CN102902419A - Mixed type pointing device - Google Patents
Mixed type pointing device Download PDFInfo
- Publication number
- CN102902419A CN102902419A CN2011102124819A CN201110212481A CN102902419A CN 102902419 A CN102902419 A CN 102902419A CN 2011102124819 A CN2011102124819 A CN 2011102124819A CN 201110212481 A CN201110212481 A CN 201110212481A CN 102902419 A CN102902419 A CN 102902419A
- Authority
- CN
- China
- Prior art keywords
- light source
- module
- indicator device
- imageing sensor
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Position Input By Displaying (AREA)
Abstract
The invention provides a mixed type pointing device comprising an optical navigation module and a pointing module. The optical navigation module is used for replacing a key of the existing pointing device, such as an optical mouse or a roller mouse and the like. The optical navigation module is used for sensing a hand posture of at least one object which is operated by a user, so that an instruction of a special program executed by a host can be started. The optical navigation module is only used for sensing the hand posture of the object, but does not sense the motion of the mixed pointing device relative to one surface, so that the hand posture can be sensed only by the analysis degree without high requirements.
Description
Technical field
The present invention relates to a kind of hybrid pointing device, particularly a kind of hybrid pointing device, it comprises for the optical guidance module of the gesture of at least one finger of sensing and is used for this hybrid pointing device of sensing with respect to the sensing module of the movement on a surface.
Background technology
In the existing indicator device, for example optical mouse and idler wheel mouse, the sensing cursor that is shown on the host display is controlled by the relative displacement between this indicator device and the surface.This indicator device generally includes two buttons (left button and right button), is used for starting the instruction that is relevant to sensing cursor movement on the display.Usually, when the user wants executive routine, towing diagram or revises pattern etc., the sensing cursor on user's mobile display and should point to the specific graphic user interface of cursor pointing (GUI) upward after, press at least one button with enabled instruction.In order to improve the application of existing indicator device, be provided with plural button on some indicator device.Therefore, user's definable specific function, it starts by the sensing cursor of pressing extra button or press simultaneously on a plurality of buttons and the mobile display.
Yet, because the user can only come operation push-button with five fingers at most simultaneously, the puzzlement that too many button can cause the user is set.For example, when the user attempted pressing many buttons, this user may be difficult to mobile indicator device and come sensing cursor on the mobile display.
Another kind of indicator device provides optical sensing module to replace existing mouse.This optical sensing module is used for luminous movement to pointing and receive light that this finger reflects and come this finger of sensing, uses the sensing cursor on the control display device.More small-sized and the sensing region less of this kind indicator device, so its truly have resolution low, be difficult to accurately control point to cursor and be difficult to fast moving and point to the shortcomings such as cursor.
In addition because user's hand and the unstable operation of finger, above-mentioned existing mouse be difficult to control point to cursor as the crow flies towards one party to movement, move, draw level and smooth camber line or carry out correct trickle movement along a certain particular path.
In recent years, industry proposes a kind of indicator device with capacitive touch-control module (CTM) or electric resistance touch-control module (RTM).Described capacitive touch-control module or electric resistance touch-control module are used for the touch-control action of sensing finger with startup command.More specifically, this capacitive touch-control module or electric resistance touch-control module have the sensing array that is distributed in equably on the sensing region.When finger suitably contacted this sensing region, the touch-control action will cause the electrical variation of sensing array to indicate the contact position on this sensing array.Yet in order to ensure can correctly detecting finger, whole capacitive touch-control module or electric resistance touch-control module must be kept normal operation.In case the running of the part of capacitive touch-control module or electric resistance touch-control module is undesired, then can't correctly detect the movement of finger.Moreover finger must be with the certain touch mode capacitive touch-control module of enough powers or electric resistance touch-control module with directed device institute sensing.Above-mentioned these characteristics have all limited the application of this technology.
In view of this, a kind of i.e. startup command and can accurately to move with the indicator device that reaches better control be very important by different way of button of can not need using is proposed.
Summary of the invention
The invention provides a kind of hybrid pointing device that comprises the optical guidance module and point to module.Described sensing module is used for this hybrid pointing device of sensing with respect to the movement on a surface, with the sensing cursor on the mobile display.Described optical guidance module is used for replacing the button of existing indicator devices such as optical mouse or idler wheel mouse, such as left button, right button, roller etc.This optical guidance module is used for the gesture of at least one finger of sensing user to start the order of the performed specific program of main frame.Hybrid pointing device is relative to the moving of described surface because the optical guidance module only is used for the gesture of sensing finger, and the resolution of this optical guidance module is not as long as can enough sensing gestures can and need too high.
The present invention also provides a kind of hybrid pointing device that comprises the optical guidance module and point to module.This optical guidance module is pointed to the demand that cursor more is close to the users in order to auxiliary moving.By the certain gestures of at least one finger of sensing, this optical guidance module can limit the moving direction that points to cursor in order to startup command, thus the sensing cursor as the crow flies on the mobile display or the scroll-up/down page, left and right sides scroll through pages etc.Therefore, the comparable existing indicator device of user is controlled more accurately and is pointed to cursor and move along desired direction.In addition, by the certain gestures of at least one finger of sensing, this optical guidance module can be in order to directly mobilely to point to cursor, to point to cursor or cooperate on the keyboard at least one button direct mobile sensing cursor or with different speed scroll through pages etc. in a limited range in that display is mobile with relatively high speed.
Because described optical guidance module can be operated in many ways, at least on the other hand refer to, carry out a gesture such as sliding, with many fingers touch-control, hit and enclose with finger drawing at least on the other hand etc. to give directions at least on the other hand, the user only can select to press or do not press the key to run order on it in the existing indicator device, and optical guidance module of the present invention can provide the method for intuition more to control sensing cursor on the display or the picture on the screen.
Optical guidance module of the present invention comprises at least one imageing sensor and at least one light source.This light source luminescent and user operate light that at least one object reflection source sends to be received by described imageing sensor.Because the different gesture motion of object will form different images in imageing sensor, it is that electric signal is with the sensing cursor on the control display device or the performed specific program of startup main frame that the optical guidance module is then changed this image.
Hybrid pointing device of the present invention is used for operating in a surface for the user.This hybrid pointing device comprises the first module, the second module and processor.Described the first module is used for the sensing hybrid pointing device with respect to the movement on described surface.Described the second module comprises light source and imageing sensor.Described light source is used for luminous.Described imageing sensor is used for acquisition and comprises that the operated at least one object of user reflects the light that described light source sends and the image of at least one luminous point that produces.Described processor is used for coming according to the positional information of luminous point described in the described image gesture of the described object of identification.
In the form of implementation of the invention described above, the positional information of luminous point described in the image captures by look-up table, and this look-up table is formed also pre-stored in processor by the matrix that the angular field of view with described imageing sensor is divided into a plurality of subregions.
The present invention can be by adding optical guidance module of the present invention and changing the relevant peripheral device to be integrated in existing optical mouse or the idler wheel mouse structure.In a kind of form of implementation, the first included module of described hybrid pointing device can share identical light source with the second module.
Description of drawings
Fig. 1 has shown the vertical view of the hybrid pointing device of first embodiment of the invention.
Fig. 2 has shown among Fig. 1 the cut-open view along A-A ' line.
Fig. 3 a-Fig. 3 b has shown the synoptic diagram that comprises the image that covers shadow that the imageing sensor of Fig. 1 captures.
Fig. 4 has shown the top view of the hybrid pointing device of second embodiment of the invention.
Fig. 5 has shown the synoptic diagram of the hybrid pointing device of third embodiment of the invention.
Fig. 6 a has shown the synoptic diagram of the hybrid pointing device of fourth embodiment of the invention.
Fig. 6 b has shown the synoptic diagram of Trackpad of the hybrid pointing device of fourth embodiment of the invention, and wherein this Trackpad has a plurality of through holes.
Fig. 7 has shown that operating area is divided into the synoptic diagram of the matrix of a plurality of subregions.
The main element symbol description
10,40,50,60 hybrid pointing devices
101,103,401,501,601 imageing sensors
1011, the border of 1012,1031,1032 angulars field of view
105,403,405,505 light sources
107,407,507,607 operating areas
108,508,608 point to module
109,409,509,609 processors
111,113 dummy line, 115 finger positions
301,303 cover shadow 406 reflecting elements
421 finger positions, 423 finger mirror images
71,71 ', 72 finger positions
Embodiment
In order to allow above and other purpose of the present invention, feature and the advantage can be more obvious, hereinafter in connection with appended accompanying drawing, be described in detail below.
In the following explanation, will hybrid pointing device of the present invention be described by embodiment, it has for replacing existing optical guidance module with indicator device of button.Yet embodiments of the invention are not limited to any specific environment, application or embodiment.Therefore, the explanation of following each embodiment only is exemplary, is not for limiting the present invention.Be understandable that the member not directly related with the present invention omitted and be not shown in the following example and the diagram.
Shown various example of the present invention in the following diagram, it is similar to existing mouse.That is, following hybrid pointing device of the present invention is used for being positioned over level and smooth working surface, and this hybrid pointing device comprises the sensing module, this points to module for luminous extremely described working surface and receives from the light of this working surface reflection, with the movement of the sensing cursor of control display on host display.Among other embodiment, described sensing module can be replaced by other devices, and for example roller points to module.Those skilled in the art in the invention can understand the existing function of pointing to module, so will repeat no more in the following explanation.
Fig. 1 has shown the vertical view of the hybrid pointing device of first embodiment of the invention.This hybrid pointing device 10 comprises the optical guidance module, and this optical guidance module has two imageing sensors 101 and 103, light source 105 and processor 109.This processor 109 is electrically connected at described imageing sensor 101 and 103 and light source 105.Must should be noted that the number of described light source and imageing sensor is not for limiting the present invention.In the present embodiment, described processor 109 also can be electrically connected at and point to module 108 (as shown in Figure 2), to process the data that transmit from this module.Yet among other embodiment, described sensing module 108 also can comprise the processor of independent running.Described imageing sensor is used for containing optically the operating area 107 of hybrid pointing device 10.Among this embodiment, imageing sensor 101 and 103 is used for containing optically operating area 107 as shown in Figure 2, and wherein Fig. 2 is that the hybrid pointing device 10 of Fig. 1 is along the cut-open view of A-A ' line.Must should be noted, only show the synoptic diagram that is used for explanation processor 109 among Fig. 2. Imageing sensor 101 and 103 arranges with respect to light source 105.Therefore, imageing sensor 101 and 103 can receive the light that light source 105 sends.The operating area 107 of the first embodiment is a dip plane, and the user can place finger thereon and moveable finger.Among other embodiment, do not receive the light that light source 105 sends as long as operating area 107 can not stop imageing sensor 101 and 103, operating area 107 also can be set to vertical or concave shape.Among other embodiment, as long as imageing sensor 101 and 103 angular field of view can contain light source 105, imageing sensor 101 and 103 and light source 105 can be arranged at diverse location.As shown in Figure 2, the light source that light source 105 can directed module 108 is by the illumination of specific light guiding structure; That is optical guidance module and sensing module 108 are used same light source.Among other embodiment, light source 105 can be the arbitrary source that points to outside the module 108.
Especially, the relative position between light source 105 and imageing sensor 101 and 103 is subject to the restriction of the lighting angle of the angular field of view of imageing sensor and light source.The lighting angle of light source refers to can be by the angular range of the optical illumination that light source sent.Generally speaking, imageing sensor has the angular field of view of the maximum magnitude of its institute energy sensing of definition.For example, imageing sensor 101 roughly has the angular field of view of 90 degree, and it is by two arrows 1011 and 1012 expressions.Two borders of this arrow 1011 and 1012 display view angle scopes, imageing sensor 101 can't be positioned at the outer object of this angular field of view by sensing.In like manner, imageing sensor 103 can have the angular field of view of 90 degree, and it is by two other arrow 1031 and 1032 expressions.Among the first embodiment, imageing sensor 101 and 103 all has the angular field of view of the lighting angle of containing light source 105.Among other embodiment, light source 105 can be replaced by line source, for example roughly sends the light-emitting section of directional light.
When user's placement referred in operating area 107 at least on the other hand, this finger can cover the light that light source 105 sends, and formed at least one shadow that covers in imageing sensor 101 and 103.Please refer to shown in Fig. 3 a and Fig. 3 b, it has shown the respectively synoptic diagram that covers shadow of sensing of imageing sensor 101 and 103.Among this embodiment, imageing sensor 101 and 103 is the line image sensor, and finger covers shadow 301 and 303 respectively at forming in the image.Described image is imageing sensor 101 and 103 images that capture respectively.Each image display one numerical range is with the launching position between two borders of the angular field of view that represents each imageing sensor.Among a kind of embodiment, for example each imageing sensor has 0 to 255 numerical range.When user's placement referred in operating area 107 at least on the other hand, this finger can cover the light that light source sends, and covered shadow 301 and 303 respectively at imageing sensor 101 and 103 formation.Shown in Fig. 3 a and Fig. 3 b, the described shadow 301 that covers for example has 120 numerical value, and the described shadow 303 that covers for example has 200 numerical value.Be understandable that the user also can use other objects, operate in the operating area 107 such as pointer, touch-control rod or other similar articles, but not use the finger described in the various embodiments of the present invention.
Can draw a dummy line 111 passes the finger position 115 of imageing sensor 101 and contact and draws the finger position 115 that a dummy line 113 is passed imageing sensor 103 and contact. Imageing sensor 101 and 103 position can map to two groups of coordinate figures of any existing coordinate system, for example polar coordinate system or rectangular coordinate system.Then, processor 109 with two numerical value, for example 120 and 200 is mapped as one group of coordinate figure based on the same coordinate system system, that is the contact coordinate figure, and tries to achieve two equations from dummy line 111 and 113.By finding the solution this two equations, the position 115 of the finger that processor 109 can obtain to contact on operating area 107.When the finger of contact when moving, the diverse location of the finger that processor can be by continuing to try to achieve contact is followed the trail of finger.In case the finger of contact leaves operating area 107, processor 109 also can be found the quick variation that causes of disappearance because of the contact coordinate figure.
In other embodiments, also can be by imageing sensor 101 being arranged to finger when being placed into operating area 107, finger can be with light source 105 light emitted line reflections, and are received by imageing sensor 101, in order to form bright spot at imageing sensor 101.Because the relative position of light source 105 and operating area 107 is for predetermining, therefore change by the bright spot of judging bright spot formed image on imageing sensor 101, just can learn the distance between finger and the light source 105, when following the trail of the finger movement with respect to the change of distance between the light source 105.When for example bright spot was brighter, namely the representative finger was nearer apart from light source position, and when bright spot was darker, namely the representative finger was far away apart from light source position; When for example the bright spot scope is larger again, namely representative finger is far away apart from light source position, the bright spot scope hour, namely the representative finger is nearer apart from light source position, this bright spot scope determines that according to the brightness of the pixel of imageing sensor this bright spot scope comprises the pixel that brightness surpasses predetermined value.
For example work as again indicator device and have operation planar, so that the user is positioned over finger when using on this operation planar, because light source 105 is fixing with the relative position of imageing sensor 101, and finger also is subject at operation planar mobile, therefore when the relative distance of finger and light source 105 or relative angle change, image space on the imageing sensor 101 also can be different, processor is according to the formed triangle of bright spot position, light source and object, by the distance between bright spot position calculation acquisition object and the light source.
Afore-mentioned distance changes can be in order to trigger a relative instruction, for example can be in order to the function of the roller that replaces idler wheel mouse so that the scroll-up/down of control viewing window pictures and about roll.
In case the optical guidance module can be followed the trail of the movement of at least one finger, trace data then can be used for starting specific instruction.For example, if the finger of contact moves right, optical guidance module fechtable trace data also starts the instruction of the current picture of scrolling; If the finger of two contacts away from each other or close, trace data can be used to start the instruction that zooms in or out current picture or pattern; If the finger of at least one contact moves clockwise or counterclockwise, trace data can be used to rotate current picture or picture.Moreover the user can set according to the specific action of finger the instruction of expectation.
Fig. 4 has shown the vertical view of the hybrid pointing device 40 of second embodiment of the invention.The Main Differences of the first and second embodiment is that the second embodiment only uses an imageing sensor 401 collocation reflecting elements 406 and two line sources 403 and 405.Among other embodiment, line source 403 and 405 can be formed single line source and reflecting element 406 can extend suitable length along different directions in the angular field of view of imageing sensor 401, for example extends to the relative edge of relative light source 405 among Fig. 4.Reflecting element 406 also can be comprised of a plurality of minute surfaces.Among other embodiment, one of them in the described line source 403 and 405 can be for luminous active light source, and another line source can be passive light source (for example reflective fabric), with the reflection light that active light source was sent.Be understandable that, when the position 421 on the finger contact operating area 407, with respect to reflecting element 406, can be in the position 423 form the mirror image symmetrical with position 421.The finger of contact can cover shadow in imageing sensor 401 two of formation, and (one of them is formed by the reflected light that covers reflecting element 406, and the mirror image of another to be the finger that is in 421 places, position map to reflecting element 406) and produce two numerical value, then processor 409 is mapped as coordinate figure with these two numerical value.As mentioned above, the position of imageing sensor 401 also can be mapped as coordinate figure.By solving the equation that is determined by these coordinate figures, processor 409 can be learnt the position that finger contacts.
Fig. 5 has shown the synoptic diagram of the hybrid pointing device 50 of third embodiment of the invention.This hybrid pointing device 50 comprises imageing sensor 501, light source 505, processor 509 and points to module 508 that described processor 509 is electrically connected described imageing sensor 501, light source 505 and points to module 508.Must should be noted that the number of light source and imageing sensor is not for limiting the present invention.Hybrid pointing device 50 also has operating area 507, and it is the upper surface of Trackpad, puts at least one finger and moveable finger thereon for the user.As shown in Figure 5, the light that sends of light source 505 finger the reflection sources luminous and user.Then, reflected light is received by imageing sensor 501.Processor 509 then can identification operating area 507 on finger the position and continue to follow the trail of finger movement thereon.This tracked information is used to start as the described particular command of the first and second embodiment.
Moreover because the action of the difference of the gesture of at least one finger can form different images in imageing sensor 501, the user can come the different gestures of at least one finger of identification to come startup command by the image identification technology.When not pointing on the operating area 507, the light that light source 50 sends will outwards penetrate and imageing sensor 501 can't sensing from the light of operating area 507 reflections; That is the light that Trackpad sends with respect to light source 505 is transparent.When the user puts at least onely when pointing on operating area 507, the light that light source 505 sends can at the finger of this contact and the surface between the operating area 507 be reflected and can form in the image of 505 sensings of imageing sensor the mild luminous point of at least one variation.Processor 509 then is that electric signal is with sensing cursor or the startup main frame performed specific program of control display on display with this video conversion.
Fig. 6 a has shown the synoptic diagram of the hybrid pointing device 60 of fourth embodiment of the invention.This hybrid pointing device 60 comprises imageing sensor 601, processor 609 and points to module 608 that described processor 609 is electrically connected to described imageing sensor 601 and points to module 608.When surround lighting is enough strong, imageing sensor 601 direct sensing operating areas 607 and this imageing sensor 601 can be from surround lighting at least one finger on the identification operating area 607.Must should be noted that the number of imageing sensor is not for limiting the present invention.
The finger that is positioned over operating area 607 can produce difform shade.Then this shade is by 601 sensings of imageing sensor.Processor 609 then can identification operating area 607 on finger the position and continue to follow the trail of finger movement thereon.Tracked information is used to start the particular command such as above-mentioned the first and second embodiment.Moreover because the action of the difference of the gesture of at least one finger can form different images in imageing sensor 601, the user can come startup command by the different gestures of at least one finger.Processor 609 can be used for the described image that causes by the different gestures of image identification technology identification according to above-mentioned the 3rd embodiment.The finger that is used for the sensing contact owing to imageing sensor 601 covers the shade that surround lighting produces, also can be formed with a plurality of through holes shown in Fig. 6 b on the Trackpad, processor 609 can come according to the through hole that the finger that contacts cover the position of finger on the identification operating area 607 and continue to follow the trail of the movement of pointing on the operating area 607 by this.Be understandable that the shape of through hole and density shown in Fig. 6 b are not for limiting the present invention.
Among other embodiment, operating area 507 and 607 can have light element, and this light element is used for guiding light to whole operating area.For example among Fig. 5, an end of light source 505 reconfigurable operating areas 507 in having light element, and the light that it sends is advanced along operating area 507 by the guiding of light element.Imageing sensor 501 is the image of the equally distributed operating area 507 of sensing brightness then.When user's placement referred in operating area 507 at least on the other hand, the finger of contact will change light intensity and can there be at least one shadow that covers that is caused by the finger that contacts in sensed image.Processor 509 can be used for the described image that comes the identification sensing by the image identification technology according to above-mentioned the 3rd embodiment.Among Fig. 6 a, operating area 607 can have light element, and this light element is used for the light of boot environment light or secondary light source to whole operating area 607, and then imageing sensor 601 operates as above-mentioned imageing sensor 501.
Because optical guidance module of the present invention is used for gesture or the movement of sensing user finger, so the resolution of the imageing sensor of the module of optical guidance described in the various embodiments described above can not need as the resolution of the imageing sensor that points to module so high.Especially, if the gesture that the resolution of the imageing sensor of optical guidance module can at least one finger of sensing or mobile, and details that must the sensing finger surface, for example fingerprint.In other words, as long as the general profile that the imageing sensor of optical guidance module can sensing finger, its resolution is just enough.The embodiment of imageing sensor comprises ccd image sensor, cmos image sensor or its similar imageing sensor.
In addition, above-mentioned hybrid pointing device can also comprise the transmission interface unit, the movement that this transmission interface unit is used for pointing to module institute sensing is sent to display with the sensing cursor on mobile this display, and transport processor pick out with respect to the order of finger gesture to start the sensing cursor on the performed specific program of main frame or the direct mobile display.
The operating area of the various embodiments described above can be divided into many subregions according to the applied coordinate system of optical guidance module.Take rectangular coordinate system as example, please refer to shown in Figure 7ly, it has shown that operating area is divided into the synoptic diagram of the matrix of a plurality of subregions.The finger of supposing contact occupies the position 71 of operating area and then moves to position 72.The optical guidance module only need continue which subregion of sensing is contacted to calculate the movement of the finger of contact by finger, then starts specific instruction and moves to respond this.Even when the finger of contact when moving, the optical guidance module is the moving direction of the finger that contact with acquisition of sensing starting point and end point only, then starts specific instruction to respond the mobile message of the finger that contacts.
When finger contacts an above subregion simultaneously, position 71 and 71 as shown in Figure 7 ', the optical guidance module can be utilized several different methods estimation positional information, for example the relative coordinate figure of average two sub regions, with two sub regions together as starting point, the finger selecting to be touched occupy maximum sub regions and the subregion selecting at random to be touched one of them etc., but the present invention is not limited to this.Simultaneously, the positional information that the optical guidance module can pre-stored all subregion, for example stored position information is in look-up table and be stored in memory storage, and then when the finger contact operating area, the positional information of in advance storage of acquisition is with the enhancement processing speed.Because the fundamental purpose of optical guidance module is for determining gesture or the movement of finger, as long as the optical guidance module is estimated positional information with consistent method, for example all the time with the relative coordinate figure of above-mentioned average two sub regions, with two sub regions together as starting point, the finger selecting to be touched occupies maximum sub regions and one of them method of the subregion selecting at random to be touched is estimated positional information, the output of this optical guidance module namely can be used for determining gesture or the movement pointed.
Above-mentioned light source can be any existing light source, such as light emitting diode, laser diode or infrared light light source etc., but the present invention is not limited to this.Use the advantage of infrared light light source to be to utilize its not visible characteristics to avoid affecting user's vision.The tracked information that captures from the movement of finger of contact also can be used for the sensing cursor on the assist mobile display.For example, the moving direction of detecting the finger of contact when the optical guidance module is same as the moving direction that points to module, and for example the finger of contact is moved to the left and points to module and is moved to the left the sensing cursor, and this sensing cursor can accelerate to be moved to the left.Perhaps, the optical guidance module can in the situation of at least one button on collocation or the keyboard of not arranging in pairs or groups, refer to temporarily control the movement of pointing to cursor by moving at operating area at least on the other hand after detecting certain gestures.The operating area of the various embodiments described above and/or imageing sensor can be set to have a pitch angle so that place finger and be easy to sensing image.
As mentioned above, existing indicator device with optical sensing module has and is difficult to accurately control and points to cursor and be difficult to higher speed and move the problem of pointing to cursor.And existing have capacitive touch-control module or electric resistance touch-control module indicator device must power operates and must be maintained at kilter with larger pressing.Therefore, the invention provides a kind of hybrid pointing device, it has to refer to touch controllable function and than the more comparatively operation of intuition of existing indicator device more, and can accurately operate the sensing cursor along direction and the path movement expected.Moreover because optical guidance module of the present invention is used for gesture or the movement of sensing user finger, the resolution of the imageing sensor of optical guidance module can be lower than the resolution of the imageing sensor that points to module in the various embodiments described above.
Although the present invention discloses by above-described embodiment, yet above-described embodiment is not that any the technical staff in the technical field of the invention without departing from the spirit and scope of the present invention, should make various changes and modification for restriction the present invention.Therefore protection scope of the present invention should be as the criterion with the scope that appended claims was defined.
Claims (12)
1. indicator device, this indicator device are used for for user's operation with the surface relative displacement to occur, and this indicator device comprises:
The first module is used for this indicator device of sensing with respect to the movement on described surface;
The second module comprises:
Light source is used for luminous; And
Imageing sensor is used for acquisition and has comprised described user and operate the image that at least one object reflects at least one luminous point of the light that described light source sends; And
Processor is used for the variation according to luminous point image described in the described image, comes the relative distance of the described object of identification and described light source and produces distance signal.
2. indicator device according to claim 1, wherein said the first module is optical guidance module or machinery guidance module.
3. indicator device according to claim 1, wherein said processor calculates the movement of described the first module senses.
4. indicator device according to claim 1, wherein said processor comes the relative distance of the described object of identification and described light source according to the brightness of the image of described luminous point, when the brightness of the image of described luminous point increases, the relative distance that represents described object and described light source shortens, when the brightness of the image of described luminous point reduces, represent the relative distance increase of described object and described light source.
5. indicator device according to claim 1, wherein said processor comes the relative distance of the described object of identification and described light source according to the bright spot scope of the image of described luminous point, when the bright spot scope of the image of described luminous point is dwindled, the relative distance that represents described object and described light source shortens, when the bright spot scope of the image of described luminous point increases, the relative distance that represents described object and described light source increases, described bright spot scope determines that according to the brightness of the pixel of imageing sensor this bright spot scope comprises the pixel that brightness surpasses predetermined value.
6. indicator device according to claim 1, this indicator device also comprises operation planar, so that the user is positioned over described object on the described operation planar and uses, wherein said processor comes the relative distance of the described object of identification and described light source according to the change of bright spot position in image capturing range of the image of described luminous point, thereby the relative position of fixing described imageing sensor and described light source, and limit the use of described object on described operation planar, described processor is according to described bright spot position, described light source and the formed triangle of described object obtain distance between described object and the described light source by the bright spot position calculation.
7. indicator device according to claim 1, wherein said the second module is the distance measurement sensor module.
8. indicator device according to claim 1, wherein said the second module is the proximity sense module.
9. indicator device according to claim 1, wherein said the second module also comprises another light source, so that object forms at least two luminous point images at described imageing sensor.
10. indicator device according to claim 1, wherein said the second module also comprises another light source and another imageing sensor corresponding with described another light source, so that object forms the luminous point image at separately imageing sensor respectively.
11. indicator device according to claim 1, wherein said the second module also comprises barrier element, is not directly received by imageing sensor to limit the light that described light source launched.
12. indicator device according to claim 1, wherein said distance signal are used for the control relative order, with the rolling of the form on the control screen.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011102124819A CN102902419A (en) | 2011-07-27 | 2011-07-27 | Mixed type pointing device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011102124819A CN102902419A (en) | 2011-07-27 | 2011-07-27 | Mixed type pointing device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN102902419A true CN102902419A (en) | 2013-01-30 |
Family
ID=47574695
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2011102124819A Pending CN102902419A (en) | 2011-07-27 | 2011-07-27 | Mixed type pointing device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102902419A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103279209A (en) * | 2013-05-30 | 2013-09-04 | Tcl集团股份有限公司 | Two-dimensional positioning method and device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1296590A (en) * | 1998-01-28 | 2001-05-23 | 微软公司 | Operator input device |
US20040046741A1 (en) * | 2002-09-09 | 2004-03-11 | Apple Computer, Inc. | Mouse having an optically-based scrolling feature |
US20070152966A1 (en) * | 2005-12-30 | 2007-07-05 | Apple Computer, Inc. | Mouse with optical sensing surface |
US20080150913A1 (en) * | 2002-05-28 | 2008-06-26 | Matthew Bell | Computer vision based touch screen |
CN102073392A (en) * | 2009-10-29 | 2011-05-25 | 原相科技股份有限公司 | Hybrid pointing device |
-
2011
- 2011-07-27 CN CN2011102124819A patent/CN102902419A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1296590A (en) * | 1998-01-28 | 2001-05-23 | 微软公司 | Operator input device |
US20080150913A1 (en) * | 2002-05-28 | 2008-06-26 | Matthew Bell | Computer vision based touch screen |
US20040046741A1 (en) * | 2002-09-09 | 2004-03-11 | Apple Computer, Inc. | Mouse having an optically-based scrolling feature |
US20070152966A1 (en) * | 2005-12-30 | 2007-07-05 | Apple Computer, Inc. | Mouse with optical sensing surface |
CN102073392A (en) * | 2009-10-29 | 2011-05-25 | 原相科技股份有限公司 | Hybrid pointing device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103279209A (en) * | 2013-05-30 | 2013-09-04 | Tcl集团股份有限公司 | Two-dimensional positioning method and device |
CN103279209B (en) * | 2013-05-30 | 2017-04-19 | Tcl集团股份有限公司 | Two-dimensional positioning method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102073392A (en) | Hybrid pointing device | |
EP2898399B1 (en) | Display integrated camera array | |
US8610670B2 (en) | Imaging and display apparatus, information input apparatus, object detection medium, and object detection method | |
EP2249233A2 (en) | Method and apparatus for recognizing touch operation | |
US8867791B2 (en) | Gesture recognition method and interactive system using the same | |
US20100225588A1 (en) | Methods And Systems For Optical Detection Of Gestures | |
US9454260B2 (en) | System and method for enabling multi-display input | |
US20110018822A1 (en) | Gesture recognition method and touch system incorporating the same | |
US8659577B2 (en) | Touch system and pointer coordinate detection method therefor | |
CN103744542B (en) | Hybrid pointing device | |
KR20110005737A (en) | Interactive input system with optical bezel | |
JP2008250949A (en) | Image processor, control program, computer-readable recording medium, electronic equipment and control method of image processor | |
JP2008250950A (en) | Image processor, control program, computer-readable recording medium, electronic equipment and control method of image processor | |
US9652083B2 (en) | Integrated near field sensor for display devices | |
CN103853321A (en) | Portable computer with pointing function and pointing system | |
JP2009205423A (en) | Display imaging device and object detection method | |
JP2008250951A (en) | Image processor, control program, computer-readable recording medium, electronic equipment, and control method of image processor | |
US20110193969A1 (en) | Object-detecting system and method by use of non-coincident fields of light | |
US8581847B2 (en) | Hybrid pointing device | |
US9489077B2 (en) | Optical touch panel system, optical sensing module, and operation method thereof | |
CN101989150A (en) | Gesture recognition method and touch system using same | |
KR20130136313A (en) | Touch screen system using touch pen and touch recognition metod thereof | |
CN102902419A (en) | Mixed type pointing device | |
US9582078B1 (en) | Integrated touchless joystick-type controller | |
JP5118663B2 (en) | Information terminal equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C02 | Deemed withdrawal of patent application after publication (patent law 2001) | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20130130 |