CN201465045U - Cursor locating system - Google Patents

Cursor locating system Download PDF

Info

Publication number
CN201465045U
CN201465045U CN2009200675548U CN200920067554U CN201465045U CN 201465045 U CN201465045 U CN 201465045U CN 2009200675548 U CN2009200675548 U CN 2009200675548U CN 200920067554 U CN200920067554 U CN 200920067554U CN 201465045 U CN201465045 U CN 201465045U
Authority
CN
China
Prior art keywords
unit
cursor
image capture
motion
button
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2009200675548U
Other languages
Chinese (zh)
Inventor
袁鸿军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN2009200675548U priority Critical patent/CN201465045U/en
Application granted granted Critical
Publication of CN201465045U publication Critical patent/CN201465045U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The utility model discloses a cursor locating system. A cursor is driven to move by relative three-dimensional movement between an image capture unit and an object which is randomly shot; the object which is randomly shot is taken as an instantaneous measurement datum of the three-dimensional movement of the image capture unit; the three-dimensional movement of the image capture unit generates movement of an image point of the object which is randomly shot in a visual field of a user; and the cursor is driven to move according to two-dimensional movement of the image point of the object which is randomly shot. The cursor locating system comprises the image capture unit, an image processing and solving unit, a data sending unit, a data receiving unit and a processing unit. The utility model drives the cursor to move through relative movement of an image pickup device and the movement of a shot object thereof, and the user does not need to rely on a smooth desktop and can use the utility model within a space range at will.

Description

The cursor positioning system
Technical field
The utility model belongs to electronic information technical field, relates to a kind of positioning system, relates in particular to a kind of cursor positioning system.
Background technology
Existing computer mouse comes branch to be divided into mechanical mouse, optical mouse usually according to structure.
Mechanical mouse is a moving direction of judging it by slide potentiometer at the beginning of occurring, so its sensitivity is low, wearing and tearing big.But along with development of technology, mechanical mouse has absorbed some designs of optical mouse, become optical-mechanical formula mouse by the Purely mechanical structural development, optical-mechanical formula mouse has adopted the scrambler different with the Purely mechanical mouse, and has used a spin to lean against two structures in the rotating shaft.Through years of development, optical-mechanical formula mouse becomes the most ripe a kind of mouse of present technology, and is simple in structure, with low cost, remains till now and enjoys certain share on the market.
Optical mouse be 1981 by Dick Lyon and Steve Kirsch utility model, this mouse that does not have a spin adopts optical alignment, initial photoelectricity must and the cooperation of special volume backing plate could use, cause inconvenience.Along with development of technology, optical mouse has finally been abandoned backing plate, the irradiate light by sending a branch of redness in the time of work to desktop, the motion of judging mouse by the motion and the reflection of desktop different colours or sags and crests then.Therefore the precision of optical mouse is higher comparatively speaking, adds in light weightly, without the cleaned at regular intervals mouse, is usually used in the pinpoint design field of needs before.Along with the reduction of cost, also promote gradually and come at present.
The defective of above-mentioned two kinds of mouses is, all needs to be placed on the smooth surface of contact, finish the operation of moving cursor by this surface of contact, so the user has to operate in a narrow space; Normal this in the past, generate occupational illness easily, user's health is had adverse effect.
In addition, a kind of air mouse is arranged, be based on gyro and accelerometer is realized, this mouse price is too expensive.
The utility model content
Technical problem to be solved in the utility model is: a kind of cursor positioning system is provided, can arbitraryly uses in spatial dimension.
For solving the problems of the technologies described above, the utility model adopts following technical scheme:
A kind of cursor positioning system, it comprises image capture unit, Flame Image Process and resolves unit, data transmission unit, Data Receiving unit, processing unit; The image capture unit is in order to photographic images, and each two field picture of taking is sent to a Flame Image Process and resolves the unit; Flame Image Process and resolve the unit in order to the relative three-dimensional motion between the object of described image capture unit and random shooting is processed into the cursor position offset data then transfers to a data transmission unit; The three-dimensional motion of described image capture unit causes the motion of the picture of the object of random shooting in its visual field, drives the motion of cursor according to the two dimensional motion of the picture of the object of described random shooting; Data transmission unit is in order to be sent to a Data Receiving unit to the described cursor position compensation that calculates; The Data Receiving unit compensates in order to the cursor position that receives the data transmission unit transmission, and this cursor position compensation is sent in the processing unit; Processing unit shows in the relevant position in order to drive cursor.
As a kind of preferred version of the present utility model, the object of described random shooting is as the transient measurement benchmark of described image capture unit three-dimensional motion.
As a kind of preferred version of the present utility model, described Flame Image Process and the position of resolving the picture point of transient measurement benchmark in the unit record i two field picture; Described Flame Image Process and resolve the unit and seek the picture point of described transient measurement benchmark in the i+1 two field picture by the method for template matches; Described Flame Image Process and resolve the two dimensional motion of unit according to the picture point of the object of the position calculation random shooting of above-mentioned two picture points.
As a kind of preferred version of the present utility model, described image capture unit is sent to the i two field picture of taking Flame Image Process and resolves the unit; Every through once taking, the value of i adds 1; Described Flame Image Process and resolve the picture point that the transient measurement benchmark is obtained in the unit, and give position C its position; Described Flame Image Process and resolve the unit and in the i+1 two field picture, mate the picture point of seeking the transient measurement benchmark when taking the i frame; If coupling is sought the picture point success of transient measurement benchmark, coupling is sought successful best match position give position D; Offset vector V '=D-C; Cursor position Q=Q '+V; Wherein, Q is this cursor position after moving, and Q ' is this cursor position before moving; V is the cursor position compensation rate, and V=k*V ', k are scale-up factor.
As a kind of preferred version of the present utility model, described image capture unit is a video camera; Choose the spatial point P that is positioned on the camera optical axis Z axle (0,0, z) as the transient measurement benchmark of video camera three-dimensional motion; The picture point of described transient measurement benchmark is positioned at the center on picture plane.
As a kind of preferred version of the present utility model, during described image capture unit picked-up i two field picture, the coordinate of getting the image capture unit is relative coordinate system o 0Get the some P ([00z] of fetch bit on the z axle in this coordinate system T) as the transient measurement benchmark, the image position of transient measurement benchmark P is in the center p on picture plane 0(00);
After the image capture unit moves (Rt), form new coordinate system o 1, absorb the i+1 two field picture this moment, and the position of the picture of transient measurement benchmark P moves to p 1(u v); Linear video camera imaging model
s u v 1 = A 0 R t 0 1 X 1
= [ A ] ( [ RX + t ] ) ;
= f 0 0 0 f 0 0 0 1 ( 1 γ - β - γ 1 α β - α 1 0 0 z + t 1 t 2 t 3 )
That is: s u f v f 1 = ( - β α 1 z + t 1 t 2 t 3 ) ;
u f = - βz + t 1 z + t 3 ;
v f = αz + t 2 z + t 3
Wherein,
Figure G2009200675548D00037
Be the camera intrinsic parameter matrix;
Figure G2009200675548D00038
Be the video camera rotation matrix;
Figure G2009200675548D00039
Be video camera translation matrix; X is that the transient measurement benchmark is at coordinate system o 0In coordinate;
Figure G2009200675548D000310
The coordinate of picture point in the picture coordinate system for the transient measurement benchmark; S is a proportionality constant.
As a kind of preferred version of the present utility model, described Input Control Element comprises camera head and at least one control button; Setting operation function on described control button.
As a kind of preferred version of the present utility model, described Input Control Element comprises that one is the housing of lip pencil; In this housing head camera head is set; At at least three buttons of side setting of this housing, its function is set to left button, right button, and the power switch of common mouse respectively.
As a kind of preferred version of the present utility model, described Input Control Element comprises the push-button unit of being convenient to be worn over finger ring on the finger, being arranged on camera head on this finger ring, be arranged with described finger ring branch; Described push-button unit comprises three buttons at least, and its function is set to left button, right button, and the power switch of common mouse respectively.Described push-button unit is arranged on the keyboard or the independent part for separating with other mechanisms.
The beneficial effects of the utility model are: the cursor positioning system that the utility model discloses, it is support that the user need not with smooth desktop, can indiscriminately ad. as one wishes use in spatial dimension.Simultaneously, the utility model can the random shooting object, need not only to take the object of reference of setting.
Description of drawings
Fig. 1 is the composition synoptic diagram of cursor positioning system among the embodiment one.
Fig. 2 is the process flow diagram of cursor positioning system positioning method among the embodiment one.
Fig. 3 is a principle of work synoptic diagram of the present utility model.
Fig. 4-1, Fig. 4-2 is the structural representation of Input Control Element among the embodiment four.
Fig. 5-1, Fig. 5-2 is the structural representation of Input Control Element among the embodiment five.
Fig. 6 is the composition synoptic diagram of cursor positioning system among the embodiment six.
Description of reference numerals:
10: cursor positioning system 11: the image capture unit
111: housing 112: camera head
113: button 114: button
115: button 116: camera head
117: finger ring 12: Flame Image Process and resolve the unit
13: data transmission unit 14: the Data Receiving unit
15: processing unit 20: electronic product
Embodiment
Describe preferred embodiment of the present utility model in detail below in conjunction with accompanying drawing.
Embodiment one
[principle of work]
The utility model drives cursor movement by the motion of video camera.Before elaborating the utility model, principle of work of the present utility model at first is described, promptly why the motion of cursor is the reflection of camera motion.
According to the photographic imagery theory, between the motion of the motion of video camera and the picture of target object certain relation is arranged.If only consider to be positioned at the target object on the camera optical axis, can find that then the relation between the motion of the motion of its picture and video camera has been simplified greatly.Here be theoretical foundation with the photographic imagery principle, the motion of research video camera, the motion at the picture of the target object on the camera optical axis, the relation between the cursor movement three realize the control to cursor movement.
1, is positioned at the motion of picture of the target object on the camera optical axis and the relation of camera motion.
Linear video camera imaging model:
Figure G2009200675548D00051
Wherein
Figure G2009200675548D00052
Be the camera intrinsic parameter matrix; Be the video camera rotation matrix;
Figure G2009200675548D00054
Be video camera translation matrix; X is that the transient measurement benchmark is at coordinate system o 0In coordinate;
Figure G2009200675548D00055
The coordinate of picture point in the picture coordinate system for the transient measurement benchmark; S is a proportionality constant.
Camera coordinates when as shown in Figure 3, supposing camera picked-up i two field picture is world coordinate system o 0(i is an integer, and every through once taking, the value of i adds 1) gets the some P ([00z] of fetch bit on the z axle in this coordinate system T) as the impact point of investigating, the image position of impact point p is in the center p on picture plane at this moment 0(00).After camera moves (Rt), form new camera coordinate system o 1, absorb the i+1 two field picture this moment, and the position of the picture of p moves to p 1(u v):
s u v 1 = A 0 R t 0 1 X 1
= [ A ] ( [ RX + t ] ) ;
= f 0 0 0 f 0 0 0 1 ( 1 γ - β - γ 1 α β - α 1 0 0 z + t 1 t 2 t 3 )
That is: s u f v f 1 = ( - β α 1 z + t 1 t 2 t 3 ) ;
u f = - βz + t 1 z + t 3 .
v f = αz + t 2 z + t 3
Following formula shows, video camera has produced motion (Rt), and caused the motion [uv] of the picture of p (00z) point thus when clapping the i frame to the i+1 frame.The motion [uv] of the picture of p (00z) point and the motion (Rt) of video camera and the z coordinate figure of p (00z) have relation.
2, the motion control of cursor.
Aloft in the Flame Image Process of mouse, seek p (00z) point in adjacent two two field pictures with the method for coupling.Therefore p (00z) point is that certain on the real object does not a bit have direct corresponding relation on a statistic and the physical significance.Owing to be that statistics is obtained p (00z) point in an image window, so the position of p (00z) point and decide by objects all in the whole statistical window.A difference of the coordinate of the point on this and the physical significance is that the z coordinate figure of p (00z) point is the amount of a non-saltus step that slowly changes.In conjunction with following formula, as can be seen, motion [uv] main determining factor of the picture of p (00z) point is the motion (Rt) of video camera self.And the motion of video camera self (Rt) is the factor that the user can control fully.In addition, because the existence of user's uncontrollable factor z, the motion [uv] of the picture of p (00z) point is a coarse reflection of video camera motion itself, if the motion [uv] of the picture of putting with p (00z) drives cursor, the motion of cursor also is coarse reflection of camera motion.Though be not accurately absolute, undoubtedly, it can reflect the trend of camera motion.This motion with video camera drives the method for cursor and with the writing brush similar part of having write, the movement locus of hand obviously and the shape of the word that will write discrepant, though this difference exists, can indiscriminately ad. as one wishes depict a beautiful calligraphy.
[system's composition]
As shown in Figure 1, the utility model has disclosed a kind of cursor positioning system 10, comprises image capture unit 11, Flame Image Process and resolves unit 12, data transmission unit 13, Data Receiving unit 14, processing unit 15.
Image capture unit 11 is in order to photographic images, and each two field picture of taking is sent to a Flame Image Process and resolves unit 12.In the present embodiment, the user is by mobile image capture unit 11, makes to produce relative three-dimensional motion between image capture unit 11 and shot object; Can certainly pass through the follow shot object, make and produce relative three-dimensional motion between image capture unit 11 and shot object.
Flame Image Process and resolve unit 12 in order to the relative three-dimensional motion between the object of described image capture unit 11 and random shooting is processed into the cursor position offset data then transfers to data transmission unit 13; The relative three-dimensional motion of described image capture unit 11 causes the motion of the picture of the object of random shooting in its visual field, the object of described random shooting is as the transient measurement benchmark of described image capture unit 11 three-dimensional motions, drives the motion of cursor according to the two dimensional motion of the picture of the object of described random shooting.
Data transmission unit 13 is in order to be sent to Data Receiving unit 14 to the described cursor position compensation that calculates.
Data Receiving unit 14 compensates in order to the cursor position that receives data transmission unit 13 transmissions, and this cursor position compensation is sent in the processing unit 15.
Processing unit shows in the relevant position to drive cursor with 15.In the present embodiment, this processing unit 15 is as the part of the electronic product 20 that comprises described cursor positioning system 10.
[localization method]
The utility model discloses a kind of cursor positioning method, and this method drives the motion of cursor by an image capture unit 11 and the relative three-dimensional motion between the random shooting object; That is, can be 11 motions of image capture unit, also can be image capture unit 11 captured object of which movement.
In the present embodiment, the transient measurement benchmark of the object of described random shooting as 11 three-dimensional motions of described image capture unit.The three-dimensional motion of described image capture unit 11 causes the two dimensional motion of the picture point of the object of random shooting in its visual field; The utility model drives the motion of cursor according to the two dimensional motion of the picture point of the object of described random shooting.
Described image capture unit 11 can be video camera.By above-mentioned principle of work as can be known, choose the point that is positioned on the camera optical axis Z axle can more accurately reflect video camera as the transient measurement benchmark of video camera three-dimensional motion motion.Therefore, in the present embodiment, (0,0, z) as the transient measurement benchmark of video camera three-dimensional motion, the picture point of described transient measurement benchmark is positioned at the center on picture plane to choose the spatial point P that is positioned on the camera optical axis Z axle.
In addition, the method for two dimensional motion of picture point of obtaining the object of random shooting is: the position of writing down the picture point of transient measurement benchmark in the i two field picture; Seek the picture point of described transient measurement benchmark in the i+1 two field picture by the method for template matches; Two dimensional motion according to the picture point of the object of the position calculation random shooting of above-mentioned two picture points.Present embodiment takes the method coupling of template matches to seek, if do not take this matching process, by above-mentioned principle of work as can be known, saltus step (as becoming 10 from 1) can take place the z value, can cause the saltus step of picture point thus, thereby cause the cursor saltus step.And take template matching method, z value slowly to change, can regard z as constant, thereby can not cause the cursor saltus step.
See also Fig. 2, the cursor positioning method of present embodiment comprises the steps;
(1) user controls an image capture unit motion, described image capture unit photographs image, and an i two field picture of taking is sent to a Flame Image Process and resolves the unit; Every through once taking, the value of i adds 1.
During described image capture unit picked-up i two field picture, the coordinate of getting the image capture unit is relative coordinate system o 0Get the some P ([00z] of fetch bit on the z axle in this coordinate system T) as the transient measurement benchmark, the image position of transient measurement benchmark P is in the center p on picture plane 0(00).
(2) described Flame Image Process and resolve the picture point that the transient measurement benchmark is obtained in the unit, and give position C its position.
(3) after the image capture unit generation relative motion, described Flame Image Process and resolve the unit and obtain the i+1 two field picture from described image capture unit;
After the image capture unit moves (Rt), form new coordinate system o 1, absorb the i+1 two field picture this moment, and the position of the picture of transient measurement benchmark P moves to p 1(uv); Linear video camera imaging model
s u v 1 = A 0 R t 0 1 X 1
= [ A ] ( [ RX + t ] ) ;
= f 0 0 0 f 0 0 0 1 ( 1 γ - β - γ 1 α β - α 1 0 0 z + t 1 t 2 t 3 )
That is: s u f v f 1 = ( - β α 1 z + t 1 t 2 t 3 ) ;
u f = - βz + t 1 z + t 3 ;
v f = αz + t 2 z + t 3
Wherein
Figure G2009200675548D00097
Be the camera intrinsic parameter matrix;
Figure G2009200675548D00098
Be the video camera rotation matrix;
Figure G2009200675548D00099
Be video camera translation matrix; X is that the transient measurement benchmark is at coordinate system o 0In coordinate;
Figure G2009200675548D000910
The coordinate of picture point in the picture coordinate system for the transient measurement benchmark; S is a proportionality constant.
(4) described Flame Image Process and resolve the unit and in the i+1 two field picture, mate the picture point of seeking described transient measurement benchmark; In the present embodiment, the method that images match is sought is a template matching method;
(5) the match is successful if the correlation function value that has a template matches, is thought this point greater than the threshold value that is provided with, and changes step (6), otherwise commentaries on classics step (1);
(6) position of getting described correlation function value maximum is a best match position, coupling is sought successful best match position give position D;
(7) calculating location offset vector V '=D-C;
(8) calculate cursor position Q=Q '+V; Wherein, Q is this cursor position after moving, and Q ' is this cursor position before moving; V is the cursor position compensation rate, and V=k*V ', k are scale-up factor.
By cursor positioning system of the present utility model and localization method thereof, by moving relative to driving cursor with the motion of its shot object of camera head, it is support that the user need not with smooth desktop, can indiscriminately ad. as one wishes use in spatial dimension.
Embodiment two
The difference of present embodiment and embodiment one is, in the present embodiment, chooses near on being positioned at the camera optical axis Z axle point as the transient measurement benchmark of video camera three-dimensional motion, uses this technical scheme, still can more accurately reflect the motion of video camera.
In addition, can also choose arbitrfary point in the camera coverage as the transient measurement benchmark of video camera three-dimensional motion.
Embodiment three
The difference of present embodiment and embodiment one, embodiment two is, in the present embodiment, by the follow shot object, makes between the object of image capture unit and random shooting and produces relative three-dimensional motion.
At this moment, the image capture unit is static, and all objects of the random shooting in the described image capture unit field of view are done the as a whole rigid motion of doing; The all objects of described image capture unit random shooting are done the mass motion that as a whole rigid motion causes the picture point of the object of random shooting in the image capture unit field of view; Described image capture unit is as the transient measurement benchmark of this rigid motion; Drive the motion of cursor according to the two dimensional motion of the picture of the object of described random shooting.
Embodiment four
See also Fig. 4-1, Fig. 4-2, in the present embodiment, the camera head 112 that described Input Control Element comprises a housing 111 that is lip pencil, be provided with at these housing 111 heads, three buttons 113,114,115 that are provided with at these housing 111 sides, the function of button 113,114,115 are set to the left button of common mouse, right button, and power switch respectively.
In addition, can also increase or reduce button (as increasing the middle key function of common mouse), can change the function (being set to the middle key of common mouse as power switch) of above-mentioned button simultaneously.
Embodiment five
See also Fig. 5-1, Fig. 5-2, in the present embodiment, described Input Control Element comprises that one is convenient to be worn over the finger ring 117 on the finger, be arranged on the camera head 116 on this finger ring 117, the push-button unit that was arranged in 117 minutes with described finger ring (figure does not show). finger ring 117 can be as ring one belt transect on user's finger, its can for the sealing, also can be for opening is arranged. described push-button unit comprises three buttons at least, its function is set to the left button of common mouse respectively, right button, and power switch. described push-button unit be arranged on the keyboard (as about in key simulate realization by the Macintosh of keyboard), perhaps push-button unit is an independent part of separating with other mechanisms.
In addition, can also increase or reduce button (as increasing the middle key function of common mouse), can change the function (being set to the middle key of common mouse as power switch) of above-mentioned button simultaneously.
Embodiment six
See also Fig. 6, in the present embodiment, Input Control Element does not comprise button, only is the image capture unit.Image capture unit 11, Flame Image Process and resolve only images information of 15 of unit 12, data transmission unit 13, Data Receiving unit 14, processing unit are controlled cursor movement (control principle sees also embodiment one) thus.The right and left key of mouse is simulated realization by the Macintosh of electronic product 20 keyboards.
Here description of the present utility model and application are illustrative, are not to want with scope restriction of the present utility model in the above-described embodiments.Here the distortion of disclosed embodiment and change are possible, and the various parts of the replacement of embodiment and equivalence are known for those those of ordinary skill in the art.Those skilled in the art are noted that under the situation that does not break away from spirit of the present utility model or essential characteristic, and the utility model can be with other forms, structure, layout, ratio, and realize with other elements, material and parts.Under the situation that does not break away from the utility model scope and spirit, can carry out other distortion and change here to disclosed embodiment.

Claims (5)

1. cursor positioning system is characterized in that it comprises:
The image capture unit, in order to photographic images, and each two field picture of taking is sent to a Flame Image Process and resolves the unit;
Flame Image Process and resolve the unit in order to the relative three-dimensional motion between the object of described image capture unit and random shooting is processed into the cursor position offset data, then transfers to a data transmission unit; The three-dimensional motion of described image capture unit causes the motion of the picture of the object of random shooting in its visual field, drives the motion of cursor according to the two dimensional motion of the picture of the object of described random shooting;
Data transmission unit is in order to be sent to a Data Receiving unit to the described cursor position compensation that calculates;
The Data Receiving unit compensates in order to the cursor position that receives the data transmission unit transmission, and this cursor position compensation is sent in the processing unit;
Processing unit shows in the relevant position in order to drive cursor.
2. cursor positioning according to claim 1 system is characterized in that:
Described system comprises Input Control Element, and described Input Control Element comprises described image capture unit;
Described Input Control Element comprises camera head and at least one control button;
Setting operation function on described control button.
3. cursor positioning according to claim 2 system is characterized in that:
Described Input Control Element comprises that one is the housing of lip pencil;
In this housing head camera head is set;
At at least three buttons of side setting of this housing, its function is set to left button, right button, and the power switch of common mouse respectively.
4. cursor positioning according to claim 2 system is characterized in that:
Described Input Control Element comprises the push-button unit of being convenient to be worn over finger ring on the finger, being arranged on camera head on this finger ring, be arranged with described finger ring branch;
Described push-button unit comprises three buttons at least, and its function is set to left button, right button, and the power switch of common mouse respectively.
5. cursor positioning according to claim 4 system is characterized in that:
Described push-button unit is arranged on the keyboard or the independent part for separating with other mechanisms.
CN2009200675548U 2009-02-06 2009-02-06 Cursor locating system Expired - Fee Related CN201465045U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009200675548U CN201465045U (en) 2009-02-06 2009-02-06 Cursor locating system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009200675548U CN201465045U (en) 2009-02-06 2009-02-06 Cursor locating system

Publications (1)

Publication Number Publication Date
CN201465045U true CN201465045U (en) 2010-05-12

Family

ID=42392393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009200675548U Expired - Fee Related CN201465045U (en) 2009-02-06 2009-02-06 Cursor locating system

Country Status (1)

Country Link
CN (1) CN201465045U (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841679A (en) * 2012-05-14 2012-12-26 乐金电子研发中心(上海)有限公司 Non-contact man-machine interaction method and device
CN103067653A (en) * 2013-01-04 2013-04-24 苏州云都网络技术有限公司 Mobile digital camera device and operation method thereof
CN104223616A (en) * 2014-09-22 2014-12-24 重庆邮电大学 Information input method, information input device, processing chip and information input finger ring
CN105223396A (en) * 2015-10-08 2016-01-06 中国电子科技集团公司第四十一研究所 The device and method of waveform measurement cursor display is realized based on FPGA

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841679A (en) * 2012-05-14 2012-12-26 乐金电子研发中心(上海)有限公司 Non-contact man-machine interaction method and device
CN102841679B (en) * 2012-05-14 2015-02-04 乐金电子研发中心(上海)有限公司 Non-contact man-machine interaction method and device
CN103067653A (en) * 2013-01-04 2013-04-24 苏州云都网络技术有限公司 Mobile digital camera device and operation method thereof
CN104223616A (en) * 2014-09-22 2014-12-24 重庆邮电大学 Information input method, information input device, processing chip and information input finger ring
CN104223616B (en) * 2014-09-22 2015-12-30 重庆邮电大学 A kind of data inputting method, device, process chip and information input finger ring
CN105223396A (en) * 2015-10-08 2016-01-06 中国电子科技集团公司第四十一研究所 The device and method of waveform measurement cursor display is realized based on FPGA
CN105223396B (en) * 2015-10-08 2017-12-01 中国电子科技集团公司第四十一研究所 The device and method that waveform measurement cursor shows is realized based on FPGA

Similar Documents

Publication Publication Date Title
Zhang et al. Visual panel: virtual mouse, keyboard and 3D controller with an ordinary piece of paper
Büschel et al. Miria: A mixed reality toolkit for the in-situ visualization and analysis of spatio-temporal interaction data
US11182962B2 (en) Method and system for object segmentation in a mixed reality environment
KR102182607B1 (en) How to determine hand-off for virtual controllers
Balakrishnan et al. The Rockin'Mouse: integral 3D manipulation on a plane
Mohr et al. Retargeting video tutorials showing tools with surface contact to augmented reality
CN101482782A (en) Cursor positioning system and method
Chun et al. Real-time hand interaction for augmented reality on mobile phones
US20110012830A1 (en) Stereo image interaction system
CN103270423B (en) For estimated rigid motion to be used to respond the system and method to determine object information
CN106055090A (en) Virtual reality and augmented reality control with mobile devices
US20130318479A1 (en) Stereoscopic user interface, view, and object manipulation
CN201465045U (en) Cursor locating system
CN103858074A (en) System and method for interfacing with a device via a 3d display
CN104281397A (en) Refocusing method and device for multiple depth sections and electronic device
Romat et al. Flashpen: A high-fidelity and high-precision multi-surface pen for virtual reality
CN109871117A (en) Information processing unit, display device and information processing system
Ye et al. 3D curve creation on and around physical objects with mobile AR
Lüthi et al. DeltaPen: A device with integrated high-precision translation and rotation sensing on passive surfaces
Malik An exploration of multi-finger interaction on multi-touch surfaces
EP3627288A1 (en) Camera module and system using the same
Yoo et al. 3D remote interface for smart displays
CN103869941B (en) Have electronic installation and the instant bearing calibration of virtual touch-control of virtual touch-control service
Sato et al. Video-based tracking of user's motion for augmented desk interface
Huang et al. Three-dimensional virtual touch display system for multi-user applications

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20100512

Termination date: 20120206