CN102193631A - Wearable three-dimensional gesture interaction system and using method thereof - Google Patents

Wearable three-dimensional gesture interaction system and using method thereof Download PDF

Info

Publication number
CN102193631A
CN102193631A CN2011101146045A CN201110114604A CN102193631A CN 102193631 A CN102193631 A CN 102193631A CN 2011101146045 A CN2011101146045 A CN 2011101146045A CN 201110114604 A CN201110114604 A CN 201110114604A CN 102193631 A CN102193631 A CN 102193631A
Authority
CN
China
Prior art keywords
gesture
mark
wearable
change step
colour
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011101146045A
Other languages
Chinese (zh)
Inventor
凌晨
陈明
张文俊
赵凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN2011101146045A priority Critical patent/CN102193631A/en
Publication of CN102193631A publication Critical patent/CN102193631A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a wearable three-dimensional gesture interaction system and a suing method thereof. The system consists of a micro projector, a common camera, a portable computer, and a group of simple and easy marks. The function of human-computer interaction by using gestures and a three-dimensional graphical interface is realized through simple and easy computer vision computation. In the wearable three-dimensional gesture interaction system, a three-dimensional graphical interface is more vivid and the interaction mode is more flexible through a projection screen marked three-dimensional graphical interface mode and a virtual mouse gesture. The wearable three-dimensional gesture interaction system has simple equipment and low cost and can meet the requirements on mobile equipment.

Description

Wearable three-dimension gesture interactive system and using method thereof
Technical field
What the present invention relates to is a kind of 3 D graphical interface gesture interaction technology that can dress, specifically a kind of based on monocular vision and mark, make to use gesture and carry out the wearable three-dimension gesture interactive system and the using method thereof of man-machine interaction.
Background technology
Along with people improve constantly the requirement of natural interaction, carry out man-machine interaction by gesture and become main flow trend.Can both carry out the operation of virtual data in order to reach whenever and wherever possible, the wearable computer technology is a kind of feasible method of head it off.Along with the development of micro projector, it has been irreversible trend that micro projector becomes the mobile device standard configuration.Simultaneously, people also are not content with the too small display screen of mobile device gradually.So the interface projected to enlarges screen size on the plane and become the research focus.
For the research of wearable computer, as; Chinese patent: its name is called " the microminiature keyboard of wearable computer special use ", application number 200410043913.8, this patent provides a kind of microminiature keyboard of wearable computer special use, volume is little, simple in structure, disc type is mutual but interactive mode remains conventional keys, does not meet the requirement of natural interaction.Chinese patent: its name is called " Wearable input media ", application number 200920178274.4; Chinese patent: its name is called " a kind of glove-type virtual input device ", application number 200920204864.X.These patents are requested the hardware constraint of conventional mouse keyboard-type, realize the purpose of device virtualization, but need the support of special hardware, are not suitable for the requirement of mobile device.Chinese patent: its name is called " wearable radio multimedia miniature that has Chinese LINUX operating system is calculated machine ", application number 00107607.8, this patent end user's natural language carries out man-machine interaction, but people are accustomed to using hand to carry out alternately more, so patent is not suitable for extensive commercial the popularization.Chinese patent: its name is called " a kind of man-machine interaction helmet that is used for wearable computer ", application number 200710023167.X, this patent proposes a kind of eye tracking function that has, can realize the helmet that wearable computer is carried out man-machine interaction by cooperating voice, the application that can be widely used in the both hands that Aeronautics and Astronautics, machine-building, electric power, military affairs etc. can not the thorough liberation people, but too high to hardware requirement.
Summary of the invention
The problem and shortage that exists of prior art in view of the above, the object of the present invention is to provide a kind of wearable three-dimension gesture interactive system and using method thereof, system architecture is simple, and the user can carry out alternately the virtual three-dimensional object by gesture, and three-dimensional interface produces concrete.For achieving the above object, design of the present invention is: according to the wearable computer technical requirement, the present invention includes portable computer (1), camera (2), micro projector (3) and colour-coded (4).For picture that camera is taken is the picture that the user sees,, is worn over camera on the head or hangs over the front so camera should be near user's eye.For projected picture is not blocked by hand, micro projector should be worn over head.And for the real screen that picture that camera obtains and micro projector are launched, because the different geometric distortions that cause do not influence later process and the complicacy that reduces program between both positions own, should make micro projector and camera close as far as possible, promptly both are identical as far as possible with angle to the distance of projection screen.Micro projector and camera are connected on the portable computer, and this portable computer can use notebook computer, also can use smart mobile phone.Need to be with the mark of enameling on user's finger, in the present invention, the forefinger of the forefinger of left hand, thumb and little finger of toe and the right hand, thumb need be with going up different colour-codeds.
Micro projector and camera also have no special requirements, and that can buy on the market at present all can.Camera and micro projector are placed on the cap, are worn on the head, portable computer is carried on the back after one's death, has so just constituted the hardware system of Wearable three-dimension gesture exchange method.Along with the development of mobile device, when micro projector also became the mobile device configuration as camera, native system can be reduced to single mobile device, accomplishes incorporate requirement.
According to the foregoing invention design, the present invention adopts following technical proposals:
A kind of wearable three-dimension gesture interactive system, comprise portable computer (1), camera (2), micro projector (3) and colour-coded (4), it is characterized in that camera and micro projector are placed on the cap, it is identical as far as possible with angle to make both arrive the distance of projection screen, be connected on the portable computer, and on user's finger, need to be with the mark of enameling.
A kind of using method of above-mentioned wearable three-dimension gesture interactive system is characterized in that operation steps is:
1) mark identification: obtain the information that the user points mark (4) of enameling by camera (2), obtain its coordinate information by image segmentation;
2) action recognition: which kind of action command calculates the position relation of motion vector and each mark of each mark in real time, be thereby analyze;
3) response output: revise picture according to action command, and project on the plane, realize man-machine interaction by micro projector (3).
The mark identification of above-mentioned steps 1 by the computer image processing technology based on color of maturation, is cut apart the real image that camera photographs, and obtains the coordinate information of each colour-coded.
The action recognition of above-mentioned steps 2, the interactive mode among the present invention is a gesture interaction, comprises four class gestures: multi-point touch gesture, free gesture, mixing gesture and virtual mouse.The multi-point touch gesture is exactly a series of gestures of nowadays the most popular multi-point touch, by this type of gesture, and can free zoomed image, mobile icon or the like.Free gesture is exactly the gesture in people's daily life, and for example left hand is made " OK " gesture, just can realize the operation of confirming; Left index finger intersects with right hand forefinger and is " X " shape, just can realize the operation of cancellation; The support of two hands is flat, and left hand is last, and the right hand is following, and the left hand thumb touches mutually with right hand forefinger, and left index finger touches mutually with hand thumb, shows the gesture of " finding a view ", just the picture that current camera obtains can be preserved, i.e. camera function.Mixing gesture, to fit into pedestrian's machine by the right-hand man exactly mutual, left hand control three-dimension varying, for example selection at visual angle etc.; The conversion of right hand control plane is as move left and right of picture etc.Virtual mouse is a kind of gesture that the present invention proposes.Virtual mouse is about to the left hand palm and uses as mouse.This gesture is that left hand is strutted, and the palm of the hand is in the face of camera, four fingers bending as far as possible except that thumb, make four fingers finger nail in a straight line.The colour-coded of left index finger and little finger of toe in a straight line at this moment, as top left corner apex, the mark on the little finger of toe is done a square downwards as summit, the upper right corner with the mark on the forefinger, in this square area, be the control area, corresponding with coordinate on the graphical interfaces.And this square area on the palm, when right hand forefinger is mobile in the control area, can contact palm leftward, and can see the centre of the palm as is the control area.She Ji advantage is that right hand forefinger can have the place by means of power when moving like this, can make to move more surely, reaches pinpoint effect.To in embodiment, be specifically described in conjunction with figure.
The response of above-mentioned steps 3 output, with the graphical interfaces in the portable computer by micro projector project on the wall, on the physical plane such as desktop.The graphical interfaces that uses not only can use nowadays conventional X-Y scheme interface, can also use 3 D graphical interface.Can the nowadays commercial three-dimensional interface of compatibility, for example BumpTop and Real Desktop.The present invention also proposes a kind of method of utilizing the projection screen mark to realize three-dimensional interface.The projection screen mark is promptly done the different mark of drafting and is projected on the projection screen again on four drift angles in original interface, these marks just are called " projection screen mark ".Because the angle of people and view plane normal amount is different, the picture of projection can produce geometric distortion, and then by camera acquisition actual projection images.According to four marks and its visual transform matrix, calculate the direction vector of projection plane, and then render the virtual three-dimensional object and project on the plane; End user can carry out the virtual three-dimensional object by gesture alternately.Along with the moving of user, promptly the variation of angle is measured by user and view plane normal, will play up different dummy objects, but the user will produce the impression of dummy object just as necessary being like this.And the user can carry out the virtual three-dimensional object by gesture alternately.To in embodiment, be specifically described in conjunction with figure.
The present invention has following conspicuous outstanding substantive distinguishing features and remarkable advantage compared with prior art:
The present invention makes 3 D graphical interface more lively by a kind of 3 D graphical interface mode and a kind of virtual mouse gesture of projection screen mark, and interactive mode is more flexible.Present device is simple, cost is low, can satisfy the requirement of mobile device.
Description of drawings
The synoptic diagram of Fig. 1 embodiment of the invention;
The schematic flow sheet of Fig. 2 embodiment of the invention;
Fig. 3 virtual mouse synoptic diagram;
Fig. 4 projection screen mark synoptic diagram.
Embodiment
Below in conjunction with accompanying drawing the preferred embodiments of the present invention are described in further detail.
Embodiment one: as shown in Figure 1, wearable three-dimension gesture interactive system comprises a portable computer (1), a camera (2), a micro projector (3) and some colour-codeds (4).This interactive system is low to hardware requirement, and micro projector and camera do not require, so system cost is low.For picture that camera is taken is the picture that the user sees, and hand motion do not influence drop shadow effect, as shown in Figure 1, micro projector and camera is placed on the cap.
Embodiment two: the using method of wearable three-dimension gesture interactive system, and as shown in Figure 2, its concrete steps are as follows:
1., mark identification;
2., action recognition;
3., response output.
The 1. described mark identification of above-mentioned steps is to obtain the information that the user points the mark of enameling by camera, obtains its coordinate information by image segmentation, and its concrete steps are as follows:
(1), obtains the key frame picture.
(2), the colour-coded in the key frame picture is carried out image segmentation, obtain the coordinate of each colour-coded, and the relative position that calculates each mark is as gesture.
The 2. described action recognition of above-mentioned steps is carried out gesture interaction, and its concrete steps are as follows:
(3), be the multi-point touch gesture? be that the multi-point touch gesture is then changeed step (7), otherwise change step (4).
(4), be free gesture? be then to change step (8), otherwise change step (5).
(5), be to mix gesture? be then to change step (9), otherwise change step (6).
(6), be virtual mouse? be then to change step (10), otherwise change step (1).
A kind of special gesture in the step (6): virtual mouse, as shown in Figure 3, the colour-coded of left index finger and little finger of toe in a straight line, with the mark on the forefinger as top left corner apex, mark on the little finger of toe is done a square downwards as summit, the upper right corner, is the control area in this square area, be the centre of the palm, corresponding with coordinate on the graphical interfaces.She Ji advantage is that right hand forefinger can have the place by means of power when moving like this, can make to move more surely, reaches pinpoint effect.The utilization virtual mouse can be used as the dummy keyboard input characters, also can accurately draw on the interface.
The 3. described response output of above-mentioned steps is revised picture according to action command, and is projected on the plane by micro projector, realizes man-machine interaction, and its concrete steps are as follows:
(7), according to the different gestures of colour-coded, make the operation of corresponding multi-point touch, finish mutual back and change step (1).
(8), according to the different gestures of colour-coded, make corresponding free gesture operation, finish mutual back and change step (1).
(9), according to the different gestures of colour-coded, make corresponding mixing gesture operation, finish mutual back and change step (1).
(10), according to the different gestures of colour-coded, make the operation of corresponding virtual mouse, finish mutual back and change step (1).
The ordinary two dimensional graphical interfaces can be used among the present invention, also 3 D graphical interface can be used.Present embodiment provides a kind of special 3 D graphical interface " projection screen mark ", as shown in Figure 4, draws different marks and project on the projection screen on four drift angles in original interface.Because the angle that people and view plane normal measure is different, the picture of projection will produce geometric distortion, and then obtain these actual projection images by camera.Transformation matrix according to four marks and its vision:
Figure 413876DEST_PATH_IMAGE001
Wherein, Be world coordinate system (
Figure 702217DEST_PATH_IMAGE003
,
Figure 456547DEST_PATH_IMAGE004
, ) to the camera coordinate system ( ,
Figure 271422DEST_PATH_IMAGE007
,
Figure 139146DEST_PATH_IMAGE008
) transformation matrix.
Figure 414269DEST_PATH_IMAGE002
In
Figure 57740DEST_PATH_IMAGE009
Be selection matrix,
Figure 517541DEST_PATH_IMAGE010
Be translation matrix,
Figure 246462DEST_PATH_IMAGE011
Be perspective transformation matrix.Calculate the direction vector of projection plane, and then render the virtual three-dimensional object and project on the plane, end user can carry out the virtual three-dimensional object by gesture alternately.Augmented reality system based on mark was that camera position is fixed in the past, the mark position change.And the projection screen mark among the present invention is a mark position, and promptly projection screen is motionless, and camera moves.Along with the moving of user, promptly the variation of user and view plane normal amount angle will play up different dummy objects, but the user will produce the impression of dummy object just as necessary being like this, and more three-dimensional interface more vividly specifically than in the past.And the user can carry out the virtual three-dimensional object by gesture alternately.

Claims (5)

1. wearable three-dimension gesture interactive system, comprise portable computer (1), camera (2), micro projector (3) and colour-coded (4), it is characterized in that camera and micro projector are placed on the cap, it is identical as far as possible with angle to make both arrive the distance of projection screen, be connected on the portable computer, and on user's finger, need to be with the mark of enameling.
2. the using method of a wearable three-dimension gesture interactive system according to claim 1 is characterized in that operation steps is:
1. mark identification: obtain the information that the user points mark (4) of enameling by camera (2), obtain its coordinate information by image segmentation;
2. action recognition: which kind of action command calculates the position relation of motion vector and each mark of each mark in real time, be thereby analyze;
3. response output: revise picture according to action command, and project on the plane, realize man-machine interaction by micro projector (3).
3. the using method of wearable three-dimension gesture interactive system according to claim 2,1. the concrete operations step of mark identification is as follows to it is characterized in that described step:
(1), obtains the key frame picture;
(2), the colour-coded in the key frame picture is carried out image segmentation, obtain the coordinate of each colour-coded, and the relative position that calculates each mark is as gesture.
4. the using method of wearable three-dimension gesture interactive system according to claim 2,2. the concrete operations step of action recognition is as follows to it is characterized in that described step:
(3), be the multi-point touch gesture? be that the multi-point touch gesture is then changeed step (7), otherwise change step (4);
(4), be free gesture? be then to change step (8), otherwise change step (5);
(5), be to mix gesture? be then to change step (9), otherwise change step (6);
(6), be virtual mouse? be then to change step (10), otherwise change step (1);
A kind of special gesture in the step (6): virtual mouse, the colour-coded of left index finger and little finger of toe in a straight line, with the mark on the forefinger as top left corner apex, mark on the little finger of toe is as summit, the upper right corner, do a square downwards, in this square area, be the control area, i.e. the centre of the palm is corresponding with coordinate on the graphical interfaces; She Ji advantage is that right hand forefinger can have the place by means of power when moving like this, can make to move more surely, reaches pinpoint effect; (3), be the multi-point touch gesture? be that the multi-point touch gesture is then changeed step (7), otherwise change step (4).
5. the using method of wearable three-dimension gesture interactive system according to claim 2, it is as follows to it is characterized in that 3. described step responds the concrete operations step of output:
(7), according to the different gestures of colour-coded, make the operation of corresponding multi-point touch, finish mutual back and change step (1);
(8), according to the different gestures of colour-coded, make corresponding free gesture operation, finish mutual back and change step (1);
(9), according to the different gestures of colour-coded, make corresponding mixing gesture operation, finish mutual back and change step (1);
(10), according to the different gestures of colour-coded, make the operation of corresponding virtual mouse, finish mutual back and change step (1).
CN2011101146045A 2011-05-05 2011-05-05 Wearable three-dimensional gesture interaction system and using method thereof Pending CN102193631A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011101146045A CN102193631A (en) 2011-05-05 2011-05-05 Wearable three-dimensional gesture interaction system and using method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011101146045A CN102193631A (en) 2011-05-05 2011-05-05 Wearable three-dimensional gesture interaction system and using method thereof

Publications (1)

Publication Number Publication Date
CN102193631A true CN102193631A (en) 2011-09-21

Family

ID=44601809

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011101146045A Pending CN102193631A (en) 2011-05-05 2011-05-05 Wearable three-dimensional gesture interaction system and using method thereof

Country Status (1)

Country Link
CN (1) CN102193631A (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014012486A1 (en) * 2012-07-17 2014-01-23 Gao Shouqian Wearable wireless intelligent electronic device having removable and freely-combinable functional modules
CN103995592A (en) * 2014-05-21 2014-08-20 上海华勤通讯技术有限公司 Wearable equipment and terminal information interaction method and terminal
CN104063038A (en) * 2013-03-18 2014-09-24 联想(北京)有限公司 Information processing method and device and electronic equipment
CN104102071A (en) * 2013-04-12 2014-10-15 王旭东 Full-automatic photographing equipment
CN104345802A (en) * 2013-08-08 2015-02-11 派布勒斯有限公司 Method and device for controlling a near eye display
CN104461277A (en) * 2013-09-23 2015-03-25 Lg电子株式会社 Mobile terminal and method of controlling therefor
CN104460988A (en) * 2014-11-11 2015-03-25 陈琦 Input control method of intelligent cell phone virtual reality device
CN104660941A (en) * 2013-11-22 2015-05-27 北京弘天智达科技有限公司 Micro display communication device, system and method
CN104866103A (en) * 2015-06-01 2015-08-26 联想(北京)有限公司 Relative position determining method, wearable electronic equipment and terminal equipment
WO2015165181A1 (en) * 2014-04-28 2015-11-05 京东方科技集团股份有限公司 Method and apparatus for controlling projection of wearable device, and wearable device
CN105302337A (en) * 2013-06-04 2016-02-03 李文傑 High resolution and high sensitivity three-dimensional (3D) cursor maneuvering system, device and motion detection method
CN106293078A (en) * 2016-08-02 2017-01-04 福建数博讯信息科技有限公司 Virtual reality exchange method based on photographic head and device
CN104090465B (en) * 2014-06-17 2017-01-11 福建水立方三维数字科技有限公司 Three-dimensional interactive projection imaging method
WO2017107182A1 (en) * 2015-12-25 2017-06-29 深圳市柔宇科技有限公司 Head-mounted display device
CN107085467A (en) * 2017-03-30 2017-08-22 北京奇艺世纪科技有限公司 A kind of gesture identification method and device
CN109445599A (en) * 2018-11-13 2019-03-08 宁波视睿迪光电有限公司 Interaction pen detection method and 3D interactive system
CN109496331A (en) * 2016-05-20 2019-03-19 奇跃公司 The context aware of user interface
CN110049228A (en) * 2018-01-17 2019-07-23 北京林业大学 A kind of new method and system taken pictures based on gesture control
CN110140099A (en) * 2017-01-27 2019-08-16 高通股份有限公司 System and method for tracking control unit
CN111624770A (en) * 2015-04-15 2020-09-04 索尼互动娱乐股份有限公司 Pinch and hold gesture navigation on head mounted display
CN112416133A (en) * 2020-11-30 2021-02-26 魔珐(上海)信息科技有限公司 Hand motion capture method and device, electronic equipment and storage medium
CN112515661A (en) * 2020-11-30 2021-03-19 魔珐(上海)信息科技有限公司 Posture capturing method and device, electronic equipment and storage medium
CN114967927A (en) * 2022-05-30 2022-08-30 桂林电子科技大学 Intelligent gesture interaction method based on image processing

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101077232A (en) * 2007-06-07 2007-11-28 南京航空航天大学 Human-computer interaction helmet for type computer

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101077232A (en) * 2007-06-07 2007-11-28 南京航空航天大学 Human-computer interaction helmet for type computer

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《北京理工大学学报》 20051231 柳阳等 基于穿戴视觉的人手跟踪与手势识别方法 第1083-1086页 1-2 第25卷, 第12期 *
《计算机辅助设计与图形学学报》 20110430 李岩等 一种手部实时跟踪与定位的虚实碰撞检测方法 第713-718页 1-2 第23卷, 第4期 *

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014012486A1 (en) * 2012-07-17 2014-01-23 Gao Shouqian Wearable wireless intelligent electronic device having removable and freely-combinable functional modules
CN104063038A (en) * 2013-03-18 2014-09-24 联想(北京)有限公司 Information processing method and device and electronic equipment
CN104102071A (en) * 2013-04-12 2014-10-15 王旭东 Full-automatic photographing equipment
CN105302337A (en) * 2013-06-04 2016-02-03 李文傑 High resolution and high sensitivity three-dimensional (3D) cursor maneuvering system, device and motion detection method
CN105302337B (en) * 2013-06-04 2019-10-18 李文傑 High-resolution and three-dimensional (3D) the mouse mobile control system of high sensitive, mobile controller part and its motion detection method
CN104345802B (en) * 2013-08-08 2019-03-22 脸谱公司 For controlling the devices, systems, and methods of near-to-eye displays
CN104345802A (en) * 2013-08-08 2015-02-11 派布勒斯有限公司 Method and device for controlling a near eye display
CN104461277A (en) * 2013-09-23 2015-03-25 Lg电子株式会社 Mobile terminal and method of controlling therefor
CN104660941A (en) * 2013-11-22 2015-05-27 北京弘天智达科技有限公司 Micro display communication device, system and method
WO2015165181A1 (en) * 2014-04-28 2015-11-05 京东方科技集团股份有限公司 Method and apparatus for controlling projection of wearable device, and wearable device
US9872002B2 (en) 2014-04-28 2018-01-16 Boe Technology Group Co., Ltd. Method and device for controlling projection of wearable apparatus, and wearable apparatus
CN103995592A (en) * 2014-05-21 2014-08-20 上海华勤通讯技术有限公司 Wearable equipment and terminal information interaction method and terminal
CN104090465B (en) * 2014-06-17 2017-01-11 福建水立方三维数字科技有限公司 Three-dimensional interactive projection imaging method
CN104460988B (en) * 2014-11-11 2017-12-22 陈琦 A kind of input control method of smart mobile phone virtual reality device
CN104460988A (en) * 2014-11-11 2015-03-25 陈琦 Input control method of intelligent cell phone virtual reality device
CN111624770A (en) * 2015-04-15 2020-09-04 索尼互动娱乐股份有限公司 Pinch and hold gesture navigation on head mounted display
CN104866103A (en) * 2015-06-01 2015-08-26 联想(北京)有限公司 Relative position determining method, wearable electronic equipment and terminal equipment
CN104866103B (en) * 2015-06-01 2019-12-24 联想(北京)有限公司 Relative position determining method, wearable electronic device and terminal device
WO2017107182A1 (en) * 2015-12-25 2017-06-29 深圳市柔宇科技有限公司 Head-mounted display device
CN109496331B (en) * 2016-05-20 2022-06-21 奇跃公司 Context awareness for user interface menus
CN109496331A (en) * 2016-05-20 2019-03-19 奇跃公司 The context aware of user interface
CN106293078A (en) * 2016-08-02 2017-01-04 福建数博讯信息科技有限公司 Virtual reality exchange method based on photographic head and device
CN110140099B (en) * 2017-01-27 2022-03-11 高通股份有限公司 System and method for tracking controller
CN110140099A (en) * 2017-01-27 2019-08-16 高通股份有限公司 System and method for tracking control unit
US11740690B2 (en) 2017-01-27 2023-08-29 Qualcomm Incorporated Systems and methods for tracking a controller
CN107085467A (en) * 2017-03-30 2017-08-22 北京奇艺世纪科技有限公司 A kind of gesture identification method and device
CN110049228A (en) * 2018-01-17 2019-07-23 北京林业大学 A kind of new method and system taken pictures based on gesture control
CN109445599A (en) * 2018-11-13 2019-03-08 宁波视睿迪光电有限公司 Interaction pen detection method and 3D interactive system
CN112416133A (en) * 2020-11-30 2021-02-26 魔珐(上海)信息科技有限公司 Hand motion capture method and device, electronic equipment and storage medium
CN112515661A (en) * 2020-11-30 2021-03-19 魔珐(上海)信息科技有限公司 Posture capturing method and device, electronic equipment and storage medium
CN112515661B (en) * 2020-11-30 2021-09-14 魔珐(上海)信息科技有限公司 Posture capturing method and device, electronic equipment and storage medium
WO2022111525A1 (en) * 2020-11-30 2022-06-02 魔珐(上海)信息科技有限公司 Posture capturing method and apparatus, electronic device, and storage medium
CN114967927A (en) * 2022-05-30 2022-08-30 桂林电子科技大学 Intelligent gesture interaction method based on image processing
CN114967927B (en) * 2022-05-30 2024-04-16 桂林电子科技大学 Intelligent gesture interaction method based on image processing

Similar Documents

Publication Publication Date Title
CN102193631A (en) Wearable three-dimensional gesture interaction system and using method thereof
KR101844390B1 (en) Systems and techniques for user interface control
EP2987063B1 (en) Virtual tools for use with touch-sensitive surfaces
Mistry et al. WUW-wear Ur world: a wearable gestural interface
JP6165485B2 (en) AR gesture user interface system for mobile terminals
CN109145802B (en) Kinect-based multi-person gesture man-machine interaction method and device
JP4513830B2 (en) Drawing apparatus and drawing method
CN107728792A (en) A kind of augmented reality three-dimensional drawing system and drawing practice based on gesture identification
US20190050132A1 (en) Visual cue system
WO2013010027A1 (en) Drawing aid system for multi-touch devices
Matulic et al. Phonetroller: Visual representations of fingers for precise touch input with mobile phones in vr
CN103092437A (en) Portable touch interactive system based on image processing technology
CN107179876B (en) Man-machine interaction device based on virtual reality system
Choi et al. Bare-hand-based augmented reality interface on mobile phone
CN104820584B (en) Construction method and system of 3D gesture interface for hierarchical information natural control
WO2023024536A1 (en) Drawing method and apparatus, and computer device and storage medium
CN115328304A (en) 2D-3D fused virtual reality interaction method and device
Jiang et al. A brief analysis of gesture recognition in VR
Bai et al. Poster: Markerless fingertip-based 3D interaction for handheld augmented reality in a small workspace
Zhang et al. A hybrid 2d-3d tangible interface for virtual reality
CN103558914A (en) Single-camera virtual keyboard based on geometric correction and optimization
CN113434046A (en) Three-dimensional interaction system, method, computer device and readable storage medium
Han et al. Ar pottery: Experiencing pottery making in the augmented space
Tao et al. Human-Computer Interaction Using Fingertip Based on Kinect
Schlattmann et al. Efficient bimanual symmetric 3d manipulation for bare-handed interaction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20110921