CN105184268A - Gesture recognition device, gesture recognition method, and virtual reality system - Google Patents

Gesture recognition device, gesture recognition method, and virtual reality system Download PDF

Info

Publication number
CN105184268A
CN105184268A CN201510586276.7A CN201510586276A CN105184268A CN 105184268 A CN105184268 A CN 105184268A CN 201510586276 A CN201510586276 A CN 201510586276A CN 105184268 A CN105184268 A CN 105184268A
Authority
CN
China
Prior art keywords
hand
imaging device
orientation information
palm
centre
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510586276.7A
Other languages
Chinese (zh)
Other versions
CN105184268B (en
Inventor
吴涛
郭成
曹萌
谢祖永
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING G-WEARABLES INFORMATION SCIENCE & TECHNOLOGY Co Ltd
Original Assignee
BEIJING G-WEARABLES INFORMATION SCIENCE & TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING G-WEARABLES INFORMATION SCIENCE & TECHNOLOGY Co Ltd filed Critical BEIJING G-WEARABLES INFORMATION SCIENCE & TECHNOLOGY Co Ltd
Priority to CN201510586276.7A priority Critical patent/CN105184268B/en
Publication of CN105184268A publication Critical patent/CN105184268A/en
Application granted granted Critical
Publication of CN105184268B publication Critical patent/CN105184268B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm

Abstract

The invention provides a gesture recognition device, a gesture recognition method, and a virtual reality system. The gesture recognition device comprises an imaging device, a first inertial sensor, a second inertial sensor, and a processor. To be specific, the imaging device is used for shooting a hand image; the first inertial sensor is fixed relative to the imaging device and is used for sensing lens orientation information of the imaging device; the second inertial sensor arranged on a hand is used for sensing orientation information of a hand palm center; and the processor connecting the first inertial sensor and the second inertial sensor is used for determining a relative direction relation of the hand palm center with the imaging device according to the lens orientation information and the orientation information of the hand palm center. Besides, the gesture recognition method includes: determining orientation information of a palm center of a hand; and determining a relative direction relation of the palm center of the hand with an imaging device according to lens orientation information and the orientation information of the palm center of the hand. And the virtual reality system consists of a gesture recognition device and a head-mounted device. Therefore, the gesture recognition becomes accurate.

Description

Gesture identification equipment, gesture identification method and virtual reality system
Technical field
The present invention relates to gesture identification equipment, gesture identification method and virtual reality system.
Background technology
Along with virtual reality, man-machine interaction, the developing rapidly of image recognition correlation technique, the demand of industry-by-industry to gesture identification is accurately increasing.It can be applied to: the various aspects such as Intelligent housing, vehicle-mounted operation control, PC and mobile terminal manipulation, industrial design.And its commercial value also increases day by day.
The method of gesture identification is also varied, mainly can be roughly divided into three kinds of technology: the image recognition technology based on light technology, the motion capture technology based on inertial sensor and the hand configuration simulation technology based on physical construction.These three kinds of technology cut both ways, incompatible.
Light is utilized to carry out hand images recovery, the depth information of hand form that what the technology of carrying out gesture identification was generally collected is all.Its shortcoming is: the positive and negative judgement of hand is more difficult, easily judges by accident; The impact of light is larger, comprises outdoor visible ray, laser that laser camera oneself sends, catches camera self and identify interference etc.; Identification range is limited, limits by light path, lower to the tolerance of obstacle, and the identification decision that both hands overlap is wrong.
Utilize inertial sensor carry out motion capture have portability strong, simple to operate, be not afraid of and block, catch the advantage that precision is high, sample rate is high.And it adopts high speed integrated chip and module, volume is little, lightweight, cost performance is higher.But in gesture identification field, it has some obvious defects: the fine movement for hand joint cannot reduce; Can not carry out space absolute fix, carry out integral operation by each several part limbs attitude information, the spatial positional information obtained causes integrator drift in various degree.The precise positioning of hand motion cannot be carried out; Easily reduce precision by ferromagnetic impact of surrounding environment.
Summary of the invention
Problem to be solved by this invention is to provide gesture identification equipment, gesture identification method and virtual reality system, makes gesture identification more accurate.
According to an aspect of the present invention, provide a kind of gesture identification equipment, comprising: imaging device, for taking hand images; First inertial sensor, is set to fix relative to imaging device, for sensing the camera lens orientation information of described imaging device; Second inertial sensor, is suitable for being arranged in hand, for sensing hand centre of the palm orientation information; Processor, is connected to the first inertial sensor and the second inertial sensor, for according to camera lens orientation information and hand centre of the palm orientation information, determines the relative direction relation of the hand centre of the palm relative to imaging device.
Preferably, processor can also be used for obtaining hand shape information from hand images, and in conjunction with hand shape information and relative direction relation, determines that hand is left hand or the right hand.
Preferably, imaging device can comprise the depth camera based on principle between when flying light, TOF (TimeofFlight) between when flying light.
Preferably, the first inertial sensor can be nine axle sensors.
Preferably, the second inertial sensor can be nine axle sensors.
According to an aspect of the present invention, provide a kind of virtual reality system, comprising: gesture identification equipment; Headset equipment.
Preferably, imaging device can be arranged in headset equipment, also can independently arrange, or is arranged on other equipment.
Preferably, this virtual reality system can also comprise Backpack type equipment, is suitable for user and bears.Processor can be arranged in this Backpack type equipment.Imaging device, the first inertial sensor, the second inertial sensor can be communicated with processor by wired or wireless mode.
According to an aspect of the present invention, provide a kind of gesture identification method, comprising: the camera lens orientation information determining imaging device; Determine the orientation information in the hand centre of the palm; According to camera lens orientation information and hand centre of the palm orientation information, determine the relative direction relation of the hand centre of the palm relative to imaging device.
Preferably, camera lens orientation information and hand orientation information can represent with the form of the vector in space coordinates, determine that the hand centre of the palm can comprise relative to the step of the relative direction relation of imaging device: the inner product of vector asking camera lens orientation information and hand centre of the palm orientation information; If inner product is greater than 0, then judge that the hand centre of the palm is back to imaging device, if inner product is less than 0, then judge that hand the centre of the palm is to imaging device.
Preferably, hand images can be hand depth image.
Preferably, gesture identification method can also comprise: the hand images obtaining imaging device shooting; Hand shape information is obtained from hand images; In conjunction with hand shape information and relative direction relation, determine that hand is left hand or the right hand.
By using according to gesture identification equipment of the present invention, gesture identification method and virtual reality system, the shortcoming that single light recognition technology or the motion capture of inertial navigation formula bring can be made up, thus make gesture identification more accurate.
Accompanying drawing explanation
In conjunction with the drawings disclosure illustrative embodiments is described in more detail, above-mentioned and other object of the present disclosure, Characteristics and advantages will become more obvious, wherein, in disclosure illustrative embodiments, identical reference number represents same parts usually.
Fig. 1 is the schematic diagram according to virtual reality system of the present invention.
Fig. 2 is the process flow diagram of an embodiment of the method for identification hand centre of the palm relative direction according to gesture identifying device of the present invention.
Fig. 3 shows in gesture identification method, is determining the process flow diagram of the hand centre of the palm relative to the sub-step of an embodiment of the relative direction relation of imaging device according to the present invention.
Fig. 4 is the process flow diagram of an embodiment of the method for identification hand gestures information according to gesture identifying device of the present invention.
Embodiment
Below with reference to accompanying drawings preferred implementation of the present disclosure is described in more detail.Although show preferred implementation of the present disclosure in accompanying drawing, but should be appreciated that, the disclosure can be realized in a variety of manners and not should limit by the embodiment of setting forth here.On the contrary, provide these embodiments to be to make the disclosure more thorough and complete, and the scope of the present disclosure intactly can be conveyed to those skilled in the art.
Fig. 1 schematically shows according to virtual reality system of the present invention.
As shown in Figure 1, this virtual reality system comprises headset equipment 40 and the gesture identification equipment according to one aspect of the invention.
Gesture identification equipment according to the present invention comprises and is arranged on imaging device 15 in headset equipment 40 and the first inertial sensor 10.Imaging device 15 and the first inertial sensor 10 can all be fixedly installed in headset equipment 40, make the first inertial sensor 10 can be set to fix relative to imaging device 15.This gesture identification equipment also comprises the second inertial sensor 20 and processor 30.
Processor 30 can be arranged on optional position.In one embodiment, this virtual reality system can also comprise the Backpack type equipment (not shown) being suitable for user and bearing.Processor 30 can be arranged in this Backpack type equipment.Imaging device 15, first inertial sensor 10, second inertial sensor 20 can be communicated with processor 30 by wired or wireless mode.
Imaging device 15 is for taking the image of hand.
Imaging device 15 can adopt the depth camera based on (TOF, TimeofFlight) principle between when flying light, and it can be taken and obtain hand depth image, can provide hand depth information.
First inertial sensor 10 is set to fix relative to imaging device 15, thus can by sensing the directional information of himself, the camera lens orientation information of sensing imaging device 15.
First inertial sensor can be nine axle sensors.
Second inertial sensor 10 can be arranged on hand, such as, can be arranged on the position that the hand centre of the palm, the back of the hand, wrist etc. can be rotated with the hand centre of the palm, thus can by sensing the directional information of himself, sensing hand centre of the palm orientation information.
Second inertial sensor also can be nine axle sensors.
First inertial sensor 10 and the second inertial sensor 20 when not relying on other positioning systems, can determine respective coordinate orientation information, thus can provide camera lens orientation information and hand centre of the palm orientation information respectively.
Processor 30 is connected to the first inertial sensor 10 and the second inertial sensor 20, for according to camera lens orientation information and hand centre of the palm orientation information, determines the relative direction relation of the hand centre of the palm relative to imaging device 15.
Processor 30 can also obtain hand shape information from hand images, and in conjunction with hand shape information and relative direction relation, determines that hand is left hand or the right hand.
This virtual reality system can also comprise the Backpack type equipment (not shown) being suitable for user and bearing.Processor 30 can be arranged in this Backpack type equipment.Other device can also be provided with in this Backpack type equipment, such as, can be provided with power supply, signal receiving/transmission device etc.
First inertial sensor 10, imaging device 15, second inertial sensor 20 can be communicated with processor 30 by wired or wireless mode.
Illustrated in Fig. 1 that gesture identification equipment is for the situation in virtual reality system, imaging device 15 is arranged in headset equipment in this case.Such as, headset equipment can also have head-mounted display.
In other situations of carrying out gesture identification, such as, in Intelligent housing, vehicle-mounted operation control, PC and mobile terminal manipulation, industrial design etc., gesture identification equipment also can be arranged separately, or is arranged on other equipment.
Describe according to gesture identifying device of the present invention and virtual reality system with reference to figure 1 above.
The gesture identification method that may be used for this gesture identification equipment is described below with reference to Fig. 2 to Fig. 4.
This gesture identification method can perform in such as above-mentioned gesture identification equipment.
Fig. 2 is the process flow diagram of an embodiment of the method for identification hand centre of the palm relative direction according to gesture identifying device of the present invention.
In step S100, determine the camera lens orientation information of imaging device 15.Such as, the camera lens orientation information of imaging device 15 can be obtained from the first inertial sensor 10.
In step S200, determine hand centre of the palm orientation information.Such as, hand centre of the palm orientation information can be obtained from the second inertial sensor 20.
Here, first describe the camera lens orientation information determining imaging device 15, after describe and determine hand centre of the palm orientation information.But, should be understood that the order of step S100 and step S200 can be put upside down, also can perform simultaneously.
Then, in step S300, according to camera lens orientation information and hand centre of the palm orientation information, determine the relative direction relation of the hand centre of the palm relative to imaging device 15.
Camera lens orientation information can represent with the form of vector, such as, can be camera lens normal to a surface.
Hand centre of the palm orientation information also can represent with the form of vector, such as, can be the normal of hand centre of the palm part.
Camera lens orientation information and hand centre of the palm orientation information can represent with the vector form in space coordinates.Space coordinates herein also can be called " earth coordinates ", are generally fixing coordinate system in the environment.Such as, when using said apparatus in a room, space coordinates are now fixing relative to room.
Above-mentioned relative direction relation can be determined by doing inner product to the vector corresponding to camera lens orientation information and hand centre of the palm orientation information.
Describe a kind of can being used in further detail below with reference to Fig. 3 and perform the example determining the method for relative direction relation in above-mentioned steps S300.
Fig. 3 shows in above-mentioned steps S300, is determining the process flow diagram of the hand centre of the palm relative to the sub-step of an example of the relative direction relation of imaging device.
In step S310, calculate the inner product of vector of camera lens orientation information and hand centre of the palm orientation information.
In step S320, judge whether above-mentioned inner product of vector is greater than 0.
If inner product is greater than 0, then in step S330, judge that the hand centre of the palm is back to imaging device.
If inner product is less than 0, then in step S340, judge that hand the centre of the palm is to imaging device.
Below with reference to Fig. 4 description gesture identification method according to another embodiment of the invention.
Fig. 4 is the process flow diagram of another embodiment of the method for identification hand gestures information according to gesture identifying device of the present invention.
In the embodiment shown in fig. 4, determine on the basis of the hand centre of the palm relative to the relative direction relation of imaging device 15 at the step S100 by describing above with reference to Fig. 2, S200, S300, further combined with the hand images that imaging device 15 is taken, determine hand gestures.Step beyond the step only comprised Fig. 2 is below described in detail.
In step S400, obtain the hand images that imaging device 15 is taken.Such as, processor 30 directly can obtain hand images from imaging device 15.Or the hand images that imaging device 15 is taken can be stored in storer (not shown), and processor 30 can read hand images from storer.
When imaging device 15 adopts TOF depth camera, hand images can be hand depth image, can provide hand depth information.
In step S500, obtain hand shape information from hand images.
Above-mentioned hand shape information such as can comprise finger relative length, digital flexion/stretch the information such as form.
In step S600, the relative direction relation determined in the hand shape information that integrating step S500 determines and step S300, determines that hand is left hand or the right hand.
On the basis determining right-hand man, hand gestures information can be determined more exactly.Hand gestures information such as can comprise the information such as digital flexion degree and digital flexion direction.
Should be understood that and the sequencing that the step S100 shown in Fig. 4 to step S300 and step S400 to step S500 does not fix when implementing can perform, execution of also can running simultaneously with random order.
So far, described in detail according to gesture identification equipment of the present invention, gesture identification method and virtual reality system.But it will be understood by those skilled in the art that and the invention is not restricted to various details described herein but suitable amendment can be made.Protection scope of the present invention is defined by the appended claims.

Claims (10)

1. a gesture identification equipment, is characterized in that, comprising:
Imaging device, for taking hand images;
First inertial sensor, is set to fix relative to described imaging device, for sensing the camera lens orientation information of described imaging device;
Second inertial sensor, is suitable for being arranged in hand, for sensing hand centre of the palm orientation information;
Processor, is connected to described first inertial sensor and described second inertial sensor, for according to described camera lens orientation information and described hand centre of the palm orientation information, determines the relative direction relation of the described hand centre of the palm relative to described imaging device.
2. gesture identification equipment according to claim 1, is characterized in that,
Described processor also for obtaining hand shape information from described hand images, and in conjunction with described hand shape information and described relative direction relation, determines that described hand is left hand or the right hand.
3. gesture identification equipment according to claim 1 and 2, is characterized in that,
Described imaging device comprises the depth camera based on principle between when flying light.
4. gesture identification equipment according to claim 1, is characterized in that,
Described first inertial sensor is nine axle sensors; And/or
Described second inertial sensor is nine axle sensors.
5. a virtual reality system, is characterized in that, comprising:
According to the gesture identification equipment in Claims 1-4 described in any one;
Headset equipment, described imaging device is arranged in described headset equipment.
6. virtual reality system according to claim 5, is characterized in that, also comprises:
Backpack type equipment, is suitable for user and bears, and described processor is arranged in described Backpack type equipment.
7. use the gesture identification method identifying hand gestures information according to the gesture identification equipment in Claims 1-4 described in any one, it is characterized in that, comprising:
Determine the camera lens orientation information of described imaging device;
Determine described hand centre of the palm orientation information;
According to described camera lens orientation information and described hand centre of the palm orientation information, determine the relative direction relation of the described hand centre of the palm relative to described imaging device.
8. method according to claim 7, it is characterized in that, described camera lens orientation information and described hand centre of the palm orientation information are all represent with the form of the vector in space coordinates, determine that the described hand centre of the palm comprises relative to the step of the relative direction relation of described imaging device:
Ask the inner product of vector of described camera lens orientation information and described hand centre of the palm orientation information;
If inner product is greater than 0, then judge that the described hand centre of the palm is back to described imaging device,
If inner product is less than 0, then judge that described hand the centre of the palm is to described imaging device.
9. method according to claim 8, is characterized in that, also comprises:
Obtain the hand images of described imaging device shooting;
Hand shape information is obtained from described hand images;
In conjunction with described hand shape information and described relative direction relation, determine that described hand is left hand or the right hand.
10. method according to claim 9, is characterized in that,
Described hand images is hand depth image.
CN201510586276.7A 2015-09-15 2015-09-15 Gesture identification equipment, gesture identification method and virtual reality system Active CN105184268B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510586276.7A CN105184268B (en) 2015-09-15 2015-09-15 Gesture identification equipment, gesture identification method and virtual reality system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510586276.7A CN105184268B (en) 2015-09-15 2015-09-15 Gesture identification equipment, gesture identification method and virtual reality system

Publications (2)

Publication Number Publication Date
CN105184268A true CN105184268A (en) 2015-12-23
CN105184268B CN105184268B (en) 2019-01-25

Family

ID=54906335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510586276.7A Active CN105184268B (en) 2015-09-15 2015-09-15 Gesture identification equipment, gesture identification method and virtual reality system

Country Status (1)

Country Link
CN (1) CN105184268B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105807624A (en) * 2016-05-03 2016-07-27 惠州Tcl移动通信有限公司 Method for controlling intelligent home equipment through VR equipment and VR equipment
CN105912117A (en) * 2016-04-12 2016-08-31 北京锤子数码科技有限公司 Motion state capture method and system
WO2017113793A1 (en) * 2015-12-31 2017-07-06 北京体基科技有限公司 Method and apparatus for determining area of finger in image, and a wrist-type device
WO2017113794A1 (en) * 2015-12-31 2017-07-06 北京体基科技有限公司 Gesture recognition method, control method and apparatus, and wrist-type device
CN112000224A (en) * 2020-08-24 2020-11-27 北京华捷艾米科技有限公司 Gesture interaction method and system
CN106933340B (en) * 2015-12-31 2024-04-26 北京体基科技有限公司 Gesture motion recognition method, control method and device and wrist type equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100156787A1 (en) * 2008-12-19 2010-06-24 Brother Kogyo Kabushiki Kaisha Head mount display
CN102906623A (en) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
CN103018905A (en) * 2011-09-23 2013-04-03 奇想创造事业股份有限公司 Head-mounted somatosensory manipulation display system and method thereof
CN103324309A (en) * 2013-06-18 2013-09-25 杭鑫鑫 Wearable computer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100156787A1 (en) * 2008-12-19 2010-06-24 Brother Kogyo Kabushiki Kaisha Head mount display
CN102906623A (en) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
CN103018905A (en) * 2011-09-23 2013-04-03 奇想创造事业股份有限公司 Head-mounted somatosensory manipulation display system and method thereof
CN103324309A (en) * 2013-06-18 2013-09-25 杭鑫鑫 Wearable computer

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017113793A1 (en) * 2015-12-31 2017-07-06 北京体基科技有限公司 Method and apparatus for determining area of finger in image, and a wrist-type device
WO2017113794A1 (en) * 2015-12-31 2017-07-06 北京体基科技有限公司 Gesture recognition method, control method and apparatus, and wrist-type device
CN106933340A (en) * 2015-12-31 2017-07-07 北京体基科技有限公司 Gesture motion recognition methods, control method and device and wrist equipment
CN106933341A (en) * 2015-12-31 2017-07-07 北京体基科技有限公司 It is a kind of to determine the method in region, device and wrist equipment residing for finger in the picture
CN106933340B (en) * 2015-12-31 2024-04-26 北京体基科技有限公司 Gesture motion recognition method, control method and device and wrist type equipment
CN106933341B (en) * 2015-12-31 2024-04-26 北京体基科技有限公司 Method and device for determining region of finger in image and wrist type equipment
CN105912117A (en) * 2016-04-12 2016-08-31 北京锤子数码科技有限公司 Motion state capture method and system
CN105912117B (en) * 2016-04-12 2019-05-07 北京锤子数码科技有限公司 Motion state method for catching and system
CN105807624A (en) * 2016-05-03 2016-07-27 惠州Tcl移动通信有限公司 Method for controlling intelligent home equipment through VR equipment and VR equipment
CN112000224A (en) * 2020-08-24 2020-11-27 北京华捷艾米科技有限公司 Gesture interaction method and system

Also Published As

Publication number Publication date
CN105184268B (en) 2019-01-25

Similar Documents

Publication Publication Date Title
US11887312B2 (en) Fiducial marker patterns, their automatic detection in images, and applications thereof
CN103134489B (en) The method of target localization is carried out based on mobile terminal
CN103471580B (en) For providing method, mobile terminal and the server of navigation information
CN107748569B (en) Motion control method and device for unmanned aerial vehicle and unmanned aerial vehicle system
CN105184268A (en) Gesture recognition device, gesture recognition method, and virtual reality system
CN111091587B (en) Low-cost motion capture method based on visual markers
CN105814609A (en) Fusing device and image motion for user identification, tracking and device association
JP6943988B2 (en) Control methods, equipment and systems for movable objects
JP2016524128A (en) Locating system and locating method
CA2697060A1 (en) Method and apparatus for sending data relating to a target to a mobile device
US20210183100A1 (en) Data processing method and apparatus
CN102446048A (en) Information processing device and information processing method
JP6927937B2 (en) Systems and methods for generating 3D skeletal representations
CN110470333A (en) Scaling method and device, the storage medium and electronic device of sensor parameters
CN108600367A (en) Internet of Things system and method
CN109767470A (en) A kind of tracking system initial method and terminal device
Stranner et al. A high-precision localization device for outdoor augmented reality
CN116348916A (en) Azimuth tracking for rolling shutter camera
TWI722738B (en) Augmented reality device and positioning method
CN103903253A (en) Mobile terminal positioning method and system
JP2022108823A (en) Search support system and search support program
CN109579830A (en) The air navigation aid and navigation system of intelligent robot
KR20200134401A (en) Smart glasses operation method interworking to mobile device
Praschl et al. Enabling outdoor MR capabilities for head mounted displays: a case study
CN115480483A (en) Method, device, equipment and medium for identifying kinetic parameters of robot

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant