CN105184268B - Gesture identification equipment, gesture identification method and virtual reality system - Google Patents
Gesture identification equipment, gesture identification method and virtual reality system Download PDFInfo
- Publication number
- CN105184268B CN105184268B CN201510586276.7A CN201510586276A CN105184268B CN 105184268 B CN105184268 B CN 105184268B CN 201510586276 A CN201510586276 A CN 201510586276A CN 105184268 B CN105184268 B CN 105184268B
- Authority
- CN
- China
- Prior art keywords
- hand
- imaging device
- palm
- orientation information
- centre
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention provides a kind of gesture identification equipment, gesture identification method and virtual reality systems.The gesture identification equipment includes: imaging device, for shooting hand images;First inertial sensor is arranged to fix relative to imaging device, for sensing the camera lens orientation information of imaging device;Second inertial sensor, is arranged in hand, for sensing hand centre of the palm orientation information;Processor is connected to the first inertial sensor and the second inertial sensor, for determining relative direction relationship of the hand centre of the palm relative to imaging device according to camera lens orientation information and hand centre of the palm orientation information.Gesture identification method comprises determining that the camera lens orientation information of imaging device;Determine hand centre of the palm orientation information;According to camera lens orientation information and hand centre of the palm orientation information, relative direction relationship of the hand centre of the palm relative to imaging device is determined.Virtual reality system includes gesture identification equipment and headset equipment.Thus, it is possible to keep gesture identification more accurate.
Description
Technical field
The present invention relates to gesture identification equipment, gesture identification method and virtual reality systems.
Background technique
With virtual reality, human-computer interaction, the rapid development of image recognition the relevant technologies, various industries are to accurately gesture
The demand of identification is increasing.It can be applied to: Intelligent housing, vehicle-mounted operation control, PC and mobile terminal manipulation, industry
The various aspects such as design.And its commercial value also increases day by day.
The method of gesture identification is also varied, can mainly be roughly divided into three kinds of technologies: the image recognition based on light technology
Technology, the motion capture technology based on inertial sensor and the hand configuration simulation technology based on mechanical structure.These three skills
Art cuts both ways, incompatible.
Hand images recovery is carried out using light, carry out that the technology of gesture identification generally collects is all the depth of hand form
Information.Its shortcoming is that: the positive and negative judgement of hand is relatively difficult, is easy to be judged by accident;The influence of light is bigger, including outdoor can
Laser, capture camera itself identification interference that light-exposed, laser camera oneself issues etc.;Identification range is limited, is limited by optical path, right
The tolerance of obstacle is lower, and the identification decision that both hands overlap is wrong.
Using inertial sensor carry out motion capture have portability it is strong, it is easy to operate, be not afraid of block, captures precision height,
The high advantage of sample rate.And it uses high speed integrated chip and module, small in size, light-weight, sexual valence is relatively high.But
Gesture identification field has some apparent defects: the fine movement of hand joint can not be restored;It not can be carried out space
Absolute fix carries out integral operation by each section limbs posture information, and obtained spatial positional information causes different degrees of
Integrator drift.It can not carry out the precise positioning of hand motion;It is easy to be reduced precision by ferromagnetic influence of ambient enviroment.
Summary of the invention
The problem to be solved by the invention is to provide gesture identification equipment, gesture identification method and virtual reality systems, make
Gesture identification is more accurate.
According to an aspect of the invention, there is provided a kind of gesture identification equipment, comprising: imaging device, for shooting hand
Portion's image;First inertial sensor is arranged to fix relative to imaging device, for sensing the camera lens court of the imaging device
To information;Second inertial sensor, suitable for being arranged in hand, for sensing hand centre of the palm orientation information;Processor is connected to
One inertial sensor and the second inertial sensor, for determining hand according to camera lens orientation information and hand centre of the palm orientation information
Relative direction relationship of the centre of the palm relative to imaging device.
Preferably, processor can be also used for obtaining hand shape information from hand images, and combine hand shape information
With relative direction relationship, determine that hand is left hand or the right hand.
Preferably, the depth camera of principle between when imaging device may include based on winged light, fly light when between TOF
(Time of Flight)。
Preferably, the first inertial sensor can be nine axle sensors.
Preferably, the second inertial sensor can be nine axle sensors.
According to an aspect of the invention, there is provided a kind of virtual reality system, comprising: gesture identification equipment;Wear-type
Equipment.
Preferably, imaging device can be set in headset equipment, can also be independently arranged, or be set to other equipment
On.
Preferably, which can also include Backpack type equipment, bear suitable for user.Processor can be set
In the Backpack type equipment.Imaging device, the first inertial sensor, the second inertial sensor can pass through wired or wireless side
Formula is communicated with processor.
According to an aspect of the invention, there is provided a kind of gesture identification method, comprising: determine the camera lens court of imaging device
To information;Determine the orientation information in the hand centre of the palm;According to camera lens orientation information and hand centre of the palm orientation information, the hand centre of the palm is determined
Relative direction relationship relative to imaging device.
Preferably, camera lens orientation information and hand orientation information can be indicated in the form of the vector in space coordinates,
The step of determining relative direction relationship of the hand centre of the palm relative to imaging device can include: ask camera lens orientation information and the hand centre of the palm
The inner product of vector of orientation information;If inner product is greater than 0, determine the hand centre of the palm back to imaging device, if inner product less than 0,
Determine hand the centre of the palm to imaging device.
Preferably, hand images can be hand depth image.
Preferably, gesture identification method can also include: the hand images for obtaining imaging device shooting;It is obtained from hand images
Obtain hand shape information;In conjunction with hand shape information and relative direction relationship, determine that hand is left hand or the right hand.
By using gesture identification equipment according to the present invention, gesture identification method and virtual reality system, can make up
Disadvantage brought by single light identification technology or inertial navigation formula motion capture, to keep gesture identification more accurate.
Detailed description of the invention
Disclosure illustrative embodiments are described in more detail in conjunction with the accompanying drawings, the disclosure above-mentioned and its
Its purpose, feature and advantage will be apparent, wherein in disclosure illustrative embodiments, identical reference label
Typically represent same parts.
Fig. 1 is the schematic diagram of virtual reality system according to the present invention.
Fig. 2 is one embodiment of the method for the identification hand centre of the palm relative direction of gesture identifying device according to the present invention
Flow chart.
Fig. 3 is shown is determining the hand centre of the palm relative to the opposite of imaging device in gesture identification method according to the present invention
The flow chart of the sub-step of one embodiment of direction relations.
Fig. 4 is the stream of one embodiment of the method for the identification hand gestures information of gesture identifying device according to the present invention
Cheng Tu.
Specific embodiment
The preferred embodiment of the disclosure is more fully described below with reference to accompanying drawings.Although showing the disclosure in attached drawing
Preferred embodiment, however, it is to be appreciated that may be realized in various forms the disclosure without the embodiment party that should be illustrated here
Formula is limited.On the contrary, these embodiments are provided so that this disclosure will be more thorough and complete, and can be by the disclosure
Range is completely communicated to those skilled in the art.
Fig. 1 schematically shows virtual reality system according to the present invention.
As shown in Figure 1, the virtual reality system includes headset equipment 40 and the gesture of one aspect is known according to the present invention
Other equipment.
Gesture identification equipment according to the present invention includes that the imaging device 15 being arranged in headset equipment 40 and first are used
Property sensor 10.Imaging device 15 and the first inertial sensor 10 can be all fixed in headset equipment 40, so that the
One inertial sensor 10 can be set to fixed relative to imaging device 15.The gesture identification equipment further includes that the second inertia passes
Sensor 20 and processor 30.
Processor 30 can be set at an arbitrary position.In one embodiment, which can also include suitable
In the Backpack type equipment (not shown) that user bears.Processor 30 can be set in the Backpack type equipment.Imaging device
15, the first inertial sensor 10, the second inertial sensor 20 can be led to by wired or wireless mode and processor 30
Letter.
Imaging device 15 is used to shoot the image of hand.
The depth camera of (TOF, Time of Flight) principle between when imaging device 15 can be using based on winged light,
It can shoot to obtain hand depth image, be capable of providing hand depth information.
First inertial sensor 10 is arranged to fixed relative to imaging device 15, so as to by sensing its own
Directional information senses the camera lens orientation information of imaging device 15.
First inertial sensor can be nine axle sensors.
Second inertial sensor 10 can be set in hand, can in the hand centre of the palm, the back of the hand, wrist etc. for example, can be set
With at the position that is rotated together with the hand centre of the palm, so as to sense hand centre of the palm court by the directional information for sensing its own
To information.
Second inertial sensor is also possible to nine axle sensors.
First inertial sensor 10 and the second inertial sensor 20 can be in the case where not depending on other positioning systems, really
Fixed respective coordinate orientation information, so as to provide camera lens orientation information and hand centre of the palm orientation information respectively.
Processor 30 is connected to the first inertial sensor 10 and the second inertial sensor 20, for according to camera lens orientation information
With hand centre of the palm orientation information, to determine relative direction relationship of the hand centre of the palm relative to imaging device 15.
Processor 30 can also obtain hand shape information from hand images, and combine hand shape information and relative direction
Relationship determines that hand is left hand or the right hand.
The virtual reality system can also include the Backpack type equipment (not shown) born suitable for user.Processor 30
It can be set in the Backpack type equipment.Other devices are also provided in the Backpack type equipment, such as electricity can be set
Source, signal receiving/transmission device etc..
First inertial sensor 10, imaging device 15, the second inertial sensor 20 can pass through wired or wireless mode
It is communicated with processor 30.
The situation that gesture identification equipment is used in virtual reality system is shown in Fig. 1, in this case imaging device 15
It is arranged in headset equipment.For example, can also have head-mounted display in headset equipment.
In other for carrying out gesture identification, such as in Intelligent housing, vehicle-mounted operation control, PC and mobile terminal
Manipulation, industrial design etc., gesture identification equipment can also be separately provided, or be set in other equipment.
Fig. 1 has been referred to above describes gesture identifying device according to the present invention and virtual reality system.
It can be used for the gesture identification method of the gesture identification equipment below with reference to Fig. 2 to Fig. 4 description.
The gesture identification method can execute in for example above-mentioned gesture identification equipment.
Fig. 2 is one embodiment of the method for the identification hand centre of the palm relative direction of gesture identifying device according to the present invention
Flow chart.
In step S100, the camera lens orientation information of imaging device 15 is determined.For example, can be obtained from the first inertial sensor 10
Obtain the camera lens orientation information of imaging device 15.
In step S200, hand centre of the palm orientation information is determined.For example, the hand palm can be obtained from the second inertial sensor 20
Heart orientation information.
Here, first describe the camera lens orientation information of determining imaging device 15, after describe the determining hand centre of the palm towards letter
Breath.It is to be understood, however, that the sequence of step S100 and step S200 can overturn, also may be performed simultaneously.
Then, in step S300, according to camera lens orientation information and hand centre of the palm orientation information, determine the hand centre of the palm relative to
The relative direction relationship of imaging device 15.
Camera lens orientation information can be indicated in the form of vector, such as can be camera lens normal to a surface.
Hand centre of the palm orientation information can also be indicated in the form of vector, such as can be the normal of hand centre of the palm part.
Camera lens orientation information and hand centre of the palm orientation information can be indicated with the vector form in space coordinates.Herein
Space coordinates are referred to as " earth coordinates ", are in the environment usually fixed coordinate system.For example, using in a room
When above-mentioned apparatus, space coordinates at this time are fixed relative to room.
It can be above-mentioned to determine by doing inner product to vector corresponding to camera lens orientation information and hand centre of the palm orientation information
Relative direction relationship.
Being described more fully one kind below with reference to Fig. 3 can be used to execute determining relative direction pass in above-mentioned steps S300
The example of the method for system.
Fig. 3 is shown is determining relative direction relationship of the hand centre of the palm relative to imaging device in above-mentioned steps S300
The flow chart of one exemplary sub-step.
In step S310, the inner product of vector of camera lens orientation information and hand centre of the palm orientation information is calculated.
In step S320, judge whether above-mentioned inner product of vector is greater than 0.
If inner product is greater than 0, in step S330, determine the hand centre of the palm back to imaging device.
If inner product less than 0, in step S340, determines hand the centre of the palm to imaging device.
Below with reference to the gesture identification method of Fig. 4 description according to another embodiment of the invention.
Fig. 4 is another embodiment of the method for the identification hand gestures information of gesture identifying device according to the present invention
Flow chart.
In the embodiment shown in fig. 4, hand is determined in step S100, S200, S300 by describing above with reference to Fig. 2
On the basis of the centre of the palm is relative to the relative direction relationship of imaging device 15, further combined with the hand figure of the shooting of imaging device 15
Picture, to determine hand gestures.Below only to other than including the steps that in Fig. 2 the step of be described in detail.
In step S400, the hand images that imaging device 15 is shot are obtained.For example, processor 30 can be from imaging device 15
Directly acquire hand images.Alternatively, the hand images that imaging device 15 is shot can store in memory (not shown),
Processor 30 can read hand images from memory.
In the case where imaging device 15 is using TOF depth camera, hand images can be hand depth image, can
Hand depth information is provided.
In step S500, hand shape information is obtained from hand images.
Above-mentioned hand shape information for example may include finger with respect to length, digital flexion/stretch the information such as form.
The relative direction determined in step S600, the hand shape information and step S300 determined in conjunction with step S500 is closed
System, determines that hand is left hand or the right hand.
On the basis of determining right-hand man, hand gestures information can be more accurately determined.Hand gestures information is for example
It may include the information such as digital flexion degree and digital flexion direction.
It should be understood that step S100 to step S300 and step S400 to step S500 shown in Fig. 4 is when implementing
The sequencing that do not fix can be executed with random order, can also simultaneously and concurrently be executed.
So far, gesture identification equipment, gesture identification method and virtual reality system according to the present invention has already been described in detail.
It would be appreciated by those skilled in the art that the present invention is not limited to various details described herein but appropriate repair can be made
Change.Protection scope of the present invention is defined by the appended claims.
Claims (10)
1. a kind of gesture identification equipment characterized by comprising
Imaging device, for shooting hand images;
First inertial sensor is arranged to fix relative to the imaging device, for sensing the camera lens of the imaging device
Orientation information;
Second inertial sensor, the position rotated together suitable for being arranged in hand with the hand centre of the palm, for by sensing its own
Directional information sense hand centre of the palm orientation information;
Processor is connected to first inertial sensor and second inertial sensor, for according to the camera lens direction
Information and hand centre of the palm orientation information, determine relative direction relationship of the hand centre of the palm relative to the imaging device.
2. gesture identification equipment according to claim 1, which is characterized in that
The processor is also used to obtain hand shape information from the hand images, and in conjunction with the hand shape information and institute
Relative direction relationship is stated, determines that the hand is left hand or the right hand.
3. gesture identification equipment according to claim 1 or 2, which is characterized in that
The depth camera of principle between when the imaging device includes based on winged light.
4. gesture identification equipment according to claim 1, which is characterized in that
First inertial sensor is nine axle sensors;And/or
Second inertial sensor is nine axle sensors.
5. a kind of virtual reality system characterized by comprising
According to claim 1 to gesture identification equipment described in any one of 4;
Headset equipment, the imaging device are arranged in the headset equipment.
6. virtual reality system according to claim 5, which is characterized in that further include:
Backpack type equipment, bears suitable for user, and the processor is arranged in the Backpack type equipment.
7. a kind of using identifying hand gestures information to gesture identification equipment described in any one of 4 according to claim 1
Gesture identification method characterized by comprising
Determine the camera lens orientation information of the imaging device;
Determine hand centre of the palm orientation information;
According to the camera lens orientation information and hand centre of the palm orientation information, determine the hand centre of the palm relative to the imaging
The relative direction relationship of device.
8. the method according to the description of claim 7 is characterized in that the camera lens orientation information and hand centre of the palm direction letter
Breath is indicated in the form of the vector in space coordinates, determines the hand centre of the palm relative to the opposite of the imaging device
The step of direction relations includes:
Seek the inner product of vector of the camera lens orientation information and hand centre of the palm orientation information;
If inner product is greater than 0, determine the hand centre of the palm back to the imaging device,
If inner product less than 0, determines described hand the centre of the palm to the imaging device.
9. according to the method described in claim 8, it is characterized by further comprising:
Obtain the hand images of the imaging device shooting;
Hand shape information is obtained from the hand images;
In conjunction with the hand shape information and the relative direction relationship, determine that the hand is left hand or the right hand.
10. according to the method described in claim 9, it is characterized in that,
The hand images are hand depth images.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510586276.7A CN105184268B (en) | 2015-09-15 | 2015-09-15 | Gesture identification equipment, gesture identification method and virtual reality system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510586276.7A CN105184268B (en) | 2015-09-15 | 2015-09-15 | Gesture identification equipment, gesture identification method and virtual reality system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105184268A CN105184268A (en) | 2015-12-23 |
CN105184268B true CN105184268B (en) | 2019-01-25 |
Family
ID=54906335
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510586276.7A Active CN105184268B (en) | 2015-09-15 | 2015-09-15 | Gesture identification equipment, gesture identification method and virtual reality system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105184268B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106933341B (en) * | 2015-12-31 | 2024-04-26 | 北京体基科技有限公司 | Method and device for determining region of finger in image and wrist type equipment |
CN106933340B (en) * | 2015-12-31 | 2024-04-26 | 北京体基科技有限公司 | Gesture motion recognition method, control method and device and wrist type equipment |
CN105912117B (en) * | 2016-04-12 | 2019-05-07 | 北京锤子数码科技有限公司 | Motion state method for catching and system |
CN105807624A (en) * | 2016-05-03 | 2016-07-27 | 惠州Tcl移动通信有限公司 | Method for controlling intelligent home equipment through VR equipment and VR equipment |
CN112000224A (en) * | 2020-08-24 | 2020-11-27 | 北京华捷艾米科技有限公司 | Gesture interaction method and system |
US20230140030A1 (en) * | 2021-11-03 | 2023-05-04 | Htc Corporation | Method, system and recording medium for accessory pairing |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102906623A (en) * | 2010-02-28 | 2013-01-30 | 奥斯特豪特集团有限公司 | Local advertising content on an interactive head-mounted eyepiece |
CN103018905A (en) * | 2011-09-23 | 2013-04-03 | 奇想创造事业股份有限公司 | Head-mounted somatosensory manipulation display system and method thereof |
CN103324309A (en) * | 2013-06-18 | 2013-09-25 | 杭鑫鑫 | Wearable computer |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5293154B2 (en) * | 2008-12-19 | 2013-09-18 | ブラザー工業株式会社 | Head mounted display |
-
2015
- 2015-09-15 CN CN201510586276.7A patent/CN105184268B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102906623A (en) * | 2010-02-28 | 2013-01-30 | 奥斯特豪特集团有限公司 | Local advertising content on an interactive head-mounted eyepiece |
CN103018905A (en) * | 2011-09-23 | 2013-04-03 | 奇想创造事业股份有限公司 | Head-mounted somatosensory manipulation display system and method thereof |
CN103324309A (en) * | 2013-06-18 | 2013-09-25 | 杭鑫鑫 | Wearable computer |
Also Published As
Publication number | Publication date |
---|---|
CN105184268A (en) | 2015-12-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105184268B (en) | Gesture identification equipment, gesture identification method and virtual reality system | |
US11100649B2 (en) | Fiducial marker patterns, their automatic detection in images, and applications thereof | |
US11402903B1 (en) | Fiducial rings in virtual reality | |
US10261595B1 (en) | High resolution tracking and response to hand gestures through three dimensions | |
US20220206588A1 (en) | Micro hand gestures for controlling virtual and graphical elements | |
US20220326781A1 (en) | Bimanual interactions between mapped hand regions for controlling virtual and graphical elements | |
US20160232715A1 (en) | Virtual reality and augmented reality control with mobile devices | |
TWI722280B (en) | Controller tracking for multiple degrees of freedom | |
US20170357332A1 (en) | Six dof mixed reality input by fusing inertial handheld controller with hand tracking | |
CN110647237A (en) | Gesture-based content sharing in an artificial reality environment | |
US20210407178A1 (en) | Generating ground truth datasets for virtual reality experiences | |
US20220099972A1 (en) | Geometry modeling of eyewear devices with flexible frames | |
WO2017043181A1 (en) | Sensor device, sensor system, and information-processing device | |
CN104756045A (en) | Wearable sensor for tracking articulated body-parts | |
US20240031678A1 (en) | Pose tracking for rolling shutter camera | |
US20230419615A1 (en) | Robotic learning of assembly tasks using augmented reality | |
JP2009258884A (en) | User interface | |
WO2014106862A2 (en) | A method and system enabling control of different digital devices using gesture or motion control | |
Grimm et al. | VR/AR input devices and tracking | |
Vu et al. | Hand pose detection in hmd environments by sensor fusion using multi-layer perceptron | |
CN115023732A (en) | Information processing apparatus, information processing method, and information processing program | |
EP4246282A1 (en) | Information processing apparatus, information processing system, and information processing method | |
Li et al. | Handheld pose tracking using vision-inertial sensors with occlusion handling | |
US20220358736A1 (en) | Mobile device tracking module within a vr simulation | |
Li | A new efficient pose estimation and tracking method for personal devices: application to interaction in smart spaces |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |