CN105867599A - Gesture control method and device - Google Patents

Gesture control method and device Download PDF

Info

Publication number
CN105867599A
CN105867599A CN201510505966.5A CN201510505966A CN105867599A CN 105867599 A CN105867599 A CN 105867599A CN 201510505966 A CN201510505966 A CN 201510505966A CN 105867599 A CN105867599 A CN 105867599A
Authority
CN
China
Prior art keywords
option
coordinate
control part
touch control
options menu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510505966.5A
Other languages
Chinese (zh)
Inventor
聂林
于燕
黄波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leshi Zhixin Electronic Technology Tianjin Co Ltd
Original Assignee
Leshi Zhixin Electronic Technology Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leshi Zhixin Electronic Technology Tianjin Co Ltd filed Critical Leshi Zhixin Electronic Technology Tianjin Co Ltd
Priority to CN201510505966.5A priority Critical patent/CN105867599A/en
Publication of CN105867599A publication Critical patent/CN105867599A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiments of the invention provide a gesture operation method and device. The method comprises the following steps: displaying an option menu when it is determined that an acquired first image contains a set gesture; determining an area in which a coordinate of a touch part falls when the touch part in an acquired second image falls in an option menu area; determining an option pointed by the touch part according to the area in which the coordinate of the touch part falls; and executing a corresponding operation according to the determined option. By adopting the scheme provided by the embodiments of the method and device, the options in the option menu are selected through gestures, and tools such as a remote controller and the like are not required, so that the user operations are more simple and convenient, the user experience is improved and the using cost is decreased.

Description

A kind of gesture control method and device
Technical field
The present embodiments relate to field of computer technology, particularly relate to a kind of gesture control method and device.
Background technology
Along with the raising of computer level, the analogue simulation utilizing computer to carry out environment brings people not The same experience, wherein utilizes virtual reality technology (VR, VirtualReality) and augmented reality skill The terminal unit that art (AR, Augmented Reality) makes is (such as: the VR helmet and AR eye Mirror etc.) gradually apply in different fields.Virtual reality technology is that one can create and experiencing virtual The computer simulation system in the world, utilizes computer to generate a kind of simulated environment, it is common that by true generation Boundary intercepts outside.Augmented reality is by new skill integrated to real world information and virtual world information Art, can be the entity letter originally being difficult to experience in the certain time spatial dimension of real world Breath, superposition again after analog simulation, by virtual Information application to real world.Use both technology The terminal unit made can provide the user a lot of service, including vision, sound, taste and sense of touch Deng experience.The most how providing the user convenient man-machine interaction experience is an important problem.
At present, existing man-machine interaction mainly uses the Solid Tools such as remote controller, keyboard and mouse, logical Cross the button on instrument and choose menu item, trigger corresponding operation.Using above-mentioned instrument manipulation menu During, when user intercepts with real world, it is impossible to see the button on instrument, can only rely on Experiential operating, so it may happen that the situation of maloperation, poor user experience.It addition, each terminal Equipment is equipped with operated tool, improves cost.
Summary of the invention
The embodiment of the present invention provides a kind of gesture control method and device, in order to solve people of the prior art Machine interaction uses the poor user experience that Solid Tools manipulation menu causes, the problem that cost is high.
The embodiment of the present invention provides a kind of gesture control method, including:
When determining in the first image collected containing when setting gesture, show options menu;
When touch control part falls into options menu region in the second image collected, determine described touch control part The region that coordinate falls into;
The region that coordinate according to touch control part falls into, determines the option that touch control part is pointed to;
According to the option determined, perform corresponding operation.
The embodiment of the present invention provides a kind of gesture actuation means, including:
Display unit, for determining in the first image collected containing when setting gesture, shows option dish Single;
Area determination unit, is used for when in the second image collected, touch control part falls into options menu region, Determine the region that the coordinate of described touch control part falls into;
Option determines unit, for the region fallen into according to the coordinate of touch control part, determines what touch control part was pointed to Option;
Manipulation unit, for according to the option determined, performs corresponding operation.
The gesture control method of embodiment of the present invention offer and device, beneficial effect: owing to being selected by gesture Select the option in options menu, it is not necessary to use the instruments such as remote controller so that user operation is easier, Improve Consumer's Experience, and decrease use cost.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to reality Execute the required accompanying drawing used in example or description of the prior art to be briefly described, it should be apparent that under, Accompanying drawing during face describes is some embodiments of the present invention, for those of ordinary skill in the art, On the premise of not paying creative work, it is also possible to obtain other accompanying drawing according to these accompanying drawings.
The flow chart of the gesture control method that Fig. 1 provides for the embodiment of the present invention;
Fig. 2 is the flow chart of gesture control method in the embodiment of the present invention 1;
Fig. 3 is the structural representation of gesture actuation means in the embodiment of the present invention 2.
Detailed description of the invention
For making the purpose of the embodiment of the present invention, technical scheme and advantage clearer, below in conjunction with this Accompanying drawing in bright embodiment, is clearly and completely described the technical scheme in the embodiment of the present invention, Obviously, described embodiment is a part of embodiment of the present invention rather than whole embodiments.Based on Embodiment in the present invention, those of ordinary skill in the art are obtained under not making creative work premise The every other embodiment obtained, broadly falls into the scope of protection of the invention.
The embodiment of the present invention provides a kind of gesture control method, as it is shown in figure 1, include:
Step 101, when determine in the first image collected containing set gesture time, show options menu.
Step 102, when in the second image collected, touch control part falls into options menu region, determine this The region that the coordinate of touch control part falls into.
The region that step 103, coordinate according to touch control part fall into, determines the option that touch control part is pointed to.
Step 104, according to the option determined, perform corresponding operation.
In the embodiment of the present invention, gesture control method can be applied at virtual implementing helmet and augmented reality eye On the equipment such as mirror, these equipment are provided with image collecting device, such as, image first-class.This setting gesture Can arrange the most flexibly, such as, this setting gesture can be that the five fingers are stretched out, palm Face is just to image collecting device.When touch control part is the option in selecting options menu, touch object and choosing The lap of item menu area, wherein, this touch object can be finger fingertip, pointer etc..Touch The coordinate in control portion can use motion sensing control device Leap Motion of the prior art or computer of increasing income to regard Feel that touch control part is processed by storehouse (Open CV, Open Source Computer Vision Library) Arrive.
In the embodiment of the present invention, by default gesture, activate the display of virtual menu, in the void of display Intend operating on menu, it is not necessary to use the instruments such as remote controller so that user operation is easier, carries High Consumer's Experience, and decrease use cost.
Below in conjunction with the accompanying drawings, the method and device provided the present invention with specific embodiment is described in detail.
Embodiment 1:
The embodiment of the present invention 1 manipulates virtual implementing helmet with gesture, as a example by touch object is finger fingertip, The flow chart of the gesture control method that Fig. 2 provides for the embodiment of the present invention 1, specifically includes and processes step as follows Rapid:
Step 201, collection are containing the first image setting gesture.
In the embodiment of the present invention, virtual reality (VR, the VirtualReality) helmet is installed photographic head and adopts The image in collection photographic head front, this photographic head can be common USB camera, it is also possible to take the photograph for the degree of depth As head.Photographic head can gather image according to predetermined period, this predetermined period can according to practical experience and Needs are arranged flexibly.
When user wants by the gesture manipulation VR helmet, before photographic head, do this setting gesture motion, Such as: the five fingers are stretched out, palmar aspect just to photographic head, be not intended to finger towards, do setting gesture motion Purpose be to activate options menu.It practice, after the present image in camera collection front, ought Front image is sent to the picture recognition module in the central processing unit of the VR helmet, by picture recognition module pair This present image processes, and determines whether include setting gesture in this present image, concrete, permissible Extract the gesture feature in this present image, the gesture feature of extraction and the gesture feature setting gesture are entered Row coupling, if the match is successful, then this present image is the first image including setting gesture.
Step 202, with the first reference point of setting at the coordinate position alternatively menu of the first image Position reference point, determines the viewing area of this options menu.
In this step, after determining setting gesture, to set as a example by gesture stretchs out as the five fingers, at the first figure The first reference point is set, using this first reference point as display option dish in the palm of picture and the profile of finger The reference point that unit is put, determines the viewing area of options menu.The quantity of option in options menu, respectively select Position between content, and each option all presets, fixing, therefore once it is determined that wherein A reference point, the position of options menu is i.e. determined.
Such as: can choose thumb tip as the first reference point, options menu has 16 options, and 4 Row 4 row are arranged in order, and the option content that the position between option is fixed and each position is corresponding is also fixing 's.The shape of options menu entirety can be regular, such as: rectangle etc., it is also possible to be irregular Shape, such as: be arranged in the shape setting gesture.Coordinate position alternatively dish with the first reference point Single position reference point, for example, it is possible to by the coordinate bit at the center of the Rectangle Option menu Yu the first reference point Put coincidence, determine the viewing area of type selecting menu.
Step 203, in the viewing area of the options menu determined, display options menu each option.
Step 204, collection finger fingertip fall into second image in options menu region.
In the embodiment of the present invention, after user sees rendering preferences menu by the VR helmet, stretch out another One finger fingertip of hands points to options menu.For the VR helmet, after display options menu, The present image in the front of collection is sent to the image recognition in the central processing unit of the VR helmet by photographic head Module, is processed this present image by picture recognition module, it is determined whether collects finger fingertip and falls Enter the second image of option menu area, concrete, the gesture feature in this present image can be extracted, When the gesture feature extracted meets the feature that in the five fingers, four fingers close up, a finger is stretched out, determine present image Second image in options menu region is fallen into for finger fingertip.
After collecting finger fingertip and fall into second image in options menu region, determine the seat of finger fingertip The region that village enters, the region that the coordinate of finger fingertip is fallen into and each option area in the second image Position compare, determine the coordinate region of finger fingertip, and then determine what finger fingertip pointed to Option, concrete processing mode such as step 205-step 207.
Step 205, determine the pixel that in this second image, in options menu, each option region comprises Corresponding coordinates regional.
In the embodiment of the present invention, options menu can there is interval between each option region, it is also possible to Not interval.If there is no, between interval i.e. option area, there is public boundary between each option region, Pre-set the option area belonging to public boundary, such as, for two, left and right option area, Ke Yishe Put public boundary and belong to left side option area, for upper and lower two option area, public boundary can be set Belong to upside option area.
Step 206, the coordinate of acquisition finger fingertip.
In the embodiment of the present invention, the coordinate of finger fingertip can use motion sensing control device of the prior art Leap Motion or computer vision storehouse of increasing income (Open CV, Open Source Computer Vision Library) touch control part is carried out process to obtain.
Step 207, by the coordinate of the finger fingertip picture that each option region comprises with this options menu The coordinate of vegetarian refreshments compares, it is determined whether there is the first option area of the coordinate comprising finger fingertip, If it is, enter step 208, if it does not, enter step 210.
Step 208, determine that the coordinate of finger fingertip falls into this first option area.
Step 209, according to option corresponding to the first option area determined, perform corresponding operation.
In this step, each option in options menu pre-sets, such as: option can be Broadcasting, time-out, F.F. or rewind.Further, each option in options menu can be corresponding Two operations, such as: option can be to play and suspend, can arrange corresponding initial of this option Operation, for playing, triggers the operation of this option i.e. for the first time for playing, options menu after display starts, The operation each time to this option can be recorded the most successively.When determining that this option chosen is Fingers After the option that point points to, determine and this option is performed the menu content that last time operation is corresponding, if upper one Secondary execution play operation, then current this performs pausing operation;If the last pausing operation that performs, then Current this performs play operation.When options menu stops display, delete the record of total Options operation.
Step 210, determine that finger fingertip falls into non-option area.
When there is interval between the region at the option place of options menu, if do not existed in options menu When protecting the option area of finger fingertip coordinate, determine that finger fingertip falls into non-option area, say, that Finger fingertip, in the viewing area of options menu, falls in the interval between option region.
The above-mentioned gesture control method provided by the embodiment of the present invention 1, owing to selecting option by gesture Option in menu, it is not necessary to use the instruments such as remote controller so that user operation is easier, improves Consumer's Experience, and decrease use cost.
Embodiment 2:
Based on same inventive concept, according to the gesture control method of the above embodiment of the present invention offer, accordingly Ground, the embodiment of the present invention 2 additionally provides a kind of gesture actuation means, its structural representation as it is shown on figure 3, Specifically include:
Display unit 301, for when using image acquisition device to the first figure including setting gesture During picture, show options menu.
In the embodiment of the present invention, image collecting device can be common USB camera, it is also possible to for deeply Degree photographic head.Photographic head can gather image according to predetermined period, and this predetermined period can be according to actual warp Test and needs are arranged flexibly.When user wants by the gesture manipulation VR helmet, before photographic head Doing the gesture motion of this setting, such as: the five fingers are stretched out, palmar aspect just to photographic head, is not intended to finger Towards, the purpose doing the gesture motion set is to activate options menu.It practice, camera collection After the present image in front, present image is sent to the image recognition in the central processing unit of the VR helmet Whether module, is processed this present image by picture recognition module, determine in this present image and include Set gesture, concrete, the gesture feature in this present image can be extracted, the gesture feature that will extract Mating with the gesture feature setting gesture, if the match is successful, then this present image is for including setting First image of gesture.
After determining setting gesture, to set as a example by gesture stretchs out as the five fingers, at palm and the profile of finger Inside choose the first reference point, using this first reference point as the reference point of display options menu position, determine The viewing area of options menu.In options menu the quantity of option, each option content, and each option it Between position all preset, fixing, therefore once it is determined that one of them reference point, option dish Single position is i.e. determined.
Such as: thumb tip can be chosen as the first reference point, the position between each option of options menu It is fixing for putting, such as: options menu has 16 options, and 4 row 4 row are arranged in order, between option The option content that position is fixed and each position is corresponding be also fixing.The shape of options menu entirety can To be regular, such as: rectangle etc., it is also possible to be irregular shape, such as: be arranged in setting hands The shape of gesture.With the position reference point of the coordinate position alternatively menu of the first reference point, such as, can Overlap with the coordinate position by the center of the Rectangle Option menu with the first reference point, determine the aobvious of type selecting menu Show region.
Area determination unit 302, falls into options menu district for working as touch control part in the second image collected During territory, determine the region that the coordinate of touch control part falls into.
In the embodiment of the present invention, when touch control part is the option in selecting options menu, touch object and choosing The lap of item menu area, wherein, this touch object can be finger fingertip, pointer etc..With As a example by touch object is finger fingertip, after user sees rendering preferences menu, stretch out the one of another hands Finger points to options menu.After display options menu, the area determination unit 302 front to gathering Finger fingertip fall into the present image in options menu region, the i.e. second image processes, and determines finger The region that the coordinate of finger tip falls into.
Option determines unit 303, for the region fallen into according to the coordinate of touch control part, determines that touch control part refers to To option.
Manipulation unit 304, for according to the option determined, performs corresponding operation.
In the embodiment of the present invention, each option in options menu pre-sets, such as: option Can be broadcasting, time-out, F.F. or rewind.Further, each option in options menu can With corresponding two operations, such as: an option can be to play and suspend, this option pair can be set The initial operation answered is for playing, and the operation triggering this option i.e. for the first time is broadcasting, and options menu is in display After beginning, the operation each time to this option can be recorded the most successively.When determining this choosing chosen After the item option for touch control part sensing, determine the menu content that this option is performed last time operation correspondence, If the last play operation that performs, then current this performs pausing operation;Time-out is performed if last Operation, then current this performs play operation.When options menu stops display, delete total Options behaviour The record made.
Further, display unit 301, the coordinate position specifically for the first reference point to set is made For the position reference point of options menu, determine the viewing area of options menu;In the options menu determined Viewing area, each option of display options menu.
Further, area determination unit 302, each specifically for determining in the second image in options menu The coordinates regional that pixel that option region comprises is corresponding;Obtain the coordinate of described touch control part;By institute State the coordinate that pixel that the coordinate option region each with described options menu of touch control part comprise is corresponding Compare, it is determined whether there is the first option area of the coordinate comprising described touch control part;If it does, Determine that the coordinate of described touch control part falls into described first option area;If it does not, determine described touch-control Clan enters non-option area.
Between the region at the option place of options menu exist interval time, if in options menu each When option region does not the most exist the pixel identical with the coordinate of finger fingertip, determine that finger fingertip falls Enter non-option area, say, that finger fingertip, in the viewing area of options menu, falls into option institute In interval between zones.
Further, said apparatus, also include:
Stop display unit 305, for when the 3rd image determining collection does not contains described setting gesture Time, stop showing this options menu.
Further, this sets gesture and stretchs out as the five fingers, and palmar aspect is just to this image collecting device.
The function of above-mentioned each unit may correspond to the respective handling step in flow process shown in Fig. 1 or Fig. 2, This repeats no more.
The embodiment of the present invention can be passed through hardware processor (hardware processor) realize being correlated with Functional module.
In sum, the scheme that the embodiment of the present invention provides, including: when determining the first image collected In containing set gesture time, show options menu;When in the second image collected, touch control part falls into option During menu area, determine the region that the coordinate of touch control part falls into;The region that coordinate according to touch control part falls into, Determine the option that touch control part is pointed to;According to the option determined, perform corresponding operation.The employing present invention is real Execute the scheme that example provides, owing to being selected the option in options menu by gesture, it is not necessary to use remote controller Deng instrument so that user operation is easier, improve Consumer's Experience, and decrease use cost.
Device embodiment described above is only schematically, wherein said illustrates as separating component Unit can be or may not be physically separate, the parts shown as unit can be or Person may not be physical location, i.e. may be located at a place, or can also be distributed to multiple network On unit.Some or all of module therein can be selected according to the actual needs to realize the present embodiment The purpose of scheme.Those of ordinary skill in the art are not in the case of paying performing creative labour, the most permissible Understand and implement.
Through the above description of the embodiments, those skilled in the art is it can be understood that arrive each reality The mode of executing can add the mode of required general hardware platform by software and realize, naturally it is also possible to by firmly Part.Based on such understanding, the portion that prior art is contributed by technique scheme the most in other words Dividing and can embody with the form of software product, this computer software product can be stored in computer can Read in storage medium, such as ROM/RAM, magnetic disc, CD etc., including some instructions with so that one Computer equipment (can be personal computer, server, or the network equipment etc.) performs each to be implemented The method described in some part of example or embodiment.
Last it is noted that above example is only in order to illustrate technical scheme, rather than it is limited System;Although the present invention being described in detail with reference to previous embodiment, the ordinary skill of this area Personnel it is understood that the technical scheme described in foregoing embodiments still can be modified by it, Or wherein portion of techniques feature is carried out equivalent;And these amendments or replacement, do not make phase The essence answering technical scheme departs from the spirit and scope of various embodiments of the present invention technical scheme.

Claims (10)

1. a gesture control method, it is characterised in that including:
When determining in the first image collected containing when setting gesture, show options menu;
When touch control part falls into options menu region in the second image collected, determine described touch control part The region that coordinate falls into;
The region that coordinate according to described touch control part falls into, determines the option that described touch control part is pointed to;
According to the option determined, perform corresponding operation.
Method the most according to claim 1, it is characterised in that display options menu, specifically includes:
The position reference point of coordinate position alternatively menu of the first reference point to set, determines described The viewing area of options menu;
In the viewing area of the described options menu determined, show each option of described options menu.
Method the most according to claim 1, it is characterised in that determine that the coordinate of described touch control part falls The region entered, specifically includes:
Determine that the pixel that described in described second image, in options menu, each option region comprises is corresponding Coordinates regional;
Obtain the coordinate of described touch control part;
By the coordinate of the described touch control part pixel that each option region comprises with described options menu Corresponding coordinate compares, it is determined whether there is the first option area of the coordinate comprising described touch control part;
If it does, determine that the coordinate of described touch control part falls into described first option area;
If it does not, determine that described touch control part falls into non-option area.
Method the most according to claim 1, it is characterised in that also include:
When the 3rd image determining collection does not contains described setting gesture, stop showing described option dish Single.
5. according to the arbitrary described method of claim 1-4, it is characterised in that the described gesture that sets is as five Finger is stretched out, and palmar aspect is just to image collecting device.
6. a gesture actuation means, it is characterised in that including:
Display unit, for determining in the first image collected containing when setting gesture, shows option dish Single;
Area determination unit, is used for when in the second image collected, touch control part falls into options menu region, Determine the region that the coordinate of described touch control part falls into;
Option determines unit, for the region fallen into according to the coordinate of described touch control part, determines described touch-control The option that portion points to;
Manipulation unit, for according to the option determined, performs corresponding operation.
Device the most according to claim 6, it is characterised in that described display unit, specifically for The position reference point of coordinate position alternatively menu of the first reference point to set, determines described option The viewing area of menu;In the viewing area of the described options menu determined, show described options menu Each option.
Device the most according to claim 6, it is characterised in that described area determination unit, specifically For determining that the pixel that described in described second image, in options menu, each option region comprises is corresponding Coordinates regional;Obtain the coordinate of described touch control part;By the coordinate of described touch control part and described options menu In coordinate corresponding to the pixel that comprises of each option region compare, it is determined whether exist and comprise institute State the first option area of the coordinate of touch control part;If it does, determine that the coordinate of described touch control part falls into institute State the first option area;If it does not, determine that described touch control part falls into non-option area.
Device the most according to claim 6, it is characterised in that also include:
Stop display unit, be used for when the 3rd image determining collection does not contains described setting gesture, Stop showing described options menu.
10. according to the arbitrary described device of claim 6-9, it is characterised in that described set gesture as The five fingers are stretched out, and palmar aspect is just to image collecting device.
CN201510505966.5A 2015-08-17 2015-08-17 Gesture control method and device Pending CN105867599A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510505966.5A CN105867599A (en) 2015-08-17 2015-08-17 Gesture control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510505966.5A CN105867599A (en) 2015-08-17 2015-08-17 Gesture control method and device

Publications (1)

Publication Number Publication Date
CN105867599A true CN105867599A (en) 2016-08-17

Family

ID=56624251

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510505966.5A Pending CN105867599A (en) 2015-08-17 2015-08-17 Gesture control method and device

Country Status (1)

Country Link
CN (1) CN105867599A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106445118A (en) * 2016-09-06 2017-02-22 网易(杭州)网络有限公司 Virtual reality interaction method and apparatus
CN106445152A (en) * 2016-09-29 2017-02-22 珠海市魅族科技有限公司 Method for managing menus in virtual reality environments and virtual reality equipment
CN106598277A (en) * 2016-12-19 2017-04-26 网易(杭州)网络有限公司 Virtual reality interactive system
CN106774872A (en) * 2016-12-09 2017-05-31 网易(杭州)网络有限公司 Virtual reality system, virtual reality exchange method and device
CN106940477A (en) * 2017-03-14 2017-07-11 联想(北京)有限公司 A kind of control method and electronic equipment
CN107168530A (en) * 2017-04-26 2017-09-15 腾讯科技(深圳)有限公司 Object processing method and device in virtual scene
WO2018033154A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Gesture control method, device, and electronic apparatus
CN108536273A (en) * 2017-03-01 2018-09-14 天津锋时互动科技有限公司深圳分公司 Man-machine menu mutual method and system based on gesture
CN109144598A (en) * 2017-06-19 2019-01-04 天津锋时互动科技有限公司深圳分公司 Electronics mask man-machine interaction method and system based on gesture
CN109460149A (en) * 2018-10-31 2019-03-12 北京百度网讯科技有限公司 System management facility, display methods, VR equipment and computer-readable medium
CN109934929A (en) * 2017-12-15 2019-06-25 深圳梦境视觉智能科技有限公司 The method, apparatus of image enhancement reality, augmented reality show equipment and terminal
CN112198962A (en) * 2020-09-30 2021-01-08 聚好看科技股份有限公司 Method for interacting with virtual reality equipment and virtual reality equipment
US11194400B2 (en) 2017-04-25 2021-12-07 Tencent Technology (Shenzhen) Company Limited Gesture display method and apparatus for virtual reality scene
CN114168034A (en) * 2017-05-26 2022-03-11 成都理想境界科技有限公司 Menu operation method applied to head-mounted display equipment and head-mounted display equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729808A (en) * 2008-10-14 2010-06-09 Tcl集团股份有限公司 Remote control method for television and system for remotely controlling television by same
CN102467326A (en) * 2010-11-16 2012-05-23 富泰华工业(深圳)有限公司 Menu display device and display control method thereof
US20130239041A1 (en) * 2012-03-06 2013-09-12 Sony Corporation Gesture control techniques for use with displayed virtual keyboards
CN103365402A (en) * 2012-03-31 2013-10-23 青岛海信电器股份有限公司 Control method and device for display equipment
CN104076907A (en) * 2013-03-25 2014-10-01 联想(北京)有限公司 Control method, control device and wearable electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729808A (en) * 2008-10-14 2010-06-09 Tcl集团股份有限公司 Remote control method for television and system for remotely controlling television by same
CN102467326A (en) * 2010-11-16 2012-05-23 富泰华工业(深圳)有限公司 Menu display device and display control method thereof
US20130239041A1 (en) * 2012-03-06 2013-09-12 Sony Corporation Gesture control techniques for use with displayed virtual keyboards
CN103365402A (en) * 2012-03-31 2013-10-23 青岛海信电器股份有限公司 Control method and device for display equipment
CN104076907A (en) * 2013-03-25 2014-10-01 联想(北京)有限公司 Control method, control device and wearable electronic equipment

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018033154A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Gesture control method, device, and electronic apparatus
CN106445118B (en) * 2016-09-06 2019-05-17 网易(杭州)网络有限公司 Virtual reality exchange method and device
CN106445118A (en) * 2016-09-06 2017-02-22 网易(杭州)网络有限公司 Virtual reality interaction method and apparatus
CN106445152A (en) * 2016-09-29 2017-02-22 珠海市魅族科技有限公司 Method for managing menus in virtual reality environments and virtual reality equipment
CN106774872A (en) * 2016-12-09 2017-05-31 网易(杭州)网络有限公司 Virtual reality system, virtual reality exchange method and device
CN106598277A (en) * 2016-12-19 2017-04-26 网易(杭州)网络有限公司 Virtual reality interactive system
CN106598277B (en) * 2016-12-19 2019-09-17 网易(杭州)网络有限公司 Virtual reality interactive system
CN108536273A (en) * 2017-03-01 2018-09-14 天津锋时互动科技有限公司深圳分公司 Man-machine menu mutual method and system based on gesture
CN106940477A (en) * 2017-03-14 2017-07-11 联想(北京)有限公司 A kind of control method and electronic equipment
US11194400B2 (en) 2017-04-25 2021-12-07 Tencent Technology (Shenzhen) Company Limited Gesture display method and apparatus for virtual reality scene
CN107168530A (en) * 2017-04-26 2017-09-15 腾讯科技(深圳)有限公司 Object processing method and device in virtual scene
CN114168034A (en) * 2017-05-26 2022-03-11 成都理想境界科技有限公司 Menu operation method applied to head-mounted display equipment and head-mounted display equipment
CN109144598A (en) * 2017-06-19 2019-01-04 天津锋时互动科技有限公司深圳分公司 Electronics mask man-machine interaction method and system based on gesture
CN109934929A (en) * 2017-12-15 2019-06-25 深圳梦境视觉智能科技有限公司 The method, apparatus of image enhancement reality, augmented reality show equipment and terminal
CN109460149A (en) * 2018-10-31 2019-03-12 北京百度网讯科技有限公司 System management facility, display methods, VR equipment and computer-readable medium
CN112198962A (en) * 2020-09-30 2021-01-08 聚好看科技股份有限公司 Method for interacting with virtual reality equipment and virtual reality equipment
CN112198962B (en) * 2020-09-30 2023-04-28 聚好看科技股份有限公司 Method for interacting with virtual reality equipment and virtual reality equipment

Similar Documents

Publication Publication Date Title
CN105867599A (en) Gesture control method and device
US11048333B2 (en) System and method for close-range movement tracking
KR101979317B1 (en) System and method for close-range movement tracking
Mossel et al. 3DTouch and HOMER-S: intuitive manipulation techniques for one-handed handheld augmented reality
EP3275514A1 (en) Virtuality-and-reality-combined interactive method and system for merging real environment
JP2013037675A5 (en)
CN105892635A (en) Image capture realization method and apparatus as well as electronic device
EP2816456A1 (en) Information processing device, information processing method, and computer program
CN106861184B (en) Method and system for realizing human-computer interaction in immersive VR game
CN105975072A (en) Method, device and system for identifying gesture movement
CN106504001A (en) Method of payment and device in a kind of VR environment
CN106200941B (en) Control method of virtual scene and electronic equipment
CN104571811B (en) Information processing method and electronic equipment
CN109933190B (en) Head-mounted display equipment and interaction method thereof
CN112991555B (en) Data display method, device, equipment and storage medium
CN107479902B (en) Control processing method and device, storage medium, processor and terminal
JP6131004B2 (en) Object display method, program, and apparatus
CN116360589A (en) Method and medium for inputting information by virtual keyboard and electronic equipment
CN111265849A (en) Interaction method and device for virtual cards
CN106569675A (en) Method and device for displaying prompt dialog boxes
CN115904201A (en) Virtual reality device, control method and device thereof, and computer storage medium
KR101861096B1 (en) Method and apparatus for controlling information displayed on screen by recognizing hand gesture of user
Teixeira et al. Open/closed hand classification using Kinect data
CN113672158A (en) Human-computer interaction method and device for augmented reality
CN114327063A (en) Interaction method and device of target virtual object, electronic equipment and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160817