CN218214058U - Virtual reality interaction equipment based on gesture recognition - Google Patents
Virtual reality interaction equipment based on gesture recognition Download PDFInfo
- Publication number
- CN218214058U CN218214058U CN202222399206.9U CN202222399206U CN218214058U CN 218214058 U CN218214058 U CN 218214058U CN 202222399206 U CN202222399206 U CN 202222399206U CN 218214058 U CN218214058 U CN 218214058U
- Authority
- CN
- China
- Prior art keywords
- gesture
- unit
- gesture recognition
- virtual reality
- control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The utility model discloses an interactive equipment of virtual reality based on gesture recognition, including the image acquisition module: the device comprises an infrared motion sensor, a video snapshot camera and a memory; an image processing module: the gesture recognition system comprises a gesture segmentation unit, a gesture analysis unit, an instruction confirmation unit and a semantic conversion unit; the screen control module: comprises a signal conversion unit and a control unit; a data communication module: including a wireless communication unit and a signal line. The utility model discloses a contactless's human-computer interaction has overcome the problem that more relies on hardware equipment interaction among the current human-computer interaction, has improved user experience, for control methods such as keyboard, mouse, infrared remote controller, the utility model discloses it can controlgear to need not user's contact, and it is more simple and convenient to use user control.
Description
Technical Field
The utility model relates to a human-computer interaction technology field specifically is a virtual reality interaction device based on gesture recognition.
Background
With the evolution of mobile computing devices from notebook computers to mobile phones and tablet computers, the control modes of the mobile computing devices have evolved from keyboards, mice, keys of the mobile phones, handwriting pads, touch screens and virtual keyboards, and it can be seen that the control modes of the mobile devices are more and more intuitive and simple and are evolved in accordance with the natural habits of people.
At present, a touch screen-based control mode widely used on mobile computing equipment is more in line with intuitive reaction of people and easier to learn than a keyboard and a mouse mode, but the screen touch mode only captures the actions of human fingers after all, and in some occasions needing more user body information input, such as sports games, simulation training, complex control, remote control and the like, the screen touch mode has the limitation that the human body information is captured only singly.
At present, most of the existing virtual reality interaction technologies still utilize conventional input methods such as a mouse and a key to interact with the device, so that the input methods are too limited, and therefore, when a user selects or executes a function, the operation is cumbersome, and the user experience is poor.
SUMMERY OF THE UTILITY MODEL
The to-be-solved technical problem of the utility model is to overcome above technical defect, provide a virtual reality interaction device based on gesture recognition.
In order to solve the above problem, the technical scheme of the utility model is that: a virtual reality interaction device based on gesture recognition, comprising:
an image acquisition module: the device comprises an infrared motion sensor, a video snapshot camera and a memory;
an image processing module: the gesture recognition system comprises a gesture segmentation unit, a gesture analysis unit, an instruction confirmation unit and a semantic conversion unit;
the screen control module: comprises a signal conversion unit and a control unit;
a communication module: including a wireless communication unit and a signal line.
Further, the infrared motion sensor is used for detecting the change of the gesture, the video snapshot camera is used for recording the gesture and taking a snapshot of a certain amount in a preset time, and the storage is used for storing the snapshot image.
Further, the gesture segmentation unit converts gestures in the plurality of captured images into a three-dimensional model, the gesture analysis unit is used for extracting edge contours of the three-dimensional model and comparing the extracted edge contours with a gesture database to obtain gesture shape data, the instruction confirmation unit is used for determining a gesture interaction instruction according to changes of the gesture shape data in a preset time interval, and the semantic conversion unit converts the obtained gesture interaction instruction into corresponding electric signals.
Further, the signal conversion unit converts the electric signal into a control signal, and the control unit controls the interaction device to execute corresponding interaction according to the control signal.
Further, the wireless communication unit is used for connecting with an external wireless network, and the signal line is used for data communication among the modules.
Compared with the prior art, the utility model the advantage lie in: the utility model discloses a contactless human-computer interaction has overcome the problem that more relies on hardware equipment interaction among the current human-computer interaction, has improved user experience, for control methods such as keyboard, mouse, infrared remote controller, the utility model discloses it can controlgear to need not user's contact, makes user control more simple and convenient.
Drawings
Fig. 1 is a schematic structural diagram of the present invention.
Fig. 2 is a schematic flow chart of the present invention.
As shown in the figure: 1. an image acquisition module; 101. an infrared motion sensor; 102. a video capture camera; 103. a memory; 2. an image processing module; 201. a gesture segmentation unit; 202. a gesture analysis unit; 203. an instruction confirmation unit; 204. a semantic conversion unit; 3. a screen control module; 301. a signal conversion unit; 302. a control unit; 4. and a communication module.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments; based on the embodiments in the present invention, all other embodiments obtained by a person skilled in the art without creative work belong to the protection scope of the present invention.
As shown in fig. 1 to 2, a virtual reality interaction device based on gesture recognition includes:
image acquisition module 1: the gesture recognition system comprises an infrared motion sensor 101, a video snapshot camera 102 and a memory 103, wherein the infrared motion sensor 101 is used for detecting the change of gestures, the video snapshot camera 102 is used for recording the gestures and taking a certain number of snapshots within preset time, and the memory 103 is used for storing the snapshot images;
the image processing module 2: the gesture recognition system comprises a gesture segmentation unit 201, a gesture analysis unit 202, an instruction confirmation unit 203 and a semantic conversion unit 204, wherein the gesture segmentation unit 201 converts gestures in a plurality of captured images into a three-dimensional model, the gesture analysis unit 202 is used for extracting edge outlines of the three-dimensional model and comparing the extracted edge outlines with a gesture database to obtain gesture shape data, the instruction confirmation unit 203 is used for determining a gesture interaction instruction according to changes of the gesture shape data in a preset time interval, and the semantic conversion unit 204 converts the obtained gesture interaction instruction into corresponding electric signals;
the screen control module 3: the interactive device comprises a signal conversion unit 301 and a control unit 302, wherein the signal conversion unit 301 converts an electric signal into a control signal, and the control unit 302 controls the interactive device to execute corresponding interactive actions according to the control signal;
the communication module 4: the wireless communication unit is used for being connected with an external wireless network, and the signal line is used for data communication among the modules.
In specific use, the interaction device is connected with an external wireless network through a wireless communication unit, when the infrared motion sensor 101 detects a gesture of a user, the video capture camera 102 starts to capture the gesture quickly, the gesture segmentation unit 201 converts the gesture in a plurality of captured images into a three-dimensional model, the gesture analysis unit 202 is used for extracting an edge profile of the three-dimensional model and comparing the extracted edge profile with a gesture database to obtain gesture shape data, the instruction confirmation unit 203 is used for determining a gesture interaction instruction according to the change of the gesture shape data in a preset time interval, and the semantic conversion unit 204 converts the obtained gesture interaction instruction into a corresponding electric signal; the signal conversion unit 301 converts the electrical signal into a control signal, and the control unit 302 controls the interaction device to execute a corresponding action according to the control signal, so as to implement human-computer interaction.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
The present invention and the embodiments thereof have been described above, but the description is not limited thereto, and the embodiment shown in the drawings is only one of the embodiments of the present invention, and the actual structure is not limited thereto. In summary, those skilled in the art should understand that they should not be limited to the embodiments described above, and that they can design the similar structure and embodiments without departing from the spirit of the invention.
Claims (5)
1. A virtual reality interaction device based on gesture recognition, comprising:
image acquisition module (1): the device comprises an infrared motion sensor (101), a video snapshot camera (102) and a memory (103);
image processing module (2): the gesture recognition system comprises a gesture segmentation unit (201), a gesture analysis unit (202), an instruction confirmation unit (203) and a semantic conversion unit (204);
screen control module (3): comprises a signal conversion unit (301) and a control unit (302);
communication module (4): including a wireless communication unit and a signal line.
2. The virtual reality interaction device based on gesture recognition according to claim 1, wherein: the infrared motion sensor (101) is used for detecting the change of gestures, the video snapshot camera (102) is used for recording the gestures and taking a certain number of snapshots within a preset time, and the memory (103) is used for storing the snapshot images.
3. The virtual reality interaction device based on gesture recognition as claimed in claim 1, wherein: the gesture recognition system comprises a gesture segmentation unit (201) for converting gestures in a plurality of captured images into a three-dimensional model, a gesture analysis unit (202) for extracting edge profiles of the three-dimensional model and comparing the extracted edge profiles with a gesture database to obtain gesture shape data, an instruction confirmation unit (203) for determining a gesture interaction instruction according to changes of the gesture shape data in a preset time interval, and a semantic conversion unit (204) for converting the obtained gesture interaction instruction into corresponding electric signals.
4. The virtual reality interaction device based on gesture recognition as claimed in claim 1, wherein: the signal conversion unit (301) converts the electric signal into a control signal, and the control unit (302) controls the interaction device to execute corresponding interaction action according to the control signal.
5. The virtual reality interaction device based on gesture recognition as claimed in claim 1, wherein: the wireless communication unit is used for connecting with an external wireless network, and the signal line is used for data communication among the modules.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202222399206.9U CN218214058U (en) | 2022-09-09 | 2022-09-09 | Virtual reality interaction equipment based on gesture recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202222399206.9U CN218214058U (en) | 2022-09-09 | 2022-09-09 | Virtual reality interaction equipment based on gesture recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
CN218214058U true CN218214058U (en) | 2023-01-03 |
Family
ID=84631803
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202222399206.9U Active CN218214058U (en) | 2022-09-09 | 2022-09-09 | Virtual reality interaction equipment based on gesture recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN218214058U (en) |
-
2022
- 2022-09-09 CN CN202222399206.9U patent/CN218214058U/en active Active
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10146989B2 (en) | Methods for controlling a hand-held electronic device and hand-held electronic device utilizing the same | |
CN108616712B (en) | Camera-based interface operation method, device, equipment and storage medium | |
CN105094654B (en) | Screen control method and device | |
CN106415472A (en) | Gesture control method, device, terminal apparatus and storage medium | |
CN102135839B (en) | Terminal and input method thereof | |
WO2012068950A1 (en) | Touch screen triggering method and touch device | |
WO2011150607A1 (en) | Method and mobile terminal for automatically recognizing gesture | |
WO2018040009A1 (en) | Method, terminal, handwriting stylus, and system for signature verification | |
CN101984396A (en) | Method for automatically identifying rotation gesture and mobile terminal thereof | |
CN103034427A (en) | Touch screen page turning method and device and touch screen equipment | |
WO2012041234A1 (en) | Camera-based information input method and terminal | |
US20240077948A1 (en) | Gesture-based display interface control method and apparatus, device and storage medium | |
CN102023735A (en) | Touch input equipment, electronic equipment and mobile phone | |
CN101869484A (en) | Medical diagnosis device having touch screen and control method thereof | |
WO2020215719A1 (en) | Keyboard-based operation processing method and apparatus, device, and medium | |
US20230188638A1 (en) | Control method and device | |
CN103917943A (en) | A terminal device treatment method and a terminal device | |
CN101667077B (en) | Method for identifying single-click, double-click and drag motions and controller of touch device | |
Yin et al. | CamK: Camera-based keystroke detection and localization for small mobile devices | |
CN218214058U (en) | Virtual reality interaction equipment based on gesture recognition | |
CN103135745B (en) | Non-contact control method, information equipment and system based on depth image | |
WO2023179694A1 (en) | Texture identification based difference touch method | |
CN101598982B (en) | Electronic device and method for executing mouse function of same | |
CN111324224A (en) | Mouse based on pressure induction and control method thereof | |
TW201734709A (en) | Electronic device and switch method and system for inputting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |