CN112904999A - Augmented reality somatosensory interaction method and system based on laser radar - Google Patents
Augmented reality somatosensory interaction method and system based on laser radar Download PDFInfo
- Publication number
- CN112904999A CN112904999A CN202011611544.3A CN202011611544A CN112904999A CN 112904999 A CN112904999 A CN 112904999A CN 202011611544 A CN202011611544 A CN 202011611544A CN 112904999 A CN112904999 A CN 112904999A
- Authority
- CN
- China
- Prior art keywords
- augmented reality
- somatosensory interaction
- image
- module
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 74
- 230000003238 somatosensory effect Effects 0.000 title claims abstract description 53
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 44
- 238000000034 method Methods 0.000 title claims abstract description 13
- 230000002452 interceptive effect Effects 0.000 claims abstract description 28
- 230000005540 biological transmission Effects 0.000 claims abstract description 16
- 230000002159 abnormal effect Effects 0.000 claims abstract description 10
- 230000033001 locomotion Effects 0.000 claims description 10
- 230000002708 enhancing effect Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 8
- 230000002035 prolonged effect Effects 0.000 abstract description 2
- 230000005856 abnormality Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003014 reinforcing effect Effects 0.000 description 1
- 230000007306 turnover Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C9/00—Measuring inclination, e.g. by clinometers, by levels
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/50—Systems of measurement based on relative movement of target
- G01S17/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/3058—Monitoring arrangements for monitoring environmental properties or parameters of the computing system or of the computing system component, e.g. monitoring of power, currents, temperature, humidity, position, vibrations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/32—Monitoring with visual or acoustical indication of the functioning of the machine
- G06F11/324—Display of status information
- G06F11/327—Alarm or error message display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Quality & Reliability (AREA)
- Electromagnetism (AREA)
- Computing Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses an augmented reality somatosensory interaction method and system based on a laser radar, and relates to the technical field of somatosensory interaction systems. The system comprises an interactive system and an auxiliary system, wherein the auxiliary system is arranged at the output end of the interactive system, the input end of the auxiliary system is electrically connected with the output end of the interactive system through a wire harness, and the interactive system comprises an image transmission module, a radar processing module, augmented reality equipment, a somatosensory interactive equipment display and a voice control module. According to the invention, the interaction system is arranged, so that the somatosensory interaction of the image can be enhanced after the image is processed by the radar, the enhancement effect is more remarkable, the use is convenient, and meanwhile, the experience effect of people is improved, the auxiliary system is arranged, so that the running state of the somatosensory interaction equipment display can be monitored in real time, the abnormal running of the somatosensory interaction equipment display is avoided, the service life of the somatosensory interaction equipment display is further prolonged, and meanwhile, the somatosensory interaction effect on the image can also be improved.
Description
Technical Field
The invention belongs to the technical field of somatosensory interaction systems, and particularly relates to an augmented reality somatosensory interaction method and system based on a laser radar.
Background
The somatosensory interaction system can blend motion and entertainment into your life. The operator can control the system through his own limbs, and can interact with the internet players to share pictures and video information. Imagine, you are standing before a large screen at this moment, turn over the page, slide, confirm simple operation such as can realize browsing and accomplishing shopping to commodity with the hand, this is the application of body sensing technique in the consumer goods field and brings the facility for us, in addition, the merchant also can obtain accurate consumption feedback through this computer, facial expression that the camera of computer can record, information such as consumer's sex, dwell time through anonymous recognition technology record, however, current body sensing interactive system still has certain weak point when using, current body sensing interactive system does not act on laser radar in body sensing interactive system when using, body sensing interactive system strengthens unobviously, it is comparatively inconvenient to use, reduce people's experience effect.
Disclosure of Invention
The invention aims to provide an augmented reality somatosensory interaction system based on a laser radar, which aims to solve the existing problems that: the existing somatosensory interaction system does not act on a laser radar in the somatosensory interaction system when in use, the somatosensory interaction system is not obviously enhanced, the use is inconvenient, and the experience effect of people is reduced.
In order to solve the technical problems, the invention is realized by the following technical scheme:
the invention provides an augmented reality somatosensory interaction method based on a laser radar, which comprises the following steps of:
s1, the image acquisition module acquires an image and transmits the image through the image transmission module;
s2, the radar processing module and the augmented reality equipment perform augmented processing on the image;
and S3, displaying the image through the somatosensory interaction device display.
Further, in step S2, the plurality of radar antennas are circumferentially distributed, and the radar processing module receives signals from the radar antennas, and recognizes the motion trajectory, thereby enhancing the image.
Further, when the radar processing module is not operated in step S2, the motion trajectory is recognized by the gyroscope, thereby enhancing the image.
Further, in the process that the image is displayed through the somatosensory interaction device display in the step S3, the time recorder, the display abnormality sensor, the alarm module and the temperature sensor monitor the running state of the somatosensory interaction device display in real time, when the display abnormality sensor senses a display abnormality signal, the display abnormality mark and the timestamp are bound to generate a display abnormality record and stored, and meanwhile, the alarm module gives an alarm; when the temperature sensor senses that the temperature exceeds the set temperature, the alarm module gives an alarm.
The invention relates to an augmented reality somatosensory interaction system based on a laser radar, which comprises an interaction system and an auxiliary system, wherein the auxiliary system is arranged at the output end of the interaction system, and the input end of the auxiliary system is electrically connected with the output end of the interaction system through a wire harness.
Further, interactive system includes image transmission module, radar processing module, reinforcing and realizes equipment, body and feels interactive equipment display and speech control module, radar processing module sets up the output at image transmission module, radar processing module's input passes through pencil and image transmission module's output electric connection.
Further, augmented reality equipment sets up the output at radar processing module, the input of augmented reality equipment passes through pencil and radar processing module's output electric connection.
Further, the interactive equipment display is felt to the body sets up the output at augmented reality equipment, the input of the interactive equipment display is felt to the body passes through the output electric connection of pencil with augmented reality equipment.
Further, voice control module sets up the output at the interactive device display is felt to the body, voice control module's input passes through the pencil and feels the output electric connection of interactive device display with the body.
Furthermore, the auxiliary system comprises an operation time recorder, an abnormal display sensor, an alarm module and a temperature sensor, wherein the operation time recorder, the abnormal display sensor, the alarm module and the temperature sensor are all arranged at the output end of the somatosensory interactive device display.
Furthermore, the input ends of the operation time recorder, the display abnormity sensor, the alarm module and the temperature sensor are respectively electrically connected with the output end of the somatosensory interaction device display through the wiring harness.
The invention has the following beneficial effects:
1. according to the invention, the interaction system is arranged, so that the somatosensory interaction of the image can be enhanced after the image is processed by the radar, the enhancement effect is more obvious, the use is convenient, and the experience effect of people is also improved.
2. The auxiliary system can monitor the running state of the somatosensory interaction device display in real time, so that abnormal running of the somatosensory interaction device display is avoided, the service life of the somatosensory interaction device display is prolonged, and the somatosensory interaction effect on images can be improved.
Of course, it is not necessary for any product in which the invention is practiced to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is an overall connection block diagram of the present invention.
In the drawings, the components represented by the respective reference numerals are listed below:
1. an interactive system; 101. an image transmission module; 102. a radar processing module; 103. augmented reality devices; 104. a somatosensory interaction device display; 105. a voice control module; 2. an auxiliary system; 201. An operation time recorder; 202. a display abnormality sensor; 203. an alarm module; 204. a temperature sensor.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As a first embodiment of the present invention, this embodiment provides an augmented reality somatosensory interaction method based on a lidar, including the following steps:
s1, the image acquisition module acquires an image and transmits the image through the image transmission module;
s2, the radar processing module and the augmented reality equipment perform augmented processing on the image; the radar processing module receives signals of the radar antennas and identifies an action track, so that an image is enhanced;
and S3, displaying the image through the somatosensory interaction device display.
As a second embodiment, this embodiment further provides a method for enhancing presence-induced interaction based on lidar, including the following steps:
s1, the image acquisition module acquires an image and transmits the image through the image transmission module;
s2, when the radar processing module does not work, recognizing an action track through a gyroscope, and then enhancing the image through augmented reality equipment;
and S3, displaying the image through the somatosensory interaction device display.
As a third embodiment, the rest are the same as the first embodiment, except that in the process of displaying an image through the motion sensing interactive device display in step S3 of this embodiment, the time recorder, the display anomaly sensor, the alarm module, and the temperature sensor monitor the operation state of the motion sensing interactive device display in real time, when the display anomaly sensor senses a display anomaly signal, the display anomaly flag and the time stamp are bound to generate a display anomaly record and stored, and an alarm is issued through the alarm module; when the temperature sensor senses that the temperature exceeds the set temperature, the alarm module gives an alarm.
Referring to fig. 1, the augmented reality somatosensory interaction system based on the laser radar includes an interaction system 1 and an auxiliary system 2, wherein the auxiliary system 2 is arranged at an output end of the interaction system 1, and an input end of the auxiliary system 2 is electrically connected with an output end of the interaction system 1 through a wire harness.
Interaction system 1 includes image transmission module 101, radar processing module 102, augmented reality equipment 103, body feeling interaction device display 104 and speech control module 105, and radar processing module 102 sets up the output at image transmission module 101, and the input of radar processing module 102 passes through pencil and image transmission module 101's output electric connection.
Augmented reality equipment 103 sets up the output at radar processing module 102, and the output electric connection of pencil and radar processing module 102 is passed through to augmented reality equipment 103's input, and radar processing module 102 and augmented reality equipment 103 can strengthen the image.
Somatosensory interaction device display 104 is arranged at the output end of augmented reality device 103, and the input end of somatosensory interaction device display 104 is electrically connected with the output end of augmented reality device 103 through a wire harness.
Voice control module 105 sets up and feels interactive device display 104's output at the body, and voice control module 105's input is through pencil and the output electric connection that interactive device display 104 was felt to the body, and voice control module 105 can make entire system more intelligent swift when using.
The auxiliary system 2 comprises an operation time recorder 201, a display abnormity sensor 202, an alarm module 203 and a temperature sensor 204, wherein the operation time recorder 201, the display abnormity sensor 202, the alarm module 203 and the temperature sensor 204 are all arranged at the output end of the somatosensory interaction device display 104.
The operation time recorder 201, show unusual sensor 202, the input of alarm module 203 and temperature sensor 204 is respectively through pencil and the output electric connection of body feeling interaction device display 104, operation time recorder 201 can take notes the operating time of body feeling interaction device display 104, show unusual sensor 202 can monitor whether unusual operation of body feeling interaction device display 104 is gone on, when body feeling interaction device display 104 abnormal operation, can send out the alarm sound through alarm module 203, and temperature sensor 204 can monitor the temperature of body feeling interaction device display 104 during the operation, avoid body feeling interaction device display 104 to lead to body to feel interaction device display 104 can not normal operating because of the high temperature, and then prolong the life of body feeling interaction device display 104.
One specific application of this embodiment is: the image is transmitted through the image transmission module 101, the image is subjected to enhancement processing through the radar processing module 102 and the augmented reality device 103, the image is displayed through the motion sensing interaction device display 104, the running time recorder 201, the display abnormity sensor 202, the alarm module 203 and the temperature sensor 204 can monitor the running state of the motion sensing interaction device display 104 in real time, and damage to the motion sensing interaction device display 104 is avoided.
In the description herein, references to the description of "one embodiment," "an example," "a specific example" or the like are intended to mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is limited only by the claims and their full scope and equivalents.
Claims (10)
1. An augmented reality somatosensory interaction method based on a laser radar is characterized in that: the method comprises the following steps:
s1, the image acquisition module acquires an image and transmits the image through the image transmission module;
s2, the radar processing module and the augmented reality equipment perform augmented processing on the image;
and S3, displaying the image through the somatosensory interaction device display.
2. The lidar-based augmented reality somatosensory interaction system according to claim 1, wherein: in step S2, the plurality of radar antennas are circumferentially distributed, and the radar processing module receives signals from the radar antennas, and recognizes the movement trajectory, thereby enhancing the image.
3. The lidar-based augmented reality somatosensory interaction system according to claim 1, wherein: when the radar processing module does not work in step S2, the motion trajectory is recognized by the gyroscope, so that the image is enhanced.
4. The lidar-based augmented reality somatosensory interaction system according to claim 1, wherein: in the process that the image is displayed through the somatosensory interaction device display in the step 3, the time recorder, the abnormal display sensor, the alarm module and the temperature sensor monitor the running state of the somatosensory interaction device display in real time, when the abnormal display sensor senses an abnormal display signal, the abnormal display mark and the time stamp are bound to generate an abnormal display record and are stored, and meanwhile, the alarm module gives an alarm; when the temperature sensor senses that the temperature exceeds the set temperature, the alarm module gives an alarm.
5. The utility model provides an interactive system is felt to augmented reality body based on laser radar, includes interactive system (1) and auxiliary system (2), its characterized in that: the auxiliary system (2) is arranged at the output end of the interactive system (1), and the input end of the auxiliary system (2) is electrically connected with the output end of the interactive system (1) through a wire harness; interaction system (1) includes image transmission module (101), radar processing module (102), augmented reality equipment (103), body and feels interactive device display (104) and speech control module (105), radar processing module (102) set up the output at image transmission module (101), the output electric connection of pencil and image transmission module (101) is passed through to the input of radar processing module (102).
6. The lidar-based augmented reality somatosensory interaction system according to claim 5, wherein: augmented reality equipment (103) set up the output at radar processing module (102), the input of augmented reality equipment (103) passes through the pencil and the output electric connection of radar processing module (102).
7. The lidar-based augmented reality somatosensory interaction system according to claim 6, wherein: somatosensory interaction device display (104) is arranged at the output end of augmented reality device (103), and the input end of somatosensory interaction device display (104) is electrically connected with the output end of augmented reality device (103) through a wire harness.
8. The lidar-based augmented reality somatosensory interaction system according to claim 7, wherein: voice control module (105) set up the output at body feeling interactive device display (104), the input of voice control module (105) is through the pencil and body feeling interactive device display's (104) output electric connection.
9. The lidar-based augmented reality somatosensory interaction system according to claim 8, wherein: the auxiliary system (2) comprises an operation time recorder (201), a display abnormity sensor (202), an alarm module (203) and a temperature sensor (204), wherein the operation time recorder (201), the display abnormity sensor (202), the alarm module (203) and the temperature sensor (204) are all arranged at the output end of the somatosensory interaction device display (104).
10. The lidar-based augmented reality somatosensory interaction system according to claim 9, wherein: the input ends of the running time recorder (201), the display abnormity sensor (202), the alarm module (203) and the temperature sensor (204) are respectively electrically connected with the output end of the somatosensory interaction device display (104) through a wiring harness.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011611544.3A CN112904999A (en) | 2020-12-30 | 2020-12-30 | Augmented reality somatosensory interaction method and system based on laser radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011611544.3A CN112904999A (en) | 2020-12-30 | 2020-12-30 | Augmented reality somatosensory interaction method and system based on laser radar |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112904999A true CN112904999A (en) | 2021-06-04 |
Family
ID=76112066
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011611544.3A Pending CN112904999A (en) | 2020-12-30 | 2020-12-30 | Augmented reality somatosensory interaction method and system based on laser radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112904999A (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104483754A (en) * | 2014-12-04 | 2015-04-01 | 上海交通大学 | Head-wearing type multimedia terminal assisted watching system aiming at patient with dysopia |
CN204375125U (en) * | 2014-11-28 | 2015-06-03 | 国家电网公司 | Operation irregularity alarm device |
CN106502424A (en) * | 2016-11-29 | 2017-03-15 | 上海小持智能科技有限公司 | Based on the interactive augmented reality system of speech gestures and limb action |
CN107491008A (en) * | 2017-08-29 | 2017-12-19 | 歌尔股份有限公司 | Wearable device |
CN207115332U (en) * | 2017-08-28 | 2018-03-16 | 重庆卢浮印象数字科技有限公司 | Body-sensing gloves and gesture-capture system |
CN207380667U (en) * | 2017-08-10 | 2018-05-18 | 苏州神秘谷数字科技有限公司 | Augmented reality interactive system based on radar eye |
CN209572101U (en) * | 2019-04-24 | 2019-11-01 | 杭州赛鲁班网络科技有限公司 | Teaching share system based on augmented reality |
CN110553686A (en) * | 2019-10-12 | 2019-12-10 | 重庆四标精密零部件制造有限公司 | Cold header running state monitoring system |
CN110688005A (en) * | 2019-09-11 | 2020-01-14 | 塔普翊海(上海)智能科技有限公司 | Mixed reality teaching environment, teacher and teaching aid interaction system and interaction method |
CN111149079A (en) * | 2018-08-24 | 2020-05-12 | 谷歌有限责任公司 | Smart phone, system and method including radar system |
CN111488823A (en) * | 2020-04-09 | 2020-08-04 | 福州大学 | Dimension-increasing gesture recognition and interaction system and method based on two-dimensional laser radar |
CN211698428U (en) * | 2018-08-01 | 2020-10-16 | 深圳先进技术研究院 | Head-mounted display device |
-
2020
- 2020-12-30 CN CN202011611544.3A patent/CN112904999A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN204375125U (en) * | 2014-11-28 | 2015-06-03 | 国家电网公司 | Operation irregularity alarm device |
CN104483754A (en) * | 2014-12-04 | 2015-04-01 | 上海交通大学 | Head-wearing type multimedia terminal assisted watching system aiming at patient with dysopia |
CN106502424A (en) * | 2016-11-29 | 2017-03-15 | 上海小持智能科技有限公司 | Based on the interactive augmented reality system of speech gestures and limb action |
CN207380667U (en) * | 2017-08-10 | 2018-05-18 | 苏州神秘谷数字科技有限公司 | Augmented reality interactive system based on radar eye |
CN207115332U (en) * | 2017-08-28 | 2018-03-16 | 重庆卢浮印象数字科技有限公司 | Body-sensing gloves and gesture-capture system |
CN107491008A (en) * | 2017-08-29 | 2017-12-19 | 歌尔股份有限公司 | Wearable device |
CN211698428U (en) * | 2018-08-01 | 2020-10-16 | 深圳先进技术研究院 | Head-mounted display device |
CN111149079A (en) * | 2018-08-24 | 2020-05-12 | 谷歌有限责任公司 | Smart phone, system and method including radar system |
CN209572101U (en) * | 2019-04-24 | 2019-11-01 | 杭州赛鲁班网络科技有限公司 | Teaching share system based on augmented reality |
CN110688005A (en) * | 2019-09-11 | 2020-01-14 | 塔普翊海(上海)智能科技有限公司 | Mixed reality teaching environment, teacher and teaching aid interaction system and interaction method |
CN110553686A (en) * | 2019-10-12 | 2019-12-10 | 重庆四标精密零部件制造有限公司 | Cold header running state monitoring system |
CN111488823A (en) * | 2020-04-09 | 2020-08-04 | 福州大学 | Dimension-increasing gesture recognition and interaction system and method based on two-dimensional laser radar |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR20200110154A (en) | Visual question answering model, electronic device and storage medium | |
WO2020062998A1 (en) | Image processing method, storage medium, and electronic device | |
CN111541951B (en) | Video-based interactive processing method and device, terminal and readable storage medium | |
CN105159537A (en) | Multiscreen-based real-time independent interaction system | |
CN112365956A (en) | Psychological treatment method, psychological treatment device, psychological treatment server and psychological treatment storage medium based on virtual reality | |
CN103870508A (en) | Webpage scaling method, device and system | |
CN116841393A (en) | Meta-universe scene control system and method and virtual scene interaction system | |
CN112149599B (en) | Expression tracking method and device, storage medium and electronic equipment | |
CN112904999A (en) | Augmented reality somatosensory interaction method and system based on laser radar | |
CN113938759A (en) | File sharing method and file sharing device | |
CN112637692B (en) | Interaction method, device and equipment | |
CN114040252B (en) | Display frame rate control method and device, computer readable medium and electronic equipment | |
CN114125149A (en) | Video playing method, device, system, electronic equipment and storage medium | |
Annapoorna et al. | Hand Gesture Recognition and Conversion to Speech for Speech Impaired | |
CN112449242A (en) | Online video teaching system, method and device | |
CN219958185U (en) | Small-sized device for detecting user fatigue in VR scene | |
CN114338953B (en) | Video processing circuit, video processing method and electronic equipment | |
CN117115918A (en) | Sign language recognition method and device, electronic equipment and readable storage medium | |
CN116980683B (en) | Slide show method, device and storage medium based on video | |
CN112135152B (en) | Information processing method and device | |
CN113286104A (en) | Video signal superposition dynamic character and picture processing system based on 4K60 | |
CN117806459A (en) | Man-machine interaction method, device, platform, electronic equipment and storage medium | |
CN117319751A (en) | Video playing method and device, electronic equipment and storage medium | |
TWM659550U (en) | Low latency video tagging device | |
CN114995626A (en) | Vehicle-mounted system and control method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |