CN104866106A - HUD and infrared identification-combined man-machine interactive method and system - Google Patents

HUD and infrared identification-combined man-machine interactive method and system Download PDF

Info

Publication number
CN104866106A
CN104866106A CN201510297399.9A CN201510297399A CN104866106A CN 104866106 A CN104866106 A CN 104866106A CN 201510297399 A CN201510297399 A CN 201510297399A CN 104866106 A CN104866106 A CN 104866106A
Authority
CN
China
Prior art keywords
hud
man
infrared
gesture
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510297399.9A
Other languages
Chinese (zh)
Inventor
何杰
莫冰
韩宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Halation Network Technology Co Ltd Of Shenzhen
Original Assignee
Halation Network Technology Co Ltd Of Shenzhen
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Halation Network Technology Co Ltd Of Shenzhen filed Critical Halation Network Technology Co Ltd Of Shenzhen
Priority to CN201510297399.9A priority Critical patent/CN104866106A/en
Publication of CN104866106A publication Critical patent/CN104866106A/en
Pending legal-status Critical Current

Links

Abstract

The invention discloses an HUD and infrared identification-combined man-machine interactive method, which comprises the following steps: S1, through shooting the gesture action of a user, obtaining an actual action feature value corresponding to the gesture action; S2, matching the actual action feature value obtained in S1 with a preset action feature value corresponding to an action command to obtain the corresponding action command; S3, automatically executing the action command obtained by matching in S2, and feeding an execution result back to an HUD display system in a vehicle; S4, reflecting and imaging the execution result in S3 to a display screen by the HUD display system in the vehicle. Compared with the prior art, the HUD and infrared identification-combined man-machine interactive method has the beneficial effects that a driver can pay attention to a front road during driving, and finishes multiple gestures to realize man-machine interaction without changing a sight, the cost is low, the operation is safe and simple, the expandability is high, and the driver can conveniently judge in a driving state.

Description

A kind of man-machine interaction method in conjunction with HUD and infrared identification and system
Technical field
The present invention relates to field of human-computer interaction, specifically a kind of man-machine interaction method in conjunction with HUD and infrared identification and system.
Background technology
HUD (Heads Up Display) new line display instrument, be head-up display system again, it can important information as information such as the speed of a motor vehicle, rotating speed, water temperature, voltage, navigation, the driver being presented at windshield glass looks squarely scope, driver need not be bowed, just can see important information clearly, thus avoid disperseing the notice to road ahead, avoid the fatigue of eyes.From visual effect, HUD exactly likes Transparence Display technology, the difference of essence is had again with it, can be described as an in a disguised form application of projection display technique, the signal that conventional instrument produces cannot be directly used in the display demand of HUD, after must be transmitted through computer disposal conversion, the data of needs is passed to the display unit of HUD, then by image projection on the glass of front.HUD is in car field, and the most practical function can be summarized as three major types, information of vehicles, navigation and safety, and especially security function is particularly valuable, is following main development direction.
People in current automobile can only leave front in sight line and turn to below to watch the instruments dish, is touched realize man-machine interaction by physical button, touch-screen, cause drive a vehicle dangerous; Also there is part producer to apply infrared sensor and catch the gesture of waving left and right, simple human-computer interaction mechanism can only be triggered; Also have minority producer application structure light technology or dual camera is installed and obtain the degree of depth of gesture in motor space, carry out gesture identification, but realize cost intensive, be difficult to popularize.
Summary of the invention
The present invention is directed to prior art Problems existing, the people in current automobile can only leave front in sight line and turn to below to watch the instruments dish, is touched, the mode of infrared sensor realizes simple man-machine interaction by physical button, touch-screen, cause drive a vehicle dangerous; Or carry out gesture identification by structured light and dual camera, but realize cost intensive, propose a kind of man-machine interaction method in conjunction with HUD and infrared identification and system, this man-machine interactive system installation cost is low, safe and reliable.
The present invention in order to the technical scheme solving prior art problem and adopt is: provide a kind of man-machine interaction method in conjunction with HUD and infrared identification, it is characterized in that, said method comprising the steps of:
S1, by taking the gesture motion of user, obtains the actual act eigenwert that this gesture motion is corresponding;
S2, the actual act eigenwert obtained by S1 is mated with the preset motion characteristic value of respective action instruction, obtains the action command that this actual act eigenwert is corresponding;
S3, automatically performs S2 and mates the action command obtained, and execution result is fed back to HUD display system in car;
S4, by HUD display system in car by the execution result catoptric imaging of S3 to display screen.
Preferably, the gesture motion of taking user in step S1 comprises the infrared ray of shooting gesture motion reflection, is converted into the digital signal of RAW form after generating the optical signalling storing raw information.
Preferably, described actual act eigenwert is the outline data of gesture, follows the poor value of pixel value of surrounding random distance to obtain by the rgb pixel value of gesture profile.
Preferably, the process that in step S2, actual act eigenwert and the preset motion characteristic value of respective action instruction carry out mating comprises: the data picture of shooting is first gone background, by contexts detection algorithm mixture Gaussian background model, the prospect of picture is found out, as the profile of gesture, the Data Color value of profile is carried out gesture classification by CNN degree of depth learning algorithm, the gesture-type chosen after classification, in conjunction with the movement locus of gesture, is determined gesture instruction identification.
Preferably, automatically performing S2 in step S3, to mate the action command obtained be that the data of action command and computational logic are submitted to arm processor by Computational frames such as OpenCL or CUDA by GPU, and arm processor performs corresponding action command.
Preferably, in step S4, the execution result catoptric imaging of S3 comprises to the process of display screen: HUD display system reads by RGB888 agreement the instruction results digital signal that performs an action that arm processor transmits, and zooms into the virtual image display in shield glass front after digital signal being converted into optical signalling imaging by optical lens.
In order to support said method, present invention also offers a kind of man-machine interactive system in conjunction with HUD and infrared identification, comprising:
One photographing module, for taking gesture motion when user carries out man-machine interaction, and is converted into digital signal by the original optical signal of this gesture motion record;
One HUD module, performs instruction results corresponding to gesture motion for showing man-machine interactive system; It is characterized in that, also comprise
One circuit module, for being electrically connected photographing module and HUD module, receive and process photographing module shooting gesture motion digital signal, by process gesture motion result send to HUD module.
Preferably, described photographing module comprises infrared camera and infrared LED light emitting diode matrix two parts, infrared LED light emitting diode matrix is made up of several LED, is symmetrically distributed in around infrared camera, by circuit module to LED power light emitting diode arrays.Photographing module passes through main flow connection protocol such as MIPI or CSI interface and is electrically connected circuit module, the RAW data of transmission infrared camera shooting, these RAW data are the rgb values that be have recorded each pixel by photo-sensitive cell, there is a saturating infrared photosphere below of infrared light supply, for improving required waveband infrared transmittance.Infrared light supply illuminates the gesture motion of driver, and infrared camera shooting driver gesture data sends to circuit module.
Preferably, the integrated arm processor of described circuit module and the GPU for vision calculating.Described GPU carries out background the gesture picture that photographing module transmits and is separated with gesture profile, images of gestures after separation adopts CNN degree of depth learning algorithm to carry out gesture classification, by the direction that the result of classification is slided in conjunction with gesture motion trajectory calculation gesture, by using the direction of hidden Markov model algorithm predicts gesture motion, finally determining the action command of gesture motion and action command is passed to arm processor.Arm processor accepts and completes this action command, and arm processor sends the object information that this action command completes to HUD module.The object information that HUD module correspondence completes instruction generates corresponding display interface, and is projected on windshield by this display interface and show.Circuit module also provides power supply for this man-machine interactive system, leaves the OBD system that line connects car cigarette lighter or vehicle.When circuit module connects the OBD interface of vehicle, vehicle OBD system is not only this man-machine interactive system and provides power supply, also by meter panel of motor vehicle information as speed, oil mass, in OBD standard code mode, send circuit module to, circuit module sends the pattern of generating custom after Information procession to HUD module again and shows.
Preferably, HUD module, by RGB888 interfacing circuitry module, is thrown imaging film by projection module, just, mirror lens is formed, by the mode of projection imaging, the result of system process gesture instruction is reflected in imaging on shield glass.Wherein projection module is the LCD screen based on the micro-throwing module with DLP technology or the micro-throwing module based on laser projection or high brightness, and the digital signal that receiving circuit module transmits is presented at HUD module; Just throwing imaging film is use chip typography in the convex configuration more than more than 10 ten thousand of printed thereon close to Nano grade on the surface of film, being coated with nano level light sensitive layer by vacuum magnetron sputtering coating film technique again, ensureing have good diffuse effect to possess again high optical gain to projecting to imaging film.Projection module is relative with just throwing imaging film position, and mirror lens is positioned at projection module rear for receiving the light imaging of just throwing imaging film reflection.Projection module as HUD module adopts the projector with DLP technology, DLP projector includes a slice numeral DMD micromirror chip, this chip is that close-packed arrays many small square reflecting optics in the electronic nodes of one piece of silicon wafer, and every a slice reflecting optics here all correspond to a pixel of synthetic image.When DMD micromirror chip works, respective memory controls on two different positions, carry out switching and rotates.The RGB light source projects of LED is on reflecting optics, after the light of wherein a certain color projects the surface of DMD micromirror chip, all micro mirrors on dmd chip, according to the quantity of this color in the pixel of self correspondence, determine it is in open position number of times to this coloured light, also namely determine after reflection by the quantity of projection lens projects to the light on screen.When the light of other colors is irradiated to DMD surface successively, the action that all micro mirrors in DMD surface will repeat above as quick as thought, the result finally showed is exactly just throwing projected image imaging film occurring colour.
A kind of man-machine interactive system course of work in conjunction with HUD and infrared identification provided by the invention: below the sunshading board that man-machine interactive system is suspended on position of driver or console dead ahead.OBD interface start up system in the power lead access car that this man-machine interactive system is reserved, infrared light supply irradiates the hand of driver, infrared camera is by the gesture motion of saturating infrared photosphere captured in real-time driver, the picture of camera shooting sends the GPU of circuit module to, GPU is classified to picture by CNN degree of depth learning algorithm, in conjunction with the movement locus of gesture after classification, judge that gesture represents action command and complete this action command, the status information of execution instruction is optical signalling through HUD resume module, project to and mirror lens demonstrates the virtual image feed back to driver.The above-mentioned virtual image content demonstrated includes but not limited to, the real-time information of panel board data, speed, navigation data, mobile phone link information or other assistance driver drives vehicle.
The invention has the beneficial effects as follows: the present invention is shown and infrared knowledge method for distinguishing in conjunction with HUD by one, driver can be focused one's attention at road surface ahead when driving, complete various gestures action while not changing sight line and realize human-computer interaction, with low cost, handling safety is simple, expansion is strong, facilitate driver to obtain makes a decision under driving condition.
Accompanying drawing explanation
Fig. 1 is a kind of man-machine interaction method block diagram in conjunction with HUD and infrared identification provided by the invention;
Fig. 2 is that a kind of man-machine interactive system in conjunction with HUD and infrared identification provided by the invention forms framework map;
Fig. 3 for the embodiment of the present invention one provide man-machine interactive system to open after mobile phone have an incoming call time HUD display module display content graph;
The hardware structure diagram of a kind of man-machine interactive system figure in conjunction with HUD and infrared identification that Fig. 4 provides for the embodiment of the present invention one;
A kind of man-machine interactive system mounting means in conjunction with HUD and infrared identification that Fig. 5 provides for the embodiment of the present invention one;
Identifier declaration in figure: 1, mirror lens, 2, circuit module, 3, infrared camera, 4, the infrared photosphere of dateline, 5, projection module, 6, just throw imaging film, 7, windshield, 8, HUD module, 9, sunshading board, 10, infrared camera shooting angle.
Embodiment
Below with reference to accompanying drawing 1 to accompanying drawing 5 and embodiment, the present invention is described further, but should not limit the scope of the invention with this.
As shown in Figures 1 and 2, the invention provides a kind of man-machine interaction method in conjunction with HUD and infrared identification, it is characterized in that, said method comprising the steps of:
S1, by taking the gesture motion of user, obtains the actual act eigenwert that this gesture motion is corresponding;
S2, the actual act eigenwert obtained by S1 is mated with the preset motion characteristic value of respective action instruction, obtains the action command that this actual act eigenwert is corresponding;
S3, automatically performs S2 and mates the action command obtained, and execution result is fed back to HUD display system in car;
S4, by HUD display system in car by the execution result catoptric imaging of S3 to display screen.
In order to support said method, present invention also offers a kind of man-machine interactive system in conjunction with HUD and infrared identification, comprising:
One photographing module 1, for taking gesture motion when user carries out man-machine interaction, and is converted into digital signal by the original optical signal of this gesture motion record;
One HUD module 2, performs instruction results corresponding to gesture motion for showing man-machine interactive system; It is characterized in that, also comprise
One circuit module 3, for being electrically connected photographing module 1 and HUD module 2, receive and process photographing module 1 take gesture motion digital signal, by process gesture motion result send to HUD module 2.
Embodiment one
As shown in Figure 3, when driver on the run mobile phone have incoming call, incoming information is comprised incoming person's phone number and head portrait by connecting on-vehicle Bluetooth by mobile phone, be transferred in car and HUD display screen front windshield shows, driver sees incoming information, vacillate now to the left, now to the right according to the gesture prompting of information and represent that refusal is answered, thumb up and agree to answer, driver stretches out below the photographing module of the right hand in this man-machine interactive system and makes the gesture of thumbing up, this gesture is by this system one infrared layer isolation filtration light thoroughly, the digital data transmission of RAW form is converted into circuit system module after the infrared camera pictures taken of system photographing module.The GPU of circuit system module receives this digital signal, after the background removal in picture, the profile of gesture in picture is carried out gesture classification through CNN degree of depth learning algorithm and show that this gesture belongs to determined type, in conjunction with thumb movement locus upwards, determine that this action command activates pickup feature, GPU by this command to circuit module, circuit module connects mobile phone and HUD module by bluetooth, the pickup feature Activated Phone, this digital command is converted into optical signalling by HUD module simultaneously, being projected at the interface received calls by the projection module of HUD module just throws on imaging film, just throwing the mirror lens that this projection ray is reflexed to opposite by imaging film again.Driver sees the virtual image interface that receives calls that shield glass is formed by the amplification of mirror lens.
Embodiment two
As shown in Figure 4 and Figure 5, a kind of man-machine interactive system in conjunction with HUD and infrared identification provided by the invention is installed in car: this man-machine interactive system comprises photographing module 1, HUD module 2 and circuit module 3.This entire system is suspended on below the sunshading board of position of driver, wherein OBD interface in circuit module preset OBD interface pin access car, for giving system power supply and providing relevant running information.
The photographing module of this system comprises infrared camera and infrared LED light emitting diode matrix two parts, and infrared LED light emitting diode matrix is made up of several LED, is symmetrically distributed in around infrared camera, and there is a saturating infrared photosphere below of infrared camera.
The top that the circuit module of this system is positioned at total system is close to automotive sunshade panel and is integrated with arm processor and GPU.
This system HUD module throws imaging film by projection module, just, mirror lens is formed, wherein the present embodiment with projection module be based on the micro-throwing module with DLP technology, just throwing imaging film is use chip typography in the convex configuration more than more than 10 ten thousand of printed thereon close to Nano grade on the surface of film, then is coated with nano level light sensitive layer by vacuum magnetron sputtering coating film technique.Projection module is relative with just throwing imaging film position, mirror lens be positioned at projection module rear for just throw imaging film reflection light penetration after be amplified to shield glass imaging.
Embodiment three
A kind of man-machine interactive system course of work in conjunction with HUD and infrared identification provided by the invention: below the sunshading board that man-machine interactive system is suspended on position of driver or console dead ahead.OBD interface start up system in the power lead access car that this man-machine interactive system is reserved, infrared light supply irradiates the hand of driver, infrared camera is by the gesture motion of saturating infrared photosphere captured in real-time driver, the picture of camera shooting sends the GPU of circuit module to, GPU is classified to picture by CNN degree of depth learning algorithm, in conjunction with the movement locus of gesture after classification, judge that gesture represents action command and complete this action command, the status information of execution instruction is optical signalling through HUD resume module, project to and mirror lens demonstrates the virtual image feed back to driver.The above-mentioned virtual image content demonstrated includes but not limited to, the real-time information of panel board data, speed, navigation data, mobile phone link information or other assistance driver drives vehicle.
The good effect of the present application: by one in conjunction with HUD display and infrared knowledge method for distinguishing, driver can be focused one's attention at road surface ahead when driving, complete various gestures action while not changing sight line and realize human-computer interaction, with low cost, handling safety is simple, expansion is strong, facilitate driver to obtain makes a decision under driving condition.
The announcement of book and instruction according to the above description, those skilled in the art in the invention can also change above-mentioned embodiment and revise.Therefore, the present invention is not limited to embodiment disclosed and described above, also should fall in the protection domain of claim of the present invention modifications and changes more of the present invention.In addition, although employ some specific terms in this instructions, these terms just for convenience of description, do not form any restriction to the present invention.

Claims (10)

1., in conjunction with a man-machine interaction method for HUD and infrared identification, it is characterized in that, said method comprising the steps of:
S1, by taking the gesture motion of user, obtains the actual act eigenwert that this gesture motion is corresponding;
S2, the actual act eigenwert obtained by S1 is mated with the preset motion characteristic value of respective action instruction, obtains the action command that this actual act eigenwert is corresponding;
S3, automatically performs S2 and mates the action command obtained, and execution result is fed back to HUD display system in car;
S4, by HUD display system in car by the execution result catoptric imaging of S3 to display screen.
2. a kind of man-machine interaction method in conjunction with HUD and infrared identification as claimed in claim 1, it is characterized in that, the gesture motion of taking user in step S1 comprises the infrared ray of shooting gesture motion reflection, is converted into the digital signal of RAW form after generating the optical signalling storing raw information.
3. a kind of man-machine interaction method in conjunction with HUD and infrared identification as claimed in claim 1, it is characterized in that, actual act eigenwert adopts the outline data of gesture, is obtained by the rgb pixel value of the gesture profile poor value of pixel value with surrounding random distance.
4. a kind of man-machine interaction method in conjunction with HUD and infrared identification as claimed in claim 1, it is characterized in that, the process that in step S2, actual act eigenwert and the preset motion characteristic value of respective action instruction carry out mating comprises: actual act eigenwert carried out mating rear classification with preset motion characteristic value, according to the movement locus of classification results and gesture, determine the action command that gesture is corresponding.
5. a kind of man-machine interaction method in conjunction with HUD and infrared identification as claimed in claim 1, it is characterized in that, automatically performing S2 in step S3, to mate the action command obtained be automatically loaded by arm processor and read this action command, activates that corresponding program completes.
6. a kind of man-machine interaction method in conjunction with HUD and infrared identification as claimed in claim 1, it is characterized in that, in step S4, the execution result catoptric imaging of S3 comprises to the process of display screen: HUD display system reads the digital signal of the instruction results that performs an action that arm processor transmits, and projects to display screen after digital signal being converted into optical signalling imaging.
7., in conjunction with a man-machine interactive system for HUD and infrared identification, for realizing the described a kind of man-machine interaction method in conjunction with HUD and infrared identification of one of claim 1 to 6, comprising:
One photographing module, for taking gesture motion when user carries out man-machine interaction, and is converted into digital signal by the original optical signal of this gesture motion record;
One HUD module, performs instruction results corresponding to gesture motion for showing man-machine interactive system; It is characterized in that, also comprise
One circuit module, for being electrically connected photographing module and HUD module, receive and process photographing module shooting gesture motion digital signal, by process gesture motion result send to HUD module.
8. a kind of man-machine interactive system in conjunction with HUD and infrared identification as claimed in claim 7, it is characterized in that, described photographing module comprises some LEDs infrared light supplies and is symmetrically distributed in around infrared camera, there is a saturating infrared photosphere below of infrared light supply, for improving required waveband infrared transmittance.
9. a kind of man-machine interactive system in conjunction with HUD and infrared identification as claimed in claim 7, it is characterized in that, the integrated arm processor of described circuit module and the GPU calculated for vision, this GPU carries out gesture outline identification to the infrared shooting data that photographing module transmits, and by action command corresponding to movement locus coupling gesture motion of recognition result in conjunction with gesture.
10. a kind of man-machine interactive system in conjunction with HUD and infrared identification as claimed in claim 7, it is characterized in that, HUD module throws imaging film by projection module, just, mirror lens is formed, wherein projection module is relative with just throwing imaging film position, and mirror lens is positioned at projection module rear for receiving the light imaging of just throwing imaging film reflection.
CN201510297399.9A 2015-06-03 2015-06-03 HUD and infrared identification-combined man-machine interactive method and system Pending CN104866106A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510297399.9A CN104866106A (en) 2015-06-03 2015-06-03 HUD and infrared identification-combined man-machine interactive method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510297399.9A CN104866106A (en) 2015-06-03 2015-06-03 HUD and infrared identification-combined man-machine interactive method and system

Publications (1)

Publication Number Publication Date
CN104866106A true CN104866106A (en) 2015-08-26

Family

ID=53911987

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510297399.9A Pending CN104866106A (en) 2015-06-03 2015-06-03 HUD and infrared identification-combined man-machine interactive method and system

Country Status (1)

Country Link
CN (1) CN104866106A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105677206A (en) * 2016-01-08 2016-06-15 北京乐驾科技有限公司 System and method for controlling head-up display based on vision
CN106693361A (en) * 2016-12-23 2017-05-24 武汉市马里欧网络有限公司 Ultrasonic hand gesture recognition based AR (augmented reality) dress-up game projection method and ultrasonic hand gesture recognition based AR dress-up game projection system
CN107719122A (en) * 2017-09-14 2018-02-23 中国第汽车股份有限公司 A kind of new line actuation means and method applied to automobile
CN108073274A (en) * 2016-11-10 2018-05-25 财团法人金属工业研究发展中心 Gesture operation method and system based on depth value
CN108229345A (en) * 2017-12-15 2018-06-29 吉利汽车研究院(宁波)有限公司 A kind of driver's detecting system
CN108430819A (en) * 2015-12-22 2018-08-21 歌乐株式会社 Car-mounted device
CN110696614A (en) * 2018-07-10 2020-01-17 福特全球技术公司 System and method for controlling vehicle functions via driver HUD and passenger HUD
CN111124198A (en) * 2018-11-01 2020-05-08 广州汽车集团股份有限公司 Animation playing and interaction method, device, system and computer equipment
CN111158457A (en) * 2019-12-31 2020-05-15 苏州莱孚斯特电子科技有限公司 Vehicle-mounted HUD (head Up display) human-computer interaction system based on gesture recognition
CN111158491A (en) * 2019-12-31 2020-05-15 苏州莱孚斯特电子科技有限公司 Gesture recognition man-machine interaction method applied to vehicle-mounted HUD
CN111596766A (en) * 2020-05-22 2020-08-28 福建天晴数码有限公司 Gesture recognition method of head-mounted device and storage medium
WO2020183249A1 (en) * 2019-03-08 2020-09-17 Indian Institute Of Science A system for man-machine interaction in vehicles
CN112784926A (en) * 2021-02-07 2021-05-11 四川长虹电器股份有限公司 Gesture interaction method and system
CN114701409A (en) * 2022-04-28 2022-07-05 东风汽车集团股份有限公司 Gesture interactive intelligent seat adjusting method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294190A (en) * 2012-02-06 2013-09-11 福特全球技术公司 Recognition system interacting with vehicle controls through gesture recognition
CN103885580A (en) * 2012-12-20 2014-06-25 现代自动车株式会社 Control system for using hand gesture for vehicle
CN104238731A (en) * 2013-06-24 2014-12-24 惠州市华阳多媒体电子有限公司 Gesture control system of head-up display and control method of gesture control system
WO2014203534A1 (en) * 2013-06-20 2014-12-24 株式会社デンソー Head-up display device, and illuminating device employed in head-up display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294190A (en) * 2012-02-06 2013-09-11 福特全球技术公司 Recognition system interacting with vehicle controls through gesture recognition
CN103885580A (en) * 2012-12-20 2014-06-25 现代自动车株式会社 Control system for using hand gesture for vehicle
WO2014203534A1 (en) * 2013-06-20 2014-12-24 株式会社デンソー Head-up display device, and illuminating device employed in head-up display device
CN104238731A (en) * 2013-06-24 2014-12-24 惠州市华阳多媒体电子有限公司 Gesture control system of head-up display and control method of gesture control system

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108430819A (en) * 2015-12-22 2018-08-21 歌乐株式会社 Car-mounted device
CN105677206A (en) * 2016-01-08 2016-06-15 北京乐驾科技有限公司 System and method for controlling head-up display based on vision
CN108073274B (en) * 2016-11-10 2021-04-06 财团法人金属工业研究发展中心 Gesture operation method and system based on depth value
CN108073274A (en) * 2016-11-10 2018-05-25 财团法人金属工业研究发展中心 Gesture operation method and system based on depth value
CN106693361A (en) * 2016-12-23 2017-05-24 武汉市马里欧网络有限公司 Ultrasonic hand gesture recognition based AR (augmented reality) dress-up game projection method and ultrasonic hand gesture recognition based AR dress-up game projection system
CN107719122A (en) * 2017-09-14 2018-02-23 中国第汽车股份有限公司 A kind of new line actuation means and method applied to automobile
CN108229345A (en) * 2017-12-15 2018-06-29 吉利汽车研究院(宁波)有限公司 A kind of driver's detecting system
CN110696614A (en) * 2018-07-10 2020-01-17 福特全球技术公司 System and method for controlling vehicle functions via driver HUD and passenger HUD
CN110696614B (en) * 2018-07-10 2024-04-16 福特全球技术公司 System and method for controlling vehicle functions via driver HUD and passenger HUD
CN111124198A (en) * 2018-11-01 2020-05-08 广州汽车集团股份有限公司 Animation playing and interaction method, device, system and computer equipment
WO2020183249A1 (en) * 2019-03-08 2020-09-17 Indian Institute Of Science A system for man-machine interaction in vehicles
CN111158457A (en) * 2019-12-31 2020-05-15 苏州莱孚斯特电子科技有限公司 Vehicle-mounted HUD (head Up display) human-computer interaction system based on gesture recognition
CN111158491A (en) * 2019-12-31 2020-05-15 苏州莱孚斯特电子科技有限公司 Gesture recognition man-machine interaction method applied to vehicle-mounted HUD
CN111596766A (en) * 2020-05-22 2020-08-28 福建天晴数码有限公司 Gesture recognition method of head-mounted device and storage medium
CN111596766B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Gesture recognition method of head-mounted device and storage medium
CN112784926A (en) * 2021-02-07 2021-05-11 四川长虹电器股份有限公司 Gesture interaction method and system
CN114701409A (en) * 2022-04-28 2022-07-05 东风汽车集团股份有限公司 Gesture interactive intelligent seat adjusting method and system
CN114701409B (en) * 2022-04-28 2023-09-05 东风汽车集团股份有限公司 Gesture interactive intelligent seat adjusting method and system

Similar Documents

Publication Publication Date Title
CN104866106A (en) HUD and infrared identification-combined man-machine interactive method and system
US9116666B2 (en) Gesture based region identification for holograms
CN108885341B (en) Prism-based eye tracking
CN206031079U (en) On -vehicle head -up display AR of augmented reality HUD
US9727132B2 (en) Multi-visor: managing applications in augmented reality environments
US9651781B2 (en) Head-up display device
CN104875680A (en) HUD (head up display) device combining voice and video recognition
US10445594B2 (en) Onboard display system
US20140285520A1 (en) Wearable display device using augmented reality
RU2017105422A (en) COMPACT WINDSHIELD INDICATION SYSTEM
CN102566756A (en) Comprehension and intent-based content for augmented reality displays
US10067415B2 (en) Method for displaying image using projector and wearable electronic device for implementing the same
TWI572503B (en) Head up display system
CN105677206A (en) System and method for controlling head-up display based on vision
US20190339535A1 (en) Automatic eye box adjustment
US20130003028A1 (en) Floating virtual real image display apparatus
CN107872659B (en) Projection arrangement and projecting method
JP5821464B2 (en) Head-mounted display device
JP5783045B2 (en) Input device and input system
US20120327130A1 (en) Floating virtual plasma display apparatus
KR20180050811A (en) Head up display apparatus for vehicle and method for controlling thereof
US20210191133A1 (en) External recording indicators
US20170285765A1 (en) Input apparatus, input method, and computer program
CN115685654A (en) Projection device, vehicle and display apparatus
KR20230034448A (en) Vehicle and method for controlling thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20150826

RJ01 Rejection of invention patent application after publication