CN116501167A - In-vehicle interaction system based on gesture operation and vehicle - Google Patents
In-vehicle interaction system based on gesture operation and vehicle Download PDFInfo
- Publication number
- CN116501167A CN116501167A CN202310362987.0A CN202310362987A CN116501167A CN 116501167 A CN116501167 A CN 116501167A CN 202310362987 A CN202310362987 A CN 202310362987A CN 116501167 A CN116501167 A CN 116501167A
- Authority
- CN
- China
- Prior art keywords
- gesture
- vehicle
- sensor
- information
- interaction system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 62
- 230000006870 function Effects 0.000 claims abstract description 11
- 230000011218 segmentation Effects 0.000 claims abstract description 11
- 238000000034 method Methods 0.000 claims description 18
- 230000002452 interceptive effect Effects 0.000 claims description 12
- 238000001514 detection method Methods 0.000 claims description 5
- 230000004048 modification Effects 0.000 claims description 4
- 238000012986 modification Methods 0.000 claims description 4
- 230000003068 static effect Effects 0.000 claims description 3
- 210000003811 finger Anatomy 0.000 description 13
- 210000004247 hand Anatomy 0.000 description 6
- 238000012544 monitoring process Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000004378 air conditioning Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 210000003414 extremity Anatomy 0.000 description 2
- 230000007958 sleep Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241001672694 Citrus reticulata Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 210000004932 little finger Anatomy 0.000 description 1
- 238000002715 modification method Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides an in-vehicle interaction system based on gesture operation and a vehicle, wherein the system comprises a sensor and a vehicle body controller which are electrically connected; the sensor is used for collecting image information in the vehicle; the vehicle body controller comprises a gesture segmentation module, a gesture tracking module, a gesture recognition module and an execution module which are sequentially connected, and the gesture segmentation module, the gesture tracking module, the gesture recognition module and the execution module are used for receiving image information acquired by the sensor and generating a control signal to control vehicle-mounted functions. The technical scheme of the invention can give consideration to the experience of drivers and passengers, can meet the interaction demands of personnel at a plurality of positions in the vehicle, can improve the safety of the interaction system and the vehicle in the vehicle, and simultaneously improves the precision and the applicability of the interaction system.
Description
Technical Field
The invention relates to the field of human-computer interaction, in particular to an in-vehicle interaction system based on gesture operation and a vehicle.
Background
With the rapid development of computer technology, people tend to realize man-machine interaction in a more natural way, the appearance of virtual reality technology provides a new form and development direction for man-machine interaction, and gesture interaction is one of main interaction modes of man-machine interaction nowadays due to natural and visual advantages, and is widely applied to mobile phones, computer control, body sensory games, robot control and the like. And automotive interaction as a newer interaction design area provides new challenges and opportunities for interaction design.
Patent publication No. CN107102731a discloses a gesture control method for a vehicle, the vehicle includes a vehicle-mounted central control system, the vehicle-mounted central control system is disposed on the vehicle and is in communication with the vehicle, the vehicle-mounted central control system controls the vehicle to act, wherein the gesture control method includes the following steps: entering a gesture control mode; sending out a gesture instruction; the vehicle-mounted central control system receives the gesture instruction; the vehicle-mounted central control system controls the vehicle according to the gesture instruction; the vehicle acts according to the control of the vehicle-mounted central control system.
The patent with publication number CN111469859A discloses an automobile gesture interaction system, which comprises a collecting device and a control device; the acquisition device is used for acquiring images of a specific area in the automobile in real time; the control device comprises an identification unit, a matching unit and an execution unit; the identification unit is used for carrying out human hand identification on the image; and the matching unit matches the identification result with a preset action, and if the matching is successful, the execution unit executes a corresponding instruction.
However, the following problems exist in the prior art:
1. the current mainstream voice interaction has a slower response speed, voice recognition is performed after calling a voice assistant, and the voice recognition performance for non-mandarin is poor, so that a recognition error condition is often generated.
2. Existing gesture recognition operations are designed primarily for drivers, and the riding experience of passengers is often ignored.
3. The current sensors mostly realize interaction by detecting limb actions of passengers, and relate to positions of hands, arms, legs, heads and the like, so that various problems can be brought, for example, the mistaken touch of a user can be caused by the overlarge monitoring range, and some common limb actions can change the intelligent environment in a vehicle even if the user does not want to change the intelligent environment; for another example, wide-range, multi-angle monitoring also represents a complexity and cost increase of the technology.
4. The existing sensor has the defect of being sensitive to ambient light, and the accuracy of sensing can be greatly influenced under the condition of insufficient light, so that the occurrence of a low-light environment in a vehicle is not rare.
Therefore, how to provide an in-vehicle interaction system based on gesture operation and a vehicle capable of solving the above technical problems are a technical problem to be solved in the art.
Disclosure of Invention
In order to solve the technical problems, the invention provides an in-vehicle interaction system based on gesture operation and a technical scheme of a vehicle, so as to solve the technical problems.
The invention discloses an in-vehicle interaction system based on gesture operation, which comprises a sensor and a vehicle body controller which are electrically connected;
the vehicle body controller comprises a gesture segmentation module, a gesture tracking module, a gesture recognition module and an execution module which are connected in sequence;
the sensor is used for acquiring image information in the vehicle;
the vehicle body controller is used for receiving the image information and generating a control signal to control vehicle-mounted functions;
the method for controlling the vehicle-mounted function comprises the following steps:
the vehicle central control system opens a gesture control switch and enters a gesture control mode;
the interactive system judges whether the gesture is recognized or not, if not, the interactive system returns; if the gesture is identifiable, judging gesture content and linking with preset gesture command database information stored in the vehicle body controller to generate control command data, wherein the method specifically comprises the following steps of:
the sensor acquires image information in the vehicle;
the gesture segmentation module receives the image information and calculates to obtain initial position information of a human hand;
the gesture tracking module receives the initial position information, and tracks and identifies a human hand to obtain imaged hand information and sensor detection range information;
the gesture recognition module receives the hand information and the detection range information of the sensor, and calculates to obtain gesture information;
the execution module receives the gesture information, compares the gesture information with the preset gesture instruction database information, and calculates control instruction data;
and the interactive system controls the vehicle-mounted function according to the control instruction data and outputs an execution instruction.
According to the in-vehicle interaction system based on gesture operation of the first aspect of the present invention, the preset gesture command database information may be modified, and the modifying method includes:
entering a vehicle central control system modification interface;
receiving a gesture image;
judging whether the gesture image is in compliance or not, and if not, returning; if the gesture is compliant, inputting gesture content;
reading an operation instruction corresponding to the gesture content;
judging whether the manipulation instruction conflicts with the preset gesture instruction database information or not, and if so, returning; if the control instructions do not conflict, the control instructions are input into the preset gesture instruction database information.
According to the gesture operation-based in-vehicle interaction system of the first aspect of the invention, the sensor is an infrared binocular stereo vision sensor.
According to the gesture operation-based in-vehicle interaction system of the first aspect of the invention, the system can be arranged at any one of a main driving position, a secondary driving position, a position where the right front side of the secondary driving is kept at a preset distance from a console button, a back of a back-row seat facing a back-row passenger or a front upper side of the top of the back-row passenger.
According to the gesture operation-based in-vehicle interaction system of the first aspect of the invention, a shielding object is arranged above the sensor.
According to the in-vehicle interaction system based on gesture operation, which is provided by the first aspect of the invention, the interaction system further comprises a screen; and the interaction system issues an operation instruction according to the hand model data identified by the sensor, and feeds back a gesture instruction corresponding to the hand model data on the screen.
According to the in-vehicle interaction system based on gesture operation, the preset gesture command database information comprises static or dynamic gesture definitions and command definitions for operating hardware devices or multimedia systems in the vehicle.
According to the in-vehicle interaction system based on gesture operation, which is disclosed by the first aspect of the invention, a driver can adjust the authority of gesture operation through the vehicle central control system.
A second aspect of the invention provides a vehicle comprising the gesture-based in-vehicle interaction system of any of the first aspects of the invention.
According to the technical content disclosed by the invention, the method has the following beneficial effects:
1. according to the technical scheme, not only is the driver focused, but also the user experience of non-drivers is focused, and the potential safety hazard of the driving process can be reduced by the driver through gestures; and passengers can not effectively control the environment in the vehicle due to being far away from the center console, so that the sensor arranged in each direction can meet the requirements of more users under the condition of less occupied space, and the riding experience of the rear passengers is improved. Meanwhile, the driver can adjust the authority of gesture operation through the central control so as to ensure driving safety.
2. The speed of gesture interaction is faster than that of voice interaction, and gesture interaction can be used in a specific environment. For example, if a passenger is on a phone or sleeps, the gesture operation is quieter.
3. The technical scheme of the invention adjusts the used image sensor, reduces the monitoring range of the sensor, can realize faster and more accurate control within a short distance, has the frame number exceeding 200 frames per second, and can even reach 0.01 millimeter. And the narrowing of the range and the increasing of the shielding object can reduce the false touch of the user. The sensor after adjustment can also adapt to more environment scenes, reduces the influence of light on shooting, and improves the applicability.
Other features of the present invention and its advantages will become apparent from the following detailed description of exemplary embodiments of the invention, which proceeds with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a block diagram of an in-vehicle interaction system based on gesture operation according to an embodiment;
FIG. 2 is a flowchart of an interaction method of an in-vehicle interaction system based on gesture operation according to an embodiment of the present invention;
FIG. 3 is a flowchart of a method for linking a gesture content determination with preset gesture command database information stored in a body controller according to an embodiment of the present invention;
FIG. 4 is a flowchart of a method of modifying preset gesture command database information according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The first aspect of the invention discloses an in-vehicle interaction system based on gesture operation shown in fig. 1, which comprises a sensor and a vehicle body controller which are electrically connected;
the vehicle body controller comprises a gesture segmentation module, a gesture tracking module, a gesture recognition module and an execution module which are connected in sequence;
the sensor acquires left and right visual images through the camera and transmits the calibrated stereogram to the vehicle body controller;
the vehicle body controller is used for receiving the image information and generating a control signal to control vehicle-mounted functions;
the method for controlling the vehicle-mounted function is as shown in fig. 2:
the vehicle central control system opens a gesture control switch and enters a gesture control mode;
the interactive system judges whether the gesture is recognized or not, if not, the interactive system returns; if the gesture is identifiable, judging gesture content and linking with preset gesture command database information stored in the vehicle body controller to generate control command data, wherein the specific method is as shown in fig. 3 and comprises the following steps:
the sensor acquires image information in the vehicle;
the gesture segmentation module receives image information input by a sensor, performs triangular calculation through internal and external parameters of a camera to obtain a depth image, uses a gesture segmentation algorithm to process the depth image, segments initial position information of a human hand, takes the position as the initial position of a gesture tracking algorithm, and performs gesture segmentation again if a tracking target disappears;
the gesture tracking module receives data information of the gesture segmentation module, including a pre-judging condition of whether a gesture is detected, a starting position of the gesture, a track of an independent hand and the like, tracks and identifies the gesture starting from the starting position, and when the hand position is tracked, a hand object is defined by using a HandController component, and the main function of the component is to symbolize the hand information detected by the sensor and the detection range of the sensor.
The gesture recognition module receives the information processed by the gesture tracking module, and compiles a script to analyze and recognize the gesture, for example, if the gesture of simple hand making a fist and five fingers opening is required to be judged, the finger information needs to be traversed once to acquire the number of the fingers opening, and the gesture is judged to be in the five fingers opening or making a fist by judging the number of the fingers opening; if the gesture of one scissor hand needs to be judged, whether the thumb, the ring finger and the little finger are open or not needs to be judged, and the index finger and the middle finger also need to be judged, so that some common gestures can be defined. However, if the gesture is to be judged to be more complex or dynamic, the included angle between the direction of the finger and the direction of the fingertip, and the included angle between the normal vector of the palm and the direction of the fingertip need to be calculated. If the motion of the two hands is to be judged, the data information of the two hands is required to be called and the relation between the two hands is judged, and the specific gestures of the two hands are judged through the space distance between the two hands and the position relation of the fingertips.
The execution module compares preset gesture command database information stored in the vehicle body controller, obtains command data according to information parameters, controls vehicle-mounted functions such as power windows, air conditioners and system music, and outputs execution commands.
According to the in-vehicle interaction system based on gesture operation in the first aspect of the present invention, the preset gesture command database information is preset system information, and the user can modify the settings according to the needs during the use, and the modification method is shown in fig. 4:
entering a vehicle central control system modification interface;
receiving a gesture image;
judging whether the gesture image is in compliance or not, and if not, returning; if the gesture is compliant, inputting gesture content;
reading an operation instruction corresponding to the gesture content;
judging whether the manipulation instruction conflicts with the preset gesture instruction database information or not, and if so, returning; if the control instructions do not conflict, the control instructions are input into the preset gesture instruction database information.
According to the gesture operation-based in-vehicle interaction system of the first aspect of the invention, the sensor is an infrared binocular stereo vision sensor, which is a sensor combining infrared and binocular imaging. Compared with a structured light sensor and a TOF sensor, the binocular stereo vision sensor is more suitable for short-distance monitoring, and can meet the requirements on the premise of fully utilizing the performance of the sensor for images with small gestures. The infrared camera can image at night, mainly by capturing infrared rays emitted by the infrared lamp panel, and the characteristics of the infrared camera can well make up the disadvantage that binocular shooting is easily affected by the rays.
In using the gesture-based in-vehicle interactive system of the first aspect of the present invention, the user places the hand in close proximity in the forward direction of the sensor, for example, for the leap motion sensor, the distance of the hand from the sensor is about 10-60cm, and the actual range will vary depending on the type of binocular sensor used.
According to the gesture-based in-vehicle interaction system of the first aspect of the present invention, the sensor may be installed at any suitable position in the vehicle, as long as two conditions that the user's hand may be in the control range of the sensor and the region is not prone to false touch are satisfied. For example, the control button can be installed near the primary and secondary driving, but the control button near the primary and secondary driving is relatively comprehensive, so that the control button can be selectively installed. The device can also be arranged at a place far away from a button of a control console in front of the right side of the assistant driver to prevent false touch, or arranged on the chair back of a front-row seat facing a rear-row passenger or on the front upper part of the top of the head of the rear-row passenger, so that the operation and the use of the passenger sitting in the rear-row passenger are facilitated, and the interactivity of the rear-row passenger is improved.
According to the gesture operation-based in-vehicle interaction system provided by the first aspect of the invention, the shielding object is arranged above the sensor, and can be a sliding cover, a film or the like, and is opened when in use, so that misoperation caused by daily activities is prevented.
According to the in-vehicle interaction system based on gesture operation, which is provided by the first aspect of the invention, the interaction system further comprises a screen; and the interaction system issues an operation instruction according to the hand model data identified by the sensor, and feeds back a gesture instruction corresponding to the hand model data on the screen. The sensor can project hand model data onto the screen through an algorithm to operate the actual application panel, so that operations which can be achieved by a mouse or a touch screen such as space clicking, dragging and page turning can be achieved, and better effects can be achieved by matching with a projector.
According to the in-vehicle interaction system based on gesture operation of the first aspect of the present invention, the preset gesture command database information includes a plurality of static or dynamic gesture definitions, such as opening five fingers, making a fist, drawing up fingers, drawing down fingers, and comparing fingers with yards; the method also comprises instruction definition for operating various hardware devices or multimedia systems in the vehicle, such as playing music, suspending the music, increasing the air-conditioning air speed, reducing the air-conditioning air speed, photographing and the like.
According to the in-vehicle interaction system based on gesture operation, which is disclosed by the invention, a driver can adjust the authority of gesture operation through the vehicle central control system so as to ensure the driving safety.
In summary, the technical solutions of the aspects of the present invention have the following advantages compared with the prior art:
1. the speed of gesture interaction is faster than that of voice interaction, and gesture interaction can be used in a specific environment. For example, if a passenger is on a phone or sleeps, the gesture operation is quieter.
2. Not only pay attention to the driver, but also pay attention to the experience of non-drivers, and the driver can reduce the potential safety hazard in the driving process through gestures; and passengers can not effectively control the environment in the vehicle due to being far away from the center console, so that the sensor arranged in each direction can meet the requirements of more users under the condition of less occupied space, and the riding experience of the rear passengers is improved. Meanwhile, the driver can adjust the authority of gesture operation through the central control so as to ensure driving safety.
3. According to the technical scheme, the used image sensor is adjusted, the monitoring range of the sensor is reduced, the rapid and accurate control can be realized in a short distance, the number of frames per second exceeds 200 frames, the precision can even reach 0.01 millimeter, and the mistaken touch of a user can be reduced due to the reduction of the range and the increase of the shielding object. The sensor after adjustment can also adapt to more environment scenes, reduces the influence of light on shooting, and improves the applicability.
The second aspect of the invention provides a vehicle, comprising the gesture operation-based in-vehicle interaction system according to any one of the first aspect of the invention, which not only improves the interaction requirements of passengers at all positions, but also improves the safety of the interaction system, thereby improving the safety and comfort of the vehicle.
Note that the technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be regarded as the scope of the description. The foregoing examples represent only a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.
Claims (9)
1. An in-vehicle interaction system based on gesture operation is characterized in that,
comprises a sensor and a vehicle body controller which are electrically connected;
the vehicle body controller comprises a gesture segmentation module, a gesture tracking module, a gesture recognition module and an execution module which are connected in sequence;
the sensor is used for acquiring image information in the vehicle;
the vehicle body controller is used for receiving the image information and generating a control signal to control vehicle-mounted functions;
the method for controlling the vehicle-mounted function comprises the following steps:
the vehicle central control system opens a gesture control switch and enters a gesture control mode;
the interactive system judges whether the gesture is recognized or not, if not, the interactive system returns; if the gesture is identifiable, judging gesture content and linking with preset gesture command database information stored in the vehicle body controller to generate control command data, wherein the method specifically comprises the following steps of:
the sensor acquires image information in the vehicle;
the gesture segmentation module receives the image information and calculates to obtain initial position information of a human hand;
the gesture tracking module receives the initial position information, and tracks and identifies a human hand to obtain imaged hand information and sensor detection range information;
the gesture recognition module receives the hand information and the detection range information of the sensor, and calculates to obtain gesture information;
the execution module receives the gesture information, compares the gesture information with the preset gesture instruction database information, and calculates control instruction data;
and the interactive system controls the vehicle-mounted function according to the control instruction data and outputs an execution instruction.
2. The in-vehicle interaction system based on gesture operation according to claim 1, wherein the preset gesture command database information can be modified, and the method for modifying comprises:
entering a vehicle central control system modification interface;
receiving a gesture image;
judging whether the gesture image is in compliance or not, and if not, returning; if the gesture is compliant, inputting gesture content;
reading an operation instruction corresponding to the gesture content;
judging whether the manipulation instruction conflicts with the preset gesture instruction database information or not, and if so, returning; if the control instructions do not conflict, the control instructions are input into the preset gesture instruction database information.
3. The in-vehicle interaction system based on gesture operations of claim 1, wherein the sensor is an infrared binocular stereo vision sensor.
4. The in-vehicle interaction system based on gesture operation according to claim 1, wherein the sensor is installed at least at any one of a main driving position, a co-driving position, a right front side of the co-driving position and a predetermined distance from a console button, a front seat facing a back of a rear passenger, or a front upper side of a top of a head of the rear passenger.
5. The gesture-based on-vehicle interactive system according to claim 1, wherein,
and a shielding object is arranged above the sensor.
6. The gesture-based on-vehicle interactive system according to claim 1, wherein,
the interactive system further comprises a screen; and the interaction system issues an operation instruction according to the hand model data identified by the sensor, and feeds back a gesture instruction corresponding to the hand model data on the screen.
7. The in-vehicle interaction system based on gesture operations of claim 1, wherein the preset gesture command database information includes static or dynamic gesture definitions, and further includes command definitions for operating hardware devices or multimedia systems in the vehicle.
8. The in-vehicle interaction system based on gesture operation according to claim 1, wherein a driver can adjust authority of gesture operation through the in-vehicle control system.
9. A vehicle comprising the gesture-based on-vehicle interaction system of any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310362987.0A CN116501167A (en) | 2023-04-07 | 2023-04-07 | In-vehicle interaction system based on gesture operation and vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310362987.0A CN116501167A (en) | 2023-04-07 | 2023-04-07 | In-vehicle interaction system based on gesture operation and vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116501167A true CN116501167A (en) | 2023-07-28 |
Family
ID=87325765
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310362987.0A Pending CN116501167A (en) | 2023-04-07 | 2023-04-07 | In-vehicle interaction system based on gesture operation and vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116501167A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117789297A (en) * | 2023-12-26 | 2024-03-29 | 大明电子股份有限公司 | Vehicle-mounted quick-charging gesture recognition processing method and system |
CN117789297B (en) * | 2023-12-26 | 2024-06-11 | 大明电子股份有限公司 | Vehicle-mounted quick-charging gesture recognition processing method and system |
-
2023
- 2023-04-07 CN CN202310362987.0A patent/CN116501167A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117789297A (en) * | 2023-12-26 | 2024-03-29 | 大明电子股份有限公司 | Vehicle-mounted quick-charging gesture recognition processing method and system |
CN117789297B (en) * | 2023-12-26 | 2024-06-11 | 大明电子股份有限公司 | Vehicle-mounted quick-charging gesture recognition processing method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220164035A1 (en) | Systems and methods for triggering actions based on touch-free gesture detection | |
US9547792B2 (en) | Control apparatus, vehicle, and portable terminal | |
JP5261554B2 (en) | Human-machine interface for vehicles based on fingertip pointing and gestures | |
US9235269B2 (en) | System and method for manipulating user interface in vehicle using finger valleys | |
US10025388B2 (en) | Touchless human machine interface | |
US20160368382A1 (en) | Motor vehicle control interface with gesture recognition | |
KR20150054042A (en) | Vehicle and control method for the same | |
WO2022226736A1 (en) | Multi-screen interaction method and apparatus, and terminal device and vehicle | |
KR102084032B1 (en) | User interface, means of transport and method for distinguishing a user | |
US20210072831A1 (en) | Systems and methods for gaze to confirm gesture commands in a vehicle | |
KR101438615B1 (en) | System and method for providing a user interface using 2 dimension camera in a vehicle | |
US20140168068A1 (en) | System and method for manipulating user interface using wrist angle in vehicle | |
KR20170002902A (en) | Vehicle and method of controlling the same | |
US20230078074A1 (en) | Methods and devices for hand-on-wheel gesture interaction for controls | |
KR20210057358A (en) | Gesture recognition method and gesture recognition device performing the same | |
CN114564102A (en) | Automobile cabin interaction method and device and vehicle | |
CN105759955B (en) | Input device | |
CN114821810A (en) | Static gesture intention recognition method and system based on dynamic feature assistance and vehicle | |
CN116198435B (en) | Vehicle control method and device, vehicle and storage medium | |
CN111638786B (en) | Display control method, device, equipment and storage medium of vehicle-mounted rear projection display system | |
CN116501167A (en) | In-vehicle interaction system based on gesture operation and vehicle | |
CN110850975B (en) | Electronic system with palm recognition, vehicle and operation method thereof | |
CN107107756B (en) | Human/machine interface and method for controlling vehicle functions by detecting driver's movements and/or expressions | |
CN117136347A (en) | Method, system and computer program for touch stabilization | |
US20200218347A1 (en) | Control system, vehicle and method for controlling multiple facilities |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |