CN114779920A - Whole vehicle window gesture control system based on biological recognition and control method thereof - Google Patents

Whole vehicle window gesture control system based on biological recognition and control method thereof Download PDF

Info

Publication number
CN114779920A
CN114779920A CN202111419372.4A CN202111419372A CN114779920A CN 114779920 A CN114779920 A CN 114779920A CN 202111419372 A CN202111419372 A CN 202111419372A CN 114779920 A CN114779920 A CN 114779920A
Authority
CN
China
Prior art keywords
gesture
control system
user
biological
window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111419372.4A
Other languages
Chinese (zh)
Inventor
田鋆
李彦奇
龚晓琴
娄凌宇
张云轩
张楠
王星皓
李佳
苗冬梅
孙玥茹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Bestune Car Co Ltd
Original Assignee
FAW Bestune Car Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Bestune Car Co Ltd filed Critical FAW Bestune Car Co Ltd
Priority to CN202111419372.4A priority Critical patent/CN114779920A/en
Publication of CN114779920A publication Critical patent/CN114779920A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60JWINDOWS, WINDSCREENS, NON-FIXED ROOFS, DOORS, OR SIMILAR DEVICES FOR VEHICLES; REMOVABLE EXTERNAL PROTECTIVE COVERINGS SPECIALLY ADAPTED FOR VEHICLES
    • B60J7/00Non-fixed roofs; Roofs with movable panels, e.g. rotary sunroofs
    • B60J7/0007Non-fixed roofs; Roofs with movable panels, e.g. rotary sunroofs moveable head-liners, screens, curtains or blinds for ceilings

Abstract

The invention discloses a full-window gesture control system based on biological recognition and a control method thereof, and belongs to the technical field of automobile window control. The invention provides a full-vehicle window gesture control system based on biological recognition and a control method thereof.A biological recognition module is used for acquiring dynamic human body data of a user, and a central controller is used for calling corresponding initially learned user gesture habit data through the dynamic human body data of the user and sending the user gesture habit data to a gesture recognition unit; the central controller generates an execution instruction according to the acquired gesture motion image signal and sends the execution instruction to the car window driving control system, so that the operation efficiency and the control precision are effectively improved, and a driver does not need to press a glass lifting switch for a long time to control the car window glass.

Description

Whole vehicle window gesture control system based on biological recognition and control method thereof
Technical Field
The invention discloses a full-window gesture control system based on biological recognition and a control method thereof, and belongs to the technical field of automobile window control.
Background
At present, the ascending and descending of the vehicle window are realized through the operation of a control button, therefore, when the vehicle window needs to be lifted, a driver needs to successively execute a series of operations of searching for the control button and operating the control button, and the potential safety hazard of the driver in the driving process is increased invisibly.
In order to solve the problems, most of the existing car window control modes carry out gesture recognition according to user habits by changing the positions and the shapes of control buttons or needing the user to actively recognize the control buttons, but the car window control modes are difficult to adapt to the needs of actual conditions, are easy to imitate, and have the problem of low detection accuracy.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provides a full car window gesture control system based on biological recognition and a control method thereof.
The invention aims to solve the problems by the following technical scheme:
a full vehicle window gesture control system based on biological recognition comprises:
the vehicle window driving control system comprises a vehicle window driving control system, a gesture recognition unit, a central controller and a biological recognition module, wherein the central controller is electrically connected with the vehicle window driving control system, the gesture recognition unit and the biological recognition module respectively.
Preferably, the gesture recognition unit adopts a 15W4K32S4 chip.
Preferably, the central controller is a BCM body control module.
Preferably, the gesture recognition unit includes: a first gesture recognition unit and a second gesture recognition unit.
A control method of a whole vehicle window gesture control system based on biological recognition comprises the following steps:
the biological recognition module acquires dynamic human body data of a user and feeds the dynamic human body data back to the central controller;
the central controller obtains the human body biological characteristics of the user through the dynamic human body data of the user, and similarity identity judgment is carried out on the gesture biological characteristics of the user and all biological characteristics in a comparison library constructed offline to judge whether the gesture biological characteristics meet requirements or not;
if yes, calling corresponding initially learned user gesture habit data and sending the initially learned user gesture habit data to a gesture recognition unit;
the gesture recognition unit acquires a gesture action image signal and feeds the gesture action image signal back to the central controller;
and the central controller generates an execution instruction according to the acquired gesture motion image signal and sends the execution instruction to the car window driving control system.
Preferably, if the similarity between the gesture biological characteristics of the user and all the biological characteristics in the comparison library constructed offline does not meet the requirement, generating an executable instruction and feeding the executable instruction back to the biological identification module, and re-acquiring the dynamic human body data of the user.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a full vehicle window gesture control system based on biological recognition and a control method thereof, wherein dynamic human body data of a user are acquired through a biological recognition module, and a central controller calls corresponding initially learned user gesture habit data through the dynamic human body data of the user and sends the initially learned user gesture habit data to a gesture recognition unit; the central controller generates an execution instruction according to the acquired gesture action image signal and sends the execution instruction to the car window driving control system, the gesture sliding replaces a control key, dynamic human body data is a dynamic and sequential biological characteristic, and is more difficult to simulate and forge, the operation efficiency and the control precision are effectively improved, a driver does not need to press a glass lifting switch for a long time to control the car window glass, the inattention of the driver during driving is avoided, and the safety problem during driving is avoided.
Drawings
Fig. 1 is an electrical connection diagram of a full-vehicle window gesture control system based on biological recognition.
Fig. 2 is a schematic diagram of a main driving window control gesture of the whole window gesture control system based on biometric identification according to the invention.
Fig. 3 is a schematic diagram of a passenger window control gesture of the whole window gesture control system based on biometric identification according to the present invention.
Fig. 4 is a schematic diagram of a left rear window control gesture of a whole window gesture control system based on biometric identification according to the present invention.
FIG. 5 is a schematic diagram of a right rear window control gesture of the whole window gesture control system based on biometric identification.
Fig. 6 is an electrical connection diagram of a central controller of a vehicle-mounted gesture control sunroof system based on pupil recognition.
Detailed Description
The invention is further illustrated below with reference to the accompanying figures 1-6:
the technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is to be understood that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art based on the embodiments of the present invention without any creative effort, fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention.
In the description of the present invention, it should be noted that unless otherwise explicitly stated or limited, the terms "mounted," "connected" and "connected" are to be construed broadly and may be, for example, fixedly connected, detachably connected or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
As shown in fig. 1, a first embodiment of the present invention provides a whole vehicle window gesture control system based on biometric identification, based on the prior art, including: the vehicle window control system comprises a vehicle window driving control system, a gesture recognition unit, a central controller and a biological recognition module, wherein the central controller is electrically connected with the vehicle window driving control system, the gesture recognition unit and the biological recognition module respectively. The central controller is BCM automobile body control module, and the gesture recognition unit includes: the first gesture recognition unit and the second gesture recognition unit respectively adopt 15W4K32S4 chips, correspond to the left hand and the right hand of a user respectively, read gesture data through I/O (input/output) interface simulated IIC (inter-integrated circuit) communication and send the data to the BCM body control module.
The window driving control system adopts the function of a common electric window lifting device on an automobile, and the controller and the BCM body control module are connected in parallel to control the electric window lifting device so as to realize gesture control of lifting of the window glass. The gesture motion image signal operation logic table is shown in the following table 1, the first gesture recognition unit and the second gesture recognition unit respectively correspond to the left rear window and the left rear window, the first gesture recognition unit and the second gesture recognition unit can recognize the main driving window and the auxiliary driving window, the control gestures of the main driving window and the auxiliary driving window are not limited to left hands or right hands, the control gestures of the main driving window and the auxiliary driving window are shown in fig. 2-3 by taking the left hand as an example, and the control gestures of the left rear window and the left rear window are shown in fig. 4-5 by taking the left hand as an example.
TABLE 1 gesture image signal operation logic table
Figure BDA0003376654650000051
In the above, a whole vehicle window gesture control system based on biometric identification is introduced, and the control method thereof is described below, including:
the biological recognition module acquires dynamic human body data of a user and feeds the dynamic human body data back to the central controller;
the central controller obtains the human body biological characteristics of the user through the dynamic human body data of the user, and similarity identity judgment is carried out on the gesture biological characteristics of the user and all biological characteristics in a comparison library constructed offline to judge whether the gesture biological characteristics meet the requirements or not;
if yes, calling corresponding initially learned user gesture habit data and sending the initially learned user gesture habit data to a gesture recognition unit;
and if not, generating an executable instruction and feeding the instruction back to the biological identification module, and acquiring the dynamic human body data of the user again.
The gesture recognition unit acquires a gesture action image signal and feeds the gesture action image signal back to the central controller;
and the central controller generates an execution instruction according to the acquired gesture motion image signal and sends the execution instruction to the car window driving control system.
The central controller is in the form of a general purpose computing device. As shown in fig. 6, in general, the computing device 100 includes: a processor 101 and a memory 102.
Processor 101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 101 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 101 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 101 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, the processor 101 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 102 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 102 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 102 is used to store at least one instruction for execution by the processor 101 to implement a semantic recognition based vehicle gesture control sunroof system and a control method thereof provided herein.
In some embodiments, the terminal 100 may further include: a peripheral interface 103 and at least one peripheral. Specifically, the peripheral device includes: at least one of radio frequency circuitry 104, touch screen display 105, camera 106, audio circuitry 107, positioning components 108, and power supply 109.
The peripheral interface 103 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 101 and the memory 102. In some embodiments, processor 101, memory 102, and peripheral interface 103 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 101, the memory 102 and the peripheral device interface 103 may be implemented on separate chips or circuit boards, which are not limited in this embodiment.
The Radio Frequency circuit 104 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 104 communicates with a communication network and other communication devices via electromagnetic signals. The rf circuit 104 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 104 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 104 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 104 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The touch display screen 105 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. The touch display screen 105 also has the ability to acquire touch signals on or over the surface of the touch display screen 105. The touch signal may be input to the processor 101 as a control signal for processing. The touch screen display 105 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the touch display screen 105 may be one, providing a front panel of the terminal 100; in other embodiments, the touch display screen 105 may be at least two, respectively disposed on different surfaces of the terminal 100 or in a folded design; in still other embodiments, the touch display 105 may be a flexible display disposed on a curved surface or a folded surface of the terminal 100. Even more, the touch screen display 105 may be configured as a non-rectangular irregular pattern, i.e., a shaped screen. The touch Display screen 105 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 106 is used to capture images or video. Optionally, the camera assembly 106 includes a front camera and a rear camera. Generally, a front camera is used for realizing video call or self-shooting, and a rear camera is used for realizing shooting of pictures or videos. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera and a wide-angle camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting function and a VR (Virtual Reality) shooting function. In some embodiments, camera head assembly 106 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp and can be used for light compensation at different color temperatures.
Audio circuitry 107 is used to provide an audio interface between a user and terminal 100. Audio circuitry 107 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 101 for processing or inputting the electric signals to the radio frequency circuit 104 to realize voice communication. The microphones may be plural and respectively provided at different portions of the terminal 100 for the purpose of stereo sound collection or noise reduction. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 101 or the radio frequency circuit 104 into sound waves. The loudspeaker can be a traditional film loudspeaker, and can also be a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 107 may also include a headphone jack.
The positioning component 108 is used to locate the current geographic Location of the terminal 100 to implement navigation or LBS (Location Based Service). The Positioning component 108 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 109 is used to supply power to the various components in the terminal 100. The power source 109 may be alternating current, direct current, disposable or rechargeable. When the power supply 109 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 100 also includes one or more sensors 110. The one or more sensors 110 include, but are not limited to: acceleration sensor 111, gyro sensor 112, pressure sensor 113, fingerprint sensor 114, optical sensor 115, and proximity sensor 116.
The acceleration sensor 111 can detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 100. For example, the acceleration sensor 111 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 101 may control the touch screen 105 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 111. The acceleration sensor 111 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 112 may detect a body direction and a rotation angle of the terminal 100, and the gyro sensor 112 may collect a 3D (3 Dimensions) motion of the user on the terminal 100 in cooperation with the acceleration sensor 111. From the data collected by the gyro sensor 112, the processor 101 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization while shooting, game control, and inertial navigation.
The pressure sensor 113 may be disposed on a side bezel of the terminal 100 and/or an underlying layer of the touch display screen 105. When the pressure sensor 113 is disposed at a side frame of the terminal 100, a user's grip signal on the terminal 100 can be detected, and left-right hand recognition or shortcut operation can be performed according to the grip signal. When the pressure sensor 113 is disposed at the lower layer of the touch display screen 105, it is possible to control the operability control on the UI interface according to the pressure operation of the user on the touch display screen 105. The operability control comprises at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 114 is used to collect a fingerprint of a user to identify the user's identity based on the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 101 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 114 may be disposed on the front, back, or side of the terminal 100. When a physical button or a vendor Logo is provided on the terminal 100, the fingerprint sensor 114 may be integrated with the physical button or the vendor Logo.
The optical sensor 115 is used to collect the ambient light intensity. In one embodiment, the processor 101 may control the display brightness of the touch screen display 105 based on the ambient light intensity collected by the optical sensor 115. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 105 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 105 is turned down. In another embodiment, the processor 101 may also dynamically adjust the shooting parameters of the camera head assembly 106 according to the ambient light intensity collected by the optical sensor 115.
A proximity sensor 116, also known as a distance sensor, is typically disposed on the front face of the terminal 100. The proximity sensor 116 is used to collect the distance between the user and the front surface of the terminal 100. In one embodiment, when the proximity sensor 116 detects that the distance between the user and the front surface of the terminal 100 is gradually decreased, the processor 101 controls the touch display screen 105 to switch from the bright screen state to the dark screen state; when the proximity sensor 116 detects that the distance between the user and the front surface of the terminal 100 becomes gradually larger, the processor 101 controls the touch display screen 105 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configurations illustrated are not limiting of terminal 100 and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, a computer readable storage medium is further provided, on which a computer program is stored, and the program, when executed by a processor, implements a vehicle gesture control skylight system based on semantic recognition and a control method thereof as provided by all inventive embodiments of this application.
Any combination of one or more computer-readable media may be employed. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
While embodiments of the invention have been disclosed above, it is not intended to be limited to the uses set forth in the specification and examples. It can be applied to all kinds of fields suitable for the present invention. Additional modifications will readily occur to those skilled in the art. It is therefore intended that the invention not be limited to the exact details and illustrations described and illustrated herein, but fall within the scope of the appended claims and their equivalents.
While embodiments of the invention have been disclosed above, it is not intended to be limited to the uses set forth in the specification and examples. It can be applied to all kinds of fields suitable for the present invention. Additional modifications will readily occur to those skilled in the art. It is therefore intended that the invention not be limited to the exact details and illustrations described and illustrated herein, but fall within the scope of the appended claims and equivalents thereof.

Claims (6)

1. A full vehicle window gesture control system based on biological recognition is characterized by comprising:
the vehicle window control system comprises a vehicle window driving control system, a gesture recognition unit, a central controller and a biological recognition module, wherein the central controller is electrically connected with the vehicle window driving control system, the gesture recognition unit and the biological recognition module respectively.
2. The biological recognition-based full vehicle window gesture control system according to claim 1, wherein the gesture recognition unit adopts a 15W4K32S4 chip.
3. The biological recognition-based full-vehicle-window gesture control system as claimed in claim 2, wherein the central controller is a BCM vehicle body control module.
4. The biological recognition-based full-vehicle-window gesture control system according to claim 2 or 3, wherein the gesture recognition unit comprises: a first gesture recognition unit and a second gesture recognition unit.
5. A control method of a full vehicle window gesture control system based on biological recognition is characterized by comprising the following steps:
the biological recognition module acquires dynamic human body data of a user and feeds the dynamic human body data back to the central controller;
the central controller obtains the human body biological characteristics of the user through the dynamic human body data of the user, and similarity identity judgment is carried out on the human body biological characteristics of the user and all biological characteristics in a comparison library constructed offline to judge whether the human body biological characteristics meet requirements or not;
if yes, calling corresponding initially learned user gesture habit data and sending the initially learned user gesture habit data to a gesture recognition unit;
the gesture recognition unit acquires a gesture action image signal and feeds the gesture action image signal back to the central controller;
and the central controller generates an execution instruction according to the acquired gesture motion image signal and sends the execution instruction to the car window driving control system.
6. The control method of the whole vehicle window gesture control system based on biological recognition according to claim 5,
and if the similarity between the gesture biological characteristics of the user and all the biological characteristics in the comparison library constructed offline does not meet the requirement, generating an executable instruction and feeding the executable instruction back to the biological recognition module, and re-acquiring the dynamic human body data of the user.
CN202111419372.4A 2021-11-26 2021-11-26 Whole vehicle window gesture control system based on biological recognition and control method thereof Pending CN114779920A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111419372.4A CN114779920A (en) 2021-11-26 2021-11-26 Whole vehicle window gesture control system based on biological recognition and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111419372.4A CN114779920A (en) 2021-11-26 2021-11-26 Whole vehicle window gesture control system based on biological recognition and control method thereof

Publications (1)

Publication Number Publication Date
CN114779920A true CN114779920A (en) 2022-07-22

Family

ID=82423633

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111419372.4A Pending CN114779920A (en) 2021-11-26 2021-11-26 Whole vehicle window gesture control system based on biological recognition and control method thereof

Country Status (1)

Country Link
CN (1) CN114779920A (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105365707A (en) * 2014-08-11 2016-03-02 福特全球技术公司 Vehicle driver identification

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105365707A (en) * 2014-08-11 2016-03-02 福特全球技术公司 Vehicle driver identification

Similar Documents

Publication Publication Date Title
CN108961681B (en) Fatigue driving reminding method and device and storage medium
KR20180136776A (en) Mobile terminal and method for controlling the same
CN110705614A (en) Model training method and device, electronic equipment and storage medium
CN111753606A (en) Intelligent model upgrading method and device
CN111931712B (en) Face recognition method, device, snapshot machine and system
CN110349527B (en) Virtual reality display method, device and system and storage medium
CN110058729B (en) Method and electronic device for adjusting sensitivity of touch detection
CN114090140A (en) Interaction method between devices based on pointing operation and electronic device
CN111741266B (en) Image display method and device, vehicle-mounted equipment and storage medium
CN114595019A (en) Theme setting method, device and equipment of application program and storage medium
CN114594885A (en) Application icon management method, device and equipment and computer readable storage medium
CN114789734A (en) Perception information compensation method, device, vehicle, storage medium, and program
CN114834221A (en) Intelligent sun shading method, system, terminal and storage medium for automobile sun visor
CN110717365B (en) Method and device for obtaining picture
CN114779920A (en) Whole vehicle window gesture control system based on biological recognition and control method thereof
CN108090438B (en) Fingerprint acquisition method and device
CN110992954A (en) Method, device, equipment and storage medium for voice recognition
CN111723615A (en) Method and device for carrying out detection object matching judgment on detection object image
CN114566064B (en) Method, device, equipment and storage medium for determining position of parking space
CN114816043A (en) Vehicle-mounted gesture control skylight system based on pupil recognition and control method thereof
CN115237246A (en) Vehicle-mounted gesture control skylight system based on semantic recognition and control method thereof
CN110659609B (en) Fingerprint matching method and device, electronic equipment and medium
CN113138782A (en) Method, device, terminal and storage medium for OTA (over the air) upgrading calibration data
CN114360538A (en) Voice data acquisition method, device, equipment and computer readable storage medium
CN113450799A (en) Vehicle-mounted schedule management method, system, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination