CN118107605A - Vehicle control method and system based on steering wheel gesture interaction - Google Patents

Vehicle control method and system based on steering wheel gesture interaction Download PDF

Info

Publication number
CN118107605A
CN118107605A CN202410533515.1A CN202410533515A CN118107605A CN 118107605 A CN118107605 A CN 118107605A CN 202410533515 A CN202410533515 A CN 202410533515A CN 118107605 A CN118107605 A CN 118107605A
Authority
CN
China
Prior art keywords
steering wheel
driver
hand
gesture
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410533515.1A
Other languages
Chinese (zh)
Inventor
钟星宇
刘青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rivotek Technology Jiangsu Co Ltd
Original Assignee
Rivotek Technology Jiangsu Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rivotek Technology Jiangsu Co Ltd filed Critical Rivotek Technology Jiangsu Co Ltd
Priority to CN202410533515.1A priority Critical patent/CN118107605A/en
Publication of CN118107605A publication Critical patent/CN118107605A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to the technical field of vehicle control systems, in particular to a vehicle control method and system based on steering wheel gesture interaction, comprising the following steps: after the vehicle is powered on for the first time, automatically detecting the boundary of the steering wheel through an image processing algorithm, acquiring the boundary coordinates of the steering wheel and storing the coordinates in a local place; when the vehicle is in a driving state, continuously starting a camera to capture hand images of a driver in real time, detecting and tracking the hand positions of the driver in real time by utilizing image processing and computer vision technology, and obtaining hand position coordinates of the driver; and comparing the boundary coordinates of the steering wheel with the position coordinates of the hands of the driver in real time, judging whether the hands of the driver are placed on the steering wheel, and if the hands of the driver are determined to be placed on the steering wheel, starting a gesture recognition function of the steering wheel, and interacting with the vehicle through gestures of the driver. By the method, a driver can hold the steering wheel and simultaneously perform various operations, so that the operation convenience is improved, and the driving safety is enhanced.

Description

Vehicle control method and system based on steering wheel gesture interaction
Technical Field
The invention relates to the technical field of vehicle control, in particular to a vehicle control method and system based on steering wheel gesture interaction.
Background
Currently, in the automotive industry, the development of interactive modes has advanced to some extent. At present, a touch screen and buttons are generally adopted for interaction in an automobile, a driver needs to leave a steering wheel to click the screen or press the buttons to finish various operations, and certain operation complexity and safety risks exist in the mode; head-up display (HUD) technology has been applied in some high-end automobiles, HUD allows the driver to obtain necessary information without leaving the line of sight by projecting information onto a transparent display in the driver's field of view, however, current HUD technology is mainly operated by control buttons or voice commands, and gesture interaction has relatively few applications; gesture recognition technology is widely applied to smart phones and other devices, but is relatively rarely applied in the field of automobiles, and some researches and experimental projects explore the potential of applying the gesture recognition technology to vehicle control, but no mature commercial products are formed yet; intelligent voice assistants have become a primary way of current car interaction, in which the driver can control vehicle functions and obtain information through voice commands, which reduces reliance on touch screens or buttons, but still requires the driver to issue verbal instructions.
With the rapid development of the automobile industry, more and more scenes are interacted with vehicles or other vehicle-mounted devices in the driving process of users, but the technology aspect based on the gesture interaction of the steering wheel is still in the exploration and development stage. The application number 2021107108229, named as a man-machine interaction method, a device, an electronic device and a storage medium, controls a target object through gesture actions in different areas, but the patent uses the whole palm recognition, the recognition precision is lower, in the palm recognition process, the palm is likely to be separated from a steering wheel, and the driving safety cannot be ensured.
Disclosure of Invention
Aiming at the problems of low safety, low recognition precision, complex operation and the like in the vehicle interaction in the driving process, the invention provides a vehicle control method and system based on steering wheel gesture interaction, and aims to provide a convenient, safe and visual mode so that a driver can perform various operations when holding a steering wheel.
In order to achieve the above purpose, the present invention provides the following technical solutions:
a vehicle control method based on steering wheel gesture interaction, the method comprising:
After the vehicle is powered on for the first time, automatically detecting the boundary of the steering wheel through an image processing algorithm, acquiring the boundary coordinates of the steering wheel and storing the boundary coordinates to the local, wherein the boundary coordinates of the steering wheel comprise the center coordinates (x, y) of the steering wheel, the upper left corner coordinates (x 1,y1) and the lower right corner coordinates (x 2,y2);
When the vehicle is in a driving state, continuously starting a camera to capture hand images of a driver in real time, detecting and tracking the hand positions of the driver in real time by utilizing image processing and computer vision technology, and acquiring hand position coordinates of the driver, wherein the method comprises the steps of; center point coordinates (m, n), upper left corner coordinates (m 1,n1), and lower right corner coordinates (m 2,n2) of the driver's hand;
And comparing the boundary coordinates of the steering wheel with the position coordinates of the hands of the driver in real time, judging whether the hands of the driver are placed on the steering wheel, and if the hands of the driver are determined to be placed on the steering wheel, starting a gesture recognition function of the steering wheel, and interacting with the vehicle through gestures of the driver.
As a preferred solution of the present invention, the method for automatically detecting the steering wheel boundary by using an image processing algorithm, and obtaining the coordinates of the steering wheel boundary specifically includes:
The camera is used for collecting image data of the steering wheel and converting the image data into a gray scale image, and the gray scale conversion formula is as follows:
Wherein I represents original image data; r represents the intensity value of the red channel of each pixel point in the image data, G represents the intensity value of the green channel of each pixel point in the image data, and B represents the intensity value of the blue channel of each pixel point in the image data;
The image is smoothed by applying gaussian blur, the formula of which is:
In the method, in the process of the invention, Representing the intensity of the blur of the gaussian function at any coordinate (p, q); sigma is the standard deviation in Gaussian blur, used to control the degree of blur; wherein,
Is the basic ambiguity value,/>As a regulatory factor,/>Is an adjustment function based on the illumination intensity L and the reflectivity F of the steering wheel surface, L max representing the maximum illumination intensity in the vehicle;
Processing the blurred gray image by using a Canny edge detection algorithm, calculating the gradient magnitude and direction of each point in the gray image by using a group of linear filters, performing non-maximum suppression to remove points on a non-boundary, detecting strong edges and weak edges by using a threshold T high、Tlow, screening out real edge points, and tracking and connecting the detected edges to form a complete edge image; wherein,
In the method, in the process of the invention,Is the base threshold value of the value,Is an adjustment function based on environmental parameters;
The circle in the processed image is detected through Hough circle transformation, the circle center and the radius are determined, the circle center is the center coordinate (x, y) of the steering wheel, the center coordinate minus the radius is the upper left corner coordinate (x 1,y1) of the steering wheel, and the center coordinate minus the radius is the lower right corner coordinate (x 2,y2) of the steering wheel.
As a preferable scheme of the invention, the hand detection of the driver is carried out based on a skin color model, the size and the proportion of the anchor frame are adaptively adjusted according to historical tracking data, and an adaptive adjustment formula is as follows:
In the method, in the process of the invention, Is the width and height of the new anchor frame,Is the width and height of the previous frame,Is the width and height currently detected,Is a smoothing factor for balancing the effects of the previous and subsequent frames;
And after the hands of the driver are detected, tracking the hand positions and the motion tracks of the driver in real time by using a particle filter or a Kalman filter, and recording the hand position coordinates of the driver.
As a preferred embodiment of the present invention, the method for comparing the boundary coordinates of the steering wheel with the hand position coordinates of the driver in real time to determine whether the driver's hand is placed on the steering wheel includes:
Defining a steering wheel rectangular frame based on an upper left corner coordinate (x 1,y1) and a lower right corner coordinate (x 2,y2) of the steering wheel, defining a hand rectangular frame based on an upper left corner coordinate (m 1,n1) and a lower right corner coordinate (m 2,n2) of the driver hand, and calculating the overlapping degree of the driver hand and the steering wheel to judge the relative position relationship between the driver hand and the steering wheel;
overlap = intersection area of steering wheel rectangular frame and hand rectangular frame +.2% steering wheel rectangular frame area x 100%;
if the overlapping degree of the plurality of camera angles is more than 20%, judging that the hands of the driver are positioned above the steering wheel;
the Euclidean distance is further used to calculate the distance d of the driver's hand position and the steering wheel boundary:
when the distances d under the angles of the cameras are smaller than a preset distance threshold value, the driver is judged to be placed on the steering wheel.
As a preferable scheme of the invention, the gesture recognition function of the starting steering wheel is divided into an active starting mode and a passive starting mode;
Active on refers to: if the camera continuously monitors that the driver sends out the active starting gesture, the steering wheel gesture recognition function is started actively, otherwise, the steering wheel gesture recognition function is kept in a silent state;
passive on refers to: and under specific conditions, when the HUD or the vehicle screen is occupied, automatically starting a steering wheel gesture recognition function.
As a preferable scheme of the invention, after a steering wheel gesture recognition function is started, a camera captures gesture actions of a driver, and the captured gesture actions are subjected to gesture recognition and conversion through an image processing and machine learning algorithm, and the gesture recognition and conversion method comprises the following steps:
identifying a finger: identifying the tip, middle phalanx and end phalanx of each finger;
Identifying the stretching degree of the finger: if the fingertip, the middle phalanx and the tail end phalanx are not in a straight line, the finger is considered to be bent, otherwise, the finger is considered to be unbent;
Identifying a finger movement state: judging whether the fingertip is lifted up and dropped down rapidly within a preset time, if so, recognizing that the finger has knocking movement, and if not, recognizing that the finger is stationary;
and once the preset gesture is recognized, converting the gesture into a corresponding control instruction, transmitting the corresponding control instruction to a vehicle control system for execution, and feeding back an execution result in a voice or screen display mode.
As a preferred aspect of the present invention, in the steering wheel gesture recognition function, the preset gesture includes:
active on gesture: all fingers of the hands keep a bending state, hold the steering wheel and tighten the steering wheel into a fist, and grasp the steering wheel twice within 2 seconds;
Confirmation gesture: any thumb of the left hand or the right hand is quickly knocked on the surface of the steering wheel for 1 time within 1 second, and other fingers keep a bending state;
adjusting a volume gesture: the thumb of the right hand slides more than 3 cm along the surface of the steering wheel, and other fingers keep a bending state and do not move;
adjust brightness gesture: the thumb of the left hand slides more than 3 cm along the surface of the steering wheel, and other fingers keep a bending state and do not move;
The last gesture: all fingers of the left hand hold the steering wheel and tighten the steering wheel into a fist, the steering wheel is held twice within 2 seconds, and other fingers keep a bending state and do not move;
The next gesture: all fingers of the right hand hold the steering wheel and tighten the steering wheel into a fist, the steering wheel is held twice within 2 seconds, and other fingers keep a bending state and do not move;
Telephone answering gestures: any one index finger of the left hand or the right hand rapidly strikes the surface of the steering wheel for 1 time within 1 second, and other fingers keep a bending state;
Phone hang-up gesture: the index finger and the middle finger of the left hand or the right hand simultaneously and rapidly strike the surface of the steering wheel for 1 time within 1 second, and other fingers keep a bending state;
air volume adjustment gesture: all fingers of the left hand hold the steering wheel and tighten the steering wheel into fists, when the steering wheel is held and slid for more than 10 degrees in 1 second, the air quantity is adjusted, if the steering wheel is held and slid clockwise, the air quantity is increased, if the steering wheel is held and slid anticlockwise, the air quantity is reduced, after the steering wheel is stopped and slid for more than 2 seconds, the fists are loosened, the adjustment is finished, and the fingers of the right hand keep a bending state and do not move;
temperature adjustment gesture: all fingers of the right hand hold the steering wheel and tighten the steering wheel into fists, the temperature starts to be adjusted when the steering wheel is held and slid for more than 10 degrees within 1 second, the temperature is reduced when the steering wheel is held and slid clockwise, the temperature is increased when the steering wheel is held and slid anticlockwise, the fists are loosened after the steering wheel is stopped and slid for more than 2 seconds, the adjustment is finished, and the fingers of the left hand keep the bending state and do not move.
As a preferred aspect of the present invention, when the steering wheel of the automobile has a lifting function, the method further includes: automatically detecting whether the position of the steering wheel changes after the vehicle is powered on, if so, re-detecting the boundary of the steering wheel, and updating the locally stored boundary coordinates of the steering wheel; if the hand is not changed, the locally stored steering wheel boundary coordinates are directly called to judge the relative positions of the hand and the steering wheel.
A vehicle control system based on steering wheel gesture interaction, the system comprising: the device comprises a plurality of cameras, a steering wheel boundary detection module, a hand detection and tracking module, a relative position judgment module of hands and a steering wheel, a gesture recognition and instruction conversion module, an instruction transmission and execution module and a feedback module;
The cameras are arranged around the steering wheel of the vehicle and are used for collecting image data of the steering wheel or capturing hand images of a driver in real time;
The steering wheel boundary detection module is used for automatically detecting the steering wheel boundary through an image processing algorithm, acquiring the steering wheel boundary coordinates and storing the steering wheel boundary coordinates to the local;
The hand detection and tracking module is used for continuously starting the camera to capture hand images of the driver in real time when the vehicle is in a driving state, and detecting and tracking the hand positions of the driver in real time by utilizing image processing and computer vision technology to obtain hand position coordinates of the driver;
The relative position judging module of the hand and the steering wheel is used for comparing the boundary coordinates of the steering wheel with the position coordinates of the hand of the driver in real time to judge whether the hand of the driver is placed on the steering wheel, if the hand of the driver is determined to be placed on the steering wheel, the gesture recognition function of the steering wheel is started, and interaction is carried out with the vehicle machine through the gesture of the driver;
The gesture recognition and instruction conversion module is used for performing gesture recognition and conversion on the captured gesture actions through an image processing and machine learning algorithm;
The instruction transmission and execution module is used for converting the recognized gesture into a corresponding control instruction and transmitting the corresponding control instruction to the vehicle control system for execution;
The feedback module is used for feeding back the execution result in a voice or screen display mode.
Compared with the prior art, the invention has the beneficial effects that: only the gesture matching is carried out through image recognition, any sensor is not needed, and the cost is saved; the method is limited to a steering wheel area, so that the palm is ensured not to be separated from the steering wheel to ensure driving safety, and only the finger action is identified, so that the identification precision is improved; by integrating image processing, computer vision and machine learning technologies, efficient and visual gesture interaction between a driver and a vehicle is realized. Through the gesture recognition function of the steering wheel, a driver does not need to be distracted to operate various control buttons or touch screens of the vehicle, and can concentrate on the road conditions in front more, so that the safety risk caused by operation distraction is reduced; the gesture interaction is more visual and natural, the control of the vehicle is realized through simple gesture actions, and the convenience and the pleasure of driving are improved; through continuous optimization of a machine learning algorithm, the system can recognize more kinds of gesture actions, can customize according to personal habits of a driver, and realizes personalized vehicle control; the real-time feedback mechanism can inform the driver of the execution condition of gesture instructions in real time, ensures that the instructions are executed correctly, and enhances the transparency and reliability of interaction.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a flow chart of a method for active and passive opening in an embodiment of the invention;
FIG. 3 is a flow chart of a method for gesture recognition in an embodiment of the present invention;
FIG. 4 is a schematic diagram of a finger identification in an embodiment of the present invention;
FIG. 5 is a diagram illustrating the identification of the degree of stretching of a finger in an embodiment of the present invention;
FIG. 6 is a schematic diagram of recognizing finger motion status in an embodiment of the present invention;
FIG. 7 is a system modular block diagram of the present invention;
fig. 8 is a schematic diagram of a camera mounting position in an embodiment of the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1, an embodiment of the present invention provides a vehicle control method based on gesture interaction of a steering wheel, which includes the following steps:
S1: after the vehicle is powered on for the first time, automatically detecting the boundary of the steering wheel through an image processing algorithm, acquiring the boundary coordinates of the steering wheel, and storing the boundary coordinates of the steering wheel to the local, wherein the boundary coordinates of the steering wheel comprise the center coordinates (x, y) of the steering wheel, the upper left corner coordinates (x 1,y1) and the lower right corner coordinates (x 2,y2);
And (3) carrying out boundary detection on the steering wheel by using an image processing algorithm, and determining the position and the shape of the steering wheel by analyzing the shape and the color information in the image to obtain the outline or the boundary frame of the steering wheel.
In one embodiment, step S1 specifically includes:
S11: the camera is used for collecting image data of the steering wheel and converting the image data into a gray scale image, and the gray scale conversion formula is as follows:
Wherein I represents original image data; r represents the intensity value of the red channel of each pixel point in the image data, G represents the intensity value of the green channel of each pixel point in the image data, and B represents the intensity value of the blue channel of each pixel point in the image data; three coefficients (0.299, 0.587, 0.114) are derived from the sensitivity of the human eye to different color light rays;
S12: the image is smoothed by applying gaussian blur, the formula of which is:
In the method, in the process of the invention, Representing the intensity of the blur of the gaussian function at any coordinate (p, q); sigma is the standard deviation in Gaussian blur, used to control the degree of blur; wherein,
Is the basic ambiguity value,/>As a regulatory factor,/>Is an adjustment function based on the illumination intensity L and the reflectivity F of the steering wheel surface, L max representing the maximum illumination intensity in the vehicle;
S13: processing the blurred gray image by using a Canny edge detection algorithm, calculating the gradient magnitude and direction of each point in the gray image by using a group of linear filters, performing non-maximum suppression to remove points on a non-boundary, detecting strong edges and weak edges by using a threshold T high、Tlow, screening out real edge points, and tracking and connecting the detected edges to form a complete edge image; wherein,
In the method, in the process of the invention,Is the base threshold value of the value,Is an adjustment function based on environmental parameters;
the Canny edge detection algorithm uses two thresholds to detect strong edges and weak edges, and in high-illumination or high-reflectivity environments, the contrast of the edges may be more difficult to distinguish, so in this embodiment, an adjustment threshold is set to adapt to the change, and the adjustment is performed according to factors such as current ambient illumination, reflectivity of steering wheel materials and the like, so as to adapt to various driving environments, and improve the accuracy and reliability of steering wheel detection.
S14: the circle in the processed image is detected through Hough circle transformation, the circle center and the radius are determined, the circle center is the center coordinate (x, y) of the steering wheel, the center coordinate minus the radius is the upper left corner coordinate (x 1,y1) of the steering wheel, and the center coordinate minus the radius is the lower right corner coordinate (x 2,y2) of the steering wheel.
S2: when the vehicle is in a driving state, continuously starting a camera to capture hand images of a driver in real time, detecting and tracking the hand positions of the driver in real time by utilizing image processing and computer vision technology, and acquiring hand position coordinates of the driver, wherein the method comprises the steps of; center point coordinates (m, n), upper left corner coordinates (m 1,n1), and lower right corner coordinates (m 2,n2) of the driver's hand;
In a specific embodiment, a hand detection algorithm, such as a hand detection technology based on a skin color model, is used to detect the hand of the driver, and the size and the proportion of the anchor frame are adaptively adjusted according to the historical tracking data so as to better match the size and the shape of the hand, and the adaptive adjustment formula is as follows:
In the method, in the process of the invention, Is the width and height of the new anchor frame,Is the width and height of the previous frame,Is the width and height currently detected,Is a smoothing factor for balancing the effects of the previous and subsequent frames;
After the hands of the driver are detected, the hand positions and the motion tracks of the driver are tracked in real time by using a target tracking algorithm such as particle filtering, kalman filtering and the like, and the particle filter, the Kalman filter and the like can estimate the dynamic positions of the hands in continuous video frames, process any blurring or shielding caused by rapid movement and record the hand position coordinates of the driver.
S3: and comparing the boundary coordinates of the steering wheel with the position coordinates of the hands of the driver in real time, judging whether the hands of the driver are placed on the steering wheel, and if the hands of the driver are determined to be placed on the steering wheel, starting a gesture recognition function of the steering wheel, and interacting with the vehicle through gestures of the driver.
In one embodiment, step S3 specifically includes:
s31: defining a steering wheel rectangular frame based on an upper left corner coordinate (x 1,y1) and a lower right corner coordinate (x 2,y2) of the steering wheel, defining a hand rectangular frame based on an upper left corner coordinate (m 1,n1) and a lower right corner coordinate (m 2,n2) of the driver hand, and calculating the overlapping degree of the driver hand and the steering wheel to judge the relative position relationship between the driver hand and the steering wheel;
overlap = intersection area of steering wheel rectangular frame and hand rectangular frame +.2% steering wheel rectangular frame area x 100%;
the threshold value of the overlapping degree is designed according to the actual situation, such as: if the overlapping degree of the plurality of camera angles is more than 20%, judging that the hands of the driver are positioned above the steering wheel;
S32: the Euclidean distance is further used to calculate the distance d of the driver's hand position and the steering wheel boundary:
and setting a proper threshold value according to actual conditions and requirements to judge whether the driver's hand and the steering wheel are together, and judging that the driver's hand is placed on the steering wheel when the distances d under the angles of the cameras are smaller than a preset distance threshold value.
In the method for calculating based on the two-dimensional image, the real three-dimensional distance between the hand of the driver and the steering wheel cannot be directly measured, so that the distance between the hand of the driver and the steering wheel is ensured to be smaller than a set threshold value and the overlapping degree of the hand of the driver and the steering wheel is higher under the condition of multiple angles by simultaneously calculating the data of the cameras of the angles, and the hand of the driver and the steering wheel can be considered to be contacted.
Further, as shown in fig. 2, the function of recognizing the gesture of turning on the steering wheel is divided into two modes, i.e. active turning on and passive turning on;
Active on refers to: if the camera continuously monitors that the driver sends out the active starting gesture, the steering wheel gesture recognition function is started actively, otherwise, the steering wheel gesture recognition function is kept in a silent state;
Passive on refers to: under specific conditions (such as a music full-screen playing page, an air conditioner full-screen page, an incoming call and the like), when the HUD or the car machine screen is occupied, the gesture recognition function of the steering wheel is automatically started.
S4: after the gesture recognition function of the steering wheel is started, capturing gesture actions of a driver by a camera, and carrying out gesture recognition and conversion on the captured gesture actions through image processing and a machine learning algorithm;
As shown in fig. 3, the method for gesture recognition and conversion includes:
s41: identifying a finger: as shown in fig. 4, the tip, middle phalanx and end phalanx of each finger are identified;
s42: identifying the stretching degree of the finger: as shown in fig. 5, if the fingertip, middle phalanx and end phalanx are not in a straight line, the finger is considered to be bent, otherwise the finger is considered to be unbent;
s43: identifying a finger movement state: as shown in fig. 6, in a preset time (e.g., 2 seconds), judging whether the fingertip is lifted up and dropped down rapidly, if so, recognizing that the finger has a knocking motion, and if not, recognizing that the finger is stationary;
Once the preset gesture is recognized, the gesture is converted into a corresponding control instruction and transmitted to a vehicle control system for execution, the vehicle control system is connected with the HUD or the screen, the transmission mode CAN adopt a wired or wireless mode, for example, data transmission is carried out through a CAN bus or Bluetooth connection of the vehicle, and the execution result is fed back through a voice or screen display mode.
In one specific embodiment, the preset gesture in the gesture recognition function of the steering wheel includes:
active on gesture: all fingers of the hands keep a bending state, hold the steering wheel and tighten the steering wheel into a fist, and grasp the steering wheel twice within 2 seconds;
Confirmation gesture: any thumb of the left hand or the right hand is quickly knocked on the surface of the steering wheel for 1 time within 1 second, and other fingers keep a bending state;
adjusting a volume gesture: the thumb of the right hand slides more than 3 cm along the surface of the steering wheel, and other fingers keep a bending state and do not move;
adjust brightness gesture: the thumb of the left hand slides more than 3 cm along the surface of the steering wheel, and other fingers keep a bending state and do not move;
The last gesture: all fingers of the left hand hold the steering wheel and tighten the steering wheel into a fist, the steering wheel is held twice within 2 seconds, and other fingers keep a bending state and do not move;
The next gesture: all fingers of the right hand hold the steering wheel and tighten the steering wheel into a fist, the steering wheel is held twice within 2 seconds, and other fingers keep a bending state and do not move;
Telephone answering gestures: any one index finger of the left hand or the right hand rapidly strikes the surface of the steering wheel for 1 time within 1 second, and other fingers keep a bending state;
Phone hang-up gesture: the index finger and the middle finger of the left hand or the right hand simultaneously and rapidly strike the surface of the steering wheel for 1 time within 1 second, and other fingers keep a bending state;
air volume adjustment gesture: all fingers of the left hand hold the steering wheel and tighten the steering wheel into fists, when the steering wheel is held and slid for more than 10 degrees in 1 second, the air quantity is adjusted, if the steering wheel is held and slid clockwise, the air quantity is increased, if the steering wheel is held and slid anticlockwise, the air quantity is reduced, after the steering wheel is stopped and slid for more than 2 seconds, the fists are loosened, the adjustment is finished, and the fingers of the right hand keep a bending state and do not move;
temperature adjustment gesture: all fingers of the right hand hold the steering wheel and tighten the steering wheel into fists, the temperature starts to be adjusted when the steering wheel is held and slid for more than 10 degrees within 1 second, the temperature is reduced when the steering wheel is held and slid clockwise, the temperature is increased when the steering wheel is held and slid anticlockwise, the fists are loosened after the steering wheel is stopped and slid for more than 2 seconds, the adjustment is finished, and the fingers of the left hand keep the bending state and do not move.
The preset gestures are default of the system, and a user can adjust the gesture according to own habits in the setting.
Taking a confirmation gesture as an example, when the middle control screen pops out in the running process of the vehicle, the popup window can be closed by the gesture while holding the steering wheel, the operation of releasing hands is not needed, and the movement state of the finger joints is only detected;
the first step: identifying the finger of the user, and the upper phalanx, middle phalanx and end phalanx;
And a second step of: detecting a bending motion of the thumb of the left hand or the thumb of the right hand;
and a third step of: recognizing that the thumb of the left hand or the thumb of the right hand is rapidly knocked on the surface of the steering wheel for 1 time, and keeping other fingers in a bending state;
Fourth step: triggering a "confirm gesture".
When the automobile steering wheel has a lifting function, the method further comprises the following steps: automatically detecting whether the position of the steering wheel changes after the vehicle is powered on, if so, re-detecting the boundary of the steering wheel, and updating the locally stored boundary coordinates of the steering wheel; if the hand is not changed, the locally stored steering wheel boundary coordinates are directly called to judge the relative positions of the hand and the steering wheel.
As shown in fig. 7, another embodiment of the present invention provides a vehicle control system based on gesture interaction of a steering wheel, including: the device comprises a plurality of cameras, a steering wheel boundary detection module, a hand detection and tracking module, a relative position judgment module of hands and a steering wheel, a gesture recognition and instruction conversion module, an instruction transmission and execution module and a feedback module;
a plurality of cameras are installed around the steering wheel of the vehicle for collecting image data of the steering wheel or capturing hand images of the driver in real time, as shown in fig. 8;
the steering wheel boundary detection module is used for automatically detecting the steering wheel boundary through an image processing algorithm, acquiring the steering wheel boundary coordinates and storing the steering wheel boundary coordinates to the local;
the hand detection and tracking module is used for continuously starting the camera to capture hand images of the driver in real time when the vehicle is in a driving state, detecting and tracking the hand positions of the driver in real time by utilizing image processing and computer vision technology, and obtaining hand position coordinates of the driver;
The relative position judging module of the hand and the steering wheel is used for comparing the boundary coordinates of the steering wheel with the position coordinates of the hand of the driver in real time to judge whether the hand of the driver is placed on the steering wheel, if the hand of the driver is determined to be placed on the steering wheel, the gesture recognition function of the steering wheel is started, and interaction is carried out with the vehicle machine through the gesture of the driver;
The gesture recognition and instruction conversion module is used for carrying out gesture recognition and conversion on the captured gesture actions through an image processing and machine learning algorithm;
The instruction transmission and execution module is used for converting the recognized gesture into a corresponding control instruction and transmitting the corresponding control instruction to the vehicle control system for execution;
The feedback module is used for feeding back the execution result in a voice or screen display mode.
In conclusion, the gesture matching is only needed through image recognition, any sensor is not needed, and the cost is saved; the method is limited to a steering wheel area, so that the palm is ensured not to be separated from the steering wheel to ensure driving safety, and only the finger action is identified, so that the identification precision is improved; by integrating image processing, computer vision and machine learning technologies, efficient and visual gesture interaction between a driver and a vehicle is realized. Through the gesture recognition function of the steering wheel, a driver does not need to be distracted to operate various control buttons or touch screens of the vehicle, and can concentrate on the road conditions in front more, so that the safety risk caused by operation distraction is reduced; the gesture interaction is more visual and natural, the control of the vehicle is realized through simple gesture actions, and the convenience and the pleasure of driving are improved; through continuous optimization of a machine learning algorithm, the system can recognize more kinds of gesture actions, can customize according to personal habits of a driver, and realizes personalized vehicle control; the real-time feedback mechanism can inform the driver of the execution condition of gesture instructions in real time, ensures that the instructions are executed correctly, and enhances the transparency and reliability of interaction.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are intended to be embraced therein.

Claims (9)

1. A vehicle control method based on steering wheel gesture interaction, the method comprising:
After the vehicle is powered on for the first time, automatically detecting the boundary of the steering wheel through an image processing algorithm, acquiring the boundary coordinates of the steering wheel and storing the boundary coordinates to the local, wherein the boundary coordinates of the steering wheel comprise the center coordinates (x, y) of the steering wheel, the upper left corner coordinates (x 1,y1) and the lower right corner coordinates (x 2,y2);
When the vehicle is in a driving state, continuously starting a camera to capture hand images of a driver in real time, detecting and tracking the hand positions of the driver in real time by utilizing image processing and computer vision technology, and acquiring hand position coordinates of the driver, wherein the method comprises the steps of; center point coordinates (m, n), upper left corner coordinates (m 1,n1), and lower right corner coordinates (m 2,n2) of the driver's hand;
And comparing the boundary coordinates of the steering wheel with the position coordinates of the hands of the driver in real time, judging whether the hands of the driver are placed on the steering wheel, and if the hands of the driver are determined to be placed on the steering wheel, starting a gesture recognition function of the steering wheel, and interacting with the vehicle through gestures of the driver.
2. The vehicle control method based on the gesture interaction of the steering wheel according to claim 1, wherein the steering wheel boundary detection is automatically performed through an image processing algorithm, and steering wheel boundary coordinates are obtained, and the method specifically comprises:
The camera is used for collecting image data of the steering wheel and converting the image data into a gray scale image, and the gray scale conversion formula is as follows:
Wherein I represents original image data; r represents the intensity value of the red channel of each pixel point in the image data, G represents the intensity value of the green channel of each pixel point in the image data, and B represents the intensity value of the blue channel of each pixel point in the image data;
The image is smoothed by applying gaussian blur, the formula of which is:
In the method, in the process of the invention, Representing the intensity of the blur of the gaussian function at any coordinate (p, q); sigma is the standard deviation in Gaussian blur, used to control the degree of blur; wherein,
Is the basic ambiguity value,/>As a regulatory factor,/>Is an adjustment function based on the illumination intensity L and the reflectivity F of the steering wheel surface, L max representing the maximum illumination intensity in the vehicle;
Processing the blurred gray image by using a Canny edge detection algorithm, calculating the gradient magnitude and direction of each point in the gray image by using a group of linear filters, performing non-maximum suppression to remove points on a non-boundary, detecting strong edges and weak edges by using a threshold T high、Tlow, screening out real edge points, and tracking and connecting the detected edges to form a complete edge image; wherein,
In the method, in the process of the invention,、/>Is the basic threshold,/>Is an adjustment function based on environmental parameters;
The circle in the processed image is detected through Hough circle transformation, the circle center and the radius are determined, the circle center is the center coordinate (x, y) of the steering wheel, the center coordinate minus the radius is the upper left corner coordinate (x 1,y1) of the steering wheel, and the center coordinate minus the radius is the lower right corner coordinate (x 2,y2) of the steering wheel.
3. The method for controlling a vehicle based on a gesture interaction of a steering wheel according to claim 1, wherein the method for controlling a driver's hand based on a skin color model adaptively adjusts the size and the proportion of an anchor frame according to historical tracking data, and the adaptive adjustment formula is as follows:
In the method, in the process of the invention, Is the width and height of the new anchor frame,/>Is the width and height of the last frame,/>Is the currently detected width and height,/>Is a smoothing factor for balancing the effects of the previous and subsequent frames;
And after the hands of the driver are detected, tracking the hand positions and the motion tracks of the driver in real time by using a particle filter or a Kalman filter, and recording the hand position coordinates of the driver.
4. The method for controlling a vehicle based on a gesture interaction of a steering wheel according to claim 1, wherein the real-time comparison of the boundary coordinates of the steering wheel with the position coordinates of the hand of the driver is performed to determine whether the hand of the driver is placed on the steering wheel, the method comprising:
Defining a steering wheel rectangular frame based on an upper left corner coordinate (x 1,y1) and a lower right corner coordinate (x 2,y2) of the steering wheel, defining a hand rectangular frame based on an upper left corner coordinate (m 1,n1) and a lower right corner coordinate (m 2,n2) of the driver hand, and calculating the overlapping degree of the driver hand and the steering wheel to judge the relative position relationship between the driver hand and the steering wheel;
overlap = intersection area of steering wheel rectangular frame and hand rectangular frame +.2% steering wheel rectangular frame area x 100%;
if the overlapping degree of the plurality of camera angles is more than 20%, judging that the hands of the driver are positioned above the steering wheel;
the Euclidean distance is further used to calculate the distance d of the driver's hand position and the steering wheel boundary:
when the distances d under the angles of the cameras are smaller than a preset distance threshold value, the driver is judged to be placed on the steering wheel.
5. The vehicle control method based on the gesture interaction of the steering wheel according to claim 1, wherein the function of recognizing the gesture of the steering wheel is started in two modes of active starting and passive starting;
Active on refers to: if the camera continuously monitors that the driver sends out the active starting gesture, the steering wheel gesture recognition function is started actively, otherwise, the steering wheel gesture recognition function is kept in a silent state;
passive on refers to: and under specific conditions, when the HUD or the vehicle screen is occupied, automatically starting a steering wheel gesture recognition function.
6. The method for controlling a vehicle based on a steering wheel gesture interaction according to claim 5, wherein after the steering wheel gesture recognition function is turned on, a gesture motion of a driver is captured by a camera, and the captured gesture motion is gesture-recognized and converted by an image processing and machine learning algorithm, the method for gesture recognition and conversion comprising:
identifying a finger: identifying the tip, middle phalanx and end phalanx of each finger;
Identifying the stretching degree of the finger: if the fingertip, the middle phalanx and the tail end phalanx are not in a straight line, the finger is considered to be bent, otherwise, the finger is considered to be unbent;
Identifying a finger movement state: judging whether the fingertip is lifted up and dropped down rapidly within a preset time, if so, recognizing that the finger has knocking movement, and if not, recognizing that the finger is stationary;
and once the preset gesture is recognized, converting the gesture into a corresponding control instruction, transmitting the corresponding control instruction to a vehicle control system for execution, and feeding back an execution result in a voice or screen display mode.
7. The vehicle control method based on the interaction of the hand gestures of claim 6, wherein the preset hand gestures in the hand gesture recognition function comprise:
active on gesture: all fingers of the hands keep a bending state, hold the steering wheel and tighten the steering wheel into a fist, and grasp the steering wheel twice within 2 seconds;
Confirmation gesture: any thumb of the left hand or the right hand is quickly knocked on the surface of the steering wheel for 1 time within 1 second, and other fingers keep a bending state;
adjusting a volume gesture: the thumb of the right hand slides more than 3 cm along the surface of the steering wheel, and other fingers keep a bending state and do not move;
adjust brightness gesture: the thumb of the left hand slides more than 3 cm along the surface of the steering wheel, and other fingers keep a bending state and do not move;
The last gesture: all fingers of the left hand hold the steering wheel and tighten the steering wheel into a fist, the steering wheel is held twice within 2 seconds, and other fingers keep a bending state and do not move;
The next gesture: all fingers of the right hand hold the steering wheel and tighten the steering wheel into a fist, the steering wheel is held twice within 2 seconds, and other fingers keep a bending state and do not move;
Telephone answering gestures: any one index finger of the left hand or the right hand rapidly strikes the surface of the steering wheel for 1 time within 1 second, and other fingers keep a bending state;
Phone hang-up gesture: the index finger and the middle finger of the left hand or the right hand simultaneously and rapidly strike the surface of the steering wheel for 1 time within 1 second, and other fingers keep a bending state;
air volume adjustment gesture: all fingers of the left hand hold the steering wheel and tighten the steering wheel into fists, when the steering wheel is held and slid for more than 10 degrees in 1 second, the air quantity is adjusted, if the steering wheel is held and slid clockwise, the air quantity is increased, if the steering wheel is held and slid anticlockwise, the air quantity is reduced, after the steering wheel is stopped and slid for more than 2 seconds, the fists are loosened, the adjustment is finished, and the fingers of the right hand keep a bending state and do not move;
temperature adjustment gesture: all fingers of the right hand hold the steering wheel and tighten the steering wheel into fists, the temperature starts to be adjusted when the steering wheel is held and slid for more than 10 degrees within 1 second, the temperature is reduced when the steering wheel is held and slid clockwise, the temperature is increased when the steering wheel is held and slid anticlockwise, the fists are loosened after the steering wheel is stopped and slid for more than 2 seconds, the adjustment is finished, and the fingers of the left hand keep the bending state and do not move.
8. The method for controlling a vehicle based on a gesture interaction of a steering wheel according to claim 1, wherein when the steering wheel of the vehicle has a lifting function, the method further comprises: automatically detecting whether the position of the steering wheel changes after the vehicle is powered on, if so, re-detecting the boundary of the steering wheel, and updating the locally stored boundary coordinates of the steering wheel; if the hand is not changed, the locally stored steering wheel boundary coordinates are directly called to judge the relative positions of the hand and the steering wheel.
9. A vehicle control system based on a steering wheel gesture interaction based vehicle control method according to any of claims 1-8, characterized in that the system comprises: the device comprises a plurality of cameras, a steering wheel boundary detection module, a hand detection and tracking module, a relative position judgment module of hands and a steering wheel, a gesture recognition and instruction conversion module, an instruction transmission and execution module and a feedback module;
The cameras are arranged around the steering wheel of the vehicle and are used for collecting image data of the steering wheel or capturing hand images of a driver in real time;
The steering wheel boundary detection module is used for automatically detecting the steering wheel boundary through an image processing algorithm, acquiring the steering wheel boundary coordinates and storing the steering wheel boundary coordinates to the local;
The hand detection and tracking module is used for continuously starting the camera to capture hand images of the driver in real time when the vehicle is in a driving state, and detecting and tracking the hand positions of the driver in real time by utilizing image processing and computer vision technology to obtain hand position coordinates of the driver;
The relative position judging module of the hand and the steering wheel is used for comparing the boundary coordinates of the steering wheel with the position coordinates of the hand of the driver in real time to judge whether the hand of the driver is placed on the steering wheel, if the hand of the driver is determined to be placed on the steering wheel, the gesture recognition function of the steering wheel is started, and interaction is carried out with the vehicle machine through the gesture of the driver;
The gesture recognition and instruction conversion module is used for performing gesture recognition and conversion on the captured gesture actions through an image processing and machine learning algorithm;
The instruction transmission and execution module is used for converting the recognized gesture into a corresponding control instruction and transmitting the corresponding control instruction to the vehicle control system for execution;
The feedback module is used for feeding back the execution result in a voice or screen display mode.
CN202410533515.1A 2024-04-30 2024-04-30 Vehicle control method and system based on steering wheel gesture interaction Pending CN118107605A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410533515.1A CN118107605A (en) 2024-04-30 2024-04-30 Vehicle control method and system based on steering wheel gesture interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410533515.1A CN118107605A (en) 2024-04-30 2024-04-30 Vehicle control method and system based on steering wheel gesture interaction

Publications (1)

Publication Number Publication Date
CN118107605A true CN118107605A (en) 2024-05-31

Family

ID=91219443

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410533515.1A Pending CN118107605A (en) 2024-04-30 2024-04-30 Vehicle control method and system based on steering wheel gesture interaction

Country Status (1)

Country Link
CN (1) CN118107605A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105404862A (en) * 2015-11-13 2016-03-16 山东大学 Hand tracking based safe driving detection method
CN110008834A (en) * 2019-02-28 2019-07-12 中电海康集团有限公司 A kind of the steering wheel intervention detection and statistical method of view-based access control model
CN110135398A (en) * 2019-05-28 2019-08-16 厦门瑞为信息技术有限公司 Both hands off-direction disk detection method based on computer vision
CN112036314A (en) * 2020-08-31 2020-12-04 上海商汤临港智能科技有限公司 Steering wheel hands-off detection method and device, electronic equipment and storage medium
CN115268651A (en) * 2022-08-09 2022-11-01 重庆理工大学 Implicit gesture interaction method and system for steering wheel
WO2023000119A1 (en) * 2021-07-17 2023-01-26 华为技术有限公司 Gesture recognition method and apparatus, system, and vehicle
CN117392649A (en) * 2023-12-11 2024-01-12 武汉未来幻影科技有限公司 Identification method and device for indicating operation of vehicle part and processing equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105404862A (en) * 2015-11-13 2016-03-16 山东大学 Hand tracking based safe driving detection method
CN110008834A (en) * 2019-02-28 2019-07-12 中电海康集团有限公司 A kind of the steering wheel intervention detection and statistical method of view-based access control model
CN110135398A (en) * 2019-05-28 2019-08-16 厦门瑞为信息技术有限公司 Both hands off-direction disk detection method based on computer vision
CN112036314A (en) * 2020-08-31 2020-12-04 上海商汤临港智能科技有限公司 Steering wheel hands-off detection method and device, electronic equipment and storage medium
WO2023000119A1 (en) * 2021-07-17 2023-01-26 华为技术有限公司 Gesture recognition method and apparatus, system, and vehicle
CN115268651A (en) * 2022-08-09 2022-11-01 重庆理工大学 Implicit gesture interaction method and system for steering wheel
CN117392649A (en) * 2023-12-11 2024-01-12 武汉未来幻影科技有限公司 Identification method and device for indicating operation of vehicle part and processing equipment

Similar Documents

Publication Publication Date Title
CN106502570B (en) Gesture recognition method and device and vehicle-mounted system
US9235269B2 (en) System and method for manipulating user interface in vehicle using finger valleys
US9891716B2 (en) Gesture recognition in vehicles
EP3316080B1 (en) Virtual reality interaction method, apparatus and system
EP2635953B1 (en) Robust video-based handwriting and gesture recognition for in-car applications
US8675916B2 (en) User interface apparatus and method using movement recognition
EP2969697B1 (en) System and method for identifying handwriting gestures in an in-vehicle infromation system
CN107562208A (en) The intelligent terminal control method and intelligent terminal control system of a kind of view-based access control model
US20140152549A1 (en) System and method for providing user interface using hand shape trace recognition in vehicle
US10234955B2 (en) Input recognition apparatus, input recognition method using maker location, and non-transitory computer-readable storage program
CN110489027B (en) Handheld input device and display position control method and device of indication icon of handheld input device
CN112507918B (en) Gesture recognition method
US20140168068A1 (en) System and method for manipulating user interface using wrist angle in vehicle
CN112558305B (en) Control method, device and medium of display picture and head-up display control system
CN109947243B (en) Intelligent electronic equipment gesture capturing and recognizing technology based on touch hand detection
CN114821810A (en) Static gesture intention recognition method and system based on dynamic feature assistance and vehicle
CN118107605A (en) Vehicle control method and system based on steering wheel gesture interaction
CN109993059B (en) Binocular vision and object recognition technology based on single camera on intelligent electronic equipment
CN109960406B (en) Intelligent electronic equipment gesture capturing and recognizing technology based on action between fingers of two hands
US20140098998A1 (en) Method and system for controlling operation of a vehicle in response to an image
JPWO2013175603A1 (en) Operation input device, operation input method, and operation input program
CN112926454A (en) Dynamic gesture recognition method
WO2022217598A1 (en) Limb recognition method and apparatus
KR101481307B1 (en) Mobile terminal for generating control command using finger image and sensor and method for generating control command using finger image from camera and sensor in terminal
CN104915014A (en) Non-contact interaction method based on mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination