CN110794819A - Intelligent automobile wireless driving control system with gesture key fusion - Google Patents

Intelligent automobile wireless driving control system with gesture key fusion Download PDF

Info

Publication number
CN110794819A
CN110794819A CN201911117896.0A CN201911117896A CN110794819A CN 110794819 A CN110794819 A CN 110794819A CN 201911117896 A CN201911117896 A CN 201911117896A CN 110794819 A CN110794819 A CN 110794819A
Authority
CN
China
Prior art keywords
vehicle
end controller
palm
control
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911117896.0A
Other languages
Chinese (zh)
Inventor
张阳
罗杰
朱彬
刘亚
周玉栋
张灿明
尤虎
韩雪非
杨旭
陈向成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Lu Chang Intelligent Technology Co Ltd
Original Assignee
Shenzhen Lu Chang Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Lu Chang Intelligent Technology Co Ltd filed Critical Shenzhen Lu Chang Intelligent Technology Co Ltd
Priority to CN201911117896.0A priority Critical patent/CN110794819A/en
Publication of CN110794819A publication Critical patent/CN110794819A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0016Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement characterised by the operator's input device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/25Means to switch the anti-theft system on or off using biometry
    • B60R25/252Fingerprint recognition
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/005Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with signals other than visual, e.g. acoustic, haptic

Abstract

The utility model provides a wireless driving control system of intelligent car that gesture button fuses, can solve intelligent driving technique operating accuracy and experience relatively poor technical problem. The system comprises a vehicle body-based controller, a vehicle end controller, a palm end controller and a camera; the palm-end controller is used for generating control parameters for steering, braking, accelerator, gear and parking of the vehicle, receiving voice instructions and transmitting the voice instructions to the vehicle-end controller for processing through the short-distance wireless module; the vehicle-end controller is used for receiving various control parameters sent by the palm-end controller, performing advanced processing on the set information by using the video data of the camera, completing the packaging of corresponding control messages, sending the control messages to corresponding execution components of the vehicle through the CAN bus, and completing the vehicle control. The invention can realize the one-hand operation of the vehicle, liberates the other hand of a safety worker, can hold the handrail and improves the driving stability and safety of the safety worker.

Description

Intelligent automobile wireless driving control system with gesture key fusion
Technical Field
The invention relates to the technical field of wireless driving, in particular to an intelligent automobile wireless driving control system with gesture key fusion.
Background
In the technical field of intelligent driving, the degree of vehicle automation is higher and higher, the requirement for manually controlling the vehicle in certain specific scenes is reduced, even some novel vehicles cancel traditional operating mechanisms such as a steering wheel and a brake pedal, and the vehicle still needs to be manually taken over and controlled in certain scenes, such as low-speed traction, manual obstacle winding, fine driving behaviors and the like.
For example, the traditional steering wheel, brake pedal and other operating mechanisms are gradually cancelled in the existing emerging unmanned minibus such as Dongfeng Sharing-VAN, Yutong unmanned bus 'minibus' and the like, when the novel vehicle meets the scene needing manual vehicle pulling, a worker can connect an operator similar to a game handle in the vehicle and operate the vehicle through double-hand combined keys, and the operation can achieve the purpose of controlling the vehicle in effect, but the form is old, and the operation accuracy and experience are poor.
Disclosure of Invention
The intelligent automobile wireless driving control system with the integrated gesture keys can solve the technical problems of poor operation accuracy and poor experience of an intelligent driving technology.
In order to achieve the purpose, the invention adopts the following technical scheme:
the utility model provides a wireless driving control system of intelligent car of gesture button integration, includes:
based on the automobile body controller, the system also comprises an automobile end controller, a palm end controller and a camera;
the palm end controller and the camera are respectively in communication connection with the vehicle end controller, and the vehicle end controller is in communication connection with the automobile body controller through a CAN bus;
the palm-end controller is held by an operator and comprises a fingerprint identification module;
the camera is arranged on the front windshield of the automobile and directly faces the driving position, and is used for collecting the face information of a driver sitting on the driving position;
the palm-end controller is used for generating control parameters for steering, braking, accelerator, gear and parking of the vehicle, receiving voice instructions and transmitting the voice instructions to the vehicle-end controller for processing through the short-distance wireless module;
the vehicle-end controller is used for receiving various control parameters sent by the palm-end controller, performing advanced processing on the set information by using the video data of the camera, completing the packaging of corresponding control messages, sending the control messages to corresponding execution components of the vehicle through the CAN bus, and completing the vehicle control.
Furthermore, the palm-end controller comprises a processor, and a fingerprint sensor, an inertia measurement unit, a short-distance wireless communication module, a linear key, a toggle key, a microphone and a loudspeaker which are respectively in communication connection with the processor;
the fingerprint sensor is used for collecting fingerprints of control personnel and checking identities;
the processor comprises an inertia measuring unit which is used for estimating the swing inclination angle of an operator arm and converting the swing inclination angle into a first control parameter of vehicle steering, and the first control parameter is transmitted to the vehicle-end controller through the short-distance wireless communication module;
the linear key circuit generates a second control parameter for controlling an accelerator and a brake, and the second control parameter is transmitted to the vehicle-end controller through the short-distance wireless communication module;
the third control parameter is transmitted to the vehicle-end controller through the short-distance wireless communication module;
the microphone is used for collecting the voice of a controller, and the voice is decoded and transmitted to the vehicle-end controller through the short-distance wireless communication module;
the loudspeaker is used for finishing voice output or voice prompt and informing the driver of relevant states and information.
Furthermore, the palm-end controller also comprises an indicator light which is in communication connection with the processor and used for prompting abnormal information.
Furthermore, the palm end controller also comprises a vibration motor, and the vibration motor is in communication connection with the processor and is used for prompting abnormal information.
Furthermore, the vehicle-end controller comprises a second processor, a second short-distance wireless communication module, an expansion interface and a CAN bus transceiver controller, wherein the second short-distance wireless communication module, the expansion interface and the CAN bus transceiver controller are respectively in communication connection with the second processor;
the second processor receives the operator identity information acquired by the fingerprint sent by the palm-end controller, and checks the double identities by combining with the driver face identification information acquired by reading and processing the camera video, so that the legality of the authority of the operator is guaranteed, the driver is guaranteed to operate in an effective lens area, and conditions are provided for subsequent safe operation;
the second processor also receives the inclination angle information of the arm of the driver and the generated steering control parameters sent by the palm-end controller, acquires pixel-level judgment of the inclination angle of the arm of the operator in the image by acquiring video data of the camera and recognizing gesture actions of the operator by using a machine learning and deep learning method, and then fuses the IMU inclination angle and the pixel fitting inclination angle to obtain more reliable inclination measurement of the arm of the operator so as to convert the more reliable inclination measurement of the arm of the operator into the steering control parameters with higher stability;
the second processor also receives voice decoding data of an operator sent by the palm-end controller, and generates a voice control instruction through further analysis and processing;
and the second processor converts the key codes and control parameters of the accelerator, the brake, the parking and the gear positions sent by the palm-end controller into corresponding vehicle body control messages, converts the direction control parameters generated after the fusion gesture processing into the vehicle body control messages, generates the corresponding vehicle body control messages according to the voice commands, and sends the messages to the CAN bus according to the vehicle control cycle to complete vehicle body control.
Furthermore, the processor analyzes fatigue and attention conditions of an operator through video data collected by the camera, and provides safety verification conditions for vehicle operation.
Further, the processor comprises an inertia measurement unit which is used for estimating the swing inclination angle of the arm of the operator and converting the swing inclination angle into a first control parameter of vehicle steering, and the first control parameter is transmitted to the vehicle-end controller through the short-distance wireless communication module;
wherein the calculation of the inclination angle of the operator's arm swing is as follows:
based on Euler angles yaw, pitch and roll in an east-north-sky coordinate system, setting the range of roll as r 1E [ -k degrees, + k degrees ], wherein k is less than 45 according to the joint movement comfort level of a human body;
after the vehicle-end controller receives the message frame, the data of the load segment is cached, meanwhile, the gesture posture of a person is collected by the camera, an included angle r2 between the central axis of the arm and the y axis in the picture is calculated, wherein the included angle r2 belongs to [ -k degrees, + k degrees ], k is less than 45, and the calculation method comprises the following steps:
r1 and r2 in the load section are fused, and a Kalman filtering method is used for fusing, so that sudden jitter identified by an inertia measurement unit or a camera is filtered, and a more stable arm inclination angle value is obtained:
kalman prediction equation:
X(k|k-1)=AX(k-1|k-1)+BU(k)
P(k|k-1)=AP(k-1|k-1)AT+Q
updating an equation:
Kg(k)=P(k|k-1)HT/(HP(k|k-1)HT+R)
X(k|k)=X(k|k-1)+Kg(k)(Z(k)-HX(k|k-1))
P(k|k)=(I-Kg(k)H)P(k|k-1)
let u (k) r1, z (k) r2,
Figure BDA0002274586950000041
p is a covariance matrix of X, k is discrete time counting, dt is sampling calculation interval 50ms, Q _ bias is IMU angular velocity drift, Kg is Kalman gain, H is |10|, I is a unit matrix, R is arm and y-axis included angle noise calculated by a video image, and 0.01 degree is taken;
calculating a prediction fusion angle r by continuously completing the prediction and updating processes;
then mapping the inclination angle r into a steering wheel turning angle value; the mapping method comprises the following steps:
knowing the r range as + -k DEG, k <45 and the steering wheel rotation angle range controlled by the controller of the electronic power steering system as S e < -t DEG, + t DEG), mapping r to the full range of S by a linear mapping mode, and using a cubic linear model:
S=a*r3+b*r2+c*r+d
and S is a direction control secondary instruction, the vehicle-end controller analyzes the cached palm-end controller message frame, completes CAN control instruction packet according to the CAN protocol of the vehicle controller and sends the packet to a vehicle body CAN bus to realize the control of the vehicle body.
According to the technical scheme, the intelligent automobile wireless driving control system with the integrated gesture keys has to support wire control operation based on the accelerator, the brake, the gear and the parking of the automobile, belongs to the industrial production work of the automobile, and can be used for realizing manual take-over and driving of the automobile in a specific scene. The operator can one-hand grip the palm end controller, lift up this arm and face towards the camera, another arm can grasp the handrail and be used for keeping the health stable, accomplish dual authentication through palm end controller and car end controller, realize controlling vehicle throttle, brake, fender position and parking through palm end controller button, realize controlling the vehicle through the arm and incline about and turn to (palm end IMU gesture is resolved and is fused with camera gesture analysis), accomplish through speech recognition and the richer interactive control of vehicle (like light, amusement, door and window etc.).
Aiming at the manual control scene of unmanned vehicles without a steering wheel, a brake pedal and the like, the invention mainly has the following beneficial effects:
(1) and the operation validity is ensured by double identity authentication. The safety requirement level of unmanned vehicle management is very high, the risk that an illegal person randomly controls the vehicle cannot occur, double identity authentication can be completed through fingerprint identification of the palm-end controller and face identification of the vehicle-end controller, and the legality of an operator is ensured;
(2) the safety operation area is limited, and if the vehicle is operated in a remote control mode, a safety worker is not on the vehicle, but the vehicle is operated remotely, so that the judgment of the safety worker on the environment around the vehicle is insufficient, and the safety risk can occur. The hand-end controller and the vehicle-end controller are fused for gesture direction recognition, so that a safe driver can be limited in a camera area, and the driving authority of the safe driver can be relieved if the safe driver leaves the area;
(3) the vehicle can be operated by one hand, the other hand of a safety worker is released, the handrail can be held, and the driving stability and safety of the safety worker are improved;
(4) the palm-end controller accords with the ergonomic design, and the design and layout of the press switch, the toggle switch and the microphone enable the operation experience to be better;
(5) the dual gesture direction recognition makes the operation to the vehicle direction more stable on a more convenient basis.
Drawings
FIG. 1 is a schematic diagram of the system of the present invention;
FIG. 2 is a schematic diagram of the tilting motion of the palm-end gripper according to the present invention;
FIG. 3 is a schematic diagram of the linear key principle of the present invention;
FIG. 4 is a schematic diagram of the external structure of the palm-end controller of the present invention;
FIG. 5 is a schematic block diagram of the palm-end director concept of the present invention;
FIG. 6 is a schematic block diagram of the end-of-vehicle controller concept of the present invention;
FIG. 7 is a flow chart of the operation of the present invention;
FIG. 8 is a diagram illustrating a structure of a private frame for short-range wireless transmission according to an embodiment of the present invention;
FIG. 9 is a gear mapping table of an embodiment of the present invention;
FIG. 10 is a graph of target throttle opening p for an embodiment of the present invention;
FIG. 11 is a parking map of an embodiment of the present invention;
FIG. 12 is a schematic view of a palm end rotation coordinate system and a rotation angle;
FIG. 13 is a schematic view of the range of angles through which the arm swings about the elbow joint;
FIG. 14 is a flowchart illustrating the calculation of the arm tilt angle according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention.
The intelligent automobile wireless driving control system with the gesture key fusion mainly comprises a palm-end controller, an automobile-end controller, a camera and a necessary cable, wherein the palm-end controller and the camera are respectively in communication connection with the automobile-end controller, and the automobile-end controller is in communication connection with an automobile body controller through a CAN bus;
the palm-end controller is held by an operator and comprises a fingerprint identification module;
the camera is arranged on the front windshield of the automobile or at a position which can be directly opposite to the driving position at other positions in the automobile, and is used for collecting the face information of a driver sitting on the driving position;
the following is detailed for each module:
palm end controller
The palm-end controller is used for generating control parameters and operator voice instructions for steering, braking, accelerator, gear and parking of the vehicle and transmitting the control parameters and the operator voice instructions to the vehicle-end controller for processing through the short-distance wireless module.
The wireless charging device mainly comprises a linear key, a toggle key, an indicator light, a microphone, a fingerprint sensor, a processor, a wireless transceiver module and a built-in rechargeable battery. The physical size is almost as big as the key of a common passenger car, can be held by one hand, and has the following functions:
collecting the fingerprint of a control person through a fingerprint sensor, and checking the identity;
the inclination angle of the arm swing of the operator is estimated through an IMU (inertial measurement unit) comprising an accelerometer and a gyroscope, and is converted into a control parameter of vehicle steering, and the control parameter is transmitted to a vehicle-end controller through a short-distance wireless communication module. An operator needs to make an arm holding the palm-end controller as vertical as possible, and the attitude angle calculated by the internal IMU is synchronously changed by inclining left and right, so that the inclination angle of the arm is calculated; as shown in fig. 2;
control parameters for controlling the accelerator and the brake are generated through the linear key circuit, and the control parameters can be transmitted to the vehicle-end controller through the short-distance wireless communication module. The linear key is a specially designed structure with certain resilience, a finger can output a linear analog signal after being pressed, the linear analog signal is collected by the processor, if the finger is released, the key can automatically rebound and reset, and the schematic diagram of the internal principle of the key is shown in fig. 3;
control parameters for controlling gears and parking are generated through shifting and self-locking keys, and the control parameters can be transmitted to a vehicle-end controller through a short-distance wireless communication module. The toggle button and the self-locking button are common electronic components and parts, and can respectively realize the purpose of toggling gears to output different electronic signal quantities and the purpose of pressing self-locking to generate different electronic signals;
the voice of a controller is collected through the microphone, the decoded voice is transmitted to the vehicle-end controller through the short-distance wireless communication module, the microphone is arranged at a certain port of the palm-end controller, and the palm-end controller is close to the mouth of an operator when being held, so that the operator can input the voice with the voice smaller than the voice normally communicated with people, and the design can reduce the volume requirement on voice control;
related working states such as abnormity, low electric quantity, illegal authority and the like are indicated through the working lamp and the vibrating motor;
the short-distance wireless communication module is connected with the vehicle-end controller to complete control parameter and instruction transmission and complete bidirectional interaction. The external structure schematic diagram of the palm-end controller is shown in FIG. 4;
a schematic diagram of the palm-end controller principle framework is shown in fig. 5.
Vehicle end controller
The vehicle-end controller is used for receiving various control parameters sent by the palm-end controller, processing the video data of the camera to complete the deep processing of key information, such as secondary comparison of operator identities, operator state detection, gesture inclination angle fusion processing of operators and the like, and meanwhile completing the packaging of corresponding control messages, and sending the control messages to corresponding execution components of the vehicle through the CAN bus to complete vehicle control.
The wireless communication system mainly comprises a camera, a processor, a short-distance wireless communication module, an expansion interface (such as a vehicle networking module), a CAN bus transceiver controller and the like, and the principle framework of the wireless communication system is as shown in 6;
the functions are as follows:
double identity authentication: the data of the palm-end controller is received, the identity information of the operator acquired by the fingerprint is contained, and the double identities are checked by further processing the face identification information of the driver acquired by the camera video, so that the legality of the authority of the operator is guaranteed, the driver is guaranteed to operate in an effective lens area, and conditions are provided for subsequent safe operation;
operator state analysis: the video data collected by the camera is used for analyzing the fatigue, attention and other conditions of an operator, and providing safety verification conditions for vehicle operation;
and (3) fusion gesture processing: the data of the palm-end controller are received, the inclination angle information of the arm of the driver and the generated steering control parameters are included, at the moment, the gesture actions of the operator are recognized by the aid of methods such as machine learning and deep learning through the video data of the camera, pixel-level judgment of the inclination angle of the arm of the operator in the image is obtained, the IMU inclination angle and the pixel fitting inclination angle are fused, more reliable measurement of the inclination angle of the arm of the operator is obtained, and the more reliable measurement of the inclination angle of the arm of the operator is converted into the steering control parameters with higher stability. The process can eliminate the shaking and misoperation behaviors of a large number of operators;
and (3) voice control instruction analysis: receiving data of the palm-end controller, wherein the data contains voice decoding data of an operator, and generating a voice control instruction through further analysis and processing;
generating and forwarding a vehicle body control command: converting key codes and control parameters of an accelerator, a brake, a parking position and a gear position transmitted from a palm-end controller into corresponding vehicle body control messages, converting direction control parameters generated after the fusion gesture processing into vehicle body control messages, generating corresponding vehicle body control messages according to voice commands, and sending the messages to a CAN bus according to a vehicle control cycle to complete vehicle body control;
the message information and the key-related mapping data transmitted from the palm-end controller to the vehicle-end controller are shown in fig. 8:
frame head and frame length are self-defined code words;
the load section is a control parameter;
the check is a frame check field, and a code word such as CS or CRC can be used.
As shown in fig. 9, the shift amount: the state value of the dial switch corresponds to a physical gear, and a gear value is stored;
and (3) accelerator mapping: and linear mapping between the ADC value of the linear push switch and the throttle opening degree is carried out, and the throttle opening degree value is stored. As shown in the following curve, the horizontal axis e is the ADC value of the push switch, and can be normalized; the longitudinal axis p is an accelerator opening value, the range can be 0-255 according to a vehicle controller protocol, the accelerator controller can directly control the opening of the accelerator by means of control, and f is a mapping curve:
p=a*(e/W)2+b*(e/W)+c
wherein a, b and c are coefficients, W is the full-scale range of the ADC, e/W is the ADC numerical normalization, and p is the accelerator opening.
Since the range of the target accelerator opening p is known, i.e., the control target is manually operated, as shown in fig. 10, the value of e is also available, and the values of the coefficients a, b, c can be determined through experimental tests.
Brake mapping: the relationship curve of the brake pressure is mapped by linearly pressing the key, the brake pressure value is stored, the mapping process is similar to the accelerator mapping, and repeated description is omitted.
Parking mapping: the storage auto-lock switch numerical value controls vehicle parking state: as shown in fig. 11;
direction 1 level mapping: stores the measured values of IMU attitude angle ROLL,
an IMU (inertial measurement unit) in the palm-end controller can output attitude angles and Euler angles yaw, pitch and roll under an east-north-sky coordinate system, the part of the principle belongs to a strapdown inertial navigation theory, the invention does not expand or deduce, the schematic diagram of the attitude angles and the northeast-sky coordinate system is shown in the following figure, and after the static initialization in the step (3) of the working flow, the east-north-sky system initialized by the coordinate system is as follows: the operator faces the front of the vehicle north, the right hand east, and directly above east. When the arm rotates to an inclined angle left and right along with the elbow joint, the arm actually rotates around the north axis in the coordinate system, and the rotation angle is a roll, as shown in fig. 12.
According to the comfort level of human joint movement, the range of roll is defined as r 1E [ -k °, + k ° ], and k is less than 45.
(3) After the vehicle-end controller receives the message frame, the data of the load segment is cached, meanwhile, the gesture posture of a person is collected by the camera, an included angle r2 between the central axis of the arm and the y axis in the picture is calculated, wherein the included angle r2 belongs to [ -k degrees, + k degrees ], k is less than 45, and the calculation method comprises the following steps:
as shown in fig. 14, r1 and r2 in the load segment are fused, and fused by using a kalman filter method, so as to filter out sudden jitter identified by the inertial measurement unit or the camera, and obtain a more stable arm tilt angle value:
kalman prediction equation:
X(k|k-1)=AX(k-1|k-1)+BU(k)
P(k|k-1)=AP(k-1|k-1)AT+Q
updating an equation:
Kg(k)=P(k|k-1)HT/(HP(k|k-1)HT+R)
X(k|k)=X(k|k-1)+Kg(k)(Z(k)-HX(k|k-1))
P(k|k)=(I-Kg(k)H)P(k|k-1)
let u (k) r1, z (k) r2,
Figure BDA0002274586950000101
p is a covariance matrix of X, k is discrete time counting, dt is sampling calculation interval 50ms, Q _ bias is IMU angular velocity drift, Kg is Kalman gain, H ═ 10|, I is a unit matrix, R is arm and y-axis included angle noise calculated by a video image, and specifically 0.01 degree is taken;
calculating a prediction fusion angle r by continuously completing the prediction and updating processes;
the tilt angle r needs to be mapped to a steering wheel angle value, i.e. the above-mentioned directional control level 2 command, to be used by the directional controller of the vehicle chassis.
The mapping method comprises the following steps:
given that the range of r is ± k °, k <45, and given that the steering wheel angle range controlled by an EPS (electric power steering) controller is S e [ -t °, + t ° ], r can be mapped to the full range of S by a linear mapping, similar to the throttle mapping, where a cubic linear model can be used:
S=a*r3+b*r2+c*r+d
and S is a direction control 2-level instruction, the vehicle-end controller analyzes the cached palm-end controller message frame, completes CAN control instruction packet according to the CAN protocol of the whole vehicle controller, and sends the packet to the CAN bus of the vehicle body to realize the control of the vehicle body.
System workflow
The system can work normally and needs the following conditions:
(1) the palm-end controller is sufficiently charged and can be provided with a special charging seat on the vehicle;
(2) the vehicle is provided with a drive-by-wire capability, and CAN control an accelerator, a gear, a brake, steering, parking and other controls through a CAN message.
Assuming that the intelligent driving vehicle is provided with the palm-end controller and the related parts of the vehicle-end controller in the embodiment of the invention, a safety operator enters the vehicle to prepare for manually taking over the vehicle, and the work flow is shown as 7:
1) the safety operator takes the palm-end controller off the charging seat in the vehicle, holds the palm-end controller with one hand according to the correct posture, holds the handrail with the other hand to keep the body stable, and the palm-end controller is separated from the base to activate the control system to wait for the prompt of the acousto-optic state;
2) the security personnel simultaneously press the fingerprint identification module of the palm-end controller and look ahead at the front windshield, and a camera facing the interior of the vehicle is arranged above the area to collect the face information and is connected with the vehicle-end controller. The palm-end controller transmits the identified identity information to the vehicle-end controller through a close-range wireless channel, the identity information is compared with the face identification identity information completed by the vehicle-end controller to obtain a double identity authentication result, if the identity information passes the authentication, the control authority of the driver to the vehicle is activated, otherwise, the driver needs to re-authenticate, the authentication fails for many times, and the driving authority of the vehicle is locked in a short time. Meanwhile, a safe driver needs to be kept in the camera area, if the safe driver moves out of the camera area, the driving authority is hung, and if the safe driver is hung for a long time, the driving authority is locked;
3) if the gesture passes the identity authentication, the security personnel lift the arms holding the palm-end controller according to the voice prompt, stand for a few seconds, enable the palm-end controller IMU to complete initial gesture calculation, wait for the vehicle-end controller and the camera to recognize the gesture at the same time, and completely get ready after the system is confirmed to pass;
4) the security officer generates a control instruction for the vehicle through a key and a microphone of the palm-end controller: the linear pressure key maps an accelerator command and a brake command, the dial key maps a gear command, the self-locking key maps a parking brake command, the microphone and voice recognition module maps a voice control command, and the built-in IMU posture mapping direction controls a level 1 command. The instructions are transmitted from the palm-end controller to the vehicle-end controller through a short-distance wireless channel;
5) the vehicle-end controller receives and analyzes the instruction in the step (4), identifies the arm inclination angle of the operator through the camera, maps the arm inclination angle into a direction control 2-level instruction, fuses the received direction control 1-level instruction, and generates a final direction control instruction;
6) the vehicle end controller packages related instructions, transmits the instructions to a vehicle body CAN network, the vehicle controller receives and analyzes the instructions, generates CAN messages of an accelerator, a brake, a steering and the like according to a chassis wire control protocol, and transmits the CAN messages to a chassis CAN bus, and each executing mechanism of the chassis, such as EPS and ESC, drives the chassis to execute mechanical components after receiving the messages, so as to complete vehicle control.
In summary, the embodiment of the present invention has the following features:
the method has double identity authentication means, and ensures the operation validity;
the vehicle can be operated by one hand, and the vehicle can be controlled by one hand;
a linear rebound key circuit is integrated, and the touch of an accelerator pedal and a brake pedal is simulated by fingers;
the voice control is supported, the position of the microphone is designed to be close to the mouth, the command is not required to be issued by loud sound, and the requirement on the voice input environment is low;
and 3, the gesture control of the fusion of the video and the IMU ensures that the control is more stable.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (8)

1. The utility model provides an intelligent car wireless driving control system that gesture button fuses, is based on car body controller, its characterized in that: the system also comprises a vehicle-end controller, a palm-end controller and a camera;
the palm end controller and the camera are respectively in communication connection with the vehicle end controller, and the vehicle end controller is in communication connection with the automobile body controller through a CAN bus;
the palm-end controller is held by an operator and comprises a fingerprint identification module;
the camera is arranged in the position, facing the driving position, in the automobile and used for collecting the face information of a driver sitting on the driving position;
the palm-end controller is used for generating control parameters for steering, braking, accelerator, gear and parking of the vehicle, receiving voice instructions and transmitting the voice instructions to the vehicle-end controller for processing through the short-distance wireless module;
the vehicle-end controller is used for receiving various control parameters sent by the palm-end controller, performing advanced processing on the set information by using the video data of the camera, completing the packaging of corresponding control messages, sending the control messages to corresponding execution components of the vehicle through the CAN bus, and completing the vehicle control.
2. The intelligent automobile wireless driving control system based on gesture key fusion of claim 1, characterized in that: the palm end controller comprises a processor, and further comprises a fingerprint sensor, an inertia measurement unit, a short-distance wireless communication module, a linear key, a toggle key, a microphone and a loudspeaker which are in communication connection with the processor;
the fingerprint sensor is used for collecting fingerprints of control personnel and checking identities;
the processor comprises an inertia measuring unit which is used for estimating the swing inclination angle of an operator arm and converting the swing inclination angle into a first control parameter of vehicle steering, and the first control parameter is transmitted to the vehicle-end controller through the short-distance wireless communication module;
the linear key circuit generates a second control parameter for controlling an accelerator and a brake, and the second control parameter is transmitted to the vehicle-end controller through the short-distance wireless communication module;
the third control parameter is transmitted to the vehicle-end controller through the short-distance wireless communication module;
the microphone is used for collecting the voice of a controller, and the voice is decoded and transmitted to the vehicle-end controller through the short-distance wireless communication module;
the loudspeaker is used for finishing voice output or voice prompt and informing the driver of relevant states and information.
3. The intelligent automobile wireless driving control system with the integrated gesture keys according to claim 2, characterized in that: the palm-end controller also comprises an indicator light which is in communication connection with the processor and used for prompting abnormal information.
4. The intelligent automobile wireless driving control system with the integrated gesture keys according to claim 2, characterized in that: the palm end controller further comprises a vibration motor, and the vibration motor is in communication connection with the processor and used for prompting abnormal information.
5. The intelligent automobile wireless driving control system based on gesture key fusion of claim 1, characterized in that: the vehicle-end controller comprises a second processor, a second short-distance wireless communication module, an expansion interface and a CAN bus transceiving controller, wherein the second short-distance wireless communication module, the expansion interface and the CAN bus transceiving controller are respectively in communication connection with the second processor;
the second processor receives the operator identity information acquired by the fingerprint sent by the palm-end controller, and checks the double identities by combining with the driver face identification information acquired by reading and processing the camera video, so that the legality of the authority of the operator is guaranteed, the driver is guaranteed to operate in an effective lens area, and conditions are provided for subsequent safe operation;
the second processor also receives the inclination angle information of the arm of the driver and the generated steering control parameters sent by the palm-end controller, acquires the pixel-level judgment of the inclination angle of the arm of the operator in the image by acquiring the video data of the camera and identifying the gesture actions of the operator by using a machine learning and deep learning method, and then fuses the IMU inclination angle and the pixel fitting inclination angle to obtain more reliable inclination measurement of the arm of the operator so as to convert the inclination measurement into the steering control parameters with the stability meeting the set requirement;
the second processor also receives voice decoding data of an operator sent by the palm-end controller, and generates a voice control instruction through further analysis and processing;
and the second processor converts the key codes and control parameters of the accelerator, the brake, the parking and the gear positions sent by the palm-end controller into corresponding vehicle body control messages, converts the direction control parameters generated after the fusion gesture processing into the vehicle body control messages, generates the corresponding vehicle body control messages according to the voice commands, and sends the messages to the CAN bus according to the vehicle control cycle to complete vehicle body control.
6. The intelligent automobile wireless driving control system based on gesture key fusion of claim 5, characterized in that: the processor analyzes the fatigue and attention conditions of the operator through the video data collected by the camera, and provides a safety verification condition for vehicle operation.
7. The intelligent automobile wireless driving control system based on gesture key fusion of claim 5, characterized in that: the processor comprises an inertia measuring unit which is used for estimating the swing inclination angle of an operator arm and converting the swing inclination angle into a first control parameter of vehicle steering, and the first control parameter is transmitted to the vehicle-end controller through the short-distance wireless communication module;
wherein the calculation of the inclination angle of the operator's arm swing is as follows:
based on Euler angles yaw, pitch and roll in an east-north-sky coordinate system, setting the range of roll as r 1E [ -k degrees, + k degrees ], wherein k is less than 45 according to the joint movement comfort level of a human body;
after the vehicle-end controller receives the message frame, the data of the load segment is cached, meanwhile, the gesture posture of a person is collected by the camera, an included angle r2 between the central axis of the arm and the y axis in the picture is calculated, wherein the included angle r2 belongs to [ -k degrees, + k degrees ], k is less than 45, and the calculation method comprises the following steps:
r1 and r2 in the load section are fused, and a Kalman filtering method is used for fusing, so that sudden jitter identified by an inertia measurement unit or a camera is filtered, and a more stable arm inclination angle value is obtained:
kalman prediction equation:
X(k|k-1)=AX(k-1|k-1)+BU(k)
P(k|k-1)=AP(k-1|k-1)AT+Q
updating an equation:
Kg(k)=P(k|k-1)HT/(HP(k|k-1)HT+R)
X(k|k)=X(k|k-1)+Kg(k)(Z(k)-HX(k|k-1))
P(k|k)=(I-Kg(k)H)P(k|k-1)
let u (k) r1, z (k) r2,p is a covariance matrix of X, k is discrete time counting, dt is sampling calculation interval 50ms, Q _ bias is IMU angular velocity drift, Kg is Kalman gain, H is |10|, I is a unit matrix, R is arm and y-axis included angle noise calculated by a video image, and 0.01 degree is taken;
calculating a prediction fusion angle r by continuously completing the prediction and updating processes;
then mapping the inclination angle r into a steering wheel turning angle value; the mapping method comprises the following steps:
knowing the r range as + -k DEG, k <45 and the steering wheel rotation angle range controlled by the controller of the electronic power steering system as S e < -t DEG, + t DEG), mapping r to the full range of S by a linear mapping mode, and using a cubic linear model:
S=a*r3+b*r2+c*r+d
and S is a direction control secondary instruction, the vehicle-end controller analyzes the cached palm-end controller message frame, completes CAN control instruction packet according to the CAN protocol of the vehicle controller and sends the packet to a vehicle body CAN bus to realize the control of the vehicle body.
8. The intelligent automobile wireless driving control system based on gesture key fusion of claim 1, characterized in that: the camera is arranged on the front windshield in the automobile and is right opposite to the driving position.
CN201911117896.0A 2019-11-13 2019-11-13 Intelligent automobile wireless driving control system with gesture key fusion Pending CN110794819A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911117896.0A CN110794819A (en) 2019-11-13 2019-11-13 Intelligent automobile wireless driving control system with gesture key fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911117896.0A CN110794819A (en) 2019-11-13 2019-11-13 Intelligent automobile wireless driving control system with gesture key fusion

Publications (1)

Publication Number Publication Date
CN110794819A true CN110794819A (en) 2020-02-14

Family

ID=69444816

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911117896.0A Pending CN110794819A (en) 2019-11-13 2019-11-13 Intelligent automobile wireless driving control system with gesture key fusion

Country Status (1)

Country Link
CN (1) CN110794819A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112114969A (en) * 2020-09-23 2020-12-22 北京百度网讯科技有限公司 Data processing method and device, electronic equipment and storage medium
CN113721632A (en) * 2021-09-08 2021-11-30 阿波罗智能技术(北京)有限公司 Vehicle remote control method, device and equipment and cloud cockpit
CN113805508A (en) * 2021-09-03 2021-12-17 奇瑞汽车股份有限公司 Interactive controller of intelligent automobile

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5989123A (en) * 1994-05-20 1999-11-23 Sega Enterprises, Ltd. Steering wheel control apparatus for a television game machine
US20120283894A1 (en) * 2001-10-24 2012-11-08 Mouhamad Ahmad Naboulsi Hands on steering wheel vehicle safety control system
CN103067630A (en) * 2012-12-26 2013-04-24 刘义柏 Method of generating wireless control command through gesture movement of mobile phone
CN204123979U (en) * 2014-10-20 2015-01-28 东北石油大学 Based on the information acquisition tow lift that three axle Gravity accelerometers control
CN106960486A (en) * 2016-01-08 2017-07-18 福特全球技术公司 The system and method that functional characteristic activation is carried out by gesture identification and voice command
JP2019049146A (en) * 2017-09-11 2019-03-28 Ihi運搬機械株式会社 Automatic loading and unloading apparatus and method for remote control vehicle
CN208867942U (en) * 2018-09-06 2019-05-17 烟台市安特洛普网络科技有限公司 Intelligent vehicle-carried interactive system
CN209085953U (en) * 2018-12-27 2019-07-09 河南护航实业股份有限公司 A kind of tele-control system of intelligent driving test target vehicle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5989123A (en) * 1994-05-20 1999-11-23 Sega Enterprises, Ltd. Steering wheel control apparatus for a television game machine
US20120283894A1 (en) * 2001-10-24 2012-11-08 Mouhamad Ahmad Naboulsi Hands on steering wheel vehicle safety control system
CN103067630A (en) * 2012-12-26 2013-04-24 刘义柏 Method of generating wireless control command through gesture movement of mobile phone
CN204123979U (en) * 2014-10-20 2015-01-28 东北石油大学 Based on the information acquisition tow lift that three axle Gravity accelerometers control
CN106960486A (en) * 2016-01-08 2017-07-18 福特全球技术公司 The system and method that functional characteristic activation is carried out by gesture identification and voice command
JP2019049146A (en) * 2017-09-11 2019-03-28 Ihi運搬機械株式会社 Automatic loading and unloading apparatus and method for remote control vehicle
CN208867942U (en) * 2018-09-06 2019-05-17 烟台市安特洛普网络科技有限公司 Intelligent vehicle-carried interactive system
CN209085953U (en) * 2018-12-27 2019-07-09 河南护航实业股份有限公司 A kind of tele-control system of intelligent driving test target vehicle

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112114969A (en) * 2020-09-23 2020-12-22 北京百度网讯科技有限公司 Data processing method and device, electronic equipment and storage medium
CN113805508A (en) * 2021-09-03 2021-12-17 奇瑞汽车股份有限公司 Interactive controller of intelligent automobile
CN113721632A (en) * 2021-09-08 2021-11-30 阿波罗智能技术(北京)有限公司 Vehicle remote control method, device and equipment and cloud cockpit

Similar Documents

Publication Publication Date Title
WO2017038485A1 (en) Electronic key system
CN110794819A (en) Intelligent automobile wireless driving control system with gesture key fusion
JP7345919B2 (en) vehicle
EP3503068B1 (en) Vehicle and control method
US9346471B2 (en) System and method for controlling a vehicle user interface based on gesture angle
CN104460974B (en) Vehicle control device
JP2018191322A (en) Headset computer capable of enabling and disabling features on the basis of real time image analysis
US10732760B2 (en) Vehicle and method for controlling the vehicle
JP2018162061A (en) Method for using communication terminal in motor vehicle while autopilot device is activated and motor vehicle
TWI459234B (en) Handheld device and method for controlling a unmanned aerial vehicle using the handheld device
CN112513787B (en) Interaction method, electronic device and system for in-vehicle isolation gesture
CN109933191B (en) Gesture recognition and control method and system
CN110968184B (en) Equipment control device
WO2022141648A1 (en) Method for human-computer interaction and device for human-computer interaction
CN108073118B (en) Vehicle body monitoring system and method based on mobile phone APP
KR20140072734A (en) System and method for providing a user interface using hand shape trace recognition in a vehicle
JP7443820B2 (en) In-vehicle equipment control device and vehicle control system
CN206096929U (en) Balance car automatic navigation driving device
CN112722331B (en) Interaction device and interaction control method of lunar manned mobile vehicle system
CN107264444A (en) The professional reversing of trailer auxiliary knob of modularization for open storage region
WO2023036230A1 (en) Execution instruction determination method and apparatus, device, and storage medium
JP2006312346A (en) Command input device
KR20140079025A (en) Method for providing a user interface using leg gesture recognition in a vehicle
CN108674344A (en) Speech processing system based on steering wheel and its application
CN114987364A (en) Multi-mode human-vehicle interaction system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned
AD01 Patent right deemed abandoned

Effective date of abandoning: 20231103