WO2021218900A1 - 一种车辆控制方法、装置及系统 - Google Patents

一种车辆控制方法、装置及系统 Download PDF

Info

Publication number
WO2021218900A1
WO2021218900A1 PCT/CN2021/089847 CN2021089847W WO2021218900A1 WO 2021218900 A1 WO2021218900 A1 WO 2021218900A1 CN 2021089847 W CN2021089847 W CN 2021089847W WO 2021218900 A1 WO2021218900 A1 WO 2021218900A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
target
gesture data
vehicle
function
Prior art date
Application number
PCT/CN2021/089847
Other languages
English (en)
French (fr)
Inventor
陈林
林瑟·斯坦尼斯拉夫
孙娇敏
Original Assignee
长城汽车股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 长城汽车股份有限公司 filed Critical 长城汽车股份有限公司
Priority to US17/758,204 priority Critical patent/US20230039339A1/en
Priority to EP21797619.0A priority patent/EP4067192A4/en
Publication of WO2021218900A1 publication Critical patent/WO2021218900A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/11Instrument graphical user interfaces or menu aspects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/14643D-gesture
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device

Definitions

  • the present disclosure relates to the field of automobile technology, and in particular to a vehicle control method, device and system.
  • gesture-controlled vehicles have become a new research focus.
  • the driver’s gestures are directly processed by the Human Machine Interface (HMI) device, and the driver’s intention is recognized through the HMI device, and the intention is directly transmitted to the target device. Then the target device performs the corresponding operation.
  • HMI Human Machine Interface
  • the present disclosure aims to provide a vehicle control method, device, and system to solve the problem that the existing method of using gestures to control the vehicle cannot effectively avoid the danger of gesture recognition and data transmission errors.
  • a vehicle control method wherein, applied to a terminal control device in a vehicle, the terminal control device is in communication connection with a human-machine interface device, the vehicle includes a plurality of actuators, and the method includes:
  • the recommended gestures corresponding to each sub-function are stored in the vehicle;
  • the receiving current gesture data sent by the human-machine interface device when the target function activation instruction is monitored includes :
  • the recommended gesture corresponding to each sub-function is stored in the vehicle;
  • the current gesture data includes gesture parameters;
  • generating a target control instruction according to the current gesture data, and sending the target control instruction to a target execution mechanism includes:
  • a target control instruction is generated according to the recommended gesture, and the target control instruction is sent to a target execution mechanism.
  • the current gesture data further includes a preliminary gesture, and the preliminary gesture is a gesture determined by the human-machine interface device based on the acquired gesture operation;
  • the acquiring recommended gestures corresponding to each sub-function in the target function includes:
  • the recommended gesture corresponding to each sub-function in the target function is acquired.
  • the recommended gesture corresponding to each sub-function is stored in the vehicle; when the recommended gesture corresponding to the target function is a periodic gesture, the current gesture data conforms to a predetermined When the conditions are set, generating a target control instruction according to the current gesture data, and sending the target control instruction to the target execution mechanism, includes:
  • a target control instruction is generated according to the current gesture data, and the target control instruction is sent to a target execution mechanism.
  • the vehicle stores the historical usage of each of the functions; when the activation instruction of the target function is monitored, the current gesture sent by the human-machine interface device is received Before the data, it also includes:
  • a recommended function is determined according to the wake-up gesture data, and the recommended information of the recommended function is sent to the man-machine interface device for the man-machine interface device Interface equipment display;
  • the vehicle stores the turn-on gestures corresponding to each function, and the recommended information includes the target turn-on gestures for the recommended function; when the confirmation operation for the recommended information is monitored, the recommendation is determined
  • the function is a target function, and the step of generating an opening instruction for the target function includes:
  • the recommended function is determined to be the target function, and an opening instruction for the target function is generated.
  • the current gesture data includes gesture position, movement speed, movement direction or movement shape; wherein,
  • the moving shape includes the diameter of the curved surface and the center of the curved surface.
  • Another objective of the embodiments of the present disclosure is to provide a vehicle control device, wherein it is applied to a terminal control device in a vehicle, the terminal control device is in communication connection with a human-machine interface device, and the vehicle includes a plurality of actuators,
  • the device includes:
  • the receiving module is configured to receive the current gesture data sent by the man-machine interface device when the target function activation instruction is monitored;
  • the instruction module is used to generate a target control instruction according to the current gesture data when the current gesture data meets the preset conditions, and send the target control instruction to the target executing agency for the target executing agency to execute the target Describe the operation corresponding to the current gesture data.
  • the vehicle control method and device described in the present disclosure have the following advantages:
  • the terminal control device in the vehicle recognizes and judges the gesture data generated by the gesture operation, and only when the gesture data meets the preset conditions, will the target control instruction be generated based on the gesture data to control the target actuator Performing corresponding operations can avoid the danger of false triggering of vehicle functions due to errors in gesture data when the gesture data is destroyed during the transmission of the gesture data from the human-machine interface device to the terminal control device, and ensure the use of gestures to control the vehicle At the same time, there is no need to add additional protection measures on the transmission path between the human-machine interface device and the terminal control device.
  • the existing modules and technologies can be fully borrowed, and only the corresponding modules of the terminal control device need to be developed, so that the entire The development difficulty of the system is greatly reduced, and the cost is also saved; therefore, the existing method of using gestures to control the vehicle is solved, and the problem of danger due to gesture recognition and data transmission errors cannot be effectively avoided.
  • the embodiment of the present disclosure also proposes another vehicle control method, which is applied to a human-machine interface device, the human-machine interface device is communicatively connected with a terminal control device in a vehicle, the vehicle includes a plurality of actuators, the Methods include:
  • the terminal control device Send the current gesture data to the terminal control device, so that when the current gesture data meets a preset condition, the terminal control device generates a target control instruction according to the current gesture data, and controls the target The instruction is sent to the target executing agency for the target executing agency to perform the operation corresponding to the current gesture data.
  • the generating current gesture data according to the gesture operation includes:
  • the current gesture data is generated from the gesture parameter and the preliminary gesture.
  • Another object of the embodiments of the present disclosure is to propose another vehicle control device, which is applied to a human-machine interface device, which is communicatively connected with a terminal control device in a vehicle, and the vehicle includes a plurality of executions.
  • the device includes:
  • the monitoring module is used to monitor the gesture operation when the target function opening instruction is monitored
  • a data generation module for generating current gesture data according to the gesture operation
  • the sending module is configured to send the current gesture data to the terminal control device, so that the terminal control device generates a target control instruction according to the current gesture data when the current gesture data meets a preset condition, and
  • the target control instruction is sent to the target executing agency, so that the target executing agency executes the operation corresponding to the current gesture data.
  • the embodiment of the present disclosure also proposes a vehicle control system, which includes a human-machine interface device and a vehicle terminal control device, wherein:
  • the man-machine interface device monitors the gesture operation
  • the man-machine interface device generates current gesture data according to the gesture operation
  • the man-machine interface device sends the current gesture data to the terminal control device
  • the terminal control device When the current gesture data meets a preset condition, the terminal control device generates a target control instruction according to the current gesture data, and sends the target control instruction to the target executing agency, so that the target executing agency can execute the target control instruction. Describe the operation corresponding to the current gesture data.
  • Another object of the present disclosure is to provide a vehicle, wherein the vehicle includes the vehicle control system as described above.
  • the gesture data corresponding to the gesture operation is sent to the terminal control device of the vehicle, and then the terminal control device in the vehicle recognizes and judges the gesture data, and only when the gesture data meets the preset conditions Generate target control instructions based on the gesture data, and then control the target actuator to perform corresponding operations.
  • the gesture data is damaged during the transmission of the gesture data from the human-machine interface device to the terminal control device, it can avoid the vehicle function caused by the error of the gesture data The safety of using gestures to control the vehicle is guaranteed.
  • the existing modules and technologies can be completely borrowed.
  • the present disclosure also proposes a computing processing device, including:
  • a memory in which computer readable codes are stored
  • One or more processors when the computer-readable code is executed by the one or more processors, the computing processing device executes any one of the vehicle control methods described above.
  • the present disclosure also proposes a computer program, including computer readable code, which when the computer readable code runs on a computing processing device, causes the computing processing device to execute any of the aforementioned vehicle control methods.
  • the present disclosure also proposes a computer-readable medium in which the above-mentioned computer program is stored.
  • FIG. 1 is a schematic flowchart of a vehicle control method proposed by an embodiment of the disclosure
  • FIG. 2 is a schematic flowchart of a vehicle control method proposed by another embodiment of the disclosure.
  • FIG. 4 is a schematic structural diagram of a vehicle control device proposed in an embodiment of the disclosure.
  • FIG. 5 is a schematic structural diagram of a vehicle control device proposed in another embodiment of the disclosure.
  • Fig. 6 schematically shows a block diagram of a computing processing device for executing the method according to the present disclosure.
  • Fig. 7 schematically shows a storage unit for holding or carrying program codes for implementing the method according to the present disclosure.
  • FIG. 1 shows a schematic flowchart of a vehicle control method provided by an embodiment of the present disclosure.
  • the vehicle control method provided by the embodiment of the present disclosure is applied to a terminal control device in a vehicle, and the terminal control device is connected to a terminal control device in a vehicle.
  • the human-machine interface device is in communication connection, the vehicle includes a plurality of actuators, and the method includes steps S100-S200.
  • the terminal control device may be a vehicle controller or a specific functional module, etc.; and the human-machine interface device may be a vehicle central control, a vehicle-mounted camera, etc. directly through the terminal control device local area network (Controller Area Network, CAN)
  • the vehicle-mounted device that communicates with the terminal control device can also be a mobile device that can monitor gesture operations such as a mobile phone, a tablet computer, etc.
  • the human-machine interface device communicates with the vehicle-mounted communication terminal through a wireless network
  • the terminal control device communicates with the vehicle-mounted communication terminal through the CAN network, so as to realize the communication connection between the terminal control device and the man-machine interface device.
  • the target function refers to the specific vehicle function that the driver intends to achieve through gesture operation, such as remote control parking in a vehicle.
  • the above-mentioned actuator refers to a specific vehicle mechanism used to realize different vehicle functions. Through the independent work of the above-mentioned actuator or the cooperation of multiple actuators, the driving intention expressed by the driver's gesture operation can be realized. In practical applications, the above-mentioned actuators include steering systems, braking systems, powertrains, transmissions, four-wheel drive control systems, and so on.
  • Step S100 When the start instruction of the target function is monitored, receive the current gesture data sent by the human-machine interface device.
  • step S100 because the vehicle has a variety of functions that can be specifically executed through gesture operations, the premise is that the function must be turned on first, and then the corresponding function can be executed according to the driver's gesture operation.
  • the start instruction of the target function is monitored, it means that the driver needs to turn on the target function, and how to execute the target function needs to be performed according to the driver's specific gesture operation, so it needs to be received by the human-machine interface device.
  • the sent current gesture data which reflects the current specific gesture operation of the driver.
  • the current gesture data may include gesture parameters such as gesture position, movement speed, movement direction, and movement shape.
  • the moving shape may include the diameter of the curved surface and the center of the curved surface.
  • Step S200 When the current gesture data meets a preset condition, generate a target control instruction according to the current gesture data, and send the target control instruction to a target executing agency for the target executing agency to execute the current The operation corresponding to the gesture data.
  • the preset condition refers to a condition set in advance and used to determine whether the current gesture data is reasonable and whether the vehicle will be dangerous, and it is also a condition for determining whether the current gesture data faithfully reflects the driver's driving intention.
  • the above-mentioned target control instruction refers to an instruction that controls a specific implementing agency to perform corresponding operations, and a target implementing agency refers to a specific agency that achieves a target function.
  • the target control instruction is generated according to the current gesture data, and the target control instruction is sent to the target executing agency, so that the target executing agency can execute the operation corresponding to the current gesture data, so as to realize the target function.
  • the step of generating a target control instruction based on the current gesture data is not executed, and the transmission of the gesture data is controlled to be interrupted to enter a safe state or to take appropriate actions to avoid outputting abnormal functional behaviors, which violates safety.
  • step S200 the analysis process of judging whether the acquired current gesture data meets the preset conditions, and when the current gesture data meets the preset conditions, the target control instruction is generated according to the current gesture data, and the target control instruction is sent to The operation of the target actuator is directly executed by the terminal control device instead of on the gesture acquisition device. In this way, when the gesture data or the instructions corresponding to the gesture are transmitted from the human-machine interface device to the terminal control device, they are damaged.
  • the terminal control device will not be able to recognize the expected gestures and driving intentions based on the received data, and the terminal control device will not generate corresponding target control instructions, so that it will not cause the wrong output of gesture-based functional behavior and ensure safety; at the same time, In the above method, only the terminal control device needs to be developed according to the corresponding automobile safety integrity level, while the HMI device and transmission path device do not need to add any additional design for the automobile safety integrity level, thereby saving development and design costs.
  • the vehicle control method described in the present disclosure has the following advantages:
  • the gesture data corresponding to the gesture operation is sent to the terminal control device of the vehicle, and then the terminal control device in the vehicle recognizes and judges the gesture data, and only when the gesture data meets the preset conditions Generate target control instructions based on the gesture data, and then control the target actuator to perform corresponding operations.
  • the gesture data is damaged during the transmission of the gesture data from the human-machine interface device to the terminal control device, it can avoid the vehicle function caused by the error of the gesture data The safety of using gestures to control the vehicle is guaranteed.
  • the existing modules and technologies can be completely borrowed.
  • the target function can be remote parking function, remote control window lift function, remote door unlocking function, remote back door opening and closing function, remote engine control function, remote heating control function, etc.
  • the target function is the remote parking function.
  • the driver turns on the remote parking function through a certain gesture operation through the prompt of the mobile phone APP interface, and controls the parking by interacting with the mobile phone APP with specific gesture operations.
  • the start and stop of the process during the remote parking process, the mobile phone APP acquires the driver's gesture operation, recognizes the driver's intention, such as remote parking, pause, stop and start, and then converts the driver's gesture operation into a gesture
  • the data is sent to the vehicle-mounted communication terminal.
  • the vehicle-mounted communication terminal After the vehicle-mounted communication terminal receives the gesture data sent by the mobile phone, it will be passed to the remote parking control module.
  • the corresponding parking instruction When the conditions are set, the corresponding parking instruction is generated to control the vehicle to park automatically, and when it is determined that the gesture data does not meet the preset conditions, the corresponding parking instruction is not generated and enters a safe state.
  • the vehicle control method provided by the embodiment of the present disclosure further includes steps S1001 to S1002 before the above step S100.
  • step S1001 because the sub-functions in each function have corresponding recommended gestures, and the recommended gestures corresponding to the sub-functions in different functions may be similar or even the same, the driver performs the wake-up gesture operation through the man-machine interface device and When the wake-up gesture data is sent, the recommended function corresponding to the wake-up gesture data can be determined based on the wake-up gesture data.
  • the recommended function includes at least one sub-function and the recommended gesture corresponding to the wake-up gesture data is the same or similar; and The recommended function is sent to the man-machine interface device for display by the man-machine interface device, allowing the driver to specifically choose whether to perform the above-mentioned recommended function.
  • the vehicle stores the historical usage of each of the functions.
  • the recommended function it can be determined based on the wake-up gesture data and the aforementioned historical usage, that is, the driver is most likely to
  • the functions that need to be used are recommended functions for the driver to choose and use.
  • the recommended function can be one or more.
  • the above confirmation operation is an operation of confirming the use of the recommended function through a voice, gesture, touch, pressing, etc., issued by the human-machine interface device. If the confirmation operation for the recommended function is monitored, it means that the driver confirms that the above recommended function needs to be used, so the recommended function is used as the target function, and an opening instruction for the target function is generated to facilitate the follow-up gesture operation performed by the driver Control specific implementing agencies to perform corresponding work, so as to achieve the target function.
  • the vehicle also stores an opening gesture corresponding to each function, and the recommended information includes a target opening gesture for the recommended function and a function description.
  • the above step S1002 specifically includes: When the difference value between the actual gesture operation for the recommendation information and the target opening gesture is within a fourth preset range, it is determined that the recommended function is a target function, and an opening instruction for the target function is generated.
  • the fourth preset range is a preset limit for determining whether the actual gesture operation monitored by the human-machine interface device matches the opening gesture, and the actual gesture operation for the recommended information is consistent with the target.
  • the difference value of the opening gesture is within the fourth preset range, it indicates that the driver wants to turn on the recommended function, and therefore determines that the recommended function is the target function, and generates an opening instruction for the target function.
  • the recommended gestures corresponding to each sub-function are stored in the vehicle, and the specific steps S101 to S102 are performed in the above step S100.
  • the recommended gesture corresponding to each sub-function is a standard operation gesture that specifically implements each function.
  • the vehicle can be triggered to control the vehicle to implement the corresponding function.
  • the aforementioned recommended gestures may include the shape of the gesture and the corresponding movement trajectory, and cleaning up according to the recommendation allows the driver to intuitively know the specific gesture operation that controls the vehicle to perform the target function.
  • Step S101 When the target function activation instruction is detected, the recommended gesture corresponding to each sub-function in the target function is sent to the man-machine interface device for display by the man-machine interface device.
  • step S101 that is, when the target function activation instruction is monitored, that is, when the driver needs to turn on the target function, the recommended gesture corresponding to the target function is first sent to the man-machine interface device, and then the man-machine interface device displays This gesture can better prompt the driver how to perform gesture operations to control the vehicle to achieve the target function.
  • Step S102 Receive current gesture data sent by the man-machine interface device.
  • step S102 after sending the recommended gestures corresponding to each sub-function in the target function to the human-machine interface device, the gesture data representing the driver's specific gesture operations acquired and sent by the human-machine interface device is received and sent, According to the gesture data, it is determined to generate a corresponding instruction to control a specific actuator to achieve a corresponding target function.
  • the recommended gesture corresponding to each sub-function in the target function is first sent to the man-machine interface device, and then the man-machine interface device displays the gesture, so as to better remind the driver how to perform the gesture operation , To control the vehicle to achieve each sub-function in the target function.
  • the recommended gestures corresponding to each sub-function are stored in the vehicle; when the recommended gestures corresponding to each sub-function in the target function are periodic gestures, the above step S200 includes the following steps: S201 ⁇ S202.
  • the recommended gesture corresponding to each sub-function is a standard operation gesture that specifically implements each function.
  • the vehicle can be triggered to control the vehicle to implement the corresponding function.
  • the recommended gesture corresponding to the target function is a periodic gesture, it means that the driver needs to perform periodic gesture operations to control the vehicle to continuously perform the target function.
  • Step S201 Determine expected gesture data according to the gesture data acquired last time and the recommended gesture corresponding to each sub-function in the target function.
  • step S201 because the recommended gesture corresponding to each sub-function in the target function is a periodic gesture, the man-machine interface device will receive the periodic gesture operation performed by the driver, and will operate the periodic gesture.
  • the generated gesture protector is sent to the terminal control device, so the terminal control device can also periodically obtain the gesture data corresponding to the target function.
  • the gesture data and recommended gestures can predict the gesture data received next time; that is, the expected gesture data that should be obtained from the man-machine interface device this time can be determined based on the gesture data and recommended gestures received last time.
  • Step S202 If the difference value between the current gesture data and the expected gesture data is within a third preset range, a target control instruction is generated according to the current gesture data, and the target control instruction is sent to the target actuator .
  • the third preset range is preset to determine whether the received current gesture data meets the expected limit, that is, to determine whether the received current gesture data faithfully represents the driver's driving intention The boundaries.
  • the target control instruction is generated according to the current gesture data, and the target control instruction is sent to the target actuator; and when the difference value between the current gesture data and the expected gesture data is not within the third preset range , Indicating that the received current gesture data does not meet expectations, then it is determined that the received current gesture data does not faithfully represent the driver’s driving intention or the data is incorrect, so the target control instruction will not be generated based on the current gesture data.
  • the target control command will not be sent to the target actuator, so as to avoid misrecognizing the driver's intention and causing false triggering of vehicle functions, which may lead to dangerous problems
  • the above-mentioned third preset range can be 3%-30%; in addition, in order to ensure the realization of the target function, the gesture data transmission rate should be sufficiently high and orderly. Among them, for periodic gestures, the data transmission rate At least twice the rate of gesture operation.
  • recommended gestures corresponding to each sub-function are stored in the vehicle; the gesture data includes gesture parameters; the above step S200 includes steps S211 to S213.
  • the recommended gesture corresponding to each function is a standard operation gesture that specifically implements the sub-function in each function.
  • the vehicle can be triggered to control the vehicle to realize the corresponding sub-function.
  • Step S211 According to the gesture data, reconstruct the actual gesture.
  • the gesture parameters specifically include the gesture position, movement speed, movement direction, movement shape, etc., so the actual gesture corresponding to the gesture operation performed by the driver can be reconstructed through the received gesture parameter.
  • Step S212 Obtain recommended gestures corresponding to each sub-function in the target function.
  • step S212 because the recommended gesture corresponding to each function is stored in the vehicle, when the activation instruction of the target function is monitored, the target function can be determined, and then the recommended gesture can be determined.
  • Step S213 If the difference value between the actual gesture and the recommended gesture is within the first preset range, generate a target control instruction according to the recommended gesture, and send the target control instruction to the target actuator.
  • the first preset range is a limit for determining whether the actual gesture matches the recommended gesture, that is, the limit for determining whether the received current gesture data faithfully represents the driver's driving intention.
  • the difference value of the recommended gesture corresponding to each sub-function of the actual gesture and the target function is within the first preset range, it indicates that the actual gesture matches the recommended gesture, that is, the recommended gesture can be used as the target recommended gesture, and the determination is made.
  • the received current gesture data faithfully represents the driver’s driving intention, so the target control instruction is generated according to the current gesture data, and the target control instruction is sent to the target actuator; and the actual gesture and the recommended gesture
  • the difference value exceeds the first preset range, indicating that the actual gesture does not match the recommended gesture, it is determined that the received current gesture data does not faithfully represent the driver’s driving intention or the data is incorrect, and therefore it will not be based on the current gesture data.
  • Gesture data generates target control instructions, and does not send the target control instructions to the target actuator, so as to avoid erroneous recognition of the driver's intention and cause false triggering of vehicle functions, which may lead to dangerous problems.
  • the gesture data further includes a preliminary gesture
  • the preliminary gesture is a gesture determined by the human-machine interface device based on the acquired gesture operation.
  • the step S212 is specifically The method includes: if the difference value between the preliminary gesture and the actual gesture is within a second preset range, acquiring the recommended gesture corresponding to each sub-function in the target function.
  • the human-machine interface device when the human-machine interface device monitors the driver's gesture operation, it first determines the corresponding gesture parameter based on the gesture operation, and at the same time, recognizes the gesture parameter according to the change, so as to obtain the recognition analysis of the human-machine interface device. Then, the preliminary gesture and gesture parameters are packaged into gesture data and sent to the terminal control device of the vehicle; after receiving the gesture data, the terminal control device reconstructs the actual gesture based on the gesture parameters in it, and then reconstructs the obtained The actual gesture is compared with the preliminary gesture sent from the human-machine interface device, and it can be determined whether the gesture parameter is damaged during the transmission of the human-machine interface device to the terminal control device.
  • the above-mentioned second preset range is the limit set by the preset and used to determine whether the actual gesture matches the preliminary gesture, that is, it is determined that the received gesture parameter is transmitted from the human-machine interface device to the terminal control device. The boundary of whether the process was destroyed. If the difference between the preliminary gesture and the actual gesture is within the second preset range, it indicates that the actual gesture matches the preliminary gesture, that is, it is determined that the received gesture parameter is in the process of being transmitted from the human-machine interface device to the terminal control device.
  • the gesture parameter obtained by the terminal control device is the gesture parameter issued by the human-machine interface device, so it enters the subsequent step of obtaining the recommended gesture corresponding to each sub-function in the target function to further determine Whether the actual gesture corresponding to the gesture parameter is safe, reasonable and whether it reflects the driver's real driving intention.
  • the human-machine interface device needs to perform preliminary gesture recognition, and send the sensed gesture parameters and gesture recognition results to the terminal control device, and then the terminal control device first recognizes the gesture parameters in the received data.
  • the expressed gestures are then compared with the preliminary gestures in the received data. If they are consistent, it is considered that there is no abnormality when the data passes through the human-machine interface device and the transmission system.
  • the above process is due to the human-machine interface device and the terminal control device respectively.
  • the separate identification and final comparison of the two devices increase the reliability of the identification results and further ensure the safety of the system.
  • FIG. 2 shows a schematic flowchart of another vehicle control method provided by an embodiment of the present disclosure.
  • the other vehicle control method provided by the embodiment of the present disclosure is applied to a human-machine interface device.
  • the interface device is in communication connection with the terminal control device in the vehicle, the vehicle includes a plurality of actuators, and the method includes steps S221 to S223.
  • Step S221 When the start instruction of the target function is detected, the gesture operation is monitored.
  • step S221 when the driver chooses to turn on the target function, the turn-on instruction is triggered on the side of the man-machine switching device, and at the same time, the turn-on instruction of the target function is also sent to the terminal control device. Therefore, when the instruction to turn on the target function is monitored, it means that when the driver needs to turn on the target function, the driver will perform gesture operations on the man-machine interface device, so the gesture operations can be monitored, and then the corresponding gestures will be obtained in the subsequent step S222 data.
  • the touch sensor in the human-machine interface device can specifically execute the steps of monitoring the gesture operation described above, so as to obtain the gesture parameters.
  • Step S222 Generate current gesture data according to the gesture operation.
  • step S222 by monitoring the gesture operation performed by the driver on the human-machine interface device, the corresponding gesture data can be obtained.
  • Step S223 Send the current gesture data to the terminal control device, so that when the current gesture data meets a preset condition, the terminal control device generates a target control instruction according to the current gesture data, and sends the data to the terminal control device.
  • the target control instruction is sent to a target executing agency for the target executing agency to perform an operation corresponding to the current gesture data.
  • the gesture data corresponding to the gesture operation is sent to the terminal control device of the vehicle via the transmission system, and then the terminal control device in the vehicle recognizes and judges the gesture data, and only when the gesture data meets the preset conditions, Generate target control instructions based on the gesture data, and then control the target actuator to perform corresponding operations.
  • the gesture data is damaged during the transmission of the gesture data from the human-machine interface device to the terminal control device, it can avoid the vehicle function caused by the error of the gesture data The safety of using gestures to control the vehicle is guaranteed.
  • the existing modules and technologies can be completely borrowed.
  • the above-mentioned transmission system refers to a system that can establish a communication connection between a human-machine interface device and a terminal control device.
  • a human-machine interface device can include an Internet of Vehicles server, an in-vehicle communication terminal, and a CAN network.
  • step S222 specifically includes steps S2221 to S2223:
  • the corresponding gesture parameter by monitoring the gesture operation performed by the driver on the human-machine interface device, the corresponding gesture parameter can be determined, and the corresponding gesture shape can be determined through the above gesture parameter, that is, the driver's gesture operation can be reproduced .
  • the gesture parameters include gesture parameters such as gesture position, movement speed, movement direction, and movement shape.
  • the moving shape may include the diameter of the curved surface and the center of the curved surface.
  • the human-machine interface device analyzes and recognizes the gesture reflected by the gesture parameter determined in the step S2221, and the above-mentioned preliminary cleaning is obtained.
  • step S2223 the packing parameters determined in step S2221 according to the driver's gesture operation and the preliminary packing determined in step S2222 are packaged together as the current packing data, and then sent to the terminal control device, and then the terminal control device
  • the current gesture data meets a preset condition
  • a target control instruction is generated according to the current gesture data, and the target control instruction is sent to a target executing agency, so that the target executing agency can execute the current gesture data corresponding to the operate.
  • the vehicle control method described in the present disclosure has the following advantages:
  • the gesture data corresponding to the gesture operation is sent to the terminal control device of the vehicle, and then the terminal control device in the vehicle recognizes and judges the gesture data, and only when the gesture data meets the preset conditions Generate target control instructions based on the gesture data, and then control the target actuator to perform corresponding operations.
  • the gesture data is damaged during the transmission of the gesture data from the human-machine interface device to the terminal control device, it can avoid the vehicle function caused by the error of the gesture data The safety of using gestures to control the vehicle is guaranteed.
  • the existing modules and technologies can be completely borrowed.
  • FIG. 3 shows a flow chart of the interaction steps of a vehicle control method proposed in an embodiment of the present disclosure, using a human-machine interface device and a vehicle terminal control device, which includes steps S301 to S305.
  • the human-machine interface device communicates with the terminal control device of the vehicle via a transmission system such as an Internet of Vehicles server and a CAN network.
  • a transmission system such as an Internet of Vehicles server and a CAN network.
  • Step S301 When the start instruction of the target function is detected, the man-machine interface device monitors the gesture operation.
  • step S301 reference may be made to the description of step S221, which will not be repeated here.
  • Step S302 The human-machine interface device generates current gesture data according to the gesture operation.
  • step S302 reference may be made to the description of step S222, which will not be repeated here.
  • Step S303 The human-machine interface device sends the current gesture data to the terminal control device.
  • step S303 reference may be made to the description of step S223, which will not be repeated here.
  • S304 The terminal control device receives the current gesture data sent by the man-machine interface device.
  • step S304 reference may be made to the description of step S100, which will not be repeated here.
  • the terminal control device When the current gesture data meets a preset condition, the terminal control device generates a target control instruction according to the current gesture data, and sends the target control instruction to the target executing agency for the target executing agency Perform the operation corresponding to the current gesture data.
  • step S305 reference may be made to the description of step S200, which will not be repeated here.
  • the vehicle control method described in the embodiments of the present disclosure has the following advantages:
  • the human-machine interface device When performing a gesture operation, acquires the gesture operation and generates the packing data, and then sends the gesture data to the terminal control device of the vehicle, and then the terminal control device in the vehicle recognizes and judges the gesture data, and only in the gesture data
  • the target control instruction is generated according to the gesture data, and then the target actuator is controlled to perform the corresponding operation. It can avoid the damage when the gesture data is transmitted from the human-machine interface device to the terminal control device.
  • the error of gesture data causes the false triggering of vehicle functions and is dangerous, which ensures the safety of using gestures to control the vehicle; at the same time, there is no need to add additional protective measures to the transmission path between human-machine interface equipment and terminal control equipment, which can be completely borrowed
  • FIG. 4 shows a schematic structural diagram of a vehicle control device proposed by an embodiment of the present disclosure, and the device includes:
  • the receiving module 41 is configured to receive the current gesture data sent by the man-machine interface device when the start instruction of the target function is monitored;
  • the instruction module 42 is configured to generate a target control instruction according to the current gesture data when the current gesture data meets a preset condition, and send the target control instruction to the target executing agency for execution by the target executing agency The operation corresponding to the current gesture data.
  • the receiving module 41 receives the current gesture data sent by the human-machine interface device, and then the terminal control device recognizes and judges the current gesture data, and Only when the gesture data meets the preset conditions, the instruction module 42 will generate the target control instruction according to the gesture data, and then control the target actuator to perform the corresponding operation.
  • the gesture data can be transmitted from the human-machine interface device to the terminal control device.
  • the additional protection measures can completely borrow the existing modules and technologies, and only need to develop the corresponding modules of the terminal control equipment, which greatly reduces the development difficulty of the entire system and saves costs; therefore, it solves the existing use of gesture control
  • the way of vehicles cannot effectively avoid dangerous problems caused by gesture recognition and data transmission errors.
  • the vehicle control device further includes:
  • the control module is configured to, when receiving the wake-up gesture data sent by the man-machine interface device, determine a recommended function according to the wake-up gesture data, and send the recommended information of the recommended function to the man-machine interface device to For display of the man-machine interface device;
  • the activation module is configured to determine that the recommended function is a target function when the confirmation operation for the recommended information is monitored, and generate an activation instruction for the target function.
  • the receiving module 41 includes:
  • a sending unit configured to send recommended gestures corresponding to each sub-function in the target function to the man-machine interface device for display by the man-machine interface device when the start instruction of the target function is monitored;
  • the receiving unit is configured to receive the current gesture data sent by the man-machine interface device.
  • the vehicle control device recommended gestures corresponding to each sub-function are stored in the vehicle; the gesture data includes gesture parameters;
  • the instruction module 42 includes:
  • the reconstruction unit is used to reconstruct the actual gesture according to the gesture parameter
  • An acquiring unit configured to acquire recommended gestures corresponding to each sub-function in the target function
  • the first instruction unit is configured to generate a target control instruction according to the recommended gesture if the difference value between the actual gesture and the recommended gesture is within a first preset range, and send the target control instruction to the target for execution mechanism.
  • the gesture data further includes a preliminary gesture, and the preliminary gesture is a gesture determined by the human-machine interface device based on the acquired gesture operation;
  • the acquiring unit is specifically configured to acquire the recommended gesture corresponding to each sub-function in the target function if the difference value between the preliminary gesture and the actual gesture is within a second preset range.
  • the recommended gesture corresponding to each sub-function is stored in the vehicle; when the recommended gesture corresponding to the target function is a periodic gesture, the instruction module 42 includes:
  • the first determining unit is configured to determine expected gesture data according to the gesture data obtained last time and the recommended gesture corresponding to the target function;
  • the second instruction unit is configured to generate a target control instruction according to the current gesture data if the difference value between the current gesture data and the expected gesture data is within a third preset range, and send the target control instruction To the target implementing agency.
  • Another object of the present disclosure is to provide a vehicle control device, which is applied to a human-machine interface device, the human-machine interface device is communicatively connected with a terminal control device in a vehicle, the vehicle includes a plurality of actuators, of which, please refer to Fig. 5 shows a schematic structural diagram of a vehicle control device proposed by an embodiment of the present disclosure, and the device includes:
  • the monitoring module 51 is used to monitor gesture operations when the target function activation instruction is detected
  • the data generating module 52 is configured to generate current gesture data according to the gesture operation
  • the sending module 53 is configured to send the current gesture data to the terminal control device, so that the terminal control device generates a target control instruction according to the current gesture data when the current gesture data meets a preset condition, And send the target control instruction to the target executing agency, so that the target executing agency executes the operation corresponding to the current gesture data.
  • the monitoring module 51 monitors the gesture operation when the target function activation instruction is monitored, and the data generation module 52 generates current gesture data according to the gesture operation, and then the sending module 53 transfers the current gesture
  • the data is sent to the terminal control device, and then the terminal control device recognizes and judges the current gesture data, and only when the gesture data meets the preset conditions, will the target control instruction be generated according to the gesture data, and then the target actuator will be controlled to perform the corresponding operation.
  • the gesture data When the gesture data is damaged in the process of transmitting the gesture data from the human-machine interface device to the terminal control device, it can avoid the risk of false triggering of the vehicle function due to the error of the gesture data, and ensure the safety of using the gesture to control the vehicle; at the same time; , There is no need to add additional protection measures on the transmission path between the human-machine interface equipment and the terminal control equipment, and the existing modules and technologies can be fully borrowed, and only the corresponding modules of the terminal control equipment need to be developed, making the development of the entire system extremely difficult The reduction also saves the cost; therefore, the existing method of using gestures to control the vehicle is solved, and the problem of danger due to gesture recognition and data transmission errors cannot be effectively avoided.
  • the data generating module 52 includes:
  • the second determining unit is configured to determine gesture parameters according to the gesture operation
  • the third determining unit is configured to determine a preliminary gesture according to the gesture parameter
  • the generating unit is configured to generate the current gesture data from the gesture parameter and the preliminary gesture.
  • Another object of the present disclosure is to provide a vehicle control system, which includes a human-machine interface device and a vehicle terminal control device, wherein:
  • the man-machine interface device monitors the gesture operation
  • the man-machine interface device generates current gesture data according to the gesture operation
  • the man-machine interface device sends the current gesture data to the terminal control device
  • the terminal control device When the current gesture data meets a preset condition, the terminal control device generates a target control instruction according to the current gesture data, and sends the target control instruction to the target executing agency, so that the target executing agency can execute the target control instruction. Describe the operation corresponding to the current gesture data.
  • Another object of the present disclosure is to provide a vehicle, wherein the vehicle includes the vehicle control system as described above.
  • the vehicle control system and vehicle have the same advantages as the aforementioned vehicle control method and device compared to the prior art, and will not be repeated here.
  • the vehicle control method, device, system and vehicle provided by the present disclosure send gesture data corresponding to the gesture operation to the terminal control device of the vehicle when performing gesture operations, and then the terminal control device in the vehicle recognizes And judge the gesture data, and only when the gesture data meets the preset conditions, will the target control instruction be generated according to the gesture data, and then the target actuator will be controlled to perform the corresponding operation.
  • the gesture data can be transmitted from the human-machine interface device to the terminal control device When it is damaged in the process, it avoids the danger of false triggering of vehicle functions due to errors in gesture data, ensuring the safety of using gestures to control the vehicle; at the same time, there is no need for the transmission path of human-machine interface equipment and terminal control equipment Adding additional protection measures to the above, you can completely borrow existing modules and technologies, and only need to develop the corresponding modules of the terminal control equipment, which greatly reduces the development difficulty of the entire system and saves costs; therefore, it solves the existing utilization The way that gestures control the vehicle cannot effectively avoid dangerous problems caused by gesture recognition and data transmission errors.
  • the device embodiments described above are merely illustrative.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units.
  • Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments. Those of ordinary skill in the art can understand and implement it without creative work.
  • the various component embodiments of the present disclosure may be implemented by hardware, or by software modules running on one or more processors, or by a combination of them.
  • a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in the computing processing device according to the embodiments of the present disclosure.
  • DSP digital signal processor
  • the present disclosure can also be implemented as a device or device program (for example, a computer program and a computer program product) for executing part or all of the methods described herein.
  • Such a program for realizing the present disclosure may be stored on a computer-readable medium, or may have the form of one or more signals.
  • Such a signal can be downloaded from an Internet website, or provided on a carrier signal, or provided in any other form.
  • FIG. 6 shows a computing processing device that can implement the method according to the present disclosure.
  • the computing processing device traditionally includes a processor 1010 and a computer program product in the form of a memory 1020 or a computer readable medium.
  • the memory 1020 may be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the memory 1020 has a storage space 1030 for executing program codes 1031 of any method steps in the above methods.
  • the storage space 1030 for program codes may include various program codes 1031 respectively used to implement various steps in the above method. These program codes can be read from or written into one or more computer program products.
  • Such computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards, or floppy disks.
  • Such a computer program product is usually a portable or fixed storage unit as described with reference to FIG. 7.
  • the storage unit may have storage segments, storage spaces, etc., arranged similarly to the memory 1020 in the computing processing device of FIG. 6.
  • the program code can be compressed in an appropriate form, for example.
  • the storage unit includes computer-readable code 1031', that is, code that can be read by a processor such as 1010, which, when run by a computing processing device, causes the computing processing device to execute the method described above. The various steps.
  • any reference signs placed between parentheses should not be constructed as a limitation to the claims.
  • the word “comprising” does not exclude the presence of elements or steps not listed in the claims.
  • the word “a” or “an” preceding an element does not exclude the presence of multiple such elements.
  • the present disclosure can be realized by means of hardware including several different elements and by means of a suitably programmed computer. In the unit claims listing several devices, several of these devices may be embodied in the same hardware item. The use of the words first, second, and third, etc. do not indicate any order. These words can be interpreted as names.

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

公开了一种车辆控制方法,应用于车辆中的终端控制设备,终端控制设备与人机界面设备通信连接,车辆包括多个执行机构,该方法包括:在监测到目标功能的开启指令时,接收人机界面设备发送的当前手势数据;在当前手势数据符合预设条件时,根据当前手势数据生成目标控制指令,并将目标控制指令发送至目标执行机构,以供目标执行机构执行当前手势数据对应的操作。该车辆控制方法可以在手势数据由人机界面设备传输至终端控制设备的过程中遭到破坏时,避免因为手势数据的错误而造成车辆功能的误触发而发生危险,保证了利用手势控制车辆的安全性。还公开了一种车辆控制装置及一种车辆控制系统。

Description

一种车辆控制方法、装置及系统
相关申请的交叉引用
本公开要求在2020年04月30日提交中国专利局、申请号为202010366741.7、名称为“一种车辆控制方法、装置及系统”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
技术领域
本公开涉及汽车技术领域,特别涉及一种车辆控制方法、装置及系统。
背景技术
当前,手势控制车辆已成为新的研究热点。在现有利用手势控制车辆的技术中,驾驶员的手势直接由人机界面(Human Machine Interface,HMI)设备进行处理,通过HMI设备识别驾驶员的意图,并将该意图直接传输给目标设备,进而由目标设备执行相应的操作。
但是,上述方式中,一旦识别驾驶意图的HMI模块出现故障或传递驾驶意图的路径出现失效,容易导致终端接受设备接收到的数据,不能代表真正的驾驶意图,甚至导致功能的误触发而发生行车危险。例如,全自动泊车误激活,导致高速非预期横向移动,发生严重的车毁人亡事故。
现有技术中,为了确保正确的识别和传递驾驶意图,需要对HMI设备及传递路径上的模块增加防护措施,且相关的设备需要按照对应的ASIL等级去开发。但上述方式不仅会大量增加成本,同时使得系统过于复杂而难以满足对应的ASIL等级要求。
发明内容
有鉴于此,本公开旨在提出一种车辆控制方法、装置及系统,以解决现有利用手势控制车辆的方式,无法有效避免因手势识别及数据传输错误而发生危险的问题。
为达到上述目的,本公开的技术方案是这样实现的:
一种车辆控制方法,其中,应用于车辆中的终端控制设备,所述终端控制设备与人机界面设备通信连接,所述车辆包括多个执行机构,所述方法包括:
在监测到目标功能的开启指令时,接收所述人机界面设备发送的当前手势数据;
在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标 控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
进一步地,所述的车辆控制方法中,所述车辆中存储有各个子功能对应的推荐手势;所述在监测到目标功能开启指令时,接收所述人机界面设备发送的当前手势数据,包括:
在监测到目标功能的开启指令时,将目标功能中各个子功能对应的推荐手势发送至所述人机界面设备,以供所述人机界面设备展示;
接收所述人机界面设备发送的当前手势数据。
进一步地,所述的车辆控制方法中,所述车辆中存储有各个子功能对应的推荐手势;所述当前手势数据包括手势参数;
所述在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,包括:
根据所述当前手势数据,重构实际手势;
获取所述目标功能中各个子功能对应的推荐手势;
若所述实际手势与所述推荐手势的差异值在第一预设范围内,则根据所述推荐手势生成目标控制指令,并将所述目标控制指令发送至目标执行机构。
进一步地,所述的车辆控制方法中,所述当前手势数据还包括初步手势,所述初步手势为所述人机界面设备基于所获取的手势操作确定的手势;
所述获取所述目标功能中各个子功能对应的推荐手势,包括:
若所述初步手势与所述实际手势的差异值在第二预设范围内,则获取所述目标功能中各个子功能对应的推荐手势。
进一步地,所述的车辆控制方法中,所述车辆中存储有各个子功能对应的推荐手势;在所述目标功能对应的推荐手势为周期性手势时,所述在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,包括:
根据上一次所获取的手势数据及所述目标功能对应的推荐手势,确定预期手势数据;
若所述当前手势数据与所述预期手势数据的差异值在第三预设范围内,则根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构。
进一步地,所述的车辆控制方法中,所述车辆中存储有各个所述功能的历史使用情况;在所述在监测到目标功能的开启指令时,接收所述人机界面设备发送的当前手势数据之前,还包括:
在接收所述人机界面设备发送的唤醒手势数据时,根据所述唤醒手势数据,确定推荐功能,并将所述推荐功能的推荐信息发送至所述人机界面设备,以供所述人机界面设备展示;
在监测到针对所述推荐信息的确认操作时,确定所述推荐功能为目标功能,并生成针对所述目标功能的开启指令。
进一步地,所述车辆中存储有各个功能对应的开启手势,所述推荐信息包括针对所述推荐功能的目标开启手势;所述在监测到针对所述推荐信息的确认操作时,确定所述推荐功能为目标功能,并生成针对所述目标功能的开启指令的步骤,包括:
在监测到针对所述推荐信息的实际手势操作与所述目标开启手势的差异值在第四预设范围内时,确定所述推荐功能为目标功能,并生成针对所述目标功能的开启指令。
进一步的,所述当前手势数据包括手势位置、移动速度、移动方向或移动形状;其中,
所述移动形状包括曲面直径和曲面中心。
本公开实施例的另一目的还在于提出一种车辆控制装置,其中,应用于车辆中的终端控制设备,所述终端控制设备与人机界面设备通信连接,所述车辆包括多个执行机构,所述装置包括:
接收模块,用于在监测到目标功能开启指令时,接收所述人机界面设备发送的当前手势数据;
指令模块,用于在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
相对于在先技术,本公开所述的车辆控制方法及装置具有以下优势:
因为在进行手势操作时,由车辆中的终端控制设备识别和判断手势操作所产生手势数据,且仅在手势数据符合预设条件时,才会根据手势数据生成目标控制指令,进而控制目标执行机构执行相应的操作,可以在手势数据由人机界面设备传输至终端控制设备的过程中遭到破坏时,避免因为手势数据的错误而造成车辆功能的误触发而发生危险,保证了利用手势控制车辆的安全性;同时,无需再人机界面设备与终端控制设备的传递路径上增加额外的保护措施,可以完全借用已有模块和技术,而只需要针对终端控制设备的对应模块进行开发,使得整个系统的开发难度大大降低,也节约了成本;因此,解决了现有利用手势控制车辆的方式,无法有效避免因手势识别及数据传输错误而发生危险的问题。
本公开实施例还提出了另一种车辆控制方法,其中,应用于人机界面设备,所述人机界面设备与车辆中的终端控制设备通信连接,所述车辆包括多个执行机构,所述方法包括:
在监测到目标功能开启指令时,监测手势操作;
根据所述手势操作生成当前手势数据;
将所述当前手势数据发送至所述终端控制设备,以供所述终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
进一步地,所述的车辆控制方法中,所述根据所述手势操作生成当前手势数据,包括:
根据所述手势操作,确定手势参数;
根据所述手势参数,确定初步手势;
由所述手势参数及所述初步手势,生成所述当前手势数据。
本公开实施例的另一目的还在于提出另一种车辆控制装置,其中,应用于人机界面设备,所述人机界面设备与车辆中的终端控制设备通信连接,所述车辆包括多个执行机构,所述装置包括:
监测模块,用于在监测到目标功能开启指令时,监测手势操作;
数据生成模块,用于根据所述手势操作生成当前手势数据;
发送模块,用于将所述当前手势数据发送至所述终端控制设备,以供所述终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
本公开实施例还提出了一种车辆控制系统,所述系统包括人机界面设备及车辆的终端控制设备,其中,
在监测到目标功能开启指令时,所述人机界面设备监测手势操作;
所述人机界面设备根据所述手势操作生成当前手势数据;
所述人机界面设备将所述当前手势数据发送至所述终端控制设备;
所述终端控制设备接收所述人机界面设备发送的当前手势数据;
所述终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
本公开的再一目的在于提出一种车辆,其中,所述车辆包括如上所述的车辆控制系统。
相对于在先技术,本公开所述的车辆控制方法、装置及系统具有以下优势:
因为在进行手势操作时,是将手势操作对应的手势数据发送至车辆的终端控制设备,然后由车辆中的终端控制设备识别和判断手势数据,且仅在手势数据符合预设条件时,才会根据手势数据生成目标控制指令,进而控制目标执行机构执行相应的操作,可以在手势数据由人机界面设备传输至终端控制设备的过程中遭到破坏时,避免因为手势数据的错误而造成车辆功能的误 触发而发生危险,保证了利用手势控制车辆的安全性;同时,无需再人机界面设备与终端控制设备的传递路径上增加额外的保护措施,可以完全借用已有模块和技术,而只需要针对终端控制设备的对应模块进行开发,使得整个系统的开发难度大大降低,也节约了成本;因此,解决了现有利用手势控制车辆的方式,无法有效避免因手势识别及数据传输错误而发生危险的问题。
本公开还提出了一种计算处理设备,包括:
存储器,其中存储有计算机可读代码;以及
一个或多个处理器,当所述计算机可读代码被所述一个或多个处理器执行时,所述计算处理设备执行上述的任意一种车辆控制方法。
本公开还提出了一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行上述的任意一种车辆控制方法。
本公开还提出了一种计算机可读介质,其中存储了上述的计算机程序。
上述说明仅是本公开技术方案的概述,为了能够更清楚了解本公开的技术手段,而可依照说明书的内容予以实施,并且为了让本公开的上述和其它目的、特征和优点能够更明显易懂,以下特举本公开的具体实施方式。
附图说明
为了更清楚地说明本公开实施例或相关技术中的技术方案,下面将对实施例或相关技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
构成本公开的一部分的附图用来提供对本公开的进一步理解,本公开的示意性实施例及其说明用于解释本公开,并不构成对本公开的不当限定。在附图中:
图1为本公开实施例所提出的车辆控制方法的流程示意图;
图2为本公开另一实施例所提出的车辆控制方法的流程示意图;
图3为本公开实施例中所提出的车辆控制方法的交互步骤流程图;
图4为本公开一实施例所提出的车辆控制装置的结构示意图;
图5为本公开另一实施例所提出的车辆控制装置的结构示意图;
图6示意性地示出了用于执行根据本公开的方法的计算处理设备的框图;并且
图7示意性地示出了用于保持或者携带实现根据本公开的方法的程序代码的存储单元。
具体实施例
下面将参考附图更详细地描述本公开的实施例。虽然附图中显示了本公开的实施例,然而应当理解,可以以各种形式实现本公开而不应被这里阐述的实施例所限制。相反,提供这些实施例是为了能够更彻底地理解本公开,并且能够将本公开的范围完整地传达给本领域的技术人员。
需要说明的是,在不冲突的情况下,本公开中的实施例及实施例中的特征可以相互组合。
下面将参考附图并结合实施例来详细说明本公开。
请参阅图1,示出了本公开实施例所提供的一种车辆控制方法的流程示意图,本公开实施例所提供的车辆控制方法,应用于车辆中的终端控制设备,所述终端控制设备与人机界面设备通信连接,所述车辆包括多个执行机构,所述方法包括步骤S100~S200。
本公开实施例所提供的车辆控制方法中,终端控制设备可以是整车控制器或者具体的功能模块等;而人机界面设备可以是车辆中控、车载摄像头等直接通过终端控制设备局域网(Controller Area Network,CAN)与终端控制设备进行通信连接的车载设备;上述人机界面设备也可以是手机、平板电脑等可以监测手势操作的移动设备,人机界面设备通过无线网络与车载通讯终端进行通信,终端控制设备通过CAN网络与车载通讯终端通信,从而实现终端控制设备与人机界面设备的通信连接。
目标功能指的是驾驶员意图通过手势操作实现的具体车辆功能,例如车载遥控泊车等功能。上述执行机构指的是用于实现不同车辆功能的具体车辆机构,通过上述执行机构的单独工作或多个执行机构的配合工作,即可以实现驾驶员的手势操作所表达的驾驶意图。在实际应用中,上述执行机构包括转向系统、制动系统、动力总成、变速器、四驱控制系统等。
步骤S100、在监测到目标功能的开启指令时,接收所述人机界面设备发送的当前手势数据。
上述步骤S100中,因为车辆具有多种可以通过手势操作来具体执行的功能,但其前提是必须先开启该功能,然后才可以根据驾驶员的手势操作执行对应的功能。其中,在监测到目标功能的开启指令时,也即说明驾驶员需要开启目标功能,而具体如何执行该目标功能,则需要根据驾驶员的具体手势操作来进行,因而需要接收由人机界面设备发送来的当前手势数据,该当前手势数据反映了驾驶员当前的具体手势操作。
在实际应用中,该当前手势数据可以包括手势位置、移动速度、移动方向、移动形状等手势参数。其中,移动形状可以包括曲面直径和曲面中心等。
步骤S200、在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
在上述步骤S200中,预设条件指的是预先设置的并用于判定当前手势数据是否合理、是否会使车辆发生危险的条件,也使确定当前手势数据是否如实反映了驾驶员的驾驶意图的条件;上述目标控制指令指的是控制具体执行机构执行相应操作的指令,而目标执行机构指的是实现目标功能具体机构。
在当前手势数据符合该预设条件时,说明当前所获取的手势数据处于合理、不会使车辆发生危险的状态,确定如实反映了驾驶员的驾驶意图。因而根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作,从而实现目标功能。
而在当前手势数据不符合该预设条件时,说明当前所获取的手势数据处于不合理、会使车辆发生危险的状态,确定其未如实反映驾驶员的驾驶意图。因而不执行根据所述当前手势数据生成目标控制指令的步骤,控制手势数据传输中断进入安全状态或采取恰当的行为,以避免输出异常的功能行为,违背安全。
上述步骤S200中,判断所获取到的当前手势数据是否符合预设条件的分析处理过程,以及在当前手势数据符合预设条件时,根据当前手势数据生成目标控制指令,并将目标控制指令发送至目标执行机构的操作,均直接由终端控制设备执行,而不是在手势获取设备上执行,这样在手势数据或手势对应的指令由人机界面设备传输至终端控制设备的过程中遭到破坏时,终端控制设备根据接收到的数据将无法识别出期望的手势和驾驶意图,终端控制设备也即不会生成相应的目标控制指令,从而不会导致基于手势的功能行为误输出,确保安全;同时,上述方式只需终端控制设备按照对应的汽车安全完整性等级去开发,而HMI设备及传输路径设备则不需为了汽车安全完整性等级增加任何额外设计,从而节约了开发设计成本。
相对于现有技术,本公开所述的车辆控制方法具有以下优势:
因为在进行手势操作时,是将手势操作对应的手势数据发送至车辆的终端控制设备,然后由车辆中的终端控制设备识别和判断手势数据,且仅在手势数据符合预设条件时,才会根据手势数据生成目标控制指令,进而控制目标执行机构执行相应的操作,可以在手势数据由人机界面设备传输至终端控制设备的过程中遭到破坏时,避免因为手势数据的错误而造成车辆功能的误触发而发生危险,保证了利用手势控制车辆的安全性;同时,无需再人机界面设备与终端控制设备的传递路径上增加额外的保护措施,可以完全借用已有模块和技术,而只需要针对终端控制设备的对应模块进行开发,使得整个系统的开发难度大大降低,也节约了成本;因此,解决了现有利用手势控制车辆的方式,无法有效避免因手势识别及数据传输错误而发生危险的问题。
在实际应用场景中,目标功能可以为遥控泊车功能、远程控制升窗功能、 远程门解闭锁功能、远程后背门开关功能、远程发动机控制功能、远程加热控制功能等。例如目标功能为遥控泊车功能,在该应用场景中,驾驶员通过手机APP界面的提示,通过一定的手势操作,开启遥控泊车功能,并通过与手机APP交互特定手势的操作,控制泊车过程的启动和停止,遥控泊车过程期间,手机APP获取驾驶员的手势操作,识别驾驶员的意图,比如遥控泊车的开启、暂停、终止和启动,然后将驾驶员的手势操作转化为手势数据发给车载通讯终端,该车载通讯终端接受到手机发送的手势数据后,传递给遥控泊车控制模块,遥控泊车控制模块将基于该手势数据,结合车辆当前状态,在确定手势数据符合预设条件时,生成对应的泊车指令控制车辆自动泊车,而在确定手势数据不符合预设条件时,则不生成对应的泊车指令而进入安全状态。
可选地,在一种实施方式中,本公开实施例所提供的车辆控制方法,在上述步骤S100之前,还包括步骤S1001~S1002。
S1001、在接收所述人机界面设备发送的唤醒手势数据时,根据所述唤醒手势数据,确定推荐功能,并将所述推荐功能的推荐信息发送至所述人机界面设备,以供所述人机界面设备展示。
上述步骤S1001中,因为各个功能中的子功能均具有对应的推荐手势,且不同的功能中的子功能对应的推荐手势可能相近甚至相同,因而在驾驶员通过人机界面设备进行唤醒手势操作并发出唤醒手势数据时,可以基于该唤醒手势数据,确定与该唤醒手势数据对应的推荐功能,该推荐功能中至少包括一种子功能对应的推荐手势与唤醒手势数据对应的手势相同或相近;而将推荐功能发送至人机界面设备,以供人机界面设备展示,可以让驾驶员具体选择是否执行上述推荐功能。
可选地,在实际应用中,所述车辆中存储有各个所述功能的历史使用情况,在确定上述推荐功能时,可以基于唤醒手势数据及上述历史使用情况确定,也即将驾驶员最有可能需要使用的功能作为推荐功能,供驾驶员选择使用。
在实际应用中,该推荐功能可以为一个或多个。
S1002、在监测到针对所述推荐信息的确认操作时,确定所述推荐功能为目标功能,并生成针对所述目标功能的开启指令。
在上述步骤S1002中,上述确认操作为通过人机界面设备发出的语音、手势、触摸、按压等确认使用推荐功能的操作。若监测到针对推荐功能的确认操作,则说明驾驶员确认需要使用上述推荐功能,因而将推荐功能作为目标功能,并生成针对所述目标功能的开启指令,以便于根据驾驶员后续进行的手势操作控制具体的执行机构执行相应的工作,从而实现目标功能。
可选地,在一种具体实施方式中,车辆中还存储有各个功能对应的开启 手势,所述推荐信息包括针对所述推荐功能的目标开启手势及功能说明,上述步骤S1002具体包括,在监测到针对所述推荐信息的实际手势操作与所述目标开启手势的差异值在第四预设范围内时,确定所述推荐功能为目标功能,并生成针对所述目标功能的开启指令。
本实施方式中,第四预设范围为预先设置用于判定经由人机界面设备所监测到的实际手势操作是否与开启手势相匹配的界限,针对所述推荐信息的实际手势操作与所述目标开启手势的差异值在第四预设范围内时,说明驾驶员希望开启推荐功能,因而确定所述推荐功能为目标功能,并生成针对所述目标功能的开启指令。
可选地,在一种实施方式中,本公开实施例所提供的车辆控制方法,所述车辆中存储有各个子功能对应的推荐手势,在上述步骤S100具体步骤S101~S102。
本实施方式中,各个子功能对应的推荐手势即具体实现各个功能的标准操作手势,驾驶员按该推荐手势进行手势操作时,即可以触发控制车辆实现对应的功能。在实际应用中,上述推荐手势可以包括手势形状及相应的移动轨迹,根据该推荐收拾可以让驾驶员直观获知控制车辆执行目标功能的具体手势操作。
步骤S101、在监测到目标功能开启指令时,将目标功能中各个子功能对应的推荐手势发送至所述人机界面设备,以供所述人机界面设备展示。
上述步骤S101中,即在监测到目标功能开启指令时,也即在驾驶员需要开启目标功能时,先将该目标功能对应的推荐手势发送至人机界面设备,然后由该人机界面设备展示该手势,从而更好地提示驾驶员具体可以如何进行手势操作,以控制车辆实现目标功能。
步骤S102、接收所述人机界面设备发送的当前手势数据。
上述步骤S102中,即在将目标功能中各个子功能对应的推荐手势发送至所述人机界面设备后,再接收由人机界面设备处所获取并发送的表示驾驶员具体手势操作的手势数据,以根据该手势数据确定生成相应的指令控制具体的执行机构实现相应的目标功能。
本实施方式中,通过先将该目标功能中各个子功能对应的推荐手势发送至人机界面设备,然后由该人机界面设备展示该手势,从而更好地提示驾驶员具体可以如何进行手势操作,以控制车辆实现目标功能中各个子功能。
可选地,在一种具体实施方式中,所述车辆中存储有各个子功能对应的推荐手势;在所述目标功能中各个子功能对应的推荐手势为周期性手势时,上述步骤S200包括步骤S201~S202。
本实施方式中,各个子功能对应的推荐手势即具体实现各个功能的标准操作手势,驾驶员按该推荐手势进行手势操作时,即可以触发控制车辆实现 对应的功能。在所述目标功能对应的推荐手势为周期性手势时,即说明需要驾驶员周期性进行手势操作才能够控制车辆持续执行目标功能。
步骤S201、根据上一次所获取的手势数据及所述目标功能中各个子功能对应的推荐手势,确定预期手势数据。
在上述步骤S201中,因为在所述目标功能中各个子功能对应的推荐手势为周期性手势时,人机界面设备会接收到驾驶员进行的周期性手势操作,并将该周期性手势操作所产生的手势护具发送至终端控制设备,因而终端控制设备也即可以周期性地获得目标功能对应的手势数据。其中,因为车辆存储有目标功能中各个子功能对应的推荐手势,且因为只有按照推荐手势进行手势操作才能控制车辆执行目标功能,因而若驾驶员要正常实现目标功能,则结合当前所接收到的手势数据及推荐手势,即可以预测下一次所接收的手势数据;也即可以根据上一次接收到的手势数据及推荐手势,确定本次应从人机界面设备处所获取得到的预期手势数据。
步骤S202、若所述当前手势数据与所述预期手势数据的差异值在第三预设范围内,则根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构。
在上述步骤S202中,第三预设范围为预先设置用于判定所接收到的当前手势数据是否符合预期的界限,也即是判定所接收到的当前手势数据是否如实代表了驾驶员的驾驶意图的界限。其中,在当前手势数据与所述预期手势数据的差异值在第三预设范围内时,说明所接收到的当前手势数据符合预期,判定所接收到的当前手势数据如实代表了驾驶员的驾驶意图,因而根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构;而在当前手势数据与所述预期手势数据的差异值未在第三预设范围内时,说明所接收到的当前手势数据不符合预期,则判定所接收到的当前手势数据未如实代表驾驶员的驾驶意图或数据有误,因而不会根据所述当前手势数据生成目标控制指令,也不会将目标控制指令发送至目标执行机构,从而避免错误的识别驾驶员意图而造成车辆功能的误触发,进而导致发生危险的问题。
在实际应用中,上述第三预设范围可以为3%~30%;另外,为了保证目标功能的实现,手势数据传输速率应足够高且有序,其中,对于周期性手势,其数据传输速率至少是手势操作速率的二倍。
可选地,在一种实施方式中,所述车辆中存储有各个子功能对应的推荐手势;所述手势数据包括手势参数;上述步骤S200包括步骤S211~S213。
本实施方式中,各个功能对应的推荐手势即具体实现各个功能中的子功能的标准操作手势,驾驶员按该推荐手势进行手势操作时,即可以触发控制车辆实现对应的子功能。
步骤S211、根据所述手势数据,重构实际手势。
上述步骤S211中,手势参数具体包括手势位置、移动速度、移动方向、移动形状等,因而通过所接收到手势参数即可以重构驾驶员执行手势操作对应的实际手势。
步骤S212、获取所述目标功能中各个子功能对应的推荐手势。
在上述步骤S212中,因为车辆中存储了各个功能对应的推荐手势,因而在监测到目标功能的开启指令时,可以确定目标功能,进而可以确定推荐手势。
步骤S213、若所述实际手势与推荐手势的差异值在第一预设范围内,则根据所述推荐手势生成目标控制指令,并将所述目标控制指令发送至目标执行机构。
在上述步骤S213中,第一预设范围为用于判定实际手势是否与推荐手势匹配的界限,也即是判定所接收到的当前手势数据是否如实代表了驾驶员的驾驶意图的界限。其中,在实际手势与目标功能中各个子功能对应的推荐手势的差异值在第一预设范围内时,说明实际手势与推荐手势相匹配,即可以将该推荐手势作为目标推荐手势,判定所接收到的当前手势数据如实代表了驾驶员的驾驶意图,因而根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构;而在实际手势与所述推荐手势的差异值超出第一预设范围内时,说明实际手势与推荐手势不相匹配,则判定所接收到的当前手势数据未如实代表驾驶员的驾驶意图或数据有误,因而不会根据所述当前手势数据生成目标控制指令,也不会将目标控制指令发送至目标执行机构,从而避免错误的识别驾驶员意图而造成车辆功能的误触发,进而导致发生危险的问题。
可选地,在一种更具体地实施方式中,所述手势数据还包括初步手势,所述初步手势为所述人机界面设备基于所获取的手势操作确定的手势,所述步骤S212,具体包括:若所述初步手势与实际手势的差异值在第二预设范围内,则获取所述目标功能中各个子功能对应的推荐手势。
在本实施方式中,即人机界面设备在监测驾驶员的手势操作时,先基于其手势操作确定对应的手势参数,同时会根据改手势参数进行识别,从而得到由人机界面设备识别分析得到的手势,然后将该初步手势与手势参数打包成手势数据发送至车辆的终端控制设备;终端控制设备在接收到手势数据后,会基于其中的手势参数重构实际手势,然后将重构得到的实际手势与从人机界面设备所发送过来的初步手势进行对比,可以确定手势参数由人机界面设备传输至终端控制设备的过程中是否遭到破坏。
其中,上述第二预设范围即为预设设置的且用于判定实际手势是否与初步手势匹配的界限,也即是判定所接收到的手势参数在由人机界面设备传输至终端控制设备的过程中是否遭到破坏的界限。若所述初步手势与实际手势 的差异值在第二预设范围内,则说明实际手势与初步手势匹配,也即判定所接收到的手势参数在由人机界面设备传输至终端控制设备的过程中未遭到破坏,也即终端控制设备所获取得到手势参数即人机界面设备所发出的手势参数,因而进入后续获取所述目标功能中各个子功能对应的推荐手势的步骤中,以进一步判定手势参数对应的实际手势是否安全、合理以及是否反映了驾驶员的真实驾驶意图。
在本实施方式中,也即需要由人机界面设备进行初步手势识别,并将感知的手势参数和手势识别结果一同发送至终端控制设备,然后终端控制设备首先通过接受数据中的手势参数识别其表达的手势,然后和接受数据中的初步手势进行对比,若一致,则认为数据经过人机界面设备和传输系统时,未出现异常,上述过程,由于分别经过人机界面设备及终端控制设备这两个设备的分别识别和最终对比,增加识别结果的可靠性,进一步保证了系统的安全性。
请参阅图2,示出了本公开实施例所提供的另一种车辆控制方法的流程示意图,本公开实施例所提供的另一种车辆控制方法,应用于人机界面设备,所述人机界面设备与车辆中的终端控制设备通信连接,所述车辆包括多个执行机构,其中,所述方法包括步骤S221~S223。
步骤S221、在监测到目标功能的开启指令时,监测手势操作。
上述步骤S221中,在驾驶员选择开启目标功能时,是在人机交换设备侧触发开启指令,同时,该目标功能的开启指令也会发送至终端控制设备。因而在监测到目标功能的开启指令时,说明驾驶员需要开启目标功能时,驾驶员会在人机界面设备上进行手势操作,因而可以监测其手势操作,进而在后续步骤S222中得到相应的手势数据。
在实际应用中,可以由人机界面设备中的触摸传感器具体执行上述监测手势操作的步骤,从而获取手势参数。
步骤S222、根据所述手势操作生成当前手势数据。
上述步骤S222中,通过监测驾驶员在人机界面设备上进行的手势操作,即可以得到对应的手势数据。
步骤S223、将所述当前手势数据发送至所述终端控制设备,以供所述终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
上述步骤S223中,将手势操作对应的手势数据经由传输系统发送至车辆的终端控制设备,然后由车辆中的终端控制设备识别和判断手势数据,且仅在手势数据符合预设条件时,才会根据手势数据生成目标控制指令,进而控制目标执行机构执行相应的操作,可以在手势数据由人机界面设备传输至终 端控制设备的过程中遭到破坏时,避免因为手势数据的错误而造成车辆功能的误触发而发生危险,保证了利用手势控制车辆的安全性;同时,无需再人机界面设备与终端控制设备的传递路径上增加额外的保护措施,可以完全借用已有模块和技术,而只需要针对终端控制设备的对应模块进行开发,使得整个系统的开发难度大大降低,也节约了成本;因此,解决了现有利用手势控制车辆的方式,无法有效避免因手势识别及数据传输错误而发生危险的问题。
在实际应用中,上述传输系统指的是可以将人机界面设备与终端控制设备建立通信连接的系统,具体可以包括车联网服务器、车载通讯终端及CAN网络等。
可选地,在一种实施方式中,上述步骤S222具体包括步骤S2221~S2223:
S2221、根据所述手势操作,确定手势参数。
上述步骤S2221中,通过监测驾驶员在人机界面设备上进行的手势操作,即可以确定其对应的手势参数,通过上述手势参数可以确定对应的手势形状,也即可以重现驾驶员的手势操作。具体地,该手势参数包括手势位置、移动速度、移动方向、移动形状等手势参数。其中,移动形状可以包括曲面直径和曲面中心等。
S2222、根据所述手势参数,确定初步手势。
上述步骤S2222中,即由人机界面设备分析、识别步骤S2221中所确定得到的手势参数所反映的手势,也即得到上述初步收拾。
S2223、由所述手势参数及所述初步手势,生成所述当前手势数据。
上述步骤S2223中,即将步骤S2221中根据驾驶员的手势操作所确定的收拾参数,以及由步骤S2222所确定的初步收拾一同打包为当前收拾数据,然后发送给终端控制设备,进而由终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
相对于现有技术,本公开所述的车辆控制方法具有以下优势:
因为在进行手势操作时,是将手势操作对应的手势数据发送至车辆的终端控制设备,然后由车辆中的终端控制设备识别和判断手势数据,且仅在手势数据符合预设条件时,才会根据手势数据生成目标控制指令,进而控制目标执行机构执行相应的操作,可以在手势数据由人机界面设备传输至终端控制设备的过程中遭到破坏时,避免因为手势数据的错误而造成车辆功能的误触发而发生危险,保证了利用手势控制车辆的安全性;同时,无需再人机界面设备与终端控制设备的传递路径上增加额外的保护措施,可以完全借用已有模块和技术,而只需要针对终端控制设备的对应模块进行开发,使得整个 系统的开发难度大大降低,也节约了成本;因此,解决了现有利用手势控制车辆的方式,无法有效避免因手势识别及数据传输错误而发生危险的问题。
请参阅图3,示出了本公开实施例提出的一种车辆控制方法的交互步骤流程图,应用人机界面设备及车辆的终端控制设备,其中,包括步骤S301~S305。
在实际应用中,所述人机界面设备经由车联网服务器、CAN网络等传输系统与车辆的终端控制设备通信连接。
步骤S301、在监测到目标功能的开启指令时,所述人机界面设备监测手势操作。
上述步骤S301可以参照步骤S221的说明,在此不再赘述。
步骤S302、所述人机界面设备根据所述手势操作生成当前手势数据。
上述步骤S302可以参照步骤S222的说明,在此不再赘述。
步骤S303、所述人机界面设备将所述当前手势数据发送至所述终端控制设备。
上述步骤S303可以参照步骤S223的说明,在此不再赘述。
S304、所述终端控制设备接收所述人机界面设备发送的当前手势数据。
上述步骤S304可以参照步骤S100的说明,在此不再赘述。
S305、所述终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
上述步骤S305可以参照步骤S200的说明,在此不再赘述。
相对于现有技术,本公开实施例所述的车辆控制方法具有以下优势:
在进行手势操作时,由人机界面设备获取手势操作并生成收拾数据,然后将手势数据发送至车辆的终端控制设备,然后由车辆中的终端控制设备识别和判断手势数据,且仅在手势数据符合预设条件时,才会根据手势数据生成目标控制指令,进而控制目标执行机构执行相应的操作,可以在手势数据由人机界面设备传输至终端控制设备的过程中遭到破坏时,避免因为手势数据的错误而造成车辆功能的误触发而发生危险,保证了利用手势控制车辆的安全性;同时,无需再人机界面设备与终端控制设备的传递路径上增加额外的保护措施,可以完全借用已有模块和技术,而只需要针对终端控制设备的对应模块进行开发,使得整个系统的开发难度大大降低,也节约了成本;因此,解决了现有利用手势控制车辆的方式,无法有效避免因手势识别及数据传输错误而发生危险的问题。
本公开的另一目的在于提出一种车辆控制装置,应用于车辆中的终端控制设备,所述终端控制设备与人机界面设备通信连接,所述车辆包括多个执行机构,其中,请参阅图4,图4示出了本公开实施例所提出的一种车辆控制装置的结构示意图,所述装置包括:
接收模块41,用于在监测到目标功能的开启指令时,接收所述人机界面设备发送的当前手势数据;
指令模块42,用于在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
本公开实施例所述的装置中,在监测到目标功能的开启指令时,由接收模块41接收所述人机界面设备发送的当前手势数据,然后由终端控制设备识别和判断当前手势数据,且仅在手势数据符合预设条件时,才由指令模块42会根据手势数据生成目标控制指令,进而控制目标执行机构执行相应的操作,可以在手势数据由人机界面设备传输至终端控制设备的过程中遭到破坏时,避免因为手势数据的错误而造成车辆功能的误触发而发生危险,保证了利用手势控制车辆的安全性;同时,无需再人机界面设备与终端控制设备的传递路径上增加额外的保护措施,可以完全借用已有模块和技术,而只需要针对终端控制设备的对应模块进行开发,使得整个系统的开发难度大大降低,也节约了成本;因此,解决了现有利用手势控制车辆的方式,无法有效避免因手势识别及数据传输错误而发生危险的问题。
可选地,所述的车辆控制装置,还包括:
控制模块,用于在接收所述人机界面设备发送的唤醒手势数据时,根据所述唤醒手势数据,确定推荐功能,并将所述推荐功能的推荐信息发送至所述人机界面设备,以供所述人机界面设备展示;
启动模块,用于在监测到针对所述推荐信息的确认操作时,确定所述推荐功能为目标功能,并生成针对所述目标功能的开启指令。
可选地,所述的车辆控制装置中,所述车辆中存储有各个子功能对应的推荐手势;接收模块41,包括:
发送单元,用于在监测到目标功能的开启指令时,将目标功能中各个子功能对应的推荐手势发送至所述人机界面设备,以供所述人机界面设备展示;
接收单元,用于接收所述人机界面设备发送的当前手势数据。
进一步地,所述的车辆控制装置中,所述车辆中存储有各个子功能对应的推荐手势;所述手势数据包括手势参数;
所述指令模块42,包括:
重构单元,用于根据所述手势参数,重构实际手势;
获取单元,用于获取所述目标功能中各个子功能对应的推荐手势;
第一指令单元,用于若所述实际手势与所述推荐手势的差异值在第一预设范围内,则根据所述推荐手势生成目标控制指令,并将所述目标控制指令发送至目标执行机构。
可选地,所述的车辆控制装置中,所述手势数据还包括初步手势,所述 初步手势为所述人机界面设备基于所获取的手势操作确定的手势;
所述获取单元,具体用于若所述初步手势与所述实际手势的差异值在第二预设范围内,则获取所述目标功能中各个子功能对应的推荐手势。
可选地,所述的车辆控制装置中,所述车辆中存储有各个子功能对应的推荐手势;在所述目标功能对应的推荐手势为周期性手势时,所述指令模块42包括:
第一确定单元,用于根据上一次所获取的手势数据及所述目标功能对应的推荐手势,确定预期手势数据;
第二指令单元,用于若所述当前手势数据与所述预期手势数据的差异值在第三预设范围内,则根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构。
本公开的另一目的在于提出一种车辆控制装置,应用于人机界面设备,所述人机界面设备与车辆中的终端控制设备通信连接,所述车辆包括多个执行机构,其中,请参阅图5,图5示出了本公开实施例所提出的一种车辆控制装置的结构示意图,所述装置包括:
监测模块51,用于在监测到目标功能开启指令时,监测手势操作;
数据生成模块52,用于根据所述手势操作生成当前手势数据;
发送模块53,用于将所述当前手势数据发送至所述终端控制设备,以供所述终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
本公开实施例所述的装置中,由监测模块51在监测到目标功能开启指令时,监测手势操作,由数据生成模块52根据所述手势操作生成当前手势数据,再由发送模块53将当前手势数据发送至终端控制设备,然后由终端控制设备识别和判断当前手势数据,且仅在手势数据符合预设条件时,才会根据手势数据生成目标控制指令,进而控制目标执行机构执行相应的操作,可以在手势数据由人机界面设备传输至终端控制设备的过程中遭到破坏时,避免因为手势数据的错误而造成车辆功能的误触发而发生危险,保证了利用手势控制车辆的安全性;同时,无需再人机界面设备与终端控制设备的传递路径上增加额外的保护措施,可以完全借用已有模块和技术,而只需要针对终端控制设备的对应模块进行开发,使得整个系统的开发难度大大降低,也节约了成本;因此,解决了现有利用手势控制车辆的方式,无法有效避免因手势识别及数据传输错误而发生危险的问题。
可选地,所述的车辆控制装置中,所述数据生成模块52包括:
第二确定单元,用于根据所述手势操作,确定手势参数;
第三确定单元,用于根据所述手势参数,确定初步手势;
生成单元,用于由所述手势参数及所述初步手势,生成所述当前手势数据。
本公开的另一目的还在于提出一种车辆控制系统,所述系统包括人机界面设备及车辆的终端控制设备,其中,
在监测到目标功能开启指令时,所述人机界面设备监测手势操作;
所述人机界面设备根据所述手势操作生成当前手势数据;
所述人机界面设备将所述当前手势数据发送至所述终端控制设备;
所述终端控制设备接收所述人机界面设备发送的当前手势数据;
所述终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
本公开的再一目的在于提出一种车辆,其中,所述车辆包括如上所述的车辆控制系统。
所述车辆控制系统、车辆与上述车辆控制方法、装置相对于现有技术所具有的优势相同,在此不再赘述
综上所述,本公开提供的车辆控制方法、装置、系统及车辆,在进行手势操作时,是将手势操作对应的手势数据发送至车辆的终端控制设备,然后由车辆中的终端控制设备识别和判断手势数据,且仅在手势数据符合预设条件时,才会根据手势数据生成目标控制指令,进而控制目标执行机构执行相应的操作,可以在手势数据由人机界面设备传输至终端控制设备的过程中遭到破坏时,避免因为手势数据的错误而造成车辆功能的误触发而发生危险,保证了利用手势控制车辆的安全性;同时,无需再人机界面设备与终端控制设备的传递路径上增加额外的保护措施,可以完全借用已有模块和技术,而只需要针对终端控制设备的对应模块进行开发,使得整个系统的开发难度大大降低,也节约了成本;因此,解决了现有利用手势控制车辆的方式,无法有效避免因手势识别及数据传输错误而发生危险的问题。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。
本公开的各个部件实施例可以以硬件实现,或者以在一个或者多个处理 器上运行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器(DSP)来实现根据本公开实施例的计算处理设备中的一些或者全部部件的一些或者全部功能。本公开还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置程序(例如,计算机程序和计算机程序产品)。这样的实现本公开的程序可以存储在计算机可读介质上,或者可以具有一个或者多个信号的形式。这样的信号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。
例如,图6示出了可以实现根据本公开的方法的计算处理设备。该计算处理设备传统上包括处理器1010和以存储器1020形式的计算机程序产品或者计算机可读介质。存储器1020可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。存储器1020具有用于执行上述方法中的任何方法步骤的程序代码1031的存储空间1030。例如,用于程序代码的存储空间1030可以包括分别用于实现上面的方法中的各种步骤的各个程序代码1031。这些程序代码可以从一个或者多个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。这些计算机程序产品包括诸如硬盘,紧致盘(CD)、存储卡或者软盘之类的程序代码载体。这样的计算机程序产品通常为如参考图7所述的便携式或者固定存储单元。该存储单元可以具有与图6的计算处理设备中的存储器1020类似布置的存储段、存储空间等。程序代码可以例如以适当形式进行压缩。通常,存储单元包括计算机可读代码1031’,即可以由例如诸如1010之类的处理器读取的代码,这些代码当由计算处理设备运行时,导致该计算处理设备执行上面所描述的方法中的各个步骤。
本文中所称的“一个实施例”、“实施例”或者“一个或者多个实施例”意味着,结合实施例描述的特定特征、结构或者特性包括在本公开的至少一个实施例中。此外,请注意,这里“在一个实施例中”的词语例子不一定全指同一个实施例。
在此处所提供的说明书中,说明了大量具体细节。然而,能够理解,本公开的实施例可以在没有这些具体细节的情况下被实践。在一些实例中,并未详细示出公知的方法、结构和技术,以便不模糊对本说明书的理解。
在权利要求中,不应将位于括号之间的任何参考符号构造成对权利要求的限制。单词“包含”不排除存在未列在权利要求中的元件或步骤。位于元件之前的单词“一”或“一个”不排除存在多个这样的元件。本公开可以借助于包括有若干不同元件的硬件以及借助于适当编程的计算机来实现。在列举了若干装置的单元权利要求中,这些装置中的若干个可以是通过同一个硬件项来具体体现。单词第一、第二、以及第三等的使用不表示任何顺序。可 将这些单词解释为名称。
以上所述仅为本公开的较佳实施例而已,并不用以限制本公开,凡在本公开的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本公开的保护范围之内。
以上所述,仅为本公开的具体实施方式,但本公开的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本公开的保护范围之内。因此,本公开的保护范围应以权利要求的保护范围为准。

Claims (16)

  1. 一种车辆控制方法,其特征在于,应用于车辆中的终端控制设备,所述终端控制设备与人机界面设备通信连接,所述车辆包括多个执行机构,所述方法包括:
    在监测到目标功能的开启指令时,接收所述人机界面设备发送的当前手势数据;
    在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
  2. 根据权利要求1所述的车辆控制方法,其特征在于,所述车辆中存储有各个子功能对应的推荐手势;所述在监测到目标功能开启指令时,接收所述人机界面设备发送的当前手势数据,包括:
    在监测到目标功能的开启指令时,将目标功能中各个子功能对应的推荐手势发送至所述人机界面设备,以供所述人机界面设备展示;
    接收所述人机界面设备发送的当前手势数据。
  3. 根据权利要求1所述的车辆控制方法,其特征在于,所述车辆中存储有各个子功能对应的推荐手势;所述当前手势数据包括手势参数;
    所述在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,包括:
    根据所述当前手势数据,重构实际手势;
    获取所述目标功能中各个所述子功能对应的推荐手势;
    若所述实际手势与所述推荐手势的差异值在第一预设范围内,则根据所述推荐手势生成目标控制指令,并将所述目标控制指令发送至目标执行机构。
  4. 根据权利要求3所述的车辆控制方法,其特征在于,所述当前手势数据还包括初步手势,所述初步手势为所述人机界面设备基于所获取的手势操作确定的手势;
    所述获取所述目标功能中各个子功能对应的推荐手势,包括:
    若所述初步手势与所述实际手势的差异值在第二预设范围内,则获取所述目标功能中各个子功能对应的推荐手势。
  5. 根据权利要求1所述的车辆控制方法,其特征在于,所述车辆中存储有各个子功能对应的推荐手势;在所述目标功能中各个子功能对应的推荐手势为周期性手势时,所述在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,包括:
    根据上一次所获取的手势数据及所述目标功能中各个子功能对应的推荐 手势,确定预期手势数据;
    若所述当前手势数据与所述预期手势数据的差异值在第三预设范围内,则根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构。
  6. 根据权利要求1所述的车辆控制方法,其特征在于,在所述在监测到目标功能的开启指令时,接收所述人机界面设备发送的当前手势数据之前,还包括:
    在接收所述人机界面设备发送的唤醒手势数据时,根据所述唤醒手势数据,确定推荐功能,并将所述推荐功能的推荐信息发送至所述人机界面设备,以供所述人机界面设备展示;
    在监测到针对所述推荐信息的确认操作时,确定所述推荐功能为目标功能,并生成针对所述目标功能的开启指令。
  7. 根据权利要求6所述的车辆控制方法,其特征在于,所述车辆中存储有各个功能对应的开启手势,所述推荐信息包括针对所述推荐功能的目标开启手势;所述在监测到针对所述推荐信息的确认操作时,确定所述推荐功能为目标功能,并生成针对所述目标功能的开启指令的步骤,包括:
    在监测到针对所述推荐信息的实际手势操作与所述目标开启手势的差异值在第四预设范围内时,确定所述推荐功能为目标功能,并生成针对所述目标功能的开启指令。
  8. 根据权利要求1所述的车辆控制方法,其特征在于,所述当前手势数据包括手势位置、移动速度、移动方向或移动形状;其中,
    所述移动形状包括曲面直径和曲面中心。
  9. 一种车辆控制方法,其特征在于,应用于人机界面设备,所述人机界面设备与车辆中的终端控制设备通信连接,所述车辆包括多个执行机构,所述方法包括:
    在监测到目标功能的开启指令时,监测手势操作;
    根据所述手势操作生成当前手势数据;
    将所述当前手势数据发送至所述终端控制设备,以供所述终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
  10. 根据权利要求9所述的车辆控制方法,其特征在于,所述根据所述手势操作生成当前手势数据,包括:
    根据所述手势操作,确定手势参数;
    根据所述手势参数,确定初步手势;
    由所述手势参数及所述初步手势,生成所述当前手势数据。
  11. 一种车辆控制装置,其特征在于,应用于车辆中的终端控制设备,所述终端控制设备与人机界面设备通信连接,所述车辆包括多个执行机构,所述装置包括:
    接收模块,用于在监测到目标功能的开启指令时,接收所述人机界面设备发送的当前手势数据;
    指令模块,用于在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
  12. 一种车辆控制装置,其特征在于,应用于人机界面设备,所述人机界面设备与车辆中的终端控制设备通信连接,所述车辆包括多个执行机构,所述装置包括:
    监测模块,用于在监测到目标功能的开启指令时,监测手势操作;
    数据生成模块,用于根据所述手势操作生成当前手势数据;
    发送模块,用于将所述当前手势数据发送至所述终端控制设备,以供所述终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
  13. 一种车辆控制系统,所述系统包括人机界面设备及车辆的终端控制设备,其特征在于,
    在监测到目标功能的开启指令时,所述人机界面设备监测手势操作;
    所述人机界面设备根据所述手势操作生成当前手势数据;
    所述人机界面设备将所述当前手势数据发送至所述终端控制设备;
    所述终端控制设备接收所述人机界面设备发送的当前手势数据;
    所述终端控制设备在所述当前手势数据符合预设条件时,根据所述当前手势数据生成目标控制指令,并将所述目标控制指令发送至目标执行机构,以供所述目标执行机构执行所述当前手势数据对应的操作。
  14. 一种计算处理设备,其特征在于,包括:
    存储器,其中存储有计算机可读代码;以及
    一个或多个处理器,当所述计算机可读代码被所述一个或多个处理器执行时,所述计算处理设备执行如权利要求1-8中任一项所述的车辆控制方法,或执行如权利要求9或10所述的车辆控制方法。
  15. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行根据权利要求1-8中任一项所述的车辆控制方法,或执行如权利要求9或10所述的车辆控制方法。
  16. 一种计算机可读介质,其中存储了如权利要求15所述的计算机程序。
PCT/CN2021/089847 2020-04-30 2021-04-26 一种车辆控制方法、装置及系统 WO2021218900A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/758,204 US20230039339A1 (en) 2020-04-30 2021-04-26 Vehicle control method, apparatus and system
EP21797619.0A EP4067192A4 (en) 2020-04-30 2021-04-26 VEHICLE CONTROL METHOD, DEVICE AND SYSTEM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010366741.7A CN111645701B (zh) 2020-04-30 2020-04-30 一种车辆控制方法、装置及系统
CN202010366741.7 2020-04-30

Publications (1)

Publication Number Publication Date
WO2021218900A1 true WO2021218900A1 (zh) 2021-11-04

Family

ID=72345974

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/089847 WO2021218900A1 (zh) 2020-04-30 2021-04-26 一种车辆控制方法、装置及系统

Country Status (4)

Country Link
US (1) US20230039339A1 (zh)
EP (1) EP4067192A4 (zh)
CN (1) CN111645701B (zh)
WO (1) WO2021218900A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016448A (zh) * 2022-06-16 2022-09-06 中国第一汽车股份有限公司 一种车辆控制方法、装置、车载终端、车辆及介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111645701B (zh) * 2020-04-30 2022-12-06 长城汽车股份有限公司 一种车辆控制方法、装置及系统

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2441635A1 (en) * 2010-10-06 2012-04-18 Harman Becker Automotive Systems GmbH Vehicle User Interface System
CN105807904A (zh) * 2014-10-06 2016-07-27 现代自动车株式会社 用于车辆的人机接口设备及其控制方法
CN106354259A (zh) * 2016-08-30 2017-01-25 同济大学 基于Soli和Tobii的汽车HUD眼动辅助手势交互系统及其装置
CN107640159A (zh) * 2017-08-04 2018-01-30 吉利汽车研究院(宁波)有限公司 一种自动驾驶人机交互系统及方法
CN109552340A (zh) * 2017-09-22 2019-04-02 奥迪股份公司 用于车辆的手势和表情控制
US20190144000A1 (en) * 2017-11-13 2019-05-16 Ford Global Technologies, Llc Method and device to enable a fast stop of an autonomously driving vehicle
DE102018208889A1 (de) * 2018-06-06 2019-12-12 Faurecia Innenraum Systeme Gmbh Steuereinrichtung für ein Fahrzeug und Verfahren zur Steuerung eines Fahrzeugs
CN111645701A (zh) * 2020-04-30 2020-09-11 长城汽车股份有限公司 一种车辆控制方法、装置及系统

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011525283A (ja) * 2008-06-18 2011-09-15 オブロング・インダストリーズ・インコーポレーテッド 車両インターフェース用ジェスチャ基準制御システム
CN101349944A (zh) * 2008-09-03 2009-01-21 宏碁股份有限公司 手势引导系统及以触控手势控制计算机系统的方法
US10613642B2 (en) * 2014-03-12 2020-04-07 Microsoft Technology Licensing, Llc Gesture parameter tuning
CN106843729A (zh) * 2017-01-20 2017-06-13 宇龙计算机通信科技(深圳)有限公司 一种终端操控方法及终端
CN107102731A (zh) * 2017-03-31 2017-08-29 斑马信息科技有限公司 用于车辆的手势控制方法及其系统
CN107493495B (zh) * 2017-08-14 2019-12-13 深圳市国华识别科技开发有限公司 交互位置确定方法、系统、存储介质和智能终端
CN110435561B (zh) * 2019-07-26 2021-05-18 中国第一汽车股份有限公司 一种车辆控制方法、系统和车辆
CN110764616A (zh) * 2019-10-22 2020-02-07 深圳市商汤科技有限公司 手势控制方法和装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2441635A1 (en) * 2010-10-06 2012-04-18 Harman Becker Automotive Systems GmbH Vehicle User Interface System
CN105807904A (zh) * 2014-10-06 2016-07-27 现代自动车株式会社 用于车辆的人机接口设备及其控制方法
CN106354259A (zh) * 2016-08-30 2017-01-25 同济大学 基于Soli和Tobii的汽车HUD眼动辅助手势交互系统及其装置
CN107640159A (zh) * 2017-08-04 2018-01-30 吉利汽车研究院(宁波)有限公司 一种自动驾驶人机交互系统及方法
CN109552340A (zh) * 2017-09-22 2019-04-02 奥迪股份公司 用于车辆的手势和表情控制
US20190144000A1 (en) * 2017-11-13 2019-05-16 Ford Global Technologies, Llc Method and device to enable a fast stop of an autonomously driving vehicle
DE102018208889A1 (de) * 2018-06-06 2019-12-12 Faurecia Innenraum Systeme Gmbh Steuereinrichtung für ein Fahrzeug und Verfahren zur Steuerung eines Fahrzeugs
CN111645701A (zh) * 2020-04-30 2020-09-11 长城汽车股份有限公司 一种车辆控制方法、装置及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4067192A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016448A (zh) * 2022-06-16 2022-09-06 中国第一汽车股份有限公司 一种车辆控制方法、装置、车载终端、车辆及介质
CN115016448B (zh) * 2022-06-16 2024-05-07 中国第一汽车股份有限公司 一种车辆控制方法、装置、车载终端、车辆及介质

Also Published As

Publication number Publication date
EP4067192A1 (en) 2022-10-05
EP4067192A4 (en) 2023-08-16
CN111645701A (zh) 2020-09-11
CN111645701B (zh) 2022-12-06
US20230039339A1 (en) 2023-02-09

Similar Documents

Publication Publication Date Title
US20220355791A1 (en) Automatic parking method, device, and system
US20230073942A1 (en) Method and apparatus for vehicle control
WO2021218900A1 (zh) 一种车辆控制方法、装置及系统
CN107804321B (zh) 高级自主车辆教程
CN105501227B (zh) 道路紧急激活
CN107000762B (zh) 用于自动实施机动车的至少一个行驶功能的方法
CN113085885A (zh) 一种驾驶模式的切换方法、装置、设备及可读存储介质
CN113359759A (zh) 基于自动驾驶的泊车控制方法、系统、车辆及存储介质
EP3623950B1 (en) System and method for verifying vehicle controller based on virtual machine
CN106873572B (zh) 自动驾驶切断装置以及自动驾驶切断方法和系统
JPWO2019131003A1 (ja) 車両制御装置および電子制御システム
CN114084142A (zh) 拖车控制系统、车辆、拖动控制方法及存储介质
JP7275262B2 (ja) 車両運転権限移転方法および装置
CN115601856B (zh) 自动驾驶系统预期功能安全测试场景确定方法和设备
GB2563871A (en) Control system
CN114844764B (zh) 一种网络安全功能检测的方法及相关设备
CN111505977B (zh) 功能辅助调试方法、功能调试方法、装置、系统及介质
CN115140054A (zh) 车辆驾驶模式的切换方法、切换装置和切换系统
CN111149088B (zh) 用于运行控制器的方法以及具有对应的控制器的设备
RU2798256C1 (ru) Способ автоматической парковки, а также устройства и системы для его реализации.
US11891050B2 (en) Terminal apparatus for management of autonomous parking, system having the same and method thereof
CN111591305B (zh) 驾驶辅助系统的控制方法、系统、计算机设备及存储介质
CN115979671A (zh) 基于时间限制的车辆下线检测方法、系统介质及设备
US9355600B2 (en) Display device with readback display
CN108205303B (zh) 一种车辆电子部件更新方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21797619

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021797619

Country of ref document: EP

Effective date: 20220629

NENP Non-entry into the national phase

Ref country code: DE