CN113840766A - A vehicle control method and device - Google Patents

A vehicle control method and device Download PDF

Info

Publication number
CN113840766A
CN113840766A CN202180003366.2A CN202180003366A CN113840766A CN 113840766 A CN113840766 A CN 113840766A CN 202180003366 A CN202180003366 A CN 202180003366A CN 113840766 A CN113840766 A CN 113840766A
Authority
CN
China
Prior art keywords
vehicle
intention
intent
driver
condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202180003366.2A
Other languages
Chinese (zh)
Other versions
CN113840766B (en
Inventor
许明霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yinwang Intelligent Technology Co ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN113840766A publication Critical patent/CN113840766A/en
Application granted granted Critical
Publication of CN113840766B publication Critical patent/CN113840766B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

本申请公开了一种车辆控制方法及装置,可基于驾驶员的手势操作识别驾驶员的驾驶意图,并根据驾驶意图实现对车辆的驾驶状态的进行控制,提供了更加灵活的驾驶意图识别与驾驶控制,带来更好的驾驶体验。

Figure 202180003366

The present application discloses a vehicle control method and device, which can recognize the driver's driving intention based on the driver's gesture operation, and realize the control of the driving state of the vehicle according to the driving intention, thereby providing more flexible driving intention recognition and driving. control for a better driving experience.

Figure 202180003366

Description

Vehicle control method and device
Technical Field
The present disclosure relates to the field of automatic driving, and more particularly, to a method and an apparatus for controlling a vehicle.
Background
According to the definition of Society of Automotive Engineers (SAE) on the automation level, in 6 levels of L0 to L5 of automated driving, in a scene of automated driving of a low level (e.g., L2 and below), automated driving is realized based on driving intention according to a driver using artificial driving plus Advanced Driving Assistance System (ADAS); in the high-level automated driving scenario (L3 to L5), L3 is automated driving performed within a specific area, L4 is highly automated driving, L5 is fully automated driving, and automated driving of L3 to L5 is all performed autonomously by the vehicle, and the driver does not need to concentrate on attention for driving operation. That is, automatic driving, particularly high-level automatic driving, frees the driver's hands, feet, and even the driver's thinking about driving.
In the existing technology for automatic driving based on the driving intention of the driver, the driving intention of the driver is mainly recognized to obtain the driving intention corresponding to the driving action and further realize automatic driving, so the recognition mode of the driving intention is not flexible enough.
Disclosure of Invention
The application provides a vehicle control method and device, which are used for flexibly recognizing the driving intention of a driver.
The vehicle control method provided by the present application may be executed by an electronic device that supports a vehicle control function. An electronic device can be abstracted as a computer system. The electronic device supporting the vehicle control function in the present application may also be referred to as a vehicle control device. The vehicle control device may be a complete machine of the electronic device, or may be a part of the electronic device, for example: and a chip related to the control function of the vehicle, such as a system chip or an image chip. The SOC chip is also called a System On Chip (SOC). Specifically, the vehicle control device may be a terminal device or an in-vehicle apparatus such as an in-vehicle computer, a car machine, or the like in a vehicle, or may be a system chip, an image processing chip, or another type of chip that can be provided in a computer system in the vehicle or the in-vehicle apparatus.
In a first aspect, a vehicle control method is provided. The method comprises the following steps: the vehicle control apparatus may acquire a first gesture operation of a driver, determine a first intention of the driver according to the first gesture operation, and control a driving state of the vehicle according to the first intention. The first gesture operation comprises a driver touch operation on the touch screen and/or an interval gesture operation of the driver, and the touch operation comprises a touch operation or a drag operation.
By adopting the method, the vehicle control device can flexibly recognize the driving intention of the driver based on the first gesture operation of the driver, and control the driving state of the vehicle according to the driving intention, thereby providing a more flexible driving intention recognition mode and bringing better driving experience.
In one possible design, the first intent may include at least one of a passing intent, a lane change intent, a steering intent, or a driving trajectory intent. Therefore, various types of additional driving intentions can be recognized according to the first gesture, and flexible control over the vehicle according to the first gesture operation is achieved.
In one possible design, if the first gesture operation includes a drag operation of the driver on an icon of the vehicle displayed on the touch screen, the vehicle control apparatus may further move a display position of the icon of the vehicle in the touch screen in accordance with a change in a drag trajectory of the drag operation, and/or display the drag trajectory of the drag operation in the touch screen. Therefore, the visualization of the dragging track can be realized, and the user experience is improved.
In one possible design, the vehicle control apparatus may control the driving state of the vehicle according to the drag trajectory when the first intention corresponding to the drag trajectory is allowed to be executed. When the first intention corresponding to the drag trajectory is not allowed to be executed, the vehicle control apparatus may further perform at least one of the following operations: sending a first prompt message, wherein the first prompt message is used for notifying that a first intention corresponding to the dragging operation is not allowed to be executed; or displaying an icon of the vehicle at a first display position, wherein the first display position is the display position of the icon of the vehicle before the drag operation is acquired; or clearing the dragging track displayed in the touch screen. Therefore, when the first intention is allowed to be executed, the control of the driving state of the vehicle can be realized according to the first intention, and when the first intention is not allowed to be executed, the driver can be fed back in time, so that the user experience is improved.
In one possible design, if the first intention corresponding to the dragging operation includes a driving track intention, when the first intention corresponding to the dragging track is not allowed to be executed, the vehicle control device may further correct the dragging track according to at least one of a traffic regulation condition, a safe driving condition, an environmental condition, or a comfort condition, and display the corrected dragging track in the touch screen, where the corrected dragging track may represent a suggested driving route. Therefore, when the first intention corresponding to the drag operation is not allowed to be executed, the drag trajectory can be corrected according to the first condition and the correction of the drag trajectory can be intuitively fed back to the driver.
In one possible design, the vehicle control device may further send a second prompt message, where the second prompt message is used to inquire whether to control the driving state of the vehicle according to the corrected dragging track. The vehicle control apparatus may further acquire a first operation of the driver, wherein the first operation indicates agreement to control the driving state of the vehicle according to the corrected drag trajectory, and control the driving state of the vehicle according to the corrected drag trajectory. Therefore, the driving state of the vehicle can be controlled according to the corrected dragging track according to the feedback of the driver to the second prompt message, so that the success rate of controlling the driving state of the vehicle according to the driving intention of the driver is improved.
In one possible design, the vehicle control apparatus may further determine that the first intention does not satisfy a first condition including at least one of a traffic regulation condition, a safe-driving condition, an environmental condition, or a comfort condition. The vehicle control apparatus may further determine a second intention that satisfies the first condition, based on the first intention, the second intention being executed at a timing different from a timing at which the first intention is executed. The vehicle control device may also control the driving state of the vehicle according to the second intention. Therefore, when the first intention is not allowed to be executed, the first intention can be corrected according to the first condition and the second intention can be obtained, and therefore controlling the driving state of the vehicle according to the second intention can improve the success rate of the driving intention recognition.
In one possible embodiment, the vehicle control device may also send a third prompting message, which is used to inquire whether to control the driving state of the vehicle according to the second intention. The vehicle control apparatus may further acquire a second operation by the driver, the second operation indicating agreement to control the driving state of the vehicle according to a second intention. Therefore, the driving state of the vehicle can be controlled according to the corrected second intention according to the feedback of the driver to the third prompt message, so that the success rate of controlling the driving state of the vehicle according to the driving intention of the driver is improved.
In one possible design, the vehicle control apparatus may further determine that the first intention does not satisfy a first condition including at least one of a traffic regulation condition, a safe-driving condition, or a comfort condition. The vehicle control apparatus may also determine a second intention according to the first condition, the second intention satisfying the first condition. The vehicle control apparatus may further transmit a third prompting message for inquiring whether to control the driving state of the vehicle according to the second intention. A third operation of the driver is acquired, the third operation indicating that the driving state of the vehicle is not approved to be controlled according to the second intention. The vehicle control apparatus may also stop controlling the driving state of the vehicle according to the second intention. Therefore, according to the third operation, whether to perform the vehicle driving state control can be effectively determined according to the operation of the driver.
In a second aspect, the present application provides a vehicle control apparatus that includes a processing module and an input-output module. The input and output module can be used for acquiring a first gesture operation of the driver. The processing module may be configured to determine a first intention of the driver based on the first gesture operation, and control a driving state of the vehicle based on the first intention. Wherein the first gesture operation comprises at least one of: the method comprises the following steps that a driver conducts touch operation on a touch screen, wherein the touch operation comprises touch operation or dragging operation; the driver operates the vehicle in an empty gesture.
In one possible design, the first intent includes at least one of the following intents: an overtaking intent, a lane change intent, a steering intent, or a driving trajectory intent.
In one possible design, if the first gesture operation includes a drag operation of an icon of the vehicle displayed on the touch screen by the driver, the input-output module may be further configured to: moving the display position of the icon of the vehicle in the touch screen along with the change of the dragging track of the dragging operation; and/or displaying a dragging track of the dragging operation in the touch screen.
In one possible design, the processing module may be further configured to control a driving state of the vehicle according to the drag trajectory when the first intention corresponding to the drag trajectory is allowed to be executed, or configured to perform, when the first intention corresponding to the drag trajectory is not allowed to be executed: and clearing the dragging track displayed in the touch screen. When the first intention corresponding to the drag track is not allowed to be executed, the input and output module may be further configured to send a first prompt message, where the first prompt message is used to notify that the first intention corresponding to the drag operation is not allowed to be executed, and/or display an icon of the vehicle at a first display position, where the first display position is a display position of the icon of the vehicle before the drag operation is acquired.
In one possible design, the first intention corresponding to the drag operation includes an intention of the driving track, and when the first intention corresponding to the drag track is not allowed to be executed, the processing module is further configured to: the drag trajectory is corrected according to at least one of traffic regulation conditions, safe driving conditions, environmental conditions, or comfort conditions. The input-output module may be further operable to: and displaying the corrected dragging track in the touch screen, wherein the corrected dragging track represents a suggested driving route.
In one possible design, the input/output module may be further configured to send a second prompt message, where the second prompt message is used to inquire whether to control the driving state of the vehicle according to the modified drag trajectory, and the input/output module may be further configured to obtain a first operation of the driver, where the first operation indicates that the driver agrees to control the driving state of the vehicle according to the modified drag trajectory. The processing module can be further used for controlling the driving state of the vehicle according to the corrected dragging track.
In one possible design, the processing module is further operable to: determining that the first intent does not satisfy a first condition, the first condition comprising at least one of a traffic regulation condition, a safe driving condition, an environmental condition, or a comfort condition; the processing module can be further used for determining a second intention according to the first intention, wherein the second intention meets the first condition, and the execution time of the second intention is different from that of the first intention; the processing module is specifically operable to control a driving state of the vehicle in accordance with the second intent.
In one possible design, the input-output module may be further configured to send a third prompting message for inquiring whether to control the driving state of the vehicle according to the second intention, and the input-output module may be further configured to acquire a second operation of the driver, the second operation indicating agreement to control the driving state of the vehicle according to the second intention.
In one possible design, the processing module may be further operative to determine that the first intent does not satisfy a first condition, the first condition including at least one of a traffic regulation condition, a safe driving condition, or a comfort condition. The processing module may be further operable to determine a second intent based on the first condition, the second intent satisfying the first condition. The input-output module may be further operable to: and sending a third prompt message for inquiring whether to control the driving state of the vehicle according to the second intention. The input-output module may be further operable to acquire a third operation by the driver, the third operation indicating disagreement with control of the driving state of the vehicle according to the second intention. The processing module may also be configured to stop controlling the driving state of the vehicle according to the second intent.
In a third aspect, the present application provides a computing device, comprising a processor, the processor being connected to a memory, the memory storing a computer program or instructions, the processor being configured to execute the computer program or instructions stored in the memory, so as to cause the computing device to perform the method of the first aspect or any one of the possible implementation manners of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program or instructions which, when executed, cause a computer to perform the method of the first aspect or any one of the possible implementations of the first aspect.
In a fifth aspect, the present application provides a computer program product, which, when executed by a computer, causes the computer to perform the method of the first aspect or any one of the possible implementations of the first aspect.
In a sixth aspect, the present application provides a chip, which is connected to a memory and is configured to read and execute a computer program or instructions stored in the memory to implement the method in the first aspect or any one of the possible implementation manners of the first aspect.
In a seventh aspect, the present application provides a vehicle including the on-board control device and the execution device in any possible implementation manner of the second aspect or the second aspect, so as to implement the method in any possible implementation manner of the first aspect or the first aspect.
In an eighth aspect, the present application provides a vehicle comprising the chip and the execution apparatus of the sixth aspect, so as to implement the method of the first aspect or any possible implementation manner of the first aspect.
Based on the technical scheme provided by the application, the vehicle control device can determine the first intention of the driver according to the touch operation and/or the air gesture operation of the driver, and control the driving state of the vehicle in the automatic driving state according to the first intention, or the driving intention is used for controlling the driving state of the vehicle, wherein the driving state of the vehicle refers to a state related to the driving mode of the vehicle, so that the scheme can provide a more flexible driving intention identification mode for an automatic driving scene, and can provide better automatic driving experience for the driver.
Here, the control of the driving state may be regarded as control in a short time or control in a limited number of times, for example, the vehicle control device detects a touch operation and performs control of the driving state once according to the touch operation.
Further, if the touch operation of the driver is a dragging operation performed on the touch screen, the vehicle control device may display a dragging track of the dragging operation on the display screen, and/or move an icon of the vehicle along with the dragging operation of the driver, so as to realize visualization of the dragging operation. If the vehicle control device identifies that the dragging track is not allowed to be executed, the dragging track can be cleared or the position of the vehicle icon is restored or the driver is informed that the dragging operation or the driving intention corresponding to the dragging operation is not allowed to be executed. In addition, the vehicle control device may correct the dragging trajectory according to at least one of a traffic regulation condition, a safe driving condition, an environmental condition, or a comfort condition, and display the corrected dragging trajectory in the touch screen, and the corrected dragging trajectory may represent a suggested driving route, so that the correction of the dragging trajectory may be intuitively fed back to the driver. In addition, the vehicle control device can also inquire whether the driver executes the control of the driving state of the vehicle according to the corrected dragging track, and if the driver indicates that the driver agrees to execute the corrected dragging track through gestures or other human-computer interaction operations, the vehicle control device can control the driving state of the vehicle according to the corrected dragging track, so that the success rate of vehicle control is improved.
In addition, when the vehicle control device recognizes that the first intention does not satisfy at least one of the traffic regulation condition, the safe driving condition, the environmental condition, or the comfort condition, the driver may be prompted that the first intention is not allowed to be executed, thereby providing feedback to the driver regarding the first gesture operation. Further, the driver may also be asked by the vehicle control apparatus whether to execute a second intention, which may be an intention that satisfies the traffic regulation condition, the safe driving condition, the environmental condition, and the comfort condition determined from the first intention, for example, the second intention may be the first intention that is delayed in execution. The driver may thereafter be asked whether to perform the second intention, and if the driver indicates agreement to perform the second intention through a gesture or other human-machine interaction operation, the vehicle control apparatus may control the driving state of the vehicle according to the second intention, thereby improving the vehicle control success rate.
Therefore, the technical scheme provided by the embodiment of the application can support the vehicle control device to flexibly identify the driving intention of the driver, and improve the user experience in the automatic driving scene. In addition, in the implementation of the technical scheme, the vehicle control device can interact with the driver in time according to the conditions of whether the driving intention is allowed to be executed or not and the like, so that the driving safety and the driving compliance are ensured, and the interactive experience and the driving participation sense of the driver are enhanced.
Drawings
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a vehicle control device according to an embodiment of the present application;
FIG. 3 is a schematic structural diagram of another vehicle control device provided in an embodiment of the present application;
FIG. 4 is a schematic structural diagram of another vehicle control device provided in the embodiments of the present application;
FIG. 5 is a schematic flow chart illustrating a vehicle control method according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating a vehicle icon display according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a drag trajectory provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of another drag trajectory provided by embodiments of the present application;
FIG. 9 is a schematic diagram of another drag trajectory provided by embodiments of the present application;
FIG. 10 is a schematic diagram of an interface for querying whether to execute a second intent according to an embodiment of the present application;
fig. 11 is a flowchart illustrating another vehicle control method according to an embodiment of the present application.
Detailed Description
The application provides a vehicle control method and device, which are used for flexibly recognizing the driving intention of a driver according to the gesture operation of the driver and providing driving safety and driving compliance guarantee for a vehicle control command generated according to the driving intention through an automatic driving algorithm. The method and the device are based on the same technical conception, and because the principles of solving the problems of the method and the device are similar, the implementation of the device and the method can be mutually referred, and repeated parts are not repeated.
In the method provided by the embodiment of the application, the vehicle control device can determine the first intention of the driver according to the first gesture operation of the driver and control the driving state of the vehicle according to the first intention so as to realize automatic driving of the vehicle. The first gesture operation comprises a touch operation or an air gesture of a driver on the touch screen. The touch operation is, for example, a touch screen operation or a drag operation. In the method, the control of the vehicle is carried out based on the man-machine interaction operation of the driver, and the driving intention of the driver is recognized by the vehicle control device according to the man-machine interaction operation of the driver, so that the flexible recognition of the driving intention is realized.
Hereinafter, portions of the present application are explained to facilitate understanding by those skilled in the art.
1. Vehicle with a steering wheel
The present application is applicable to an autonomous vehicle (hereinafter, simply referred to as a vehicle), and particularly to a vehicle having a Human Machine Interface (HMI) function, a function of calculating and judging a driving state of the vehicle by an autonomous driving algorithm, and a function of controlling a motion of the vehicle.
Optionally, the vehicle may include at least one autopilot system to support autopilot of the vehicle.
It should be understood that the vehicle may be replaced by other vehicles or vehicles such as trains, aircrafts, mobile platforms, etc. according to the actual use requirement. This is not limited in this application.
2. Vehicle control device
The method provided by the embodiment of the application can be used for supporting the vehicle to realize. The vehicle control device may support a function of human-computer interaction, a function of calculation and judgment by an automatic driving algorithm with respect to a driving state of the vehicle, and a function of motion control of the vehicle.
Alternatively, the vehicle control device may be integrated with the vehicle, for example, the vehicle control device may be disposed in the vehicle. Alternatively, a separate arrangement may be employed, with remote communication supported between the vehicle control device and the vehicle.
The vehicle control device may be implemented in the form of a terminal device and/or a server. The terminal device may be, for example, a mobile phone, a computer, a tablet computer, or a vehicle-mounted device. The server may be a cloud server or the like. Taking the vehicle control device as an example of a terminal device, the terminal device can provide the following functions: the method comprises the steps of collecting gesture operation or man-machine interaction operation of a user through terminal equipment, inquiring or prompting the user, controlling the driving state of a vehicle according to the gesture operation input by the user and the like. Wherein, if the separated setting is adopted, the driving state of the vehicle can be controlled based on the communication between the terminal equipment and the vehicle. For example, as shown in fig. 1, the vehicle control device is a terminal, and the present application may assist in recognizing the driving intention of the driver and controlling the driving state of the vehicle through the terminal. It should be understood that the input in the present application refers to information transmission from the driver to the vehicle control device. The output means that the vehicle control device transmits information to the driver.
For example, fig. 2 shows a schematic diagram of a possible vehicle control device, which may include a processing module 210 and an input/output module 220. Illustratively, the structure shown in fig. 2 may be a terminal device, or a functional part having the vehicle control apparatus shown in the present application. When the structure is a terminal device or other electronic devices, the input/output module 220 may include a device (or called human-computer interaction device) for supporting a human-computer interaction function, such as a touch screen, a camera, a speaker, a microphone, or an ultrasonic sensor, or a module for supporting communication with a device for supporting a human-computer interaction function, such as a touch screen, a camera, a speaker, a microphone, or an ultrasonic sensor. The processing module 210 may be a processor, such as a Central Processing Unit (CPU). When the structure is a functional component having the vehicle control device shown in the present application, the input/output module 220 may be an interface circuit for communicating with a device such as a touch screen, a camera, a millimeter wave radar, a speaker, a microphone, or an ultrasonic sensor for supporting a human-computer interaction function, and the processing module 210 may be a processor. When the structure is a chip or a system of chips, the input-output module 220 may be an input-output interface of the chip, and the processing module 210 may be a processor of the chip, and may include one or more central processing units. It should be understood that the processing module 210 in the embodiment of the present application may be implemented by a processor or a processor-related circuit component, and the input/output module 220 may be implemented by a transceiver, a device for supporting a human-computer interaction function, or a related circuit component.
For example, the processing module 210 may be used to perform all operations performed by the vehicle control apparatus in any of the embodiments of the present application, except for input-output operations, such as processing operations to identify driving intent, determine whether the driving intent is allowed to perform and control the driving state of the vehicle, and/or to support other processes in the embodiments described herein, such as generating messages, information, and/or signaling output by the input-output module 220, and processing messages, information, and/or signaling input by the input-output module 220. The input output module 220 may be used to perform all input and output operations performed by the terminal identification and/or server in any of the embodiments of the present application, such as to enable or communicate with human-machine interaction through or with a device for supporting human-machine interaction functionality, or to communicate with a device for controlling a driving state of a vehicle, and/or to support other processes in the embodiments described herein.
In addition, the input/output module 220 may be a functional module that can perform both an output operation and an input operation, and when performing a transmission operation, the input/output module 220 may be considered as a transmission module, and when performing a reception operation, the input/output module 220 may be considered as a reception module; alternatively, the input/output module 220 may also include two functional modules, and the input/output module 220 may be regarded as a general term for the two functional modules, which are respectively a sending module and a receiving module, where the sending module is used to complete a sending operation.
Fig. 3 is a schematic structural diagram of another vehicle control device for executing the actions executed by the vehicle control device according to the embodiment of the present application. For ease of understanding and illustration. As shown in fig. 3, the vehicle control device may include at least one component of a processor, a memory, an interface circuit, or a human-machine interaction device. The processor is mainly used for realizing the processing operation provided by the embodiment of the application, such as controlling the vehicle control device, executing the software program, processing the data of the software program, and the like. The memory is used primarily for storing software programs and data. The man-machine interaction device can be used for supporting man-machine interaction, and can comprise a touch screen, a camera, a millimeter wave radar, a loudspeaker, a microphone or an ultrasonic sensor, and the like, and the functions include but are not limited to: operations such as gestures input by the driver are acquired, and information is output (or presented) to the driver. The interface circuit may be used to support communication with the vehicle control device, for example, when the human machine interface device is externally connected to the vehicle control device (i.e., the vehicle control device does not include the human machine interface device). The interface circuit may include a transceiver or an input-output interface.
It should be understood that only one memory and processor are shown in fig. 3 for ease of illustration. In the production of the actual vehicle control device, there may be one or more processors and one or more memories. The memory may also be referred to as a storage medium or a storage device, etc. The memory may be provided independently of the processor, or may be integrated with the processor, which is not limited in this embodiment.
Fig. 4 shows another vehicle control device according to an embodiment of the present application. Therefore, the vehicle control device can comprise a human-computer interaction module, a human-computer interaction control module, a decision planning calculation module and a whole vehicle motion control module.
The man-machine interaction module can be used for realizing input and output interaction with a driver in the vehicle. The specific form is related to the interaction mode. For example, the human-computer interaction module may include a touch screen, and if the human-computer interaction module is implemented by space gesture recognition, the human-computer interaction module may include a camera, a millimeter wave radar, an ultrasonic sensor, or the like, and if the human-computer interaction module is implemented by semantic recognition, the human-computer interaction module may include a microphone and a speaker. It is of course also possible that the input and output are mixed, such as input through a microphone and output through a touch screen, etc. There may also be a plurality of input and output modes, for example, the input mode includes a combination of touch gesture and air gesture, and/or the output mode includes a plurality of modes of displaying and playing voice, sound or music through the display screen, which is not limited in this application.
The human-computer interaction module can also be used for informing the human-computer interaction control module of the detected human-computer interaction operation, for example, the human-computer interaction module can identify the human-computer interaction operation of a driver according to the predefined characteristics corresponding to the human-computer interaction operation, and send the information of the detected human-computer interaction operation to the human-computer interaction control module, so that the human-computer interaction control module can identify the human-computer interaction operation according to the information of the human-computer interaction operation. It should be understood that the human-machine interaction module is at least operable to detect the first gesture operations and human-machine interaction operations referred to in this application.
For example, if the human-computer interaction operation is touch operation, the human-computer interaction module may send touch operation information acquired by the touch screen to the human-computer interaction control module, where the touch operation information may include touch operation type, position on the touch screen, dragging track, dragging duration, and interactive content information such as characters and symbols displayed on the display screen. If the man-machine interaction operation is voice, the man-machine interaction module can send the voice signal to the man-machine interaction control module, or can extract the semantics of the voice signal and send the semantics of the voice signal to the man-machine interaction control module. If the human-computer interaction operation is an air-separating gesture, the human-computer interaction module can send a video signal acquired by the camera, a detection signal of the ultrasonic sensor or a photo signal to the human-computer interaction control module, or the human-computer interaction module can extract description information corresponding to the air-separating gesture according to the video and/or photo signal, for example, the description information is used for describing that the air-separating gesture is an air-separating gesture such as nodding or waving, and the description information is sent to the human-computer interaction control module.
The human-machine interaction module may also be used to output (or present) information for the purpose of asking and/or informing the driver, for example, by displaying symbols or words on a display screen, or by playing voice, sound or music through a speaker, etc. If the purpose of the output (or presentation) information is to ask the driver, the human-machine interaction module may also detect the feedback of the driver to the inquiry, for example, whether the driver has made a human-machine interaction operation indicating the feedback of the inquiry within a period of time (e.g., 10 seconds, 20 seconds, etc.) after the inquiry is made. And sending information of the human-computer interaction operation representing the feedback of the inquiry to the human-computer interaction control module so that the human-computer interaction control module knows the feedback result of the driver to the inquiry, for example, the feedback result may be used for representing agreement or anti-peer to the content of the inquiry.
The man-machine interaction control module can be used for recognizing the driving intention input by man-machine interaction and forwarding the recognized intention to the decision planning calculation module. Optionally, the human-machine interaction control module may be further configured to forward information from the decision-making plan calculation module to the human-machine interaction module, which may need to be output (or presented) by the human-machine interaction module to the driver/user.
For example, the human-computer interaction control module may identify the driving intention corresponding to the human-computer interaction operation according to the description information of the human-computer interaction operation sent by the human-computer interaction module. When the driver needs to be inquired, the driver can be inquired through the human-computer interaction module according to the information from the self-decision planning calculation module.
The decision planning calculation module can be used for judging whether the driving intention determined by the man-machine interaction control module accords with the rules of automatic driving or not and adjusting the driving intention under the condition that adjustment is needed. The decision plan calculation module can also be used for determining a control command according to the driving intention and controlling the vehicle by the whole vehicle motion control module according to the control command.
The human-computer interaction control module may also be referred to as an autopilot brain or an autopilot system, and may include a chip for executing an autopilot algorithm, such as an artificial energy (AI) chip, a Graphics Processing Unit (GPU) chip, a Central Processing Unit (CPU), and the like, or a system formed by a plurality of chips thereof, which is not limited in this embodiment.
The vehicle motion control module can be used for controlling the driving state of the vehicle according to the control command from the decision planning calculation module, so that the control on processing the driving state designed by the embodiment of the application can be realized.
It should be understood that the vehicle control device in the present application may be replaced with an electronic device, a vehicle, an in-vehicle device, or the like. In the present application, the operation performed by the vehicle control device may be implemented by a main body of an electronic device, a vehicle, an in-vehicle device, or the like.
The method provided by the embodiment of the present application is explained below with reference to fig. 5. The method may be performed by a vehicle control device. The vehicle control device may include any one or more of the structures shown in fig. 2-4. When the vehicle control method is implemented, the processing action in the method provided by the embodiment of the present application may be implemented by the processing module 210 shown in fig. 2 or the processor shown in fig. 3, or the human-machine interaction control module, the decision-making plan calculation module and the entire vehicle motion control module shown in fig. 4, including but not limited to recognition of a gesture or human-machine interaction operation, determination of driving intention, or control of a vehicle driving state according to the driving intention. The interaction between the vehicle control device and the driver can also be realized by the input and output module shown in fig. 2 or the interface circuit or the human-computer interaction device shown in fig. 3, and the interactions include but are not limited to: the method comprises the steps of obtaining gestures or man-machine interaction operation of a driver, or displaying or presenting information to the driver so as to realize actions such as display, notification or inquiry facing the driver.
S101: the vehicle control device acquires a first gesture operation of a driver.
The first gesture operation comprises at least one of touch operation or air gesture operation of a driver on the touch screen, wherein the touch operation or the air gesture operation is acquired through the touch screen. The touch operation may be a touch operation and/or a drag operation performed by the driver on the touch screen. The spaced gesture operation includes actions of the driver through body parts such as hands or head, including but not limited to hand swinging, fist making, nod, blinking or head shaking, etc.
It should be understood that the above first gesture operation may be one or more of the above-exemplified operations, for example, the first gesture operation may be a touch operation or an air gesture input operation, and may also be a combination of multiple ones of the touch operation or the air gesture input operation.
If the first gesture operation is a touch operation, the vehicle control device may identify a type of touch of the driver on the touch screen. The touch operation may include a touch type operation (or touch operation) and/or a drag type operation (or touch operation). The touch operation can be divided into single-click operation and double-click operation. For the operation of the drag type, the touch screen can also identify information such as a track and/or a drag speed of the drag operation. It should be appreciated that an icon of the vehicle may be displayed on the display screen, and the driver may drive into the touch operation by touching or dragging the icon. As shown in fig. 6, an icon representing the vehicle and the current driving state of the vehicle, such as information of the heading direction of the vehicle, the lane where the vehicle is located, and/or the driving speed, may be displayed on the touch screen. The driver can perform first touch operation on the vehicle displayed on the touch screen.
If the first gesture operation is an air gesture operation, the vehicle control device can recognize the air gesture operation of the driver through a camera, a millimeter wave radar or an ultrasonic sensor and the like. Different spaced gesture operations may be preset to have different meanings, for example, nodding or making a fist to indicate agreement, and nodding or waving a hand to indicate disagreement.
Here, the operation of acquiring the air gesture through the camera is taken as an example, and a recognition mode of the air gesture is described. The camera can collect continuous multi-frame images, and position change characteristics of the characteristic pixel points in the multi-frame images are determined through identification of the characteristic pixel points in the images, for example, the characteristic pixel points are edge images of fingers, palms or faces of users obtained through an edge detection algorithm. If the position change characteristic accords with the preset characteristic, the meaning of the spaced gesture operation can be determined to be the preset meaning corresponding to the characteristic. For example, the characteristic pixel point includes edge images of the fingers and the palm of the user, and the edge images of the fingers of the user in the multi-frame image show a tendency of being more curved and approaching towards the palm, so that the spaced gesture can be recognized as a fist, and the meaning of the spaced gesture can be further determined to be the meaning corresponding to the preset fist.
S102: the vehicle control device determines a first intention of the driver based on the first gesture operation. The first intention may also be referred to as a driving intention for indicating an intention of the driver for the driving state control of the vehicle.
Optionally, the first intent comprises at least one of an acceleration intent, a deceleration intent, a parking intent, a passing intent, a lane change intent, a steering intent, a driving trajectory intent, a drift intent, a following intent, or an acceleration intent.
It should be understood that in S101, the driver may express the driving intention through the first gesture operation, for example, when the driver wants the vehicle to perform the passing action, the driver may perform the first gesture operation corresponding to the passing intention, and in S102, the vehicle control device may recognize the intention of the driver as the passing intention according to the first gesture operation.
The vehicle control apparatus may also ask the driver whether to execute the first intention after recognizing the first intention, so as to avoid a driving intention recognition error. For example, after obtaining the first intention, the vehicle control apparatus may inquire whether the driver agrees to execute the first intention, or in other words, whether the driver controls the driving state of the vehicle in accordance with the first intention.
When inquiring whether the driver agrees to execute the first intention, the vehicle control device can display the characters corresponding to the first intention through the display screen, or play the voice corresponding to the first intention to the driver through the loudspeaker, detect the human-computer interaction operation of the driver, and judge whether the driver agrees to execute the first intention according to the detected human-computer interaction operation of the driver.
The human-computer interaction operation indicating that the driver agrees to execute the first intention includes, for example, a driver performing a touch operation on a virtual key indicating agreement displayed on a touch panel, answering a voice including words such as "agreement" and "confirmation" or indicating agreement by an action such as nodding, and a gesture. The human-computer interaction operation indicating that the first intention is not executed is performed by indicating that the driver does not agree with the virtual key in the touch screen, answering voice containing words such as 'disagreement' or 'cancel', or making actions and gestures such as shaking head or waving hands.
By way of example, the correspondence between several human-computer interaction operations and driving intentions is illustrated here to facilitate understanding of the present application.
(1) For a touch operation, the vehicle control device may identify an intention corresponding to the operation according to the position information of the touch operation. For example, if the touch operation is a driver's touch operation on a specific area, the vehicle control device may identify what human-machine interaction operation the touch operation is in combination with the meaning of the specific area.
For example, the touch screen may display at least one virtual key corresponding to the driving intention, and when the user performs a touch operation on the virtual key displayed on the touch screen, the driving intention expressed by the touch operation is the driving intention corresponding to the virtual key.
The virtual key can display the driving intention corresponding to the virtual key in a text or icon display mode and the like. Taking an acceleration intention as an example, as shown in fig. 6, the virtual key may display information such as "acceleration" and/or an acceleration value of "5 kilometers per hour (km/h)", for example, the virtual key may be displayed on a touch screen, when a driver performs a touch operation on the virtual key, the virtual key may display a virtual key with an acceleration value of 5km/h and a virtual key with an acceleration value of 10km/h, and when the driver performs a touch operation on the virtual key with an acceleration value of 5km/h, the vehicle control device may recognize that the first intention is an acceleration intention with an acceleration value of 5 km/h.
In addition, the virtual keys displayed on the touch screen can also correspond to speed change values represented by numerical values and signs, if the numerical values displayed on the virtual keys are positive numbers, acceleration intentions are represented, and the numerical values are responsible for representing deceleration intentions. For example, when the driver touches a virtual key with a value of "-5 km/h", the first intention is a deceleration intention with a deceleration value of 5 km/h.
Similarly, virtual keys corresponding to driving intentions such as a deceleration intention, a parking intention, a passing intention, a drifting intention, a following intention and/or a lane changing intention can also be set in a similar manner.
It should be understood that in actual use, the virtual keys may be replaced by virtual areas, icons, text or windows, etc.
For example, the driving intention may be input by touching a vehicle icon or a blank display area as shown in fig. 6. For example, the driver touches a blank area in the vehicle front direction of the vehicle icon shown in fig. 6, and the corresponding driving intention is an intention to accelerate according to a set threshold value. On the contrary, if the driver touches a blank area in the vehicle tail direction of the vehicle icon, the deceleration intention is represented. The driving intention which is expected to be realized when the driver performs single-click and/or double-click operation on the vehicle icon can be determined through manual setting or predefining and the like, so that the driving intention can be conveniently input.
(2) For the drag operation, the vehicle control device may identify an intention corresponding to the drag operation according to information such as a track of the drag operation, a drag acceleration, and the like. Wherein the acceleration intention, the deceleration intention, the parking intention, the passing intention, the lane change intention, the steering intention, the travel track intention, the drift intention, the following intention, the acceleration intention, or the like may be determined according to the drag operation.
Alternatively, as shown in fig. 7, a drag trajectory of the drag operation by the driver may be displayed on the display screen. For example, when the driver drags an icon of the vehicle, the display screen may display a drag trajectory along which the icon is dragged.
In addition, optionally, after the drag operation on the icon is started, the display screen may display the icon at a first position where the finger of the driver is located at a first time, and as the drag operation is performed, the display screen may display the icon at a second position where the finger of the driver is located at a second time, where the first time is different from the second time, and the first position is different from the second position. That is, the display screen may move the display position of the vehicle icon in the display screen as the driver's dragging trajectory changes.
The manner in which the vehicle control apparatus recognizes the first intention from the drag trajectory will be described below by way of example.
As shown in fig. 7, when the driver drags the vehicle to a lane other than the current lane, taking the touch screen as a capacitive screen as an example, the position (in the form of coordinates) touched by the finger on the screen is detected by the touch screen controller and sent to the CPU through an interface (e.g., an RS-232 serial port), thereby determining the input information. The vehicle control device judges that the first intention is the lane changing intention according to the coordinate position of the lane line displayed on the touch screen, the initial vehicle position coordinate detected by the touch screen and the dragged vehicle position coordinate, and the vehicle control device can execute the lane changing operation according to the lane changing intention. For another example, if the driver drags the vehicle in the same direction as the driving direction of the vehicle (i.e., the direction of the head of the vehicle), the CPU of the touch screen may record the coordinates and time points at which the initial position and the drag end position are detected, and obtain the drag speed by calculating the distance between the coordinate points and dividing by the time difference between the two points. If the dragging speed is higher than the current vehicle speed, the first intention is an acceleration intention, and if the determined dragging speed is lower than the current vehicle speed, the first intention is a deceleration intention.
The vehicle control apparatus may further obtain a travel track intention, that is, a driving intention of controlling the travel track of the vehicle in accordance with (or based on) a drag track of the driver on the display screen.
The vehicle control apparatus may also determine the acceleration intention according to the acceleration of the drag operation. For example, if the driver's drag operation indicates that the vehicle is expected to travel at (or based on) the acceleration of the drag operation, the acceleration intention determined by the vehicle control device is to control the acceleration of the vehicle based on the acceleration of the drag operation by the driver.
The vehicle control apparatus may also determine the acceleration intention or the deceleration intention according to a drag direction of the drag operation. For example, if the included angle between the direction of the drag trajectory and the direction of the vehicle head is less than 90 degrees, an acceleration intention is indicated, and the acceleration value may be a default value or a value that the driver inputs by other means. If the included angle between the direction of the dragging track and the direction of the vehicle head is larger than 90 degrees, the deceleration intention is represented, and the acceleration value can be a default value or a value input by a driver in other modes.
The speed and/or acceleration corresponding to the dragging operation can be displayed on the display screen, the increase or decrease of the speed and/or acceleration is controlled by the driver through changing the dragging speed and the like, and when the driver finishes the dragging operation, the speed and/or acceleration displayed on the display screen is the speed and/or acceleration which the driver wants the vehicle to reach. If the speed is greater than the running speed of the current vehicle, the driving intention corresponding to the dragging operation is an acceleration intention, if the speed is less than the running speed of the current vehicle, the driving intention corresponding to the dragging operation is a deceleration intention, if the acceleration is greater than the acceleration of the current vehicle, the driving intention corresponding to the dragging operation is an acceleration intention for increasing the acceleration, and if the acceleration is less than the acceleration of the current vehicle, the driving intention corresponding to the dragging operation is an acceleration intention for decreasing the acceleration.
As another example, the intent to drift may also be identified based on the shape of the trajectory of the drag operation, one possible implementation is shown in fig. 8. Therefore, a driver can mark a line in front of the vehicle on the touch screen, the vehicle turns a small but obvious corner in one direction, then the vehicle turns to the other direction rapidly, the process keeps a high speed, and the vehicle stops rapidly like a sudden brake. The vehicle control device reads a track of the dragging operation from the touch screen, and identifies three characteristics according to the track, wherein the three characteristics are respectively as follows: A) after turning to one direction, the track quickly turns to the other direction; B) the angle between the final turning direction of the track and the original advancing direction is in the other direction of the range of 90 degrees to 120 degrees; C) the trajectory speed rapidly decreases and finally stops. If the three characteristics are satisfied, the first intention is judged to be a drift to graph. After that, a prompt of 'whether to drift' can be popped up, and after clicking 'confirm', the user confirms the drift intention.
For another example, the intention of following the vehicle can be realized through the dragging track on the touch screen, so that the driving states of the vehicle, such as the running track, acceleration, deceleration and the like, are controlled to be similar to the driving state of the vehicle ahead under the condition of ensuring the compliance, for example, the vehicle and the vehicle ahead are controlled to accelerate and decelerate at a close position, or turn at the same intersection and the like. One possible implementation is, for example, the driver double-clicking on the vehicle icon, and then dragging the icon without the finger leaving the touch screen; then, the vehicle icon is dragged to be overlapped with the icon of the front vehicle, as shown in fig. 9, and then the finger leaves the touch screen (i.e., hands are released), and double-click is performed at the position of the front vehicle icon. The vehicle control device reads the touch operation of the user operation from the touch screen, and identifies three characteristics, which are respectively: A) double-clicking and dragging the vehicle icon; B) when the user releases the hands, the vehicle icon moves to be overlapped with the icons of other vehicles; C) double-click after hands are released. Therefore, the method is used for prejudging and setting the following of the appointed front vehicle, popping up a prompt of 'whether to set the following of the vehicle', and confirming the intention of the following vehicle after clicking 'confirmation' by a user.
It should be understood that the above description of recognizing the driving intention according to the drag operation is merely exemplary, and should not be construed as limiting the manner of protection of the present application to the above examples.
(3) For the operation of the air-separating gesture, the vehicle control device can recognize the air-separating gesture of the driver collected by a camera, a millimeter wave radar, an ultrasonic sensor or the like, and recognize a first intention corresponding to the air-separating gesture so as to obtain the first intention. For example, the driver gestures collected by the camera include a gesture indicating acceleration, and the vehicle control device may recognize the first intention as an acceleration intention according to the gesture.
In particular implementations, a correspondence between the air-break gesture and the driving intent may be set. Considering that the shape and the movement locus of the gesture can be recognized by the current spaced gesture recognition technology, the correspondence of the shape of the gesture to the driving intention, such as the left finger indicating the left lane change, the right finger indicating the right lane change, the forward finger indicating the forward acceleration, the backward finger indicating the deceleration, etc., can be predefined, and/or the correspondence of the movement locus of the gesture to the driving intention, such as the left palm movement indicating the left lane change, the right palm movement indicating the right lane change, the forward palm movement indicating the forward acceleration, the backward palm movement indicating the deceleration, etc., can be predefined. For another example, the two hands hold the steering wheel and turn in the air, turning left indicates changing the sitting lane, turning right indicates changing the right lane, and the like. The technical means of the space gesture recognition is not limited, and for example, a camera or a millimeter wave radar can be used for recognition. When the vehicle control device identifies the air separation gesture corresponding to the driving intention, the first intention is identified as the air separation gesture.
S103: the vehicle control device controls a driving state of the vehicle according to the first intention.
The driving state is a state related to the driving mode of the vehicle, and is not a state unrelated to driving, such as an in-vehicle entertainment system such as an in-vehicle audio system or a video device, or light control. The driving manner is, for example, a state of acceleration, approach, reverse, stop, whether acceleration is changed, whether a driving track is changed, whether overtaking is performed, whether steering is performed, whether drifting is performed, whether following is performed, or whether lane change is performed.
For example, the vehicle control device may control the driving state of the vehicle a limited number of times (e.g., 1 time) according to the first intention, or control the vehicle for a limited period of time (e.g., 1 minute). For example, the number of times of validation or the time period of validation of the first intention may be set.
It is understood that the vehicle control apparatus can ensure safety and compliance in controlling the driving state of the vehicle in accordance with the first intention. For example, the vehicle control device in S103 may control the vehicle according to the rule of autonomous driving based on the first intention.
By adopting the flow of fig. 5, the vehicle control device can flexibly recognize the driving intention based on the gesture operation of the driver, and control the driving state of the vehicle according to the driving intention, thereby providing a more flexible driving intention recognition mode and bringing better driving experience.
Alternatively, in S103, the vehicle control apparatus may determine whether the first intention is allowed to be executed. Wherein the first intention meeting the first condition is allowed to be executed, and the first intention not meeting the first condition is not allowed to be executed.
For example, the vehicle control apparatus may determine whether the first intention meets a first condition, and when the first condition is met, the first intention is allowed to be executed, otherwise, the first intention is not allowed to be executed. The first condition includes, but is not limited to, at least one of a traffic regulation condition, a safe driving condition, or a comfort condition. Therefore, when it is determined that the first intention does not meet the first condition, the vehicle control apparatus may refuse to perform control of the process in accordance with the first intention to ensure driving safety, compliance, and comfort.
The traffic regulation conditions include, but are not limited to, forbidding of a double-solid line, forbidding of a diversion line, forbidding of long-time line riding, a highest speed limit condition and a lowest speed limit condition of the road, and the like.
The safe driving condition is a condition proposed to further improve the safety of vehicle driving on the premise that the traffic regulation condition is satisfied. The safe driving condition may be related to current vehicle condition information or current road information of the vehicle, or the like. For example, the safe driving conditions include a driving distance kept safe from a pedestrian, another vehicle, a traffic facility such as a guardrail, and a road surface structure. In addition, in order to ensure the safety of the vehicle running, the highest speed limit condition, the lowest speed limit condition, and the like of the vehicle may be additionally set.
The comfort condition is a running condition specified for meeting the requirements of a driver and passengers for comfortable riding, for example, the acceleration of the vehicle is limited within a set acceleration range, and discomfort caused by severe acceleration or deceleration of the vehicle is avoided; and a condition that the speed of the vehicle does not exceed the set speed during steering. The comfort condition can be a condition meeting common comfort requirements of the public, and can also be a condition set according to individual comfort requirements of a driver or a passenger.
In one possible implementation, when the vehicle control device recognizes that the first intention does not meet the first condition, the vehicle control device stops execution of the first intention, that is, stops controlling the driving state of the vehicle in accordance with the first intention. At this time, the vehicle control apparatus may instruct the driver to reject the control of the driving state of the vehicle according to the first intention, or instruct to reject the execution of the control instruction corresponding to the first intention.
For example, when the driver indicates an intention to change lane by a drag operation, if the lane change manner indicated by the drag trajectory is not permitted, the vehicle control apparatus stops controlling the driving state of the vehicle in accordance with the intention. For example, if the driver drags the vehicle to a lane in the opposite direction or the drag trajectory crosses the double solid line, the lane change intention does not meet the first condition, and the vehicle control device may refuse execution if the driving state of the vehicle is not allowed to be controlled according to the intention. In addition, when the first intention does not meet the first condition, the vehicle control apparatus may prompt the driver of the driving state in which the control of the vehicle according to the first intention of the driver is not permitted, by playing a voice or by display on a touch panel, or the like.
Alternatively, the vehicle control apparatus may stop the execution of the first intention and send a first prompt message, which may be used to notify the driver that the first intention is not allowed to be executed, after determining that the first intention does not meet the first condition. Taking the drag operation as an example, when the first intention corresponding to the drag operation is not allowed to be executed, the vehicle control apparatus may transmit a first prompt message indicating that the first intention corresponding to the drag operation is not allowed to be executed.
In the present application, the vehicle control device sends the prompting message (such as the first prompting message, the second prompting message, the third prompting message, and the like), which may be understood as outputting (or presenting) the prompting message to the driver by the vehicle control device, or may be understood as sending the prompting message to a human-computer interaction device such as a display screen by the vehicle control device, and outputting (or presenting) the prompting message to the driver by the software control device.
Specifically, the vehicle control device may transmit the first prompt message to a human-computer interaction device such as a display screen, so that the driver is notified of the first intention by the human-computer interaction device that the first intention is not allowed to be executed. Alternatively, the sending of the first prompt message may be a notification that the first intention is not allowed to be executed to the driver through the human-computer interaction module by the vehicle control device.
For example, if the first gesture operation includes a drag operation, when the vehicle control device recognizes that the driving intention corresponding to the drag operation is not allowed to be executed, the manner of sending the first prompt message is, for example, a manner of displaying a symbol such as "x" or displaying a text such as "unable to be executed" on the display screen, which indicates that the driving state of the vehicle is not allowed to be controlled in accordance with the drag operation.
Further, for a drag operation that is not permitted to be performed, the vehicle control apparatus may also clear the drag trajectory of the drag operation displayed in the display screen. And/or if the vehicle control device changes the display position of the icon of the vehicle displayed in the display screen along with the change of the dragging track, the vehicle control device may display the icon of the vehicle at a first display position in the case that it is determined that the dragging operation is not allowed to be performed, wherein the first display position is the display position of the icon of the vehicle before the dragging operation is acquired, or in other words, the vehicle control device may restore the display position of the icon of the vehicle in the display screen to the display position where the icon of the vehicle was before the dragging operation was performed on the icon of the vehicle by the driver.
In another possible implementation, when the vehicle control apparatus recognizes that the first intention does not meet the first condition, the vehicle control apparatus may determine a second intention meeting the first condition based on the first intention. After obtaining the second intention, the vehicle control device may inquire, through the third prompt message, whether the driver agrees to execute the second intention, or in other words, whether the driver controls the driving state of the vehicle in accordance with the second intention.
The manner of obtaining the second intention according to the first intention is, for example, obtaining the second intention by changing information such as a numerical value, a drag trajectory, and the like corresponding to the first intention without changing the type of intention of the first intention. For example, if the current traveling speed of the vehicle is 75km/h, the first intention of the driver is an acceleration intention to accelerate by 10km/h, but the highest speed limit of the current road is 80km/h, the vehicle may determine that the second intention is to accelerate by 5km/h according to the highest speed limit. For another example, the maximum speed limit of the current road section is 80km/h, but since there are many vehicles on the current road section, in order to ensure smooth and safe driving, the safe driving condition may be to limit the maximum speed at which the vehicle drives, for example, 70km/h, and then, if the current driving speed of the vehicle is 65km/h and the first intention of the driver is an acceleration intention of accelerating for 10km/h, the vehicle may determine that the second intention is to accelerate for 5 km/h. For another example, if the first intention is an acceleration intention, when the acceleration indicated by the first intention of the driver is out of an acceleration range required for the comfort condition, the vehicle control apparatus may generate a second intention, the acceleration corresponding to the second intention belonging to the acceleration range.
For another example, when the first intention determined according to the drag trajectory input by the driver through the touch screen is not allowed to be executed, the drag trajectory is corrected according to the first condition, and the second intention is obtained according to the corrected drag trajectory. For example, if the first intention is a driving track intention, but driving according to the dragging track may not meet the first condition due to the fact that the dragging track input by the driver passes through an opposite lane or other areas where the vehicle is not allowed to pass through, and the like, the dragging track may be corrected according to the first condition to overcome the reason that the dragging track does not meet the first condition, and a second intention is obtained according to the corrected dragging track, and the second intention may also be the driving track intention.
It should be understood that in the present application, if the vehicle control apparatus recognizes that the first intention is a driving intention that does not satisfy the first condition, the first intention may be adjusted to obtain a second intention that satisfies the first condition. Thereafter, the vehicle control apparatus may inquire of the driver whether to execute the second intention.
When inquiring whether the driver agrees to execute the second intention, the vehicle control device can display the characters corresponding to the second intention or the corrected dragging track through the display screen, or play the voice corresponding to the second intention to the driver through the loudspeaker, detect the human-computer interaction operation of the driver, and judge whether the driver agrees to execute the second intention according to the detected human-computer interaction operation of the driver.
The human-computer interaction operation indicating the consent to execute the second intention may be, for example, a driver performing a touch operation on a virtual key indicating the consent displayed on the touch screen, answering a voice including words such as "consent" and "confirmation", or indicating the consent by an action such as nodding, or a gesture. The human-computer interaction operation indicating that the second intention is not executed is performed, for example, the driver touches a virtual key representing the second intention on the touch screen, answers a voice containing words such as "not agree" or "cancel", or performs an action or gesture such as shaking head or waving hands.
Further, the vehicle control apparatus may control the driving state of the vehicle according to the second intention, if it is detected that the driver indicates agreement to perform the human-machine interaction operation of the second intention. If it is detected that the driver indicates that the driver does not agree to execute the human-computer interaction operation of the second intention, or if the human-computer interaction operation of the driver indicating that the driver agrees to execute the second intention is not received within the set time period, the vehicle control device may judge that the driver does not agree to execute the second intention, stop execution of the second intention, and at this time, the vehicle control device may prompt the driver to re-input the driving intention.
Taking a mobile phone with a touch panel as an example of the vehicle control device, the inquiry operation in the present application can be realized by a dialog box displayed on the screen of the mobile phone shown in fig. 10. Taking the second intention as an example of the adjusted acceleration intention, the vehicle control apparatus may be prompted in the dialog box to determine the second intention that meets the first condition, and to ask the driver whether to execute the intention. Optionally, through the dialog box shown in fig. 9, the screen of the mobile phone may further display a virtual key (e.g., an "ok" key) indicating that the driver agrees to execute the second intention and a virtual key (e.g., a "cancel" key) indicating that the driver disagrees with executing the second intention, and when the driver touches any of the virtual keys, the mobile phone may know the touch result, that is, whether the driver agrees to execute the second intention. Optionally, a countdown with a set time duration may be further displayed in the dialog box, and after the countdown is 0, if the mobile phone does not detect the touch operation of the driver on the virtual key indicating that the driver agrees to execute the second intention, it is determined that the driver does not agree to execute the second intention.
The following describes a process of obtaining the second intention for the first intention correction and inquiring the driver whether to execute the second intention, taking a specific example of the first intention as an example.
When the vehicle control device recognizes that the travel track intention does not meet the first condition, the travel track corresponding to the travel track intention may be corrected to obtain a corrected travel track to represent a suggested travel track (for example, a travel track satisfying the first condition is identified), and the travel track intention corresponding to the corrected travel track is the second intention. The vehicle control apparatus may transmit a second prompt message for inquiring whether to control the driving state of the vehicle according to the corrected drag trajectory (or, for inquiring whether to agree to execute the second intention, or for inquiring whether to agree to control the driving state of the vehicle according to the second intention). The sending of the second prompt message may be sending of the second prompt message by the vehicle control device to a human-computer interaction device such as a display screen, so that the human-computer interaction device inquires of the driver whether to control the driving state of the vehicle according to the corrected dragging track. Or, sending the second prompt message may also refer to the vehicle control device inquiring, through the human-computer interaction module, whether to control the driving state of the vehicle according to the corrected dragging track.
Optionally, the vehicle control device may further display the corrected dragging track through a display screen.
Thereafter, the vehicle control apparatus may acquire a first operation by the driver, the first operation indicating agreement to control the driving state of the vehicle according to the corrected drag trajectory (or, indicating agreement to execute the second intention, or, agreement to control the driving state of the vehicle according to the second intention), the vehicle control apparatus may control the driving state of the vehicle according to the corrected drag trajectory, or, the vehicle control apparatus may control the driving state of the vehicle according to the second intention. In addition, if the operation of the driver indicating that the driving state of the vehicle is not to be controlled according to the corrected drag trajectory is acquired, the second intention is not executed.
In addition, in the present application, the manner in which the vehicle control device obtains the second intention from the first intention may also be to delay the execution time of the first intention, that is, the execution timings of the first intention and the second intention are different. For example, if the vehicle control device finds that a section which is dropped to pass after a certain time is allowed to accelerate according to an acceleration value corresponding to the first intention, the user may be prompted to accelerate again after the section is reached. In this example, reaching a future routed road segment followed by acceleration may be considered a second intent obtained from the first intent.
For example, the vehicle control apparatus may send a third prompt message to the driver after obtaining the second intention according to the first intention, for inquiring whether to control the driving state of the vehicle according to the second intention (or, for inquiring whether to approve execution of the second intention, or for inquiring whether to approve control of the driving state of the vehicle according to the second intention). Thereafter, if the vehicle control device obtains a second operation by the driver, the driving state of the vehicle is controlled according to the second intention. The second operation may be a human-machine interaction operation indicating agreement to control the driving state of the vehicle according to the second intention (or, in other words, agreement to execute the second intention, or agreement to control the driving state of the vehicle according to the second intention). In addition, if the operation of the driver indicating that the driving state of the vehicle is not to be controlled according to the corrected second intention is acquired, the second intention is not executed.
As shown in fig. 11, when the vehicle control device includes the human-machine interaction module, the human-machine interaction control module, the decision-making plan calculation module, and the entire vehicle motion control module shown in fig. 4, the method provided in the embodiment of the present application may include the following steps:
s201: the human-computer interaction module acquires a first gesture operation of a driver.
S202: and the human-computer interaction module informs the human-computer interaction control module of the first gesture operation.
S203: the human-computer interaction control module determines a first intention according to the first gesture operation.
S204: the human-computer interaction control module notifies the decision-making plan calculation module of the first intent.
S205: the decision plan calculation module determines whether the first intention satisfies a first condition, if so, performs S206, and if not, performs S207 and/or S208.
S206: and the decision planning calculation module generates a control command according to the first intention, and the whole vehicle motion control module executes the control command to realize control on the driving state of the vehicle according to the first intention.
S207: the decision planning calculation module informs the driver through the man-machine interaction module: the driving state of the vehicle is not allowed to be controlled according to the first intention.
For example, the decision plan calculation module sends an instruction to the human-machine interaction module, and the instruction is used for informing the driver that the first intention is not allowed to be executed. Thereafter, the human-computer interaction module may notify the driver through the display screen and/or the speaker.
S208: the decision plan calculation module determines a second intent that meets the first condition based on the first intent.
S209: the decision plan calculation module notifies the second intent.
S210: the human-computer interaction module inquires the driver: whether to execute the second intent.
S211: the man-machine interaction module acquires a first man-machine interaction operation of a driver.
The first human-computer interaction operation is, for example, a touch operation or an air gesture operation performed by the driver through a touch screen, and may also be a voice input operation performed by the driver through voice, and the application is not particularly limited.
S212: and the human-computer interaction module informs the human-computer interaction control module of the first human-computer interaction operation of the driver.
S213: the human-computer interaction control module identifies whether the first human-computer interaction operation is a human-computer interaction operation indicating consent to perform the second intention. If the first human-machine interaction operation is a human-machine interaction operation indicating consent to perform the second intention, S214-S215 are performed, otherwise, if the first human-machine interaction operation is not a human-machine interaction operation indicating consent to perform the second intention, S216 is performed.
S214: the human-computer interaction control module informs the decision-making planning calculation module that the driver agrees to execute the second intention.
S215: and the decision planning calculation module generates a control command according to the second intention, and the whole vehicle motion control module executes the control command to realize the control of the driving state of the vehicle according to the first intention.
S216: the human-computer interaction control module informs the decision-making planning calculation module that the driver does not agree to execute the second intention. The process of controlling the driving state of the vehicle according to the first intention and/or the second intention is then ended, that is, the driving state of the vehicle is not controlled according to the first intention and the second intention.
According to the flow shown in fig. 11, the vehicle control method provided by the embodiment of the present application can be implemented by the vehicle control apparatus shown in fig. 4. It should be understood that, in the embodiment of the steps shown in fig. 11 and implemented by the vehicle control device shown in fig. 4, according to the vehicle control method provided by the embodiment of the present application, some steps shown in fig. 11 may be omitted, some steps in fig. 11 may be replaced by other steps, or the vehicle control device may further execute some steps not shown in fig. 11.
Based on the above and the same conception, the present application also provides a vehicle control device, which is used for realizing the functions of the vehicle control device in the vehicle control method described in the above method embodiment section, and therefore has the advantages of the method embodiment. The vehicle control apparatus may include any one of the structures of fig. 2 to 4, or may be implemented by a combination of any plurality of the structures of fig. 2 to 4.
The vehicle control device shown in fig. 2 may be a terminal or a vehicle, or may be a chip inside the terminal or the vehicle. The vehicle control apparatus may implement the vehicle control method shown in fig. 5 or fig. 11 and the above-described alternative embodiments. The vehicle control apparatus may include a processing module 210 and an input-output module 220.
The processing module 210 may be configured to execute any step of S102, S103 in the method shown in fig. 5, or S203, S205, S206, S208, S213, or S215 in the method shown in fig. 11, or may be configured to execute any step related to vehicle control, intention recognition, determination as to whether to allow execution of driving intention or correction of driving intention in the above-described alternative embodiments. The input/output module 220 may be configured to perform S101 in the method shown in fig. 5, or any one of S201, S207, S210, or S211 in the method described in fig. 11, or may be configured to perform any one of the steps related to human-computer interaction in the above-described alternative embodiments. For details, reference is made to the detailed description in the method example, which is not repeated herein.
The input-output module 220 may be used to obtain a first gesture operation of the driver. The processing module 210 may be configured to determine a first intention of the driver based on the first gesture operation, and control a driving state of the vehicle based on the first intention. Wherein the first gesture operation comprises at least one of: the method comprises the following steps that a driver conducts touch operation on a touch screen, wherein the touch operation comprises touch operation or dragging operation; the driver operates the vehicle in an empty gesture.
It should be understood that the vehicle control device in the embodiment of the present application may be implemented by software, for example, a computer program or instructions having the above-mentioned functions, and the corresponding computer program or instructions may be stored in a memory inside the terminal, and the processor reads the corresponding computer program or instructions inside the memory to implement the above-mentioned functions of the processing module 210 and/or the input-output module 220. Alternatively, the vehicle control device in the embodiment of the present application may also be realized by hardware. The processing module 210 may be a processor (e.g., a CPU or a processor in a system chip), and the input/output module 220 may include a human-computer interaction device, or an interface supporting the processing module 210 to communicate with the human-computer interaction device, such as an interface circuit, for notifying the processing module 210 of the first gesture operation recognized by the human-computer interaction device. If a human-computer interaction device is included, input-output device 220 may include a processor for recognizing the first gesture operation. Alternatively, the vehicle control apparatus in the embodiment of the present application may also be implemented by a combination of a processor and a software module.
In an alternative implementation, the first intent includes at least one of the following intents: an overtaking intent, a lane change intent, a steering intent, or a driving trajectory intent.
In an alternative implementation manner, if the first gesture operation includes a drag operation of an icon of the vehicle displayed on the touch screen by the driver, the input-output module 220 is further configured to: moving the display position of the icon of the vehicle in the touch screen along with the change of the dragging track of the dragging operation; and/or displaying a dragging track of the dragging operation in the touch screen. The input/output module 220 may include a touch screen, or an interface connected to the touch screen, for displaying a dragging track on the touch screen.
In an alternative implementation, the processing module 210 may be further configured to control a driving state of the vehicle according to the drag trajectory when the first intention corresponding to the drag trajectory is allowed to be executed, or perform, when the first intention corresponding to the drag trajectory is not allowed to be executed: the dragging track displayed in the touch screen is cleared, and the input/output module 220 may include the touch screen or an interface connected to the touch screen for enabling the touch screen to clear the dragging track. When the first intention corresponding to the dragging track is not allowed to be executed, the input/output module 220 may be further configured to send a first prompt message, where the first prompt message is used to notify that the first intention corresponding to the dragging operation is not allowed to be executed, and at this time, the input/output module 220 may include a touch screen, a speaker, or another human-computer interaction device, or include an interface connected to the human-computer interaction device, and is used to notify the driver that the first intention corresponding to the dragging operation is not allowed to be executed through the human-computer interaction device; and/or displaying the icon of the vehicle at a first display position, where the first display position is the display position of the icon of the vehicle before the drag operation is acquired, and the input and output module 220 at this time may include a touch screen, or an interface connected to the touch screen, and configured to display the icon of the vehicle at the first display position.
In an alternative implementation, the first intention corresponding to the drag operation includes an intention of the driving track, and when the first intention corresponding to the drag track is not allowed to be executed, the processing module 210 is further configured to: the drag trajectory is corrected according to at least one of traffic regulation conditions, safe driving conditions, environmental conditions, or comfort conditions. The input-output module 220 may also be configured to: the corrected dragging track is displayed on the touch screen, and the corrected dragging track represents a suggested driving route, and the input/output module 220 may include a touch screen or an interface connected to the touch screen for displaying the corrected dragging track.
In an optional implementation manner, the input/output module 220 may be further configured to send a second prompt message, where the second prompt message is used to inquire whether to control the driving state of the vehicle according to the modified dragging track, and the input/output module 220 may be further configured to obtain a first operation of the driver, where the first operation indicates that the driver agrees to control the driving state of the vehicle according to the modified dragging track. The processing module 210 may also be configured to control a driving state of the vehicle according to the modified drag trajectory. The input/output module 220 may include a touch screen, a speaker, or other human-computer interaction device, or an interface connected to the human-computer interaction device, and is configured to send the second prompt message to the driver through the human-computer interaction device and obtain the first operation of the driver.
In an alternative implementation, the processing module 210 is further configured to: determining that the first intent does not satisfy a first condition, the first condition comprising at least one of a traffic regulation condition, a safe driving condition, an environmental condition, or a comfort condition; the processing module 210 may be further configured to determine a second intention according to the first intention, the second intention satisfying the first condition, an execution timing of the second intention being different from an execution timing of the first intention; the processing module 210 is particularly operable to control the driving state of the vehicle in accordance with the second intent.
In an alternative implementation, the input and output module 220 may be further configured to send a third prompt message, where the third prompt message is used to inquire whether to control the driving state of the vehicle according to the second intention, and the input and output module 220 may be further configured to obtain a second operation of the driver, where the second operation indicates agreement to control the driving state of the vehicle according to the second intention. The input/output module 220 may include a touch screen, a speaker, or other human-computer interaction device, or an interface connected to the human-computer interaction device, and is configured to send a third prompt message to the driver through the human-computer interaction device and obtain a second operation of the driver.
In an alternative implementation, the processing module 210 may further determine that the first intent does not satisfy a first condition, the first condition including at least one of a traffic regulation condition, a safe driving condition, or a comfort condition. The processing module 210 may also be configured to determine a second intent based on the first condition, the second intent satisfying the first condition. The input-output module 220 may also be configured to: and sending a third prompt message for inquiring whether to control the driving state of the vehicle according to the second intention. The input-output module 220 may be further operable to acquire a third operation by the driver, the third operation indicating that the driving state of the vehicle is not approved to be controlled according to the second intention. The processing module 210 may also be used to stop controlling the driving state of the vehicle according to the second intent.
It should be understood that, for details of processing of the vehicle control device in the embodiment of the present application, reference may be made to fig. 5 and fig. 11 and related expressions in the embodiment of the method of the present application, and details are not repeated here.
The vehicle control device shown in fig. 3 may be a terminal or a vehicle, or may be a chip inside the terminal or the vehicle. The vehicle control apparatus may implement the vehicle control method shown in fig. 5 or fig. 11 and the above-described alternative embodiments. Wherein the vehicle control device may include at least one of a processor, a memory, an interface circuit, or a human-computer interaction device. It should be understood that although only one processor, one memory, one interface circuit and one (or one) human interaction device are shown in fig. 3. The vehicle control device may include other numbers of processors and interface circuits.
The interface circuit is used for communicating the vehicle control device with the terminal or other components of the vehicle, such as a memory or other processor, or a human-computer interaction device. The processor may be used for signal interaction with other components via the interface circuit. The interface circuit may be an input/output interface of the processor.
For example, the processor may read computer programs or instructions in a memory coupled thereto through the interface circuit and decode and execute the computer programs or instructions. It should be understood that these computer programs or instructions may include the functional programs described above, as well as the functional programs of the vehicle control devices described above. When the corresponding functional program is decoded and executed by the processor, the vehicle control device can be enabled to realize the scheme in the vehicle control method provided by the embodiment of the application.
Alternatively, these functional programs are stored in a memory outside the vehicle control apparatus, and the vehicle control apparatus may not include the memory at this time. When the functional program is decoded and executed by the processor, part or all of the content of the functional program is temporarily stored in the memory.
Alternatively, these functional programs are stored in a memory inside the vehicle control device. When the above-described function program is stored in the memory inside the vehicle control apparatus, the vehicle control apparatus may be provided in the vehicle control apparatus of the embodiment of the invention.
Alternatively, these functional programs are stored in a memory outside the vehicle control apparatus, and the other portions of these functional programs are stored in a memory inside the vehicle control apparatus.
It should be understood that the processor may be a chip. For example, the processor may be a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), a system on chip (SoC), a Central Processing Unit (CPU), a Network Processor (NP), a digital signal processing circuit (DSP), a Microcontroller (MCU), a Programmable Logic Device (PLD), or other integrated chips.
It should be noted that the processor in the embodiments of the present application may be an integrated circuit chip having signal processing capability. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The processor described above may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
It will be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM, enhanced SDRAM, SLDRAM, Synchronous Link DRAM (SLDRAM), and direct rambus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
It should be understood that when the vehicle control apparatus is implemented by the structure shown in fig. 3, a computer program or instructions may be stored by the memory, executed by the processor, and executed by the processing module 210 when the vehicle control apparatus is implemented by the structure shown in fig. 2. The actions performed by the input-output module 220 when the vehicle control apparatus is implemented by the structure shown in fig. 2 may also be performed by the interface circuit and/or the human-machine interaction device. Alternatively, the processing module 210 shown in fig. 2 may be implemented by the processor and the memory shown in fig. 3, or the processing module 210 shown in fig. 2 includes the processor and the memory, or the processor executes the computer program or the instructions stored in the memory to implement the actions performed by the processing module 210 shown in fig. 2. And/or, the input/output module 220 shown in fig. 2 may be implemented by the interface circuit and/or the human-computer interaction device shown in fig. 3, or the processing module 210 shown in fig. 2 includes the interface circuit and/or the human-computer interaction device shown in fig. 3, or the interface circuit and/or the human-computer interaction device performs the actions performed by the input/output module 220 shown in fig. 2.
When the vehicle control apparatus is implemented by the structure shown in fig. 4, the actions performed by the processing module 210 when the vehicle control apparatus is implemented by the structure shown in fig. 2 may be performed by the human-computer interaction control module, the decision plan calculation module, and the entire vehicle motion control module. The actions performed by the input-output module 220 when the vehicle control apparatus is implemented by the structure shown in fig. 2 may also be performed by the human-machine interaction module. When the vehicle control apparatus is implemented by the structure shown in fig. 4, the actions respectively executed by the human-computer interaction module, the human-computer interaction control module, the decision-making plan calculation module, and the entire vehicle motion control module may refer to the description in the flow shown in fig. 11, and are not described again here.
It should be understood that the structures of the vehicle control devices shown in any one of fig. 2 to 4 may be combined with each other, and the vehicle control devices shown in any one of fig. 2 to 4 and the relevant design details of the various alternative embodiments may be referred to each other, and the vehicle control method shown in any one of fig. 2 to 4 and the relevant design details of the various alternative embodiments may also be referred to. And will not be repeated here.
Based on the above and the same idea, the present application provides a computing device comprising a processor connected to a memory, the memory being configured to store a computer program or instructions, the processor being configured to execute the computer program stored in the memory, so as to cause the computing device to perform the method in the above method embodiments.
Based on the above and the same idea, the present application provides a computer-readable storage medium having stored thereon a computer program or instructions, which, when executed, cause a computing device to perform the method in the above-described method embodiments.
Based on the above and the same idea, the present application provides a computer program product, which, when executed by a computer, causes the computing device to perform the method in the above method embodiments.
Based on the above and the same idea, the present application provides a chip, which is connected to a memory and is used to read and execute a computer program or instructions stored in the memory, so as to cause a computing device to execute the method in the above method embodiments.
Based on the foregoing and similar concepts, embodiments of the present application provide an apparatus that includes a processor and an interface circuit configured to receive a computer program or instructions and transmit the computer program or instructions to the processor; the processor executes the computer program or instructions to perform the methods in the above-described method embodiments.
It should be understood that the division of the modules in the embodiments of the present application is illustrative, and is only one logical function division, and there may be other division manners in actual implementation. In addition, functional modules in the embodiments of the present application may be integrated into one processor, may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (21)

1.一种车辆控制方法,其特征在于,所述车辆处于自动驾驶状态,包括:1. A vehicle control method, wherein the vehicle is in an automatic driving state, comprising: 获取驾驶员的第一手势操作;Obtain the first gesture operation of the driver; 根据所述第一手势操作确定所述驾驶员的第一意图;determining the first intention of the driver according to the first gesture operation; 根据所述第一意图控制所述车辆的驾驶状态;controlling the driving state of the vehicle according to the first intent; 其中,所述第一手势操作包括以下操作中的一种或多种:Wherein, the first gesture operation includes one or more of the following operations: 所述驾驶员在触控屏上的触控操作,所述触控操作包括触碰操作或拖拽操作;或者,The driver's touch operation on the touch screen, where the touch operation includes a touch operation or a drag operation; or, 所述驾驶员的隔空手势操作。The driver's remote gesture operation. 2.如权利要求1所述的方法,其特征在于,所述第一意图包括以下意图中的至少一种:2. The method of claim 1, wherein the first intent comprises at least one of the following intents: 超车意图、变道意图、转向意图,或者行驶轨迹意图。Overtaking intent, lane change intent, steering intent, or driving trajectory intent. 3.如权利要求1或2所述的方法,其特征在于,所述第一手势操作包括所述驾驶员对于所述触控屏上显示的所述车辆的图标的拖拽操作;3. The method according to claim 1 or 2, wherein the first gesture operation comprises a drag operation of the driver on the icon of the vehicle displayed on the touch screen; 所述方法还包括:The method also includes: 随着所述拖拽操作的拖拽轨迹的变化,移动所述车辆的图标在所述触控屏中的显示位置;和/或,moving the display position of the icon of the vehicle on the touch screen as the drag track of the drag operation changes; and/or, 在所述触控屏中显示所述拖拽操作的拖拽轨迹。The drag track of the drag operation is displayed on the touch screen. 4.如权利要求3所述的方法,其特征在于,还包括:4. The method of claim 3, further comprising: 在所述拖拽轨迹对应的第一意图被允许执行时,根据所述拖拽轨迹控制所述车辆的驾驶状态;When the first intention corresponding to the drag track is allowed to be executed, controlling the driving state of the vehicle according to the drag track; 在所述拖拽轨迹对应的第一意图不被允许执行时,执行以下中的至少一个操作:When the first intent corresponding to the dragging track is not allowed to be executed, at least one of the following operations is performed: 发送第一提示消息,所述第一提示消息用于通知所述拖拽操作对应的第一意图不被允许执行;或者,Send a first prompt message, where the first prompt message is used to notify that the first intent corresponding to the drag operation is not allowed to be executed; or, 在第一显示位置显示所述车辆的图标,所述第一显示位置是获取到所述拖拽操作之前所述车辆的图标的显示位置;或者,Display the icon of the vehicle at a first display position, where the first display position is the display position of the icon of the vehicle before the drag operation is acquired; or, 清除所述触控屏中显示的所述拖拽轨迹。Clear the drag track displayed on the touch screen. 5.如权利要求3或4所述的方法,其特征在于,所述拖拽操作对应的第一意图包括行驶轨迹意图,在所述拖拽轨迹对应的第一意图不被允许执行时,还包括:5. The method according to claim 3 or 4, wherein the first intent corresponding to the dragging operation comprises a driving trajectory intent, and when the first intent corresponding to the dragging trajectory is not allowed to be executed, the include: 根据交通规则条件、安全行驶条件、环境条件或舒适性条件中的至少一个,对所述拖拽轨迹进行修正;modifying the drag trajectory according to at least one of traffic rule conditions, safe driving conditions, environmental conditions or comfort conditions; 在所述触控屏中显示修正后的拖拽轨迹,所述修正后的拖拽轨迹表示建议的行驶路线。A corrected drag track is displayed on the touch screen, and the corrected drag track represents a suggested driving route. 6.如权利要求5所述的方法,其特征在于,还包括:6. The method of claim 5, further comprising: 发送第二提示消息,所述第二提示消息用于询问是否根据所述修正后的拖拽轨迹控制所述车辆的驾驶状态;sending a second prompt message, where the second prompt message is used to inquire whether to control the driving state of the vehicle according to the corrected drag trajectory; 获取所述驾驶员的第一操作,所述第一操作表示同意根据所述修正后的拖拽轨迹控制所述车辆的驾驶状态;acquiring a first operation of the driver, the first operation expressing agreement to control the driving state of the vehicle according to the corrected drag trajectory; 根据所述第一意图控制所述车辆的驾驶状态,包括:Controlling the driving state of the vehicle according to the first intent includes: 根据所述修正后的拖拽轨迹控制所述车辆的驾驶状态。The driving state of the vehicle is controlled according to the corrected drag trajectory. 7.如权利要求1或2所述的方法,其特征在于,还包括:7. The method of claim 1 or 2, further comprising: 确定所述第一意图不满足第一条件,所述第一条件包括交通规则条件、安全行驶条件、环境条件或舒适性条件中的至少一个;determining that the first intent does not satisfy a first condition, the first condition comprising at least one of a traffic rule condition, a safe driving condition, an environmental condition, or a comfort condition; 根据所述第一意图确定第二意图,所述第二意图满足所述第一条件,所述第二意图的执行时机与所述第一意图的执行时机不同;A second intent is determined according to the first intent, the second intent satisfies the first condition, and the execution timing of the second intent is different from the execution timing of the first intent; 所述根据所述第一意图控制所述车辆的驾驶状态,包括:The controlling the driving state of the vehicle according to the first intention includes: 根据所述第二意图控制所述车辆的驾驶状态。The driving state of the vehicle is controlled according to the second intention. 8.如权利要求7所述的方法,其特征在于,还包括:8. The method of claim 7, further comprising: 发送第三提示消息,所述第三提示消息用于询问是否根据所述第二意图控制所述车辆的驾驶状态;sending a third prompt message, where the third prompt message is used to inquire whether to control the driving state of the vehicle according to the second intention; 获取所述驾驶员的第二操作,所述第二操作表示同意根据所述第二意图控制所述车辆的驾驶状态。A second operation of the driver is acquired, the second operation expressing agreement to control the driving state of the vehicle according to the second intention. 9.如权利要求1或2所述的方法,其特征在于,还包括:9. The method of claim 1 or 2, further comprising: 确定所述第一意图不满足第一条件,所述第一条件包括交通规则条件、安全行驶条件或舒适性条件中的至少一个;determining that the first intent does not satisfy a first condition, the first condition comprising at least one of a traffic rule condition, a safe driving condition, or a comfort condition; 根据所述第一条件确定第二意图,所述第二意图满足所述第一条件;determining a second intent according to the first condition, the second intent satisfying the first condition; 发送第三提示消息,所述第三提示消息用于询问是否根据所述第二意图控制所述车辆的驾驶状态;sending a third prompt message, where the third prompt message is used to inquire whether to control the driving state of the vehicle according to the second intention; 获取所述驾驶员的第三操作,所述第三操作表示不同意根据所述第二意图控制所述车辆的驾驶状态;acquiring a third operation of the driver, the third operation indicating that he does not agree to control the driving state of the vehicle according to the second intention; 停止根据所述第二意图控制所述车辆的驾驶状态。Controlling the driving state of the vehicle according to the second intention is stopped. 10.一种车辆控制装置,其特征在于,所述车辆处于自动驾驶状态,包括处理模块和输入输出模块:10. A vehicle control device, wherein the vehicle is in an automatic driving state, comprising a processing module and an input and output module: 所述输入输出模块,用于获取驾驶员的第一手势操作;The input and output module is used to obtain the first gesture operation of the driver; 所述处理模块,用于根据所述第一手势操作确定所述驾驶员的第一意图,根据所述第一意图控制所述车辆的驾驶状态;the processing module, configured to determine the first intention of the driver according to the first gesture operation, and control the driving state of the vehicle according to the first intention; 其中,所述第一手势操作包括以下操作中的至少一种:Wherein, the first gesture operation includes at least one of the following operations: 所述驾驶员在触控屏上的触控操作,所述触控操作包括触碰操作或拖拽操作;或者,The driver's touch operation on the touch screen, where the touch operation includes a touch operation or a drag operation; or, 所述驾驶员的隔空手势操作。The driver's remote gesture operation. 11.如权利要求10所述的车辆控制装置,其特征在于,所述第一意图包括以下意图中的至少一种:11. The vehicle control device of claim 10, wherein the first intent includes at least one of the following intents: 超车意图、变道意图、转向意图,或者行驶轨迹意图。Overtaking intent, lane change intent, steering intent, or driving trajectory intent. 12.如权利要求10或11所述的车辆控制装置,其特征在于,所述第一手势操作包括所述驾驶员对于所述触控屏上显示的所述车辆的图标的拖拽操作;12. The vehicle control device according to claim 10 or 11, wherein the first gesture operation comprises a drag operation of the driver on the icon of the vehicle displayed on the touch screen; 所述输入输出模块还用于:The input and output modules are also used for: 随着所述拖拽操作的拖拽轨迹的变化,移动所述车辆的图标在所述触控屏中的显示位置;和/或,moving the display position of the icon of the vehicle on the touch screen as the drag track of the drag operation changes; and/or, 在所述触控屏中显示所述拖拽操作的拖拽轨迹。The drag track of the drag operation is displayed on the touch screen. 13.如权利要求12所述的车辆控制装置,其特征在于,所述处理模块还用于:13. The vehicle control device of claim 12, wherein the processing module is further configured to: 在所述拖拽轨迹对应的第一意图被允许执行时,根据所述拖拽轨迹控制所述车辆的驾驶状态;When the first intention corresponding to the drag track is allowed to be executed, controlling the driving state of the vehicle according to the drag track; 在所述拖拽轨迹对应的第一意图不被允许执行时,清除所述触控屏中显示的所述拖拽轨迹;when the first intent corresponding to the dragging track is not allowed to be executed, clearing the dragging track displayed on the touch screen; 在所述拖拽轨迹对应的第一意图不被允许执行时,所述输入输出模块还用于:When the first intent corresponding to the dragging track is not allowed to be executed, the input and output module is further configured to: 发送第一提示消息,所述第一提示消息用于通知所述拖拽操作对应的第一意图不被允许执行;和/或,sending a first prompt message, where the first prompt message is used to notify that the first intent corresponding to the drag operation is not allowed to be executed; and/or, 在第一显示位置显示所述车辆的图标,所述第一显示位置是获取到所述拖拽操作之前所述车辆的图标的显示位置。The icon of the vehicle is displayed at a first display position, the first display position being the display position of the icon of the vehicle before the drag operation is acquired. 14.如权利要求12或13所述的车辆控制装置,其特征在于,所述拖拽操作对应的第一意图包括行驶轨迹意图,在所述拖拽轨迹对应的第一意图不被允许执行时,所述处理模块还用于:14 . The vehicle control device according to claim 12 or 13 , wherein the first intent corresponding to the dragging operation comprises a driving trajectory intent, and when the first intent corresponding to the dragging trajectory is not allowed to execute. 15 . , the processing module is also used for: 根据交通规则条件、安全行驶条件、环境条件或舒适性条件中的至少一个,对所述拖拽轨迹进行修正;modifying the drag trajectory according to at least one of traffic rule conditions, safe driving conditions, environmental conditions or comfort conditions; 所述输入输出模块还用于:The input and output modules are also used for: 在所述触控屏中显示修正后的拖拽轨迹,所述修正后的拖拽轨迹表示建议的行驶路线。A corrected drag track is displayed on the touch screen, and the corrected drag track represents a suggested driving route. 15.如权利要求14所述的车辆控制装置,其特征在于,所述输入输出模块还用于:15. The vehicle control device according to claim 14, wherein the input and output module is further used for: 发送第二提示消息,所述第二提示消息用于询问是否根据所述修正后的拖拽轨迹控制所述车辆的驾驶状态;sending a second prompt message, where the second prompt message is used to inquire whether to control the driving state of the vehicle according to the corrected drag trajectory; 获取所述驾驶员的第一操作,所述第一操作表示同意根据所述修正后的拖拽轨迹控制所述车辆的驾驶状态;acquiring a first operation of the driver, the first operation expressing agreement to control the driving state of the vehicle according to the corrected drag trajectory; 所述处理模块具体用于:The processing module is specifically used for: 根据所述修正后的拖拽轨迹控制所述车辆的驾驶状态。The driving state of the vehicle is controlled according to the corrected drag trajectory. 16.如权利要求10或11所述的车辆控制装置,其特征在于,所述处理模块还用于:16. The vehicle control device according to claim 10 or 11, wherein the processing module is further used for: 确定所述第一意图不满足第一条件,所述第一条件包括交通规则条件、安全行驶条件、环境条件或舒适性条件中的至少一个;determining that the first intent does not satisfy a first condition, the first condition comprising at least one of a traffic rule condition, a safe driving condition, an environmental condition, or a comfort condition; 根据所述第一意图确定第二意图,所述第二意图满足所述第一条件,所述第二意图的执行时机与所述第一意图的执行时机不同;A second intent is determined according to the first intent, the second intent satisfies the first condition, and the execution timing of the second intent is different from the execution timing of the first intent; 所述处理模块具体用于:The processing module is specifically used for: 根据所述第二意图控制所述车辆的驾驶状态。The driving state of the vehicle is controlled according to the second intention. 17.如权利要求16所述的车辆控制装置,其特征在于,所述输入输出模块还用于:17. The vehicle control device according to claim 16, wherein the input and output module is further used for: 发送第三提示消息,所述第三提示消息用于询问是否根据所述第二意图控制所述车辆的驾驶状态;sending a third prompt message, where the third prompt message is used to inquire whether to control the driving state of the vehicle according to the second intention; 获取所述驾驶员的第二操作,所述第二操作表示同意根据所述第二意图控制所述车辆的驾驶状态。A second operation of the driver is acquired, the second operation expressing agreement to control the driving state of the vehicle according to the second intention. 18.如权利要求10或11所述的车辆控制装置,其特征在于,所述处理模块还用:18. The vehicle control device according to claim 10 or 11, wherein the processing module further uses: 确定所述第一意图不满足第一条件,所述第一条件包括交通规则条件、安全行驶条件或舒适性条件中的至少一个;determining that the first intent does not satisfy a first condition, the first condition comprising at least one of a traffic rule condition, a safe driving condition, or a comfort condition; 根据所述第一条件确定第二意图,所述第二意图满足所述第一条件;determining a second intent according to the first condition, the second intent satisfying the first condition; 所述输入输出模块还用于:The input and output modules are also used for: 发送第三提示消息,所述第三提示消息用于询问是否根据所述第二意图控制所述车辆的驾驶状态;sending a third prompt message, where the third prompt message is used to inquire whether to control the driving state of the vehicle according to the second intention; 获取所述驾驶员的第三操作,所述第三操作表示不同意根据所述第二意图控制所述车辆的驾驶状态;acquiring a third operation of the driver, the third operation indicating that he does not agree to control the driving state of the vehicle according to the second intention; 所述处理模块还用于:The processing module is also used for: 停止根据所述第二意图控制所述车辆的驾驶状态。Controlling the driving state of the vehicle according to the second intention is stopped. 19.一种计算设备,其特征在于,包括处理器,所述处理器与存储器相连,所述存储器存储计算机程序或指令,所述处理器用于执行所述存储器中存储的计算机程序或指令,以使得所述计算设备执行如权利要求1至9中任一项所述的方法。19. A computing device, characterized in that it comprises a processor, the processor is connected to a memory, the memory stores computer programs or instructions, and the processor is configured to execute the computer programs or instructions stored in the memory to The computing device is caused to perform the method of any one of claims 1 to 9. 20.一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机程序或指令,当所述计算机程序或指令被计算设备执行时,以使得所述计算设备执行如权利要求1至9中任一项所述的方法。20. A computer-readable storage medium, characterized in that, a computer program or instruction is stored in the computer-readable storage medium, and when the computer program or instruction is executed by a computing device, so that the computing device executes as follows: The method of any one of claims 1 to 9. 21.一种芯片,其特征在于,包括至少一个处理器和接口;21. A chip, comprising at least one processor and an interface; 所述接口,用于为所述至少一个处理器提供计算机程序、指令或者数据;the interface for providing computer programs, instructions or data for the at least one processor; 所述至少一个处理器用于执行所述计算机程序或指令,以使得如权利要求1至9中任一项所述的方法被执行。The at least one processor is adapted to execute the computer program or instructions such that the method of any of claims 1 to 9 is performed.
CN202180003366.2A 2021-03-31 2021-03-31 A vehicle control method and device Active CN113840766B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/084650 WO2022205159A1 (en) 2021-03-31 2021-03-31 Vehicle control method and apparatus

Publications (2)

Publication Number Publication Date
CN113840766A true CN113840766A (en) 2021-12-24
CN113840766B CN113840766B (en) 2022-10-18

Family

ID=78971725

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180003366.2A Active CN113840766B (en) 2021-03-31 2021-03-31 A vehicle control method and device

Country Status (2)

Country Link
CN (1) CN113840766B (en)
WO (1) WO2022205159A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114510193A (en) * 2022-02-24 2022-05-17 芜湖雄狮汽车科技有限公司 Vehicle control method and device, vehicle and storage medium
CN115092163A (en) * 2022-06-21 2022-09-23 重庆长安汽车股份有限公司 Artificial intelligence auxiliary driving method and system and automobile
WO2024156254A1 (en) * 2023-01-28 2024-08-02 华为技术有限公司 Voice interaction guidance method, program, device, and vehicle
CN119527307A (en) * 2025-01-22 2025-02-28 浙江吉利控股集团有限公司 Vehicle distance control method, device, electronic device, vehicle and storage medium
CN119527332A (en) * 2025-01-22 2025-02-28 浙江吉利控股集团有限公司 Human-computer interaction method, device, electronic equipment, vehicle and medium for vehicle control

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107851394A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN107848540A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method, driving auxiliary program and automatic driving vehicle
CN107848541A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN107851395A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN107924629A (en) * 2015-07-31 2018-04-17 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN107924627A (en) * 2015-07-31 2018-04-17 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method, driving auxiliary program and automatic driving vehicle
CN108334258A (en) * 2018-04-11 2018-07-27 刘连波 Automatic Pilot auxiliary device, automatic Pilot householder method and automatic Pilot auxiliary system
CN111813314A (en) * 2019-04-12 2020-10-23 比亚迪股份有限公司 Vehicle control method and device, storage medium and electronic equipment
CN112141124A (en) * 2019-08-27 2020-12-29 英属开曼群岛商麦迪创科技股份有限公司 Assisted driving system for vehicle and method of operation thereof

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107851394A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN107848540A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method, driving auxiliary program and automatic driving vehicle
CN107848541A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN107851395A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN107924629A (en) * 2015-07-31 2018-04-17 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN107924627A (en) * 2015-07-31 2018-04-17 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method, driving auxiliary program and automatic driving vehicle
CN108334258A (en) * 2018-04-11 2018-07-27 刘连波 Automatic Pilot auxiliary device, automatic Pilot householder method and automatic Pilot auxiliary system
CN111813314A (en) * 2019-04-12 2020-10-23 比亚迪股份有限公司 Vehicle control method and device, storage medium and electronic equipment
CN112141124A (en) * 2019-08-27 2020-12-29 英属开曼群岛商麦迪创科技股份有限公司 Assisted driving system for vehicle and method of operation thereof

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114510193A (en) * 2022-02-24 2022-05-17 芜湖雄狮汽车科技有限公司 Vehicle control method and device, vehicle and storage medium
CN114510193B (en) * 2022-02-24 2025-04-04 芜湖雄狮汽车科技有限公司 Vehicle control method, device, vehicle and storage medium
CN115092163A (en) * 2022-06-21 2022-09-23 重庆长安汽车股份有限公司 Artificial intelligence auxiliary driving method and system and automobile
WO2024156254A1 (en) * 2023-01-28 2024-08-02 华为技术有限公司 Voice interaction guidance method, program, device, and vehicle
CN119527307A (en) * 2025-01-22 2025-02-28 浙江吉利控股集团有限公司 Vehicle distance control method, device, electronic device, vehicle and storage medium
CN119527332A (en) * 2025-01-22 2025-02-28 浙江吉利控股集团有限公司 Human-computer interaction method, device, electronic equipment, vehicle and medium for vehicle control
CN119527332B (en) * 2025-01-22 2025-05-13 浙江吉利控股集团有限公司 Human-computer interaction method, device, electronic device, vehicle and medium for vehicle control

Also Published As

Publication number Publication date
CN113840766B (en) 2022-10-18
WO2022205159A1 (en) 2022-10-06

Similar Documents

Publication Publication Date Title
CN113840766B (en) A vehicle control method and device
CN115158354B (en) Driving assistance method, driving assistance device and driving assistance system
JP6575818B2 (en) Driving support method, driving support device using the same, automatic driving control device, vehicle, driving support system, program
JP5957745B1 (en) Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle
JP5945999B1 (en) Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle
US10532763B2 (en) Driving support device, driving support system, and driving support method
CN109562760B (en) Predictions for Autonomous Vehicle Testing
JP7329755B2 (en) Support method and support system and support device using the same
JP2023041765A (en) Driving control device, driving control program, presentation control device and presentation control program
JP7006326B2 (en) Autonomous driving system
US11945442B2 (en) Autonomous driving system and control method for autonomous driving system
JPWO2018029758A1 (en) Control method and control device of autonomous driving vehicle
JP2017013614A (en) Vehicle travel control apparatus
US20200017123A1 (en) Drive mode switch controller, method, and program
CN113942504B (en) Self-adaptive cruise control method and device
JP2017119508A (en) Drive assist device, drive assist system, drive assist method, drive assist program and automatic operation vehicle
CN112009480A (en) Driving assistance system
JP2024546056A (en) Intelligent driving method and vehicle to which the method is applied
JP6090727B2 (en) Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle
JP2023031225A (en) Presentation control device, presentation control program, automatic driving control device, and automatic driving control program
JP7218752B2 (en) PRESENTATION CONTROL DEVICE, PRESENTATION CONTROL PROGRAM AND OPERATION CONTROL DEVICE
CN114030477A (en) Method and system for realizing adaptive ADAS function based on vehicle driving related data
JP7338653B2 (en) Presentation control device and presentation control program
JP2020121619A (en) Vehicle control device and vehicle control method
JP6907802B2 (en) Driving support method and driving support device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20241106

Address after: 518129 Huawei Headquarters Office Building 101, Wankecheng Community, Bantian Street, Longgang District, Shenzhen, Guangdong

Patentee after: Shenzhen Yinwang Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Patentee before: HUAWEI TECHNOLOGIES Co.,Ltd.

Country or region before: China