WO2023201563A1 - 一种控制方法、装置和交通工具 - Google Patents

一种控制方法、装置和交通工具 Download PDF

Info

Publication number
WO2023201563A1
WO2023201563A1 PCT/CN2022/087879 CN2022087879W WO2023201563A1 WO 2023201563 A1 WO2023201563 A1 WO 2023201563A1 CN 2022087879 W CN2022087879 W CN 2022087879W WO 2023201563 A1 WO2023201563 A1 WO 2023201563A1
Authority
WO
WIPO (PCT)
Prior art keywords
controller
vehicle
sensor group
sensing result
control unit
Prior art date
Application number
PCT/CN2022/087879
Other languages
English (en)
French (fr)
Inventor
王剑伟
严立
赖龙珍
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2022/087879 priority Critical patent/WO2023201563A1/zh
Priority to CN202280005225.9A priority patent/CN117279818A/zh
Publication of WO2023201563A1 publication Critical patent/WO2023201563A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/023Avoiding failures by using redundant parts
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B9/00Safety arrangements
    • G05B9/02Safety arrangements electric
    • G05B9/03Safety arrangements electric with multiple-channel loop, i.e. redundant control systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/22Arrangements for detecting or preventing errors in the information received using redundant apparatus to increase reliability

Definitions

  • Embodiments of the present application relate to the field of intelligent driving, and more specifically, to a control method, device and vehicle.
  • the responsible entity in autonomous driving L3 and above scenarios is the autonomous driving system.
  • the design requirement of the automatic driving system is "fail operational", that is, when a fault occurs, the automatic driving system continues to run the automatic driving function and takes corresponding measures to make the vehicle exit automatic driving safely.
  • the current mainstream manufacturers in the industry use two automatic driving controllers for "1:1" backup redundancy. That is, two identical automatic driving domain controllers are used in the entire system to connect in parallel, one of which is connected in parallel.
  • the automatic driving controller serves as the main controller, runs the complete automatic driving business, and outputs vehicle control instructions to control the operation of the vehicle.
  • Another automatic driving controller serves as a backup controller.
  • the backup controller can replace the main controller to continue business processing and control the behavior of the vehicle. This requires both the main controller and the backup controller to have high computing performance to meet the system requirements.
  • the backup controller is in an idle state, which will cause a waste of costs and computing resources.
  • Embodiments of the present application provide a control method, device and vehicle, which help to improve the utilization of computing resources and also help to reduce the cost of the controller.
  • vehicles may include one or more different types of vehicles that operate or move on land (for example, roads, roads, railways, etc.), water (for example: waterways, rivers, oceans, etc.) or in space.
  • Transport or movable object For example, vehicles may include cars, bicycles, motorcycles, trains, subways, airplanes, ships, aircraft, robots, or other types of transportation vehicles or movable objects.
  • a control method includes: a first controller obtains a first sensing result based on data collected by sensors in a first sensor group; and a second controller obtains a first sensing result based on data collected by sensors in a second sensor group. data to obtain a second sensing result; the first controller receives the second sensing result sent by the second controller; the first controller sends a message to the actuator based on the first sensing result and the second sensing result. First control command.
  • the first controller and the second controller can respectively sense the data collected by the sensors in the first sensor group and the second sensor group, thereby obtaining the first sensing result and the second sensing result.
  • the first controller may use the first sensing result calculated by the first controller and the second sensing result sent by the second controller to the first controller to generate and send the first control instruction to the actuator.
  • the first controller can use the computing power of the second controller, which helps to improve the utilization of computing resources; at the same time, the first controller and the second controller only need to collect the data collected by the sensors in their respective corresponding sensor groups. For data processing, both controllers do not need to have high computing performance, which helps reduce the cost of the controller.
  • the first controller may be a primary controller
  • the second controller may be a backup controller
  • the method further includes: the second controller receiving the first sensing result sent by the first controller; the second controller based on the first sensing As a result and the second sensing result, a second control instruction is generated.
  • the second controller may use the second sensing result calculated by the second controller and the sensing result sent by the first controller to the second controller to generate the second control instruction. In this way, the second controller can also utilize the computing power of the first controller, which helps to further improve the utilization of computing resources.
  • the method further includes: the second controller sending the second control instruction to the actuator.
  • both the first controller and the second controller can send control instructions generated by each to the actuator.
  • the actuator can perform corresponding control operations according to the first control instruction sent by the first controller and the second control instruction sent by the second controller.
  • the first control instruction includes first identification information
  • the second control instruction includes second identification information, where the first identification information and the second identification information are different.
  • the actuator After receiving the first control instruction and the second control instruction, the actuator can perform corresponding control operations according to the first identification information and the second identification information.
  • the first identification information may be a first controller area network identification (CAN ID)
  • the second identification information may be a second CAN ID.
  • the executor can store a corresponding relationship between the identification information (for example, CAN ID) and the priority of the control instruction. For example, the priority of the control instruction corresponding to the first CAN ID is greater than the priority of the control instruction corresponding to the second CAN ID. In this way, when the actuator receives the first control instruction and the second control instruction, it can execute the first control instruction with a higher priority and discard the second control instruction or not execute the second control instruction.
  • the first control instruction includes first priority information
  • the second control instruction includes second priority information.
  • the actuator can directly execute control instructions with higher priority. For example, if the first priority is higher than the second priority, then when the actuator receives the first control instruction and the second control instruction, it can execute the first control instruction with a higher priority and abandon the second control instruction. instruction or not to execute the second control instruction.
  • the first control instruction may include identification information of the first controller
  • the second control instruction may include identification information of the second controller.
  • the actuator may store a corresponding relationship between the identification information of the controller and the priority of the control instructions issued by the controller. For example, the priority of the control instructions issued by the first controller is higher than the priority of the control instructions issued by the second controller. . In this way, when the actuator receives the first control instruction and the second control instruction, it can execute the first control instruction with a higher priority and discard the second control instruction with a lower priority or not execute the second control instruction. .
  • the method further includes: when the first controller fails, the first controller stops sending the first control instruction to the actuator.
  • the first controller when the first controller fails, it can stop sending the first control instruction to the actuator. In this way, when the actuator does not receive the first control instruction and receives the second control instruction, it can directly execute the third control instruction.
  • the second control command avoids the communication negotiation process when switching between the first controller and the second controller. By prohibiting the first controller from sending control commands to the actuator, the control command can be quickly switched, which helps to improve the first control command.
  • the switching speed between the controller and the second controller at the same time, because the second controller can quickly take over the control of the vehicle, it helps to improve the safety performance of the vehicle.
  • the method further includes: when the first controller fails, the first controller stops sending the first control instruction; the second controller When it is determined that the first controller fails and the second controller does not fail, the second control instruction is sent to the actuator.
  • the second controller when it is determined that the first controller fails and the second controller does not fail, can send a second control instruction to the actuator, thereby realizing that when the first controller fails, the traffic Control of the tool is switched from the first controller to the second controller.
  • the second controller determines that the first controller fails, including: the second controller receives indication information sent by the first controller, the indication information being used to indicate that the first controller fails. Fault.
  • the first controller may periodically send information to the second controller (for example, sensing results or information indicating whether the first controller is faulty).
  • the second controller may receive information sent by the first controller during the running period of the timer. If the information sent by the first controller is not received when the timer times out, the second controller may determine that the first controller fails.
  • the method is applied to a vehicle, and before sending the first control instruction to the actuator, the method further includes: determining that the vehicle is in an autonomous driving state; Wherein, the method also includes: prompting the user to take over the vehicle.
  • the user when the first controller fails, the user can be prompted to take over the vehicle. This allows the user to quickly take over the vehicle after seeing the prompt, thus helping to ensure the user's driving safety.
  • the first controller fails at the first moment, and the method further includes: the first controller sends a third sensing result to the second controller,
  • the third sensing result includes the sensing result of the data collected by the sensors in the first sensor group by the first controller within a first time period, which is located before the first time; the second controller is based on The third sensing result and the second sensing result control the vehicle to stop traveling.
  • the second controller when the first controller fails, can use the third sensing result and the second sensing result calculated before the first controller fails to control the vehicle to stop driving, which is helpful. To improve the safety of transportation.
  • At least some of the sensors in the first sensor group are different from the sensors in the second sensor group.
  • At least some of the sensors in the first sensor group are different from the sensors in the second sensor group, so that the first controller and the second controller can respectively sense the data collected by different sensors, which is helpful to Improve the utilization of computing resources in transportation vehicles.
  • the fact that at least some of the sensors in the first sensor group are different from the sensors in the second sensor group can be understood to mean that the second sensor group does not include at least some of the sensors in the first sensor group; or , can also be understood as the sensors in the first sensor group and the sensors in the second sensor group are different, that is, there are no identical sensors in the first sensor group and the second sensor group; or, it can also be understood as that in the first sensor group
  • the sensors and the sensors in the second sensor group are partly the same and partly different, that is, the first sensor group does not include some sensors in the second sensor group, and the second sensor group does not include some sensors in the first sensor group.
  • the first sensor group and the second sensor group include positioning sensors and millimeter wave radars.
  • the first sensor group and the second sensor group may include positioning sensors and millimeter wave radars.
  • the second controller can also use the data collected by the positioning sensor and millimeter wave radar in the first sensor group to perform Perception helps improve the safety performance of transportation vehicles.
  • the second sensor group includes a side-view camera.
  • the second controller can use the data collected by the side-view camera in the second sensor group to ensure safe parking of the vehicle.
  • a control device in a second aspect, includes: a first control unit, configured to obtain a first sensing result according to data collected by sensors in the first sensor group; and a second control unit, configured to obtain a first sensing result according to the data collected by the sensors in the first sensor group.
  • the data collected by the sensors in the two sensor groups are used to obtain the second sensing result; the second control unit is also used to send the second sensing result to the first control unit; the first control unit is used to obtain the second sensing result according to the first control unit.
  • the sensing result and the second sensing result are used to send the first control instruction to the actuator.
  • the first control unit is also configured to send the first sensing result to the second control unit; the second control unit is also configured to send the first sensing result according to the second control unit.
  • a sensing result and the second sensing result generate a second control instruction.
  • the second control unit is also configured to send the second control instruction to the actuator.
  • the first control unit is also configured to stop sending the first control instruction to the actuator when the first control unit fails.
  • the first control unit is also configured to stop sending the first control instruction when the first control unit fails; the second control unit is used to When it is determined that the first control unit has failed and the second control unit has not failed, the second control instruction is sent to the actuator.
  • the first control unit is also used to determine that the vehicle is in an automatic driving state before sending the first control instruction to the actuator; the first control unit The unit is also used to prompt the user to take over the vehicle when the first control unit fails.
  • the first control unit fails at the first moment, and the first control unit is further configured to send a third sensing result to the second control unit, the The third sensing result includes the sensing result of the data collected by the sensors in the first sensor group by the first control unit within a first time period, which is located before the first time; the second control unit further For controlling the vehicle to stop traveling according to the third sensing result and the second sensing result.
  • At least some of the sensors in the first sensor group are different from the sensors in the second sensor group.
  • the first sensor group and the second sensor group include positioning sensors and millimeter wave radars.
  • the second sensor group includes a side-view camera.
  • a device in a third aspect, includes: a memory for storing computer instructions; and a processor for executing the computer instructions stored in the memory, so that the device performs the method in the first aspect.
  • a fourth aspect provides a vehicle, which includes the device described in any one of the above second or third aspects.
  • the means of transportation is a vehicle.
  • a computer program product includes: computer program code.
  • the computer program code When the computer program code is run on a computer, it causes the computer to execute the method in the first aspect.
  • the above computer program code may be stored in whole or in part on the first storage medium, where the first storage medium may be packaged together with the processor, or may be packaged separately from the processor, which is not specifically limited in the embodiments of the present application.
  • a computer-readable medium stores program code.
  • the computer program code When the computer program code is run on a computer, it causes the computer to execute the method in the first aspect.
  • inventions of the present application provide a chip system.
  • the chip system includes a processor for calling a computer program or computer instructions stored in a memory, so that the processor executes the method described in the first aspect.
  • the processor is coupled with the memory through an interface.
  • the chip system further includes a memory, and a computer program or computer instructions are stored in the memory.
  • Figure 1 is a schematic functional block diagram of a vehicle provided by an embodiment of the application.
  • Figure 2 is a schematic diagram of the system architecture provided by the embodiment of the present application.
  • Figure 3 is another schematic diagram of the system architecture provided by the embodiment of the present application.
  • Figure 4 is a schematic flow chart of the control method provided by the embodiment of the present application.
  • FIG. 5 is a schematic block diagram of a control device provided by an embodiment of the present application.
  • Prefixes such as “first” and “second” are used in the embodiments of this application only to distinguish different description objects, and have no limiting effect on the position, order, priority, quantity or content of the described objects.
  • the use of ordinal words and other prefixes used to distinguish the described objects does not limit the described objects.
  • Words constitute redundant restrictions.
  • plural means two or more.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • the vehicle 100 may include a perception system 120 , a display device 130 , and a computing platform 150 , where the perception system 120 may include several types of sensors that sense information about the environment surrounding the vehicle 100 .
  • the sensing system 120 may include a positioning system, which may be a global positioning system (GPS), Beidou system or other positioning systems, or an inertial measurement unit (IMU).
  • the perception system 120 may also include one or more of lidar, millimeter wave radar, ultrasonic radar, and camera devices.
  • the computing platform 150 may include processors 151 to 15n (n is a positive integer).
  • the processor is a circuit with signal processing capabilities.
  • the processor may be a circuit with instruction reading and execution capabilities.
  • CPU central processing unit
  • microprocessor microprocessor
  • GPU graphics processing unit
  • DSP digital signal processor
  • the processor can realize certain functions through the logical relationship of the hardware circuit. The logical relationship of the hardware circuit is fixed or can be reconstructed.
  • the processor is an application-specific integrated circuit (application-specific integrated circuit).
  • Hardware circuits implemented by ASIC or programmable logic device (PLD), such as field programmable gate array (FPGA).
  • PLD programmable logic device
  • FPGA field programmable gate array
  • the process of the processor loading the configuration file and realizing the hardware circuit configuration can be understood as the process of the processor loading instructions to realize the functions of some or all of the above units.
  • it can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as a neural network processing unit (NPU), tensor processing unit (TPU), deep learning processing Unit (deep learning processing unit, DPU), etc.
  • the computing platform 150 may also include a memory, which is used to store instructions. Some or all of the processors 151 to 15n may call instructions in the memory and execute the instructions to implement corresponding functions.
  • the vehicle 100 may include an advanced driving assist system (ADAS), which utilizes a variety of sensors on the vehicle (including but not limited to: lidar, millimeter wave radar, camera, ultrasonic sensor, global positioning system, inertial Measurement unit) acquires information from around the vehicle and analyzes and processes the acquired information to achieve functions such as obstacle perception, target recognition, vehicle positioning, path planning, driver monitoring/reminder, etc., thereby improving the safety of vehicle driving. Level of automation and comfort.
  • ADAS advanced driving assist system
  • ADAS systems generally include three main functional modules: perception module, decision-making module and execution module.
  • the perception module senses the surrounding environment of the vehicle body through sensors and inputs corresponding real-time data to the decision-making layer processing center.
  • the perception module mainly includes on-board cameras. / Ultrasonic radar / Millimeter wave radar / Lidar, etc.; the decision-making module uses computing devices and algorithms to make corresponding decisions based on the information obtained by the perception module; the execution module takes corresponding actions after receiving the decision-making signal from the decision-making module, such as driving, changing lanes , steering, braking, warning, etc.
  • L0 level is no automation
  • L1 level is driving support
  • L2 level is partial automation
  • L3 level is conditional automation
  • L4 level is high automation
  • L5 level is complete automation.
  • the tasks of monitoring and responding to road conditions from L1 to L3 are jointly completed by the driver and the system, and the driver is required to take over dynamic driving tasks.
  • L4 and L5 levels allow the driver to completely transform into a passenger role.
  • ADAS can implement mainly include but are not limited to: adaptive cruise, automatic emergency braking, automatic parking, blind spot monitoring, front intersection traffic warning/braking, rear intersection traffic warning/braking, and front collision warning. , lane departure warning, lane keeping assist, rear collision avoidance warning, traffic sign recognition, traffic jam assist, highway assist, etc.
  • L0-L5 autonomous driving levels
  • automatic parking can include APA, RPA, AVP, etc.
  • APA the driver does not need to control the steering wheel, but still needs to control the accelerator and brake on the vehicle; for RPA, the driver can use a terminal (such as a mobile phone) to remotely park the vehicle outside the vehicle; for AVP, the vehicle can be parked remotely without driving. Parking is completed without a driver.
  • APA is approximately at the L1 level
  • RPA is approximately at the L2-L3 level
  • AVP is approximately at the L4 level.
  • the responsible subject in autonomous driving L3 and above scenarios is the autonomous driving system.
  • the design requirement of the automatic driving system is "fail operational", that is, when a fault occurs, the automatic driving system continues to run the automatic driving function and takes corresponding measures to make the vehicle exit automatic driving safely.
  • the current mainstream manufacturers in the industry use two automatic driving controllers for "1:1" backup redundancy. That is, two identical automatic driving domain controllers are used in the entire system to connect in parallel, one of which is connected in parallel.
  • the automatic driving controller serves as the main controller, runs the complete automatic driving business, and outputs vehicle control instructions to control the operation of the vehicle.
  • Another automatic driving controller serves as a backup controller.
  • the backup controller can replace the main controller to continue business processing and control the behavior of the vehicle. This requires both the main controller and the backup controller to have high computing performance to meet the system requirements.
  • the backup controller is in an idle state, which will cause a waste of costs and computing resources.
  • Embodiments of the present application provide a control method, device and vehicle.
  • Two controllers work together in a load sharing manner. Each controller handles different services and senses them through a communication bus between the controllers. The results are sent to the peer controller, so that each controller can obtain the sensing results of all sensors, which helps to improve the utilization of computing resources and also helps to reduce the cost of the controller.
  • Figure 2 shows a schematic diagram of the system architecture provided by the embodiment of the present application.
  • the system architecture includes sensor group A, sensor group B, controller A, controller B, and body actuators 1-n.
  • the sensors in sensor group A can be connected to controller A, and the sensors in sensor group B can be connected to controller B.
  • Controller A can send the generated vehicle control instructions to vehicle control bus A, and controller B can send the generated vehicle control instructions to vehicle control bus B.
  • the system architecture shown in Figure 2 can be applied in autonomous driving scenarios with high functional safety, high reliability and high performance. It is a new interconnected and interactive software and hardware integrated architecture for vehicle-mounted autonomous driving controllers.
  • Sensor group A and sensor group B include but are not limited to a certain number of camera devices, laser radar, millimeter wave radar, ultrasonic radar, GPS, IMU, etc. At the same time, some sensors in sensor group A are allowed to be connected to controller B, and some sensors in sensor group B are allowed to be connected to controller A.
  • sensors that output data through a controller area network (CAN) bus or a controller area network with flexible data-rate (CAN with flexible data-rate, CANFD) bus can be connected to the controller A and the control unit respectively.
  • CAN controller area network
  • CAN with flexible data-rate, CANFD controller area network with flexible data-rate
  • Controller A and Controller B have the ability to perform sensory calculations on external input sensor data to identify vehicle surrounding environment information and control vehicle behavior through a series of calculation processes.
  • controller A and controller B may be interconnected through a communication bus.
  • the communication bus may be an Ethernet bus or a CAN bus.
  • controller A and controller B shown in Figure 2 may be located in the above-mentioned ADAS system.
  • system architecture shown in Figure 2 above is described as including two sensor groups and two controllers.
  • the number of sensor groups and controllers is not specifically limited.
  • the system architecture may also include three (or more than three) sensor groups and three (or more than three) controllers.
  • the system architecture may include sensor group A, sensor group B, sensor group C, controller A, and controller B. Sensor group A and sensor group C can access controller A and sensor group B can access controller B.
  • the system architecture may include sensor group A, sensor group B, controller A, controller B, and controller C. Sensor group A can be connected to controller A, and sensor group B can be connected to controller B and controller C respectively.
  • controller A and controller B can jointly process important services in the autonomous driving service, such as the processing of sensing results, in a load balancing manner.
  • Controller A calculates the data collected by the sensors in sensor group A connected to itself to obtain a first perception result.
  • the first perception result may include environmental information around the vehicle (including but not limited to lane line information, obstacles information, traffic signs, location information, etc.).
  • controller B can also calculate the data collected by the sensors in sensor group B connected to itself to obtain a second perception result.
  • the second perception result can include environmental information around the vehicle.
  • Controller A and controller B can interact with structured data after perceptual calculation, so that both controllers can obtain the perceptual results on the opposite controller; in this way, perceptual computing can be implemented on both controllers. are processed separately, and the calculation results can be shared by the two controllers.
  • controller A can use the computing power of controller B, and controller B can also use the computing power of controller A, which helps to improve the utilization of computing resources; at the same time, controller A and controller B only need to The data collected by the sensors in the corresponding sensor group are processed, and both controllers do not need to have high computing performance, which helps to reduce the cost of the controllers.
  • the body actuators 1-n may include body actuators on the vehicle used to control the lateral and longitudinal behavior of the vehicle.
  • the body actuators 1-n may include a motor control unit (integrated power unit, IPU), an electronic power steering system (electrical power system, EPS), an electronic brake system (electrical brake system, EBS), etc.
  • IPU integrated power unit
  • EPS electronic power steering system
  • EBS electronic brake system
  • These body actuators are responsible for receiving the vehicle control instructions output by controller A and/or controller B, and realizing the control of the vehicle by controller A or controller B.
  • the controller can be connected to the controller through two different vehicle control buses.
  • Controller A is connected to the body actuator 1-n through the vehicle control bus A
  • controller B is connected to the body actuator 1 through the vehicle control bus B.
  • Controller A can send vehicle control instructions to vehicle control bus A through the CAN bus or CANFD bus
  • controller B can send vehicle control instructions to vehicle control bus B through the CAN bus or CANFD bus.
  • Controller A and controller B can control the behavior of the vehicle through the vehicle control bus.
  • controller A and controller B For example, if the body actuator supports two-way vehicle control instructions to control the vehicle, then when controller A and controller B are both normal, controller A and controller B send vehicle control instructions through vehicle control bus A and vehicle control bus B respectively. . If the priority of the vehicle control instructions on the vehicle control bus A is higher than the vehicle control instructions on the vehicle control bus B, then the body actuator obtains the vehicle control instructions from the vehicle control bus A. At this time, the controller A dominates the control of the vehicle.
  • controller A When controller A fails, controller A stops sending vehicle control instructions to vehicle control bus A, and the body actuator instead receives vehicle control instructions from vehicle control bus B. At this time, controller A takes over the control of the vehicle. Switch to controller B to take control of the vehicle. At the same time, controller A (or controller B) can also control the prompt device to prompt the user to take over the vehicle and/or control the vehicle to pull over. If controller B fails, controller B stops sending vehicle control instructions through vehicle control bus B, and the vehicle is still under the control of controller A; at the same time, controller A (or controller B) can also control the prompt device to prompt The user takes over the vehicle and/or controls the vehicle's pull-over.
  • controller A when controller A fails, it can stop sending vehicle control instructions to the body actuator. In this way, the body actuator does not receive the vehicle control instructions sent by controller A and receives the vehicle control instructions sent by controller B.
  • the vehicle control command sent by the controller B can be directly executed, avoiding the communication negotiation process when the controller A and the controller B switch, thus helping to increase the switching speed of the controller A and the controller B; at the same time , because controller B can quickly take over control of the vehicle, it helps improve the safety performance of the vehicle.
  • controller A and controller B negotiate to allow only one vehicle control command to be issued.
  • controller A and controller B are both normal, controller A can send vehicle control instructions to the body actuator through vehicle control bus A.
  • controller A fails and controller B is normal, controller A stops sending vehicle control instructions to the body actuators through vehicle control bus A and controller B can send vehicle control instructions to the body actuators through vehicle control bus B.
  • controller A can include visual perception module A, lidar perception module A, millimeter wave perception module A, position positioning module A, local perception fusion module A, perception result extraction module A, and global perception fusion module A.
  • Controller B may include visual perception module B, lidar perception module B, millimeter wave perception module B, position positioning module B, local perception fusion module B, perception result extraction module B, global perception fusion module B, and planning control module B , vehicle control command issuing module B, hardware monitoring module B, software monitoring module B, fault management module B, master-slave management module B and time synchronization module B.
  • Controller A and controller B perform time synchronization through time synchronization module A and time synchronization module B, so that the time on controller A and the time on controller B remain synchronized.
  • Either controller A or controller B can serve as the primary controller, and the other can serve as the backup controller.
  • the visual perception module A, lidar sensing module A, millimeter wave sensing module A and position positioning module A on controller A can respectively process the data collected by the camera device in sensor group A, the data collected by lidar, and the data collected by millimeter wave radar. data and data collected by GPS/IMU to obtain corresponding sensing results.
  • the visual perception module B, lidar perception module B, millimeter wave perception module B and position positioning module B on controller B can respectively process the data collected by the camera device in sensor group B, the data collected by lidar, and the millimeter wave radar collection.
  • the data and the data collected by GPS/IMU are used to obtain the corresponding sensing results.
  • controller A and controller B both have a visual perception module, a lidar perception module, a millimeter wave perception module and a position positioning module.
  • sensor group A may include a front-view long-range camera, a front-view short-range camera, and a surround-view camera (such as a front-view camera, a rear-view camera, a left-view camera, a right-view camera, etc.) Camera), forward lidar, backward lidar, GPS, IMU.
  • the controller A may include a visual perception module A, a laser radar perception module A, and a position positioning module A.
  • sensing group B may include side-view cameras (for example, left front-view camera, right front-view camera, left rear-view camera, right rear-view camera), GPS, and IMU.
  • the controller B may include a visual perception module B and a position positioning module B.
  • sensor group A when controller A is responsible for the autonomous driving business of the vehicle, sensor group A may include a front-view long-range camera, a front-view short-range camera, and a surround-view camera (such as a front-view camera, a rear-view camera, a left-view camera, a right-view camera, etc.) (video camera), GPS, IMU.
  • the controller A may include a visual perception module A and a position positioning module A.
  • sensing group B may include side-view cameras (for example, left front-view camera, right front-view camera, left rear-view camera, right rear-view camera), forward lidar and Backward facing lidar.
  • controller B may include visual perception module B and lidar perception module B.
  • controller A and controller B may include a visual perception module, a laser radar perception module, a millimeter wave perception module and a position positioning module. In this way, no matter what type of sensors are included in the sensor group, controller A and controller B can process the data collected by them.
  • controller A and controller B may set sensing modules according to the types of sensors in the connected sensor group. For example, when sensor group A does not include lidar, controller A may not include lidar sensing module A; for example, when sensor group B does not include positioning sensors, controller B may not include a position positioning module.
  • Millimeter-wave radars and positioning sensors that output data through the CAN bus or CANFD bus can be connected to controller A and controller B at the same time.
  • the millimeter wave radar and positioning sensor in sensor group A can be connected to controller A and controller B respectively.
  • controller A fails and the millimeter wave radar and positioning sensor in sensor group A are normal, controller B can also use the data collected by the millimeter wave radar and positioning sensor in sensor group A to make the vehicle park in its own lane or pull over. parking.
  • the local perception fusion module A on the controller A receives the perception results of the visual perception module A, the laser radar perception module A, the millimeter wave perception module A and the position positioning module A and fuses the perception results to obtain the same time and space of the sensors on the controller A. Model of vehicle surrounding environment information in coordinate system.
  • the local perception fusion module B on the controller B receives the perception results of the visual perception module B, the laser radar perception module B, the millimeter wave perception module B and the position positioning module B and fuses the perception results to obtain the same time and space of the sensors on the controller B. Model of vehicle surrounding environment information in coordinate system.
  • the above models of vehicle surrounding environment information include but are not limited to: lane line information, traffic sign information (such as traffic light information, speed limit sign information, etc.), obstacle information on the road, etc.
  • the perception result extraction module A on the controller A selects and extracts the data in the local perception fusion module A, and sends the selected and extracted data to the global perception fusion module B.
  • the global perception fusion module B can further fuse the fusion results obtained by the local perception fusion module B and the data sent by the perception result extraction module A.
  • the perception result extraction module B on the controller B selects and extracts the data in the local perception fusion module B, and sends the selected and extracted data to the global perception fusion module A.
  • the global perception fusion module A can further fuse the fusion results obtained by the local perception fusion module A and the data sent by the perception result extraction module B.
  • the methods by which the perception result extraction module A selects and extracts the data in the local perception fusion module A include but are not limited to the following:
  • controller A and controller B are both strong, have a large memory space, or the high-speed bus bandwidth between the two controllers is sufficient, then the local perception fusion on controller A and controller B can be After the module is integrated, all data is synchronized to the opposite controller in real time, so that both controllers can maximize access to all information.
  • the perception result extraction module A can filter the data fused by the local perception fusion module A and sort it according to the criticality of the information.
  • the sorting method can use key directions, distance, etc.
  • Perception result extraction module A first removes non-key information in non-key directions (such as obstacle information in the left and right directions, long-distance object information in the backward direction), and long-distance information in key directions (such as obstacle information 200 meters forward). . If the performance of controller B is still insufficient at this time, the perception result extraction module A can prune this information, while retaining other important information and passing it to the global perception fusion module B. It should be understood that the above screening process may be completed during the controller performance testing phase.
  • the perception result extraction module A will give priority to sending the forward, right front, right rear, and rear obstacle information required for parking to the global perception fusion Module B. For obstacle information in other directions, less or no information may be sent. It should be understood that for vehicles traveling on the left, the perception result extraction module A can also give priority to the forward, left front, left rear, and rear obstacle information required for parking to the global perception fusion module B.
  • the data sent by the perception result extraction module A to the global perception fusion module B and the data sent by the perception result extraction module B to the global perception fusion module A can be transmitted through the high-speed Ethernet bus.
  • the range, type, and data volume of data selection on controller A and controller B may be the same or different, and may depend on the information required for business deployment on the opposite controller.
  • the planning control module A on the controller A plans and calculates the vehicle trajectory and generates the corresponding vehicle control instructions A according to the function deployment strategy of the autonomous driving business on the controller A.
  • the planning control module A can send the generated vehicle control instruction A to the vehicle control instruction issuing module A.
  • the planning control module B on the controller B plans and calculates the vehicle trajectory according to the function deployment strategy of the automatic driving business on the controller B and generates the corresponding vehicle control instructions B.
  • the planning control module B can send the generated vehicle control instruction B to the vehicle control instruction issuing module B.
  • the function deployment on controller A and controller B may have different strategies.
  • controller A deploys a high-speed cruise function and controller B deploys a pull-over function.
  • controller A deploys a high-speed cruise function and controller B deploys a pull-over function.
  • controller B deploys a pull-over function.
  • the trajectory planning and motion control strategies on the two controllers may be different.
  • controller A For another example, the same function is deployed on controller A and controller B.
  • the high-speed cruise function is deployed on both controller A and controller B.
  • the same trajectory planning and motion control strategies can be used on the two controllers.
  • controller A deploys the high-speed cruise function and the pull-over function
  • controller B deploys the pull-over function
  • the vehicle control instruction issuing module A and the vehicle control instruction issuing module B can output the vehicle control instruction to the vehicle control bus in the following two ways.
  • the body actuator can receive the vehicle control command A sent by the vehicle control command issuing module A and receive the vehicle control command B sent by the vehicle control command issuing module B.
  • the vehicle control instruction A includes first identification information
  • the vehicle control instruction B includes second identification information
  • the first identification information and the second identification information are different.
  • the body actuator may store the corresponding relationship between the identification information and the priority corresponding to the vehicle control command.
  • the actuator can save the corresponding relationship between CAN ID and priority in Table 1.
  • the body actuator when the body actuator receives vehicle control command A and vehicle control command B, it can parse and obtain the CAN ID of vehicle control command A and the CAN ID of vehicle control command B. If the CAN ID in vehicle control command A is 1 and the CAN ID in vehicle control command B is 2, then the body actuator can execute the higher priority vehicle control command A according to the corresponding relationship shown in Table 1 above without Execute vehicle control command B.
  • vehicle control instruction A includes first priority information
  • vehicle control instruction B includes second priority information
  • the first priority is higher than the second priority.
  • vehicle control instruction A may include identification information of controller A
  • vehicle control instruction B may include identification information of controller B
  • the body actuator may store the corresponding relationship between the identification information of the controller and the priority corresponding to the vehicle control command.
  • Table 2 shows the corresponding relationship between the identification information of a controller and the priority corresponding to the vehicle control instruction.
  • the body actuator After the body actuator receives the vehicle control instruction A and the vehicle control instruction B, it can parse and obtain the identification information of the controller in the vehicle control instruction A and the identification information of the controller in the vehicle control instruction B.
  • the body actuator can execute the vehicle control command A with a higher priority instead of executing the vehicle control command B according to the corresponding relationship shown in Table 2 above.
  • the above process of the vehicle body actuator determining the priority of the vehicle control instruction A and the vehicle control instruction B through the information carried in the vehicle control instruction A and the vehicle control instruction B is only illustrative. In the embodiments of the present application, It is not limited to this.
  • the vehicle control command can also carry other information to determine the priority of the vehicle control command. For example, the first field in the vehicle control command A carries certain information but the first field in the vehicle control command B does not carry this information. , then the body actuator can determine that the priority of vehicle control command A is higher than the priority of vehicle control command B.
  • the priority of vehicle control instructions can also be determined through the vehicle control bus. For example, the priority of vehicle control instructions on vehicle control bus A is higher than the priority of vehicle control instructions on vehicle control bus B.
  • controller A and controller B are both normal (or both are in a healthy working state)
  • controller A and controller B both issue vehicle control instructions.
  • controller A actually controls the operation of the vehicle.
  • controller A fails, or when controller A is unable to use existing sensor resources and computing capabilities to control the vehicle, the master-slave management module A prohibits the vehicle control instruction issuing module A from issuing vehicle control instructions to the vehicle control unit. on bus A.
  • master-slave management module B allows vehicle control command issuing module B to deliver vehicle control command B to vehicle control bus B. At this time, the control authority of the vehicle can be quickly switched from controller A to controller B.
  • controller A (or controller B) can also control the prompt device to prompt the driver to take over control of the vehicle.
  • the prompt device includes one or more of a display screen, an ambient light, and a voice module.
  • the display screen can be controlled to display the prompt message "Please take over the vehicle.”
  • the color of the ambient light can be controlled to turn red to prompt the driver to take over the vehicle.
  • the voice module can be controlled to send out the voice message "Please take over the vehicle" to prompt the driver to take over the vehicle.
  • controller A fails at time T 1 , and perception result extraction module A can send the data collected by controller A to sensor group A during the period from time T 0 to time T 1 to global perception fusion module B.
  • the result of perceptual fusion can further fuse the perception results sent by the perception result extraction module A and the perception fusion results of the data collected by the sensors in the sensor group B by the controller B, thereby improving the control of the vehicle by the controller B to stop or park in the lane.
  • Safety during pull-over is
  • controller B When controller B also fails, or when controller B cannot use existing sensor resources and computing capabilities to control the vehicle, the master-slave management module B prohibits the issuance of vehicle control instructions and sends vehicle control instructions B to the vehicle control bus. B on. At this time, neither the vehicle control bus A nor the vehicle control bus B sends any vehicle control instructions, and the body actuator performs emergency braking to decelerate the vehicle to a stop.
  • both controllers may send vehicle control instructions to the body actuator.
  • the vehicle control instructions can be quickly switched by stopping the controller from issuing vehicle control instructions.
  • the vehicle control command switching solution provided by the embodiment of this application does not require master-slave negotiation between systems, and facilitates the support of controllers from heterogeneous manufacturers to form a master-slave system.
  • the body actuator can only receive one vehicle control command, or the body actuator only receives the vehicle control command sent by the vehicle control command delivery module A or only receives the vehicle control command sent by the vehicle control command delivery module B.
  • controller A can be set as the main controller, and controller A will send vehicle control instructions first;
  • controller B can be set as a backup controller, and controller B will not send vehicle control instructions.
  • controller A and controller B are both normal, the master-slave management module A and the master-slave management module B select the two controllers. At this time, the vehicle control command issuing module A is allowed to send the vehicle control command. Transmitting module B prohibits sending vehicle control commands.
  • the master-slave management module A prohibits the vehicle control command module from issuing A and sends the vehicle control command to the vehicle control bus. A on.
  • the master-slave management module B determines whether the controller B is normal. If the controller B is normal, the vehicle control command issuance module B is allowed to issue the vehicle control command at this time.
  • the master-slave management module A may periodically send indication information to the master-slave management module B.
  • the indication information is used to indicate whether the controller A is normal.
  • the master-slave management module B can periodically send indication information to the master-slave management module A, and the indication information is used to indicate whether the controller B is normal.
  • the master-slave management module B determines that controller A is faulty and controller B is normal, it can allow vehicle control command issuing module B to issue vehicle control commands.
  • the master-slave management module A may store a timer. If the information sent by the master-slave management module B is received during the timer movement period, the master-slave management module A may consider the controller B to be normal; If the information sent by the master-slave management module B is not received within the timer movement period, the master-slave management module A may consider that the controller B is faulty. Similarly, the master-slave management module B can store a timer.
  • the master-slave management module B can consider the controller A to be normal; If the information sent by the master-slave management module A is not received during the movement of the controller, the master-slave management module B can think that the controller A is faulty.
  • the hardware monitoring module A on the controller A can monitor the fault status of the hardware system on the controller A in real time. If there is a fault status, the fault information is reported to the fault management module A.
  • the hardware monitoring module B on the controller B can monitor the fault status of the hardware system on the controller B in real time. If there is a fault status, the fault information is reported to the fault management module B.
  • the software monitoring module A on the controller A monitors the health status of the software on the controller in real time. If a fault occurs, the fault information is reported to the fault management module A.
  • the software monitoring module B on the controller B monitors the health status of the software on the controller in real time. If a fault occurs, the fault information is reported to the fault management module B.
  • Fault management module A summarizes and grades software faults and hardware faults on controller A to determine whether faults that affect the autonomous driving business have occurred and give the severity of the fault's impact.
  • fault management module B summarizes and grades software faults and hardware faults on controller B to determine whether faults that affect the autonomous driving business occur and the severity of the faults.
  • the fault management module A and the fault management module B run at the automotive safety integrity level D (ASIL-D) in their respective controllers.
  • ASIL-D automotive safety integrity level D
  • the master-slave management module A obtains the fault information of the controller A from the fault management module A
  • the master-slave management module B obtains the fault information of the controller B from the fault management module B.
  • Master-slave management module A and master-slave management module B can run at the functional safety level of ASIL-D in their respective controllers.
  • the master-slave management module A and the master-slave management module B communicate through two heterogeneous buses between the two controllers, such as CAN bus and Ethernet bus, to notify the other end of its own health status and whether it is sending Vehicle control instructions.
  • two controllers can respectively access different sensor groups, and the two controllers respectively perform perception calculations on the data collected by the sensors in these sensor groups.
  • Each controller sends the calculated structured data to the peer controller, and each controller can obtain the sensing results of all sensors, so that the sensing and computing capabilities of both controllers can be effectively utilized, thereby ensuring Helps improve the utilization of computing resources.
  • Figure 4 shows a schematic flow chart of a control method 400 provided by an embodiment of the present application.
  • the method 400 may be applied in a control system including a first controller and a second controller.
  • the control system may be located in the vehicle; or the control system may be located in the computing platform shown in FIG. 1; or the control system may be located in the ADAS system.
  • the method 400 includes:
  • the first controller obtains the first sensing result based on the data collected by the sensors in the first sensor group.
  • the first controller may be the above-mentioned controller A
  • the first sensor group may be the above-mentioned sensor group A.
  • the first controller may be a master controller.
  • the second controller obtains the second sensing result based on the data collected by the sensors in the second sensor group.
  • the second controller may be the above-mentioned controller B
  • the second sensor group may be the above-mentioned sensor group B.
  • the second controller is a backup controller.
  • the first sensor group and the second sensor group may include the same sensor.
  • At least some of the sensors in the first sensor group are different from the sensors in the second sensor group.
  • the sensors in the first sensor group are different from the sensors in the second sensor group, which can be understood as the sensors in the first sensor group are different from the sensors in the second sensor group, that is, the first sensor group and the sensors in the second sensor group are different. There are no identical sensors in the two sensor groups.
  • the first controller may be responsible for the automatic driving service and the second controller may be responsible for the safe parking function.
  • the first sensor group may include a forward-looking long-range camera, a forward-looking short-range camera, a surround-view camera (such as a front-view camera, a rear-view camera, a left-view camera, a right-view camera), a forward-looking lidar, and a rear-facing laser. Radar, GPS, IMU; the second sensor group may include side-view cameras (for example, left front-view camera, right front-view camera, left rear-view camera, right rear-view camera). At this time, the sensors in the first sensor group and the sensors in the second sensor group may be different.
  • the sensors in the first sensor group are different from the sensors in the second sensor group. It can also be understood that the sensors in the first sensor group and the sensors in the second sensor group are partly the same and another part is different, that is, the sensors in the first sensor group and the sensors in the second sensor group are partly the same.
  • One sensor group does not include some sensors in the second sensor group, and the second sensor group does not include some sensors in the first sensor group.
  • the first controller may be responsible for the automatic driving service and the second controller may be responsible for the safe parking function.
  • the first sensor group may include a front-view long-range camera, a front-view short-range camera, a surround-view camera (such as a front-view camera, a rear-view camera, a left-view camera, a right-view camera), GPS, and an IMU;
  • the second sensor group can include side-view cameras (for example, left front-view camera, right front-view camera, left rear-view camera, right rear-view camera), forward-facing lidar, rear-facing lidar, GPS, and IMU.
  • both the first sensor group and the second sensor group have GPS and IMU, and the first sensor group does not include the side-view camera, forward lidar, and backward lidar in the second sensor group.
  • the second sensor does not include the forward-looking long-distance camera, the forward-looking short-distance camera, and the surround-view camera in the first sensor group.
  • S430 The second controller sends the second sensing result to the first controller.
  • the first controller receives the second sensing result sent by the second controller.
  • the second controller sends the second sensing result to the first controller, including: the second controller sends the second sensing result to the first controller through the CAN bus, CANFD bus or Ethernet bus. .
  • the second sensing result includes a part of the sensing result of the data collected by the second controller from the sensors in the second sensor group.
  • the second controller may not send the sensing result of the location of the vehicle to the first controller.
  • the first controller sends a first control instruction to the actuator based on the first sensing result and the second sensing result.
  • the global perception fusion module A can further fuse the fusion results obtained by the local perception fusion module A and the data sent by the perception result extraction module B.
  • Planning control module A can generate vehicle control instructions based on the fusion results.
  • the vehicle control command issuing module A can send the vehicle control command to the body actuator.
  • the method 400 further includes: the first controller sending the first sensing result to the second controller.
  • the second controller receives the first sensing result sent by the first controller.
  • the second controller generates a second control instruction based on the first sensing result and the second sensing result.
  • the first sensing result includes a part of the sensing result of the data collected by the first controller on the sensors in the first sensor group.
  • the first controller may not send the sensing result of the location of the vehicle to the second controller.
  • the first controller can sense information about obstacles in the left and right directions of the vehicle, information about objects 100 meters behind, and information about obstacles 200 meters forward through the data collected by the surround-view camera.
  • the second sensing result sent by the second controller may only carry information about obstacles 200 meters forward, but not information about obstacles in the left and right directions and information about objects 100 meters backward.
  • the method 400 further includes: the second controller sending the second control instruction to the actuator.
  • the second controller can send the second control instruction to the body actuator through the CAN bus or the CANFD bus.
  • the first control instruction includes first identification information
  • the second control instruction includes second identification information, wherein the first identification information and the second identification information are different.
  • the actuator After receiving the first control instruction and the second control instruction, the actuator can perform corresponding control operations according to the first identification information and the second identification information.
  • the first identification information may be a first CAN ID
  • the second identification information may be a second CAN ID.
  • the executor can store a corresponding relationship between the identification information (for example, CAN ID) and the priority of the control instruction. For example, the priority of the control instruction corresponding to the first CAN ID is greater than the priority of the control instruction corresponding to the second CAN ID. In this way, when the actuator receives the first control instruction and the second control instruction, it can execute the first control instruction with a higher priority and not execute the second control instruction.
  • the first control instruction includes first priority information
  • the second control instruction includes second priority information.
  • the actuator can directly execute control instructions with higher priority. For example, if the first priority is higher than the second priority, then when the actuator receives the first control instruction and the second control instruction, it can execute the first control instruction with a higher priority without executing the second control instruction. instruction.
  • the first control instruction may include identification information of the first controller
  • the second control instruction may include identification information of the second controller.
  • the actuator may store a corresponding relationship between the priority of the controller and the vehicle control instructions issued by the controller. For example, the priority of the first controller is higher than the priority of the second controller. In this way, when the actuator receives the first control instruction and the second control instruction, it can execute the first control instruction with a higher priority but not execute the second control instruction with a lower priority.
  • the method 400 further includes: when the first controller fails, the first controller stops sending the first control instruction to the actuator.
  • the fault management module A determines that a fault has occurred in the controller A through the monitoring results of the hardware monitoring module A and/or the software monitoring module, then The fault management module A can notify the master-slave management module A that a fault has occurred in the controller A. Therefore, the master-slave management module A can control the vehicle control instruction issuing module A to stop issuing vehicle control instructions to the vehicle control bus A.
  • the method 400 further includes: when the first controller fails, the first controller stops sending the first control instruction; the second controller determines that the first controller fails and the third controller When the second controller fails, the second control instruction is sent to the actuator.
  • the fault management module A determines that a fault has occurred in the controller A through the monitoring results of the hardware monitoring module A and/or the software monitoring module, then The fault management module A can notify the master-slave management module A that a fault has occurred in the controller A. Therefore, the master-slave management module A can control the vehicle control instruction issuing module A to stop issuing vehicle control instructions to the vehicle control bus A. At the same time, the master-slave management module A can also notify the master-slave management module B that controller A has failed. After receiving the notification, the master-slave management module B can switch the status of the vehicle control command issuance module B from prohibiting the issuance of vehicle control instructions to the vehicle control bus B to allowing the issuance of vehicle control instructions to the vehicle control bus B.
  • the method is applied to a vehicle. Before sending the first control instruction to the actuator, the method further includes: determining that the vehicle is in an autonomous driving state; wherein, the method 400 further includes: prompting the user to take over the Transportation.
  • prompting the user to take over the vehicle includes: controlling the prompting device to prompt the user to take over the vehicle.
  • the user can be prompted to take over the vehicle by controlling one or more of the following: controlling the display screen to display prompt information, controlling the color change of the ambient light, and controlling the voice module to emit voice prompts.
  • the first controller fails at the first moment, and the method further includes: the first controller sends a third sensing result to the second controller, the third sensing result includes the The first controller perceives the data collected by the sensors in the first sensor group, and the first time period is before the first moment; the second controller determines based on the third perception result and the second perception result, Control vehicle parking.
  • sensors that output data through the CAN bus or CANFD bus can be connected to the first controller and the second controller respectively.
  • the first sensor group and the second sensor group include positioning sensors and/or millimeter wave radars.
  • the second sensor group includes a side-view camera.
  • the second controller can be used to be responsible for the safe parking of the vehicle.
  • including the side-view camera in the second sensor group can ensure that when the first controller fails, the second controller can achieve safe parking of the vehicle through the data collected by the side-view camera.
  • Embodiments of the present application also provide a device for implementing any of the above methods.
  • a device is provided that includes units (or means) for implementing each step performed by a vehicle in any of the above methods.
  • FIG. 5 shows a schematic block diagram of a control device 500 provided by an embodiment of the present application.
  • the device 500 includes: a first control unit 510, used to obtain the first sensing result according to the data collected by the sensors in the first sensor group; a second control unit 520, used to obtain the first sensing result according to the data collected by the sensors in the second sensor group.
  • the data collected by the sensors in the sensor are used to obtain the second perception result;
  • the second control unit 510 is also used to send the second perception result to the first control unit;
  • the first control unit 520 is used to obtain the second perception result according to the first perception As a result and the second sensing result, a first control instruction is sent to the actuator.
  • the first control unit 510 is also configured to send the first sensing result to the second control unit; the second control unit 520 is also configured to determine based on the first sensing result and the second sensing result, Generate a second control instruction.
  • the second control unit 520 is also used to send the second control instruction to the actuator.
  • the first control unit 510 is also configured to stop sending the first control instruction to the actuator when the first control unit fails.
  • the first control unit 510 is also configured to stop sending the first control instruction when the first control unit fails; the second control unit 520 is used to stop sending the first control instruction when it is determined that the first control unit fails. And when the second control unit fails, the second control instruction is sent to the actuator.
  • the first control unit 510 is also used to determine that the vehicle is in an automatic driving state before sending the first control instruction to the actuator; the first control unit 510 is also used to determine when the first control unit When a malfunction occurs, the control prompt device prompts the user to take over the vehicle.
  • the first control unit 510 fails at the first moment.
  • the first control unit 510 is also configured to send a third sensing result to the second control unit 520, where the third sensing result includes the first time period.
  • the first control unit 520 is also configured to detect the sensing results of the data collected by the sensors in the first sensor group, and the first time period is before the first moment; the second control unit 520 is also configured to use the third sensing results according to And the second sensing result is used to control the vehicle to stop.
  • At least some of the sensors in the first sensor group are different from the sensors in the second sensor group.
  • the first sensor group and the second sensor group include positioning sensors and millimeter wave radars.
  • the second sensor group includes a side-view camera.
  • each unit in the above device is only a division of logical functions. In actual implementation, it can be fully or partially integrated into a physical entity, or it can also be physically separated.
  • the unit in the device can be implemented in the form of a processor calling software; for example, the device includes a processor, the processor is connected to a memory, instructions are stored in the memory, and the processor calls the instructions stored in the memory to implement any of the above methods.
  • the processor is, for example, a general-purpose processor, such as a CPU or a microprocessor
  • the memory is a memory within the device or a memory outside the device.
  • the units in the device can be implemented in the form of hardware circuits, and some or all of the functions of the units can be implemented through the design of the hardware circuits, which can be understood as one or more processors; for example, in one implementation,
  • the hardware circuit is an ASIC, which realizes the functions of some or all of the above units through the design of the logical relationship of the components in the circuit; for another example, in another implementation, the hardware circuit can be implemented through PLD, taking FPGA as an example. It can include a large number of logic gate circuits, and the connection relationships between the logic gate circuits can be configured through configuration files to realize the functions of some or all of the above units. All units of the above device may be fully realized by the processor calling software, or may be fully realized by hardware circuits, or part of the units may be realized by the processor calling software, and the remaining part may be realized by hardware circuits.
  • the processor is a circuit with signal processing capabilities.
  • the processor may be a circuit with instruction reading and execution capabilities, such as a CPU, a microprocessor, a GPU, or DSP, etc.; in another implementation, the processor can realize certain functions through the logical relationship of the hardware circuit. The logical relationship of the hardware circuit is fixed or can be reconstructed.
  • the processor is a hardware circuit implemented by ASIC or PLD. For example, FPGA.
  • the process of the processor loading the configuration file and realizing the hardware circuit configuration can be understood as the process of the processor loading instructions to realize the functions of some or all of the above units.
  • it can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as NPU, TPU, DPU, etc.
  • each unit in the above device can be one or more processors (or processing circuits) configured to implement the above method, such as: CPU, GPU, NPU, TPU, DPU, microprocessor, DSP, ASIC, FPGA , or a combination of at least two of these processor forms.
  • processors or processing circuits
  • each unit in the above device may be integrated together in whole or in part, or may be implemented independently. In one implementation, these units are integrated together and implemented as a system-on-a-chip (SOC).
  • SOC may include at least one processor for implementing any of the above methods or implementing the functions of each unit of the device.
  • the at least one processor may be of different types, such as a CPU and an FPGA, or a CPU and an artificial intelligence processor. CPU and GPU etc.
  • Embodiments of the present application also provide a device, which includes a processing unit and a storage unit, where the storage unit is used to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the device performs the method performed in the above embodiments or step.
  • the above-mentioned processing unit may be the processor 151-15n shown in Figure 1.
  • An embodiment of the present application also provides a vehicle, which may include the above control device 500 .
  • the vehicle may be a vehicle.
  • Embodiments of the present application also provide a computer program product.
  • the computer program product includes: computer program code.
  • the computer program code When the computer program code is run on a computer, it causes the computer to execute the above method.
  • Embodiments of the present application also provide a computer-readable medium.
  • the computer-readable medium stores program code.
  • the computer program code When the computer program code is run on a computer, it causes the computer to perform the above method.
  • each step of the above method can be completed by instructions in the form of hardware integrated logic circuits or software in the processor.
  • the method disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware processor for execution, or can be executed by a combination of hardware and software modules in the processor.
  • the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in conjunction with its hardware. To avoid repetition, it will not be described in detail here.
  • the memory may include a read-only memory and a random access memory, and provide instructions and data to the processor.
  • the size of the sequence numbers of the above-mentioned processes does not mean the order of execution.
  • the execution order of each process should be determined by its functions and internal logic, and should not be implemented in this application.
  • the implementation of the examples does not constitute any limitations.
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program code. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

本申请实施例提供了一种控制方法、装置和交通工具,该方法包括:第一控制器根据第一传感器组中的传感器采集的数据,获取第一感知结果;第二控制器根据第二传感器组中的传感器采集的数据,获取第二感知结果;该第一控制器接收该第二控制器发送的该第二感知结果;该第一控制器根据该第一感知结果和该第二感知结果,向执行器发送第一控制指令。本申请实施例可以应用于智能汽车或者电动汽车中,有助于提升计算资源的利用率,也有助于降低控制器的成本。

Description

一种控制方法、装置和交通工具 技术领域
本申请实施例涉及智能驾驶领域,并且更具体地,涉及一种控制方法、装置和交通工具。
背景技术
从自动驾驶国际标准和功能安全的要求看,自动驾驶L3及以上场景下的责任主体是自动驾驶系统。自动驾驶系统的设计要求是“失效可运行”(fail operational),即当故障发生后自动驾驶系统继续运行自动驾驶功能并采取相应措施使车辆安全的退出自动驾驶。
针对该应用场景,目前业界主流的厂家都是采用两个自动驾驶控制器进行“1:1”的备份冗余,也就是整个系统中采用两个相同的自动驾驶域控制器并行连接,其中一个自动驾驶控制器作为主控制器,运行完整的自动驾驶业务,并输出车控指令控制车辆的运行。另外一个自动驾驶控制器作为备用控制器,当主控制器出现故障后备用控制器能替代主控制器继续进行业务的处理,控制车辆的行为。这样就要求主控制器和备用控制器同时具备较高的计算性能才能满足系统的要求。而当主控制器未出现故障时,备用控制器是处于空跑状态,这样会造成成本以及计算资源的浪费。
发明内容
本申请实施例提供一种控制方法、装置和交通工具,有助于提升计算资源的利用率,也有助于降低控制器的成本。
在本申请实施例中,交通工具可以包括一种或多种不同类型的在陆地(例如,公路,道路,铁路等),水面(例如:水路,江河,海洋等)或者空间上操作或移动的运输工具或者可移动物体。例如,交通工具可以包括汽车,自行车,摩托车,火车,地铁,飞机,船,飞行器,机器人或其它类型的运输工具或可移动物体等。
第一方面,提供了一种控制方法,该方法包括:第一控制器根据第一传感器组中的传感器采集的数据,获取第一感知结果;第二控制器根据第二传感器组中的传感器采集的数据,获取第二感知结果;该第一控制器接收该第二控制器发送的该第二感知结果;该第一控制器根据该第一感知结果和该第二感知结果,向执行器发送第一控制指令。
本申请实施例中,第一控制器和第二控制器可以分别对第一传感器组和第二传感器组中的传感器采集的数据进行感知,从而得到第一感知结果和第二感知结果。第一控制器可以利用第一控制器计算得到的第一感知结果和第二控制器发送给第一控制器的第二感知结果,生成并向执行器发送第一控制指令。这样,第一控制器可以利用第二控制器的计算能力,有助于提升计算资源的利用率;同时,第一控制器和第二控制器只需要对各自对应的传感器组中的传感器采集的数据进行处理,两个控制器不需要都具备较高的计算性能,有助于降低控制器的成本。
在一些可能的实现方式中,该第一控制器可以为主控制器,该第二控制器可以为备用控制器。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:该第二控制器接收该第一控制器发送的该第一感知结果;该第二控制器根据该第一感知结果和该第二感知结果,生成第二控制指令。
本申请实施例中,第二控制器可以利用第二控制器计算得到的第二感知结果和第一控制器发送给第二控制器的感知结果,生成第二控制指令。这样,第二控制器也可以利用第一控制器的计算能力,有助于进一步提升计算资源的利用率。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:该第二控制器向该执行器发送该第二控制指令。
本申请实施例中,第一控制器和第二控制器均可以将各自生成的控制指令发送给执行器。这样,执行器可以根据第一控制器发送的第一控制指令和第二控制器发送的第二控制指令来执行相应的控制操作。
在一些可能的实现方式中,该第一控制指令中包括第一标识信息,该第二控制指令中包括第二标识信息,其中,该第一标识信息和该第二标识信息不同。执行器在接收到第一控制指令和第二控制指令后,可以根据第一标识信息和第二标识信息,执行相应的控制操作。
示例性的,该第一标识信息可以为第一控制器局域网标识(controller area network identification,CAN ID),第二标识信息可以为第二CAN ID。执行器中可以保存有标识信息(例如,CAN ID)与控制指令的优先级的对应关系,如第一CAN ID对应的控制指令的优先级大于第二CAN ID对应的控制指令的优先级。这样,当执行器接收到该第一控制指令和第二控制指令时,可以执行优先级较高的第一控制指令,而舍弃该第二控制指令或者不执行第二控制指令。
在一些可能的实现方式中,该第一控制指令中包括第一优先级信息,该第二控制指令中包括第二优先级信息。这样,执行器中无需保存标识信息与控制指令的优先级的对应关系。执行器可以直接执行优先级较高的控制指令。例如,若第一优先级高于第二优先级,那么当执行器接收到该第一控制指令和第二控制指令时,可以执行优先级较高的第一控制指令,而舍弃该第二控制指令或者不执行第二控制指令。
在一些可能的实现方式中,该第一控制指令中可以包括第一控制器的标识信息,该第二控制指令可以包括第二控制器的标识信息。执行器中可以保存有控制器的标识信息与控制器发出的控制指令的优先级的对应关系,如第一控制器发出的控制指令的优先级高于第二控制器发出的控制指令的优先级。这样,当执行器接收到该第一控制指令和第二控制指令时,可以执行优先级较高的第一控制指令,而舍弃优先级较低的该第二控制指令或者不执行第二控制指令。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:在该第一控制器发生故障时,该第一控制器停止向该执行器发送该第一控制指令。
本申请实施例中,当第一控制器发生故障时可以停止向执行器发送第一控制指令,这样执行器在未接收到第一控制指令且接收到第二控制指令时,可以直接执行该第二控制指令,避免了第一控制器和第二控制器切换时进行通信协商的过程,通过禁止第一控制器向 执行器发送控制指令的方式进行控制指令的快速切换,有助于提升第一控制器和第二控制器切换的速率;同时,由于第二控制器可以快速接管交通工具的控制权,有助于提升交通工具的安全性能。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:在该第一控制器发生故障时,该第一控制器停止发送该第一控制指令;该第二控制器在确定该第一控制器发生故障且该第二控制器未发生故障时,向该执行器发送该第二控制指令。
本申请实施例中,第二控制器可以在确定第一控制器发生故障且第二控制器未发生故障时,向执行器发送第二控制指令,从而实现在第一控制器发生故障时,交通工具的控制器权从第一控制器切换至第二控制器。
在一些可能的实现方式中,该第二控制器确定第一控制器发生故障,包括:该第二控制器接收第一控制器发送的指示信息,该指示信息用于指示该第一控制器发生故障。
在一些可能的实现方式中,第一控制器可以周期性向第二控制器发送信息(例如,感知结果或者用于指示第一控制器是否故障的信息)。第二控制器可以在定时器运行期间内接收第一控制器发送的信息。若在定时器超时时未接收到第一控制器发送的信息,那么第二控制器可以确定第一控制器发生故障。
结合第一方面,在第一方面的某些实现方式中,该方法应用于交通工具中,该向执行器发送该第一控制指令之前,该方法还包括:确定该交通工具处于自动驾驶状态;其中,该方法还包括:提示用户接管该交通工具。
本申请实施例中,在第一控制器发生故障时,可以提示用户接管交通工具,这样可以使得用户在看到提示后快速接管交通工具,从而有助于保证用户的驾驶安全性。
结合第一方面,在第一方面的某些实现方式中,该第一控制器在第一时刻发生故障,该方法还包括:该第一控制器向该第二控制器发送第三感知结果,该第三感知结果包括第一时间段内该第一控制器对该第一传感器组中的传感器采集的数据的感知结果,该第一时间段位于该第一时刻之前;该第二控制器根据该第三感知结果和该第二感知结果,控制交通工具停止行驶。
本申请实施例中,在第一控制器发生故障时,第二控制器可以利用第一控制器发生故障之前计算得到的第三感知结果以及第二感知结果,来控制交通工具停止行驶,有助于提升交通工具的安全性。
结合第一方面,在第一方面的某些实现方式中,该第一传感器组中的至少部分传感器和该第二传感器组中的传感器不同。
本申请实施例中,第一传感器组中的至少部分传感器和第二传感器组中的传感器不同,这样第一控制器和第二控制器可以分别对不同的传感器采集的数据进行感知,有助于提升交通工具中计算资源的利用率。
在一些可能的实现方式中,该第一传感器组中的至少部分传感器和该第二传感器组中的传感器不同可以理解为第二传感器组中不包括该第一传感器组中的至少部分传感器;或者,也可以理解为第一传感器组中的传感器和第二传感器组中的传感器不同,即第一传感器组和第二传感器组中不存在相同的传感器;或者,还可以理解为第一传感器组中的传感器和第二传感器组中的传感器部分相同且另一部分不同,即第一传感器组中不包括第二传感器组中的部分传感器,第二传感器组中不包括第一传感器组中的部分传感器。
结合第一方面,在第一方面的某些实现方式中,该第一传感器组和该第二传感器组中包括定位传感器和毫米波雷达。
本申请实施例中,第一传感器组和第二传感器组中可以包括定位传感器和毫米波雷达。这样,当第一传感器组中的定位传感器和毫米波雷达未发生故障而第一控制器发生故障时,第二控制器还可以利用第一传感器组中的定位传感器和毫米波雷达采集的数据进行感知,有助于提升交通工具的安全性能。
结合第一方面,在第一方面的某些实现方式中,该第二传感器组中包括侧视摄像头。
本申请实施例中,以第一控制器和第二控制器位于车辆中为例,第二控制器可以利用第二传感器组中的侧视摄像头采集的数据保证车辆安全停车。
第二方面,提供了一种控制装置,该控制装置包括:第一控制单元,用于根据第一传感器组中的传感器采集的数据,获取第一感知结果;第二控制单元,用于根据第二传感器组中的传感器采集的数据,获取第二感知结果;该第二控制单元,还用于向该第一控制单元发送该第二感知结果;该第一控制单元,用于根据该第一感知结果和该第二感知结果,向执行器发送第一控制指令。
结合第二方面,在第二方面的某些实现方式中,该第一控制单元,还用于向该第二控制单元发送该第一感知结果;该第二控制单元,还用于根据该第一感知结果和该第二感知结果,生成第二控制指令。
结合第二方面,在第二方面的某些实现方式中,该第二控制单元,还用于向该执行器发送该第二控制指令。
结合第二方面,在第二方面的某些实现方式中,该第一控制单元,还用于在该第一控制单元发生故障时,停止向该执行器发送该第一控制指令。
结合第二方面,在第二方面的某些实现方式中,该第一控制单元,还用于在该第一控制单元发生故障时,停止发送该第一控制指令;该第二控制单元,用于在确定该第一控制单元发生故障且该第二控制单元未发生故障时,向该执行器发送该第二控制指令。
结合第二方面,在第二方面的某些实现方式中,该第一控制单元,还用于向该执行器发送该第一控制指令之前,确定该交通工具处于自动驾驶状态;该第一控制单元,还用于在第一控制单元发生故障时,控制提示装置提示用户接管该交通工具。
结合第二方面,在第二方面的某些实现方式中,该第一控制单元在第一时刻发生故障,该第一控制单元,还用于向该第二控制单元发送第三感知结果,该第三感知结果包括第一时间段内该第一控制单元对该第一传感器组中的传感器采集的数据的感知结果,该第一时间段位于该第一时刻之前;该第二控制单元,还用于根据该第三感知结果和该第二感知结果,控制交通工具停止行驶。
结合第二方面,在第二方面的某些实现方式中,该第一传感器组中的至少部分传感器和该第二传感器组中的传感器不同。
结合第二方面,在第二方面的某些实现方式中,该第一传感器组和该第二传感器组中包括定位传感器和毫米波雷达。
结合第二方面,在第二方面的某些实现方式中,该第二传感器组中包括侧视摄像头。
第三方面,提供了一种装置,该装置包括:存储器,用于存储计算机指令;处理器,用于执行该存储器中存储的计算机指令,以使得该装置执行上述第一方面中的方法。
第四方面,提供了一种交通工具,该交通工具包括上述第二方面或者第三方面中任一项所述的装置。
结合第四方面,在第四方面的某些实现方式中,该交通工具为车辆。
第五方面,提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码在计算机上运行时,使得计算机执行上述第一方面中的方法。
上述计算机程序代码可以全部或者部分存储在第一存储介质上,其中第一存储介质可以与处理器封装在一起的,也可以与处理器单独封装,本申请实施例对此不作具体限定。
第六方面,提供了一种计算机可读介质,所述计算机可读介质存储有程序代码,当所述计算机程序代码在计算机上运行时,使得计算机执行上述第一方面中的方法。
第七方面,本申请实施例提供了一种芯片系统,该芯片系统包括处理器,用于调用存储器中存储的计算机程序或计算机指令,以使得该处理器执行上述第一方面所述的方法。
结合第七方面,在一种可能的实现方式中,该处理器通过接口与存储器耦合。
结合第七方面,在一种可能的实现方式中,该芯片系统还包括存储器,该存储器中存储有计算机程序或计算机指令。
附图说明
图1是申请实施例提供的车辆的一个功能框图示意。
图2是本申请实施例提供的系统架构的示意图。
图3是本申请实施例提供的系统架构的另一示意图。
图4是本申请实施例提供的控制方法的示意性流程图。
图5是本申请实施例提供的控制装置的示意性框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。
本申请实施例中采用诸如“第一”、“第二”的前缀词,仅仅为了区分不同的描述对象,对被描述对象的位置、顺序、优先级、数量或内容等没有限定作用。本申请实施例中对序数词等用于区分描述对象的前缀词的使用不对所描述对象构成限制,对所描述对象的陈述参见权利要求或实施例中上下文的描述,不应因为使用这种前缀词而构成多余的限制。此外,在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
图1是本申请实施例提供的车辆100的一个功能框图示意。车辆100可以包括感知系统120、显示装置130和计算平台150,其中,感知系统120可以包括感测关于车辆100周边的环境的信息的若干种传感器。例如,感知系统120可以包括定位系统,定位系统可以是全球定位系统(global positioning system,GPS),也可以是北斗系统或者其他定位系统、惯性测量单元(inertial measurement unit,IMU)。又例如,感知系统120还可以包括激光雷达、毫米波雷达、超声雷达以及摄像装置中的一种或者多种。
车辆100的部分或所有功能可以由计算平台150控制。计算平台150可包括处理器 151至15n(n为正整数),处理器是一种具有信号的处理能力的电路,在一种实现中,处理器可以是具有指令读取与运行能力的电路,例如中央处理单元(central processing unit,CPU)、微处理器、图形处理器(graphics processing unit,GPU)(可以理解为一种微处理器)、或数字信号处理器(digital signal processor,DSP)等;在另一种实现中,处理器可以通过硬件电路的逻辑关系实现一定功能,该硬件电路的逻辑关系是固定的或可以重构的,例如处理器为专用集成电路(application-specific integrated circuit,ASIC)或可编程逻辑器件(programmable logic device,PLD)实现的硬件电路,例如现场可编程门阵列(field programmable gate array,FPGA)。在可重构的硬件电路中,处理器加载配置文档,实现硬件电路配置的过程,可以理解为处理器加载指令,以实现以上部分或全部单元的功能的过程。此外,还可以是针对人工智能设计的硬件电路,其可以理解为一种ASIC,例如神经网络处理单元(neural network processing unit,NPU)、张量处理单元(tensor processing unit,TPU)、深度学习处理单元(deep learning processing unit,DPU)等。此外,计算平台150还可以包括存储器,存储器用于存储指令,处理器151至15n中的部分或全部处理器可以调用存储器中的指令,执行指令,以实现相应的功能。
车辆100可以包括高级驾驶辅助系统(advanced driving assistant system,ADAS),ADAS利用在车辆上的多种传感器(包括但不限于:激光雷达、毫米波雷达、摄像装置、超声波传感器、全球定位系统、惯性测量单元)从车辆周围获取信息,并对获取的信息进行分析和处理,实现例如障碍物感知、目标识别、车辆定位、路径规划、驾驶员监控/提醒等功能,从而提升车辆驾驶的安全性、自动化程度和舒适度。
从逻辑功能上来说,ADAS系统一般包括三个主要功能模块:感知模块,决策模块和执行模块,感知模块通过传感器感知车身周围环境,输入相应实时数据至决策层处理中心,感知模块主要包括车载摄像头/超声波雷达/毫米波雷达/激光雷达等;决策模块根据感知模块获取的信息,使用计算装置和算法做出相应决策;执行模块从决策模块接收到决策信号后采取相应行动,如驾驶、变道、转向、刹车、警示等。
在不同的自动驾驶等级(L0-L5)下,基于人工智能算法和多传感器所获取的信息,ADAS可以实现不同等级的自动驾驶辅助,上述的自动驾驶等级(L0-L5)是基于汽车工程师协会(society of automotive engineers,SAE)的分级标准的。其中,L0级为无自动化;L1级为驾驶支援;L2级为部分自动化;L3级为有条件自动化;L4级为高度自动化;L5级为完全自动化。L1至L3级监测路况并做出反应的任务都由驾驶员和系统共同完成,并需要驾驶员接管动态驾驶任务。L4和L5级可以让驾驶员完全转变为乘客的角色。目前,ADAS可以实现的功能主要包括但不限于:自适应巡航、自动紧急刹车、自动泊车、盲点监测、前方十字路口交通警示/制动、后方十字路口交通警示/制动、前车碰撞预警、车道偏离预警、车道保持辅助、后车防撞预警、交通标识识别、交通拥堵辅助、高速公路辅助等。应当理解的是:上述的各种功能在不同的自动驾驶等级(L0-L5)下可以有具体的模式,自动驾驶等级越高,对应的模式越智能。例如,自动泊车可以包括APA、RPA以及AVP等。对于APA,驾驶员无需操纵方向盘,但是仍然需要在车辆上操控油门和刹车;对于RPA,驾驶员可以使用终端(例如手机)在车辆外部对车辆进行遥控泊车;对于AVP,车辆可以在没有驾驶员的情况下完成泊车。从对应的自动驾驶等级而言,APA约处在L1级的水平,RPA约处于L2-L3级的水平,而AVP约处于L4级的水平。
如前所述,从自动驾驶国际标准和功能安全的要求看,自动驾驶L3及以上场景下的责任主体是自动驾驶系统。自动驾驶系统的设计要求是“失效可运行”(fail operational),即当故障发生后自动驾驶系统继续运行自动驾驶功能并采取相应措施使车辆安全的退出自动驾驶。
针对该应用场景,目前业界主流的厂家都是采用两个自动驾驶控制器进行“1:1”的备份冗余,也就是整个系统中采用两个相同的自动驾驶域控制器并行连接,其中一个自动驾驶控制器作为主控制器,运行完整的自动驾驶业务,并输出车控指令控制车辆的运行。另外一个自动驾驶控制器作为备用控制器,当主控制器出现故障后备用控制器能替代主控制器继续进行业务的处理,控制车辆的行为。这样就要求主控制器和备用控制器同时具备较高的计算性能才能满足系统的要求。而当主控制器未出现故障时,备用控制器是处于空跑状态,这样会造成成本以及计算资源的浪费。
本申请实施例提供了一种控制方法、装置和交通工具,两个控制器之间采用负载分担的方式协同工作,每个控制器处理不同的业务,并通过控制器之间的通信总线将感知结果发送到对端控制器,使得每个控制器上都能得到所有传感器的感知结果,有助于提升计算资源的利用率,也有助于降低控制器的成本。
图2示出了本申请实施例提供的系统架构的示意图。如图2所示,该系统架构中包括传感器组A、传感器组B、控制器A、控制器B、车身执行器1-n。其中传感器组A中的传感器可以接入控制器A,传感器组B中的传感器可以接入控制器B。控制器A可以将生成的车控指令发送到车控总线A,控制器B可以将生成的车控指令发送到车控总线B。图2所示的系统架构可以应用在高功能安全、高可靠和高性能的自动驾驶场景下,是车载自动驾驶控制器的一种新的互联、互动的软硬件一体的架构。
传感器组A和传感器组B中包括但不限于若干数量的摄像装置、激光雷达、毫米波雷达、超声波雷达、GPS、IMU等。同时,允许传感器组A中的部分传感器可以接入到控制器B中,同样传感器组B中的部分传感器可以接入到控制器A中。
一个实施例中,通过控制器局域网(controller area network,CAN)总线或者具有灵活数据速率的控制器局域网(CAN with flexible data-rate,CANFD)总线输出数据的传感器可以分别接到控制器A和控制器B上。
控制器A和控制器B具备对外部输入的传感器数据进行感知计算,从而识别车辆周边环境信息,并通过一系列计算过程控制车辆行为的能力。在申请实施例中,控制器A和控制器B之间可以通过通信总线进行互联,例如,该通信总线可以为以太总线或者CAN总线等。
应理解,图2所示的控制器A和控制器B可以位于上述ADAS系统中。
还应理解,以上图2所示的系统架构中是以包括2个传感器组和2个控制器为例进行说明的,本申请实施例中对于传感器组和控制器的个数并不作具体限定。例如,该系统架构中还可以包括3个(或者3个以上)的传感器组以及3个(或者3个以上)的控制器。
还应理解,以上传感器组的个数和控制器的个数可以是相等的,或者,传感器组的个数和控制器的个数也可以不相等的。例如,系统架构中可以包括传感器组A、传感器组B、传感器组C、控制器A和控制器B。传感器组A和传感器组C可以接入控制器A且传感器组B可以接入控制器B。又例如,系统架构中可以包括传感器组A、传感器组B、控制 器A、控制器B和控制器C。传感器组A可以接入控制器A,传感器组B可以分别接入控制器B和控制器C。
本申请实施例中,控制器A和控制器B可以采用负载均衡的方式共同处理自动驾驶业务中的重要业务,如感知结果的处理。控制器A对接入自身的传感器组A中的传感器采集的数据进行计算,得出第一感知结果,第一感知结果中可以包括车辆周围的环境信息(包括但不限于车道线信息、障碍物信息、交通标示、位置信息等)。同样的,控制器B也可对接入自身的传感器组B中的传感器采集数据进行计算,得出第二感知结果,第二感知结果中可以包括车辆周围的环境信息。控制器A和控制器B之间可以进行感知计算后的结构化数据的交互,使两个控制器均能获得到对端控制器上的感知结果;以此实现了感知计算在两个控制器上的分别处理,而计算结果又能被两个控制器共享。
这样,控制器A可以利用控制器B的计算能力且控制器B也可以利用控制器A的计算能力,有助于提升计算资源的利用率;同时,控制器A和控制器B只需要对各自对应的传感器组中的传感器采集的数据进行处理,两个控制器不需要都具备较高的计算性能,有助于降低控制器的成本。
车身执行器1-n可以包括车辆上用于控制车辆横向、纵向行为的车身执行器。例如,车身执行器1-n中可以包括电机控制单元(integrated power unit,IPU)、电控转向系统(electrical power system,EPS)、电控制动系统(electrical brake system,EBS)等。这些车身执行器负责接收控制器A和/或控制器B输出的车控指令,实现控制器A或控制器B对车辆的控制。如图2所示,可以通过两个不同的车控总线分别与控制器连接,控制器A通过车控总线A连接车身执行器1-n,控制器B通过车控总线B连接车身执行器1-n。控制器A可以通过CAN总线或者CANFD总线将车控指令发送到车控总线A上,控制器B可以通过CAN总线或者CANFD总线将车控指令发送到车控总线B上。
控制器A和控制器B可以通过车控总线控制车辆的行为。
例如,若车身执行器支持两路车控指令控制车辆,那么在控制器A和控制器B均正常时,控制器A和控制器B分别通过车控总线A和车控总线B发送车控指令。若车控总线A上的车控指令的优先级高于车控总线B上的车控指令,那么车身执行器从车控总线A上获取车控指令,此时控制器A主导车辆的控制。
而当控制器A发生故障时,控制器A停止向车控总线A发送车控指令,车身执行器则改为接收车控总线B上的车控指令,此时从控制器A主导车辆的控制切换至控制器B主导车辆的控制。同时,控制器A(或者控制器B)还可以控制提示装置提示用户接管车辆和/或控制车辆的靠边停车。如果控制器B发生故障,则控制器B停止通过车控总线B发送车控指令,车辆依然处于控制器A的控制之下;同时,控制器A(或者控制器B)还可以控制提示装置提示用户接管车辆和/或控制车辆的靠边停车。
本申请实施例中,当控制器A发生故障时可以停止向车身执行器发送车控指令,这样车身执行器在未接收到控制器A发送的车控指令且接收到控制器B发送的车控指令时,可以直接执行该控制器B发送的车控指令,避免了控制器A和控制器B切换时进行通信协商的过程,从而有助于提升控制器A和控制器B切换的速率;同时,由于控制器B可以快速接管车辆的控制权,有助于提升车辆的安全性能。
又例如,若车身执行器只支持一路车控指令控制车辆,那么控制器A和控制器B协 商只允许一路发出车控指令。当控制器A和控制器B均正常时,可以由控制器A通过车控总线A向车身执行器发送车控指令。而控制器A发生故障且控制器B正常时,控制器A停止通过车控总线A向车身执行器发送车控指令且控制器B可以通过车控总线B向车身执行器发送车控指令。
图3示出了本申请实施例提供的系统架构的另一示意图。如图3所示,控制器A中可以包括视觉感知模块A、激光雷达感知模块A、毫米波感知模块A、位置定位模块A、本地感知融合模块A、感知结果抽取模块A、全局感知融合模块A、规划控制模块A、车控指令下发模块A、硬件监控模块A、软件监控模块A、故障管理模块A、主从管理模块A以及时间同步模块A。控制器B中可以包括视觉感知模块B、激光雷达感知模块B、毫米波感知模块B、位置定位模块B、本地感知融合模块B、感知结果抽取模块B、全局感知融合模块B、规划控制模块B、车控指令下发模块B、硬件监控模块B、软件监控模块B、故障管理模块B、主从管理模块B以及时间同步模块B。
控制器A和控制器B通过时间同步模块A和时间同步模块B进行时间同步,使得控制器A上的时间与控制器B上的时间保持同步。控制器A和控制器B中的任意一方可以作为主控制器,另一方作为备用控制器。
控制器A上的视觉感知模块A、激光雷达感知模块A、毫米波感知模块A和位置定位模块A可以分别处理传感器组A中摄像装置采集的数据、激光雷达采集的数据、毫米波雷达采集的数据和GPS/IMU采集的数据,从而获得对应的感知结果。同时控制器B上的视觉感知模块B、激光雷达感知模块B、毫米波感知模块B和位置定位模块B可以分别处理传感器组B中摄像装置采集的数据、激光雷达采集的数据、毫米波雷达采集的数据和GPS/IMU采集的数据,从而获得对应的感知结果。上述的这些数据输入控制器时,均使用各自控制器上的时间信息在数据上打上时间戳,使这些数据可以区分出来先后顺序。
应理解,以上是以控制器A和控制器B中均具备视觉感知模块、激光雷达感知模块、毫米波感知模块和位置定位模块为例进行说明的。本申请实施例并不限于此。例如,当控制器A负责车辆的自动驾驶业务时,传感器组A中可以包括前视长距摄像头、前视短距摄像头、环视摄像头(例如前视摄像头、后视摄像头、左视摄像头、右视摄像头)、前向激光雷达、后向激光雷达、GPS、IMU。此时,控制器A中可以包括视觉感知模块A、激光雷达感知模块A和位置定位模块A。当控制器B负责车辆实现安全停车的功能时,传感组B中可以包括侧视摄像头(例如,左前视摄像头、右前视摄像头、左后视摄像头、右后视摄像头)、GPS、IMU。此时,控制器B中可以包括视觉感知模块B和位置定位模块B。
又例如,当控制器A负责车辆的自动驾驶业务时,传感器组A中可以包括前视长距摄像头、前视短距摄像头、环视摄像头(例如前视摄像头、后视摄像头、左视摄像头、右视摄像头)、GPS、IMU。此时,控制器A中可以包括视觉感知模块A和位置定位模块A。当控制器B负责车辆实现安全停车的功能时,传感组B中可以包括侧视摄像头(例如,左前视摄像头、右前视摄像头、左后视摄像头、右后视摄像头)、前向激光雷达和后向激光雷达。此时,控制器B中可以包括视觉感知模块B和激光雷达感知模块B。
以上控制器A和控制器B中均可以包括视觉感知模块、激光雷达感知模块、毫米波感知模块和位置定位模块。这样,无论传感器组中包含何种类型的传感器,控制器A和控 制器B均可以对其采集的数据进行处理。或者,控制器A和控制器B可以根据所连接的传感器组中的传感器的类型而设置感知模块。例如,传感器组A中不包括激光雷达时,控制器A中可以不包括激光雷达感知模块A;又例如,传感器组B中不包括定位传感器时,控制器B中可以不包括位置定位模块。
以CAN总线或者CANFD总线输出数据的毫米波雷达、定位传感器(例如,GPS、IMU)可以同时接入到控制器A和控制器B。例如,传感器组A中的毫米波雷达、定位传感器可以分别接入控制器A和控制器B。当控制器A发生故障且传感器组A中的毫米波雷达和定位传感器正常时,控制器B还可以利用传感器组A中的毫米波雷达和定位传感器采集的数据,使得车辆进行本车道停车或者靠边停车。
控制器A上的本地感知融合模块A接收视觉感知模块A、激光雷达感知模块A、毫米波感知模块A和位置定位模块A的感知结果并对感知结果进行融合,得到控制器A上传感器同一时空坐标系下的车辆周边环境信息的模型。控制器B上的本地感知融合模块B接收视觉感知模块B、激光雷达感知模块B、毫米波感知模块B和位置定位模块B的感知结果并对感知结果进行融合,得到控制器B上传感器同一时空坐标系下的车辆周边环境信息的模型。
以上车辆周边环境信息的模型包括但不限于:车道线信息,交通标识信息(如红绿灯信息、限速标志信息等)、道路上障碍物信息等。
控制器A上的感知结果抽取模块A对本地感知融合模块A中的数据进行选择和抽取,并将选择和抽取后的数据发送到全局感知融合模块B。全局感知融合模块B可以对本地感知融合模块B获得的融合结果以及感知结果抽取模块A发送的数据进行进一步融合。同时,控制器B上的感知结果抽取模块B对本地感知融合模块B中的数据进行选择和抽取,并将选择和抽取后的数据发送到全局感知融合模块A。全局感知融合模块A可以对本地感知融合模块A获得的融合结果以及感知结果抽取模块B发送的数据进行进一步融合。
一个实施例中,感知结果抽取模块A对本地感知融合模块A中的数据进行选择和抽取的方式包括但不限于以下几种:
(1)若控制器A和控制器B的性能均较强、具备较大的内存空间或者两个控制器之间的高速总线带宽足够,那么可以将控制器A和控制器B上本地感知融合模块融合后数据全部实时同步到对端控制器,这样两个控制器上均能最大化的获得所有信息。
(2)若控制器B的性能较弱或者内存空间有限,那么感知结果抽取模块A可以对本地感知融合模块A融合后的数据进行筛选,按照信息的关键程度进行排序。排序的方式可以采用关键方向、距离远近等等。感知结果抽取模块A首先去掉非关键方向的非关键信息(如左右方向的障碍物信息,后向的远距离物体信息),关键方向的远距离信息(如前向200米处的障碍物信息)。如果这时控制器B的性能依然不足,则感知结果抽取模块A可以将这些信息裁减掉,而保留其他重要信息传递到全局感知融合模块B。应理解,以上筛选的过程可以是在测试控制器性能阶段完成的。
(3)以业务为重点,若控制器B主要承担安全停车的功能,则感知结果抽取模块A优先将停车所需的前向、右前、右后、后向的障碍物信息发送给全局感知融合模块B。对于其他方向的障碍物信息可以少发送或者不发送。应理解,对于左侧行驶的车辆,感知结果抽取模块A也可以优先将停车所需的前向、左前、左后、后向的障碍物信息发送给全局 感知融合模块B。
感知结果抽取模块A发送给全局感知融合模块B的数据以及感知结果抽取模块B发送给全局感知融合模块A的数据可以通过高速以太总线进行传输。控制器A和控制器B上的数据选择的范围、类型以及数据量的大小可以相同也可以不同,可以取决于对端控制器上部署业务所需要的信息。
控制器A上的规划控制模块A根据本控制器上的自动驾驶业务的功能部署策略,进行对车辆轨迹的规划、计算并生成相应的车控指令A。规划控制模块A可以将生成的车控指令A发送给车控指令下发模块A。同时,控制器B上的规划控制模块B根据本控制器上的自动驾驶业务的功能部署策略,进行对车辆轨迹的规划、计算并生成相应的车控指令B。规划控制模块B可以将生成的车控指令B发送给车控指令下发模块B。
一个实施例中,控制器A和控制器B上的功能部署可以有不同的策略。
例如,控制器A和控制器B上部署不同的功能,如控制器A部署高速巡航功能,控制器B上部署靠边停车功能,两个控制器上的轨迹规划和运动控制的策略可以不相同。
又例如,控制器A和控制器B上部署相同的功能,如控制器A和B上都部署高速巡航功能,两个控制器上可以采用相同的轨迹规划和运动控制的策略。
又例如,控制器A和控制器B上可以部署一些相同的功能,以及分别部署一些不同的功能,如控制器A部署高速巡航功能和靠边停车功能,控制器B部署靠边停车功能等。
车控指令下发模块A和车控指令下发模块B在接收到车控指令后,向车控总线输出车控指令时可以包括以下两种方式。
方式一:车身执行器可以接收车控指令下发模块A发送的车控指令A以及接收车控指令下发模块B发送的车控指令B。
一个实施例中,车控指令A中包括第一标识信息,车控指令B中包括第二标识信息,该第一标识信息和该第二标识信息不同。车身执行器中可以保存有标识信息与车控指令对应的优先级的对应关系。
例如,以车控指令中包括的标识信息为CAN ID为例,执行器中可以保存表1中CAN ID与优先级的对应关系。
表1
CAN ID 车控指令对应的优先级
1
2
这样,当车身执行器接收到车控指令A和车控指令B后,可以解析获得车控指令A的CAN ID以及车控指令B的CAN ID。若车控指令A中的CAN ID为1且车控指令B中的CAN ID为2,那么车身执行器可以根据上述表1所示的对应关系,执行优先级较高的车控指令A而不执行车控指令B。
一个实施例中,车控指令A中包括第一优先级信息,车控指令B中包括第二优先级信息,该第一优先级高于该第二优先级。这样,当车身执行器接收到车控指令A和车控指令B后,可以解析获得车控指令A的第一优先级以及车控指令B的第二优先级。车身执行器可以执行优先级较高的车控指令A而不执行车控指令B。
一个实施例中,车控指令A中可以包括控制器A的标识信息,车控指令B中可以包括控制器B的标识信息。车身执行器中可以保存有控制器的标识信息与车控指令对应的优先级的对应关系。例如,表2示出了一种控制器的标识信息与车控指令对应的优先级的对应关系。
表2
控制器的标识信息 车控指令对应的优先级
A
B
这样,当车身执行器接收到车控指令A和车控指令B后,可以解析获得车控指令A中控制器的标识信息以及车控指令B中控制器的标识信息。车身执行器可以根据上述表2所示的对应关系,执行优先级较高的车控指令A而不执行车控指令B。
应理解,以上表格所示的对应关系仅仅是示意性的,本申请实施例并不限于此。
还应理解,以上车身执行器通过车控指令A和车控指令B中携带的信息来确定车控指令A和车控指令B的优先级的过程也仅仅是示意性的,本申请实施例中并不限于此。还可以车控指令中携带其他信息来确定车控指令的优先级,例如,车控指令A中的第一字段中携带了某个信息而车控指令B中的第一字段中未携带该信息,则车身执行器可以确定车控指令A的优先级高于车控指令B的优先级。又例如,还可以通过车控总线来确定车控指令的优先级。如车控总线A上的车控指令的优先级高于车控总线B上的车控指令的优先级。
当控制器A和控制器B均正常(或者,均处于健康工作状态)时,控制器A和控制器B均下发车控指令,此时由于控制器A发送的车控指令拥有高优先级,则控制器A实际控制车辆的运行。
当控制器A发生故障时,或者,当控制器A无法利用现有传感器资源和计算能力控制车辆时,则主从管理模块A禁止车控指令下发模块A将车控指令下发到车控总线A上。当控制器B未发生故障可以维持当前的自动驾驶业务时,主从管理模块B允许车控指令下发模块B将车控指令B下发到车控总线B上。此时,车辆的控制权限可以快速由控制器A切换至控制器B。
一个实施例中,控制器A(或者控制器B)还可以控制提示装置提示驾驶员接管车辆的控制权。
示例性的,该提示装置包括显示屏、氛围灯、语音模块中的一种或者多种。例如,可以控制显示屏显示提示信息“请接管车辆”。又例如,可以控制氛围灯的颜色变红来提示驾驶员接管车辆。又例如,可以控制语音模块发出语音信息“请接管车辆”来提示驾驶员接管车辆。
一个实施例中,控制器A在T 1时刻发生故障,感知结果抽取模块A可以向全局感知融合模块B发送T 0时刻至T 1时刻这一时间段内控制器A对传感器组A采集的数据的感知融合结果。全局感知融合模块B可以结合感知结果抽取模块A发送的感知结果和控制器B对传感器组B中的传感器采集的数据的感知融合结果进行进一步融合,从而提升控制器B控制车辆在本车道停车或者靠边停车过程中的安全性。
当控制器B也发生故障,或者,当控制器B无法利用现有传感器资源和计算能力控制车辆时,主从管理模块B禁止车控指令下发B将车控指令B下发到车控总线B上。此时车控总线A和车控总线B均无车控指令下发,车身执行器紧急制动将车辆减速停车。
本申请实施例中,两个控制器可以均向车身执行器发送车控指令。当某个控制器发生故障时,通过停止该控制器下发车控指令的方式进行快速的车控指令的切换。本申请实施例提供的车控指令切换方案,无需要进行系统间的主从协商,方便支持异构厂家的控制器构成主备系统。
方式二:车身执行器只能接收一路车控指令,或者,车身执行器只接收车控指令下发模块A发送的车控指令或者只接收车控指令下发模块B发送的车控指令
例如,可以将控制器A设置为主用控制器,则控制器A优先发送车控指令;控制器B设置为备用控制器,控制器B不发送车控指令。
当控制器A和控制器B均正常时,主从管理模块A和主从管理模块B对两个控制器进行选择,此时允许车控指令下发模块A发送车控指令,车控指令下发模块B禁止发送车控指令。
当控制器A发生故障,或者,当控制器A无法利用现有传感器资源和计算能力控制车辆时,则主从管理模块A禁止车控指令模块下发A将车控指令下发到车控总线A上。同时主从管理模块B判断控制器B是否正常,若控制器B正常,则此时允许车控指令下发模块B下发车控指令。
一个实施例中,主从管理模块A可以周期性的向主从管理模块B发送指示信息,该指示信息用于指示控制器A是否正常。同样,主从管理模块B可以周期性的向主从管理模块A发送指示信息,该指示信息用于指示控制器B是否正常。主从管理模块B在确定控制器A发生故障且控制器B正常时,可以允许车控指令下发模块B下发车控指令。
一个实施例中,主从管理模块A中可以保存有定时器,若在定时器运动期间内接收到主从管理模块B发送的信息,则主从管理模块A可以认为控制器B是正常的;若在定时器运动期间内未接收到主从管理模块B发送的信息,则主从管理模块A可以认为控制器B发生故障。同样,主从管理模块B中可以保存有定时器,若在定时器运动期间内接收到主从管理模块A发送的信息,则主从管理模块B可以认为控制器A是正常的;若在定时器运动期间内未接收到主从管理模块A发送的信息,则主从管理模块B可以认为控制器A发生故障。
当控制器A和控制器B均发生故障造成都无法正常下发车控指令时,此时车控总线A和车控总线B均无车控指令下发,此时车辆执行器紧急制动将车辆减速停车。
控制器A上硬件监控模块A可以实时监控控制器A上硬件系统的故障状态,如果有故障状态,则将故障信息上报到故障管理模块A。同样,控制器B上硬件监控模块B可以实时监控控制器B上硬件系统的故障状态,如果有故障状态,则将故障信息上报到故障管理模块B。
控制器A上软件监控模块A实时监控本控制器上软件的健康状态,如果有故障发生,则将故障信息上报到故障管理模块A。同样,控制器B上软件监控模块B实时监控本控制器上软件的健康状态,如果有故障发生,则将故障信息上报到故障管理模块B。
故障管理模块A对控制器A上的软件故障和硬件故障进行汇总和分级,从而判断是 否有影响自动驾驶业务的故障发生,并给出故障的影响严重程度。同样,故障管理模块B对控制器B上的软件故障和硬件故障进行汇总和分级,判断是否有影响自动驾驶业务的故障发生以及故障的影响严重程度。
一个实施例中,故障管理模块A和故障管理模块B在各自控制器分别运行在汽车安全完整性等级D级(automotive safety integrity level D,ASIL-D)。
主从管理模块A从故障管理模块A上获取控制器A的故障信息,主从管理模块B从故障管理模块B获取控制器B的故障信息。主从管理模块A和主从管理模块B可以在各自控制器分别运行在ASIL-D的功能安全级别。主从管理模块A和主从管理模块B之间通过两个控制器之间的两种异构的总线进行通信,如CAN总线和以太总线,以通知对端自身的健康状态以及自身是否正在发送车控指令。
本申请实施例中,两个控制器可以分别接入不同的传感器组,两个控制器分别对这些传感器组中的传感器采集的数据进行感知计算。每个控制器将计算后的结构化数据发送到对端控制器,每个控制器上均可以获得所有传感器的感知结果,使两个控制器的感知计算的能力均可以得到有效利用,从而有助于提升计算资源的利用率。
图4示出了本申请实施例提供的一种控制方法400的示意性流程图。该方法400可以应用于包括第一控制器和第二控制器的控制系统中。例如,该控制系统可以位于交通工具中;或者,该控制系统可以位于上述图1所示的计算平台中;或者,该控制系统可以位于上述ADAS系统中。该方法400包括:
S410,第一控制器根据第一传感器组中的传感器采集的数据,获取第一感知结果。
示例性的,该第一控制器可以为上述控制器A,该第一传感器组可以为上述传感器组A。
可选地,该第一控制器可以为主控制器。
S420,第二控制器根据第二传感器组中的传感器采集的数据,获取第二感知结果。
示例性的,该第二控制器可以为上述控制器B,该第二传感器组可以为上述传感器组B。
可选地,该第二控制器为备用控制器。
可选地,该第一传感器组和该第二传感器组中可以包括相同的传感器。
可选地,该第一传感器组中的至少部分传感器和该第二传感器组中的传感器不同。
应理解,该第一传感器组中的至少部分传感器和该第二传感器组中的传感器不同可以理解为第一传感器组中的传感器和第二传感器组中的传感器不同,即第一传感器组和第二传感器组中不存在相同的传感器。
示例性的,以该控制系统位于车辆中为例,该第一控制器可以负责自动驾驶业务且该第二控制器可以负责安全停车功能。那么,第一传感器组中可以包括前视长距摄像头、前视短距摄像头、环视摄像头(例如前视摄像头、后视摄像头、左视摄像头、右视摄像头)、前向激光雷达、后向激光雷达、GPS、IMU;第二传感器组中可以包括侧视摄像头(例如,左前视摄像头、右前视摄像头、左后视摄像头、右后视摄像头)。此时,第一传感器组中的传感器和第二传感器组中的传感器可以均不相同。
或者,该第一传感器组中的至少部分传感器和该第二传感器组中的传感器不同还可以理解为第一传感器组中的传感器和第二传感器组中的传感器部分相同且另一部分不同,即 第一传感器组中不包括第二传感器组中的部分传感器,第二传感器组中不包括第一传感器组中的部分传感器。
示例性的,以该控制系统位于车辆中为例,该第一控制器可以负责自动驾驶业务且该第二控制器可以负责安全停车功能。那么,第一传感器组中可以包括前视长距摄像头、前视短距摄像头、环视摄像头(例如前视摄像头、后视摄像头、左视摄像头、右视摄像头)、GPS、IMU;第二传感器组中可以包括侧视摄像头(例如,左前视摄像头、右前视摄像头、左后视摄像头、右后视摄像头)、前向激光雷达、后向激光雷达、GPS、IMU。此时,第一传感器组和第二传感器组中均具有GPS和IMU,第一传感器组中不包括第二传感器组中的侧视摄像头、前向激光雷达和后向激光雷达。第二传感器中不包括第一传感器组中的前视长距摄像头、前视短距摄像头、环视摄像头。
S430,该第二控制器向该第一控制器发送该第二感知结果。
相应的,该第一控制器接收该第二控制器发送的该第二感知结果。
可选地,该第二控制器向该第一控制器发送该第二感知结果,包括:第二控制器通过CAN总线、CANFD总线或者以太总线,向该第一控制器发送该第二感知结果。
可选地,该第二感知结果包括该第二控制器对第二传感器组中的传感器采集的数据的感知结果中的部分。例如,第一传感器组和第二传感器组中均具有GPS和IMU时,第二控制器可以不向该第一控制器发送对于交通工具的位置的感知结果。
S440,该第一控制器根据该第一感知结果和该第二感知结果,向执行器发送第一控制指令。
示例性的,如图3所示,全局感知融合模块A可以对本地感知融合模块A获得的融合结果以及感知结果抽取模块B发送的数据进行进一步融合。规划控制模块A可以根据融合后的结果生成车控指令。车控指令下发模块A可以将该车控指令发送至车身执行器。
可选地,该方法400还包括:该第一控制器向该第二控制器发送该第一感知结果。
相应的,该第二控制器接收该第一控制器发送的该第一感知结果。该第二控制器根据该第一感知结果和该第二感知结果,生成第二控制指令。
可选地,该第一感知结果包括该第一控制器对第一传感器组中的传感器采集的数据的感知结果中的部分。
例如,第一传感器组和第二传感器组中均具有GPS和IMU时,第一控制器可以不向该第二控制器发送对于交通工具的位置的感知结果。
又例如,第一控制器通过环视摄像头采集的数据可以感知交通工具左右方向的障碍物的信息、后向100米物体的信息以及前向200米处的障碍物信息,第一控制器在向第二控制器发送的第二感知结果中可以只携带前向200米处的障碍物信息,而不携带左右方向的障碍物的信息以及后向100米物体的信息。
可选地,该方法400还包括:该第二控制器向该执行器发送该第二控制指令。
可选地,以第一控制器和第二控制器位于车辆中为例,该第二控制器可以通过CAN总线或者CANFD总线向车身执行器发送该第二控制指令。
可选地,该第一控制指令中包括第一标识信息,该第二控制指令中包括第二标识信息,其中,该第一标识信息和该第二标识信息不同。执行器在接收到第一控制指令和第二控制指令后,可以根据第一标识信息和第二标识信息,执行相应的控制操作。
示例性的,该第一标识信息可以为第一CAN ID,第二标识信息可以为第二CAN ID。执行器中可以保存有标识信息(例如,CAN ID)与控制指令的优先级的对应关系,如第一CAN ID对应的控制指令的优先级大于第二CAN ID对应的控制指令的优先级。这样,当执行器接收到该第一控制指令和第二控制指令时,可以执行优先级较高的第一控制指令,而不执行第二控制指令。
可选地,该第一控制指令中包括第一优先级信息,该第二控制指令中包括第二优先级信息。这样,执行器中无需保存标识信息与控制指令的优先级的对应关系。执行器可以直接执行优先级较高的控制指令。例如,若第一优先级高于第二优先级,那么当执行器接收到该第一控制指令和第二控制指令时,可以执行优先级较高的第一控制指令,而不执行第二控制指令。
可选地,该第一控制指令中可以包括第一控制器的标识信息,该第二控制指令可以包括第二控制器的标识信息。执行器中可以保存有控制器与控制器发出的车控指令的优先级的对应关系,如第一控制器的优先级高于第二控制器的优先级。这样,当执行器接收到该第一控制指令和第二控制指令时,可以执行优先级较高的第一控制指令,而不执行优先级较低的第二控制指令。
可选地,该方法400还包括:在该第一控制器发生故障时,该第一控制器停止向该执行器发送该第一控制指令。
示例性的,如图3所示,若车身执行器支持两路车控指令控制车辆,故障管理模块A通过硬件监控模块A和/或软件监控模块的监控结果判断控制器A发生了故障,那么故障管理模块A可以向主从管理模块A通知控制器A发生了故障。从而主从管理模块A可以控制车控指令下发模块A停止向车控总线A下发车控指令。
可选地,该方法400还包括:在该第一控制器发生故障时,该第一控制器停止发送该第一控制指令;该第二控制器在确定该第一控制器发生故障且该第二控制器未发生故障时,向该执行器发送该第二控制指令。
示例性的,如图3所示,若车身执行器只支持一路车控指令控制车辆,故障管理模块A通过硬件监控模块A和/或软件监控模块的监控结果判断控制器A发生了故障,那么故障管理模块A可以向主从管理模块A通知控制器A发生了故障。从而主从管理模块A可以控制车控指令下发模块A停止向车控总线A下发车控指令。同时,主从管理模块A还可以向主从管理模块B通知控制器A发生了故障。主从管理模块B在接收到该通知后,可以将车控指令下发模块B的状态从禁止向车控总线B下发车控指令切换至允许向车控总线B下发车控指令。
可选地,该方法应用于交通工具中,该向执行器发送该第一控制指令之前,该方法还包括:确定该交通工具处于自动驾驶状态;其中,该方法400还包括:提示用户接管该交通工具。
可选地,该提示用户接管该交通工具,包括:控制提示装置提示用户接管交通工具。示例性的,可以通过控制显示屏显示提示信息、控制氛围灯颜色变化以及控制语音模块发出语音提示音中的一种或者多种,来提示用户接管交通工具。
可选地,该第一控制器在第一时刻发生故障,该方法还包括:该第一控制器向该第二控制器发送第三感知结果,该第三感知结果包括第一时间段内该第一控制器对该第一传感 器组中的传感器采集的数据的感知结果,该第一时间段位于该第一时刻之前;该第二控制器根据该第三感知结果和该第二感知结果,控制车辆停车。
可选地,通过CAN总线或者CANFD总线输出数据的传感器,可以分别接入第一控制器和第二控制器中。
可选地,该第一传感器组和该第二传感器组中包括定位传感器和/或毫米波雷达。
可选地,该第二传感器组中包括侧视摄像头。
以该交通工具是车辆为例,该第二控制器可以用于负责车辆的安全停车。这样在第二传感器组中包括侧视摄像头,可以保证在第一控制器出现故障时,第二控制器通过侧视摄像头采集的数据实现车辆的安全停车。
本申请实施例还提供用于实现以上任一种方法的装置,例如,提供一种装置包括用以实现以上任一种方法中交通工具所执行的各步骤的单元(或手段)。
图5示出了本申请实施例提供的一种控制装置500的示意性框图。如图5所示,该装置500包括:第一控制单元510,用于根据第一传感器组中的传感器采集的数据,获取第一感知结果;第二控制单元520,用于根据第二传感器组中的传感器采集的数据,获取第二感知结果;该第二控制单元510,还用于向该第一控制单元发送该第二感知结果;该第一控制单元520,用于根据该第一感知结果和该第二感知结果,向执行器发送第一控制指令。
可选地,该第一控制单元510,还用于向该第二控制单元发送该第一感知结果;该第二控制单元520,还用于根据该第一感知结果和该第二感知结果,生成第二控制指令。
可选地,该第二控制单元520,还用于向该执行器发送该第二控制指令。
可选地,该第一控制单元510,还用于在该第一控制单元发生故障时,停止向该执行器发送该第一控制指令。
可选地,该第一控制单元510,还用于在该第一控制单元发生故障时,停止发送该第一控制指令;该第二控制单元520,用于在确定该第一控制单元发生故障且该第二控制单元未发生故障时,向该执行器发送该第二控制指令。
可选地,该第一控制单元510,还用于向该执行器发送该第一控制指令之前,确定该交通工具处于自动驾驶状态;该第一控制单元510,还用于在第一控制单元发生故障时,控制提示装置提示用户接管该交通工具。
可选地,该第一控制单元510在第一时刻发生故障,该第一控制单元510,还用于向该第二控制单元520发送第三感知结果,该第三感知结果包括第一时间段内该第一控制单元对该第一传感器组中的传感器采集的数据的感知结果,该第一时间段位于该第一时刻之前;该第二控制单元520,还用于根据该第三感知结果和该第二感知结果,控制车辆停车。
可选地,该第一传感器组中的至少部分传感器和该第二传感器组中的传感器不同。
可选地,该第一传感器组和该第二传感器组中包括定位传感器和毫米波雷达。
可选地,该第二传感器组中包括侧视摄像头。
应理解以上装置中各单元的划分仅是一种逻辑功能的划分,实际实现时可以全部或部分集成到一个物理实体上,也可以物理上分开。此外,装置中的单元可以以处理器调用软件的形式实现;例如装置包括处理器,处理器与存储器连接,存储器中存储有指令,处理器调用存储器中存储的指令,以实现以上任一种方法或实现该装置各单元的功能,其中处 理器例如为通用处理器,例如CPU或微处理器,存储器为装置内的存储器或装置外的存储器。或者,装置中的单元可以以硬件电路的形式实现,可以通过对硬件电路的设计实现部分或全部单元的功能,该硬件电路可以理解为一个或多个处理器;例如,在一种实现中,该硬件电路为ASIC,通过对电路内元件逻辑关系的设计,实现以上部分或全部单元的功能;再如,在另一种实现中,该硬件电路为可以通过PLD实现,以FPGA为例,其可以包括大量逻辑门电路,通过配置文件来配置逻辑门电路之间的连接关系,从而实现以上部分或全部单元的功能。以上装置的所有单元可以全部通过处理器调用软件的形式实现,或全部通过硬件电路的形式实现,或部分通过处理器调用软件的形式实现,剩余部分通过硬件电路的形式实现。
在本申请实施例中,处理器是一种具有信号的处理能力的电路,在一种实现中,处理器可以是具有指令读取与运行能力的电路,例如CPU、微处理器、GPU、或DSP等;在另一种实现中,处理器可以通过硬件电路的逻辑关系实现一定功能,该硬件电路的逻辑关系是固定的或可以重构的,例如处理器为ASIC或PLD实现的硬件电路,例如FPGA。在可重构的硬件电路中,处理器加载配置文档,实现硬件电路配置的过程,可以理解为处理器加载指令,以实现以上部分或全部单元的功能的过程。此外,还可以是针对人工智能设计的硬件电路,其可以理解为一种ASIC,例如NPU、TPU、DPU等。
可见,以上装置中的各单元可以是被配置成实施以上方法的一个或多个处理器(或处理电路),例如:CPU、GPU、NPU、TPU、DPU、微处理器、DSP、ASIC、FPGA,或这些处理器形式中至少两种的组合。
此外,以上装置中的各单元可以全部或部分可以集成在一起,或者可以独立实现。在一种实现中,这些单元集成在一起,以片上系统(system-on-a-chip,SOC)的形式实现。该SOC中可以包括至少一个处理器,用于实现以上任一种方法或实现该装置各单元的功能,该至少一个处理器的种类可以不同,例如包括CPU和FPGA,CPU和人工智能处理器,CPU和GPU等。
本申请实施例还提供了一种装置,该装置包括处理单元和存储单元,其中存储单元用于存储指令,处理单元执行存储单元所存储的指令,以使该装置执行上述实施例执行的方法或者步骤。
可选地,若该装置位于车辆中,上述处理单元可以是图1所示的处理器151-15n。
本申请实施例还提供了一种交通工具,该交通工具可以包括上述控制装置500。
可选地,该交通工具可以为车辆。
本申请实施例还提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码在计算机上运行时,使得计算机执行上述方法。
本申请实施例还提供了一种计算机可读介质,所述计算机可读介质存储有程序代码,当所述计算机程序代码在计算机上运行时,使得计算机执行上述方法。
在实现过程中,上述方法的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法 的步骤。为避免重复,这里不再详细描述。
应理解,本申请实施例中,该存储器可以包括只读存储器和随机存取存储器,并向处理器提供指令和数据。
还应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖。在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (26)

  1. 一种控制方法,其特征在于,包括:
    第一控制器根据第一传感器组中的传感器采集的数据,获取第一感知结果;
    第二控制器根据第二传感器组中的传感器采集的数据,获取第二感知结果;
    所述第一控制器接收所述第二控制器发送的所述第二感知结果;
    所述第一控制器根据所述第一感知结果和所述第二感知结果,向执行器发送第一控制指令。
  2. 如权利要求1所述的方法,其特征在于,所述方法还包括:
    所述第二控制器接收所述第一控制器发送的所述第一感知结果;
    所述第二控制器根据所述第一感知结果和所述第二感知结果,生成第二控制指令。
  3. 如权利要求2所述的方法,其特征在于,所述方法还包括:
    所述第二控制器向所述执行器发送所述第二控制指令。
  4. 如权利要求3所述的方法,其特征在于,所述方法还包括:
    在所述第一控制器发生故障时,所述第一控制器停止向所述执行器发送所述第一控制指令。
  5. 如权利要求2所述的方法,其特征在于,所述方法还包括:
    在所述第一控制器发生故障时,所述第一控制器停止发送所述第一控制指令;
    所述第二控制器在确定所述第一控制器发生故障且所述第二控制器未发生故障时,向所述执行器发送所述第二控制指令。
  6. 如权利要求4或5所述的方法,其特征在于,所述方法应用于交通工具中,所述向执行器发送所述第一控制指令之前,所述方法还包括:
    确定所述交通工具处于自动驾驶状态;
    其中,所述方法还包括:
    提示用户接管所述交通工具。
  7. 如权利要求4至6中任一项所述的方法,其特征在于,所述第一控制器在第一时刻发生故障,所述方法还包括:
    所述第一控制器向所述第二控制器发送第三感知结果,所述第三感知结果包括第一时间段内所述第一控制器对所述第一传感器组中的传感器采集的数据的感知结果,所述第一时间段位于所述第一时刻之前;
    所述第二控制器根据所述第三感知结果和所述第二感知结果,控制交通工具停止行驶。
  8. 如权利要求1至7中任一项所述的方法,其特征在于,所述第一传感器组中的至少部分传感器和所述第二传感器组中的传感器不同。
  9. 如权利要求1至8中任一项所述的方法,其特征在于,所述第一传感器组和所述第二传感器组中包括定位传感器和毫米波雷达。
  10. 如权利要求1至9中任一项所述的方法,其特征在于,所述第二传感器组中包括侧视摄像头。
  11. 一种控制装置,其特征在于,包括:
    第一控制单元,用于根据第一传感器组中的传感器采集的数据,获取第一感知结果;
    第二控制单元,用于根据第二传感器组中的传感器采集的数据,获取第二感知结果;
    所述第二控制单元,还用于向所述第一控制单元发送所述第二感知结果;
    所述第一控制单元,用于根据所述第一感知结果和所述第二感知结果,向执行器发送第一控制指令。
  12. 如权利要求11所述的装置,其特征在于,
    所述第一控制单元,还用于向所述第二控制单元发送所述第一感知结果;
    所述第二控制单元,还用于根据所述第一感知结果和所述第二感知结果,生成第二控制指令。
  13. 如权利要求12所述的装置,其特征在于,
    所述第二控制单元,还用于向所述执行器发送所述第二控制指令。
  14. 如权利要求13所述的装置,其特征在于,
    所述第一控制单元,还用于在所述第一控制单元发生故障时,停止向所述执行器发送所述第一控制指令。
  15. 如权利要求12所述的装置,其特征在于,
    所述第一控制单元,还用于在所述第一控制单元发生故障时,停止发送所述第一控制指令;
    所述第二控制单元,用于在确定所述第一控制单元发生故障且所述第二控制单元未发生故障时,向所述执行器发送所述第二控制指令。
  16. 如权利要求14或15所述的装置,其特征在于,
    所述第一控制单元,还用于向所述执行器发送所述第一控制指令之前,确定所述交通工具处于自动驾驶状态;
    所述第一控制单元,还用于在第一控制单元发生故障时,控制提示装置提示用户接管所述交通工具。
  17. 如权利要求14至16中任一项所述的装置,其特征在于,所述第一控制单元在第一时刻发生故障,
    所述第一控制单元,还用于向所述第二控制单元发送第三感知结果,所述第三感知结果包括第一时间段内所述第一控制单元对所述第一传感器组中的传感器采集的数据的感知结果,所述第一时间段位于所述第一时刻之前;
    所述第二控制单元,还用于根据所述第三感知结果和所述第二感知结果,控制所述交通工具停止行驶。
  18. 如权利要求11至17中任一项所述的装置,其特征在于,所述第一传感器组中的至少部分传感器和所述第二传感器组中的传感器不同。
  19. 如权利要求11至18中任一项所述的装置,其特征在于,所述第一传感器组和所述第二传感器组中包括定位传感器和毫米波雷达。
  20. 如权利要求11至19中任一项所述的装置,其特征在于,所述第二传感器组中包括侧视摄像头。
  21. 一种控制装置,其特征在于,所述控制装置包括处理器和存储器,所述存储器用 于存储程序指令,所述处理器用于调用所述程序指令来执行权利要求1至10中任一项所述的方法。
  22. 一种交通工具,其特征在于,所述交通工具包括如权利要求11至21中任一项所述的控制装置。
  23. 如权利要求22所述的交通工具,其特征在于,所述交通工具为车辆。
  24. 一种计算机可读存储介质,其特征在于,所述计算机可读介质存储有程序代码,当所述程序代码在计算机上运行时,使得计算机执行如权利要求1至10中任意一项所述的方法。
  25. 一种芯片,其特征在于,所述芯片包括处理器与数据接口,所述处理器通过所述数据接口读取存储器上存储的指令,以执行如权利要求1至10中任一项所述的方法。
  26. 一种计算机程序产品,其特征在于,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码在计算机上运行时,使得所述计算机执行如权利要求1至10中任一项所述的方法。
PCT/CN2022/087879 2022-04-20 2022-04-20 一种控制方法、装置和交通工具 WO2023201563A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/087879 WO2023201563A1 (zh) 2022-04-20 2022-04-20 一种控制方法、装置和交通工具
CN202280005225.9A CN117279818A (zh) 2022-04-20 2022-04-20 一种控制方法、装置和交通工具

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/087879 WO2023201563A1 (zh) 2022-04-20 2022-04-20 一种控制方法、装置和交通工具

Publications (1)

Publication Number Publication Date
WO2023201563A1 true WO2023201563A1 (zh) 2023-10-26

Family

ID=88418932

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/087879 WO2023201563A1 (zh) 2022-04-20 2022-04-20 一种控制方法、装置和交通工具

Country Status (2)

Country Link
CN (1) CN117279818A (zh)
WO (1) WO2023201563A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117163071A (zh) * 2023-11-03 2023-12-05 安徽蔚来智驾科技有限公司 车辆控制方法、控制装置、可读存储介质及车辆

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105515739A (zh) * 2014-10-14 2016-04-20 罗伯特·博世有限公司 用于自动化行驶的防止失效的e/e结构
CN108776472A (zh) * 2018-05-17 2018-11-09 驭势(上海)汽车科技有限公司 智能驾驶控制方法及系统、车载控制设备和智能驾驶车辆
CN109367501A (zh) * 2018-09-07 2019-02-22 百度在线网络技术(北京)有限公司 自动驾驶系统、车辆控制方法及装置
CN111295319A (zh) * 2018-12-26 2020-06-16 华为技术有限公司 车辆控制方法、相关设备及计算机存储介质
CN113267992A (zh) * 2021-07-19 2021-08-17 北京踏歌智行科技有限公司 一种基于冗余设计的矿卡无人驾驶控制系统
WO2021255985A1 (ja) * 2020-06-16 2021-12-23 日立Astemo株式会社 電子制御装置、及び車両制御方法
CN113838226A (zh) * 2020-06-23 2021-12-24 图森有限公司 用于自动驾驶车辆的冗余硬件和软件架构

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105515739A (zh) * 2014-10-14 2016-04-20 罗伯特·博世有限公司 用于自动化行驶的防止失效的e/e结构
CN108776472A (zh) * 2018-05-17 2018-11-09 驭势(上海)汽车科技有限公司 智能驾驶控制方法及系统、车载控制设备和智能驾驶车辆
CN109367501A (zh) * 2018-09-07 2019-02-22 百度在线网络技术(北京)有限公司 自动驾驶系统、车辆控制方法及装置
CN111295319A (zh) * 2018-12-26 2020-06-16 华为技术有限公司 车辆控制方法、相关设备及计算机存储介质
WO2021255985A1 (ja) * 2020-06-16 2021-12-23 日立Astemo株式会社 電子制御装置、及び車両制御方法
CN113838226A (zh) * 2020-06-23 2021-12-24 图森有限公司 用于自动驾驶车辆的冗余硬件和软件架构
CN113267992A (zh) * 2021-07-19 2021-08-17 北京踏歌智行科技有限公司 一种基于冗余设计的矿卡无人驾驶控制系统

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117163071A (zh) * 2023-11-03 2023-12-05 安徽蔚来智驾科技有限公司 车辆控制方法、控制装置、可读存储介质及车辆
CN117163071B (zh) * 2023-11-03 2024-03-05 安徽蔚来智驾科技有限公司 车辆控制方法、控制装置、可读存储介质及车辆

Also Published As

Publication number Publication date
CN117279818A (zh) 2023-12-22

Similar Documents

Publication Publication Date Title
US11644831B2 (en) Multi-stage operation of autonomous vehicles
CA3122846C (en) Redundant hardware system for autonomous vehicles
Parent et al. Intelligent vehicle technologies
US10875511B2 (en) Systems and methods for brake redundancy for an autonomous vehicle
US20200017114A1 (en) Independent safety monitoring of an automated driving system
US20240071221A1 (en) Systems and methods for implementing multimodal safety operations with an autonomous agent
US20220308577A1 (en) Virtual towing device, system, and method
US11208118B2 (en) Travel control device, travel control method, and computer-readable storage medium storing program
JP6942236B1 (ja) 車両制御装置、車両制御方法、およびプログラム
US12012097B2 (en) Complementary control system for an autonomous vehicle
US11008018B1 (en) Risk prediction on a peer-to-peer network
WO2023201563A1 (zh) 一种控制方法、装置和交通工具
US11987266B2 (en) Distributed processing of vehicle sensor data
JP7092955B1 (ja) 車両制御装置、車両制御方法、およびプログラム
WO2023004759A1 (zh) 故障检测方法、故障检测装置、服务器和车辆
JP2022103673A (ja) 車両制御システム、および車両制御方法
Bijlsma et al. In-vehicle architectures for truck platooning: The challenges to reach SAE automation level 3
US20230311929A1 (en) Autonomous vehicle interaction with chassis control system to provide enhanced driving modes
JP7177968B1 (ja) 車両制御装置、車両制御方法、およびプログラム
US20230406293A1 (en) Secondary fallback software stack for autonomous vehicle
WO2024113829A1 (zh) 泊车方法、装置和车辆
WO2023228781A1 (ja) 処理システム及び情報提示装置
CN114954413A (zh) 车辆自检处理方法、装置、设备及存储介质
CN118579108A (zh) 车辆自动驾驶安全控制方法及系统

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 202280005225.9

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22937801

Country of ref document: EP

Kind code of ref document: A1