WO2023050129A1 - 车辆控制的方法和装置 - Google Patents

车辆控制的方法和装置 Download PDF

Info

Publication number
WO2023050129A1
WO2023050129A1 PCT/CN2021/121638 CN2021121638W WO2023050129A1 WO 2023050129 A1 WO2023050129 A1 WO 2023050129A1 CN 2021121638 W CN2021121638 W CN 2021121638W WO 2023050129 A1 WO2023050129 A1 WO 2023050129A1
Authority
WO
WIPO (PCT)
Prior art keywords
function
sub
sensor
failure
functions
Prior art date
Application number
PCT/CN2021/121638
Other languages
English (en)
French (fr)
Inventor
佘晓丽
方晔阳
李帅君
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN202180039563.XA priority Critical patent/CN116209608A/zh
Priority to PCT/CN2021/121638 priority patent/WO2023050129A1/zh
Publication of WO2023050129A1 publication Critical patent/WO2023050129A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures

Definitions

  • the embodiments of the present application relate to the field of smart cars, and more specifically, to a method and device for controlling a vehicle.
  • smart terminals such as smart transportation equipment and smart cars are gradually entering people's daily lives.
  • Sensors play a very important role in smart terminals.
  • Various sensors installed on the smart terminal such as millimeter-wave radar, lidar, camera, ultrasonic radar, etc., sense the surrounding environment during the movement of the smart terminal and collect data for the smart terminal to recognize the environment. For example, the identification and tracking of moving objects, and the identification of stationary scenes such as lane lines and signboards, etc., combined with navigator and map data for path planning.
  • Sensors can detect possible dangers in advance and assist or even take necessary avoidance measures autonomously, effectively increasing the safety and comfort of smart terminals.
  • Intelligent driving is a mainstream application in the field of artificial intelligence.
  • sensor failures will occur, including vehicle vibrations causing loose wiring harnesses, sensor surface damage, and being blocked by dirt. Damaged sensors can greatly affect the performance of perception systems, thereby affecting intelligent driving capabilities.
  • Embodiments of the present application provide a vehicle control method and device, which can improve the intelligence level of automatic driving on the premise of ensuring driving safety.
  • a vehicle control method including: acquiring sensor state information and automatic driving function information, the sensor state information includes information indicating that one or more sensors are in a failure state, and the automatic driving function information is used to indicate the currently running An automatic driving function, the automatic driving function includes a plurality of operating conditions; according to the impact of the failure of the one or more sensors on the first operating condition, determine the corresponding one or more sensor failures under the first operating condition The driving strategy, wherein the first operating condition is any one of a plurality of operating conditions.
  • Operating conditions refer to various working states or conditions of automatic driving functions, such as cruising, car following, lane keeping, lane changing, turning, start and stop, reverse parking, side parking, etc., more specifically, for example, can be ICA's cruising condition, AVP's following condition, and APA's reversing and warehousing conditions, etc.
  • Intelligent driving vehicles generally present or provide automatic driving functions to vehicle users according to different operating conditions (for example, in the form of automatic driving function packages), that is, when the driver activates the automatic driving function, the automatic driving function will generally be activated.
  • One or several function packages correspond to different operating conditions.
  • the driver can turn on or off the corresponding operating conditions according to his preferences or needs: for example, when the driver wants to turn on the car following and lane keeping functions, he can send a corresponding message to the vehicle through a lever or button. For another example, when the driver only wants to enable the car-following function, but does not want to enable the lane-keeping function, he can send a corresponding automatic driving instruction to the vehicle through another lever or button.
  • the vehicle control solution provided by the embodiment of the present application formulates different levels of driving strategies according to the degree of dependence of different sensors on each operating condition, rather than simply exiting all automatic driving functions directly. In this way, during the automatic driving process, when one or more sensors fail, the vehicle maintains all or part of the current automatic driving function as much as possible, which can improve the user's automatic driving experience while ensuring driving safety.
  • the vehicle control solution of the embodiment of the present application can be controlled at a granularity of vehicle operating conditions. Compared with the method that can only be controlled at the granularity of the overall automatic driving function, it can provide more flexible and more detailed automatic driving capabilities and improve the user's automatic driving experience.
  • the driving strategies of different levels are multi-level function degraded driving strategies, including: maintaining the current automatic driving function, disabling partial automatic driving functions, and exiting the automatic driving function.
  • the vehicle control solution provided by the embodiment of the present application formulates at least three levels of driving strategies according to the different degrees of dependence of different sensors on vehicle operating conditions, instead of simply maintaining the automatic driving function or directly exiting the automatic driving function. In this way, it is possible to provide more flexible levels of automatic driving capabilities with more detailed levels and ensure the rationality of functional degradation.
  • the realization of the first operating condition is related to one or more sub-functions, and the one or more sub-functions include at least one of the following: one or more key sub-functions, One or more non-critical sub-functions, one or more auxiliary sub-functions, the first operating condition is any one of multiple operating conditions, the loss of key sub-functions makes the first operating condition impossible to achieve, non-critical sub-functions The loss does not affect the realization of sub-functions in the first operating condition except the non-critical sub-functions, and the loss of auxiliary sub-functions does not affect the realization of the first operating condition.
  • each operating condition is related to one or more sub-functions.
  • Subfunctions are generally not presented to the vehicle user, ie the driver cannot select a specific subfunction under operating conditions.
  • the realization of the turning condition can depend on the non-critical sub-functions of the left target recognition, the non-critical sub-functions of the right target recognition, the non-critical sub-functions of the backward target recognition, and the auxiliary sub-functions of target recognition;
  • the realization of oblique parking conditions can rely on the key sub-functions of nearby obstacle recognition, the non-critical sub-functions of high-precision recognition, and the auxiliary sub-functions of target recognition.
  • determining the driving strategy corresponding to the failure of one or more sensors under the first operating condition includes: determining the driving strategy corresponding to the failure of one or more sensors under the first operating condition
  • the driving strategy corresponding to sensor failure is to maintain the automatic driving function, one or more sensors include the first type of sensor, and the failure of the first type of sensor does not affect the realization of the first operating condition.
  • determining the driving strategy corresponding to the failure of one or more sensors under the first operating condition includes: determining the driving strategy corresponding to the second type of sensor failure under the first operating condition
  • the driving strategy corresponding to sensor failure is to disable the partial automatic driving function, one or more sensors include the second type of sensor, and the failure of the second type of sensor affects the realization of non-critical sub-functions of the first operating condition.
  • the vehicle when some sensors fail and cause the first operating condition to fail to operate normally, the vehicle can disable the partial automatic driving function associated with the failed sensor while maintaining the current automatic driving function. In this way, it is not necessary to manually take over some or all of the automatic driving functions, and improve the user's automatic driving experience while ensuring driving safety.
  • determining the driving strategy corresponding to one or more sensor failures under the first operating condition includes: determining the driving strategy corresponding to the third type of sensor failure under the first operating condition
  • the driving strategy corresponding to sensor failure is to exit the automatic driving function.
  • One or more sensors include the third type of sensor, and the failure of the third type of sensor affects the realization of key sub-functions in the first operating condition.
  • the first type of sensor includes a first sensor unit, the first sensor unit is associated with the first subfunction, and there are other non-failure sensor units associated with the first subfunction,
  • the first sub-function is any one of one or more sub-functions.
  • the vehicle under a certain operating condition, when the sensing information of some failed sensors can be supplemented by other non-failed sensors, the vehicle can still maintain the current automatic driving function. In this way, it is possible to provide more flexible driving strategies with richer layers and improve the user's automatic driving experience.
  • the first type of sensor further includes a second sensor unit, the second sensor unit is associated with a second auxiliary sub-function, and the second auxiliary sub-function is one or more auxiliary sub-functions any one of the functions.
  • the second type of sensor includes a third sensor unit, the third sensor unit is associated with a third non-critical sub-function, and there is no other sensor associated with the third non-critical sub-function.
  • the third non-critical sub-function is any one of one or more non-critical sub-functions.
  • the second type of sensor further includes a fourth sensor unit, the fourth sensor unit is associated with a fourth key sub-function and a fourth non-key sub-function, and there is a compensation for the fourth
  • the surrounding environment of the key sub-function and other non-failure sensor units that perceive the surrounding environment includes vehicles, railings, etc.
  • the fourth key sub-function is any one of one or more key sub-functions
  • the fourth non-key sub-function is one or any one of several non-critical subfunctions.
  • the surrounding environment and other sensor units can be used to supplement the key sensory information of the failed sensor.
  • the vehicle can still not exit the automatic driving function, which can improve the user's automatic driving experience.
  • the third type of sensor includes a fifth sensor unit, the fifth sensor unit is associated with the fifth key sub-function, and there is no other non-failure associated with the fifth key sub-function
  • the sensor unit or there is no at least one of the following: supplementing the surrounding environment of the fifth key sub-function, and other non-failure sensor units that sense the surrounding environment, and the fifth key sub-function is any one of one or more key sub-functions.
  • a vehicle control device including: an acquisition unit, configured to acquire sensor state information and automatic driving function information, the sensor state information includes information indicating that one or more sensors are in a failure state, and the automatic driving function information is used to In order to indicate the currently running automatic driving function, the automatic driving function includes a plurality of operating conditions; the processing unit is used to determine the first operating condition under the first operating condition and a or a driving strategy corresponding to a plurality of sensor failures, wherein the first operating condition is any one of the plurality of operating conditions.
  • Operating conditions refer to various working states or conditions of automatic driving functions, such as cruising, car following, lane keeping, lane changing, turning, start and stop, reverse parking, side parking, etc., more specifically, for example, can be ICA's cruising condition, AVP's following condition, and APA's reversing and warehousing conditions, etc.
  • Intelligent driving vehicles generally present or provide automatic driving functions to vehicle users according to different operating conditions (for example, in the form of automatic driving function packages), that is, when the driver activates the automatic driving function, the automatic driving function will generally be activated.
  • One or several function packages correspond to different operating conditions.
  • the driver can turn on or off the corresponding operating conditions according to his preferences or needs: for example, when the driver wants to turn on the car following and lane keeping functions, he can send a corresponding message to the vehicle through a lever or button. For another example, when the driver only wants to enable the car-following function, but does not want to enable the lane-keeping function, he can send a corresponding automatic driving instruction to the vehicle through another lever or button.
  • the vehicle control solution provided by the embodiment of the present application formulates different levels of driving strategies according to the degree of dependence of different sensors on each operating condition, rather than simply exiting all automatic driving functions directly. In this way, during the automatic driving process, when one or more sensors fail, the vehicle maintains all or part of the current automatic driving function as much as possible, which can improve the user's automatic driving experience while ensuring driving safety.
  • the vehicle control solution of the embodiment of the present application can be controlled at a granularity of vehicle operating conditions. Compared with the method that can only be controlled at the granularity of the overall automatic driving function, it can provide more flexible and more detailed automatic driving capabilities and improve the user's automatic driving experience.
  • the driving strategies of different levels are multi-level functional degradation driving strategies, including: maintaining the automatic driving function, disabling partial automatic driving functions, and exiting the automatic driving functions.
  • the vehicle control solution provided by the embodiment of the present application formulates at least three levels of driving strategies according to the different degrees of dependence of different sensors on vehicle operating conditions, instead of simply maintaining the automatic driving function or directly exiting the automatic driving function. In this way, it is possible to provide more flexible levels of automatic driving capabilities with more detailed levels and ensure the rationality of functional degradation.
  • the realization of the first operating condition is related to one or more sub-functions
  • the one or more sub-functions include at least one of the following: one or more key sub-functions, One or more non-critical sub-functions, one or more auxiliary sub-functions
  • the first operating condition is any one of multiple operating conditions
  • the loss of key sub-functions makes the first operating condition impossible to achieve, non-critical sub-functions
  • the loss does not affect the realization of sub-functions in the first operating condition except the non-critical sub-functions
  • the loss of auxiliary sub-functions does not affect the realization of the first operating condition.
  • each operating condition is related to one or more sub-functions.
  • Subfunctions are generally not presented to the vehicle user, ie the driver cannot select a specific subfunction under operating conditions.
  • the realization of the turning condition can depend on the non-critical sub-functions of the left target recognition, the non-critical sub-functions of the right target recognition, the non-critical sub-functions of the backward target recognition, and the auxiliary sub-functions of target recognition;
  • the realization of oblique parking conditions can rely on the key sub-functions of nearby obstacle recognition, the non-critical sub-functions of high-precision recognition, and the auxiliary sub-functions of target recognition.
  • the processing unit is specifically configured to determine the driving strategy corresponding to the failure of the first type of sensor under the first operating condition.
  • the processing unit is specifically configured to determine the driving strategy corresponding to the failure of the first type of sensor under the first operating condition.
  • one or more sensors The first type of sensor is included, and the failure of the first type of sensor does not affect the realization of the first operating condition.
  • the processing unit is specifically configured to determine that the driving strategy corresponding to the failure of the second type of sensor under the first operating condition is to disable part of the current automatic driving function, one or
  • the plurality of sensors includes a second type of sensor, and failure of the second type of sensor affects the realization of non-critical sub-functions of the first operating condition.
  • the vehicle when some sensors fail and cause the first operating condition to fail to operate normally, the vehicle can disable the partial automatic driving function associated with the failed sensor while maintaining the current automatic driving function. In this way, it is not necessary to manually take over some or all of the automatic driving functions, and improve the user's automatic driving experience while ensuring driving safety.
  • the processing unit is specifically configured to determine that the driving strategy corresponding to the failure of the third type of sensor under the first operating condition is to exit the automatic driving function, and one or more sensors Including the third type of sensor, the failure of the third type of sensor affects the realization of the key sub-functions of the first operating condition.
  • the first type of sensor includes a first sensor unit, the first sensor unit is associated with the first subfunction, and there are other non-failure sensor units associated with the first subfunction,
  • the first sub-function is any one of one or more sub-functions.
  • the vehicle under a certain operating condition, when the sensing information of some failed sensors can be supplemented by other non-failed sensors, the vehicle can still maintain the current automatic driving function. In this way, it is possible to provide more flexible driving strategies with richer layers and improve the user's automatic driving experience.
  • the first type of sensor further includes a second sensor unit, the second sensor unit is associated with a second auxiliary sub-function, and the second auxiliary sub-function is one or more auxiliary sub-functions any one of the functions.
  • the second type of sensor includes a third sensor unit, the third sensor unit is associated with a third non-critical sub-function, and there is no other sensor associated with the third non-critical sub-function.
  • the third non-critical sub-function is any one of one or more non-critical sub-functions.
  • the second type of sensor includes a fourth sensor unit, the fourth sensor unit is associated with the fourth key sub-function and the fourth non-key sub-function, and there is a supplementary fourth key
  • the surrounding environment of the sub-function and other non-failure sensor units that perceive the surrounding environment includes vehicles, railings, etc.
  • the fourth key sub-function is any one of one or more key sub-functions
  • the fourth non-critical sub-function is one or more Any of several non-critical subfunctions.
  • the surrounding environment and other sensor units can be used to supplement the key sensory information of the failed sensor.
  • the vehicle can still not exit the automatic driving function, which can improve the user's automatic driving experience.
  • the third type of sensor includes a fifth sensor unit, the fifth sensor unit is associated with the fifth key sub-function, and there is no other non-failure associated with the fifth key sub-function
  • the sensor unit or there is no at least one of the following: supplementing the surrounding environment of the fifth key sub-function, and other non-failure sensor units that sense the surrounding environment, and the fifth key sub-function is any one of one or more key sub-functions.
  • a computing device including a memory and a processor, the memory is used to store program instructions; when the program instructions are executed in the processor, the processor is used to execute the first aspect or the first aspect The method described in the two aspects.
  • the processor in the third aspect above may include a central processing unit (central processing unit, CPU), or a combination of a CPU and a neural network processing processor.
  • CPU central processing unit
  • a computer-readable medium stores program code for execution by a device, where the program code is used to execute the method in the first aspect or the second aspect.
  • a chip in a fifth aspect, includes a processor and a data interface, and the processor reads instructions stored on the memory through the data interface, and executes the method in the first aspect or the second aspect above.
  • the chip may further include a memory, the memory stores instructions, the processor is configured to execute the instructions stored in the memory, and when the instructions are executed, the The processor is configured to execute the first aspect or the method in any one implementation manner of the first aspect.
  • the aforementioned chip may specifically be a field-programmable gate array (field-programmable gate array, FPGA) or an application-specific integrated circuit (application-specific integrated circuit, ASIC).
  • FPGA field-programmable gate array
  • ASIC application-specific integrated circuit
  • an automatic driving vehicle including: at least one processor and a memory, at least one processor is coupled with the memory, and is used to read and execute instructions in the memory to perform any one of the above-mentioned first aspects method in a possible implementation.
  • the method in the first aspect may specifically refer to the first aspect and the method in any of the various implementation manners in the first aspect.
  • FIG. 1 is a functional block diagram of a vehicle to which the embodiment of the present application is applied.
  • Fig. 2 is a schematic structural diagram of a vehicle sensor provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of a simplified vehicle control system applicable to an embodiment of the present application.
  • Fig. 4 is a schematic flowchart of a vehicle control method provided by an embodiment of the present application.
  • Fig. 5 is a schematic diagram of HMI interaction of the vehicle control method provided by the embodiment of the present application.
  • Fig. 6 is a schematic flowchart of a vehicle control method provided by an embodiment of the present application.
  • Fig. 7 is a schematic block diagram of a vehicle control device provided by an embodiment of the present application.
  • Fig. 8 is a schematic block diagram of a vehicle control device provided by an embodiment of the present application.
  • the vehicle control method and device provided in the embodiments of the present application can be applied to intelligent driving vehicles, and can also be applied to intelligent terminals such as intelligent household equipment and robots.
  • intelligent driving vehicles and can also be applied to intelligent terminals such as intelligent household equipment and robots.
  • FIG. 1 is a functional block diagram of a vehicle 100 to which the embodiment of the present application is applied.
  • the vehicle 100 may be an intelligent driving vehicle, and the vehicle 100 may fully or partially support an automatic driving mode.
  • vehicle 100 Various subsystems may be included in vehicle 100 , such as travel system 110 , sensing system 120 , control system 130 , one or more peripheral devices 140 as well as power supply 160 , computer system 150 and user interface 170 .
  • vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements.
  • each subsystem and element of the vehicle 100 may be interconnected by wire or wirelessly.
  • propulsion system 110 may include components for providing powered motion to vehicle 100 .
  • the propulsion system 110 may include an engine 111 , a transmission 112 , an energy source 113 and wheels 114 /tires.
  • the engine 111 may be an internal combustion engine, an electric motor, an air compression engine or other types of engine combinations; for example, a hybrid engine composed of a gasoline engine and an electric motor, or a hybrid engine composed of an internal combustion engine and an air compression engine.
  • Engine 111 may convert energy source 113 into mechanical energy.
  • the energy source 113 may include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. Energy source 113 may also provide energy to other systems of vehicle 100 .
  • the transmission device 112 may include a gearbox, a differential, and a drive shaft; wherein, the transmission device 112 may transmit mechanical power from the engine 111 to the wheels 114 .
  • the transmission 112 may also include other devices, such as clutches.
  • drive shafts may include one or more axles that may be coupled to one or more wheels 114 .
  • the sensing system 120 may include several sensors that sense information about the environment around the vehicle 100 .
  • the sensing system 120 may include a positioning system 121 (for example, a global positioning system (global positioning system, GPS), Beidou system, or other positioning systems), an inertial measurement unit (inertial measurement unit, IMU) 122, a radar 123, a laser measuring A tachymeter 124, a camera 125, and a vehicle speed sensor 126.
  • the sensing system 120 may also include sensors of the interior systems of the monitored vehicle 100 (eg, interior air quality monitor, fuel gauge, oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding properties (position, shape, orientation, velocity, etc.). Such detection and identification are critical functions for safe operation of autonomous vehicle 100 .
  • the positioning system 121 can be used to estimate the geographic location of the vehicle 100 .
  • the IMU 122 may be used to sense changes in position and orientation of the vehicle 100 based on inertial acceleration.
  • IMU 122 may be a combination accelerometer and gyroscope.
  • radar 123 may utilize radio information to sense objects within the surrounding environment of vehicle 100 .
  • radar 123 may be used to sense the velocity and/or heading of an object.
  • laser range finder 124 may utilize laser light to sense objects in the environment in which vehicle 100 is located.
  • laser rangefinder 124 may include one or more laser sources, a laser scanner, and one or more detectors, among other system components.
  • camera 125 may be used to capture multiple images of the environment surrounding vehicle 100 .
  • camera 125 may be a still camera or a video camera.
  • a vehicle speed sensor 126 may be used to measure the speed of the vehicle 100 .
  • real-time speed measurement of vehicles can be performed.
  • the measured vehicle speed may be communicated to the control system 130 to enable control of the vehicle.
  • control system 130 controls the operation of the vehicle 100 and its components.
  • the control system 130 may include various elements, such as a steering system 131 , an accelerator 132 , a braking unit 133 , a computer vision system 134 , a route control system 135 and an obstacle avoidance system 136 .
  • the steering system 131 is operable to adjust the heading of the vehicle 100 .
  • it could be a steering wheel system.
  • the throttle 132 may be used to control the operating speed of the engine 111 and thus the speed of the vehicle 100 .
  • the braking unit 133 may be used to control the deceleration of the vehicle 100 ; the braking unit 133 may use friction to slow the wheels 114 . In other embodiments, the brake unit 133 can convert the kinetic energy of the wheel 114 into electric current. The braking unit 133 may also take other forms to slow down the rotation of the wheels 114 to control the speed of the vehicle 100 .
  • computer vision system 134 is operable to process and analyze images captured by camera 125 in order to identify objects and/or features in the environment surrounding vehicle 100 .
  • the aforementioned objects and/or features may include traffic information, road boundaries and obstacles.
  • the computer vision system 134 may use object recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision techniques.
  • the computer vision system 134 may be used to map the environment, track objects, estimate the speed of objects, and the like.
  • the route control system 135 may be used to determine the travel route of the vehicle 100 .
  • route control system 135 may combine data from sensors, GPS, and one or more predetermined maps to determine a travel route for vehicle 100 .
  • an obstacle avoidance system 136 may be used to identify, evaluate, and avoid or otherwise negotiate potential obstacles in the environment of the vehicle 100 .
  • control system 130 may additionally or alternatively include components other than those shown and described. Alternatively, some of the components shown above may be reduced.
  • the vehicle 100 can interact with external sensors, other vehicles, other computer systems or users through the peripheral device 140; wherein the peripheral device 140 can include a wireless communication system 141, an on-board computer 142, a microphone 143 and/or or speaker 144.
  • the peripheral device 140 can include a wireless communication system 141, an on-board computer 142, a microphone 143 and/or or speaker 144.
  • peripheral device 140 may provide a means for vehicle 100 to interact with user interface 170 .
  • on-board computer 142 may provide information to a user of vehicle 100 .
  • the user interface 116 can also operate the on-board computer 142 to receive user input; the on-board computer 142 can be operated through a touch screen.
  • peripheral device 140 may provide a means for vehicle 100 to communicate with other devices located within the vehicle.
  • microphone 143 may receive audio (eg, voice commands or other audio input) from a user of vehicle 100 .
  • speaker 144 may output audio to a user of vehicle 100 .
  • the wireless communication system 141 may communicate wirelessly with one or more devices, either directly or via a communication network.
  • the wireless communication system 141 may use 3G cellular communication; for example, code division multiple access (CDMA), EVDO, global system for mobile communications (GSM)/general packet radio service (general packet radio service, GPRS), or 4G cellular communication, such as long term evolution (long term evolution, LTE); or, 5G cellular communication.
  • the wireless communication system 141 may use wireless Internet access (WiFi) to communicate with a wireless local area network (wireless local area network, WLAN).
  • WiFi wireless Internet access
  • the wireless communication system 141 can utilize infrared link, Bluetooth or ZigBee protocol (ZigBee) to communicate directly with the device; other wireless protocols, such as various vehicle communication systems, for example, the wireless communication system 141 can include one or Multiple dedicated short range communications (DSRC) devices, which may include public and/or private data communications between vehicles and/or roadside stations.
  • ZigBee ZigBee protocol
  • DSRC Multiple dedicated short range communications
  • power supply 160 may provide power to various components of vehicle 100 .
  • the power source 160 may be a rechargeable lithium ion battery or a lead acid battery.
  • One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100 .
  • power source 160 and energy source 113 may be implemented together, such as in some all-electric vehicles.
  • vehicle 100 may be controlled by computer system 150 , wherein computer system 150 may include at least one processor 151 executing Directive 153.
  • the computer system 150 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
  • the processor 151 may be any conventional processor, such as a commercially available central processing unit (CPU).
  • CPU central processing unit
  • the processor may be a dedicated device such as an application specific integrated circuit (ASIC) or other hardware-based processors.
  • FIG. 1 functionally illustrates a processor, memory, and other elements of a computer in the same block, those of ordinary skill in the art will appreciate that the processor, computer, or memory may actually include Multiple processors, computers, or memory within the same physical enclosure.
  • memory may be a hard drive or other storage medium located in a different housing than the computer.
  • references to a processor or computer are to be understood to include references to collections of processors or computers or memories that may or may not operate in parallel.
  • some components such as the steering and deceleration components, may each have their own processor that only performs calculations related to component-specific functions .
  • the processor may be located remotely from the vehicle and be in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle while others are executed by a remote processor, including taking the necessary steps to perform a single maneuver.
  • memory 152 may contain instructions 153 (eg, program logic) that may be used by processor 151 to perform various functions of vehicle 100 , including those described above.
  • Memory 152 may also include additional instructions, such as including sending data to, receiving data from, interacting with, and/or performing operations on, one or more of travel system 110 , sensing system 120 , control system 130 , and peripherals 140 . control instructions.
  • memory 152 may also store data such as road maps, route information, the vehicle's position, direction, speed, and other such vehicle data, among other information. Such information may be used by the vehicle 100 and the computer system 150 during operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
  • user interface 170 may be used to provide information to or receive information from a user of vehicle 100 .
  • user interface 170 may include one or more input/output devices within set of peripheral devices 140 , such as wireless communication system 141 , onboard computer 142 , microphone 143 and speaker 144 .
  • computer system 150 may control functions of vehicle 100 based on input received from various subsystems (eg, travel system 110 , sensing system 120 , and control system 130 ) and from user interface 170 .
  • computer system 150 may utilize input from control system 130 in order to control braking unit 133 to avoid obstacles detected by sensing system 120 and obstacle avoidance system 136 .
  • the computer system 150 is operable to provide control over many aspects of the vehicle 100 and its subsystems.
  • one or more of these components described above may be installed separately from or associated with the vehicle 100 .
  • memory 152 may exist partially or completely separate from vehicle 100 .
  • the components described above may be communicatively coupled together in a wired and/or wireless manner.
  • FIG. 1 should not be construed as limiting the embodiment of the present application.
  • the vehicle 100 may be an autonomous vehicle traveling on the road, which can recognize objects in its surroundings to determine adjustments to the current speed.
  • Objects may be other vehicles, traffic control devices, or other types of objects.
  • each identified object may be considered independently and based on the object's respective characteristics, such as its current speed, acceleration, distance to the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to adjust.
  • the vehicle 100 or a computing device associated with the vehicle 100 (such as the computer system 150, computer vision system 134, memory 152 of FIG. rain, ice on the road, etc.) to predict the behavior of the identified objects.
  • a computing device associated with the vehicle 100 such as the computer system 150, computer vision system 134, memory 152 of FIG. rain, ice on the road, etc.
  • each recognized object is dependent on the behavior of the other, so it is also possible to predict the behavior of a single recognized object by considering all recognized objects together.
  • the vehicle 100 is able to adjust its speed based on the predicted behavior of the identified object.
  • the self-driving car can determine based on the predicted behavior of the object that the vehicle will need to adjust (eg, accelerate, decelerate, or stop) to a steady state.
  • other factors may also be considered to determine the speed of the vehicle 100 , such as the lateral position of the vehicle 100 in the traveling road, the curvature of the road, the proximity of static and dynamic objects, and the like.
  • the computing device may also provide instructions to modify the steering angle of the vehicle 100 such that the self-driving car follows a given trajectory and/or maintains contact with objects in the vicinity of the self-driving car (e.g., , the safe lateral and longitudinal distances of cars in adjacent lanes on the road.
  • objects in the vicinity of the self-driving car e.g., , the safe lateral and longitudinal distances of cars in adjacent lanes on the road.
  • the above-mentioned vehicles 100 may be cars, trucks, motorcycles, buses, boats, airplanes, helicopters, lawn mowers, recreational vehicles, playground vehicles, construction equipment, trams, golf carts, trains, and trolleys, etc.
  • the application examples are not particularly limited.
  • the vehicle 100 shown in FIG. 1 may be an automatic driving vehicle, and the automatic driving system will be described in detail below.
  • Fig. 2 is a schematic structural diagram of a vehicle sensor provided by an embodiment of the present application.
  • the vehicle sensor in FIG. 2 may include the laser ranging 124 in the vehicle 100 in FIG. 1 , may include the vehicle speed sensor 126 in the vehicle 100 in FIG. 1 , and may also include other types of sensors.
  • the vehicle may include multiple sensors, such as lidar, front-view camera, rear-view camera, side-view camera, millimeter-wave radar, ultrasonic radar, fisheye camera, and the like.
  • sensors such as lidar, front-view camera, rear-view camera, side-view camera, millimeter-wave radar, ultrasonic radar, fisheye camera, and the like.
  • a vehicle may include 3 lidars facing forward, left, and right respectively.
  • Lidar light detection and ranging, Lidar
  • the laser radar emits laser light to the detection target, and then the receiver collects the light signal reflected by the target, and determines the distance of the target by measuring the round-trip time of the transmitted signal. Due to the advantages of high coherence, directivity, and monochromaticity of laser, laser radar can realize long-distance and high-precision ranging functions. LiDAR expands the ranging result of a single point to two dimensions through scanning or multi-element array detection to form a distance image. At present, lidar is applied in many occasions such as automatic driving, three-dimensional modeling of buildings, terrain mapping, robots, rendezvous and docking, etc.
  • LIDAR can be used to identify the exact location and shape of objects.
  • a vehicle may include 6 millimeter-wave radars, including 1 forward, 1 backward, and 4 lateral.
  • the four millimeter-wave radars on the side can be angular millimeter-wave radars, and their orientations are front left, rear left, front right, and rear right.
  • Millimeter wave radar is a radar that works in the millimeter wave band (millimeter wave).
  • the frequency of the millimeter wave is 30-300 gigahertz (GHz), that is, the wavelength is 1-10 millimeters (mm).
  • Millimeter waves have wavelengths between microwaves and centimeter waves.
  • millimeter wave radar The detection range of millimeter wave radar is generally 0-200 meters. Compared with optical beams such as infrared and laser, millimeter wave has a stronger ability to penetrate fog, smoke, and dust, so millimeter wave radar has the characteristics of all-weather.
  • Millimeter wave radar can be used to identify the distance and speed of objects.
  • a vehicle may include 12 ultrasonic radars.
  • the ultrasonic transmitter emits an ultrasonic signal in a certain direction outside, and starts timing at the same time as the ultrasonic wave is emitted.
  • the ultrasonic wave propagates through the air. Stop timing immediately when the wave hits.
  • the propagation speed of ultrasonic waves in the air is 340 meters per second (m/s), and the timer can measure the distance from the emission point to the obstacle by recording the time t.
  • the propagation speed of ultrasonic waves is relatively slow.
  • using ultrasonic ranging cannot keep up with the real-time changes in the distance between cars, and the error is relatively large.
  • the ultrasonic wave has a large scattering angle and poor directivity.
  • ultrasonic ranging sensors have great advantages.
  • the vehicle may include four forward-looking cameras, which are respectively a telephoto camera, a wide-angle camera, and a binocular camera (including two cameras).
  • a telephoto camera also known as a telephoto camera, refers to a camera that has a longer focal length than a standard camera.
  • the focal length of a telephoto camera is generally in the range of 135-800 millimeters (mm), and some telephoto cameras have even larger focal lengths.
  • a wide-angle camera is a camera with a shorter focal length than a standard camera and a larger viewing angle than a standard camera.
  • the focal length of an ordinary wide-angle lens is generally 38-24 mm, and the viewing angle is 60-84 degrees; the focal length of an ultra-wide-angle lens is 20-13 mm, and the viewing angle is 94-118 degrees. Because the focal length of the wide-angle lens is short and the angle of view is large, a larger area of the scene can be captured within a shorter shooting distance.
  • Binocular cameras can be used for ranging.
  • the ranging principle of the binocular camera is similar to that of the human eye.
  • the human eye can perceive the distance of an object because there are differences in the images presented by the two eyes to the same object, also known as "parallax". The farther the object is, the smaller the parallax will be; otherwise, the larger the parallax will be.
  • the size of the parallax corresponds to the distance between the object and the eye.
  • a vehicle may include a rear view camera.
  • the vehicle may include four side-facing cameras facing left front, left rear, right front, and right rear, respectively.
  • the rear-view camera and the side-facing camera can be standard cameras or mid-range cameras.
  • the vehicle may include four fisheye cameras facing forward, backward, left, and right respectively.
  • a fisheye camera is a special type of lens in an ultra-wide-angle camera.
  • the angle of view of a fisheye camera can reach or exceed what the human eye can see.
  • the focal length of the fisheye camera is generally 16mm or less, and the angle of view of the fisheye camera is close to or equal to 180 degrees.
  • the diameter of the front lens of the lens of this camera is very short and protrudes toward the front of the lens in a parabolic shape, which is quite similar to the eyes of fish.
  • the inoperable part of the system is handed over to the driver after the sensor fails.
  • the driver is allowed to input the speed limit when the speed limit cannot be recognized, and the driver is responsible for controlling the direction when the lane cannot be recognized.
  • this can strictly guarantee driving safety, it reduces the usability of the intelligent driving system and the experience of intelligent driving.
  • the vehicle control scheme provided by the embodiment of the present application formulates different levels of driving strategies according to the different sensors' dependence on the automatic driving function and the vehicle operating conditions, rather than simply exiting all automatic driving functions directly. In this way, during the automatic driving process, when one or more sensors fail, the vehicle maintains all or part of the current automatic driving function as much as possible, which can improve the user's automatic driving experience while ensuring driving safety.
  • Fig. 3 shows a schematic structural diagram of a simplified vehicle control system according to an embodiment of the present application.
  • the vehicle control system in Figure 3 can implement different driving strategies for different sensor failures in Figure 2, and perform information interaction with the driver.
  • the vehicle control system shown in FIG. 3 may include a sensor state monitoring module 310 , a central processing unit 320 , a regulatory module 330 and a display module 340 .
  • the sensor status monitoring module can monitor in real time whether one or more of the multiple sensors included in the sensor system 120 in FIG. 1 is invalid or abnormal.
  • the form of sensor failure or abnormality may specifically include: whether the sensor has a signal, whether the sensor signal is abnormal, whether the sensing result of the sensor is abnormal, and the like.
  • the sensor status monitoring module 310 may send failure information of the monitored one or more sensors to the central processing unit 320 .
  • the central processing unit 320 may receive sensor status information sent by the sensor status monitoring module 310 .
  • the central processing unit 320 determines the corresponding function degradation strategy when one or more sensors fail according to the currently running intelligent driving function, and sends the function degradation driving strategy to the regulatory module 330 .
  • the regulation module 330 receives and executes the function degradation strategy sent by the central processing unit 320 .
  • the central processing unit can display the driving strategy with degraded functions on a display module 340 such as a human machine interface (HMI). Current sensor failure status and driving strategy, etc.
  • a display module 340 such as a human machine interface (HMI). Current sensor failure status and driving strategy, etc.
  • the central processing unit 320 can also express the driving strategy through voice, ambient light, vibration, etc., to remind or warn the driver of the current sensor failure status and function degradation strategy.
  • Fig. 4 is a schematic flowchart of a vehicle control method provided by an embodiment of the present application. The method of FIG. 4 may be performed by the vehicle of FIG. 1 , the sensor of FIG. 2 , and the vehicle control system of FIG. 3 .
  • the sensor status information may include information indicating that one or more sensors are in a failure state, and the automatic driving function information may be used to indicate a currently operating automatic driving function, and the currently operating automatic driving function includes a plurality of operating conditions.
  • sensor failure there are many forms of sensor failure, for example, it may include no signal of the sensor due to driving, wiring, etc., or it may also include damage to the sensor caused by splashing gravel, or it may also include water or dirt. etc. caused by sensor perception performance degradation and so on. In this embodiment of the present application, no limitation is imposed on the specific form of sensor failure.
  • the automatic driving function can be integrated cruise assist (integrated cruise assist, ICA), navigation cruise assist (navigation cruise assist, NCA), automatic parking assist (auto parking assist, APA), remote parking assist (remote parking assist, RPA), autonomous valet parking (automated valet parking, AVP) and other modes, the embodiment of the present application only uses the automatic driving function of the vehicle as an example and does not limit it.
  • automatic cruise can be carried out in the mode of ICA or NCA.
  • ICA mode does not require high-precision maps, and can use sensors for road recognition. ICA mode is suitable for the situation when the vehicle is driving on the highway.
  • NCA mode requires prefabricated high-precision maps. Using the NCA mode, route planning and cruise can be realized according to the destination entered by the user on the map. NCA is suitable for vehicles driving on urban roads or highways.
  • autonomous parking can use modes such as AVP, APA or RPA.
  • the automatic driving system replaces the driver to complete the driving and parking tasks from a specific area of the parking lot (such as the entrance and exit, elevator room) to the target parking space. That is to say, in the process of parking in AVP mode, the automatic driving system can control the cruising of the vehicle and sense the surrounding environment to determine whether there is an empty parking space. After it is determined that there is an empty parking space, the positional relationship between the empty parking space and the vehicle can be determined through the APA mode or RPA mode, and the vehicle is controlled to drive to the parking space.
  • a specific area of the parking lot such as the entrance and exit, elevator room
  • ICA can include cruising, car following, start and stop, lane keeping, lane changing, etc.
  • NCA and AVP can include cruising, car following, starting and stopping, lane keeping, lane changing, turning, etc.
  • APA and RPA can include side Azimuth parking, backing into storage, inclined parking spaces and other working conditions.
  • the operating conditions included in the automatic driving function of the vehicle are only used as examples without limitation.
  • a driving strategy corresponding to one or more sensor failures under a first operating condition is determined.
  • the driving strategy corresponding to the failure of the one or more sensors under the first operating condition determines the driving strategy corresponding to the failure of the one or more sensors under the first operating condition, wherein the first operating The working condition is any one of multiple operating conditions.
  • the above-mentioned driving strategy is a driving strategy for multi-level function degradation, including: maintaining the current automatic driving function, disabling some automatic driving functions, and exiting the current automatic driving function.
  • the realization of the first operating condition is related to one or more sub-functions, and one or more sub-functions include at least one of the following: one or more key sub-functions, one or more non-critical sub-functions, one or more auxiliary Sub-function, the first operating condition is any one of the multiple operating conditions of the current automatic driving function, the loss of the key sub-function causes the failure of the first operating condition, and the loss of the non-critical sub-function does not affect the first operation
  • the realization of the sub-functions except the non-critical sub-functions in the working conditions, the loss of the auxiliary sub-functions will not affect the realization of the first operating condition.
  • the first operating condition can be any operating condition under any automatic driving function, for example, it can be the cruising condition of ICA, the following condition of AVP, or the reversing warehousing condition of APA Situation etc.
  • the first working condition is only used as an example without limitation.
  • the failure of the first type of sensor does not affect the realization of the first operating condition, and the first operating condition corresponds to the failure of the first type of sensor
  • the driving strategy is to maintain the current automatic driving function.
  • failure of the first type of sensor does not affect the realization of the first working condition. Therefore, when the first type of sensor fails, the first operating condition can still operate normally.
  • the failure of the second type of sensor affects the realization of non-critical sub-functions in the first operating condition.
  • the driving strategy corresponding to the failure of similar sensors is to disable the partial automatic driving function.
  • the non-critical sub-functions under the first operating condition cannot be realized, such as the turning function cannot be realized, the high-precision detection function cannot be realized, etc., and the remaining sub-functions except the non-critical sub-functions can be realised.
  • the non-critical sub-function corresponding to the second type of failure sensor is disabled, it will not cause driving safety and driving route deviation problems.
  • the corresponding driving strategy when the second type of sensor fails is to maintain the current automatic driving function. Disable the non-critical sub-functions associated with the second type of sensor under the premise of disabling.
  • the failure of the third type of sensor affects the realization of key sub-functions in the first operating condition.
  • the driving strategy corresponding to sensor failure is to exit the current automatic driving function.
  • the loss of critical sub-functions can lead to driving safety or driving route deviation issues.
  • the third type of sensor fails, the key sub-functions of the first operating condition are lost, and the first operating condition cannot be realized. Then, under the first operating condition, the driving strategy corresponding to the failure of the third type of sensor is to exit the automatic driving function.
  • Fig. 5 is a schematic diagram of HMI interaction of the vehicle control method provided by the embodiment of the present application.
  • the driving strategy corresponding to the failure of one or more sensors can be transmitted to the driver in different forms through the vehicle control system in Figure 2, so that the driver can be notified of the state of sensor failure, the driving strategy of function degradation and related information. operate.
  • the HMI informs the driver of the sensor failure status and reminds the driver to pay attention to the road conditions.
  • Notification and reminder methods may include: displaying sensor failure status and road conditions that need to be observed on the screen in the car in the form of text and/or pictures, and the vehicle voice system broadcasting the status of sensor failure and road conditions that need to be observed.
  • the HMI informs the driver of the sensor failure status, the driving strategy of function degradation (such as vehicle slowing down, prohibiting left turn, etc.), and reminds the driver to pay attention to observe the road conditions.
  • Notification and reminder methods may include: displaying sensor failure status, corresponding driving strategies with degraded functions, and road conditions that need to be observed in the form of text and/or pictures on the vehicle screen, vehicle voice system broadcasting sensor failure status, and driving with degraded functions Strategies and road conditions that need to be observed.
  • the first operating condition is any one of various operating conditions of the current automatic driving function.
  • the HMI informs the driver of the sensor failure status and warns the driver to take over the vehicle.
  • the notification and warning methods include: the HMI displays the sensor failure status in a conspicuous color such as red, the driver takes over the vehicle request, The vehicle voice system broadcasts sensor failure status, the driver's request to take over the vehicle, and requests the driver to take over the vehicle by means of vibration or interior ambient light.
  • Fig. 6 is a schematic flowchart of a vehicle control method provided by an embodiment of the present application. The method of FIG. 6 may be performed by the vehicle of FIG. 1 , the sensor of FIG. 2 , and the vehicle control system of FIG. 3 .
  • the sensor status information may include information indicating that one or more sensors are in a failure state, and the automatic driving function information may be used to indicate the currently operating automatic driving function, and the automatic driving function includes a plurality of operating conditions.
  • sensor failure may include no signal of the sensor due to driving, wiring, etc., or it may also include damage to the sensor caused by splashing gravel, or it may also include water or dirt. etc. caused by sensor perception performance degradation and so on.
  • the embodiment of the present application does not limit the specific form of sensor failure.
  • the automatic driving function can be integrated cruise assist (integrated cruise assist, ICA), navigation cruise assist (navigation cruise assist, NCA), automatic parking assist (auto parking assist, APA), remote parking assist (remote parking assist, RPA), autonomous valet parking (automated valet parking, AVP) and other modes, the embodiment of the present application only uses the automatic driving function of the vehicle as an example and does not limit it.
  • automatic cruise can adopt ICA mode.
  • the ICA mode does not require high-precision maps and is suitable for vehicles driving on expressways.
  • automatic cruise can also adopt NCA mode.
  • NCA mode requires prefabricated high-precision maps, which are suitable for vehicles driving on urban roads or highways.
  • autonomous parking can use AVP mode.
  • AVP mode the automatic driving system replaces the driver to complete the driving and parking tasks from a specific area of the parking lot (such as an entrance, an elevator room) to a target parking space. That is to say, in the process of parking in AVP mode, the automatic driving system can control the cruising of the vehicle and sense the surrounding environment to determine whether there is an empty parking space. That is, the self-driving system can find a parking space.
  • autonomous parking can also use APA or RPA mode. After determining that there is an empty parking space, the positional relationship between the empty parking space and the vehicle can be determined through the APA mode or RPA mode, and the vehicle can be controlled to drive to the parking space:
  • the automatic driving system determines the positional relationship between the empty parking space and the vehicle, and controls the vehicle to park in the parking space.
  • the driver can leave the vehicle and send a parking command to the automatic driving system using a terminal device such as a mobile phone.
  • the automatic driving system can complete the parking operation according to the received parking instruction. That is to say, after receiving the parking instruction, the automatic driving system can determine the positional relationship between the empty parking space and the vehicle, and control the vehicle to park in the parking space.
  • RPA technology involves the communication between vehicles and terminal equipment, and the communication method generally adopted is Bluetooth.
  • ICA can include cruising, car following, start and stop, lane keeping, lane changing, etc.
  • NCA and AVP can include cruising, car following, starting and stopping, lane keeping, lane changing, turning, etc.
  • APA and RPA can include side Azimuth parking, backing into storage, inclined parking spaces and other working conditions.
  • the operating conditions included in the automatic driving function of the vehicle are only used as examples without limitation.
  • the first operating condition may be the cruising condition of the ICA mode, the vehicle-following condition of the AVP mode, or the reversing warehousing condition of the APA mode.
  • the first operating condition may be any operating condition under any automatic driving function, which is not limited in this embodiment of the present application.
  • a driving strategy corresponding to one or more sensor failures under a first operating condition is determined.
  • the driving strategy corresponding to the failure of the one or more sensors under the first operating condition determines the driving strategy corresponding to the failure of the one or more sensors under the first operating condition, wherein the first operating The working condition is any one of multiple operating conditions.
  • the automatic driving strategy is a driving strategy for multi-level functional degradation, including: maintaining the currently running automatic driving function, disabling some automatic driving functions, and exiting the current automatic driving function.
  • the realization of the first operating condition is related to one or more sub-functions, and one or more sub-functions include at least one of the following: one or more key sub-functions, one or more non-critical sub-functions, one or more auxiliary Sub-function, the first operating condition is any one of the multiple operating conditions of the current automatic driving function, the loss of the key sub-function causes the failure of the first operating condition, and the loss of the non-critical sub-function does not affect the first operation
  • the realization of the sub-functions except the non-critical sub-functions in the working conditions, the loss of the auxiliary sub-functions will not affect the realization of the first operating condition.
  • the first operating condition can be any operating condition under any automatic driving function, for example, it can be the cruising condition of ICA, the following condition of AVP, or the reversing warehousing condition of APA Situation etc.
  • the first working condition is only used as an example without limitation.
  • one or more sensors include the first type of sensor, the failure of the first type of sensor does not affect the realization of the first operating condition, and the first operating condition corresponds to the failure of the first type of sensor
  • the driving strategy is to maintain the current automatic driving function.
  • failure of the first type of sensor does not affect the realization of the first operating condition. Therefore, when the first type of sensor fails, the first operating condition can still operate normally.
  • one or more sensors include the second type of sensor, and the failure of the second type of sensor affects the realization of non-critical sub-functions in the first operating condition.
  • the driving strategy corresponding to the failure of similar sensors is to disable the partial automatic driving function.
  • the non-critical sub-functions associated with the second type of sensor cannot be realized, such as the inability to accurately identify the oncoming vehicle at the intersection (forward long-distance target), the high-precision detection cannot be realized, etc., unless the non-critical The rest of the sub-functions except the key sub-functions can be realized.
  • the driving strategy corresponding to the second type of sensor failure under the first operating condition is to maintain the current automatic driving function Disable the non-critical sub-functions associated with the second type of sensor under the premise of disabling.
  • one or more sensors include the third type of sensor, and the failure of the third type of sensor affects the realization of key sub-functions in the first operating condition.
  • the driving strategy corresponding to sensor failure is to exit the current automatic driving function.
  • the loss of critical sub-functions can lead to driving safety or driving route deviation issues.
  • the third type of sensor fails, the key sub-functions of the first operating condition are lost, and the first operating condition cannot be realized. Then, under the first operating condition, the driving strategy corresponding to the failure of the third type of sensor is to exit the automatic driving function.
  • the first type of sensor includes a first sensor unit, the first sensor unit is associated with the first sub-function, and there are other unfailed sensor units associated with the first sub-function, the first sub-function is the first Any one of one or more sub-functions under working conditions.
  • the first type of sensor further includes a second sensor unit, the second sensor unit is associated with a second auxiliary sub-function, and the second auxiliary sub-function is any one of the one or more auxiliary sub-functions.
  • the second type of sensor includes a third sensor unit, the third sensor unit is associated with the third non-critical sub-function, and there is no other failed sensor unit associated with the third non-critical sub-function, the third non-critical sub-function
  • the key sub-function is any one of one or more non-key sub-functions under the first working condition.
  • the second type of sensor further includes a fourth sensor unit
  • the fourth sensor unit is associated with the fourth key sub-function and the fourth non-key sub-function, and there are surrounding environment and Other unfailed sensor units that perceive the surrounding environment, the surrounding environment includes vehicles, railings, etc.
  • the fourth key sub-function is any one of one or more key sub-functions of the first operating condition
  • the fourth non-critical sub-function is the first Any one of one or more non-critical subfunctions of an operating condition.
  • the third type of sensor includes a fifth sensor unit, the fifth sensor unit is associated with the fifth key sub-function, and there is no other non-failure sensor unit associated with the fifth key sub-function, or there is no following At least one item: make up the surrounding environment of the fifth key sub-function, and sense other non-failure sensor units of the surrounding environment.
  • the fifth key sub-function is any one of one or more key sub-functions of the first operating condition.
  • the sensor unit may be a single sensor, or a group of two or more sensors, which is not limited in this application.
  • the sub-functions associated with different sensor units may be the same or different.
  • the sensor unit may be associated with multiple sub-functions at the same time, or may be associated with only one sub-function, which is not limited in this application.
  • one or more sensors may fail.
  • the embodiment of the present application provides aggressive processing strategies (see Table 1 to Table 27 for details) and conservative processing strategies (see Table 28 to Table 30 for details) for each sensor failure.
  • the system In the aggressive processing strategy, when a certain sensor fails, the system maintains the normal operation of the function as much as possible; in the conservative processing strategy, when a certain sensor fails, the system prefers to hand over the control of the vehicle to the driver.
  • Both the ICA mode and the NCA mode involve the longitudinal control of the vehicle under the conditions of vehicle cruising, car following, start and stop, and the lateral control of the vehicle under the conditions of lane keeping and lane changing.
  • the NCA mode also involves the turning of the vehicle, while the ICA mode is not applicable to the turning of the vehicle (not applicable, N/A).
  • the recognition of the traffic signal light (traffic light) with a long distance from the vehicle and the object type of the oncoming vehicle will be affected, so it can be controlled before the vehicle is about to drive to the intersection.
  • Vehicle deceleration so as to allow enough time to realize the identification of traffic lights and objects coming; it can also be understood as the non-critical sub-function of distant target recognition that the failure of the telephoto camera affects the cruising condition, the vehicle cannot drive at high speed, and the corresponding The driving strategy is to disable the high-speed driving function (deceleration driving).
  • any working condition of ICA or NCA mode if one other forward-looking camera other than the telephoto camera fails, there are other non-failure forward-looking camera groups that can be associated with the sub-function of the failed forward-looking camera, which can also be understood as Other non-failed forward-looking camera groups can make up for the loss of perception information caused by the failure of the forward-looking camera. Therefore, any working condition of ICA or NCA mode can still operate normally, and the corresponding driving strategy is to maintain the current automatic driving function.
  • the driving strategy corresponding to the failure of the telephoto camera is exactly the same as the driving strategy corresponding to the failure of the other forward-looking camera mentioned above.
  • Table 1 is limited to driving strategies when one forward-looking camera fails.
  • two or more forward-looking cameras fail, and there is no other non-failed forward-looking camera group that can be associated with the sub-function of the failed forward-looking camera (it can also be understood that it cannot Obtain the forward information required for the safe driving of the vehicle), the key sub-functions associated with two or more forward-looking cameras cannot be realized, and the automatic driving function can be exited at this time.
  • One or more of the cameras on the left fails, and the aggressive processing strategies during automatic cruise are shown in Table 2.
  • the left camera can provide auxiliary target recognition functions for cruising, car following, start and stop, and lane keeping conditions, and can also be understood as the left camera associated with cruise, car following, start and stop, and lane keeping Auxiliary subfunction for object recognition on the left side of the situation.
  • the loss of auxiliary sub-functions will not affect the realization or normal operation of the cruising, car following, start-stop, and lane keeping conditions. Therefore, in ICA or NCA mode, when the left camera fails, the vehicle can still maintain the current automatic driving function.
  • One or more of the cameras on the right fails, and the aggressive processing strategies during automatic cruise are shown in Table 3.
  • NCA mode In the turning condition of NCA mode, regardless of turning left or right, it is necessary to observe the target on the right side. Therefore, when it is determined that one of the right cameras fails, the key sub-function of observing the right target cannot be realized, and the NCA can be exited at this time.
  • ICA or NCA mode it can be considered that the right camera provides auxiliary information for target recognition for cruising, car following, start-stop, and lane keeping conditions. It can be understood as an auxiliary sub-function of the right camera associated with cruise, car following, start and stop, and lane keeping conditions.
  • ICA or NCA mode the failure of the right camera will not affect the realization of cruising, car following, start and stop, lane keeping and other working conditions. Therefore, the cruise, follow-up, start-stop, and lane-keeping working conditions of ICA or NCA mode can operate normally, and the vehicle can still maintain the current automatic driving function.
  • Lane changing requires observation of the rear target to prevent the vehicle from colliding with the rear vehicle in the target lane.
  • the rear-view camera fails, the non-critical sub-function of accurate recognition of rear targets cannot be realized, so automatic lane change cannot be realized.
  • prohibiting vehicles from changing lanes will not cause safety problems or problems such as deviation from the route. Therefore, in the lane changing condition of ICA or NCA mode, when the rear view camera fails, the lane changing function can be disabled.
  • the observation of the rear target mainly depends on the left camera or the right camera.
  • the rear view camera only provides auxiliary information for object recognition in cornering conditions.
  • the failure of the rear-view camera does not affect the realization of the turning condition, and the turning condition can still operate normally. Therefore, the vehicle can maintain the current autonomous driving function.
  • the rear-view camera is associated with ICA or NCA modes to assist target recognition in cruise, car following, start-stop, and lane-keeping conditions. Therefore, the failure of the rear-view camera does not affect the realization of cruising, car following, start-stop, and lane-keeping conditions.
  • ICA or NCA mode of cruising, car following, start-stop, and lane keeping when the rearview camera fails, the corresponding driving strategy can be to maintain the current automatic driving function.
  • Fisheye cameras are mainly used for parking. It can be considered that the fisheye camera provides an auxiliary target recognition function for the automatic driving function of ICA or NCA mode, so the failure of the fisheye camera has no effect on the realization of each working condition of ICA or NCA. Therefore, when one or more fisheye cameras fail, each operating condition can operate normally, and the vehicle can still maintain the automatic driving function in ICA or NCA mode.
  • the forward-facing lidar and the forward-looking camera together can realize the perception of forward-facing targets.
  • the forward lidar fails the exact position and shape of the vehicle's forward target cannot be recognized, and the accuracy of the perception of the forward target cannot be guaranteed. It can be understood that when the forward lidar fails, the key sub-functions associated with the forward lidar are lost, and the operating conditions of the ICA or NCA mode cannot be realized. Therefore, under any operating condition of ICA or NCA, when the forward lidar fails, the corresponding driving strategy can be to exit the automatic driving function of ICA or NCA mode.
  • Side LiDAR is used to accurately identify the exact position and shape of objects located to the side of the vehicle.
  • the side lidar fails, the key sub-functions associated with the side lidar are lost, and the operating conditions of the ICA or NCA mode cannot be realized. Therefore, under any operating condition of ICA or NCA, when the lateral lidar fails, the corresponding driving strategy is to exit the automatic driving function of ICA or NCA mode.
  • the forward-facing millimeter-wave radar can be mainly used to identify targets at a distance ahead and track the speed of the vehicle ahead.
  • the forward millimeter-wave radar fails, the judgment of the speed of the vehicle in front can only rely on the front lidar and the forward-looking camera, resulting in large errors. Therefore, in ICA or NCA mode of car-following and start-stop working conditions, when the forward millimeter-wave radar fails, the following distance can be increased, and the distance between the vehicle in front and the vehicle in front (that is, the distance between the vehicle in front and the vehicle in front) can be increased. The distance between them), so as to reduce or eliminate the impact of the judgment error on the speed of the target ahead on the safety of the vehicle.
  • the forward-facing millimeter-wave radar only provides auxiliary target recognition functions for cruise, lane keeping, lane changing, and turning conditions in ICA or NCA mode. Therefore, the failure of the forward millimeter-wave radar does not affect the realization of cruise, lane keeping, lane changing, and turning conditions. In the cruising, lane keeping, lane changing, and turning conditions in ICA or NCA mode, when the forward millimeter-wave radar fails, the automatic driving function in ICA or NCA mode can still operate normally.
  • Angular millimeter-wave radar can be used to provide assistance for vehicle sideways target recognition.
  • the angular millimeter wave radar only provides auxiliary target recognition function. That is to say, the failure of the angular millimeter-wave radar does not affect the realization of any operating condition in ICA or NCA mode. Therefore, when the angular millimeter-wave radar fails, the autopilot function in ICA or NCA mode can still operate normally.
  • the vehicle shown in Figure 2 does not have a laser radar in the rear direction. That is to say, the rear of the vehicle is in the blind spot of each lidar.
  • the observation and perception of targets located behind the vehicle can mainly rely on the rear-view camera and the rear-facing millimeter-wave radar.
  • the backward millimeter-wave radar fails, it is impossible to accurately estimate the position, velocity and other information of the backward target.
  • the ego vehicle changes lanes to the target lane, it is necessary to sense the vehicle behind the ego vehicle and located in the target lane, and adjust the lane change time of the ego vehicle according to the driving state of the vehicle. Therefore, when changing lanes in ICA or NCA mode, when the rear-view camera detects that there is a vehicle behind the target lane, the failure of the backward millimeter-wave radar will affect the lane change, and the corresponding driving strategy is to prohibit lane change.
  • Ultrasonic radar is mainly used for parking. It can be considered that ultrasonic radar only provides an auxiliary target recognition function for the automatic driving function of ICA or NCA mode. The failure of ultrasonic radar does not affect the realization of each working condition of ICA or NCA mode. Therefore, when the ultrasonic radar fails, all operating conditions can operate normally, and the vehicle can still maintain the automatic driving function in ICA or NCA mode.
  • the aggressive processing strategy in the case of sensor failure can be described in Table 12 to Table 22 in detail.
  • the driving speed of the vehicle is lower when the vehicle is looking for and driving towards the parking space.
  • the vehicle In the process of the vehicle driving to the parking space, the vehicle generally drives in the parking lot, garage and other areas, and generally does not involve the identification of signal indicators such as traffic lights at intersections.
  • the number of forward-looking cameras is multiple. When one of the cameras fails, it is considered that other non-failed forward-looking cameras can be associated with the sub-function of the failed forward-looking camera, which can also be understood as other non-failed forward-looking camera groups It can make up for the loss of perception information caused by the failure of the forward-looking camera. Therefore, if one of the forward-looking cameras fails, all operating conditions can operate normally, and the vehicle can still maintain the automatic driving function of the current AVP mode.
  • Table 12 is limited to driving strategies when one forward-looking camera fails. Under any operating condition, two or more forward-looking cameras fail, and there is no other non-failed forward-looking camera group that can be associated with the sub-function of the failed forward-looking camera (it can also be understood that the vehicle cannot be obtained at this time. Forward information required for safe driving), the key sub-functions associated with two or more failed forward-looking cameras cannot be realized. Therefore, in any working condition, when two or more forward-looking cameras fail, the driving strategy can be to exit the automatic driving function.
  • One or more of the left cameras fails, and the aggressive processing strategies for the vehicle autonomously driving to the parking space are shown in Table 13.
  • One or more of the right cameras fails, and the aggressive processing strategies during the vehicle autonomously driving to the parking space are shown in Table 14.
  • Table 15 shows the processing strategy when the rear-view camera fails and the vehicle autonomously drives to the parking space.
  • the driving strategy for each operating condition in the AVP mode is the same as that in the NCA mode, and will not be repeated here.
  • LiDAR has a blind spot near the vehicle.
  • Each camera except the fisheye camera also has a blind spot near the vehicle.
  • the fisheye camera fails, and the automatic driving system cannot realize the key sub-function of accurately sensing nearby obstacles. Therefore, in the case of fisheye camera failure, the driving strategy corresponding to each operating condition is to exit the automatic driving function of AVP mode.
  • Table 17 shows the aggressive processing strategies when the forward-facing lidar fails and the vehicle drives autonomously to the parking space.
  • the perception of forward-facing targets can be achieved.
  • the forward-facing lidar can be used to determine the position of the forward-facing obstacles.
  • the forward lidar fails, it is impossible to accurately perceive the position and shape of forward obstacles.
  • the forward laser radar fails, its associated key sub-functions are lost, and various operating conditions cannot be realized. Therefore, in the case of failure of the forward lidar, the driving strategy corresponding to each operating condition is to exit the automatic driving function of the AVP mode.
  • Side LiDAR can be used to accurately identify the exact position and shape of objects located to the side of the vehicle.
  • the key sub-functions associated with the side lidar cannot be realized. Therefore, when the lateral lidar fails, the driving strategy corresponding to each operating condition is to exit the automatic driving function of the AVP mode.
  • Table 19 shows the aggressive processing strategies when the forward millimeter-wave radar fails and the vehicle drives autonomously to the parking space.
  • the driving speed of the vehicle is low.
  • the forward camera and forward laser radar can be used to realize the perception of the forward target, and the ultrasonic sensor and fisheye camera can be used to complete the perception of nearby obstacles.
  • the forward millimeter-wave radar angle can provide auxiliary target identification information.
  • One or more of the angular millimeter-wave radars fails, and the aggressive processing strategies in the process of the vehicle autonomously driving to the parking space are shown in Table 20.
  • the driving speed of the vehicle is low.
  • the side camera and side lidar can be used to realize the perception of side targets, and the ultrasonic sensor and fisheye camera can be used to complete the perception of nearby obstacles.
  • the angle millimeter wave radar angle can be associated with the auxiliary target recognition function.
  • Table 21 shows the aggressive processing strategies when the backward millimeter-wave radar fails and the vehicle drives autonomously to the parking space.
  • the driving speed of the vehicle is low.
  • the side camera and side lidar can be used to realize the perception of side targets, and the ultrasonic sensor and fisheye camera can be used to complete the perception of nearby obstacles.
  • the backward millimeter-wave radar angle can provide auxiliary target identification function.
  • Ultrasonic radar is mainly used for nearby obstacle detection.
  • the fisheye camera can still be used to perceive nearby targets. Therefore, in the case of ultrasonic radar failure, the automatic driving function of AVP mode can still be maintained in each operating condition.
  • each operating condition can exit the automatic driving function of AVP mode.
  • the first method uses lidar, fisheye camera and ultrasonic radar at the same time.
  • the second way is to use fisheye camera and ultrasonic radar at the same time.
  • the third way is to use only fisheye cameras or ultrasonic mines for perception.
  • the second method reduces the accuracy of obstacle detection, but does not affect the realization of the function of autonomously parking into a parking space.
  • the third method for perception the accuracy of obstacle detection is further reduced.
  • only ultrasonic radar is used for perception to realize the function of autonomously parking in the parking space.
  • Objects around the parking space are required to indicate the range of the parking space for auxiliary positioning, such as walls, fences or other vehicles.
  • the first way can be used for perception.
  • the second method can be used for perception.
  • the third method can be used for perception.
  • the aggressive processing strategy when the sensor fails can be described in Table 23 to Table 27.
  • the vehicle travels at a low speed and only needs to perceive objects that are relatively close.
  • the forward-looking camera, left-side camera, right-side camera, and rear-view camera only provide auxiliary target recognition functions for the automatic driving function in APA or RPA mode.
  • the failure of one or more of the forward-looking camera, left-side camera, right-side camera, and rear-view camera does not affect the realization of the automatic driving function of the APA or RPA mode. Therefore, each operating condition can still maintain the automatic driving function of APA or RPA mode.
  • One or more of the fisheye cameras fails, and the aggressive processing strategies during the process of autonomous parking of the vehicle into the parking space are shown in Table 24.
  • Fisheye cameras are mainly used to perceive the surrounding environment when parking.
  • the vehicle can switch to driving that only uses the function of the ultrasonic radar to degrade under each operating condition Strategy.
  • the surrounding environment to assist ultrasonic radar to identify and model the nearby environment, although the recognition accuracy and modeling performance have been reduced, the vehicle can still be parked in the parking space autonomously under various operating conditions.
  • the fisheye camera when the fisheye camera fails, it can also switch to a driving strategy that can utilize the functions of ultrasonic radar and lidar to degrade, and the vehicle can still be parked autonomously in each operating condition.
  • each The operating conditions all exit the current automatic driving function.
  • the processing strategy for the vehicle autonomously parking into the parking space is shown in Table 25.
  • Lidar, ultrasonic radar, and fisheye cameras can all detect the nearby environment when the vehicle is autonomously parked in the parking space.
  • ultrasonic radar and fisheye camera can be used to realize the key sub-functions of nearby environment detection, although the recognition accuracy and modeling performance are reduced (high-precision non-critical sub-functions cannot be realized), In each operating condition, the vehicle can still be parked autonomously in the parking space.
  • the fisheye camera when the ultrasonic radar and the lidar fail at the same time, the fisheye camera can also be used to realize the key sub-functions of nearby environment detection. Although the recognition accuracy and modeling performance are reduced, each operating condition can still be realized. The vehicle parks itself in the parking space.
  • Millimeter wave radar can provide the function of assisting target identification. It can be considered that the failure of millimeter-wave radar does not affect the realization of any operating condition under the automatic driving function of APA or RPA mode. In the event that one or more millimeter-wave radars fail, the vehicle can still realize the perception of the surrounding environment of the vehicle during the process of autonomously parking into the parking space. Therefore, under various operating conditions, if one or more millimeter-wave radars fail, the vehicle can still maintain the automatic driving function in APA or RPA mode.
  • One or more of the ultrasonic radars fails, and the aggressive processing strategies during the process of autonomously parking the vehicle into the parking space are shown in Table 26.
  • Ultrasonic radar failure normal operation downgrade strategy quit side parking the Using Fisheye Camera Perception the Reversing into storage the Using Fisheye Camera Perception the Inclined parking the Using Fisheye Camera Perception the
  • Ultrasonic radar is mainly used to accurately detect the distance of nearby obstacles.
  • the fisheye camera can be used to detect the nearby environment such as obstacles. Although the detection accuracy is reduced, the vehicle can still be parked in the parking space autonomously under various working conditions. It can also be understood that when the ultrasonic radar fails, under various operating conditions, the key sub-function of detecting nearby obstacles can be realized by using the fisheye camera, but high detection accuracy (non-critical sub-function) cannot be achieved.
  • the fisheye camera and lidar can also be used to detect the nearby environment.
  • lidar, fisheye camera, and ultrasonic radar fail at the same time, high-precision detection of nearby obstacles cannot be achieved under each operating condition, and the current automatic driving function is exited at this time.
  • the automatic driving function is more dependent on sensors.
  • it is more inclined to exit the automatic driving function, that is, to completely hand over the control of the vehicle to the driver.
  • the sub-functions associated with each forward-looking camera are key sub-functions, and the key sub-functions associated with each forward-looking camera are different from each other.
  • the corresponding driving strategy under each operating condition is to exit the automatic driving function of ICA or NCA mode.
  • the driving strategy when two or more forward-looking cameras fail is the same as the driving strategy when one of the forward-looking cameras fails.
  • the forward millimeter-wave radar is mainly used to realize forward distant target recognition and front vehicle speed detection.
  • the driving strategy corresponding to each operating condition is to exit the current automatic driving function.
  • the conservative driving strategy is the same as the aggressive driving strategy, which will not be described here.
  • the front-view camera, left-side camera, right-side camera, rear-view camera, fisheye camera, laser radar, millimeter-wave radar, ultrasonic radar One or more failures in , the conservative processing strategy is the same as the aggressive processing strategy, and will not be described here.
  • the conservative processing strategy during the process of autonomously parking the vehicle into the parking space is as follows: Table 30 shows.
  • One or more of the front-view camera, left-side camera, right-side camera, rear-view camera, and lidar fails. Under any operating condition, the vehicle cannot realize parking space search, accurate positioning and One or more of the key sub-functions such as high-precision environment modeling. Therefore, in the event of failure of one or more of the forward-looking camera, left-side camera, right-side camera, rear-view camera and lidar, its associated key sub-functions are lost, each operating condition cannot be realized, and the vehicle can exit Autopilot function in APA or RPA mode.
  • the driving strategy corresponding to each operating condition is to exit the automatic driving function of APA or RPA mode.
  • the driving strategy for conservative processing is the same as the driving strategy for aggressive processing during the process of autonomously parking the vehicle into a parking space, so details will not be repeated here.
  • the corresponding driving strategy is delivered to the driver through the HMI.
  • the HMI can transmit the driving strategy corresponding to one or more sensor failures to the driver in different forms, so that the driver can be notified of the sensor failure status and related operations.
  • the driver can be notified of the sensor failure status and related operations.
  • Fig. 7 is a schematic block diagram of a vehicle control device provided by an embodiment of the present application.
  • the apparatus 700 includes an acquisition unit 710 and a processing unit 720 .
  • the acquisition unit 710 can implement a corresponding communication function, and the processing unit 720 is used for data processing.
  • the acquisition unit 710 may also be referred to as a communication interface or a communication unit.
  • the apparatus 700 in FIG. 7 can execute each process in the foregoing method embodiment, and to avoid repetition, no detailed description is given here.
  • the apparatus 700 may further include a storage unit 730 .
  • the storage unit 730 can be used to store instructions and/or data.
  • the processing unit 720 can read instructions and/or data in the storage unit, so that the apparatus 700 implements the aforementioned method embodiments.
  • the device 700 can be used to execute the actions performed by the second node in the method embodiment above.
  • the acquisition unit 710 is used to perform acquisition-related operations on the second node side in the method embodiment above.
  • the processing unit 720 uses Operations related to processing on the second node side in the above method embodiments are executed.
  • the apparatus 700 may include corresponding units for executing each process of the method in FIG. 4 .
  • each unit in the device 700 and other operations and/or functions described above are respectively for realizing the corresponding process of the method embodiment in FIG. 4 .
  • the acquiring unit 710 can be used to execute the step 410 in the method 400
  • the processing unit 720 can be used to execute the step 420 in the method 400 .
  • the acquiring unit 710 is configured to acquire sensor state information and automatic driving function information; the processing unit 720 is configured to determine a driving strategy corresponding to failure of one or more failed sensors under the first operating condition.
  • the processing unit 720 is specifically configured to determine the driving strategy corresponding to the failure of the first type of sensor under the first operating condition.
  • one or more sensors include the first type of sensor , the failure of the first type of sensor does not affect the realization of the first operating condition.
  • the processing unit 720 is specifically configured to determine that the driving strategy corresponding to the failure of the second type of sensor under the first operating condition is to disable the non-critical sub-function of the first operating condition, and one or more sensors Including the second type of sensor, the failure of the second type of sensor affects the realization of non-critical sub-functions of the first operating condition.
  • the processing unit 720 is specifically configured to determine that the driving strategy corresponding to the failure of the third type of sensor under the first operating condition is to exit the automatic driving function, one or more sensors include the third type of sensor, the first The failure of the three types of sensors affects the realization of the key sub-functions of the first operating condition.
  • the first type of sensor includes a first sensor unit, the first sensor unit is associated with a first subfunction, and there are other unfailed sensor units associated with the first subfunction, and the first subfunction is one or more Any one of the sub-functions.
  • the first type of sensor further includes a second sensor unit, the second sensor unit is associated with a second auxiliary sub-function, and the second auxiliary sub-function is any one of one or more auxiliary sub-functions.
  • the second type of sensor includes a third sensor unit, the third sensor unit is associated with the third non-critical sub-function, and there is no other failure sensor unit associated with the third non-critical sub-function, the third non-critical A sub-function is any one of one or more non-critical sub-functions.
  • the second type of sensor includes a fourth sensor unit, the fourth sensor unit is associated with the fourth key sub-function and the fourth non-key sub-function, and there is a surrounding environment and perceptual surroundings that make up for the fourth key sub-function Other non-failure sensor units of the environment, the surrounding environment includes vehicles, railings, etc., the fourth key sub-function is any one of one or more key sub-functions, and the fourth non-key sub-function is any one of one or more non-key sub-functions one.
  • the third type of sensor includes a fifth sensor unit, the fifth sensor unit is associated with the fifth key sub-function, and there is no other unfailed sensor unit associated with the fifth key sub-function, or there is no at least the following One item: make up for the surrounding environment of the fifth key sub-function, and other non-failure sensor units that perceive the surrounding environment.
  • the fifth key sub-function is any one or more key sub-functions of the first operating condition.
  • the fifth key sub-function A function is any one of one or more key sub-functions.
  • the third type of sensor further includes a sixth sensor unit, the sixth sensor unit is associated with a sixth key sub-function, and the sixth key sub-function cannot be compensated by the surrounding environment, and the sixth key sub-function is one or Any of several key subfunctions.
  • the processing unit 720 in FIG. 7 may be realized by at least one processor or a processor-related circuit
  • the acquiring unit 710 may be realized by a transceiver or a transceiver-related circuit
  • the storage unit 730 may be realized by at least one memory.
  • FIG. 8 is a schematic block diagram of a vehicle control device of an embodiment of the present application.
  • the vehicle control device 800 shown in FIG. 8 may include a memory 810 , a processor 820 , and a communication interface 830 .
  • the memory 810, the processor 820, and the communication interface 830 are connected through internal connection paths.
  • the memory 810 is used to store instructions.
  • the processor 820 is configured to execute the instructions stored in the memory 820 to control the input/output interface 830 to receive/send at least part of the parameters of the second channel model.
  • the memory 810 may be coupled to the processor 820 through an interface, or may be integrated with the processor 820.
  • the above-mentioned communication interface 830 uses a transceiver device such as but not limited to a transceiver to implement communication between the communication device 800 and other devices or communication networks.
  • the above-mentioned communication interface 830 may also include an input/output interface (input/output interface).
  • each step of the above method may be implemented by an integrated logic circuit of hardware in the processor 820 or an instruction in the form of software.
  • the methods disclosed in the embodiments of the present application may be directly implemented by a hardware processor, or implemented by a combination of hardware and software modules in the processor.
  • the software module can be located in a mature storage medium in the field such as random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, register.
  • the storage medium is located in the memory 810, and the processor 820 reads the information in the memory 810, and completes each step of the above method in combination with its hardware. To avoid repetition, no detailed description is given here.
  • the processor may be a central processing unit (central processing unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (digital signal processor, DSP), dedicated integrated Circuit (application specific integrated circuit, ASIC), off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the memory may include a read-only memory and a random access memory, and provide instructions and data to the processor.
  • a portion of the processor may also include non-volatile random access memory.
  • the processor may also store device type information.
  • sequence numbers of the above-mentioned processes do not mean the order of execution, and the execution order of the processes should be determined by their functions and internal logic, and should not be used in the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • An embodiment of the present application also provides a computing device, including: at least one processor and a memory, at least one processor is coupled to the memory, and is used to read and execute instructions in the memory, so as to perform any of the above-mentioned FIG. 4 to FIG. 6 a way.
  • the embodiment of the present application also provides a computer-readable medium, the computer-readable medium stores program codes, and when the computer program codes run on the computer, the computer executes any one of the above-mentioned methods in FIG. 4 to FIG. 6 .
  • the embodiment of the present application also provides a chip, including: at least one processor and a memory, at least one processor is coupled with the memory, and is used to read and execute instructions in the memory, so as to execute any one of the above-mentioned Figures 4 to 6 way.
  • the embodiment of the present application also provides an automatic driving vehicle, including: at least one processor and a memory, at least one processor is coupled with the memory, and is used to read and execute the instructions in the memory, so as to execute the above-mentioned Figure 4 to Figure 6 any of the methods.
  • a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a computing device and the computing device can be components.
  • One or more components can reside within a process and/or thread of execution and a component can be localized on one computer and/or distributed between two or more computers.
  • these components can execute from various computer readable media having various data structures stored thereon.
  • a component may, for example, be based on a signal having one or more packets of data (e.g., data from two components interacting with another component between a local system, a distributed system, and/or a network, such as the Internet via a signal interacting with other systems). Communicate through local and/or remote processes.
  • packets of data e.g., data from two components interacting with another component between a local system, a distributed system, and/or a network, such as the Internet via a signal interacting with other systems.
  • the disclosed systems, devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or May be integrated into another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the functions described above are realized in the form of software function units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disc and other media that can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

一种车辆控制的方法,包括:获取指示一个或多个传感器处于失效状态的信息,以及用于指示当前运行的自动驾驶功能的信息;根据一个或多个传感器失效对当前运行的自动驾驶功能的各个运行工况的影响,确定在每个运行工况下与一个或多个传感器失效对应的多级功能降级的驾驶策略,从而保障行车安全,提高自动驾驶系统的可用性。一种车辆控制的装置、计算设备、计算机可读介质、芯片以及自动驾驶车辆也被公开。

Description

车辆控制的方法和装置 技术领域
本申请实施例涉及智能车领域,更具体地,涉及一种车辆控制的方法和装置。
背景技术
随着技术的发展,智能运输设备、智能车等智能终端正在逐步进入人们的日常生活中。传感器在智能终端上发挥着十分重要的作用。安装在智能终端上的各式各样的传感器,比如毫米波雷达、激光雷达、摄像头、超声波雷达等,在智能终端的运动过程中感知周围的环境,收集数据,以供智能终端进行环境识别,例如移动物体的辨识与追踪,以及静止场景如车道线、标示牌的识别等,并结合导航仪及地图数据进行路径规划。传感器可以预先察觉到可能发生的危险并辅助甚至自主采取必要的规避手段,有效增加了智能终端的安全性和舒适性。
智能驾驶是人工智能领域的一种主流应用。在自动驾驶车辆的运行过程中,会发生传感器失效的情况,包括车辆振动造成线束松动、传感器表面受损、被污物遮挡等。受损的传感器会极大地影响感知系统的性能,从而影响智能驾驶能力。
目前,大多数智能驾驶系统在传感器失效后直接退出自动驾驶模式,即退出所有自动驾驶功能,这样虽然能严格保证驾驶安全性,但是降低了智能驾驶系统的可用性和智能驾驶的体验。
发明内容
本申请实施例提供一种车辆控制的方法和装置,能够在保障行车安全的前提下,提升自动驾驶的智能化程度。
第一方面,提供一种车辆控制的方法,包括:获取传感器状态信息和自动驾驶功能信息,传感器状态信息包括指示一个或多个传感器处于失效状态的信息,自动驾驶功能信息用于指示当前运行的自动驾驶功能,该自动驾驶功能包括多个运行工况;根据所述一个或多个传感器失效对第一运行工况的影响,确定在第一运行工况下与一个或多个传感器失效对应的驾驶策略,其中,第一运行工况为多个运行工况的任意一个。
运行工况是指自动驾驶功能的各种工作状态或状况,例如巡航、跟车、车道保持、变道、转弯、启停、倒车入库、侧方停车等,更具体地,例如,可以是ICA的巡航工况,AVP的跟车工况,也APA的倒车入库工况等。智能驾驶车辆一般按照不同的运行工况向车辆用户呈现或提供自动驾驶功能(例如以自动驾驶功能的功能包的形式),即,当驾驶员启动自动驾驶功能时,一般会启动自动驾驶功能的一个或若干个功能包,分别对应于不同的运行工况。驾驶员可以根据自己的喜好或需求,开启或关闭对应的运行工况:例如,当驾驶员想要开启跟车和车道保持功能时,可以通过一种拨杆或按键等方式,向车辆发出对应的自动驾驶指令;又例如,当驾驶员仅想开启跟车功能,但不想 开启车道保持功能时,可以通过另一种拨杆或按键等方式,向车辆发出对应的自动驾驶指令。
本申请实施例提供的车辆控制的方案,根据不同传感器对每个运行工况的依赖程度不同,制定了不同等级的驾驶策略,而不是简单地直接退出所有自动驾驶功能。这样,在自动驾驶过程中,当一个或多个传感器失效时,车辆尽可能地保持当前的全部或部分自动驾驶功能,能够在保障行车安全的同时,提高用户的自动驾驶体验。
另一方面,本申请实施例的车辆控制的方案,能够以车辆运行工况为粒度进行控制。与只能以整体自动驾驶功能为粒度进行控制的方式相比,能够更加灵活地提供层次更丰富、等级更细化的自动驾驶能力,提高用户的自动驾驶体验。
结合第一方面,在第一方面的某些实现方式中,不同等级的驾驶策略为多级功能降级的驾驶策略,包括:保持当前的自动驾驶功能、禁用部分自动驾驶功能、退出自动驾驶功能。
本申请实施例提供的车辆控制的方案,根据不同传感器对车辆运行工况的依赖程度不同,制定了至少三级驾驶策略,而不是简单地保持自动驾驶功能或直接退出自动驾驶功能。这样,能够更加灵活地提供层次更丰富、等级更细化的自动驾驶能力,保证功能降级的合理性。
结合第一方面,在第一方面的某些实现方式中,第一运行工况的实现与一个或多个子功能相关,一个或多个子功能至少包括以下一项:一个或多个关键子功能、一个或多个非关键子功能、一个或多个辅助子功能,第一运行工况为多个运行工况中的任意一个,关键子功能丧失导致第一运行工况无法实现,非关键子功能丧失不影响所述第一运行工况中除所述非关键子功能外的子功能的实现,辅助子功能丧失不影响第一运行工况的实现。
每个运行工况的实现与一个或多个子功能相关。子功能一般不会呈现给车辆用户,即驾驶员不能选择运行工况下的具体子功能。例如,转弯工况的实现可以依赖左侧目标识别的非关键子功能、右侧目标识别的非关键子功能、后向目标识别的非关键子功能以及目标识别的辅助子功能等;又例如,斜方位停车工况的实现可以依赖近处障碍物识别的关键子功能、高精度识别的非关键子功能以及目标识别的辅助子功能等。
结合第一方面,在第一方面的某些实现方式中,确定在第一运行工况下与一个或多个传感器失效对应的驾驶策略,包括:确定在第一运行工况下与第一类传感器失效对应的驾驶策略为保持自动驾驶功能,一个或多个传感器包括第一类传感器,第一类传感器失效不影响第一运行工况的实现。
结合第一方面,在第一方面的某些实现方式中,确定在第一运行工况下与一个或多个传感器失效对应的驾驶策略,包括:确定在第一运行工况下与第二类传感器失效对应的驾驶策略为禁用部分自动驾驶功能,一个或多个传感器包括第二类传感器,第二类传感器失效影响第一运行工况的非关键子功能的实现。
本申请实施例提供的车辆控制的方案,当某些传感器失效导致第一运行工况无法正常运行时,车辆可以在保持当前自动驾驶功能的前提下禁用失效传感器相关联的部分自动驾驶功能。如此,可以不需要人工接管部分或全部的自动驾驶功能,在保证行车安全的同时,提高用户的自动驾驶体验。
结合第一方面,在第一方面的某些实现方式中,确定在第一运行工况下与一个或多个传感器失效对应的驾驶策略,包括:确定在第一运行工况下与第三类传感器失效对应的驾驶策略为退出自动驾驶功能,一个或多个传感器包括第三类传感器,第三类传感器失效影响第一运行工况的关键子功能的实现。
结合第一方面,在第一方面的某些实现方式中,第一类传感器包括第一传感器单元,第一传感器单元关联第一子功能,且存在关联第一子功能的其他未失效传感器单元,第一子功能为一个或多个子功能的任意一个。
本申请实施例提供的车辆控制的方案中,在某一运行工况下,某些失效传感器的感知信息可以通过其他未失效的传感器弥补时,车辆仍可以保持当前的自动驾驶功能。这样,能够更加灵活提供层次更丰富的驾驶策略,提高了用户的自动驾驶体验。
结合第一方面,在第一方面的某些实现方式中,第一类传感器还包括第二传感器单元,第二传感器单元关联第二辅助子功能,第二辅助子功能为一个或多个辅助子功能的任意一个。
结合第一方面,在第一方面的某些实现方式中,第二类传感器包括第三传感器单元,第三传感器单元关联第三非关键子功能,且不存在关联第三非关键子功能的其他失效传感器单元,第三非关键子功能为一个或多个非关键子功能的任意一个。
结合第一方面,在第一方面的某些实现方式中,第二类传感器还包括第四传感器单元,第四传感器单元关联第四关键子功能和第四非关键子功能,且存在弥补第四关键子功能的周围环境和感知周围环境的其他未失效传感器单元,周围环境包括车辆、栏杆等,第四关键子功能为一个或多个关键子功能的任意一个,第四非关键子功能为一个或多个非关键子功能的任意一个。
本申请实施例提供的车辆控制的方案,当某些关联关键子功能的传感器单元失效时,还可以通过周围环境以及其他传感器单元弥补失效传感器的关键感知信息。这样,即使关联关键子功能的传感器失效,车辆仍可以不退出自动驾驶功能,能够提高用户的自动驾驶体验。
结合第一方面,在第一方面的某些实现方式中,第三类传感器包括第五传感器单元,第五传感器单元关联第五关键子功能,且不存在关联第五关键子功能的其他未失效传感器单元,或者不存在以下至少一项:弥补第五关键子功能的周围环境、感知周围环境的其他未失效传感器单元,第五关键子功能为一个或多个关键子功能的任意一个。
第二方面,提供了一种车辆控制装置,包括:获取单元,用于获取传感器状态信息和自动驾驶功能信息,传感器状态信息包括指示一个或多个传感器处于失效状态的信息,自动驾驶功能信息用于指示当前运行的自动驾驶功能,自动驾驶功能包括多个运行工况;处理单元,用于根据一个或多个传感器失效对第一运行工况的影响,确定在第一运行工况下与一个或多个传感器失效对应的驾驶策略,其中,所述第一运行工况为所述多个运行工况的任意一个。
运行工况是指自动驾驶功能的各种工作状态或状况,例如巡航、跟车、车道保持、变道、转弯、启停、倒车入库、侧方停车等,更具体地,例如,可以是ICA的巡航工况,AVP的跟车工况,也APA的倒车入库工况等。智能驾驶车辆一般按照不同的运行工况向车辆用户呈现或提供自动驾驶功能(例如以自动驾驶功能的功能包的形式), 即,当驾驶员启动自动驾驶功能时,一般会启动自动驾驶功能的一个或若干个功能包,分别对应于不同的运行工况。驾驶员可以根据自己的喜好或需求,开启或关闭对应的运行工况:例如,当驾驶员想要开启跟车和车道保持功能时,可以通过一种拨杆或按键等方式,向车辆发出对应的自动驾驶指令;又例如,当驾驶员仅想开启跟车功能,但不想开启车道保持功能时,可以通过另一种拨杆或按键等方式,向车辆发出对应的自动驾驶指令。
本申请实施例提供的车辆控制的方案,根据不同传感器对每个运行工况的依赖程度不同,制定了不同等级的驾驶策略,而不是简单地直接退出所有自动驾驶功能。这样,在自动驾驶过程中,当一个或多个传感器失效时,车辆尽可能地保持当前的全部或部分自动驾驶功能,能够在保障行车安全的同时,提高用户的自动驾驶体验。
另一方面,本申请实施例的车辆控制的方案,能够以车辆运行工况为粒度进行控制。与只能以整体自动驾驶功能为粒度进行控制的方式相比,能够更加灵活地提供层次更丰富、等级更细化的自动驾驶能力,提高用户的自动驾驶体验。
结合第二方面,在第二方面的某些实现方式中,不同等级的驾驶策略为多级功能降级的驾驶策略,包括:保持自动驾驶功能、禁用部分自动驾驶功能、退出自动驾驶功能。
本申请实施例提供的车辆控制的方案,根据不同传感器对车辆运行工况的依赖程度不同,制定了至少三级驾驶策略,而不是简单地保持自动驾驶功能或直接退出自动驾驶功能。这样,能够更加灵活地提供层次更丰富、等级更细化的自动驾驶能力,保证功能降级的合理性。
结合第二方面,在第二方面的某些实现方式中,第一运行工况的实现与一个或多个子功能相关,一个或多个子功能至少包括以下一项:一个或多个关键子功能、一个或多个非关键子功能、一个或多个辅助子功能,第一运行工况为多个运行工况中的任意一个,关键子功能丧失导致第一运行工况无法实现,非关键子功能丧失不影响所述第一运行工况中除所述非关键子功能外的子功能的实现,辅助子功能丧失不影响第一运行工况的实现。
每个运行工况的实现与一个或多个子功能相关。子功能一般不会呈现给车辆用户,即驾驶员不能选择运行工况下的具体子功能。例如,转弯工况的实现可以依赖左侧目标识别的非关键子功能、右侧目标识别的非关键子功能、后向目标识别的非关键子功能以及目标识别的辅助子功能等;又例如,斜方位停车工况的实现可以依赖近处障碍物识别的关键子功能、高精度识别的非关键子功能以及目标识别的辅助子功能等。
结合第二方面,在第二方面的某些实现方式中,处理单元具体用于确定在第一运行工况下与第一类传感器失效对应的驾驶策略为保持自动驾驶功能,一个或多个传感器包括第一类传感器,第一类传感器失效不影响第一运行工况的实现。
结合第二方面,在第二方面的某些实现方式中,处理单元具体用于确定在第一运行工况下与第二类传感器失效对应的驾驶策略为禁用部分当前的自动驾驶功能,一个或多个传感器包括第二类传感器,第二类传感器失效影响第一运行工况的非关键子功能的实现。
本申请实施例提供的车辆控制的方案,当某些传感器失效导致第一运行工况无法正 常运行时,车辆可以在保持当前自动驾驶功能的前提下禁用失效传感器相关联的部分自动驾驶功能。如此,可以不需要人工接管部分或全部的自动驾驶功能,在保证行车安全的同时,提高用户的自动驾驶体验。
结合第二方面,在第二方面的某些实现方式中,处理单元具体用于确定在第一运行工况下与第三类传感器失效对应的驾驶策略为退出自动驾驶功能,一个或多个传感器包括第三类传感器,第三类传感器失效影响第一运行工况的关键子功能的实现。
结合第二方面,在第二方面的某些实现方式中,第一类传感器包括第一传感器单元,第一传感器单元关联第一子功能,且存在关联第一子功能的其他未失效传感器单元,第一子功能为一个或多个子功能的任意一个。
本申请实施例提供的车辆控制的方案中,在某一运行工况下,某些失效传感器的感知信息可以通过其他未失效的传感器弥补时,车辆仍可以保持当前的自动驾驶功能。这样,能够更加灵活提供层次更丰富的驾驶策略,提高了用户的自动驾驶体验。
结合第二方面,在第二方面的某些实现方式中,第一类传感器还包括第二传感器单元,第二传感器单元关联第二辅助子功能,第二辅助子功能为一个或多个辅助子功能的任意一个。
结合第二方面,在第二方面的某些实现方式中,第二类传感器包括第三传感器单元,第三传感器单元关联第三非关键子功能,且不存在关联第三非关键子功能的其他失效传感器单元,第三非关键子功能为一个或多个非关键子功能的任意一个。
结合第二方面,在第二方面的某些实现方式中,第二类传感器包括第四传感器单元,第四传感器单元关联第四关键子功能和第四非关键子功能,且存在弥补第四关键子功能的周围环境和感知周围环境的其他未失效传感器单元,周围环境包括车辆、栏杆等,第四关键子功能为一个或多个关键子功能的任意一个,第四非关键子功能为一个或多个非关键子功能的任意一个。
本申请实施例提供的车辆控制的方案,当某些关联关键子功能的传感器单元失效时,还可以通过周围环境以及其他传感器单元弥补失效传感器的关键感知信息。这样,即使关联关键子功能的传感器失效,车辆仍可以不退出自动驾驶功能,能够提高用户的自动驾驶体验。
结合第二方面,在第二方面的某些实现方式中,第三类传感器包括第五传感器单元,第五传感器单元关联第五关键子功能,且不存在关联第五关键子功能的其他未失效传感器单元,或者不存在以下至少一项:弥补第五关键子功能的周围环境、感知周围环境的其他未失效传感器单元,第五关键子功能为一个或多个关键子功能的任意一个。
第三方面,提供一种计算设备,包括存储器和处理器,所述存储器用于存储程序指令;当所述程序指令在所述处理器中执行时,所述处理器用于执行第一方面或第二方面所述的方法。
上述第三方面中的处理器既可以包括中央处理器(central processing unit,CPU),也可以包括CPU与神经网络运算处理器的组合。
第四方面,提供一种计算机可读介质,该计算机可读介质存储用于设备执行的程序代码,该程序代码包括用于执行第一方面或第二方面中的方法。
第五方面,提供一种芯片,所述芯片包括处理器与数据接口,所述处理器通过所述 数据接口读取存储器上存储的指令,执行上述第一方面或第二方面中的方法。
可选地,作为一种实现方式,所述芯片还可以包括存储器,所述存储器中存储有指令,所述处理器用于执行所述存储器上存储的指令,当所述指令被执行时,所述处理器用于执行第一方面或第一方面中的任意一种实现方式中的方法。
上述芯片具体可以是现场可编程门阵列(field-programmable gate array,FPGA)或者专用集成电路(application-specific integrated circuit,ASIC)。
第六方面,提供了一种自动驾驶车辆,包括:至少一个处理器和存储器,至少一个处理器与存储器耦合,用于读取并执行存储器中的指令,以执行上述第一方面中任一种可能的实现方式中的方法。
应理解,本申请中,第一方面的方法具体可以是指第一方面以及第一方面中各种实现方式中的任意一种实现方式中的方法。
附图说明
图1是本申请实施例适用的车辆的功能框图。
图2是本申请实施例提供的车辆传感器的示意性结构图。
图3是本申请实施例适用的简化的车辆控制系统的架构示意图。
图4是本申请实施例提供的车辆控制方法的示意性流程图。
图5是本申请实施例提供的车辆控制方法的HMI交互示意图。
图6是本申请实施例提供的车辆控制方法的示意性流程图。
图7是本申请实施例提供的车辆控制装置的示意性框图。
图8是本申请实施例提供的车辆控制装置的示意性框图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。
本申请实施例所提供的车辆控制的方法和装置可以应用于智能驾驶车辆,也可以应用于智能家居设备、机器人等智能终端。下面结合附图,对本申请实施例的技术方案进行介绍。
图1是本申请实施例适用的车辆100的功能框图。其中,车辆100可以是智能驾驶车辆,车辆100可以完全或部分地支持自动驾驶模式。
车辆100中可以包括各种子系统,例如,行进系统110、传感系统120、控制系统130、一个或多个外围设备140以及电源160、计算机系统150和用户接口170。
可选地,车辆100可以包括更多或更少的子系统,并且每个子系统可包括多个元件。另外,车辆100的每个子系统和元件可以通过有线或者无线互连。
示例性地,行进系统110可以包括用于向车辆100提供动力运动的组件。在一个实施例中,行进系统110可以包括引擎111、传动装置112、能量源113和车轮114/轮胎。其中,引擎111可以是内燃引擎、电动机、空气压缩引擎或其他类型的引擎组合;例如,汽油发动机和电动机组成的混动引擎,内燃引擎和空气压缩引擎组成的混动引擎。引擎111可以将能量源113转换成机械能量。
示例性地,能量源113可以包括汽油、柴油、其他基于石油的燃料、丙烷、其他基 于压缩气体的燃料、乙醇、太阳能电池板、电池和其他电力来源。能量源113也可以为车辆100的其他系统提供能量。
示例性地,传动装置112可以包括变速箱、差速器和驱动轴;其中,传动装置112可以将来自引擎111的机械动力传送到车轮114。
在一个实施例中,传动装置112还可以包括其他器件,比如离合器。其中,驱动轴可以包括可耦合到一个或多个车轮114的一个或多个轴。
示例性地,传感系统120可以包括感测关于车辆100周边的环境的信息的若干个传感器。
例如,传感系统120可以包括定位系统121(例如,全球定位系统(global positioning system,GPS)、北斗系统或者其他定位系统)、惯性测量单元(inertial measurement unit,IMU)122、雷达123、激光测距仪124、相机125以及车速传感器126。传感系统120还可以包括被监视车辆100的内部系统的传感器(例如,车内空气质量监测器、燃油量表、机油温度表等)。来自这些传感器中的一个或多个的传感器数据可用于检测对象及其相应特性(位置、形状、方向、速度等)。这种检测和识别是自主车辆100的安全操作的关键功能。
其中,定位系统121可以用于估计车辆100的地理位置。IMU 122可以用于基于惯性加速度来感测车辆100的位置和朝向变化。在一个实施例中,IMU 122可以是加速度计和陀螺仪的组合。
示例性地,雷达123可以利用无线电信息来感测车辆100的周边环境内的物体。在一些实施例中,除了感测物体以外,雷达123还可用于感测物体的速度和/或前进方向。
示例性地,激光测距仪124可以利用激光来感测车辆100所位于的环境中的物体。在一些实施例中,激光测距仪124可以包括一个或多个激光源、激光扫描器以及一个或多个检测器,以及其他系统组件。
示例性地,相机125可以用于捕捉车辆100的周边环境的多个图像。例如,相机125可以是静态相机或视频相机。
示例性地,车速传感器126可以用于测量车辆100的速度。例如,可以对车辆进行实时测速。测得的车速可以传送给控制系统130以实现对车辆的控制。
如图1所示,控制系统130为控制车辆100及其组件的操作。控制系统130可以包括各种元件,比如可以包括转向系统131、油门132、制动单元133、计算机视觉系统134、路线控制系统135以及障碍规避系统136。
示例性地,转向系统131可以操作来调整车辆100的前进方向。例如,在一个实施例中可以为方向盘系统。油门132可以用于控制引擎111的操作速度并进而控制车辆100的速度。
示例性地,制动单元133可以用于控制车辆100减速;制动单元133可以使用摩擦力来减慢车轮114。在其他实施例中,制动单元133可以将车轮114的动能转换为电流。制动单元133也可以采取其他形式来减慢车轮114转速从而控制车辆100的速度。
如图1所示,计算机视觉系统134可以操作来处理和分析由相机125捕捉的图像以便识别车辆100周边环境中的物体和/或特征。上述物体和/或特征可以包括交通信息、道路边界和障碍物。计算机视觉系统134可以使用物体识别算法、运动中恢复结构 (structure from motion,SFM)算法、视频跟踪和其他计算机视觉技术。在一些实施例中,计算机视觉系统134可以用于为环境绘制地图、跟踪物体、估计物体的速度等等。
示例性地,路线控制系统135可以用于确定车辆100的行驶路线。在一些实施例中,路线控制系统135可结合来自传感器、GPS和一个或多个预定地图的数据以为车辆100确定行驶路线。
如图1所示,障碍规避系统136可以用于识别、评估和避免或者以其他方式越过车辆100的环境中的潜在障碍物。
在一个实例中,控制系统130可以增加或替换地包括除了所示出和描述的那些以外的组件。或者也可以减少一部分上述示出的组件。
如图1所示,车辆100可以通过外围设备140与外部传感器、其他车辆、其他计算机系统或用户之间进行交互;其中,外围设备140可包括无线通信系统141、车载电脑142、麦克风143和/或扬声器144。
在一些实施例中,外围设备140可以提供车辆100与用户接口170交互的手段。例如,车载电脑142可以向车辆100的用户提供信息。用户接口116还可操作车载电脑142来接收用户的输入;车载电脑142可以通过触摸屏进行操作。在其他情况中,外围设备140可以提供用于车辆100与位于车内的其它设备通信的手段。例如,麦克风143可以从车辆100的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器144可以向车辆100的用户输出音频。
如图1所述,无线通信系统141可以直接地或者经由通信网络来与一个或多个设备无线通信。例如,无线通信系统141可以使用3G蜂窝通信;例如,码分多址(code division multiple access,CDMA))、EVD0、全球移动通信系统(global system for mobile communications,GSM)/通用分组无线服务(general packet radio service,GPRS),或者4G蜂窝通信,例如长期演进(long term evolution,LTE);或者,5G蜂窝通信。无线通信系统141可以利用无线上网(WiFi)与无线局域网(wireless local area network,WLAN)通信。
在一些实施例中,无线通信系统141可以利用红外链路、蓝牙或者紫蜂协议(ZigBee)与设备直接通信;其他无线协议,例如各种车辆通信系统,例如,无线通信系统141可以包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备,这些设备可包括车辆和/或路边台站之间的公共和/或私有数据通信。
如图1所示,电源160可以向车辆100的各种组件提供电力。在一个实施例中,电源160可以为可再充电锂离子电池或铅酸电池。这种电池的一个或多个电池组可被配置为电源为车辆100的各种组件提供电力。在一些实施例中,电源160和能量源113可一起实现,例如一些全电动车中那样。
示例性地,车辆100的部分或所有功能可以受计算机系统150控制,其中,计算机系统150可以包括至少一个处理器151,处理器151执行存储在例如存储器152中的非暂态计算机可读介质中的指令153。计算机系统150还可以是采用分布式方式控制车辆100的个体组件或子系统的多个计算设备。
例如,处理器151可以是任何常规的处理器,诸如商业可获得的中央处理器(central processing unit,CPU)。
可选地,该处理器可以是诸如专用集成电路(application specific integrated circuit,ASIC)或其它基于硬件的处理器的专用设备。尽管图1功能性地图示了处理器、存储器、和在相同块中的计算机的其它元件,但是本领域的普通技术人员应该理解该处理器、计算机、或存储器实际上可以包括可以或者可以不存储在相同的物理外壳内的多个处理器、计算机或存储器。例如,存储器可以是硬盘驱动器或位于不同于计算机的外壳内的其它存储介质。因此,对处理器或计算机的引用将被理解为包括对可以或者可以不并行操作的处理器或计算机或存储器的集合的引用。不同于使用单一的处理器来执行此处所描述的步骤,诸如转向组件和减速组件的一些组件每个都可以具有其自己的处理器,所述处理器只执行与特定于组件的功能相关的计算。
在此处所描述的各个方面中,处理器可以位于远离该车辆并且与该车辆进行无线通信。在其它方面中,此处所描述的过程中的一些在布置于车辆内的处理器上执行而其它则由远程处理器执行,包括采取执行单一操纵的必要步骤。
在一些实施例中,存储器152可包含指令153(例如,程序逻辑),指令153可以被处理器151来执行车辆100的各种功能,包括以上描述的那些功能。存储器152也可包括额外的指令,比如包括向行进系统110、传感系统120、控制系统130和外围设备140中的一个或多个发送数据、从其接收数据、与其交互和/或对其进行控制的指令。
示例性地,除了指令153以外,存储器152还可存储数据,例如,道路地图、路线信息,车辆的位置、方向、速度以及其它这样的车辆数据,以及其他信息。这种信息可在车辆100在自主、半自主和/或手动模式中操作期间被车辆100和计算机系统150使用。
如图1所示,用户接口170可以用于向车辆100的用户提供信息或从其接收信息。可选地,用户接口170可以包括在外围设备140的集合内的一个或多个输入/输出设备,例如,无线通信系统141、车载电脑142、麦克风143和扬声器144。
在本申请的实施例中,计算机系统150可以基于从各种子系统(例如,行进系统110、传感系统120和控制系统130)以及从用户接口170接收的输入来控制车辆100的功能。例如,计算机系统150可以利用来自控制系统130的输入以便控制制动单元133来避免由传感系统120和障碍规避系统136检测到的障碍物。在一些实施例中,计算机系统150可操作来对车辆100及其子系统的许多方面提供控制。
可选地,上述这些组件中的一个或多个可与车辆100分开安装或关联。例如,存储器152可以部分或完全地与车辆100分开存在。上述组件可以按有线和/或无线方式来通信地耦合在一起。
可选地,上述组件只是一个示例,实际应用中,上述各个模块中的组件有可能根据实际需要增添或者删除,图1不应理解为对本申请实施例的限制。
可选地,车辆100可以是在道路行进的自动驾驶汽车,可以识别其周围环境内的物体以确定对当前速度的调整。物体可以是其它车辆、交通控制设备、或者其它类型的物体。在一些示例中,可以独立地考虑每个识别的物体,并且基于物体的各自的特性,诸如它的当前速度、加速度、与车辆的间距等,可以用来确定自动驾驶汽车所要调整的速度。
可选地,车辆100或者与车辆100相关联的计算设备(如图1的计算机系统150、计 算机视觉系统134、存储器152)可以基于所识别的物体的特性和周围环境的状态(例如,交通、雨、道路上的冰等等)来预测所述识别的物体的行为。
可选地,每一个所识别的物体都依赖于彼此的行为,因此,还可以将所识别的所有物体全部一起考虑来预测单个识别的物体的行为。车辆100能够基于预测的所述识别的物体的行为来调整它的速度。换句话说,自动驾驶汽车能够基于所预测的物体的行为来确定车辆将需要调整到(例如,加速、减速、或者停止)稳定状态。在这个过程中,也可以考虑其它因素来确定车辆100的速度,诸如,车辆100在行驶的道路中的横向位置、道路的曲率、静态和动态物体的接近度等等。
除了提供调整自动驾驶汽车的速度的指令之外,计算设备还可以提供修改车辆100的转向角的指令,以使得自动驾驶汽车遵循给定的轨迹和/或维持与自动驾驶汽车附近的物体(例如,道路上的相邻车道中的轿车)的安全横向和纵向距离。
上述车辆100可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例不做特别的限定。
在一种可能的实现方式中,上述图1所示的车辆100可以是自动驾驶车辆,下面对自动驾驶系统的进行详细描述。
图2是本申请实施例提供的车辆传感器的示意性结构图。图2的车辆传感器可以包括图1的车辆100中的激光测距124,可以包括图1的车辆100中的车速传感器126,还可以包括其他类型的传感器。
具体地,车辆可包括多个传感器,例如,激光雷达、前视相机、后视相机、侧向相机、毫米波雷达、超声波雷达、鱼眼相机等。图2中描绘了部分常见的传感器类型及安装位置,但是本申请实施例对车辆传感器的类型、数量、位置不作限制。
例如,车辆可以包括朝向分别为前向、左向、右向的3个激光雷达。
激光雷达(light detection and ranging,Lidar)用于距离探测。激光雷达向探测目标发射激光,然后由接收器收集被目标反射的光信号,通过测量发射信号的往返时间来确定目标的距离。由于激光具有的高相干性、方向性、单色性等优点,激光雷达能够实现远距离、高精度的测距功能。激光雷达通过扫描或多元阵列探测的方式,将单点的测距结果扩展到二维,形成距离图像。目前激光雷达在自动驾驶、建筑物三维建模、地形测绘、机器人、交会对接等诸多场合得到应用。
利用激光雷达可以用于识别物体的准确位置和形状。
再例如,车辆可包括6个毫米波雷达,其中,前向1个、后向1个,侧向4个。侧向的4个毫米波雷达可以是角毫米波雷达,朝向分别为左前方、左后方、右前方、右后方。
毫米波雷达,是工作在毫米波波段(millimeter wave)探测的雷达。通常毫米波的频率为30~300吉赫兹(GHz),即波长为1~10毫米(mm)。毫米波的波长介于微波和厘米波之间。
毫米波雷达的探测距离一般在0-200米。与红外、激光等光学波束相比,毫米波穿透雾、烟、灰尘的能力强,因此毫米波雷达具有全天候的特点。
毫米波雷达可以用于识别物体的距离和速度。
再例如,车辆可以包括12个超声波雷达。
超声波发射器向外面某一个方向发射出超声波信号,在发射超声波时刻的同时开始进行计时,超声波通过空气进行传播,传播途中遇到障碍物就会立即返射传播回来,超声波接收器在收到反射波的时刻就立即停止计时。在空气中超声波的传播速度是340米每秒(m/s),计时器通过记录时间t,就可以测算出从发射点到障碍物之间的距离长度。
超声波的传播速度较慢,当汽车高速行驶时,使用超声波测距无法跟上汽车的车距实时变化,误差较大。另一方面,超声波散射角大,方向性较差。但是,在短距离测量中,超声波测距传感器具有非常大的优势。
再例如,车辆可以包括4个前视相机,分别为长焦相机、广角相机、双目相机(包括2个相机)。
长焦相机,也可以称为长焦距相机,是指比标准相机的焦距长的相机。长焦相机的焦距一般在135-800毫米(mm)范围内,有的长焦相机的焦距甚至更大。
广角相机是指焦距短于标准相机、视角大于标准相机的相机。普通广角镜头的焦距一般为38-24毫米,视角为60-84度;超广角镜头的焦距为20-13毫米,视角为94-118度。由于广角镜头的焦距短,视角大,在较短的拍摄距离范围内,能拍摄到较大面积的景物。
双目相机可以用于测距。双目相机的测距原理与人眼相似。人眼能够感知物体的远近,是由于两只眼睛对同一个物体呈现的图像存在差异,也称“视差”。物体距离越远,视差越小;反之,视差越大。视差的大小对应着物体与眼睛之间距离的远近。通过对双目相机采集两幅图像视差进行计算,可以对图像所拍摄到的范围中的景物进行测距。
再例如,车辆可以包括1个后视相机。
再例如,车辆可以包括朝向分别为左前、左后、右前、右后的4个侧向相机。后视相机、侧向相机可以为标准相机或中距相机。
再例如,车辆可以包括4个鱼眼相机,分别朝向前向、后向、左向、右向。
鱼眼相机是超广角相机中的一种特殊镜头。鱼眼相机的视角能够达到或超出人眼所能看到的范围。鱼眼相机的焦距一般为16mm或更短,鱼眼相机的视角接近或等于180度。为使镜头达到最大的摄影视角,这种摄影头的镜头的前镜片直径很短且呈抛物状向镜头前部凸出,与鱼的眼睛颇为相似。
目前,在一些自动驾驶系统中,在传感器失效后将系统不可操作部分移交给驾驶员,例如,无法识别限速时让驾驶员输入限速,无法识别车道时由驾驶员负责控制方向。这样虽然能严格保证驾驶安全性,但是降低了智能驾驶系统的可用性和智能驾驶的体验。
不同的自动驾驶功能对传感器的依赖程度不同。本申请实施例提供的车辆控制方案,根据不同传感器对自动驾驶功能以及车辆运行工况的依赖程度不同,制定了不同等级的驾驶策略,而不是简单地直接退出所有自动驾驶功能。这样,在自动驾驶过程中,当一个或多个传感器失效时,车辆尽可能地保持当前的全部或部分自动驾驶功能,能够在保障行车安全的同时,提高用户的自动驾驶体验。
图3示出了本申请实施例的简化的车辆控制系统的架构示意图。图3的车辆控制系 统可以针对图2中不同的传感器失效执行不同的驾驶策略,并与驾驶员进行信息交互。
图3所示的车辆控制系统可以包括传感器状态监测模块310、中央处理单元320、规控模块330和显示模块340。
传感器状态监测模块可以实时监测图1中的传感器系统120所包括的多个传感器中的一个或多个是否失效或者是否出现异常。传感器失效或异常的形式具体可以包括:传感器有无信号、传感器信号是否异常、传感器感知结果是否异常等。
当传感器系统120所包括的多个传感器中的一个或多个失效时,传感器状态监测模块310可以将监测的一个或多个传感器的失效信息发送给中央处理单元320。
中央处理单元320可以接收传感器状态监测模块310发送的传感器状态信息。中央处理单元320根据当前运行的智能驾驶功能确定一个或多个传感器失效时对应的功能降级策略,并将功能降级的驾驶策略发送给规控模块330。
相应的,规控模块330接收并执行中央处理单元320发送的功能降级策略。
中央处理单元可以将功能降级的驾驶策略显示在人机交互界面(human machine interface,HMI)等显示模块340,显示模块包括车辆内部的显示屏、中控屏等可以与用户交互的界面,提醒用户当前传感器的失效状态和驾驶策略等。
可选地,中央处理单元320还可以将驾驶策略通过语音、氛围灯、振动等形式表现,以提醒或警告驾驶员当前传感器失效状态和功能降级策略等。
图4是本申请实施例提供的车辆控制方法的示意性流程图。图4的方法可以由图1的车辆、图2的传感器以及图3的车辆控制系统执行。
在S410,获取传感器状态信息和自动驾驶功能信息。
传感器状态信息可以包括指示一个或多个传感器处于失效状态的信息,自动驾驶功能信息可用于指示当前运行的自动驾驶功能,当前运行的自动驾驶功能包括多个运行工况。
传感器失效的形式可以有多种,例如,可以包括由于驱动、接线等原因导致的传感器无信号,或者还可以包括由于飞溅碎石等造成的传感器损伤,或者还可以包括或者由于水、脏污遮挡等造成的传感器感知性能下降等。本申请实施例中对于传感器失效的具体形式不做限制。
自动驾驶功能可以是集成式巡航辅助(integrated cruise assist,ICA)、导航式巡航辅助(navigation cruise assist,NCA)、自动泊车辅助(auto parking assist,APA)、遥控泊车辅助(remote parking assist,RPA)、自主代客泊车(automated valet parking,AVP)等模式的一种,本申请实施例对于车辆运行的自动驾驶功能仅作为示例而不进行限制。
例如,自动巡航可以采用ICA或NCA的模式进行。
ICA模式无需高精度地图,可以利用传感器进行道路识别。ICA模式适用于车辆在高速路行驶的情况。
NCA模式要求预制高精地图。利用NCA模式,可以根据用户在地图中输入的目的地,实现路径规划和巡航。NCA适用于车辆在城市道路或高速路行驶的情况。
例如,自主泊车可以采用AVP、APA或RPA等模式。
采用AVP模式,自动驾驶系统代替驾驶员完成从停车场特定区域(如出入口、电梯 间)到目标停车位的行驶与泊车任务。也就是说,在采用AVP模式进行泊车的过程中,自动驾驶系统可以控制车辆巡航,并感知周围环境,从而确定是否存在空车位。在确定存在空车位的情况后,可以通过APA模式或者RPA模式确定空车位与车辆之间的位置关系,控制车辆行驶至该车位。
ICA可以包括巡航、跟车、启停、车道保持、变道等工况,NCA和AVP可以包括巡航、跟车、启停、车道保持、变道、转弯等工况,APA和RPA可以包括侧方位停车、倒车入库、斜车位泊车等工况。本申请实施例对于车辆运行的自动驾驶功能所包括的运行工况仅作为示例而不进行限制。
在S420,确定在第一运行工况下与一个或多个传感器失效对应的驾驶策略。
具体地,根据所述一个或多个传感器失效对第一运行工况的影响,确定在所述第一运行工况下与所述一个或多个传感器失效对应的驾驶策略,其中,第一运行工况为多个运行工况的任意一个。
上述驾驶策略为多级功能降级的驾驶策略,包括:保持当前运行的自动驾驶功能、禁用部分自动驾驶功能,退出当前自动驾驶功能。
其中,第一运行工况的实现与一个或多个子功能相关,一个或多个子功能至少包括以下一项:一个或多个关键子功能、一个或多个非关键子功能、一个或多个辅助子功能,第一运行工况为当前自动驾驶功能的多个运行工况中的任意一个,关键子功能为丧失导致第一运行工况无法实现,非关键子功能丧失不影响所述第一运行工况中除所述非关键子功能外的子功能的实现,辅助子功能丧失不影响第一运行工况的实现。第一运行工况可以是任何一种自动驾驶功能下的任意一种运行工况,例如,可以是ICA的巡航工况,可以是AVP的跟车工况,也可以是APA的倒车入库工况等。本申请实施例对第一工况仅作为示例而不进行限制。
在某些可能实现的方式中,一个或多个传感器中存在第一类传感器,第一类传感器失效不影响第一运行工况的实现,在第一运行工况下与第一类传感器失效对应的驾驶策略为保持当前的自动驾驶功能。
应理解,第一类传感器失效不影响第一工况的实现。因此,当第一类传感器失效时,第一运行工况仍可以正常运行。
在某些可能实现的方式中,一个或多个传感器中存在第二类传感器,第二类传感器失效影响第一运行工况的非关键子功能的实现,在第一运行工况下与第二类传感器失效对应的驾驶策略为禁用部分自动驾驶功能。
应理解,当第二类传感器失效时,第一运行工况下的非关键子功能不能实现,例如转弯功能不能实现、高精度检测功能无法实现等,除该非关键子功能以外的其余子功能可以实现。当禁用第二类失效传感器对应的非关键子功能时,不会导致行车安全和行车路线偏离问题,在第一运行工况下第二类传感器失效时对应的驾驶策略为在保持当前自动驾驶功能的前提下禁用第二类传感器关联的非关键子功能。
在某些可能实现的方式中,一个或多个传感器中存在第三类传感器,第三类传感器失效影响第一运行工况的关键子功能的实现,在第一运行工况下与第三类传感器失效对应的驾驶策略为退出当前的自动驾驶功能。
应理解,关键子功能的丧失可以导致行车安全或行车路线偏离问题。当第三类传感 器失效时,第一运行工况的关键子功能丧失,第一运行工况无法实现。那么在第一运行工况下与第三类传感器失效对应的驾驶策略为退出自动驾驶功能。
图5是本申请实施例提供的车辆控制方法的HMI交互示意图。
在自动驾驶过程中,可以通过图2的车辆控制系统将一个或多个传感器失效对应的驾驶策略以不同形式传递给驾驶员,使驾驶员知会传感器失效的状态、功能降级的驾驶策略以及相关操作。
对于保持当前自动驾驶功能的驾驶策略,HMI告知驾驶员传感器失效状态,并提醒驾驶员注意观察路况。告知和提醒方式可以包括:在车内屏幕上以文字和/或图片的形式显示传感器失效状态、需要注意观察的路况,车辆语音系统广播传感器失效的状态、需要注意观察的路况等。
对于禁用部分自动驾驶功能的驾驶策略,HMI告知驾驶员传感器失效状态、功能降级的驾驶策略(例如车辆减速行驶、禁止左转等),并提醒驾驶员注意观察路况。告知和提醒方式可以包括:在车辆屏幕上以文字和/或图片的形式显示传感器失效状态、对应的功能降级的驾驶策略以及需要注意观察的路况,车辆语音系统广播传感器失效状态、功能降级的驾驶策略以及需要注意观察的路况等。
应理解,第一运行工况为当前自动驾驶功能的多种运行工况的任意一种。
对于退出自动驾驶功能的驾驶策略,HMI告知驾驶员传感器失效状态,并警告驾驶员接管车辆,告知和警告的方式的包括:HMI以红色等显眼的颜色显示传感器失效状态、驾驶员接管车辆请求,车辆语音系统广播传感器失效状态、驾驶员接管车辆请求,以及采用振动或车内氛围灯的方式请求驾驶员接管车辆等。
图6是本申请实施例提供的车辆控制方法的示意性流程图。图6的方法可以由图1的车辆、图2的传感器以及图3的车辆控制系统执行。
在S610,获取传感器状态信息和自动驾驶功能信息。
传感器状态信息可包括指示一个或多个传感器处于失效状态的信息,自动驾驶功能信息可用于指示当前运行的自动驾驶功能,自动驾驶功能包括多个运行工况。
传感器失效的形式可以有多种,例如,可以包括由于驱动、接线等原因导致的传感器无信号,或者还可以包括由于飞溅碎石等造成的传感器损伤,或者还可以包括或者由于水、脏污遮挡等造成的传感器感知性能下降等。本申请实施例对于传感器失效的具体形式不做限制。
自动驾驶功能可以是集成式巡航辅助(integrated cruise assist,ICA)、导航式巡航辅助(navigation cruise assist,NCA)、自动泊车辅助(auto parking assist,APA)、遥控泊车辅助(remote parking assist,RPA)、自主代客泊车(automated valet parking,AVP)等模式的一种,本申请实施例对于车辆运行的自动驾驶功能仅作为示例而不进行限制。
例如,自动巡航可以采用ICA模式。ICA模式无需高精度地图,适用于车辆在高速路行驶的情况。
再例如,自动巡航还可以采用NCA模式。NCA模式要求预制高精地图,适用于车辆在城市道路或高速路行驶的情况。
例如,自主泊车可以采用AVP模式。采用AVP模式,自动驾驶系统代替驾驶员完 成从停车场特定区域(如出入口、电梯间)到目标停车位的行驶与泊车任务。也就是说,在采用AVP模式进行泊车的过程中,自动驾驶系统可以控制车辆巡航,并感知周围环境,从而确定是否存在空车位。也就是说,自动驾驶系统可以寻找车位。
再例如,自主泊车还可以采用APA或RPA模式。在确定存在空车位的情况后,可以通过APA模式或者RPA模式确定空车位与车辆之间的位置关系,控制车辆行驶至该车位:
例如,采用APA模式,自动驾驶系统确定空车位与车辆之间的位置关系,并控制车辆泊入车位。
例如,采用RPA模式,驾驶员可以离开车辆,利用手机等终端设备向自动驾驶系统发送泊车指令。自动驾驶系统可以根据接收的泊车指令,完成泊车操作。也就是说,自动驾驶系统可以在接收泊车指令后,确定空车位与车辆之间的位置关系,并控制车辆泊入车位。RPA技术涉及车辆与终端设备的通信,一般采用的通讯方式为蓝牙。
ICA可以包括巡航、跟车、启停、车道保持、变道等工况,NCA和AVP可以包括巡航、跟车、启停、车道保持、变道、转弯等工况,APA和RPA可以包括侧方位停车、倒车入库、斜车位泊车等工况。本申请实施例对于车辆运行的自动驾驶功能所包括的运行工况仅作为示例而不进行限制。
第一运行工况可以是ICA模式的巡航工况,也可以是AVP模式的跟车工况,也可以是APA模式的倒车入库工况。第一运行工况可以是任意一种自动驾驶功能下的任意一种运行工况,本申请实施例不做限制。
在S620,确定在第一运行工况下与一个或多个传感器失效对应的驾驶策略。
具体地,根据所述一个或多个传感器失效对第一运行工况的影响,确定在所述第一运行工况下与所述一个或多个传感器失效对应的驾驶策略,其中,第一运行工况为多个运行工况的任意一个。
在某些可能实现的方式中,自动驾驶策略为多级功能降级的驾驶策略,包括:保持当前运行的自动驾驶功能、禁用部分自动驾驶功能,退出当前自动驾驶功能。
其中,第一运行工况的实现与一个或多个子功能相关,一个或多个子功能至少包括以下一项:一个或多个关键子功能、一个或多个非关键子功能、一个或多个辅助子功能,第一运行工况为当前自动驾驶功能的多个运行工况中的任意一个,关键子功能为丧失导致第一运行工况无法实现,非关键子功能丧失不影响所述第一运行工况中除所述非关键子功能外的子功能的实现,辅助子功能丧失不影响第一运行工况的实现。第一运行工况可以是任何一种自动驾驶功能下的任意一种运行工况,例如,可以是ICA的巡航工况,可以是AVP的跟车工况,也可以是APA的倒车入库工况等。
本申请实施例对第一工况仅作为示例而不进行限制。在某些可能实现的方式中,一个或多个传感器中包括第一类传感器,第一类传感器失效不影响第一运行工况的实现,在第一运行工况下与第一类传感器失效对应的驾驶策略为保持当前的自动驾驶功能。
应理解,第一类传感器失效不影响第一运行工况的实现。因此,第一类传感器失效时,第一运行工况仍可以正常运行。
在某些可能实现的方式中,一个或多个传感器中包括第二类传感器,第二类传感器失效影响第一运行工况的非关键子功能的实现,在第一运行工况下与第二类传感器失效 对应的驾驶策略为禁用部分自动驾驶功能。
应理解,当第二类传感器失效时,第二类传感器关联的非关键子功能不能实现,例如无法准确识别路口对向来车(前向远距离目标)、高精度检测无法实现等,除该非关键子功能以外的其余子功能可以实现。当禁用第二类失效传感器关联的非关键子功能时,不会导致行车安全和行车路线偏离问题,则在第一运行工况下第二类传感器失效对应的驾驶策略为在保持当前自动驾驶功能的前提下禁用第二类传感器关联的非关键子功能。
在某些可能实现的方式中,一个或多个传感器中包括第三类传感器,第三类传感器失效影响第一运行工况的关键子功能的实现,在第一运行工况下与第三类传感器失效对应的驾驶策略为退出当前的自动驾驶功能。
应理解,关键子功能的丧失可以导致行车安全或行车路线偏离问题。当第三类传感器失效时,第一运行工况的关键子功能丧失,第一运行工况不能实现。那么在第一运行工况下与第三类传感器失效对应的驾驶策略为退出自动驾驶功能。在某些可能实现的方式中,第一类传感器包括第一传感器单元,第一传感器单元关联第一子功能,且存在关联第一子功能的其他未失效传感器单元,第一子功能为第一工况下的一个或多个子功能中的任意一个。
在某些可能实现的方式中,第一类传感器还包括第二传感器单元,第二传感器单元关联第二辅助子功能,第二辅助子功能为一个或多个辅助子功能的任意一个。
在某些可能实现的方式中,第二类传感器包括第三传感器单元,第三传感器单元关联第三非关键子功能,且不存在关联第三非关键子功能的其他失效传感器单元,第三非关键子功能为第一工况下的一个或多个非关键子功能中的任意一个。
在某些可能实现的方式中,第二类传感器还包括第四传感器单元,第四传感器单元关联第四关键子功能和第四非关键子功能,且存在弥补第四关键子功能的周围环境和感知周围环境的其他未失效传感器单元,周围环境包括车辆、栏杆等,第四关键子功能为第一运行工况的一个或多个关键子功能中的任意一个,第四非关键子功能为第一运行工况的一个或多个非关键子功能中的任意一个。
在某些可能实现的方式中,第三类传感器包括第五传感器单元,第五传感器单元关联第五关键子功能,且不存在关联第五关键子功能的其他未失效传感器单元,或者不存在以下至少一项:弥补第五关键子功能的周围环境、感知周围环境的其他未失效传感器单元,第五关键子功能为第一运行工况的一个或多个关键子功能的任意一个。
应理解,传感器单元可以是一个单独的传感器,也可以是两个及两个以上的传感器组,本申请不做限定。
应理解,在智能车辆中,不同的传感器单元关联的子功能可以相同,也可以不同。传感器单元可以同时关联多个子功能,也可以仅关联一个子功能,本申请不做限定。
自动驾驶过程中,一个或多个传感器可能失效。本申请实施例为各个传感器失效提供了激进的处理策略(详见表1至表27)和保守的处理策略(详见表28至表30)。
激进处理策略中,当某个传感器失效时,系统尽可能的维持功能的正常运作;保守处理策略中,在某个传感器失效时,系统偏好于将车辆的控制权移交给驾驶员。
在自动巡航过程中,例如在ICA或NCA模式的自动驾驶过程中,图2中的传感器出 现失效情况下,激进的处理策略可以详见表1至表11的说明。
ICA模式和NCA模式均涉及车辆巡航、跟车、启停等情况下对车辆的纵向控制以及车道保持、变道等情况下对车辆的横向控制。另外,NCA模式还涉及在车辆转弯的情况,而ICA模式对车辆转弯的情况不适用(not applicable,N/A)。
前视相机中的一个相机失效,自动巡航过程中激进的处理策略如表1所示。
表1
Figure PCTCN2021121638-appb-000001
在NCA模式的巡航工况下,如果长焦相机失效,对自车距离较远的交通信号灯(即红绿灯)、对向来车的物体类型识别受到影响,因此可以在车辆即将行驶至路口之前,控制车辆减速,从而留出足够的时间以实现对红绿灯检、对象来车的识别;也可以理解为长焦相机失效影响巡航工况的远处目标识别的非关键子功能,车辆无法高速行驶,对应的驾驶策略为禁用高速行驶功能(减速行驶)。
在ICA或NCA模式的任意一个工况下,除长焦相机之外的其他一个前视相机失效,存在可以关联该失效前视相机的子功能的其他未失效前视相机组,也可以理解为其他未失效的前视相机组可以弥补该前视相机失效导致的感知信息丢失。因此,ICA或NCA模式的任意一个工况仍可以正常运行,对应的驾驶策略为保持当前的自动驾驶功能。
采用ICA模式的巡航工况下,不涉及在路口对车辆的控制,长焦相机失效时对应的驾驶策略与上述其他一个前视相机失效对应的驾驶策略完全相同。
表1仅限于当一个前视相机失效时的驾驶策略。在ICA或NCA模式的任意一个运行工况下,两个及两个以上前视相机失效,且不存在可以关联该失效前视相机子功能的其他未失效前视相机组(也可以理解为不能获得车辆安全行驶所需的前向信息),两个及两个以上前视相机关联的关键子功能无法实现,此时可以退出自动驾驶功能。
左侧相机中的一个或多个相机失效,自动巡航过程中激进的处理策略如表2所示。
表2
Figure PCTCN2021121638-appb-000002
Figure PCTCN2021121638-appb-000003
当左侧相机失效时,无法实现左侧目标的准确识别的非关键子功能,从而车辆无法实现向左变道。一般来讲,在ICA或NCA模式下,禁止车辆向左变道不会导致安全问题或者偏离路线等问题。仅仅左侧的相机失效,对位于车辆右侧的目标的观测不产生影响,因此车辆的变道工况仍可以向右变道。因此,在ICA或NCA模式的变道工况下,驾驶策略可以为禁止向左变道。
在NCA模式的转弯工况下,无论左转右转,都需要对左侧目标进行观测。因此,当确定左侧相机之一失效时,对左侧目标观测的关键子功能无法实现,此时可以退出NCA。
在ICA或NCA模式下,左侧相机可以为巡航、跟车、启停、车道保持工况提供辅助目标识别的功能,也可以理解为左侧相机关联巡航、跟车、启停、车道保持工况的左侧目标识别的辅助子功能。辅助子功能丧失不影响巡航、跟车、启停、车道保持工况的实现或正常运行。因此,在ICA或NCA模式下,左侧相机失效时,车辆仍可以保持当前的自动驾驶功能。
右侧相机中的一个或多个相机失效,自动巡航过程中激进的处理策略如表3所示。
表3
Figure PCTCN2021121638-appb-000004
与左侧相机中失效时的情况类似,当右侧相机失效时,无法实现右侧目标的准确识别的非关键子功能,从而车辆无法实现向右变道。一般来讲,在ICA或NCA模式下,禁止车辆向右变道并不影响车辆正常行驶,例如不会导致安全问题或者偏离路线问题。仅仅右侧相机失效,对位于车辆左侧的目标的观测不产生影响,因此车辆的变道工况仍可以向左变道。因此,在ICA或NCA模式的变道工况下,驾驶策略可以为禁止向右变道。
在NCA模式的转弯工况下,无论左转右转,都需要对右侧目标进行观测。因此,当确定右侧相机之一失效时,对右侧目标观测的关键子功能无法实现,此时可以退出NCA。
在ICA或NCA模式下,可以认为右侧相机为巡航、跟车、启停、车道保持工况提供目标识别的辅助信息。可以理解为右侧相机关联巡航、跟车、启停、车道保持工况的辅助子功能。在ICA或NCA模式下,右侧相机失效不影响巡航、跟车、启停、车道保持等工况的实现。因此,ICA或NCA模式的巡航、跟车、启停、车道保持工况可以正常运行,车辆仍可以保持当前的自动驾驶功能。
后视相机失效,自动巡航过程中激进的处理策略如表4所示。
表4
Figure PCTCN2021121638-appb-000005
变道需要对后方目标进行观测,防止自车与后方位于目标车道车辆发生碰撞。当后视相机失效时,无法实现对后方目标的准确识别的非关键子功能,因此无法实现自动变道。一般来讲,在ICA或NCA模式下,禁止车辆变道不会导致安全问题或者偏离路线等问题。因此,ICA或NCA模式的变道工况下,当后视相机失效时,可以禁用变道功能。
在NCA模式的转弯工况下,对后方目标的观测主要依赖左侧相机或右侧相机。后视相机仅为转弯工况提供目标识别的辅助信息。后视相机失效不影响转弯工况的实现,转弯工况仍可以正常运行。因此,车辆可以保持当前的自动驾驶功能。
可以认为后视相机关联ICA或NCA模式的巡航、跟车、启停、车道保持工况的辅助目标识别的功能。因此,后视相机失效不影响巡航、跟车、启停、车道保持工况的实现。在ICA或NCA模式的巡航、跟车、启停、车道保持工况下,当后视相机失效时,对应的驾驶策略可以为保持当前的自动驾驶功能。
鱼眼相机中的一个或多个失效,自动巡航过程中激进的处理策略如表5所示。
表5
Figure PCTCN2021121638-appb-000006
鱼眼相机主要用于泊车。可以认为鱼眼相机为ICA或NCA模式的自动驾驶功能提供辅助目标识别的功能,故鱼眼相机失效对ICA或NCA各工况的实现无影响。因此,当一个或多个鱼眼相机失效时,各运行工况可以正常运行,车辆仍可以保持ICA或NCA模式的自动驾驶功能。
前向激光雷达失效,自动巡航过程中激进的处理策略如表6所示。
表6
Figure PCTCN2021121638-appb-000007
Figure PCTCN2021121638-appb-000008
前向激光雷达和前视相机一起,可以实现对前向目标的感知。当前向激光雷达失效时,无法识别车辆前向的目标的准确位置和形状,不能保障对前向目标的感知准确性。可以理解为前向激光雷达失效时,前向激光雷达关联的关键子功能丧失,ICA或NCA模式的各运行工况无法实现。因此,在ICA或NCA的任意一个运行工况下,当前向激光雷达失效时,对应的驾驶策略可以为退出ICA或NCA模式的自动驾驶功能。
侧向激光雷达中的一个或多个失效,自动巡航过程中激进的处理策略如表7所示。
表7
Figure PCTCN2021121638-appb-000009
侧向激光雷达用于准确识别位于车辆侧向的目标的准确位置和形状。当侧向激光雷达失效时,侧向激光雷达关联的关键子功能丧失,ICA或NCA模式的各运行工况无法实现。因此,在ICA或NCA的任意一个运行工况下,当侧向激光雷达失效时,对应的驾驶策略为退出ICA或NCA模式的自动驾驶功能。
前向毫米波雷达失效,自动巡航过程中激进的处理策略如表8所示。
表8
Figure PCTCN2021121638-appb-000010
前向毫米波雷达主要可以用于对前方距离较远处目标的识别和对前车速度进行跟踪。当前向毫米波雷达失效时,对前车速度的判断只能依赖于前侧激光雷达和前视相机,产生较大的误差。因此,在ICA或NCA模式下的跟车和启停工况下,前向毫米波雷达失效时,可以增加跟车距离,增加前车起步与自车起步的间隔(也就是增加与前车之间的距离),从而降低或消除对前方目标速度的判断误差对车辆行驶的安全性产生的影响。
可以认为前向毫米波雷达仅为ICA或NCA模式下的巡航、车道保持、变道、转弯工况提供辅助目标识别的功能。因此,前向毫米波雷达失效不影响巡航、车道保持、变道、转弯工况的实现。在ICA或NCA模式下的巡航、车道保持、变道、转弯工况下,当前向毫米波雷达失效时,ICA或NCA模式的自动驾驶功能仍可以正常运行。
角毫米波雷达中的一个或多个失效,自动巡航过程中激进的处理策略如表9所示。
表9
Figure PCTCN2021121638-appb-000011
车辆侧向目标的感知主要依赖相机和激光雷达完成。角毫米波雷达可以用于为车辆侧向的目标识别提供辅助。
自动巡航过程中,可以认为角毫米波雷达仅提供辅助目标识别功能。也就是说角毫米波雷达失效不影响ICA或NCA模式下的任意一个运行工况的实现。因此,当角毫米波雷达失效时,ICA或NCA模式的自动驾驶功能仍可以正常运行。
后向毫米波雷达失效,自动巡航过程中激进的处理策略如表10所示。
表10
Figure PCTCN2021121638-appb-000012
图2所示的车辆,后向未设置激光雷达。也就是说,车辆的后向处于各个激光雷达盲区之中。对位于车辆后向的目标的观测和感知,主要可以依赖后视相机和后向毫米波雷达。
当后向毫米波雷达失效时,无法准确估计后向目标的位置、速度等信息。而自车向目标车道变道时,需要对自车后方且位于目标车道的车辆进行感知,并根据该车辆的行驶状态对自车的变道的时间等进行调整。因此,在ICA或NCA模式的变道工况下,当后视相机检测到目标车道后向有车辆等目标时,后向毫米波雷达失效影响变道,对应的驾驶策略为禁止变道。
应理解,一般情况下,在ICA或NCA模式下,禁止车辆变道不会导致安全问题或者偏离路线问题。
超声波雷达中的一个或多个失效,自动巡航过程中激进的处理策略如表11所示。
表11
Figure PCTCN2021121638-appb-000013
超声波雷达主要运用于泊车,可以认为超声波雷达仅为ICA或NCA模式的自动驾驶功能提供辅助目标识别的功能。超声波雷达失效不影响ICA或NCA模式的各工况的实现。因此,超声波雷达失效时,各运行工况可以正常运行,车辆仍可以保持ICA或NCA模式的自动驾驶功能。
自主泊车过程中,图4中的一个传感器失效情况下激进的处理策略可以详见表12至表33的说明。
在车辆寻找并驶向车位的过程中,例如在AVP模式的自动驾驶过程中,传感器失效情况下激进的处理策略可以详见表12至表22的说明。一般情况下,与ICA、NCA等方模式相比,在车辆寻找并驶向车位的过程中,车辆的行驶速度较低。
前视相机中的一个相机失效,车辆自主驶向车位过程中激进的处理策略如表12所示。
表12
Figure PCTCN2021121638-appb-000014
车辆驶向车位的过程中,车辆一般是在停车场、车库等区域行驶,一般情况下不涉及十字路口对红绿灯等的信号指示灯的识别。
前视相机的数量为多个,当其中的一个相机失效的情况下,认为其他未失效的前视相机可以关联该失效前视相机的子功能,也可以理解为其他未失效的前视相机组可以弥补该前视相机失效导致的感知信息丢失。因此,前视相机之一失效,各运行工况可以正常运行,车辆仍可以保持当前AVP模式的自动驾驶功能。
表12仅限于当一个前视相机失效时的驾驶策略。在任意一个运行工况下,两个及两个以上前视相机失效,且不存在可以关联该失效前视相机子功能的其他未失效前视相机组(也可以理解为此时已经不能获得车辆安全行驶所需的前向信息),两个及两个以上失效前视相机关联的关键子功能无法实现。因此,在任意一个工况下,当两个及两个以上的前视相机失效时,驾驶策略可以为退出自动驾驶功能。
左侧相机中的一个或多个失效,车辆自主驶向车位过程中激进的处理策略如表13所示。
表13
Figure PCTCN2021121638-appb-000015
当左侧相机失效时,在AVP模式下各运行工况的驾驶策略与在NCA模式下的驾驶策略相同,此处不进行赘述。
右侧相机中的一个或多个失效,车辆自主驶向车位过程中激进的处理策略如表14所示。
表14
Figure PCTCN2021121638-appb-000016
当右侧相机失效时,在AVP模式下各运行工况的驾驶策略与在NCA模式下的驾驶策略相同,此处不进行赘述。
后视相机失效,车辆自主驶向车位过程中的处理策略如表15所示。
表15
Figure PCTCN2021121638-appb-000017
当后视相机失效时,在AVP模式下各运行工况的驾驶策略与在NCA模式下的驾驶策略相同,此处不进行赘述。
鱼眼相机中的一个或多个失效,车辆自主驶向车位过程中激进的处理策略如表16所示。
表16
Figure PCTCN2021121638-appb-000018
Figure PCTCN2021121638-appb-000019
激光雷达在车辆较近处存在盲区。除鱼眼相机之外的各个相机在车辆较近处也存在盲区。在AVP模式的各运行工况下,鱼眼相机失效,自动驾驶系统无法实现准确感知近处障碍物的关键子功能。因此,在鱼眼相机失效的情况下,各运行工况对应的驾驶策略均为退出AVP模式的自动驾驶功能。
前向激光雷达失效,车辆自主驶向车位过程中激进的处理策略如表17所示。
表17
Figure PCTCN2021121638-appb-000020
利用前向激光雷达和前视相机,可以实现对前向目标的感知。在车辆自主驶向车位的过程中,可能存在柱子、路桩、道闸杆等障碍物,利用前向激光雷达可以确定前向障碍物的位置。当前向激光雷达失效时,无法实现对前向障碍物位置和形状准确感知。可以理解为前向激光雷达失效,其关联的关键子功能丧失,各运行工况无法实现。因此,在前向激光雷达失效的情况下,各运行工况对应的驾驶策略为退出AVP模式的自动驾驶功能。
侧向激光雷达中的一个或多个失效,车辆自主驶向车位过程中激进的处理策略如表18所示。
表18
Figure PCTCN2021121638-appb-000021
侧向激光雷达可以用于准确识别位于车辆侧向的目标的准确位置和形状。当侧向激光雷达失效时,侧向激光雷达关联的关键子功能无法实现。因此,当侧向激光雷达失效时,各运行工况对应的驾驶策略均为退出AVP模式的自动驾驶功能。
前向毫米波雷达失效,车辆自主驶向车位过程中激进的处理策略如表19所示。
表19
Figure PCTCN2021121638-appb-000022
Figure PCTCN2021121638-appb-000023
车辆自主驶向车位的过程中,车辆的行驶速度较低,利用前向相机和前向激光雷达可以实现对前向目标的感知,利用超声波传感器和鱼眼摄像头可以完成近处障碍物的感知。前向毫米波雷达角可以提供辅助目标识别信息。在车辆的行驶速度较低的情况下,可以认为前向毫米波雷达失效不影响认为AVP模式的自动驾驶功能的实现。因此,在前向毫米波雷达失效的情况下,各运行工况仍可以保持AVP模式的自动驾驶功能。
角毫米波雷达中的一个或多个失效,车辆自主驶向车位过程中激进的处理策略如表20所示。
表20
Figure PCTCN2021121638-appb-000024
车辆自主驶向车位的过程中,车辆的行驶速度较低,利用侧向相机和侧向激光雷达可以实现对侧向目标的感知,利用超声波传感器和鱼眼摄像头可以完成近处障碍物的感知。角毫米波雷达角可以关联辅助目标识别功能。在车辆的行驶速度较低的情况下,可以认为角毫米波雷达失效不影响AVP模式的自动驾驶功能的实现。因此,在角毫米波雷达失效的情况下,各运行工况仍可以保持AVP模式的自动驾驶功能。
后向毫米波雷达失效,车辆自主驶向车位过程中激进的处理策略如表21所示。
表21
Figure PCTCN2021121638-appb-000025
车辆自主驶向车位的过程中,车辆的行驶速度较低,利用侧向相机和侧向激光雷达可以实现对侧向目标的感知,利用超声波传感器和鱼眼摄像头可以完成近处障碍物的感知。后向毫米波雷达角可以提供辅助目标识别功能。在车辆的行驶速度较低的情况下,可以认为后向毫米波雷达失效不影响AVP模式的自动驾驶功能的实现。因此,在后向毫米波雷达失效的情况下,各运行工况仍可以保持AVP模式的自动驾驶功能。
超声波雷达失效,车辆自主驶向车位过程中激进的处理策略如表22所示。
表22
Figure PCTCN2021121638-appb-000026
Figure PCTCN2021121638-appb-000027
超声波雷达主要运用于近处障碍物检测。当超声波雷达失效时,可以利用鱼眼相机仍然可以对近处目标进行感知。因此,在超声波雷达失效的情况下,各运行工况仍然可以保持AVP模式的自动驾驶功能。
在一些实施例中,当超声波雷达与鱼眼相机同时失效时,车辆无法实现近处目标的检测。因此,在超声波雷达失效的情况下,各运行工况可以退出AVP模式的自动驾驶功能。
车辆自主泊入停车位的过程中,可以采用三种方式。第一种方式,同时采用激光雷达、鱼眼相机和超声波雷达。第二种方式,同时采用鱼眼相机和超声波雷达。第三种方式,仅利用鱼眼相机或超声波雷进行感知。
与第一种方式相比,采用第二种方式,对障碍物检测的精度降低,但不影响自主泊入停车位功能的实现。利用第三种方式进行感知,对障碍物检测的精度进一步降低。另外,仅利用超声波雷达进行感知以实现自主泊入停车位功能,要求车位周围有物体用于表示车位的范围,用于辅助定位,如墙、围栏或其他车辆等。
在各个传感器均正常的情况下。可以利用第一种方式进行感知。在激光雷达失效的情况下,可以利用第二种方式进行感知。在鱼眼相机失效或超声波雷达的失效的情况下,可以利用第三种方式进行感知。
在车辆自主泊入停车位的过程中,例如,在APA或RPA模式下,传感器失效时激进的处理策略可以详见表23至表27的说明。一般情况下,在车辆自主泊入停车位的过程中,车辆的行驶速度较低,并且仅需要对距离较近的目标进行感知。
前视相机、左侧相机、右侧相机或后视相机中的一个或多个相机失效,车辆自主泊入停车位过程中激进的处理策略如表23所示。
表23
Figure PCTCN2021121638-appb-000028
车辆自主泊入停车位的过程中,主要依赖激光雷达、鱼眼相机和超声波雷达对远近处环境及障碍物进行快速、准确地识别及建模。可以认为前视相机、左侧相机、右侧相机、后视相机仅为APA或RPA模式的自动驾驶功能提供辅助目标识别的功能。前视相机、左侧相机、右侧相机、后视相机中的一个或多个失效不影响APA或RPA模式的自动驾驶功能的实现。因此,各运行工况仍可以保持APA或RPA模式的自动驾驶功能。
鱼眼相机中的一个或多个失效,车辆自主泊入停车位过程中激进的处理策略如表24所示。
表24
鱼眼相机失效 正常运行 降级策略 退出
侧方位停车   利用超声波雷达感知  
倒车入库   利用超声波雷达感知  
斜车位泊车   利用超声波雷达感知  
鱼眼相机主要用于泊车时对近处环境的感知。当鱼眼相机失效时,若在空车位的两侧存在可以指示空车位边界的车辆、栏杆、墙等周围环境时,在各运行工况下车辆可以切换到仅利用超声波雷达的功能降级的驾驶策略。利用周围环境辅助超声波雷达对近处环境进行识别和建模,虽然识别精度和建模性能有所降低,各运行工况仍可以实现将车辆自主泊入停车位。
在一些实施例中,当鱼眼相机失效时,也可以切换到可以利用超声波雷达和激光雷达的功能降级的驾驶策略,各运行工况仍可以实现将车辆自主泊入停车位。
可选地,若在空车位的两侧不存在空车位的两侧存在可以指示空车位边界的车辆、栏杆、墙等周围环境时,和/或当超声波雷达与鱼眼相机同时失效时,各运行工况均退出当前的自动驾驶功能。
前向激光雷达、侧向激光雷达中的一个或多个失效,车辆自主泊入停车位过程中的处理策略如表25所示。
表25
Figure PCTCN2021121638-appb-000029
激光雷达、超声波雷达、鱼眼相机均可以在车辆自主泊入停车位过程中对近处环境进行检测。当一个或多个激光雷达失效时,可以利用超声波雷达和鱼眼相机实现近处环境检测的关键子功能,虽然识别精度和建模性能有所降低(高精度的非关键子功能无法实现),各运行工况仍可以实现车辆自主泊入车位。
在一些实施例中,当超声波雷达与激光雷达同时失效时,还可以利用鱼眼相机实现近处环境检测的关键子功能,虽然识别精度和建模性能有所降低,各运行工况仍可以实现车辆自主泊入车位。
前向毫米波雷达、角毫米波雷达、后向毫米波雷达中的一个或多个失效,车辆自主泊入停车位过程中激进的处理策略如表26所示。
表26
Figure PCTCN2021121638-appb-000030
毫米波雷达可以提供辅助目标识别的功能。可以认为毫米波雷达失效不影响APA或RPA模式的自动驾驶功能下的任意一个运行工况的实现。在一个或多个毫米波雷达失效的情况下,车辆自主泊入停车位过程中仍然可以实现对车辆近处环境的感知。因此,在各运行工况下,一个或多个毫米波雷达失效,车辆仍可以保持APA或RPA模式的自动 驾驶功能。
超声波雷达中的一个或多个失效,车辆自主泊入停车位过程中激进的处理策略如表26所示。
表27
超声波雷达失效 正常运行 降级策略 退出
侧方位停车   利用鱼眼相机感知  
倒车入库   利用鱼眼相机感知  
斜车位泊车   利用鱼眼相机感知  
超声波雷达主要用于对近处障碍物距离的准确检测。当超声波雷达失效时,可以利用鱼眼相机对近处环境如障碍物进行检测,虽检测精度下降,但各工况仍能够实现车辆自主泊入停车位。也可以理解为,当超声波雷达失效时,在各运行工况下,利用鱼眼相机可实现对近处障碍物检测的关键子功能,但是无法实现高检测精度(非关键子功能)。
在一些实施例中,当超声波雷达失效时,也可以利用鱼眼相机和激光雷达对近处环境进行检测。
在一些实施例中,当激光雷达、鱼眼相机和超声波雷达同时失效时,各运行工况下无法实现对近处障碍物高精度的检测,此时退出当前的自动驾驶功能。
自动巡航过程中,例如在ICA或NCA模式的自动驾驶过程中,图2中的传感器出现失效的情况下,保守的处理策略详见表28至表30的说明。
保守策略中,可以认为自动驾驶功能对传感器的依赖程度更高。当一个或多个传感器失效时,更偏向于退出自动驾驶功能,即将车辆的控制权完全交给驾驶员。
前视相机中的一个或多个相机失效,自动巡航过程中保守的处理策略如表28所示。
表28
Figure PCTCN2021121638-appb-000031
在保守的策略中,可以认为在任意一个运行工况下,每个前视相机关联的子功能为关键子功能,且每个前视相机关联的关键子功能互不相同。当前视相机之一失效时,不存在可以关联该失效前视相机子功能的其他未失效前视相机组(也可以理解为此时不能获得车辆安全行驶所需的前向信息),该失效前视相机关联的关键子功能无法实现。因此,在各个运行工况下对应的驾驶策略均为退出ICA或NCA模式的自动驾驶功能。
两个及两个以上前视相机失效时的驾驶策略与前视之一失效的驾驶策略相同。
前向毫米波雷达失效,自动巡航过程中保守的处理策略如表29所示。
表29
Figure PCTCN2021121638-appb-000032
在保守的策略中,可以认为在ICA或NCA模式的每个运行工况下,前向毫米波雷达主要用于实现前向远处目标识别和前车速度检测。当毫米波雷达失效时,其关联的关键子功能无法实现,各个运行工况对应的驾驶策略均为退出当前的自动驾驶功能。
在自动巡航过程中,例如在ICA或NCA模式的自动驾驶功能下,左侧相机、右侧相机、后视相机、鱼眼相机、前向激光雷达、侧向激光雷达、角毫米波雷达、后向毫米波雷达、超声波雷达中的一个或多个失效时,保守处理的驾驶策略与激进处理的驾驶策略相同,在此不进行赘述。
在车辆寻找并驶向车位的过程中,例如在AVP模式的自动驾驶功能下,前视相机、左侧相机、右侧相机、后视相机、鱼眼相机、激光雷达、毫米波雷达、超声波雷达中的一个或多个失效,保守的处理策略与激进的处理策略相同,此处不进行赘述。
在车辆自主泊入停车位的过程,例如在APA或RPA模式的自动驾驶功能下,图2中的传感器失效时,保守的处理策略可以详见表30及上述部分表格。一般情况下,在车辆自主泊入停车位的过程中,车辆的行驶速度较低。
保守策略中,可以依赖普通相机和激光雷达完成车位搜寻、定位、建模,可以依赖鱼眼相机和超声波雷达完成泊车过程中的障碍物识别。普通相机、激光雷达、鱼眼相机、超声波雷达都是系统的关键传感器。
前视相机、左侧相机、右侧相机、后视相机、前向激光雷达、侧向激光雷达、鱼眼相机中的一个或多个失效,车辆自主泊入停车位过程中保守的处理策略如表30所示。
表30
Figure PCTCN2021121638-appb-000033
前视相机、左侧相机、右侧相机、后视相机和激光雷达中的一个或多个失效,在任意一个运行工况下,车辆自主泊入停车位过程中不能实现车位搜寻、准确定位以及高精度环境建模等关键子功能中的一个或多个。因此,在前视相机、左侧相机、右侧相机、后视相机和激光雷达中的一个或多个失效的情况下,其关联的关键子功能丧失,各运行工况无法实现,车辆可以退出APA或RPA模式的自动驾驶功能。
鱼眼相机失效,在任意一个工况下,可以认为车辆自主泊入停车位过程中无法实现 准确感知近处障碍物的关键子功能。因此,在鱼眼相机失效的情况下,各运行工况对应的驾驶策略均为退出APA或RPA模式的自动驾驶功能。
毫米波雷达、超声波雷达失效时,车辆自主泊入停车位过程中,保守处理的驾驶策略与激进处理的驾驶策略相同,在此不进行赘述。
在自动驾驶过程中,由于传感器失效导致退出当前自动驾驶功能,具体策略如表31所示。
表31
Figure PCTCN2021121638-appb-000034
在S630,将对应的驾驶策略通过HMI传递给驾驶员。
应理解,在自动驾驶过程中,HMI可以将一个或多个传感器失效对应的驾驶策略以不同形式传递给驾驶员,使得驾驶员知会传感器失效的状态以及相关操作,具体可参见图5描述,在此不进行赘述。
上文结合图1至图6的描述了本申请实施例提供的车辆控制的方法,下面结合图7至图8,描述本申请实施例的装置实施例。应理解,方法侧实施例的描述与装置实施例的描述相互对应,因此,未详细描述的部分可以参见上文的描述。
图7是本申请实施例提供的车辆控制装置的示意性框图。该装置700包括获取单元710和处理单元720。获取单元710可以实现相应的通信功能,处理单元720用于进行数据处理。获取单元710还可以称为通信接口或通信单元。图7的装置700可以执行上述方法实施例中的各个过程,为避免重复,不再详细描述。
可选地,该装置700还可以包括存储单元730。该存储单元730可以用于存储指令和/或数据。处理单元720可以读取存储单元中的指令和/或数据,以使得装置700实现前述方法实施例。
该装置700可以用于执行上文方法实施例中第二节点所执行的动作,具体的,获取单元710用于执行上文方法实施例中第二节点侧的获取相关的操作,处理单元720用于执行上文方法实施例中第二节点侧的处理相关的操作。
该装置700可以包括用于执行图4的方法的各个过程的相应单元。并且,该装置700中的各单元和上述其他操作和/或功能分别为了实现图4的方法实施例的相应流程。
其中,当该装置700用于执行图4中的方法400时,获取单元710可用于执行方法400中的步骤410,处理单元720可用于执行方法400中的步骤420。
具体的,获取单元710,用于获取传感器状态信息和自动驾驶功能信息;处理单元720,用于确定在第一运行工况下与一个或多个失效传感器失效对应的驾驶策略。
一种可能的实现方式中,处理单元720具体用于确定在第一运行工况下与第一类传感器失效对应的驾驶策略为保持当前的自动驾驶功能,一个或多个传感器包括第一类传感器,第一类传感器失效不影响第一运行工况的实现。
一种可能的实现方式中,处理单元720具体用于确定在第一运行工况下与第二类传感器失效对应的驾驶策略为禁用第一运行工况的非关键子功能,一个或多个传感器包括第二类传感器,第二类传感器失效影响第一运行工况的非关键子功能的实现。
一种可能的实现方式中,处理单元720具体用于确定在第一运行工况下与第三类传感器失效对应的驾驶策略为退出自动驾驶功能,一个或多个传感器包括第三类传感器,第三类传感器失效影响第一运行工况的关键子功能的实现。
一种可能的实现方式中,第一类传感器包括第一传感器单元,第一传感器单元关联第一子功能,且存在关联第一子功能的其他未失效传感器单元,第一子功能为一个或多个子功能的任意一个。
一种可能的实现方式中,第一类传感器还包括第二传感器单元,第二传感器单元关联第二辅助子功能,第二辅助子功能为一个或多个辅助子功能的任意一个。
一种可能的实现方式中,第二类传感器包括第三传感器单元,第三传感器单元关联第三非关键子功能,且不存在关联第三非关键子功能的其他失效传感器单元,第三非关键子功能为一个或多个非关键子功能的任意一个。
一种可能的实现方式中,第二类传感器包括第四传感器单元,第四传感器单元关联第四关键子功能和第四非关键子功能,且存在弥补第四关键子功能的周围环境和感知周围环境的其他未失效传感器单元,周围环境包括车辆、栏杆等,第四关键子功能为一个或多个关键子功能的任意一个,第四非关键子功能为一个或多个非关键子功能的任意一个。
一种可能的实现方式中,第三类传感器包括第五传感器单元,第五传感器单元关联第五关键子功能,且不存在关联第五关键子功能的其他未失效传感器单元,或者不存在以下至少一项:弥补第五关键子功能的周围环境、感知周围环境的其他未失效传感器单元,第五关键子功能为第一运行工况的一个或多个关键子功能的任意一个,第五关键子功能为一个或多个关键子功能的任意一个。
一种可能的实现方式中,第三类传感器还包括第六传感器单元,第六传感器单元关联第六关键子功能,且第六关键子功能无法通过周围环境弥补,第六关键子功能为一个或多个关键子功能的任意一个。
应理解,各单元执行上述相应步骤的具体过程在上述方法实施例中已经详细说明,为了简洁,在此不再赘述。
图7中的处理单元720可以由至少一个处理器或处理器相关电路实现,获取单元710可以由收发器或收发器相关电路实现,存储单元730可以通过至少一个存储器实现。
图8是本申请实施例的车辆控制设备的示意性框图。图8所示的车辆控制设备800可以包括存储器810、处理器820、以及通信接口830。存储器810、处理器820,通信接口830通过内部连接通路相连。该存储器810用于存储指令。该处理器820用于执行该存储器820存储的指令,以控制输入/输出接口830接收/发送第二信道模型的至少部分参数。可选地,存储器810既可以和处理器820通过接口耦合,也可以和处理器820集 成在一起。
需要说明的是,上述通信接口830使用例如但不限于收发器一类的收发装置,来实现通信设备800与其他设备或通信网络之间的通信。上述通信接口830还可以包括输入/输出接口(input/output interface)。
在实现过程中,上述方法的各步骤可以通过处理器820中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器810,处理器820读取存储器810中的信息,结合其硬件完成上述方法的各个步骤。为避免重复,这里不再详细描述。
应理解,本申请实施例中,该处理器可以为中央处理单元(central processing unit,CPU),该处理器还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
还应理解,本申请实施例中,该存储器可以包括只读存储器和随机存取存储器,并向处理器提供指令和数据。处理器的一部分还可以包括非易失性随机存取存储器。例如,处理器还可以存储设备类型的信息。
应理解,本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系。
应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
本申请实施例还提供一种计算设备,包括:至少一个处理器和存储器,至少一个处理器与存储器耦合,用于读取并执行存储器中的指令,以执行上述图4至图6中的任一种方法。
本申请实施例还提供一种计算机可读介质,计算机可读介质存储有程序代码,当所计算机程序代码在计算机上运行时,使得计算机执行上述图4至图6中的任一种方法。
本申请实施例还提供一种芯片,包括:至少一个处理器和存储器,至少一个处理器与存储器耦合,用于读取并执行存储器中的指令,以执行上述图4至图6中的任一种方法。
本申请实施例还提供一种自动驾驶车辆,包括:至少一个处理器和存储器,至少一个处理器与存储器耦合,用于读取并执行所述存储器中的指令,以执行上述图4至图6中的任一种方法。
在本说明书中使用的术语“部件”、“模块”、“系统”等用于表示计算机相关的实体、硬件、固件、硬件和软件的组合、软件、或执行中的软件。例如,部件可以是但不限于,在处理器上运行的进程、处理器、对象、可执行文件、执行线程、程序和/或计算机。通 过图示,在计算设备上运行的应用和计算设备都可以是部件。一个或多个部件可驻留在进程和/或执行线程中,部件可位于一个计算机上和/或分布在2个或更多个计算机之间。此外,这些部件可从在上面存储有各种数据结构的各种计算机可读介质执行。部件可例如根据具有一个或多个数据分组(例如来自与本地系统、分布式系统和/或网络间的另一部件交互的二个部件的数据,例如通过信号与其它系统交互的互联网)的信号通过本地和/或远程进程来通信。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (26)

  1. 一种车辆控制的方法,其特征在于,包括:
    获取传感器状态信息和自动驾驶功能信息,所述传感器状态信息包括指示一个或多个传感器处于失效状态的信息,所述自动驾驶功能信息用于指示当前运行的自动驾驶功能,所述自动驾驶功能包括多个运行工况;
    根据所述一个或多个传感器失效对第一运行工况的影响,确定在所述第一运行工况下与所述一个或多个传感器失效对应的驾驶策略,其中,所述第一运行工况为所述多个运行工况的任意一个。
  2. 如权利要求1所述的车辆控制的方法,其特征在于,所述驾驶策略为多级功能降级的驾驶策略,包括:保持所述自动驾驶功能、禁用部分所述自动驾驶功能、退出所述自动驾驶功能。
  3. 如权利要求1或2所述的车辆控制的方法,其特征在于,所述第一运行工况的实现与一个或多个子功能相关,所述一个或多个子功能至少包括以下一项:一个或多个关键子功能、一个或多个非关键子功能、一个或多个辅助子功能,所述第一运行工况为所述多个运行工况中的任意一个,所述关键子功能丧失导致第一运行工况无法实现,所述非关键子功能丧失不影响所述第一运行工况中除所述非关键子功能外的子功能的实现,所述辅助子功能丧失不影响所述第一运行工况的实现。
  4. 如权利要求1-3任一项所述的车辆控制的方法,其特征在于,所述确定在所述第一运行工况下与所述一个或多个传感器失效对应的驾驶策略,包括:
    确定在所述第一运行工况下与第一类传感器失效对应的驾驶策略为保持所述自动驾驶功能,所述一个或多个传感器包括所述第一类传感器,所述第一类传感器失效不影响所述第一运行工况的实现。
  5. 如权利要求1-4任一项所述的车辆控制的方法,其特征在于,所述确定在所述第一运行工况下与所述一个或多个传感器失效对应的驾驶策略,包括:
    确定在第一运行工况下与第二类传感器失效对应的驾驶策略为禁用部分所述自动驾驶功能,所述一个或多个传感器包括所述第二类传感器,所述第二类传感器失效影响所述第一运行工况的非关键子功能的实现。
  6. 如权利要求1-5任一项所述的车辆控制的方法,其特征在于,所述确定在所述第一运行工况下与所述一个或多个传感器失效对应的驾驶策略,包括:
    确定在第一运行工况下与第三类传感器失效对应的驾驶策略为退出所述自动驾驶功能,所述一个或多个传感器包括所述第三类传感器,所述第三类传感器失效影响所述第一运行工况的关键子功能的实现。
  7. 如权利要求4所述的车辆控制的方法,其特征在于,所述第一类传感器包括第一传感器单元,所述第一传感器单元关联第一子功能,且存在关联所述第一子功能的其他未失效传感器单元,所述第一子功能为所述一个或多个子功能的任意一个。
  8. 如权利要求4所述的车辆控制的方法,其特征在于,所述第一类传感器还包括第二传感器单元,所述第二传感器单元关联第二辅助子功能,所述第二辅助子功能为所述 一个或多个辅助子功能中的任意一个。
  9. 如权利要求5所述的车辆控制的方法,其特征在于,所述第二类传感器包括第三传感器单元,所述第三传感器单元关联第三非关键子功能,且不存在关联所述第三非关键子功能的其他失效传感器单元,所述第三非关键子功能为一个或多个非关键子功能的任意一个。
  10. 如权利要求5所述的车辆控制的方法,其特征在于,所述第二类传感器还包括第四传感器单元,所述第四传感器单元关联第四关键子功能和第四非关键子功能,且存在弥补所述第四关键子功能的周围环境和感知所述周围环境的其他未失效传感器单元,所述周围环境包括车辆、栏杆等,所述第四关键子功能为所述一个或多个关键子功能的任意一个,所述第四非关键子功能为所述一个或多个非关键子功能的任意一个。
  11. 如权利要求6所述的车辆控制的方法,其特征在于,所述第三类传感器包括第五传感器单元,所述第五传感器单元关联第五关键子功能,且不存在关联所述第五关键子功能的其他未失效传感器单元,或者不存在以下至少一项:弥补所述第五关键子功能的所述周围环境、感知所述周围环境的其他未失效传感器单元,所述第五关键子功能为所述一个或多个关键子功能的任意一个。
  12. 一种车辆控制的装置,其特征在于,包括:
    获取单元,用于获取传感器状态信息和自动驾驶功能信息,所述传感器状态信息包括指示一个或多个传感器处于失效状态的信息,所述自动驾驶功能信息用于指示当前运行的自动驾驶功能,所述自动驾驶功能包括多个运行工况;
    处理单元,用于根据所述一个或多个传感器失效对第一运行工况的影响,确定在所述第一运行工况下与所述一个或多个传感器失效对应的驾驶策略,其中,所述第一运行工况为所述多个运行工况的任意一个。
  13. 如权利要求12所述的装置,其特征在于,所述驾驶策略为多级功能降级的驾驶策略,包括:保持所述自动驾驶功能、禁用部分所述自动驾驶功能、退出所述自动驾驶功能。
  14. 如权利要求12或13所述的装置,其特征在于,所述第一运行工况的实现与一个或多个子功能相关,所述一个或多个子功能至少包括以下一项:一个或多个关键子功能、一个或多个非关键子功能、一个或多个辅助子功能,所述第一运行工况为所述多个运行工况中的任意一个,所述关键子功能丧失导致第一运行工况无法实现,所述非关键子功能丧失不影响所述第一运行工况中除所述非关键子功能外的子功能的实现,所述辅助子功能丧失不影响所述第一运行工况的实现。
  15. 如权利要求12-14任一项所述的装置,其特征在于,所述处理单元具体用于确定在所述第一运行工况下与第一类传感器失效对应的驾驶策略为保持所述自动驾驶功能,所述一个或多个传感器包括所述第一类传感器,所述第一类传感器失效不影响所述第一运行工况的实现。
  16. 如权利要求12-15任一项所述的装置,其特征在于,所述处理单元具体用于确定在所述第一运行工况下与第二类传感器失效对应的驾驶策略为禁用部分所述自动驾驶功能,所述一个或多个传感器包括所述第二类传感器,所述第二类传感器失效影响所述第一运行工况的非关键子功能的实现。
  17. 如权利要求12-16任一项所述的装置,其特征在于,所述处理单元具体用于确定在所述第一运行工况下与第三类传感器失效对应的驾驶策略为退出所述自动驾驶功能,所述一个或多个传感器包括所述第三类传感器,所述第三类传感器失效影响所述第一运行工况的关键子功能的实现。
  18. 如权利要求15所述的装置,其特征在于,所述第一类传感器包括第一传感器单元,所述第一传感器单元关联第一子功能,且存在关联所述第一子功能的其他未失效传感器单元,所述第一子功能为所述一个或多个子功能的任意一个。
  19. 如权利要求15所述的装置,其特征在于,所述第一类传感器还包括第二传感器单元,所述第二传感器单元关联第二辅助子功能,所述第二辅助子功能为一个或多个辅助子功能的任意一个。
  20. 如权利要求16所述的装置,其特征在于,所述第二类传感器包括第三传感器单元,所述第三传感器单元关联第三非关键子功能,且不存在关联所述第三非关键子功能的其他失效传感器单元,所述第三非关键子功能为一个或多个非关键子功能的任意一个。
  21. 如权利要求16所述的装置,其特征在于,所述第二类传感器包括第四传感器单元,所述第四传感器单元关联第四关键子功能和第四非关键子功能,且存在弥补所述第四关键子功能的周围环境和感知所述周围环境的其他未失效传感器单元,所述周围环境包括车辆、栏杆等,所述第四关键子功能为所述一个或多个关键子功能的任意一个,所述第四非关键子功能为所述一个或多个非关键子功能的任意一个。
  22. 如权利要求17所述的装置,其特征在于,所述第三类传感器包括第五传感器单元,所述第五传感器单元关联第五关键子功能,且不存在关联所述第五关键子功能的其他未失效传感器单元,或者不存在以下至少一项:弥补所述第五关键子功能的所述周围环境、感知所述周围环境的其他未失效传感器单元,所述第五关键子功能为所述一个或多个关键子功能的任意一个。
  23. 一种计算设备,其特征在于,包括:至少一个处理器和存储器,所述至少一个处理器与所述存储器耦合,用于读取并执行所述存储器中的指令,以执行如权利要求1至11中任一项所述的方法。
  24. 一种计算机可读介质,其特征在于,所述计算机可读介质存储有程序代码,当所述计算机程序代码在计算机上运行时,使得所述计算机执行如权利要求1至11中任一项所述的方法。
  25. 一种芯片,其特征在于,包括:至少一个处理器和存储器,所述至少一个处理器与所述存储器耦合,用于读取并执行所述存储器中的指令,以执行如权利要求1至11中任一项所述的方法。
  26. 一种自动驾驶车辆,其特征在于,包括:至少一个处理器和存储器,所述至少一个处理器与所述存储器耦合,用于读取并执行所述存储器中的指令,以执行如权利要求1至11中任一项所述的方法。
PCT/CN2021/121638 2021-09-29 2021-09-29 车辆控制的方法和装置 WO2023050129A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180039563.XA CN116209608A (zh) 2021-09-29 2021-09-29 车辆控制的方法和装置
PCT/CN2021/121638 WO2023050129A1 (zh) 2021-09-29 2021-09-29 车辆控制的方法和装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/121638 WO2023050129A1 (zh) 2021-09-29 2021-09-29 车辆控制的方法和装置

Publications (1)

Publication Number Publication Date
WO2023050129A1 true WO2023050129A1 (zh) 2023-04-06

Family

ID=85781013

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/121638 WO2023050129A1 (zh) 2021-09-29 2021-09-29 车辆控制的方法和装置

Country Status (2)

Country Link
CN (1) CN116209608A (zh)
WO (1) WO2023050129A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109910910A (zh) * 2019-03-13 2019-06-21 浙江吉利汽车研究院有限公司 一种车辆控制系统及方法
CN110386148A (zh) * 2019-06-26 2019-10-29 北京汽车集团有限公司 自动驾驶车辆的控制方法、装置和车辆
US20190382031A1 (en) * 2018-06-18 2019-12-19 Baidu Usa Llc Methods for handling sensor failures in autonomous driving vehicles
CN112572465A (zh) * 2019-09-12 2021-03-30 中车时代电动汽车股份有限公司 一种智能驾驶汽车感知系统故障处理方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190382031A1 (en) * 2018-06-18 2019-12-19 Baidu Usa Llc Methods for handling sensor failures in autonomous driving vehicles
CN109910910A (zh) * 2019-03-13 2019-06-21 浙江吉利汽车研究院有限公司 一种车辆控制系统及方法
CN110386148A (zh) * 2019-06-26 2019-10-29 北京汽车集团有限公司 自动驾驶车辆的控制方法、装置和车辆
CN112572465A (zh) * 2019-09-12 2021-03-30 中车时代电动汽车股份有限公司 一种智能驾驶汽车感知系统故障处理方法

Also Published As

Publication number Publication date
CN116209608A (zh) 2023-06-02

Similar Documents

Publication Publication Date Title
US11845472B2 (en) Inferring state of traffic signal and other aspects of a vehicle's environment based on surrogate data
US11203337B2 (en) Vehicle with autonomous driving capability
CN110775063B (zh) 一种车载设备的信息显示方法、装置及车辆
US10444754B2 (en) Remote assistance for an autonomous vehicle in low confidence situations
US10437257B2 (en) Autonomous driving system
US11010624B2 (en) Traffic signal recognition device and autonomous driving system
US20170313321A1 (en) Vehicle control system, vehicle control method, and vehicle control program
JP2021144732A (ja) 制御装置、制御方法、制御プログラム及び制御システム
JP6692986B2 (ja) 車両制御システム、車両制御方法、および車両制御プログラム
JP2015516623A (ja) 他の車両の予測挙動に基づく自律型車両の挙動の変更
US20240106987A1 (en) Multi-Sensor Assembly with Improved Backward View of a Vehicle
WO2023050129A1 (zh) 车辆控制的方法和装置
CN116203938A (zh) 用于运载工具传感器管理的系统、方法和存储介质
EP4397561A1 (en) Vehicle control method and apparatus
JP7340669B2 (ja) 制御装置、制御方法、制御プログラム及び制御システム
WO2023061013A1 (zh) 自动泊车方法和装置
WO2023010267A1 (zh) 确定泊出方向的方法和装置
JP2023046107A (ja) 車両の運転支援装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21958719

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2021958719

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2021958719

Country of ref document: EP

Effective date: 20240404

NENP Non-entry into the national phase

Ref country code: DE