WO2020103515A1 - 用于控制无人驾驶车辆的方法和装置 - Google Patents

用于控制无人驾驶车辆的方法和装置

Info

Publication number
WO2020103515A1
WO2020103515A1 PCT/CN2019/103252 CN2019103252W WO2020103515A1 WO 2020103515 A1 WO2020103515 A1 WO 2020103515A1 CN 2019103252 W CN2019103252 W CN 2019103252W WO 2020103515 A1 WO2020103515 A1 WO 2020103515A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
unmanned vehicle
information
unmanned
control instruction
Prior art date
Application number
PCT/CN2019/103252
Other languages
English (en)
French (fr)
Inventor
王月
吴泽琳
薛晶晶
刘颖楠
饶文龙
王子杰
龚伟
Original Assignee
百度在线网络技术(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 百度在线网络技术(北京)有限公司 filed Critical 百度在线网络技术(北京)有限公司
Priority to EP19887694.8A priority Critical patent/EP3756966A4/en
Priority to JP2020550098A priority patent/JP7236454B2/ja
Publication of WO2020103515A1 publication Critical patent/WO2020103515A1/zh
Priority to US17/024,629 priority patent/US11511774B2/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/007Emergency override
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/04Monitoring the functioning of the control system
    • B60W50/045Monitoring control system parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0016Planning or execution of driving tasks specially adapted for safety of the vehicle or its occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0018Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions
    • B60W60/00186Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions related to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
    • G05D1/0282Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal generated in a local control room
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/94Hardware or software architectures specially adapted for image or video understanding
    • G06V10/95Hardware or software architectures specially adapted for image or video understanding structured as a network, e.g. client-server architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0808Diagnosing performance data
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/016Personal emergency signalling and security systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0809Driver authorisation; Driver identity check
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0004In digital systems, e.g. discrete-time systems involving sampling
    • B60W2050/0005Processor details or data handling, e.g. memory registers or chip architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/04Monitoring the functioning of the control system
    • B60W50/045Monitoring control system parameters
    • B60W2050/046Monitoring control system parameters involving external transmission of data to or from the vehicle, e.g. via telemetry, satellite, Global Positioning System [GPS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2530/00Input parameters relating to vehicle conditions or values, not covered by groups B60W2510/00 or B60W2520/00
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/229Attention level, e.g. attentive to driving, reading or sleeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/26Incapacity
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/18Braking system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2756/00Output or target parameters relating to data
    • B60W2756/10Involving external transmission of data to or from the vehicle
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station

Definitions

  • the embodiments of the present application relate to the field of computer technology, and in particular to a method and device for controlling an unmanned vehicle.
  • Unmanned vehicles can reduce the problems caused by human errors (such as drunk driving, speeding, fatigue driving, etc.), and can also reduce the work intensity of the driver of the vehicle.
  • the embodiment of the present application proposes a method and device for controlling an unmanned vehicle.
  • an embodiment of the present application provides a method for controlling an unmanned vehicle.
  • the method includes: receiving environmental information sent by the unmanned vehicle, wherein the environmental information includes environmental information outside the vehicle; Determine the environment information outside the vehicle and operation information of the operation performed by the unmanned vehicle to determine whether the unmanned vehicle is in an abnormal operation state; in response to determining that the unmanned vehicle is in an abnormal operation state, A driver-driven vehicle sends a brake control instruction and a data acquisition instruction, wherein the brake control instruction is used to control the braking of the driverless vehicle, and the data acquisition instruction is used to obtain a driving record in the driverless vehicle Instrument data.
  • the method before the receiving environmental information sent by the unmanned vehicle, the method further includes: receiving a vehicle control request sent by the unmanned vehicle.
  • the environment information includes in-vehicle environment information
  • the in-vehicle environment information includes in-vehicle video information
  • the method further includes: determining in the unmanned vehicle based on the in-vehicle video information Whether a predetermined passenger is included; in response to determining that the unmanned vehicle does not include the predetermined passenger, a first control instruction is sent to the unmanned vehicle, wherein the first control instruction is used to control the unmanned vehicle Send alert information.
  • the method further includes: in response to determining that the unmanned vehicle includes a predetermined passenger, determining whether the predetermined passenger is in an abnormal state; in response to determining that the predetermined passenger is in an abnormal state, reporting to the The unmanned vehicle sends a second control instruction, wherein the second control instruction is used to control the unmanned vehicle to play preset prompt information.
  • the environment information includes in-vehicle environment information
  • the in-vehicle environment information includes in-vehicle video information
  • the method further includes: determining in the unmanned vehicle based on the in-vehicle video information Whether the passenger with abnormal behavior is included; in response to determining that the passenger with abnormal behavior is included in the unmanned vehicle, a third control instruction is sent to the unmanned vehicle, wherein the third control instruction is used to control the Man-driven vehicles perform scheduled emergency operations with abnormal passenger behavior.
  • the environment information includes in-vehicle environment information
  • the in-vehicle environment information includes in-vehicle video information, in-vehicle smoke concentration information, and in-vehicle temperature information
  • the method further includes: according to the vehicle In-vehicle video information, in-vehicle smoke concentration information, and / or in-vehicle temperature information to determine whether item burning occurs in the driverless vehicle; in response to determining that item burning occurs in the driverless vehicle, report to the driverless vehicle
  • a fourth control instruction is sent, wherein the fourth control instruction is used to control the unmanned vehicle to perform a predetermined item burning emergency operation.
  • an embodiment of the present application provides an apparatus for controlling an unmanned vehicle, the apparatus includes: a first receiving unit configured to receive environmental information sent by the unmanned vehicle, wherein the environmental information includes The outside environment information of the vehicle; the first determining unit is configured to determine whether the unmanned vehicle is in an abnormal operation state according to the outside environment information and operation information of the operation performed by the unmanned vehicle; the sending unit, Is configured to send a brake control instruction and a data acquisition instruction to the unmanned vehicle in response to determining that the unmanned vehicle is in an abnormal state of operation, wherein the brake control instruction is used to control the unmanned driving Vehicle braking, the data acquisition instruction is used to acquire data of the driving recorder in the unmanned vehicle.
  • the device further includes: a second receiving unit configured to receive a vehicle control request sent by an unmanned vehicle.
  • the environment information includes in-vehicle environment information
  • the in-vehicle environment information includes in-vehicle video information
  • the device further includes: a second determination unit configured to be based on the in-vehicle video information Determining whether the unmanned vehicle includes a predetermined passenger; a first control instruction sending unit configured to send a first control instruction to the unmanned vehicle in response to determining that the unmanned vehicle does not include the predetermined passenger , Wherein the first control instruction is used to control the unmanned vehicle to send warning information.
  • the apparatus further includes: a third determination unit configured to determine whether the predetermined passenger is in an abnormal state in response to determining that the unmanned vehicle includes a predetermined passenger; the second control instruction sending unit , Configured to, in response to determining that the predetermined passenger is in an abnormal state, send a second control instruction to the unmanned vehicle, wherein the second control instruction is used to control the unmanned vehicle to play a preset Prompt information.
  • the environment information includes in-vehicle environment information
  • the in-vehicle environment information includes in-vehicle video information
  • the device further includes: a fourth determination unit configured to be based on the in-vehicle video information Determining whether the unmanned vehicle includes a passenger with abnormal behavior; a third control instruction sending unit is configured to, in response to determining that the unmanned vehicle includes a passenger with abnormal behavior, send a Three control instructions, wherein the third control instruction is used to control the unmanned vehicle to perform a predetermined emergency operation for passenger behavior abnormality.
  • the environment information includes in-vehicle environment information
  • the in-vehicle environment information includes in-vehicle video information, in-vehicle smoke concentration information, and in-vehicle temperature information
  • the device further includes: a fifth determination unit , Configured to determine whether item burning occurs in the driverless vehicle based on the in-vehicle video information, in-vehicle smoke concentration information, and / or in-vehicle temperature information; a fourth control instruction sending unit, configured to respond to the determination Combustion of articles occurs in the unmanned vehicle, and a fourth control instruction is sent to the unmanned vehicle, where the fourth control instruction is used to control the unmanned vehicle to perform a predetermined emergency operation for the combustion of articles.
  • an embodiment of the present application provides a server including: one or more processors; a storage device, on which one or more programs are stored, when the one or more programs are When executed by one processor, the above one or more processors implement the method as described in any implementation manner of the first aspect.
  • an embodiment of the present application provides a computer-readable medium on which a computer program is stored, where the computer program is executed by a processor to implement the method described in any one of the implementation manners of the first aspect.
  • the method and device for controlling an unmanned vehicle provided by an embodiment of the present application first receive environmental information sent by the unmanned vehicle, where the environmental information includes environmental information outside the vehicle, and then according to the environmental information outside the vehicle and the unmanned vehicle
  • the performed operation determines whether the unmanned vehicle is in an abnormal operation state, and in response to determining that the unmanned vehicle is in an abnormal operation state, sends a brake control instruction and a data acquisition instruction to the unmanned vehicle, thereby realizing when the unmanned vehicle is in
  • the driverless vehicle is braked in time to improve safety, and at the same time, the data of the driving recorder in the driverless vehicle is obtained in time, thereby improving the efficiency of exception handling.
  • FIG. 1 is an exemplary system architecture diagram to which an embodiment of the present application can be applied;
  • FIG. 2 is a flowchart of an embodiment of a method for controlling an unmanned vehicle according to the present application
  • FIG. 3 is a schematic diagram of an application scenario of a method for controlling an unmanned vehicle according to the present application
  • FIG. 4 is a flowchart of another embodiment of a method for controlling an unmanned vehicle according to the present application.
  • FIG. 5 is a schematic structural diagram of an embodiment of an apparatus for controlling an unmanned vehicle according to the present application.
  • FIG. 6 is a schematic structural diagram of a computer system suitable for implementing the server of the embodiment of the present application.
  • FIG. 1 shows an exemplary system architecture 100 to which a method for controlling an unmanned vehicle or an apparatus for controlling an unmanned vehicle may be applied to embodiments of the present application.
  • the system architecture 100 may include driverless vehicles 101, 102 and 103, a network 104 and a server 105.
  • the network 104 serves as a medium for providing a communication link between the unmanned vehicles 101, 102, 103 and the server 105.
  • the network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, and so on.
  • the unmanned vehicles 101, 102, 103 interact with the server 105 through the network 104 to receive or send messages, and so on.
  • the unmanned vehicles 101, 102, and 103 may be equipped with various information collection devices, such as image collection devices, binocular cameras, radar detectors, sensors, and so on.
  • the above-mentioned information collection device can be used to collect the outside environment information and the inside environment information of the unmanned vehicles 101, 102, 103.
  • Unmanned vehicles 101, 102, and 103 can also be equipped with an in-vehicle intelligent brain.
  • the in-vehicle intelligent brain can receive the information collected by the above-mentioned information collection equipment, analyze the information, and then process the information, and then control the unmanned vehicles 101, 102, 103 Perform corresponding operations (for example, continue driving, emergency stop, etc.).
  • the unmanned vehicles 101, 102, 103 may be various types of vehicles, including but not limited to large passenger cars, tractors, city buses, medium passenger cars, large trucks, small cars, and so on.
  • the server 105 may be a server that provides various services, such as a background server that processes environmental information sent by the unmanned vehicles 101, 102, and 103.
  • the background server can perform various analysis processes on the received environmental information, and send instructions to the unmanned vehicles 101, 102, 103 according to the processing results to control the unmanned vehicles 101, 102, 103.
  • the server 105 may be hardware or software. When the server 105 is hardware, it can be implemented as a distributed server cluster composed of multiple servers or as a single server. When the server 105 is software, it may be implemented as multiple software or software modules (for example, to provide distributed services), or as a single software or software module. There is no specific limit here.
  • the method for controlling the unmanned vehicle provided by the embodiment of the present application is generally executed by the server 105, and accordingly, the device for controlling the unmanned vehicle is generally provided in the server 105.
  • terminal devices, networks, and servers in FIG. 1 are only schematic. According to the implementation needs, there can be any number of terminal devices, networks and servers.
  • the method for controlling an unmanned vehicle includes the following steps:
  • Step 201 Receive environmental information sent by an unmanned vehicle.
  • the execution subject of the method for controlling an unmanned vehicle can receive the unmanned vehicle (for example, controlling the unmanned vehicle shown in FIG. 1) through a wireless connection 101, 102, 103) sent environmental information.
  • the above-mentioned environment information may include the outside environment information of the driverless vehicle.
  • the above-mentioned environment information outside the vehicle may be information about the environment outside the vehicle collected by various information collection devices installed on the unmanned vehicle. For example, it may be information about the outside environment collected by a binocular camera or a radar detector installed in an unmanned vehicle.
  • Step 202 Determine whether the unmanned vehicle is in an abnormal operation state according to the environment information outside the vehicle and the operation performed by the unmanned vehicle.
  • the driverless vehicle needs to perform various operations during driving, for example, acceleration, deceleration, braking, turning, and so on.
  • the unmanned vehicle can send operation information (for example, turning direction and angle, braking force, etc.) of the performed operation to the above-mentioned execution subject in real time.
  • operation information for example, turning direction and angle, braking force, etc.
  • the above-mentioned execution subject can determine whether the above-mentioned unmanned vehicle is in an abnormal operation state according to the outside environment information received in step 201 and operation information of the operation performed by the unmanned vehicle.
  • the above-mentioned execution body may pre-store the correspondence between the outside environment information and the operation, or may also pre-store the judgment rule for determining the operation according to the outside environment information.
  • the above-mentioned execution subject can predict the operation to be performed by the unmanned vehicle based on the outside environment information sent by the unmanned vehicle, and use the predicted operation as the prediction operation.
  • the above-mentioned execution subject may determine whether the predicted operation matches the operation (ie, the actual operation) corresponding to the operation information sent by the driverless vehicle (for example, the same or similar). If there is no match, it can be considered that the driverless vehicle is in an abnormal state of operation.
  • the environment information outside the vehicle includes information about obstacles (eg, buildings, pedestrians, other vehicles, etc.), such as the volume of obstacles, distance information from obstacles, and so on.
  • the above-mentioned execution subject can predict the operation to be performed by the unmanned vehicle to avoid the obstacle based on the obstacle information, for example, turning to a certain angle in a certain direction. After that, it can be judged whether the predicted operation matches the actual operation of the driverless vehicle. If there is no match, for example, the predicted operation is "turning to a certain angle in a certain direction", and the actual operation of the unmanned vehicle is "forward acceleration", it proves that the unmanned vehicle is in an abnormal operation state.
  • obstacles eg, buildings, pedestrians, other vehicles, etc.
  • Step 203 In response to determining that the driverless vehicle is in an abnormal operation state, send a brake control instruction and a data acquisition instruction to the driverless vehicle.
  • the above-mentioned execution subject may send a brake control instruction and a data acquisition instruction to the unmanned vehicle.
  • the above brake control command may be used to control the above-mentioned unmanned vehicle braking.
  • the above-mentioned execution subject may send different braking control instructions to the driverless vehicle according to the difference between the predicted operation and the actual operation. For example, when the error between the predicted operation and the actual operation is large, a brake control command for emergency braking may be sent to the unmanned vehicle. When the error between the predicted operation and the actual operation is small, a brake control command for slow braking can be sent to the unmanned vehicle.
  • the above-mentioned executive body can also find the nearest parking spot through various methods (such as querying a high-precision map) to control the driverless vehicle to park at a safe location.
  • the above data acquisition instruction can be used to acquire data of a driving recorder in an unmanned vehicle.
  • Car driving recorder commonly known as car black box, is a digital electronic recording device that records and stores the driving speed, time, mileage and other state information about the driving of the vehicle and can output data through the interface.
  • the above-mentioned executive body may forward the acquired data to a preset device, for example, a terminal used by a technician who analyzes and processes the abnormal operation state of the unmanned vehicle.
  • the above-mentioned executive body may also analyze and process the acquired data to obtain the cause of the abnormal operation of the unmanned vehicle, and display or send the obtained reason to a preset device.
  • the above-mentioned environment information may include in-vehicle environment information
  • the above-mentioned in-vehicle environment information may include in-vehicle video information
  • the above-mentioned video information in the vehicle may be a video collected by a video collection device installed in the above-mentioned unmanned vehicle.
  • the above method for controlling an unmanned vehicle may also include the following:
  • the execution subject may determine whether the unmanned vehicle includes a predetermined passenger based on the in-vehicle video information.
  • the above-mentioned predetermined passenger may refer to a person responsible for the safety of the unmanned vehicle on the unmanned vehicle, for example, a safety officer.
  • a safety officer In practice, in order to ensure the safety of public places, security officers are usually arranged on city buses, buses, etc. The security officers will wear uniforms.
  • face information of a predetermined passenger may be pre-stored in the above-mentioned executive body.
  • the above-mentioned execution subject can perform face detection and face recognition on passengers in the video information in the vehicle. According to the processing result, it is determined whether there is a predetermined passenger in the driverless vehicle. It should be noted that processing such as face detection and face recognition is a well-known technology that has been widely researched and applied at present, and will not be repeated here.
  • the above-mentioned execution subject may send the first control instruction to the unmanned vehicle.
  • the above-mentioned execution subject may send the first control instruction to the unmanned vehicle.
  • the above-mentioned first control instruction can be used to control the driverless vehicle to send warning information.
  • the first control instruction may control the unmanned vehicle to send an alarm message to a predetermined device (for example, a terminal used by a person responsible for vehicle safety) to notify the user of the device that there is no scheduled passenger in the unmanned vehicle.
  • the alarm information may include information such as the identity and location of the above-mentioned unmanned vehicle, and generally a relevant person may quickly locate the above-mentioned unmanned vehicle.
  • the above method for controlling an unmanned vehicle may further include the following:
  • the above-mentioned executive body may further determine whether the predetermined passenger is in an abnormal state.
  • the execution subject may perform human motion recognition on the predetermined passenger based on the in-vehicle video information, thereby recognizing the motion of the predetermined passenger.
  • the recognized action it is determined whether the predetermined passenger is in an abnormal state.
  • the abnormal state may refer to a non-working state in which actions such as operating a mobile phone and sleeping are performed. It should be noted that human motion recognition is a well-known technology that has been widely researched and applied at present, and will not be repeated here.
  • a second control instruction is sent to the driverless vehicle.
  • the second control instruction may be used to control the unmanned vehicle to play preset prompt information.
  • the prompt information can be used to prompt the above-mentioned predetermined passengers, which are in an abnormal state, which will cause the unmanned vehicle and other passengers to be in an unsafe state. Through this implementation, it can promptly prompt when the scheduled passenger is in an abnormal state, thereby improving the safety of the driverless vehicle and other passengers.
  • the above-mentioned environment information may include in-vehicle environment information, and the above-mentioned in-vehicle environment information may include in-vehicle video information. as well as
  • the above method for controlling an unmanned vehicle may further include:
  • the above-mentioned executive body can recognize the actions of various passengers in the unmanned vehicle based on the above-mentioned video information in the vehicle, and determine whether there are any passengers who perform predetermined abnormal actions (such as dangerous actions) based on the actions.
  • Abnormal behavior the passenger is determined to be a passenger with abnormal behavior.
  • the above-mentioned execution subject may send a third control instruction to the unmanned vehicle.
  • the above-mentioned third control instruction may be used to control the above-mentioned unmanned vehicle to perform a predetermined emergency operation for passenger behavior abnormality.
  • the above-mentioned behavior abnormal emergency operation may be preset, for example, sending warning information including the vehicle identification and location to the public security organ.
  • the environmental information may include in-vehicle environmental information
  • the in-vehicle environmental information may include in-vehicle video information, in-vehicle smoke concentration information, and in-vehicle temperature information;
  • the above method for controlling an unmanned vehicle may further include:
  • the above-mentioned executive body may determine whether the burning of objects occurs in the above-mentioned unmanned vehicle according to the video information in the vehicle, the smoke concentration information in the vehicle, and / or the temperature information in the vehicle.
  • the smoke concentration information in the vehicle and the temperature information in the vehicle may be collected by a sensor installed in an unmanned vehicle.
  • the above-mentioned executive body may detect whether flames occur in an unmanned vehicle based on in-vehicle video information, and determine whether item burning occurs in the unmanned vehicle according to flame detection results, smoke concentration information in the vehicle, and / or temperature information in the vehicle .
  • the above-mentioned execution subject can determine whether there is item burning according to one of the video information in the car, smoke concentration information in the car and temperature information in the car, or whether the item burning occurs according to two of them , You can also integrate all the information to determine whether there is item burning.
  • the above-mentioned execution subject may send a fourth control instruction to the above-mentioned unmanned vehicle.
  • the above-mentioned fourth control command may be used to control the above-mentioned unmanned vehicle to perform a predetermined item burning emergency operation.
  • the above item burning emergency operation may be preset, for example, sending the item burning alarm information including the vehicle identification and location to the fire department.
  • FIG. 3 is a schematic diagram of an application scenario of the method for controlling an unmanned vehicle according to this embodiment.
  • the server 301 receives environmental information sent by the unmanned vehicle 302, where the environmental information includes environmental information outside the vehicle. After that, the server 301 determines whether the unmanned vehicle 302 is in an abnormal operation state according to the environment information outside the vehicle and the operation information of the operation performed by the unmanned vehicle 302. Finally, in response to determining that the driverless vehicle 302 is in an abnormal operation state, the server 301 sends a brake control instruction and a data acquisition instruction to the driverless vehicle 302, where the brake control instruction is used to control the braking of the driverless vehicle , The data acquisition instruction is used to acquire the data of the driving recorder in the driverless vehicle.
  • the method provided by the above embodiment of the present application realizes that when the unmanned vehicle is in an abnormal operation state, the unmanned vehicle is braked in time to improve safety, and at the same time, the data of the driving recorder in the unmanned vehicle is obtained in time, thereby improving The efficiency of exception handling.
  • FIG. 4 shows a flow 400 of yet another embodiment of a method for controlling an unmanned vehicle.
  • the process 400 of the method for controlling an unmanned vehicle includes the following steps:
  • Step 401 Receive a vehicle control request sent by an unmanned vehicle.
  • the execution subject of the method for controlling an unmanned vehicle can receive the unmanned vehicle (for example, controlling the unmanned vehicle shown in FIG. 1) through a wireless connection 101, 102, 103) the vehicle control request sent.
  • the above vehicle control request is used to request the execution subject to control the unmanned vehicle.
  • the above-mentioned vehicle control request may be sent by a passenger in an unmanned vehicle by triggering a preset device (for example, a preset button). For example, when a passenger feels that the unmanned vehicle is experiencing abnormal driving, it may be triggered by The preset device requests the execution subject to control the unmanned vehicle from the cloud to ensure the driving safety of the unmanned vehicle.
  • the above-mentioned vehicle control request may be sent by an unmanned vehicle.
  • the unmanned vehicle determines that the unmanned vehicle has a driving abnormality according to vehicle parameters (for example, acceleration, speed, etc.), it may send an Send a vehicle control request to request the executive to control the unmanned vehicle from the cloud to ensure the driving safety of the unmanned vehicle.
  • vehicle parameters for example, acceleration, speed, etc.
  • Step 402 Receive environmental information sent by an unmanned vehicle.
  • step 402 is similar to step 201 in the embodiment shown in FIG. 2 and will not be repeated here.
  • step 403 it is determined whether the unmanned vehicle is in an abnormal operation state according to the environment information outside the vehicle and the operation information of the operation performed by the unmanned vehicle.
  • step 403 is similar to step 202 in the embodiment shown in FIG. 2 and will not be repeated here.
  • step 404 in response to determining that the driverless vehicle is in an abnormal operation state, a brake control instruction and a data acquisition instruction are sent to the driverless vehicle.
  • step 404 is similar to step 203 in the embodiment shown in FIG. 2 and will not be repeated here.
  • the process 400 of the method for controlling an unmanned vehicle in this embodiment highlights the step of receiving a vehicle control request sent by an unmanned vehicle. Therefore, the solution described in this embodiment can control the unmanned vehicle that sends the vehicle control request, so that the executive can quickly determine the unmanned vehicle that needs to be controlled, thereby making the control more targeted and improving the control efficiency .
  • the present application provides an embodiment of a device for controlling an unmanned vehicle, which is similar to the method embodiment shown in FIG. 2
  • the device can be specifically applied to various electronic devices.
  • the device 500 for controlling an unmanned vehicle of this embodiment includes: a first receiving unit 501, a first determining unit 502 and a sending unit 503.
  • the first receiving unit 501 is configured to receive environmental information sent by the driverless vehicle, wherein the environmental information includes environmental information outside the vehicle;
  • the first determining unit 502 is configured to according to the environmental information outside the vehicle and the Operation information of operations performed by the unmanned vehicle to determine whether the unmanned vehicle is in an abnormal operation state;
  • the sending unit 503 is configured to respond to the unmanned vehicle in response to determining that the unmanned vehicle is in an abnormal operation state
  • the vehicle sends a brake control instruction and a data acquisition instruction, wherein the brake control instruction is used to control the braking of the driverless vehicle, and the data acquisition instruction is used to acquire the driving recorder in the driverless vehicle data.
  • step 201, step 202 and step 203 in the example will not be repeated here.
  • the device 500 further includes: a second receiving unit (not shown in the figure) configured to receive a vehicle control request sent by an unmanned vehicle.
  • the environment information includes in-vehicle environment information
  • the in-vehicle environment information includes in-vehicle video information
  • the device 500 further includes: a second determination unit (in the figure (Not shown), configured to determine whether the unmanned vehicle includes a predetermined passenger based on the in-vehicle video information; a first control instruction sending unit (not shown in the figure) is configured to respond to the determination The unmanned vehicle does not include a predetermined passenger, and sends a first control instruction to the unmanned vehicle, where the first control instruction is used to control the unmanned vehicle to send alarm information.
  • the device 500 further includes: a third determination unit (not shown in the figure) configured to determine in response to determining that the unmanned vehicle includes a predetermined passenger Whether the predetermined passenger is in an abnormal state; a second control instruction sending unit (not shown in the figure) is configured to send a second control instruction to the driverless vehicle in response to determining that the predetermined passenger is in an abnormal state, Wherein, the second control instruction is used to control the unmanned vehicle to play preset prompt information.
  • a third determination unit (not shown in the figure) configured to determine in response to determining that the unmanned vehicle includes a predetermined passenger Whether the predetermined passenger is in an abnormal state
  • a second control instruction sending unit (not shown in the figure) is configured to send a second control instruction to the driverless vehicle in response to determining that the predetermined passenger is in an abnormal state, wherein, the second control instruction is used to control the unmanned vehicle to play preset prompt information.
  • the environment information includes in-vehicle environment information
  • the in-vehicle environment information includes in-vehicle video information
  • the device 500 further includes: a fourth determination unit (in the figure (Not shown), configured to determine whether the unmanned vehicle includes a passenger with abnormal behavior based on the in-vehicle video information; a third control instruction sending unit (not shown in the figure), configured to respond to the determination The unmanned vehicle includes a passenger with abnormal behavior, and sends a third control instruction to the unmanned vehicle, wherein the third control instruction is used to control the unmanned vehicle to perform a predetermined emergency behavior passenger emergency operating.
  • the environment information includes in-vehicle environment information
  • the in-vehicle environment information includes in-vehicle video information, in-vehicle smoke concentration information, and in-vehicle temperature information
  • the device 500 further includes: a fifth determining unit (not shown in the figure) configured to determine the interior of the driverless vehicle based on the in-vehicle video information, in-vehicle smoke concentration information, and / or in-vehicle temperature information Whether item burning occurs; a fourth control command sending unit (not shown in the figure), configured to send a fourth control command to the driverless vehicle in response to determining that item burning occurs in the driverless vehicle, wherein , The fourth control instruction is used to control the unmanned vehicle to perform a predetermined item burning emergency operation.
  • FIG. 6 shows a schematic structural diagram of a computer system 600 suitable for implementing the server of the embodiment of the present application.
  • the server shown in FIG. 6 is only an example, and should not bring any limitation to the functions and usage scope of the embodiments of the present application.
  • the computer system 600 includes a central processing unit (CPU) 601 that can be loaded into a random access memory (RAM) 603 from a program stored in a read-only memory (ROM) 602 or from a storage section 608 Instead, perform various appropriate actions and processing.
  • RAM random access memory
  • ROM read-only memory
  • various programs and data necessary for the operation of the system 600 are also stored.
  • the CPU 601, ROM 602, and RAM 603 are connected to each other through a bus 604.
  • An input / output (I / O) interface 605 is also connected to the bus 604.
  • the following components are connected to the I / O interface 605: an input section 606 including a keyboard, a mouse, etc .; an output section 607 including a cathode ray tube (CRT), a liquid crystal display (LCD), etc., and a speaker; a storage section 608 including a hard disk, etc. ; And a communication section 609 including a network interface card such as a LAN card, a modem, etc. The communication section 609 performs communication processing via a network such as the Internet.
  • the driver 610 is also connected to the I / O interface 605 as needed.
  • a removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is installed on the drive 610 as necessary, so that the computer program read out therefrom is installed into the storage section 608 as needed.
  • the process described above with reference to the flowchart may be implemented as a computer software program.
  • embodiments of the present disclosure include a computer program product that includes a computer program carried on a computer-readable medium, the computer program containing program code for performing the method shown in the flowchart.
  • the computer program may be downloaded and installed from the network through the communication section 609, and / or installed from the removable medium 611.
  • CPU central processing unit
  • the computer-readable medium described in this application may be a computer-readable signal medium or a computer-readable storage medium or any combination of the two.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination of the above. More specific examples of computer readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable removable Programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
  • the computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • the computer-readable signal medium may include a data signal that is propagated in a baseband or as part of a carrier wave, in which a computer-readable program code is carried. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, and the computer-readable medium may send, propagate, or transmit a program for use by or in combination with an instruction execution system, apparatus, or device. .
  • the program code contained on the computer-readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, optical cable, RF, etc., or any suitable combination of the foregoing.
  • the computer program code for performing the operations of the present application may be written in one or more programming languages or a combination thereof, the programming languages including object-oriented programming languages such as Java, Smalltalk, C ++, as well as conventional Procedural programming language-such as "C" language or similar programming language.
  • the program code may be executed entirely on the user's computer, partly on the user's computer, as an independent software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (for example, through an Internet service provider Internet connection).
  • LAN local area network
  • WAN wide area network
  • Internet service provider Internet connection for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • each block in the flowchart or block diagram may represent a module, program segment, or part of code that contains one or more logic functions Executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks represented in succession may actually be executed in parallel, and they may sometimes be executed in reverse order, depending on the functions involved.
  • each block in the block diagrams and / or flowcharts, and combinations of blocks in the block diagrams and / or flowcharts can be implemented with dedicated hardware-based systems that perform specified functions or operations Or, it can be realized by a combination of dedicated hardware and computer instructions.
  • the units described in the embodiments of the present application may be implemented in software or hardware.
  • the described unit may also be provided in the processor.
  • a processor includes a first receiving unit, a first determining unit, and a sending unit.
  • the names of these units do not constitute a limitation on the unit itself.
  • the first receiving unit may also be described as “a unit that receives environmental information sent by an unmanned vehicle”.
  • the present application also provides a computer-readable medium, which may be included in the device described in the foregoing embodiments; or may exist alone without being assembled into the device.
  • the computer-readable medium carries one or more programs.
  • the device When the one or more programs are executed by the device, the device is caused to: receive environment information sent by an unmanned vehicle, where the environment information includes the environment outside the vehicle Information; determine whether the unmanned vehicle is in an abnormal state of operation based on the outside environment information and the operation information of the operation performed by the unmanned vehicle;
  • a brake control instruction and a data acquisition instruction are sent to the unmanned vehicle, wherein the brake control instruction is used to control the unmanned vehicle to brake ,
  • the data acquisition instruction is used to acquire data of the driving recorder in the driverless vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

一种用于控制无人驾驶车辆的方法和装置。该方法包括:接收无人驾驶车辆发送的环境信息(201),其中,环境信息包括车外环境信息;根据车外环境信息和无人驾驶车辆所执行操作的操作信息,确定无人驾驶车辆是否处于操作异常状态(202);响应于确定无人驾驶车辆处于操作异常状态,向无人驾驶车辆发送制动控制指令和数据获取指令(203),其中,制动控制指令用于控制无人驾驶车辆制动,数据获取指令用于获取无人驾驶车辆内行驶记录仪的数据。所述方法实现了当无人驾驶车辆处于操作异常状态时,及时将无人驾驶车辆制动,提高安全性。

Description

用于控制无人驾驶车辆的方法和装置
本申请要求于2018年11月19日提交的、申请号为201811374621.0、申请人为百度在线网络技术(北京)有限公司、发明名称为“用于控制无人驾驶车辆的方法和装置”的中国专利申请的优先权,该申请的全文以引用的方式并入本申请中。
技术领域
本申请实施例涉及计算机技术领域,具体涉及用于控制无人驾驶车辆的方法和装置。
背景技术
随着无人驾驶车辆技术的日益完善,无人驾驶车辆也越来越受到人们的关注。无人驾驶车辆能够减少因为人为错误(例如酒驾、超速、疲劳驾驶等等)而导致的问题,也能够减轻车辆驾驶者的工作强度。
为了保证无人驾驶车辆的安全行驶,需要实时监控车辆的状态,并在车辆出现异常时及时采取应急措施,以保证乘客人身和财产的安全,降低损失。此外,当无人驾驶车辆出现异常时,需要及时获取无人驾驶车辆的汽车黑匣子数据,以便对异常进行分析等处理。
发明内容
本申请实施例提出了用于控制无人驾驶车辆的方法和装置。
第一方面,本申请实施例提供了一种用于控制无人驾驶车辆的方法,该方法包括:接收无人驾驶车辆发送的环境信息,其中,所述环境信息包括车外环境信息;根据所述车外环境信息和所述无人驾驶车辆所执行操作的操作信息,确定所述无人驾驶车辆是否处于操作异常状态;响应于确定所述无人驾驶车辆处于操作异常状态,向所述无人驾驶车辆发送制动控制指令和数据获取指令,其中,所述制动控制指 令用于控制所述无人驾驶车辆制动,所述数据获取指令用于获取所述无人驾驶车辆内行驶记录仪的数据。
在一些实施例中,在所述接收无人驾驶车辆发送的环境信息之前,所述方法还包括:接收无人驾驶车辆发送的车辆控制请求。
在一些实施例中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息;以及所述方法还包括:根据所述车内视频信息确定所述无人驾驶车辆内是否包括预定乘客;响应于确定所述无人驾驶车辆内不包括预定乘客,向所述无人驾驶车辆发送第一控制指令,其中,所述第一控制指令用于控制所述无人驾驶车辆发送警报信息。
在一些实施例中,所述方法还包括:响应于确定所述无人驾驶车辆内包括预定乘客,确定所述预定乘客是否处于异常状态;响应于确定所述预定乘客处于异常状态,向所述无人驾驶车辆发送第二控制指令,其中,所述第二控制指令用于控制所述无人驾驶车辆播放预先设定的提示信息。
在一些实施例中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息;以及所述方法还包括:根据所述车内视频信息确定所述无人驾驶车辆内是否包括行为异常的乘客;响应于确定所述无人驾驶车辆内包括行为异常的乘客,向所述无人驾驶车辆发送第三控制指令,其中,所述第三控制指令用于控制所述无人驾驶车辆执行预定的乘客行为异常应急操作。
在一些实施例中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息、车内烟雾浓度信息和车内温度信息;以及所述方法还包括:根据所述车内视频信息、车内烟雾浓度信息和/或车内温度信息确定所述无人驾驶车辆内是否出现物品燃烧;响应于确定所述无人驾驶车辆内出现物品燃烧,向所述无人驾驶车辆发送第四控制指令,其中,所述第四控制指令用于控制所述无人驾驶车辆执行预定的物品燃烧应急操作。
第二方面,本申请实施例提供了一种用于控制无人驾驶车辆的装置,装置包括:第一接收单元,被配置成接收无人驾驶车辆发送的环 境信息,其中,所述环境信息包括车外环境信息;第一确定单元,被配置成根据所述车外环境信息和所述无人驾驶车辆所执行操作的操作信息,确定所述无人驾驶车辆是否处于操作异常状态;发送单元,被配置成响应于确定所述无人驾驶车辆处于操作异常状态,向所述无人驾驶车辆发送制动控制指令和数据获取指令,其中,所述制动控制指令用于控制所述无人驾驶车辆制动,所述数据获取指令用于获取所述无人驾驶车辆内行驶记录仪的数据。
在一些实施例中,所述装置还包括:第二接收单元,被配置成接收无人驾驶车辆发送的车辆控制请求。
在一些实施例中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息;以及所述装置还包括:第二确定单元,被配置成根据所述车内视频信息确定所述无人驾驶车辆内是否包括预定乘客;第一控制指令发送单元,被配置成响应于确定所述无人驾驶车辆内不包括预定乘客,向所述无人驾驶车辆发送第一控制指令,其中,所述第一控制指令用于控制所述无人驾驶车辆发送警报信息。
在一些实施例中,所述装置还包括:第三确定单元,被配置成响应于确定所述无人驾驶车辆内包括预定乘客,确定所述预定乘客是否处于异常状态;第二控制指令发送单元,被配置成响应于确定所述预定乘客处于异常状态,向所述无人驾驶车辆发送第二控制指令,其中,所述第二控制指令用于控制所述无人驾驶车辆播放预先设定的提示信息。
在一些实施例中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息;以及所述装置还包括:第四确定单元,被配置成根据所述车内视频信息确定所述无人驾驶车辆内是否包括行为异常的乘客;第三控制指令发送单元,被配置成响应于确定所述无人驾驶车辆内包括行为异常的乘客,向所述无人驾驶车辆发送第三控制指令,其中,所述第三控制指令用于控制所述无人驾驶车辆执行预定的乘客行为异常应急操作。
在一些实施例中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息、车内烟雾浓度信息和车内温度信息;以及 所述装置还包括:第五确定单元,被配置成根据所述车内视频信息、车内烟雾浓度信息和/或车内温度信息确定所述无人驾驶车辆内是否出现物品燃烧;第四控制指令发送单元,被配置成响应于确定所述无人驾驶车辆内出现物品燃烧,向所述无人驾驶车辆发送第四控制指令,其中,所述第四控制指令用于控制所述无人驾驶车辆执行预定的物品燃烧应急操作。
第三方面,本申请实施例提供了一种服务器,该服务器包括:一个或多个处理器;存储装置,其上存储有一个或多个程序,当上述一个或多个程序被上述一个或多个处理器执行时,使得上述一个或多个处理器实现如第一方面中任一实现方式描述的方法。
第四方面,本申请实施例提供了一种计算机可读介质,其上存储有计算机程序,其中,该计算机程序被处理器执行时实现如第一方面中任一实现方式描述的方法。
本申请实施例提供的用于控制无人驾驶车辆的方法和装置,首先接收无人驾驶车辆发送的环境信息,其中,环境信息包括车外环境信息,而后根据车外环境信息和无人驾驶车辆所执行的操作确定无人驾驶车辆是否处于操作异常状态,响应于确定无人驾驶车辆处于操作异常状态,向无人驾驶车辆发送制动控制指令和数据获取指令,从而实现当无人驾驶车辆处于操作异常状态时,及时将无人驾驶车辆制动,提高安全性,同时及时获取无人驾驶车辆内行驶记录仪的数据,从而提高异常处理的效率。
附图说明
通过阅读参照以下附图所作的对非限制性实施例所作的详细描述,本申请的其它特征、目的和优点将会变得更明显:
图1是本申请的一个实施例可以应用于其中的示例性系统架构图;
图2是根据本申请的用于控制无人驾驶车辆的方法的一个实施例的流程图;
图3是根据本申请的用于控制无人驾驶车辆的方法的一个应用场 景的示意图;
图4是根据本申请的用于控制无人驾驶车辆的方法的又一个实施例的流程图;
图5是根据本申请的用于控制无人驾驶车辆的装置的一个实施例的结构示意图;
图6是适于用来实现本申请实施例的服务器的计算机系统的结构示意图。
具体实施方式
下面结合附图和实施例对本申请作进一步的详细说明。可以理解的是,此处所描述的具体实施例仅仅用于解释相关发明,而非对该发明的限定。另外还需要说明的是,为了便于描述,附图中仅示出了与有关发明相关的部分。
需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。下面将参考附图并结合实施例来详细说明本申请。
图1示出了可以应用本申请实施例的用于控制无人驾驶车辆的方法或用于控制无人驾驶车辆的装置的示例性系统架构100。
如图1所示,系统架构100可以包括无人驾驶车辆101、102、103,网络104和服务器105。网络104用以在无人驾驶车辆101、102、103和服务器105之间提供通信链路的介质。网络104可以包括各种连接类型,例如有线、无线通信链路或者光纤电缆等等。
无人驾驶车辆101、102、103通过网络104与服务器105交互,以接收或发送消息等。无人驾驶车辆101、102、103上可以安装有各种信息采集装置,例如图像采集装置、双目摄像机、雷达探测器、传感器等等。上述信息采集装置可以用于采集无人驾驶车辆101、102、103的车外环境信息和车内环境信息。无人驾驶车辆101、102、103上还可以安装车载智能大脑,车载智能大脑可以接收上述信息采集设备采集信息,并对信息进行分析等处理,之后根据处理结果控制无人驾驶车辆101、102、103进行相应的操作(例如继续行驶、紧急停车 等等)。
无人驾驶车辆101、102、103可以是各种类型的车辆,包括但不限于大型客车、牵引车、城市公交车、中型客车、大型货车、小型汽车等等。
服务器105可以是提供各种服务的服务器,例如对无人驾驶车辆101、102、103发送的环境信息进行处理的后台服务器。后台服务器可以对接收到的环境信息进行各种分析处理,并根据处理结果向无人驾驶车辆101、102、103发送指令,以控制无人驾驶车辆101、102、103。
需要说明的是,服务器105可以是硬件,也可以是软件。当服务器105为硬件时,可以实现成多个服务器组成的分布式服务器集群,也可以实现成单个服务器。当服务器105为软件时,可以实现成多个软件或软件模块(例如用来提供分布式服务),也可以实现成单个软件或软件模块。在此不做具体限定。
需要说明的是,本申请实施例所提供的用于控制无人驾驶车辆的方法一般由服务器105执行,相应地,用于控制无人驾驶车辆的装置一般设置于服务器105中。
应该理解,图1中的终端设备、网络和服务器的数目仅仅是示意性的。根据实现需要,可以具有任意数目的终端设备、网络和服务器。
继续参考图2,示出了根据本申请的用于控制无人驾驶车辆的方法的一个实施例的流程200。该用于控制无人驾驶车辆的方法,包括以下步骤:
步骤201,接收无人驾驶车辆发送的环境信息。
在本实施例中,用于控制无人驾驶车辆的方法的执行主体(例如图1所示的服务器105)可以通过无线连接方式接收无人驾驶车辆(例如图1所示的控制无人驾驶车辆101、102、103)发送的环境信息。其中,上述环境信息可以包括无人驾驶车辆的车外环境信息。上述车外环境信息可以是安装在无人驾驶车辆上的各种信息采集装置采集的车外环境的信息。例如,可以是安装在无人驾驶车辆的双目摄像机或雷达探测器采集的车外环境的信息。
步骤202,根据车外环境信息和无人驾驶车辆所执行的操作,确定无人驾驶车辆是否处于操作异常状态。
在本实施例中,无人驾驶车辆在行驶过程中需要执行多种操作,例如,加速、减速、刹车、转弯等等。无人驾驶车辆可以将所执行操作的操作信息(例如,转弯方向以及角度,刹车力度等等)实时发送给上述执行主体。这样,上述执行主体可以根据步骤201中接收到的车外环境信息和无人驾驶车辆所执行操作的操作信息,确定上述无人驾驶车辆是否处于操作异常状态。
作为示例,上述执行主体中可以预先存储有车外环境信息与操作的对应关系,或者也可以预先存储有根据车外环境信息确定操作的判断规则。这样,上述执行主体可以根据无人驾驶车辆发送的车外环境信息预测无人驾驶车辆所要执行的操作,并将预测的操作作为预测操作。之后,上述执行主体可以确定预测操作与无人驾驶车辆发送的操作信息对应的操作(即实际操作)是否相匹配(例如,相同或相似)。如果不匹配,则可以认为无人驾驶车辆处于操作异常状态。举例来说,假设车外环境信息包括障碍物(例如,建筑物、行人、其他车辆等等)信息,例如,障碍物的体积、与障碍物的距离信息等等。上述执行主体可以根据障碍物信息预测无人驾驶车辆为避让障碍物所要执行的操作,例如,向某个方向转某个角度。之后,可以判断预测操作与无人驾驶车辆的实际操作是否相匹配。如果不匹配,例如,预测操作为“向某个方向转某个角度”,而无人驾驶车辆的实际操作为“加速前行”,则证明无人驾驶车辆处于操作异常状态。
步骤203,响应于确定无人驾驶车辆处于操作异常状态,向无人驾驶车辆发送制动控制指令和数据获取指令。
在本实施例中,响应于确定无人驾驶车辆处于操作异常状态,上述执行主体可以向无人驾驶车辆发送制动控制指令和数据获取指令。
在这里,上述制动控制指令可以用于控制上述无人驾驶车辆制动。作为示例,上述执行主体可以根据预测操作与实际操作之间的差异大小,向无人驾驶车辆发送不同的制动控制指令。例如,当预测操作与实际操作的误差较大时,可以向无人驾驶车辆发送紧急制动的制动控 制指令。当预测操作与实际操作的误差较小时,可以向无人驾驶车辆发送缓慢制动的制动控指令。实践中,上述执行主体还可以通过各种方式(例如查询高精度地图)寻找距离最近的停车点,以控制无人驾驶车辆停靠在安全位置。
在这里,上述数据获取指令可以用于获取无人驾驶车辆内行驶记录仪的数据。汽车行驶记录仪,俗称汽车黑匣子,是对车辆行驶速度、时间、里程以及有关车辆行驶的其他状态信息进行记录、存储并可通过接口实现数据输出的数字式电子记录装置。获取无人驾驶车辆内行驶记录仪的数据之后,上述执行主体可以将获取的数据转发给预设的设备,例如,对无人驾驶车辆的操作异常状态进行分析处理的技术人员所使用的终端。上述执行主体还可以对获取的数据进行分析处理,以得到无人驾驶车辆出现操作异常的原因,并将得到的原因进行显示或者发送给预设的设备。
在本实施例的一些可选的实现方式中,上述环境信息可以包括车内环境信息,上述车内环境信息可以包括车内视频信息。其中,上述车内视频信息可以是安装在上述无人驾驶车辆内的视频采集装置采集的视频。上述用于控制无人驾驶车辆的方法还可以包括以下内容:
首先,上述执行主体可以根据上述车内视频信息确定上述无人驾驶车辆内是否包括预定乘客。
在本实现方式中,上述预定乘客可以是指无人驾驶车辆上的负责无人驾驶车辆安全的人员,例如,安全员。实践中,为了确保公共场所的安全,城市公交车、客车等公共汽车上通常安排安全员,安全员会身穿统一的制服。作为示例,上述执行主体内可以预先存储有预定乘客的人脸信息。这样,上述执行主体可以对车内视频信息中的乘客进行人脸检测、人脸识别等处理。根据处理结果确定无人驾驶车辆内是否有预定乘客。需要说明的是,人脸检测、人脸识别等处理是目前广泛研究和应用的公知技术,此处不再赘述。
然后,响应于确定无人驾驶车辆内不包括预定乘客,上述执行主体可以向无人驾驶车辆发送第一控制指令。
在本实现方式中,响应于确定无人驾驶车辆内不包括预定乘客, 上述执行主体可以向无人驾驶车辆发送第一控制指令。其中,上述第一控制指令可以用于控制无人驾驶车辆发送警报信息。作为示例,第一控制指令可以控制无人驾驶车辆向预定设备(例如,负责车辆安全的相关人员所使用的终端)发送警报信息,以通知设备端用户上述无人驾驶车辆内没有预定乘客。在这里,上述警报信息可以包括上述无人驾驶车辆的标识、位置等信息,一般相关人员可以快速定位上述无人驾驶车辆。通过本实现方式,可以在确定无人驾驶车辆内缺少预定乘客后,控制无人驾驶车辆及时发送警报信息,从而提高了无人驾驶车辆的安全性。
在一些可选的实现方式中,上述用于控制无人驾驶车辆的方法还可以包括以下内容:
首先,响应于确定无人驾驶车辆内包括预定乘客,确定预定乘客是否处于异常状态。
在本实现方式中,响应于确定无人驾驶车辆内包括预定乘客,上述执行主体可以进一步确定上述预定乘客是否处于异常状态。作为示例,上述执行主体可以根据上述车内视频信息对上述预定乘客进行人体动作识别,从而识别出上述预定乘客的动作。并根据识别出的动作确定预定乘客是否处于异常状态。在这里,异常状态可以是指执行操作手机、睡觉等动作的非工作状态。需要说明的是,人体动作识别是目前广泛研究和应用的公知技术,此处不再赘述。
然后,响应于确定预定乘客处于异常状态,向无人驾驶车辆发送第二控制指令。在这里,上述第二控制指令可以用于控制上述无人驾驶车辆播放预先设定的提示信息。该提示信息可以用于提示上述预定乘客,其处于异常状态,这样会导致无人驾驶车辆和其他乘客处于不安全状态。通过本实现方式,可以在预定乘客处于异常状态时,及时提示,从而提高无人驾驶车辆和其他乘客的安全性。
在本实施例的一些可选的实现方式中,上述环境信息可以包括车内环境信息,上述车内环境信息可以包括车内视频信息。以及
上述用于控制无人驾驶车辆的方法还可以包括:
首先,根据车内视频信息确定无人驾驶车辆内是否包括行为异常 的乘客。作为示例,上述执行主体可以根据上述车内视频信息识别出无人驾驶车辆内各个乘客的动作,并根据动作判断是否有乘客做出预定的异常动作(例如危险动作),如果有乘客做出预定的异常动作,则确定该乘客为行为异常的乘客。
然后,响应于确定无人驾驶车辆内包括行为异常的乘客,上述执行主体可以向无人驾驶车辆发送第三控制指令。在这里,上述第三控制指令可以用于控制上述无人驾驶车辆执行预定的乘客行为异常应急操作。作为示例,上述行为异常应急操作可以是预先设定的,例如,向公安机关发送包括车辆标识和位置的报警信息。
在本实施例的一些可选的实现方式中,上述环境信息可以包括车内环境信息,上述车内环境信息可以包括车内视频信息、车内烟雾浓度信息和车内温度信息;以及
上述用于控制无人驾驶车辆的方法还可以包括:
首先,上述执行主体可以根据车内视频信息、车内烟雾浓度信息和/或车内温度信息确定上述无人驾驶车辆内是否出现物品燃烧。在这里,上述车内烟雾浓度信息和上述车内温度信息可以是安装在无人驾驶车辆内的传感器采集的。作为示例,上述执行主体可以根据车内视频信息检测无人驾驶车辆内是否出现火焰,并根据火焰检测结果、车内烟雾浓度信息和/或车内温度信息确定无人驾驶车辆内是否发生物品燃烧。不难理解,根据实际需要,上述执行主体可以根据车内视频信息、车内烟雾浓度信息和车内温度信息中的一项确定是否出现物品燃烧,也可以根据其中的两项确定是否出现物品燃烧,还可以综合全部信息确定是否出现物品燃烧。
然后,响应于确定无人驾驶车辆内出现物品燃烧,上述执行主体可以向上述无人驾驶车辆发送第四控制指令。在这里,上述第四控制指令可以用于控制上述无人驾驶车辆执行预定的物品燃烧应急操作。作为示例,上述物品燃烧应急操作可以是预先设定的,例如,向消防部门发送包括车辆标识和位置的物品燃烧报警信息。
继续参见图3,图3是根据本实施例的用于控制无人驾驶车辆的方法的应用场景的一个示意图。在图3的应用场景中,服务器301接 收无人驾驶车辆302发送的环境信息,其中,环境信息包括车外环境信息。之后,服务器301根据车外环境信息和无人驾驶车辆302所执行操作的操作信息,确定无人驾驶车辆302是否处于操作异常状态。最后,响应于确定无人驾驶车辆302处于操作异常状态,服务器301向无人驾驶车辆302发送制动控制指令和数据获取指令,其中,制动控制指令用于控制所述无人驾驶车辆制动,数据获取指令用于获取无人驾驶车辆内行驶记录仪的数据。
本申请的上述实施例提供的方法实现当无人驾驶车辆处于操作异常状态时,及时将无人驾驶车辆制动,提高安全性,同时及时获取无人驾驶车辆内行驶记录仪的数据,从而提高异常处理的效率。
进一步参考图4,其示出了用于控制无人驾驶车辆的方法的又一个实施例的流程400。该用于控制无人驾驶车辆的方法的流程400,包括以下步骤:
步骤401,接收无人驾驶车辆发送的车辆控制请求。
在本实施例中,用于控制无人驾驶车辆的方法的执行主体(例如图1所示的服务器105)可以通过无线连接方式接收无人驾驶车辆(例如图1所示的控制无人驾驶车辆101、102、103)发送的车辆控制请求。在这里,上述车辆控制请求用于请求执行主体对无人驾驶车辆进行控制。作为一个示例,上述车辆控制要求可以是无人驾驶车辆内的乘客通过触发预设装置(例如,预设按钮)发送的,例如,当乘客感觉无人驾驶车辆出现行驶异常时,可以通过触发该预设装置,请求执行主体从云端控制无人驾驶车辆,以确保无人驾驶车辆的行车安全。作为另一个示例,上述车辆控制请求可以是无人驾驶车辆发送的,例如,当无人驾驶车辆根据车辆参数(例如,加速度、速度等)确定无人驾驶车辆出现行驶异常时,可以向执行主体发送车辆控制请求,以请求执行主体从云端控制无人驾驶车辆,以确保无人驾驶车辆的行车安全。
步骤402,接收无人驾驶车辆发送的环境信息。
在本实施例中,步骤402与图2所示实施例的步骤201类似,此 处不再赘述。
步骤403,根据车外环境信息和无人驾驶车辆所执行操作的操作信息,确定无人驾驶车辆是否处于操作异常状态。
在本实施例中,步骤403与图2所示实施例的步骤202类似,此处不再赘述。
步骤404,响应于确定无人驾驶车辆处于操作异常状态,向无人驾驶车辆发送制动控制指令和数据获取指令。
在本实施例中,步骤404与图2所示实施例的步骤203类似,此处不再赘述。
从图4中可以看出,与图2对应的实施例相比,本实施例中的用于控制无人驾驶车辆的方法的流程400突出了接收无人驾驶车辆发送的车辆控制请求的步骤。由此,本实施例描述的方案可以对发送车辆控制请求的无人驾驶车辆进行控制,这样,执行主体可以快速确定需要控制的无人驾驶车辆,从而使控制更加具有针对性,提高了控制效率。
进一步参考图5,作为对上述各图所示方法的实现,本申请提供了一种用于控制无人驾驶车辆的装置的一个实施例,该装置实施例与图2所示的方法实施例相对应,该装置具体可以应用于各种电子设备中。
如图5所示,本实施例的用于控制无人驾驶车辆的装置500包括:第一接收单元501、第一确定单元502和发送单元503。其中,第一接收单元501被配置成接收无人驾驶车辆发送的环境信息,其中,所述环境信息包括车外环境信息;第一确定单元502被配置成根据所述车外环境信息和所述无人驾驶车辆所执行操作的操作信息,确定所述无人驾驶车辆是否处于操作异常状态;发送单元503被配置成响应于确定所述无人驾驶车辆处于操作异常状态,向所述无人驾驶车辆发送制动控制指令和数据获取指令,其中,所述制动控制指令用于控制所述无人驾驶车辆制动,所述数据获取指令用于获取所述无人驾驶车辆内行驶记录仪的数据。
在本实施例中,用于控制无人驾驶车辆的装置500的第一接收单元501、第一确定单元502和发送单元503的具体处理及其所带来的技术效果可分别参考图2对应实施例中步骤201、步骤202和步骤203的相关说明,在此不再赘述。
在本实施例的一些可选的实现方式中,所述装置500还包括:第二接收单元(图中未示出),被配置成接收无人驾驶车辆发送的车辆控制请求。
在本实施例的一些可选的实现方式中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息;以及所述装置500还包括:第二确定单元(图中未示出),被配置成根据所述车内视频信息确定所述无人驾驶车辆内是否包括预定乘客;第一控制指令发送单元(图中未示出),被配置成响应于确定所述无人驾驶车辆内不包括预定乘客,向所述无人驾驶车辆发送第一控制指令,其中,所述第一控制指令用于控制所述无人驾驶车辆发送警报信息。
在本实施例的一些可选的实现方式中,所述装置500还包括:第三确定单元(图中未示出),被配置成响应于确定所述无人驾驶车辆内包括预定乘客,确定所述预定乘客是否处于异常状态;第二控制指令发送单元(图中未示出),被配置成响应于确定所述预定乘客处于异常状态,向所述无人驾驶车辆发送第二控制指令,其中,所述第二控制指令用于控制所述无人驾驶车辆播放预先设定的提示信息。
在本实施例的一些可选的实现方式中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息;以及所述装置500还包括:第四确定单元(图中未示出),被配置成根据所述车内视频信息确定所述无人驾驶车辆内是否包括行为异常的乘客;第三控制指令发送单元(图中未示出),被配置成响应于确定所述无人驾驶车辆内包括行为异常的乘客,向所述无人驾驶车辆发送第三控制指令,其中,所述第三控制指令用于控制所述无人驾驶车辆执行预定的乘客行为异常应急操作。
在本实施例的一些可选的实现方式中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息、车内烟雾浓度信息和 车内温度信息;以及
所述装置500还包括:第五确定单元(图中未示出),被配置成根据所述车内视频信息、车内烟雾浓度信息和/或车内温度信息确定所述无人驾驶车辆内是否出现物品燃烧;第四控制指令发送单元(图中未示出),被配置成响应于确定所述无人驾驶车辆内出现物品燃烧,向所述无人驾驶车辆发送第四控制指令,其中,所述第四控制指令用于控制所述无人驾驶车辆执行预定的物品燃烧应急操作。
下面参考图6,其示出了适于用来实现本申请实施例的服务器的计算机系统600的结构示意图。图6示出的服务器仅仅是一个示例,不应对本申请实施例的功能和使用范围带来任何限制。
如图6所示,计算机系统600包括中央处理单元(CPU)601,其可以根据存储在只读存储器(ROM)602中的程序或者从存储部分608加载到随机访问存储器(RAM)603中的程序而执行各种适当的动作和处理。在RAM 603中,还存储有系统600操作所需的各种程序和数据。CPU 601、ROM 602以及RAM 603通过总线604彼此相连。输入/输出(I/O)接口605也连接至总线604。
以下部件连接至I/O接口605:包括键盘、鼠标等的输入部分606;包括诸如阴极射线管(CRT)、液晶显示器(LCD)等以及扬声器等的输出部分607;包括硬盘等的存储部分608;以及包括诸如LAN卡、调制解调器等的网络接口卡的通信部分609。通信部分609经由诸如因特网的网络执行通信处理。驱动器610也根据需要连接至I/O接口605。可拆卸介质611,诸如磁盘、光盘、磁光盘、半导体存储器等等,根据需要安装在驱动器610上,以便于从其上读出的计算机程序根据需要被安装入存储部分608。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信部分609从网络上被下载和安装,和/或从 可拆卸介质611被安装。在该计算机程序被中央处理单元(CPU)601执行时,执行本申请的方法中限定的上述功能。
需要说明的是,本申请所述的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本申请中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本申请中,计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:无线、电线、光缆、RF等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言或其组合来编写用于执行本申请的操作的计算机程序代码,所述程序设计语言包括面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利 用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本申请各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本申请实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。所描述的单元也可以设置在处理器中,例如,可以描述为:一种处理器包括第一接收单元、第一确定单元和发送单元。其中,这些单元的名称在某种情况下并不构成对该单元本身的限定,例如,第一接收单元还可以被描述为“接收无人驾驶车辆发送的环境信息的单元”。
作为另一方面,本申请还提供了一种计算机可读介质,该计算机可读介质可以是上述实施例中描述的装置中所包含的;也可以是单独存在,而未装配入该装置中。上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该装置执行时,使得该装置:接收无人驾驶车辆发送的环境信息,其中,所述环境信息包括车外环境信息;根据所述车外环境信息和所述无人驾驶车辆所执行操作的操作信息,确定所述无人驾驶车辆是否处于操作异常状态;
响应于确定所述无人驾驶车辆处于操作异常状态,向所述无人驾驶车辆发送制动控制指令和数据获取指令,其中,所述制动控制指令用于控制所述无人驾驶车辆制动,所述数据获取指令用于获取所述无人驾驶车辆内行驶记录仪的数据。
以上描述仅为本申请的较佳实施例以及对所运用技术原理的说 明。本领域技术人员应当理解,本申请中所涉及的发明范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述发明构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本申请中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。

Claims (15)

  1. 一种用于控制无人驾驶车辆的方法,包括:
    接收无人驾驶车辆发送的环境信息,其中,所述环境信息包括车外环境信息;
    根据所述车外环境信息和所述无人驾驶车辆所执行操作的操作信息,确定所述无人驾驶车辆是否处于操作异常状态;
    响应于确定所述无人驾驶车辆处于操作异常状态,向所述无人驾驶车辆发送制动控制指令和数据获取指令,其中,所述制动控制指令用于控制所述无人驾驶车辆制动,所述数据获取指令用于获取所述无人驾驶车辆内行驶记录仪的数据。
  2. 根据权利要求1所述的方法,其中,在所述接收无人驾驶车辆发送的环境信息之前,所述方法还包括:
    接收无人驾驶车辆发送的车辆控制请求。
  3. 根据权利要求1所述的方法,其中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息;以及
    所述方法还包括:
    根据所述车内视频信息确定所述无人驾驶车辆内是否包括预定乘客;
    响应于确定所述无人驾驶车辆内不包括预定乘客,向所述无人驾驶车辆发送第一控制指令,其中,所述第一控制指令用于控制所述无人驾驶车辆发送警报信息。
  4. 根据权利要求3所述的方法,其中,所述方法还包括:
    响应于确定所述无人驾驶车辆内包括预定乘客,确定所述预定乘客是否处于异常状态;
    响应于确定所述预定乘客处于异常状态,向所述无人驾驶车辆发送第二控制指令,其中,所述第二控制指令用于控制所述无人驾驶车 辆播放预先设定的提示信息。
  5. 根据权利要求1所述的方法,其中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息;以及
    所述方法还包括:
    根据所述车内视频信息确定所述无人驾驶车辆内是否包括行为异常的乘客;
    响应于确定所述无人驾驶车辆内包括行为异常的乘客,向所述无人驾驶车辆发送第三控制指令,其中,所述第三控制指令用于控制所述无人驾驶车辆执行预定的乘客行为异常应急操作。
  6. 根据权利要求1所述的方法,其中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息、车内烟雾浓度信息和车内温度信息;以及
    所述方法还包括:
    根据所述车内视频信息、车内烟雾浓度信息和/或车内温度信息确定所述无人驾驶车辆内是否出现物品燃烧;
    响应于确定所述无人驾驶车辆内出现物品燃烧,向所述无人驾驶车辆发送第四控制指令,其中,所述第四控制指令用于控制所述无人驾驶车辆执行预定的物品燃烧应急操作。
  7. 一种用于控制无人驾驶车辆的装置,包括:
    第一接收单元,被配置成接收无人驾驶车辆发送的环境信息,其中,所述环境信息包括车外环境信息;
    第一确定单元,被配置成根据所述车外环境信息和所述无人驾驶车辆所执行操作的操作信息,确定所述无人驾驶车辆是否处于操作异常状态;
    发送单元,被配置成响应于确定所述无人驾驶车辆处于操作异常状态,向所述无人驾驶车辆发送制动控制指令和数据获取指令,其中,所述制动控制指令用于控制所述无人驾驶车辆制动,所述数据获取指 令用于获取所述无人驾驶车辆内行驶记录仪的数据。
  8. 根据权利要求7所述的装置,其中,所述装置还包括:
    第二接收单元,被配置成接收无人驾驶车辆发送的车辆控制请求。
  9. 根据权利要求7所述的装置,其中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息;以及
    所述装置还包括:
    第二确定单元,被配置成根据所述车内视频信息确定所述无人驾驶车辆内是否包括预定乘客;
    第一控制指令发送单元,被配置成响应于确定所述无人驾驶车辆内不包括预定乘客,向所述无人驾驶车辆发送第一控制指令,其中,所述第一控制指令用于控制所述无人驾驶车辆发送警报信息。
  10. 根据权利要求9所述的装置,其中,所述装置还包括:
    第三确定单元,被配置成响应于确定所述无人驾驶车辆内包括预定乘客,确定所述预定乘客是否处于异常状态;
    第二控制指令发送单元,被配置成响应于确定所述预定乘客处于异常状态,向所述无人驾驶车辆发送第二控制指令,其中,所述第二控制指令用于控制所述无人驾驶车辆播放预先设定的提示信息。
  11. 根据权利要求7所述的装置,其中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息;以及
    所述装置还包括:
    第四确定单元,被配置成根据所述车内视频信息确定所述无人驾驶车辆内是否包括行为异常的乘客;
    第三控制指令发送单元,被配置成响应于确定所述无人驾驶车辆内包括行为异常的乘客,向所述无人驾驶车辆发送第三控制指令,其中,所述第三控制指令用于控制所述无人驾驶车辆执行预定的乘客行为异常应急操作。
  12. 根据权利要求7所述的装置,其中,所述环境信息包括车内环境信息,所述车内环境信息包括车内视频信息、车内烟雾浓度信息和车内温度信息;以及
    所述装置还包括:
    第五确定单元,被配置成根据所述车内视频信息、车内烟雾浓度信息和/或车内温度信息确定所述无人驾驶车辆内是否出现物品燃烧;
    第四控制指令发送单元,被配置成响应于确定所述无人驾驶车辆内出现物品燃烧,向所述无人驾驶车辆发送第四控制指令,其中,所述第四控制指令用于控制所述无人驾驶车辆执行预定的物品燃烧应急操作。
  13. 一种服务器,包括:
    一个或多个处理器;
    存储装置,其上存储有一个或多个程序,
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-6中任一所述的方法。
  14. 一种计算机可读介质,其上存储有计算机程序,其中,所述程序被处理器执行时实现如权利要求1-6中任一所述的方法。
  15. 根据权利要求1所述的方法,其中响应于预测操作与所述操作信息对应的实际操作不匹配,确定所述无人驾驶车辆处于所述操作异常状态,所述预测操作为根据所述车外环境信息预测的所述无人驾驶车辆需要执行的操作。
PCT/CN2019/103252 2018-11-19 2019-08-29 用于控制无人驾驶车辆的方法和装置 WO2020103515A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP19887694.8A EP3756966A4 (en) 2018-11-19 2019-08-29 PILOTLESS VEHICLE CONTROL PROCESS AND APPARATUS
JP2020550098A JP7236454B2 (ja) 2018-11-19 2019-08-29 無人運転車両を制御するための方法及び装置
US17/024,629 US11511774B2 (en) 2018-11-19 2020-09-17 Method and apparatus for controlling autonomous driving vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811374621.0 2018-11-19
CN201811374621.0A CN109532847B (zh) 2018-11-19 2018-11-19 用于控制无人驾驶车辆的方法和装置、服务器、介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/024,629 Continuation US11511774B2 (en) 2018-11-19 2020-09-17 Method and apparatus for controlling autonomous driving vehicle

Publications (1)

Publication Number Publication Date
WO2020103515A1 true WO2020103515A1 (zh) 2020-05-28

Family

ID=65848301

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/103252 WO2020103515A1 (zh) 2018-11-19 2019-08-29 用于控制无人驾驶车辆的方法和装置

Country Status (5)

Country Link
US (1) US11511774B2 (zh)
EP (1) EP3756966A4 (zh)
JP (1) JP7236454B2 (zh)
CN (1) CN109532847B (zh)
WO (1) WO2020103515A1 (zh)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109532847B (zh) * 2018-11-19 2020-01-24 百度在线网络技术(北京)有限公司 用于控制无人驾驶车辆的方法和装置、服务器、介质
JP7052709B2 (ja) * 2018-12-25 2022-04-12 トヨタ自動車株式会社 車両制御装置及び車両制御方法
CN110032176A (zh) * 2019-05-16 2019-07-19 广州文远知行科技有限公司 无人驾驶汽车的远程接管方法、装置、设备和存储介质
CN110525225B (zh) * 2019-08-22 2022-03-22 易特智行科技(张家口)有限公司 一种无人驾驶电动汽车的制动方法、储存介质及其开发方法
CN110949377B (zh) * 2019-12-18 2021-08-24 成都坦途智行科技有限公司 一种低速无人车的判断急停方法和系统
CN113255347B (zh) * 2020-02-10 2022-11-15 阿里巴巴集团控股有限公司 实现数据融合的方法和设备及实现无人驾驶设备的识别方法
CN114162125A (zh) * 2020-09-11 2022-03-11 奥迪股份公司 用于控制自动驾驶车辆的方法、装置、介质及车辆
CN112947362A (zh) * 2021-01-29 2021-06-11 知行汽车科技(苏州)有限公司 无人驾驶车辆异常状态的远程控制方法、装置及存储介质
US11899449B1 (en) * 2021-03-10 2024-02-13 State Farm Mutual Automobile Insurance Company Autonomous vehicle extended reality environments
CN113386689B (zh) * 2021-07-08 2023-07-21 东软睿驰汽车技术(大连)有限公司 基于云实现的车内设备条件联动方法和装置
CN113569718A (zh) * 2021-07-26 2021-10-29 阿波罗智联(北京)科技有限公司 提醒方法、装置、设备、系统和存储介质
CN113821010B (zh) * 2021-08-11 2023-03-31 安途智行(北京)科技有限公司 自动驾驶车辆路测的监控方法
CN113635911B (zh) * 2021-09-07 2023-03-14 阿波罗智能技术(北京)有限公司 车辆控制方法、装置、设备、存储介质及自动驾驶车辆
CN114475564B (zh) * 2022-03-01 2023-09-26 清华大学苏州汽车研究院(相城) 一种车辆紧急应对控制方法、系统、车辆及存储介质
US12065159B2 (en) 2022-10-17 2024-08-20 Toyota Motor Engineering & Manufacturing North America, Inc. Customizable abnormal driving detection

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105976450A (zh) * 2016-04-27 2016-09-28 百度在线网络技术(北京)有限公司 无人驾驶车辆的数据处理方法和装置、黑匣子系统
JP2017047835A (ja) * 2015-09-04 2017-03-09 日立オートモティブシステムズ株式会社 車載ネットワーク装置
CN107357194A (zh) * 2016-05-10 2017-11-17 通用汽车环球科技运作有限责任公司 自主驾驶车辆中的热监测
CN107415602A (zh) * 2017-07-06 2017-12-01 上海小蚁科技有限公司 用于车辆的监测方法、设备和系统、计算机可读存储介质
CN107949504A (zh) * 2015-06-26 2018-04-20 英特尔公司 自主车辆安全系统和方法
JP6381835B1 (ja) * 2017-06-08 2018-08-29 三菱電機株式会社 車両制御装置
CN109532847A (zh) * 2018-11-19 2019-03-29 百度在线网络技术(北京)有限公司 用于控制无人驾驶车辆的方法和装置

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015056530A1 (ja) * 2013-10-17 2015-04-23 みこらった株式会社 自動運転車、自動運転車の盗難防止システム、自動運転車の盗難防止プログラム、端末制御用プログラム及び自動運転車のレンタル方法
JP6150258B2 (ja) * 2014-01-15 2017-06-21 みこらった株式会社 自動運転車
US9720410B2 (en) * 2014-03-03 2017-08-01 Waymo Llc Remote assistance for autonomous vehicles in predetermined situations
JP5898746B1 (ja) * 2014-09-29 2016-04-06 富士重工業株式会社 車両の走行制御装置
US9494935B2 (en) * 2014-11-13 2016-11-15 Toyota Motor Engineering & Manufacturing North America, Inc. Remote operation of autonomous vehicle in unexpected environment
CN104875745B (zh) * 2015-05-18 2018-11-09 百度在线网络技术(北京)有限公司 一种状态信息的处理方法及系统
JP6429202B2 (ja) * 2016-02-10 2018-11-28 本田技研工業株式会社 車両、車両制御装置、車両制御方法、および車両制御プログラム
US9964948B2 (en) * 2016-04-20 2018-05-08 The Florida International University Board Of Trustees Remote control and concierge service for an autonomous transit vehicle fleet
US9725036B1 (en) * 2016-06-28 2017-08-08 Toyota Motor Engineering & Manufacturing North America, Inc. Wake-up alerts for sleeping vehicle occupants
KR101891599B1 (ko) * 2016-09-30 2018-08-24 엘지전자 주식회사 자율 주행 차량의 제어방법과 서버
JP6870270B2 (ja) * 2016-10-14 2021-05-12 日産自動車株式会社 無人運転システムの遠隔操作方法と遠隔操作装置
WO2018110124A1 (ja) * 2016-12-13 2018-06-21 日立オートモティブシステムズ株式会社 車両制御装置
JP2018134949A (ja) 2017-02-21 2018-08-30 アイシン精機株式会社 運転支援装置
US10133270B2 (en) * 2017-03-28 2018-11-20 Toyota Research Institute, Inc. Electronic control units, vehicles, and methods for switching vehicle control from an autonomous driving mode
CN107123175A (zh) * 2017-03-31 2017-09-01 百度在线网络技术(北京)有限公司 一种记录驾驶信息的方法、装置和系统
US20180315314A1 (en) * 2017-04-28 2018-11-01 GM Global Technology Operations LLC Automated vehicle route traversal
WO2019046204A1 (en) * 2017-08-28 2019-03-07 nuTonomy Inc. MIXED-MODE DRIVING OF A VEHICLE HAVING AUTONOMOUS DRIVING CAPABILITIES
CN108162981A (zh) * 2017-12-29 2018-06-15 山东渔翁信息技术股份有限公司 一种无人驾驶设备控制方法、装置及系统
US11022971B2 (en) * 2018-01-16 2021-06-01 Nio Usa, Inc. Event data recordation to identify and resolve anomalies associated with control of driverless vehicles
US10901413B2 (en) * 2018-06-13 2021-01-26 City University Of Hong Kong System and method for controlling operation of an autonomous vehicle
US11354406B2 (en) * 2018-06-28 2022-06-07 Intel Corporation Physics-based approach for attack detection and localization in closed-loop controls for autonomous vehicles
US10953830B1 (en) * 2018-07-13 2021-03-23 State Farm Mutual Automobile Insurance Company Adjusting interior configuration of a vehicle based on vehicle contents

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107949504A (zh) * 2015-06-26 2018-04-20 英特尔公司 自主车辆安全系统和方法
JP2017047835A (ja) * 2015-09-04 2017-03-09 日立オートモティブシステムズ株式会社 車載ネットワーク装置
CN105976450A (zh) * 2016-04-27 2016-09-28 百度在线网络技术(北京)有限公司 无人驾驶车辆的数据处理方法和装置、黑匣子系统
CN107357194A (zh) * 2016-05-10 2017-11-17 通用汽车环球科技运作有限责任公司 自主驾驶车辆中的热监测
JP6381835B1 (ja) * 2017-06-08 2018-08-29 三菱電機株式会社 車両制御装置
CN107415602A (zh) * 2017-07-06 2017-12-01 上海小蚁科技有限公司 用于车辆的监测方法、设备和系统、计算机可读存储介质
CN109532847A (zh) * 2018-11-19 2019-03-29 百度在线网络技术(北京)有限公司 用于控制无人驾驶车辆的方法和装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3756966A4

Also Published As

Publication number Publication date
JP2022511161A (ja) 2022-01-31
EP3756966A1 (en) 2020-12-30
EP3756966A4 (en) 2021-11-24
CN109532847B (zh) 2020-01-24
US20210001887A1 (en) 2021-01-07
JP7236454B2 (ja) 2023-03-09
US11511774B2 (en) 2022-11-29
CN109532847A (zh) 2019-03-29

Similar Documents

Publication Publication Date Title
WO2020103515A1 (zh) 用于控制无人驾驶车辆的方法和装置
KR102205240B1 (ko) 예상치 못한 임펄스 변화 충돌 검출기
US20200351281A1 (en) Systems and methods for detection of malicious activity in vehicle data communication networks
US12080284B2 (en) Two-way in-vehicle virtual personal assistant
US11400944B2 (en) Detecting and diagnosing anomalous driving behavior using driving behavior models
CN109455180B (zh) 用于控制无人车的方法和装置
JP2020525916A (ja) 運転中の自動運転車による非行行動を検出するシステムおよび方法
JPWO2017168883A1 (ja) 情報処理装置、情報処理方法、プログラム、およびシステム
WO2021017057A1 (zh) 载客运营车辆监管系统及设备、介质
US20200216027A1 (en) Detecting vehicle intrusion using command pattern models
CN111231972B (zh) 基于驾驶行为习惯的告警方法、车辆及存储介质
US20190077353A1 (en) Cognitive-based vehicular incident assistance
US11609565B2 (en) Methods and systems to facilitate monitoring center for ride share and safe testing method based for selfdriving cars to reduce the false call by deuddaction systems based on deep learning machine
CN113581195B (zh) 特种车辆识别方法、电子设备和计算机可读介质
CN112991684A (zh) 一种驾驶预警的方法及装置
WO2020215976A1 (zh) 用于处理交通事故的方法和装置
WO2021004212A1 (zh) 一种车辆的驾驶权限的移交方法及装置
US11263837B2 (en) Automatic real-time detection of vehicular incidents
CN110059619B (zh) 基于图像识别自动报警的方法和装置
CN109308802A (zh) 异常车辆管理方法及装置
WO2020059115A1 (ja) 運転判定装置および運転判定方法
CN111225033A (zh) 一种车辆的管理方法、装置、设备及存储介质
JP2020071594A (ja) 履歴蓄積装置、及び履歴蓄積プログラム
CN110503522B (zh) 跟踪订单生成方法、存储介质和电子设备
CN114582088A (zh) 一种网约车的报警处理方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19887694

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020550098

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2019887694

Country of ref document: EP

Effective date: 20200923

NENP Non-entry into the national phase

Ref country code: DE