WO2020155617A1 - 无人车运行场景确定方法及设备 - Google Patents

无人车运行场景确定方法及设备 Download PDF

Info

Publication number
WO2020155617A1
WO2020155617A1 PCT/CN2019/103323 CN2019103323W WO2020155617A1 WO 2020155617 A1 WO2020155617 A1 WO 2020155617A1 CN 2019103323 W CN2019103323 W CN 2019103323W WO 2020155617 A1 WO2020155617 A1 WO 2020155617A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
unmanned vehicle
target
driving
parameter
Prior art date
Application number
PCT/CN2019/103323
Other languages
English (en)
French (fr)
Inventor
于高
冯岩
Original Assignee
北京百度网讯科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京百度网讯科技有限公司 filed Critical 北京百度网讯科技有限公司
Publication of WO2020155617A1 publication Critical patent/WO2020155617A1/zh
Priority to US17/020,874 priority Critical patent/US20210024083A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera

Definitions

  • the embodiments of the present application relate to the technical field of unmanned vehicles, and in particular to a method and equipment for determining an operation scenario of an unmanned vehicle.
  • unmanned driving technology With the continuous advancement of artificial intelligence technology, unmanned driving technology has also made considerable progress, and various unmanned vehicles are favored by more and more users.
  • the unmanned vehicle needs to send the operating scene of the unmanned vehicle to the cloud server, and the operating scene of the unmanned vehicle from the cloud server analyzes the driving situation of the unmanned vehicle to realize the control of the unmanned vehicle
  • the corrections made here are the operating scenes of the unmanned vehicles.
  • the operating scenes include information such as travel time and location, obstacle conditions, and unmanned vehicle driving status.
  • the existing process of determining the operating scenario of an unmanned vehicle is to collect the driving data of the unmanned vehicle during the driving process and send it to the cloud server in real time.
  • the cloud server needs to receive the driving data sent by the unmanned vehicle. Analyze the driving data to determine what operating state the unmanned vehicle is in when it sends the data, and then analyze the driving status of the unmanned vehicle according to the operating state of the unmanned vehicle to obtain the operating scene of the unmanned vehicle.
  • the inventor found that there are at least the following problems in the prior art: Since the unmanned vehicle can only collect fixed types of data of the unmanned vehicle during the data collection process, the data collected by the cloud server is fixed and single, and cannot be accurate. Analyzing the operating scene of the unmanned vehicle, resulting in the cloud server being unable to accurately correct the control of the unmanned vehicle.
  • the embodiments of the present application provide a method and equipment for determining an operation scenario of an unmanned vehicle, so as to solve the problem that the unmanned vehicle in the prior art can only collect fixed type data of the unmanned vehicle during the data collection process, so the cloud server collects The data is fixed and single, which cannot accurately analyze the technical problems of unmanned vehicle operating scenarios.
  • an embodiment of the present application provides a method for determining an operation scenario of an unmanned vehicle.
  • the method is applied to an unmanned vehicle and includes:
  • the driving data is sent to the server, so that the server determines to restore the operating scene of the unmanned vehicle according to the driving data.
  • the server since the data collection instruction is sent by the server, the corresponding driving data is collected according to the collection condition of the data collection finger and the type of data to be collected, so that the unmanned driving data is determined based on the data collection instruction, not
  • the existing technology is fixed, and the server can obtain the driving data of the unmanned vehicle according to the data collection instruction, and then accurately analyze the operation scene of the unmanned vehicle, and realize the accurate control of the unmanned vehicle by the server.
  • the collection condition includes at least one of a target time, a target obstacle number, a target position, and a target driving state; when the collection condition is met, the corresponding type of the data to be collected is acquired
  • the driving data includes: real-time acquisition of target data sent by each sensor device of the unmanned vehicle, where the target data includes at least one of a time parameter, an obstacle number parameter, a position parameter, and a driving state parameter; when the time When at least one of the parameter, the obstacle quantity parameter, the position parameter, and the driving state parameter matches at least one of the target time, the target obstacle quantity, the target position, and the target driving state, it is determined that the unmanned vehicle satisfies The acquisition condition, and the driving data corresponding to the type of data to be collected of the unmanned vehicle is acquired.
  • the receiving the data collection instruction sent by the server includes: receiving the data collection instruction sent by the server at a set time interval through an over-the-air OTA method.
  • an embodiment of the present application provides a method for determining an operation scenario of an unmanned vehicle.
  • the method is applied to a server and includes:
  • the server since the data collection instruction is sent by the server, the corresponding driving data is collected according to the collection condition of the data collection finger and the type of data to be collected, so that the unmanned driving data is determined based on the data collection instruction, not
  • the existing technology is fixed, and the server can obtain the driving data of the unmanned vehicle according to the data collection instruction, and then accurately analyze the operation scene of the unmanned vehicle, and realize the accurate control of the unmanned vehicle by the server.
  • the collection condition includes at least one of target time, target obstacle number, target position, and target driving state; wherein the driving data is that the unmanned vehicle is set at intervals of time. At least one of the time parameter, the obstacle quantity parameter, the position parameter, and the driving state parameter obtained from each sensing device of the unmanned vehicle, when at least one of the time parameter, the obstacle quantity parameter, the position parameter and the driving state parameter is When it is matched with at least one of the target time, the number of target obstacles, the target position, and the target driving state, it is determined that the unmanned vehicle meets the acquisition condition, and the data to be acquired of the unmanned vehicle is acquired The driving data corresponding to the type.
  • the driving data includes the target identification code corresponding to the acquisition condition and historical driving data; the determining and restoring the operating scenario of the unmanned vehicle according to the driving data includes: querying pre-stored The corresponding relationship between the identification code and the collection condition is obtained, and the collection condition corresponding to the target identification code is obtained; and the operation scenario of the unmanned vehicle is determined to be generated according to the collection condition and the historical driving data.
  • the method further includes: determining whether the time identifier of the driving data exceeds a set time threshold, wherein The time identifier corresponds to the time when the driving data is generated by the unmanned vehicle; if the time identifier of the driving data exceeds a set time threshold, the step of sending a data collection instruction to the unmanned vehicle is executed again.
  • the sending the data collection instruction to the unmanned vehicle includes: sending the data collection instruction to the unmanned vehicle every set time interval through an over-the-air download OTA method.
  • an embodiment of the present application provides an apparatus for determining an operation scenario of an unmanned vehicle.
  • the apparatus is applied to an unmanned vehicle and includes:
  • the collection instruction receiving module is configured to receive the data collection instruction sent by the server, wherein the data collection instruction includes the collection condition and the type of data to be collected;
  • a driving data acquisition module configured to acquire driving data corresponding to the type of data to be collected when the collection condition is met;
  • the driving data sending module is configured to send the driving data to the server, so that the server can determine and generate the operating scene of the unmanned vehicle according to the driving data.
  • the acquisition condition includes at least one of target time, target obstacle number, target position, and target driving state; the acquisition condition includes target time, target obstacle number, target position, and target At least one of the driving status; the driving data acquisition module is specifically configured to acquire target data sent by the sensor devices of the unmanned vehicle at a set time interval, wherein the target data includes a time parameter and a parameter of the number of obstacles , At least one of the position parameter and the driving state parameter; when at least one of the time parameter, the obstacle quantity parameter, the position parameter and the driving state parameter is related to the target time, the target obstacle quantity, the target position and the target When at least one of the driving states matches, it is determined that the unmanned vehicle meets the collection condition, and the driving data corresponding to the type of data to be collected of the unmanned vehicle is acquired.
  • the collection instruction receiving module is specifically configured to receive the data collection instruction sent by the server at a set time interval in an OTA manner.
  • an embodiment of the present application provides an apparatus for determining an operating scenario of an unmanned vehicle.
  • the apparatus is applied to a server and includes:
  • the collection instruction sending module is used to send a data collection instruction to the unmanned vehicle, wherein the data collection instruction includes the collection condition and the type of data to be collected;
  • a data receiving module configured to receive corresponding driving data sent when the unmanned vehicle meets the collection condition, wherein the driving data is obtained by the unmanned vehicle according to the type of data to be collected;
  • the operating scenario determination module is configured to determine and restore the operating scenario of the unmanned vehicle according to the driving data.
  • the collection condition includes at least one of target time, target obstacle number, target position, and target driving state; wherein the driving data is that the unmanned vehicle is set at intervals of time. At least one of the time parameter, the obstacle quantity parameter, the position parameter, and the driving state parameter obtained from each sensing device of the unmanned vehicle, when at least one of the time parameter, the obstacle quantity parameter, the position parameter and the driving state parameter is When it is matched with at least one of the target time, the number of target obstacles, the target position, and the target driving state, it is determined that the unmanned vehicle meets the acquisition condition, and the data to be acquired of the unmanned vehicle is acquired The driving data corresponding to the type.
  • the driving data includes the target identification code corresponding to the acquisition condition and historical driving data; the operating scenario determination module is specifically used to query the correspondence between the pre-stored identification code and the acquisition condition to obtain The acquisition condition corresponding to the target identification code; the operation scenario of the unmanned vehicle is determined to be generated according to the acquisition condition and the historical driving data.
  • the device further includes: a time threshold judgment module for judging whether the time identifier of the driving data exceeds a set time threshold, wherein the time identifier corresponds to the unmanned vehicle to generate the The time of the driving data; if the time identifier of the driving data exceeds the set time threshold, the step of sending a data collection instruction to the unmanned vehicle is executed again.
  • the collection instruction sending module is specifically configured to send the data collection instruction to the unmanned vehicle every set time interval in an OTA manner.
  • an embodiment of the present application provides a device for determining an operating scenario of an unmanned vehicle, including: at least one processor and a memory;
  • the memory stores computer execution instructions
  • the at least one processor executes the computer-executable instructions stored in the memory, so that the at least one processor executes the unmanned vehicle operating scenario determination method described in the first aspect and various possible designs of the first aspect.
  • an embodiment of the present application provides a device for determining an operating scenario of an unmanned vehicle, including: at least one processor and a memory;
  • the memory stores computer execution instructions
  • the at least one processor executes the computer-executable instructions stored in the memory, so that the at least one processor executes the unmanned vehicle operating scenario determination method described in the second aspect and various possible designs of the second aspect.
  • an embodiment of the present application provides a computer-readable storage medium, wherein the computer-readable storage medium stores computer-executable instructions, and when the processor executes the computer-executable instructions, the first Aspect and the method for determining the unmanned vehicle operating scenario described in the various possible designs of the first aspect.
  • an embodiment of the present application provides a computer-readable storage medium, wherein the computer-readable storage medium stores computer-executable instructions, and when the processor executes the computer-executable instructions, the first Aspect and the method for determining the unmanned vehicle operating scenario described in the various possible designs of the first aspect.
  • the method first receives a data collection instruction sent by a server, where the data collection instruction includes a collection condition and the type of data to be collected, and then when the collection condition is met Acquire driving data corresponding to the type of data to be collected; finally, send the driving data to the server, so that the server determines to restore the operating scene of the unmanned vehicle according to the driving data.
  • the data collection instruction is sent by the server, the corresponding driving data is collected according to the collection condition of the data collection and the type of data to be collected, so that the driving data collected by unmanned persons is determined according to the data collection instruction, rather than the prior art.
  • the server can obtain driving data of the unmanned vehicle according to the data collection instructions, and then accurately analyze the operation scene of the unmanned vehicle, and realize the accurate control of the unmanned vehicle by the server.
  • FIG. 1 is a schematic diagram of the system architecture of an unmanned vehicle operation scenario determination system provided by an embodiment of the application;
  • FIG. 2 is a schematic flowchart 1 of a method for determining an operation scenario of an unmanned vehicle provided by an embodiment of the application;
  • FIG. 3 is a second schematic flowchart of a method for determining an unmanned vehicle operation scenario provided by an embodiment of the application;
  • FIG. 4 is a third schematic flowchart of a method for determining an operation scenario of an unmanned vehicle provided by an embodiment of the application;
  • FIG. 5 is a schematic diagram of the interaction flow of the method for determining the operation scenario of an unmanned vehicle provided by an embodiment of the application;
  • FIG. 6 is a structural schematic diagram 1 of an apparatus for determining an operation scenario of an unmanned vehicle provided by an embodiment of the application;
  • FIG. 7 is a second structural schematic diagram of an unmanned vehicle operation scene determining apparatus provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of the hardware structure of an unmanned vehicle positioning device provided by an embodiment of the application.
  • FIG. 1 is a schematic diagram of the system architecture of an unmanned vehicle operating scenario determination system provided by an embodiment of the application.
  • the system provided in this embodiment includes an unmanned vehicle 101 and a server 102, and the unmanned vehicle 101 and the server 102 communicate through a network 103.
  • the unmanned vehicle 101 may be equipped with a processor and other sensors, and the sensors are used to perceive various states or driving parameters of the unmanned vehicle.
  • the server 102 may be a server, a server cluster composed of multiple servers, or a cloud computing platform.
  • the server 102 can realize data transmission with the unmanned vehicle 101 through the network 103, and complete the control of the unmanned vehicle 101.
  • unmanned vehicles 101 and servers 102 in FIG. 1 is only illustrative, and any number of unmanned vehicles 101 and servers 102 can be provided as needed.
  • the existing process of determining the operating scenario of the unmanned vehicle is to collect the driving data of the unmanned vehicle during the driving process.
  • the cloud server After the cloud server receives the driving data sent by the unmanned vehicle, it needs to analyze the driving data to determine the operating state of the unmanned vehicle when the data is sent, and then according to the operating state of the unmanned vehicle Analyze the driving conditions of the unmanned vehicle to obtain the operating scene of the unmanned vehicle.
  • unmanned vehicles can only collect fixed types of data of unmanned vehicles in the process of collecting data, the data collected by the cloud server is fixed and single, and cannot accurately analyze the operating scenarios of unmanned vehicles.
  • the control of unmanned vehicles will be revised accordingly.
  • the embodiments of the present application provide a method and equipment for determining the operating scene of an unmanned vehicle, so that the driving data collected by unmanned vehicles is determined according to data collection instructions, instead of being fixed in the prior art.
  • the server can collect data according to Command to obtain driving data of unmanned vehicles, and then accurately analyze the operating scenarios of unmanned vehicles, and realize accurate control of unmanned vehicles by the server
  • Fig. 2 is a schematic flow chart 1 of the method for determining the operating scenario of an unmanned vehicle provided by an embodiment of the application.
  • the execution subject of this embodiment may be the unmanned vehicle in the embodiment shown in Fig. 1, and this embodiment is not particularly limited here. .
  • the method includes:
  • Step S201 Receive a data collection instruction sent by a server, where the data collection instruction includes collection conditions and types of data to be collected.
  • the data collection instruction may be determined according to the type of unmanned vehicle to be confirmed, and the type identification of the unmanned vehicle may be carried in the data collection instruction.
  • Each type of unmanned vehicle corresponds to different data collection instructions. For example, for a type A unmanned vehicle, the received data collection instruction carries the collection conditions of the type A unmanned vehicle and the type of data to be collected; for the type B unmanned vehicle, the received data collection instruction carries the type of B unmanned vehicle Collection conditions and the type of data to be collected.
  • the acquisition conditions include the target time of the unmanned vehicle, the number of target obstacles, the target location, and the target driving state.
  • the target time refers to the driving time of the unmanned vehicle
  • the target obstacle quantity refers to the number of obstacles detected by the driving direction sensor of the unmanned vehicle
  • the target position refers to the location where the unmanned vehicle is driving
  • the target driving state It refers to whether the unmanned vehicle is running or braking.
  • the unmanned vehicle is equipped with various sensing devices that collect the driving data of the unmanned vehicle, such as a camera to collect images, a global positioning system (Global Positioning System, GPS) module for positioning, and acceleration of the driving state of the unmanned vehicle Sensors and so on.
  • various sensing devices that collect the driving data of the unmanned vehicle, such as a camera to collect images, a global positioning system (Global Positioning System, GPS) module for positioning, and acceleration of the driving state of the unmanned vehicle Sensors and so on.
  • GPS Global Positioning System
  • the type of data to be collected is used to instruct the unmanned vehicle to obtain driving data corresponding to the type of data to be collected from various sensor devices installed on the unmanned vehicle.
  • the types of data to be collected may include vehicle body state data types, obstacle data types, and user somatosensory data types.
  • the body state data type is used to instruct the unmanned vehicle to collect various parameters during the operation of the unmanned vehicle, such as speed, acceleration, direction, etc.
  • the obstacle data type is used to indicate the number and size of obstacles that the unmanned vehicle collects
  • the type of user's somatosensory data is used to indicate the relative acceleration of the unmanned vehicle to collect the user's perception.
  • Step S202 Acquire driving data corresponding to the type of data to be collected when the collection condition is met.
  • the collection conditions may include the target time, the number of target obstacles, the target location, and the target driving state.
  • the process of judging whether the unmanned vehicle meets the acquisition conditions can be:
  • the unmanned vehicle obtains from each sensing device of the unmanned vehicle at a set time interval, the current time parameter, the obstacle quantity parameter, the position parameter and the driving state parameter When it is consistent with the target time, target obstacle number, target location and target driving state, it is determined that the unmanned vehicle meets the acquisition conditions.
  • the process of judging whether the unmanned vehicle meets the acquisition conditions can also be:
  • the target data includes at least one of time parameters, obstacle quantity parameters, position parameters, and driving state parameters; current time parameters, obstacle quantity parameters, position parameters, and When at least one of the driving state parameters matches at least one of the target time, the number of target obstacles, the target position, and the target driving state, it is determined that the unmanned vehicle meets the acquisition condition.
  • the process of obtaining the driving data corresponding to the type of data to be collected may be:
  • the types of data to be collected may include vehicle body state data types, obstacle data types, and user somatosensory data types.
  • Step S203 Send the driving data to the server, so that the server determines to restore the operating scene of the unmanned vehicle according to the driving data.
  • the driving data includes unmanned vehicle body state data, obstacle data, and user somatosensory data, etc.
  • the operating scene of the unmanned vehicle is obtained.
  • the operating scenario of the unmanned vehicle refers to the time and location information of the unmanned vehicle; and the obstacle situation outside the unmanned vehicle, including information such as the number and size of the obstacles, and the status of the user.
  • this embodiment first receives the data collection instruction sent by the server, where the data collection instruction includes the collection condition and the type of data to be collected, and then acquires the corresponding data type when the collection condition is met. Driving data; finally, the driving data is sent to the server, so that the server determines to restore the operating scene of the unmanned vehicle according to the driving data. Since the data collection instruction is sent by the server, the corresponding driving data is collected according to the collection condition of the data collection and the type of data to be collected, so that the driving data collected by unmanned persons is determined according to the data collection instruction, rather than the prior art. Constantly, the server can obtain driving data of the unmanned vehicle according to the data collection instructions, and then accurately analyze the operation scene of the unmanned vehicle, and realize the accurate control of the unmanned vehicle by the server.
  • the acquisition conditions include at least one of the target time, the number of target obstacles, the target position, and the target driving state; in the embodiment corresponding to FIG. 2, step S202 acquires the waiting time when the acquisition conditions are met.
  • the process of collecting driving data corresponding to the data type includes:
  • the unmanned vehicle meets the acquisition condition , And obtain the driving data corresponding to the type of data to be collected for the unmanned vehicle.
  • the time parameter, the obstacle quantity parameter, the position parameter, and the driving state parameter matches at least one of the target time, the target obstacle quantity, the target position, and the target driving state, it refers to the time parameter,
  • One or more parameters among the obstacle quantity parameter, the position parameter, and the driving state parameter are the same as the corresponding one or more parameters among the target time, the target obstacle quantity, the target position, and the target driving state.
  • the acquisition conditions are that the unmanned vehicle is at the target time A, the number of obstacles is 0, the target location is location B, and the target driving state is "sudden braking".
  • the unmanned vehicle detects that the various parameters obtained from each sensing device meet the target time A, the number of obstacles is 0, the target location is location B, and the target driving state is "sudden braking", it is determined that the unmanned vehicle meets the collection At this time, the driving data corresponding to the type of data to be collected of the unmanned vehicle is obtained.
  • the receiving data collection instruction sent by the server in step S201 includes:
  • FIG. 3 is a schematic diagram of the second flow of a method for determining an unmanned vehicle operating scenario provided by an embodiment of the application.
  • the execution subject of this embodiment may be the server in the embodiment shown in FIG. 1, and this embodiment is not particularly limited here.
  • the method includes:
  • Step S301 Send a data collection instruction to the unmanned vehicle, where the data collection instruction includes the collection condition and the type of data to be collected.
  • Step S302 Receive corresponding driving data sent when the unmanned vehicle satisfies the collection condition, where the driving data is obtained by the unmanned vehicle according to the type of data to be collected.
  • Step S303 Determine and restore the operating scenario of the unmanned vehicle according to the driving data.
  • this embodiment sends a data collection instruction to the unmanned vehicle, where the data collection instruction includes the collection condition and the type of data to be collected; the corresponding data sent when the unmanned vehicle meets the collection condition is received Driving data, wherein the driving data is obtained by an unmanned vehicle according to the type of data to be collected; and determining and restoring the operating scene of the unmanned vehicle according to the driving data.
  • the data collection instruction is sent by the server, the corresponding driving data is collected according to the collection condition of the data collection and the type of data to be collected, so that the driving data collected by unmanned persons is determined according to the data collection instruction, rather than the prior art.
  • the server can obtain driving data of the unmanned vehicle according to the data collection instructions, and then accurately analyze the operation scene of the unmanned vehicle, and realize the accurate control of the unmanned vehicle by the server.
  • the collection condition includes at least one of target time, target obstacle number, target position, and target driving state; wherein the driving data is that the unmanned vehicle is set every set time At least one of the time parameter, the obstacle quantity parameter, the position parameter and the driving state parameter acquired from each sensor device of the unmanned vehicle, when the time parameter, the obstacle quantity parameter, the position parameter and the driving state parameter are When at least one of matches with at least one of the target time, the number of target obstacles, the target position, and the target driving state, it is determined that the unmanned vehicle meets the acquisition condition, and the acquired unmanned vehicle's to-be-collected The driving data corresponding to the data type.
  • FIG. 4 is a schematic diagram of the third flow of a method for determining an unmanned vehicle operating scene provided by an embodiment of the application.
  • the driving data includes the target identification code corresponding to the acquisition condition and the historical driving
  • the data describes in detail the process of determining the operating scenario of the unmanned vehicle for generating the unmanned vehicle based on the driving data.
  • the method includes:
  • Step S401 Query the correspondence between the pre-stored identification code and the collection condition, and obtain the collection condition corresponding to the target identification code.
  • the target identification code and historical driving data corresponding to the collection condition of the driving data collected by the unmanned vehicle Packed to get the driving data.
  • the target identification code is extracted, and the corresponding relationship between the pre-stored identification code and the collection condition is queried according to the target identification code, and the collection condition corresponding to the target identification code is obtained.
  • Table 1 is an example of the correspondence between the pre-stored identification codes and the acquisition conditions.
  • Step S402 Determine and generate the operating scenario of the unmanned vehicle according to the collection condition and the historical driving data.
  • the historical driving data may include body state data, obstacle data, and user somatosensory data.
  • the acquisition conditions include the target time of the unmanned vehicle, the number of target obstacles, the target location and the target driving state.
  • body state data, obstacle data and user somatosensory data By extracting body state data, obstacle data and user somatosensory data, the operating scene of the unmanned vehicle is obtained.
  • the operating scene of the unmanned vehicle refers to the time and location information of the unmanned vehicle; and the outside of the unmanned vehicle Obstacle situation, including information such as the number and size of obstacles, and the status of users.
  • querying the collection conditions based on the target identifier instead of directly sending the data of the collection conditions and historical driving data together, can reduce the volume of transmitted data and improve the efficiency of data transmission.
  • the method further includes:
  • the step of sending a data collection instruction to the unmanned vehicle is executed again.
  • the driving data that needs to be received cannot exceed a certain failure time, otherwise the received driving data cannot represent the current driving state of the unmanned vehicle.
  • the sending a data collection instruction to the unmanned vehicle includes:
  • the data collection instruction is sent to the unmanned vehicle every set time interval through the OTA method.
  • FIG. 5 is a schematic diagram of the interaction flow of the method for determining the operation scenario of the unmanned vehicle provided by the embodiment of the application. The interaction process between the unmanned vehicle and the server in this embodiment is described, and this embodiment does not specifically limit it here. As shown in Figure 5, the method includes:
  • Step S501 The server sends a data collection instruction to the unmanned vehicle, where the collection instruction includes the collection condition and the type of data to be collected.
  • Step S502 The unmanned vehicle obtains driving data corresponding to the type of data to be collected when the collection condition is satisfied.
  • Step S503 The unmanned vehicle sends the driving data to the server.
  • Step S504 The server determines the operating scenario of the unmanned vehicle according to the driving data.
  • the server can obtain the driving data of the unmanned vehicle according to the data collection instruction, and then accurately analyze the operation scene of the unmanned vehicle, and realize the accurate control of the unmanned vehicle by the server.
  • FIG. 6 is a first structural diagram of an apparatus for determining an unmanned vehicle operating scene provided by an embodiment of the application.
  • the device 600 for determining the operating scene of the unmanned vehicle includes: a collection instruction receiving module 601, a driving data acquiring module 602 and a driving data sending module 603.
  • the collection instruction receiving module 601 is configured to receive data collection instructions sent by the server, where the data collection instructions include collection conditions and types of data to be collected;
  • a driving data acquisition module 602 configured to acquire driving data corresponding to the type of data to be collected when the collection conditions are met;
  • the driving data sending module 603 is configured to send the driving data to the server, so that the server determines and generates the operating scene of the unmanned vehicle according to the driving data.
  • the device provided in this embodiment can be used to execute the technical solution of the method embodiment corresponding to FIG. 2, and its implementation principles and technical effects are similar, and will not be repeated here in this embodiment.
  • the collection condition includes at least one of a target time, a target obstacle number, a target position, and a target driving state;
  • the driving data acquisition module 602 is specifically configured to acquire target data sent by each sensing device of the unmanned vehicle at a set time interval, where the target data includes time parameters, obstacle quantity parameters, position parameters, and driving state parameters When at least one of the time parameter, the number of obstacles parameter, the position parameter, and the driving state parameter is at least one of the target time, the target number of obstacles, the target position, and the target driving state When matching, it is determined that the unmanned vehicle meets the collection condition, and the driving data corresponding to the type of data to be collected of the unmanned vehicle is acquired.
  • the collection instruction receiving module 601 is specifically configured to receive the data collection instruction sent by the server at a set time interval in an OTA manner.
  • FIG. 7 is a second structural schematic diagram of an apparatus for determining an operation scenario of an unmanned vehicle provided by an embodiment of the application.
  • the device 700 for determining the operating scene of the unmanned vehicle includes:
  • the collection instruction sending module 701 is configured to send data collection instructions to the unmanned vehicle, where the data collection instructions include collection conditions and types of data to be collected;
  • the data receiving module 702 is configured to receive corresponding driving data sent when the unmanned vehicle meets the collection condition, where the driving data is obtained by the unmanned vehicle according to the type of data to be collected;
  • the operating scenario determination module 703 is configured to determine and restore the operating scenario of the unmanned vehicle according to the driving data.
  • the device provided in this embodiment can be used to execute the technical solution of the method embodiment corresponding to FIG. 3, and its implementation principles and technical effects are similar, and details are not described herein again in this embodiment.
  • the collection condition includes at least one of target time, target obstacle number, target position, and target driving state; wherein the driving data is that the unmanned vehicle is set every set time At least one of the time parameter, the obstacle quantity parameter, the position parameter and the driving state parameter obtained from each sensing device of the unmanned vehicle, when the time parameter, the obstacle quantity parameter, the position parameter and the driving state parameter are When at least one of matches with at least one of the target time, the number of target obstacles, the target position, and the target driving state, it is determined that the unmanned vehicle meets the acquisition condition, and the acquired unmanned vehicle's to-be-collected The driving data corresponding to the data type.
  • the driving data includes a target identification code corresponding to the acquisition condition and historical driving data
  • the operating scenario determination module 703 is specifically configured to query the correspondence between the pre-stored identification code and the collection condition, and obtain the collection condition corresponding to the target identification code; determine to generate the non-existence according to the collection condition and the historical driving data. Running scenes of people and vehicles.
  • the device further includes:
  • the time threshold judgment module 704 is used to judge whether the time identifier of the driving data exceeds a set time threshold, wherein the time identifier corresponds to the time when the unmanned vehicle generates the driving data; if the time identifier of the driving data If the set time threshold is exceeded, the step of sending a data collection instruction to the unmanned vehicle is executed again.
  • the collection instruction sending module 701 is specifically configured to send the data collection instruction to the unmanned vehicle every set time interval in an OTA manner.
  • FIG. 8 is a schematic diagram of the hardware structure of an unmanned vehicle positioning device provided by an embodiment of the application.
  • the unmanned vehicle positioning device 800 provided in this embodiment includes: at least one processor 801 and a memory 802.
  • the neural network-based road disease identification device 800 further includes a communication component 803. Among them, the processor 801, the memory 802, and the communication component 803 are connected through a bus 804.
  • At least one processor 801 executes the computer-executable instructions stored in the memory 802, so that at least one processor 801 executes the unmanned vehicle operating scenario determination method in any of the foregoing method embodiments.
  • the communication component 803 is used to communicate with the terminal device and/or the server.
  • the processor may be a central processing unit (English: Central Processing Unit, abbreviated as: CPU), or other general-purpose processors or digital signal processors (English: Digital Signal Processor, referred to as DSP), application specific integrated circuit (English: Application Specific Integrated Circuit, referred to as ASIC), etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like. The steps of the method disclosed in combination with the application can be directly embodied as executed and completed by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • the memory may include high-speed RAM memory, and may also include non-volatile storage NVM, such as at least one disk memory.
  • the bus may be an Industry Standard Architecture (ISA) bus, Peripheral Component (PCI) bus, or Extended Industry Standard Architecture (EISA) bus, etc.
  • ISA Industry Standard Architecture
  • PCI Peripheral Component
  • EISA Extended Industry Standard Architecture
  • the bus can be divided into address bus, data bus, control bus, etc.
  • the buses in the drawings of this application are not limited to only one bus or one type of bus.
  • the embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores computer-executable instructions, and when the processor executes the computer-executed instructions, the above-mentioned unmanned vehicle operation scenario determination is realized method.
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules is only a logical function division, and there may be other divisions in actual implementation, for example, multiple modules can be combined or integrated. To another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or modules, and may be in electrical, mechanical or other forms.
  • modules described as separate components may or may not be physically separated, and the components displayed as modules may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional modules in the various embodiments of the present application may be integrated into one processing unit, or each module may exist alone physically, or two or more modules may be integrated into one unit.
  • the units formed by the above-mentioned modules can be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • the above-mentioned integrated modules implemented in the form of software function modules may be stored in a computer readable storage medium.
  • the above-mentioned software function module is stored in a storage medium and includes a number of instructions to make a computer device (which can be a personal computer, a server, or a network device, etc.) or a processor (English: processor) execute the various embodiments of this application Part of the method.
  • processor may be a central processing unit (English: Central Processing Unit, abbreviated as: CPU), or other general-purpose processors, digital signal processors (English: Digital Signal Processor, abbreviated as DSP), and application-specific integrated circuits (English: Application Specific Integrated Circuit, referred to as ASIC) etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like. The steps of the method disclosed in combination with the application can be directly embodied as executed and completed by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • the memory may include a high-speed RAM memory, and may also include a non-volatile storage NVM, such as at least one disk storage, and may also be a U disk, a mobile hard disk, a read-only memory, a magnetic disk, or an optical disk.
  • NVM non-volatile storage
  • the bus may be an Industry Standard Architecture (ISA) bus, Peripheral Component (PCI) bus, or Extended Industry Standard Architecture (EISA) bus, etc.
  • ISA Industry Standard Architecture
  • PCI Peripheral Component
  • EISA Extended Industry Standard Architecture
  • the bus can be divided into address bus, data bus, control bus, etc.
  • the buses in the drawings of this application are not limited to only one bus or one type of bus.
  • the above-mentioned storage medium can be realized by any type of volatile or non-volatile storage device or their combination, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable Except for programmable read only memory (EPROM), programmable read only memory (PROM), read only memory (ROM), magnetic memory, flash memory, magnetic disks or optical disks.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EPROM erasable except for programmable read only memory
  • PROM programmable read only memory
  • ROM read only memory
  • magnetic memory flash memory
  • flash memory magnetic disks or optical disks.
  • optical disks any available medium that can be accessed by a general-purpose or special-purpose computer.
  • An exemplary storage medium is coupled to the processor, so that the processor can read information from the storage medium and can write information to the storage medium.
  • the storage medium may also be an integral part of the processor.
  • the processor and the storage medium may be located in Application Specific Integrated Circuits (ASIC for short).
  • ASIC Application Specific Integrated Circuits
  • the processor and the storage medium may also exist as discrete components in the electronic device or main control device.
  • a person of ordinary skill in the art can understand that all or part of the steps in the foregoing method embodiments can be implemented by a program instructing relevant hardware.
  • the aforementioned program can be stored in a computer readable storage medium. When the program is executed, it executes the steps including the foregoing method embodiments; and the foregoing storage medium includes: ROM, RAM, magnetic disk, or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

本申请实施例提供一种无人车运行场景确定方法及设备,该方法首先接收服务器发送的数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型,然后在满足所述采集条件时获取所述待采集数据类型对应的行驶数据;最后将所述行驶数据发送至所述服务器,以使所述服务器根据所述行驶数据确定还原所述无人车的运行场景。由于数据采集指令是由服务器发送的,根据数据采集指的采集条件和待采集数据类型采集对应的行驶数据,使得无人采集的行驶数据是根据数据采集指令确定的,而不是现有技术中是固定不变的,服务器可以根据数据采集指令获取无人车行驶数据,进而准确分析无人车的运行场景,实现服务器对无人车的准确控制。

Description

无人车运行场景确定方法及设备
本申请要求于2019年02月01日提交中国专利局、申请号为CN201910104851.3、申请名称为“无人车运行场景确定方法及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及无人车技术领域,尤其涉及一种无人车运行场景确定方法及设备。
背景技术
随着人工智能技术的不断进步,无人驾驶技术也取得了长足的发展,各种无人车受到越来越多用户的青睐。无人车在行驶过程中,无人车需要将无人车的运行场景发送给云端服务器,由云端服务器无人车的运行场景对无人车的行驶状况进行分析,实现对无人车的控制进行的修正,这里的无人车的运行场景,其中运行场景包括行驶时间和位置、障碍物情况以及无人车行驶状态等信息。
目前,现有的确定无人车运行场景的过程是,无人车在行驶过程中采集无人车的行驶数据实时发送至云端服务器,云端服务器在接收到无人车发送的行驶数据后,需要对行驶数据进行分析,确定无人车发送数据的时候是处于何种运行状态,进而根据无人车的运行状态对无人车的行驶状况进行分析,得到无人车的运行场景。
然而,发明人发现现有技术中至少存在如下问题:由于无人车在采集数据的过程中,只能采集无人车的固定类型的数据,因此云端服务器采集到的数据固定且单一,不能准确分析无人车的运行场景,导致云端服务器根据无法准确地对无人车的控制进行修正。
发明内容
本申请实施例提供一种无人车运行场景确定方法及设备,以解决现有技术中无人车在采集数据的过程中,只能采集无人车的固定类型的数据,因此云端服务器采集到的数据固定且单一,不能准确分析无人车的运行场景的技术问题。
第一方面,本申请实施例提供一种无人车运行场景确定方法,所述方法应用于无人车,包括:
接收服务器发送的数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;
在满足所述采集条件时获取所述待采集数据类型对应的行驶数据;
将所述行驶数据发送至所述服务器,以使所述服务器根据所述行驶数据确定还原所述无人车的运行场景。
基于上述技术内容,由于数据采集指令是由服务器发送的,根据数据采集指的采集条件和待采集数据类型采集对应的行驶数据,使得无人采集的行驶数据是根据数据采集指令确定的,而不是现有技术中是固定不变的,服务器可以根据数据采集指令获取无人车行驶数据,进而准确分析无人车的运行场景,实现服务器对无人车的准确控制。
在一种可能的设计中,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;所述在满足所述采集条件时获取所述待采集数据类型对应的行驶数据,包括:实时获取无人车各传感设备发送的目标数据,其中所述目标数据包括时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种;当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,并获取无人车的所述待采集数据类型对应的行驶数据。
通过在满足采集条件时,才发送行驶数据至服务器,可减轻实时发送数据给服务器带来的网络带宽的压力。
在一种可能的设计中,所述接收服务器发送的数据采集指令,包括:接收所述服务器通过空中下载OTA方式每隔设定时间间隔发送的所述数据采集指令。
第二方面,本申请实施例提供一种无人车运行场景确定方法,所述方法应用于服务器,包括:
向无人车发送数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;
接收所述无人车满足所述采集条件时所发送的对应的行驶数据,其中所述行驶数据是无人车根据所述待采集数据类型获取得到;
根据所述行驶数据确定还原所述无人车的运行场景。
基于上述技术内容,由于数据采集指令是由服务器发送的,根据数据采集指的采集条件和待采集数据类型采集对应的行驶数据,使得无人采集的行驶数据是根据数据采集指令确定的,而不是现有技术中是固定不变的,服务器可以根据数据采集指令获取无人车行驶数据,进而准确分析无人车的运行场景,实现服务器对无人车的准确控制。
在一种可能的设计中,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;其中所述行驶数据是所述无人车每隔设定时间间隔从无人车各传感设备获取的时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种,当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,获取的无人车的所述待采集数据类型对应的行驶数据。
在一种可能的设计中,所述行驶数据包含所述采集条件对应的目标标识码和历史行驶数据;所述根据所述行驶数据确定还原所述无人车的运行场景,包括:查询预存的标识码与采集条件的对应关系,获取所述目标标识码对应的采集条件;根据所述采集条件和所述历史行驶数据确定生成所述无人车的运行场景。
根据目标标识查询采集条件,而不是直接将采集条件的数据和历史行驶数据一并发送,可减少传输数据的体量,提高数据传输效率。
在一种可能的设计中,所述接收所述无人车满足所述采集条件时所发送的对应的行驶数据之后,还包括:判断所述行驶数据的时间标识是否超出设定时间阈值,其中所述时间标识对应所述无人车生成所述行驶数据的时间;若所述行驶数据的时间标识超出设定时间阈值,则重新执行向所述无人车发送数据采集指令的步骤。
通过在设定时间阈值判断行驶数据的时间标识是否超出设定时间阈值时,新执行向所述无人车发送数据采集指令的步骤,能够避免接收到时间失效的行驶数据。
在一种可能的设计中,特征在于,所述向无人车发送数据采集指令,包括:通过空中下载OTA方式每隔设定时间间隔向所述无人车发送所述数据采集指令。
第三方面,本申请实施例提供一种无人车运行场景确定装置,所述装置应用于无人车,包括:
采集指令接收模块,用于接收服务器发送的数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;
行驶数据获取模块,用于在满足所述采集条件时获取所述待采集数据类型对应的行驶数据;
行驶数据发送模块,用于将所述行驶数据发送至所述服务器,以使所述服务器根据所述行驶数据确定生成所述无人车的运行场景。
在一种可能的设计中,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;所述行驶数据获取模块,具体用于每隔设定时间间隔获取无人车各传感设备发送的目标数据,其中所述目标数据包括时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种;当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,并获取无人车的所述待采集数据类型对应的行驶数据。
在一种可能的设计中,所述采集指令接收模块,具体用于接收所述服务器通过OTA方式每隔设定时间间隔发送的所述数据采集指令。
第四方面,本申请实施例提供一种无人车运行场景确定装置,所述装置应用于服务器,包括:
采集指令发送模块,用于向无人车发送数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;
数据接收模块,用于接收所述无人车满足所述采集条件时所发送的对应的行驶数据,其中所述行驶数据是无人车根据所述待采集数据类型获取得到;
运行场景确定模块,用于根据所述行驶数据确定还原所述无人车的运行场景。
在一种可能的设计中,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;其中所述行驶数据是所述无人车每隔设定时间间隔从无人车各传感设备获取的时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种,当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一 种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,获取的无人车的所述待采集数据类型对应的行驶数据。
在一种可能的设计中,所述行驶数据包含所述采集条件对应的目标标识码和历史行驶数据;所述运行场景确定模块,具体用于查询预存的标识码与采集条件的对应关系,获取所述目标标识码对应的采集条件;根据所述采集条件和所述历史行驶数据确定生成所述无人车的运行场景。
在一种可能的设计中,所述装置还包括:时间阈值判断模块,用于判断所述行驶数据的时间标识是否超出设定时间阈值,其中所述时间标识对应所述无人车生成所述行驶数据的时间;若所述行驶数据的时间标识超出设定时间阈值,则重新执行向所述无人车发送数据采集指令的步骤。
在一种可能的设计中,所述采集指令发送模块,具体用于通过OTA方式每隔设定时间间隔向所述无人车发送所述数据采集指令。
第五方面,本申请实施例提供一种无人车运行场景确定设备,包括:至少一个处理器和存储器;
所述存储器存储计算机执行指令;
所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如上第一方面以及第一方面各种可能的设计所述的无人车运行场景确定方法。
第六方面,本申请实施例提供一种无人车运行场景确定设备,包括:至少一个处理器和存储器;
所述存储器存储计算机执行指令;
所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如上第二方面以及第二方面各种可能的设计所述的无人车运行场景确定方法。
第七方面,本申请实施例提供一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如上第一方面以及第一方面各种可能的设计所述的无人车运行场景确定方法。
第八方面,本申请实施例提供一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如上第一方面以及第一方面各种可能的设计所述的无人车运行场景确定方法。
本实施例提供的无人车运行场景确定方法及设备,该方法首先接收服务器发送的数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型,然后在满足所述采集条件时获取所述待采集数据类型对应的行驶数据;最后将所述行驶数据发送至所述服务器,以使所述服务器根据所述行驶数据确定还原所述无人车的运行场景。由于数据采集指令是由服务器发送的,根据数据采集指的采集条件和待采集数据类型采集对应的行驶数据,使得无人采集的行驶数据是根据数据采集指令确定的,而不是现有技术中是固定不变的,服务器可以根据数据采集指令获取无人车行驶数据,进而准确分析无人车的运行场景,实现服务器对无人车的准确控制。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的无人车运行场景确定系统的系统架构示意图;
图2为本申请实施例提供的无人车运行场景确定方法的流程示意图一;
图3为本申请实施例提供的无人车运行场景确定方法的流程示意图二;
图4为本申请实施例提供的无人车运行场景确定方法的流程示意图三;
图5为本申请实施例提供的无人车运行场景确定方法的交互流程示意图;
图6为本申请实施例提供的无人车运行场景确定装置的结构示意图一;
图7为本申请实施例提供的无人车运行场景确定装置的结构示意图二;
图8为本申请实施例提供的无人车定位设备的硬件结构示意图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
图1为本申请实施例提供的无人车运行场景确定系统的系统架构示意图。如图1所示,本实施例提供系统包括无人车101和服务器102,无人车101与服务器102通过网络103通信。
这里,无人车101可以是可以安装有处理器以及其他传感器,传感器用于感知无人车的各种状态或行驶参数。
服务器102可以是台服务器,也可以是多台服务器组成的服务器集群,或者是一个云计算平台。服务器102可以通过网络103实现与无人车101的数据传输,完成对无人车101的控制。
应理解的是,图1中的无人车101和服务器102数目仅仅是示意性的,根据需要,设置任意数目的无人车101和服务器102。
目前,现有的确定无人车运行场景的过程是,无人车在行驶过程中采集无人车的行驶数据
实时发送至云端服务器,云端服务器在接收到无人车发送的行驶数据后,需要对行驶数据进行分析,确定无人车发送数据的时候是处于何种运行状态,进而根据无人车的运行状态对无人车的行驶状况进行分析,得到无人车的运行场景。
由于无人车在采集数据的过程中,只能采集无人车的固定类型的数据,因此云端服务器采集到的数据固定且单一,不能准确分析无人车的运行场景,导致云端服务器根据无法准确地对无人车的控制进行修正。本申请实施例提供了一种无人车运行场景 确定方法及设备,使得无人采集的行驶数据是根据数据采集指令确定的,而不是现有技术中是固定不变的,服务器可以根据数据采集指令获取无人车行驶数据,进而准确分析无人车的运行场景,实现服务器对无人车的准确控制
图2为本申请实施例提供的无人车运行场景确定方法的流程示意图一,本实施例的执行主体可以为图1所示实施例中的无人车,本实施例此处不做特别限制。如图2所示,该方法包括:
步骤S201:接收服务器发送的数据采集指令,其中数据采集指令中包括采集条件和待采集数据类型。
在本实施例中,数据采集指令可以根据待确认的无人车的类型确定,数据采集指令中可以携带无人车的类型标识。每种类型的无人车对应不同的数据采集指令。例如,对于A型无人车,接收的数据采集指令中携带A型无人车的采集条件和待采集数据类型;对于B型无人车,接收的数据采集指令中携带B型无人车的采集条件和待采集数据类型。
在本实施例中,采集条件包括无人车的目标时间、目标障碍物数量、目标位置和目标行驶状态等。其中目标时间指的是无人车行驶的时间,目标障碍物数量指的是无人车行驶方向传感器检测到的障碍物的数量,目标位置指的是无人车的行驶的地点,目标行驶状态指的是无人车是处于行驶或者刹车状态。
无人车上设有各种采集无人车行驶数据的传感设备,例如,采集图像的摄像头、进行定位的全球定位系统(Global Positioning System,简称GPS)模块、获取无人车行驶状态的加速度传感器等等。
待采集数据类型用于指示无人车从无人车上设置的各种传感设备获取待采集数据类型对应的行驶数据。
待采集数据类型可以包括车身状态数据类型、障碍物数据类型和用户体感数据类型等类型。其中车身状态数据类型用于指示无人车采集无人车运行时的各种参数,例如,速度、加速度、方向等;障碍物数据类型用于指示无人车采集障碍物的数量、大小等;用户体感数据类型用于指示无人车采集用户的感知的相对加速度等。
步骤S202:在满足采集条件时获取待采集数据类型对应的行驶数据。
在本实施例中,采集条件可以包括目标时间、目标障碍物数量、目标位置和目标行驶状态。
具体地,判断无人车是否满足采集条件的过程,可以是:
无人车每隔设定时间间隔从无人车各传感设备获取的时间参数、障碍物数量参数、位置参数和行驶状态参数中,当时间参数、障碍物数量参数、位置参数和行驶状态参数与目标时间、目标障碍物数量、目标位置和目标行驶状态一致时,确定无人车满足采集条件。
判断无人车是否满足采集条件的过程,也可以是:
实时获取无人车各传感设备发送的目标数据,其中目标数据包括时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种;当时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定无人车满足采集条件。
具体地,获取待采集数据类型对应的行驶数据的过程,可以是:
获取待采集数据类型中各类型标识;根据各类型标识,确定各类型标识对应的传感器;向各类型标识对应的传感器发送采集指令,控制各传感器采集各类型标识对应的行驶数据;接收各传感器采集各类型标识对应的行驶数据。其中待采集数据类型可以包括车身状态数据类型、障碍物数据类型和用户体感数据类型等类型。
步骤S203:将行驶数据发送至服务器,以使服务器根据行驶数据确定还原无人车的运行场景。
在本实施例中,行驶数据包括无人车的车身状态数据、障碍物数据和用户体感数据等,通过对车身状态数据、障碍物数据和用户体感数据等进行提取,得到无人车的运行场景,其中无人车的运行场景指的是无人车运行的时间和位置信息;以及无人车外的障碍物情况,包括障碍物的数量和大小等信息,以及用户的状态等。
从上述描述可知,本实施例首先接收服务器发送的数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型,然后在满足所述采集条件时获取所述待采集数据类型对应的行驶数据;最后将所述行驶数据发送至所述服务器,以使所述服务器根据所述行驶数据确定还原所述无人车的运行场景。由于数据采集指令是由服务器发送的,根据数据采集指的采集条件和待采集数据类型采集对应的行驶数据,使得无人采集的行驶数据是根据数据采集指令确定的,而不是现有技术中是固定不变的,服务器可以根据数据采集指令获取无人车行驶数据,进而准确分析无人车的运行场景,实现服务器对无人车的准确控制。
在本申请的一个实施例中,采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;在图2对应的实施例中,步骤S202在满足采集条件时获取待采集数据类型对应的行驶数据的过程,包括:
实时获取无人车各传感设备发送的目标数据,其中目标数据包括时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种;
当时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定无人车满足采集条件,并获取无人车的待采集数据类型对应的行驶数据。
其中,当时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配,指的是时间参数、障碍物数量参数、位置参数和行驶状态参数中的一种参数或多种参数与目标时间、目标障碍物数量、目标位置和目标行驶状态中的对应的一种参数或多种参数相同。
从上述实施例可知,通过在满足采集条件时,才发送行驶数据至服务器,可减轻实时发送数据给服务器带来的网络带宽的压力。
例如,采集条件为无人车处于目标时间为A、障碍物数量为0、目标位置为地点B以及目标行驶状态为“急刹车”。在无人车检测到从各传感设备获取的各种参数满足目标时间为A、障碍物数量为0、目标位置为地点B以及目标行驶状态为“急刹车”时,确定无人车满足采集,此时获取无人车的待采集数据类型对应的行驶数据。
在本申请的一个实施例中,在图2对应的实施例中,步骤S201中所述接收服务器 发送的数据采集指令,包括:
接收所述服务器通过空中下载OTA方式每隔设定时间间隔发送的所述数据采集指令。
图3为本申请实施例提供的无人车运行场景确定方法的流程示意图二,本实施例的执行主体可以为图1所示实施例中的服务器,本实施例此处不做特别限制。如图3所示,该方法包括:
步骤S301:向无人车发送数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型。
步骤S302:接收所述无人车满足所述采集条件时所发送的对应的行驶数据,其中所述行驶数据是无人车根据所述待采集数据类型获取得到。
步骤S303:根据所述行驶数据确定还原所述无人车的运行场景。
从上述描述可知,本实施例向无人车发送数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;接收所述无人车满足所述采集条件时所发送的对应的行驶数据,其中所述行驶数据是无人车根据所述待采集数据类型获取得到;根据所述行驶数据确定还原所述无人车的运行场景。由于数据采集指令是由服务器发送的,根据数据采集指的采集条件和待采集数据类型采集对应的行驶数据,使得无人采集的行驶数据是根据数据采集指令确定的,而不是现有技术中是固定不变的,服务器可以根据数据采集指令获取无人车行驶数据,进而准确分析无人车的运行场景,实现服务器对无人车的准确控制。
在本申请的一个实施例中,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;其中所述行驶数据是所述无人车每隔设定时间间隔从无人车各传感设备获取的时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种,当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,获取的无人车的所述待采集数据类型对应的行驶数据。
图4为本申请实施例提供的无人车运行场景确定方法的流程示意图三,本实施例在图3实施例的基础上,所述行驶数据包含所述采集条件对应的目标标识码和历史行驶数据,详细描述了步骤根据所述行驶数据确定生成所述无人车的无人车的运行场景的过程。如图4所示,该方法包括:
步骤S401:查询预存的标识码与采集条件的对应关系,获取所述目标标识码对应的采集条件。
在本实施例中,由于相无人车发送的采集指令可能是多个,为了确定该行驶数据对应的采集条件,将无人车采集该行驶数据的采集条件对应的目标标识码和历史行驶数据打包得到行驶数据中。当接收到行驶数据后提取出目标标识码,并根据目标标识码查询预存的标识码与采集条件的对应关系,获取所述目标标识码对应的采集条件。
例如,参考表1,表1为预存的标识码与采集条件的对应关系的一个实例。
表1.预存的标识码与采集条件的对应关系的一个实例
标识码 采集条件
01 目标时间1、目标障碍物数量1、目标位置1和目标行驶状态1
02 目标时间2、目标障碍物数量2、目标位置2和目标行驶状态2
03 目标时间3、目标障碍物数量4、目标位置4和目标行驶状态4
步骤S402:根据所述采集条件和所述历史行驶数据确定生成所述无人车的运行场景。
在本实施例中,历史行驶数据可以包括车身状态数据、障碍物数据和用户体感数据。采集条件包括无人车的目标时间、目标障碍物数量、目标位置和目标行驶状态。通过对车身状态数据、障碍物数据和用户体感数据等进行提取,得到无人车的运行场景,其中无人车的运行场景指的是无人车运行的时间和位置信息;以及无人车外的障碍物情况,包括障碍物的数量和大小等信息,以及用户的状态等。
从上述描述可知,根据目标标识查询采集条件,而不是直接将采集条件的数据和历史行驶数据一并发送,可减少传输数据的体量,提高数据传输效率。
在本申请的一个实施例中,所述接收所述无人车满足所述采集条件时所发送的对应的行驶数据之后,还包括:
判断所述行驶数据的时间标识是否超出设定时间阈值,其中所述时间标识对应所述无人车生成所述行驶数据的时间;
若所述行驶数据的时间标识超出设定时间阈值,则重新执行向所述无人车发送数据采集指令的步骤。
在本实施例中,由于对无人车的控制具有实时性,需要接收到的行驶数据不能超过一定的失效时间,否则接收到的行驶数据不能代表无人当前的行驶状态。
从上述描述可知,通过在设定时间阈值判断行驶数据的时间标识是否超出设定时间阈值时,新执行向所述无人车发送数据采集指令的步骤,能够避免接收到时间失效的行驶数据。
在本申请的一个实施例中,所述向无人车发送数据采集指令,包括:
通过OTA方式每隔设定时间间隔向所述无人车发送所述数据采集指令。
图5为本申请实施例提供的无人车运行场景确定方法的交互流程示意图,本实施例的对无人车和服务器的交互过程进行描述,本实施例此处不做特别限制。如图5所示,该方法包括:
步骤S501:服务器向无人车发送数据采集指令,其中所述采集指令中包括采集条件和待采集数据类型。
步骤S502:无人车在满足所述采集条件时获取所述待采集数据类型对应的行驶数据。
步骤S503:无人车将所述行驶数据发送至所述服务器。
步骤S504:服务器根据所述行驶数据确定所述无人车的运行场景。
从上述描述可知,还原由于数据采集指令是由服务器发送的,根据数据采集指的采集条件和待采集数据类型采集对应的行驶数据,使得无人采集的行驶数据是根据数据采集指令确定的,而不是现有技术中是固定不变的,服务器可以根据数据采集指令获取无人车行驶数据,进而准确分析无人车的运行场景,实现服务器对无人车的准确控制。
图6为本申请实施例提供的无人车运行场景确定装置的结构示意图一。如图6所示,该无人车运行场景确定装置600包括:采集指令接收模块601、行驶数据获取模块602和行驶数据发送模块603。
其中,采集指令接收模块601,用于接收服务器发送的数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;
行驶数据获取模块602,用于在满足所述采集条件时获取所述待采集数据类型对应的行驶数据;
行驶数据发送模块603,用于将所述行驶数据发送至所述服务器,以使所述服务器根据所述行驶数据确定生成所述无人车的运行场景。
本实施例提供的设备,可用于执行上述图2对应的方法实施例的技术方案,其实现原理和技术效果类似,本实施例此处不再赘述。
在在本申请的一个实施例中,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;
所述行驶数据获取模块602,具体用于每隔设定时间间隔获取无人车各传感设备发送的目标数据,其中所述目标数据包括时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种;当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,并获取无人车的所述待采集数据类型对应的行驶数据。
在本申请的一个实施例中,所述采集指令接收模块601,具体用于接收所述服务器通过OTA方式每隔设定时间间隔发送的所述数据采集指令。
图7为本申请实施例提供的无人车运行场景确定装置的结构示意图二。如图7所示,该无人车运行场景确定装置700包括:
采集指令发送模块701,用于向无人车发送数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;
数据接收模块702,用于接收所述无人车满足所述采集条件时所发送的对应的行驶数据,其中所述行驶数据是无人车根据所述待采集数据类型获取得到;
运行场景确定模块703,用于根据所述行驶数据确定还原所述无人车的运行场景。
本实施例提供的设备,可用于执行上述图3对应的方法实施例的技术方案,其实现原理和技术效果类似,本实施例此处不再赘述。
在本申请的一个实施例中,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;其中所述行驶数据是所述无人车每隔设定时间间隔从无人车各传感设备获取的时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种,当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至 少一种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,获取的无人车的所述待采集数据类型对应的行驶数据。
在本申请的一个实施例中,所述行驶数据包含所述采集条件对应的目标标识码和历史行驶数据;
所述运行场景确定模块703,具体用于查询预存的标识码与采集条件的对应关系,获取所述目标标识码对应的采集条件;根据所述采集条件和所述历史行驶数据确定生成所述无人车的运行场景。
在本申请的一个实施例中,参考图7,所述装置还包括:
时间阈值判断模块704,用于判断所述行驶数据的时间标识是否超出设定时间阈值,其中所述时间标识对应所述无人车生成所述行驶数据的时间;若所述行驶数据的时间标识超出设定时间阈值,则重新执行向所述无人车发送数据采集指令的步骤。
在本申请的一个实施例中,所述采集指令发送模块701,具体用于通过OTA方式每隔设定时间间隔向所述无人车发送所述数据采集指令。
图8为本申请实施例提供的无人车定位设备的硬件结构示意图。如图8所示,本实施例提供的无人车定位设备800包括:至少一个处理器801和存储器802。该基于神经网络的道路病害识别设备800还包括通信部件803。其中,处理器801、存储器802以及通信部件803通过总线804连接。
在具体实现过程中,至少一个处理器801执行所述存储器802存储的计算机执行指令,使得至少一个处理器801执行上述任一方法实施例中的无人车运行场景确定方法。通信部件803用于与终端设备和/或服务器进行通讯。
处理器801的具体实现过程可参见上述方法实施例,其实现原理和技术效果类似,本实施例此处不再赘述。
在上述的图8所示的实施例中,应理解,处理器可以是中央处理单元(英文:Central Processing Unit,简称:CPU),还可以是其他通用处理器、数字信号处理器(英文:Digital Signal Processor,简称:DSP)、专用集成电路(英文:Application Specific Integrated Circuit,简称:ASIC)等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合申请所公开的方法的步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。
存储器可能包含高速RAM存储器,也可能还包括非易失性存储NVM,例如至少一个磁盘存储器。
总线可以是工业标准体系结构(Industry Standard Architecture,ISA)总线、外部设备互连(Peripheral Component,PCI)总线或扩展工业标准体系结构(Extended Industry Standard Architecture,EISA)总线等。总线可以分为地址总线、数据总线、控制总线等。为便于表示,本申请附图中的总线并不限定仅有一根总线或一种类型的总线。
本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如上所述的无人车运行场景确定方法。
在本申请所提供的几个实施例中,应该理解到,所揭露的设备和方法,可以通过其它的方式实现。例如,以上所描述的设备实施例仅仅是示意性的,例如,所述模块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或模块的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的模块可以是或者也可以不是物理上分开的,作为模块显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能模块可以集成在一个处理单元中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个单元中。上述模块成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
上述以软件功能模块的形式实现的集成的模块,可以存储在一个计算机可读取存储介质中。上述软件功能模块存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(英文:processor)执行本申请各个实施例所述方法的部分步骤。
应理解,上述处理器可以是中央处理单元(英文:Central Processing Unit,简称:CPU),还可以是其他通用处理器、数字信号处理器(英文:Digital Signal Processor,简称:DSP)、专用集成电路(英文:Application Specific Integrated Circuit,简称:ASIC)等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合申请所公开的方法的步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。
存储器可能包含高速RAM存储器,也可能还包括非易失性存储NVM,例如至少一个磁盘存储器,还可以为U盘、移动硬盘、只读存储器、磁盘或光盘等。
总线可以是工业标准体系结构(Industry Standard Architecture,ISA)总线、外部设备互连(Peripheral Component,PCI)总线或扩展工业标准体系结构(Extended Industry Standard Architecture,EISA)总线等。总线可以分为地址总线、数据总线、控制总线等。为便于表示,本申请附图中的总线并不限定仅有一根总线或一种类型的总线。
上述存储介质可以是由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。存储介质可以是通用或专用计算机能够存取的任何可用介质。
一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于专用集成电路(Application Specific Integrated Circuits,简称:ASIC)中。当然,处理器和存储介质也可以作为分立组件存在于电子设备或主控设备中。
本领域普通技术人员可以理解:实现上述各方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成。前述的程序可以存储于一计算机可读取存储介质中。该程序在执行时,执行包括上述各方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (20)

  1. 一种无人车运行场景确定方法,其特征在于,所述方法应用于无人车,包括:
    接收服务器发送的数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;
    在满足所述采集条件时获取所述待采集数据类型对应的行驶数据;
    将所述行驶数据发送至所述服务器,以使所述服务器根据所述行驶数据确定还原所述无人车的运行场景。
  2. 根据权利要求1所述的方法,其特征在于,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;
    所述在满足所述采集条件时获取所述待采集数据类型对应的行驶数据,包括:
    实时获取无人车各传感设备发送的目标数据,其中所述目标数据包括时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种;
    当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,并获取无人车的所述待采集数据类型对应的行驶数据。
  3. 根据权利要求1所述的方法,其特征在于,所述接收服务器发送的数据采集指令,包括:
    接收所述服务器通过空中下载OTA方式每隔设定时间间隔发送的所述数据采集指令。
  4. 一种无人车运行场景确定方法,其特征在于,所述方法应用于服务器,包括:
    向无人车发送数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;
    接收所述无人车满足所述采集条件时所发送的对应的行驶数据,其中所述行驶数据是无人车根据所述待采集数据类型获取得到;
    根据所述行驶数据确定还原所述无人车的运行场景。
  5. 根据权利要求4所述的方法,其特征在于,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;其中所述行驶数据是所述无人车每隔设定时间间隔从无人车各传感设备获取的时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种,当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,获取的无人车的所述待采集数据类型对应的行驶数据。
  6. 根据权利要求4所述的方法,其特征在于,所述行驶数据包含所述采集条件对应的目标标识码和历史行驶数据;
    所述根据所述行驶数据确定还原所述无人车的运行场景,包括:
    查询预存的标识码与采集条件的对应关系,获取所述目标标识码对应的采集条件;
    根据所述采集条件和所述历史行驶数据确定生成所述无人车的运行场景。
  7. 根据权利要求4所述的方法,其特征在于,所述接收所述无人车满足所述采集条件时所发送的对应的行驶数据之后,还包括:
    判断所述行驶数据的时间标识是否超出设定时间阈值,其中所述时间标识对应所述无人车生成所述行驶数据的时间;
    若所述行驶数据的时间标识超出设定时间阈值,则重新执行向所述无人车发送数据采集指令的步骤。
  8. 根据权利要求4至7任一项所述的方法,其特征在于,所述向无人车发送数据采集指令,包括:
    通过OTA方式每隔设定时间间隔向所述无人车发送所述数据采集指令。
  9. 一种无人车运行场景确定装置,其特征在于,所述装置应用于无人车,包括:
    采集指令接收模块,用于接收服务器发送的数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;
    行驶数据获取模块,用于在满足所述采集条件时获取所述待采集数据类型对应的行驶数据;
    行驶数据发送模块,用于将所述行驶数据发送至所述服务器,以使所述服务器根据所述行驶数据确定生成所述无人车的运行场景。
  10. 根据权利要求9所述的装置,其特征在于,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;
    所述行驶数据获取模块,具体用于每隔设定时间间隔获取无人车各传感设备发送的目标数据,其中所述目标数据包括时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种;当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,并获取无人车的所述待采集数据类型对应的行驶数据。
  11. 根据权利要求9所述的装置,其特征在于,所述采集指令接收模块,具体用于接收所述服务器通过OTA方式每隔设定时间间隔发送的所述数据采集指令。
  12. 一种无人车运行场景确定装置,其特征在于,所述装置应用于服务器,包括:
    采集指令发送模块,用于向无人车发送数据采集指令,其中所述数据采集指令中包括采集条件和待采集数据类型;
    数据接收模块,用于接收所述无人车满足所述采集条件时所发送的对应的行驶数据,其中所述行驶数据是无人车根据所述待采集数据类型获取得到;
    运行场景确定模块,用于根据所述行驶数据确定还原所述无人车的运行场景。
  13. 根据权利要求12所述的装置,其特征在于,所述采集条件包括目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种;其中所述行驶数据是所述无人车每隔设定时间间隔从无人车各传感设备获取的时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种,当所述时间参数、障碍物数量参数、位置参数和行驶状态参数中的至少一种与所述目标时间、目标障碍物数量、目标位置和目标行驶状态中的至少一种匹配时,确定所述无人车满足所述采集条件,获取的无人车的所述待采集数据类型对应的行驶数据。
  14. 根据权利要求12所述的装置,其特征在于,所述行驶数据包含所述采集条件对应的目标标识码和历史行驶数据;
    所述运行场景确定模块,具体用于查询预存的标识码与采集条件的对应关系,获取所述目标标识码对应的采集条件;根据所述采集条件和所述历史行驶数据确定生成所述无人车的运行场景。
  15. 根据权利要求12所述的装置,其特征在于,所述装置还包括:
    时间阈值判断模块,用于判断所述行驶数据的时间标识是否超出设定时间阈值,其中所述时间标识对应所述无人车生成所述行驶数据的时间;若所述行驶数据的时间标识超出设定时间阈值,则重新执行向所述无人车发送数据采集指令的步骤。
  16. 根据权利要求12至15任一项所述的装置,其特征在于,所述采集指令发送模块,具体用于通过OTA方式每隔设定时间间隔向所述无人车发送所述数据采集指令。
  17. 一种无人车运行场景确定设备,其特征在于,包括:至少一个处理器和存储器;
    所述存储器存储计算机执行指令;
    所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如权利要求1至3任一项所述的无人车运行场景确定方法。
  18. 一种无人车运行场景确定设备,其特征在于,包括:至少一个处理器和存储器;
    所述存储器存储计算机执行指令;
    所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如权利要求4至8任一项所述的无人车运行场景确定方法。
  19. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如权利要求1至3任一项所述的无人车运行场景确定方法。
  20. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如权利要求4至8任一项任一项所述的无人车运行场景确定方法。
PCT/CN2019/103323 2019-02-01 2019-08-29 无人车运行场景确定方法及设备 WO2020155617A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/020,874 US20210024083A1 (en) 2019-02-01 2020-09-15 Method and device for determining unmanned vehicle running scene

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910104851.3A CN109934954B (zh) 2019-02-01 2019-02-01 无人车运行场景确定方法及设备
CN201910104851.3 2019-02-01

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/020,874 Continuation US20210024083A1 (en) 2019-02-01 2020-09-15 Method and device for determining unmanned vehicle running scene

Publications (1)

Publication Number Publication Date
WO2020155617A1 true WO2020155617A1 (zh) 2020-08-06

Family

ID=66985401

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/103323 WO2020155617A1 (zh) 2019-02-01 2019-08-29 无人车运行场景确定方法及设备

Country Status (3)

Country Link
US (1) US20210024083A1 (zh)
CN (1) CN109934954B (zh)
WO (1) WO2020155617A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112147935A (zh) * 2020-09-25 2020-12-29 劢微机器人科技(深圳)有限公司 无人叉车叉臂控制方法、装置、设备及存储介质
CN112862404A (zh) * 2021-02-24 2021-05-28 招商局国际信息技术有限公司 港口作业自动化管理方法、装置、设备及介质

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109934954B (zh) * 2019-02-01 2020-10-16 北京百度网讯科技有限公司 无人车运行场景确定方法及设备
CN110584601B (zh) * 2019-08-26 2022-05-17 首都医科大学 一种老人认知功能监测和评估系统
CN111582018B (zh) * 2020-03-24 2024-02-09 北京掌行通信息技术有限公司 无人车动态交互场景的判定方法、系统、判定终端及存储介质
CN112764916B (zh) * 2020-12-18 2023-08-22 北京百度网讯科技有限公司 数据采集的方法及装置
CN116416706A (zh) 2020-12-18 2023-07-11 北京百度网讯科技有限公司 数据采集的方法及装置
CN113879302A (zh) * 2021-10-21 2022-01-04 中寰卫星导航通信有限公司 一种车辆控制方法、装置、设备及存储介质
CN114095555A (zh) * 2021-12-08 2022-02-25 金蝶医疗软件科技有限公司 信息采集方法及相关设备
CN115240450A (zh) * 2022-07-13 2022-10-25 购旺工业(赣州)有限公司 一种智慧交通数据采集设备及方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170278312A1 (en) * 2016-03-22 2017-09-28 GM Global Technology Operations LLC System and method for automatic maintenance
CN108242166A (zh) * 2016-12-24 2018-07-03 钱浙滨 一种车辆行驶监控方法及装置
CN109215164A (zh) * 2017-07-04 2019-01-15 百度在线网络技术(北京)有限公司 行车数据获取方法和装置
CN109934954A (zh) * 2019-02-01 2019-06-25 北京百度网讯科技有限公司 无人车运行场景确定方法及设备

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5798332B2 (ja) * 2011-02-10 2015-10-21 トヨタ自動車株式会社 車両情報取得システム及び車両情報取得方法
US20140257686A1 (en) * 2013-03-05 2014-09-11 GM Global Technology Operations LLC Vehicle lane determination
JP2016119547A (ja) * 2014-12-19 2016-06-30 トヨタ自動車株式会社 車両データのリモート収集システム
JP6020611B2 (ja) * 2015-01-20 2016-11-02 トヨタ自動車株式会社 車両データのリモート収集システム
CN106856502A (zh) * 2016-12-02 2017-06-16 北京京东尚科信息技术有限公司 无人车控制方法、无人车、服务器和无人车系统
CN107063713B (zh) * 2017-04-27 2020-03-10 百度在线网络技术(北京)有限公司 应用于无人驾驶汽车的测试方法和装置
CN109032102B (zh) * 2017-06-09 2020-12-18 百度在线网络技术(北京)有限公司 无人驾驶车辆测试方法、装置、设备及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170278312A1 (en) * 2016-03-22 2017-09-28 GM Global Technology Operations LLC System and method for automatic maintenance
CN108242166A (zh) * 2016-12-24 2018-07-03 钱浙滨 一种车辆行驶监控方法及装置
CN109215164A (zh) * 2017-07-04 2019-01-15 百度在线网络技术(北京)有限公司 行车数据获取方法和装置
CN109934954A (zh) * 2019-02-01 2019-06-25 北京百度网讯科技有限公司 无人车运行场景确定方法及设备

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112147935A (zh) * 2020-09-25 2020-12-29 劢微机器人科技(深圳)有限公司 无人叉车叉臂控制方法、装置、设备及存储介质
CN112862404A (zh) * 2021-02-24 2021-05-28 招商局国际信息技术有限公司 港口作业自动化管理方法、装置、设备及介质
CN112862404B (zh) * 2021-02-24 2023-09-05 招商局国际科技有限公司 港口作业自动化管理方法、装置、设备及介质

Also Published As

Publication number Publication date
US20210024083A1 (en) 2021-01-28
CN109934954B (zh) 2020-10-16
CN109934954A (zh) 2019-06-25

Similar Documents

Publication Publication Date Title
WO2020155617A1 (zh) 无人车运行场景确定方法及设备
CN108639048B (zh) 汽车变道辅助方法、系统以及汽车
US10077054B2 (en) Tracking objects within a dynamic environment for improved localization
US10160448B2 (en) Object tracking using sensor fusion within a probabilistic framework
US20210108943A1 (en) Map data updating method, apparatus, system and storage medium
US10262537B1 (en) Autonomous optimization of parallel parking space utilization
US20210365024A1 (en) Method and device for positioning unmanned vehicle
WO2021059714A1 (ja) 占有グリッドマップ生成装置、占有グリッドマップ生成システム、占有グリッドマップ生成方法、およびプログラム
JP2020042783A (ja) 無人運転車に基づく障害物分類方法、装置、機器及び記憶媒体
CN111256687A (zh) 地图数据的处理方法、装置、采集设备和存储介质
KR20210055746A (ko) 운전자의 근무 상태 검출 방법, 장치, 기기 및 컴퓨터 저장 매체
JP7120170B2 (ja) 車線推定装置
CN114495056A (zh) 停车场柱子检测方法、检测装置、车辆及存储介质
US20180357839A1 (en) Method and device for uploading data of a motor vehicle
TWI832302B (zh) 深度圖像獲取方法、電子設備及電腦可讀存儲媒體
US20230392940A1 (en) Systems and methods for generating fuel efficiency score based on cell phone sensor data
US20230274585A1 (en) On-vehicle device, management system, and upload method
US20230039032A1 (en) Apparatus and method for updating map and non-transitory computer-readable medium containing computer program for updating map
WO2024066980A1 (zh) 一种重定位方法以及装置
CN115092136B (zh) 车速规划方法、装置、车辆及存储介质
US20240096113A1 (en) Method and apparatus for calibrating roll angle of on-board camera, device, and storage medium
JP7204818B2 (ja) 車両制御システムおよび車両制御方法
JP6948222B2 (ja) 撮影画像に含まれる停車場所を判定するためのシステム、方法、及びプログラム
WO2023135781A1 (ja) 転倒検出装置、システム及び方法、並びに、コンピュータ可読媒体
CN114763145A (zh) 驾驶行为检测方法、装置、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19913275

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19913275

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 24/02/2022)