CN109472884B - Unmanned vehicle data storage method, device, equipment and storage medium - Google Patents

Unmanned vehicle data storage method, device, equipment and storage medium Download PDF

Info

Publication number
CN109472884B
CN109472884B CN201811270051.0A CN201811270051A CN109472884B CN 109472884 B CN109472884 B CN 109472884B CN 201811270051 A CN201811270051 A CN 201811270051A CN 109472884 B CN109472884 B CN 109472884B
Authority
CN
China
Prior art keywords
unmanned vehicle
scene
information
determining
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811270051.0A
Other languages
Chinese (zh)
Other versions
CN109472884A (en
Inventor
王小霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN201811270051.0A priority Critical patent/CN109472884B/en
Publication of CN109472884A publication Critical patent/CN109472884A/en
Application granted granted Critical
Publication of CN109472884B publication Critical patent/CN109472884B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0602Interfaces specially adapted for storage systems specifically adapted to achieve a particular effect
    • G06F3/0604Improving or facilitating administration, e.g. storage management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0602Interfaces specially adapted for storage systems specifically adapted to achieve a particular effect
    • G06F3/0608Saving storage space on storage systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0628Interfaces specially adapted for storage systems making use of a particular technique
    • G06F3/0655Vertical data movement, i.e. input-output transfer; data movement between one or more hosts and one or more storage devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0668Interfaces specially adapted for storage systems adopting a particular infrastructure
    • G06F3/067Distributed or networked storage systems, e.g. storage area networks [SAN], network attached storage [NAS]
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention discloses a method, a device, equipment and a storage medium for storing data of an unmanned vehicle. Wherein, the method comprises the following steps: determining scene information of the unmanned vehicle according to driving behavior information and/or driving environment information of the unmanned vehicle acquired in real time; if the scene information of the unmanned vehicle belongs to an abnormal storage scene, selecting a target channel to be stored from all channels of the unmanned vehicle according to the scene information; and uploading the data of the target channel to the server, and storing the received data of the target channel by the server. According to the technical scheme provided by the embodiment of the invention, when the unmanned vehicle needs to store data, the data of different channels are selected and stored according to different scenes where the unmanned vehicle is located, and other useless data are abandoned, so that the network resources of the unmanned vehicle are saved.

Description

Unmanned vehicle data storage method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of information processing, in particular to a method, a device, equipment and a storage medium for storing data of an unmanned vehicle.
Background
With the development of science and technology, automobiles are gradually intelligentized, and unmanned automobiles are also gradually appeared in front of people. The unmanned vehicle is mainly realized by the mutual cooperation of a plurality of sensors arranged on the unmanned vehicle, and by utilizing the sensors, the unmanned vehicle can acquire massive data every second, the vehicle-end resources are limited, and the vehicle-end data are precious and contain much information which is worthy of mining. How to fully utilize the vehicle-end resources, screen the data that need to be reserved under different time and different scenes, improve the resource utilization, and reduce the occupation of unnecessary resources is very important.
However, in order to reduce the occupation of vehicle-end network resources, data reserved by falling discs is continuously reduced, so that a lot of useful data cannot be reserved, and therefore, the characteristics that the responsibility confirmation is complex and technical personnel are required to participate when no person has a vehicle accident exist and the like exist.
Disclosure of Invention
The embodiment of the invention provides a method, a device, equipment and a storage medium for storing unmanned vehicles data.
In a first aspect, an embodiment of the present invention provides an unmanned vehicle data storage method, where the method includes:
determining scene information of the unmanned vehicle according to driving behavior information and/or driving environment information of the unmanned vehicle acquired in real time;
if the scene information of the unmanned vehicle belongs to an abnormal storage scene, selecting a target channel to be stored from all channels of the unmanned vehicle according to the scene information;
and uploading the data of the target channel to the server, and storing the received data of the target channel by the server.
In a second aspect, an embodiment of the present invention further provides an unmanned vehicle data storage device, where the device includes:
the scene information determining module is used for determining scene information of the unmanned vehicle according to the driving behavior information and/or the driving environment information of the unmanned vehicle acquired in real time;
the target channel determining module is used for selecting a target channel to be stored from all channels of the unmanned vehicle according to the scene information if the scene information of the unmanned vehicle belongs to an abnormal storage scene;
and the data reporting module is used for uploading the data of the target channel to the server and storing the received data of the target channel by the server.
In a third aspect, an embodiment of the present invention further provides an apparatus, where the apparatus includes:
one or more processors;
storage means for storing one or more programs;
when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the unmanned vehicle data storage method of any of the first aspects.
In a fourth aspect, an embodiment of the present invention further provides a storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the unmanned vehicle data storage method according to any of the first aspects.
According to the unmanned vehicle data storage method, the unmanned vehicle data storage device, the unmanned vehicle data storage equipment and the unmanned vehicle data storage medium, the acquired driving behavior information and/or driving environment information of the unmanned vehicle are analyzed in real time, scene information of the unmanned vehicle is determined, and the scene of the unmanned vehicle is judged; and when the scene information of the unmanned vehicle belongs to an abnormal storage scene, selecting a channel for acquiring data required by the unmanned vehicle in the scene from all channels of the unmanned vehicle according to the scene information, taking the channel as a target channel to be stored, and uploading the data of the target channel to the server for storage. Compared with the existing unmanned vehicle data storage mode, the scheme has the advantages that when data are required to be stored, the data of different channels are stored according to different scenes where the unmanned vehicle is located, other useless data are abandoned, occupation of network resources by data transmission in the unmanned vehicle is reduced, and further the network resources of the unmanned vehicle are saved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart of an unmanned vehicle data storage method according to a first embodiment of the present invention;
fig. 2 is a flowchart of an unmanned vehicle data storage method according to a second embodiment of the present invention;
fig. 3 is a flowchart of an unmanned vehicle data storage method according to a third embodiment of the present invention;
fig. 4 is a block diagram illustrating a configuration of an unmanned vehicle data storage device according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of an apparatus provided in the fifth embodiment of the present invention.
Detailed Description
The embodiments of the present invention will be described in further detail with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the embodiments of the invention and that no limitation of the invention is intended. It should be further noted that, for convenience of description, only some structures, not all structures, relating to the embodiments of the present invention are shown in the drawings.
Example one
Fig. 1 is a flowchart of an unmanned vehicle data storage method according to an embodiment of the present invention. The embodiment is suitable for storing the data of the unmanned vehicle to reduce the occupation of network resources at the vehicle end. The method can be executed by the unmanned vehicle data storage device or the unmanned vehicle provided by the embodiment of the invention, the device can be realized in a software and/or hardware mode, and the device can be integrated on the unmanned vehicle or can be used as an independent device. Referring to fig. 1, the method specifically includes:
and S110, determining scene information of the unmanned vehicle according to the driving behavior information and/or the driving environment information of the unmanned vehicle collected in real time.
The driving behavior information is information describing a driving state of the unmanned vehicle, and may include at least one of a driving mode, a driving behavior, a driving speed, and the like of the unmanned vehicle. The driving environment information refers to information for describing a driving environment where the unmanned vehicle is located, and may include traffic light information, zebra crossing information, weather information, obstacle information, and the like on a road.
Optionally, the unmanned vehicle may be configured with a plurality of sensors, such as a laser radar, a camera, a GPS, a speed sensor, and a gyroscope, to cooperate with each other, so as to collect driving behavior information and driving environment information of the unmanned vehicle in real time. Specifically, the point cloud data collected by the laser radar, the image data collected by the camera, the positioning data collected by the GPS, the speed data collected by the speed sensor and the gyroscope, and the like may be fused according to a preset algorithm, so as to obtain the driving behavior information and the driving environment information of the unmanned vehicle. For example, it may be: the unmanned vehicle driving road scene is reproduced based on the combination of the point cloud data acquired by the laser radar and the image data acquired by the camera; and then, determining the driving behavior information and the driving environment information of the unmanned vehicle based on positioning data acquired by a GPS (global positioning system), speed data acquired by a speed sensor and a gyroscope and the like.
The scene information refers to a scene of unmanned vehicle driving, and can be a straight line passing scene, an overspeed scene, a yielding scene, an automatic driving scene, an intersection scene or a takeover scene and the like.
Specifically, after the driving behavior information and the driving environment information of the unmanned vehicle are acquired in real time through the mutual matching of a plurality of sensors, such as a laser radar, a camera, a GPS and a gyroscope, configured on the unmanned vehicle, a pre-constructed scene recognition model can be adopted to recognize the scene information where the unmanned vehicle is located; scene information and the like of the unmanned vehicle can be detected and obtained through detection equipment configured on the unmanned vehicle. Or a specific statistical analysis algorithm is adopted for the driving behavior information and/or the driving environment information, and the scene information of the unmanned vehicle is analyzed and determined.
For example, the driving behavior information of the unmanned vehicle collected in real time can be analyzed to determine the scene information of the unmanned vehicle; or analyzing the driving environment information of the unmanned vehicle collected in real time to determine the scene information of the unmanned vehicle; in order to ensure the comprehensiveness of the determined scene information, the scene information of the unmanned vehicle can be determined by comprehensively analyzing the driving behavior information and the driving environment information of the unmanned vehicle, which are acquired in real time.
For example, according to the driving behavior information and/or the driving environment information of the unmanned vehicle collected in real time, the determining of the scene information of the unmanned vehicle may be: the driving behavior information and/or the driving environment information of the unmanned vehicle collected in real time are input into a scene recognition model which is constructed in advance based on machine learning, and the scene information of the unmanned vehicle is determined according to the output result of the scene recognition model. The scene recognition model is obtained by training a machine learning model based on sample driving behavior information and/or sample driving environment information and sample scene information.
And S120, if the scene information of the unmanned vehicle belongs to the abnormal storage scene, selecting a target channel to be stored from all channels of the unmanned vehicle according to the scene information.
The abnormal storage scene information refers to information that the unmanned vehicle is in an abnormal driving scene and is stored in advance, and may include at least one of a take-over scene, a straight-line traffic scene, a yield scene, an overspeed scene, an intersection scene, an obstacle hidden danger scene and the like. Optionally, determining whether the scene information of the unmanned vehicle belongs to an abnormal storage scene may include: and if the preset abnormal storage scene information comprises the scene information of the unmanned vehicle, determining that the scene information of the unmanned vehicle belongs to the abnormal storage scene.
In this embodiment, the channel is a carrier for transmitting data, and data can be transmitted from one functional module to at least one other functional module only in the channel. Correspondingly, each functional module integrated in the vehicle-mounted system of the unmanned vehicle is used as each functional module in the data transmission topological structure, and the content transmitted by the subscribed channel can be received when the system runs only if the specified channel is subscribed in advance. Specifically, the number of available channels in the vehicle-mounted system of the unmanned vehicle is limited, and the transmitted data is unlimited. And further, channels in the system can be subscribed according to the transmission relation of data in the vehicle-mounted system. And further, when the system runs, acquiring the corresponding content through the subscribed channel.
Illustratively, different channels in the unmanned vehicle store different data; the target channel is a channel for acquiring data required by the unmanned vehicle in an abnormal storage scene when the scene information of the unmanned vehicle belongs to the abnormal storage scene; optionally, the number of the target channels may not exceed the total number of channels in the unmanned vehicle, and may be one or multiple. The scene information of the unmanned vehicle is different, the required data is also different, and the corresponding target channels are different.
Specifically, after scene information of the unmanned vehicle is determined according to driving behavior information and/or driving environment information of the unmanned vehicle collected in real time, the scene of the unmanned vehicle is judged; if the preset abnormal storage scene information is detected to include the scene information of the unmanned vehicle, determining that the scene information of the unmanned vehicle belongs to the abnormal storage scene; and selecting a channel for acquiring data required by the unmanned vehicle in the scene from all channels of the unmanned vehicle according to the scene information as a target channel to be stored. If the preset abnormal storage scene information is detected to not include the scene information of the unmanned vehicle, the scene information of the unmanned vehicle is determined not to belong to the abnormal storage scene, and in order to reduce occupation of vehicle-end resources of the unmanned vehicle, data acquired in the scene can be deleted.
Optionally, the scene information may include acquisition time information, frequency point information, and the like. Therefore, after determining that the scene information of the unmanned vehicle belongs to the abnormal storage scene; and selecting a target channel to be stored from each channel of the unmanned vehicle directly according to the frequency point information and the acquisition time information in the scene information.
S130, uploading the data of the target channel to the server, and storing the received channel data by the server.
The data of the target channel is the data which is acquired by the target channel and is needed when the unmanned vehicle is in a certain abnormal storage scene. The server may be a device or software system having data storage, information processing capabilities, and the like.
Specifically, if the scene information of the unmanned vehicle belongs to an abnormal storage scene, after a target channel to be stored is selected from each channel of the unmanned vehicle according to the scene information, the unmanned vehicle uploads the data of the target channel to the server, so that the server locally stores the data of the target channel after receiving the data of the target channel, and subsequent query is facilitated. The unmanned vehicle gives up uploading data of the non-target channel to the server side so as to save network resources of the unmanned vehicle; in addition, the data of the non-target channels can be locally discarded and stored at the vehicle end, so that the storage space of the unmanned vehicle is released.
It should be noted that, in this embodiment, when the unmanned vehicle needs the server to store data, the data of the target channel to be stored is selected from the channels through the scene information, and is transmitted to the server, so that the server performs local storage according to the received data. Compared with the existing unmanned vehicle data storage mode, the scheme reduces the occupation of data transmission in the unmanned vehicle on network resources by selectively storing data, and further saves the network resources of the unmanned vehicle.
According to the technical scheme provided by the embodiment of the invention, the acquired driving behavior information and/or driving environment information of the unmanned vehicle are analyzed in real time, the scene information of the unmanned vehicle is determined, and the scene of the unmanned vehicle is judged; and when the scene information of the unmanned vehicle belongs to an abnormal storage scene, selecting a channel for acquiring data required by the unmanned vehicle in the scene from all channels of the unmanned vehicle according to the scene information, taking the channel as a target channel to be stored, and uploading the data of the target channel to the server for storage. Compared with the existing unmanned vehicle data storage mode, the scheme has the advantages that when data are required to be stored, the data of different channels are stored according to different scenes where the unmanned vehicle is located, other useless data are abandoned, occupation of network resources by data transmission in the unmanned vehicle is reduced, and further the network resources of the unmanned vehicle are saved.
Optionally, selecting a target channel to be stored from the channels of the unmanned vehicle may further include:
if the unmanned vehicle is communicated through the wireless local area network, the first channel information included in the scene information is used as a target channel to be stored; and if the unmanned vehicle is communicated through the mobile network, the second channel information included in the scene information is used as a target channel to be stored.
The first channel is a channel used for acquiring data required by a specific scene in the unmanned vehicle under the wireless local area network, and may include one or more channels; correspondingly, the second channel refers to a channel for collecting data required by a specific scene in the unmanned vehicle in the mobile network communication mode, and may include one or more channels. The first channel information refers to information of a channel associated with the first channel, and may include a name, a number, frequency point information, and the like of the channel; correspondingly, the second channel information refers to information of a channel associated with the second channel, and may include a name, a number, frequency point information, and the like of the channel. It should be noted that the first channel information and the second channel information are different.
Specifically, after determining that the scene information of the unmanned vehicle belongs to an abnormal storage scene, if it is detected that the communication between the unmanned vehicle and the server is realized based on the wireless local area network, the first channel information included in the scene information can be used as a target channel to be stored; if it is detected that the communication between the unmanned vehicle and the server is realized based on the mobile network, the second channel information included in the scene information may be used as a target channel to be stored.
In this embodiment, a mode of selecting a target channel based on a communication system is provided, and then data of the corresponding target channel is reported to a server, so that occupation of network resources by data transmission in an unmanned vehicle is reduced to a certain extent, and further the network resources of the unmanned vehicle are saved.
Example two
Fig. 2 is a flowchart of a method for storing data of an unmanned vehicle according to a second embodiment of the present invention, and this embodiment further explains scene information where the unmanned vehicle is located according to driving behavior information of the unmanned vehicle collected in real time on the basis of the first embodiment. Referring to fig. 2, the method specifically includes:
and S210, determining scene information of the unmanned vehicle according to at least one of the driving mode, the driving behavior and the driving speed of the unmanned vehicle acquired in real time.
The driving modes can include an automatic (unmanned) driving mode, a manual driving mode and a take-over point mode according to whether a person participates; from the safety of driving, a safe driving mode, other self-defined modes and the like can be further included; the take-over mode refers to a situation where the unmanned vehicle is changed from the autonomous driving mode to the manual driving mode. Driving behavior may include, but is not limited to: straight running, steering, reverse running, turning around or lane changing and the like; the steering may be left or right; correspondingly, the lane change can be a left lane change, a right lane change or a left-right continuous lane change. The driving speed may be in-running, constant speed running, variable speed (non-constant speed) running, overspeed running, non-overspeed running, stopping, or the like.
Specifically, the driving mode, the driving behavior and the driving speed of the unmanned vehicle can be acquired in real time through the mutual cooperation of a plurality of sensors, such as a laser radar, a camera, a GPS (global positioning system), a gyroscope and the like, which are configured on the unmanned vehicle; and analyzing the unmanned vehicle according to one or more of the driving mode, the driving behavior and the driving speed of the unmanned vehicle acquired in real time, and determining the scene where the unmanned vehicle is located.
For example, determining the scene information where the unmanned vehicle is located according to at least one of the driving mode, the driving behavior and the driving speed of the unmanned vehicle acquired in real time may include one or more of the following situations:
1) if the driving mode of the unmanned vehicle is detected to be the takeover point mode, determining that the unmanned vehicle is in a takeover scene;
specifically, if it is detected that the driving mode of the unmanned vehicle is switched to the takeover point mode, it is determined that the unmanned vehicle is in a takeover scene.
2) If the driving mode of the unmanned vehicle is detected to be the automatic driving mode, determining that the unmanned vehicle is in the automatic driving scene;
3) if the driving mode of the unmanned vehicle is detected to be the safe driving mode, determining that the unmanned vehicle is in the safe driving scene;
4) if the driving behavior of the unmanned vehicle is steering, turning around or lane changing, determining that the unmanned vehicle is in a yielding scene;
specifically, determining that the unmanned vehicle is in the yielding scene may include: if the driving behavior of the unmanned vehicle is detected to be left turning or right turning, determining that the unmanned vehicle is in a yielding scene; if the driving behavior of the unmanned vehicle is detected to be turning around, determining that the unmanned vehicle is in a yielding scene; and if the driving behavior of the unmanned vehicle is detected to be left lane changing, right lane changing or left and right continuous lane changing, determining that the unmanned vehicle is in a yielding scene.
5) If the driving behavior of the unmanned vehicle is straight, determining that the unmanned vehicle is in a passing scene;
6) if the driving speed of the unmanned vehicle is detected to be greater than the road speed limit value, determining that the unmanned vehicle is in an overspeed scene;
the road speed limit value is an upper limit value of a driving speed of a vehicle allowed on a certain section of road preset by a traffic control department according to the road environment condition; optionally, the road speed limit value may be obtained from map data, or may be obtained from image data acquired by a camera. Specifically, the driving speed of the unmanned vehicle acquired by the gyroscope can be compared with the road speed limit value of the road section where the unmanned vehicle is located, and if the driving speed of the unmanned vehicle is detected to be greater than the road speed limit value, the unmanned vehicle is determined to be in an overspeed scene; and if the driving speed of the unmanned vehicle is detected to be less than or equal to the road speed limit value, determining that the unmanned vehicle is in a non-overspeed scene.
7) If the driving speed of the unmanned vehicle is detected to be constant within a preset time period, determining that the unmanned vehicle is in a uniform speed scene;
8) if the driving speeds of the unmanned vehicles at any time are different, determining that the unmanned vehicles are in a speed change scene;
9) and if the driving speed of the unmanned vehicle is detected to be zero in the preset time period, determining that the unmanned vehicle is in a stop scene.
And S220, if the scene information of the unmanned vehicle belongs to the abnormal storage scene, selecting a target channel to be stored from all channels of the unmanned vehicle according to the scene information.
And S230, uploading the data of the target channel to the server, and storing the received channel data by the server.
According to the technical scheme provided by the embodiment of the invention, the acquired driving behavior information of the unmanned vehicle is analyzed in real time, the scene information of the unmanned vehicle is determined, and the scene of the unmanned vehicle is judged; and when the scene information of the unmanned vehicle belongs to an abnormal storage scene, selecting a channel for acquiring data required by the unmanned vehicle in the scene from all channels of the unmanned vehicle according to the scene information, taking the channel as a target channel to be stored, and uploading the data of the target channel to the server for storage. Compared with the existing unmanned vehicle data storage mode, the scheme has the advantages that when data are required to be stored, the data of different channels are stored according to different scenes where the unmanned vehicle is located, other useless data are abandoned, occupation of network resources by data transmission in the unmanned vehicle is reduced, and further the network resources of the unmanned vehicle are saved.
EXAMPLE III
Fig. 3 is a flowchart of a method for storing data of an unmanned vehicle according to a third embodiment of the present invention, and this embodiment further explains scene information where the unmanned vehicle is located according to driving environment information of the unmanned vehicle collected in real time on the basis of the third embodiment. Referring to fig. 3, the method specifically includes:
and S310, determining scene information of the unmanned vehicle according to traffic light information and/or obstacle information detected by the unmanned vehicle in real time.
The traffic light information refers to state information of traffic lights on a road, and may include: red light on, green light to red light, red light to green light, etc.; the obstacle information is information of an object other than the host vehicle on the unmanned vehicle driving road, and may include the type of the obstacle, the driving behavior of the obstacle, the relative position and relative speed of the obstacle and the vehicle, and the like. The obstacle types may include static obstacles and dynamic obstacles; static obstacles may include stopped pedestrians, vehicles (bicycles, motorcycles, automobiles, unmanned vehicles, etc.), trees, curbs, etc.; dynamic obstacles may include pedestrians, vehicles, etc. walking; the driving behavior of the obstacle can be static, retrograde motion, straight motion, lane change or turning around and the like; the relative position of the obstacle and the vehicle can be an opposite lane, a front lane, a rear lane, other lanes or relative distance and the like; the relative speed of the obstacle to the vehicle may be a change (acceleration, deceleration, relatively fast or relatively slow, etc.) or fast or slow (later than the unmanned vehicle, earlier than the unmanned vehicle, the same speed, etc.), etc.
Specifically, the traffic light information and/or the obstacle information can be detected in real time through the mutual matching of a plurality of sensors such as a laser radar, a camera, a GPS (global positioning system), a gyroscope and the like which are arranged on the unmanned vehicle; and analyzing the unmanned vehicle according to the traffic light information and/or the barrier information detected in real time to determine the scene of the unmanned vehicle.
For example, determining the scene information where the unmanned vehicle is located according to the traffic light information and/or the obstacle information detected by the unmanned vehicle in real time may include: determining scene information of the unmanned vehicle according to the obstacle information detected by the unmanned vehicle in real time; determining scene information of the unmanned vehicle according to traffic light information detected by the unmanned vehicle in real time; and determining scene information of the unmanned vehicle according to traffic light information and obstacle information detected by the unmanned vehicle in real time.
Optionally, determining scene information where the unmanned vehicle is located according to obstacle information detected by the unmanned vehicle in real time may include: and determining scene information of the unmanned vehicle according to the type of the obstacle detected by the unmanned vehicle in real time, the relative position and the relative speed of the obstacle and the vehicle.
For example, if it is detected that the obstacle is a driven vehicle, the driven vehicle is located in a lane behind the unmanned vehicle and the distance between the driven vehicle and the unmanned vehicle is less than the safe driving distance, and the driven vehicle is accelerating relative to the unmanned vehicle, it is determined that the unmanned vehicle is in the obstacle potential situation. Or if the obstacle is detected to be a driving vehicle, and the relative time determined according to the relative position and the relative speed of the obstacle and the vehicle is smaller than the time threshold, determining that the unmanned vehicle is in the obstacle hidden danger scene.
Optionally, determining the scene information of the unmanned vehicle according to the traffic light information detected by the unmanned vehicle in real time may include: and determining that the unmanned vehicle is in the intersection scene according to the traffic light information detected by the unmanned vehicle in real time. Such as an intersection or a tee junction, etc.
And S320, if the scene information of the unmanned vehicle belongs to the abnormal storage scene, selecting a target channel to be stored from all channels of the unmanned vehicle according to the scene information.
S330, uploading the data of the target channel to the server, and storing the received channel data by the server.
According to the technical scheme provided by the embodiment of the invention, the acquired driving environment information of the unmanned vehicle is analyzed in real time, the scene information of the unmanned vehicle is determined, and the scene of the unmanned vehicle is judged; and when the scene information of the unmanned vehicle belongs to an abnormal storage scene, selecting a channel for acquiring data required by the unmanned vehicle in the scene from all channels of the unmanned vehicle according to the scene information, taking the channel as a target channel to be stored, and uploading the data of the target channel to the server for storage. Compared with the existing unmanned vehicle data storage mode, the scheme has the advantages that when data are required to be stored, the data of different channels are stored according to different scenes where the unmanned vehicle is located, other useless data are abandoned, occupation of network resources by data transmission in the unmanned vehicle is reduced, and further the network resources of the unmanned vehicle are saved.
Example four
Fig. 4 is a block diagram of a structure of an unmanned vehicle data storage device according to a fourth embodiment of the present invention, where the device is capable of executing an unmanned vehicle data storage method according to any embodiment of the present invention, and has corresponding functional modules and beneficial effects of the execution method. As shown in fig. 4, the apparatus may include:
the scene information determining module 410 is configured to determine scene information where the unmanned vehicle is located according to driving behavior information and/or driving environment information of the unmanned vehicle collected in real time;
the target channel determining module 420 is configured to select a target channel to be stored from each channel of the unmanned vehicle according to the scene information if the scene information of the unmanned vehicle belongs to an abnormal storage scene;
the data reporting module 430 is configured to upload data of the target channel to the server, and the server stores the received data of the target channel.
According to the technical scheme provided by the embodiment of the invention, the acquired driving behavior information and/or driving environment information of the unmanned vehicle are analyzed in real time, the scene information of the unmanned vehicle is determined, and the scene of the unmanned vehicle is judged; and when the scene information of the unmanned vehicle belongs to an abnormal storage scene, selecting a channel for acquiring data required by the unmanned vehicle in the scene from all channels of the unmanned vehicle according to the scene information, taking the channel as a target channel to be stored, and uploading the data of the target channel to the server for storage. Compared with the existing unmanned vehicle data storage mode, the scheme has the advantages that when data are required to be stored, the data of different channels are stored according to different scenes where the unmanned vehicle is located, other useless data are abandoned, occupation of network resources by data transmission in the unmanned vehicle is reduced, and further the network resources of the unmanned vehicle are saved.
Illustratively, the context information determining module 410 includes:
the first scene information determining unit is used for determining scene information of the unmanned vehicle according to at least one of the driving mode, the driving behavior and the driving speed of the unmanned vehicle acquired in real time.
For example, the first context information determining unit may specifically be configured to:
if the driving mode of the unmanned vehicle is detected to be the takeover point mode, determining that the unmanned vehicle is in a takeover scene;
if the driving behavior of the unmanned vehicle is steering, turning around or lane changing, determining that the unmanned vehicle is in a yielding scene;
and if the driving speed of the unmanned vehicle is detected to be greater than the road speed limit value, determining that the unmanned vehicle is in an overspeed scene.
Illustratively, the context information determining module 410 further includes:
and the second scene information determining unit is used for determining the scene information of the unmanned vehicle according to the traffic light information and/or the barrier information detected by the unmanned vehicle in real time.
For example, when determining the scene information of the unmanned vehicle according to the obstacle information detected by the unmanned vehicle in real time, the second scene information determining unit is specifically configured to:
and determining scene information of the unmanned vehicle according to the type of the obstacle detected by the unmanned vehicle in real time, the relative position and the relative speed of the obstacle and the vehicle.
For example, when determining whether the scene information of the unmanned vehicle belongs to the abnormal storage scene, the target channel determining module 420 is specifically configured to:
if the preset abnormal storage scene information comprises the scene information of the unmanned vehicle, determining that the scene information of the unmanned vehicle belongs to an abnormal storage scene; the abnormal storage scene information comprises at least one of a take-over scene, a yield scene, an overspeed scene, an intersection scene and an obstacle hidden danger scene.
Illustratively, the context information determining module 410 is specifically configured to:
the driving behavior information and/or the driving environment information of the unmanned vehicle collected in real time are input into a scene recognition model which is constructed in advance based on machine learning, and the scene information of the unmanned vehicle is determined according to the output result of the scene recognition model.
For example, the target channel determining module 420 may have a function of, when selecting a target channel to be stored from channels of the unmanned vehicle:
if the unmanned vehicle is communicated through the wireless local area network, the first channel information included in the scene information is used as a target channel to be stored;
and if the unmanned vehicle is communicated through the mobile network, the second channel information included in the scene information is used as a target channel to be stored.
EXAMPLE five
Fig. 5 is a schematic structural diagram of an apparatus according to a fifth embodiment of the present invention, and fig. 5 shows a block diagram of an exemplary apparatus suitable for implementing the embodiment of the present invention. The device 12 shown in fig. 5 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present invention. As shown in FIG. 5, device 12 is in the form of a general purpose computing device. The components of device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, and commonly referred to as a "hard drive"). Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. System memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in system memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of embodiments described herein.
Device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with device 12, and/or with any devices (e.g., network card, modem, etc.) that enable device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with the other modules of the device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, implementing the method for storing data with an unmanned vehicle according to the embodiment of the present invention.
EXAMPLE six
The sixth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program (or referred to as computer-executable instructions) is stored, where the computer program, when executed by a processor, can implement the method for storing the data of the unmanned vehicle according to any of the above embodiments.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the embodiments of the present invention have been described in more detail through the above embodiments, the embodiments of the present invention are not limited to the above embodiments, and many other equivalent embodiments may be included without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. An unmanned vehicle data storage method is characterized by comprising the following steps:
determining scene information of the unmanned vehicle according to driving behavior information and/or driving environment information of the unmanned vehicle acquired in real time;
if the scene information of the unmanned vehicle belongs to an abnormal storage scene, selecting a target channel to be stored from all channels of the unmanned vehicle according to the scene information; the abnormal storage scene information comprises at least one of a take-over scene, a yield scene, an overspeed scene, an intersection scene and an obstacle hidden danger scene;
uploading the data of the target channel to a server, and storing the received data of the target channel by the server;
the method for selecting the target channel to be stored from the channels of the unmanned vehicle comprises the following steps:
if the unmanned vehicle is communicated through a wireless local area network, the first channel information included in the scene information is used as a target channel to be stored;
if the unmanned vehicle is communicated through the mobile network, the second channel information included in the scene information is used as a target channel to be stored; the first channel information and the second channel information are different.
2. The method according to claim 1, wherein the determining scene information of the unmanned vehicle according to the driving behavior information of the unmanned vehicle collected in real time comprises:
and determining scene information of the unmanned vehicle according to at least one of the driving mode, the driving behavior and the driving speed of the unmanned vehicle acquired in real time.
3. The method of claim 2, wherein determining scene information of the unmanned vehicle according to at least one of the driving mode, the driving behavior and the driving speed of the unmanned vehicle acquired in real time comprises:
if the driving mode of the unmanned vehicle is detected to be the takeover point mode, determining that the unmanned vehicle is in a takeover scene;
if the driving behavior of the unmanned vehicle is steering, turning around or lane changing, determining that the unmanned vehicle is in a yielding scene;
and if the driving speed of the unmanned vehicle is detected to be greater than the road speed limit value, determining that the unmanned vehicle is in an overspeed scene.
4. The method of claim 1, wherein determining scene information of the unmanned vehicle according to the driving environment information of the unmanned vehicle collected in real time comprises:
and determining scene information of the unmanned vehicle according to traffic light information and/or obstacle information detected by the unmanned vehicle in real time.
5. The method of claim 4, wherein determining scene information of the unmanned vehicle according to the obstacle information detected by the unmanned vehicle in real time comprises:
and determining scene information of the unmanned vehicle according to the type of the obstacle detected by the unmanned vehicle in real time, the relative position and the relative speed of the obstacle and the vehicle.
6. The method of claim 1, wherein determining whether the scene information of the unmanned vehicle belongs to an abnormal storage scene comprises:
and if the preset abnormal storage scene information comprises the scene information of the unmanned vehicle, determining that the scene information of the unmanned vehicle belongs to the abnormal storage scene.
7. The method according to claim 1, wherein determining scene information of the unmanned vehicle according to the driving behavior information and/or the driving environment information of the unmanned vehicle collected in real time comprises:
the method comprises the steps of inputting driving behavior information and/or driving environment information of the unmanned vehicle collected in real time into a scene recognition model which is constructed in advance based on machine learning, and determining scene information of the unmanned vehicle according to an output result of the scene recognition model.
8. An unmanned vehicle data storage device, comprising:
the scene information determining module is used for determining scene information of the unmanned vehicle according to the driving behavior information and/or the driving environment information of the unmanned vehicle acquired in real time;
the target channel determining module is used for selecting a target channel to be stored from all channels of the unmanned vehicle according to the scene information if the scene information of the unmanned vehicle belongs to an abnormal storage scene; the abnormal storage scene information comprises at least one of a take-over scene, a yield scene, an overspeed scene, an intersection scene and an obstacle hidden danger scene;
the data reporting module is used for uploading data of a target channel to the server side, and the server side stores the received channel data;
when the target channel determination module selects a target channel to be stored from channels of the unmanned vehicle, the target channel determination module is specifically configured to:
if the unmanned vehicle is communicated through a wireless local area network, the first channel information included in the scene information is used as a target channel to be stored;
and if the unmanned vehicle is communicated through a mobile network, using second channel information included in the scene information as a target channel to be stored, wherein the first channel information is different from the second channel information.
9. An apparatus, characterized in that the apparatus comprises:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the unmanned vehicle data storage method of any of claims 1-7.
10. A storage medium on which a computer program is stored, characterized in that the program, when executed by a processor, implements the unmanned vehicle data storage method according to any one of claims 1-7.
CN201811270051.0A 2018-10-29 2018-10-29 Unmanned vehicle data storage method, device, equipment and storage medium Active CN109472884B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811270051.0A CN109472884B (en) 2018-10-29 2018-10-29 Unmanned vehicle data storage method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811270051.0A CN109472884B (en) 2018-10-29 2018-10-29 Unmanned vehicle data storage method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109472884A CN109472884A (en) 2019-03-15
CN109472884B true CN109472884B (en) 2022-02-18

Family

ID=65666330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811270051.0A Active CN109472884B (en) 2018-10-29 2018-10-29 Unmanned vehicle data storage method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109472884B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110275510B (en) * 2019-06-14 2021-12-07 百度在线网络技术(北京)有限公司 Processing frequency adjusting method and device of vehicle control model and electronic equipment
CN111582018B (en) * 2020-03-24 2024-02-09 北京掌行通信息技术有限公司 Unmanned vehicle dynamic interaction scene judging method, unmanned vehicle dynamic interaction scene judging system, unmanned vehicle dynamic interaction scene judging terminal and storage medium
CN111694362B (en) * 2020-06-23 2024-01-12 北京京东乾石科技有限公司 Driving path planning method and device, storage medium and electronic equipment
CN112017325B (en) * 2020-08-06 2022-12-13 广州小鹏自动驾驶科技有限公司 Message processing method and device, vehicle and storage medium
CN113761306A (en) * 2020-09-30 2021-12-07 北京京东乾石科技有限公司 Vehicle-end data processing method and device
CN112200616A (en) * 2020-10-26 2021-01-08 新石器慧义知行智驰(北京)科技有限公司 Investigation method and device, electronic equipment and storage medium
CN113799795B (en) * 2020-10-30 2023-08-04 北京京东乾石科技有限公司 Unmanned vehicle control method, storage medium and electronic equipment
CN112287566B (en) * 2020-11-24 2024-05-07 北京亮道智能汽车技术有限公司 Automatic driving scene library generation method and system and electronic equipment
CN113033684A (en) * 2021-03-31 2021-06-25 浙江吉利控股集团有限公司 Vehicle early warning method, device, equipment and storage medium
CN113947893A (en) * 2021-09-03 2022-01-18 网络通信与安全紫金山实验室 Method and system for restoring driving scene of automatic driving vehicle
CN116760891B (en) * 2023-08-21 2023-11-03 西安华创马科智能控制系统有限公司 Data processing method and device for downhole multi-equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202026426U (en) * 2011-04-07 2011-11-02 上海迈迅威视觉科技有限公司 Vehicle-mounted video monitor
CN105978701A (en) * 2016-05-11 2016-09-28 惠州市凯越电子有限公司 Information interaction system and method between vehicle-mounted system and self-established channel of intelligent mobile terminal
CN106469477A (en) * 2016-08-31 2017-03-01 北京汇通天下物联科技有限公司 A kind of driving event recording method and system
CN107042824A (en) * 2015-10-23 2017-08-15 哈曼国际工业有限公司 System and method for detecting the accident in vehicle

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8898443B2 (en) * 2010-10-01 2014-11-25 Z124 Multi-operating system
CN106303918B (en) * 2015-11-24 2020-06-02 北京智谷睿拓技术服务有限公司 Inter-device communication method, inter-device communication resource allocation method, and device thereof
US9889859B2 (en) * 2015-12-21 2018-02-13 Intel Corporation Dynamic sensor range in advanced driver assistance systems
CN105721544B (en) * 2016-01-19 2019-07-23 福州华鹰重工机械有限公司 Vehicle border information sharing method and device based on content
US10309794B2 (en) * 2016-03-04 2019-06-04 GM Global Technology Operations LLC Progressive map maintenance at a mobile navigation unit

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202026426U (en) * 2011-04-07 2011-11-02 上海迈迅威视觉科技有限公司 Vehicle-mounted video monitor
CN107042824A (en) * 2015-10-23 2017-08-15 哈曼国际工业有限公司 System and method for detecting the accident in vehicle
CN105978701A (en) * 2016-05-11 2016-09-28 惠州市凯越电子有限公司 Information interaction system and method between vehicle-mounted system and self-established channel of intelligent mobile terminal
CN106469477A (en) * 2016-08-31 2017-03-01 北京汇通天下物联科技有限公司 A kind of driving event recording method and system

Also Published As

Publication number Publication date
CN109472884A (en) 2019-03-15

Similar Documents

Publication Publication Date Title
CN109472884B (en) Unmanned vehicle data storage method, device, equipment and storage medium
US11548516B2 (en) Data acquisition method, apparatus, device and computer-readable storage medium
CN109345829B (en) Unmanned vehicle monitoring method, device, equipment and storage medium
US20200349057A1 (en) Using divergence to conduct log-based simulations
CN113196103A (en) Object motion classification for autonomous vehicles
CN111739344B (en) Early warning method and device and electronic equipment
CN110796007B (en) Scene recognition method and computing device
JP2022084758A (en) Vehicle monitoring method, device, electronic apparatus, storage medium, computer program, cloud control platform, and vehicle load cooperation system
CN113741485A (en) Control method and device for cooperative automatic driving of vehicle and road, electronic equipment and vehicle
CN108766031B (en) Method and device for detecting lane obstacle
US11480964B2 (en) Distributed system execution using a serial timeline
CN113511204B (en) Vehicle lane changing behavior identification method and related equipment
CN109635861B (en) Data fusion method and device, electronic equipment and storage medium
CN111553319A (en) Method and device for acquiring information
US20190268402A1 (en) Distributed computing of vehicle data
CN113033463A (en) Deceleration strip detection method and device, electronic equipment and storage medium
CN110765224A (en) Processing method of electronic map, vehicle vision repositioning method and vehicle-mounted equipment
CN114492022A (en) Road condition sensing data processing method, device, equipment, program and storage medium
WO2020139959A1 (en) Architecture for simulation of distributed systems
CN114179829A (en) Multi-end cooperative vehicle driving method, device, system and medium
CN114771576A (en) Behavior data processing method, control method of automatic driving vehicle and automatic driving vehicle
CN112923948B (en) Vehicle navigation broadcasting method and device, electronic equipment and storage medium
CN114619949B (en) Highway lane change prompting method and device, vehicle, electronic equipment and medium
CN114596704A (en) Traffic event processing method, device, equipment and storage medium
CN112885087A (en) Method, apparatus, device and medium for determining road condition information and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant