CN113761306A - Vehicle-end data processing method and device - Google Patents

Vehicle-end data processing method and device Download PDF

Info

Publication number
CN113761306A
CN113761306A CN202011065298.6A CN202011065298A CN113761306A CN 113761306 A CN113761306 A CN 113761306A CN 202011065298 A CN202011065298 A CN 202011065298A CN 113761306 A CN113761306 A CN 113761306A
Authority
CN
China
Prior art keywords
vehicle
end data
detection result
scene
vehicle end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011065298.6A
Other languages
Chinese (zh)
Inventor
段亚男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN202011065298.6A priority Critical patent/CN113761306A/en
Publication of CN113761306A publication Critical patent/CN113761306A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/906Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Abstract

The invention discloses a vehicle-end data processing method and device, and relates to the technical field of computers. The method comprises the following steps: acquiring vehicle end data generated in the automatic driving process of the vehicle; based on the type of the vehicle end data, calling a corresponding scene detection model to detect the vehicle end data and determining a scene detection result; the scene detection model is used for detecting scenes in the automatic driving process, the scene detection results comprise a first class detection result and a second class detection result, the first class detection result is used for indicating that an abnormal scene is detected, and the second class detection result is used for indicating that the abnormal scene is not detected; and storing the vehicle end data corresponding to the first type of detection result, and deleting the vehicle end data except the vehicle end data corresponding to the first type of detection result. The embodiment reduces the cost of manpower and network resources, saves storage resources and improves the updating and iteration efficiency of the automatic driving algorithm.

Description

Vehicle-end data processing method and device
Technical Field
The invention relates to the technical field of computers, in particular to a vehicle-end data processing method and device.
Background
The automatic driving technology replaces the traditional driver operation through a computer system, and is a new trend of future development of the automobile industry.
In order to update and iterate the automatic driving algorithm, problem data of the automatic driving algorithm needs to be found out first, and then problem regression and algorithm iteration are performed. Currently, in the operation process, the problem data of the automatic driving algorithm is mainly determined in the following way: an operator records problems and time points on site in real time, screens required data records, and iterates an automatic driving algorithm after returning the data; or the operators on site do not need to record in real time, but return all the data, and then search the required problem data from all the data in the later period to iterate the automatic driving algorithm.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art:
the operator real-time on-site recording can cause the labor cost of automatic driving operation to be too high; and the storage resources of the automatic driving vehicle are limited, all the data are reserved, the storage resources are wasted, a large amount of data after the data are reserved occupy a network when being transmitted back, the network cost is high, the problem data need to be positioned from all the data in the later period, and the efficiency is too low.
Disclosure of Invention
In view of this, the invention provides a vehicle-end data processing method and device, which can reduce the costs of manpower and network resources, save storage resources, and improve the efficiency of updating and iterating an automatic driving algorithm.
To achieve the above object, according to one aspect of the present invention, a vehicle-end data processing method is provided.
The vehicle end data processing method comprises the following steps:
acquiring vehicle end data generated in the automatic driving process of the vehicle;
based on the type of the vehicle end data, calling a corresponding scene detection model to detect the vehicle end data and determining a scene detection result; the scene detection model is used for detecting scenes in the automatic driving process, the scene detection results comprise a first class detection result and a second class detection result, the first class detection result is used for indicating that an abnormal scene is detected, and the second class detection result is used for indicating that the abnormal scene is not detected;
and storing the vehicle end data corresponding to the first type of detection result, and deleting the vehicle end data except the vehicle end data corresponding to the first type of detection result.
Optionally, the obtaining vehicle-end data generated in the automatic driving process of the vehicle includes:
receiving and storing vehicle end data generated in the automatic driving process of the vehicle; after a trigger instruction is received, a vehicle end data identifier carried by the trigger instruction is analyzed, and vehicle end data corresponding to the vehicle end data identifier is obtained.
The method further comprises the following steps:
and after the vehicle end data corresponding to the vehicle end data identification is obtained, judging whether the obtained data is successful, and if not, reporting error information.
Optionally, the calling a corresponding scene detection model to detect the vehicle-end data based on the vehicle-end data type, and determining a scene detection result includes:
according to configuration information of a scene detection model, vehicle end data required by the model are selected from vehicle end data corresponding to the vehicle end data identification; and inputting the vehicle end data required by the model into the scene detection model to detect the vehicle end data and determine a scene detection result.
Optionally, the scene detection model includes:
a collision scene detection model or a red light running scene detection model.
Optionally, the method further comprises:
and setting a category label for the vehicle end data according to the scene detection result, storing the vehicle end data corresponding to the first type detection result according to the category label, and deleting the vehicle end data except the vehicle end data corresponding to the first type detection result.
Optionally, the method further comprises:
and returning the stored vehicle-side data to the cloud database so as to optimize the automatic driving algorithm according to the returned vehicle-side data.
According to still another aspect of the present invention, a vehicle-end data processing apparatus is provided.
The vehicle-end data processing device of the present invention includes:
the acquisition module is used for acquiring vehicle end data generated in the automatic driving process of the vehicle;
the evaluation module is used for calling a corresponding scene detection model to detect the vehicle-end data based on the vehicle-end data type and determining a scene detection result; the scene detection model is used for detecting scenes in the automatic driving process, the scene detection results comprise a first class detection result and a second class detection result, the first class detection result is used for indicating that an abnormal scene is detected, and the second class detection result is used for indicating that the abnormal scene is not detected;
and the data processing module is used for storing the vehicle end data corresponding to the first type of detection result and deleting the vehicle end data except the vehicle end data corresponding to the first type of detection result.
According to another aspect of the invention, an electronic device for vehicle-end data processing is provided.
The vehicle-end data processing electronic device of the present invention includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the vehicle-end data processing method provided by the invention.
According to still another aspect of the present invention, there is provided a computer-readable medium on which a computer program is stored, the program, when executed by a processor, implementing the vehicle-end data processing method provided by the present invention.
The embodiment of the invention has the following advantages or beneficial effects:
by carrying out model detection on the field transmission data and storing the vehicle-end data of the abnormal scene, the technical problems of high manpower and network cost, limited storage resources and low efficiency of searching problem data are solved, and the technical effects of reducing the manpower and network resource cost, saving the storage resources and improving the updating iteration efficiency of the automatic driving algorithm are further achieved.
Further effects of the above-mentioned non-conventional alternatives will be described below in connection with the embodiments.
Drawings
The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
fig. 1 is a schematic diagram of a main flow of a vehicle-end data processing method according to a first embodiment of the present invention;
fig. 2 is a schematic diagram of a detailed flow of a vehicle-end data processing method according to a second embodiment of the invention;
fig. 3 is a schematic diagram of main blocks of a vehicle-end data processing apparatus according to a third embodiment of the present invention;
FIG. 4 is an exemplary system architecture diagram in which embodiments of the present invention may be employed;
FIG. 5 is a schematic block diagram of a computer system suitable for use with the electronic device to implement an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present invention are described below with reference to the accompanying drawings, in which various details of embodiments of the invention are included to assist understanding, and which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a schematic diagram of a main flow of a vehicle-end data processing method according to an embodiment of the present invention, and as shown in fig. 1, the vehicle-end data processing method according to the embodiment of the present invention includes:
and step S101, vehicle end data generated in the automatic driving process of the vehicle are obtained.
For example, an autonomous vehicle is provided with a vehicle-end operation control device that controls the travel progress of the vehicle. During the driving process of the automatic driving vehicle, different operation scenes exist, including a normal scene and a fault scene. Normal scenarios are such as: normal driving, parking, starting, temporary parking (red lights, etc.), etc., and fault scenarios such as: collision, running a red light, sudden stop (sudden flameout), overspeed, brake failure, etc.
In this step, the vehicle-end data generated during the automatic driving of the vehicle can be received in real time. The vehicle end data generated in the automatic driving process of the vehicle can comprise: and the vehicle end data under the normal scene or the vehicle end data under the abnormal scene.
Step S102, calling a corresponding scene detection model to detect the vehicle-end data based on the vehicle-end data type, and determining a scene detection result; the scene detection model is used for detecting scenes in the automatic driving process, the scene detection results comprise first detection results and second detection results, the first detection results are used for indicating that abnormal scenes are detected, and the second detection results are used for indicating that the abnormal scenes are not detected.
Illustratively, the scene detection model is used for detecting scenes in an automatic driving process in real time, wherein the scenes comprise normal scenes and abnormal scenes. In this step, a scene detection model is pre-configured in the vehicle-end operation control device, and based on the obtained vehicle-end data type, the scene detection model is called to perform real-time detection on vehicle-end data generated in the automatic driving process of the vehicle, so as to obtain a scene detection result, including: a first type detection result and a second type detection result.
Further, the first type detection result is used to indicate that an abnormal scene is detected, that is: when a fault occurs, the second detection result is used for indicating that the abnormal scene is not detected, namely: no failure occurred.
And S103, storing the vehicle end data corresponding to the first type detection result, and deleting the vehicle end data except the vehicle end data corresponding to the first type detection result.
In this step, the first type detection result is used to indicate that the abnormal scene is detected, and corresponds to the vehicle end data of the "failed" scene; the second type of detection result is used for indicating that the abnormal scene is not detected, and corresponds to vehicle end data of a scene of 'no fault'. In the invention, the vehicle end data under the scene of 'failure' is stored in real time, and the vehicle end data under the scene of 'failure' is deleted.
In the embodiment of the invention, by carrying out scene model detection on the field transmission data, the problem data can be detected in real time and only the required vehicle-end data is stored, so that the cost of manpower and network resources is reduced, the storage resources are saved, and the updating iteration efficiency of the automatic driving algorithm is improved.
Fig. 2 is a schematic diagram of a detailed flow of a vehicle-end data processing method according to a second embodiment of the invention. The vehicle-end data processing method provided by the embodiment of the invention can be executed by a data processing device. As shown in fig. 2, the vehicle-end data processing method according to the embodiment of the present invention includes:
and step S201, transmitting the vehicle end data to the data processing device.
In this step, the vehicle-end data generated during the running of the autonomous vehicle is transmitted to the data processing device.
Illustratively, the vehicle-end data may include a combination of at least one or more of the following types of data: positioning information, sensing traffic light information, sensing obstacle information, planning control track information, chassis information and the like. Further, in order to distinguish the vehicle-end data generated at different times, the corresponding serial numbers can be transmitted at the same time of transmitting the vehicle-end data. For example, the vehicle-end data sent at a time may include: the system comprises a serial number 1, positioning information, sensing traffic light information, sensing barrier information, planning control track information and chassis information; serial number 2, positioning information, sensing traffic light information, sensing barrier information, planning control track information and chassis information.
Step S202, the data processing device receives and stores the vehicle end data.
In this step, the data processing device stores the vehicle end data after receiving the vehicle end data.
Further, when the vehicle-end data is stored, the vehicle-end data can be stored according to the category of the vehicle-end data. For example, vehicle end data of the same category is stored in the same MAP (MAP) object, and vehicle end data of different categories is stored in different MAP objects. The MAP is a data structure directly accessed according to a Key value (Key value), and is essentially a set of Key value pairs, and the storage form is generally [ Key, value ]. For example, the storage manner of the positioning information data may be: [ sequence number 1, positioning information 1, sequence number 2, positioning information 2, … … ]; the storage mode of the planning control track information data can be as follows: [ serial number 1, planned control trajectory information 1, serial number 2, planned control trajectory information 2, … … ].
In step S203, the data processing apparatus receives a trigger instruction.
In this step, the data processing device receives the vehicle-end data and simultaneously receives a trigger instruction.
Further, the trigger instruction may include detection information of a scene. The scene may include one or more of: collision scenes, red light running scenes, overspeed scenes and the like are selected from the scenes according to the detection requirements.
Further, the data processing apparatus receives a trigger instruction through a read function.
Step S204, after receiving the trigger instruction, the data processing device analyzes the vehicle end data identifier carried by the trigger instruction, and obtains the vehicle end data corresponding to the vehicle end data identifier carried by the trigger instruction.
In this step, after receiving the trigger instruction, the data processing device immediately analyzes the trigger instruction to analyze the vehicle-end data identifier carried by the trigger instruction, and then calls the required vehicle-end data according to the vehicle-end data identifier.
Illustratively, the vehicle-end data identification includes: serial number, category identification of vehicle end data.
Further, the serial number of the vehicle-end data may be a frame number.
Further, the trigger instruction is used for indicating the detection of the scene of the frame, which may include: and identifying vehicle end data. For example, the trigger instruction indicates that a red light running scene of the 5 th frame is detected. Data processing apparatus is after receiving the instruction that detects the red light scene of making a dash across of frame 5, immediately resolves this instruction, and the vehicle end data sign that the instruction that the analytic result was made a dash across the red light scene of frame 5 and is detected carried includes: PCT-4, OB-6, LC-10. Wherein PNC-4 represents: the serial number is 4, and the category identification of the vehicle-end data is planning control track information; OB-6 represents: the serial number is 6, and the category identification of the vehicle end data is perceived obstacle information; LC-10 represents: the serial number is 10, and the category identification of the vehicle-end data is positioning information. Then, according to the analyzed vehicle end data identification, the planning control track information of the 4 th frame is called from the planning control track information storage area, the sensing obstacle information of the 6 th frame is called from the sensing obstacle information storage area, and the positioning information of the 10 th frame is called from the positioning information storage area.
Step S205, the data processing device determines whether the vehicle-end data acquisition is successful.
In this step, the data processing device determines whether the calling of the required vehicle-end data is successful, if so, step S206 is executed; if not, go to step S209.
It should be noted that steps S201 to S205 are an alternative embodiment of "acquiring vehicle-end data generated during automatic driving of the vehicle". In other embodiments of the present invention, the vehicle-end data may be obtained immediately after receiving the vehicle-end data, or when a timing trigger condition is satisfied.
And S206, determining a scene detection result by the data processing device based on the scene detection model and the vehicle end data.
In this step, the scene detection model is used to detect a scene in the automatic driving process. After the data processing device obtains the required vehicle end data, the vehicle end data required by the model is input into the scene detection model, the algorithm of the scene detection model is operated to detect the vehicle end data, and a scene detection result is obtained after the detection is finished. Illustratively, the scene detection models may include collision scene detection models, speeding scene detection models, red light running scene detection models, and so on.
The scene detection result comprises a first type detection result or a second type detection result, the first type detection result is used for indicating that an abnormal scene is detected, and the second type detection result is used for indicating that the abnormal scene is not detected. And the data processing device determines the scene detection result to be a first type detection result or a second type detection result based on the scene detection model and the required vehicle end data.
Further, the scene detection model is a detection algorithm under different scenes, which is constructed in advance, and information stored in the detection algorithm includes: parameters required for model detection, judgment conditions for model detection, and the like. For example, the information stored by the overspeed scene detection model includes: parameters required by model detection, such as serial number, speed information, positioning information, planning control track information and the like, and overspeed judgment conditions. In specific implementation, the values of the parameters required for model detection can be obtained through step S204. For example, the overspeed determination condition may be: overspeed is counted when the speed exceeds too much.
Further, according to actual detection requirements, if continuous detection is needed, the data processing device can obtain required continuous vehicle end data, a model algorithm is operated to perform detection based on the scene detection model and the continuous vehicle end data, and continuous detection results are output after the scene detection results are obtained.
And step S207, setting a category label for the vehicle end data according to the scene detection result.
In this step, the corresponding category labels may be preconfigured for different scenarios. For example, for a collision scenario, the following labels are configured: "TCS" and "NTCS". Wherein, "TCS" indicates that a collision occurs at the time point, and "NTCS" indicates that no collision occurs at the time point. For example, for the overspeed scenario, the following tags are configured: "TSD" and "NTSD". Wherein, the "TSD" indicates that the time point is overspeed, and the "NTSD" indicates that the time point is not overspeed. For example, for a red light running scene, the following tags are configured: "TRDL" and "NTRDL". Wherein, "TRDL" indicates that the red light is being run at the time point, and "NTRDL" indicates that the red light is not being run at the time point. After the data processing device determines that the scene detection result is the first type of detection result or the second type of detection result based on the scene detection model and the required vehicle-end data, the corresponding label can be selected from the pre-configured category labels according to the obtained detection result so as to mark the vehicle-end data corresponding to the scene detection result.
And S208, storing the vehicle end data corresponding to the first type detection result according to the type label, and deleting the vehicle end data except the vehicle end data corresponding to the first type detection result.
In this step, the data processing device screens the vehicle-end data according to the category label. For example, if the category tag is "TCS", it indicates that a collision occurs at the time point, that is, the collision scene detection result in step S207 is the first-type detection result, and the vehicle-side data corresponding to the first-type detection result of the collision scene is stored; if the category label is "NTCS", it indicates that no collision occurs at the time point, that is, the collision scene detection result in step S207 is the second type detection result, and the vehicle end data corresponding to the second type detection result of the collision scene is deleted.
Further, after the data processing apparatus filters the vehicle-side data in step S208, the stored vehicle-side data may be written into a hard disk for physical storage.
Step S209, the processing flow is ended and error information is reported.
In this step, if the data processing device fails to successfully acquire the required vehicle-side data, the processing flow is ended and error information is reported.
Further, the method of the embodiment of the present invention may further include the steps of: after step S208, the saved vehicle-side data may be transmitted back to the cloud database, so as to optimize the automatic driving algorithm according to the transmitted vehicle-side data.
In the embodiment of the invention, various scenes are detected in real time through the pre-configured model library, and the vehicle-end data are screened based on the detection result, so that the problem data can be detected in real time and only the required vehicle-end data are stored, thereby not only reducing the cost of manpower and network resources and saving the storage resources, but also improving the updating iteration efficiency of the automatic driving algorithm.
The flow shown in fig. 2 is further explained below by taking a collision scenario as an example.
And step A, the data processing device receives and stores the vehicle end data.
Illustratively, the vehicle-end data may include: serial number 1, positioning information, sensing traffic light information, sensing barrier information, planning control track information, chassis information and the like.
And B, the data processing device receives a trigger instruction. For example, the trigger instruction received by the data processing apparatus is: the collision scene of frame 5 is detected.
Step C, after the data processing device receives the instruction for detecting the collision scene of the 5 th frame, immediately analyzing the instruction, and analyzing the vehicle-end data identifier carried by the instruction for detecting the collision scene of the 5 th frame by the data processing device, wherein the step C comprises the following steps: OB-5 and LC-6. Wherein OB-5 represents: the serial number is 5, and the category identification of the vehicle end data is perceived obstacle information; LC-6 represents: the serial number is 6, and the category identification of the vehicle-end data is positioning information. Then, the data processing apparatus retrieves the perceived obstacle information of the 5 th frame from the perceived obstacle information storage area, and retrieves the localization information of the 6 th frame from the localization information storage area.
D, after the data processing device judges that the required vehicle end data are successfully acquired, executing the step E; if not, reporting error information.
And E, the data processing device operates the algorithm of the collision scene detection model to detect based on the collision scene detection model and the required vehicle end data.
The collision scene detection model stores parameters required by model detection and judgment conditions of the model detection. For example, the parameters required for the collision scenario detection model detection include: sensing barrier information and positioning information, wherein the judgment conditions of the model detection are as follows: it is detected whether the autonomous vehicle polygon intersects with the obstacle polygon. After vehicle end data required for detecting the 5 th frame collision scene is acquired, the collision scene detection model can be operated. And after the algorithm detection of the collision scene detection model is finished, obtaining the detection result of the collision scene to determine whether the 5 th frame collides.
And F, selecting a class label aiming at the collision scene from the pre-configured class labels by the data processing device according to the detection result of the collision scene: and the TCS or NTCS is used for marking vehicle-end data corresponding to the detection result of the collision scene.
For example, if a collision is detected, the corresponding vehicle end data may be marked as "TCS"; if no collision is detected, the corresponding vehicle end data is marked as NTCS. Further, the data processing device stores the vehicle end data with the category label of "TCS". Wherein the vehicle end data with the category label of "TCS" includes: the 5 th frame perceives barrier information, and the 6 th frame locates information. And the data processing device can delete the vehicle-end data with the category label of NTCS.
In the embodiment of the invention, the collision scene can be detected in real time through the steps, the vehicle-end data are screened based on the detection result, the problem data can be detected in real time, and only the required vehicle-end data are stored, so that the cost of manpower and network resources is reduced, the storage resources are saved, and the updating iteration efficiency of the automatic driving algorithm is improved.
Fig. 3 is a schematic diagram of main blocks of a vehicle-end data processing device according to a third embodiment of the present invention. As shown in fig. 3, a vehicle-end data processing apparatus 300 according to an embodiment of the present invention includes: an acquisition module 301, an evaluation module 302 and a data processing module 303.
The obtaining module 301 is configured to obtain vehicle-end data generated in an automatic driving process of a vehicle.
For example, an autonomous vehicle is provided with a vehicle-end operation control device that controls the travel progress of the vehicle. During the driving process of the automatic driving vehicle, different operation scenes exist, including a normal scene and a fault scene. Normal scenarios are such as: normal driving, parking, starting, temporary parking (red lights, etc.), etc., and fault scenarios such as: collision, running a red light, sudden stop (sudden flameout), overspeed, brake failure, etc.
The obtaining module 301 may receive vehicle-end data generated during an automatic driving process of a vehicle in real time. The vehicle end data generated in the automatic driving process of the vehicle can comprise: and the vehicle end data under the normal scene or the vehicle end data under the abnormal scene.
And the evaluation module 302 is configured to invoke a corresponding scene detection model to detect the vehicle-end data based on the vehicle-end data type, and determine a scene detection result. The scene detection model is used for detecting scenes in the automatic driving process, the scene detection results comprise first detection results and second detection results, the first detection results are used for indicating that abnormal scenes are detected, and the second detection results are used for indicating that the abnormal scenes are not detected.
Illustratively, the scene detection model is used for detecting scenes in an automatic driving process in real time, wherein the scenes comprise normal scenes and abnormal scenes. In this module, a scene detection model is pre-configured in the vehicle-end operation control device, and the evaluation module 302 calls the scene detection model to perform real-time detection on vehicle-end data generated in the automatic driving process of the vehicle based on the obtained vehicle-end data type to obtain a scene detection result, including: a first type detection result and a second type detection result.
Further, the first type detection result is used to indicate that an abnormal scene is detected, that is: when a fault occurs, the second detection result is used for indicating that the abnormal scene is not detected, namely: no failure occurred.
And the data processing module 303 is configured to store the vehicle-end data corresponding to the first type of detection result, and delete the vehicle-end data except the vehicle-end data corresponding to the first type of detection result.
In this module, the first type detection result is used to indicate that the abnormal scene is detected, and corresponds to the vehicle end data of the "failed" scene; the second type of detection result is used for indicating that the abnormal scene is not detected, and corresponds to vehicle end data of a scene of 'no fault'. In the invention, the data processing module 303 stores the vehicle end data in the scene of 'failure' in real time and deletes the vehicle end data in the scene of 'failure'.
In the embodiment of the invention, the vehicle end data generated in the automatic driving process of the vehicle is obtained through the obtaining module, the scene detection model is called to carry out real-time detection on the vehicle end data generated in the automatic driving process of the vehicle through the evaluation module based on the obtained vehicle end data type, the scene detection result is obtained, the vehicle end data corresponding to the first type of detection result is stored through the data processing module, the rest vehicle end data are deleted, the problem data can be detected in real time, only the required vehicle end data are stored, the cost of manpower and network resources is reduced, the storage resources are saved, and the updating iteration efficiency of the automatic driving algorithm is improved.
Fig. 4 illustrates an exemplary system architecture 400 of a vehicle-end data processing device to which embodiments of the present invention may be applied.
As shown in fig. 4, the system architecture 400 may include terminal devices 401, 402, 403, a network 404, and a server 405. The network 404 serves as a medium for providing communication links between the terminal devices 401, 402, 403 and the server 405. Network 404 may include various types of connections, such as wire, wireless communication links, or fiber optic cables, to name a few.
A user may use terminal devices 401, 402, 403 to interact with a server 405 over a network 404 to receive or send messages or the like. The terminal devices 401, 402, 403 may have various communication client applications installed thereon, such as shopping applications, web browser applications, search applications, instant messaging tools, mailbox clients, social platform software, and the like.
The terminal devices 401, 402, 403 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 405 may be a server that provides various services, such as a background management server that supports various websites browsed by the user using the terminal devices 401, 402, and 403. The background management server can analyze and process the received data such as the product information inquiry request and feed back the processing result to the terminal equipment.
It should be noted that the vehicle-end data processing method provided by the embodiment of the present invention is generally executed by the server 405, and accordingly, the vehicle-end data processing apparatus is generally disposed in the server 405.
It should be understood that the number of terminal devices, networks, and servers in fig. 4 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 5, shown is a block diagram of a computer system 500 suitable for use with a terminal device implementing an embodiment of the present invention. The terminal device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 5, the computer system 500 includes a Central Processing Unit (CPU)501 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the system 500 are also stored. The CPU 501, ROM 502, and RAM 503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output portion 507 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
In particular, according to the embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program performs the above-described functions defined in the system of the present invention when executed by the Central Processing Unit (CPU) 501.
It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present invention may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor includes an acquisition module, an evaluation module, and a data processing module. The names of these modules do not form a limitation to the module itself in some cases, and for example, the acquiring module may also be described as a "module that sends a data acquiring request to a connected server".
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be separate and not incorporated into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to comprise: acquiring vehicle end data generated in the automatic driving process of the vehicle; based on the vehicle end data type, calling a corresponding scene detection model to detect the vehicle end data and determining a scene detection result; the scene detection model is used for detecting scenes in the automatic driving process, the scene detection results comprise a first class detection result and a second class detection result, the first class detection result is used for indicating that an abnormal scene is detected, and the second class detection result is used for indicating that the abnormal scene is not detected; and storing the vehicle end data corresponding to the first type of detection result, and deleting the vehicle end data except the vehicle end data corresponding to the first type of detection result.
According to the technical scheme of the embodiment of the invention, the technical effects of reducing the cost of manpower and network resources, saving storage resources and improving the updating and iteration efficiency of the automatic driving algorithm can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations, and substitutions can occur, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A vehicle end data processing method is characterized by comprising the following steps of;
acquiring vehicle end data generated in the automatic driving process of the vehicle;
based on the type of the vehicle end data, calling a corresponding scene detection model to detect the vehicle end data and determining a scene detection result; the scene detection model is used for detecting scenes in the automatic driving process, the scene detection results comprise a first class detection result and a second class detection result, the first class detection result is used for indicating that an abnormal scene is detected, and the second class detection result is used for indicating that the abnormal scene is not detected;
and storing the vehicle end data corresponding to the first type of detection result, and deleting the vehicle end data except the vehicle end data corresponding to the first type of detection result.
2. The method of claim 1, wherein the obtaining end-of-vehicle data generated during automatic driving of the vehicle comprises:
receiving and storing vehicle end data generated in the automatic driving process of the vehicle; after receiving the trigger instruction, analyzing the vehicle end data identification carried by the trigger instruction, and acquiring the vehicle end data corresponding to the vehicle end data identification.
3. The method of claim 2, wherein the method further comprises:
after the vehicle end data corresponding to the vehicle end data identification is obtained, whether the obtained data is successful is judged, and if not, error information is reported.
4. The method of claim 2, wherein the calling of the corresponding scene detection model to detect the vehicle-end data based on the vehicle-end data type, and the determining of the scene detection result comprises:
according to configuration information of a scene detection model, vehicle end data required by the model are selected from vehicle end data corresponding to the vehicle end data identification; and inputting the vehicle end data required by the model into the scene detection model to detect the vehicle end data and determine a scene detection result.
5. The method of claim 4, wherein the scene detection model comprises: a collision scene detection model or a red light running scene detection model.
6. The method of claim 1, wherein the method further comprises:
and setting a category label for the vehicle end data according to the scene detection result, storing the vehicle end data corresponding to the first type detection result according to the category label, and deleting the vehicle end data except the vehicle end data corresponding to the first type detection result.
7. The method of claim 6, wherein the method further comprises:
and returning the stored vehicle-side data to the cloud database so as to optimize the automatic driving algorithm according to the returned vehicle-side data.
8. A vehicle-end data processing device is characterized by comprising:
the acquisition module is used for acquiring vehicle end data generated in the automatic driving process of the vehicle;
the evaluation module is used for calling a corresponding scene detection model to detect the vehicle-end data based on the vehicle-end data type and determining a scene detection result; the scene detection model is used for detecting scenes in the automatic driving process, the scene detection results comprise a first class detection result and a second class detection result, the first class detection result is used for indicating that an abnormal scene is detected, and the second class detection result is used for indicating that the abnormal scene is not detected;
and the data processing module is used for storing the vehicle end data corresponding to the first type of detection result and deleting the vehicle end data except the vehicle end data corresponding to the first type of detection result.
9. An electronic device for processing vehicle-end data, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202011065298.6A 2020-09-30 2020-09-30 Vehicle-end data processing method and device Pending CN113761306A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011065298.6A CN113761306A (en) 2020-09-30 2020-09-30 Vehicle-end data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011065298.6A CN113761306A (en) 2020-09-30 2020-09-30 Vehicle-end data processing method and device

Publications (1)

Publication Number Publication Date
CN113761306A true CN113761306A (en) 2021-12-07

Family

ID=78785793

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011065298.6A Pending CN113761306A (en) 2020-09-30 2020-09-30 Vehicle-end data processing method and device

Country Status (1)

Country Link
CN (1) CN113761306A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023226733A1 (en) * 2022-05-27 2023-11-30 中国第一汽车股份有限公司 Vehicle scene data acquisition method and apparatus, storage medium and electronic device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023226733A1 (en) * 2022-05-27 2023-11-30 中国第一汽车股份有限公司 Vehicle scene data acquisition method and apparatus, storage medium and electronic device

Similar Documents

Publication Publication Date Title
CN108920135B (en) User-defined service generation method and device, computer equipment and storage medium
CN110689804B (en) Method and apparatus for outputting information
CN112200067A (en) Intelligent video event detection method, system, electronic equipment and storage medium
CN114519061A (en) Map data updating method, device, electronic equipment and medium
CN110657813B (en) Method and device for optimizing planned roads in map
CN114281663A (en) Test processing method, test processing device, electronic equipment and storage medium
CN113761306A (en) Vehicle-end data processing method and device
US11726443B2 (en) Efficient controller data generation and extraction
CN111367500A (en) Data processing method and device
JP2023095812A (en) On-vehicle data processing method, device, electronic device, storage medium, and program
CN114987494A (en) Driving scene processing method and device and electronic equipment
CN114861321A (en) Problem scene extraction method, device, equipment and medium for traffic flow simulation
CN115115231A (en) Index system construction method and device, electronic equipment and storage medium
CN114090514A (en) Log retrieval method and device for distributed system
CN113486100A (en) Service management method, device, server and computer storage medium
CN111343101A (en) Server current limiting method and device, electronic equipment and readable storage medium
CN112180909A (en) Method and device for determining locking point priority in vehicle operation
CN114840112B (en) POP resource management method, device, equipment and computer readable storage medium
CN115190008B (en) Fault processing method, fault processing device, electronic equipment and storage medium
CN113079052B (en) Model training method, device, equipment and storage medium, and method and device for identifying data of Internet of things
US20230343148A1 (en) Method of Digging Valuable Data and Server Using the Same
CN113029179B (en) Route evaluation method and device, electronic equipment and storage medium
CN114093170B (en) Generation method, system and device of annunciator control scheme and electronic equipment
CN114579594A (en) Information updating method and device, electronic equipment and storage medium
CN112762949A (en) Method and device for detecting navigation path

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination