CN115439957B - Intelligent driving data acquisition method, acquisition device, acquisition equipment and computer readable storage medium - Google Patents

Intelligent driving data acquisition method, acquisition device, acquisition equipment and computer readable storage medium Download PDF

Info

Publication number
CN115439957B
CN115439957B CN202211117297.0A CN202211117297A CN115439957B CN 115439957 B CN115439957 B CN 115439957B CN 202211117297 A CN202211117297 A CN 202211117297A CN 115439957 B CN115439957 B CN 115439957B
Authority
CN
China
Prior art keywords
vehicle
data acquisition
data
rule
intelligent driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211117297.0A
Other languages
Chinese (zh)
Other versions
CN115439957A (en
Inventor
施亮
王丰源
左锐
郑骁栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SAIC Volkswagen Automotive Co Ltd
Original Assignee
SAIC Volkswagen Automotive Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SAIC Volkswagen Automotive Co Ltd filed Critical SAIC Volkswagen Automotive Co Ltd
Priority to CN202211117297.0A priority Critical patent/CN115439957B/en
Publication of CN115439957A publication Critical patent/CN115439957A/en
Application granted granted Critical
Publication of CN115439957B publication Critical patent/CN115439957B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/10Detection; Monitoring
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application relates to an intelligent driving data acquisition method, an intelligent driving data acquisition device, intelligent driving data acquisition equipment and a computer readable storage medium. S1, editing data acquisition requirements at a local end, and generating data acquisition rules based on the data acquisition requirements; s2, releasing the data acquisition rules to a cloud release platform, and releasing the data acquisition rules to a specified vehicle through the cloud release platform according to the data acquisition rules; s3, the vehicle adds or updates the data acquisition rule; s4, the vehicle acquires perception data based on a scene recognition algorithm; and S5, the vehicle sends the acquired perception data to a data center. The application provides an intelligent driving data acquisition method, an intelligent driving data acquisition device, intelligent driving data acquisition equipment and a computer readable storage medium, which can effectively acquire sensing data of a vehicle.

Description

Intelligent driving data acquisition method, acquisition device, acquisition equipment and computer readable storage medium
Technical Field
The application relates to the technical field of vehicle big data, in particular to an intelligent driving data acquisition method, an intelligent driving data acquisition device, intelligent driving data acquisition equipment and a computer readable storage medium.
Background
Almost all host factories and intelligent driving scheme suppliers in the world currently try to acquire intelligent driving data, and through continuous training of the data and improvement of the algorithm performance of intelligent driving, more reliable and more comfortable intelligent driving functions are developed. It can be said that the big data of the vehicle is a source for improving the performance and the functional experience of the intelligent driving software.
The current mainstream method for acquiring intelligent driving data is to deploy a series of intelligent driving sensors on one or more vehicles (hereinafter referred to as vehicle ends), wherein the intelligent driving sensors comprise cameras, millimeter wave radars, laser radars, ultrasonic radars, inertial Measurement Units (IMU), wheel speed sensors, high-precision positioning and the like and are used for sensing environment information and data (comprising surrounding vehicles, pedestrians, obstacles, lane lines, traffic lights, passable areas and the like) of the vehicles. The information is then collected and stored in real time and transmitted to a data center via a wireless network. As the number of vehicle sensors increases, the data collected per unit time becomes larger and larger, and the storage capacity of the data center is more and more challenging. Furthermore, as the maturity of software algorithms continues to increase, more extreme scenarios (corn cases) are required for algorithm training, rather than a large number of normal scenarios (normal cases) to improve performance. This also means that the efficiency of vehicle-side data collection becomes lower and lower over time, and that the exploitation of limited data scenarios becomes more and more difficult. Therefore, how to find one of the hot spots of the current research of the intelligent driving data acquisition method capable of reducing cost and enhancing efficiency.
Disclosure of Invention
In order to solve the above problems in the prior art, the present application provides an intelligent driving data acquisition method, an intelligent driving data acquisition device, and a computer readable storage medium, which can effectively acquire sensing data of a vehicle.
Specifically, the application provides an intelligent driving data acquisition method, which comprises the following steps:
s1, editing data acquisition requirements at a local end, and generating data acquisition rules based on the data acquisition requirements, wherein the data acquisition refers to acquisition of perception data of an intelligent driving sensor at a vehicle end, the data acquisition requirements refer to data acquisition requirements which are required to meet various requirements of an acquisition scene, and the data acquisition rules are rule sets of data acquisition corresponding to the various requirements;
s2, releasing the data acquisition rules to a cloud release platform, and releasing the data acquisition rules to a specified vehicle through the cloud release platform according to the data acquisition rules;
s3, the vehicle adds or updates the data acquisition rule;
s4, the vehicle acquires the sensing data of the acquisition scene matched with the data acquisition rule based on a scene recognition algorithm, and the scene recognition algorithm is used for recognizing whether the scene environment where the vehicle is positioned is matched with the acquisition scene matched with the data acquisition rule according to the sensing data of the intelligent driving sensor at the vehicle end;
and S5, the vehicle sends the acquired perception data to a data center.
According to one embodiment of the present application, in step S1, the vehicle-end intelligent driving sensor includes at least a vehicle-mounted camera, a millimeter wave radar, a laser radar, an ultrasonic radar, an inertial measurement unit IMU, a wheel speed sensor, and a high-precision positioning sensor, and the sensing data includes at least an image, a point cloud, a radar echo, a vehicle position, a speed, a vehicle target, a pedestrian, an obstacle, a lane line, a traffic light, passable area information, and related attributes acquired by the vehicle-end intelligent driving sensor.
According to one embodiment of the present application, in step S4, the process of acquiring the sensing data by the vehicle includes the steps of:
recording sensing data of all intelligent driving sensors at all vehicle ends in full quantity;
and screening the perception data based on the scene recognition algorithm, and acquiring the perception data matched with the acquisition scene of the data acquisition rule.
According to one embodiment of the present application, in step S4, the process of acquiring the sensing data by the vehicle includes the steps of:
analyzing whether the scene environment where the vehicle is located is matched with the acquisition scene matched with the data acquisition rule or not based on the scene recognition algorithm;
if the vehicle-end intelligent driving sensor is matched with the vehicle-end intelligent driving sensor, recording sensing data of the corresponding vehicle-end intelligent driving sensor according to a data acquisition rule; if not, stopping recording.
According to one embodiment of the present application, in step S5, the vehicle directly transmits the collected sensing data to the data center, or stores the sensing data and then transmits the stored sensing data to the data center.
The application also provides an intelligent driving data acquisition device, which comprises:
the local terminal module is used for editing data acquisition requirements and generating data acquisition rules based on the data acquisition requirements, wherein the data acquisition refers to acquisition of perception data of the intelligent driving sensor at the vehicle end, the data acquisition requirements refer to data acquisition to meet various requirements of an acquisition scene, and the data acquisition rules are rule sets of data acquisition corresponding to the various requirements;
yun Duanzi module for receiving and storing the data collection rule generated by the local terminal module and issuing the data collection rule to the vehicle designated by the data collection rule;
the vehicle terminal module is arranged on the vehicle and is used for receiving the data acquisition rule issued by the cloud terminal sub-module and adding or updating the data acquisition rule; acquiring sensing data of an acquisition scene matched with the data acquisition rule based on a scene recognition algorithm, wherein the scene recognition algorithm is used for recognizing whether the scene environment where the vehicle is positioned is matched with the acquisition scene matched with the data acquisition rule according to the sensing data of the intelligent driving sensor at the vehicle end;
and the data center sub-module is used for receiving the perception data acquired by the vehicle terminal sub-module.
According to one embodiment of the application, the local terminal module comprises a vehicle information visualization interface and a rule editing visualization interface;
the vehicle information visualization interface is used for displaying vehicle information, the vehicle information comprises the position and the current data acquisition state of a vehicle, the vehicle terminal module reports the vehicle information to the cloud terminal module, and the local terminal module acquires the vehicle information through the cloud terminal module;
and the rule editing visual interface is used for editing the data acquisition rule matching.
According to one embodiment of the application, the cloud sub-module comprises a vehicle management module, a rule storage module and a rule issuing module;
the vehicle management module is used for managing the vehicle information;
the rule storage module is used for storing the received data acquisition rule;
the rule issuing module is used for issuing the data acquisition rule to the appointed vehicle.
According to one embodiment of the application, the vehicle terminal module comprises a communication management module, a rule management module, a recording management module and a human-computer interface;
the communication management module is used for reporting the vehicle information to the cloud sub-module, receiving the data acquisition rule sent by the cloud sub-module and forwarding the data acquisition rule to the rule management module;
the rule management module updates, saves and synchronizes the data acquisition rule to the recording management module and the man-machine interface;
the recording management module records the perceived data of the acquired scene matched with the data acquisition rule based on the scene recognition algorithm; the recording management module supports a manual recording mode and an automatic recording mode, wherein the manual recording mode is used for recording the sensing data of all vehicle-end intelligent driving sensors in a full quantity, and then screening the sensing data based on a scene recognition algorithm to obtain sensing data matched with the acquisition scene of the data acquisition rule; the automatic recording mode is to analyze whether the scene environment where the vehicle is located is matched with the acquisition scene matched with the data acquisition rule based on the scene recognition algorithm, if so, recording the perception data of the corresponding intelligent driving sensor at the vehicle end according to the data acquisition rule, and if not, stopping recording;
the man-machine interface is used for displaying the acquisition and recording state, the data acquisition rule and providing a recording mode switching button, and the recording mode switching button is used for switching between a manual recording mode and an automatic recording mode.
The application also provides intelligent driving data acquisition equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the steps of the intelligent driving data acquisition method are realized when the processor executes the computer program.
The application also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the steps of the intelligent driving data collection method provided by the application.
The intelligent driving data acquisition method, the intelligent driving data acquisition device, the intelligent driving data acquisition equipment and the computer readable storage medium provided by the application can be used for effectively acquiring the perception data of the vehicle.
It is to be understood that both the foregoing general description and the following detailed description of the present application are exemplary and explanatory and are intended to provide further explanation of the application as claimed.
Drawings
The accompanying drawings, which are included to provide a further explanation of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the accompanying drawings:
fig. 1 shows a flow chart of a method of intelligent driving data collection according to an embodiment of the application.
Fig. 2 shows a schematic structural diagram of an intelligent driving data acquisition device according to an embodiment of the present application.
Detailed Description
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. The following description of at least one exemplary embodiment is merely exemplary in nature and is in no way intended to limit the application, its application, or uses. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present application. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
The relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present application unless it is specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description. Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
In addition, the terms "first", "second", etc. are used to define the components, and are only for convenience of distinguishing the corresponding components, and the terms have no special meaning unless otherwise stated, and therefore should not be construed as limiting the scope of the present application. Furthermore, although terms used in the present application are selected from publicly known and commonly used terms, some terms mentioned in the present specification may be selected by the applicant at his or her discretion, the detailed meanings of which are described in relevant parts of the description herein. Furthermore, it is required that the present application is understood, not simply by the actual terms used but by the meaning of each term lying within.
Fig. 1 shows a flow chart of a method of intelligent driving data collection according to an embodiment of the application. As shown in the figure, the intelligent driving data acquisition method comprises the following steps:
s1, editing data acquisition requirements at a local end, generating data acquisition rules based on the data acquisition requirements, wherein the data acquisition refers to acquisition of perception data of an intelligent driving sensor at a vehicle end, and the data acquisition requirements refer to data acquisition to meet various requirements of an acquisition scene. For example, the user may specify that the acquisition scenario includes the time at which data acquisition occurs, configuration requirements of the vehicle-side intelligent driving sensor, vehicle location, weather, lane conditions, a target number threshold, road conditions, and the like. The data collection rule is a rule set of data collection corresponding to each requirement.
S2, issuing the data acquisition rules to a cloud issuing platform, and issuing the data acquisition rules to the appointed vehicle through the cloud issuing platform. For example, data collection rules may be issued to a given vehicle at a given time.
And S3, adding or updating data acquisition rules by the vehicle. It is easy to understand that if a new data acquisition rule is provided, the vehicle adds the new data acquisition rule; if a new version of the existing data collection rule is available, the data collection rule is updated.
S4, the vehicle acquires the sensing data of the acquisition scene matched with the data acquisition rule based on a scene recognition algorithm, and the scene recognition algorithm is used for recognizing whether the scene environment where the vehicle is located is matched with the acquisition scene matched with the data acquisition rule according to the sensing data of the intelligent driving sensor at the vehicle end. The vehicle can acquire data on time or on demand through a scene recognition algorithm, and the perception data conforming to the data acquisition rule is acquired. For example, the scene recognition algorithm can obtain the current geographic position of the vehicle, the information of the lane and the like through high-precision positioning (sensors); the current traffic flow information can be obtained through the camera and the millimeter wave radar; traffic light information and the like can be known through V2X. The perception information is converged to match the acquisition scene requirement of the data acquisition rule, so that effective perception data can be acquired, and the data storage cost is reduced.
S5, the vehicle sends the acquired sensing data to a data center.
Preferably, in step S1, the vehicle-end intelligent driving sensor at least includes a vehicle-mounted camera, a millimeter wave radar, a laser radar, an ultrasonic radar, an inertial measurement unit IMU, a wheel speed sensor, and a high-precision positioning sensor, and the sensing data at least includes an image, a point cloud, a radar echo, a vehicle position, a vehicle speed, a vehicle target object, a pedestrian, an obstacle, a lane line, a traffic light, passable area information, and related attributes acquired by the vehicle-end intelligent driving sensor.
Preferably, in step S4, the process of acquiring the sensing data by the vehicle includes the steps of:
recording the sensing data of all the intelligent driving sensors at the vehicle end in full, namely acquiring all the sensing data without considering the data acquisition rule;
and screening the perception data based on a scene recognition algorithm, and acquiring the perception data matched with the acquisition scene of the data acquisition rule, wherein the acquisition of the perception data is equivalent to removing the perception data which does not accord with the data acquisition rule, so as to form effective perception data.
Preferably, in step S4, the process of acquiring the sensing data by the vehicle includes the steps of:
and analyzing whether the scene environment where the vehicle is positioned is matched with the acquisition scene matched with the data acquisition rule based on a scene recognition algorithm. The scene recognition algorithm acquires the scene information of the road where the current vehicle is located by analyzing the intelligent driving sensor at the vehicle end, for example, the scene information of the road surface can be acquired in real time through a camera, and the acquired video stream is transmitted to an engine of the scene recognition algorithm; the engine of the scene recognition algorithm determines the category of the vehicle scene (e.g., intersection, sunny day, etc.) by analyzing and processing the video stream (which may be set for seconds or minutes). If the category of the vehicle scene does not accord with the acquisition scene required by the data acquisition rule, the matching is failed; if the category of the vehicle scene accords with the acquisition scene required by the acquisition rule, the matching is successful.
If the vehicle-end intelligent driving sensor is matched with the vehicle-end intelligent driving sensor, recording sensing data of the corresponding vehicle-end intelligent driving sensor according to a data acquisition rule; further, before starting to perform the data collection action, backtracking the data (which may be seconds or minutes) in a period of time before the data collection action is performed; if not, stopping recording. It is easy to understand that the engine of the scene recognition algorithm is always in a working state, and the matching condition of the current scene of the vehicle and the acquisition scene required by the data acquisition rule is monitored.
Preferably, in step S5, the vehicle directly transmits the acquired sensing data to the data center, or stores the sensing data and then transmits the stored sensing data to the data center.
Fig. 2 shows a schematic structural diagram of an intelligent driving data acquisition device according to an embodiment of the present application. As shown in the figure, an intelligent driving data acquisition device 200 mainly includes a local terminal module 201, a cloud sub-module 202, a vehicle terminal module 203, and a data center sub-module 204.
The local terminal module 201 may be understood as a local client of the cloud sub-module 202. The local terminal module 201 is configured to edit a data collection requirement, and generate a data collection rule based on the data collection requirement. The data acquisition refers to acquisition of perception data of an intelligent driving sensor at a vehicle end, the data acquisition requirement refers to that the data acquisition is required to meet various requirements of an acquisition scene, and the data acquisition rule is a rule set of the data acquisition corresponding to the various requirements.
The Yun Duanzi module 202 is configured to receive and store the data collection rule generated by the local terminal module 201, and issue the data collection rule to a vehicle specified according to the data collection rule.
The vehicle terminal module 203 is provided on the vehicle. The vehicle terminal module 203 is configured to receive the data collection rule issued by the cloud terminal module 202, and add or update the data collection rule. The vehicle terminal module 203 obtains the sensing data of the acquisition scene matched with the data acquisition rule based on a scene recognition algorithm, and the scene recognition algorithm is used for recognizing whether the scene environment where the vehicle is located is matched with the acquisition scene matched with the data acquisition rule according to the sensing data of the intelligent driving sensor at the vehicle end.
The data center sub-module 204 is configured to receive the sensing data acquired by the vehicle terminal module 203.
Preferably, the local terminal module 201 includes a vehicle information visualization interface 2011 and a rule editing visualization interface 2012. The vehicle information visualization interface 2011 is used for displaying vehicle information, which includes the position of the vehicle and the current data acquisition state. The vehicle terminal module 203 reports the vehicle information to the cloud sub-module 202, and the local terminal module 201 obtains the vehicle information through the cloud sub-module 202. Rule edit visualization interface 2012 is used for editing data collection rule matching, and the editing operation mainly comprises operations of adding, deleting, modifying, viewing and the like of the data collection rule.
Preferably, the cloud sub-module 202 includes a vehicle management module 2021, a rule storage module 2022, and a rule issuing module 2023. Wherein the vehicle management module 2021 is used to manage vehicle information. The rule storage module 2022 is used to store received data collection rules, typically in a rule database. The rule issuing module 2023 is used to issue data collection rules to the vehicles to which it is assigned. The HTTP protocol may be used for data transfer between Yun Duanzi module 202 and local terminal module 201, or a TCP-like data transfer protocol mechanism may be used.
Preferably, the vehicle terminal module 203 includes a communication management module 2031, a rule management module 2032, a recording management module 2033, and a human-machine interface 2034. The communication management module 2031 is configured to report vehicle information to the cloud submodule 202. When the vehicle-end data collection status changes, the communication management module 2031 reports the vehicle information of the own vehicle to the Yun Duanzi module 202. The communication management module 2031 also receives the data collection rules issued by the cloud terminal module 202 and forwards to the rule management module 2032. The car terminal modules 203 and Yun Duanzi module 202 use the HTTP protocol for data transmission. By way of example and not limitation, the vehicle end submodule 203 and the cloud end submodule 202 may also use MQTT protocol, TCP, and similar data transfer protocol mechanisms.
The rule management module 2032 updates, saves, and synchronizes the data collection rules to the record management module 2033 and the human-machine interface 2034. The rule management module 2032 has corresponding sub-function modules including a vehicle end operator confirmation module, a rule update module, a rule save module, and a rule synchronization module. After receiving the data collection rule, the communication management module 2031 interacts with the vehicle end operator through the human-computer interface 2034, and the vehicle end operator can confirm whether to accept the data collection rule. If the vehicle end operator refuses to accept, the vehicle end operator confirmation module discards the received data acquisition rule and does not do any operation; if the vehicle-end operator accepts the new data acquisition rule, the data acquisition rule is stored through the rule storage module, and if the new version of the existing data acquisition rule is received, the data acquisition rule is updated through the rule updating module. The rule synchronization module is configured to synchronize the data collection rules to the record management module 2033 and the human-machine interface 2034.
The recording management module 2033 records (collects) the perceived data of the collected scene matching the data collection rule based on the scene recognition algorithm. Recording management module 2033 supports a manual recording mode and an automatic recording mode. The manual recording mode is to record the sensing data of all the intelligent driving sensors at all the vehicle ends in full, and the sensing data is screened based on a scene recognition algorithm without considering the data acquisition rule, so as to acquire the sensing data matched with the acquisition scene of the data acquisition rule. The automatic recording mode is to analyze whether the scene environment where the vehicle is located is matched with the acquisition scene matched with the data acquisition rule based on the scene recognition algorithm, if so, recording the sensing data of the corresponding intelligent driving sensor at the vehicle end according to the data acquisition rule, and if not, stopping recording. Recording management module 2033 synchronizes the recording status to human interface 2034. Specifically, the recording management module 2033 includes a synchronization status module, a manual recording module, an automatic recording module, and a scene matching module. The synchronization status module is configured to obtain a recording status and synchronize to the man-machine interface 2034. The manual recording module is used for executing manual recording. The automatic recording module is used for executing automatic recording. The scene matching module is used for analyzing whether the scene environment where the vehicle is located is matched with the acquisition scene matched with the data acquisition rule.
The human-machine interface 2034 is used for displaying the acquisition recording status, data acquisition rules, and providing a recording mode switch button for switching between a manual recording mode and an automatic recording mode. The acquisition and recording state comprises a recording mode of current data acquisition, a storage position of acquired data and the like. The data acquisition rules may be by displaying a description of the corresponding acquisition scene, etc. The vehicle end operator can select a specific recording mode through a recording mode switching button.
Preferably, the vehicle terminal module 203 can transmit the acquired sensing data to the data center sub-module 204 by adopting a hard disk transmission mode, and can also transmit the data by adopting a wireless network, such as wifi, bluetooth, cellular communication and the like.
The application also provides intelligent driving data acquisition equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the steps of the intelligent driving data acquisition method are realized when the processor executes the computer program.
The application also provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor implements the steps of the aforementioned intelligent driving data acquisition method.
The specific implementation manner and technical effects of the intelligent driving data acquisition device and the computer readable storage medium can be referred to the embodiment of the intelligent driving data acquisition method provided by the application, and are not described herein.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The various illustrative logical modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software as a computer program product, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a web site, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk (disk) and disc (disk) as used herein include Compact Disc (CD), laser disc, optical disc, digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks (disk) usually reproduce data magnetically, while discs (disk) reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
The intelligent driving data acquisition method, the intelligent driving data acquisition device, the intelligent driving data acquisition equipment and the computer readable storage medium provided by the application have the following advantages:
1. the data acquisition rules can be edited in real time and issued in real time in a local visual interface according to acquisition requirements; the data collection rule of each vehicle in the motorcade can be edited in real time, so that the diversity and the flexibility of data collection are improved.
2. The data acquisition rule is issued to the vehicle end through the cloud end and then executed by the vehicle end; no engineer is required to find the appointed vehicle, and then the data acquisition rule is edited from the interior of the vehicle, so that the manpower resource is saved.
3. The vehicle end deploys a scene recognition algorithm, and the environment of the vehicle end and the data acquisition rule can be matched. If the matching is successful, the data acquisition can be adopted on time and on demand. Effective data can be screened out through different recording modes, so that the lightweight storage of the data is realized.
4. The continuous full-quantity data acquisition and automatic recording can be operated simultaneously, so that the flexibility of data acquisition is improved.
It will be apparent to those skilled in the art that various modifications and variations can be made to the above-described exemplary embodiments of the present application without departing from the spirit and scope of the application. Therefore, it is intended that the present application cover the modifications and variations of this application provided they come within the scope of the appended claims and their equivalents.

Claims (10)

1. An intelligent driving data acquisition method comprises the following steps:
s1, editing data acquisition requirements at a local end, and generating data acquisition rules based on the data acquisition requirements, wherein the data acquisition refers to acquisition of perception data of an intelligent driving sensor at a vehicle end, the data acquisition requirements refer to data acquisition requirements which are required to meet various requirements of an acquisition scene, and the data acquisition rules are rule sets of data acquisition corresponding to the various requirements;
s2, releasing the data acquisition rules to a cloud release platform, and releasing the data acquisition rules to a specified vehicle through the cloud release platform according to the data acquisition rules;
s3, the vehicle adds or updates the data acquisition rule;
s4, the vehicle acquires the sensing data of the acquisition scene matched with the data acquisition rule based on a scene recognition algorithm, and the scene recognition algorithm is used for recognizing whether the scene environment where the vehicle is positioned is matched with the acquisition scene matched with the data acquisition rule according to the sensing data of the intelligent driving sensor at the vehicle end;
s5, the vehicle sends the acquired sensing data to a data center;
in step S1, the vehicle-end intelligent driving sensor includes a vehicle-mounted camera, a millimeter wave radar, a laser radar, an ultrasonic radar, an inertial measurement unit IMU, a wheel speed sensor and a high-precision positioning sensor, and the sensing data includes an image, a point cloud, a radar echo, a vehicle position, a vehicle speed, a vehicle target object, a pedestrian, an obstacle, a lane line, a traffic light and passable area information acquired by the vehicle-end intelligent driving sensor.
2. The intelligent driving data collection method according to claim 1, wherein in step S4, the process of the vehicle acquiring the perception data includes the steps of:
recording sensing data of all intelligent driving sensors at all vehicle ends in full quantity;
and screening the perception data based on the scene recognition algorithm, and acquiring the perception data matched with the acquisition scene of the data acquisition rule.
3. The intelligent driving data collection method according to claim 1, wherein in step S4, the process of the vehicle acquiring the perception data includes the steps of:
analyzing whether the scene environment where the vehicle is located is matched with the acquisition scene matched with the data acquisition rule or not based on the scene recognition algorithm;
if the vehicle-end intelligent driving sensor is matched with the vehicle-end intelligent driving sensor, recording sensing data of the corresponding vehicle-end intelligent driving sensor according to a data acquisition rule; if not, stopping recording.
4. The intelligent driving data collection method according to claim 1, wherein in step S5, the vehicle directly transmits the collected sensing data to a data center, or stores the sensing data and then transmits the stored sensing data to the data center.
5. An intelligent driving data acquisition device, comprising:
the local terminal module is used for editing data acquisition requirements and generating data acquisition rules based on the data acquisition requirements, wherein the data acquisition refers to acquisition of perception data of the intelligent driving sensor at the vehicle end, the data acquisition requirements refer to data acquisition to meet various requirements of an acquisition scene, and the data acquisition rules are rule sets of data acquisition corresponding to the various requirements; the intelligent driving sensor at the vehicle end comprises a vehicle-mounted camera, a millimeter wave radar, a laser radar, an ultrasonic radar, an Inertial Measurement Unit (IMU), a wheel speed sensor and a high-precision positioning sensor, wherein the sensing data comprises images, point clouds, radar echoes, vehicle positions, speeds, vehicle targets, pedestrians, barriers, lane lines, traffic lights and passable area information which are acquired by the intelligent driving sensor at the vehicle end;
yun Duanzi module for receiving and storing the data collection rule generated by the local terminal module and issuing the data collection rule to the vehicle designated by the data collection rule;
the vehicle terminal module is arranged on the vehicle and is used for receiving the data acquisition rule issued by the cloud terminal sub-module and adding or updating the data acquisition rule; acquiring sensing data of an acquisition scene matched with the data acquisition rule based on a scene recognition algorithm, wherein the scene recognition algorithm is used for recognizing whether the scene environment where the vehicle is positioned is matched with the acquisition scene matched with the data acquisition rule according to the sensing data of the intelligent driving sensor at the vehicle end;
and the data center sub-module is used for receiving the perception data acquired by the vehicle terminal sub-module.
6. The intelligent driving data collection apparatus of claim 5, wherein the local terminal module comprises a vehicle information visualization interface and a rule editing visualization interface;
the vehicle information visualization interface is used for displaying vehicle information, the vehicle information comprises the position and the current data acquisition state of a vehicle, the vehicle terminal module reports the vehicle information to the cloud terminal module, and the local terminal module acquires the vehicle information through the cloud terminal module;
and the rule editing visual interface is used for editing the data acquisition rule matching.
7. The intelligent driving data acquisition device of claim 6, wherein the cloud sub-module comprises a vehicle management module, a rule storage module, and a rule issuing module;
the vehicle management module is used for managing the vehicle information;
the rule storage module is used for storing the received data acquisition rule;
the rule issuing module is used for issuing the data acquisition rule to the appointed vehicle.
8. The intelligent driving data acquisition device according to claim 5, wherein the vehicle terminal module comprises a communication management module, a rule management module, a recording management module and a man-machine interface;
the communication management module is used for reporting the vehicle information to the cloud sub-module, receiving the data acquisition rule sent by the cloud sub-module and forwarding the data acquisition rule to the rule management module;
the rule management module updates, saves and synchronizes the data acquisition rule to the recording management module and the man-machine interface;
the recording management module records the perceived data of the acquired scene matched with the data acquisition rule based on the scene recognition algorithm; the recording management module supports a manual recording mode and an automatic recording mode, wherein the manual recording mode is used for recording the sensing data of all vehicle-end intelligent driving sensors in a full quantity, and then screening the sensing data based on a scene recognition algorithm to obtain sensing data matched with the acquisition scene of the data acquisition rule; the automatic recording mode is to analyze whether the scene environment where the vehicle is located is matched with the acquisition scene matched with the data acquisition rule based on the scene recognition algorithm, if so, recording the perception data of the corresponding intelligent driving sensor at the vehicle end according to the data acquisition rule, and if not, stopping recording;
the man-machine interface is used for displaying the acquisition and recording state, the data acquisition rule and providing a recording mode switching button, and the recording mode switching button is used for switching between a manual recording mode and an automatic recording mode.
9. An intelligent driving data collection device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the intelligent driving data collection method according to any one of claims 1-4 when executing the computer program.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the intelligent driving data collection method according to any one of claims 1-4.
CN202211117297.0A 2022-09-14 2022-09-14 Intelligent driving data acquisition method, acquisition device, acquisition equipment and computer readable storage medium Active CN115439957B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211117297.0A CN115439957B (en) 2022-09-14 2022-09-14 Intelligent driving data acquisition method, acquisition device, acquisition equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211117297.0A CN115439957B (en) 2022-09-14 2022-09-14 Intelligent driving data acquisition method, acquisition device, acquisition equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN115439957A CN115439957A (en) 2022-12-06
CN115439957B true CN115439957B (en) 2023-12-08

Family

ID=84246696

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211117297.0A Active CN115439957B (en) 2022-09-14 2022-09-14 Intelligent driving data acquisition method, acquisition device, acquisition equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN115439957B (en)

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345527A (en) * 2013-07-23 2013-10-09 深圳市博瑞得科技有限公司 Intelligent data statistical system
JP2015220530A (en) * 2014-05-15 2015-12-07 株式会社Nttドコモ Device, program and system for identifying audience quality
CN107436773A (en) * 2016-05-25 2017-12-05 全球能源互联网研究院 A kind of rule-based scene adaptive method of Android
CN109947800A (en) * 2018-08-01 2019-06-28 日海智能科技股份有限公司 Data processing method, device, equipment and medium
CN110794189A (en) * 2019-12-06 2020-02-14 杭州和利时自动化有限公司 Data acquisition method and device and related equipment
CN111599183A (en) * 2020-07-22 2020-08-28 中汽院汽车技术有限公司 Automatic driving scene classification and identification system and method
CN111619482A (en) * 2020-06-08 2020-09-04 武汉光庭信息技术股份有限公司 Vehicle driving data acquisition and processing system and method
CN111785057A (en) * 2020-06-23 2020-10-16 大众问问(北京)信息科技有限公司 Method and device for prompting emergency and vehicle
CN112256584A (en) * 2020-10-30 2021-01-22 深圳无域科技技术有限公司 Internet number making method and system
CN112740725A (en) * 2020-03-31 2021-04-30 华为技术有限公司 Driving data acquisition method and device
CN113138906A (en) * 2021-05-13 2021-07-20 北京优特捷信息技术有限公司 Call chain data acquisition method, device, equipment and storage medium
CN114168632A (en) * 2021-12-07 2022-03-11 泰康保险集团股份有限公司 Abnormal data identification method and device, electronic equipment and storage medium
CN114167857A (en) * 2021-11-08 2022-03-11 北京三快在线科技有限公司 Control method and device of unmanned equipment
CN114265411A (en) * 2021-12-28 2022-04-01 上汽大众汽车有限公司 Method for solving problem that vehicle prediction model performance is limited by perception data performance
CN114407652A (en) * 2022-01-19 2022-04-29 亿咖通(湖北)技术有限公司 Information display method, device and equipment
CN114495057A (en) * 2022-01-21 2022-05-13 亿咖通(湖北)技术有限公司 Data acquisition method, electronic device and storage medium
CN114743170A (en) * 2022-04-24 2022-07-12 重庆长安汽车股份有限公司 Automatic driving scene labeling method based on AI algorithm
CN114792111A (en) * 2022-02-28 2022-07-26 浙江大华技术股份有限公司 Data acquisition method and device, electronic equipment and storage medium
CN114936122A (en) * 2022-03-23 2022-08-23 联合汽车电子有限公司 Vehicle monitoring system, method and readable storage medium
CN114979216A (en) * 2022-05-26 2022-08-30 重庆长安汽车股份有限公司 Server data acquisition configuration method and system thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7359131B2 (en) * 2020-11-16 2023-10-11 トヨタ自動車株式会社 data recording device

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345527A (en) * 2013-07-23 2013-10-09 深圳市博瑞得科技有限公司 Intelligent data statistical system
JP2015220530A (en) * 2014-05-15 2015-12-07 株式会社Nttドコモ Device, program and system for identifying audience quality
CN107436773A (en) * 2016-05-25 2017-12-05 全球能源互联网研究院 A kind of rule-based scene adaptive method of Android
CN109947800A (en) * 2018-08-01 2019-06-28 日海智能科技股份有限公司 Data processing method, device, equipment and medium
CN110794189A (en) * 2019-12-06 2020-02-14 杭州和利时自动化有限公司 Data acquisition method and device and related equipment
CN112740725A (en) * 2020-03-31 2021-04-30 华为技术有限公司 Driving data acquisition method and device
CN111619482A (en) * 2020-06-08 2020-09-04 武汉光庭信息技术股份有限公司 Vehicle driving data acquisition and processing system and method
CN111785057A (en) * 2020-06-23 2020-10-16 大众问问(北京)信息科技有限公司 Method and device for prompting emergency and vehicle
CN111599183A (en) * 2020-07-22 2020-08-28 中汽院汽车技术有限公司 Automatic driving scene classification and identification system and method
CN112256584A (en) * 2020-10-30 2021-01-22 深圳无域科技技术有限公司 Internet number making method and system
CN113138906A (en) * 2021-05-13 2021-07-20 北京优特捷信息技术有限公司 Call chain data acquisition method, device, equipment and storage medium
CN114167857A (en) * 2021-11-08 2022-03-11 北京三快在线科技有限公司 Control method and device of unmanned equipment
CN114168632A (en) * 2021-12-07 2022-03-11 泰康保险集团股份有限公司 Abnormal data identification method and device, electronic equipment and storage medium
CN114265411A (en) * 2021-12-28 2022-04-01 上汽大众汽车有限公司 Method for solving problem that vehicle prediction model performance is limited by perception data performance
CN114407652A (en) * 2022-01-19 2022-04-29 亿咖通(湖北)技术有限公司 Information display method, device and equipment
CN114495057A (en) * 2022-01-21 2022-05-13 亿咖通(湖北)技术有限公司 Data acquisition method, electronic device and storage medium
CN114792111A (en) * 2022-02-28 2022-07-26 浙江大华技术股份有限公司 Data acquisition method and device, electronic equipment and storage medium
CN114936122A (en) * 2022-03-23 2022-08-23 联合汽车电子有限公司 Vehicle monitoring system, method and readable storage medium
CN114743170A (en) * 2022-04-24 2022-07-12 重庆长安汽车股份有限公司 Automatic driving scene labeling method based on AI algorithm
CN114979216A (en) * 2022-05-26 2022-08-30 重庆长安汽车股份有限公司 Server data acquisition configuration method and system thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于几何特征的点云目标检测方法研究;梁琼;《中国优秀硕士学位论文全文数据库 信息科技辑》(2019 年 第12期);I138-462 *

Also Published As

Publication number Publication date
CN115439957A (en) 2022-12-06

Similar Documents

Publication Publication Date Title
JP7467485B2 (en) Generating ground truth for machine learning from time series elements
CN106154834B (en) Method and apparatus for controlling automatic driving vehicle
DE112018006665T5 (en) PROCEDURE FOR ACCESSING ADDITIONAL PERCEPTIONAL DATA FROM OTHER VEHICLES
WO2022156520A1 (en) Cloud-road collaborative automatic driving model training method and system, and cloud-road collaborative automatic driving model calling method and system
WO2022141506A1 (en) Method for constructing simulation scene, simulation method and device
US10262537B1 (en) Autonomous optimization of parallel parking space utilization
CN109241373B (en) Method and apparatus for collecting data
US11206561B2 (en) Analysis method of vehicle-to-object communication system and analysis system using the same
US20090051568A1 (en) Method and apparatus for traffic control using radio frequency identification tags
CN114153220B (en) Remote control method for automatic driving based on artificial intelligence Internet of things platform
CN105849790A (en) Road condition information acquisition method
CN113479195A (en) Method for automatic valet parking and system for carrying out said method
DE102020128153A1 (en) SAMPLING OF DRIVING SCENARIOS FOR TRAINING/COORDINATION OF MACHINE LEARNING MODELS FOR VEHICLES
CN110696826B (en) Method and device for controlling a vehicle
CN111516690B (en) Control method and device of intelligent automobile and storage medium
CN114419572B (en) Multi-radar target detection method and device, electronic equipment and storage medium
CN114492022A (en) Road condition sensing data processing method, device, equipment, program and storage medium
CN115439957B (en) Intelligent driving data acquisition method, acquisition device, acquisition equipment and computer readable storage medium
CN112598908B (en) Driver red light running recognition method and device, electronic equipment and storage medium
CN110324369A (en) A kind of information sharing method and system based on road conditions
US20230256994A1 (en) Assessing relative autonomous vehicle performance via evaluation of other road users
CN112150807B (en) Vehicle early warning method and device, storage medium and electronic equipment
CN115359671A (en) Intersection vehicle cooperative control method and related equipment
CN113763704A (en) Vehicle control method, device, computer readable storage medium and processor
CN112698372A (en) Spatio-temporal data processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant