WO2021114608A1 - Data labeling method and apparatus - Google Patents
Data labeling method and apparatus Download PDFInfo
- Publication number
- WO2021114608A1 WO2021114608A1 PCT/CN2020/098210 CN2020098210W WO2021114608A1 WO 2021114608 A1 WO2021114608 A1 WO 2021114608A1 CN 2020098210 W CN2020098210 W CN 2020098210W WO 2021114608 A1 WO2021114608 A1 WO 2021114608A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- labeling
- target
- time
- state data
- environmental state
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/906—Clustering; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
Definitions
- the present application relates to the technical field of data processing, and in particular to a data labeling method and apparatus.
- autonomous vehicles which are capable of autonomous driving may be referred to as autonomous vehicles or self-driving vehicles.
- autonomous vehicle applies to both fully autonomous vehicles and partially autonomous vehicles, which are capable of driving autonomously some of the time, or in certain conditions, with the back-up of a human driver for certain scenarios or more challenging driving conditions.
- An autonomous driving system may include three modules: a sensing module, a decision module, and an execution module.
- the sensing module is configured to collect environmental state data in real time from a number of sensors and recognize the data.
- the decision module is configured to generate a driving command according to the collected environmental state data and deliver the driving command to the execution module.
- the execution module is configured to execute a corresponding driving operation according to the driving command delivered by the decision module.
- a large amount of sample data is needed to train the sensing module and the decision module so as to realize accurate sensing and decision-making. Because an amount of data collected by the sensors during a traveling process of an autonomous vehicle is very large, if all the collected data are used as sample data to train the sensing module and the decision module, the training time will be too long. In addition, the collected data may include invalid data which may complicate the training process.
- a first aspect of the present disclosure provides a data labeling method, executed by a vehicle-mounted terminal of an autonomous vehicle, the method comprising:
- the vehicle mounted terminal determining, by the vehicle mounted terminal, a target labeling type and a target time for target environmental state data included in the collected environmental state data, wherein the target labeling type is determined according to an environment in which the autonomous vehicle is located, and wherein the target time refers to a time at which the target labeling type is determined;
- the labeling information includes the target labeling type and target time determined by the vehicle mounted terminal and wherein the determined target time is the same as the collection time indicated by the time stamp corresponding to the target environmental state data.
- a second aspect of the present disclosure provides a data labeling apparatus suitable for use as a vehicle-mounted terminal of an autonomous vehicle, the data labeling apparatus comprising:
- a first receiving module configured to receive environmental state data collected by a vehicle-mounted sensor of the autonomous vehicle in real time and a time stamp corresponding to the environmental state data, wherein the time stamp is used to indicate a collection time of the corresponding environmental state data;
- a determination module configured to determine a target labeling type and a target time, wherein the target labeling type is determined according to an environment in which the autonomous vehicle is located, and the target time refers to a time at which the target labeling type is determined;
- a generation module configured to generate, according to the target labeling type, labeling information for target environmental state data included in the received environmental state data, wherein the labeling information includes the target labeling type and target time determined by the determination module and wherein the determination module is configured to determine a target time which is the same as the collection time indicated by the time stamp corresponding to the target environmental state data.
- a third aspect of the present disclosure provides a non-transitory computer readable storage medium storing instructions which are executable by a processor to perform the method of the first aspect of the present disclosure.
- FIG. 1 is an architectural diagram of an example of a system in which a data labeling method according to the present application may be used;
- FIG. 2 is a flowchart of an example data labeling method according to the present application.
- FIG. 3 is a schematic diagram of an example fusion result provided in an embodiment of the present application.
- FIG. 4 is a schematic diagram of a data labeling interface provided in an embodiment of the present application.
- FIG. 5 is a schematic diagram of another data labeling interface provided in an embodiment of the present application.
- FIG. 6 is a schematic diagram of still another data labeling interface provided in an embodiment of the present application.
- FIG. 7 is a schematic structural diagram of a data labeling apparatus provided in an embodiment of the present application.
- FIG. 8 is a schematic structural diagram of another data labeling apparatus provided in an embodiment of the present application.
- a neural network for recognizing environmental state data may be deployed in a sensing module and a neural network for making a driving decision may be deployed on a decision module.
- the sensing module and the decision module need to be trained with different sample data, so that the sensing module can recognize data content comprised in the collected data, and the decision module can make a driving decision according to the data content comprised in the collected data.
- the collected data may include a lot of invalid data.
- the collected data may be labeled, so that it can be classified and filtered according to labeling information.
- One approach is toupload the collected data from the journey as whole to a user terminal after the journey is completed. Labeling personnel can then browse the collected data on the user terminal and label the collected data a piece at a time. However, this may result in incomplete labeling as the personnel may miss or fail to recognize all of the relevant items of collected data.
- the present disclosure proposes a data labeling method, which may be executed by a vehicle-mounted terminal of an autonomous vehicle in order to more accurately or completely label data collected by the vehicle sensors during a journey of the vehicle.
- the vehicle-mounted terminal may for example be a computing device located in the vehicle.
- the vehicle mounted terminal receives environmental state data collected in real-time by a vehicle-mounted sensor of the autonomous vehicle.
- the environmental state data may be received by the vehicle-mounted terminal in real-time and labelled in real-time by a user of the vehicle-mounted terminal.
- the environmental state data may be received and labeled by the vehicle-mounted terminal during a journey of the autonomous vehicle. In this way a user of the vehicle-mounted terminal can verify or enter the labels based upon observing the environment of the vehicle. This may help to ensure accurate labelling and/or identify invalid data.
- the user may be assisted by a recognition function, such as a neural network, of the vehicle-mounted terminal, which may identify objects to be labeled and/or suggest labels for objects or scenarios. This may help to ensure more complete labeling of data and make it less likely for the user to miss data which should be labeled.
- a recognition function such as a neural network
- the vehicle-mounted terminal receives both the collected environmental state data and a time stamp corresponding to the environmental state data.
- the time stamp indicates a collection time of the environmental state data.
- the vehicle-mounted terminal determines a target labeling type and a target time for target environmental state data included in the collected environmental state data.
- Target environmental state data is environmental state data which is to be labeled.
- the target labeling type is determined according to an environment in which the autonomous vehicle is located (for example based on user-selection of the target labeling type, or a combination of user selection and machine-recognition) .
- the target time refers to a time at which the target labeling type is determined.
- the vehicle mounted terminal generates labeling information for the target environmental state data including the determined target labeling type and target time.
- the environmental state data may be labeled.
- the determined target time is the same as the collection time indicated by the time stamp corresponding to the target environmental state data. In this way the label is tied to the time stamp of the environmental data.
- the target labeling type is determined based on a user selection or user input.
- the user selection or user input may take some time, the time at which the user actually completes the user input or selection process may be somewhat delayed compared to the collection time of the environmental state data they are labeling.
- the vehicle-mounted terminal may suspend (e.g. pause) the target time so that the target time is the same as the collection time of the target environmental state data being labelled.
- the vehicle mounted terminal performs recognition on the collected environmental state data, using a neural network and displays a proposed target labeling type for the target environmental state data based on a result of the recognition.
- the vehicle mounted terminal may receive a user selection either confirming the proposed target labeling type or selecting an alternative target labeling type.
- the user may input a target labeling type or select a target labeling type from a list of options, without receiving a specific proposed target labeling type.
- the vehicle mounted terminal may receive a plurality of pieces of environmental state data and corresponding time stamps indicating collection times of the plurality of pieces of environmental state data.
- the vehicle-mounted terminal may apply the same labeling information to a plurality of pieces of environmental state data which have a same collection time.
- FIG. 1 is an architectural diagram of an example system 100for labeling data collected by one or more sensors of an autonomous vehicle 1 according to an example of the present disclosure.
- the system 100 comprises vehicle-mounted sensor 101 and a vehicle-mounted terminal 102.
- the vehicle-mounted sensor 101 is to collect environmental state data from an environment of the autonomous vehicle, while the vehicle mounted terminal 102 is to label the collected environmental state data.
- the vehicle-mounted sensor 101 may for example be a camera, a laser radar, a millimeter wave radar etc., or other type of vehicle-mounted sensor.
- the system may include a plurality of vehicle-mounted sensors and further examples of sensor are given below with reference to Fig. 8.
- the vehicle mounted terminal 102 may be any suitable computing device, such as but not limited to a personal computer, desktop computer, client computer terminal, tablet computer, mobile computing device, smart-phone etc.
- the vehicle mounted terminal 102 includes a processor and may also include a display and user interface. When performing the data labeling according to the present disclosure the vehicle mounted terminal 102 should be located in the autonomous vehicle 1.
- the vehicle mounted terminal 102 may be a computing device permanently or temporarily fixed to the autonomous vehicle.
- the vehicle-mounted terminal may be a portable computing device, such as a smart phone or tablet computer, which may be carried on and off the autonomous vehicle.
- the vehicle-mounted sensor 101 and the vehicle-mounted terminal 102 are connected in a wireless or wired manner for communication.
- the system 100 may also include a server 103, such as but not limited a server or a server cluster.
- the server 103 may be used to store environmental state data which has been collected and/or labelled by the vehicle mounted terminal.
- the vehicle-mounted terminal 102 and the server 103 may be connected in a wireless manner so that the vehicle-mounted terminal 102 may be send data wirelessly to the server 103.
- Wireless communication may make it possible to transmit data from the vehicle-mounted terminal 102 to the server 103 in real-time.
- the vehicle-mounted terminal 102 may be connected in a wired manner to the server 103, for instance by connecting the vehicle-mounted terminal to the server in a wired manner after the autonomous vehicle has completed a journey.
- the vehicle-mounted sensor 101 is a sensor mounted on an autonomous vehicle.
- the vehicle-mounted sensor 101 can collect surrounding environmental state data in real time, and send the environmental state data and a time stamp corresponding to the environmental state data to the vehicle-mounted terminal 102 while collecting the environmental state data.
- the vehicle-mounted sensor 101 may collect a plurality of pieces of environmental state data and each piece of environmental state data may be time stamped with a collection time, which is a time at which the piece of environmental state data was collected.
- a neural network for recognizing data may be deployed on the vehicle-mounted terminal 102.
- the vehicle-mounted terminal 102 can receive the environmental state data collected by the vehicle-mounted sensor 101 in real time and a time stamp corresponding to each piece of environmental state data.
- the vehicle-mounted terminal may recognize the received environmental state data using the neural network. If the recognition is successful then a recognition result is produced.
- the neural network may recognize one or more objects in the piece of environmental state data in which case the recognition result includes an unlabeled object.
- the neural network may recognize not only the existence of an object, but also a type of object or a scenario and in that case the recognition result may include a proposed label for the object or scenario.
- the vehicle-mounted terminal 102 can fuse the plurality of recognition results. Fuse means to combine a plurality of recognition results together.
- the fused recognition results may be referred to as a “fusion result” .
- the vehicle-mounted terminal may display the fusion result.
- the fused recognition result may be a video frame including a plurality of recognized objects.
- the fusing the recognition results may include combining the video data and radar data together.
- the fusion result may include an object identified from the video data and a distance to the object based on the radar data.
- the vehicle-mounted terminal 102 may determine a target labeling type; use, as a target time, a time at which the target labeling type is determined; and generate labeling information for target environmental state data according to the target labeling type.
- a labeling instruction is received at the target time that may be defined to be the same as a collection time indicated by a time stamp corresponding to the target environmental state data.
- the labeling information comprises the target labeling type and the target time.
- the vehicle-mounted terminal 102 may display the labeling information, receive an instruction for the displayed labeling information, and perform, on the selected labeling information, an operation indicated by the instruction.
- the vehicle-mounted terminal 102 may further send the received environmental state data and the generated labeling information to the server 103.
- the server 103 may receive the labeling information generated by the vehicle-mounted terminal 102 and the environmental state data that are sent by the vehicle-mounted terminal 102, determine, from the received environmental state data according to the target time comprised in the labeling information, the target environmental state data with a time indicated by a corresponding time stamp that is the same as the target time, and correspondingly store the labeling information and the target environmental state data.
- the server 103 may further classify and store the environmental state data and the corresponding labeling information according to the target labeling type comprised in the labeling information corresponding to the environmental state data, so as to subsequently acquire different types of data to train different neural network models.
- FIG. 2 is a flowchart of a data labeling method provided in an embodiment of the present application.
- the method may be applied to a terminal, and the terminal may refer to the vehicle-mounted terminal 102 in FIG. 1.
- the method comprises the following blocks:
- Block 201 receiving environmental state data collected by a vehicle-mounted sensor of an autonomous vehicle in real time and a time stamp corresponding to each piece of environmental state data.
- the time stamp is used to indicate a collection time of corresponding environmental state data.
- the vehicle-mounted sensor can collect environmental state data around the autonomous vehicle in real time, and acquire a collection time of the environmental state data while collecting the environmental state data, wherein the collection time of the environmental state data is used as a time stamp corresponding to the environmental state data. Then, the vehicle-mounted sensor can send the environmental state data and the time stamp corresponding to the environmental state data to the vehicle-mounted terminal.
- the vehicle-mounted terminal After receiving the environmental state data collected in real time and the time stamp corresponding to each piece of environmental state data that are sent by the vehicle-mounted sensor, the vehicle-mounted terminal can recognize the received environment state data, and then fuse recognition results to display a fusion result.
- the environmental state data received by the vehicle-mounted terminal is data collected by a single type of vehicle-mounted sensor.
- the plurality of recognition results are fused to obtain a fusion result; and the fusion result is displayed, wherein the fusion result is used to indicate data content comprised in the certain environmental state data.
- the vehicle-mounted terminal may input the environmental state data into a neural network model deployed on the vehicle-mounted terminal for data content recognition, recognize the environmental state data by means of the neural network model, and output a plurality of recognition results corresponding to the environmental state data. Then, the vehicle-mounted terminal may fuse the plurality of recognition results to obtain a fusion result, and display the fusion result.
- the neural network model may be a neural network trained with a large amount of sample data.
- the vehicle-mounted terminal can input the image data into the neural network model, and recognize the image data by means of the neural network model. It is assumed that a plurality of recognition results output by the neural network model are: a traffic light, a lane line, and a car. The vehicle-mounted terminal can fuse the traffic light, the lane line and the car to obtain a fusion result, and display the fusion result.
- the environmental state data received by the vehicle-mounted terminal may be data collected by different types of vehicle-mounted sensors.
- the vehicle-mounted terminal may recognize a plurality of different types of environmental state data collected at the same time to obtain a plurality of recognition results corresponding to the plurality of different types of environmental state data. Then, the vehicle-mounted terminal may fuse the plurality of recognition results to obtain a fusion result, and display the fusion result.
- the vehicle-mounted terminal may recognize different types of environmental state data collected by the camera and the radar at the same time.
- the neural network model may be used to recognize image data collected by the camera, where an obtained recognition result is a car; and also recognize data collected by the radar, where an obtained recognition result is a distance between the autonomous vehicle and an obstacle and a shape and size of the obstacle.
- the vehicle-mounted terminal can fuse the recognition result obtained based on the image data and the recognition result obtained based on the radar data to obtain a fusion result.
- the vehicle-mounted terminal can display a fusion result comprising a distance between autonomous vehicle and the car and a size and a shape of the car.
- the vehicle-mounted terminal fuses a plurality of recognition results of environmental state data and displays an obtained fusion results, so that labeling personnel on the autonomous vehicle can determine, according to an actual situation of an environmental in which the autonomous vehicle is located, whether the fusion result displayed by the vehicle-mounted terminal is correct.
- the labeling personnel sees a truck in front of the autonomous vehicle, but the fusion result illustrated by the vehicle-mounted terminal is a car.
- the labeling personnel can determine that the fusion result for environmental state data that is displayed by the vehicle-mounted terminal is wrong.
- Block 202 determining a target labeling type and a target time.
- the target labeling type is determined according to an environment in which the autonomous vehicle is located, and the target time refers to a time at which the target labeling type is determined.
- the vehicle-mounted terminal may display a plurality of labeling options, wherein each of the plurality of labeling options is used to indicate a labeling type; receive a labeling instruction triggered for a target labeling option in the plurality of labeling options, wherein the labeling instruction carries a labeling type indicated by the target labeling option; and determine the labeling type carried by the labeling instruction as the target labeling type, and determine a receiving time of the labeling instruction as the target time.
- the vehicle-mounted terminal may display a labeling option interface, a plurality of labeling options may be displayed in the labeling option interface, and each labeling option may be used to indicate a labeling type.
- a user may trigger a labeling instruction by selecting a target labeling option in the plurality of labeling options, wherein the labeling instruction carries a labeling type indicated by the target labeling option.
- the vehicle-mounted terminal may receive the labeling instruction, use the labeling type carried in the labeling instruction as the target labeling type, and use, as the target time, a time at which the labeling instruction is received.
- the vehicle-mounted terminal may display a labeling option interface
- the labeling option interface comprises a plurality of labeling options 401
- each of the plurality of labeling options 401 is used to indicate a labeling type, such as an expressway, a street, and a traffic light.
- a plurality of labeling types for which labeling needs to be performed have been determined before the labeling personnel starts labeling.
- the labeling personnel sees an object indicated by a target labeling type in the plurality of labeling types in an environment in which the autonomous vehicle is located, and a currently displayed fusion result also comprises the object indicated by the target labeling type
- the labeling personnel can select a target labeling option in the plurality of labeling options, so as to trigger a labeling instruction corresponding to the target labeling option.
- the labeling instruction carries a labeling type indicated by the target labeling option.
- the vehicle-mounted terminal may receive the labeling instruction, use the labeling type carried in the labeling instruction as the target labeling type, and use a receiving time of the labeling instruction as the target time.
- the target labeling type is a content type used to indicate data content comprised in target environmental state data.
- the plurality of labeling options comprised in the labeling option interface are: a traffic light, a lane line, a truck, weather, and an abnormality.
- Labeling types indicated by the labeling options of the traffic light, lane line, truck, weather, and abnormality are content types.
- the labeling personnel wants to label the traffic light.
- the labeling personnel can select the traffic light option in the option interface, so as to trigger a labeling instruction.
- the labeling instruction carries a labeling type of the traffic light that is indicated by the traffic light option.
- the terminal may use the labeling type of the traffic light as the target labeling type, and use, as the target time, a time at which the labeling instruction is received.
- the vehicle-mounted terminal can further display a time in real time above the plurality of labeling options while displaying the plurality of labeling options, wherein the time is synchronized with a time at which the vehicle-mounted sensor collects environmental state data.
- the vehicle-mounted terminal can pause transition of the displayed time, and use a displayed time at the pause time as the target time.
- a display of the vehicle-mounted terminal may display environmental state data, a time and a plurality of labeling options.
- the displayed time may be synchronized to the collection time at which the displayed environmental state data was collected.
- the displayed time may be paused so that the target time at which the target label type was determined is the same as the collection time of the target environmental state data being labeled.
- the labeling option interface of the vehicle-mounted terminal may further display a real time 402 while displaying the plurality of labeling options 401.
- the time 402 pauses at 08: 00: 00.
- the labeling personnel may find that a fusion result illustrated by the vehicle-mounted terminal is different from an actual situation in front of the autonomous vehicle.
- the labeling personnel may determine a labeling type according to an actual situation of an environment in which the autonomous vehicle is located, and then select a target labeling option corresponding to the labeling type in the plurality of labeling options, so as to trigger a labeling instruction.
- the labeling instruction carries a labeling type indicated by the target labeling option.
- the vehicle-mounted terminal may receive the labeling instruction, use the labeling type carried in the labeling instruction as the target labeling type, and use a receiving time of the labeling instruction as the target time.
- the target labeling type is a content type.
- the labeling personnel when the labeling personnel finds that a fusion result illustrated by the vehicle-mounted terminal is different from the actual situation in front of the autonomous vehicle, it may be directly considered that the current situation is abnormal.
- the labeling personnel may directly select a target labeling option corresponding to an abnormality type in the plurality of labeling options, so as to trigger a labeling instruction.
- the plurality of labeling options described above are still used as an example.
- a truck in front of the autonomous vehicle there is a truck in front of the autonomous vehicle, and a fusion result illustrated by the vehicle-mounted terminal is a car in front of the autonomous vehicle.
- the labeling personnel can determine that the fusion result for environmental state data that is displayed by the vehicle-mounted terminal is wrong.
- the labeling personnel may determine a labeling type as a truck according to the truck in front of the autonomous vehicle, and then select a truck option in the labeling option interface, so as to trigger a labeling instruction.
- the labeling instruction carries a labeling type of the truck that is indicated by the truck option.
- the labeling personnel may directly select an abnormality type option in the labeling option interface, so as to trigger a labeling instruction.
- the labeling instruction will carry an abnormality type used to indicate a data abnormality.
- the vehicle-mounted terminal may detect a signal of the autonomous vehicle in real time.
- an abnormality type used to indicate a data abnormality is determined as the target labeling type, and a detection time of the abnormal signal is determined as the target time.
- the abnormal signal is used to indicate that a running state of the autonomous vehicle is abnormal.
- the abnormal running state refers to an unmanned abnormal behavior of the autonomous vehicle that affects normal traveling of the autonomous vehicle, for example, automatic exit of an automatic driving system, a failure in detecting a positioning signal, etc.
- Block 203 generating, according to the target labeling type, labeling information for target environmental state data in the received environmental state data.
- a collection time indicated by a time stamp corresponding to the target environmental state data is the same time as the target time.
- the vehicle-mounted terminal may generate the corresponding labeling information for the target environmental state data according to the target labeling type.
- the labeling information comprises the target labeling type and target time.
- environmental state data whose collection time indicated by a time stamp is the same as the target time in a plurality pieces of environmental state data is the target environmental state data.
- the target labeling type determined by the vehicle-mounted terminal is a traffic light
- the target time is 08: 00: 00 on June 6, 2018.
- a piece of labeling information can be generated according to the target labeling type and the target time: 08: 00: 00 on June 6, 2018, a traffic light.
- environmental state data for which the labeling information is labeled is target environmental state data whose collection time indicated by a time stamp is 08: 00: 00 on June 6, 2018 in a plurality pieces of environmental state data.
- environmental time data to be labeled may be indicated by a target time comprised in labeling information, so that labeling for environmental state data can be more flexible and rich.
- the labeling information and the target environmental state data may also be displayed correspondingly, which is not limited in this embodiment of the present application.
- the vehicle-mounted terminal may further display a metadata setting interface, as shown in FIG. 5.
- the metadata setting interface comprises a plurality of metadata setting items, the labeling personnel can input corresponding configuration information to each metadata setting item.
- the vehicle-mounted terminal can store configuration information of a plurality pieces of metadata.
- the metadata setting items comprise a load, a driver, a task, weather, a route and a software version of the current automatic driving.
- the configuration information described above can be used as configuration metadata of all environmental state data in this driving process.
- the vehicle-mounted terminal may display the labeling information; receive a modification instruction used for modifying the labeling information, wherein the modification instruction carries a specified labeling type; and modify the target labeling type comprised in the labeling information to the specified labeling type.
- the vehicle-mounted terminal may detect a selection operation for any one of a plurality pieces of displayed labeling information, and use labeling information indicated by the selection operation as the target labeling information, then receive a modification instruction for modifying the target labeling information, acquire a specified labeling type carried in the modification instruction, and modify the target labeling type comprised in the target labeling information to the specified labeling type.
- labeling information indicated by a selection operation is target labeling information 403.
- the user can perform a selection operation for a modification option 404 and input a specified labeling type in an edit box 405, and then click a submit option 406, so as to trigger a modification instruction.
- the modification instruction carries the specified labeling type.
- the vehicle-mounted terminal may modify a target labeling type comprised in the target labeling information 403 to the specified labeling type.
- the vehicle-mounted terminal may further receive a deletion instruction for deleting the target labeling information, and then delete the target labeling information according to the deletion instruction.
- labeled data indicated by a selection operation is the target data 403, and then the user may perform a selection operation for a deletion option 407, so as to trigger a deletion instruction. After receiving the deletion instruction, the vehicle-mounted terminal may delete the target labeling information 403.
- the vehicle-mounted terminal may further receive an adding instruction for adding to target labeling information, then acquire a to-be-added specified labeling type carried in the adding instruction, and add the to-be-added specified labeling type as the labeling type comprised in the target labeling information.
- the vehicle-mounted terminal may further store the labeled data in a server, so as to subsequently extract sample data from the labeled data.
- the vehicle-mounted terminal may classify and store the plurality pieces of labeling information and environmental state data corresponding to each piece of labeling information according to a target labeling type comprised in the labeling information, and transfer a plurality pieces of environmental state data and corresponding labeling information that are classified and stored to a mobile storage medium, such as a portable hard disk or portable solid state drive.
- the plurality pieces of environmental state data and the corresponding labeling information may then be transferred to the server by the mobile storage medium.
- communication can be performed between the vehicle-mounted terminal and the server, for example over a wireless communication link.
- the vehicle-mounted terminal each time the vehicle-mounted terminal generates a piece of labeling information, i.e., after labeling is performed on corresponding environmental state data through labeling information, the environmental state data and the labeling information may be sent to the server. Then, the server can correspondingly store the received environmental state data and the labeling information which are sent by the vehicle-mounted terminal.
- a vehicle-mounted terminal can receive environmental state data collected by a vehicle-mounted sensor of an autonomous vehicle in real time, wherein each piece of environmental state data corresponds to a time stamp, and the time stamp is used to indicate a collection time of the environmental state data.
- the vehicle-mounted terminal can generate labeling information according to a determined target labeling type and a target time, and the labeling information is used as labeling information for environmental state data with the same target time.
- the target labeling type is determined according to an environment in which the autonomous vehicle is located.
- labeling information can be generated, according to an environment in which an autonomous vehicle is located, for received environmental state data collected in real-time, which avoids the related-art problem of incomplete labeling information caused by a fact that labeling is performed by only recognizing collected environmental state data, and enriches content of the labeling.
- the data labeling apparatus may be applied to an autonomous vehicle and comprises:
- a first receiving module 701 configured to receive environmental state data collected by a vehicle-mounted sensor of the autonomous vehicle in real time and a time stamp corresponding to each piece of environmental state data, wherein the time stamp is used to indicate a collection time of the corresponding environmental state data;
- a determination module 702 configured to determine a target labeling type and a target time, wherein the target labeling type is determined according to an environment in which the autonomous vehicle is located, and the target time refers to a time at which the target labeling type is determined;
- a generation module 703 configured to generate, according to the target labeling type, labeling information for target environmental state data in the received environmental state data, wherein a collection time indicated by a time stamp corresponding to the target environmental state data is the same time as the target time.
- the determination module 702 may be further configured to:
- each of the plurality of labeling options is used to indicate a labeling type
- the apparatus may further comprise:
- a recognition module configured to recognize the received environmental state data
- a fusion module configured to fuse a plurality of recognition results to obtain a fusion result if the plurality of recognition results are obtained after recognition is performed on certain environmental state data
- a first display module configured to display the fusion result, wherein the fusion result is used to indicate data content comprised in the certain environmental state data.
- the determination module 702 may be specifically configured to:
- an abnormality type used to indicate a data abnormality when detecting an abnormal signal
- the labeling information comprises the target labeling type and the target time; and the apparatus may further comprises:
- a second display module configured to display the labeling information
- a second receiving module configured to receive a modification instruction used for modifying the labeling information, wherein the modification instruction carries a specified labeling type
- a modification module configured to modify the target labeling type comprised in the labeling information to the specified labeling type.
- a vehicle-mounted terminal can receive environmental state data collected by a vehicle-mounted sensor of an autonomous vehicle in real time, wherein each piece of environmental state data corresponds to a time stamp, and the time stamp is used to indicate a collection time of the environmental state data.
- the vehicle-mounted terminal can generate labeling information according to a determined target labeling type and a target time, and the labeling information is used as labeling information for environmental state data with the same target time.
- the target labeling type is determined according to an environment in which the autonomous vehicle is located. It can be seen that, labeling information can be generated, according to an environment in which an autonomous vehicle is located, for received environmental state data collected in real-time. This may help to avoid incomplete labeling information caused and may enrich the content of the labeling.
- the division of functional modules is merely used as an example for illustration. In practical applications, the functions may be allocated to different functional modules for completion according to requirements, i.e., an internal structure of the device is divided into different functional modules to complete all or some of the functions described above.
- the data labeling method embodiment provided in the embodiments described above belong to the same concept, and for a specific implementation process thereof, references can be made to the method embodiment, which will not be repeatedly described here.
- FIG. 8 is a structural block diagram of a data labeling terminal 800 according to an exemplary embodiment.
- the terminal 800 may be a notebook computer, a desktop computer, etc.
- the terminal 800 comprises: a processor 801 and a memory 802.
- the processor 801 may comprise one or more processing cores, such as a 4-core processor, an 8-core processor, etc.
- the processor 801 may be implemented in form of at least one of the following hardware: a digital signal processor (DSP) , a field-programmable gate array (FPGA) , and a programmable logic array (PLA) .
- DSP digital signal processor
- FPGA field-programmable gate array
- PDA programmable logic array
- the processor 801 may also comprise a main processor and a coprocessor.
- the main processor is a processor for processing data in a wake-up state and also referred to as a central processing unit (CPU) .
- the coprocessor is a low-power processor for processing data in a standby state.
- the processor 801 may be integrated with a graphics processing unit (GPU) , and the GPU is configured to render and draw content that needs to be displayed on a display screen.
- the processor 801 may further comprise an artificial intelligence (AI) processor, and the AI processor is configured to process computing operations related to machine learning.
- AI artificial intelligence
- the memory 802 may comprise one or more computer-readable storage media which may be non-transitory.
- the memory 802 may further comprise a high-speed random access memory and a non-volatile memory, such as one or more disk storage devices and flash storage devices.
- the non-transitory computer-readable storage medium in the memory 802 is configured to store at least one instruction, wherein the at least one instruction is configured to be executed by the processor 801 to implement the data labeling method provided in the method embodiment of the present application.
- the terminal 800 further optionally comprises: a peripheral device interface 803 and at least one peripheral device.
- the processor 801, the memory 802, and the peripheral device interface 803 may be connected by means of a bus or a signal line.
- Each peripheral device may be connected to the peripheral device interface 803 by means of a bus, a signal line, or a circuit board.
- the peripheral device comprises: at least one of a radio frequency circuit 804, a display screen 805, a camera assembly 806, an audio circuit 807, a positioning assembly 808, and a power supply 809.
- the peripheral device interface 803 may be configured to connect at least one peripheral device related to input/output (I/O) to the processor 801 and the memory 802.
- the processor 801, the memory 802, and the peripheral device interface 803 are integrated on the same chip or circuit board.
- any one or two of the processor 801, the memory 802, and the peripheral device interface 803 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
- the radio frequency circuit 804 is configured to receive and transmit radio frequency (RF) signals which are also referred to as electromagnetic signals.
- the radio frequency circuit 804 communicates with a communication network and other communication devices through electromagnetic signals.
- the radio frequency circuit 804 converts an electrical signal into an electromagnetic signal for sending, or converts a received electromagnetic signal into an electrical signal.
- the radio frequency circuit 804 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chip group, a user identity module card, etc.
- the radio frequency circuit 804 can communicate with other terminals by means of at least one wireless communication protocol.
- the wireless communication protocol includes but is not limited to: the world wide web, the metropolitan area network, the Intranet, various generations of mobile communication networks (2G, 3G, 4G, and 5G) , wireless local area networks and/or Wireless Fidelity (WiFi) networks.
- the radio frequency circuit 804 may further comprise a circuit related to near field communication (NFC) , which is not limited in the present application.
- NFC near field communication
- the display screen 805 is configured to display a user interface (UI) .
- the UI may comprise a graphic, a text, an icon, a video, and any combination thereof.
- the display screen 805 also has the ability to collect touch signals on or above the surface of the display screen 805.
- the touch signal may be input to the processor 801 as a control signal for processing.
- the display screen 805 can be further configured to provide virtual buttons and/or virtual keyboards which are also referred to as soft buttons and/or soft keyboards.
- the display screen 805 may be a flexible display screen which is arranged on the curved surface or the folding surface of the terminal 800.
- the display screen 805 can even be arranged in a non-rectangular irregular shape, i.e., a profiled screen.
- the display screen 805 may be manufactured by using materials such as a liquid crystal display (LCD) and an organic light-emitting diode (OLED) .
- LCD liquid crystal display
- OLED organic light-emitting diode
- the camera assembly 806 is configured to collect images or videos.
- the camera assembly 806 comprises a front camera and a rear camera.
- the front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back surface of the terminal.
- there are at least two rear cameras which are separately any one of a main camera, a depth-of-field camera, a wide-angle camera, and a long-focus camera, so as to fuse the main camera and the depth-of-field camera to realize a bokeh function, and fuse the main camera and the wide-angle camera to realize a panoramic shooting and virtual reality (VR) shooting function or other fusion shooting functions.
- the camera assembly 806 may further comprise a flashlight.
- the flashlight may be a single-color temperature flashlight or a dual-color temperature flashlight.
- the dual-color temperature flashlight refers to a combination of a warm light flashlight and a cold light flashlight, which can be used for light compensation at different color temperatures.
- the audio circuit 807 may comprise a microphone and a speaker.
- the microphone is configured to collect sound waves of a user and an environment, and convert the sound waves into electrical signals and input them to the processor 801 for processing, or input them to the radio frequency circuit 804 to implement voice communication.
- the microphone can further be an array microphone or an omnidirectional acquisition microphone.
- the speaker is configured to convert electrical signal from the processor 801 or the radio frequency circuit 804 into sound waves.
- the speaker may be a traditional thin film speaker or a piezoelectric ceramic speaker.
- the audio circuit 807 may further comprise a headphone jack.
- the positioning assembly 808 is configured to locate a current geographic location of the terminal 800 to implement navigation or a location based service (LBS) .
- the positioning assembly 808 may be a positioning assembly based on the global positioning system (GPS) of the United States, the Beidou system of China, or the Galileo system of the European Union.
- GPS global positioning system
- the power supply 809 is configured to supply power to various assemblies in the terminal 800.
- the power supply 809 may be an alternating current battery, a direct current battery, a disposable battery, or a rechargeable battery.
- the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
- the wired rechargeable battery is a battery charged by means of a wired line
- the wireless rechargeable battery is a battery charged by means of a wireless coil.
- the rechargeable battery can be further configured to support fast charging technologies.
- the terminal 800 further comprises one or more sensors 810.
- the one or more sensors 810 include but are not limited to: an acceleration sensor 811, a gyroscope sensor 812, a pressure sensor 813, a fingerprint sensor 814, an optical sensor 815, and a proximity sensor 816.
- the acceleration sensor 811 can detect the magnitude of acceleration on the three coordinate axes of a coordinate system established with the terminal 800.
- the acceleration sensor 811 can be configured to detect components of gravitational acceleration on the three coordinate axes.
- the processor 801 may control the display screen 805 to display the user interface in a landscape view or a portrait view according to a gravity acceleration signal collected by the acceleration sensor 811.
- the acceleration sensor 811 can be further configured to collect game data or motion data of a user.
- the gyroscope sensor 812 can detect a body direction and a rotation angle of the terminal 800, and the gyroscope sensor 812 can cooperate with the acceleration sensor 811 to collect a 3D action of a user to the terminal 800.
- the processor 801 can realize the following functions according to data collected by the gyroscope sensor 812: action sensing (for example, changing the UI based on a tilt operation of the user) , image stabilization during shooting, game control, and inertial navigation.
- the pressure sensor 813 may be arranged at a side frame of the terminal 800 and/or a lower layer of the display screen 805.
- the pressure sensor can detect a grip signal of a user to the terminal 800, and the processor 801 performs left and right hand recognition or a shortcut operation according to the grip signal collected by the pressure sensor 813.
- the processor 801 controls an operability control on the UI interface according to a pressure operation of the user to the display screen 805.
- the operability control comprises at least one of a button control, a scroll bar control, an icon control, and a menu control.
- the fingerprint sensor 814 is configured to collect a fingerprint of a user, and the processor 801 recognizes a user identity according to the fingerprint collected by the fingerprint sensor 814, or the fingerprint sensor 814 recognizes the user identity according to the collected fingerprint.
- the processor 801 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc.
- the fingerprint sensor 814 may be arranged on the front surface, back surface, or side surface of the terminal 800. When a physical button or a manufacturer logo is arranged on the terminal 800, the fingerprint sensor 814 may be integrated with the physical button or manufacturer logo.
- the optical sensor 815 is configured to collect ambient light intensity.
- the processor 801 may control display luminance of the display screen 805 according to the ambient light intensity collected by the optical sensor 815. Specifically, when the ambient light intensity is high, the display luminance of the display screen 805 is turned up; and when the ambient light intensity is low, the display luminance of the display screen 805 is turned down.
- the processor 801 can further dynamically adjust a shooting parameter of the camera assembly 806 according to the ambient light intensity collected by the optical sensor 815.
- the proximity sensor 816 also referred to as a distance sensor, is usually arranged on the front panel of the terminal 800.
- the proximity sensor 816 is configured to collect a distance between a user and the front surface of the terminal 800.
- the processor 801 controls the display screen 805 to switch from a screen-on state to a screen-off state.
- the processor 801 controls the display screen 805 to switch from the screen-off state to the screen-on state.
- the embodiments of the present application provide a terminal comprising a processor and a memory used for storing instructions that can be executed by the processor, wherein the processor is configured to carry out the data labeling method shown in FIG. 2.
- the embodiments of the application further provide a computer-readable storage medium, which stores a computer programs that implements, when being executed by a processor, the data labeling method shown in FIG. 2.
- the embodiments of the present application further provide a computer program product containing an instruction, and when the computer program product runs on a computer, the computer is enabled to carry out the data labeling method shown in FIG. 2.
- a person of ordinary skill in the art may understand that all or some of the blocks for implementing the embodiments described above may be completed by hardware, or may be completed by a program instructing related hardware.
- the program may be stored in a computer-readable storage medium.
- the above-mentioned storage medium may be a read-only memory, a magnetic disk, an optical disk, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Abstract
A data labeling method and apparatus, the method comprises: A vehicle-mounted terminal receives environmental state data collected by a vehicle-mounted sensor of an autonomous vehicle in real time and a time stamp corresponding to the collected environmental state data (201). The vehicle mounted terminal determines a target labeling type and target time for target environmental state data included in the collected environmental state data (202). The target labeling type is determined according to an environment in which the autonomous vehicle is located and the target time refers to a time at which the target labeling type is determined. The vehicle mounted terminal generates labeling information for the target environmental state data including the target labeling type and target time (203).
Description
The present application relates to the technical field of data processing, and in particular to a data labeling method and apparatus.
With the development of science and technology, research into autonomous driving technologies has become a key focus in recent years. Vehicles which are capable of autonomous driving may be referred to as autonomous vehicles or self-driving vehicles. In the context of this disclosure the term autonomous vehicle applies to both fully autonomous vehicles and partially autonomous vehicles, which are capable of driving autonomously some of the time, or in certain conditions, with the back-up of a human driver for certain scenarios or more challenging driving conditions.
An autonomous driving system may include three modules: a sensing module, a decision module, and an execution module. The sensing module is configured to collect environmental state data in real time from a number of sensors and recognize the data. The decision module is configured to generate a driving command according to the collected environmental state data and deliver the driving command to the execution module. The execution module is configured to execute a corresponding driving operation according to the driving command delivered by the decision module.
A large amount of sample data is needed to train the sensing module and the decision module so as to realize accurate sensing and decision-making. Because an amount of data collected by the sensors during a traveling process of an autonomous vehicle is very large, if all the collected data are used as sample data to train the sensing module and the decision module, the training time will be too long. In addition, the collected data may include invalid data which may complicate the training process.
SUMMARY
A first aspect of the present disclosure provides a data labeling method, executed by a vehicle-mounted terminal of an autonomous vehicle, the method comprising:
receiving, by the vehicle mounted terminal, environmental state data collected by a vehicle-mounted sensor of the autonomous vehicle in real time and a time stamp corresponding to the environmental state data, wherein the time stamp indicates a collection time of the environmental state data;
determining, by the vehicle mounted terminal, a target labeling type and a target time for target environmental state data included in the collected environmental state data, wherein the target labeling type is determined according to an environment in which the autonomous vehicle is located, and wherein the target time refers to a time at which the target labeling type is determined; and
generating, by the vehicle mounted terminal, labeling information for the target environmental state data, wherein the labeling information includes the target labeling type and target time determined by the vehicle mounted terminal and wherein the determined target time is the same as the collection time indicated by the time stamp corresponding to the target environmental state data.
A second aspect of the present disclosure provides a data labeling apparatus suitable for use as a vehicle-mounted terminal of an autonomous vehicle, the data labeling apparatus comprising:
a first receiving module configured to receive environmental state data collected by a vehicle-mounted sensor of the autonomous vehicle in real time and a time stamp corresponding to the environmental state data, wherein the time stamp is used to indicate a collection time of the corresponding environmental state data;
a determination module configured to determine a target labeling type and a target time, wherein the target labeling type is determined according to an environment in which the autonomous vehicle is located, and the target time refers to a time at which the target labeling type is determined; and
a generation module configured to generate, according to the target labeling type, labeling information for target environmental state data included in the received environmental state data, wherein the labeling information includes the target labeling type and target time determined by the determination module and wherein the determination module is configured to determine a target time which is the same as the collection time indicated by the time stamp corresponding to the target environmental state data.
A third aspect of the present disclosure provides a non-transitory computer readable storage medium storing instructions which are executable by a processor to perform the method of the first aspect of the present disclosure.
Further features and aspects of the present disclosure are disclosed in the appended claims. The above described approach may help to enrich the content of labelling and may help to avoid or reduce incomplete labeling information, which may occur if data is labelled manually by labelling personal after the journey is completed.
Examples of the present disclosure are described below, by way of example only, with reference to the accompanying drawings in which:
FIG. 1 is an architectural diagram of an example of a system in which a data labeling method according to the present application may be used;
FIG. 2 is a flowchart of an example data labeling method according to the present application;
FIG. 3 is a schematic diagram of an example fusion result provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of a data labeling interface provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of another data labeling interface provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of still another data labeling interface provided in an embodiment of the present application;
FIG. 7 is a schematic structural diagram of a data labeling apparatus provided in an embodiment of the present application; and
FIG. 8 is a schematic structural diagram of another data labeling apparatus provided in an embodiment of the present application.
DETAILED DESCRIPTION OF PREFERRED IMPLEMENTATIONS
Examples of the present disclosure are described in further detail below with reference to the accompanying drawings.
In the field of automatic driving, a neural network for recognizing environmental state data may be deployed in a sensing module and a neural network for making a driving decision may be deployed on a decision module. In this case, the sensing module and the decision module need to be trained with different sample data, so that the sensing module can recognize data content comprised in the collected data, and the decision module can make a driving decision according to the data content comprised in the collected data.
In order to improve the accuracy of aneural network model, a large amount of sample data is required to train the neural network model. In the context of autonomous driving, a difficulty arises because the amount of data collected by sensors during a journey of the autonomous vehicle is very large. Furthermore, the collected data may include a lot of invalid data. In order to accurately extract sample data from the collected data, the collected data may be labeled, so that it can be classified and filtered according to labeling information.
One approach is toupload the collected data from the journey as whole to a user terminal after the journey is completed. Labeling personnel can then browse the collected data on the user terminal and label the collected data a piece at a time. However, this may result in incomplete labeling as the personnel may miss or fail to recognize all of the relevant items of collected data.
Accordingly the present disclosure proposes a data labeling method, which may be executed by a vehicle-mounted terminal of an autonomous vehicle in order to more accurately or completely label data collected by the vehicle sensors during a journey of the vehicle.
The vehicle-mounted terminal may for example be a computing device located in the vehicle. The vehicle mounted terminal receives environmental state data collected in real-time by a vehicle-mounted sensor of the autonomous vehicle. In some examples the environmental state data may be received by the vehicle-mounted terminal in real-time and labelled in real-time by a user of the vehicle-mounted terminal. For instance the environmental state data may be received and labeled by the vehicle-mounted terminal during a journey of the autonomous vehicle. In this way a user of the vehicle-mounted terminal can verify or enter the labels based upon observing the environment of the vehicle. This may help to ensure accurate labelling and/or identify invalid data.
In some implementations, the user may be assisted by a recognition function, such as a neural network, of the vehicle-mounted terminal, which may identify objects to be labeled and/or suggest labels for objects or scenarios. This may help to ensure more complete labeling of data and make it less likely for the user to miss data which should be labeled.
The vehicle-mounted terminal receives both the collected environmental state data and a time stamp corresponding to the environmental state data. The time stamp indicates a collection time of the environmental state data. The vehicle-mounted terminal, determines a target labeling type and a target time for target environmental state data included in the collected environmental state data. Target environmental state data is environmental state data which is to be labeled. The target labeling type is determined according to an environment in which the autonomous vehicle is located (for example based on user-selection of the target labeling type, or a combination of user selection and machine-recognition) . The target time refers to a time at which the target labeling type is determined.
The vehicle mounted terminal generates labeling information for the target environmental state data including the determined target labeling type and target time. In this way the environmental state data may be labeled. The determined target time is the same as the collection time indicated by the time stamp corresponding to the target environmental state data. In this way the label is tied to the time stamp of the environmental data.
In some examples, the target labeling type is determined based on a user selection or user input. As the user selection or user input may take some time, the time at which the user actually completes the user input or selection process may be somewhat delayed compared to the collection time of the environmental state data they are labeling. In order to compensate for this, the vehicle-mounted terminal may suspend (e.g. pause) the target time so that the target time is the same as the collection time of the target environmental state data being labelled.
In one example, the vehicle mounted terminal performs recognition on the collected environmental state data, using a neural network and displays a proposed target labeling type for the target environmental state data based on a result of the recognition. The vehicle mounted terminal may receive a user selection either confirming the proposed target labeling type or selecting an alternative target labeling type.
In other examples, the user may input a target labeling type or select a target labeling type from a list of options, without receiving a specific proposed target labeling type. The vehicle mounted terminal may receive a plurality of pieces of environmental state data and corresponding time stamps indicating collection times of the plurality of pieces of environmental state data. The vehicle-mounted terminal may apply the same labeling information to a plurality of pieces of environmental state data which have a same collection time.
A system architecture related to the data labeling method provided in this embodiment of the present application will now be described, followed by examples of the data labeling method.
FIG. 1 is an architectural diagram of an example system 100for labeling data collected by one or more sensors of an autonomous vehicle 1 according to an example of the present disclosure. As shown in FIG. 1, the system 100 comprises vehicle-mounted sensor 101 and a vehicle-mounted terminal 102. The vehicle-mounted sensor 101 is to collect environmental state data from an environment of the autonomous vehicle, while the vehicle mounted terminal 102 is to label the collected environmental state data.
The vehicle-mounted sensor 101 may for example be a camera, a laser radar, a millimeter wave radar etc., or other type of vehicle-mounted sensor. The system may include a plurality of vehicle-mounted sensors and further examples of sensor are given below with reference to Fig. 8.
The vehicle mounted terminal 102 may be any suitable computing device, such as but not limited to a personal computer, desktop computer, client computer terminal, tablet computer, mobile computing device, smart-phone etc. The vehicle mounted terminal 102 includes a processor and may also include a display and user interface. When performing the data labeling according to the present disclosure the vehicle mounted terminal 102 should be located in the autonomous vehicle 1. In some examples the vehicle mounted terminal 102 may be a computing device permanently or temporarily fixed to the autonomous vehicle. However, in other examples the vehicle-mounted terminal may be a portable computing device, such as a smart phone or tablet computer, which may be carried on and off the autonomous vehicle. The vehicle-mounted sensor 101 and the vehicle-mounted terminal 102 are connected in a wireless or wired manner for communication.
The system 100 may also include a server 103, such as but not limited a server or a server cluster. The server 103 may be used to store environmental state data which has been collected and/or labelled by the vehicle mounted terminal. The vehicle-mounted terminal 102 and the server 103 may be connected in a wireless manner so that the vehicle-mounted terminal 102 may be send data wirelessly to the server 103. Wireless communication may make it possible to transmit data from the vehicle-mounted terminal 102 to the server 103 in real-time. In other examples the vehicle-mounted terminal 102 may be connected in a wired manner to the server 103, for instance by connecting the vehicle-mounted terminal to the server in a wired manner after the autonomous vehicle has completed a journey.
The vehicle-mounted sensor 101 is a sensor mounted on an autonomous vehicle. The vehicle-mounted sensor 101 can collect surrounding environmental state data in real time, and send the environmental state data and a time stamp corresponding to the environmental state data to the vehicle-mounted terminal 102 while collecting the environmental state data. For example, the vehicle-mounted sensor 101 may collect a plurality of pieces of environmental state data and each piece of environmental state data may be time stamped with a collection time, which is a time at which the piece of environmental state data was collected.
A neural network for recognizing data may be deployed on the vehicle-mounted terminal 102. The vehicle-mounted terminal 102 can receive the environmental state data collected by the vehicle-mounted sensor 101 in real time and a time stamp corresponding to each piece of environmental state data. The vehicle-mounted terminal may recognize the received environmental state data using the neural network. If the recognition is successful then a recognition result is produced. For example, the neural network may recognize one or more objects in the piece of environmental state data in which case the recognition result includes an unlabeled object. In some cases the neural network may recognize not only the existence of an object, but also a type of object or a scenario and in that case the recognition result may include a proposed label for the object or scenario. When a plurality of recognition results are obtained after recognition is performed on the environmental state data, the vehicle-mounted terminal 102 can fuse the plurality of recognition results. Fuse means to combine a plurality of recognition results together. The fused recognition results may be referred to as a “fusion result” . The vehicle-mounted terminal may display the fusion result. For example if the vehicle-mounted sensor is a camera, then the fused recognition result may be a video frame including a plurality of recognized objects. In another example, if a first vehicle-mounted sensor is a camera and a second vehicle-mounted sensor is a radar, then the fusing the recognition results may include combining the video data and radar data together. For example, the fusion result may include an object identified from the video data and a distance to the object based on the radar data.
In a process of receiving the environmental state data collected by the vehicle-mounted sensor in real time, the vehicle-mounted terminal 102 may determine a target labeling type; use, as a target time, a time at which the target labeling type is determined; and generate labeling information for target environmental state data according to the target labeling type. A labeling instruction is received at the target time that may be defined to be the same as a collection time indicated by a time stamp corresponding to the target environmental state data. The labeling information comprises the target labeling type and the target time. After the labeling information is generated for the target environmental state data, the vehicle-mounted terminal 102 may display the labeling information, receive an instruction for the displayed labeling information, and perform, on the selected labeling information, an operation indicated by the instruction. The vehicle-mounted terminal 102 may further send the received environmental state data and the generated labeling information to the server 103.
The server 103 may receive the labeling information generated by the vehicle-mounted terminal 102 and the environmental state data that are sent by the vehicle-mounted terminal 102, determine, from the received environmental state data according to the target time comprised in the labeling information, the target environmental state data with a time indicated by a corresponding time stamp that is the same as the target time, and correspondingly store the labeling information and the target environmental state data. In addition, the server 103 may further classify and store the environmental state data and the corresponding labeling information according to the target labeling type comprised in the labeling information corresponding to the environmental state data, so as to subsequently acquire different types of data to train different neural network models.
Then, the data labeling method provided in this embodiment of the present application is introduced.
FIG. 2 is a flowchart of a data labeling method provided in an embodiment of the present application. The method may be applied to a terminal, and the terminal may refer to the vehicle-mounted terminal 102 in FIG. 1. As shown in FIG. 2, the method comprises the following blocks:
Block 201: receiving environmental state data collected by a vehicle-mounted sensor of an autonomous vehicle in real time and a time stamp corresponding to each piece of environmental state data.
The time stamp is used to indicate a collection time of corresponding environmental state data.
It should be noted that the vehicle-mounted sensor can collect environmental state data around the autonomous vehicle in real time, and acquire a collection time of the environmental state data while collecting the environmental state data, wherein the collection time of the environmental state data is used as a time stamp corresponding to the environmental state data. Then, the vehicle-mounted sensor can send the environmental state data and the time stamp corresponding to the environmental state data to the vehicle-mounted terminal.
After receiving the environmental state data collected in real time and the time stamp corresponding to each piece of environmental state data that are sent by the vehicle-mounted sensor, the vehicle-mounted terminal can recognize the received environment state data, and then fuse recognition results to display a fusion result.
In some embodiments, the environmental state data received by the vehicle-mounted terminal is data collected by a single type of vehicle-mounted sensor. In this case, if a plurality of recognition results are obtained after recognition is performed on certain environmental state data, the plurality of recognition results are fused to obtain a fusion result; and the fusion result is displayed, wherein the fusion result is used to indicate data content comprised in the certain environmental state data.
As an example, when the environmental state data is image data, the vehicle-mounted terminal may input the environmental state data into a neural network model deployed on the vehicle-mounted terminal for data content recognition, recognize the environmental state data by means of the neural network model, and output a plurality of recognition results corresponding to the environmental state data. Then, the vehicle-mounted terminal may fuse the plurality of recognition results to obtain a fusion result, and display the fusion result.
The neural network model may be a neural network trained with a large amount of sample data.
Exemplarily, an example in which the sensor is a camera is used, environmental state data collected by the camera is image data, and the vehicle-mounted terminal can input the image data into the neural network model, and recognize the image data by means of the neural network model. It is assumed that a plurality of recognition results output by the neural network model are: a traffic light, a lane line, and a car. The vehicle-mounted terminal can fuse the traffic light, the lane line and the car to obtain a fusion result, and display the fusion result.
In other embodiments, the environmental state data received by the vehicle-mounted terminal may be data collected by different types of vehicle-mounted sensors. In this case, the vehicle-mounted terminal may recognize a plurality of different types of environmental state data collected at the same time to obtain a plurality of recognition results corresponding to the plurality of different types of environmental state data. Then, the vehicle-mounted terminal may fuse the plurality of recognition results to obtain a fusion result, and display the fusion result.
Exemplarily, when the vehicle-mounted sensors comprise a camera and a radar, the vehicle-mounted terminal may recognize different types of environmental state data collected by the camera and the radar at the same time. Specifically, the neural network model may be used to recognize image data collected by the camera, where an obtained recognition result is a car; and also recognize data collected by the radar, where an obtained recognition result is a distance between the autonomous vehicle and an obstacle and a shape and size of the obstacle. Then, the vehicle-mounted terminal can fuse the recognition result obtained based on the image data and the recognition result obtained based on the radar data to obtain a fusion result. Then, as shown in FIG. 3, the vehicle-mounted terminal can display a fusion result comprising a distance between autonomous vehicle and the car and a size and a shape of the car.
In this embodiment of the present application, the vehicle-mounted terminal fuses a plurality of recognition results of environmental state data and displays an obtained fusion results, so that labeling personnel on the autonomous vehicle can determine, according to an actual situation of an environmental in which the autonomous vehicle is located, whether the fusion result displayed by the vehicle-mounted terminal is correct.
Exemplarily, taking the fusion result shown in FIG. 3 as an example, the labeling personnel sees a truck in front of the autonomous vehicle, but the fusion result illustrated by the vehicle-mounted terminal is a car. In this case, the labeling personnel can determine that the fusion result for environmental state data that is displayed by the vehicle-mounted terminal is wrong.
Block 202: determining a target labeling type and a target time.
The target labeling type is determined according to an environment in which the autonomous vehicle is located, and the target time refers to a time at which the target labeling type is determined.
In some embodiments, the vehicle-mounted terminal may display a plurality of labeling options, wherein each of the plurality of labeling options is used to indicate a labeling type; receive a labeling instruction triggered for a target labeling option in the plurality of labeling options, wherein the labeling instruction carries a labeling type indicated by the target labeling option; and determine the labeling type carried by the labeling instruction as the target labeling type, and determine a receiving time of the labeling instruction as the target time.
The vehicle-mounted terminal may display a labeling option interface, a plurality of labeling options may be displayed in the labeling option interface, and each labeling option may be used to indicate a labeling type. A user may trigger a labeling instruction by selecting a target labeling option in the plurality of labeling options, wherein the labeling instruction carries a labeling type indicated by the target labeling option. Correspondingly, the vehicle-mounted terminal may receive the labeling instruction, use the labeling type carried in the labeling instruction as the target labeling type, and use, as the target time, a time at which the labeling instruction is received.
Exemplarily, as shown in FIG. 4, the vehicle-mounted terminal may display a labeling option interface, the labeling option interface comprises a plurality of labeling options 401, and each of the plurality of labeling options 401 is used to indicate a labeling type, such as an expressway, a street, and a traffic light.
In a possible case, before the labeling personnel starts labeling, a plurality of labeling types for which labeling needs to be performed have been determined. On this basis, when the labeling personnel sees an object indicated by a target labeling type in the plurality of labeling types in an environment in which the autonomous vehicle is located, and a currently displayed fusion result also comprises the object indicated by the target labeling type, the labeling personnel can select a target labeling option in the plurality of labeling options, so as to trigger a labeling instruction corresponding to the target labeling option. The labeling instruction carries a labeling type indicated by the target labeling option. Correspondingly, the vehicle-mounted terminal may receive the labeling instruction, use the labeling type carried in the labeling instruction as the target labeling type, and use a receiving time of the labeling instruction as the target time. In this case, the target labeling type is a content type used to indicate data content comprised in target environmental state data.
Exemplarily, it is assumed that the plurality of labeling options comprised in the labeling option interface are: a traffic light, a lane line, a truck, weather, and an abnormality. Labeling types indicated by the labeling options of the traffic light, lane line, truck, weather, and abnormality are content types. The labeling personnel wants to label the traffic light. In this case, when there is a traffic light in front of the autonomous vehicle and a fusion result illustrated by the vehicle-mounted terminal also comprises the traffic light, the labeling personnel can select the traffic light option in the option interface, so as to trigger a labeling instruction. In this case, the labeling instruction carries a labeling type of the traffic light that is indicated by the traffic light option. After receiving the labeling instruction, the terminal may use the labeling type of the traffic light as the target labeling type, and use, as the target time, a time at which the labeling instruction is received.
It should be noted that the vehicle-mounted terminal can further display a time in real time above the plurality of labeling options while displaying the plurality of labeling options, wherein the time is synchronized with a time at which the vehicle-mounted sensor collects environmental state data. On this basis, when receiving a labeling instruction, the vehicle-mounted terminal can pause transition of the displayed time, and use a displayed time at the pause time as the target time. Thus a display of the vehicle-mounted terminal may display environmental state data, a time and a plurality of labeling options. The displayed time may be synchronized to the collection time at which the displayed environmental state data was collected. Upon a user inputting a labeling instruction (e.g. by selecting one of the labeling options) , the displayed time may be paused so that the target time at which the target label type was determined is the same as the collection time of the target environmental state data being labeled.
Exemplarily, as shown in FIG. 4, the labeling option interface of the vehicle-mounted terminal may further display a real time 402 while displaying the plurality of labeling options 401. In this case, the time 402 pauses at 08: 00: 00.
In another possible case, the labeling personnel may find that a fusion result illustrated by the vehicle-mounted terminal is different from an actual situation in front of the autonomous vehicle. In this case, the labeling personnel may determine a labeling type according to an actual situation of an environment in which the autonomous vehicle is located, and then select a target labeling option corresponding to the labeling type in the plurality of labeling options, so as to trigger a labeling instruction. In this case, the labeling instruction carries a labeling type indicated by the target labeling option. Correspondingly, the vehicle-mounted terminal may receive the labeling instruction, use the labeling type carried in the labeling instruction as the target labeling type, and use a receiving time of the labeling instruction as the target time. In this case, the target labeling type is a content type.
Optionally, when the labeling personnel finds that a fusion result illustrated by the vehicle-mounted terminal is different from the actual situation in front of the autonomous vehicle, it may be directly considered that the current situation is abnormal. In this case, the labeling personnel may directly select a target labeling option corresponding to an abnormality type in the plurality of labeling options, so as to trigger a labeling instruction.
Exemplarily, the plurality of labeling options described above are still used as an example. Currently, there is a truck in front of the autonomous vehicle, and a fusion result illustrated by the vehicle-mounted terminal is a car in front of the autonomous vehicle. In this case, the labeling personnel can determine that the fusion result for environmental state data that is displayed by the vehicle-mounted terminal is wrong. In this case, the labeling personnel may determine a labeling type as a truck according to the truck in front of the autonomous vehicle, and then select a truck option in the labeling option interface, so as to trigger a labeling instruction. In this case, the labeling instruction carries a labeling type of the truck that is indicated by the truck option. Optionally, after the labeling personnel determines that a fusion result for environmental state data that is displayed by the vehicle-mounted terminal is wrong, it may be directly considered that the current situation is abnormal. In this case, the labeling personnel may directly select an abnormality type option in the labeling option interface, so as to trigger a labeling instruction. in this case, the labeling instruction will carry an abnormality type used to indicate a data abnormality.
In other embodiments, the vehicle-mounted terminal may detect a signal of the autonomous vehicle in real time. When an abnormal signal is detected, an abnormality type used to indicate a data abnormality is determined as the target labeling type, and a detection time of the abnormal signal is determined as the target time.
The abnormal signal is used to indicate that a running state of the autonomous vehicle is abnormal. The abnormal running state refers to an unmanned abnormal behavior of the autonomous vehicle that affects normal traveling of the autonomous vehicle, for example, automatic exit of an automatic driving system, a failure in detecting a positioning signal, etc.
Block 203: generating, according to the target labeling type, labeling information for target environmental state data in the received environmental state data.
A collection time indicated by a time stamp corresponding to the target environmental state data is the same time as the target time.
After determining the target labeling type and the target time, the vehicle-mounted terminal may generate the corresponding labeling information for the target environmental state data according to the target labeling type. The labeling information comprises the target labeling type and target time. In this case, environmental state data whose collection time indicated by a time stamp is the same as the target time in a plurality pieces of environmental state data is the target environmental state data.
Exemplarily, the target labeling type determined by the vehicle-mounted terminal is a traffic light, and the target time is 08: 00: 00 on June 6, 2018. A piece of labeling information can be generated according to the target labeling type and the target time: 08: 00: 00 on June 6, 2018, a traffic light. In this case, environmental state data for which the labeling information is labeled is target environmental state data whose collection time indicated by a time stamp is 08: 00: 00 on June 6, 2018 in a plurality pieces of environmental state data.
It should be noted that, in this embodiment of the present application, instead of directly labeling the collected environmental state data, environmental time data to be labeled may be indicated by a target time comprised in labeling information, so that labeling for environmental state data can be more flexible and rich.
Certainly, in this embodiment of the present application, after the vehicle-mounted terminal generates the labeling information, the labeling information and the target environmental state data may also be displayed correspondingly, which is not limited in this embodiment of the present application.
Optionally, in this embodiment of the present application, before generating the labeling information for the target environmental state data according to the target labeling type, the vehicle-mounted terminal may further display a metadata setting interface, as shown in FIG. 5. The metadata setting interface comprises a plurality of metadata setting items, the labeling personnel can input corresponding configuration information to each metadata setting item. After receiving the configuration information input by the labeling personnel, the vehicle-mounted terminal can store configuration information of a plurality pieces of metadata. The metadata setting items comprise a load, a driver, a task, weather, a route and a software version of the current automatic driving. In this case, after the vehicle-mounted terminal completes data labeling for the autonomous vehicle in one traveling process, the configuration information described above can be used as configuration metadata of all environmental state data in this driving process.
After generating the labeling information for the environmental state data, the vehicle-mounted terminal may display the labeling information; receive a modification instruction used for modifying the labeling information, wherein the modification instruction carries a specified labeling type; and modify the target labeling type comprised in the labeling information to the specified labeling type.
In this embodiment of the present application, after displaying the labeling information for the environmental state data, the vehicle-mounted terminal may detect a selection operation for any one of a plurality pieces of displayed labeling information, and use labeling information indicated by the selection operation as the target labeling information, then receive a modification instruction for modifying the target labeling information, acquire a specified labeling type carried in the modification instruction, and modify the target labeling type comprised in the target labeling information to the specified labeling type.
Exemplarily, referring to FIG. 6, labeling information indicated by a selection operation is target labeling information 403. Then, the user can perform a selection operation for a modification option 404 and input a specified labeling type in an edit box 405, and then click a submit option 406, so as to trigger a modification instruction. The modification instruction carries the specified labeling type. After receiving the modification instruction, the vehicle-mounted terminal may modify a target labeling type comprised in the target labeling information 403 to the specified labeling type.
In one example, after detecting a selection operation for a plurality pieces of labeling information displayed in a display interface and using labeling information indicated by the selection operation as the target labeling information, the vehicle-mounted terminal may further receive a deletion instruction for deleting the target labeling information, and then delete the target labeling information according to the deletion instruction.
In one example, referring to FIG. 6, labeled data indicated by a selection operation is the target data 403, and then the user may perform a selection operation for a deletion option 407, so as to trigger a deletion instruction. After receiving the deletion instruction, the vehicle-mounted terminal may delete the target labeling information 403.
In one example, after detecting a selection operation for a plurality pieces of labeling information displayed in a display interface and using labeling information indicated by the selection operation as the target labeling information, the vehicle-mounted terminal may further receive an adding instruction for adding to target labeling information, then acquire a to-be-added specified labeling type carried in the adding instruction, and add the to-be-added specified labeling type as the labeling type comprised in the target labeling information.
The vehicle-mounted terminal may further store the labeled data in a server, so as to subsequently extract sample data from the labeled data.
In a possible scenario, it may not be possible to perform communication between the vehicle-mounted terminal and the server during the journey of the automated vehicle. In that case, the vehicle-mounted terminal may classify and store the plurality pieces of labeling information and environmental state data corresponding to each piece of labeling information according to a target labeling type comprised in the labeling information, and transfer a plurality pieces of environmental state data and corresponding labeling information that are classified and stored to a mobile storage medium, such as a portable hard disk or portable solid state drive. The plurality pieces of environmental state data and the corresponding labeling information may then be transferred to the server by the mobile storage medium.
In another possible case, communication can be performed between the vehicle-mounted terminal and the server, for example over a wireless communication link. In an automated driving process, each time the vehicle-mounted terminal generates a piece of labeling information, i.e., after labeling is performed on corresponding environmental state data through labeling information, the environmental state data and the labeling information may be sent to the server. Then, the server can correspondingly store the received environmental state data and the labeling information which are sent by the vehicle-mounted terminal.
In this embodiment of the present application, a vehicle-mounted terminal can receive environmental state data collected by a vehicle-mounted sensor of an autonomous vehicle in real time, wherein each piece of environmental state data corresponds to a time stamp, and the time stamp is used to indicate a collection time of the environmental state data. On this basis, the vehicle-mounted terminal can generate labeling information according to a determined target labeling type and a target time, and the labeling information is used as labeling information for environmental state data with the same target time. The target labeling type is determined according to an environment in which the autonomous vehicle is located. It can be seen that, in this embodiment of the present application, labeling information can be generated, according to an environment in which an autonomous vehicle is located, for received environmental state data collected in real-time, which avoids the related-art problem of incomplete labeling information caused by a fact that labeling is performed by only recognizing collected environmental state data, and enriches content of the labeling.
Referring to FIG. 7, an embodiment of the present application provides a data labeling apparatus 700. The data labeling apparatus may be applied to an autonomous vehicle and comprises:
a first receiving module 701 configured to receive environmental state data collected by a vehicle-mounted sensor of the autonomous vehicle in real time and a time stamp corresponding to each piece of environmental state data, wherein the time stamp is used to indicate a collection time of the corresponding environmental state data;
a determination module 702 configured to determine a target labeling type and a target time, wherein the target labeling type is determined according to an environment in which the autonomous vehicle is located, and the target time refers to a time at which the target labeling type is determined; and
a generation module 703 configured to generate, according to the target labeling type, labeling information for target environmental state data in the received environmental state data, wherein a collection time indicated by a time stamp corresponding to the target environmental state data is the same time as the target time.
The determination module 702 may be further configured to:
display a plurality of labeling options, wherein each of the plurality of labeling options is used to indicate a labeling type;
receive a labeling instruction triggered for a target labeling option in the plurality of labeling options, wherein the labeling instruction carries a labeling type indicated by the target labeling option; and
determine the labeling type carried by the labeling instruction as the target labeling type, and determine a receiving time of the labeling instruction as the target time.
The apparatus may further comprise:
a recognition module configured to recognize the received environmental state data;
a fusion module configured to fuse a plurality of recognition results to obtain a fusion result if the plurality of recognition results are obtained after recognition is performed on certain environmental state data; and
a first display module configured to display the fusion result, wherein the fusion result is used to indicate data content comprised in the certain environmental state data.
The determination module 702 may be specifically configured to:
determine, as the target labeling type, an abnormality type used to indicate a data abnormality when detecting an abnormal signal, and determine a detection time of the abnormal signal as the target time, wherein the abnormal signal is used to indicate that a running state of the autonomous vehicle is abnormal.
In one example, the labeling information comprises the target labeling type and the target time; and the apparatus may further comprises:
a second display module configured to display the labeling information;
a second receiving module configured to receive a modification instruction used for modifying the labeling information, wherein the modification instruction carries a specified labeling type; and
a modification module configured to modify the target labeling type comprised in the labeling information to the specified labeling type.
In conclusion, in some examples of the present application, a vehicle-mounted terminal can receive environmental state data collected by a vehicle-mounted sensor of an autonomous vehicle in real time, wherein each piece of environmental state data corresponds to a time stamp, and the time stamp is used to indicate a collection time of the environmental state data. On this basis, the vehicle-mounted terminal can generate labeling information according to a determined target labeling type and a target time, and the labeling information is used as labeling information for environmental state data with the same target time. The target labeling type is determined according to an environment in which the autonomous vehicle is located. It can be seen that, labeling information can be generated, according to an environment in which an autonomous vehicle is located, for received environmental state data collected in real-time. This may help to avoid incomplete labeling information caused and may enrich the content of the labeling.
It should be noted that, when the data labeling apparatus provided in the embodiment described above performs data labeling, the division of functional modules is merely used as an example for illustration. In practical applications, the functions may be allocated to different functional modules for completion according to requirements, i.e., an internal structure of the device is divided into different functional modules to complete all or some of the functions described above. In addition, the data labeling method embodiment provided in the embodiments described above belong to the same concept, and for a specific implementation process thereof, references can be made to the method embodiment, which will not be repeatedly described here.
FIG. 8 is a structural block diagram of a data labeling terminal 800 according to an exemplary embodiment. The terminal 800 may be a notebook computer, a desktop computer, etc.
Generally, the terminal 800 comprises: a processor 801 and a memory 802.
The processor 801 may comprise one or more processing cores, such as a 4-core processor, an 8-core processor, etc. The processor 801 may be implemented in form of at least one of the following hardware: a digital signal processor (DSP) , a field-programmable gate array (FPGA) , and a programmable logic array (PLA) . The processor 801 may also comprise a main processor and a coprocessor. The main processor is a processor for processing data in a wake-up state and also referred to as a central processing unit (CPU) . The coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 801 may be integrated with a graphics processing unit (GPU) , and the GPU is configured to render and draw content that needs to be displayed on a display screen. In some embodiments, the processor 801 may further comprise an artificial intelligence (AI) processor, and the AI processor is configured to process computing operations related to machine learning.
The memory 802 may comprise one or more computer-readable storage media which may be non-transitory. The memory 802 may further comprise a high-speed random access memory and a non-volatile memory, such as one or more disk storage devices and flash storage devices. In some embodiments, the non-transitory computer-readable storage medium in the memory 802 is configured to store at least one instruction, wherein the at least one instruction is configured to be executed by the processor 801 to implement the data labeling method provided in the method embodiment of the present application.
In some embodiments, the terminal 800 further optionally comprises: a peripheral device interface 803 and at least one peripheral device. The processor 801, the memory 802, and the peripheral device interface 803 may be connected by means of a bus or a signal line. Each peripheral device may be connected to the peripheral device interface 803 by means of a bus, a signal line, or a circuit board. Specifically, the peripheral device comprises: at least one of a radio frequency circuit 804, a display screen 805, a camera assembly 806, an audio circuit 807, a positioning assembly 808, and a power supply 809.
The peripheral device interface 803 may be configured to connect at least one peripheral device related to input/output (I/O) to the processor 801 and the memory 802. In some embodiments, the processor 801, the memory 802, and the peripheral device interface 803 are integrated on the same chip or circuit board. In some other embodiments, any one or two of the processor 801, the memory 802, and the peripheral device interface 803 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The radio frequency circuit 804 is configured to receive and transmit radio frequency (RF) signals which are also referred to as electromagnetic signals. The radio frequency circuit 804 communicates with a communication network and other communication devices through electromagnetic signals. The radio frequency circuit 804 converts an electrical signal into an electromagnetic signal for sending, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 804 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chip group, a user identity module card, etc. The radio frequency circuit 804 can communicate with other terminals by means of at least one wireless communication protocol. The wireless communication protocol includes but is not limited to: the world wide web, the metropolitan area network, the Intranet, various generations of mobile communication networks (2G, 3G, 4G, and 5G) , wireless local area networks and/or Wireless Fidelity (WiFi) networks. In some embodiments, the radio frequency circuit 804 may further comprise a circuit related to near field communication (NFC) , which is not limited in the present application.
The display screen 805 is configured to display a user interface (UI) . The UI may comprise a graphic, a text, an icon, a video, and any combination thereof. When the display screen 805 is a touch display screen, the display screen 805 also has the ability to collect touch signals on or above the surface of the display screen 805. The touch signal may be input to the processor 801 as a control signal for processing. In this case, the display screen 805 can be further configured to provide virtual buttons and/or virtual keyboards which are also referred to as soft buttons and/or soft keyboards. In some embodiments, there may be one display screen 805 arranged on the front panel of the terminal 800. In other embodiments, there may be at least two display screens 805 which are separately arranged on different surfaces of the terminal 800 or have a folding design. In still other embodiments, the display screen 805 may be a flexible display screen which is arranged on the curved surface or the folding surface of the terminal 800. The display screen 805 can even be arranged in a non-rectangular irregular shape, i.e., a profiled screen. The display screen 805 may be manufactured by using materials such as a liquid crystal display (LCD) and an organic light-emitting diode (OLED) . It should be noted that in this embodiment of the present application, when the terminal 800 is a landscape terminal, the aspect ratio of the display screen of the terminal 800 is greater than 1, for example, 16 : 9 or 4 : 3. When the terminal 800 is a portrait terminal, the aspect ratio of the display screen of the terminal 800 is less than 1, for example, 9 : 18 or 3 : 4.
The camera assembly 806 is configured to collect images or videos. Optionally, the camera assembly 806 comprises a front camera and a rear camera. Generally, the front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back surface of the terminal. In some embodiments, there are at least two rear cameras which are separately any one of a main camera, a depth-of-field camera, a wide-angle camera, and a long-focus camera, so as to fuse the main camera and the depth-of-field camera to realize a bokeh function, and fuse the main camera and the wide-angle camera to realize a panoramic shooting and virtual reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 806 may further comprise a flashlight. The flashlight may be a single-color temperature flashlight or a dual-color temperature flashlight. The dual-color temperature flashlight refers to a combination of a warm light flashlight and a cold light flashlight, which can be used for light compensation at different color temperatures.
The audio circuit 807 may comprise a microphone and a speaker. The microphone is configured to collect sound waves of a user and an environment, and convert the sound waves into electrical signals and input them to the processor 801 for processing, or input them to the radio frequency circuit 804 to implement voice communication. For the purpose of stereophony collection or noise reduction, there may be a plurality of microphones which are separately arranged at different parts of the terminal 800. The microphone can further be an array microphone or an omnidirectional acquisition microphone. The speaker is configured to convert electrical signal from the processor 801 or the radio frequency circuit 804 into sound waves. The speaker may be a traditional thin film speaker or a piezoelectric ceramic speaker. When the speaker is the piezoelectric ceramic speaker, electrical signals can not only be converted into sound waves audible to humans, but also be converted into sound waves inaudible to humans for the purpose of ranging. In some embodiments, the audio circuit 807 may further comprise a headphone jack.
The positioning assembly 808 is configured to locate a current geographic location of the terminal 800 to implement navigation or a location based service (LBS) . The positioning assembly 808 may be a positioning assembly based on the global positioning system (GPS) of the United States, the Beidou system of China, or the Galileo system of the European Union.
The power supply 809 is configured to supply power to various assemblies in the terminal 800. The power supply 809 may be an alternating current battery, a direct current battery, a disposable battery, or a rechargeable battery. When the power supply 809 comprises the rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged by means of a wired line, and the wireless rechargeable battery is a battery charged by means of a wireless coil. The rechargeable battery can be further configured to support fast charging technologies.
In some embodiments, the terminal 800 further comprises one or more sensors 810. The one or more sensors 810 include but are not limited to: an acceleration sensor 811, a gyroscope sensor 812, a pressure sensor 813, a fingerprint sensor 814, an optical sensor 815, and a proximity sensor 816.
The acceleration sensor 811 can detect the magnitude of acceleration on the three coordinate axes of a coordinate system established with the terminal 800. For example, the acceleration sensor 811 can be configured to detect components of gravitational acceleration on the three coordinate axes. The processor 801 may control the display screen 805 to display the user interface in a landscape view or a portrait view according to a gravity acceleration signal collected by the acceleration sensor 811. The acceleration sensor 811 can be further configured to collect game data or motion data of a user.
The gyroscope sensor 812 can detect a body direction and a rotation angle of the terminal 800, and the gyroscope sensor 812 can cooperate with the acceleration sensor 811 to collect a 3D action of a user to the terminal 800. The processor 801 can realize the following functions according to data collected by the gyroscope sensor 812: action sensing (for example, changing the UI based on a tilt operation of the user) , image stabilization during shooting, game control, and inertial navigation.
The pressure sensor 813 may be arranged at a side frame of the terminal 800 and/or a lower layer of the display screen 805. When the pressure sensor 813 is arranged at the side frame of the terminal 800, the pressure sensor can detect a grip signal of a user to the terminal 800, and the processor 801 performs left and right hand recognition or a shortcut operation according to the grip signal collected by the pressure sensor 813. When the pressure sensor 813 is arranged at the lower layer of the display screen 805, the processor 801 controls an operability control on the UI interface according to a pressure operation of the user to the display screen 805. The operability control comprises at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 814 is configured to collect a fingerprint of a user, and the processor 801 recognizes a user identity according to the fingerprint collected by the fingerprint sensor 814, or the fingerprint sensor 814 recognizes the user identity according to the collected fingerprint. When the user identity is recognized as a trusted identity, the processor 801 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 814 may be arranged on the front surface, back surface, or side surface of the terminal 800. When a physical button or a manufacturer logo is arranged on the terminal 800, the fingerprint sensor 814 may be integrated with the physical button or manufacturer logo.
The optical sensor 815 is configured to collect ambient light intensity. In one embodiment, the processor 801 may control display luminance of the display screen 805 according to the ambient light intensity collected by the optical sensor 815. Specifically, when the ambient light intensity is high, the display luminance of the display screen 805 is turned up; and when the ambient light intensity is low, the display luminance of the display screen 805 is turned down. In another embodiment, the processor 801 can further dynamically adjust a shooting parameter of the camera assembly 806 according to the ambient light intensity collected by the optical sensor 815.
The proximity sensor 816, also referred to as a distance sensor, is usually arranged on the front panel of the terminal 800. The proximity sensor 816 is configured to collect a distance between a user and the front surface of the terminal 800. In one embodiment, when the proximity sensor 816 detects that the distance between the user and the front surface of the terminal 800 gradually becomes smaller, the processor 801 controls the display screen 805 to switch from a screen-on state to a screen-off state. When the proximity sensor 816 detects that the distance between the user and the front surface of the terminal 800 gradually becomes larger, the processor 801 controls the display screen 805 to switch from the screen-off state to the screen-on state.
That is, the embodiments of the present application provide a terminal comprising a processor and a memory used for storing instructions that can be executed by the processor, wherein the processor is configured to carry out the data labeling method shown in FIG. 2. In addition, the embodiments of the application further provide a computer-readable storage medium, which stores a computer programs that implements, when being executed by a processor, the data labeling method shown in FIG. 2.
The embodiments of the present application further provide a computer program product containing an instruction, and when the computer program product runs on a computer, the computer is enabled to carry out the data labeling method shown in FIG. 2.
A person of ordinary skill in the art may understand that all or some of the blocks for implementing the embodiments described above may be completed by hardware, or may be completed by a program instructing related hardware. The program may be stored in a computer-readable storage medium. The above-mentioned storage medium may be a read-only memory, a magnetic disk, an optical disk, etc.
The above embodiments are merely optionally embodiments of the present application but not intended to limit the present application, and any modifications, equivalent replacements, improvements, etc. made within the spirit and principle of the present application should be included within the scope of protection of the present application.
Claims (17)
- A data labeling method, executed by a vehicle-mounted terminal of an autonomous vehicle, the method comprising:receiving, by the vehicle mounted terminal, environmental state data collected by a vehicle-mounted sensor of the autonomous vehicle in real time and a time stamp corresponding to the environmental state data, wherein the time stamp indicates a collection time of the environmental state data;determining, by the vehicle mounted terminal, a target labeling type and a target time for target environmental state data included in the collected environmental state data, wherein the target labeling type is determined according to an environment in which the autonomous vehicle is located, and wherein the target time refers to a time at which the target labeling type is determined; andgenerating, by the vehicle mounted terminal, labeling information for the target environmental state data, wherein the labeling information includes the target labeling type and target time determined by the vehicle mounted terminal and wherein the determined target time is the same as the collection time indicated by the time stamp corresponding to the target environmental state data.
- The method of claim 1, wherein determining the target labeling type comprises receiving, by the vehicle mounted terminal, a user selection of a target labeling type.
- The method of claim 2, comprising:performing recognition, by the vehicle mounted terminal on the collected environmental state data, using a neural network;displaying, by the vehicle mounted terminal, a proposed target labeling type for the target environmental state data based on a result of the recognition;and receiving, by the vehicle mounted terminal, a user selection either confirming the proposed target labeling type or selecting an alternative target labeling type.
- The method of any one of the above claims, wherein, in response to receiving a user input commencing user selection of a target labelling type, the vehicle-mounted sensor suspends the target time so that the target time is the same as the collection time of the target environmental state data being labelled.
- The method of any one of the above claims wherein the vehicle mounted terminal receives a plurality of pieces of environmental state data and corresponding time stamps indicating collection times of the plurality of pieces of environmental state data, and wherein the vehicle mounted terminal applies the same labeling information to a plurality of pieces of environmental state data which have a same collection time.
- The method according to claim 1, wherein said determining a target labeling type and a target time comprises:displaying a plurality of labeling options, wherein each of the plurality of labeling options is used to indicate a labeling type;receiving a labeling instruction triggered for a target labeling option in the plurality of labeling options, wherein the labeling instruction carries a labeling type indicated by the target labeling option; anddetermining the labeling type carried by the labeling instruction as the target labeling type, and determining a receiving time of the labeling instruction as the target time.
- The method according to claim 6, wherein after said receiving environmental state data collected by a vehicle-mounted sensor of the autonomous vehicle in real time and a time stamp corresponding to the environmental state data, the method further comprises:performing recognition on the received environmental state data to obtain a plurality of recognition results;fusing the plurality of recognition results; anddisplaying the fused recognition results, wherein the fused recognition results are indicative of data content included in the collected environmental state data.
- The method according to any of the above claims, wherein said determining a target labeling type and a target time comprises:detecting an abnormal signal indicating that a running state of the autonomous vehicle is abnormal;in response to detecting the abnormal signal, determining, as the target labeling type, an abnormality type which indicates a data abnormality, and setting a detection time of the abnormal signal as the target time.
- The method according to any of the above claims, wherein the method further comprises:displaying the labeling information generated according to the target labeling type;receiving a modification instruction used for modifying the labeling information, wherein the modification instruction carries a specified labeling type; andmodifying the target labeling type comprised in the labeling information to the specified labeling type.
- A data labeling apparatus suitable for use as a vehicle-mounted terminal of an autonomous vehicle, the data labeling apparatus comprising:a first receiving module configured to receive environmental state data collected by a vehicle-mounted sensor of the autonomous vehicle in real time and a time stamp corresponding to the environmental state data, wherein the time stamp is used to indicate a collection time of the corresponding environmental state data;a determination module configured to determine a target labeling type and a target time, wherein the target labeling type is determined according to an environment in which the autonomous vehicle is located, and the target time refers to a time at which the target labeling type is determined; anda generation module configured to generate, according to the target labeling type, labeling information for target environmental state data included in the received environmental state data, wherein the labeling information includes the target labeling type and target time determined by the determination module and wherein the determination module is configured to determine a target time which is the same as the collection time indicated by the time stamp corresponding to the target environmental state data.
- The apparatus according to claim 10, wherein the determination module is configured to:display a plurality of labeling options, wherein each of the plurality of labeling options indicates a labeling type;receive a labeling instruction triggered for a target labeling option among the plurality of labeling options, wherein the labeling instruction carries a labeling type indicated by the target labeling option; anddetermine the labeling type carried by the labeling instruction as the target labeling type, and determine a receiving time of the labeling instruction as the target time.
- The apparatus according to claim 11, further comprising:a recognition module configured to recognize the received environmental state data;a fusion module configured to fuse a plurality of recognition results after recognition is performed on the environmental state data; anda first display module configured to display the fused recognition results, wherein the fused recognition results indicate a data content included in the environmental state data.
- The apparatus according to claim 10, wherein the determination module is configured to:determine, as the target labeling type, an abnormality type used to indicate a data abnormality in response to detecting an abnormal signal indicating that a running state of the autonomous vehicle is abnormal, andset a detection time of the abnormal signal as the target time.
- The apparatus according to any of claims 10 to 13, wherein the labeling information comprises the target labeling type and the target time; andthe apparatus further comprises:a second display module configured to display the labeling information;a second receiving module configured to receive a modification instruction used for modifying the labeling information, wherein the modification instruction carries a specified labeling type; anda modification module configured to modify the target labeling type comprised in the labeling information to the specified labeling type.
- A non-transitory computer-readable storage medium storing computer readable instructions that are executable by a processor to perform the method according to any of claims 1 to 9.
- A data labeling method, being applied to a vehicle-mounted terminal of an automatic driving vehicle and comprising:receiving environmental state data collected by a vehicle-mounted sensor of the automatic driving vehicle in real time and a time stamp corresponding to each piece of environmental state data, wherein the time stamp is used to indicate a collection time of the corresponding environmental state data;determining a target labeling type and a target time, wherein the target labeling type is determined according to an environment in which the automatic driving vehicle is located, and the target time refers to a time at which the target labeling type is determined; andgenerating, according to the target labeling type, labeling information for target environmental state data in the received environmental state data, wherein a collection time indicated by a time stamp corresponding to the target environmental state data is the same time as the target time.
- A data labeling apparatus, being applied to a vehicle-mounted terminal of an automatic driving vehicle and comprising:a first receiving module configured to receive environmental state data collected by a vehicle-mounted sensor of the automatic driving vehicle in real time and a time stamp corresponding to each piece of environmental state data, wherein the time stamp is used to indicate a collection time of the corresponding environmental state data;a determination module configured to determine a target labeling type and a target time, wherein the target labeling type is determined according to an environment in which the automatic driving vehicle is located, and the target time refers to a time at which the target labeling type is determined; anda generation module configured to generate, according to the target labeling type, labeling information for target environmental state data in the received environmental state data, wherein a collection time indicated by a time stamp corresponding to the target environmental state data is the same time as the target time.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911268162.2A CN111125442B (en) | 2019-12-11 | 2019-12-11 | Data labeling method and device |
CN201911268162.2 | 2019-12-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021114608A1 true WO2021114608A1 (en) | 2021-06-17 |
Family
ID=70498624
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/098210 WO2021114608A1 (en) | 2019-12-11 | 2020-06-24 | Data labeling method and apparatus |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111125442B (en) |
WO (1) | WO2021114608A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210183173A1 (en) * | 2019-12-13 | 2021-06-17 | Marvell Asia Pte Ltd. | Automotive Data Processing System with Efficient Generation and Exporting of Metadata |
CN114172915A (en) * | 2021-11-05 | 2022-03-11 | 中汽创智科技有限公司 | Message synchronization method, automatic driving system, storage medium and electronic equipment |
US11734363B2 (en) | 2018-07-31 | 2023-08-22 | Marvell Asia Pte, Ltd. | Storage edge controller with a metadata computational engine |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111125442B (en) * | 2019-12-11 | 2022-11-15 | 苏州智加科技有限公司 | Data labeling method and device |
CN114755673A (en) * | 2020-12-25 | 2022-07-15 | 欧特明电子股份有限公司 | Multi-sensor automatic driving system |
CN113435498A (en) * | 2021-06-25 | 2021-09-24 | 上海商汤临港智能科技有限公司 | Data labeling method, device, equipment and storage medium |
CN113392804B (en) * | 2021-07-02 | 2022-08-16 | 昆明理工大学 | Multi-angle-based traffic police target data set scene construction method and system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106503653A (en) * | 2016-10-21 | 2017-03-15 | 深圳地平线机器人科技有限公司 | Area marking method, device and electronic equipment |
CN107093210A (en) * | 2017-04-20 | 2017-08-25 | 北京图森未来科技有限公司 | A kind of laser point cloud mask method and device |
CN107483911A (en) * | 2017-08-25 | 2017-12-15 | 秦山 | A kind of signal processing method and system based on more mesh imaging sensors |
US20180157920A1 (en) * | 2016-12-01 | 2018-06-07 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and apparatus for recognizing obstacle of vehicle |
US20190317507A1 (en) * | 2018-04-13 | 2019-10-17 | Baidu Usa Llc | Automatic data labelling for autonomous driving vehicles |
CN111125442A (en) * | 2019-12-11 | 2020-05-08 | 苏州智加科技有限公司 | Data labeling method and device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11328219B2 (en) * | 2018-04-12 | 2022-05-10 | Baidu Usa Llc | System and method for training a machine learning model deployed on a simulation platform |
CN110148294B (en) * | 2018-06-07 | 2021-08-03 | 腾讯大地通途(北京)科技有限公司 | Road condition state determining method and device |
CN109358614A (en) * | 2018-08-30 | 2019-02-19 | 深圳市易成自动驾驶技术有限公司 | Automatic Pilot method, system, device and readable storage medium storing program for executing |
-
2019
- 2019-12-11 CN CN201911268162.2A patent/CN111125442B/en active Active
-
2020
- 2020-06-24 WO PCT/CN2020/098210 patent/WO2021114608A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106503653A (en) * | 2016-10-21 | 2017-03-15 | 深圳地平线机器人科技有限公司 | Area marking method, device and electronic equipment |
US20180157920A1 (en) * | 2016-12-01 | 2018-06-07 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and apparatus for recognizing obstacle of vehicle |
CN107093210A (en) * | 2017-04-20 | 2017-08-25 | 北京图森未来科技有限公司 | A kind of laser point cloud mask method and device |
CN107483911A (en) * | 2017-08-25 | 2017-12-15 | 秦山 | A kind of signal processing method and system based on more mesh imaging sensors |
US20190317507A1 (en) * | 2018-04-13 | 2019-10-17 | Baidu Usa Llc | Automatic data labelling for autonomous driving vehicles |
CN111125442A (en) * | 2019-12-11 | 2020-05-08 | 苏州智加科技有限公司 | Data labeling method and device |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11734363B2 (en) | 2018-07-31 | 2023-08-22 | Marvell Asia Pte, Ltd. | Storage edge controller with a metadata computational engine |
US11748418B2 (en) | 2018-07-31 | 2023-09-05 | Marvell Asia Pte, Ltd. | Storage aggregator controller with metadata computation control |
US20210183173A1 (en) * | 2019-12-13 | 2021-06-17 | Marvell Asia Pte Ltd. | Automotive Data Processing System with Efficient Generation and Exporting of Metadata |
CN114172915A (en) * | 2021-11-05 | 2022-03-11 | 中汽创智科技有限公司 | Message synchronization method, automatic driving system, storage medium and electronic equipment |
CN114172915B (en) * | 2021-11-05 | 2023-10-31 | 中汽创智科技有限公司 | Message synchronization method, automatic driving system, storage medium and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN111125442B (en) | 2022-11-15 |
CN111125442A (en) | 2020-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021114608A1 (en) | Data labeling method and apparatus | |
CN110618800A (en) | Interface display method, device, equipment and storage medium | |
WO2021082483A1 (en) | Method and apparatus for controlling vehicle | |
CN111854780B (en) | Vehicle navigation method, device, vehicle, electronic equipment and storage medium | |
CN111192341A (en) | Method and device for generating high-precision map, automatic driving equipment and storage medium | |
CN110991491A (en) | Image labeling method, device, equipment and storage medium | |
CN110991260B (en) | Scene marking method, device, equipment and storage medium | |
CN111437600A (en) | Plot showing method, plot showing device, plot showing equipment and storage medium | |
CN113160427A (en) | Virtual scene creating method, device, equipment and storage medium | |
CN113343457A (en) | Automatic driving simulation test method, device, equipment and storage medium | |
WO2022142713A1 (en) | Method and apparatus for monitoring vehicle driving information | |
CN112269939B (en) | Automatic driving scene searching method, device, terminal, server and medium | |
CN111031493B (en) | Running time information transmission method and device, electronic equipment and storage medium | |
CN109189068B (en) | Parking control method and device and storage medium | |
CN113361386B (en) | Virtual scene processing method, device, equipment and storage medium | |
CN112101297B (en) | Training data set determining method, behavior analysis method, device, system and medium | |
CN113205069B (en) | False license plate detection method and device and computer storage medium | |
CN112560612B (en) | System, method, computer device and storage medium for determining business algorithm | |
CN114598992A (en) | Information interaction method, device, equipment and computer readable storage medium | |
CN113936240A (en) | Method, device and equipment for determining sample image and storage medium | |
CN112699906B (en) | Method, device and storage medium for acquiring training data | |
CN110399688B (en) | Method and device for determining environment working condition of automatic driving and storage medium | |
CN111324815A (en) | Automobile information processing method and device and storage medium | |
CN113433862B (en) | Simulation method and device of new energy automobile energy management system and storage medium | |
WO2024087456A1 (en) | Determination of orientation information and autonomous vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20898008 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20898008 Country of ref document: EP Kind code of ref document: A1 |