CN112967359B - Data labeling method, device, terminal equipment and storage medium - Google Patents

Data labeling method, device, terminal equipment and storage medium Download PDF

Info

Publication number
CN112967359B
CN112967359B CN202110340495.2A CN202110340495A CN112967359B CN 112967359 B CN112967359 B CN 112967359B CN 202110340495 A CN202110340495 A CN 202110340495A CN 112967359 B CN112967359 B CN 112967359B
Authority
CN
China
Prior art keywords
data
picture
marked
frame
labeling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110340495.2A
Other languages
Chinese (zh)
Other versions
CN112967359A (en
Inventor
李炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202110340495.2A priority Critical patent/CN112967359B/en
Publication of CN112967359A publication Critical patent/CN112967359A/en
Application granted granted Critical
Publication of CN112967359B publication Critical patent/CN112967359B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application is applicable to the technical field of artificial intelligence, and provides a data labeling method, a device, terminal equipment and a storage medium, wherein the method comprises the following steps: displaying a picture to be marked; if a preset event aiming at the picture to be marked is monitored, determining a data marking starting point and a data marking end point according to the preset event; generating and displaying a data annotation frame according to the determined data annotation starting point and the determined data annotation ending point; and if a selection instruction aiming at any data marking frame is received, marking the selected data marking frame, and generating marking data corresponding to the picture to be marked according to the marked data marking frame. According to the method and the device, the monitoring of the preset event is carried out on the picture to be marked, the data marking starting point and the data marking end point in the picture to be marked can be effectively determined, the corresponding data marking frame is automatically generated, and the accuracy of the marking of the selected data marking frame is effectively improved.

Description

Data labeling method, device, terminal equipment and storage medium
Technical Field
The present application relates to the field of artificial intelligence, and in particular, to a data labeling method, a device, a terminal device, and a storage medium.
Background
The training of the AI algorithm model often needs a large amount of available data, the data labeling becomes a vital ring after the data acquisition, the picture classification and the object detection are particularly common in the data labeling, the early core work of the object detection is to label the corresponding labels and classifications in the pictures after a large amount of picture data are acquired, and the corresponding labeling data are generated based on the labeling results of the labels and classifications in the pictures so as to be used for the training of the model, so that the accuracy of the model training is ensured, and the problem of data labeling is more and more emphasized by people.
In the existing data labeling process, the manual mode is adopted to label the data of the picture, so that the accuracy of the data labeling is low, and the accuracy of the generation of the labeling data is reduced.
Disclosure of Invention
In view of this, the embodiments of the present application provide a data labeling method, apparatus, terminal device, and storage medium, so as to solve the problem of low accuracy of data labeling caused by performing data labeling in a manual manner in the data labeling process in the prior art.
A first aspect of an embodiment of the present application provides a data labeling method, including:
displaying a picture to be marked;
if a preset event aiming at the picture to be marked is monitored, determining a data marking starting point and a data marking end point according to the preset event;
generating and displaying a data annotation frame according to the determined data annotation starting point and the determined data annotation ending point;
and if a selection instruction aiming at any data annotation frame is received, labeling the selected data annotation frame, and generating annotation data corresponding to the picture to be annotated according to the labeled data annotation frame.
Further, if a preset event aiming at the picture to be marked is monitored, determining a data marking starting point and a data marking end point according to the preset event includes:
if a click event aiming at the picture to be marked is monitored, a coordinate point pointed by the click event on the picture to be marked is obtained to obtain the data marking starting point, the click event is used for selecting the coordinate point on the picture to be marked, monitoring of a lifting event is carried out on the picture to be marked, and the lifting event is used for carrying out counter selection of the coordinate point on the picture to be marked;
and if the lifting event aiming at the picture to be marked is monitored, acquiring a coordinate point reversely selected by the lifting event on the picture to be marked, and obtaining the data marking end point.
Further, after the coordinate points pointed by the click event on the picture to be marked are obtained, the method further includes:
monitoring a moving event for the picture to be marked, wherein the moving event is used for moving a coordinate point on the picture to be marked;
if the moving event aiming at the picture to be marked is monitored, acquiring a coordinate point pointed by the moving event on the picture to be marked, and obtaining a marked moving point;
and generating a candidate annotation frame according to the annotation moving point and the data annotation starting point.
Further, the generating and displaying a data annotation frame according to the determined data annotation starting point and the determined data annotation ending point includes:
and acquiring the labeling moving point corresponding to the data labeling end point, and setting the candidate labeling frame corresponding to the acquired labeling moving point as the data labeling frame.
Further, after generating and displaying the data annotation frame according to the determined data annotation starting point and the determined data annotation ending point, the method further includes:
acquiring a labeling area of the data labeling frame on the picture to be labeled, and performing area overlapping detection on the acquired labeling area and a labeling area corresponding to the displayed data labeling frame;
and deleting the obtained data annotation frame corresponding to the annotation region if an overlapping region exists between the obtained annotation region and the annotation region corresponding to the displayed data annotation frame.
Further, after the obtained labeling area and the labeling area corresponding to the displayed data labeling frame have an overlapping area, the method further includes:
if the number of the areas of the overlapped areas is smaller than a first number threshold and the area of the areas of the overlapped areas is larger than an area threshold, sending a label frame replacement prompt;
and if a determining instruction aiming at the marking frame replacement prompt is received, deleting the displayed data marking frame corresponding to the overlapped area, and displaying the generated data marking frame.
Further, the generating the labeling data corresponding to the picture to be labeled according to the labeled data labeling frame includes:
respectively acquiring a center point coordinate, a picture height, a picture width and a preset origin coordinate on the picture to be marked, and acquiring tag information pointed by the tag mark;
calculating a coordinate conversion rule according to the acquired center point coordinates, the picture height, the picture width and the preset origin coordinates;
acquiring display coordinates of the data annotation frame on the picture to be annotated, and carrying out coordinate conversion on the display coordinates according to the coordinate conversion rule;
and correspondingly storing the display coordinates and the label information after the corresponding coordinates of the same data labeling frame are converted, so as to obtain the labeling data.
A second aspect of an embodiment of the present application provides a data labeling apparatus, including:
the picture display unit is used for displaying pictures to be marked;
the event monitoring unit is used for determining a data marking starting point and a data marking end point according to the preset event if the preset event aiming at the picture to be marked is monitored;
the annotation frame generation unit is used for generating and displaying a data annotation frame according to the determined data annotation starting point and the determined data annotation end point;
and the marking data generating unit is used for marking the selected data marking frame if a selection instruction aiming at any data marking frame is received, and generating marking data corresponding to the picture to be marked according to the marked data marking frame.
A third aspect of the embodiments of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the terminal device, where the processor implements the steps of the data labeling method provided by the first aspect when the processor executes the computer program.
A fourth aspect of the embodiments of the present application provides a storage medium storing a computer program which, when executed by a processor, implements the steps of the data tagging method provided by the first aspect.
The data labeling method, the device, the terminal equipment and the storage medium provided by the embodiment of the application have the following beneficial effects:
according to the data labeling method, the monitoring of the preset event is carried out on the picture to be labeled, the data labeling starting point and the data labeling end point in the picture to be labeled can be effectively determined, the corresponding data labeling frame can be automatically generated in the picture to be labeled based on the determined data labeling starting point and the determined data labeling end point, the generated data labeling frame is displayed, the user can conveniently check the generated data labeling frame effectively, if a selection instruction for any data labeling frame is received, the selected data labeling frame is labeled, the accuracy of labeling on the data labeling frame is effectively improved, the phenomenon that the accuracy of data labeling is low due to the fact that the manual mode is adopted for data labeling is prevented, and the accuracy of labeling data generation is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of an implementation of a data labeling method provided in an embodiment of the present application;
FIG. 2 is a flowchart of an implementation of a method for labeling data according to another embodiment of the present application;
FIG. 3 is a flowchart of an implementation of a method for labeling data according to another embodiment of the present application;
fig. 4 is a schematic structural diagram of a picture a to be marked provided in the embodiment of fig. 3;
FIG. 5 is a block diagram of a data labeling apparatus according to an embodiment of the present disclosure;
fig. 6 is a block diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The data labeling method according to the embodiment of the present application may be performed by a control device or a terminal (hereinafter referred to as a "mobile terminal").
Referring to fig. 1, fig. 1 shows a flowchart of an implementation of a data labeling method provided in an embodiment of the present application, including:
and step S10, displaying the picture to be marked.
The picture to be marked can be displayed on any application with a picture display function, and preferably, in the step, the picture to be marked is displayed on a canvas (canvas) tool.
Specifically, in this step, a zoom instruction or a picture deletion instruction of the user may also be obtained, and the displayed picture to be marked may be zoomed in and zoomed in based on the obtained zoom instruction, or the picture region on the displayed picture to be marked may be deleted based on the obtained deletion instruction.
Further, in the step, the picture size of the picture to be marked is obtained, the picture size of the picture to be marked is compared with the preset size, and the picture to be marked is automatically scaled according to the size comparison result, so that the effect of automatically setting the display size of the picture to be marked according to the size requirement of a user is achieved.
Further, in the step, if a picture moving instruction for the picture to be marked is received, a region to be moved in the picture to be marked is determined according to the picture moving instruction, and the region to be moved is moved according to moving information carried by the picture moving instruction, wherein the moving information comprises a region moving direction and a moving distance.
And step S20, if a preset event aiming at the picture to be marked is monitored, determining a data marking starting point and a data marking end point according to the preset event.
The preset event can be set according to requirements, and is used for triggering the determining operation of the data marking starting point and the data marking end point on the picture to be marked.
For example, when an event such as a click operation, a voice command or a touch command for a picture to be marked is monitored, determining a data marking start point and a data marking end point on the picture to be marked according to the event such as the click operation, the voice command or the touch command.
And step S30, generating and displaying a data annotation frame according to the determined data annotation starting point and the determined data annotation ending point.
And respectively acquiring coordinate points of the data marking starting point and the data marking end point on the picture to be marked to obtain starting point coordinates and end point coordinates, respectively calculating the distance between the starting point coordinates and the end point coordinates to obtain marking frame length and marking frame width, and drawing a preset graph towards the data marking end point by using the data marking starting point according to the calculated marking frame length and marking frame width to obtain the data marking frame.
Specifically, in this step, the preset pattern may be set according to the requirement, for example, the preset pattern may be set as a rectangle, an ellipse, a triangle, any polygon, or the like.
And step S40, if a selection instruction aiming at any data annotation frame is received, labeling the selected data annotation frame, and generating annotation data corresponding to the picture to be annotated according to the labeled data annotation frame.
The selection instruction is used for selecting the corresponding data annotation frame, and setting the selected data annotation frame to be in an active state, wherein at the moment, the data annotation frames on the picture to be annotated except for the active state are in an inactive state, namely, are in an inactive state.
In the step, the selected data marking frame is set to be in an active state, so that the marking error of the data marking frame in other non-selected states is effectively prevented, the accuracy of the label marking of the data marking frame is further improved, and the label marking is used for marking the pointed label information on the data marking frame in the corresponding active state.
Specifically, in this step, the generating, according to the labeled data labeling frame, labeling data corresponding to the to-be-labeled picture includes:
respectively acquiring a center point coordinate, a picture height, a picture width and a preset origin coordinate on the picture to be marked, and acquiring label information pointed by the label mark, wherein the center point coordinate is a coordinate point coordinate corresponding to a picture center of the picture to be marked after display, the preset origin coordinate can be set according to requirements, in the step, the upper left corner of the picture to be marked is set to be the preset origin coordinate (0, 0), the center point coordinate is (x 1, y 1), the acquired picture height is picH, the picture width is picW, the label information comprises preset article types, and the preset article types comprise pedestrians, vehicles, animals or buildings and the like.
Calculating a coordinate conversion rule according to the obtained center point coordinates, the picture height, the picture width and the preset origin coordinates, wherein the picture to be marked displayed on the canvas tool is a coordinate system established by taking the center point of the picture as the origin, and the coordinate system established by taking the top left corner vertex of the picture as the origin in the operation process of data marking, so that the coordinate conversion of the data marking frame can be effectively performed through the coordinate conversion rule, and the error of coordinates in the generated marking data is prevented;
acquiring display coordinates of the data annotation frame on the picture to be annotated, and carrying out coordinate conversion on the display coordinates according to the coordinate conversion rule, wherein the coordinate conversion rule is as follows: the coordinate points after coordinate conversion are shown as (x 2, y 2), and the coordinate relationship of conversion is x2=x2+picw/2, y2=pich/2-y 2.
And correspondingly storing the display coordinates and the label information after the corresponding coordinates of the same data labeling frame are converted, so as to obtain the labeling data.
Further, in this step, after the labeling is performed on the selected data labeling frame, the method further includes:
acquiring the number of marks of the data marking frames marked by the labels on the picture to be marked;
if the number of the marks is greater than or equal to a second number threshold, sending a data marking stop prompt aiming at the picture to be marked, wherein the second number threshold can be set according to requirements, the second number threshold is used for detecting whether the number of the data marking frames marked by the labels on the picture to be marked is excessive, and when the number of the marks is greater than or equal to the second number threshold, judging that the number of the data marking frames marked by the labels on the picture to be marked is excessive, and sending the data marking stop prompt aiming at the picture to be marked so as to prompt a user to stop generating the data marking frames and/or marking the labels;
specifically, in this step, when the labeling of the data labeling frame is completed, a unique id (feature-timestamp) is assigned to the labeled data labeling frame to count the number of m+1 marks, and when any labeled data labeling frame is deleted, the database is deleted according to the id of the currently deleted data labeling frame, and the number of m-1 marks is counted.
In the embodiment, the monitoring of the preset event is carried out on the picture to be marked, the data marking starting point and the data marking end point in the picture to be marked can be effectively determined, the corresponding data marking frame can be automatically generated in the picture to be marked based on the determined data marking starting point and the determined data marking end point, the generated data marking frame is displayed, the user can conveniently check the generated data marking frame effectively, if a selection instruction for any data marking frame is received, the selected data marking frame is marked by the label, the accuracy of the label marking on the data marking frame is effectively improved, the phenomenon that the accuracy of the data marking is low due to the fact that the data marking is carried out manually is prevented, and the accuracy of marking data generation is improved.
Referring to fig. 2, fig. 2 is a flowchart illustrating an implementation of a data labeling method according to another embodiment of the present application. Compared to the corresponding embodiment of fig. 1, the data labeling method provided in this embodiment is used for further refining step S20, and includes:
and S21, if a click event aiming at the picture to be marked is monitored, acquiring a coordinate point pointed by the click event on the picture to be marked, obtaining the data marking starting point, and monitoring the lifting event of the picture to be marked.
The click event is used for selecting a coordinate point on the picture to be marked, and the lifting event is used for reversely selecting the coordinate point on the picture to be marked.
Specifically, in the step, by monitoring a click event of a left button of a user mouse, whether the image to be marked triggers the acquisition of a data marking starting point is judged, when the click event of the left button of the user mouse is monitored, a coordinate point clicked by the user mouse on the image to be marked is acquired, the data marking starting point is obtained, and a lifting event of the user mouse is monitored.
Optionally, in this step, after obtaining the data annotation starting point, the obtaining the coordinate point pointed by the click event on the image to be annotated further includes:
monitoring a moving event on the picture to be marked, wherein the moving event is used for moving a coordinate point on the picture to be marked, and in the step, the moving event monitoring effect of the picture to be marked is achieved by monitoring the motion state of a mouse of a user;
if the moving event aiming at the picture to be marked is monitored, a coordinate point pointed by the moving event on the picture to be marked is obtained, and a marked moving point is obtained, wherein when the movement of a user mouse is monitored, the coordinate point of the user mouse on the picture to be marked is obtained in real time, and the marked moving point is obtained;
generating a candidate annotation frame according to the annotation moving point and the data annotation starting point, wherein the candidate annotation frame is generated according to the annotation moving point and the data annotation starting point, so that a user can conveniently view a framed picture area on a picture to be annotated, and preferably, in the step, after the candidate annotation frame is generated, the color or the graph of the area in the candidate annotation frame is modified according to a preset filling color and/or graph, and further, the user can conveniently view the framed picture area on the picture to be annotated.
And S22, if the lifting event aiming at the picture to be marked is monitored, acquiring a coordinate point reversely selected by the lifting event on the picture to be marked, and obtaining the data marking end point.
If the left button of the user mouse is monitored to be switched from the pressed state to the non-pressed state, the fact that a lifting event occurs to the picture to be marked is judged, and the data marking end point is obtained by obtaining a coordinate point reversely selected by the user mouse on the picture to be marked.
Further, in this embodiment, for step S30 in the embodiment of fig. 1, the generating and displaying a data annotation frame according to the determined data annotation start point and the determined data annotation end point includes:
and acquiring the labeling moving point corresponding to the data labeling end point, and setting the candidate labeling frame corresponding to the acquired labeling moving point as the data labeling frame, wherein the data labeling frame corresponding to the acquired labeling moving point is set as the data labeling frame by setting the candidate labeling frame corresponding to the acquired labeling moving point as the data labeling frame according to the data labeling end point finally selected by a user because the coordinate point corresponding to the lifting event corresponds to the data labeling end point.
In this embodiment, by monitoring a click event on a picture to be marked, whether the acquisition of a data marking start point is triggered for the picture to be marked is determined, when the fact that the click event occurs on the picture to be marked is monitored, a coordinate point pointed by the click event on the picture to be marked is acquired, the data marking start point is obtained, and by monitoring a lifting event on the picture to be marked, a coordinate point reversely selected by the lifting event on the picture to be marked can be effectively acquired, and the data marking end point is obtained.
Referring to fig. 3, fig. 3 is a flowchart illustrating an implementation of a data labeling method according to another embodiment of the present application. Compared to the corresponding embodiment of fig. 1, the data labeling method provided in this embodiment is used for further refining the steps after step S30, and the data labeling method further includes:
and S50, acquiring a labeling area of the data labeling frame on the picture to be labeled, and performing area overlapping detection on the acquired labeling area and the labeling area corresponding to the displayed data labeling frame.
And detecting whether the currently generated data annotation frame and the displayed data annotation frame are overlapped or not by performing region overlapping detection on the obtained annotation region and the annotation region corresponding to the displayed data annotation frame.
For example, please refer to fig. 4, which is a schematic structural diagram of a to-be-marked picture a provided in this embodiment, where the to-be-marked picture a includes a currently generated data marking frame B and a displayed data marking frame C, a displayed data marking frame D, and a displayed data marking frame E, in this step, marking areas of the data marking frame B on the to-be-marked picture a are obtained respectively, marking areas corresponding to the displayed data marking frame C, the displayed data marking frame D, and the displayed data marking frame E are obtained, and the obtained marking areas B are detected with the marking areas C, D, and E in an overlapping manner.
And step S60, deleting the obtained data annotation frame corresponding to the annotation region if an overlapping region exists between the obtained annotation region and the annotation region corresponding to the displayed data annotation frame.
If an overlapping area exists between the obtained marking area and the marking area corresponding to the displayed data marking frame, the area overlapping between the currently generated data marking frame and the displayed data marking frame is judged, and the phenomenon of object marking errors caused by overlapping between the data marking frames is prevented by deleting the data marking frame corresponding to the obtained marking area, so that the accuracy of marking data generation is improved.
Optionally, in this step, after the obtained overlapping area exists between the labeling area and the labeling area corresponding to the displayed data labeling frame, the method further includes:
if the number of the overlapped areas is smaller than a first number threshold and the area of the overlapped areas is larger than an area threshold, a marking frame replacement prompt is sent, wherein the first number threshold and the area threshold can be set according to requirements, and if the number of the overlapped areas is smaller than the first number threshold and the area of the overlapped areas is larger than the area threshold, the obtained overlapped areas between the marking areas and the displayed data marking frames are judged to meet preset prompt conditions;
specifically, in the step, when the obtained overlapping area between the labeling area and the displayed data labeling frame meets a preset prompting condition, a labeling frame replacement prompt is sent to remind a user whether the displayed data labeling frame needs to be replaced or not aiming at the currently generated data labeling frame.
And if a determining instruction aiming at the marking frame replacement prompt is received, deleting the displayed data marking frame corresponding to the overlapped area and displaying the generated data marking frame, wherein the effect of replacing the displayed data marking frame by the currently generated data marking frame is achieved by deleting the displayed data marking frame corresponding to the overlapped area and displaying the generated data marking frame, so that the replacement operation of the data marking frame on the image to be marked by a user is effectively facilitated.
In this embodiment, by performing region overlapping detection on the obtained labeling region and the labeling region corresponding to the displayed data labeling frame, to detect whether region overlapping occurs between the currently generated data labeling frame and the displayed data labeling frame, if there is an overlapping region between the obtained labeling region and the labeling region corresponding to the displayed data labeling frame, the object marking error phenomenon caused by overlapping between the data labeling frames is prevented by deleting the obtained data labeling frame corresponding to the obtained labeling region, thereby improving the accuracy of generating labeling data.
Referring to fig. 5, fig. 5 is a block diagram illustrating a data labeling apparatus 100 according to an embodiment of the present application. The data labeling device 100 in this embodiment includes units for executing the steps in the embodiments corresponding to fig. 1, 2, and 3. Refer specifically to fig. 1, fig. 2, fig. 3, and the related descriptions in the embodiments corresponding to fig. 1, fig. 2, fig. 3. For convenience of explanation, only the portions related to the present embodiment are shown. Referring to fig. 5, comprising: a picture display unit 10, an event monitor unit 11, a annotation frame generation unit 12, and an annotation data generation unit 13, wherein:
and the picture display unit 10 is used for displaying the picture to be marked.
The event monitoring unit 11 is configured to determine a data annotation start point and a data annotation end point according to a preset event if the preset event for the picture to be annotated is monitored.
Wherein the event listening unit 11 is further configured to: if a click event aiming at the picture to be marked is monitored, a coordinate point pointed by the click event on the picture to be marked is obtained to obtain the data marking starting point, the click event is used for selecting the coordinate point on the picture to be marked, monitoring of a lifting event is carried out on the picture to be marked, and the lifting event is used for carrying out counter selection of the coordinate point on the picture to be marked;
and if the lifting event aiming at the picture to be marked is monitored, acquiring a coordinate point reversely selected by the lifting event on the picture to be marked, and obtaining the data marking end point.
Further, the event listening unit 11 is further configured to: monitoring a moving event for the picture to be marked, wherein the moving event is used for moving a coordinate point on the picture to be marked;
if the moving event aiming at the picture to be marked is monitored, acquiring a coordinate point pointed by the moving event on the picture to be marked, and obtaining a marked moving point;
and generating a candidate annotation frame according to the annotation moving point and the data annotation starting point.
And the annotation frame generating unit 12 is used for generating and displaying the data annotation frame according to the determined data annotation starting point and the determined data annotation ending point.
Wherein, this annotation frame generation unit 12 is further used for: and acquiring the labeling moving point corresponding to the data labeling end point, and setting the candidate labeling frame corresponding to the acquired labeling moving point as the data labeling frame.
And the marking data generating unit 13 is configured to, if a selection instruction for any one of the data marking frames is received, label the selected data marking frame, and generate marking data corresponding to the to-be-marked picture according to the labeled data marking frame.
Wherein the annotation data generation unit 13 is further configured to: respectively acquiring a center point coordinate, a picture height, a picture width and a preset origin coordinate on the picture to be marked, and acquiring tag information pointed by the tag mark;
calculating a coordinate conversion rule according to the acquired center point coordinates, the picture height, the picture width and the preset origin coordinates;
acquiring display coordinates of the data annotation frame on the picture to be annotated, and carrying out coordinate conversion on the display coordinates according to the coordinate conversion rule;
and correspondingly storing the display coordinates and the label information after the corresponding coordinates of the same data labeling frame are converted, so as to obtain the labeling data.
Optionally, the data labeling apparatus 100 further includes:
the overlapping detection unit 14 is configured to obtain an annotation region of the data annotation frame on the image to be annotated, and perform region overlapping detection on the obtained annotation region and an annotation region corresponding to the displayed data annotation frame;
and deleting the obtained data annotation frame corresponding to the annotation region if an overlapping region exists between the obtained annotation region and the annotation region corresponding to the displayed data annotation frame.
Optionally, the overlap detection unit 14 is further configured to: if the number of the areas of the overlapped areas is smaller than a first number threshold and the area of the areas of the overlapped areas is larger than an area threshold, sending a label frame replacement prompt;
and if a determining instruction aiming at the marking frame replacement prompt is received, deleting the displayed data marking frame corresponding to the overlapped area, and displaying the generated data marking frame.
In the embodiment, the monitoring of the preset event is carried out on the picture to be marked, the data marking starting point and the data marking end point in the picture to be marked can be effectively determined, the corresponding data marking frame can be automatically generated in the picture to be marked based on the determined data marking starting point and the determined data marking end point, the generated data marking frame is displayed, the user can conveniently check the generated data marking frame effectively, if a selection instruction for any data marking frame is received, the selected data marking frame is marked by the label, the accuracy of the label marking on the data marking frame is effectively improved, the phenomenon that the accuracy of the data marking is low due to the fact that the data marking is carried out manually is prevented, and the accuracy of marking data generation is improved.
Fig. 6 is a block diagram of a terminal device 2 according to another embodiment of the present application. As shown in fig. 6, the terminal device 2 of this embodiment includes: a processor 20, a memory 21 and a computer program 22, such as a program of a data annotation method, stored in said memory 21 and executable on said processor 20. The steps of the respective embodiments of the data labeling method described above, such as S10 to S40 shown in fig. 1, S21 to S22 shown in fig. 2, or S50 to S60 shown in fig. 3, are implemented when the processor 20 executes the computer program 23. Alternatively, the processor 20 may implement the functions of each unit in the embodiment corresponding to fig. 5, for example, the functions of the units 10 to 14 shown in fig. 5, when executing the computer program 22, and the detailed description of the embodiment corresponding to fig. 6 will be referred to herein, which is omitted.
Illustratively, the computer program 22 may be partitioned into one or more units that are stored in the memory 21 and executed by the processor 20 to complete the present application. The one or more units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program 22 in the terminal device 2. For example, the computer program 22 may be divided into a picture display unit 10, an event listening unit 11, a label frame generation unit 12, a label data generation unit 13, and an overlap detection unit 14, each unit functioning specifically as described above.
The terminal device may include, but is not limited to, a processor 20, a memory 21. It will be appreciated by those skilled in the art that fig. 6 is merely an example of the terminal device 2 and does not constitute a limitation of the terminal device 2, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device may further include an input-output device, a network access device, a bus, etc.
The processor 20 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 21 may be an internal storage unit of the terminal device 2, such as a hard disk or a memory of the terminal device 2. The memory 21 may be an external storage device of the terminal device 2, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 2. Further, the memory 21 may also include both an internal storage unit and an external storage device of the terminal device 2. The memory 21 is used for storing the computer program as well as other programs and data required by the terminal device. The memory 21 may also be used for temporarily storing data that has been output or is to be output.
The embodiment of the application also provides a storage medium, wherein the storage medium stores a computer program, and the computer program realizes the steps of the provided data labeling method when being executed by a processor.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (8)

1. A method for labeling data, comprising:
displaying a picture to be marked;
if a preset event aiming at the picture to be marked is monitored, determining a data marking starting point and a data marking end point according to the preset event;
generating and displaying a data annotation frame according to the determined data annotation starting point and the determined data annotation ending point;
acquiring a labeling area of the data labeling frame on the picture to be labeled, and performing area overlapping detection on the acquired labeling area and a labeling area corresponding to the displayed data labeling frame;
detecting whether an overlapping area exists between the obtained labeling area and the labeling area corresponding to the displayed data labeling frame, and if the number of the overlapping areas is smaller than a first number threshold and the area of the overlapping areas is larger than an area threshold, sending a labeling frame replacement prompt; if a determining instruction aiming at the marking frame replacement prompt is received, deleting the displayed data marking frame corresponding to the overlapped area, and displaying the generated data marking frame;
and if a selection instruction aiming at any data annotation frame is received, labeling the selected data annotation frame, and generating annotation data corresponding to the picture to be annotated according to the labeled data annotation frame.
2. The method for labeling data according to claim 1, wherein if a preset event for the picture to be labeled is monitored, determining a data labeling start point and a data labeling end point according to the preset event comprises:
if a click event aiming at the picture to be marked is monitored, a coordinate point pointed by the click event on the picture to be marked is obtained to obtain the data marking starting point, the click event is used for selecting the coordinate point on the picture to be marked, monitoring of a lifting event is carried out on the picture to be marked, and the lifting event is used for carrying out counter selection of the coordinate point on the picture to be marked;
and if the lifting event aiming at the picture to be marked is monitored, acquiring a coordinate point reversely selected by the lifting event on the picture to be marked, and obtaining the data marking end point.
3. The method for labeling data according to claim 2, wherein after obtaining the coordinate point pointed by the click event on the picture to be labeled and obtaining the data labeling start point, the method further comprises:
monitoring a moving event for the picture to be marked, wherein the moving event is used for moving a coordinate point on the picture to be marked;
if the moving event aiming at the picture to be marked is monitored, acquiring a coordinate point pointed by the moving event on the picture to be marked, and obtaining a marked moving point;
and generating a candidate annotation frame according to the annotation moving point and the data annotation starting point.
4. A method of annotating data according to claim 3, wherein said generating and displaying a data annotation frame based on said determined starting point and end point of said data annotation comprises:
and acquiring the labeling moving point corresponding to the data labeling end point, and setting the candidate labeling frame corresponding to the acquired labeling moving point as the data labeling frame.
5. The method for labeling data according to claim 1, wherein the generating labeling data corresponding to the picture to be labeled according to the labeled data labeling frame includes:
respectively acquiring a center point coordinate, a picture height, a picture width and a preset origin coordinate on the picture to be marked, and acquiring tag information pointed by the tag mark;
calculating a coordinate conversion rule according to the acquired center point coordinates, the picture height, the picture width and the preset origin coordinates;
acquiring display coordinates of the data annotation frame on the picture to be annotated, and carrying out coordinate conversion on the display coordinates according to the coordinate conversion rule;
and correspondingly storing the display coordinates and the label information after the corresponding coordinates of the same data labeling frame are converted, so as to obtain the labeling data.
6. A data tagging device, comprising:
the picture display unit is used for displaying pictures to be marked;
the event monitoring unit is used for determining a data marking starting point and a data marking end point according to the preset event if the preset event aiming at the picture to be marked is monitored;
the annotation frame generation unit is used for generating and displaying a data annotation frame according to the determined data annotation starting point and the determined data annotation end point;
the overlapping detection unit is used for acquiring the marked area of the data marked frame on the picture to be marked and carrying out area overlapping detection on the acquired marked area and the marked area corresponding to the displayed data marked frame; detecting whether an overlapping area exists between the obtained labeling area and the labeling area corresponding to the displayed data labeling frame, and if the number of the overlapping areas is smaller than a first number threshold and the area of the overlapping areas is larger than an area threshold, sending a labeling frame replacement prompt; if a determining instruction aiming at the marking frame replacement prompt is received, deleting the displayed data marking frame corresponding to the overlapped area, and displaying the generated data marking frame;
and the marking data generating unit is used for marking the selected data marking frame if a selection instruction aiming at any data marking frame is received, and generating marking data corresponding to the picture to be marked according to the marked data marking frame.
7. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 5 when the computer program is executed.
8. A storage medium storing a computer program which, when executed by a processor, implements the steps of the method according to any one of claims 1 to 5.
CN202110340495.2A 2021-03-30 2021-03-30 Data labeling method, device, terminal equipment and storage medium Active CN112967359B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110340495.2A CN112967359B (en) 2021-03-30 2021-03-30 Data labeling method, device, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110340495.2A CN112967359B (en) 2021-03-30 2021-03-30 Data labeling method, device, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112967359A CN112967359A (en) 2021-06-15
CN112967359B true CN112967359B (en) 2023-12-19

Family

ID=76279698

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110340495.2A Active CN112967359B (en) 2021-03-30 2021-03-30 Data labeling method, device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112967359B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117115570B (en) * 2023-10-25 2023-12-29 成都数联云算科技有限公司 Canvas-based image labeling method and Canvas-based image labeling system
CN117437382B (en) * 2023-12-19 2024-03-19 成都电科星拓科技有限公司 Updating method and system for data center component

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109492635A (en) * 2018-09-20 2019-03-19 第四范式(北京)技术有限公司 Obtain method, apparatus, equipment and the storage medium of labeled data
CN110865756A (en) * 2019-11-12 2020-03-06 苏州智加科技有限公司 Image labeling method, device, equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109492635A (en) * 2018-09-20 2019-03-19 第四范式(北京)技术有限公司 Obtain method, apparatus, equipment and the storage medium of labeled data
CN110865756A (en) * 2019-11-12 2020-03-06 苏州智加科技有限公司 Image labeling method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112967359A (en) 2021-06-15

Similar Documents

Publication Publication Date Title
TWI729331B (en) Image annotation information processing method, device, server and system
CN112967359B (en) Data labeling method, device, terminal equipment and storage medium
CN108520229B (en) Image detection method, image detection device, electronic equipment and computer readable medium
US8805081B2 (en) Object detection in an image
CN113094770B (en) Drawing generation method and device, computer equipment and storage medium
WO2020140608A1 (en) Image data processing method, apparatus, and computer readable storage medium
CN112508109B (en) Training method and device for image recognition model
CN109492686A (en) A kind of picture mask method and system
WO2018103024A1 (en) Intelligent guidance method and apparatus for visually handicapped person
US9395813B2 (en) Method for controlling operation and electronic device thereof
CN111292341B (en) Image annotation method, image annotation device and computer storage medium
CN112383734A (en) Video processing method, video processing device, computer equipment and storage medium
CN112651315A (en) Information extraction method and device of line graph, computer equipment and storage medium
CN115457119B (en) Bus bar labeling method, device, computer equipment and readable storage medium
WO2023024959A1 (en) Image labeling method and system, and device and storage medium
CN111832549B (en) Data labeling method and device
CN115890004A (en) Laser marking method based on image recognition and related device
US20230196840A1 (en) Work management device and work state determination method
CN115035032A (en) Neural network training method, related method, device, terminal and storage medium
CN113505763A (en) Key point detection method and device, electronic equipment and storage medium
CN112819885A (en) Animal identification method, device and equipment based on deep learning and storage medium
CN113094240A (en) Application program abnormity monitoring method, mobile terminal and storage medium
CN112036441A (en) Feedback marking method and device for machine learning object detection result and storage medium
JP5174771B2 (en) Method for confirming completion of input of handwritten data in electronic blackboard system
CN113177607B (en) Method for labeling training object and client

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant