CN111950618A - Water area image data labeling method, device, equipment and storage medium - Google Patents

Water area image data labeling method, device, equipment and storage medium Download PDF

Info

Publication number
CN111950618A
CN111950618A CN202010779118.4A CN202010779118A CN111950618A CN 111950618 A CN111950618 A CN 111950618A CN 202010779118 A CN202010779118 A CN 202010779118A CN 111950618 A CN111950618 A CN 111950618A
Authority
CN
China
Prior art keywords
target
water area
candidate
attribute information
area image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010779118.4A
Other languages
Chinese (zh)
Inventor
余化
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Construction Bank Corp
Original Assignee
China Construction Bank Corp
CCB Finetech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction Bank Corp, CCB Finetech Co Ltd filed Critical China Construction Bank Corp
Priority to CN202010779118.4A priority Critical patent/CN111950618A/en
Publication of CN111950618A publication Critical patent/CN111950618A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/40Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements

Abstract

The embodiment of the invention discloses a water area image data labeling method, a water area image data labeling device, water area image data labeling equipment and a storage medium. The water area image data labeling method comprises the following steps: determining candidate labeling results in the water area images to be labeled based on the characteristics of the target types to be labeled; the candidate labeling result comprises a target category and/or target attribute information of the candidate label; receiving a judgment instruction of a user on the candidate labeling result, and determining a final labeling result according to the judgment instruction; the final labeling result comprises the final labeled target category and/or target attribute information; and determining the marking result of the water area image to be marked according to the final marking result. The embodiment of the invention can improve the efficiency of water area image data annotation through the determination of the candidate annotation result, does not need to manually label the water area images to be annotated in sequence, and improves the accuracy of water area image data annotation through receiving the judgment instruction of the candidate annotation result from the user.

Description

Water area image data labeling method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of Internet, in particular to a water area image data annotation method, device, equipment and storage medium.
Background
The prior art mainly aims at image annotation of land vehicles, in the field, the shape, method and classification of the image annotation are different, and research organizations or individuals in the field are more uniform in the links of image annotation and data acquisition and have certain standards, so that the annotation data of the organizations or individuals can be universal.
However, the labeling classification aiming at the water area field is not clear, and the single picture is labeled in sequence by manpower, so that the labeling efficiency is low.
Disclosure of Invention
The embodiment of the invention provides a water area image data annotation method, a device, equipment and a storage medium, which are used for improving the efficiency of water area image data annotation.
In a first aspect, an embodiment of the present invention provides a method for annotating water area image data, including:
determining candidate labeling results in the water area images to be labeled based on the characteristics of the target types to be labeled; the candidate labeling result comprises a target category and/or target attribute information of the candidate label;
receiving a judgment instruction of a user for the candidate labeling result, and determining a final labeling result according to the judgment instruction; the final labeling result comprises the final labeled target category and/or target attribute information;
and determining the marking result of the water area image to be marked according to the final marking result.
Optionally, the target category to be labeled and the target attribute information are determined by presetting, and the target attribute information is associated with the target category.
Optionally, the target category to be labeled includes at least one of the following items: ships, travelable water areas, coastlines, obstacles in water, unrelated objects and own ships;
optionally, the target attribute information of the ship includes at least one of: boundary information, a shielding proportion and a truncation proportion, wherein the attribute information of the travelable water area, the coastline, the obstacle in the water, the unrelated object and the ship at least comprises the boundary information.
Optionally, the shielding ratio refers to a ratio of a part of the ship shielded by other objects in the water area image;
the cutoff proportion refers to the proportion of the ship which is cut off by the edge of the water area image.
Optionally, the coastline is included in the body of travelable water.
Optionally, the instruction for judging the candidate labeling result by the user at least includes one of the following items:
determining the target category and/or target attribute information of the candidate label by the user;
a negative instruction of the user to the target category and/or the target attribute information of the candidate label;
and modifying the target category and/or the target attribute information of the candidate label by the user.
Optionally, receiving a judgment instruction of the user for the candidate labeling result, and determining a final labeling result according to the judgment instruction includes:
if a determination instruction of the target category and/or the target attribute information of the candidate label by the user is received, determining the target category and/or the target attribute information of the candidate label associated with the determination instruction as the final labeled target category and/or target attribute information;
if the user carries out a negation instruction on the target type and/or the target attribute information of the candidate label, deleting the target type and/or the target attribute information of the candidate label associated with the negation instruction;
and if the user modifies the target type and/or the target attribute information of the candidate label according to the modification instruction, modifying the associated target type and/or the target attribute information of the candidate label according to the modification instruction, and determining the modified target type and/or the target attribute information of the candidate label as the finally labeled target type and/or target attribute information.
Optionally, determining a candidate labeling result in the water area image to be labeled based on the feature of the target category to be labeled includes:
and determining candidate labeling results in the water area image to be labeled by an image recognition technology based on the characteristics of the target category to be labeled.
Optionally, the image recognition technique includes edge extraction.
Optionally, if the target type of any object to be marked in the water area image does not belong to any one of the ship, the travelable water area, the coastline, the unrelated object and the ship, determining that the target type of the object to be marked is an obstacle in water so as to avoid omission of the object to be marked in the water area image.
Optionally, receiving a judgment instruction of the user on the candidate labeling result includes:
and receiving a judgment instruction of the user on the candidate labeling result according to a preset labeling standard.
In a second aspect, an embodiment of the present invention further provides a device for annotating image data of a water area, including:
the candidate result determining module is used for determining a candidate marking result in the water area image to be marked based on the characteristics of the target category to be marked; the candidate labeling result comprises a target category and/or target attribute information of the candidate label;
the final result determining module is used for receiving a judgment instruction of the user on the candidate labeling result and determining a final labeling result according to the judgment instruction; the final labeling result comprises the final labeled target category and/or target attribute information;
and the water area image marking module is used for determining the marking result of the water area image to be marked according to the final marking result.
Optionally, the target category to be labeled and the target attribute information are determined by presetting, and the target attribute information is associated with the target category.
Optionally, the target category to be labeled includes at least one of the following items: ships, travelable water areas, coastlines, obstacles in water, unrelated objects and own ships;
optionally, the target attribute information of the ship includes at least one of: boundary information, a shielding proportion and a truncation proportion, wherein the attribute information of the travelable water area, the coastline, the obstacle in the water, the unrelated object and the ship at least comprises the boundary information.
Optionally, the shielding ratio refers to a ratio of a part of the ship shielded by other objects in the water area image;
the cutoff proportion refers to the proportion of the ship which is cut off by the edge of the water area image.
Optionally, the coastline is included in the body of travelable water.
Optionally, the instruction for judging the candidate labeling result by the user at least includes one of the following items:
determining the target category and/or target attribute information of the candidate label by the user;
a negative instruction of the user to the target category and/or the target attribute information of the candidate label;
and modifying the target category and/or the target attribute information of the candidate label by the user.
Optionally, the final result determining module is specifically configured to:
if a determination instruction of the target category and/or the target attribute information of the candidate label by the user is received, determining the target category and/or the target attribute information of the candidate label associated with the determination instruction as the final labeled target category and/or target attribute information;
if the user carries out a negation instruction on the target type and/or the target attribute information of the candidate label, deleting the target type and/or the target attribute information of the candidate label associated with the negation instruction;
and if the user modifies the target type and/or the target attribute information of the candidate label according to the modification instruction, modifying the associated target type and/or the target attribute information of the candidate label according to the modification instruction, and determining the modified target type and/or the target attribute information of the candidate label as the finally labeled target type and/or target attribute information.
Optionally, the candidate result determining module is specifically configured to:
and determining candidate labeling results in the water area image to be labeled by an image recognition technology based on the characteristics of the target category to be labeled.
Optionally, the image recognition technique includes edge extraction.
Optionally, if the target type of any object to be marked in the water area image does not belong to any one of the ship, the travelable water area, the coastline, the unrelated object and the ship, determining that the target type of the object to be marked is an obstacle in water so as to avoid omission of the object to be marked in the water area image.
Optionally, the final result determining module is specifically configured to:
and receiving a judgment instruction of the user on the candidate labeling result according to a preset labeling standard.
In a third aspect, an embodiment of the present invention further provides an apparatus, including:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the water area image data annotation method according to any embodiment of the invention.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements the water area image data annotation method according to any embodiment of the present invention.
The embodiment of the invention is based on the characteristics of the target category to be labeled. And obtaining a candidate labeling result from the water area image to be labeled, correcting the candidate labeling result according to a judgment instruction of a user on the candidate labeling result to obtain a final labeling result, and finally obtaining a result of labeling the water area image to be labeled according to the final labeling result. The embodiment of the invention can improve the efficiency of water area image data annotation through the determination of the candidate annotation result, does not need to manually label the water area images to be annotated in sequence, and improves the accuracy of water area image data annotation through receiving the judgment instruction of the candidate annotation result from the user.
Drawings
FIG. 1 is a flowchart of a water area image data annotation method according to a first embodiment of the present invention;
FIG. 2A is a flowchart of a water area image data annotation method according to a second embodiment of the present invention;
FIG. 2B is a schematic diagram of an example of vessel labeling in a water area image according to a second embodiment of the present invention;
FIG. 2C is a schematic diagram illustrating an example of the marking of the travelable water area in the water area image according to the second embodiment of the present invention;
fig. 2D is a schematic diagram of an example of the sea-shore line labeling in the water area image according to the second embodiment of the present invention;
FIG. 2E is a schematic diagram of an example of the annotation of the obstacle in the water area image according to the second embodiment of the present invention;
FIG. 2F is a schematic diagram illustrating an example of labeling an unrelated object in a water area image according to a second embodiment of the present invention;
FIG. 2G is a schematic diagram of an example of vessel labeling in a water area image according to a second embodiment of the present invention;
fig. 2H is a diagram illustrating an example of marking a travelable water area by bypassing an object according to a second embodiment of the present invention;
fig. 2I is a diagram illustrating an example of labeling a travelable water area without bypassing an object according to a second embodiment of the present invention;
FIG. 2J is a schematic illustration of a labeled example of a vessel in accordance with a second embodiment of the invention;
FIG. 2K is a schematic illustration of an example of the multiplexing of the navigable water area and the coastline marking in the second embodiment of the invention;
fig. 2L is a schematic diagram of an example of shoreline marking of a ship at the shore according to a second embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a water area image data annotation device according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an apparatus in the fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a water area image data annotation method according to a first embodiment of the present invention, which is applicable to a case of performing data annotation on a water area image. The method can be executed by a water area image data annotation device, which can be implemented in software and/or hardware and can be configured in a device, for example, the device can be a device with communication and computing capabilities, such as a background server. As shown in fig. 1, the method specifically includes:
step 101, determining candidate labeling results in a water area image to be labeled based on the characteristics of the target category to be labeled; and the candidate labeling result comprises the target category and/or target attribute information of the candidate labeling.
The most critical technology, namely deep learning, is aroused along with artificial intelligence, and the neural network is an input-oriented algorithm on the basis of the deep learning, and the result accuracy of the neural network depends on data close to an infinite order. Therefore, complex intermediate links are eliminated, the most important point of deep learning is that a large amount of data is needed for training, and before data training, a large amount of data must be labeled as a pilot experience of machine learning. Therefore, in the field of unmanned driving of ships, it is necessary to make a machine, instead of a driver, recognize various targets in a water area and then make a correct judgment according to the type, distance, and the like of the targets. Since a machine needs to be taught to recognize various complex scenes in a water area, a large number of water area images need to be labeled, and the machine can automatically recognize a target by learning features in the large number of images.
In the prior art, the water area images are labeled manually, however, for the learning of the neural network, a large amount of labeled data is needed, so that the efficiency is low due to the fact that the labeling is carried out manually only, and the standards are difficult to unify.
The feature of the target category to be labeled can be determined by analyzing the feature of the water area image, and for example, the object included in the water area image is analyzed in advance, the feature of the object affecting the ship running is determined, and the category to which the object belongs is determined.
Specifically, the feature of the target category to be labeled in the water area image is determined, and the image part meeting the condition is determined from the water area image to be labeled according to the feature and serves as a candidate labeling result, wherein the candidate labeling result comprises the target category matched with the feature of the image part and the attribute information of the target category corresponding to the image part. The attribute information is used for describing the characteristic properties of the target category in the image to be labeled, such as boundary information of the target category in the image, and the integrity and other characteristics of the target category can be determined through the boundary information for description.
In this embodiment of the present invention, optionally, step 101 includes:
and determining candidate labeling results in the water area image to be labeled by an image recognition technology based on the characteristics of the target category to be labeled.
The image recognition technology refers to a technology that can recognize features of different object categories, and includes, for example, but not limited to, template matching, edge extraction, or the like. Specifically, the characteristic of the target category to be labeled in the water area image is determined, the water area image to be labeled is identified through image identification technologies such as template matching or edge extraction, and after template matching or edge extraction, the candidate labeling result is identified according to the edge characteristic.
102, receiving a judgment instruction of a user on a candidate labeling result, and determining a final labeling result according to the judgment instruction; and the final labeling result comprises the final labeled target category and/or target attribute information.
And after a candidate labeling result of the water area image to be labeled is obtained, sending the candidate labeling result to a user, judging the candidate labeling result by the user to obtain a judgment result, generating a judgment instruction according to the judgment result and sending the judgment instruction. After a judgment instruction of the user for the candidate labeling result is received, the judgment result of the user for the candidate labeling result is obtained according to the judgment instruction, and the target type and/or target attribute information of the candidate label in the candidate labeling result is determined to obtain the final labeled target type and/or target attribute information. The instruction for judging the candidate labeling result by the user comprises the judgment of the memorability of the candidate labeling result of the whole water area image to be labeled or the judgment of the target category of each candidate label in the candidate labeling result.
In an optional embodiment, the instruction for judging the candidate labeling result by the user at least includes one of the following items:
determining instructions of target categories and/or target attribute information of the candidate labels by the user;
negation instructions of the target category and/or target attribute information of the candidate labels are/is carried out by the user;
and modifying the target category and/or the target attribute information of the candidate label by the user.
And the judgment instruction of the user on the candidate labeling result is generated by the judgment result on the candidate labeling result. The judgment result of the candidate labeling result by the user comprises the determination, negation and modification results of the target category and/or the target attribute information of the candidate labeling.
In an optional embodiment, receiving a judgment instruction of the user for the candidate annotation result, and determining a final annotation result according to the judgment instruction includes:
if a determination instruction of the target category and/or the target attribute information of the candidate label by the user is received, determining the target category and/or the target attribute information of the candidate label associated with the determination instruction as the final labeled target category and/or target attribute information;
if the user carries out a negation instruction on the target type and/or the target attribute information of the candidate label, deleting the target type and/or the target attribute information of the candidate label associated with the negation instruction;
and if the user modifies the target type and/or the target attribute information of the candidate label according to the modification instruction, modifying the associated target type and/or the target attribute information of the candidate label according to the modification instruction, and determining the modified target type and/or the target attribute information of the candidate label as the finally labeled target type and/or target attribute information.
And sequentially displaying each target type of the candidate labeling result of the water area image to be labeled and the corresponding target attribute information to a user, judging by the user, and determining, negating or modifying the labeling result of any type by the user. Exemplarily, the candidate labeling results of the water area image to be labeled are a candidate target class 1, a candidate target class 2 and a candidate target class 3, and the candidate target attribute information 1 of the candidate target class 1, the candidate target attribute information 2 of the candidate target class 2 and the candidate target attribute information 3 of the candidate target class 3, and the user respectively judges the candidate labeling results of the candidate target class 1, the candidate target class 2 and the candidate target class 3, for example, if the judgment of the candidate labeling result of the candidate target class 1 is affirmative, indicating that the recognition result of the candidate target class 1 is correct, the candidate target class 1 is determined to be the final target class 1; if the judgment of the candidate labeling result of the candidate target class 2 is negative and the identification result of the candidate target class 2 is wrong and inconvenient to modify, deleting the candidate target class 2 from the final target class; and if the judgment of the candidate labeling result of the candidate target category 3 is modification, which indicates that the identification result of the candidate target category 3 is a partial error, receiving the modification of the candidate target category 3 by the user, re-determining the identification result of the candidate target category 3 according to the modified content, such as modifying the category name of the candidate target category 3 or the content in the corresponding target attribute information, and determining the modified content as the final target category 3.
The accuracy of the final labeling result can be further ensured by receiving the judgment instruction of the user on the candidate labeling result, and the accuracy of data labeling is improved.
And 103, determining the marking result of the water area image to be marked according to the final marking result.
And determining the labeling result of the water area image to be labeled according to the final target type in the final labeling result determined by the judgment instruction of the candidate labeling result by the user and the associated target attribute information. Illustratively, the target labeling result is stored in a file in a JSON format, and the labeling result of the water area image to be labeled is obtained by combining the corresponding image file of the water area image to be labeled. For example, the JSON-formatted file includes location information, category information, attribute information, and the like of a target category in the water area image to be labeled.
The embodiment of the invention is based on the characteristics of the target category to be labeled. And obtaining a candidate labeling result from the water area image to be labeled, correcting the candidate labeling result according to a judgment instruction of a user on the candidate labeling result to obtain a final labeling result, and finally obtaining a result of labeling the water area image to be labeled according to the final labeling result. The embodiment of the invention can improve the efficiency of water area image data annotation through the determination of the candidate annotation result, does not need to manually label the water area images to be annotated in sequence, and improves the accuracy of water area image data annotation through receiving the judgment instruction of the candidate annotation result from the user.
Example two
Fig. 2A is a flowchart of a water area image data annotation method according to a second embodiment of the present invention, which is further optimized based on the first embodiment. As shown in fig. 2A, the method includes:
step 201, determining candidate labeling results in a water area image to be labeled based on the characteristics of the target category to be labeled; the candidate labeling result comprises a candidate labeled target category and/or target attribute information, the target category and the target attribute information to be labeled are determined through presetting, and the target attribute information is associated with the target category.
The prior art mainly aims at image annotation of land-based automobiles, in the field, the shape, method and classification of the image annotation are different, and research organizations or individuals in the field are more uniform in the links of image annotation and data acquisition and have certain standards, so that the data of each organization or individual can be universal. However, the method has the advantages of less labels aiming at the water area field, unclear classification and no unified or general standard in the industry.
Therefore, the target type and the target attribute information to be marked are determined by presetting, namely, objects which influence ship running in the water area image are analyzed and uniformly classified, and a set of universal target type determining mode is established, so that the marking result of each organization according to the marking method of the embodiment of the invention can be universal, and the marking efficiency of the water area image is improved.
Specifically, the specific properties in the water area image are analyzed, objects in the water area image, which affect the ship running, are classified in advance to obtain classification results of target classes, and associated target attribute information is set according to the properties of each class. For example, if a ship is included in the water area image, the ship is set as a target type to be labeled, and a drawing shape used for a boundary of the ship or a type of the ship is determined as target attribute information of the ship.
After the target category to be marked is preset, determining the characteristics of the target category to be marked according to the characteristics on the target category shape and the position so as to identify the target category on the water area image to be marked.
In an optional embodiment, the object class to be labeled comprises at least one of the following: ships, travelable water areas, coastlines, obstacles in water, unrelated objects and own ships;
correspondingly, the target attribute information of the ship comprises at least one of the following items: boundary information, a shielding proportion and a truncation proportion, and attribute information of a travelable water area, a coastline, an underwater obstacle, an irrelevant object and a ship at least comprises boundary information.
According to the property of the water area image, the target types to be marked are preset to comprise ships, travelable water areas, coastlines, barriers in water, unrelated objects and ships. Wherein, the ship represents other ships on the water area image, as shown in fig. 2B, which is a schematic diagram of an example of labeling of the ship in the water area image, the classification code of the ship can be represented by a boat, and the white line in the figure is the labeling result; the travelable water area represents an area where the ship can travel on the water area image, and as shown in fig. 2C, is a schematic diagram of an example of labeling the travelable water area in the water area image, the classification code of the travelable water area may be represented by water _ area, and a white line in the diagram is a labeling result; the coastline is used for representing a part where the travelable area is connected with the land or the sky, as shown in fig. 2D, the coastline is a labeled example diagram in the water area image, the classification code of the coastline can be represented by shoreline, and a white line in the diagram is a labeled result; the obstacle in water represents an obstacle that is located in a travelable water area and affects the travel of the ship, as shown in fig. 2E, a schematic diagram of an example of labeling the obstacle in water in a water area image is shown, a classification code of the obstacle in water can be represented by obstacle _ in _ water, and a white line in the diagram is a labeling result; the irrelevant object represents an object or an unidentifiable object which is located in the travelable water area but does not affect the travel of the ship, as shown in fig. 2F, a schematic diagram of an example of labeling the irrelevant object in the water area image is shown, a classification code of the irrelevant object may be represented by a boat _ dot _ care, and a white line in the diagram is a labeling result; the ship is the ship where the device for photographing the water area image is located, and as shown in fig. 2G, the labeling example of the ship in the water area image is schematically illustrated, the classification code of the ship may be represented by my _ boat, and the white line in the drawing is the labeling result. All the categories of the water area images can be represented through the target categories, and the establishment of a uniform marking standard is facilitated.
The target attribute information of each target category is preset according to the characteristics of the target category, and the target attribute of the ship comprises boundary information, a shielding proportion and a truncation proportion, wherein the boundary information is used for determining position information and size information of the ship, and the shielding proportion and the truncation proportion are used for expressing the influence degree of the ship on driving.
In the embodiment of the present invention, optionally, the shielding ratio refers to a ratio of a part of the ship shielded by other objects in the water area image; the cutoff ratio is a ratio of a portion of the ship cut by the edge of the water image.
The occlusion means that a part of the target ship is occluded by other objects in the image, at this time, the proportion of the occluded part of the target ship is estimated to be 0.1, 0.2, 0.4, 0.6, 0.8, and the like, and the whole size of the target object (including the visible part and the occluded part) is marked. The truncation indicates that the target ship is at the edge of the water area image, namely the ship is not completely in the image, only the visible part is marked to be not beyond the picture in the case, and the proportion of the truncated part is estimated to be 0.1, 0.2, 0.4, 0.6, 0.8 and the like.
In an embodiment of the present invention, the travelable water area optionally includes a shoreline. Since the coastline is located at the edge of the travelable water area, the coastline can be determined from the travelable water area, and the accuracy of the coastline determination can be improved. For example, after the travelable water area is determined, a shoreline portion is determined at the edge of the travelable water area according to characteristics of the shoreline.
Determining the characteristics of the target category to be labeled according to the preset target category to be labeled and target attribute information, wherein the drawing shape of the boundary information of the coastline is a broken line, and the drawing shape of the boundary information of other target categories is a polygon; the coastline is used for marking the part where the travelable water area is connected with the sea and sky, if the coastline is shielded, the coastline or the part connected with the sea and sky can be estimated, and the coastline at two sides is connected by the part connected with the sea and sky, so that one water area image has only one coastline category; the recognition is performed based on distance characteristics of the object, e.g., objects whose distance exceeds a certain threshold resulting in poor recognition are determined to be extraneous objects, etc.
According to the preset characteristics of the target category to be labeled, the accuracy of data labeling can be improved, and a uniform labeling standard is formed.
Step 202, receiving a judgment instruction of a user on a candidate labeling result, and determining a final labeling result according to the judgment instruction; and the final labeling result comprises the final labeled target category and/or target attribute information.
The judgment of the candidate labeling result by the user can be determined based on the features according to the target category, or can also be determined based on a preset judgment reference.
In an optional embodiment, receiving a judgment instruction of the user for the candidate labeling result includes:
and receiving a judgment instruction of the user on the candidate labeling result according to a preset labeling standard.
The preset marking standard is used for marking and establishing a uniform standard for the data of the water area image. For example, the technical key points of the water area image marking are preset, and when a ship or an obstacle in water is located at the edge of a travelable water area, the whole outline of the ship or the obstacle in water is marked. When the travelable water area is marked, the travelable water area and the object have an overlapping portion, and the marker can be bypassed or not bypassed, as shown in fig. 2H and 2I, which are schematic diagrams of examples of marking that the object is located at the edge of the travelable water area, fig. 2H is an example of marking the travelable water area by bypassing the object, and fig. 2I is an example of marking the travelable water area without bypassing the object. When a ship is marked, the marking precision is improved for the part of the ship contacting with the water area, and the marking precision can be reduced for the water part of the ship, but the whole ship needs to be included, and as shown in fig. 2J, the marking example schematic diagram of the ship is shown. When a plurality of ships are gathered together and a single ship is difficult to distinguish, marking is carried out according to the distance, and when the distance exceeds a certain threshold value, an irrelevant object is set, otherwise, distinguishing and marking are carried out on the single ship. After the user determines the labeling result of the travelable water area, two points can be directly added on the polygonal line of the travelable water area (or two existing points are selected), and the boundary broken line of the coastline is directly determined. Fig. 2K is a schematic diagram showing an example of labeling multiplexing of the travelable water area and the coastline, wherein the area ABCD surrounded by polygons is the travelable water area, and the line directly determined between the AB is the coastline. In addition, for many ships that are close to shore, the shoreline cannot be clearly determined, and the line formed outside the ship can be directly used as the shoreline. Fig. 2L is a schematic view of an example of a shoreline marking of a ship on shore.
In a feasible embodiment, if the target category of any object to be labeled in the water area image does not belong to any one of a ship, a travelable water area, a coastline, an unrelated object and a ship, determining that the target category of the object to be labeled is an underwater obstacle so as to avoid omission of the object to be labeled in the water area image. For example, for objects such as seabirds in the image of the water area, when it is clearly distinguishable that the objects in the water area are seabirds, they do not need to be marked as obstacles because the seabirds fly away when the ship only approaches the seabirds. But if the object is not clearly visible and cannot be determined to be a seabird, they are marked as obstacles in the water. Any object needing to be marked in the water area cannot be missed. When the distance is far, the boundary lines between the ships cannot be distinguished when a plurality of ships lean against each other, and a polygon can be used for containing a plurality of ships. The part of the ship submerged by water needs to be estimated and marked, and the shielding proportion is estimated and marked.
The embodiment of the invention realizes the consistency of the labeling standards of the candidate labeling result and the final labeling result by presetting the target type to be labeled and the target attribute information of the water area image, and provides the accuracy of the labeling result. The embodiment of the invention provides a technical scheme on the aspects of establishing a training data set, classifying, and the like, and improves the universality of establishing the data set of the water area image.
And 203, determining the marking result of the water area image to be marked according to the final marking result.
The embodiment of the invention is based on the characteristics of the target category to be labeled. And obtaining a candidate labeling result from the water area image to be labeled, correcting the candidate labeling result according to a judgment instruction of a user on the candidate labeling result to obtain a final labeling result, and finally obtaining a result of labeling the water area image to be labeled according to the final labeling result. The embodiment of the invention can improve the efficiency of water area image data annotation through the determination of the candidate annotation result, does not need to manually label the water area images to be annotated in sequence, and improves the accuracy of water area image data annotation through receiving the judgment instruction of the candidate annotation result from the user.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a water area image data annotation device according to a third embodiment of the present invention, which is applicable to the case of performing data annotation on a water area image. As shown in fig. 3, the apparatus includes:
a candidate result determining module 310, configured to determine a candidate labeling result in the water area image to be labeled based on the feature of the target category to be labeled; the candidate labeling result comprises a target category and/or target attribute information of the candidate label;
a final result determining module 320, configured to receive a judgment instruction of the candidate annotation result from the user, and determine a final annotation result according to the judgment instruction; the final labeling result comprises the final labeled target category and/or target attribute information;
and the water area image labeling module 330 is configured to determine a labeling result of the water area image to be labeled according to the final labeling result.
The embodiment of the invention is based on the characteristics of the target category to be labeled. And obtaining a candidate labeling result from the water area image to be labeled, correcting the candidate labeling result according to a judgment instruction of a user on the candidate labeling result to obtain a final labeling result, and finally obtaining a result of labeling the water area image to be labeled according to the final labeling result. The embodiment of the invention can improve the efficiency of water area image data annotation through the determination of the candidate annotation result, does not need to manually label the water area images to be annotated in sequence, and improves the accuracy of water area image data annotation through receiving the judgment instruction of the candidate annotation result from the user.
Optionally, the target category to be labeled and the target attribute information are determined by presetting, and the target attribute information is associated with the target category.
Optionally, the target category to be labeled includes at least one of the following items: ships, travelable water areas, coastlines, obstacles in water, unrelated objects and own ships;
optionally, the target attribute information of the ship includes at least one of: boundary information, a shielding proportion and a truncation proportion, wherein the attribute information of the travelable water area, the coastline, the obstacle in the water, the unrelated object and the ship at least comprises the boundary information.
Optionally, the shielding ratio refers to a ratio of a part of the ship shielded by other objects in the water area image;
the cutoff proportion refers to the proportion of the ship which is cut off by the edge of the water area image.
Optionally, the coastline is included in the body of travelable water.
Optionally, the instruction for judging the candidate labeling result by the user at least includes one of the following items:
determining the target category and/or target attribute information of the candidate label by the user;
a negative instruction of the user to the target category and/or the target attribute information of the candidate label;
and modifying the target category and/or the target attribute information of the candidate label by the user.
Optionally, the final result determining module is specifically configured to:
if a determination instruction of the target category and/or the target attribute information of the candidate label by the user is received, determining the target category and/or the target attribute information of the candidate label associated with the determination instruction as the final labeled target category and/or target attribute information;
if the user carries out a negation instruction on the target type and/or the target attribute information of the candidate label, deleting the target type and/or the target attribute information of the candidate label associated with the negation instruction;
and if the user modifies the target type and/or the target attribute information of the candidate label according to the modification instruction, modifying the associated target type and/or the target attribute information of the candidate label according to the modification instruction, and determining the modified target type and/or the target attribute information of the candidate label as the finally labeled target type and/or target attribute information.
Optionally, the candidate result determining module is specifically configured to:
and determining candidate labeling results in the water area image to be labeled by an image recognition technology based on the characteristics of the target category to be labeled.
Optionally, the image recognition technique includes edge extraction.
Optionally, if the target type of any object to be marked in the water area image does not belong to any one of the ship, the travelable water area, the coastline, the unrelated object and the ship, determining that the target type of the object to be marked is an obstacle in water so as to avoid omission of the object to be marked in the water area image.
Optionally, the final result determining module is specifically configured to:
and receiving a judgment instruction of the user on the candidate labeling result according to a preset labeling standard.
The water area image data annotation device provided by the embodiment of the invention can execute the water area image data annotation method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the water area image data annotation method.
Example four
Fig. 4 is a schematic structural diagram of an apparatus according to a fourth embodiment of the present invention. Fig. 4 illustrates a block diagram of an exemplary device 12 suitable for use in implementing embodiments of the present invention. The device 12 shown in fig. 4 is only an example and should not bring any limitation to the function and scope of use of the embodiments of the present invention.
As shown in FIG. 4, device 12 is in the form of a general purpose computing device. The components of device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory device 28, and a bus 18 that couples various system components including the system memory device 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory device bus or memory device controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system storage 28 may include computer system readable media in the form of volatile storage, such as Random Access Memory (RAM)30 and/or cache storage 32. Device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, and commonly referred to as a "hard drive"). Although not shown in FIG. 4, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Storage 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in storage 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with device 12, and/or with any devices (e.g., network card, modem, etc.) that enable device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown in FIG. 4, the network adapter 20 communicates with the other modules of the device 12 via the bus 18. It should be appreciated that although not shown in FIG. 4, other hardware and/or software modules may be used in conjunction with device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by running a program stored in the system storage device 28, for example, implementing a water image data annotation method provided by the embodiment of the present invention, including:
determining candidate labeling results in the water area images to be labeled based on the characteristics of the target types to be labeled; the candidate labeling result comprises a target category and/or target attribute information of the candidate label;
receiving a judgment instruction of a user for the candidate labeling result, and determining a final labeling result according to the judgment instruction; the final labeling result comprises the final labeled target category and/or target attribute information;
and determining the marking result of the water area image to be marked according to the final marking result.
EXAMPLE five
The fifth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for annotating water area image data provided by the fifth embodiment of the present invention, where the method includes:
determining candidate labeling results in the water area images to be labeled based on the characteristics of the target types to be labeled; the candidate labeling result comprises a target category and/or target attribute information of the candidate label;
receiving a judgment instruction of a user for the candidate labeling result, and determining a final labeling result according to the judgment instruction; the final labeling result comprises the final labeled target category and/or target attribute information;
and determining the marking result of the water area image to be marked according to the final marking result.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (15)

1. A water area image data labeling method is characterized by comprising the following steps:
determining candidate labeling results in the water area images to be labeled based on the characteristics of the target types to be labeled; the candidate labeling result comprises a target category and/or target attribute information of the candidate label;
receiving a judgment instruction of a user for the candidate labeling result, and determining a final labeling result according to the judgment instruction; the final labeling result comprises the final labeled target category and/or target attribute information;
and determining the marking result of the water area image to be marked according to the final marking result.
2. The method according to claim 1, wherein the target category to be labeled and the target attribute information are determined by presetting, and the target attribute information is associated with the target category.
3. The method of claim 1, wherein the target category to be labeled comprises at least one of: ships, travelable waters, shorelines, underwater obstacles, unrelated objects and own ships.
4. The method of claim 3, wherein the target property information of the vessel comprises at least one of: boundary information, a shielding proportion and a truncation proportion, wherein the attribute information of the travelable water area, the coastline, the obstacle in the water, the unrelated object and the ship at least comprises the boundary information.
5. The method according to claim 3, wherein the occlusion proportion refers to a proportion of the ship occluded by other objects in the water area image;
the cutoff proportion refers to the proportion of the ship which is cut off by the edge of the water area image.
6. The method of claim 3, wherein the coastline is included in the body of travelable water.
7. The method according to claim 1, wherein the instruction for determining the candidate labeling result by the user at least includes one of:
determining the target category and/or target attribute information of the candidate label by the user;
a negative instruction of the user to the target category and/or the target attribute information of the candidate label;
and modifying the target category and/or the target attribute information of the candidate label by the user.
8. The method of claim 7, wherein receiving a judgment instruction of the user for the candidate labeling result, and determining a final labeling result according to the judgment instruction comprises:
if a determination instruction of the target category and/or the target attribute information of the candidate label by the user is received, determining the target category and/or the target attribute information of the candidate label associated with the determination instruction as the final labeled target category and/or target attribute information;
if the user carries out a negation instruction on the target type and/or the target attribute information of the candidate label, deleting the target type and/or the target attribute information of the candidate label associated with the negation instruction;
and if the user modifies the target type and/or the target attribute information of the candidate label according to the modification instruction, modifying the associated target type and/or the target attribute information of the candidate label according to the modification instruction, and determining the modified target type and/or the target attribute information of the candidate label as the finally labeled target type and/or target attribute information.
9. The method of claim 1, wherein determining candidate labeling results in the water area image to be labeled based on the features of the target category to be labeled comprises:
and determining candidate labeling results in the water area image to be labeled by an image recognition technology based on the characteristics of the target category to be labeled.
10. The method of claim 9, wherein the image recognition technique comprises edge extraction.
11. The method according to claim 3, wherein if the target class of any object to be labeled in the water area image does not belong to any one of a ship, a travelable water area, a coastline, an unrelated object and a ship, the target class of the object to be labeled is determined to be an obstacle in water so as to avoid omission of the object to be labeled in the water area image.
12. The method of claim 1, wherein receiving a judgment instruction of the candidate labeling result from a user comprises:
and receiving a judgment instruction of the user on the candidate labeling result according to a preset labeling standard.
13. A water area image data labeling device is characterized by comprising:
the candidate result determining module is used for determining a candidate marking result in the water area image to be marked based on the characteristics of the target category to be marked; the candidate labeling result comprises a target category and/or target attribute information of the candidate label;
the final result determining module is used for receiving a judgment instruction of the user on the candidate labeling result and determining a final labeling result according to the judgment instruction; the final labeling result comprises the final labeled target category and/or target attribute information;
and the water area image marking module is used for determining the marking result of the water area image to be marked according to the final marking result.
14. An apparatus, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the water image data annotation method of any one of claims 1-12.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the water area image data annotation method according to any one of claims 1 to 12.
CN202010779118.4A 2020-08-05 2020-08-05 Water area image data labeling method, device, equipment and storage medium Pending CN111950618A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010779118.4A CN111950618A (en) 2020-08-05 2020-08-05 Water area image data labeling method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010779118.4A CN111950618A (en) 2020-08-05 2020-08-05 Water area image data labeling method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111950618A true CN111950618A (en) 2020-11-17

Family

ID=73338059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010779118.4A Pending CN111950618A (en) 2020-08-05 2020-08-05 Water area image data labeling method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111950618A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113326901A (en) * 2021-06-30 2021-08-31 北京百度网讯科技有限公司 Image annotation method and device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108573279A (en) * 2018-03-19 2018-09-25 精锐视觉智能科技(深圳)有限公司 Image labeling method and terminal device
WO2019137196A1 (en) * 2018-01-11 2019-07-18 阿里巴巴集团控股有限公司 Image annotation information processing method and device, server and system
CN110428003A (en) * 2019-07-31 2019-11-08 清华大学 Modification method, device and the electronic equipment of sample class label
CN110648347A (en) * 2019-09-24 2020-01-03 北京航天宏图信息技术股份有限公司 Coastline extraction method and device based on remote sensing image
CN110852162A (en) * 2019-09-29 2020-02-28 深圳云天励飞技术有限公司 Human body integrity data labeling method and device and terminal equipment
CN110929729A (en) * 2020-02-18 2020-03-27 北京海天瑞声科技股份有限公司 Image annotation method, image annotation device and computer storage medium
CN110929639A (en) * 2019-11-20 2020-03-27 北京百度网讯科技有限公司 Method, apparatus, device and medium for determining position of obstacle in image
CN110969793A (en) * 2019-12-25 2020-04-07 珠海大横琴科技发展有限公司 Method, system and storage medium for preventing ship intrusion at periphery of roundabout electronic purse net
CN111046927A (en) * 2019-11-26 2020-04-21 北京达佳互联信息技术有限公司 Method and device for processing labeled data, electronic equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019137196A1 (en) * 2018-01-11 2019-07-18 阿里巴巴集团控股有限公司 Image annotation information processing method and device, server and system
CN108573279A (en) * 2018-03-19 2018-09-25 精锐视觉智能科技(深圳)有限公司 Image labeling method and terminal device
CN110428003A (en) * 2019-07-31 2019-11-08 清华大学 Modification method, device and the electronic equipment of sample class label
CN110648347A (en) * 2019-09-24 2020-01-03 北京航天宏图信息技术股份有限公司 Coastline extraction method and device based on remote sensing image
CN110852162A (en) * 2019-09-29 2020-02-28 深圳云天励飞技术有限公司 Human body integrity data labeling method and device and terminal equipment
CN110929639A (en) * 2019-11-20 2020-03-27 北京百度网讯科技有限公司 Method, apparatus, device and medium for determining position of obstacle in image
CN111046927A (en) * 2019-11-26 2020-04-21 北京达佳互联信息技术有限公司 Method and device for processing labeled data, electronic equipment and storage medium
CN110969793A (en) * 2019-12-25 2020-04-07 珠海大横琴科技发展有限公司 Method, system and storage medium for preventing ship intrusion at periphery of roundabout electronic purse net
CN110929729A (en) * 2020-02-18 2020-03-27 北京海天瑞声科技股份有限公司 Image annotation method, image annotation device and computer storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
申恩恩;胡玉梅;陈光;罗攀;朱浩;: "智能驾驶实时目标检测的深度卷积神经网络", 汽车安全与节能学报, no. 01, 15 March 2020 (2020-03-15), pages 111 - 116 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113326901A (en) * 2021-06-30 2021-08-31 北京百度网讯科技有限公司 Image annotation method and device

Similar Documents

Publication Publication Date Title
CN109858555B (en) Image-based data processing method, device, equipment and readable storage medium
CN111595850B (en) Slice defect detection method, electronic device and readable storage medium
CN108734058B (en) Obstacle type identification method, device, equipment and storage medium
CN109726661B (en) Image processing method and apparatus, medium, and computing device
CN109670494B (en) Text detection method and system with recognition confidence
CN109344804A (en) A kind of recognition methods of laser point cloud data, device, equipment and medium
CN111488873B (en) Character level scene text detection method and device based on weak supervision learning
CN111462109A (en) Defect detection method, device and equipment for strain clamp and storage medium
KR102389998B1 (en) De-identification processing method and a computer program recorded on a recording medium to execute the same
KR102403169B1 (en) Method for providing guide through image analysis, and computer program recorded on record-medium for executing method therefor
CN112966685B (en) Attack network training method and device for scene text recognition and related equipment
CN114677596A (en) Remote sensing image ship detection method and device based on attention model
Zhang et al. A object detection and tracking method for security in intelligence of unmanned surface vehicles
CN110110320A (en) Automatic treaty review method, apparatus, medium and electronic equipment
CN113177957B (en) Cell image segmentation method and device, electronic equipment and storage medium
CN111582182A (en) Ship name identification method, system, computer equipment and storage medium
CN111695397A (en) Ship identification method based on YOLO and electronic equipment
CN111950618A (en) Water area image data labeling method, device, equipment and storage medium
CN114299366A (en) Image detection method and device, electronic equipment and storage medium
CN110555352B (en) Interest point identification method, device, server and storage medium
CN113936232A (en) Screen fragmentation identification method, device, equipment and storage medium
CN111353273B (en) Radar data labeling method, device, equipment and storage medium
CN114202689A (en) Point location marking method and device, electronic equipment and storage medium
CN111428724B (en) Examination paper handwriting statistics method, device and storage medium
CN114565780A (en) Target identification method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220920

Address after: 25 Financial Street, Xicheng District, Beijing 100033

Applicant after: CHINA CONSTRUCTION BANK Corp.

Address before: 25 Financial Street, Xicheng District, Beijing 100033

Applicant before: CHINA CONSTRUCTION BANK Corp.

Applicant before: Jianxin Financial Science and Technology Co.,Ltd.