CN112529830A - Image annotation method and device, electronic equipment and storage medium - Google Patents

Image annotation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112529830A
CN112529830A CN201910814719.1A CN201910814719A CN112529830A CN 112529830 A CN112529830 A CN 112529830A CN 201910814719 A CN201910814719 A CN 201910814719A CN 112529830 A CN112529830 A CN 112529830A
Authority
CN
China
Prior art keywords
hole
foreground region
image
real
binary image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910814719.1A
Other languages
Chinese (zh)
Other versions
CN112529830B (en
Inventor
余雪兵
康勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Navinfo Co Ltd
Original Assignee
Navinfo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Navinfo Co Ltd filed Critical Navinfo Co Ltd
Priority to CN201910814719.1A priority Critical patent/CN112529830B/en
Publication of CN112529830A publication Critical patent/CN112529830A/en
Application granted granted Critical
Publication of CN112529830B publication Critical patent/CN112529830B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application provides an image annotation method, an image annotation device, electronic equipment and a storage medium, wherein the method comprises the following steps: generating a foreground area corresponding to the target object according to the pixel coordinates of the target object in the marked image; generating a hypothetical hole in the foreground area, and updating the foreground area according to a judgment result of whether the hypothetical hole is a real hole or not, wherein the updated foreground area comprises the real hole; and marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the updated pixel coordinate of the real hole in the foreground area. The image annotation method can determine the real holes marked in the foreground area of the target object, and further can re-mark the holes in the target object, so that the accuracy of image annotation is improved.

Description

Image annotation method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of image annotation technologies, and in particular, to an image annotation method and apparatus, an electronic device, and a storage medium.
Background
Image annotation is the process of classifying pixel blocks contained in an image, which is essentially the process of classifying pixel blocks belonging to different objects in an image. For example, it is noted which pixel blocks belong to a person, which pixel blocks belong to a tree, which pixel blocks belong to a lane, etc. At present, an image annotation model is mostly adopted to label an image. The image labeling model is obtained by training through a training data set, wherein the training data set comprises a large number of labeled images.
In the prior art, in order to ensure the accuracy of an image labeling model obtained by training, an image is labeled mostly in a manual labeling mode to obtain an accurate training data set. The same object in the image may contain a hole, and the corresponding pixel block in the hole belongs to other objects. For example, a void may be included between the branches and leaves of a tree in the image, and the void may be the sky, house, etc., rather than a tree. In order to save labor, all pixel blocks containing the holes are marked as the same object during manual marking, for example, all pixel blocks containing the holes are marked as trees. However, the labeling accuracy of the labeling mode is low, and the training result of the image labeling model is adversely affected.
Disclosure of Invention
The application provides an image annotation method, an image annotation device, electronic equipment and a storage medium, which can determine a real hole in a foreground region marked as a target object, and further can re-mark the hole in the target object, so that the accuracy of image annotation is improved.
A first aspect of the present application provides an image annotation method, including:
generating a foreground area corresponding to a target object according to the pixel coordinates of the target object in the marked image;
generating a hypothetical hole in the foreground region, and updating the foreground region according to a judgment result of whether the hypothetical hole is a real hole, wherein the updated foreground region comprises the real hole;
and marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the updated pixel coordinate of the real hole in the foreground area.
The foreground region is a foreground region in a binary image, the generating a hypothetical void in the foreground region, and updating the foreground region according to a judgment result of whether the hypothetical void is a real void includes:
generating a hypothetical hole in a foreground region in the binary image, and updating the binary image according to a judgment result of whether the hypothetical hole is a real hole, wherein the foreground region in the updated binary image is the updated foreground region;
the marking the pixel block of the corresponding pixel coordinate as the hole in the marked image according to the updated pixel coordinate of the real hole in the foreground area comprises:
and marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the pixel coordinate of the real hole in the updated binary image.
Optionally, the generating a hypothetical hole in the foreground region in the binary image, and updating the binary image according to a determination result of whether the hypothetical hole is a real hole includes:
A. corresponding binary image P in iteration period iiRandomly generating a hypothetical hole by using a preset probability in the foreground region, wherein i is an integer greater than or equal to 1;
B. according to the pixel coordinates of the assumed holes, the P is judged in the marked image according to the corresponding relation between the distribution of the assumed holes and the distribution of the real holesiWhether the assumed hole on the foreground region is a real hole or not;
C. according to said PiThe judgment result of whether the assumed hole on the foreground region is a real hole or not, and the judgment result of the assumed hole on the foreground region is PiMarking, corroding and expanding in sequence to obtain an updated binary image P corresponding to the iteration cycle ii”;
D. Judging whether the i is equal to a preset value, if so, executing E, and if not, executing Pi"as the binary image corresponding to the next iteration cycle, and returning to execute the A;
E. updating the binary image P corresponding to the iteration cycle ii"is used as the updated binary image.
Optionally, said is according to said PiThe result of judging whether the assumed hole in the foreground region is a real hole or not is obtainedP isiMarking, corroding and expanding in sequence to obtain an updated binary image P corresponding to the iteration cycle ii", including:
the P is addediMarking the pixel block corresponding to the assumed hole as the background area, and the foreground area is the real hole, and the PiMarking pixel blocks corresponding to the assumed holes which are not the real holes on the foreground region as the foreground region to obtain a marked binary image corresponding to the iteration cycle i;
carrying out corrosion treatment on the labeled binary image corresponding to the iteration cycle i to obtain a corroded binary image P corresponding to the iteration cycle ii';
The binary image P after the corrosion treatment corresponding to the iteration cycle ii' carrying out an expansion treatment to obtain said Pi”。
Optionally, before labeling, according to the pixel coordinate of the real hole in the updated binary image, a pixel block corresponding to the pixel coordinate in the labeled image as a hole, the method further includes:
obtaining the pixel coordinates of the real cavity in the updated binary image by comparing the foreground region corresponding to the first iteration cycle with the foreground region corresponding to the last iteration cycle, wherein the pixel coordinates of the real cavity are as follows: and in the first iteration period, the pixel coordinates corresponding to the pixel blocks belonging to the foreground region and in the last iteration period.
Before generating the hypothetical holes in the foreground region in the binary image, the method further includes:
receiving a first instruction input by a user, wherein the first instruction is used for instructing expansion processing on a foreground region of the binary image, and the first instruction is input when an labeled outline area of a target object in the labeled image is smaller than a first preset outline area; and/or the presence of a gas in the gas,
and receiving a second instruction input by the user, wherein the second instruction is used for indicating that the foreground region of the binary image is corroded, the second instruction is input when the labeled outline area of the target object in the labeled image is larger than a second preset outline area, and the second preset outline area is larger than the first preset outline area.
Optionally, after labeling, according to the pixel coordinates of the real hole in the updated binary image, a pixel block corresponding to the pixel coordinates in the labeled image as a hole, the method further includes:
receiving a labeling instruction input by a user, wherein the labeling instruction comprises: the marked image is marked as an object corresponding to a target pixel block in a pixel block of the cavity;
and marking the target pixel block as an object corresponding to the target pixel block according to the marking instruction.
A second aspect of the present application provides an image annotation apparatus comprising:
the first processing module is used for generating a foreground area corresponding to a target object according to the pixel coordinates of the target object in the marked image; generating a hypothetical hole in the foreground region, and updating the foreground region according to a judgment result of whether the hypothetical hole is a real hole, wherein the updated foreground region comprises the real hole;
and the second processing module is used for marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the updated pixel coordinate of the real hole in the foreground area.
Optionally, the foreground region is a foreground region in a binary image.
Correspondingly, the first processing module is specifically configured to generate a hypothetical void in a foreground region in the binary image, and update the binary image according to a determination result of whether the hypothetical void is a real void, where a foreground region in the updated binary image is the updated foreground region.
And the second processing module is used for marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the pixel coordinate of the real hole in the updated binary image.
Optionally, the first processing module is specifically configured to:
A. corresponding binary image P in iteration period iiRandomly generating a hypothetical hole by using a preset probability in the foreground region, wherein i is an integer greater than or equal to 1;
B. according to the pixel coordinates of the assumed holes, the P is judged in the marked image according to the corresponding relation between the distribution of the assumed holes and the distribution of the real holesiWhether the assumed hole on the foreground region is a real hole or not;
C. according to said PiThe judgment result of whether the assumed hole on the foreground region is a real hole or not, and the judgment result of the assumed hole on the foreground region is PiMarking, corroding and expanding in sequence to obtain an updated binary image P corresponding to the iteration cycle ii”;
D. Judging whether the i is equal to a preset value, if so, executing E, and if not, executing Pi"as the binary image corresponding to the next iteration cycle, and returning to execute the A;
E. updating the binary image P corresponding to the iteration cycle ii"is used as the updated binary image.
Optionally, the first processing module is specifically configured to: the P is addediMarking the pixel block corresponding to the assumed hole as the background area, and the foreground area is the real hole, and the PiMarking pixel blocks corresponding to the assumed holes which are not the real holes on the foreground region as the foreground region to obtain a marked binary image corresponding to the iteration cycle i; carrying out corrosion treatment on the labeled binary image corresponding to the iteration cycle i to obtain a corroded binary image P corresponding to the iteration cycle ii'; the binary image P after the corrosion treatment corresponding to the iteration cycle ii' carrying out an expansion treatment to obtain said Pi”。
Optionally, the first processing module is further configured to obtain a pixel coordinate of a real cavity in the updated binary image by comparing a foreground region corresponding to a first iteration cycle with a foreground region corresponding to a last iteration cycle, where the pixel coordinate of the real cavity is: and in the first iteration period, the pixel coordinates corresponding to the pixel blocks belonging to the foreground region and in the last iteration period.
Optionally, the image annotation device further includes: and a transceiver module.
The receiving and sending module is configured to receive a first instruction input by a user before the foreground region generates the assumed void, where the first instruction is used to instruct to perform dilation processing on the foreground region of the binary image, and the first instruction is input when an annotated contour area of a target object in the annotated image is smaller than a first preset contour area; and/or the presence of a gas in the gas,
the second instruction is used for indicating that the foreground region of the binary image is corroded, and the second instruction is input when the labeled outline area of the target object in the labeled image is larger than a second preset outline area, and the second preset outline area is larger than the first preset outline area.
Optionally, the transceiver module is further configured to receive a labeling instruction input by a user, where the labeling instruction includes: and the marked image is marked as an object corresponding to a target pixel block in the pixel blocks of the holes.
Correspondingly, the second processing module is configured to label the target pixel block as an object corresponding to the target pixel block according to the labeling instruction.
A third aspect of the present application provides an electronic device comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executes computer-executable instructions stored in the memory, so that the electronic equipment executes the image annotation method.
A fourth aspect of the present application provides a computer-readable storage medium, which stores computer-executable instructions, and when the computer-executable instructions are executed by a processor, the image annotation method is implemented.
The application provides an image annotation method, an image annotation device, electronic equipment and a storage medium, wherein the method comprises the following steps: generating a foreground area corresponding to the target object according to the pixel coordinates of the target object in the marked image; generating a hypothetical hole in the foreground area, and updating the foreground area according to a judgment result of whether the hypothetical hole is a real hole or not, wherein the updated foreground area comprises the real hole; and marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the updated pixel coordinate of the real hole in the foreground area. The image annotation method can determine the real holes marked in the foreground area of the target object, and further can re-mark the holes in the target object, so that the accuracy of image annotation is improved.
Drawings
Fig. 1 is a first schematic flow chart of an image annotation method provided in the present application;
FIG. 2 is a schematic diagram of an unlabeled image provided herein;
FIG. 3 is a schematic diagram of a binary image corresponding to a labeled image provided in the present application;
FIG. 4 is a schematic diagram of an updated binary map provided herein;
FIG. 5 is a second flowchart illustrating an image annotation method provided in the present application;
FIG. 6 is a schematic structural diagram of an image annotation device provided in the present application;
fig. 7 is a schematic structural diagram of an electronic device provided in the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the embodiments of the present application, and it is obvious that the described embodiments are some but not all of the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In order to solve the problem of low accuracy of image annotation in the prior art, the application provides an image annotation method, which determines a real hole of a foreground region corresponding to a target object in an annotated image, and then re-annotates the annotated image according to pixel coordinates of the real hole, so that the hole in the target object in the annotated image is annotated, and the purpose of improving the annotation accuracy is achieved.
Fig. 1 is a first flowchart of an image annotation method provided in the present application. The execution subject of the method flow shown in fig. 1 may be an image annotation device, which may be implemented by any software and/or hardware. As shown in fig. 1, the image annotation method provided in this embodiment may include:
s101, generating a foreground area corresponding to the target object according to the pixel coordinates of the target object in the marked image.
The marked image in this embodiment is a roughly marked image. The rough labeling refers to labeling technology in the prior art, and optionally, the labeling technology may be manual labeling or labeling according to an image labeling model. And the marked image refers to a hole in the unmarked target object, and marks the target object and the hole contained in the target object as the target object. In this embodiment, the labeling method of the labeled image is not limited.
The target object is a preset object, and the preset object is an object usually carrying a hole. Such as trees, fences, bicycles, etc. In this embodiment, after the image containing the target object is coarsely labeled, the pixel coordinates of the target object in the labeled image can be determined. For example, if the target object is a tree, then marking the tree in the image is: pixel blocks belonging to trees are labeled as trees. Correspondingly, pixel blocks belonging to the trees can be determined in the marked images, and pixel coordinates of the trees can be obtained.
In this embodiment, the foreground region of the target object may be generated according to the pixel coordinates of the target object in the labeled image. The foreground region of the target object is a region formed by pixel blocks corresponding to the target object in the labeled image. In this embodiment, in order to facilitate correspondence between the foreground region and the pixel block in the labeled image, the ratio of the pixel block may be 1: 1 foreground region of the target object is generated.
In a possible implementation manner, the foreground region is a foreground region in a binary image, that is, the binary image corresponding to the labeled image is generated in this embodiment, and the foreground region of the binary image is a pixel block corresponding to the target object. In order to facilitate correspondence between the binary image and the pixel blocks in the labeled image, the binary image of the birthday may have the same size as the labeled image.
It should be understood that the binary image is a background region except a pixel block region (foreground region) corresponding to the target object. Optionally, the gray-scale value of the foreground region in this embodiment may be 0 or 255, that is, the foreground region may be black or white, and correspondingly, the gray-scale value of the background region may be 255 or 0, that is, the background region may be white or black.
Illustratively, according to the pixel coordinates of the trees in the labeled image, a binary image with the same size as the labeled image is generated, and the foreground area of the binary image is an area formed by pixel blocks corresponding to the pixel coordinates of the trees. Fig. 2 is a schematic diagram of an unlabeled image provided in the present application, and fig. 3 is a schematic diagram of a binary diagram corresponding to an annotated image provided in the present application. Fig. 2 is an unmarked image, and a corresponding binary image after the image in fig. 2 is roughly marked is shown in fig. 3. As shown in fig. 3, the target object is a tree, and all the holes that may be carried by the tree in the prior art are labeled as trees (i.e., the tree in fig. 3 is not labeled with a hole). It should be understood that fig. 3 exemplarily shows that the foreground region in the generated binary image is black.
Optionally, in this embodiment, a binary image corresponding to an already labeled image (a coarsely labeled image) may be labeled finely.
Illustratively, the area of the labeled contour of the target object in the labeled image can be determined by roughly labeling the target object in the image, and if the rough labeling result is more accurate, the area of the labeled contour of the target object in the labeled image is within the preset contour area range. The predetermined profile area range may be greater than the first predetermined profile area and less than a second predetermined profile area, wherein the second predetermined profile area is greater than the first predetermined profile area.
Optionally, in this embodiment, a first instruction input by a user may also be received, where the first instruction is used to instruct to perform dilation processing on a foreground region of the binary image. It should be noted that the first instruction is input when the labeled outline area of the target object in the labeled image is smaller than the first preset outline area.
If the user determines the area of the foreground region in the binary image, that is, the labeled contour area of the target object is smaller than the first preset contour area, because the target objects in the image are not completely labeled by the rough labeling, the foreground region of the binary image is subjected to expansion processing to increase the foreground region corresponding to the target object, and the accuracy of the rough labeling result is improved. If the user determines that the labeled contour area of the target object is smaller than the first preset contour area, a first instruction can be input to instruct to perform expansion processing on the foreground area of the binary image. Optionally, the first instruction includes an expansion factor, and the image annotation device may perform corresponding expansion processing on the foreground region of the binary image according to the expansion factor.
Optionally, in this embodiment, a second instruction input by the user may also be received, where the second instruction is used to instruct to perform erosion processing on the foreground region of the binary image, and the second instruction is input when the labeled outline area of the target object in the labeled image is larger than a second preset outline area.
Illustratively, if a user determines the area of a foreground region in the binary image, that is, the area of a labeled contour of a target object is larger than a second preset contour area, since the coarse labeling labels a number of pixel blocks of the user and the target object around the target object in the image as the target object, the foreground region of the binary image needs to be corroded to reduce the foreground region corresponding to the target object, and improve the accuracy of the coarse labeling result. And if the user determines that the marked outline area of the target object is larger than the second preset outline area, a second instruction can be input to instruct to carry out corrosion treatment on the foreground area of the binary image. Optionally, the second instruction includes a corrosion factor, and the image annotation device may perform corresponding corrosion processing on the foreground region of the binary image according to the corrosion factor. It should be understood that when the target object is at least one, the first instruction and the second instruction may be received simultaneously.
S102, generating a hypothetical hole in the foreground area, and updating the foreground area according to the judgment result of whether the hypothetical hole is a real hole, wherein the updated foreground area comprises the real hole.
In this embodiment, an assumed hole may be generated in the foreground region, where the assumed hole is randomly generated. Optionally, in this embodiment, a preset probability may be used to generate the assumed hole in the foreground region. For example, if the preset probability is 2%, the corresponding assumed hole may be generated in the pixel block corresponding to the foreground region with a probability of 2%. If there are 100 pixel blocks corresponding to the foreground region, 2 pixel blocks are determined to be pixel blocks corresponding to the holes in the 100 pixel blocks.
After generating the assumed hole in the foreground region, it is necessary to determine whether the generated assumed hole is a real hole. In this embodiment, a Conditional Random Field (CRF) algorithm may be used to determine whether the assumed hole is a real hole. The determination result of whether the hole is assumed to be a real hole may include: which are assumed holes are real holes and which are not.
It should be understood that, in the CRF algorithm adopted in this embodiment, on the premise that the pixel coordinates of the assumed hole are known, that is, after the distribution condition of the assumed hole is obtained, the distribution of the real hole can be predicted, that is, the pixel coordinates of the real hole can be determined, and then whether the assumed hole is the real hole can be determined, and further, the determination result of whether the assumed hole is the real hole can be obtained. The principle of the CRF algorithm is not described in detail in this embodiment.
In this embodiment, the foreground region may be updated according to a determination result of whether the assumed hole is a real hole, so as to obtain an updated foreground region. The updating of the foreground region may be to mark, as a hole, a pixel block corresponding to an assumed hole determined as a real hole in the foreground region. It should be noted that the updated foreground region in this embodiment includes a real hole.
In a possible implementation manner, when the foreground region is a foreground region in the binary image, in the above embodiment, "generating an assumed hole in the foreground region, and updating the foreground region according to a determination result of whether the assumed hole is a real hole", is to "generate an assumed hole in the foreground region in the binary image, and update the binary image according to a determination result of whether the assumed hole is a real hole", where the foreground region in the updated binary image is the updated foreground region. Correspondingly, the above-mentioned "labeling the pixel block corresponding to the pixel coordinate as a hole in the labeled image according to the updated pixel coordinate of the real hole in the foreground region" is "labeling the pixel block corresponding to the pixel coordinate as a hole in the labeled image according to the updated pixel coordinate of the real hole in the binary image".
Fig. 4 shows an updated binary map corresponding to fig. 3, and fig. 4 is a schematic diagram of the updated binary map provided in the present application. In this embodiment, a real void in the tree in the foreground region can be marked in the updated binary image, and the void is shown as a white part in the black tree in fig. 4.
And S103, marking the pixel block corresponding to the pixel coordinate as the cavity in the marked image according to the updated pixel coordinate of the real cavity in the foreground area.
In this embodiment, after the updated foreground region is obtained, the pixel coordinates of the real hole in the foreground region may be determined. And marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the updated pixel coordinate of the real hole in the foreground area.
It should be understood that, since the updated pixel block in the foreground region has an object relationship with the pixel block in the labeled image, the pixel block corresponding to the pixel coordinate of the real hole in the labeled image may be determined as the real hole, and then the pixel block corresponding to the pixel coordinate may be re-labeled as the hole.
For example, if the pixel block a in the updated foreground region is a real hole, the pixel block corresponding to the pixel block a in the tagged image is tagged as a real hole. And the pixel block corresponding to the pixel block A is the pixel block with the same pixel coordinates.
In a possible implementation manner, when the foreground region is a foreground region in the binary image, "marking a pixel block corresponding to a pixel coordinate as a hole in the marked image according to the updated pixel coordinate of the real hole in the foreground region" is "the updated pixel coordinate of the real hole in the binary image," and marking the pixel block corresponding to the pixel coordinate as a hole in the marked image.
In a possible real-time manner, when the foreground region is a foreground region in a binary image, after the pixel block corresponding to the pixel coordinate in the labeled image is labeled as a hole, it can further determine whether the pixel block labeled as a hole in the labeled image really belongs to a hole part by a manual inspection manner.
In this embodiment, a labeling instruction input by a user may be received. Wherein, the marking instruction includes: and marking the object corresponding to the target pixel block in the pixel blocks marked as the holes in the marked image. The target object is a pixel block labeled as a hole in the labeled image, but actually the pixel block corresponds to a non-hole. For example, the target pixel block corresponds to a road, a house, or the like.
In this scenario, in this embodiment, the annotated image and the pixel block annotated as a hole in the annotated image are displayed, and in this embodiment, a user may select a target pixel block by clicking or other selection methods to trigger sending an annotation instruction to the image annotation device. Wherein, the labeling instruction includes: and the target pixel block corresponds to the object. Correspondingly, the image annotation device receives an annotation instruction input by a user, and can annotate the target pixel block as an object corresponding to the target pixel block in the annotated image according to the annotation instruction, so that the annotation accuracy of the pixel block can be improved.
The image annotation method provided in the embodiment includes: generating a foreground area corresponding to the target object according to the pixel coordinates of the target object in the marked image; generating a hypothetical hole in the foreground area, and updating the foreground area according to a judgment result of whether the hypothetical hole is a real hole or not, wherein the updated foreground area comprises the real hole; and marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the updated pixel coordinate of the real hole in the foreground area. The image annotation method provided by the embodiment can determine the real hole in the foreground region annotated as the target object, and further can re-annotate the hole in the target object, thereby improving the accuracy of image annotation.
On the basis of the foregoing embodiment, the following further describes the image annotation method provided by the present application with reference to fig. 5 by taking the foreground region as a foreground region in a binary image as an example, and fig. 5 is a flowchart illustrating a second flow of the image annotation method provided by the present application. As shown in fig. 5, the present embodiment may provide that:
s501, generating a binary image according to the pixel coordinates of the target object in the labeled image, wherein the foreground area of the binary image is the target object.
In this embodiment, the assumed hole may be generated for multiple times, the foreground region of the binary image may be updated according to the determination result of whether the assumed hole generated each time is a real hole, and the updated binary image may be obtained after multiple update processes. This process is described in detail below in connection with S502-S506. It should be understood that the purpose of generating the hypothetical hole multiple times in this embodiment is that the generated hypothetical hole can completely traverse the pixel block of the foreground region, so that the real hole in the foreground region can be accurately acquired.
S502, in the iteration cycle i, the corresponding binary image PiRandomly generating a hypothetical hole with a preset probability, wherein i is an integer greater than or equal to 1.
Taking a binary image generated by the labeled image as a binary image P corresponding to an iteration cycle 11. It will be appreciated that in the binary map P corresponding according to the iteration cycle 11The judgment result of whether the hole is a real hole or not is assumed in the iteration cycle 1Corresponding binary image P1The binary image obtained after updating the foreground region is the binary image P corresponding to the iteration cycle 22
In this embodiment, the binary image P corresponding to the iteration cycle i may be obtained according to a preset probabilityiRandomly generates hypothetical holes. i is an integer greater than or equal to 1. Wherein, the binary image P corresponding to each iteration cycle iiThe manner of randomly generating the assumed holes in the foreground region in the foregoing embodiment may refer to the related description in S102, which is not described herein again.
S503, according to the pixel coordinates of the assumed holes, determining P in the labeled image according to the corresponding relation between the distribution of the assumed holes and the distribution of the real holesiIf the assumed hole is a real hole.
In this embodiment, a hypothetical hole is obtained in the foreground image of the binary image, and the pixel coordinates of the hypothetical hole can be correspondingly determined. Since the binary image has the same dimensions as the annotated image, the location of the hypothetical void in the annotated image (i.e., the pixel coordinates) can be determined in the annotated image based on the pixel coordinates of the hypothetical void. According to the corresponding relationship between the distribution of the assumed holes and the distribution of the real holes in the labeled image, that is, according to the CRF algorithm (which is used for representing the corresponding relationship between the distribution of the assumed holes and the distribution of the real holes), it can be determined whether the assumed holes in the labeled image are real holes, and correspondingly, it can be determined that P isiIf the assumed hole is a real hole.
S504, according to PiFor P, the judgment result of whether the assumed hole in the foreground region is a real hole or notiMarking, corroding and expanding in sequence to obtain an updated binary image P corresponding to the iteration cycle ii”。
In the present embodiment, P is obtainediAfter the result of judging whether the assumed hole in the foreground region is a real hole, P can be determinediAnd (6) processing. Wherein the treatment is to P in sequenceiMarking, corroding and expanding treatment are carried out in sequence, so that superposition can be obtainedUpdated binary image P corresponding to generation period ii”。
Optionally, in this embodiment, P may be replaced with PiMarking the pixel block corresponding to the assumed hole as the background area, and marking PiAnd marking the pixel block corresponding to the assumed hole which is not the real hole on the foreground region as the foreground region to obtain a marked binary image corresponding to the iteration cycle i, namely, re-marking the real hole in the binary image.
Further, in this embodiment, the labeled binary image corresponding to the iteration cycle i is subjected to corrosion processing, so as to obtain a corrosion-processed binary image P corresponding to the iteration cycle ii'. The etching process is performed to reduce the area of the foreground region, and the purpose of the etching process is to connect adjacent micro cavities into large cavities, so as to avoid that after a hypothetical cavity is further generated in the next iteration cycle, the micro cavity is re-marked as the foreground region, which results in useless work. In this embodiment, the adjacent micro holes are connected into a large hole, so that the real hole labeled in the iteration period i has higher stability.
Further, in this embodiment, the binary image P after the etching process corresponding to the iteration cycle i may also be processedi' expansion treatment to give Pi". The erosion process is performed before the area of the labeled contour corresponding to the foreground region is reduced, and the expansion process is performed before the area of the labeled contour corresponding to the foreground region is restored to the area before the erosion process is performed. It is to be understood that the expansion factor and the corrosion factor in this step are the same in this embodiment.
S505, judging whether i is equal to a preset value, if so, executing E, and if not, executing PiAnd taking the binary image as a binary image corresponding to the next iteration cycle, and returning to execute A.
In this embodiment, a preset value of an iteration cycle may be preset, so that it is expected that a randomly generated assumed hole may traverse a pixel block corresponding to the foreground region. If the required i is equal to the preset value, determining that the iteration is finished and updating the second iteration corresponding to the iteration cycle iValue map Pi"as the updated binary map. If the i is determined not to be the preset value, determining that iteration needs to be continued, namely adding 1 to the i, and adding P to the IiAnd taking the two-value graph as a two-value graph corresponding to the next iteration cycle, and returning to execute S502 until i is a preset value.
S506, updating the binary image P corresponding to the iteration cycle ii"as the updated binary map.
In this embodiment, when the iteration is finished, that is, when the iteration period i is a preset value, the updated binary image P corresponding to the iteration period i is updatedi"as the updated binary map.
S507, comparing the foreground region corresponding to the first iteration cycle with the foreground region corresponding to the last iteration cycle, and obtaining the pixel coordinates of the real cavity in the updated binary image, wherein the pixel coordinates of the real cavity are as follows: and pixel coordinates corresponding to pixel blocks belonging to the foreground area in the first iteration period and belonging to the background area in the last iteration period.
When the iteration is finished, comparing the foreground region corresponding to the first iteration cycle with the foreground region corresponding to the last iteration cycle, the pixel coordinates corresponding to the pixel blocks which belong to the foreground region in the first iteration cycle and the background region in the last iteration cycle can be determined. The pixel coordinate is the pixel coordinate of the real hole in the updated binary image. Therefore, in this embodiment, the pixel coordinates of the real hole in the updated binary image can be obtained by comparing the foreground region corresponding to the first iteration cycle with the foreground region corresponding to the last iteration cycle.
Illustratively, no real holes are included in the foreground region corresponding to the first iteration cycle, i.e. all the pixel blocks in the foreground region belong to the target object. In the last iteration cycle, the pixel block a marked as the background area is the real hole, and correspondingly, the pixel coordinate of the pixel block a is the pixel coordinate of the real hole.
Optionally, in this embodiment, the pixel coordinates corresponding to the pixel block belonging to the background region in the first iteration cycle and the pixel block belonging to the foreground region in the last iteration cycle may also be determined by comparing the background region corresponding to the first iteration cycle with the background region corresponding to the last iteration cycle. The pixel block corresponding to the pixel coordinate is as follows: originally labeled as a background region in the labeled image, but should be labeled as a pixel block of the target object.
Correspondingly, in this embodiment, the pixel coordinate may be obtained, and a pixel block corresponding to the pixel coordinate is labeled as the target object in the labeled image.
In this embodiment, the assumed void may be generated in the foreground region of the binary image for multiple times, and the foreground region of the binary image may be updated according to a determination result of whether the assumed void generated each time is a real void, so as to obtain an updated binary image after multiple processing. The purpose of generating the assumed hole for multiple times is that the generated assumed hole can completely traverse the pixel block of the foreground region, so that the real hole in the foreground region can be accurately acquired. Furthermore, after the real cavity of the foreground region is determined in each iteration cycle, the foreground region can be corroded and expanded, so that adjacent micro cavities are connected into a large cavity, the situation that the assumed cavity is generated in the next iteration cycle is avoided, the micro cavity is marked as the foreground region again, the real cavity marked in the iteration cycle has higher stability, and the accuracy of cavity marking is improved.
Fig. 6 is a schematic structural diagram of an image annotation device provided in the present application. The image annotation device may be an electronic device such as a server or a terminal (e.g., a smart phone, a tablet computer, a computer, etc.). As shown in fig. 6, the image labeling apparatus 600 includes: a first processing module 601, a second processing module 602, and a transceiver module 603.
The first processing module 601 is configured to generate a foreground region corresponding to a target object according to a pixel coordinate of the target object in the labeled image, generate a hypothetical void in the foreground region, and update the foreground region according to a determination result of whether the hypothetical void is a real void, where the updated foreground region includes the real void.
The second processing module 602 is configured to label, in the labeled image, a pixel block corresponding to a pixel coordinate as a hole according to the updated pixel coordinate of the real hole in the foreground region.
Optionally, the foreground region is a foreground region in the binary image.
Correspondingly, the first processing module 601 is specifically configured to generate a hypothetical void in a foreground region in the binary image, and update the binary image according to a determination result of whether the hypothetical void is a real void, where a foreground region in the updated binary image is an updated foreground region.
And the second processing module 602 is specifically configured to label, according to the pixel coordinates of the real hole in the updated binary image, the pixel block corresponding to the pixel coordinates in the labeled image as a hole.
Optionally, the first processing module 601 is specifically configured to:
A. corresponding binary image P in iteration period iiRandomly generating a hypothetical hole by using a preset probability in the foreground region, wherein i is an integer greater than or equal to 1;
B. according to the pixel coordinates of the assumed holes and the corresponding relation between the distribution of the assumed holes and the distribution of the real holes in the labeled image, P is judgediWhether the assumed hole on the foreground region is a real hole or not;
C. according to PiFor P, the judgment result of whether the assumed hole in the foreground region is a real hole or notiMarking, corroding and expanding in sequence to obtain an updated binary image P corresponding to the iteration cycle ii”;
D. Judging whether i is equal to a preset value, if so, executing E, and if not, executing Pi"as the binary image corresponding to the next iteration cycle, and returning to execute A;
E. updating the binary image P corresponding to the iteration cycle ii"as the updated binary map.
Optionally, the first processing module 601 is specifically configured to: will PiThe pixel block corresponding to the assumed hole of the real hole on the foreground region is marked as the background regionDomain of PiMarking pixel blocks corresponding to the assumed holes which are not the real holes on the foreground region as the foreground region to obtain a marked binary image corresponding to the iteration cycle i; carrying out corrosion treatment on the labeled binary image corresponding to the iteration cycle i to obtain a corrosion-treated binary image P corresponding to the iteration cycle ii'; the binary image P after the corrosion treatment corresponding to the iteration cycle ii' expansion treatment to give Pi”。
Optionally, the first processing module 601 is further configured to obtain a pixel coordinate of a real void in the updated binary image by comparing the foreground region corresponding to the first iteration cycle with the foreground region corresponding to the last iteration cycle, where the pixel coordinate of the real void is: and pixel coordinates corresponding to pixel blocks belonging to the foreground area in the first iteration period and belonging to the background area in the last iteration period.
Optionally, the image annotation device further includes: a transceiver module 603.
A transceiver module 603, configured to receive a first instruction input by a user before a foreground region generates an assumed void, where the first instruction is used to instruct to perform expansion processing on the foreground region of a binary image, and the first instruction is input when an labeled contour area of a target object in a labeled image is smaller than a first preset contour area; and/or the presence of a gas in the gas,
the second instruction is used for indicating to carry out corrosion treatment on the foreground area of the binary image, and the second instruction is as follows: and inputting when the labeled outline area of the target object in the labeled image is larger than a second preset outline area, wherein the second preset outline area is larger than the first preset outline area.
Optionally, the transceiver module 603 is further configured to receive a labeling instruction input by a user, where the labeling instruction includes: and marking the object corresponding to the target pixel block in the pixel blocks marked as the holes in the marked image.
Correspondingly, the second processing module 602 is further configured to label the target pixel block as an object corresponding to the target pixel block according to the labeling instruction.
The image labeling apparatus provided in this embodiment has similar principles and technical effects to those of the image labeling method, and is not described herein again.
Fig. 7 is a schematic structural diagram of an electronic device provided in the present application. As shown in fig. 7, the electronic device 700 includes: a memory 701 and at least one processor 702.
A memory 701 for storing program instructions.
The processor 702 is configured to implement the image annotation method in this embodiment when the program instructions are executed, and specific implementation principles can be referred to the foregoing embodiments, which are not described herein again.
The electronic device 700 may also include an input/output interface 703.
The input/output interface 703 may include a separate output interface and input interface, or may be an integrated interface that integrates input and output. The output interface is used for outputting data, and the input interface is used for acquiring input data.
The present application further provides a readable storage medium, in which an execution instruction is stored, and when the execution instruction is executed by at least one processor of the electronic device, the image annotation method in the above embodiment is implemented when the execution instruction is executed by the processor.
The present application also provides a program product comprising execution instructions stored in a readable storage medium. The at least one processor of the electronic device may read the executable instructions from the readable storage medium, and the execution of the executable instructions by the at least one processor causes the electronic device to implement the image annotation method provided in the various embodiments described above.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to perform some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the above embodiments of the electronic device, it should be understood that the Processor may be a Central Processing Unit (CPU), other general-purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present application may be embodied directly in a hardware processor, or in a combination of the hardware and software modules in the processor.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. An image annotation method, comprising:
generating a foreground area corresponding to a target object according to the pixel coordinates of the target object in the marked image;
generating a hypothetical hole in the foreground region, and updating the foreground region according to a judgment result of whether the hypothetical hole is a real hole, wherein the updated foreground region comprises the real hole;
and marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the updated pixel coordinate of the real hole in the foreground area.
2. The method according to claim 1, wherein the foreground region is a foreground region in a binary image corresponding to the labeled image, and the generating a hypothetical hole in the foreground region and updating the foreground region according to a determination result of whether the hypothetical hole is a real hole includes:
generating a hypothetical hole in a foreground region in the binary image, and updating the binary image according to a judgment result of whether the hypothetical hole is a real hole, wherein the foreground region in the updated binary image is the updated foreground region;
the marking the pixel block of the corresponding pixel coordinate as the hole in the marked image according to the updated pixel coordinate of the real hole in the foreground area comprises:
and marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the pixel coordinate of the real hole in the updated binary image.
3. The method according to claim 2, wherein the generating a hypothetical hole in the foreground region of the binary image and updating the binary image according to the determination result of whether the hypothetical hole is a real hole comprises:
A. corresponding binary image P in iteration period iiRandomly generating a hypothetical hole by using a preset probability in the foreground region, wherein i is an integer greater than or equal to 1;
B. according to the pixel coordinates of the assumed holes, the P is judged in the marked image according to the corresponding relation between the distribution of the assumed holes and the distribution of the real holesiWhether the assumed hole on the foreground region is a real hole or not;
C. according to said PiThe judgment result of whether the assumed hole on the foreground region is a real hole or not, and the judgment result of the assumed hole on the foreground region is PiMarking, corroding and expanding in sequence to obtain an updated binary image P corresponding to the iteration cycle ii”;
D. Judging whether the i is equal to a preset value, if so, executing E, and if not, executing Pi"as the binary image corresponding to the next iteration cycle, and returning to execute the A;
E. updating the binary image P corresponding to the iteration cycle ii"is used as the updated binary image.
4. The method of claim 3, wherein said P is a function of said PiThe judgment result of whether the assumed hole on the foreground region is a real hole or not, and the judgment result of the assumed hole on the foreground region is PiMarking, corroding and expanding in sequence to obtain an updated binary image P corresponding to the iteration cycle ii", including:
the P is addediMarking the pixel block corresponding to the assumed hole as the background area, and the foreground area is the real hole, and the PiMarking pixel blocks corresponding to the assumed holes which are not the real holes on the foreground region as the foreground region to obtain a marked binary image corresponding to the iteration cycle i;
carrying out corrosion treatment on the labeled binary image corresponding to the iteration cycle i to obtain a corroded binary image P corresponding to the iteration cycle ii';
The binary image P after the corrosion treatment corresponding to the iteration cycle ii' carrying out an expansion treatment to obtain said Pi”。
5. The method according to claim 4, wherein before labeling a pixel block corresponding to a pixel coordinate as a hole in the labeled image according to the pixel coordinate of the real hole in the updated binary image, the method further comprises:
obtaining the pixel coordinates of the real cavity in the updated binary image by comparing the foreground region corresponding to the first iteration cycle with the foreground region corresponding to the last iteration cycle, wherein the pixel coordinates of the real cavity are as follows: and in the first iteration period, the pixel coordinates corresponding to the pixel blocks belonging to the foreground region and in the last iteration period.
6. The method of claim 2, wherein before generating the hypothetical holes in the foreground region of the binary image, further comprising:
receiving a first instruction input by a user, wherein the first instruction is used for instructing expansion processing on a foreground region of the binary image, and the first instruction is input when an labeled outline area of a target object in the labeled image is smaller than a first preset outline area; and/or the presence of a gas in the gas,
and receiving a second instruction input by the user, wherein the second instruction is used for indicating that the foreground region of the binary image is corroded, the second instruction is input when the labeled outline area of the target object in the labeled image is larger than a second preset outline area, and the second preset outline area is larger than the first preset outline area.
7. The method of claim 2, wherein after labeling a pixel block corresponding to a pixel coordinate as a hole in the labeled image according to the pixel coordinate of the real hole in the updated binary image, the method further comprises:
receiving a labeling instruction input by a user, wherein the labeling instruction comprises: the marked image is marked as an object corresponding to a target pixel block in a pixel block of the cavity;
and marking the target pixel block as an object corresponding to the target pixel block according to the marking instruction.
8. An image annotation apparatus, comprising:
the first processing module is used for generating a foreground area corresponding to a target object according to the pixel coordinates of the target object in the marked image; generating a hypothetical hole in the foreground region, and updating the foreground region for processing according to a judgment result of whether the hypothetical hole is a real hole, wherein the updated foreground region comprises the real hole;
and the second processing module is used for marking the pixel block corresponding to the pixel coordinate as the hole in the marked image according to the updated pixel coordinate of the real hole in the foreground area.
9. An electronic device, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing the computer-executable instructions stored by the memory causes the electronic device to perform the method of any of claims 1-7.
10. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processor, implement the method of any one of claims 1-7.
CN201910814719.1A 2019-08-30 2019-08-30 Image labeling method, device, electronic equipment and storage medium Active CN112529830B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910814719.1A CN112529830B (en) 2019-08-30 2019-08-30 Image labeling method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910814719.1A CN112529830B (en) 2019-08-30 2019-08-30 Image labeling method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112529830A true CN112529830A (en) 2021-03-19
CN112529830B CN112529830B (en) 2023-11-14

Family

ID=74974098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910814719.1A Active CN112529830B (en) 2019-08-30 2019-08-30 Image labeling method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112529830B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1104916A1 (en) * 1999-12-04 2001-06-06 Luratech Gesellschaft für Luft-und Raumfahrt-Technologie & Multimedia mbH Method for compressing color and/or grey-level scanned documents
US20130162787A1 (en) * 2011-12-23 2013-06-27 Samsung Electronics Co., Ltd. Method and apparatus for generating multi-view
US20170220903A1 (en) * 2016-02-02 2017-08-03 Adobe Systems Incorporated Training Data to Increase Pixel Labeling Accuracy
CN107909138A (en) * 2017-11-14 2018-04-13 江苏大学 A kind of class rounded grain thing method of counting based on Android platform
US20180315180A1 (en) * 2017-04-28 2018-11-01 Fujitsu Limited Detecting portions of interest in images
CN108961246A (en) * 2018-07-10 2018-12-07 吉林大学 A kind of scanning electron microscope image hole recognition methods based on artificial intelligence
CN109308456A (en) * 2018-08-31 2019-02-05 北京字节跳动网络技术有限公司 The information of target object determines method, apparatus, equipment and storage medium
CN109583444A (en) * 2018-11-22 2019-04-05 博志生物科技有限公司 Hole region localization method, device and computer readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1104916A1 (en) * 1999-12-04 2001-06-06 Luratech Gesellschaft für Luft-und Raumfahrt-Technologie & Multimedia mbH Method for compressing color and/or grey-level scanned documents
US20130162787A1 (en) * 2011-12-23 2013-06-27 Samsung Electronics Co., Ltd. Method and apparatus for generating multi-view
US20170220903A1 (en) * 2016-02-02 2017-08-03 Adobe Systems Incorporated Training Data to Increase Pixel Labeling Accuracy
US20180315180A1 (en) * 2017-04-28 2018-11-01 Fujitsu Limited Detecting portions of interest in images
CN107909138A (en) * 2017-11-14 2018-04-13 江苏大学 A kind of class rounded grain thing method of counting based on Android platform
CN108961246A (en) * 2018-07-10 2018-12-07 吉林大学 A kind of scanning electron microscope image hole recognition methods based on artificial intelligence
CN109308456A (en) * 2018-08-31 2019-02-05 北京字节跳动网络技术有限公司 The information of target object determines method, apparatus, equipment and storage medium
CN109583444A (en) * 2018-11-22 2019-04-05 博志生物科技有限公司 Hole region localization method, device and computer readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王建;贺翼虎;周源华;: "新闻视频静态图形标识分割", 上海交通大学学报, no. 05 *

Also Published As

Publication number Publication date
CN112529830B (en) 2023-11-14

Similar Documents

Publication Publication Date Title
CN110176027B (en) Video target tracking method, device, equipment and storage medium
CN109840477B (en) Method and device for recognizing shielded face based on feature transformation
CN110245579B (en) People flow density prediction method and device, computer equipment and readable medium
CN109919002B (en) Yellow stop line identification method and device, computer equipment and storage medium
CN113344910B (en) Defect labeling image generation method and device, computer equipment and storage medium
CN112651953B (en) Picture similarity calculation method and device, computer equipment and storage medium
CN109726481B (en) Auxiliary method and device for robot construction and terminal equipment
CN111414916A (en) Method and device for extracting and generating text content in image and readable storage medium
CN113569852A (en) Training method and device of semantic segmentation model, electronic equipment and storage medium
CN112348737A (en) Method for generating simulation image, electronic device and storage medium
CN116168119A (en) Image editing method, image editing device, electronic device, storage medium, and program product
US20130121558A1 (en) Point Selection in Bundle Adjustment
JP6937782B2 (en) Image processing method and device
CN117216591A (en) Training method and device for three-dimensional model matching and multi-modal feature mapping model
CN112529830B (en) Image labeling method, device, electronic equipment and storage medium
CN113128696A (en) Distributed machine learning communication optimization method and device, server and terminal equipment
CN110414845B (en) Risk assessment method and device for target transaction
CN113379592B (en) Processing method and device for sensitive area in picture and electronic equipment
CN112861678B (en) Image recognition method and device
CN110310354B (en) Vertex identification method and device in three-dimensional scene
CN111476308A (en) Remote sensing image classification method and device based on prior geometric constraint and electronic equipment
CN111027325A (en) Model generation method, entity identification device and electronic equipment
CN116468985B (en) Model training method, quality detection device, electronic equipment and medium
CN115905135B (en) Multi-frame 3D file processing method and device
CN113554068B (en) Semi-automatic labeling method, device and readable medium for instance segmentation data set

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant