WO2020230444A1 - 情報生成装置、情報生成方法、及び、コンピュータプログラム - Google Patents
情報生成装置、情報生成方法、及び、コンピュータプログラム Download PDFInfo
- Publication number
- WO2020230444A1 WO2020230444A1 PCT/JP2020/011714 JP2020011714W WO2020230444A1 WO 2020230444 A1 WO2020230444 A1 WO 2020230444A1 JP 2020011714 W JP2020011714 W JP 2020011714W WO 2020230444 A1 WO2020230444 A1 WO 2020230444A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- marking
- information
- correction
- image
- face image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 12
- 238000004590 computer program Methods 0.000 title claims description 6
- 238000012937 correction Methods 0.000 claims abstract description 205
- 238000012545 processing Methods 0.000 claims abstract description 64
- 239000011148 porous material Substances 0.000 claims description 6
- 230000037303 wrinkles Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 abstract description 6
- 238000004422 calculation algorithm Methods 0.000 description 22
- 238000001514 detection method Methods 0.000 description 15
- 238000012795 verification Methods 0.000 description 11
- 238000012546 transfer Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 230000010354 integration Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 5
- 230000001815 facial effect Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 208000017520 skin disease Diseases 0.000 description 2
- 208000034656 Contusions Diseases 0.000 description 1
- 206010014970 Ephelides Diseases 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 208000003351 Melanosis Diseases 0.000 description 1
- 206010040954 Skin wrinkling Diseases 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000007665 sagging Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/162—Detection; Localisation; Normalisation using pixel segmentation or colour matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/165—Detection; Localisation; Normalisation using facial parts and geometric relationships
Definitions
- This disclosure relates to an information generator, an information generation method, and a computer program.
- Patent Document 1 the face region of a person included in an image is divided, and the state of spots and pores in the face region is based on the distribution state of the brightness values measured in each divided region in the divided region.
- the technique for determining is disclosed.
- the non-limiting implementation of this disclosure contributes to the provision of technology that reduces the workload of the user in the process of generating correct answer information.
- the information generation device includes an image of a person's skin, marking information indicating marking by a first user on a characteristic portion of the skin in the image, and a second marking information.
- An information management unit that manages the correction information indicating correction by the user in association with each other, and a marking UI processing unit that superimposes and displays the marking indicated by the marking information and the correction indicated by the correction information on the image. Be prepared.
- the workload of the user in the process of generating correct answer information can be reduced.
- the figure for demonstrating the use example of the information generation apparatus which concerns on this embodiment The figure which shows the configuration example of the algorithm verification system which concerns on this Embodiment
- the figure which shows the management example of the information in the information management part which concerns on this embodiment The figure which shows the example of the face image which concerns on this embodiment
- the figure which shows the example of the marking which concerns on this embodiment The figure which shows the example of ROI (RegionOfInterest) which concerns on this embodiment.
- the figure which shows the example of the correction marking which concerns on this embodiment The figure which shows the example of the correct answer marking which concerns on this embodiment
- the figure which shows the example of the integration of the marking which concerns on this embodiment The figure for demonstrating the input of the marking using the marking UI which concerns on this embodiment.
- the figure for demonstrating the correction of the marking using the correction UI which concerns on this embodiment The figure for demonstrating the modification of the marking using the marking UI which concerns on this embodiment.
- the figure for demonstrating the approval of the marking using the correction UI which concerns on this embodiment The figure which shows the example of the superimposition display of the marking of the worker and the correct answer marking which concerns on this embodiment.
- the information generation device 100 is a device for generating correct answer information that accurately indicates a spot area in an image obtained by photographing a person's face (hereinafter referred to as “face image”).
- face image a device for generating correct answer information that accurately indicates a spot area in an image obtained by photographing a person's face
- a face image will be described as an example, but the present embodiment is not limited to the face image, and can be applied to various images obtained by photographing the skin of a person (for example, an image of an arm and an image of a leg). is there.
- spots will be described below as an example, the present embodiment is not limited to spots and can be applied to various characteristic parts of the skin (for example, pores, wrinkles, bruises, freckles, etc.).
- the correct answer information generated by the information generation device 100 is useful for verifying the accuracy of an algorithm that automatically detects a characteristic portion in an image of a person's skin.
- the information generation device 100 provides the marking UI 500 (see FIG. 6) to the worker who is an example of the first user, and accepts the input of the marking for the stain of the face image (S11).
- the worker does not have to be a skin expert, and may be, for example, an ordinary person who participates for reward purposes or for volunteer purposes.
- Each of the plurality of workers operates the marking UI 500 and inputs markings on the face image of the same person (S12).
- the information generator 100 integrates markings by each worker (S13). For example, among a plurality of workers, markings made by workers in a predetermined ratio or more are left, and markings made by workers in a predetermined ratio or less are deleted. The details of marking integration will be described later (see FIG. 5).
- the information generation device 100 provides a correction UI 600 (see FIG. 7) to an expert who is an example of a second user, and accepts corrections for markings made by an operator (S14).
- the specialist is a skin specialist who can accurately identify spots on a facial image, and may be, for example, a doctor or a medical worker.
- the expert may be referred to as a corrector, manager or supervisor, etc.
- the expert determines that the marking displayed on the correction UI 600 accurately indicates the stain on the face image, the expert approves the marking (S15: YES). In this case, the information generator 100 generates correct answer information based on the approved marking (S16).
- the expert determines that the marking displayed on the correction UI 600 does not accurately indicate the stain on the face image, the expert operates the correction UI 600 to correct the marking (S17).
- the information generator 100 provides the operator with the marking UI 500 including the correction by S17, and accepts the correction of the marking by the operator (S18).
- Each worker operates the marking UI500 and corrects the marking based on the correction (S19).
- the markings modified by each worker are reintegrated (S13) and provided to the expert through the correction UI 600 (S14).
- the target of the integration may be only the modified marking.
- FIG. 2 shows a configuration example of the algorithm verification system 10.
- the algorithm verification system 10 includes a photographing device 200, an information generation device 100, and an algorithm verification device 300.
- the photographing device 200 is, for example, a camera and includes a photographing unit 201.
- the photographing unit 201 photographs the face of a person to generate a face image, and stores the face image in the information management unit 101 included in the information generation device 100.
- the information generation device 100 includes an information management unit 101, a marking UI processing unit 102, a marking information generation unit 103, a correction UI processing unit 104, a correction information generation unit 105, and a correct answer information generation unit 106.
- the information management unit 101 manages the person ID, the face image, the shooting date and time, the marking information, the correction information, the correction information, and the correct answer information in association with each other.
- the person ID is information for identifying a person.
- the face image is the face image 20 of the person indicated by the person ID.
- the face image 20 may include spots 21.
- the shooting date and time is the date and time when the face image 20 was shot.
- the marking information is information indicating the marking 22 input by the operator to the stain 21 of the face image 20.
- the marking information may include identification information of the operator who input the marking 22 and / or the date and time when the marking 22 is input.
- the correction information is information indicating the ROI (Region Of Interest) 23 indicating the range corrected by the expert with respect to the marking 22.
- the correction information may include identification information of the expert who made the correction and / or the date and time when the correction was made.
- the correction information is information indicating the correction marking 24 input by the operator based on the ROI 23.
- the correction information may include the identification information of the worker who input the correction marking 24 and / or the date and time when the correction was made.
- the correct answer information is information indicating the correct answer marking 25, which is an accurate marking on the stain 21 of the face image 20.
- Correct answer information is generated based on expert-approved marking information and / or correction information.
- the correct answer information may include the identification information of the expert who made the approval and / or the date and time when the approval was made.
- the marking UI processing unit 102 provides a marking UI 500 for an operator to input a marking 22 for a stain on a face image or to input a correction marking 24 based on an ROI 23 by an expert. For example, the marking UI processing unit 102 acquires the face image 20 from the information management unit 101 and displays it on the marking UI 500. Then, the marking UI processing unit 102 receives the input of the marking 22 on the stain 21 of the face image 20 from the operator.
- the marking UI processing unit 102 acquires the face image 20, the marking information, and the correction information associated with each other from the information management unit 101, and the marking 22 indicated by the marking information on the face image 20.
- the ROI 23 indicated by the correction information is superimposed and displayed.
- the marking UI processing unit 102 receives the input of the correction marking 24 in the ROI 23 from the operator.
- the details of the marking UI 500 will be described later (see FIGS. 6 and 8).
- the marking information generation unit 103 generates marking information based on the marking 22 input through the marking UI 500. At this time, the marking information generation unit 103 may integrate the markings 22 input by each worker to generate marking information. The details of marking integration will be described later.
- the marking information generation unit 103 stores the generated marking information in the information management unit 101 in association with the face image 20. Further, the marking information generation unit 103 generates correction information based on the correction marking 24 input through the marking UI 500. The marking information generation unit 103 stores the generated correction information in the information management unit 101.
- the correction UI processing unit 104 provides a correction UI 600 for an expert to correct and approve the marking 22 of the face image. For example, the correction UI processing unit 104 acquires the face image 20 and the marking information associated with each other from the information management unit 101, and superimposes and displays the marking 22 indicated by the marking information on the face image 20. Then, the correction UI processing unit 104 receives correction or approval for the marking 22 from an expert.
- the correction UI processing unit 104 acquires the face image 20, marking information, correction information, and correction information associated with each other from the information management unit 101, and marks the face image 20 with the marking information. 22, the ROI 23 indicated by the correction information, and the correction marking 24 indicated by the correction information are superimposed and displayed. Then, the correction UI processing unit 104 receives further correction or approval for the correction marking 24 from the expert. The details of the correction UI 600 will be described later (see FIGS. 7 and 9).
- the correction information generation unit 105 generates correction information based on the ROI 23 input through the correction UI 600.
- the correction information generation unit 105 stores the generated correction information in the information management unit 101 in association with the face image 20.
- the correct answer information generation unit 106 generates the correct answer marking 25 based on the marking 22 and / or the corrected marking 24 approved by the correction UI 600. Then, the correct answer information generation unit 106 generates correct answer information based on the correct answer marking 25, and stores the correct answer information in the information management unit 101 in association with the face image 20.
- the algorithm verification device 300 includes an algorithm execution unit 301 and a detection accuracy calculation unit 302.
- the algorithm execution unit 301 executes an algorithm for automatically detecting the stain 21 from the face image 20 and outputs information indicating the detection result (hereinafter referred to as “detection result information”).
- the detection accuracy calculation unit 302 acquires the correct answer information associated with the face image 20 used by the algorithm execution unit 301 from the information management unit 101. Then, the detection accuracy calculation unit 302 calculates the detection accuracy for the detection result information output from the algorithm execution unit 301 by using the correct answer information.
- the detection accuracy may be expressed by the accuracy rate (accuracy), precision rate (precision), recall rate (recall), and / or F value (F-measure), etc., for each pixel constituting the face image 20. .. This makes it possible to calculate the accuracy of both missed detection and excessive detection.
- the detection accuracy may be calculated for the entire face image 20 or within a predetermined range set for the face image 20.
- Accurate correct information is required at the pixel level in order to prove the effect of treatment or skin care on the site of small skin diseases such as age spots.
- the correct answer information is accurate at the pixel level because it is generated through correction by an expert who can accurately identify the stain 21 in the face image 20. Therefore, the algorithm verification device 300 can accurately calculate the detection accuracy of the algorithm for detecting the site of a small skin disease such as a spot by using the correct answer information generated by the information generation device 100.
- the marking information generation unit 103 integrates the markings 22A, 22B, 22C, and 22D input by each of the plurality of workers, and forms the area 401 marked by a predetermined ratio or more of the workers. , May be generated as marking information.
- the predetermined ratio may be arbitrarily set, for example, 20%, 50%, 80%, or the like.
- FIG. 6 shows an example of the marking UI 500. An input example of marking will be described with reference to FIG.
- the marking UI 500 includes a work area 501, a contour mode button 502, a marking mode button 503, a color change button 504, a mask button 505, a reference image display button 506, a correct marking transfer button 507, a registration button 508, a correction result display button 509, and a correction result display button 509. , Has a correction button 510.
- the marking UI processing unit 102 displays the face image 20 in the work area 501 and receives the input of the marking 22 from the operator.
- the operator inputs the marking 22 on the portion of the face image 20 that seems to be a stain 21 with a touch pen or a mouse, for example.
- the marking UI processing unit 102 switches the work area 501 to the contour mode for inputting the contour of the marking 22.
- the contour mode the inside of the input contour 521 is automatically marked (filled). Contour mode is useful for marking a wide range of stains.
- the marking UI processing unit 102 switches the work area 501 to the marking mode for directly inputting the marking 22.
- the marking mode the touched part is marked (filled).
- the marking mode is useful for marking small stains.
- the size of the marking 22 (for example, the diameter of the circle to be smeared) may be enlarged or reduced by the wheel of the mouse.
- the marking UI processing unit 102 changes the color of the face image 20.
- color changes include color enhancement, normalization by color standard deviation, unsharp masking, gamma correction, and the like. For example, when it is difficult to distinguish between stains and shadows from the face image 20 being displayed, the operator can discriminate between them by changing the color.
- the marking UI processing unit 102 sets the mask 522 in the marking prohibited area (for example, the eye area) in the face image 20.
- the marking UI processing unit 102 may automatically set the mask 522 based on the recognition result of the face part for the face image 20.
- the marking UI processing unit 102 may set the mask 522 based on the setting by an expert.
- the marking UI processing unit 102 may accept the input of the mask 522 from the operator.
- the marking UI processing unit 102 displays another face image of the same person as the face image 20 displayed in the work area 501 in a separate window.
- the different face image may have a different brightness and / or angle from the displayed face image 20.
- another face image may be aligned with the face image 20 displayed in the work area 501 and displayed in a separate window. For example, when a part of the face image 20 is enlarged and displayed in the work area 501, the same range of another face image may be enlarged and displayed in another window.
- the operator can discriminate between them by referring to another face image in another window.
- the marking UI processing unit 102 transfers the correct answer marking 25 generated from another face image of the same person to the face image 20 displayed in the work area 501.
- the marking UI processing unit 102 may transfer the correct marking 25 of another face image whose shooting date and time is closest to the face image 20 displayed in the work area 501. This can improve the work efficiency of marking. The details of the transfer of the correct marking 25 will be described later (see FIG. 11).
- the marking UI processing unit 102 When the registration button 508 is pressed, the marking UI processing unit 102 outputs the marking 22 input to the work area 501 to the marking information generation unit 103.
- the marking information generation unit 103 generates marking information based on the output marking 22, and stores it in the information management unit 101 in association with the face image 20.
- the correction result display button 509 and the correction button 510 will be described later (see FIG. 8).
- the marking UI 500 provides various functions for efficiently marking stains on the face image. The operator can efficiently mark stains through the marking UI 500.
- the marking UI 500 may display a practice image for the operator to practice marking the stain in the work area 501. In this case, the marking UI 500 may superimpose the markings made in advance by the expert on the practice image.
- FIG. 7 shows an example of the correction UI 600. An example of correcting the marking will be described with reference to FIG. 7.
- the correction UI 600 has a correction area 601, a correction reason button 602 (602A, 602B, 602C, 602D), a remand button 603, a difference display button 604, and an approval button 605.
- the correction UI processing unit 104 acquires the face image 20 and the marking information marking 22 associated with each other from the information management unit 101, and superimposes and displays the marking 22 on the face image 20. Then, the correction UI processing unit 104 receives the setting of the ROI 23 from the expert. Experts set the ROI 23 to the extent that the correction of the marking 22 is pointed out, for example with a stylus or a mouse.
- the correction reason button 602 is a button for associating the correction reason with the ROI 23.
- the correction UI processing unit 104 responds to the ROI23A. , Correspond to the reason for correction "completely excessive”.
- the correction UI processing unit 104 Corresponds the reason for correction "partially excessive" to the ROI23.
- the correction UI processing unit 104 responds to the ROI23C. Correspond the reason for correction "partially missing".
- the correction UI processing unit 104 responds to the ROI23D. Then, the reason for correction "completely missing" is associated.
- the display mode of the ROI 23 may differ in the shape of the line, the thickness of the line, and / or the color of the line for each reason for correction so that the reason for correction associated with the ROI 23 can be distinguished at a glance.
- the ROI23C and 23D associated with the correction reason for the missing marking may be displayed with a solid line
- the ROI23A associated with the correction reason for the excess marking may be displayed with a broken line.
- the correction UI 600 may display a legend of the display mode of the ROI 23.
- the correction UI processing unit 104 When the remand button 603 is pressed, the correction UI processing unit 104 outputs the set ROI 23 and the correction reason to the correction information generation unit 105.
- the correction information generation unit 105 generates correction information including the output ROI 23 and the correction reason, and stores the correction information in association with the face image 20 in the information management unit 101.
- the difference display button 604 and the approval button 605 will be described later (see FIG. 9).
- the correction UI 600 provides various functions for efficiently correcting the marking 22 performed by the operator. Therefore, the expert can operate the correction UI 600 to efficiently correct the marking 22 made by the operator.
- the input of the reason for correction is not limited to the selective input by the above-mentioned reason for correction button 602.
- the correction UI 600 may provide a function for inputting an arbitrary correction reason to the ROI 620.
- the marking UI 500 shown in FIG. 8 is the same as that shown in FIG.
- the marking UI processing unit 102 receives the face image 20, the marking information marking 22, the correction information ROI 23, and the correction reason, which are associated with each other from the information management unit 101. Is acquired, and the marking 22, ROI 23, and the reason for correction are superimposed and displayed on the face image 20. The operator inputs the correction marking 24 in the ROI 23.
- the worker adds the correction marking 24A in the ROI23C and 23D pointed out as the reason for correction "completely missing” or “partially missing".
- the added correction marking 24A is displayed in the work area 401.
- the added modified marking 24A is displayed in a different manner from the original marking 22.
- the worker deletes at least a part of the markings in the ROI 620A that are pointed out as the reason for correction "completely excessive” or “partially excessive”.
- the correction marking 24B indicating that the deletion has been made is displayed in the work area 401.
- the modified marking 24B indicating that it has been deleted is displayed in a manner different from that of the original marking 22.
- the marking UI processing unit 102 When the correction button 510 is pressed, the marking UI processing unit 102 outputs the correction marking 24 to the marking information generation unit 103.
- the marking information generation unit 103 generates correction information based on the output correction marking 24, and stores the correction information in the information management unit 101 in association with the face image 20.
- the correction UI processing unit 104 acquires the face image 20, marking information, correction information, and correction information associated with each other from the information management unit 101. Then, the correction UI processing unit 104 superimposes and displays the marking 22 indicated by the marking information, the ROI 23 indicated by the correction information, the reason for correction, and the correction marking 24 indicated by the correction information on the face image 20.
- the correction UI processing unit 104 When the approval button 605 is pressed, the correction UI processing unit 104 outputs the marking 22, ROI 23, and correction marking 24 displayed in the correction area 601 to the correct answer information generation unit 106.
- the correct answer information generation unit 106 generates correct answer marking 25 (that is, correct answer information) based on the output marking 22 and correction marking 24, and stores it in the information management unit 101.
- the expert only has to confirm the correction made by the worker with respect to the range in which the ROI23 is set, so that the correction of the worker can be confirmed efficiently.
- the marking UI processing unit 102 may superimpose and display the marking 22 input by the operator and the correct answer marking 25. In this case, the marking UI processing unit 102 may feed back to the operator as a score whether the marking 22 input by the operator tends to be larger or smaller than the correct marking 25. .. As a result, the operator can recognize the habit of his / her marking 22 and can mark the stain 21 more accurately from the next time.
- a plurality of face feature points P (P1, P2, P3, P4) can be specified for the face image 20A.
- the correct answer marking 25 may be associated with the mesh region R1 composed of facial feature points P1, P2, P3, and P4, as illustrated in FIG. 11 (A).
- the correct answer marking 25 can be transferred to another face image 20B of the same person.
- the mesh region R1 shown in FIG. 11A and the mesh region R2 shown in FIG. 11B are composed of the same facial feature points P1, P2, P3, and P4. Therefore, when the correct marking transfer button 507 is pressed, the marking UI processing unit 102 has the mesh area R1 shown in FIG. 11 (A) with respect to the mesh area R2 of another face image 20B shown in FIG. 11 (B).
- the correct answer marking 25 associated with is transferred (superimposed display). As a result, the correct marking 25 can be transferred to a substantially correct spot position even for another face image 20B having different facial expressions, sizes, and the like.
- the configuration of the information generator 100 shown in FIG. 2 is an example.
- the marking UI processing unit 102 and the marking information generation unit 103 may be configured as a device (for example, a PC) operated by an operator.
- the correction UI processing unit 104, the correction information generation unit 105, and the correct answer information generation unit 106 may be configured as a device (for example, a PC) operated by an expert.
- the information management unit 101 may be configured as a PC operated by an expert or a worker and a server device connected via a communication network.
- the information generation device 100 is a server device connected to the Internet
- the marking UI processing unit 102 provides the marking UI 500 to the browser used by the worker
- the correction UI processing unit 104 is provided by an expert.
- the correction UI 600 may be provided for the browser to be used.
- the marking information generation unit 103 may correct the marking habit of each worker when integrating the markings input from a plurality of workers.
- the habit of marking by an operator may be specified based on the difference between the marking entered by the operator and the correct marking.
- the explanation is made to improve the efficiency of correction by an expert, but the present invention is not limited to this, and in order to meet the customer's request (for example, the goal of reducing wrinkles to this extent).
- the customer for example, the patient
- corrects the marking entered by the expert for example, the doctor in charge of the treatment
- a correction can indicate a customer's desire to eliminate or reduce wrinkles in this area.
- the information generation device 100 has an image of a person's skin, marking information indicating marking by a first user (for example, an operator) on a characteristic portion of the skin in the image, and marking information.
- the information management unit 101 that manages the correction information indicating the correction by the second user (for example, an expert) in association with each other, and the marking that superimposes and displays the marking indicated by the marking information and the correction indicated by the correction information on the image. It includes a UI processing unit 102.
- the characteristic part of the skin may be any of skin spots, pores and wrinkles.
- the face image, the marking made by the first user, and the correction for the marking made by the second user are superimposed and displayed, so that the first user can perform the second for the marking entered by himself / herself.
- the correction by the user can be easily recognized.
- the marking UI processing unit 102 may accept input of markings on an image from a plurality of first users.
- the information generation device 100 may further include a marking information generation unit 103 that integrates a plurality of markings input from the plurality of first users to generate marking information. With this configuration, markings input from a plurality of first users are integrated, so that the marking information indicates a characteristic part of the skin as compared with the case where one first user inputs markings. Accuracy is improved.
- the information management unit 101 may manage the correction marking information indicating the correction of the marking by the first user based on the correction.
- the information generation device 100 When approved by the second user, the information generation device 100 generates correct answer information indicating marking (for example, correct answer marking) on a characteristic part of the skin in the image based on the marking information and the modified marking information.
- a portion 106 may be further provided. With this configuration, the correct answer information is generated using the marking corrected based on the correction, so that the correct answer information that accurately marks the characteristic part of the skin can be generated.
- the marking UI processing unit 102 may display the difference between the marking indicated by the marking information and the marking indicated by the correct answer information (for example, the correct answer marking).
- the correct answer information for example, the correct answer marking.
- the information generation device 100 superimposes and displays the marking indicated by the marking information on the image, and generates the correction information based on the correction UI processing unit 104 that receives the correction for the marking from the second user and the input correction.
- the correction information generation unit 105 and the correction information generation unit 105 may be further provided. With this configuration, the markings input by the first user are superimposed and displayed on the face image, so that the second user can easily correct the markings by the first user.
- the correction information includes information indicating a correction range (for example, ROI) and a correction reason, and the marking UI processing unit 102 may display the correction range in a different manner for each correction reason. Due to this configuration, the display mode of the correction range differs depending on the reason for correction. Therefore, the first user can easily recognize the reason for correction in the correction range from the difference in the display mode of the correction range.
- the marking UI processing unit 102 may transform and display a second face image having the same person but different from the first face image into a shape suitable for the face included in the first face image. .. With this configuration, the second face image is displayed in a manner compatible with the first face image, so that the first user can see the characteristic part of the skin in the first face image as the second face. Can be easily compared with the image.
- the marking UI processing unit 102 superimposes and displays markings (for example, correct answer markings) indicated by correct answer information of a second face image different from the first face image, which has the same person, on the first face image. Good.
- markings for example, correct answer markings
- the correct answer marking is superimposed and displayed on the first face image, so that the first user can efficiently mark the first image with reference to the correct answer marking.
- FIG. 12 is a diagram showing a hardware configuration of a computer that realizes the functions of each device by a program.
- the computer 2100 includes an input device 2101 such as a keyboard, a mouse, a touch pen and / or a touch pad, an output device 2102 such as a display or a speaker, a CPU (Central Processing Unit) 2103, a GPU (Graphics Processing Unit) 2104, and a ROM (Read Only).
- an input device 2101 such as a keyboard, a mouse, a touch pen and / or a touch pad
- an output device 2102 such as a display or a speaker
- a CPU Central Processing Unit
- GPU Graphics Processing Unit
- ROM Read Only
- RAM RandomAccessMemory
- hard disk device or storage device 2107 such as SSD (SolidStateDrive)
- recording medium such as DVD-ROM (DigitalVersatileDiskReadOnlyMemory) or USB (UniversalSerialBus) memory
- a reading device 2108 for reading information from the computer and a transmitting / receiving device 2109 for communicating via a network are provided, and each unit is connected by a bus 2110.
- the reading device 2108 reads the program from the recording medium on which the program for realizing the function of each of the above devices is recorded, and stores the program in the storage device 2107.
- the transmission / reception device 2109 communicates with the server device connected to the network, and stores the program downloaded from the server device for realizing the function of each device in the storage device 2107.
- the CPU 2103 copies the program stored in the storage device 2107 to the RAM 2106, and sequentially reads and executes the instructions included in the program from the RAM 2106, thereby realizing the functions of the above devices.
- LSI is an integrated circuit. These may be individually integrated into one chip, or may be integrated into one chip so as to include a part or all of them. Although it is referred to as LSI here, it may be referred to as IC, system LSI, super LSI, or ultra LSI depending on the degree of integration.
- the method of making an integrated circuit is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and settings of the circuit cells inside the LSI may be used.
- One aspect of the present disclosure is useful for generating information for verifying a detection algorithm.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Geometry (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
Description
情報生成装置100は、人物の顔を撮影した画像(以下「顔画像」という)におけるシミの領域を正確に示す正解情報を生成するための装置である。なお、以下では、顔画像を例に説明するが、本実施の形態は、顔画像に限られず、人物の肌を撮影した様々な画像(例えば、腕の画像及び脚の画像)に適用可能である。また、以下では、シミを例に説明するが、本実施の形態は、シミに限られず、肌の様々な特徴的な部分(例えば毛穴、皺、痣、及びそばかす等)に適用可能である。情報生成装置100によって生成された正解情報は、人物の肌を撮影した画像における特徴的な部分を自動的に検出するアルゴリズムの精度の検証に有用である。
図2は、アルゴリズム検証システム10の構成例を示す。アルゴリズム検証システム10は、撮影装置200、情報生成装置100、及びアルゴリズム検証装置300を備える。
撮影装置200は、例えばカメラであり、撮影部201を備える。撮影部201は、人物の顔を撮影して顔画像を生成し、当該顔画像を、情報生成装置100が備える情報管理部101に格納する。
情報生成装置100は、情報管理部101、マーキングUI処理部102、マーキング情報生成部103、添削UI処理部104、添削情報生成部105、及び、正解情報生成部106を備える。
人物IDは、人物を識別するための情報である。顔画像は、図4Aに示すように、人物IDが示す人物の顔画像20である。顔画像20にはシミ21が含まれ得る。撮影日時は、顔画像20を撮影した日時である。
アルゴリズム検証装置300は、アルゴリズム実行部301、及び、検出精度算出部302を備える。
図5を参照して、マーキングの統合の例を説明する。
図6は、マーキングUI500の一例を示す。図6を参照して、マーキングの入力例を説明する。
図7は、添削UI600の一例を示す。図7を参照して、マーキングの添削例を説明する。
図8を参照して、マーキングの修正例を説明する。なお、図8に示すマーキングUI500は、図6に示したものと同様である。
図9を参照して、マーキングの承認例を説明する。なお、図9に示す添削UI600は、図7に示したものと同様である。
図10を参照して、作業者が入力したマーキング22と正解マーキング25との差分を、作業者にフィードバックする例を説明する。
図11を参照して、上述した、別の顔画像に対する正解マーキング25の転写について詳細に説明する。
図2に示した情報生成装置100の構成は一例である。例えば、マーキングUI処理部102及びマーキング情報生成部103は、作業者が操作する装置(例えばPC)に構成されてよい。添削UI処理部104、添削情報生成部105及び正解情報生成部106は、専門家が操作する装置(例えばPC)に構成されてよい。情報管理部101は、専門家及び作業者が操作するPCと、通信ネットワークを介して接続されたサーバ装置に構成されてよい。或いは、情報生成装置100は、インターネットに接続されたサーバ装置であり、マーキングUI処理部102は、作業者が利用するブラウザに対してマーキングUI500を提供し、添削UI処理部104は、専門家が利用するブラウザに対して添削UI600を提供してよい。
本実施の形態に係る情報生成装置100は、人物の肌を撮影した画像と、画像における肌の特徴的な部分に対する第1のユーザ(例えば作業者)によるマーキングを示すマーキング情報と、マーキング情報に対する第2のユーザ(例えば専門家)による添削を示す添削情報と、を関連付けて管理する情報管理部101と、画像に対して、マーキング情報が示すマーキングと添削情報が示す添削とを重畳表示するマーキングUI処理部102と、を備える。肌の特徴的な部分は、肌のシミ、毛穴及び皺のうちの何れかであってよい。また、肌の特徴的な部分ではなく、過去と現在の2枚の顔写真の差分の検出であってもよい(例えば、頬のたるみの変化、等)。また、物体の認識であってもよい(例えば、物体の腐食部分の検出、等)。
100 情報生成装置
101 情報管理部
102 マーキングUI処理部
103 マーキング情報生成部
104 添削UI処理部
105 添削情報生成部
106 正解情報生成部
200 撮影装置
201 撮影部
300 アルゴリズム検証装置
301 アルゴリズム実行部
302 検出精度算出部
500 マーキングUI
501 作業領域
502 輪郭モードボタン
503 マーキングモードボタン
504 色変更ボタン
505 マスクボタン
506 参考画像表示ボタン
507 正解マーキング転写ボタン
508 登録ボタン
509 添削結果表示ボタン
510 修正ボタン
600 添削UI
601 添削領域
602、602A、602B、602C、602D 添削理由ボタン
603 差し戻しボタン
604 差分表示ボタン
605 承認ボタン
Claims (11)
- 人物の肌を撮影した画像と、前記画像における肌の特徴的な部分に対する第1のユーザによるマーキングを示すマーキング情報と、前記マーキング情報に対する第2のユーザによる添削を示す添削情報と、を関連付けて管理する情報管理部と、
前記画像に対して、前記マーキング情報が示すマーキングと前記添削情報が示す添削とを重畳表示するマーキングUI処理部と、
を備える情報生成装置。 - 前記マーキングUI処理部は、複数の前記第1のユーザから、前記画像に対する前記マーキングの入力を受け付け、
前記複数の第1のユーザから入力された複数のマーキングを統合して1つのマーキング情報を生成するマーキング情報生成部、を更に備える、
請求項1に記載の情報生成装置。 - 前記情報管理部は、前記添削に基づく前記第1のユーザによるマーキングの修正を示す修正マーキング情報を管理し、
前記第2のユーザが承認した場合、前記マーキング情報及び前記修正マーキング情報に基づいて、前記画像における肌の特徴的な部分に対するマーキングを示す正解情報を生成する正解情報生成部、を更に備える、
請求項1に記載の情報生成装置。 - 前記マーキングUI処理部は、前記マーキング情報が示すマーキングと前記正解情報が示すマーキングとの差異を表示する、
請求項3に記載の情報生成装置。 - 前記画像に対して前記マーキング情報が示すマーキングを重畳表示し、前記第2のユーザから、前記マーキングに対する添削を受け付ける添削UI処理部と、
入力された添削に基づいて前記添削情報を生成する添削情報生成部と、を更に備える、
請求項1に記載の情報生成装置。 - 前記添削情報は、添削範囲及び添削理由を示す情報を含み、
前記マーキングUI処理部は、前記添削理由毎に異なる態様にて前記添削範囲を表示する、
請求項1に記載の情報生成装置。 - 前記画像は、前記人物の第1の顔画像であり、
前記マーキングUI処理部は、前記人物が同じであって前記第1の顔画像とは別の第2の顔画像を、前記第1の顔画像に含まれる顔に適合する形状に変形して表示する、
請求項1に記載の情報生成装置。 - 前記画像は、前記人物の第1の顔画像であり、
前記マーキングUI処理部は、前記人物が同じであって前記第1の顔画像とは別の第2の顔画像の正解情報が示すマーキングを、前記第1の顔画像に重畳表示する、
請求項3に記載の情報生成装置。 - 前記肌の特徴的な部分は、前記肌のシミ、毛穴及び皺のうちの何れかである、
請求項1に記載の情報生成装置。 - 装置が、人物の肌を撮影した画像と、前記画像における肌の特徴的な部分に対する第1のユーザによるマーキングを示すマーキング情報と、前記マーキング情報に対する第2のユーザによる添削を示す添削情報と、を関連付けて管理し、
装置が、前記画像に対して、前記マーキング情報が示すマーキングと前記添削情報が示す添削とを重畳表示する、
情報生成方法。 - 人物の肌を撮影した画像と、前記画像における肌の特徴的な部分に対する第1のユーザによるマーキングを示すマーキング情報と、前記マーキング情報に対する第2のユーザによる添削を示す添削情報と、を関連付けて管理し、
前記画像に対して、前記マーキング情報が示すマーキングと前記添削情報が示す添削とを重畳表示する、
ことをコンピュータに実行させる、
コンピュータプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202080032053.5A CN113767410A (zh) | 2019-05-13 | 2020-03-17 | 信息生成装置、信息生成方法以及计算机程序 |
JP2021519284A JP7503757B2 (ja) | 2019-05-13 | 2020-03-17 | 情報生成装置、情報生成方法、及び、コンピュータプログラム |
US17/511,609 US20220051001A1 (en) | 2019-05-13 | 2021-10-27 | Information generating apparatus, information generation method, and non-transitory computer-readable recording medium storing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019090605 | 2019-05-13 | ||
JP2019-090605 | 2019-05-13 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/511,609 Continuation US20220051001A1 (en) | 2019-05-13 | 2021-10-27 | Information generating apparatus, information generation method, and non-transitory computer-readable recording medium storing program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020230444A1 true WO2020230444A1 (ja) | 2020-11-19 |
Family
ID=73289176
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/011714 WO2020230444A1 (ja) | 2019-05-13 | 2020-03-17 | 情報生成装置、情報生成方法、及び、コンピュータプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20220051001A1 (ja) |
JP (1) | JP7503757B2 (ja) |
CN (1) | CN113767410A (ja) |
WO (1) | WO2020230444A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3650960A1 (fr) * | 2018-11-07 | 2020-05-13 | Tissot S.A. | Procede de diffusion d'un message par une montre |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001216386A (ja) * | 2000-02-02 | 2001-08-10 | Nippon Telegr & Teleph Corp <Ntt> | 化粧支援装置 |
JP2002221896A (ja) * | 2001-01-24 | 2002-08-09 | Victor Co Of Japan Ltd | 化粧シミュレーションシステム |
JP2005310124A (ja) * | 2004-03-25 | 2005-11-04 | Fuji Photo Film Co Ltd | 赤目検出装置、プログラムおよびプログラムを記録した記録媒体 |
JP2008003724A (ja) * | 2006-06-20 | 2008-01-10 | Kao Corp | 美容シミュレーションシステム |
JP2008022154A (ja) * | 2006-07-11 | 2008-01-31 | Fujifilm Corp | 化粧支援装置及び方法 |
CN107679507A (zh) * | 2017-10-17 | 2018-02-09 | 北京大学第三医院 | 面部毛孔检测系统及方法 |
JP2018092351A (ja) | 2016-12-02 | 2018-06-14 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SG11201804159XA (en) * | 2015-12-28 | 2018-06-28 | Panasonic Ip Man Co Ltd | Makeup simulation assistance apparatus, makeup simulation assistance method, and makeup simulation assistance program |
WO2018003421A1 (ja) * | 2016-06-30 | 2018-01-04 | パナソニックIpマネジメント株式会社 | 画像処理装置および画像処理方法 |
JPWO2018079255A1 (ja) * | 2016-10-24 | 2019-09-12 | パナソニックIpマネジメント株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
-
2020
- 2020-03-17 WO PCT/JP2020/011714 patent/WO2020230444A1/ja active Application Filing
- 2020-03-17 JP JP2021519284A patent/JP7503757B2/ja active Active
- 2020-03-17 CN CN202080032053.5A patent/CN113767410A/zh active Pending
-
2021
- 2021-10-27 US US17/511,609 patent/US20220051001A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001216386A (ja) * | 2000-02-02 | 2001-08-10 | Nippon Telegr & Teleph Corp <Ntt> | 化粧支援装置 |
JP2002221896A (ja) * | 2001-01-24 | 2002-08-09 | Victor Co Of Japan Ltd | 化粧シミュレーションシステム |
JP2005310124A (ja) * | 2004-03-25 | 2005-11-04 | Fuji Photo Film Co Ltd | 赤目検出装置、プログラムおよびプログラムを記録した記録媒体 |
JP2008003724A (ja) * | 2006-06-20 | 2008-01-10 | Kao Corp | 美容シミュレーションシステム |
JP2008022154A (ja) * | 2006-07-11 | 2008-01-31 | Fujifilm Corp | 化粧支援装置及び方法 |
JP2018092351A (ja) | 2016-12-02 | 2018-06-14 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
CN107679507A (zh) * | 2017-10-17 | 2018-02-09 | 北京大学第三医院 | 面部毛孔检测系统及方法 |
Also Published As
Publication number | Publication date |
---|---|
CN113767410A (zh) | 2021-12-07 |
JPWO2020230444A1 (ja) | 2020-11-19 |
JP7503757B2 (ja) | 2024-06-21 |
US20220051001A1 (en) | 2022-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102317478B1 (ko) | 상처의 평가 및 관리를 위한 방법 및 시스템 | |
US11963846B2 (en) | Systems and methods for integrity analysis of clinical data | |
US20210343400A1 (en) | Systems and Methods for Integrity Analysis of Clinical Data | |
US20190088374A1 (en) | Remote dental consultation method and system | |
KR102304370B1 (ko) | 딥러닝 기반 상처 변화 및 상태 분석 장치 및 방법 | |
Howell et al. | Development of a method for clinical evaluation of artificial intelligence–based digital wound assessment tools | |
WO2022011342A9 (en) | Systems and methods for integrity analysis of clinical data | |
KR20200068992A (ko) | 전자 차트 관리 장치, 전자 차트 관리 방법 및 기록 매체 | |
WO2020230444A1 (ja) | 情報生成装置、情報生成方法、及び、コンピュータプログラム | |
US11017216B2 (en) | Skin analyzing device, skin analyzing method, and recording medium | |
Savage et al. | Use of 3D photography in complex-wound assessment | |
TW201802761A (zh) | 以病人爲中心的壓瘡照護方法 | |
CN114732425A (zh) | 一种提升dr胸片成像质量的方法及系统 | |
CN109767822B (zh) | 选择外科植入物的方法和相关装置 | |
JP2009232982A (ja) | 画像計測装置、医用画像システム及びプログラム | |
JP2019075071A (ja) | 保険料算定システム、保険料算定方法及びプログラム | |
JP7513978B2 (ja) | 受付支援装置、受付支援方法、受付支援システムおよび受付支援プログラム | |
JP6654274B1 (ja) | 診察手帳システム、患者端末及び制御方法 | |
JP2012200292A (ja) | 医療情報管理システム | |
Friesen et al. | An mHealth technology for chronic wound management | |
JP2003339685A (ja) | 表示装置、画像処理装置、画像処理システム、表示方法、及び記憶媒体 | |
Patil et al. | Normalized Feature Plane Alteration for Dental Caries Recognition | |
Salati | Smartphone photography for smart assessment of post-surgical wounds–an experience during the COVID-19 pandemic | |
US20240331342A1 (en) | System and method for determining an orthodontic occlusion class | |
WO2023054295A1 (ja) | 情報処理装置、情報処理方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20806258 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021519284 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020806258 Country of ref document: EP Effective date: 20211213 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20806258 Country of ref document: EP Kind code of ref document: A1 |