WO2020195155A1 - Image processing device, image processing method, and image processing program - Google Patents

Image processing device, image processing method, and image processing program Download PDF

Info

Publication number
WO2020195155A1
WO2020195155A1 PCT/JP2020/003680 JP2020003680W WO2020195155A1 WO 2020195155 A1 WO2020195155 A1 WO 2020195155A1 JP 2020003680 W JP2020003680 W JP 2020003680W WO 2020195155 A1 WO2020195155 A1 WO 2020195155A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
similarity
image portion
image processing
user
Prior art date
Application number
PCT/JP2020/003680
Other languages
French (fr)
Japanese (ja)
Inventor
野口 幸典
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2020195155A1 publication Critical patent/WO2020195155A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an image processing apparatus, an image processing method, and an image processing program.
  • Patent Document 1 There is an image search that finds an image similar to a certain image.
  • Patent Document 2 one that attempts to reliably evaluate the similarity between images
  • Patent Document 2 one that obtains a reference image similar to the input image with high accuracy
  • Patent Document 3 one that obtains a reference image similar to the input image with high accuracy
  • the moving frame on the image is moved to determine the initial position of the tracking frame (Patent Document 3) so as not to give a sense of discomfort when switching the display of the subject image. Therefore, the cropping range is gradually reduced and the enlargement ratio of the cropped image is gradually increased (Patent Document 4).
  • Patent Document 5 A method of automatically setting a trimming frame for the other image in the trimming frame (Patent Document 5) is also considered.
  • Japanese Unexamined Patent Publication No. 2012-133459 Japanese Unexamined Patent Publication No. 2016-081131 WO2013 / 047066 Japanese Unexamined Patent Publication No. 2008-079166 Japanese Patent No. 5482142
  • the object of the present invention is to find an image portion similar to a certain image.
  • the image processing apparatus is a similarity calculation means for calculating the similarity between an image portion that is a part of the first image and the second image, and the image portion and the second image by changing the range of the image portion.
  • a first control means that controls the similarity calculation means so as to repeat the calculation of the similarity with the image, and a detection that detects an image portion whose similarity calculated by the similarity calculation means is equal to or higher than the first threshold value. It has the means.
  • the present invention also provides an image processing method suitable for an image processing apparatus. That is, the similarity calculation means calculates the similarity between the image portion that is a part of the first image and the second image, and the control means changes the range of the image portion to change the range of the image portion and the second image.
  • the similarity calculation means is controlled so as to repeat the calculation of the similarity with and the detection means detects an image portion in which the similarity calculated by the similarity calculation means is equal to or higher than the first threshold value.
  • the present invention also provides a program for controlling a computer of an image processing apparatus and a recording medium for storing the program.
  • the image processing device is provided with a processor, and the processor calculates the similarity between the image portion that is a part of the first image and the second image, and changes the range of the image portion to change the range of the image portion and the second image.
  • the calculation of the similarity with the image of the above may be repeated to detect the image portion in which the calculated similarity is equal to or higher than the first threshold value.
  • the detection means detects, for example, the image portion having the maximum similarity calculated by the similarity calculation means.
  • a display device for displaying an image portion detected by the detection means may be further provided.
  • a second control means for controlling the similarity calculation means and the first control means may be further provided so as to repeat the processing by the similarity calculation means and the processing by the first control means for the plurality of second images.
  • a third control means for controlling the similarity calculation means and the first control means may be further provided so as to repeat the processing by the similarity calculation means and the processing by the first control means for the plurality of first images.
  • the similarity calculation means calculates, for example, the similarity between the composition of the image portion and the composition of the second image, and at least one of the lightness, color, density, sharpness or contrast of the image portion detected by the detection means.
  • One may further be provided with adjusting means to match the brightness, tint, density, sharpness or contrast of the second image.
  • the similarity calculated by the similarity calculation means is equal to or higher than the first threshold value, and the resolution is equal to or higher than the second threshold value when the image portion is enlarged to a predetermined size. Detects the image part.
  • the image part is similar to the second image.
  • the first control means calculates the similarity so as to repeat the calculation of the similarity between the image portion and the second image by changing the position information of the extraction frame of the image portion which is a part of the first image, for example.
  • the means may be controlled, or the similarity calculation means so as to repeat the calculation of the similarity between the image portion and the second image by changing the size of the extraction frame of the image portion that is a part of the first image. May be controlled.
  • the present invention it is possible to find an image portion similar to the second image from the first image even if the entire first image and the entire second image are not similar to each other. it can.
  • FIG. 1 shows an embodiment of the present invention, and is a block diagram showing an electrical configuration of the image processing apparatus 1.
  • the overall operation of the image processing device 1 is controlled by the CPU (Central Processing Unit) 2.
  • CPU Central Processing Unit
  • the image processing device 1 includes a display device 3 that displays images and other information on a display screen, and a communication device 4 that connects to the Internet and other networks to communicate with devices other than the image processing device 1.
  • the image processing device 1 also includes a hard disk 5, a hard disk drive for accessing the hard disk 5, a memory 7 for storing data, a keyboard 8 for inputting commands, and a mouse 9. There is. Further, the image processing device 1 is a memory card that writes data to the compact disk drive 10 and the memory card 13 that access the compact disk 11 and reads data recorded on the memory card 13. -Leader / Writer 12 is also included.
  • the operation program of the image processing device 1 described later is received by the communication device 4 via the Internet.
  • the received operation program is installed in the image processing device 1.
  • the operation program is received by the image processing device 1 via a network such as the Internet, is recorded on a portable recording medium such as a compact disk 11 without being installed in the image processing device 1, and is recorded from the portable recording medium. It may be read. In that case, the operation program read from the portable recording medium is installed in the image processing device 1.
  • the operation program can be read by the CPU 2 (computer) of the image processing device 1.
  • the image processing device 1 is a plurality of images (user images) taken by the user with images similar to the model image (an example of the second image, but may be an example of the first image). It is an example of the first image, but it may be an example of the second image). "The image and the image are similar" means that similar subjects included in each image appear in the same positional relationship.
  • feature quantities such as image brightness, tint, density, sharpness, and contrast may or may not be used as criteria for determining whether or not they are similar.
  • FIG. 2 shows the model image IS and n user images I1 to In (n is an integer of 2 or more).
  • the model image IS is an image of "lily of the valley", for example, an image taken by a photographic expert and taken with a composition that a general camera user who is not a photographic expert wants to imitate.
  • the model image IS and a part of the image part in each of the user images I1 to In are compared, and if there is an image part similar to the model image IS, the image part is found. Be done. This is because, as the image quality has been improved, even if a part of the image part of the image is displayed or printed as a single image, it can be fully appreciated.
  • image data representing the model image IS and the image data representing each of the user images I1 to In are read from the compact disk 11, the memory card 13, and the like and recorded on the hard disk 5. It is assumed that not only one model image IS but also image data representing other model images, user images I1 to In, as well as image data representing other user images are recorded on the hard disk 5.
  • 3 and 4 are flowcharts showing a processing procedure (image processing method) of the image processing device 1.
  • a plurality of model images are displayed on the display screen of the display device 3, and the user specifies a model image from the plurality of model images (step 21).
  • the model image IS is specified.
  • a plurality of user images are displayed on the display screen of the display device 3, and the user specifies a user image to be searched for an image similar to the model image from the plurality of user images (step 22).
  • the user specifies a user image to be searched for an image similar to the model image from the plurality of user images (step 22).
  • In is specified from the user image I1.
  • An image part similar to the model image IS can be found in the user images I1 to In. Finding a Similar Image Part If the model image and the user image from which the image part is found are determined in advance, the user does not necessarily have to specify the model image and the user image.
  • the image data representing the first user image I1 is read from the hard disk 5 and expanded in the memory 7 (step 23). Subsequently, an extraction frame having an initial size is set, and the extraction frame is set at the initial position of the read user image I1 (step 24).
  • FIG. 5 shows the relationship between the read user image I1 and the extraction frame F.
  • the user image I1 has a width of w and a height of h at a predetermined resolution. Further, the upper left vertex of the user image I1 is set as the origin (0,0).
  • the initial size of the extraction frame F (a frame that defines the image portion to be extracted from the user image) is 6w / 7 in width (may be another width) and 6h / 7 in height (others). It may be the height of).
  • the initial position of the extraction frame F is a position where the position of the upper left vertex of the extraction frame F coincides with the origin (0,0) of the user image I1.
  • the extraction frame F is positioned at position P1.
  • the similarity between the image portion in the extraction frame F (the extraction image portion, which is an example of a part of the image portion of the first image) and the model image is CPU2 (similarity). It is calculated by (an example of the calculation means) (step 25).
  • the degree of similarity may be calculated by matching the entire model image and the extracted image part, or by detecting various feature amounts in each of the model image and the extracted image part and comparing the detected feature amounts. The degree may be calculated.
  • the calculated similarity is equal to or higher than the first threshold value (YES in step 26)
  • the extracted image portion is detected by the CPU 2 (detection means), and the similarity table shown in Table 1 is updated (step). 27).
  • the position and similarity of the extraction frame F when the similarity between the extracted image portion and the model image exceeds the first threshold value is assigned to each user image as the user image No. (even in the image file name). Good) is stored. Every time the similarity becomes equal to or higher than the first threshold value, the user image No., the position of the extraction frame, and the similarity are added to the similarity table. If the similarity is less than the threshold, the process of step 27 is skipped.
  • the extraction frame F is moved horizontally by a predetermined distance (step 28), and the extraction frame moves from the position P1 to P2. If the similarity between the extracted image portion and the model image after the extraction frame F is moved is calculated again (step 29) and the similarity is equal to or higher than the first threshold value (YES in step 30), the similarity is calculated again.
  • the table is updated (added to the similarity table) (step 31). When the right side of the extraction frame F is moved to the right side of the right side of the user image I1, the extraction frame F is lowered by a predetermined distance in the vertical direction, and the left side of the extraction frame F matches the left side of the user image I1. Positioned to. In this way, the extraction frame F is repeatedly moved to the right and downward.
  • the extraction frame F By moving the extraction frame F, the extraction frame F is positioned at the position PE, and the lower right vertex of the extraction frame F is equal to or less than the lower right vertex of the user image I1 (the last position of the extraction frame F). Until the extraction frame F reaches the final position of the user image I1 (NO in step 32), the extraction frame F is moved (change of the position information of the extraction frame F) and the similarity between the extraction image portion and the model image IS is calculated. Is repeated by the CPU 2 (first control means), and the similarity table is updated when the similarity is equal to or higher than the first threshold value.
  • the size (range) of the extraction frame F is reduced (changed) by the CPU 2 and set to the initial position of the user image I1. (Step 33),
  • FIG. 6 shows the relationship between the user image I1 and the extraction frame F.
  • the extraction frame F has a width of 5w / 7 and a height of 5h / 7. Further, the upper left vertex of the extraction frame F is positioned at the origin (0,0) of the user image I1.
  • step 34 When the size of the extraction frame F is reduced, it is confirmed whether the number of pixels of the extracted image portion after being reduced is equal to or more than a certain number of pixels (step 34).
  • the number of pixels of the user image is large, even if a part of the extracted image part of the user image is treated as one image, it can be viewed. However, if the number of pixels of the extracted image portion is too small, it may not be possible to appreciate the extracted image portion if it is treated as one image. For this reason, when the extracted image portion is displayed, printed, or the like as a single image, it is confirmed whether or not the number of pixels is more than the number of pixels that can be viewed.
  • step 34 move the extraction frame F, calculate the similarity, and reduce the extraction frame F (change the size of the extraction frame F. Small extraction frame).
  • the extraction frame F may be moved by increasing F, and the similarity may be calculated) (steps 24-33).
  • FIG. 7 shows the relationship between the user image I1 and the extraction frame F.
  • the width of the extraction frame F is reduced to 2w / 7 and the height is reduced to 2h / 7, and the extraction frame F is positioned at the initial position.
  • the image data representing the next user image is Read out (step 23). In this case, the image data representing the next user image I2 is read from the hard disk 5 and expanded in the memory 7. If the number of pixels of the extracted image portion in the extraction frame F is a certain number of pixels or more, the resolution is the second even when the extracted image portion is displayed or printed in a predetermined size set in advance.
  • FIG. 8 shows the relationship between the user image I4 and the extraction frame F.
  • the movement of the extraction frame F, the calculation of the similarity, and the reduction of the extraction frame F are repeated, and the similarity between the image portion IP1 in the extraction frame F and the model image IS is calculated as shown in FIG.
  • the similarity table is updated.
  • the user image No., the position of the extraction frame F, and the similarity are stored in the similarity table.
  • the image portion whose similarity with the model image is equal to or higher than the first threshold value is detected, but the image portion having the maximum similarity with the model image may be detected.
  • the image part in which the similarity stored in the similarity table gives the maximum similarity may be detected, or the similarity is larger than the similarity stored in the similarity table.
  • the information stored in the table may be overwritten.
  • an image portion similar to one model image IS is detected, but when there are a plurality of model image ISs, they are similar to each model image of the plurality of model images.
  • the image portion may be detected by the CPU 2 (second control means).
  • the image data representing the next model image IS is transferred from the hard disk 5 from the hard disk 5. It may be read and the above processing may be repeated.
  • the CPU 2 detects an image portion similar to the model image IS for a plurality of user images I1 to In, but the model image IS for one user image. An image portion similar to the above may be detected.
  • the frame shape of the extraction frame F is similar to the frame shape of the model image IS, but it does not necessarily have to be similar. Further, the main subject included in the model image IS may be detected, and the portion of the subject having the same composition as the composition of the main subject may be found from the user images I1 to In.
  • the found image part may be extracted and displayed by trimming, masking, or the like on a part other than the found image part.
  • an image portion similar to the model image can be found.
  • the brightness, tint, density, sharpness, contrast, etc. of the image portion IP1 found as described above are adjusted to the brightness, tint, density, sharpness, contrast, etc. of the model image IS. It is a thing.
  • the average brightness, average color, average density, etc. of the image portion IP1 may match the average brightness, average color, average density, etc. of the model image IS.
  • the average brightness, average color, average density, etc. of the main subject of the image part IP1 are the average brightness, average color, average density, etc. of the main subject of the model image IS. May match with.
  • the sharpness, contrast, etc. of the boundary portion of the subject included in the image portion IP1 may match the sharpness, contrast, etc.
  • the sharpness and contrast of the boundary between the main subject and the background included in the image portion IP1 match the sharpness and contrast of the boundary between the main subject and the background included in the model image IS. May be good.
  • the image part IP1 found is not taken as a single image, but is only a part of the user image I4. For this reason, when the found image portion IP1 is treated as one image, the brightness, color, density, sharpness, and contrast may be inappropriate. For example, the brightness of the user image I4 as a whole is appropriate, but when considering only the found image portion IP1, it may be too bright or too dark. In addition, it is more similar to the model image IS when the brightness and the like are matched to the model image IS.
  • the brightness and the like may be used as one of the feature quantities, but when the processing in the second embodiment is performed, the brightness, the tint, the density, and the sharpness , Contrast, etc., it is preferable not to use the processing to be performed as a feature quantity for calculating the degree of similarity.
  • the image processing device 1 shown in FIG. 1 is used as in the first embodiment. Further, it is assumed that In is used from the user image 11 and the image portion IP1 included in the user image I4 shown in FIG. 8 is found as an image portion similar to the model image IS shown in FIG.
  • FIG. 9 is a flowchart showing a processing procedure of the image processing device 1.
  • FIG. 10 shows the extracted image portion IP1, the image portion IP2 before conversion by the conversion function of the user image, and the image portion IP3 after conversion by the conversion function of the model image.
  • the image portion IP1 (see FIG. 10) found as in the first embodiment is extracted (step 41), and the conversion function (brightness, color, density, sharpness) of the user image I4 including the extracted image portion IP1 is extracted. Conversion functions such as degree and contrast) are found by CPU2 (step 42). Assuming that the same conversion function is used for n user images I1 to In, what kind of conversion function is used for n user images I1 to In to obtain brightness, color, density, sharpness, and so on. It is estimated whether the contrast is adjusted. Similarly, assuming that the same conversion function is used for multiple model images, it is estimated what conversion function is used to adjust the brightness, tint, density, sharpness, contrast, etc. To.
  • FIG. 11 shows the conversion function
  • the horizontal axis is the input and the vertical axis is the output.
  • the conversion function f1 is a conversion function estimated to be used for the user image I1 to In
  • the conversion function f2 is a conversion function estimated to be used for the model image IS.
  • FIG. 10 (A) when the conversion function f1 presumed to be used for In from the user image I1 and the conversion function f2 presumed to be used for the model image IS are found, FIG. 10 (A) ),
  • the inverse conversion of the conversion function f1 from the user image I1 to In is performed by the CPU 2 (an example of the adjusting means) for the extracted image portion IP1 (step 43), as shown in FIG. ,
  • the image part IP2 before conversion by the conversion function f1 of the user image I4 is obtained.
  • the image portion of the user image I4 before conversion is converted by the CPU 2 (an example of the adjusting means) by using the conversion function f2 of the model image IS, so that the same conversion as the model image IS is performed.
  • Partial IP3 is obtained (step 44). It is preferable to display at least one of the obtained image portions IP1, IP2 and IP3 on the display screen of the display device 3.
  • Such processing is performed for brightness, color, density, sharpness, contrast, and the like.
  • the same adjustments as the adjustment of the brightness, tint, density, sharpness, contrast, etc. of the model image IS can be performed, and the brightness similar to that of the model image IS, An image portion IP3 having color, density, sharpness, contrast, etc. can be obtained.
  • the image processing device 1 in the above-described embodiment may be configured by using a dedicated device, but may also be configured by using a smartphone, a personal computer, a tablet terminal, or the like.
  • the processing unit that executes the above processing is programmable, such as FPGA (field-programmable gate array), which can change the circuit configuration after manufacturing.
  • FPGA field-programmable gate array
  • -It includes a dedicated electric circuit, which is a processor having a circuit configuration specially designed to execute a specific process such as a logic device or an ASIC (application specific integrated circuit).
  • One processing unit may be composed of one of these various processors, or may be composed of a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs, a CPU 2 and an FPGA). May be done.
  • one processor is configured by a combination of one or more CPU 2s and software, as represented by a computer such as a client computer or a server.
  • this processor functions as multiple processing units.
  • a processor that realizes the functions of the entire system including a plurality of processing units with one IC (integrated circuit) chip is used.
  • various processing units are configured by using one or more various processors as a hardware structure.
  • the hardware structure of these various processors is, more specifically, an electric circuit that combines circuit elements such as semiconductor elements.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides an image processing device, an image processing method, and an image processing program with which it is possible to locate an image portion that is similar to one given image. An extraction frame (F) is moved in a user image (I1), and the degree of similarity between an image portion of the user image (I1) that is present inside of the extraction frame (F) and an example image is calculated. The size of the extraction frame (F) is reduced, and the processes of moving the extraction frame (F) and calculating the degree of similarity between an image portion inside of the extraction frame (F) and the example image are repeated. An image portion for which the degree of similarity is greater than or equal to a first threshold value is located. These processes are performed on a plurality of user images. It is possible to locate a portion of a user image that is similar to the example image even though the user image may not be similar to the example image as a whole.

Description

画像処理装置,画像処理方法および画像処理プログラムImage processing equipment, image processing method and image processing program
 この発明は,画像処理装置,画像処理方法および画像処理プログラムに関する。 The present invention relates to an image processing apparatus, an image processing method, and an image processing program.
 画像検索には,ある画像に類似した画像を見つけるものがある。この場合に,画像間の類似性を確実に評価しようとするもの(特許文献1),入力画像に類似する参照画像を高精度に求めるもの(特許文献2)などが考えられている。 There is an image search that finds an image similar to a certain image. In this case, one that attempts to reliably evaluate the similarity between images (Patent Document 1), one that obtains a reference image similar to the input image with high accuracy (Patent Document 2), and the like are considered.
 また,ターゲットの自動追尾を容易にするために,画像上の移動枠を移動して追尾枠の初期位置を決定するもの(特許文献3),被写体像の表示切り替え時に違和感を与えないようにするために切り取り範囲を徐々に小さくし,切り取り後の画像の拡大率を徐々に大きくするもの(特許文献4),2つの画像に含まれる被写体同士が類似している場合には,一方の画像のトリミング枠に他方の画像のトリミング枠を自動的に設定するもの(特許文献5)も考えられている。 Further, in order to facilitate automatic tracking of the target, the moving frame on the image is moved to determine the initial position of the tracking frame (Patent Document 3) so as not to give a sense of discomfort when switching the display of the subject image. Therefore, the cropping range is gradually reduced and the enlargement ratio of the cropped image is gradually increased (Patent Document 4). When the subjects included in the two images are similar, one of the images is used. A method of automatically setting a trimming frame for the other image in the trimming frame (Patent Document 5) is also considered.
特開2012-133459号公報Japanese Unexamined Patent Publication No. 2012-133459 特開2016-081131号公報Japanese Unexamined Patent Publication No. 2016-081131 WO2013/047066WO2013 / 047066 特開2008-079166号公報Japanese Unexamined Patent Publication No. 2008-079166 特許第5482142号公報Japanese Patent No. 5482142
 ある画像に類似した画像を見つける場合には,画像全体を比較することが一般的であるが,画像全体を比較すると類似した画像を見つけることができないことがある。特許文献1に記載の発明では,部分領域同士を比較してしまうので,ある画像全体に対して類似した画像を見つけることができない。特許文献2に記載の発明では,入力画像の特徴量と参照画像の部分領域の特徴量とに基づいて,入力画像と参照画像との類似度を算出し,類似度の各々に基づいて,入力画像に類似する参照画像を求めている。特許文献2に記載の発明では,入力画像の全体に類似する参照画像を求めているので,入力画像の一部分が参照画像の全体に類似している場合には,そのような一部分を見つけることはできない。特許文献3に記載の発明では,類似した画像を見つけることは考えられていない。特許文献4に記載の発明も,類似した画像を見つけることは考えられていない。特許文献5に記載の発明は,トリミングを設定するもので,類似した画像を見つけることは考えられていない。 When finding an image similar to a certain image, it is common to compare the entire image, but when comparing the entire image, it may not be possible to find a similar image. In the invention described in Patent Document 1, since the partial regions are compared with each other, it is not possible to find a similar image for the entire image. In the invention described in Patent Document 2, the similarity between the input image and the reference image is calculated based on the feature amount of the input image and the feature amount of the partial region of the reference image, and the input is based on each of the similarity. We are looking for a reference image that is similar to the image. In the invention described in Patent Document 2, since a reference image similar to the whole of the input image is sought, if a part of the input image is similar to the whole of the reference image, it is not possible to find such a part. Can not. In the invention described in Patent Document 3, it is not considered to find a similar image. The invention described in Patent Document 4 is also not considered to find a similar image. The invention described in Patent Document 5 sets trimming, and it is not considered to find a similar image.
 この発明は,ある1枚の画像に類似している画像部分を見つけることを目的とする。 The object of the present invention is to find an image portion similar to a certain image.
 この発明による画像処理装置は,第1の画像の一部である画像部分と第2の画像との類似度を算出する類似度算出手段,画像部分の範囲を変えて画像部分と第2の画像との類似度の算出を繰り返すように類似度算出手段を制御する第1の制御手段,および類似度算出手段によって算出された類似度が第1のしきい値以上となる画像部分を検出する検出手段を備えている。 The image processing apparatus according to the present invention is a similarity calculation means for calculating the similarity between an image portion that is a part of the first image and the second image, and the image portion and the second image by changing the range of the image portion. A first control means that controls the similarity calculation means so as to repeat the calculation of the similarity with the image, and a detection that detects an image portion whose similarity calculated by the similarity calculation means is equal to or higher than the first threshold value. It has the means.
 この発明は,画像処理装置に適した画像処理方法も提供している。すなわち,類似度算出手段が,第1の画像の一部である画像部分と第2の画像との類似度を算出し,制御手段が,画像部分の範囲を変えて画像部分と第2の画像との類似度の算出を繰り返すように類似度算出手段を制御し,検出手段が,類似度算出手段によって算出された類似度が第1のしきい値以上となる画像部分を検出する。 The present invention also provides an image processing method suitable for an image processing apparatus. That is, the similarity calculation means calculates the similarity between the image portion that is a part of the first image and the second image, and the control means changes the range of the image portion to change the range of the image portion and the second image. The similarity calculation means is controlled so as to repeat the calculation of the similarity with and the detection means detects an image portion in which the similarity calculated by the similarity calculation means is equal to or higher than the first threshold value.
 また,この発明は,画像処理装置のコンピュータを制御するプログラムおよびそのプログラムを格納した記録媒体も提供している。 The present invention also provides a program for controlling a computer of an image processing apparatus and a recording medium for storing the program.
 さらに,画像処理装置にプロセッサを備え,そのプロセッサが,第1の画像の一部である画像部分と第2の画像との類似度を算出し,画像部分の範囲を変えて画像部分と第2の画像との類似度の算出を繰り返し,算出された類似度が第1のしきい値以上となる画像部分を検出するようにしてもよい。 Further, the image processing device is provided with a processor, and the processor calculates the similarity between the image portion that is a part of the first image and the second image, and changes the range of the image portion to change the range of the image portion and the second image. The calculation of the similarity with the image of the above may be repeated to detect the image portion in which the calculated similarity is equal to or higher than the first threshold value.
 検出手段は,たとえば,類似度算出手段によって算出された類似度が最大の画像部分を検出する。 The detection means detects, for example, the image portion having the maximum similarity calculated by the similarity calculation means.
 検出手段によって検出された画像部分を表示する表示装置をさらに備えてもよい。 A display device for displaying an image portion detected by the detection means may be further provided.
 複数の第2の画像について類似度算出手段による処理および第1の制御手段による処理を繰り返すように類似度算出手段および第1の制御手段を制御する第2の制御手段をさらに備えてもよい。 A second control means for controlling the similarity calculation means and the first control means may be further provided so as to repeat the processing by the similarity calculation means and the processing by the first control means for the plurality of second images.
 複数の第1の画像について類似度算出手段による処理および第1の制御手段による処理を繰り返すように類似度算出手段および第1の制御手段を制御する第3の制御手段をさらに備えてもよい。 A third control means for controlling the similarity calculation means and the first control means may be further provided so as to repeat the processing by the similarity calculation means and the processing by the first control means for the plurality of first images.
 類似度算出手段は,たとえば,画像部分の構図と第2の画像の構図との類似度を算出し,検出手段によって検出された画像部分の明度,色味,濃度,鮮鋭度またはコントラストのうち少なくとも一つを,第2の画像の明度,色味,濃度,鮮鋭度またはコントラストに合わせる調整手段をさらに備えてもよい。 The similarity calculation means calculates, for example, the similarity between the composition of the image portion and the composition of the second image, and at least one of the lightness, color, density, sharpness or contrast of the image portion detected by the detection means. One may further be provided with adjusting means to match the brightness, tint, density, sharpness or contrast of the second image.
 検出手段は,たとえば,類似度算出手段によって算出された類似度が第1のしきい値以上であり,かつ画像部分を所定の大きさに拡大した場合に解像度が第2のしきい値以上となる画像部分を検出する。 In the detection means, for example, the similarity calculated by the similarity calculation means is equal to or higher than the first threshold value, and the resolution is equal to or higher than the second threshold value when the image portion is enlarged to a predetermined size. Detects the image part.
 画像部分は第2の画像と相似であることが好ましい。 It is preferable that the image part is similar to the second image.
第1の制御手段は,たとえば,第1の画像の一部である画像部分の抽出枠の位置情報を変えて画像部分と上記第2の画像との類似度の算出を繰り返すように類似度算出手段を制御してもよいし,第1の画像の一部である画像部分の抽出枠の大きさを変えて画像部分と第2の画像との類似度の算出を繰り返すように類似度算出手段を制御してもよい。 The first control means calculates the similarity so as to repeat the calculation of the similarity between the image portion and the second image by changing the position information of the extraction frame of the image portion which is a part of the first image, for example. The means may be controlled, or the similarity calculation means so as to repeat the calculation of the similarity between the image portion and the second image by changing the size of the extraction frame of the image portion that is a part of the first image. May be controlled.
 この発明によると,第1の画像全体と第2の画像全体とを比較した場合には類似していない場合でも,第2の画像に類似している画像部分を第1の画像から見つけることができる。 According to the present invention, it is possible to find an image portion similar to the second image from the first image even if the entire first image and the entire second image are not similar to each other. it can.
画像処理装置の電気的構成を示すブロック図である。It is a block diagram which shows the electrical structure of an image processing apparatus. お手本画像とユーザ画像との一例である。This is an example of a model image and a user image. 画像処理装置の処理手順を示すフローチャートである。It is a flowchart which shows the processing procedure of an image processing apparatus. 画像処理装置の処理手順を示すフローチャートである。It is a flowchart which shows the processing procedure of an image processing apparatus. ユーザ画像と抽出枠との関係を示している。It shows the relationship between the user image and the extraction frame. ユーザ画像と抽出枠との関係を示している。It shows the relationship between the user image and the extraction frame. ユーザ画像と抽出枠との関係を示している。It shows the relationship between the user image and the extraction frame. ユーザ画像と抽出枠との関係を示している。It shows the relationship between the user image and the extraction frame. 画像処理装置の処理手順を示すフローチャートである。It is a flowchart which shows the processing procedure of an image processing apparatus. 抽出された画像部分を示している。The extracted image part is shown. 変換関数の一例である。This is an example of a conversion function.
[第1実施例]
 図1は,この発明の実施例を示すもので,画像処理装置1の電気的構成を示すブロック図である。
[First Example]
FIG. 1 shows an embodiment of the present invention, and is a block diagram showing an electrical configuration of the image processing apparatus 1.
 画像処理装置1の全体の動作は,CPU(Central Processing Unit)2によって統括される。 The overall operation of the image processing device 1 is controlled by the CPU (Central Processing Unit) 2.
 画像処理装置1には,画像その他の情報を表示画面に表示する表示装置3およびインターネットその他のネットワークに接続して画像処理装置1以外の装置等と通信する通信装置4が含まれている。また,画像処理装置1には,ハード・ディスク5,ハード・ディスク5にアクセスするハード・ディスク・ドライブ6,データ等を記憶するメモリ7,コマンド等を入力するキーボード8およびマウス9も含まれている。さらに,画像処理装置1には,コンパクト・ディスク11にアクセスするコンパクト・ディスク・ドライブ10ならびにメモリ・カード13へのデータの書き込みおよびメモリ・カード13に記録されているデータの読み取りを行うメモリ・カード・リーダ・ライタ12も含まれている。 The image processing device 1 includes a display device 3 that displays images and other information on a display screen, and a communication device 4 that connects to the Internet and other networks to communicate with devices other than the image processing device 1. The image processing device 1 also includes a hard disk 5, a hard disk drive for accessing the hard disk 5, a memory 7 for storing data, a keyboard 8 for inputting commands, and a mouse 9. There is. Further, the image processing device 1 is a memory card that writes data to the compact disk drive 10 and the memory card 13 that access the compact disk 11 and reads data recorded on the memory card 13. -Leader / Writer 12 is also included.
 後述する画像処理装置1の動作プログラムは,インターネットを介して通信装置4において受信される。受信された動作プログラムが画像処理装置1にインストールされる。動作プログラムはインターネットなどのようなネットワークを介して画像処理装置1に受信されて画像処理装置1にインストールされずに,コンパクト・ディスク11などの可搬型記録媒体に記録され,その可搬型記録媒体から読み取られてもよい。その場合には,可搬型記録媒体から読み取られた動作プログラムが画像処理装置1にインストールされる。動作プログラムは,画像処理装置1のCPU2(コンピュータ)が読み取り可能である。 The operation program of the image processing device 1 described later is received by the communication device 4 via the Internet. The received operation program is installed in the image processing device 1. The operation program is received by the image processing device 1 via a network such as the Internet, is recorded on a portable recording medium such as a compact disk 11 without being installed in the image processing device 1, and is recorded from the portable recording medium. It may be read. In that case, the operation program read from the portable recording medium is installed in the image processing device 1. The operation program can be read by the CPU 2 (computer) of the image processing device 1.
 この実施例による画像処理装置1は,お手本画像(第2の画像の一例であるが,第1の画像の一例としてもよい)に類似した画像を,ユーザによって撮影された複数の画像(ユーザ画像といい,第1の画像の一例であるが,第2の画像の一例としてもよい)の中から見つけるものである。「画像と画像とが類似する」とは,それぞれの画像に含まれる同じような被写体が同じような位置関係で現れることをいう。第1実施例においては,画像の明度,色味,濃度,鮮鋭度,コントラストなどの特徴量を類似するかどうかの判断基準に用いてもよいし,用いなくともよい。 The image processing device 1 according to this embodiment is a plurality of images (user images) taken by the user with images similar to the model image (an example of the second image, but may be an example of the first image). It is an example of the first image, but it may be an example of the second image). "The image and the image are similar" means that similar subjects included in each image appear in the same positional relationship. In the first embodiment, feature quantities such as image brightness, tint, density, sharpness, and contrast may or may not be used as criteria for determining whether or not they are similar.
 図2は,お手本画像ISとn枚(nは2以上の整数)のユーザ画像I1からInを示している。 FIG. 2 shows the model image IS and n user images I1 to In (n is an integer of 2 or more).
 お手本画像ISは「スズラン」の画像であり,たとえば,写真の専門家によって撮影されており,写真の専門家でない一般のカメラユーザが真似をしたいような構図などで撮影されている画像である。この実施例では,お手本画像ISとユーザ画像I1からInのそれぞれの画像の中の一部の画像部分とが比較され,お手本画像ISに類似した画像部分がある場合には,その画像部分が見つけられる。画像の高画質化に伴い,画像の一部の画像部分を一枚の画像として表示したり,印刷したりしても,十分鑑賞に耐えるようになったからである。 The model image IS is an image of "lily of the valley", for example, an image taken by a photographic expert and taken with a composition that a general camera user who is not a photographic expert wants to imitate. In this embodiment, the model image IS and a part of the image part in each of the user images I1 to In are compared, and if there is an image part similar to the model image IS, the image part is found. Be done. This is because, as the image quality has been improved, even if a part of the image part of the image is displayed or printed as a single image, it can be fully appreciated.
 お手本画像ISを表す画像データ,ユーザ画像I1からInのそれぞれの画像を表す画像データは,コンパクト・ディスク11,メモリ・カード13などから読み取られてハード・ディスク5に記録されているものとする。1枚のお手本画像ISだけでなく,その他のお手本画像を表す画像データ,ユーザ画像I1からInだけでなく,その他のユーザ画像を表す画像データもハード・ディスク5に記録されているものとする。 It is assumed that the image data representing the model image IS and the image data representing each of the user images I1 to In are read from the compact disk 11, the memory card 13, and the like and recorded on the hard disk 5. It is assumed that not only one model image IS but also image data representing other model images, user images I1 to In, as well as image data representing other user images are recorded on the hard disk 5.
 図3および図4は,画像処理装置1の処理手順(画像処理方法)を示すフローチャートである。 3 and 4 are flowcharts showing a processing procedure (image processing method) of the image processing device 1.
 表示装置3の表示画面に複数のお手本画像が表示され,ユーザは,複数のお手本画像の中から,お手本画像を指定する(ステップ21)。この実施例では,お手本画像ISが指定されたものとする。 A plurality of model images are displayed on the display screen of the display device 3, and the user specifies a model image from the plurality of model images (step 21). In this embodiment, it is assumed that the model image IS is specified.
 同様に,表示装置3の表示画面に複数のユーザ画像が表示され,ユーザは複数のユーザ画像の中から,お手本画像に類似した画像を見つける対象とするユーザ画像を指定する(ステップ22)。この実施例では,ユーザ画像I1からInが指定されたものとする。 Similarly, a plurality of user images are displayed on the display screen of the display device 3, and the user specifies a user image to be searched for an image similar to the model image from the plurality of user images (step 22). In this embodiment, it is assumed that In is specified from the user image I1.
 お手本画像ISに類似する画像部分が,ユーザ画像I1からInの中から見つけられる。類似する画像部分を見つけるお手本画像および画像部分が見つけられる対象であるユーザ画像があらかじめ決まっていれば,ユーザはお手本画像およびユーザ画像を指定する必要は必ずしもない。 An image part similar to the model image IS can be found in the user images I1 to In. Finding a Similar Image Part If the model image and the user image from which the image part is found are determined in advance, the user does not necessarily have to specify the model image and the user image.
 指定されたユーザ画像I1からInのうち,1枚目のユーザ画像I1を表す画像データがハード・ディスク5から読み取られ,メモリ7に展開される(ステップ23)。つづいて,初期の大きさの抽出枠が設定され,読み取られたユーザ画像I1の初期位置に,その抽出枠が設定される(ステップ24)。 Of the designated user images I1 to In, the image data representing the first user image I1 is read from the hard disk 5 and expanded in the memory 7 (step 23). Subsequently, an extraction frame having an initial size is set, and the extraction frame is set at the initial position of the read user image I1 (step 24).
 図5は,読み取られたユーザ画像I1と抽出枠Fとの関係を示している。 FIG. 5 shows the relationship between the read user image I1 and the extraction frame F.
 ユーザ画像I1は,所定の解像度において幅がw,高さがhとする。また,ユーザ画像I1の左上の頂点を原点(0,0)とする。抽出枠F(ユーザ画像から抽出する画像部分を規定する枠である)の初期の大きさは,幅が6w/7であり(他の幅でもよい),高さが6h/7とする(他の高さでもよい)。抽出枠Fの初期位置は,抽出枠Fの左上の頂点の位置がユーザ画像I1の原点(0,0)と一致する位置である。抽出枠Fは位置P1に位置決めされる。 The user image I1 has a width of w and a height of h at a predetermined resolution. Further, the upper left vertex of the user image I1 is set as the origin (0,0). The initial size of the extraction frame F (a frame that defines the image portion to be extracted from the user image) is 6w / 7 in width (may be another width) and 6h / 7 in height (others). It may be the height of). The initial position of the extraction frame F is a position where the position of the upper left vertex of the extraction frame F coincides with the origin (0,0) of the user image I1. The extraction frame F is positioned at position P1.
 抽出枠Fが位置決めされると,抽出枠F内の画像部分(抽出画像部分であり,第1の画像の一部の画像部分の一例である)とお手本画像との類似度がCPU2(類似度算出手段の一例である)によって算出される(ステップ25)。お手本画像全体と抽出画像部分とのマッチングによる類似度を算出してもよいし,お手本画像と抽出画像部分のそれぞれにおいて,様々な特徴量を検出し,検出された特徴量を比較することにより類似度を算出してもよい。算出された類似度が第1のしきい値以上であると(ステップ26でYES),その抽出画像部分がCPU2(検出手段)によって検出され,表1に示す類似度テーブルが更新される(ステップ27)。 When the extraction frame F is positioned, the similarity between the image portion in the extraction frame F (the extraction image portion, which is an example of a part of the image portion of the first image) and the model image is CPU2 (similarity). It is calculated by (an example of the calculation means) (step 25). The degree of similarity may be calculated by matching the entire model image and the extracted image part, or by detecting various feature amounts in each of the model image and the extracted image part and comparing the detected feature amounts. The degree may be calculated. When the calculated similarity is equal to or higher than the first threshold value (YES in step 26), the extracted image portion is detected by the CPU 2 (detection means), and the similarity table shown in Table 1 is updated (step). 27).
Figure JPOXMLDOC01-appb-T000001
Figure JPOXMLDOC01-appb-T000001
 抽出画像部分とお手本画像との類似度が第1のしきい値以上となった場合における抽出枠Fの位置および類似度が,ユーザ画像ごとに付与されているユーザ画像No.(画像ファイル名でもよい)に対応して格納される。類似度が第1のしきい値以上となるごとに,ユーザ画像NO.,抽出枠の位置および類似度が類似度テーブルに追加されていく。類似度がしきい値未満の場合にはステップ27の処理はスキップされる。 The position and similarity of the extraction frame F when the similarity between the extracted image portion and the model image exceeds the first threshold value is assigned to each user image as the user image No. (even in the image file name). Good) is stored. Every time the similarity becomes equal to or higher than the first threshold value, the user image No., the position of the extraction frame, and the similarity are added to the similarity table. If the similarity is less than the threshold, the process of step 27 is skipped.
 つづいて,抽出枠Fが水平方向に所定の距離だけ動かされ(ステップ28),抽出枠は位置P1からP2に移動する。抽出枠Fが動かされた後の抽出画像部分とお手本画像との類似度が再び算出され(ステップ29),類似度が第1のしきい値以上であれば(ステップ30でYES),類似度テーブルが更新(類似度テーブルに追記)される(ステップ31)。抽出枠Fの右辺がユーザ画像I1の右辺よりも右側に動かされてしまうこととなると,抽出枠Fは垂直方向に所定距離だけ下げられて,ユーザ画像I1の左辺に抽出枠Fの左辺が一致するように位置決めされる。このように,抽出枠Fの右方向への移動と下方向への移動が繰り返される。 Subsequently, the extraction frame F is moved horizontally by a predetermined distance (step 28), and the extraction frame moves from the position P1 to P2. If the similarity between the extracted image portion and the model image after the extraction frame F is moved is calculated again (step 29) and the similarity is equal to or higher than the first threshold value (YES in step 30), the similarity is calculated again. The table is updated (added to the similarity table) (step 31). When the right side of the extraction frame F is moved to the right side of the right side of the user image I1, the extraction frame F is lowered by a predetermined distance in the vertical direction, and the left side of the extraction frame F matches the left side of the user image I1. Positioned to. In this way, the extraction frame F is repeatedly moved to the right and downward.
 抽出枠Fが動かされることにより,抽出枠Fが位置PEに位置決めされ,抽出枠Fの右下の頂点がユーザ画像I1の右下の頂点以下となる(抽出枠Fの最後の位置)。抽出枠Fがユーザ画像I1の最後の位置となるまで(ステップ32でNO),抽出枠Fの移動(抽出枠Fの位置情報の変更)および抽出画像部分とお手本画像ISとの類似度の算出がCPU2(第1の制御手段)によって繰り返され,第1のしきい値以上の類似度の場合には類似度テーブルが更新される。 By moving the extraction frame F, the extraction frame F is positioned at the position PE, and the lower right vertex of the extraction frame F is equal to or less than the lower right vertex of the user image I1 (the last position of the extraction frame F). Until the extraction frame F reaches the final position of the user image I1 (NO in step 32), the extraction frame F is moved (change of the position information of the extraction frame F) and the similarity between the extraction image portion and the model image IS is calculated. Is repeated by the CPU 2 (first control means), and the similarity table is updated when the similarity is equal to or higher than the first threshold value.
 抽出枠Fがユーザ画像I1の最後の位置となると(ステップ32でYES),抽出枠Fの大きさ(範囲)がCPU2によって小さくされて(変えられて)ユーザ画像I1の初期位置に設定される(ステップ33), When the extraction frame F becomes the final position of the user image I1 (YES in step 32), the size (range) of the extraction frame F is reduced (changed) by the CPU 2 and set to the initial position of the user image I1. (Step 33),
 図6は,ユーザ画像I1と抽出枠Fとの関係を示している。 FIG. 6 shows the relationship between the user image I1 and the extraction frame F.
 抽出枠Fは,幅が5w/7,高さが5h/7の大きさに小さくされている。また,抽出枠Fの左上の頂点がユーザ画像I1の原点(0,0)に位置決めされている。 The extraction frame F has a width of 5w / 7 and a height of 5h / 7. Further, the upper left vertex of the extraction frame F is positioned at the origin (0,0) of the user image I1.
 抽出枠Fの大きさが小さくされると,小さくされた後の抽出画像部分の画素数が,一定画素数以上かどうかが確認される(ステップ34)。ユーザ画像の画素数が多い場合には,ユーザ画像の一部の抽出画像部分を1枚の画像として取り扱った場合でも,鑑賞に耐える。しかし,抽出画像部分の画素数があまりにも少ない場合には,抽出画像部分を1枚の画像として扱うと鑑賞に耐えないことがある。このために,抽出画像部分を1枚の画像として表示,印刷等された場合に,鑑賞に耐える画素数以上かどうかが確認される。 When the size of the extraction frame F is reduced, it is confirmed whether the number of pixels of the extracted image portion after being reduced is equal to or more than a certain number of pixels (step 34). When the number of pixels of the user image is large, even if a part of the extracted image part of the user image is treated as one image, it can be viewed. However, if the number of pixels of the extracted image portion is too small, it may not be possible to appreciate the extracted image portion if it is treated as one image. For this reason, when the extracted image portion is displayed, printed, or the like as a single image, it is confirmed whether or not the number of pixels is more than the number of pixels that can be viewed.
 抽出画像部分の画素が一定画素数以上であれば(ステップ34でYES),抽出枠Fの移動,類似度算出,抽出枠Fを小さくすること(抽出枠Fの大きさの変更。小さい抽出枠Fを大きくして抽出枠Fの移動,類似度算出を行ってよい)が繰り返される(ステップ24-33), If the number of pixels in the extracted image portion is a certain number or more (YES in step 34), move the extraction frame F, calculate the similarity, and reduce the extraction frame F (change the size of the extraction frame F. Small extraction frame). The extraction frame F may be moved by increasing F, and the similarity may be calculated) (steps 24-33).
 図7は,ユーザ画像I1と抽出枠Fとの関係を示している。 FIG. 7 shows the relationship between the user image I1 and the extraction frame F.
 抽出枠Fの幅が2w/7,高さが2h/7に小さくされて,初期位置に位置決めされている。抽出枠F内の抽出画像部分の画素数が一定画素数未満となると(ステップ34でNO),次のユーザ画像がある場合には(ステップ35でYES),次のユーザ画像を表す画像データが読みだされる(ステップ23)。この場合には,次のユーザ画像I2を表す画像データがハード・ディスク5から読みだされてメモリ7に展開される。抽出枠F内の抽出画像部分の画素数が一定画素数以上であれば,抽出画像部分を,あらかじめ定められている所定の大きさにして表示,印刷等した場合であっても解像度が第2のしきい値以上となり,鑑賞に耐えられるからである。もっとも,あくまでお手本画像に類似した画像部分を検出することだけを考えるのであれば,抽出画像部分の画素数が一定画素数未満であっても,検出するようにしてもよい。ユーザ画像ごとに画素数が異なることがあるので,抽出枠F内の画像部分の画素数が一定画素数未満となる最小の抽出枠Fの大きさは,ユーザ画像ごとに異なることがある。 The width of the extraction frame F is reduced to 2w / 7 and the height is reduced to 2h / 7, and the extraction frame F is positioned at the initial position. When the number of pixels of the extracted image portion in the extraction frame F is less than a certain number of pixels (NO in step 34), if there is a next user image (YES in step 35), the image data representing the next user image is Read out (step 23). In this case, the image data representing the next user image I2 is read from the hard disk 5 and expanded in the memory 7. If the number of pixels of the extracted image portion in the extraction frame F is a certain number of pixels or more, the resolution is the second even when the extracted image portion is displayed or printed in a predetermined size set in advance. This is because it exceeds the threshold value of and can withstand appreciation. However, if only the detection of an image portion similar to the model image is considered, even if the number of pixels of the extracted image portion is less than a certain number of pixels, it may be detected. Since the number of pixels may differ for each user image, the size of the minimum extraction frame F in which the number of pixels of the image portion in the extraction frame F is less than a certain number of pixels may differ for each user image.
 図8は,ユーザ画像I4と抽出枠Fとの関係を示している。 FIG. 8 shows the relationship between the user image I4 and the extraction frame F.
 上述のように抽出枠Fの移動,類似度の算出,抽出枠Fを小さくすることが繰り返され,図8に示すように抽出枠F内の画像部分IP1とお手本画像ISとの類似度が算出された結果,算出された類似度が第1の類似度以上となると類似度テーブルが更新させられる。ユーザ画像No.,抽出枠Fの位置および類似度が類似度テーブルに格納される。お手本画像IS全体とユーザ画像I4全体とを比較した場合,お手本画像ISとユーザ画像I4とは類似しているは判断される可能性は低いが,お手本画像IS全体と抽出枠F内の画像部分IP1とは類似していることがわかり,その画像部分IP1の位置,ユーザ画像I1からInのうちどのユーザ画像の画像部分かが類似度テーブルからわかるようになる。 As described above, the movement of the extraction frame F, the calculation of the similarity, and the reduction of the extraction frame F are repeated, and the similarity between the image portion IP1 in the extraction frame F and the model image IS is calculated as shown in FIG. As a result, when the calculated similarity becomes equal to or higher than the first similarity, the similarity table is updated. The user image No., the position of the extraction frame F, and the similarity are stored in the similarity table. When the entire model image IS and the entire user image I4 are compared, it is unlikely that the model image IS and the user image I4 are similar to each other, but the entire model image IS and the image portion in the extraction frame F are It can be seen that it is similar to IP1, and the position of the image portion IP1 and the image portion of which user image from the user images I1 to In can be known from the similarity table.
 ユーザによって指定されたユーザ画像I1からInのすべてのユーザ画像についてステップ23から34までの処理が終了して次のユーザ画像が無くなると(ステップ35でNO),類似度テーブルに格納されている情報によって表される画像部分が表示装置3の表示画面に表示される(ステップ36), When the processing from steps 23 to 34 is completed for all user images I1 to In specified by the user and the next user image disappears (NO in step 35), the information stored in the similarity table The image portion represented by is displayed on the display screen of the display device 3 (step 36),
 上述の実施例では,お手本画像との類似度が第1のしきい値以上の画像部分を検出しているが,お手本画像との類似度が最大となる画像部分を検出してもよい。その場合には類似度テーブルに格納されている類似度が最大の類似度を与える画像部分を検出してもよいし,類似度テーブルに格納されている類似度より大きな類似度となる場合に類似度テーブルに格納されている情報を上書きしてもよい。 In the above-described embodiment, the image portion whose similarity with the model image is equal to or higher than the first threshold value is detected, but the image portion having the maximum similarity with the model image may be detected. In that case, the image part in which the similarity stored in the similarity table gives the maximum similarity may be detected, or the similarity is larger than the similarity stored in the similarity table. The information stored in the table may be overwritten.
 また,上述の実施例においては,1枚のお手本画像ISに類似した画像部分を検出しているが,お手本画像ISが複数枚ある場合には,複数枚のお手本画像のそれぞれのお手本画像に類似した画像部分がCPU2(第2の制御手段)によって検出されるようにしてもよい。複数枚のお手本画像ISについて類似した画像部分を検出するためには,上述のように1枚のお手本画像ISについての処理が終了すると,次のお手本画像ISを表す画像データをハード・ディスク5から読み取り,上述した処理を繰り返せばよい。 Further, in the above-described embodiment, an image portion similar to one model image IS is detected, but when there are a plurality of model image ISs, they are similar to each model image of the plurality of model images. The image portion may be detected by the CPU 2 (second control means). In order to detect similar image parts for a plurality of model image ISs, when the processing for one model image IS is completed as described above, the image data representing the next model image IS is transferred from the hard disk 5 from the hard disk 5. It may be read and the above processing may be repeated.
 さらに,上述の実施例においては複数枚のユーザ画像I1からInについてお手本画像ISに類似した画像部分をCPU2(第3の制御手段)によって検出しているが,1枚のユーザ画像についてお手本画像ISに類似した画像部分を検出するようにしてもよい。 Further, in the above-described embodiment, the CPU 2 (third control means) detects an image portion similar to the model image IS for a plurality of user images I1 to In, but the model image IS for one user image. An image portion similar to the above may be detected.
 上述の実施例においては抽出枠Fの枠形状はお手本画像ISの枠形状と相似であるとしているが,必ずしも相似でなくともよい。また,お手本画像ISに含まれる主要被写体を検出し,その主要被写体の構図と同じような構図をもつ被写体の部分をユーザ画像I1からInの中から見つけるようにしてもよい。 In the above embodiment, the frame shape of the extraction frame F is similar to the frame shape of the model image IS, but it does not necessarily have to be similar. Further, the main subject included in the model image IS may be detected, and the portion of the subject having the same composition as the composition of the main subject may be found from the user images I1 to In.
 さらに,見つけられた画像部分以外の部分をトリミング,マスキングなどを行うことにより,見つけられた画像部分を抽出して表示してもよい。 Furthermore, the found image part may be extracted and displayed by trimming, masking, or the like on a part other than the found image part.
 第1実施例によると,画像全体同士を比較した場合には類似した画像を見つけることができなかった場合であっても,お手本画像に類似した画像部分を見つけることができる。 According to the first embodiment, even if a similar image cannot be found when comparing the entire images, an image portion similar to the model image can be found.
 [第2実施例]
 図9から図11は,第2実施例についてのものである。
[Second Example]
9 to 11 show the second embodiment.
 第2実施例は,上述のようにして見つけられた画像部分IP1の明度,色味,濃度,鮮鋭度,コントラストなどを,お手本画像ISの明度,色味,濃度,鮮鋭度,コントラストなどに合わせるものである。画像部分IP1の平均的な明度,平均的な色味,平均的な濃度などが,お手本画像ISの平均的な明度,平均的な色味,平均的な濃度などと一致するようにしてもよいし,画像部分IP1の主要被写体の平均的な明度,平均的な色味,平均的な濃度などが,お手本画像ISの主要被写体の平均的な明度,平均的な色味,平均的な濃度などと一致するようにしてもよい。また,画像部分IP1に含まれる被写体の境界部分の鮮鋭度,コントラストなどが,お手本画像ISに含まれる被写体の境界部分の鮮鋭度,コントラストなどと一致するようにしてもよい。たとえば,画像部分IP1に含まれる主要被写体と背景との境界部分の鮮鋭度,コントラストなどが,お手本画像ISに含まれる主要被写体と背景との境界部分の鮮鋭度,コントラストなどと一致するようにしてもよい。 In the second embodiment, the brightness, tint, density, sharpness, contrast, etc. of the image portion IP1 found as described above are adjusted to the brightness, tint, density, sharpness, contrast, etc. of the model image IS. It is a thing. The average brightness, average color, average density, etc. of the image portion IP1 may match the average brightness, average color, average density, etc. of the model image IS. However, the average brightness, average color, average density, etc. of the main subject of the image part IP1 are the average brightness, average color, average density, etc. of the main subject of the model image IS. May match with. Further, the sharpness, contrast, etc. of the boundary portion of the subject included in the image portion IP1 may match the sharpness, contrast, etc. of the boundary portion of the subject included in the model image IS. For example, make sure that the sharpness and contrast of the boundary between the main subject and the background included in the image portion IP1 match the sharpness and contrast of the boundary between the main subject and the background included in the model image IS. May be good.
 見つけられた画像部分IP1は,1枚の画像として撮影されたものではなく,あくまでユーザ画像I4の一部分である。このために,見つけられた画像部分IP1を1枚の画像として扱う場合には適切でない明度,色味,濃度,鮮鋭度,コントラストとなっていることがある。たとえば,ユーザ画像I4全体としては適切な明度となっているが,見つけられた画像部分IP1だけを考えると,明るすぎたり,暗すぎたりすることがある。また,明度等もお手本画像ISに合わせた方が,よりお手本画像ISに類似したものとなるからである。 The image part IP1 found is not taken as a single image, but is only a part of the user image I4. For this reason, when the found image portion IP1 is treated as one image, the brightness, color, density, sharpness, and contrast may be inappropriate. For example, the brightness of the user image I4 as a whole is appropriate, but when considering only the found image portion IP1, it may be too bright or too dark. In addition, it is more similar to the model image IS when the brightness and the like are matched to the model image IS.
 第1実施例における類似度の算出においては,明度等も特徴量の一つとして利用してもよいが,第2実施例における処理が行われる場合には,明度,色味,濃度,鮮鋭度,コントラスト等のうち,行われる処理については特徴量として類似度の算出には利用しないことが好ましい。 In the calculation of the similarity in the first embodiment, the brightness and the like may be used as one of the feature quantities, but when the processing in the second embodiment is performed, the brightness, the tint, the density, and the sharpness , Contrast, etc., it is preferable not to use the processing to be performed as a feature quantity for calculating the degree of similarity.
 第2実施例においても,第1実施例と同様に図1に示す画像処理装置1が利用される。また,ユーザ画像11からInが利用され,図2に示すお手本画像ISに類似した画像部分として,図8に示すユーザ画像I4に含まれる画像部分IP1が見つけられたものとする。 In the second embodiment as well, the image processing device 1 shown in FIG. 1 is used as in the first embodiment. Further, it is assumed that In is used from the user image 11 and the image portion IP1 included in the user image I4 shown in FIG. 8 is found as an image portion similar to the model image IS shown in FIG.
 図9は,画像処理装置1の処理手順を示すフローチャートである。図10は,抽出された画像部分IP1,ユーザ画像の変換関数による変換前の画像部分IP2およびお手本画像の変換関数による変換後の画像部分IP3を示している。 FIG. 9 is a flowchart showing a processing procedure of the image processing device 1. FIG. 10 shows the extracted image portion IP1, the image portion IP2 before conversion by the conversion function of the user image, and the image portion IP3 after conversion by the conversion function of the model image.
 第1実施例のようにして見つけられた画像部分IP1(図10参照)が抽出され(ステップ41),抽出された画像部分IP1を含むユーザ画像I4の変換関数(明度,色味,濃度,鮮鋭度,コントラストなどの変換関数)がCPU2によって見つけられる(ステップ42)。n枚のユーザ画像I1からInは,同じ変換関数が利用されているものと仮定し,n枚のユーザ画像I1からInがどのような変換関数を用いて明度,色味,濃度,鮮鋭度,コントラストなどが調整されているかが推定される。同様にして複数枚のお手本画像について同じ変換関数が利用されているものと仮定し,どのような変換関数を用いて明度,色味,濃度,鮮鋭度,コントラストなどが調整されているかが推定される。 The image portion IP1 (see FIG. 10) found as in the first embodiment is extracted (step 41), and the conversion function (brightness, color, density, sharpness) of the user image I4 including the extracted image portion IP1 is extracted. Conversion functions such as degree and contrast) are found by CPU2 (step 42). Assuming that the same conversion function is used for n user images I1 to In, what kind of conversion function is used for n user images I1 to In to obtain brightness, color, density, sharpness, and so on. It is estimated whether the contrast is adjusted. Similarly, assuming that the same conversion function is used for multiple model images, it is estimated what conversion function is used to adjust the brightness, tint, density, sharpness, contrast, etc. To.
 図11は,変換関数を示している。 Figure 11 shows the conversion function.
 図11において横軸が入力,縦軸が出力である。 In Fig. 11, the horizontal axis is the input and the vertical axis is the output.
 変換関数f1がユーザ画像I1からInに利用されていると推定された変換関数であり,変換関数f2がお手本画像ISに利用されていると推定された変換関数である。 The conversion function f1 is a conversion function estimated to be used for the user image I1 to In, and the conversion function f2 is a conversion function estimated to be used for the model image IS.
 図9に戻って,ユーザ画像I1からInに利用されていると推定される変換関数f1およびお手本画像ISに利用されていると推定される変換関数f2が,それぞれ見つけられると,図10(A)に示すように,抽出された画像部分IP1について,ユーザ画像I1からInの変換関数f1の逆変換がCPU2(調整手段の一例である)によって行われ(ステップ43),図10に示すように,ユーザ画像I4の変換関数f1による変換前の画像部分IP2が得られる。その後,お手本画像ISの変換関数f2を用いて,変換前のユーザ画像I4の画像部分がCPU2(調整手段の一例である)によって変換されることにより,お手本画像ISと同じ変換が行われた画像部分IP3が得られる(ステップ44)。得られた画像部分IP1,IP2およびIP3の少なくとも一つを表示装置3の表示画面に表示することが好ましい。 Returning to FIG. 9, when the conversion function f1 presumed to be used for In from the user image I1 and the conversion function f2 presumed to be used for the model image IS are found, FIG. 10 (A) ), The inverse conversion of the conversion function f1 from the user image I1 to In is performed by the CPU 2 (an example of the adjusting means) for the extracted image portion IP1 (step 43), as shown in FIG. , The image part IP2 before conversion by the conversion function f1 of the user image I4 is obtained. After that, the image portion of the user image I4 before conversion is converted by the CPU 2 (an example of the adjusting means) by using the conversion function f2 of the model image IS, so that the same conversion as the model image IS is performed. Partial IP3 is obtained (step 44). It is preferable to display at least one of the obtained image portions IP1, IP2 and IP3 on the display screen of the display device 3.
 このような処理が,明度,色味,濃度,鮮鋭度,コントラストなどについて行われる。 Such processing is performed for brightness, color, density, sharpness, contrast, and the like.
 お手本画像ISにもとづいて見つけられた画像部分IP1について,お手本画像ISの明度,色味,濃度,鮮鋭度,コントラストなどの調整と同じ調整ができるようになり,お手本画像ISと同じような明度,色味,濃度,鮮鋭度,コントラストなどをもつ画像部分IP3が得られるようになる。 For the image part IP1 found based on the model image IS, the same adjustments as the adjustment of the brightness, tint, density, sharpness, contrast, etc. of the model image IS can be performed, and the brightness similar to that of the model image IS, An image portion IP3 having color, density, sharpness, contrast, etc. can be obtained.
 上述した実施例における画像処理装置1は,専用装置を用いて構成してもよいが,スマートフォン,パーソナル・コンピュータ,タブレット端末などを利用して構成することもできる。 The image processing device 1 in the above-described embodiment may be configured by using a dedicated device, but may also be configured by using a smartphone, a personal computer, a tablet terminal, or the like.
 上述の処理を実行する処理部には,ソフトウエアを実行して各種の処理部として機能するCPU2のほかに,FPGA(field-programmable gate array)などのように製造後に回路構成を変更可能なプログラマブル・ロジック・ディバイス,ASIC(application specific integrated circuit)などの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路などが含まれる。 In addition to the CPU2 that executes software and functions as various processing units, the processing unit that executes the above processing is programmable, such as FPGA (field-programmable gate array), which can change the circuit configuration after manufacturing. -It includes a dedicated electric circuit, which is a processor having a circuit configuration specially designed to execute a specific process such as a logic device or an ASIC (application specific integrated circuit).
 1つの処理部は,これらの各種のプロセッサのうちの1つで構成されてもよいし,同種または異種の2つ以上のプロセッサの組合せ(たとえば,複数のFPGA,CPU2とFPGAの組合せ)で構成されてもよい。複数の処理部を1つのプロセッサで構成する例としては,第1に,クライアント・コンピュータやサーバなどのコンピュータに代表されるように,1つ以上のCPU2とソフトウエアの組合せで1つのプロセッサを構成し,このプロセッサが複数の処理部として機能する形態がある。第2に,システム・オン・チップなどに代表されるように,複数の処理部を含むシステム全体の機能を1つのIC(integrated circuit)チップで実現するプロセッサを使用する形態がある。このように,各種の処理部は,ハードウエア的な構造として各種のプロセッサを1つ以上用いて構成される。 One processing unit may be composed of one of these various processors, or may be composed of a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs, a CPU 2 and an FPGA). May be done. As an example of configuring a plurality of processing units with one processor, first, one processor is configured by a combination of one or more CPU 2s and software, as represented by a computer such as a client computer or a server. However, there is a form in which this processor functions as multiple processing units. Secondly, as typified by a system-on-chip, there is a form in which a processor that realizes the functions of the entire system including a plurality of processing units with one IC (integrated circuit) chip is used. In this way, various processing units are configured by using one or more various processors as a hardware structure.
 さらに,これらの各種のプロセッサのハードウエア的な構造は,より具体的には,半導体素子などの回路素子を組合せた電気回路である。 Furthermore, the hardware structure of these various processors is, more specifically, an electric circuit that combines circuit elements such as semiconductor elements.
1:画像処理装置,2:CPU,3:表示装置,4:通信装置,5:ハード・ディスク,6:ハード・ディスク・ドライブ,7:メモリ,8:キーボード,9:マウス,10:コンパクト・ディスク・ドライブ,11:コンパクト・ディスク,12:メモリ・カード・リーダ・ライタ,13:メモリ・カード,F:抽出枠,I1-In:ユーザ画像,IP1-IP3:画像部分,IS:お手本画像,f1-f2:変換関数 1: Image processing device, 2: CPU, 3: Display device, 4: Communication device, 5: Hard disk, 6: Hard disk drive, 7: Memory, 8: Keyboard, 9: Mouse, 10: Compact Disk drive, 11: Compact disc, 12: Memory card reader / writer, 13: Memory card, F: Extraction frame, I1-In: User image, IP1-IP3: Image part, IS: Model image, f1-f2: Conversion function

Claims (13)

  1.  第1の画像の一部である画像部分と第2の画像との類似度を算出する類似度算出手段,
     上記画像部分の範囲を変えて上記画像部分と上記第2の画像との類似度の算出を繰り返すように上記類似度算出手段を制御する第1の制御手段,および
     上記類似度算出手段によって算出された上記類似度が第1のしきい値以上となる上記画像部分を検出する検出手段,
     を備えた画像処理装置。
    A similarity calculation means for calculating the similarity between an image portion that is a part of the first image and the second image.
    Calculated by the first control means that controls the similarity calculation means so as to repeat the calculation of the similarity between the image portion and the second image by changing the range of the image portion, and the similarity calculation means. A detection means for detecting the image portion whose similarity is equal to or higher than the first threshold value.
    Image processing device equipped with.
  2.  上記検出手段は,
     上記類似度算出手段によって算出された類似度が最大の上記画像部分を検出する,
     請求項1に記載の画像処理装置。
    The above detection means
    The image portion having the maximum similarity calculated by the similarity calculating means is detected.
    The image processing apparatus according to claim 1.
  3.  上記検出手段によって検出された上記画像部分を表示する表示装置,
     をさらに備えた請求項1または2に記載の画像処理装置。
    A display device that displays the image portion detected by the detection means,
    The image processing apparatus according to claim 1 or 2, further comprising.
  4.  複数の上記第2の画像について上記類似度算出手段による処理および上記第1の制御手段による処理を繰り返すように上記類似度算出手段および上記第1の制御手段を制御する第2の制御手段,
     をさらに備えた請求項1から3のうち,いずれか一項に記載の画像処理装置。
    A second control means that controls the similarity calculation means and the first control means so as to repeat the processing by the similarity calculation means and the processing by the first control means for the plurality of the second images.
    The image processing apparatus according to any one of claims 1 to 3, further comprising.
  5.  複数の上記第1の画像について上記類似度算出手段による処理および上記第1の制御手段による処理を繰り返すように上記類似度算出手段および上記第1の制御手段を制御する第3の制御手段,
     をさらに備えた請求項1から4のうち,いずれか一項に記載の画像処理装置。
    A third control means that controls the similarity calculation means and the first control means so as to repeat the processing by the similarity calculation means and the processing by the first control means for the plurality of the first images.
    The image processing apparatus according to any one of claims 1 to 4, further comprising.
  6.  上記類似度算出手段は,
     上記画像部分の構図と上記第2の画像の構図との類似度を算出し,
     上記検出手段によって検出された上記画像部分の明度,色味,濃度,鮮鋭度またはコントラストのうち少なくとも一つを,上記第2の画像の明度,色味,濃度,鮮鋭度またはコントラストに合わせる調整手段,
     をさらに備えた請求項1から5のうち,いずれか一項に記載の画像処理装置。
    The above similarity calculation means
    Calculate the similarity between the composition of the image part and the composition of the second image.
    Adjusting means for adjusting at least one of the brightness, tint, density, sharpness or contrast of the image portion detected by the detection means to match the brightness, tint, density, sharpness or contrast of the second image. ,
    The image processing apparatus according to any one of claims 1 to 5, further comprising.
  7.  上記検出手段は,
     上記類似度算出手段によって算出された上記類似度が上記第1のしきい値以上であり,かつ上記画像部分を所定の大きさに拡大した場合に解像度が第2のしきい値以上となる上記画像部分を検出する,
     請求項1から6のうち,いずれか一項に記載の画像処理装置。
    The above detection means
    The similarity calculated by the similarity calculating means is equal to or higher than the first threshold value, and the resolution becomes equal to or higher than the second threshold value when the image portion is enlarged to a predetermined size. Detect the image part,
    The image processing apparatus according to any one of claims 1 to 6.
  8.  上記画像部分は上記第2の画像と相似である,
     請求項1から7のうち,いずれか一項に記載の画像処理装置。
    The image part is similar to the second image,
    The image processing apparatus according to any one of claims 1 to 7.
  9. 上記第1の制御手段は,
     上記第1の画像の一部である画像部分の抽出枠の位置情報を変えて上記画像部分と上記第2の画像との類似度の算出を繰り返すように上記類似度算出手段を制御する,
    請求項1に記載の画像処理装置。
    The first control means is
    The similarity calculation means is controlled so as to repeat the calculation of the similarity between the image portion and the second image by changing the position information of the extraction frame of the image portion that is a part of the first image.
    The image processing apparatus according to claim 1.
  10. 上記第1の制御手段は,
     上記第1の画像の一部である画像部分の抽出枠の大きさを変えて上記画像部分と上記第2の画像との類似度の算出を繰り返すように上記類似度算出手段を制御する,
    請求項1に記載の画像処理装置。
    The first control means is
    The similarity calculation means is controlled so as to repeat the calculation of the similarity between the image portion and the second image by changing the size of the extraction frame of the image portion that is a part of the first image.
    The image processing apparatus according to claim 1.
  11.  類似度算出手段が,第1の画像の一部である画像部分と第2の画像との類似度を算出し,
     制御手段が,上記画像部分の範囲を変えて上記画像部分と上記第2の画像との類似度の算出を繰り返すように上記類似度算出手段を制御し,
     検出手段が,上記類似度算出手段によって算出された上記類似度が第1のしきい値以上となる上記画像部分を検出する,
     画像処理方法。
    The similarity calculation means calculates the similarity between the image portion that is a part of the first image and the second image.
    The control means controls the similarity calculation means so as to change the range of the image portion and repeat the calculation of the similarity between the image portion and the second image.
    The detection means detects the image portion where the similarity calculated by the similarity calculation means is equal to or higher than the first threshold value.
    Image processing method.
  12.  第1の画像の一部である画像部分と第2の画像との類似度を算出させ,
     上記画像部分の範囲を変えて上記画像部分と上記第2の画像との類似度の算出を繰り返すように上記類似度の算出を制御させ,
     算出された上記類似度が第1のしきい値以上となる上記画像部分を検出させるように上記画像処理装置のコンピュータを制御し,かつ上記コンピュータが読み取り可能なプログラム。
    The similarity between the image part that is a part of the first image and the second image is calculated.
    The calculation of the similarity is controlled so that the range of the image portion is changed and the calculation of the similarity between the image portion and the second image is repeated.
    A program that controls a computer of the image processing apparatus so as to detect an image portion whose calculated similarity is equal to or higher than a first threshold value, and is readable by the computer.
  13.  請求項12に記載のプログラムを格納した記録媒体。 A recording medium in which the program according to claim 12 is stored.
PCT/JP2020/003680 2019-03-27 2020-01-31 Image processing device, image processing method, and image processing program WO2020195155A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-059720 2019-03-27
JP2019059720 2019-03-27

Publications (1)

Publication Number Publication Date
WO2020195155A1 true WO2020195155A1 (en) 2020-10-01

Family

ID=72609969

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/003680 WO2020195155A1 (en) 2019-03-27 2020-01-31 Image processing device, image processing method, and image processing program

Country Status (1)

Country Link
WO (1) WO2020195155A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005111081A (en) * 2003-10-09 2005-04-28 Olympus Corp Endoscopic image display processor
JP2008109344A (en) * 2006-10-25 2008-05-08 Fujifilm Corp Method of detecting specific object image and digital camera
JP2013109658A (en) * 2011-11-22 2013-06-06 Nippon Telegr & Teleph Corp <Ntt> Image retrieval device, image retrieval method, and program
JP2013152543A (en) * 2012-01-24 2013-08-08 Fujitsu Ltd Image storage program, method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005111081A (en) * 2003-10-09 2005-04-28 Olympus Corp Endoscopic image display processor
JP2008109344A (en) * 2006-10-25 2008-05-08 Fujifilm Corp Method of detecting specific object image and digital camera
JP2013109658A (en) * 2011-11-22 2013-06-06 Nippon Telegr & Teleph Corp <Ntt> Image retrieval device, image retrieval method, and program
JP2013152543A (en) * 2012-01-24 2013-08-08 Fujitsu Ltd Image storage program, method and device

Similar Documents

Publication Publication Date Title
US11551338B2 (en) Intelligent mixing and replacing of persons in group portraits
EP1918872B1 (en) Image segmentation method and system
US9299004B2 (en) Image foreground detection
Spreeuwers et al. Towards robust evaluation of face morphing detection
US10019823B2 (en) Combined composition and change-based models for image cropping
KR101725884B1 (en) Automatic processing of images
CN107771391B (en) Method and apparatus for determining exposure time of image frame
EP2252088A1 (en) Image processing method and system
CN104185981A (en) Method and terminal selecting image from continuous captured image
EP2974261A2 (en) Systems and methods for classifying objects in digital images captured using mobile devices
KR20200023651A (en) Preview photo blurring method and apparatus and storage medium
US20100250588A1 (en) Image searching system and image searching method
CN111028170B (en) Image processing method, image processing apparatus, electronic device, and readable storage medium
EP3093822B1 (en) Displaying a target object imaged in a moving picture
CN111757014B (en) Focal length adjusting method, device, equipment and storage medium of network camera
WO2014074959A1 (en) Real-time face detection using pixel pairs
WO2014184372A1 (en) Image capture using client device
US20150003681A1 (en) Image processing apparatus and image processing method
US9767533B2 (en) Image resolution enhancement based on data from related images
JP2007108990A (en) Face detecting method, device and program
US20150112853A1 (en) Online loan application using image capture at a client device
WO2020195155A1 (en) Image processing device, image processing method, and image processing program
JP6669390B2 (en) Information processing apparatus, information processing method, and program
CN111145153A (en) Image processing method, circuit, visual impairment assisting device, electronic device, and medium
RU2329535C2 (en) Method of automatic photograph framing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20778157

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20778157

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP