WO2007108200A1 - カメラおよび画像処理プログラム - Google Patents
カメラおよび画像処理プログラム Download PDFInfo
- Publication number
- WO2007108200A1 WO2007108200A1 PCT/JP2007/000163 JP2007000163W WO2007108200A1 WO 2007108200 A1 WO2007108200 A1 WO 2007108200A1 JP 2007000163 W JP2007000163 W JP 2007000163W WO 2007108200 A1 WO2007108200 A1 WO 2007108200A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- image data
- information
- processing program
- image processing
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00326—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
- H04N1/00328—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information
- H04N1/00331—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information with an apparatus performing optical character recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N1/32101—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N1/32144—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp
- H04N1/32149—Methods relating to embedding, encoding, decoding, detection or retrieval operations
- H04N1/32203—Spatial or amplitude domain methods
- H04N1/32229—Spatial or amplitude domain methods with selective or adaptive application of the additional information, e.g. in selected regions of the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N1/32101—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N1/32144—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title embedded in the image data, i.e. enclosed or integrated in the image, e.g. watermark, super-imposed logo or stamp
- H04N1/32149—Methods relating to embedding, encoding, decoding, detection or retrieval operations
- H04N1/32203—Spatial or amplitude domain methods
- H04N1/32251—Spatial or amplitude domain methods in multilevel data, e.g. greyscale or continuous tone data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
- H04N5/275—Generation of keying signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3212—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
- H04N2201/3214—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
- H04N2201/3253—Position information, e.g. geographical position at time of capture, GPS data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3261—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of multimedia information, e.g. a sound signal
- H04N2201/3266—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of multimedia information, e.g. a sound signal of text or character information, e.g. text accompanying an image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3273—Display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3274—Storage or retrieval of prestored additional information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
Definitions
- the present invention relates to a camera and an image processing program for inserting an insertion image into image data.
- Some cameras have a function of inserting an insertion image such as a date into image data generated by imaging.
- an insertion image such as a date into image data generated by imaging.
- a technique for preventing the main image portion and the inserted image from overlapping by determining the position at which the inserted image is inserted based on the focal position in the imaging screen is considered.
- Patent Document 1 Japanese Patent Laid-Open No. 2 00 1 _ 1 3 6 4 2 4
- the area is a main image portion including a main subject.
- the main image and the inserted image may overlap.
- the insertion position is determined based on the focal position, a position with low visibility may be determined as the insertion position.
- the present invention has been made in view of the above problems, and an object thereof is to provide a camera and an image processing program capable of inserting an insertion image at an appropriate position of image data.
- the camera of the present invention includes an imaging unit that captures an image of a subject to generate image data, and performs a face recognition process on the image data to determine a distribution of main subjects in the image data image. Recognition means for generating distribution information to be shown, and determination means for determining an insertion position for inserting an insertion image into the image data, based on at least one of the distribution information and the contrast information of the subject.
- Another camera of the present invention includes an imaging unit that captures an image of a subject to generate image data, and the image data in the image data based on attribute information of an inserted image that is inserted into the image data.
- a determining unit that limits an insertion range of the insertion image and determines an insertion position for inserting the insertion image from the insertion range based on the information on the subject in the image data;
- the image processing apparatus may further include an insertion unit that inserts the insertion image at the insertion position of the image data determined by the determination unit.
- the recording apparatus further includes a recording unit that records the image data generated by the imaging unit, the image data of the insertion image, and the insertion position determined by the determination unit on a recording medium in association with each other. May be.
- the inserted image may be an image indicating a shooting condition of the imaging unit.
- the determination unit may determine the insertion position based on the distribution information and the contrast wrinkle information.
- the determination unit weights the distribution information and the contrast information.
- the insertion position may be determined based on the distribution information and the contrast wrinkle information in consideration of the weighting.
- the image processing apparatus further includes a setting unit that sets any one of a plurality of shooting modes, and the imaging unit includes a shooting condition according to the shooting mode set by the setting unit. Then, the image of the subject is captured to generate the image data, and the determination unit is configured to determine whether the distribution information and the contrast information are based on the type of the shooting mode set by the setting unit. A weight may be determined, and the insertion position may be determined based on the distribution information and the contrast wrinkle information in consideration of the weight.
- the determining unit limits an insertion range of the insertion image in the image data based on the attribute information of the insertion image, and An insertion position for inserting the insertion image may be determined from the insertion range.
- FIG. 1 is a diagram showing a configuration of an electronic camera 1 according to a first embodiment.
- FIG. 2 is a flowchart showing the operation of the electronic camera 1 of the first embodiment.
- FIG. 3 is a flowchart (continued) showing the operation of the electronic camera 1 of the first embodiment.
- FIG. 4 is a diagram illustrating determination of an insertion position and insertion of an insertion image.
- FIG. 5 is a diagram (continued) for explaining the determination of the insertion position and the insertion of the inserted image.
- FIG. 6 is a diagram illustrating a configuration of a computer 100 according to the second embodiment.
- FIG. 7 is a flowchart showing the operation of the computer 100 according to the second embodiment.
- FIG. 8 is a diagram showing a display example of a preview image.
- FIG. 1 is a diagram illustrating a configuration of the electronic force mesa 1 according to the first embodiment.
- the electronic force mela 1 has an optical system 2, an optical system control unit 3, an image sensor 4, an image processing unit 5, a memory 6, a recording unit 7, a recording medium 8, an operation unit 9, GPS ( GI oba I Positioning System) 1 0, Clock circuit 1 1, CPU 1 2
- the optical system 2 includes a lens and a diaphragm.
- Optical system 2 is Can be exchanged.
- the optical system control unit 3 controls the focal position of the diaphragm and lens.
- the image sensor 4 converts the subject image formed via the optical system 2 into image data and outputs the image data to the image processing unit 5.
- the image processing unit 5 performs various types of digital signal processing such as color signal generation, matrix conversion processing, r conversion processing, correction processing, and image processing for inserting an insertion image indicating imaging conditions. Note that a description of the specific method of each image processing is omitted.
- the memory 6 temporarily records the image data after the image processing by the image processing unit 5.
- the recording unit 7 records the image data temporarily recorded in the memory 6 on the recording medium 8.
- the recording medium 8 is a removable memory such as a memory card.
- the operation unit 9 includes a power button, a release button, a shooting mode selection dial, etc. (not shown).
- the GPS 10 acquires the position information of the electronic camera 1.
- the clock circuit 11 has information on the shooting date and time.
- CPU 1 2 controls each part in an integrated manner.
- the CPU 12 determines the insertion position of the insertion image. Details of the determination will be described later.
- the CPU 12 records a program for executing each process in advance.
- the electronic camera 1 has a plurality of shooting modes. Multiple shooting modes include “Portrait Mode”, “Landscape Mode”, “Auto Mode”, etc.
- C P U 1 2 captures images under conditions according to the set shooting mode.
- the electronic camera 1 also has a function of inserting an insertion image into image data generated by imaging.
- the inserted image includes shooting-related information and stamp information.
- Shooting-related information is information related to shooting such as shooting conditions (exposure value, shot speed, etc.), shooting date and time, and position information.
- the shooting date and time is acquired by the clock circuit 1 1.
- the position information is acquired by GPS10.
- the stamp information is information for performing additional expressions such as text messages, symbols, and diagrams.
- C P U 1 2 records a plurality of stamp information in advance. The above-described designation regarding the insertion of the inserted image is performed in advance by the user.
- the user operates the operation unit 9 to designate what kind of inserted image is to be inserted.
- FIG. 2 and FIG. 2 show the operation of CPU 1 2 during shooting in electronic camera 1. This will be described with reference to the flowchart of FIG.
- Step S1 The CPU 12 determines whether or not an instruction to start shooting is given. Then, when instructed to start photographing, the CPU 12 proceeds to step S2. The user operates the release button on the operation unit 9 to give an instruction to start shooting.
- Step S 2 The CPU 12 controls each unit to capture an image of the subject via the optical system 2 and generate image data. Then, the CPU 12 temporarily records the generated image data in the memory 6.
- Figure 4A shows an example of the generated image data.
- Step S3 CPU 1 2 determines whether or not the insertion is specified. If the CPU 12 determines that the insertion is specified, the process proceeds to step S4. On the other hand, if it is determined that the insertion is not designated, the CPU 12 proceeds to step S20 described later.
- Step S 4 The CPU 12 calculates a wrinkle distribution evaluation value D f x.
- C P U 1 2 converts the color space of the image data temporarily recorded in the memory 6 from, for example, an RGB color space to a Lab color space.
- the CPU 12 detects a skin color area using a table showing a predetermined skin color.
- CPU 12 determines whether or not an eye exists in the detected skin color area. Specifically, the CPU 12 determines whether or not the eye exists using a table of information (color, size, etc.) indicating the eye determined in advance.
- C P U 1 2 obtains the accuracy of each skin color area based on the detection result of the skin color area and the determination of whether or not an eye is present in the detected skin color area. For example, when both eyes are present in a certain skin color region, the accuracy is 100%. If there is only one eye in a certain skin color area, the accuracy is 60%. If there is no eye in a skin color area, the accuracy is 30%.
- Figure 4B shows an example of accuracy.
- Figure 4C shows an example of the face distribution evaluation value D f X.
- the face distribution evaluation value D f X indicates a larger value as the probability that the subject's face exists in the small area is higher.
- Step S 5 The CPU 12 calculates a contrast distribution evaluation value D c x.
- the CPU 12 obtains the contrast distribution evaluation value D c X for each of the 25 small areas of the image data temporarily recorded in the memory 6.
- C P U 1 2 divides the entire screen into 25 small areas as in step S4.
- the CPU 12 detects the maximum value among the luminance values of each pixel in the small area.
- the CPU 12 detects the minimum value among the luminance values of each pixel in the small area.
- the L value of the image data in the Lab color space obtained in step S4 is used.
- Contrast distribution evaluation value Dc x ⁇ (maximum luminance value) one (minimum luminance value) ⁇ ⁇ (number of gradations) X 1 00- (Equation 2)
- Figure 4D shows an example of the contrast ⁇ distribution evaluation value D c X.
- the contrast ⁇ distribution evaluation value D c X indicates a larger value as the contrast in the smaller area is higher. Note that the value of the G component of the image data in the RGB color space may be used instead of the luminance value in the above formula.
- Step S 6 The CPU 12 determines whether or not the photographing mode is “portrait mode”. If the CPU 12 determines that the photographing mode is the “portrait mode”, the process proceeds to step S 10 described later. On the other hand, if it is determined that the shooting mode is a shooting mode other than “portrait mode”, the CPU 12 proceeds to step S7.
- Step S 7 The CPU 12 determines whether or not the shooting mode is “landscape mode”. When the CPU 12 determines that the shooting mode is “landscape mode”, the CPU 12 proceeds to step S 11 described later. On the other hand, if the shooting mode is determined to be a shooting mode other than “landscape mode”, CPU 12 determines whether the shooting mode is step S8. Proceed to
- Step S 8 The CPU 12 calculates the heel region area S.
- CPU 12 obtains the sum of the flesh color areas with an accuracy of 50% or more in step S 4 and sets it as the area area S.
- Step S 9 The CPU 12 compares the ⁇ area S and the threshold values T I and T h. If the CPU 12 determines that the ⁇ area area S ⁇ threshold value Th, the process proceeds to step S 10. If the CPU 12 determines that the ⁇ area area S ⁇ threshold value T I, the process proceeds to step S 11 described later. If the CPU 12 determines that the threshold value T I ⁇ the area area S ⁇ the threshold value T h, the CPU 12 proceeds to step S 12 described later. Note that the threshold values T I and T h are predetermined threshold values.
- Step S 1 1 When the shooting mode is “landscape mode”, CPU 1 2
- Step S 1 3 The CPU 1 2 calculates the wrinkle distribution evaluation value D fx calculated in step S 4, the contrast distribution evaluation value D c X calculated in step S 5, and the steps S 1 0 to S 1 2
- the weighting factors Wf and Wc are the same value in all small areas.
- Fig. 5A shows an example of the subject distribution evaluation value D X.
- the subject distribution evaluation value D X indicates a larger value as the probability that the main subject is present in the small area is higher.
- Step S 14 The CPU 12 determines whether or not the shooting related information insertion is specified. If the CPU 12 determines that the shooting related information insertion is specified, the process proceeds to step S 15. On the other hand, if it is determined that the shooting related information insertion is not specified, the CPU 12 proceeds to step S 17 described later.
- Step S 15 The CPU 12 determines the insertion position of the photographing related information based on the subject distribution evaluation value D x calculated in Step S 13.
- the insertion position of shooting-related information shall be selected from the small areas included in the first and fifth lines.
- the small areas included in the first and fifth lines are preferable insertion positions because of the nature of the imaging-related information.
- the CPU 12 selects a small area having the smallest object distribution evaluation value D X from the small areas included in the first and fifth lines, and sets it as the insertion position of the imaging related information. Note that a small area that is a candidate for the insertion position of the shooting-related information may be displayed and selectable by the user. Further, the CPU 12 may be configured to select the insertion position candidate in accordance with the attribute of the imaging related information.
- Step S 1 6 The CPU 12 controls the image processing unit 5 and stores it in the memory 6.
- the shooting-related information is inserted into the insertion position determined in step S15 for the recorded image data.
- Fig. 5B shows an example of an image with shooting-related information inserted.
- the small area in the second column of the first row is determined as the insertion position, and the shooting date and time, the position information of the electronic camera 1 and (i 1) are inserted as shooting related information.
- An example is shown.
- CPU 1 2 inserts so that the center of the small area in the second row of the first row that is the insertion position matches the center of the input image of shooting-related information.
- the CPU 12 controls the GPS 10 to acquire position information of the electronic camera 1.
- the CPU 12 acquires the shooting date / time information from the clock circuit 11.
- the CPU 12 temporarily records the image data with the shooting-related information inserted in the memory 6.
- Step S 17 The CPU 12 determines whether or not the insertion of stamp information is designated. If the CPU 12 determines that the insertion of stamp information has been specified, the process proceeds to step S 18. On the other hand, if it is determined that the insertion of stamp information has not been specified, the CPU 12 proceeds to step S20 described later.
- Step S 18 The CPU 12 determines the insertion position of the stamp information based on the subject distribution evaluation value Dx calculated in Step S 13.
- the insertion position of the stamp information shall be selected from the small area included in the 2nd to 4th lines.
- the small area included in the second to fourth lines is a preferable insertion position due to the nature of the stamp information.
- the CPU 12 selects the small area with the smallest object distribution evaluation value D X from the small areas included in the second to fourth lines, and sets it as the insertion position of the stamp information. It is also possible to display a small area as a candidate for the insertion position of stamp information and select it by the user. Further, the CPU 12 may be configured to select the insertion position candidate according to the attribute of the stamp information.
- Step S 1 9 CPU 1 2 controls image processing unit 5 to insert stamp information at the insertion position determined in step S 1 8 for the image data temporarily recorded in memory 6.
- Figure 5B shows an example of an image with stamp information inserted.
- the small area in the third row and third column is the insertion position.
- the message “I'm fine” (i 2) is inserted as stamp information.
- the CPU 12 inserts the center of the small area in the third column of the third row, which is the insertion position, and the center of the insertion image of the stamp information.
- the CPU 12 temporarily records the image data with the stamp information inserted in the memory 6.
- Step S 20 The CPU 12 controls each part to record the image data temporarily recorded in the memory 6 on the recording medium 8. Note that the CPU 12 may record the image before inserting the inserted image and the image after inserting into the recording medium 8 in association with each other. In addition, the CPU 12 may record the image data generated by the imaging, the image data of the inserted image, and the insertion position in the recording medium 8 in association with each other without inserting the inserted image.
- the CPU 1 2 determines the image data generated by the imaging without determining the insertion position, the image data of the insertion image, and the insertion information (face distribution evaluation value D fx, contrast distribution evaluation value D cx, weight coefficient Wf and Wc, and part or all of the subject distribution evaluation value DX) may be associated with each other and recorded on the recording medium 8.
- an image of a subject is captured to generate image data, a face recognition process is performed on the image data, and a main subject distribution in the image data image
- the distribution information indicating is generated.
- an insertion position for inserting the inserted image into the image data is determined. Therefore, it is possible to insert an insertion image at an appropriate position in the image data.
- by determining the insertion position according to the subject distribution in the screen it avoids problems such as the main image portion and the insertion image overlapping, or a position with low visibility being determined as the insertion position. be able to.
- the image of the subject is captured to generate image data
- the insertion range of the insertion image in the image data image is limited based on the attribute information of the insertion image
- the insertion position for inserting the insertion image is determined from the insertion range. Therefore, the attribute of the inserted image
- the insertion image can be inserted at an appropriate position according to the sex.
- the insertion image is inserted at the determined insertion position of the image data. Therefore, the inserted image can be inserted into the image data only when it is necessary to insert the inserted image.
- the generated image data, the image data of the insertion image, and the determined insertion position are associated and recorded on the recording medium. Therefore, it is possible to insert an insertion image at an appropriate position at a desired timing after photographing.
- the weighting of the distribution information and the contrast information is such that the weighting ratio of the distribution information to the weighting of the contrast information increases as the area ratio of the main subject to the entire subject increases.
- the insertion position is determined based on the distribution information with weighting and the contrast information. Therefore, when the area ratio of the main subject to the entire subject is large, it is determined that the subject has a large number of eyelid areas, that is, the person is the main subject. Then, the insertion position can be determined with emphasis on the face distribution of the subject in the image. Conversely, when the area ratio of the main subject to the entire subject is small, it is determined that the subject has a small face area on the screen, that is, a main subject such as a landscape. The insertion position can be determined with emphasis on the contrast of the image.
- any one of the plurality of shooting modes is selected.
- Setting means for setting a mode, and taking an image of the subject under the shooting conditions according to the shooting mode set by the setting means to generate image data, and setting the type of shooting mode set by the setting means. Based on this, the weighting of the distribution information and the contrast information is determined, and the insertion position is determined based on the distribution information and the contrast information in consideration of the weighting. Therefore, the insertion position can be automatically and appropriately determined according to the type of shooting mode.
- the shooting mode is determined (step S 6 and step S 7 in FIG. 2), and the heel area area
- the shooting mode is determined (step S 6 and step S 7 in FIG. 2), and the heel area area
- the insertion position candidates for the imaging related information and the stamp information are determined in advance, but the present invention is not limited to this example.
- the priority order of each inserted image may be determined, and the insertion position may be determined so that the inserted images do not overlap according to the priority order.
- the insertion position candidates may be determined in advance, and only the small areas corresponding to the insertion position candidates may be processed in steps S4, S5, and S13 in FIGS.
- the case where the insertion image is inserted at the time of imaging has been described as an example.
- the insertion image is reproduced at the time of reproducing the image generated by imaging and recorded on the recording medium 8. It is good also as a structure which inserts.
- an insertion position may be determined at the time of imaging, and an insertion image may be inserted according to a user instruction at the time of reproduction.
- the face distribution evaluation value D f X is calculated in step S 4 of FIG. 2
- the contrast wrinkle distribution evaluation value D c X is calculated in step S 5 of FIG. 2
- the determination of the weighting factors W f and W c in step S 12 and the calculation method of the subject distribution evaluation value DX in step S 13 in FIG. 3 are merely examples, and are not limited to the example of this embodiment.
- the calculation method and determination method in each step may be configured to be specified by the user.
- FIG. 6 is a diagram illustrating a configuration of the computer 100 according to the second embodiment.
- the computer 1 0 0 has a memory 1 0 7, a recording unit 1 0 8, an operation Unit 1 1 0, CPU 1 1 3, acquisition unit 1 20, display control unit 1 21, and display unit 1 22.
- the memory 107 stores temporarily the image data acquired by the acquisition unit 120.
- the recording unit 108 records image data temporarily recorded in the memory 107.
- the operation unit 1 1 0 includes a power button, a mouse, a keyboard, and the like.
- CPU 1 1 3 controls each part in an integrated manner.
- CPU 1 1 3 determines the insertion position of the inserted image. Details of the determination will be described later. Further, the CPU 1 13 stores a program for executing each process in advance.
- the acquisition unit 120 acquires image data from an external device such as an electronic camera or a recording medium via a wired, wireless, or recording medium drive.
- the display control unit 122 controls the display of images on the display unit 122.
- the display unit 122 includes an image display element such as a liquid crystal display element.
- Step S31 The CPU 1 1 3 determines whether or not an instruction has been given by the user. If the CPU 1 13 determines that an instruction has been given by the user, the process proceeds to step S32.
- Step S 32 The CPU 1 1 3 determines whether acquisition of image data is instructed. Then, when determining that the acquisition of the image data is instructed, the CPU 1 13 proceeds to step S34 described later. On the other hand, if an instruction other than the acquisition of image data is given, the CPU 1 13 proceeds to step S33.
- Step S33 The CPU 1 1 3 performs processing according to the instruction. Since the specific method of processing is the same as that of the known technique, the description is omitted. When CPU 1 1 1 3 performs the process according to the instruction, it returns to step S31.
- Step S 34 The C P U 1 1 3 controls the acquisition unit 120 to acquire image data from an external device or a recording medium. At this time, the CPU 1 13 acquires tag information together with the image data. Then, the CPU 113 temporarily records the acquired image data and tag information in the memory 107.
- the tag information includes the shooting mode type, inserted image information, and inserted image information. Data, insertion information, etc. are included.
- the type of shooting mode is information indicating the type of shooting mode when shooting image data.
- the inserted image information is information that is the basis for creating an inserted image, such as the shooting date, shooting position, and shooting conditions. ⁇
- the input image information is, for example, code data indicating characters.
- the image data of the inserted image is image data indicating shooting-related information and stamp information similar to those in the first embodiment.
- the insertion information includes the insertion position information described in the first embodiment, the face distribution evaluation value D fx, the contrast distribution evaluation value D cx, the weighting factors Wf and Wc, and the object distribution evaluation value D x.
- the insertion information also includes information indicating the priority order of each inserted image when there are a plurality of inserted images.
- Step S 35 The CPU 113 determines whether or not the inserted image information or the image data of the inserted image exists in the tag information. If the CPU 113 determines that the inserted image information or the image data of the inserted image exists in the tag information, the CPU 113 proceeds to step S36. On the other hand, when determining that the inserted image information or the image data of the inserted image does not exist, the CPU 113 proceeds to step S38 to be described later.
- Step S 36 The CPU 1 1 3 determines whether or not insertion position information exists in the tag information. Then, when determining that the insertion position information is present in the tag information, the CPU 113 moves to step S40 described later. On the other hand, when determining that the insertion position information does not exist, the CPU 1 13 proceeds to step S37.
- the CPU 1 1 3 determines whether or not the face distribution evaluation value D fx, the contrast distribution evaluation value D c X, the weighting factors W f and W c, the subject distribution evaluation value DX, etc. exist in the tag information. You may do it. Then, if the subject distribution evaluation value D x does not exist, the CPU 1 13 calculates the subject distribution evaluation value D x and proceeds to step S40 described later.
- Step S 37 The CPU 1 1 3 determines the insertion position.
- the CPU 113 determines the insertion position as in the first embodiment described above. This determination may be made appropriately according to the information included in the tag information for the image data acquired in step S34.
- the CPU 1 13 performs the same processing as Steps S 4 to S 13 in the flowcharts of FIGS. 2 and 3 to calculate the subject distribution evaluation value DX.
- CPU 1 1 3 performs the same processing as step S 1 4, step S 1 5, step S 1 7, and step S 1 8 in the flowchart shown in FIG. 3 to insert shooting related information and stamp information. Determine the position.
- Step S 38 C P U 1 1 3 determines whether or not the insertion designation has been performed. If the CPU 1 1 3 determines that the insertion is specified, the process proceeds to step S 39. On the other hand, if it is determined that no insertion is specified, the CPU 1 1 3 ends the series of processing. Note that the user operates the operation unit 1 1 0 to specify the insertion.
- Step S39 The CPU 1 1 3 recognizes the position designated by the user in step S38 and determines the insertion position.
- Step S 40 The CPU 1 13 generates a preview image for superimposing display on the image data temporarily recorded in the memory 107.
- the preview image is an image in which the insertion image is arranged at the insertion position determined in step S37 or step S39 or the insertion position stored in the tag information. If CPU 1 1 3 recognizes that the image data of the inserted image exists in the tag information, it inserts this as an inserted image. If the CPU 113 recognizes that the inserted image information is present in the tag information, it generates and inserts an inserted image based on the inserted image information. When inserting a plurality of inserted images, a plurality of inserted images may be simultaneously arranged in the preview image. When CPU 1 1 3 generates the preview image, it temporarily records it in memory 1 07.
- Step S41 C P U 1 1 3 controls the display control unit 1 21 to
- FIG. 8 shows a display example.
- the case where the shooting-related information shown in i 1 is inserted is taken as an example.
- the CPU 1 13 stores the image data acquired in step S 34 in the area A 1 of the display unit 122 in the area A 1.
- the view image is superimposed and displayed.
- the CPU 1 13 displays a message prompting the user to perform an insertion execution or adjustment operation, which will be described later, in an area A 2 of the display unit 122.
- the user can confirm the insertion position of the inserted image by viewing the display unit 122.
- Step S 42 C P U 1 1 3 determines whether or not insertion execution is instructed. Then, when determining that the insertion execution has been instructed, the CPU 1 13 proceeds to step S 45 described later. On the other hand, if it is determined that the insertion execution is not instructed, the CPU 1 13 proceeds to step S43.
- the user operates the operation unit 1 1 0 to give an insertion execution instruction.
- it may be configured such that insertion can be instructed for each inserted image.
- Step S43 The CPU 1 13 determines whether or not an adjustment operation has been performed. If the CPU 1 1 3 determines that the adjustment operation has been performed, the CPU 1 1 3 proceeds to step S44. On the other hand, when determining that the adjustment operation is not performed, the CPU 1 1 3 returns to step S42.
- the adjustment operation is an operation for the user to adjust the insertion position. The user performs an adjustment operation by operating the operation unit 110 while viewing the preview image displayed on the display unit 122.
- it may be configured so that an adjustment operation can be performed for each inserted image.
- Step S44 The CPU 1 1 3 changes the insertion position in accordance with the adjustment operation performed in step S43, and returns to step S40.
- Step S45 The CPU 1 13 inserts the insertion image into the image data temporarily recorded in the memory 107 as in the first embodiment described above. When inserting a plurality of inserted images, a plurality of inserted images may be inserted simultaneously. The CPU 1 1 3 temporarily records the image data with the inserted image in the memory 107.
- Step S 46 The CPU 1 13 records the image data temporarily recorded in the memory 107 on the recording unit 108.
- the image processing program for realizing the image processing on the image data to be processed by the computer the same effect as in the first embodiment can be obtained.
- a confirmation image for the user to confirm the determined insertion position is displayed on the display unit of the computer. Therefore, the user can confirm the insertion position of the insertion image by visually observing the display unit before executing the insertion of the insertion image.
- a confirmation image for the user to confirm the determined insertion position is displayed on the display unit of the computer, and after the display starts, an insertion execution instruction by the user is accepted.
- the insertion execution instruction is accepted, the insertion image is inserted at the insertion position of the image data to be processed. Therefore, the user can confirm the insertion position of the inserted image by visually observing the display unit before executing the insertion of the inserted image, and can issue an insertion execution instruction at a desired timing after the confirmation. .
- a configuration may be adopted in which a plurality of inserted images are processed in order.
- a configuration may be adopted in which a plurality of inserted images are processed simultaneously.
- the image data image and the preview image may be displayed in a superimposed manner and an insertion execution instruction may be received.
- the control program recorded by the computer 100 of the second embodiment may be recorded in the electronic camera, and the same processing as that of the second embodiment may be performed.
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2007800102889A CN101406040B (zh) | 2006-03-23 | 2007-03-02 | 摄像机及图像处理装置和方法 |
EP07713545A EP1995949B1 (en) | 2006-03-23 | 2007-03-02 | Camera and image processing program |
AT07713545T ATE534234T1 (de) | 2006-03-23 | 2007-03-02 | Kamera und bildverarbeitungsprogramm |
JP2008506171A JP5083206B2 (ja) | 2006-03-23 | 2007-03-02 | カメラおよび画像処理プログラム |
US12/225,159 US8199242B2 (en) | 2006-03-23 | 2007-03-02 | Camera and image processing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006079966 | 2006-03-23 | ||
JP2006-079966 | 2006-03-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007108200A1 true WO2007108200A1 (ja) | 2007-09-27 |
Family
ID=38522236
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2007/000163 WO2007108200A1 (ja) | 2006-03-23 | 2007-03-02 | カメラおよび画像処理プログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US8199242B2 (ja) |
EP (2) | EP2405646A1 (ja) |
JP (3) | JP5083206B2 (ja) |
CN (2) | CN102325243A (ja) |
AT (1) | ATE534234T1 (ja) |
WO (1) | WO2007108200A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010026170A1 (en) | 2008-09-02 | 2010-03-11 | Ecole Polytechnique Federale De Lausanne (Epfl) | Image annotation on portable devices |
JP2010200267A (ja) * | 2009-02-27 | 2010-09-09 | Toshiba Corp | 表示システムおよび表示方法 |
JP2010200268A (ja) * | 2009-02-27 | 2010-09-09 | Toshiba Corp | 表示システム及び表示方法 |
JP2011076340A (ja) * | 2009-09-30 | 2011-04-14 | Ntt Docomo Inc | 情報処理装置及びプログラム |
JP2014038601A (ja) * | 2012-08-16 | 2014-02-27 | Naver Corp | イメージ分析によるイメージ自動編集装置、方法およびコンピュータ読み取り可能な記録媒体 |
JP5753945B2 (ja) * | 2012-05-16 | 2015-07-22 | 楽天株式会社 | 画像処理装置、画像処理装置の制御方法、プログラム、及び情報記憶媒体 |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4911191B2 (ja) * | 2009-04-08 | 2012-04-04 | 株式会社ニコン | 画像処理装置および画像処理プログラム |
CN101846864B (zh) * | 2010-04-13 | 2011-05-11 | 世大光电(深圳)有限公司 | 物体追迹光学系统及方法 |
US9584736B2 (en) * | 2011-09-23 | 2017-02-28 | Disney Enterprises, Inc. | Automatic repositioning of video elements |
US9916538B2 (en) | 2012-09-15 | 2018-03-13 | Z Advanced Computing, Inc. | Method and system for feature detection |
US8873813B2 (en) | 2012-09-17 | 2014-10-28 | Z Advanced Computing, Inc. | Application of Z-webs and Z-factors to analytics, search engine, learning, recognition, natural language, and other utilities |
US11914674B2 (en) | 2011-09-24 | 2024-02-27 | Z Advanced Computing, Inc. | System and method for extremely efficient image and pattern recognition and artificial intelligence platform |
US11074495B2 (en) | 2013-02-28 | 2021-07-27 | Z Advanced Computing, Inc. (Zac) | System and method for extremely efficient image and pattern recognition and artificial intelligence platform |
US11195057B2 (en) | 2014-03-18 | 2021-12-07 | Z Advanced Computing, Inc. | System and method for extremely efficient image and pattern recognition and artificial intelligence platform |
RU2015137077A (ru) * | 2013-02-01 | 2017-03-06 | Руф Машиненбау Гмбх Унд Ко Кг | Подающее устройство для подвода штучного материала в брикетный пресс |
US20140378810A1 (en) * | 2013-04-18 | 2014-12-25 | Digimarc Corporation | Physiologic data acquisition and analysis |
JP7069751B2 (ja) * | 2018-01-29 | 2022-05-18 | カシオ計算機株式会社 | 印刷装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10247135A (ja) * | 1997-03-05 | 1998-09-14 | Hitachi Ltd | メッセージ表示装置および表示方法 |
JP2001008064A (ja) * | 1999-06-24 | 2001-01-12 | Casio Comput Co Ltd | 電子カメラ装置及び重畳表示情報配置方法 |
JP2001136424A (ja) | 1999-11-02 | 2001-05-18 | Canon Inc | 撮影装置及び撮影制御方法 |
JP2002010066A (ja) * | 2000-06-26 | 2002-01-11 | Olympus Optical Co Ltd | 画像再生装置 |
JP2006033370A (ja) * | 2004-07-15 | 2006-02-02 | Fuji Photo Film Co Ltd | 撮像装置 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5986706A (en) * | 1993-07-22 | 1999-11-16 | Canon Kabushiki Kaisha | Electronic apparatus |
JPH09211679A (ja) | 1996-02-02 | 1997-08-15 | Nikon Corp | カメラのデータ写し込み装置 |
JPH1026967A (ja) | 1996-07-12 | 1998-01-27 | Toshiba Corp | 画像処理装置 |
JP4456202B2 (ja) * | 1999-08-23 | 2010-04-28 | 富士フイルム株式会社 | カメラシステム及びプリンタ付きカメラ |
US6940545B1 (en) * | 2000-02-28 | 2005-09-06 | Eastman Kodak Company | Face detecting camera and method |
JP2002016833A (ja) * | 2000-06-28 | 2002-01-18 | Ricoh Co Ltd | 画像入力装置、デジタルカメラ、携帯情報入力装置、画像入力システム、画像入力方法、およびコンピュータが読取可能な記録媒体 |
JP3957479B2 (ja) * | 2001-06-07 | 2007-08-15 | 三菱電機株式会社 | 映像信号編集装置 |
US7298412B2 (en) * | 2001-09-18 | 2007-11-20 | Ricoh Company, Limited | Image pickup device, automatic focusing method, automatic exposure method, electronic flash control method and computer program |
US7068309B2 (en) * | 2001-10-09 | 2006-06-27 | Microsoft Corp. | Image exchange with image annotation |
JP2004046591A (ja) | 2002-07-12 | 2004-02-12 | Konica Minolta Holdings Inc | 画像評価装置 |
JP2004147174A (ja) * | 2002-10-25 | 2004-05-20 | Make Softwear:Kk | 写真自動販売機、画像入力方法、および画像入力プログラム |
US20040207743A1 (en) * | 2003-04-15 | 2004-10-21 | Nikon Corporation | Digital camera system |
JP3946676B2 (ja) | 2003-08-28 | 2007-07-18 | 株式会社東芝 | 撮影画像処理装置及びその方法 |
JP2005184778A (ja) | 2003-11-27 | 2005-07-07 | Fuji Photo Film Co Ltd | 撮像装置 |
JP2005173282A (ja) * | 2003-12-12 | 2005-06-30 | Canon Inc | カメラ |
US7528868B2 (en) * | 2003-12-18 | 2009-05-05 | Eastman Kodak Company | Image metadata attachment |
JP2005215750A (ja) * | 2004-01-27 | 2005-08-11 | Canon Inc | 顔検知装置および顔検知方法 |
US20050231613A1 (en) * | 2004-04-16 | 2005-10-20 | Vincent Skurdal | Method for providing superimposed video capability on a digital photographic device |
JP2005338278A (ja) * | 2004-05-25 | 2005-12-08 | Nikon Corp | 撮影用照明装置およびカメラシステム |
JP4210254B2 (ja) * | 2004-10-19 | 2009-01-14 | 富士フイルム株式会社 | プリンタ及びプリント方法 |
US7936919B2 (en) * | 2005-01-18 | 2011-05-03 | Fujifilm Corporation | Correction of color balance of face images depending upon whether image is color or monochrome |
-
2007
- 2007-03-02 EP EP11183723A patent/EP2405646A1/en not_active Ceased
- 2007-03-02 WO PCT/JP2007/000163 patent/WO2007108200A1/ja active Application Filing
- 2007-03-02 CN CN2011103042587A patent/CN102325243A/zh active Pending
- 2007-03-02 US US12/225,159 patent/US8199242B2/en not_active Expired - Fee Related
- 2007-03-02 CN CN2007800102889A patent/CN101406040B/zh not_active Expired - Fee Related
- 2007-03-02 AT AT07713545T patent/ATE534234T1/de active
- 2007-03-02 JP JP2008506171A patent/JP5083206B2/ja not_active Expired - Fee Related
- 2007-03-02 EP EP07713545A patent/EP1995949B1/en not_active Not-in-force
-
2011
- 2011-11-07 JP JP2011243331A patent/JP2012065338A/ja active Pending
-
2012
- 2012-04-09 JP JP2012088423A patent/JP5392372B2/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10247135A (ja) * | 1997-03-05 | 1998-09-14 | Hitachi Ltd | メッセージ表示装置および表示方法 |
JP2001008064A (ja) * | 1999-06-24 | 2001-01-12 | Casio Comput Co Ltd | 電子カメラ装置及び重畳表示情報配置方法 |
JP2001136424A (ja) | 1999-11-02 | 2001-05-18 | Canon Inc | 撮影装置及び撮影制御方法 |
JP2002010066A (ja) * | 2000-06-26 | 2002-01-11 | Olympus Optical Co Ltd | 画像再生装置 |
JP2006033370A (ja) * | 2004-07-15 | 2006-02-02 | Fuji Photo Film Co Ltd | 撮像装置 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010026170A1 (en) | 2008-09-02 | 2010-03-11 | Ecole Polytechnique Federale De Lausanne (Epfl) | Image annotation on portable devices |
CN102204238A (zh) * | 2008-09-02 | 2011-09-28 | 瑞士联邦理工大学,洛桑(Epfl) | 便携式设备上的图像标注 |
JP2012507761A (ja) * | 2008-09-02 | 2012-03-29 | エコール ポリテクニーク フェデラル ドゥ ローザンヌ(エーペーエフエル) | ポータブル・デバイス上での画像アノテーション |
US9953438B2 (en) | 2008-09-02 | 2018-04-24 | Ecole Polytechnic Federale De Lausanne (Epfl) | Image annotation on portable devices |
JP2010200267A (ja) * | 2009-02-27 | 2010-09-09 | Toshiba Corp | 表示システムおよび表示方法 |
JP2010200268A (ja) * | 2009-02-27 | 2010-09-09 | Toshiba Corp | 表示システム及び表示方法 |
JP2011076340A (ja) * | 2009-09-30 | 2011-04-14 | Ntt Docomo Inc | 情報処理装置及びプログラム |
JP5753945B2 (ja) * | 2012-05-16 | 2015-07-22 | 楽天株式会社 | 画像処理装置、画像処理装置の制御方法、プログラム、及び情報記憶媒体 |
US9996516B2 (en) | 2012-05-16 | 2018-06-12 | Rakuten, Inc. | Image processing device for determining a display position of an annotation |
JP2014038601A (ja) * | 2012-08-16 | 2014-02-27 | Naver Corp | イメージ分析によるイメージ自動編集装置、方法およびコンピュータ読み取り可能な記録媒体 |
JP2018110011A (ja) * | 2012-08-16 | 2018-07-12 | ネイバー コーポレーションNAVER Corporation | イメージ分析によるイメージ自動編集装置、方法およびコンピュータ読み取り可能な記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
US20090185046A1 (en) | 2009-07-23 |
CN102325243A (zh) | 2012-01-18 |
JP5392372B2 (ja) | 2014-01-22 |
JP5083206B2 (ja) | 2012-11-28 |
EP1995949A1 (en) | 2008-11-26 |
JP2012178844A (ja) | 2012-09-13 |
JPWO2007108200A1 (ja) | 2009-08-06 |
EP2405646A1 (en) | 2012-01-11 |
JP2012065338A (ja) | 2012-03-29 |
CN101406040B (zh) | 2011-12-28 |
EP1995949B1 (en) | 2011-11-16 |
CN101406040A (zh) | 2009-04-08 |
US8199242B2 (en) | 2012-06-12 |
ATE534234T1 (de) | 2011-12-15 |
EP1995949A4 (en) | 2010-05-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5392372B2 (ja) | カメラおよび画像処理プログラム | |
US7672580B2 (en) | Imaging apparatus and method for controlling display device | |
CN101325658B (zh) | 成像设备和成像方法 | |
US8077216B2 (en) | Image pickup apparatus with a human face detecting function, method and program product for detecting a human face | |
JP4732303B2 (ja) | 撮像装置 | |
JP5467992B2 (ja) | 撮像装置 | |
JP5129683B2 (ja) | 撮像装置及びその制御方法 | |
JP2005318554A (ja) | 撮影装置及びその制御方法及びプログラム及び記憶媒体 | |
JP5604285B2 (ja) | 撮像装置 | |
KR101046041B1 (ko) | 촬상 장치 | |
JP2007148691A (ja) | 画像処理装置 | |
JP4760496B2 (ja) | 画像データ生成装置、画像データ生成方法 | |
JP2018007082A (ja) | 画像再生装置およびその制御方法およびプログラム | |
JP4632417B2 (ja) | 撮像装置、及びその制御方法 | |
JP2007259004A (ja) | デジタルカメラ、画像処理装置及び画像処理プログラム | |
JP5029765B2 (ja) | 画像データ生成装置、画像データ生成方法 | |
JP2009089083A (ja) | 年齢推定撮影装置及び年齢推定撮影方法 | |
JP5002311B2 (ja) | 撮像装置、撮像装置の制御方法、プログラム、及び記憶媒体 | |
JP5158438B2 (ja) | 画像生成装置、画像生成方法、及びプログラム | |
JP2010050874A (ja) | 画像処理装置、画像処理方法及びデジタルスチルカメラ | |
JP2010021674A (ja) | 撮像装置 | |
JP2006311451A (ja) | 画像再生装置、及び電子カメラ | |
JP5185803B2 (ja) | 特定対象画像判別システム | |
JP2009017134A (ja) | 撮像装置、撮像方法 | |
JP2013131970A (ja) | 撮像装置、撮像方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07713545 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2008506171 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12225159 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007713545 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200780010288.9 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |