JP2007150491A - Camera, and image control program - Google Patents

Camera, and image control program Download PDF

Info

Publication number
JP2007150491A
JP2007150491A JP2005339698A JP2005339698A JP2007150491A JP 2007150491 A JP2007150491 A JP 2007150491A JP 2005339698 A JP2005339698 A JP 2005339698A JP 2005339698 A JP2005339698 A JP 2005339698A JP 2007150491 A JP2007150491 A JP 2007150491A
Authority
JP
Japan
Prior art keywords
image
image data
similarity
data
image file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2005339698A
Other languages
Japanese (ja)
Other versions
JP4665731B2 (en
Inventor
Hiroshi Sano
央 佐野
Original Assignee
Nikon Corp
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp, 株式会社ニコン filed Critical Nikon Corp
Priority to JP2005339698A priority Critical patent/JP4665731B2/en
Publication of JP2007150491A publication Critical patent/JP2007150491A/en
Application granted granted Critical
Publication of JP4665731B2 publication Critical patent/JP4665731B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

An image is classified according to the similarity of image data.
An image file generation unit 104a generates an image file including an image data unit generated based on original image data captured by an image sensor 103 and an additional information unit. Based on at least one type of target data among the image data portion, the original image data, and the estimated image data estimated from the image file, the similarity determination unit 104b determines a plurality of the one type of target data as comparison target data. Then, the degree of similarity is determined by comparing the patterns expressed by the comparison target data. The image classification unit 104c classifies the image file based on the similarity determination result by the similarity determination unit 104b.
[Selection] Figure 1

Description

  The present invention relates to a camera that captures an image and an image management program for managing an image captured by the camera.

  The following electronic camera is known from Patent Document 1. In this electronic camera, shooting data is recorded in a folder previously selected by the user.

JP 2001-169222 A

  However, in the conventional electronic camera, the user has to select a recording destination folder in advance, so if the user wants to record the photographic data in different folders according to the captured images, the user must Each time, the folder of the recording destination has to be selected, which is inconvenient.

The present invention relates to an image data section generated based on original image data captured by an image sensor, original image data, original image data estimated from an image file, and a process of generating an image file from the original image data. One of the generated intermediate image data and the intermediate image data estimated from the image file is set as comparison target data, the patterns expressed by the comparison target data are compared, and the similarity is determined. The image files are classified based on the determination result.
In determining the similarity, the similarity may be determined after matching the number of pixels of the comparison target data, or the similarity is determined after reducing the number of pixels of the comparison target data and reducing the image. May be.
Alternatively, image files that are determined to have a high degree of similarity may be classified by storing them in the same folder. For image files that are determined to have a high degree of similarity, each file name is assigned a similarity level. The image file may be classified by using a name indicating that the value is high. Further, the image file classification result may be stored as a similarity management table.
When determining the similarity of images, the similarity is determined after compensating for the exposure deviation between the comparison target data due to at least one of the shooting sensitivity, shutter speed, and aperture value when capturing the comparison target data. It is preferable to do.
The image data part may include a thumbnail image generated based on the image data, and the similarity may be determined based on the thumbnail image.

  According to the present invention, since the image files are classified according to the similarity of the pattern expressed by the comparison target data, the image file is automatically classified according to the pattern of the image. Convenience can be improved.

  FIG. 1 is a block diagram illustrating a configuration of an embodiment of a camera, for example, a digital camera, according to the present embodiment. The digital camera 100 includes an input switch 101, a lens 102, an image sensor 103, a control device 104, a memory 105, a memory card slot 106, and a monitor 107.

  The input switch 101 includes various input members operated by the user. For example, the input switch 101 includes a release button and various operation buttons for operating the digital camera 100.

  The lens 102 is composed of a plurality of optical lens groups, but is represented by a single lens in FIG. The image sensor 103 is, for example, a CCD, and captures a subject image input through the lens 102 every time the user presses a release button included in the input switch 101, and sends the captured image data to the control device 104. Output.

  In this image sensor 103, for example, the most representative R (red), G (green), and B (blue) color filters of a single-plate color image sensor are arranged in a Bayer array. The image data captured by the image sensor 103 through the lens 102 is assumed to be shown in the RGB color system, and color information of any one of RGB color components exists in each pixel constituting the image data. It shall be. That is, it is assumed that the image is a Bayer image.

  Here, the color information of each color component of RGB is, for example, an intensity signal of each color represented by a level of 0 to 255, and the luminance of each pixel is determined based on the intensity signal of each color of RGB. Note that image data (original image data) output from the image sensor 103 to the control device 104 is raw Bayer image data captured by the image sensor 103, that is, RAW data. Also, it is assumed that the image sizes of the image data output from the image sensor 103 to the control device 104 are the same.

  The control device 104 includes a CPU and other peripheral circuits, and functionally includes an image file generation unit 104a, a similarity determination unit 104b, and an image classification unit 104c. Note that the functions of 104a to 104c included in the control device 104 may be realized by hardware or may be realized by software. In the present embodiment, an example will be described in which the control device 104 implements each function of 104a to 104c in software by executing an image processing program stored in the memory 105.

  The image file generation unit 104 a generates an image file including an image data part and an additional information part based on the RAW data input from the image sensor 103. It is assumed that the image file format generated by the image processing unit 104a is set in advance by the user. In this embodiment, the user operates the input switch 101 to select the RAW format as the image file format. One of the Jpeg formats shall be set.

  The image file generation unit 104 a performs image processing corresponding to the image file format set by the user on the RAW data input from the image sensor 103 to generate an image data unit. Then, additional information composed of information that affects exposure at the time of image capturing such as imaging date information, photographing sensitivity information, lens 101 aperture setting information, shutter speed information, and the like is added to the generated image data portion. Then, an image file in a format set by the user is generated.

  When the RAW format is set as the format of the image file, the image file generation unit 104a does not perform image processing on the RAW data input from the image sensor 103, and uses the RAW data as an image data unit as it is. The RAW format image file is generated by adding the additional information part.

  On the other hand, when the jpeg format is set as the format of the image file, the image file generation unit 104a temporarily stores the RAW data input from the image sensor 103 in the memory 105, and then performs the process on the RAW data. Then, image processing to be described later is executed to compress the image data into Jpeg format. Then, the compressed Jpeg format image data is used as an image data part, and the additional information part described above is added to generate a Jpeg format image file. As a result, the original image data used to generate the image data portion, that is, RAW data, is saved in the memory 105, and then a JPEG format image file having the JPEG format image data as the image data portion is generated. it can.

  For image processing for compressing RAW data into Jpeg format image data, for example, a technique described in Japanese Patent Laid-Open No. 2000-23083 is used. Since this image processing is a known technique, a detailed description thereof is omitted, but the processing flow is as follows.

  First, the image file generation unit 104a performs interpolation processing on RAW data input from the CCD 103, that is, image data of a Bayer image, and information on R, G, and B color components for each pixel constituting the image data. Are converted into image data of a color image (RGB color system image data). The RGB color system image data is subjected to white balance adjustment by applying a gain to each of RGB of each pixel. The image data after white balance adjustment is smoothed by applying, for example, a 3 × 3 smoothing filter (matrix), and then tone correction is performed.

  Then, in order to convert the RGB color system image data into YCbCr color system image data, a matrix is applied to the image data after gradation correction. By subjecting the image data obtained as a result to compression processing at a predetermined compression rate, image data in Jpeg format can be obtained.

  The similarity determination unit 104b determines the similarity of the patterns of images continuously captured by the image sensor 103 based on the RAW data. Then, the image classification unit 104c classifies the image files based on the determination result by the similarity determination unit 104b so that the images with high image similarity are in the same group. In the present embodiment, the image classification unit 104c classifies images by storing images with high image similarity in the same folder in the memory card inserted in the memory card slot 106. explain. Note that the storage destination of the image classification result may be the memory 105 instead of the memory card.

  Here, the similarity of the pattern of the image means the similarity of the subject and the composition in each image. If the similarity of the pattern of these images is not less than a predetermined value, the images are the same. It can be determined that the image is captured in the scene. For example, when the user of the digital camera 100 continuously captures a plurality of frames of the same subject while changing the aperture setting and shutter speed of the lens 102, it is determined that the images of the respective frames have high similarity. Will be.

  First, the similarity determination unit 104b acquires RAW data as target data used at the time of similarity determination according to the format of the image file generated by the image file generation unit 104a. That is, when a RAW format image file is generated by the image file generation unit 104a, RAW data is acquired from the image data unit included in the image file. On the other hand, when an image file in JPEG format is generated by the image file generation unit 104a, RAW data temporarily stored in the memory 105 is acquired.

  Then, the similarity determination unit 104b uses the acquired RAW data as image data (reference image data) serving as a reference for determining the similarity, or for comparing with the already determined reference image data. The image data (target image data) is determined.

  Specifically, the similarity determination unit 104b determines whether reference image data already exists in the memory 105 and whether a predetermined time or more has elapsed since the previous image was captured. As a result, when it is determined that the reference image data does not exist in the memory 105, or when it is determined that the elapsed time since the previous image was captured is a predetermined time or more, the similarity determination unit 104b acquires The RAW data thus set is set as reference image data and stored in the memory 105.

  At the same time, the image classification unit 104c applies the image file (reference image file) corresponding to the set reference image data and the target image data determined to have high similarity to the reference image data by the similarity determination unit 104b in the process described later. A folder for storing corresponding image files (target image files) as the same group is created in the memory card. Then, the reference image file corresponding to the RAW data set as the reference image data is stored in the created folder.

  On the other hand, when the reference image data exists in the memory 105 and it is determined that the elapsed time since the previous image was captured is less than the predetermined time, the similarity determination unit 104b acquires The processed RAW data is set as target image data.

  When the acquired RAW data is set as target image data, the similarity determination unit 104b compares the reference image data stored in the memory 105 with the set target image data as comparison target data, The degree of similarity between the pattern expressed by the reference image data and the pattern expressed by the target image data is determined.

  The similarity determination unit 104b performs image data subtraction processing on the acquired reference image data and target image data in order to determine the similarity of the pattern expressed by the comparison target data. Specifically, the differences ΔR, ΔG, and the color information of the R, G, and B color components of each pixel of the reference image data and the color information of each of the R, G, and B color components of the target image data And ΔB are calculated by the following equations (1) to (3), respectively. In the following formulas (1) to (3), the R component color information of the reference image data is Ri, the G component color information is Gi, and the B component color information is Bi. The color information is R′i, the G component color information is G′i, and the B component color information is B′i. Further, i indicates a pixel position in each image data.

R component
G component
B component

  If the calculation results in all of the expressions (1) to (3) are smaller than the predetermined threshold value, the similarity determination unit 104b determines that the similarity of the pattern expressed by the comparison target data is high. That is, the similarity between the pattern expressed by the reference image data and the pattern expressed by the target image data is high, and it is determined that the reference image data and the target image data are captured in the same scene.

  The image classification unit 104c corresponds to the target image data when the similarity determination unit 104b determines that the similarity between the pattern expressed by the reference image data and the pattern expressed by the target image data is high. The target image file is saved in a folder in the memory card where the reference image file is stored.

  On the other hand, if the calculation result of any one of the formulas (1) to (3) is equal to or greater than a predetermined threshold, the similarity determination unit 104b calculates the pattern expressed by the reference image data and the pattern expressed by the target image data. It is determined that the similarity is low. Then, the similarity determination unit 104b newly sets the RAW data currently set as the target image data as the reference image data and stores it in the memory 105. That is, the RAW data currently set as the target image data is used as reference image data for determining the similarity with the next image captured by the CCD 103.

  Then, the image classification unit 104c creates a folder in the memory card for storing the newly set reference image data and the target image data determined to have high similarity as the same group in the memory card. A reference image file corresponding to the RAW data set as the image data is stored in the created folder. That is, the target image file corresponding to the image that was the target image data in the above-described processing is stored in the newly created folder as the reference image file corresponding to the new reference image data.

  As a result, when the similarity between the pattern expressed by the reference image data and the pattern expressed by the target image data is high, the same folder is used so that the image files corresponding to the respective image data are in the same group. The image files captured in the same scene can be classified so as to be stored in the same folder.

  In addition, when the degree of similarity between the pattern expressed by the reference image data and the pattern expressed by the target image data is low, each image data is set so that the reference image data and the target image data are in different groups. Can be stored in different folders. Furthermore, by setting the target image data as new reference image data, it is possible to classify images that are subsequently captured in comparison with new reference image data. Folders can be classified for each scene.

  FIG. 2 is a flowchart showing processing of the digital camera 100 in the present embodiment. The processing shown in FIG. 2 is executed when the image processing program is read from the memory 105 by the control device 104 and activated when the user presses the release button included in the input switch 101.

  In step S <b> 10, the image file generation unit 104 a generates an image file including an image data part and an additional information part based on the RAW data input from the image sensor 103. Thereafter, the process proceeds to step S20, and the similarity determination unit 104b acquires RAW data as target data used at the time of similarity determination based on the image file generated by the image file generation unit 104a, and then proceeds to step S30. .

  In step S30, the similarity determination unit 104b determines whether the reference image data already exists in the memory 105 and whether a predetermined time or more has elapsed since the previous image was captured, and acquired RAW It is determined whether or not the data is set as reference image data. If it is determined that the acquired RAW data is set as the reference image data, the process proceeds to step S80. In step S <b> 80, the similarity determination unit 104 b sets the acquired RAW data as reference image data and stores it in the memory 105. Thereafter, the process proceeds to step S90.

  In step S90, the image classification unit 104c stores the reference image file corresponding to the set reference image data and the target image file corresponding to the target image data determined to have high similarity to the reference image data as the same group. Create a folder in the memory card. Thereafter, the process proceeds to step S100, and the image classification unit 104c stores the reference image file in the created folder, and ends the process.

  On the other hand, if it is determined in step S30 that the acquired RAW data is not set as the reference image data, the process proceeds to step S40. In step S40, the similarity determination unit 104b sets the acquired RAW data as target image data, and proceeds to step S50. In step S50, the similarity determination unit 104b compares the reference image data stored in the memory 105 with the set target image data as comparison target data, and the pattern represented by the reference image data, the target The degree of similarity with the pattern expressed by the image data is determined. Thereafter, the process proceeds to step S60.

  In step S60, the image classification unit 104c determines whether the similarity between the pattern expressed by the reference image data and the pattern expressed by the target image data is high or low based on the determination result by the similarity determination unit 104b. . If it is determined that the similarity is low, the process proceeds to step S80 described above. On the other hand, if it is determined that the similarity is high, the process proceeds to step S70. In step S70, the image classification unit 104c saves the target image file corresponding to the target image data in the folder in the memory card where the reference image file is stored, and ends the process.

According to the present embodiment described above, the following operational effects can be obtained.
(1) The degree of similarity is determined by comparing the patterns expressed by the reference image data and the target image data, and the image files are classified based on the similarity determination result. As a result, the image files can be automatically classified according to the similarity of the images, and the user's trouble of classifying the images can be saved.

(2) The image files classified based on the similarity determination result are divided into folders so that image files with high similarity are stored in the same folder. Thus, the user can easily access the classified image file only by selecting a folder.

(3) When generating an image file in Jpeg format, the image file generation unit 104a temporarily stores RAW data input from the image sensor 103 in the memory 105, and the similarity determination unit 104b RAW data temporarily stored in the memory 105 is acquired, and the similarity determination process described above is performed. When the image data portion included in the image file in Jpeg format, that is, the image data in Jpeg format is data that has been compressed after being subjected to various image processing, this image data portion is used for similarity determination processing May be affected by image processing and may not give a favorable result. However, the above problem can be avoided by performing similarity determination processing based on RAW data that has not undergone image processing.

(4) Image data captured by the image sensor 103 when it is determined that the reference image data does not exist in the memory 105, or when it is determined that the elapsed time since the previous image was captured is a predetermined time or more. RAW data acquired based on the above is set as reference image data. Then, the set reference image data is compared with the target image data captured after that, and the degree of similarity between them is compared. This makes it possible to compare the reference image data based on one image with the target image data based on one image, considering that images captured continuously are likely to be captured in the same scene. Therefore, for example, the processing load can be greatly reduced as compared with a case where all image data captured in the past is used as reference image data and compared with target image data captured thereafter.

(5) In order to determine the similarity of the pattern expressed by the comparison target data, the color information of the R, G, B color components of each pixel of the reference image data and the R, G of each pixel of the target image data The difference between the color information of each color component of B and B is calculated by equations (1) to (3), respectively, and if all the calculation results are smaller than a predetermined threshold, the similarity of the pattern expressed by the comparison target data is high. I decided to judge. This makes it possible to accurately determine the similarity of the pattern expressed by the comparison target data, taking into account that the difference in color information of each pixel is generally small between two images captured in the same scene. be able to.

-Modification-
The digital camera according to the above-described embodiment can be modified as follows.
(1) In the above-described embodiment, the image classification unit 104c, based on the determination result by the similarity determination unit 104b, sets the image pattern so that the images with high image similarity are in the same group. An example in which images having a high similarity are stored in the same folder in the memory card inserted in the memory card slot 106 has been described. However, the present invention is not limited to this, and images having high image similarity may be classified into the same group, and the result may be stored as data, for example, as a similarity management table. In addition, a file name indicating that the similarity of the pattern is high may be given to each image file, for example, by sharing a part of the file name (name) of the image having a high degree of similarity of the pattern. .

(2) In the above-described embodiment, the case where the image sizes of the image data output from the image sensor 103 to the control device 104 are all the same has been described. However, the present invention is not limited to this, and the image size of the image data output from the image sensor 103 to the control device 104 is different for each image, such as when the user changes the recording size of the image while the user is continuously capturing images. The present invention can be applied even when different for each case. That is, when the image sizes of the reference image data and the target image data are different, the similarity determination unit 104b matches the number of pixels of the reference image data and the number of pixels of the target image data so as to match any image size. After that, the similarity of the pattern expressed by each image data is determined.

  As a result, even when the reference image data and the target image data have different image sizes, the similarity determination process can be performed with high accuracy. When the image sizes of the reference image data and the target image data are different, if one of the image data having a small image size is enlarged to match the other image data having a large image size, the equation (1) ) To (3) can also improve the accuracy of the determination result.

  Further, the image size may be reduced in advance when determining the similarity regardless of the image size of the image data output from the image sensor 103 to the control device 104. That is, the similarity determination process may be performed after decreasing the number of pixels of the reference image data and the number of pixels of the target image data so as to match each other. Thus, for example, even if the sharpness differs between the reference image data and the target image data due to differences in focus, aperture, edge enhancement, etc., both image sizes By reducing, the influence of the difference in sharpness on the similarity determination of the pattern can be reduced.

(3) In the embodiment described above, the similarity determination unit 104b performs the similarity determination process based on the image data unit included in the image file or the original image data saved in the memory 105, that is, RAW data. Explained. However, the present invention is not limited to this, and the similarity determination process may be performed in consideration of the information of the additional information part included in the image file. For example, the reference image data and the target image data according to information (exposure parameters) that affect exposure at the time of image capturing, such as shooting sensitivity information added to the additional information section, lens 101 aperture setting information, shutter speed information, and the like. The integration processing is performed on the respective image data so that the exposure parameters at the time of imaging in FIG. That is, the similarity determination process may be performed after compensating for the exposure deviation between the comparison target data caused by at least one of the imaging sensitivity, the shutter speed, and the aperture value at the time of imaging the comparison target data.

  Specifically, when the shutter speed when capturing the reference image data is 1/500 and the shutter speed when capturing the target image data is 1/250, the level of each pixel value of the reference image data is doubled. The integration process is performed as follows. Alternatively, the integration process is performed so that the level of each pixel value of the target image data is halved. Thereafter, similarity determination processing is performed. As a result, even if there is a difference in the exposure parameter at the time of imaging between the reference image data and the target image data, the data difference can be compensated for, and the similarity of the pattern expressed by both image data Can be determined with high accuracy.

(4) In the above-described embodiment, the similarity determination unit 104b determines whether the image file format is the RAW format or the JPEG format, and the pattern represented by the RAW data based on the RAW data. The case where the similarity is determined has been described. However, the present invention is not limited to this. When a Jpeg format image file is generated by the image file generation unit 104a, the image data unit included in the generated Jpeg format image file, that is, based on the Jpeg format image data. The similarity of the pattern expressed by the image data in Jpeg format may be determined.

  Further, the similarity determination unit 104b may compare the similarity of patterns based on intermediate image data generated in the middle of generating a Jpeg format image file from RAW data. As an example, the image file generation unit 104 a performs an interpolation process from RAW data, and stores the interpolated RAW data in the memory 105. Then, the similarity determination unit 104b compares the interpolated RAW data as the reference image data and the target image data as described above.

  Alternatively, a thumbnail image of the image may be generated and stored in the additional information part included in the image file, and the similarity of the pattern expressed by the thumbnail image may be determined based on the thumbnail image. Note that the thumbnail image is generated based on the image data in the image data portion, and the generation method thereof is a known technique, and thus description thereof is omitted.

(5) In the above-described embodiment, the image file generation unit 104 a temporarily stores RAW data input from the image sensor 103 in the memory 105 when generating a JPEG image file. I made it. And the similarity determination part 104b demonstrated the example which acquires the RAW data temporarily stored in the memory 105, and performs a similarity determination process based on this acquired RAW data. However, the present invention is not limited to this, and when a Jpeg format image file is generated by the image file generation unit 104a, the similarity determination unit 104b performs the interpolation that has generated the image file from the Jpeg format image file. RAW data (intermediate image data) is estimated. Further, the image file generation unit 104a performs color interpolation processing on the RAW data in the process of generating a Jpeg format image file from the RAW data, and generates interpolated RAW data (intermediate image data). The similarity determination process may be executed based on the estimated interpolated RAW data and the created interpolated RAW data.

  In this case, the similarity determination unit 104b reversely executes various image processes for compressing the RAW data executed by the image file generation unit 104a into Jpeg format image data as described above, and the Jpeg format The interpolated raw data is estimated from the image data. Note that compression of RAW data to Jpeg data is generally irreversible compression, and therefore, even if various processes for compressing RAW data to Jpeg format image data are executed in reverse, they are completely interpolated. RAW data is not reproduced. However, when performing the similarity determination processing by the similarity determination unit 104b in the above-described embodiment, it is possible to estimate image data that can be processed with sufficient accuracy.

  As a result, when a Jpeg format image file is generated, the RAW data input from the image sensor 103 is not temporarily stored in the memory 105, but based on the generated Jpeg format image file. The similarity determination process can be performed by estimating the original interpolated RAW data. Note that RAW data before interpolation, that is, original image data, may be estimated instead of the interpolated RAW data.

(6) In the above-described embodiment, the similarity determination unit 104b calculates the color information of the R, G, and B color components of each pixel of the reference image data and the target image data according to the equations (1) to (3). The difference between the color information of each color component of R, G, B of each pixel is calculated, and if all the calculation results are smaller than the threshold value, the similarity between the pattern expressed by the reference image data and the pattern expressed by the target image The example which determines with being high was demonstrated. However, the present invention is not limited to this, and other methods may be used to determine the similarity between the patterns represented by the reference image and the target image.

  For example, the average value of color information for each RGB color component in the entire reference image data is compared with the average value of color information for each RGB color component in the entire target image data, and if the difference is within a predetermined threshold value You may make it determine with the similarity with the pattern expressed with both images being high. Alternatively, the histogram of the reference image data and the histogram of the target image data may be compared, and if the difference is within a predetermined threshold, it may be determined that the degree of similarity between the images represented by both images is high. Also, when at least two of the above determination methods are combined and it is determined that the similarity is high by all the determination methods executed in combination, it is determined that the similarity with the pattern expressed by both images is high. You may make it do.

(7) In the embodiment described above, the similarity determination unit 104b compares the reference image data based on one image and the target image data based on one image to determine the similarity. . However, the present invention is not limited to this. All the image data captured in the past is used as the reference image data, and one piece of image data taken most recently is used as the target image data, and a plurality of pieces of reference image data, one piece of target image data, May be compared. Thereby, for example, when the user takes two images of the same scene with a time interval, the two images may be classified into different folders in the similarity determination processing in the above-described embodiment. However, by avoiding this, it is possible to reliably store images captured in the same scene in the same folder.

(8) In the above-described embodiment, the example in which the process shown in the flowchart of FIG. 2 is installed in the digital camera 100 and is activated and executed by the control device 104 has been described. However, the present invention is not limited to this, and the image management program may be installed in another device, such as a personal computer, and executed on the CPU of the personal computer.

  Note that the present invention is not limited to the configurations in the above-described embodiments as long as the characteristic functions of the present invention are not impaired.

It is a block diagram which shows the structure of one Embodiment of a digital camera. It is a flowchart figure which shows the process of a digital camera.

Explanation of symbols

DESCRIPTION OF SYMBOLS 100 Digital camera, 101 Input switch, 102 Lens, 103 Image sensor, 104 Control apparatus, 104a Image file generation part, 104b Similarity determination part, 104c Image classification part, 105 Memory, 106 Memory card slot, 107 Monitor

Claims (16)

  1. A camera for classifying image files including an image data portion generated based on original image data captured by an image sensor,
    From the image data portion, the original image data, the original image data estimated from the image file, intermediate image data generated in the process of generating the image file from the original image data, and the image file A determination unit that determines any of the estimated intermediate image data as comparison target data, compares the patterns expressed by the comparison target data, and determines the degree of similarity.
    A camera comprising: classification means for classifying the image file based on a determination result by the determination means.
  2. The camera of claim 1,
    The said determination means determines the said similarity, after making the number of pixels of the said comparison object data correspond.
  3. The camera according to claim 1 or 2,
    The determination unit is configured to determine the similarity after reducing the number of pixels of the comparison target data and reducing an image.
  4. The camera of claim 1,
    The said classification | category means classify | categorizes the said image file by storing the image file determined as the said similarity high by the said determination means in the same folder, The camera characterized by the above-mentioned.
  5. The camera of claim 1,
    The classifying unit classifies the image file by setting each file name to a name indicating a high degree of similarity with respect to the image file determined to be high by the determining unit. Features a camera.
  6. The camera of claim 1,
    The camera according to claim 1, wherein the classification means stores a classification result of the image file as a similarity management table.
  7. The camera of claim 1,
    The determination means compensates for an exposure deviation between the comparison target data caused by at least one of photographing sensitivity, shutter speed, and aperture value at the time of capturing the comparison target data, and then determines the similarity. Features a camera.
  8. The camera of claim 1,
    The image data portion includes a thumbnail image generated based on the image data,
    The determination unit determines the similarity based on the thumbnail image.
  9. An image management program for classifying image files including an image data portion generated based on original image data captured by an image sensor,
    On the computer,
    From the image data portion, the original image data, the original image data estimated from the image file, intermediate image data generated in the process of generating the image file from the original image data, and the image file A determination procedure in which any one of the estimated intermediate image data is set as comparison target data, a pattern expressed by the comparison target data is compared, and the similarity is determined.
    An image management program for executing a classification procedure for classifying the image file based on a determination result obtained by the determination procedure.
  10. The image management program according to claim 9.
    The determination procedure is characterized in that the similarity is determined after matching the number of pixels of the comparison target data.
  11. The image management program according to claim 9 or 10,
    The image management program characterized in that the determination procedure determines the similarity after reducing the number of pixels of the comparison target data and reducing the image.
  12. The image management program according to claim 9.
    The image management program according to claim 1, wherein the classification procedure classifies the image files by storing the image files determined as having a high similarity in the determination procedure in the same folder.
  13. The image management program according to claim 9.
    The classification procedure classifies the image file by setting each file name to a name indicating that the similarity is high for the image file having a high similarity according to the determination procedure. Management program.
  14. The image management program according to claim 9.
    The classification procedure stores the classification result of the image file as a similarity management table.
  15. The image management program according to claim 9.
    The determination procedure includes determining the similarity after compensating for an exposure deviation between the comparison target data caused by at least one of photographing sensitivity, shutter speed, and aperture value at the time of capturing the comparison target data. A featured image management program.
  16. The image management program according to claim 9.
    The image data portion includes a thumbnail image generated based on the image data,
    The image management program characterized in that the determination procedure determines the similarity based on the thumbnail image.
JP2005339698A 2005-11-25 2005-11-25 Camera and image management program Expired - Fee Related JP4665731B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005339698A JP4665731B2 (en) 2005-11-25 2005-11-25 Camera and image management program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005339698A JP4665731B2 (en) 2005-11-25 2005-11-25 Camera and image management program

Publications (2)

Publication Number Publication Date
JP2007150491A true JP2007150491A (en) 2007-06-14
JP4665731B2 JP4665731B2 (en) 2011-04-06

Family

ID=38211392

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005339698A Expired - Fee Related JP4665731B2 (en) 2005-11-25 2005-11-25 Camera and image management program

Country Status (1)

Country Link
JP (1) JP4665731B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009246450A (en) * 2008-03-28 2009-10-22 Casio Hitachi Mobile Communications Co Ltd Image processing apparatus and program
JP2011170424A (en) * 2010-02-16 2011-09-01 Nec Corp Mobile terminal and image classification method
JP2013127819A (en) * 2013-03-14 2013-06-27 Canon Inc Image processing apparatus and method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004064700A (en) * 2002-07-31 2004-02-26 Sony Corp Image classification apparatus, image classification method, program and recording medium, and image classification system
JP2005020409A (en) * 2003-06-26 2005-01-20 Casio Comput Co Ltd Image photographing apparatus, image arrangement apparatus, and program
JP2005080056A (en) * 2003-09-02 2005-03-24 Fuji Photo Film Co Ltd Moving picture reproducing device and electronic camera
JP2005107885A (en) * 2003-09-30 2005-04-21 Casio Comput Co Ltd Image classifying device and program
JP2005151089A (en) * 2003-11-14 2005-06-09 Seiko Epson Corp Generation of parameter for image correction for image generation apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004064700A (en) * 2002-07-31 2004-02-26 Sony Corp Image classification apparatus, image classification method, program and recording medium, and image classification system
JP2005020409A (en) * 2003-06-26 2005-01-20 Casio Comput Co Ltd Image photographing apparatus, image arrangement apparatus, and program
JP2005080056A (en) * 2003-09-02 2005-03-24 Fuji Photo Film Co Ltd Moving picture reproducing device and electronic camera
JP2005107885A (en) * 2003-09-30 2005-04-21 Casio Comput Co Ltd Image classifying device and program
JP2005151089A (en) * 2003-11-14 2005-06-09 Seiko Epson Corp Generation of parameter for image correction for image generation apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009246450A (en) * 2008-03-28 2009-10-22 Casio Hitachi Mobile Communications Co Ltd Image processing apparatus and program
JP2011170424A (en) * 2010-02-16 2011-09-01 Nec Corp Mobile terminal and image classification method
JP2013127819A (en) * 2013-03-14 2013-06-27 Canon Inc Image processing apparatus and method thereof

Also Published As

Publication number Publication date
JP4665731B2 (en) 2011-04-06

Similar Documents

Publication Publication Date Title
US10142536B2 (en) Camera using preview image to select exposure
US8666124B2 (en) Real-time face tracking in a digital image acquisition device
CA2812540C (en) High dynamic range transition
US8983148B2 (en) Color segmentation
EP1737247B1 (en) Image sensing apparatus and image processing method
US7995116B2 (en) Varying camera self-determination based on subject motion
US7397502B2 (en) Imaging apparatus including control device for controlling white balance
JP4294896B2 (en) Image processing method and apparatus, and program therefor
US7424171B2 (en) Image processing method and apparatus, computer program, and computer readable storage medium
CN101325659B (en) Imaging device, imaging method
JP4746295B2 (en) Digital camera and photographing method
JP4973098B2 (en) Image processing apparatus, image processing method, and program
US7551797B2 (en) White balance adjustment
US7835550B2 (en) Face image recording apparatus, image sensing apparatus and methods of controlling same
US7903168B2 (en) Camera and method with additional evaluation image capture based on scene brightness changes
JP4837365B2 (en) Image processing system and image processing program
JP5898466B2 (en) Imaging device, control method thereof, and program
JP4315971B2 (en) Imaging device
US8872937B2 (en) Image capture apparatus and image capturing method
US7474341B2 (en) Portable digital camera with red eye filter
US8107764B2 (en) Image processing apparatus, image processing method, and image processing program
US8350926B2 (en) Imaging apparatus, method of processing imaging result, image processing apparatus, program of imaging result processing method, recording medium storing program of imaging result processing method, and imaging result processing system
US7834915B2 (en) Image processing apparatus, imaging apparatus, imaging processing method, and computer program
JP5791336B2 (en) Image processing apparatus and control method thereof
EP1834302B1 (en) Automatic white balancing of colour gain values

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20081014

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20100625

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100629

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20100830

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100830

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100921

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20101122

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20101214

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20101227

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140121

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140121

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees