WO2006077703A1 - Imaging device, image processing device, and image recording device - Google Patents

Imaging device, image processing device, and image recording device Download PDF

Info

Publication number
WO2006077703A1
WO2006077703A1 PCT/JP2005/023007 JP2005023007W WO2006077703A1 WO 2006077703 A1 WO2006077703 A1 WO 2006077703A1 JP 2005023007 W JP2005023007 W JP 2005023007W WO 2006077703 A1 WO2006077703 A1 WO 2006077703A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image
shooting
information
scene
Prior art date
Application number
PCT/JP2005/023007
Other languages
French (fr)
Japanese (ja)
Inventor
Hiroaki Takano
Takeshi Nakajima
Daisuke Sato
Tsukasa Ito
Original Assignee
Konica Minolta Photo Imaging, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Photo Imaging, Inc. filed Critical Konica Minolta Photo Imaging, Inc.
Publication of WO2006077703A1 publication Critical patent/WO2006077703A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • H04N9/8047Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction using transform coding

Definitions

  • Imaging apparatus image processing apparatus, and image recording apparatus
  • the present invention relates to an imaging apparatus such as a digital camera, and an image processing apparatus and an image recording apparatus that perform an optimization process for forming an appreciation image on an output medium on captured image data obtained by the imaging apparatus.
  • captured image data obtained by an imaging device has been recorded on a recording medium such as a CD-R (Compact Disc Recordable), a floppy (registered trademark) disk, a memory card, or the like.
  • a recording medium such as a CD-R (Compact Disc Recordable), a floppy (registered trademark) disk, a memory card, or the like.
  • Distributed via communication networks such as CRT (Cathode Ray Tube), liquid crystal, plasma, etc., and display on display devices of small liquid crystal monitors of mobile phones, digital printers, inkjet printers, thermal printers, etc.
  • There are various output methods such as printing as a hard copy image using an output device such as the above.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2004-96500
  • An object of the present invention is to suppress information loss of photographed image information and to view a high-quality appreciation image that is higher than image data obtained by an imaging device on the image processing apparatus or image recording apparatus side. It is possible to generate reference data.
  • Scene reference raw data generation means for generating scene reference raw data depending on the characteristics of the imaging device by imaging, and optimizing the scene reference raw data for forming an appreciation image on an output medium
  • Reproduction auxiliary data generating means for generating reproduction auxiliary data when performing image processing to generate appreciation image reference data, and recording control means for attaching the reproduction auxiliary data to the scene reference raw data and recording it on a recording medium
  • An imaging apparatus comprising:
  • Image processing means for performing image processing on the scene reference raw data using the reproduction auxiliary data to create the viewing image reference data, and based on the viewing image reference data created by the image processing means.
  • the imaging apparatus according to 1, comprising an image forming means for forming an appreciation image on an output medium.
  • the image processing means standardizes the scene reference raw data to create scene reference data, and performs image processing on the scene reference data using the viewing image reference data restoration information to perform the viewing image 3.
  • the imaging apparatus according to 2, wherein reference data is generated, and the reproduction assistance data includes the viewing image reference data restoration information.
  • the imaging apparatus further comprising a shadow condition adjusting unit, wherein the reproduction assistance data includes processing process reproduction information indicating a history of shooting conditions adjusted before the main shooting by the shooting condition adjusting unit. .
  • processing process reproduction information includes an index value for determining the validity of the imaging condition.
  • index value includes a value that specifies at least one of user characteristics, light source conditions, and exposure conditions during shooting.
  • shooting information data generating means for generating shooting information data indicating shooting condition setting at the time of shooting
  • the recording control means records the shooting information data attached to the scene reference raw data on a recording medium.
  • An image processing apparatus comprising: reference data generation means.
  • the reproduction auxiliary data includes appreciation image reference data restoration information for restoring appreciation image reference data in the imaging device when generating appreciation image reference data on the output medium.
  • the reproduction auxiliary data includes processing process reproduction information for reproducing the generation process of the appreciation image reference data in the imaging device when generating the appreciation image reference data on the output medium.
  • processing process reproduction information includes an index value for determining validity of imaging conditions in the imaging apparatus.
  • the index values include user characteristics, light source conditions, and exposure conditions when shooting with the imaging device. 13.
  • the input means inputs photographing condition data indicating photographing condition settings at the time of photographing
  • the appreciation image reference data generating means inputs the reproduction auxiliary data inputted with respect to the inputted scene reference raw data.
  • the image processing apparatus according to any one of 9 to 14, wherein the image processing apparatus generates an appreciation image reference data by performing an optimization process based on the shooting condition data.
  • An image recording apparatus comprising: an image forming unit.
  • the term "generation” means that a program and a processing circuit that operate in the imaging apparatus, image processing apparatus, and image recording apparatus according to the present invention renew image signals and data.
  • “Create” may be used as a synonym.
  • imaging device refers to a device including an imaging element (image sensor) having a photoelectric conversion function, and includes a so-called digital camera and scanner.
  • image sensors include CCD (Charge Coupled Device), a charge transfer mechanism, and a CCD image sensor that provides color sensitivity by combining a pine pattern color filter, CMOS (Complementary Metal). -Oxide Semiconductor) type image sensor.
  • the output current of these image sensors is digitized by A / D change.
  • the contents of each color channel at this stage are signal intensities based on the spectral sensitivity unique to the image sensor.
  • scene reference raw data depending on the characteristics of the imaging device is a direct raw output signal of the imaging device that records information faithful to the subject, and is digitized by an A / D converter. Noise correction such as fixed pattern noise and dark current noise Means the data performed. This scene reference raw data is used for image processing that modifies the data content to improve the effect of image viewing such as tone conversion 'sharpness enhancement' and saturation enhancement. The process of mapping the signal strength of the channel to a standardized color space such as RIMM R GB or sRGB is omitted.
  • the information amount (for example, the number of gradations) of the scene reference raw data is preferably equal to or greater than the information amount (for example, the number of gradations) required for the viewing image reference data, according to the performance of the A / D converter. Good.
  • the gradation number of the scene reference raw data is preferably 12 bits or more, more preferably 14 bits or more, and more than 16 bits. preferable.
  • the "output medium” is, for example, a display device such as CRT (Cathode Ray Tube), LCD (Liquid Crystal Display), plasma display, silver salt photographic paper, inkjet paper, thermal printer paper, etc. This is a paper for generating a hard copy image.
  • CRT Cathode Ray Tube
  • LCD Liquid Crystal Display
  • plasma display silver salt photographic paper
  • inkjet paper thermal printer paper
  • thermal printer paper etc. This is a paper for generating a hard copy image.
  • Viewing image reference data is used for display devices such as CRTs, LCDs, and plasma displays, and generates hard copy images on output media such as silver halide photographic paper, inkjet paper, and thermal printer paper. This means digital image data used for. Appreciation Image reference data is subjected to an “optimization process” so that an optimal image can be obtained on display devices such as CRT, LCD, and plasma display, and output media such as silver halide photographic paper, inkjet paper, and thermal printer paper. This is different from the scene reference raw data.
  • the "recording medium” is a storage medium used for storing "scene reference raw data” and "reproduction assistance data” output from the imaging apparatus, and is a compact flash (registered trademark), memory stick (registered). Trademark), SmartMedia (registered trademark), multimedia card, hard disk, floppy (registered trademark) disk, magnetic storage medium (MO), CD-R, etc.
  • the unit that writes to the recording media can be installed in an independent or remote location connected wirelessly via a communication unit such as the Internet or a communication unit such as the Internet, even if it is integrated with the camera. Any unit such as a unit may be used.
  • the file format when recording to a recording medium is not a format specific to the imaging device. TIFF (Tagged Image File Format), JPEG (Joint Photograp hie Coding Experts Group), Exif (Exchangeable Image File Format), etc.
  • the "shooting information data” is a record of shooting condition settings at the time of shooting, and may include the same tag information written in the header part of the Exif7 aisle. Specifically, exposure time, shutter speed, aperture value (F-number), ISO sensitivity, brightness value, subject distance range, light source, flash on / off, subject area, white balance, zoom magnification, shooting scene, strobe light source Tag (code) indicating information on the amount of reflected light, shooting saturation, type of subject, subject configuration, and the like.
  • shooting information data is a value obtained at the time of shooting by a sensor provided in the camera, data processed by the sensor, It is classified into the shooting conditions of the camera set based on the sensor value, but in addition to this, the shooting mode dial (for example, portrait, sport, macro shooting mode, etc.) It also includes information that the photographer has manually set the setting switch for forced flash.
  • the shooting mode dial for example, portrait, sport, macro shooting mode, etc.
  • standardized scene reference image data means that the signal intensity of each color channel based on at least the spectral sensitivity of the image sensor itself has been mapped to the standard color space such as RIMM RGB, ERIMM RGB, or scRGB. Yes, it means image data in which the image processing that modifies the data contents is omitted in order to improve the effect at the time of image viewing such as gradation conversion, sharpness enhancement, and saturation enhancement.
  • the scene reference image data is the photoelectric conversion characteristics of the imaging device (opto-electronic conversionposition defined by ISO 1452, for example, Corona “Fine Imaging and Digital Photography” (published by the Japan Photographic Society Publishing Committee, page 449). It is preferable that the correction is made (see).
  • the amount of information (for example, the number of gradations) of the standardized scene reference image data conforms to the performance of the A / D converter and is equal to or greater than the amount of information (for example, the number of gradations) required for the viewing image reference data.
  • the number of gradations of the viewing image reference data is 8 bits per channel
  • the gradation number of the scene reference image data is preferably 12 bits or more, more preferably 14 bits or more, and even more preferably 16 bits or more.
  • Imaging device characteristic correction for generating standardized scene reference image data means a process of converting “scene reference raw data depending on characteristics of the imaging device” to “standardized scene reference image data”.
  • the content of this process depends on the state of the “scene reference raw data that depends on the characteristics of the imaging device”, but at least the signal intensity of each color channel based on the spectral sensitivity unique to the imaging device is set to RIMM RGB, ERIMM RGB, sc RGB, etc. Mapping to the standard color space. For example, when the “scene reference raw data depending on the characteristics of the imaging device” is not subjected to the interpolation processing based on the color filter array, this processing needs to be additionally performed.
  • optical processing means obtaining an optimal image on a display device such as a CRT, LCD, plasma display, or an output medium such as silver salt photographic paper, inkjet paper, thermal printer paper, etc.
  • a display device such as a CRT, LCD, plasma display, or an output medium such as silver salt photographic paper, inkjet paper, thermal printer paper, etc.
  • the process is performed so that the optimum color reproduction is obtained within the color gamut of the sRGB standard.
  • processing is performed to obtain optimal color reproduction within the color gamut of silver salt photographic paper.
  • color gamut compression it also includes gradation compression from 16 bits to 8 bits, reduction of the number of output pixels, and handling of output device output characteristics (LUT).
  • image processing such as noise suppression, sharpening, color balance adjustment, saturation adjustment or masking, and baking is performed.
  • the degree of white balance adjustment can be relaxed and the color balance can be adjusted specially.
  • the distance between the photographer and the subject is estimated based on the "amount of reflected light from the strobe light source" information, and can be reflected in, for example, setting of image processing conditions that suppress over-exposed skin.
  • the degree of sharpness is relaxed, By strengthening the smooth wrinkle treatment, the wrinkles of the skin can be made inconspicuous.
  • the image recording apparatus includes a color negative film, a color reversal film, a black and white, in addition to a mechanism for performing image processing according to the present invention on digital image data acquired from the imaging apparatus according to the present invention.
  • Film scanners that input frame image information of photographic photosensitive materials recorded by analog cameras, such as negative film and black and white reversal film, and flatbed scanners that input image information reproduced on color paper that is silver salt photographic paper You may have.
  • a compact flash (registered trademark), memory stick (registered trademark), smart media (registered trademark), multimedia card (registered trademark), floppy (registered trademark) disk Digital image data can be obtained from a remote location via a means for reading digital image data stored on any known portable "recording media” such as magneto-optical storage media (MO) or CD-R, or via communication means.
  • processing means for forming an appreciation image on any known output medium such as display devices such as CRT, LCD, plasma display, hard copy image generation paper such as silver salt photographic paper, inkjet paper, thermal printer paper, etc. And may be provided.
  • reproduction auxiliary data for generating appreciation image reference data is generated on the imaging apparatus side, and the reproduction auxiliary data is attached to scene reference raw data depending on the characteristics of the imaging apparatus.
  • the viewing image reference data is generated using the reproduction auxiliary data, and the information loss of the captured image information is reduced.
  • appreciation image reference data with higher quality than the image data obtained by the imaging device.
  • the image processing apparatus or the image recording apparatus side reproduces the reproduction assistance data by attaching the shooting information data, which is the shooting condition setting at the time of shooting, to the scene reference raw data. Also, viewing image reference data is generated using the shooting information data, and it is possible to obtain viewing image reference data of higher quality.
  • FIG. 1 is a block diagram showing a main part configuration of an imaging apparatus according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing an internal configuration of a reproduction auxiliary data generation unit.
  • FIG. 3 is a diagram showing the data structure of a data file recorded on the recording medium of the storage device in FIG.
  • ⁇ 4 A diagram for explaining the contents of the imaging device characteristic correction information.
  • ⁇ 5 A diagram showing the tone conversion characteristics of scene reference image data (a), and a diagram showing the tone conversion characteristics of scene reference image data and appreciation image reference data (b).
  • FIG. 6 A diagram showing the conversion characteristics of scene reference image data into appreciation image reference data (a), and a diagram showing gradation conversion characteristics of appreciation image reference data for each light source condition and exposure condition (b).
  • FIG. 7 is a diagram showing an example of the data structure of processing process reproduction information.
  • FIG. 9 is a flowchart showing image data recording processing executed in the imaging apparatus of the embodiment.
  • FIG. 10 is a flowchart showing a shooting scene determination process executed in the imaging apparatus.
  • FIG. 12 is a diagram showing an example of a program for converting RGB power into the HSV color system.
  • FIG. 13 is a diagram showing the lightness (V) —hue (H) plane and the region rl and region r2 on the V—H plane.
  • FIG. 14 is a diagram showing the brightness (V) —hue (H) plane and the region r3 and the region r4 on the V—H plane. 15] A diagram showing a curve representing a first coefficient for multiplying the first occupancy ratio for calculating the index ⁇ .
  • [16] A diagram showing a curve representing a second coefficient for multiplying the first occupancy rate for calculating the index j8.
  • FIG. 17 is a flowchart showing a second occupancy ratio calculation process for calculating a second occupancy ratio based on the composition of captured image data.
  • FIG. 18 is a diagram showing regions nl to n4 determined according to the distance from the outer edge of the screen of captured image data.
  • FIG. 20 is a flowchart showing a bias amount calculation process.
  • FIG. 21 is a block diagram showing a main part configuration of an imaging apparatus in a modification of the present embodiment.
  • FIG. 22 is a diagram showing a data structure of a data file recorded on a recording medium of the storage device of FIG.
  • FIG. 23 is a flowchart showing image data recording processing executed in an imaging apparatus according to a modification of the present embodiment.
  • FIG. 24 is an external view of an image recording apparatus according to an embodiment of the present invention.
  • FIG. 25 is a block diagram showing the internal configuration of the image recording apparatus.
  • FIG. 26 is a block diagram showing the internal configuration of the image processing apparatus and output unit.
  • FIG. 27 is a flowchart showing image processing executed in the image recording apparatus.
  • FIG. 1 shows a main part configuration of an imaging apparatus 100 according to an embodiment of the present invention.
  • the imaging device 100 includes a lens 1, an aperture 2, a CCD (Charge Coupled Device) 3, an analog processing circuit 4, an AZD converter 5, a temporary storage memory 6, an image processing unit 7, a header.
  • Information processing section 8, storage device 9, CCD drive circuit 10, control section 11, reproduction auxiliary data generation section 12, operation section 14, display section 15, strobe drive circuit 16, strobe 17, focal length adjustment circuit 18, autofocus A drive circuit 19 and a motor 20 are provided.
  • the optical system of the imaging apparatus 100 includes a lens 1, a diaphragm 2, and a CCD 3.
  • the lens 1 adjusts the focus and forms an optical image of the subject.
  • Aperture 2 adjusts the amount of light beam that has passed through lens 1.
  • the CCD 3 photoelectrically converts the subject light imaged on the light receiving surface by the lens 1 into an electrical signal (imaging signal) of an amount corresponding to the amount of incident light for each sensor in the CCD 3. Then, the CCD 3 is sequentially controlled by the timing pulse input from the CCD drive circuit 10, and sequentially outputs this imaging signal to the analog processing circuit 4.
  • the analog processing circuit 4 performs RGB signal amplification, noise reduction processing, and the like on the image pickup signal input from the CCD 3.
  • the processing in the analog processing circuit 4 is switched ONZOFF via the control unit 11 in accordance with the operation signal of the operation unit 14 force.
  • the AZD converter 5 converts the imaging signal input from the analog processing circuit 4 into a digital signal and outputs the digital signal.
  • the digital signal obtained by AZD modification 5 will be described as scene reference raw data.
  • the temporary storage memory 6 is a notch memory or the like, and temporarily stores the image data output from the AZD modification 5.
  • the image processing unit 7 performs tone correction, spectral sensitivity crosstalk correction, dark current noise suppression, sharpening, for display on the display unit 15 with respect to the image data stored in the temporary storage memory 6. In addition to image quality improvement processing such as white balance adjustment and saturation adjustment, processing such as image size change, trimming, and aspect conversion is performed.
  • image quality improvement processing such as white balance adjustment and saturation adjustment
  • processing such as image size change, trimming, and aspect conversion is performed.
  • the processing in the image processing unit 7 is switched ONZOFF via the control unit 11 in accordance with an operation signal from the operation unit 14.
  • the header information processing unit 8 adds the reproduction auxiliary data (details will be described later) generated by the reproduction auxiliary data generation unit 12 to the file header (header area) of the scene reference raw data stored in the temporary storage memory 6. Attach and create the attached data file (see Figure 3).
  • the storage device 9 is configured by a nonvolatile semiconductor memory or the like, and stores a control program for the imaging apparatus 100.
  • the storage device 9 includes a mounting unit for mounting a recording medium such as a memory card, and in accordance with a control signal input from the control unit 11, reading of recorded data of a recording medium mounted on the mounting unit, Write data to the recording media.
  • the CCD drive circuit 10 outputs a timing pulse in accordance with a control signal input from the control unit 11, and controls the drive of the CCD 3.
  • the control unit 11 is configured by a CPU (Central Processing Unit) or the like, and is stored in the storage device 9, reads out a control program of the imaging device 100, and controls the entire imaging device 100 in accordance with the read control program. I do.
  • CPU Central Processing Unit
  • the control unit 11 includes an automatic focus driving circuit 19 that controls a motor 20 that adjusts the focal length and focus (focus) of the lens 1 in accordance with an operation signal from the operation unit 14, and a focal length adjustment.
  • the control unit 11 instructs the reproduction auxiliary data generation unit 12 to generate reproduction auxiliary data (details will be described later) to be attached to the file header of the scene reference raw data obtained by shooting. Instructs the recording medium to record a data file with supplementary reproduction data attached to the scene reference raw data.
  • the reproduction auxiliary data generation unit 12 generates reproduction auxiliary data necessary for generating appreciation image reference data by performing an optimization process for appreciation image formation on the output medium.
  • the reproduction assistance data is output to the header information processing unit 8.
  • Reproduction auxiliary data generation The internal configuration of the unit 12 will be described in detail later with reference to FIG.
  • the operation unit 14 includes various function buttons such as a shutter button, a power ON / OFF button, and a zoom button, a cursor key, and the like, and outputs an operation signal corresponding to each button and key to the control unit 11. It consists of a play etc. and performs the required display processing according to the display control signal input by the control unit 11.
  • the display unit 15 displays information for the user of the image capturing apparatus 100 to confirm the conditions for shooting, or displays the viewing image reference data generated for display on the display unit 15. To do.
  • the display unit 15 also has a function as a finder that continuously displays images captured by the CCD 3 in the shooting mode.
  • the strobe driving circuit 16 controls the strobe 17 to emit light when the subject brightness is low, based on a control signal input from the control unit 11.
  • the strobe 17 boosts the battery voltage to a predetermined high voltage and stores it as a charge in a capacitor.
  • the strobe 17 is driven by the strobe driving circuit 16 to emit light from the X tube with the electric charge stored in the capacitor, and irradiates the subject with auxiliary light.
  • the focal length adjustment circuit 18 controls the motor 20 for adjusting the focal length by moving the lens 1 by the control signal input from the control unit 11.
  • the automatic focus driving circuit 19 controls the motor 20 for moving the lens 1 and adjusting the focus (focus) by the control signal input from the control unit 11.
  • FIG. 2 shows an internal configuration of the reproduction assistance data generation unit 12.
  • the reproduction auxiliary data generation unit 12 includes an imaging device characteristic correction information generation unit 121, an appreciation image reference data restoration information generation unit 122, and a processing process reproduction information generation unit 123.
  • the imaging device characteristic correction information generation unit 121 generates information (imaging device characteristic correction information) necessary for imaging device characteristic correction processing for generating scene reference image data in which the scene reference raw data force is also standardized. To do. As shown in FIG. 4, the imaging device characteristic correction processing includes filter interpolation calculation (a), matrix calculation (b), photoelectric conversion characteristic and gain correction (c) for scene reference raw data.
  • the filter interpolation calculation shown in FIG. 4 (a) is the image data having a filter arrangement of one pixel and one color. Force Interpolation to linear image data of 1 pixel 3 colors (RGB).
  • the filter array is an array pattern of color filters for CCD color discrimination, and an array of RGB primary colors is generally used.
  • As the filter interpolation method it is possible to apply the neare st neighbor method, the bi-linear interpolation method, the noi 1 ⁇ Higg (bi-cubic convolution) method, etc. Is possible.
  • a pixel nearest to the target pixel is selected, and the pixel is enlarged and reduced as it is in accordance with the enlargement / reduction size, and the pixel is interpolated only at a necessary portion.
  • linear density interpolation is performed from the density values of the four pixels around the target pixel according to the coordinates (real values).
  • interpolation is performed using a cubic function from the density values of 16 pixels around the pixel of interest in order to perform interpolation with higher accuracy than the bilinear method.
  • the formula used for interpolation is 3 ( ⁇ ) / ⁇ , which is the most complete density interpolation formula in theory (sampling theorem). This is approximated by the third-order term of X in the Tiller expansion and used as an interpolation formula.
  • the matrix operation shown in Fig. 4 (b) is an operation for correcting the difference in which the same subject color is recorded as a different signal due to the difference in the spectral characteristics of the color filter and the spectral sensitivity characteristics of the image sensor. It is a process and is converted from tristimulus values of RGB to XYZ color system by this matrix operation.
  • Fig. 4 (c) shows processing for correcting the difference due to photoelectric conversion characteristics (linear conversion characteristics) and gain correction (translation) of the image sensor. This processing is shown in Fig. 4 (c As shown in (), the characteristic is that UogY (Y: stimulus value of XYZ color system) is linearly converted to 1 ogE (E: exposure amount).
  • the scene reference raw data power depending on the characteristics of the imaging device XYZ in a standardized color space such as scene reference image data Converted to a value.
  • the color filter array pattern applied in the imaging device characteristic correction process, matrix coefficients a to i for matrix calculation, and the coefficient values for correcting differences associated with photoelectric conversion characteristics and gain correction are all different for each imaging device. is there.
  • the appreciation image reference data restoration information generation unit 122 restores the appreciation image reference data in the imaging device 100 when generating the appreciation image reference data from the scene reference image data. Encoding information of the restoration information (viewing image reference data restoration information) is generated.
  • FIG. 5 (a) shows the tone conversion characteristics of scene reference image data.
  • this tone conversion characteristic is a characteristic that logY (Y: stimulus value of XYZ color system) is converted to linear with respect to logE (E: exposure amount).
  • Figure 5 (b) shows the L * a * b * color system L * and logE (same color space) as the gradation conversion characteristics in the sRGB color space of scene reference image data and appreciation image reference data.
  • E Exposure amount
  • the appreciation image reference data now has a characteristic that L * changes linearly with respect to log E.
  • Fig. 6 (a) shows the conversion characteristics (curved line in the figure) of the viewing image reference data from the scene reference image data.
  • the conversion characteristics shown in Fig. 6 (a) can be corrected with the shooting scene at the time of shooting (front light, backlight, under, flash proximity, etc.).
  • Figure 6 (b) shows the gradation conversion characteristics of the appreciation image reference data for correcting the light conditions for backlighting, flash close-up photography, and under / over exposure conditions.
  • the combination of the conversion characteristics shown in Fig. 6 (a) and the conversion characteristics shown in Fig. 6 (b) is the viewing image reference data restoration information.
  • the processing process reproduction information generation unit 123 reproduces the process of generating appreciation image reference data in the imaging apparatus 100 when the scene reference image data power and the appreciation image reference data are generated (processing process reproduction). Information).
  • FIG. 7 shows an example of processing process reproduction information.
  • the processing process reproduction information indicates history information of shooting conditions from the actual shooting to a predetermined time before (for example, 2500 ms before). Specifically, as shown in FIG. Indicators 1 to 3 indicating the processing result of the discrimination process of shooting scenes (front light, backlight, under, flash proximity, etc.), total indicator 4 calculated from indicators 1 to 3, and evaluation of the calculated indicator Show history information for value 1, value for camera shake level evaluation 2, and correctness of shooting conditions.
  • the camera shake level indicates the degree of camera shake in, for example, 10 levels (an integer of 1 to 10). The higher the camera shake level, the higher the value of the camera shake level.
  • Indices 1 to 3 are numerical values for specifying a shooting scene (forward light, backlight, under, flash proximity, etc.) at the time of shooting.
  • Fig. 8 shows the discrimination map for discriminating the shooting scene.
  • the index 1 is the strobe degree (strobe degree)
  • the index 2 is the backlight intensity (backlight degree)
  • the index 3 is the under degree (under degree).
  • the area force determined by index 1 (strobe degree) and index 2 (backlight intensity) is also distinguished from backlight and direct light
  • Under and flash proximity are discriminated from the area determined by. The shooting scene discrimination process will be described in detail later with reference to FIGS.
  • Total index 4 10 — [ ⁇ (index 1 + index 2 + index 3
  • Evaluation value 1 and evaluation value 2 indicate the light source condition 'exposure condition evaluation value and camera shake evaluation value, respectively, and are defined as in equations (2) and (3).
  • Evaluation value 1 Total index 4 X 3.4— 5 (2)
  • Evaluation value 2 — 0.54 X Camera shake level +8.5 (3)
  • Equations (2) and (3) among indicators 1 to 3, a value of 6 or less is 6 and a value of +6 or more is +6.
  • the correctness of the shooting conditions increases as the shooting is performed correctly.
  • the photographing conditions (light source conditions, exposure conditions, etc.) are adjusted so that the numerical value of the photographing condition validity increases as time passes.
  • FIGS. 8 (a) and 8 (b) the history of the indices 1 to 3 indicated by the processing process reproduction information shown in FIG. 7 is indicated by diamonds.
  • the history information on the discrimination map shown in FIG. 8 may be displayed on the display unit 15.
  • the camera shake level is an integer value of 1 to 10
  • the numerical range of the indicators 1 to 3 is 6 or more + 6 or less, but these numerical ranges are not particularly limited.
  • the definition of total index 4, evaluation value 1, and evaluation value 2 will change as the numerical range changes.
  • Various user mode settings may be used as user characteristics other than camera shake.
  • the recording of history information as processing process reproduction information starts after the shooting mode is designated by the operation unit 14, and the time when the shutter button is pressed may be the time of actual shooting. Recording of history information starts after the shutter button of Part 14 is pressed halfway, and the time when the shutter button is pressed hard can be used for actual shooting!
  • FIG. 3 shows the data structure of a data file recorded on the recording medium of the storage device 9.
  • the imaging device characteristic correction information, the appreciation image reference data restoration information, and the processing process reproduction information generated by the reproduction auxiliary data generation unit 12 are processed by the header information processing unit 8 as shown in FIG.
  • the attached data file is created by attaching to the file header as reproduction auxiliary data, and is recorded on the recording medium of the attached data file storage device 9.
  • scene reference image data that is simply used for display on the display unit 15 with the appreciation image reference data generated in the imaging apparatus 100 and the appreciation image reference data are associated with each other on a recording medium. You can record it or record the thumbnail image of the viewing image reference data on the recording medium as metadata of the scene reference raw data.
  • step S1 When the shooting mode is designated by the operation unit 14, preliminary imaging is performed (step S1), and an image obtained by preliminary imaging (hereinafter referred to as preliminary imaging image) is displayed on the display unit 15.
  • the viewing image reference data is formed (step S2), and the formed viewing image reference data is displayed on the display unit 15 (step S3).
  • preliminary imaging may be performed when the shutter button is pressed halfway by the operation unit 14.
  • a shooting scene discrimination process (see FIGS. 10 to 20) is performed, and the camera shake level and the indices 1 to 3 as a result of the shooting scene discrimination process are captured.
  • a value indicating the correctness of the condition is calculated, and the correctness of the calculated imaging force is determined (step S4).
  • the shooting condition is adjusted so that the value of the shooting condition validity is increased (step S5).
  • the processing in steps S1 to S5 is performed by the shutter of the operation unit 14. The process is repeated until the actual shooting is instructed after the button is pressed, and history information of shooting conditions up to a predetermined time before the actual shooting is generated as processing process reproduction information.
  • step S6 When the shutter button of the operation unit 14 is pressed and a real shooting is instructed (step S6; YES), the imaging signal obtained from the CCD 3 is converted into a digital signal by the AZD change 5 to refer to the scene. Simultaneously with the generation of raw data (step S7), the reproduction auxiliary data generation unit 12 generates imaging device characteristic correction information, appreciation image reference data restoration information, and processing process reproduction information as reproduction auxiliary data (Ste S8).
  • the processing process reproduction information generated in step S8 is obtained by adding information at the time of actual photographing to the history information before actual photographing generated by the processing of steps S1 to S5. Note that a step of creating appreciation image reference data to be recorded on the recording media may be added after the main photographing.
  • the header information processing unit 8 attaches the reproduction assistance data generated in step S8 as tag information to the file header of the scene reference raw data generated in step S7.
  • Step S9 an attached data file (see FIG. 3) is created (Step S10).
  • the attached data file is recorded and stored in the recording medium of the storage device 9 (step S11), and the image data recording process is completed.
  • captured image data for example, scene reference raw data
  • an occupation ratio a first occupation ratio, a second occupation ratio indicating a ratio of each divided area to the entire captured image data. Occupancy ratio calculation processing is performed (step T1). Details of the occupation rate calculation process will be described later with reference to FIGS.
  • step T2 a bias amount calculation process for calculating a bias amount indicating a bias of the gradation distribution of the photographed image data is performed.
  • the bias amount calculation process in step 2 will be described in detail later with reference to FIG.
  • indices 1 to 3 for specifying the shooting scene are calculated based on the occupation ratio calculated in step T1 and a coefficient set in advance according to the shooting conditions (step ⁇ 3), and the main shooting is performed.
  • the scene discrimination process ends.
  • the calculation method of the index in step ⁇ 3 will be detailed later. Explain in detail.
  • the RGB values of the captured image data are converted into the HSV color system (step T10).
  • Figure 12 shows an example of a conversion program (HSV conversion program) that obtains hue values, saturation values, and brightness values by converting from RGB to the HSV color system in program code (c language). is there.
  • HSV conversion program shown in Fig. 12
  • the digital image data values that are input image data are defined as InR, InG, and InB
  • the calculated hue value is defined as OutH
  • the scale is defined as 0 to 360
  • the saturation The value is OutS
  • the brightness value is OutV
  • the unit is defined as 0 to 255
  • the captured image data is divided into regions having a combination of predetermined brightness and hue, and a two-dimensional histogram is created by calculating the cumulative number of pixels for each divided region (step Tl l). .
  • a two-dimensional histogram is created by calculating the cumulative number of pixels for each divided region (step Tl l). .
  • the area division of the captured image data will be described in detail.
  • Lightness (V) is lightness value power -25 (vl), 26-50 (v2), 51-84 (v3), 85-169 (v4), 170-199 (v5), 200-224 ( v6), divided into 7 regions from 225 to 255 (v7).
  • Hue (H) is a flesh color range (HI and H2) with a hue value of 0 to 39, 330 to 359, a green hue range (H3) with a hue value of 40 to 160, and a blue hue range with a hue value of 61 to 250. It is divided into four areas (H4) and red hue area (H5). Note that the red hue region (H5) is not used in the following calculations because of the fact that it contributes little to the determination of imaging conditions.
  • the flesh color hue area is further divided into a flesh color area (HI) and another area (H2).
  • HI flesh color area
  • H2 another area
  • the hue '(H) that satisfies the following formula (5) is defined as the flesh-colored area (HI), and the area that does not satisfy formula (5) (H2).
  • Hue '(H) Hue) + 60 (0 ⁇ Hue) (when 300)),
  • Hue '(H) Hue (H)-300 (when 300 ⁇ Hue (H) ⁇ 360),
  • Luminance (Y) InR X 0.30 + InG X 0.59 + InB X 0.11 (A)
  • V and brightness (V) can also be used.
  • a first occupancy ratio indicating the ratio of the cumulative number of pixels calculated for each divided region to the total number of pixels (the entire captured image) is calculated (step T12).
  • the first occupancy rate calculation process ends.
  • Table 1 shows the first occupancy ratio in each divided area, where Rij is the first occupancy ratio calculated in the divided area that also has the combined power of the lightness area vi and the hue area Hj.
  • Table 2 shows the first coefficient necessary for calculating the accuracy (X for each divided area) that quantitatively indicates the accuracy of strobe shooting, that is, the brightness state of the face area during strobe shooting.
  • the coefficient of each divided area shown in Table 2 is a weighting coefficient by which the first occupancy Rij of each divided area shown in Table 1 is multiplied, and is set in advance according to the photographing conditions.
  • FIG. 13 shows the brightness (v) —hue (H) plane.
  • a positive (+) coefficient is used for the first occupancy calculated from the region (rl) distributed in the high-brightness skin color hue region in Fig. 13, and other hues are used.
  • a negative (-) coefficient is used for the first occupancy calculated from a certain blue hue region (r2).
  • Figure 15 shows the first coefficient in the flesh-color area (HI) and the first coefficient in the other areas (green hue area (H3)) continuously changing over the entire brightness. It is shown as a curve (coefficient curve).
  • the sign of the first coefficient in the skin color region (HI) is positive (+), and the other regions (e.g., green hue) In region (H3)), the sign of the first coefficient is negative (-), and the sign of both is different.
  • Index ⁇ Sum of ⁇ 1 area + Sum of ⁇ 2 area + Sum of ⁇ 3 area + Sum of ⁇ 4 area +4.424 (7)
  • Table 3 shows the accuracy of backlighting, that is, brightness of face area during backlighting
  • the second coefficient necessary to calculate the index ⁇ that quantitatively indicates the state is shown for each divided region.
  • the coefficient of each divided area shown in Table 3 is a weighting coefficient by which the first occupancy ratio Rij of each divided area shown in Table 1 is multiplied, and is set in advance according to the shooting conditions.
  • FIG. 14 shows the brightness (v) —hue (H) plane.
  • the skin color hue area Area (r4) force distributed in the middle brightness of the area
  • the negative (-) coefficient is used for the calculated occupancy, and the occupancy calculated from the low brightness (shadow) area (r3) of the flesh hue area Is a positive (+) coefficient.
  • Fig. 16 shows the second coefficient in the flesh color region (HI) as a curve (coefficient curve) that changes continuously over the entire brightness. According to Table 3 and Fig.
  • the sign of the second coefficient in the lightness value range of 85 to 169 (v4) in the flesh tone hue region is negative (-) and the lightness value is 26 to 84 (v2,
  • the sign of the second coefficient in the low lightness (shadow) region of v3) is positive (+), which indicates that the sign of the coefficient in both regions is different.
  • H2 region sum R12 X 0.0 + R22 X 4.7 + (omitted)... + R72 X (-8.5) (8-2)
  • H3 region sum R13 X 0.0 + R23 X 0.0 + (omitted) ... + R73 X 0.0 (8-3)
  • Indicator j8 Sum of H1 region + Sum of H2 region + Sum of H3 region + Sum of H4 region + 1.554 (9)
  • the index ⁇ and the index ⁇ are calculated based on the brightness and hue distribution amount of the captured image data, they are effective for determining a captured scene when the captured image data is a color image.
  • the RGB values of the photographed image data are converted into the HSV color system (step T20).
  • the captured image data is divided into regions where the combined power of the distance from the outer edge of the captured image screen and the brightness is determined, and the cumulative number of pixels is calculated for each divided region to obtain a two-dimensional histogram. Is created (step T21).
  • the area division of the captured image data will be described in detail.
  • FIG. 18A to 18D show four areas nl to n4 divided according to the distance from the outer edge of the screen of the captured image data.
  • the area nl shown in FIG. 18 (a) is the outer frame
  • the area n2 shown in FIG. 18 (b) is the area inside the outer frame
  • the area n3 shown in FIG. 18 (c) is the area n2.
  • a further inner area, an area n4 shown in FIG. 18 (d) is an area at the center of the captured image screen.
  • a second occupancy ratio indicating the ratio of the cumulative number of pixels calculated for each divided region to the total number of pixels (the entire captured image) is calculated (step T22).
  • the second occupancy rate calculation process ends. Assuming that Qij is the second occupancy calculated in the divided area that also has the combined power of the brightness area vi and the screen area nj, the second occupancy ratio in each divided area is expressed as shown in Table 4.
  • Table 5 shows the third coefficient necessary for calculating the index ⁇ for each divided region.
  • the coefficient of each divided area shown in Table 5 is a weighting coefficient by which the second occupancy Qij of each divided area shown in Table 4 is multiplied, and is set in advance according to the photographing conditions.
  • Fig. 19 shows the third coefficient in the screen areas nl to n4 as a curve (coefficient curve) that continuously changes over the entire brightness.
  • n2 region sum Q12 X (-14.8) + Q22 X (-10.5) + (omitted)... + Q72 X 0.0 (10-2)
  • n3 region sum Q13 X 24.6 + Q23 X 12.1 + (omitted)... + Q73 X 10.1 (10-3)
  • Sum of n4 region Q 14 X 1.5 + Q24 X (-32.9) + (Omitted) ... + Q 74 X (-52.2) (10-4) Using the sum of the nl to n4 regions shown in 10-1) to (; 10-4), it is defined as in equation (11).
  • the index ⁇ is calculated based on the compositional characteristics (distance from the outer edge of the screen of the captured image data) based on the brightness distribution position of the captured image data. It is also effective for discrimination.
  • step ⁇ 2 in FIG. 10 the bias amount calculation process
  • the luminance Y (brightness) of each pixel is calculated from the RGB (Red, Green, Blue) values of the captured image data using Equation (A), and the standard deviation (xl) of the luminance is calculated. (Step T23).
  • the standard deviation (xl) of brightness is defined as in equation (12). [0118] [Equation 4]
  • the pixel luminance value is the luminance of each pixel of the captured image data
  • the average luminance value is the average value of the luminance of the captured image data.
  • the total number of pixels is the number of pixels of the entire captured image data.
  • a luminance difference value (x2) is calculated (step T24).
  • Luminance difference value (x2) (Maximum luminance value, Average luminance value) Z255 (13)
  • the maximum luminance value is the maximum luminance value of the captured image data.
  • the average luminance value (x3) of the flesh color region in the center of the screen of the captured image data is calculated (step T25), and further the average luminance value ( ⁇ 4) in the center of the screen is calculated (step S25). ( ⁇ 26).
  • the center of the screen is, for example, an area composed of an area ⁇ 3 and an area ⁇ 4 in FIG.
  • the flesh color luminance distribution value ( ⁇ 5) is calculated (step ⁇ 27), and this deviation amount calculation processing ends.
  • the maximum brightness value of the skin color area of the captured image data is Yskinjnax
  • the minimum brightness value of the skin color area is Yskin_min
  • the average brightness value of the skin color area is Yskin_ave
  • the skin color brightness distribution value (x5) is defined as in equation (14) Is done.
  • x5 (Yskin— max ⁇ Yskin— mm) / 2 ⁇ Yskin—ave (14)
  • x6 be the average luminance value of the skin color area in the center of the screen of the captured image data.
  • the center of the screen is, for example, an area composed of the area n2, the area n3, and the area n4 in FIG.
  • index 1 shown in FIGS. 7 and 8 is defined as in equation (15) using index a
  • index ⁇ , ⁇ 6, and index 2 is defined using index j8, index ⁇ , ⁇ 6. It is defined as (16).
  • Indicator 1 0.46 X indicator ⁇ +0.61 X indicator ⁇ +0.01 ⁇ ⁇ 6— 0.79 (15)
  • the index 3 shown in FIGS. 7 and 8 includes the deviation amounts (xl) to (x5) calculated in the deviation amount calculation process. It is obtained by multiplying a preset fourth coefficient according to the shooting conditions. Table 6 shows the fourth coefficient, which is a weighting coefficient by which each deviation is multiplied.
  • Indicator 3 1 0.02+ 2 1.13+ 3 0.06+ 4 (-0.01) + 5 0.03— 6.49 (17)
  • This indicator 3 is a luminance histogram distribution information that consists only of the compositional features of the screen of the captured image data. In particular, it is effective for discriminating between strobe (flash proximity) shooting scenes and under shooting scenes.
  • reproduction auxiliary data for generating viewing image reference data is generated on the imaging apparatus side, and the reproduction auxiliary data is used as characteristics of the imaging apparatus.
  • the reproduction auxiliary data is used as characteristics of the imaging apparatus.
  • FIG. 21 shows a configuration of an imaging apparatus 101 as a modification of the imaging apparatus 100.
  • the same components as those of the imaging device 100 of FIG. 21 the same components as those of the imaging device 100 of FIG.
  • the photographing information data generation unit 13 generates photographing information data that is a photographing condition setting at the time of photographing.
  • This shooting information data includes, for example, information directly related to the camera type (model) such as camera name and code number, exposure time, shutter speed, aperture value (F number), ISO sensitivity, brightness value, subject distance range, Light source, strobe flash, subject area, Information about the type of the subject, the image balance, the zoom magnification, the subject composition, the shooting scene, the amount of reflected light from the strobe light source, the shooting saturation, and the like.
  • FIG. 22 shows the data structure of a data file recorded on the recording medium of the storage device 9 of the imaging apparatus 101.
  • the imaging device characteristic correction information, the appreciation image reference data restoration information and the processing process reproduction information generated by the reproduction auxiliary data generation unit 12 and the shooting information data generated by the shooting information data generation unit 13 are the header information processing unit 8
  • the attached data file is created by being attached to the file header of the scene reference raw data as auxiliary reproduction data, and is recorded in the recording medium of the attached data file storage device 9.
  • step S20 preliminary imaging is performed (step S20), and an image obtained by preliminary imaging (hereinafter referred to as preliminary imaging image) is displayed on the display unit 15.
  • the viewing image reference data is formed (step S21), and the formed viewing image reference data is displayed on the display unit 15 (step S22).
  • preliminary imaging may be performed when the shutter button is pressed halfway by the operation unit 14.
  • the camera shake level and the indices 1 to 3 as a result of the scene discrimination process are calculated.
  • the photographing condition validity value is calculated from the calculation result, and the validity of the photographing condition is calculated. Is determined (step S23).
  • the shooting condition is adjusted so that the value of the shooting condition validity is increased (step S 24).
  • the processing of steps S20 to S24 is repeated until the actual shooting is instructed by pressing the shutter button of the operation unit 14, and history information of shooting conditions up to a predetermined time before the main shooting is generated as processing process reproduction information. It is.
  • step S25 When the shutter button on the operation unit 14 is pressed and an actual shooting is instructed (step S25; YES), the image pickup signal obtained from the CCD3 is converted into a digital signal by the AZD change 5 to generate a scene.
  • step S26 the reproduction assistance data
  • the generation unit 12 generates imaging device characteristic correction information, appreciation image reference data restoration information, and processing process reproduction information as reproduction auxiliary data (step S27), and the shooting information data generation unit 13 generates shooting information data.
  • the processing process reproduction information generated in step S27 is obtained by adding the information at the time of main photographing to the history information before main photographing generated by the processing of steps S20 to S24. Note that a step of creating appreciation image reference data for recording on a recording medium may be added after the main photographing.
  • the reproduction auxiliary data generated in step S 27 and the reproduction auxiliary data generated in step S 28 are added to the file header of the scene reference raw data generated in step S 26.
  • the captured information data is attached as tag information (step S29), and an attached data file (see FIG. 22) is created (step S30).
  • the attached data file is recorded and saved on the recording medium of the storage device 9 (step S31), and the image data recording process is completed.
  • the shooting information data that is the shooting condition setting at the time of shooting is attached to the file header of the scene reference raw data, and the recording medium By recording the image, it is possible to generate appreciation image reference data according to the shooting situation when the data recorded on the recording medium is output on the output medium.
  • the recording medium on which the attached data file is recorded is taken out from the imaging apparatus main body and mounted on an external apparatus such as an image processing apparatus or an image recording apparatus.
  • an external apparatus such as an image processing apparatus or an image recording apparatus.
  • viewing image formation on the output medium is performed. Therefore, the image processing to be optimized is performed to generate the viewing image reference data.
  • FIG. 24 shows an external configuration of the image recording apparatus 201 according to the embodiment of the present invention.
  • the image recording apparatus 201 is provided with a magazine loading unit 203 on one side surface of a main body 202.
  • the main body 202 is exposed to an exposure processing unit 204 that exposes silver salt photographic paper as an output medium.
  • the print creation unit 205 is provided to create a print by developing and drying the silver halide photographic paper. It has been.
  • the print created by the print creation unit 205 is discharged to a tray 206 provided on the other side of the main body 202.
  • a control unit 207 that controls each unit constituting the image recording apparatus 201 is provided inside the main body 202.
  • a display unit 208 On the upper part of the main body 202, a display unit 208, a film scanner unit 209 that is a transparent document reading device, a reflective document input device 210, and an operation unit 211 are arranged. Further, the main body 202 is provided with an image reading unit 214 that can read image data recorded on various recording media, and an image writing unit 215 that writes image data on various recording media.
  • a photographic photosensitive material is used as an original read from the film scanner unit 209 or the reflective original input device 210.
  • the photographic material include a color negative film, a color reversal film, a black and white negative film, and a black and white reversal film.
  • Frame image information captured by an analog camera is recorded.
  • the film scanner unit 209 converts the frame image information recorded on the photographic photosensitive material into digital image data to obtain frame image data.
  • the photographic photosensitive material is color paper that is silver salt photographic paper
  • the reflective original input device 210 converts the frame image information recorded on the silver salt photographic paper into frame image data by a flatbed scanner.
  • the image reading unit 214 includes a PC card adapter 214a and a floppy (registered trademark) disk adapter 214b, into which a PC card 213a and a floppy (registered trademark) disk 213b can be respectively inserted.
  • the PC card 213a has, for example, a memory in which a plurality of frame image data is stored after being captured by a digital camera.
  • the floppy (registered trademark) disk 213b for example, a plurality of frame image data captured by a digital camera is recorded.
  • Recording media on which frame image data is recorded in addition to PC card 213a and floppy disk 213b include, for example, multimedia card (registered trademark), memory stick (registered trademark), MD data, CD-ROM, and the like. Can be mentioned.
  • the image writing unit 215 includes a floppy (registered trademark) disk adapter 215a, an MO adapter 215b, and an optical disk adapter 215c.
  • the operation unit 211, the display unit 208, the film scanner unit 209, and the reflection original input The force device 210 and the image reading unit 214 are integrally provided in the main body 202. Force One or more of these may be provided separately.
  • the print creation method is not limited to this, and for example, an inkjet A method such as a method, an electrophotographic method, a heat sensitive method, or a sublimation method may be used.
  • FIG. 25 shows the internal configuration of the image recording apparatus 201.
  • the image recording apparatus 201 includes a control unit 207, an exposure processing unit 204, a print creation unit 205, a film scanner unit 209, a reflection original input device 210, an image reading unit 214, and an image writing unit 215. , Data storage means 271, template storage means 272, operation section 211, and display section 208.
  • the control unit 207 is constituted by a microcomputer, and various control programs stored in a storage unit (not shown) such as a ROM (Read Only Memory) and a CPU (Central Process Unit) (not shown) The operation of each unit constituting the image recording apparatus 201 is controlled in cooperation with the.
  • a storage unit not shown
  • ROM Read Only Memory
  • CPU Central Process Unit
  • control unit 207 has an image processing unit 270 and is read from the film scanner unit 209 or the reflection original input device 210 based on an input signal from the information input unit 12 of the operation unit 211.
  • Image data read from the image reading unit 214 or image data input from an external device via a communication means (not shown) to form image information for exposure to form exposure image information.
  • the image processing unit 270 performs a conversion process according to the output form on the image processed image data, and outputs it to a designated output destination.
  • the output destination of the image processing unit 270 includes a display unit 208, an image writing unit 215, a communication unit, and the like.
  • the exposure processing unit 204 exposes an image on the photosensitive material, and outputs the photosensitive material to the print creating unit 205.
  • the print creating unit 205 develops the exposed photosensitive material and dries it to create prints Pl, P2, and P3.
  • the print P1 is a service size, a no-vision size, a panorama size, etc.
  • the print P2 is an A4 size print
  • the print P3 is a business card size print.
  • the film scanner unit 209 reads a frame image recorded on a transparent original such as a developed negative film or reversal film imaged by an analog camera, and obtains a digital image signal of the frame image.
  • the reflection original input device 210 reads an image on the print P (photo print, document, various printed materials) by a flat bed scanner, and acquires a digital image signal.
  • the operation unit 211 is provided with information input means 212.
  • the information input unit 212 includes a touch panel or the like, and outputs an operation signal of the information input unit 212 to the control unit 207.
  • the operation unit 211 may be configured to include a keyboard and a mouse.
  • the image reading unit 214 reads the frame image information recorded on the PC card 213a or the floppy (registered trademark) disk 213b and transfers the frame image information to the control unit 207.
  • the image reading unit 214 includes, as the image transfer means 230, a PC card adapter 214a, a floppy (registered trademark) disk adapter 214b, and the like.
  • the image reading unit 14 reads the frame image information recorded on the PC card 213a inserted into the PC card adapter 214a and the floppy disk 213b inserted into the floppy disk adapter 214b. Transfer to the control unit 207.
  • a PC card reader, a PC card slot, or the like is used as the PC card adapter 214a.
  • the image writing unit 215 includes a floppy (registered trademark) disk adapter 215a, an MO adapter 215b, and an optical disk adapter 215c as the image transport unit 231.
  • the image writing unit 215 includes a floppy disk 216a inserted into the floppy disk adapter 215a and an MO 216b inserted into the MO adapter 215b.
  • the image data generated by the image processing method according to the present invention is written to the optical disk 216c inserted into the optical disk adapter 215c.
  • the data storage means 271 stores and sequentially stores image information and corresponding order information (information about how many prints are to be created from which frame images, print size information, etc.)
  • the template storage means 272 corresponds to the sample identification information Dl, D2, D3. Image data (data indicating a background image, an illustration image, etc.) and at least one template data for setting a synthesis area with the sample image data.
  • Image data data indicating a background image, an illustration image, etc.
  • the control unit 207 When a predetermined template is selected from a plurality of templates stored in advance in the template storage means 272 by an operator's operation (the operator's operation is based on an instruction from the client), the control unit 207 When the image identification information Dl, D2, and D3 are specified by the operator's operation (this operator's operation is based on the client's instruction), the image information and the selected template are combined.
  • the sample image data is selected based on the sample identification information Dl, D2, and D3, and the selected sample image data is combined with the image data and Z or character data ordered by the client. Creates a print based on the desired sample image data.
  • This template synthesis is performed by the well-known Chromaki method.
  • sample identification information is not limited to the three types of sample identification information Dl, D2, and D3, but may be more or less than three types.
  • sample identification information Dl, D2, and D3 for specifying a print sample is a force that is configured to be input by the operation unit 21 1 force
  • Sample identification information Dl, D2, and D3 are printed samples or Since it is recorded on the order sheet, it can be read by reading means such as OCR (Optical Character Reader). Or, the operator can input from the keyboard.
  • sample image data is recorded corresponding to sample identification information D1 for specifying a print sample, sample identification information D1 for specifying a print sample is input, and this sample identification information is input.
  • Select sample image data based on D1 and combine the selected sample image data with the image data and Z or character data based on the order to create prints based on the specified samples. Users can actually order samples for printing and can meet the diverse requirements of a wide range of users.
  • the first sample identification information D2 designating the first sample and the image data of the first sample are stored, and the second sample identification information D3 designating the second sample and the first sample identification data D3 are stored.
  • the image data of two samples is stored, the sample image data selected based on the designated first and second sample identification information D2, D3, the image data based on the order, and the Z or character data Since a print based on the specified sample is created, a wider variety of images can be synthesized, and a print that meets a wider variety of user requirements can be created.
  • the display unit 208 includes a display such as a CRT or LCD, and performs display processing in accordance with a display control signal input from the control unit 207.
  • the image processing unit 270 of the control unit 207 uses a communication means (not shown) via another computer in the facility where the image recording apparatus 201 is installed or a communication network such as the Internet. It is also possible to receive image data representing captured images and work instructions such as printing from a distant computer, and to perform image processing and create prints remotely.
  • the image processing unit 270 uses communication means (not shown) to send image data representing the captured image after the image processing of the present invention and the accompanying order information to another combination in the facility. Utah can also be sent to distant computers via the Internet.
  • the image recording apparatus 201 inputs the image information of the various recording media and the image information obtained by dividing and metering the image original, and the image information of the input image taken from the input means into “ An image that is processed to obtain an image that gives a favorable impression when observing the image on the output medium by acquiring or estimating information such as “the size of the output image” and “the size of the main subject in the output image” Image processing means, image output means for displaying processed images, printing output, writing to recording media, and another computer in the facility via a communication line, image data to a remote computer via the Internet, etc. And communication means for transmitting the attached order information.
  • FIG. 26 shows an internal configuration when the image recording apparatus 201 is divided into an image processing apparatus 301 and an output unit 302 that outputs image data processed by the image processing apparatus 301.
  • the image processing device 301 includes an input unit 303, a header information analysis unit 304, an imaging device characteristic correction processing unit 305, an appreciation image reference data restoration condition generation unit 306, a processing program.
  • the process reproduction unit 307 and the optimization processing unit 308 are configured.
  • the input unit 303 includes the image reading unit 214 of Fig. 25 and includes a mounting unit for mounting a recording medium. When the recording medium is loaded in the loading unit, the input unit 303 reads the data file recorded in the recording medium and outputs the data file to the header information analysis unit 304. In the present embodiment, the input unit 303 is described as reading a data file from the attached recording medium. However, the input unit 303 includes a wired or wireless communication unit, and inputs the data file via the communication unit.
  • the header information analysis unit 304 analyzes the data file input from the input unit 303, and converts the data file into scene reference raw data, reproduction auxiliary data (imaging device characteristic correction information, appreciation image reference data restoration). Information, processing process reproduction information) and shooting information data, and output the scene reference raw data to the scene reference image data generation unit 311 in the imaging device characteristic correction processing unit 305, and correct the imaging device characteristic correction information to the device characteristic correction.
  • the information is output to the photographing information data processing unit 313 in the unit 308.
  • the imaging device characteristic correction processing unit 305 includes a device characteristic correction processing unit 309, a processing condition table 310, a scene reference image data generation unit 311, and a temporary storage memory 312.
  • the device characteristic correction processing unit 309 determines scene generation image data generation conditions based on the imaging device characteristic correction information input from the header information analysis unit 304 and the processing condition table 310.
  • the processing condition table 310 is a table that stores processing conditions for generating scene reference image data for each characteristic of the imaging apparatus.
  • the scene reference image data generation unit 311 performs imaging device characteristic correction processing on the scene reference raw data input from the header information analysis unit 304 in accordance with the generation conditions determined by the device characteristic correction processing unit 309. Depending on the characteristics of the imaging device, generate standardized scene reference image data and output it to the temporary storage memory 312. Specifically, in the imaging device characteristic correction process, the signal intensity of each color channel based on at least the spectral sensitivity unique to the imaging device of the imaging device that generated the scene reference raw data, for example, RIMM RGB, E RIMM RGB, scRGB, etc. Includes mapping to standard color space.
  • the temporary storage memory 312 temporarily stores the scene reference image data generated by the scene reference image data generation unit 311.
  • the appreciation image reference data restoration condition generation unit 306 is based on the appreciation image reference data restoration information input from the header information analysis unit 304 !, and is restored to restore the appreciation image reference data in the imaging device. Determine the conditions.
  • the processing process reproduction unit 307 sets reproduction conditions for reproducing the generation process of the appreciation image reference data in the imaging device. decide.
  • the optimization processing unit 308 includes the photographing information data processing unit 313, the appreciation image reference data generation unit 3
  • the shooting information data processing unit 313 determines a generation condition for generating appreciation image reference data corresponding to the shooting condition based on the shooting information data input from the header information analysis unit 304.
  • the appreciation image reference data generation unit 314 reads the scene reference image data from the temporary storage memory 312 and generates the appreciation image reference data determined by the shooting information data processing unit 313 and restores the appreciation image reference data.
  • the restoration conditions determined by the condition generation unit 306, the reproduction conditions determined by the processing process reproduction unit 307, and the output destination specified by the setting input unit 316 (storage device 318, output device 317, display unit 208) Based on this information, the scene reference image data is subjected to an optimization process for obtaining an optimal image at the output destination to generate viewing image reference data, which is temporarily stored together with the operation information of the setting input unit 316. Output to memory 315.
  • Optimization processing includes, for example, compression to the color gamut of the output destination, gradation compression from 16 bits to 8 bits, reduction of the number of output pixels, processing to support output characteristics (LUT) of output devices and display devices, etc. Is included. Furthermore, image processing such as noise suppression, sharpening, color balance adjustment, saturation adjustment, and dodging processing is included.
  • the temporary storage memory 315 temporarily stores the viewing image reference data generated by the viewing image reference data generation unit 314, and the output destination (storage device 318, output device) set by the setting input unit 316. 317 and display 208).
  • the setting input unit 316 refers to the appreciation image generated by the appreciation image reference data generation unit 314. This is an input device for designating the data output destination, and corresponds to the operation unit 211 in FIG.
  • components other than the input unit 303 and the setting input unit 316 are included in the image processing unit 270 shown in FIG.
  • the output unit 302 includes a display unit 208, an output device 317 corresponding to the exposure processing unit 204 and the print creation unit 205 in FIG. 25, and a storage device 318 corresponding to the image writing unit 215 in FIG. C
  • step S40 When data is input to the input unit 303 (that is, when a recording medium is mounted to the mounting unit) (step S40), the data file recorded on the recording medium is read and the header information analysis unit In 304, the contents of the data file are analyzed (step S41), and are divided into scene reference raw data, reproduction auxiliary data (imaging device characteristic correction information, appreciation image reference data restoration information, processing process reproduction information), and shooting information data. .
  • the shooting information data processing unit 313 determines generation conditions for generating viewing image reference data according to shooting conditions based on the shooting information data (step S42), and generates viewing image reference data restoration condition generation. Based on the appreciation image reference data restoration information, the unit 306 determines restoration conditions for restoring the appreciation image reference data in the imaging device (step S44). The processing process reproduction unit 307 reproduces the processing process reproduction data. Based on the information, the reproduction condition for reproducing the generation process of the viewing image reference data in the imaging device is determined (step S45).
  • the generation condition of the scene reference image data is determined by referring to the imaging apparatus characteristic correction information and the processing condition table 310, and the scene reference raw data is determined according to the determined generation condition.
  • Imaging device characteristic correction processing is performed (step S43), and scene reference image data is generated (step S46).
  • step S46 based on the various image processing conditions (appearance image reference data generation conditions, restoration conditions, reproduction conditions) determined in steps S42, S44, and S45.
  • the optimized scene reference image data is subjected to optimization processing (step S47), and appreciation image reference data is generated (step S48).
  • the viewing image reference data is subjected to processing specific to the output destination set in the setting input unit 316 (operation unit 211) (step S49), and the output destination device is checked.
  • the image is output from the chair (step S50), and this image processing ends.
  • step S42 in FIG. 27 is not performed.
  • the optimum is based on the reproduction information data and the shooting information data attached to the file header of the scene reference raw data output from the imaging device.
  • reproduction assistance data does not contain processing process reproduction information, it is possible to obtain sufficiently high quality viewing image reference data. It is possible to obtain appreciation image reference data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)
  • Studio Devices (AREA)

Abstract

It is possible to suppress information loss of acquired image data and create image-to-be-enjoyed reference data having a higher quality than the image data obtained by an imaging device at an image processing device or at an image recording device. The imaging device (100) performs imaging to generates scene reference raw data dependent on the characteristic of the imaging device. A re-creation auxiliary data generation unit (12) subjects the scene reference raw data to image processing for optimizing formation of an image-to-be-enjoyed on an output medium so as to generate re-creation auxiliary data used for generating the image-to-be-enjoyed reference data. A header information processing unit (8)attaches the re-creation auxiliary data to the file header (header area) of the scene reference raw data so as to prepare an attached data file, which is recorded in a recording medium of a storage device (9).

Description

明 細 書  Specification
撮像装置、画像処理装置及び画像記録装置  Imaging apparatus, image processing apparatus, and image recording apparatus
技術分野  Technical field
[0001] 本発明は、デジタルカメラ等の撮像装置、撮像装置により得られた撮影画像データ に対して出力媒体上での鑑賞画像形成用に最適化処理を施す画像処理装置及び 画像記録装置に関する。  The present invention relates to an imaging apparatus such as a digital camera, and an image processing apparatus and an image recording apparatus that perform an optimization process for forming an appreciation image on an output medium on captured image data obtained by the imaging apparatus.
背景技術  Background art
[0002] 近年、デジタルカメラ等の撮像装置により得られた撮影画像データは、 CD-R (Co mpact Disc Recordable)、フロッピー(登録商標)ディスク、メモリーカードなどの記録メ ディアに記録されたり、インターネット等の通信ネットワーク経由で配信されたり、 CRT (Cathode Ray Tube)、液晶、プラズマ等のディスプレイモニタや携帯電話の小型液 晶モニタの表示デバイスに表示されたり、デジタルプリンタ、インクジェットプリンタ、サ 一マルプリンタ等の出力デバイスを用いてハードコピー画像としてプリントされたりす るなど、その出力方法は多種多様ィ匕している。  [0002] In recent years, captured image data obtained by an imaging device such as a digital camera has been recorded on a recording medium such as a CD-R (Compact Disc Recordable), a floppy (registered trademark) disk, a memory card, or the like. Distributed via communication networks such as CRT (Cathode Ray Tube), liquid crystal, plasma, etc., and display on display devices of small liquid crystal monitors of mobile phones, digital printers, inkjet printers, thermal printers, etc. There are various output methods such as printing as a hard copy image using an output device such as the above.
[0003] 撮影画像データを鑑賞用途で出力する際には、鑑賞に使用するディスプレイモ- タ上或いはハードコピー上において所望の画質が得られるように、一般に、撮影画像 データに対し、階調調整、輝度調整、カラーバランス調整、鮮鋭性強調に代表される 種々の画像処理が施される。特許文献 1には、撮像装置カゝら得られたシーン参照生 データ (画像処理が施されて!/、な 、、被写体の情報を忠実に表す生データ)力 標 準化されたシーン参照画像データを生成するための補正処理に必要な補助データ を、シーン参照生データに添付して記録メディアに記録し、画像処理装置或いは画 像記録装置側において、シーン参照生データに添付された補助データを用いて画 像処理を行うことにより、撮影画像情報の情報損失を伴うことなく最適化された鑑賞 画像を作成可能な技術が開示されて!ヽる。  [0003] When outputting captured image data for viewing purposes, in general, gradation adjustment is performed on the captured image data so that a desired image quality can be obtained on a display motor or hard copy used for viewing. Various image processing such as brightness adjustment, color balance adjustment, and sharpness enhancement are performed. In Patent Document 1, scene reference raw data obtained by an imaging device (image processing is performed! /, Raw data that faithfully represents subject information) force standardized scene reference image Auxiliary data necessary for correction processing to generate data is attached to the scene reference raw data and recorded on a recording medium, and the auxiliary data attached to the scene reference raw data on the image processing device or the image recording device side. Disclosed is a technology that enables the creation of optimized viewing images without image information loss by performing image processing using the! Speak.
特許文献 1:特開 2004— 96500号公報  Patent Document 1: Japanese Patent Application Laid-Open No. 2004-96500
発明の開示  Disclosure of the invention
発明が解決しょうとする課題 [0004] し力しながら、上述の従来の画像処理では、撮影画像データの情報損失が抑制さ れ、画質調整の自由度が向上されるが、撮像装置内で最適化された鑑賞画像参照 データを、画像処理装置或 ヽは画像記録装置側で安定に再現できな ヽと ヽぅ問題が あった。また、撮像装置での本撮影に至るまでの、撮像装置に備えられた各種センサ や演算プロセスの履歴情報は、画質調整時の有用な情報であるにも関わらず、画像 処理装置或いは画像記録装置側でこれらの情報を得ることができな 、ことから、撮像 装置で得られた画像データよりも高品位な鑑賞画像参照データを生成することがで きないという問題があった。 Problems to be solved by the invention [0004] However, in the conventional image processing described above, information loss of captured image data is suppressed and the degree of freedom of image quality adjustment is improved. However, the image processing apparatus or the image recording apparatus has a problem that it cannot be stably reproduced. In addition, although the various sensors and calculation process history information provided in the imaging device up to the actual shooting with the imaging device are useful information at the time of image quality adjustment, the image processing device or the image recording device Therefore, there is a problem in that it is impossible to generate appreciation image reference data with higher quality than image data obtained by the imaging device.
[0005] 本発明の課題は、撮影画像情報の情報損失を抑制し、且つ、画像処理装置或!ヽ は画像記録装置側で、撮像装置で得られた画像データ以上の高品質な鑑賞画像参 照データを生成可能にすることである。  [0005] An object of the present invention is to suppress information loss of photographed image information and to view a high-quality appreciation image that is higher than image data obtained by an imaging device on the image processing apparatus or image recording apparatus side. It is possible to generate reference data.
課題を解決するための手段  Means for solving the problem
[0006] 本発明の課題は、下記の手段により解決することができる。 [0006] The problems of the present invention can be solved by the following means.
1.撮像により撮像装置の特性に依存したシーン参照生データを生成するシーン参 照生データ生成手段と、前記シーン参照生データに対して出力媒体上での鑑賞画 像形成のために最適化する画像処理を施して鑑賞画像参照データを生成する際の 再現補助データを生成する再現補助データ生成手段と、前記シーン参照生データ に前記再現補助データを添付して記録メディアに記録する記録制御手段と、を備え ることを特徴とする撮像装置。  1. Scene reference raw data generation means for generating scene reference raw data depending on the characteristics of the imaging device by imaging, and optimizing the scene reference raw data for forming an appreciation image on an output medium Reproduction auxiliary data generating means for generating reproduction auxiliary data when performing image processing to generate appreciation image reference data, and recording control means for attaching the reproduction auxiliary data to the scene reference raw data and recording it on a recording medium; An imaging apparatus comprising:
2.前記再現補助データを用いて、前記シーン参照生データに画像処理を施し、前 記鑑賞画像参照データを作成する画像処理手段と、前記画像処理手段で作成され た鑑賞画像参照データに基づ!、て、出力媒体上に鑑賞画像を形成する画像形成手 段と、を備えることを特徴とする 1に記載の撮像装置。  2. Image processing means for performing image processing on the scene reference raw data using the reproduction auxiliary data to create the viewing image reference data, and based on the viewing image reference data created by the image processing means. 2. The imaging apparatus according to 1, comprising an image forming means for forming an appreciation image on an output medium.
3.前記画像処理手段は、前記シーン参照生データを標準化してシーン参照データ を作成し、このシーン参照データに対して、鑑賞画像参照データ復元情報を用いて 、画像処理を施して前記鑑賞画像参照データを生成し、前記再現補助データは、前 記鑑賞画像参照データ復元情報を含むことを特徴とする 2に記載の撮像装置。 3. The image processing means standardizes the scene reference raw data to create scene reference data, and performs image processing on the scene reference data using the viewing image reference data restoration information to perform the viewing image 3. The imaging apparatus according to 2, wherein reference data is generated, and the reproduction assistance data includes the viewing image reference data restoration information.
4.本撮影前に予備撮影を行い、予備撮影の結果に応じて撮影条件の調整を行う撮 影条件調整手段を備え、前記再現補助データは、前記撮影条件調整手段で本撮影 前に調整される撮影条件の履歴を示す処理プロセス再現情報を含むことを特徴とす る 2に記載の撮像装置。 4.Preliminary shooting is performed before actual shooting, and shooting conditions are adjusted according to the result of preliminary shooting. 3. The imaging apparatus according to 2, further comprising a shadow condition adjusting unit, wherein the reproduction assistance data includes processing process reproduction information indicating a history of shooting conditions adjusted before the main shooting by the shooting condition adjusting unit. .
5.前記処理プロセス再現情報には、撮影条件の正当性を判断するための指標値が 含まれることを特徴とする 4に記載の撮像装置。  5. The imaging apparatus according to 4, wherein the processing process reproduction information includes an index value for determining the validity of the imaging condition.
6.前記指標値には、撮影時のユーザ特性、光源条件、露出条件の少なくとも一つを 特定する値が含まれることを特徴とする 5に記載の撮像装置。  6. The imaging apparatus according to 5, wherein the index value includes a value that specifies at least one of user characteristics, light source conditions, and exposure conditions during shooting.
7.前記光源条件、露出条件は、撮影時の撮影シーン判別処理の判別結果であるこ とを特徴とする 6に記載の撮像装置。  7. The imaging apparatus according to 6, wherein the light source condition and the exposure condition are determination results of a shooting scene determination process at the time of shooting.
8.撮影時の撮影条件設定を示す撮影情報データを生成する撮影情報データ生成 手段を備え、前記記録制御手段は、前記シーン参照生データに前記撮影情報デー タを添付して記録メディアに記録することを特徴とする 1乃至 7の何れ力 1つに記載の 撮像装置。  8. It is provided with shooting information data generating means for generating shooting information data indicating shooting condition setting at the time of shooting, and the recording control means records the shooting information data attached to the scene reference raw data on a recording medium. The imaging apparatus according to any one of 1 to 7, characterized in that:
9.撮像装置の特性に依存したシーン参照生データと、当該シーン参照生データに 対して出力媒体上での鑑賞画像形成のために最適化する画像処理を施して鑑賞画 像参照データを生成する際の再現補助データと、を入力する入力手段と、前記入力 されたシーン参照生データに対し、前記入力された再現補助データに基づいて最適 化処理を施して鑑賞画像参照データを生成する鑑賞画像参照データ生成手段と、 を備えることを特徴とする画像処理装置。  9. Generate scene appreciation image reference data by applying scene processing raw data that depends on the characteristics of the imaging device and image processing that is optimized for the appreciation image formation on the output medium. An input unit for inputting reproduction auxiliary data at the time, and an appreciation image that generates an appreciation image reference data by performing an optimization process on the input scene reference raw data based on the input reproduction auxiliary data An image processing apparatus comprising: reference data generation means.
10.前記再現補助データには、出力媒体上での鑑賞画像参照データを生成する際 に撮像装置での鑑賞画像参照データを復元するための鑑賞画像参照データ復元情 報が含まれることを特徴とする 9に記載の画像処理装置。  10. The reproduction auxiliary data includes appreciation image reference data restoration information for restoring appreciation image reference data in the imaging device when generating appreciation image reference data on the output medium. The image processing apparatus according to 9.
11.前記再現補助データには、出力媒体上での鑑賞画像参照データを生成する際 に撮像装置での鑑賞画像参照データの生成過程を再現するための処理プロセス再 現情報が含まれることを特徴とする 9に記載の画像処理装置。  11. The reproduction auxiliary data includes processing process reproduction information for reproducing the generation process of the appreciation image reference data in the imaging device when generating the appreciation image reference data on the output medium. The image processing apparatus according to 9.
12.前記処理プロセス再現情報には、撮像装置での撮影条件の正当性を判断する ための指標値が含まれることを特徴とする 11に記載の画像処理装置。  12. The image processing apparatus according to 11, wherein the processing process reproduction information includes an index value for determining validity of imaging conditions in the imaging apparatus.
13.前記指標値には、撮像装置による撮影時のユーザ特性、光源条件、露出条件 の少なくとも一つを特定する値が含まれることを特徴とする 12に記載の画像処理装 置。 13. The index values include user characteristics, light source conditions, and exposure conditions when shooting with the imaging device. 13. The image processing device according to 12, wherein a value specifying at least one of the values is included.
14.前記光源条件、露出条件は、撮影時の撮影シーン判別処理の判別結果である ことを特徴とする 13に記載の画像処理装置。  14. The image processing apparatus according to 13, wherein the light source condition and the exposure condition are determination results of a shooting scene determination process at the time of shooting.
15.前記入力手段は、撮影時の撮影条件設定を示す撮影条件データを入力し、前 記鑑賞画像参照データ生成手段は、前記入力されたシーン参照生データに対し、 入力された前記再現補助データ及び前記撮影条件データに基づいて最適化処理を 施して鑑賞画像参照データを生成することを特徴とする 9乃至 14の何れか 1つに記 載の画像処理装置。  15. The input means inputs photographing condition data indicating photographing condition settings at the time of photographing, and the appreciation image reference data generating means inputs the reproduction auxiliary data inputted with respect to the inputted scene reference raw data. And the image processing apparatus according to any one of 9 to 14, wherein the image processing apparatus generates an appreciation image reference data by performing an optimization process based on the shooting condition data.
16. 9乃至 15の何れ力 1つに記載の画像処理装置と、前記画像処理装置の鑑賞画 像参照データ生成手段により生成された鑑賞画像参照データを用いて出力媒体上 に鑑賞画像を形成する画像形成手段と、を備えることを特徴とする画像記録装置。 〈用語の説明〉  16. Forming an appreciation image on the output medium using the image processing device according to any one of 9 to 15 and appreciation image reference data generated by the appreciation image reference data generation means of the image processing device An image recording apparatus comprising: an image forming unit. <Explanation of terms>
以下、本明細書で使用する用語について説明する。  Hereinafter, terms used in this specification will be described.
[0007] 本明細書の記載にお!、て「生成」とは、本発明に係る撮像装置、画像処理装置及 び画像記録装置内において作用するプログラム、処理回路が、画像信号やデータを 新たに作り出すことである。「作成」を同義語として用いることもある。  [0007] In the description of this specification, the term "generation" means that a program and a processing circuit that operate in the imaging apparatus, image processing apparatus, and image recording apparatus according to the present invention renew image signals and data. To produce. “Create” may be used as a synonym.
[0008] 本明細書の記載にお!、て「撮像装置」とは、光電変換機能を有する撮像素子 (ィメ ージセンサ)を備えた装置であって、所謂デジタルカメラやスキャナがこれに含まれる 。撮像素子の一例としては、 CCD (Charge Coupled Device :電荷結合素子)と、電荷 転送機構と、巿松模様のカラーフィルタとを組み合わせ感色性を付与した CCD型撮 像素子や、 CMOS (Complementary Metal-Oxide Semiconductor)型撮像素子が挙 げられる。これらの撮像素子の出力電流は A/D変 によりデジタルィ匕される。この 段階での各色チャンネルの内容は、撮像素子固有の分光感度に基づいた信号強度 となっている。  [0008] In the description of this specification, the term "imaging device" refers to a device including an imaging element (image sensor) having a photoelectric conversion function, and includes a so-called digital camera and scanner. . Examples of image sensors include CCD (Charge Coupled Device), a charge transfer mechanism, and a CCD image sensor that provides color sensitivity by combining a pine pattern color filter, CMOS (Complementary Metal). -Oxide Semiconductor) type image sensor. The output current of these image sensors is digitized by A / D change. The contents of each color channel at this stage are signal intensities based on the spectral sensitivity unique to the image sensor.
[0009] また、「撮像装置の特性に依存したシーン参照生データ」とは、被写体に忠実な情 報を記録した撮像装置の直接の生出力信号であり、 A/D変換器によりデジタル化さ れたデータそのものや、該データに固定パターンノイズ '暗電流ノイズ等のノイズ補正 を行ったデータを意味する。このシーン参照生データは、階調変換'鮮鋭性強調'彩 度強調のような画像鑑賞時の効果を向上するためにデータ内容を改変する画像処 理ゃ、撮像素子固有の分光感度に基づく各色チャンネルの信号強度を、 RIMM R GBや sRGB等の標準化された色空間にマッピングする処理を省略したものである。 シーン参照生データの情報量 (例えば階調数)は A/D変換器の性能に準じ、鑑賞画 像参照データで必要とされる情報量 (例えば階調数)と同等以上であることが好まし い。例えば、鑑賞画像参照データ (後述)の階調数が 1チャンネルあたり 8bitである場 合、シーン参照生データの階調数は 12bit以上が好ましぐ 14bit以上がより好ましぐ また 16bit以上がさらに好ましい。 In addition, “scene reference raw data depending on the characteristics of the imaging device” is a direct raw output signal of the imaging device that records information faithful to the subject, and is digitized by an A / D converter. Noise correction such as fixed pattern noise and dark current noise Means the data performed. This scene reference raw data is used for image processing that modifies the data content to improve the effect of image viewing such as tone conversion 'sharpness enhancement' and saturation enhancement. The process of mapping the signal strength of the channel to a standardized color space such as RIMM R GB or sRGB is omitted. The information amount (for example, the number of gradations) of the scene reference raw data is preferably equal to or greater than the information amount (for example, the number of gradations) required for the viewing image reference data, according to the performance of the A / D converter. Good. For example, if the number of gradations of the viewing image reference data (described later) is 8 bits per channel, the gradation number of the scene reference raw data is preferably 12 bits or more, more preferably 14 bits or more, and more than 16 bits. preferable.
[0010] また、「出力媒体」とは、例えば、 CRT (Cathode Ray Tube)、 LCD (Liquid Crystal Display)、プラズマディスプレイ等の表示デバイスや、銀塩印画紙、インクジェットぺ 一パー、サーマルプリンタ用紙等のハードコピー画像生成用の用紙である。  [0010] The "output medium" is, for example, a display device such as CRT (Cathode Ray Tube), LCD (Liquid Crystal Display), plasma display, silver salt photographic paper, inkjet paper, thermal printer paper, etc. This is a paper for generating a hard copy image.
[0011] また、「鑑賞画像参照データ」とは、 CRT、 LCD,プラズマディスプレイ等の表示デ バイスに用いたり、銀塩印画紙、インクジェットペーパー、サーマルプリンタ用紙等の 出力媒体上のハードコピー画像生成に用いるデジタル画像データを意味する。鑑賞 画像参照データは、 CRT, LCD,プラズマディスプレイ等の表示デバイスや、銀塩 印画紙、インクジェットペーパー、サーマルプリンタ用紙等の出力媒体上において最 適な画像が得られるよう「最適化処理」が施されたものであり、シーン参照生データと は異なっている。  [0011] “Viewing image reference data” is used for display devices such as CRTs, LCDs, and plasma displays, and generates hard copy images on output media such as silver halide photographic paper, inkjet paper, and thermal printer paper. This means digital image data used for. Appreciation Image reference data is subjected to an “optimization process” so that an optimal image can be obtained on display devices such as CRT, LCD, and plasma display, and output media such as silver halide photographic paper, inkjet paper, and thermal printer paper. This is different from the scene reference raw data.
[0012] また、「記録メディア」とは、撮像装置により出力される「シーン参照生データ」及び「 再現補助データ」の保存に用いる記憶媒体であって、コンパクトフラッシュ(登録商標 )、メモリースティック(登録商標)、スマートメディア (登録商標)、マルチメディアカード 、ハードディスク、フロッピー(登録商標)ディスク、磁気記憶媒体 (MO)、 CD— Rなど 何れであってもよい。また、記録メディアに書き込むユニットは、撮影装置と一体であ つても、コードを介して有線接続された書き込みユニット、インターネット等の通信ネッ トワークを介して無線状態で接続された独立或いは遠隔地に設置されたユニットなど の何れの態様であってもよい。記録メディアに記録する時のファイル形式は、撮像装 置固有の形式ではなぐ TIFF (Tagged Image File Format)、 JPEG (Joint Photograp hie Coding Experts Group)、 Exif (Exchangeable Image File Format)などの規格ィ匕さ れた汎用のファイル形式で記録されるのが好まし!/、。 [0012] The "recording medium" is a storage medium used for storing "scene reference raw data" and "reproduction assistance data" output from the imaging apparatus, and is a compact flash (registered trademark), memory stick (registered). Trademark), SmartMedia (registered trademark), multimedia card, hard disk, floppy (registered trademark) disk, magnetic storage medium (MO), CD-R, etc. In addition, the unit that writes to the recording media can be installed in an independent or remote location connected wirelessly via a communication unit such as the Internet or a communication unit such as the Internet, even if it is integrated with the camera. Any unit such as a unit may be used. The file format when recording to a recording medium is not a format specific to the imaging device. TIFF (Tagged Image File Format), JPEG (Joint Photograp hie Coding Experts Group), Exif (Exchangeable Image File Format), etc.
[0013] また、「撮影情報データ」とは、撮影時の撮影条件設定の記録であり、 Exif7アイル のヘッダ部に書き込まれるタグ情報と同じものを含んでもよい。具体的には、露出時 間、シャッタースピード、絞り値 (Fナンバー)、 ISO感度、輝度値、被写体距離範囲、 光源、ストロボ発光の有無、被写体領域、ホワイトバランス、ズーム倍率、撮影シーン 、ストロボ光源の反射光の量、撮影彩度、被写体の種類、被写体構成に関する情報 などを示すタグ (コード)などである。  [0013] The "shooting information data" is a record of shooting condition settings at the time of shooting, and may include the same tag information written in the header part of the Exif7 aisle. Specifically, exposure time, shutter speed, aperture value (F-number), ISO sensitivity, brightness value, subject distance range, light source, flash on / off, subject area, white balance, zoom magnification, shooting scene, strobe light source Tag (code) indicating information on the amount of reflected light, shooting saturation, type of subject, subject configuration, and the like.
[0014] また、「撮影情報データ」は、撮像装置の露出設定や焦点機能の自動化のために、 カメラに備えられたセンサの撮影時に得た値、当該センサの値力 加工されたデータ 或いは当該センサの値に基づいて設定されたカメラの撮影条件に分類されるが、こ れ以外にも撮像装置に備えられた、撮影モードダイヤル (例えば、ポートレート、スポ ーッ、マクロ撮影モード等)や、ストロボ強制発光の設定スィッチ等を撮影者がマ-ュ アルで設定した情報も含まれる。  [0014] In addition, "shooting information data" is a value obtained at the time of shooting by a sensor provided in the camera, data processed by the sensor, It is classified into the shooting conditions of the camera set based on the sensor value, but in addition to this, the shooting mode dial (for example, portrait, sport, macro shooting mode, etc.) It also includes information that the photographer has manually set the setting switch for forced flash.
[0015] また、「標準化されたシーン参照画像データ」とは、少なくとも撮像素子自体の分光 感度に基づく各色チャンネノレの信号強度を前述の RIMM RGBや ERIMM RGB, scRGB等の標準色空間にマッピング済みであり、階調変換 ·鮮鋭性強調 ·彩度強調 のような画像鑑賞時の効果を向上する為にデータ内容を改変する画像処理が省略さ れた状態の画像データを意味する。またシーン参照画像データは、撮像装置の光電 変換特'性 (ISO 1452が定義する opto- electronic conversion fonction,例えば、コロナ 社「ファインイメージングとディジタル写真」(社)日本写真学会出版委員会編 449頁 参照)の補正を行ったものであることが好まし 、。標準化されたシーン参照画像デー タの情報量 (例えば階調数)は A/D変換器の性能に準じ、鑑賞画像参照データで必 要とされる情報量 (例えば階調数)と同等以上であることが好ましい。例えば、鑑賞画 像参照データの階調数が 1チャンネルあたり 8bitである場合、シーン参照画像データ の階調数は 12bit以上が好ましぐ 14bit以上がより好ましぐまた 16bit以上が更に好ま しい。  [0015] In addition, "standardized scene reference image data" means that the signal intensity of each color channel based on at least the spectral sensitivity of the image sensor itself has been mapped to the standard color space such as RIMM RGB, ERIMM RGB, or scRGB. Yes, it means image data in which the image processing that modifies the data contents is omitted in order to improve the effect at the time of image viewing such as gradation conversion, sharpness enhancement, and saturation enhancement. The scene reference image data is the photoelectric conversion characteristics of the imaging device (opto-electronic conversion fonction defined by ISO 1452, for example, Corona “Fine Imaging and Digital Photography” (published by the Japan Photographic Society Publishing Committee, page 449). It is preferable that the correction is made (see). The amount of information (for example, the number of gradations) of the standardized scene reference image data conforms to the performance of the A / D converter and is equal to or greater than the amount of information (for example, the number of gradations) required for the viewing image reference data. Preferably there is. For example, if the number of gradations of the viewing image reference data is 8 bits per channel, the gradation number of the scene reference image data is preferably 12 bits or more, more preferably 14 bits or more, and even more preferably 16 bits or more.
[0016] また、「標準化されたシーン参照画像データを生成するための撮像装置特性補正 処理」とは、「撮像装置の特性に依存したシーン参照生データ」を「標準化されたシー ン参照画像データ」に変換する処理を意味する。この処理の内容は「撮像装置の特 性に依存したシーン参照生データ」の状態に依存するが、少なくとも撮像素子固有の 分光感度に基づく各色チャンネルの信号強度を RIMM RGBや ERIMM RGB, sc RGB等の標準色空間にマッピングする処理が含まれる。例えば、「撮像装置の特性 に依存したシーン参照生データ」が、カラーフィルタ配列に基づく補間処理を行って いない場合には、該処理の実施が加えて必要になる。(カラーフィルタ配列に基づく 補間処理の詳細は、例えばコロナ社「ファインイメージングとディジタル写真」(社)日 本写真学会出版委員会編 51頁に記載されている。)この結果、「シーン参照生デー タ」とほぼ同一の情報量を有しつつも、異なる「撮像装置」間での信号値の差異が補 正された「標準化されたシーン参照画像データ」が得られる。 [0016] In addition, "Imaging device characteristic correction for generating standardized scene reference image data" The “process” means a process of converting “scene reference raw data depending on characteristics of the imaging device” to “standardized scene reference image data”. The content of this process depends on the state of the “scene reference raw data that depends on the characteristics of the imaging device”, but at least the signal intensity of each color channel based on the spectral sensitivity unique to the imaging device is set to RIMM RGB, ERIMM RGB, sc RGB, etc. Mapping to the standard color space. For example, when the “scene reference raw data depending on the characteristics of the imaging device” is not subjected to the interpolation processing based on the color filter array, this processing needs to be additionally performed. (Details of the interpolation processing based on the color filter array are described in, for example, Corona, “Fine Imaging and Digital Photography”, page 51 of the Japan Photographic Society Publishing Committee.) “Standardized scene reference image data” in which the difference in signal value between different “imaging devices” is corrected while having almost the same amount of information as that of “data”.
[0017] また、「最適化処理」とは、 CRT、 LCD,プラズマディスプレイ等の表示デバイスや、 銀塩印画紙、インクジェットペーパー、サーマルプリンタ用紙等の出力媒体上におい て、最適な画像を得るための処理であり、例えば、 sRGB規格に準拠した CRTデイス プレイモニタに表示することを前提とした場合、 sRGB規格の色域内で最適な色再現 が得られるように処理される。銀塩印画紙への出力を前提とした場合、銀塩印画紙の 色域内で最適な色再現が得られるように処理される。また、色域の圧縮の以外にも、 16bitから 8bitへの階調圧縮、出力画素数の低減及び出力デバイスの出力特性 (LU T)への対応処理等も含まれる。更に、ノイズ抑制、鮮鋭化、カラーバランス調整、彩 度調整或いは覆 、焼き処理等の画像処理が行われることは言うまでもな 、。  [0017] In addition, "optimization processing" means obtaining an optimal image on a display device such as a CRT, LCD, plasma display, or an output medium such as silver salt photographic paper, inkjet paper, thermal printer paper, etc. For example, if it is assumed that the image is displayed on a CRT display monitor compliant with the sRGB standard, the process is performed so that the optimum color reproduction is obtained within the color gamut of the sRGB standard. If output to silver salt photographic paper is assumed, processing is performed to obtain optimal color reproduction within the color gamut of silver salt photographic paper. In addition to color gamut compression, it also includes gradation compression from 16 bits to 8 bits, reduction of the number of output pixels, and handling of output device output characteristics (LUT). Furthermore, it goes without saying that image processing such as noise suppression, sharpening, color balance adjustment, saturation adjustment or masking, and baking is performed.
[0018] 「撮影情報データ」を用いた鑑賞画像参照データの最適化の例を以下に示す。  [0018] An example of optimizing appreciation image reference data using "shooting information data" is shown below.
[0019] 「被写体構成」の情報により、例えば部分的に彩度強調処理を施したり、ダイナミツ クレンジの広 、シーンでは、覆 、焼き処理を施すことが可能となる。  [0019] Based on the information of "subject configuration", for example, it is possible to perform a saturation enhancement process partially, or to perform a cover / burn process in a scene with a wide dynamic range.
[0020] 撮影シーンの判別結果により、例えば、夜景撮影では、ホワイトバランス調整の度 合いを緩め、カラーバランスを特別に調整することが可能となる。  [0020] According to the determination result of the shooting scene, for example, in night scene shooting, the degree of white balance adjustment can be relaxed and the color balance can be adjusted specially.
[0021] 「ストロボ光源の反射光の量」情報によって、撮影者と被写体との距離が推定され、 例えば、肌の白飛びを抑制する画像処理の条件設定に反映させることができる。  [0021] The distance between the photographer and the subject is estimated based on the "amount of reflected light from the strobe light source" information, and can be reflected in, for example, setting of image processing conditions that suppress over-exposed skin.
[0022] 「被写体の種類」情報により、例えば人物撮影では、シャープネスの度合 、を緩め、 平滑ィ匕処理を強めることにより、肌のしわを目立たないようにすることができる。 [0022] Based on the "subject type" information, for example, in person photography, the degree of sharpness is relaxed, By strengthening the smooth wrinkle treatment, the wrinkles of the skin can be made inconspicuous.
[0023] 本発明に係る画像記録装置は、本発明に係る撮像装置から取得されるデジタル画 像データに対し、本発明に係る画像処理を施す機構以外にも、カラーネガフィルム、 カラーリバーサルフィルム、白黒ネガフィルム、白黒リバーサルフィルム等、アナログ カメラにより記録された写真感光材料の駒画像情報を入力するフィルムスキャナ、銀 塩印画紙であるカラーペーパー上に再現された画像情報を入力するフラットベッドス キヤナを備えていても良い。また本発明の撮像装置以外のデジタルカメラにより取得 され、コンパクトフラッシュ(登録商標)、メモリースティック (登録商標)、スマートメディ ァ (登録商標)、マルチメディアカード (登録商標)、フロッピー (登録商標)ディスク、 光磁気記憶媒体 (MO)、 CD— Rなど、公知のあらゆる可搬式の「記録メディア」に保 存されたデジタル画像データを読み取る手段、或いは通信手段を介してデジタル画 像データを遠隔地より取得し、 CRT, LCD,プラズマディスプレイ等の表示デバイス や、銀塩印画紙、インクジェットペーパー、サーマルプリンタ用紙等のハードコピー画 像生成用の用紙など、公知のあらゆる出力媒体に鑑賞画像を形成する処理手段とを 備えていてもよい。  [0023] The image recording apparatus according to the present invention includes a color negative film, a color reversal film, a black and white, in addition to a mechanism for performing image processing according to the present invention on digital image data acquired from the imaging apparatus according to the present invention. Film scanners that input frame image information of photographic photosensitive materials recorded by analog cameras, such as negative film and black and white reversal film, and flatbed scanners that input image information reproduced on color paper that is silver salt photographic paper You may have. Also obtained by a digital camera other than the imaging device of the present invention, a compact flash (registered trademark), memory stick (registered trademark), smart media (registered trademark), multimedia card (registered trademark), floppy (registered trademark) disk, Digital image data can be obtained from a remote location via a means for reading digital image data stored on any known portable "recording media" such as magneto-optical storage media (MO) or CD-R, or via communication means. And processing means for forming an appreciation image on any known output medium such as display devices such as CRT, LCD, plasma display, hard copy image generation paper such as silver salt photographic paper, inkjet paper, thermal printer paper, etc. And may be provided.
発明の効果  The invention's effect
[0024] 本発明によれば、鑑賞画像参照データを生成するための再現補助データを撮像装 置側で生成して、その再現補助データを撮像装置の特性に依存したシーン参照生 データに添付して記録メディアに記録することにより、その記録メディアに記録された データを出力媒体に出力する際、その再現補助データを利用して鑑賞画像参照デ ータが生成され、撮影画像情報の情報損失を抑制するとともに、撮像装置で得られ た画像データ以上の高品質な鑑賞画像参照データを得ることが可能となる。  [0024] According to the present invention, reproduction auxiliary data for generating appreciation image reference data is generated on the imaging apparatus side, and the reproduction auxiliary data is attached to scene reference raw data depending on the characteristics of the imaging apparatus. When the data recorded on the recording medium is output to the output medium, the viewing image reference data is generated using the reproduction auxiliary data, and the information loss of the captured image information is reduced. In addition to the suppression, it is possible to obtain appreciation image reference data with higher quality than the image data obtained by the imaging device.
[0025] また、再現補助データに加え、撮影時の撮影条件設定である撮影情報データをシ ーン参照生データに添付することにより、画像処理装置或 、は画像記録装置側では 、再現補助データ及び撮影情報データを利用して鑑賞画像参照データが生成され、 一層高品質な鑑賞画像参照データを得ることが可能となる。  [0025] Further, in addition to the reproduction assistance data, the image processing apparatus or the image recording apparatus side reproduces the reproduction assistance data by attaching the shooting information data, which is the shooting condition setting at the time of shooting, to the scene reference raw data. Also, viewing image reference data is generated using the shooting information data, and it is possible to obtain viewing image reference data of higher quality.
図面の簡単な説明  Brief Description of Drawings
[0026] [図 1]本発明の実施形態に係る撮像装置の主要部構成を示すブロック図。 [図 2]再現補助データ生成部の内部構成を示すブロック図。 FIG. 1 is a block diagram showing a main part configuration of an imaging apparatus according to an embodiment of the present invention. FIG. 2 is a block diagram showing an internal configuration of a reproduction auxiliary data generation unit.
[図 3]図 1の記憶デバイスの記録メディアに記録されるデータファイルのデータ構造を 示す図。  FIG. 3 is a diagram showing the data structure of a data file recorded on the recording medium of the storage device in FIG.
圆 4]撮像装置特性補正情報の内容を説明するための図。 圆 4] A diagram for explaining the contents of the imaging device characteristic correction information.
圆 5]シーン参照画像データの階調変換特性を示す図(a)と、シーン参照画像デー タと鑑賞画像参照データの階調変換特性を示す図 (b)。 圆 5] A diagram showing the tone conversion characteristics of scene reference image data (a), and a diagram showing the tone conversion characteristics of scene reference image data and appreciation image reference data (b).
圆 6]シーン参照画像データ力も鑑賞画像参照データへの変換特性を示す図 (a)と、 光源条件、露光条件毎の鑑賞画像参照データの階調変換特性を示す図 (b)。 圆 6] A diagram showing the conversion characteristics of scene reference image data into appreciation image reference data (a), and a diagram showing gradation conversion characteristics of appreciation image reference data for each light source condition and exposure condition (b).
[図 7]処理プロセス再現情報のデータ構造の一例を示す図。 FIG. 7 is a diagram showing an example of the data structure of processing process reproduction information.
圆 8]撮影シーンを判別するための判別マップを示す図。 8] A diagram showing a discrimination map for discriminating a shooting scene.
[図 9]実施形態の撮像装置において実行される画像データ記録処理を示すフローチ ヤート。  FIG. 9 is a flowchart showing image data recording processing executed in the imaging apparatus of the embodiment.
[図 10]撮像装置において実行される撮影シーン判別処理を示すフローチャート。 圆 11]明度,色相の領域毎に第 1の占有率を算出する第 1の占有率算出処理を示す フローチャート。  FIG. 10 is a flowchart showing a shooting scene determination process executed in the imaging apparatus. [11] A flowchart showing a first occupancy ratio calculation process for calculating a first occupancy ratio for each area of brightness and hue.
[図 12]RGB力も HSV表色系に変換するプログラムの一例を示す図。  FIG. 12 is a diagram showing an example of a program for converting RGB power into the HSV color system.
[図 13]明度 (V)—色相(H)平面と、 V— H平面上の領域 rl及び領域 r2を示す図。 FIG. 13 is a diagram showing the lightness (V) —hue (H) plane and the region rl and region r2 on the V—H plane.
[図 14]明度 (V)—色相(H)平面と、 V— H平面上の領域 r3及び領域 r4を示す図。 圆 15]指標 αを算出するための、第 1の占有率に乗算する第 1の係数を表す曲線を 示す図。 FIG. 14 is a diagram showing the brightness (V) —hue (H) plane and the region r3 and the region r4 on the V—H plane. 15] A diagram showing a curve representing a first coefficient for multiplying the first occupancy ratio for calculating the index α.
圆 16]指標 j8を算出するための、第 1の占有率に乗算する第 2の係数を表す曲線を 示す図。 [16] A diagram showing a curve representing a second coefficient for multiplying the first occupancy rate for calculating the index j8.
[図 17]撮影画像データの構図に基づいて第 2の占有率を算出する第 2の占有率算 出処理を示すフローチャート。  FIG. 17 is a flowchart showing a second occupancy ratio calculation process for calculating a second occupancy ratio based on the composition of captured image data.
[図 18]撮影画像データの画面の外縁からの距離に応じて決定される領域 nl〜n4を 示す図。  FIG. 18 is a diagram showing regions nl to n4 determined according to the distance from the outer edge of the screen of captured image data.
圆 19]指標 γを算出するための、第 2の占有率に乗算する第 3の係数を表す曲線を 領域別(nl〜n4)に示す図。 [図 20]偏倚量算出処理を示すフローチャート。 [19] A diagram showing, by region (nl to n4), curves representing the third coefficient for multiplying the second occupancy rate for calculating the index γ. FIG. 20 is a flowchart showing a bias amount calculation process.
[図 21]本実施形態の変形例における撮像装置の主要部構成を示すブロック図。  FIG. 21 is a block diagram showing a main part configuration of an imaging apparatus in a modification of the present embodiment.
[図 22]図 21の記憶デバイスの記録メディアに記録されるデータフアイルのデータ構造 を示す図。  FIG. 22 is a diagram showing a data structure of a data file recorded on a recording medium of the storage device of FIG.
[図 23]本実施形態の変形例の撮像装置において実行される画像データ記録処理を 示すフローチャート。  FIG. 23 is a flowchart showing image data recording processing executed in an imaging apparatus according to a modification of the present embodiment.
[図 24]本発明の実施形態に係る画像記録装置の外観図。  FIG. 24 is an external view of an image recording apparatus according to an embodiment of the present invention.
[図 25]画像記録装置の内部構成を示すブロック図。  FIG. 25 is a block diagram showing the internal configuration of the image recording apparatus.
[図 26]画像処理装置及び出力部の内部構成を示すブロック図。  FIG. 26 is a block diagram showing the internal configuration of the image processing apparatus and output unit.
[図 27]画像記録装置において実行される画像処理を示すフローチャート。  FIG. 27 is a flowchart showing image processing executed in the image recording apparatus.
符号の説明 Explanation of symbols
5 AZD変^^  5 AZD strange ^^
8 ヘッダ情報処理部  8 Header information processing section
9 記憶デバイス  9 Storage device
12 再現補助データ生成部  12 Reproduction auxiliary data generator
121 撮像装置特性補正情報生成部  121 Imaging device characteristic correction information generator
122 鑑賞画像参照データ復元情報生成部  122 Appreciation image reference data restoration information generator
123 処理プロセス再現情報生成部  123 Processing process reproduction information generator
13 撮影情報データ生成部  13 Shooting information data generator
15 表示部  15 Display
100、 101 撮像装置  100, 101 Imaging device
201 画像記録装置  201 Image recording device
301 画像処理装置  301 Image processing device
303 入力部  303 Input section
306 鑑賞画像参照データ復元条件生成部  306 Appreciation image reference data restoration condition generator
307 処理プロセス再現部  307 Processing process reproduction section
313 撮影情報データ処理部  313 Image data processing unit
314 鑑賞画像参照データ生成部 発明を実施するための最良の形態 314 Appreciation image reference data generator BEST MODE FOR CARRYING OUT THE INVENTION
[0028] 以下、図面を参照して、本発明の実施形態を詳細に説明する。  Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
[0029] まず、本実施形態における構成について説明する。 First, the configuration in the present embodiment will be described.
〈撮像装置の構成〉  <Configuration of imaging device>
図 1に、本発明の実施形態に係る撮像装置 100の主要部構成を示す。撮像装置 1 00は、図 1に示すように、レンズ 1、絞り 2、 CCD (Charge Coupled Device) 3、アナ口 グ処理回路 4、 AZD変換器 5、一時記憶メモリ 6、画像処理部 7、ヘッダ情報処理部 8、記憶デバイス 9、 CCD駆動回路 10、制御部 11、再現補助データ生成部 12、操 作部 14、表示部 15、ストロボ駆動回路 16、ストロボ 17、焦点距離調整回路 18、自動 焦点駆動回路 19、モータ 20等を備えて構成されている。  FIG. 1 shows a main part configuration of an imaging apparatus 100 according to an embodiment of the present invention. As shown in FIG. 1, the imaging device 100 includes a lens 1, an aperture 2, a CCD (Charge Coupled Device) 3, an analog processing circuit 4, an AZD converter 5, a temporary storage memory 6, an image processing unit 7, a header. Information processing section 8, storage device 9, CCD drive circuit 10, control section 11, reproduction auxiliary data generation section 12, operation section 14, display section 15, strobe drive circuit 16, strobe 17, focal length adjustment circuit 18, autofocus A drive circuit 19 and a motor 20 are provided.
[0030] 撮像装置 100の光学系は、レンズ 1、絞り 2、 CCD3を備えて構成されている。 [0030] The optical system of the imaging apparatus 100 includes a lens 1, a diaphragm 2, and a CCD 3.
[0031] レンズ 1は、フォーカスの調節を行い、被写体の光画像を結像する。絞り 2は、レン ズ 1を透過した光束の光量を調節する。 CCD3は、レンズ 1により受光面上に結像さ れた被写体光を、 CCD3内の各センサ毎に光の入射量に応じた量の電気的な信号 ( 撮像信号)へ光電変換する。そして、 CCD3は、 CCD駆動回路 10から入力されるタ イミングパルスに制御されることにより、この撮像信号をアナログ処理回路 4へ順次出 力する。 The lens 1 adjusts the focus and forms an optical image of the subject. Aperture 2 adjusts the amount of light beam that has passed through lens 1. The CCD 3 photoelectrically converts the subject light imaged on the light receiving surface by the lens 1 into an electrical signal (imaging signal) of an amount corresponding to the amount of incident light for each sensor in the CCD 3. Then, the CCD 3 is sequentially controlled by the timing pulse input from the CCD drive circuit 10, and sequentially outputs this imaging signal to the analog processing circuit 4.
[0032] アナログ処理回路 4は、 CCD3から入力された撮像信号に対して、 RGB信号の増 幅やノイズの低減処理等を行う。このアナログ処理回路 4における処理は、操作部 14 力 の操作信号に応じて制御部 11を介して ONZOFFが切り替えられるようになつ ている。  The analog processing circuit 4 performs RGB signal amplification, noise reduction processing, and the like on the image pickup signal input from the CCD 3. The processing in the analog processing circuit 4 is switched ONZOFF via the control unit 11 in accordance with the operation signal of the operation unit 14 force.
[0033] AZD変換器 5は、アナログ処理回路 4から入力された撮像信号をデジタル信号に 変換して出力する。以下では、 AZD変 5で得られるデジタル信号をシーン参照 生データとして説明する。  [0033] The AZD converter 5 converts the imaging signal input from the analog processing circuit 4 into a digital signal and outputs the digital signal. In the following, the digital signal obtained by AZD modification 5 will be described as scene reference raw data.
[0034] 一時記憶メモリ 6は、ノ ッファメモリ等であり、 AZD変 5から出力された画像デ ータを一時的に格納する。 The temporary storage memory 6 is a notch memory or the like, and temporarily stores the image data output from the AZD modification 5.
[0035] 画像処理部 7は、一時記憶メモリ 6に格納された画像データに対し、表示部 15で表 示するための階調補正、分光感度のクロストーク補正、暗電流ノイズ抑制、鮮鋭化、 ホワイトバランス調整、彩度調整等の画質向上処理の他、画像サイズの変更、トリミン グ、アスペクト変換等の処理を施す。この画像処理部 7における処理は、操作部 14か らの操作信号に応じて制御部 11を介して ONZOFFが切り替えられるようになって いる。 [0035] The image processing unit 7 performs tone correction, spectral sensitivity crosstalk correction, dark current noise suppression, sharpening, for display on the display unit 15 with respect to the image data stored in the temporary storage memory 6. In addition to image quality improvement processing such as white balance adjustment and saturation adjustment, processing such as image size change, trimming, and aspect conversion is performed. The processing in the image processing unit 7 is switched ONZOFF via the control unit 11 in accordance with an operation signal from the operation unit 14.
[0036] ヘッダ情報処理部 8は、一時記憶メモリ 6に格納されたシーン参照生データのフアイ ルヘッダ (ヘッダ領域)に、再現補助データ生成部 12で生成された再現補助データ( 詳細は後述)を添付して添付済みデータファイル(図 3参照)を作成する。  [0036] The header information processing unit 8 adds the reproduction auxiliary data (details will be described later) generated by the reproduction auxiliary data generation unit 12 to the file header (header area) of the scene reference raw data stored in the temporary storage memory 6. Attach and create the attached data file (see Figure 3).
[0037] 記憶デバイス 9は、不揮発性の半導体メモリ等により構成されており、撮像装置 100 の制御プログラムを記憶している。また、記憶デバイス 9は、メモリーカード等の記録メ ディアを装着する装着部を備え、制御部 11から入力される制御信号に従って、装着 部に装着された記録メディアの記録されたデータの読み出しや、記録メディアへのデ ータの書き込みを行う。  The storage device 9 is configured by a nonvolatile semiconductor memory or the like, and stores a control program for the imaging apparatus 100. In addition, the storage device 9 includes a mounting unit for mounting a recording medium such as a memory card, and in accordance with a control signal input from the control unit 11, reading of recorded data of a recording medium mounted on the mounting unit, Write data to the recording media.
[0038] CCD駆動回路 10は、制御部 11から入力される制御信号に従ってタイミングパルス を出力し、 CCD3の駆動制御を行う。  [0038] The CCD drive circuit 10 outputs a timing pulse in accordance with a control signal input from the control unit 11, and controls the drive of the CCD 3.
[0039] 制御部 11は、 CPU (Central Processing Unit)等により構成され、記憶デバイス 9に 記憶されて 、る撮像装置 100の制御プログラムを読み出して、読み出した制御プログ ラムに従って撮像装置 100全体の制御を行う。  [0039] The control unit 11 is configured by a CPU (Central Processing Unit) or the like, and is stored in the storage device 9, reads out a control program of the imaging device 100, and controls the entire imaging device 100 in accordance with the read control program. I do.
[0040] 例えば、制御部 11は、操作部 14からの操作信号に応じて、レンズ 1の焦点距離と フォーカス (ピント)を調節するモータ 20の制御を行う自動焦点駆動回路 19、焦点距 離調整回路 18、 CCD駆動回路 10、アナログ処理回路 4、一時記憶メモリ 6、画像処 理部 7、表示部 15、ストロボ駆動回路 16の制御を行うことにより、撮影動作を実行さ せる。また、制御部 11は、撮影で得られたシーン参照生データのファイルヘッダに添 付する再現補助データ (詳細は後述)の生成を再現補助データ生成部 12に指示し、 記憶デバイス 9に対し、シーン参照生データに再現補助データが添付されたデータ ファイルを記録メディアに記録するように指示する。  [0040] For example, the control unit 11 includes an automatic focus driving circuit 19 that controls a motor 20 that adjusts the focal length and focus (focus) of the lens 1 in accordance with an operation signal from the operation unit 14, and a focal length adjustment. By controlling the circuit 18, the CCD drive circuit 10, the analog processing circuit 4, the temporary storage memory 6, the image processing unit 7, the display unit 15, and the strobe driving circuit 16, the photographing operation is executed. In addition, the control unit 11 instructs the reproduction auxiliary data generation unit 12 to generate reproduction auxiliary data (details will be described later) to be attached to the file header of the scene reference raw data obtained by shooting. Instructs the recording medium to record a data file with supplementary reproduction data attached to the scene reference raw data.
[0041] 再現補助データ生成部 12は、出力媒体上での鑑賞画像形成のための最適化する 処理を施して鑑賞画像参照データを生成する際に必要な再現補助データを生成し、 生成された再現補助データをヘッダ情報処理部 8に出力する。再現補助データ生成 部 12の内部構成については、後に図 2を参照して詳細に説明する。 [0041] The reproduction auxiliary data generation unit 12 generates reproduction auxiliary data necessary for generating appreciation image reference data by performing an optimization process for appreciation image formation on the output medium. The reproduction assistance data is output to the header information processing unit 8. Reproduction auxiliary data generation The internal configuration of the unit 12 will be described in detail later with reference to FIG.
[0042] 操作部 14は、シャッターボタン、電源の ONZOFFボタン、ズームボタン等の各種 機能ボタン、カーソルキー等を備え、各ボタンやキーに対応する操作信号を制御部 1 1に出力する。 プレイ等により構成され、制御部 11により入力された表示制御信号に従って、所要の 表示処理を行う。例えば、表示部 15は、撮像装置 100の使用者が撮影に関する設 定ゃ条件を確認するための情報を表示したり、表示部 15での表示用に生成された 鑑賞画像参照データを表示したりする。また、表示部 15は、撮影モード時には、 CC D3により取り込まれた画像を連続的に表示するファインダとしての機能を有する。 The operation unit 14 includes various function buttons such as a shutter button, a power ON / OFF button, and a zoom button, a cursor key, and the like, and outputs an operation signal corresponding to each button and key to the control unit 11. It consists of a play etc. and performs the required display processing according to the display control signal input by the control unit 11. For example, the display unit 15 displays information for the user of the image capturing apparatus 100 to confirm the conditions for shooting, or displays the viewing image reference data generated for display on the display unit 15. To do. The display unit 15 also has a function as a finder that continuously displays images captured by the CCD 3 in the shooting mode.
[0044] ストロボ駆動回路 16は、制御部 11から入力される制御信号により、被写体輝度が 低い時にストロボ 17を駆動制御して発光させる。  The strobe driving circuit 16 controls the strobe 17 to emit light when the subject brightness is low, based on a control signal input from the control unit 11.
[0045] ストロボ 17は、電池電圧を所定の高電圧に昇圧させ、電荷としてコンデンサに蓄え る。そして、ストロボ 17は、ストロボ駆動回路 16により駆動されることにより、コンデンサ に蓄えられた電荷により X管を発光して、被写体に対して補助光を照射する。  The strobe 17 boosts the battery voltage to a predetermined high voltage and stores it as a charge in a capacitor. The strobe 17 is driven by the strobe driving circuit 16 to emit light from the X tube with the electric charge stored in the capacitor, and irradiates the subject with auxiliary light.
[0046] 焦点距離調整回路 18は、制御部 11から入力される制御信号により、レンズ 1を移 動させて焦点距離を調整するためのモータ 20の制御を行う。  The focal length adjustment circuit 18 controls the motor 20 for adjusting the focal length by moving the lens 1 by the control signal input from the control unit 11.
[0047] 自動焦点駆動回路 19は、制御部 11から入力される制御信号により、レンズ 1を移 動させてフォーカス(ピント)を調整するためのモータ 20の制御を行う。  The automatic focus driving circuit 19 controls the motor 20 for moving the lens 1 and adjusting the focus (focus) by the control signal input from the control unit 11.
[0048] 図 2に、再現補助データ生成部 12の内部構成を示す。再現補助データ生成部 12 は、図 2に示すように、撮像装置特性補正情報生成部 121、鑑賞画像参照データ復 元情報生成部 122、処理プロセス再現情報生成部 123により構成される。  FIG. 2 shows an internal configuration of the reproduction assistance data generation unit 12. As shown in FIG. 2, the reproduction auxiliary data generation unit 12 includes an imaging device characteristic correction information generation unit 121, an appreciation image reference data restoration information generation unit 122, and a processing process reproduction information generation unit 123.
[0049] 撮像装置特性補正情報生成部 121は、シーン参照生データ力も標準化されたシ ーン参照画像データを生成するための撮像装置特性補正処理に必要な情報 (撮像 装置特性補正情報)を生成する。撮像装置特性補正処理には、図 4に示すように、シ ーン参照生データに対するフィルタ補間演算 (a)と、マトリックス演算 (b)と、光電変換 特性及びゲイン補正 (c)がある。  [0049] The imaging device characteristic correction information generation unit 121 generates information (imaging device characteristic correction information) necessary for imaging device characteristic correction processing for generating scene reference image data in which the scene reference raw data force is also standardized. To do. As shown in FIG. 4, the imaging device characteristic correction processing includes filter interpolation calculation (a), matrix calculation (b), photoelectric conversion characteristic and gain correction (c) for scene reference raw data.
[0050] 図 4 (a)に示すフィルタ補間演算は、 1画素 1色のフィルタ配列を有する画像データ 力 1画素 3色 (RGB)のリニア画像データに補間する処理である。ここで、フィルタ配 列とは、 CCDの色弁別用カラーフィルタの配列パターンであり、 RGB原色のべィャ 一配列が一般的に用いられる。フィルタ補間方法としては、ユアレストネイバ一(neare st neighbor)法、ノヽイリ-ァ (bi— linear interpolation)法、ノヽィ ュ1 ~~ヒッグ (bi— cubic co nvolution)法等を適用することが可能である。 [0050] The filter interpolation calculation shown in FIG. 4 (a) is the image data having a filter arrangement of one pixel and one color. Force Interpolation to linear image data of 1 pixel 3 colors (RGB). Here, the filter array is an array pattern of color filters for CCD color discrimination, and an array of RGB primary colors is generally used. As the filter interpolation method, it is possible to apply the neare st neighbor method, the bi-linear interpolation method, the noi 1 ~~ Higg (bi-cubic convolution) method, etc. Is possible.
[0051] 二アレストネイバ一法では、着目画素の最近傍の画素を選択し、拡大 ·縮小のサイ ズに合わせてピクセルをそのまま拡大 '縮小し、必要な箇所のみピクセルを補間する 。 ノ ィリニア法では、着目画素の周囲の 4つの画素の濃度値から、その座標(実数値 )に応じて線形の濃度補間を行う。バイキュービック法では、バイリニア法よりも高精度 で補間を行うために、着目画素の周囲の 16個の画素の濃度値から、 3次関数を用い て補間する。補間に用ぃる式は3 (兀 ) /兀 で、理論 (サンプリング定理)的には最 も完全な濃度補間式である。これをティラー展開で Xの 3次の項で近似し、補間式とし て用いる。 [0051] In the two-arrest neighbor method, a pixel nearest to the target pixel is selected, and the pixel is enlarged and reduced as it is in accordance with the enlargement / reduction size, and the pixel is interpolated only at a necessary portion. In the non-linear method, linear density interpolation is performed from the density values of the four pixels around the target pixel according to the coordinates (real values). In the bicubic method, interpolation is performed using a cubic function from the density values of 16 pixels around the pixel of interest in order to perform interpolation with higher accuracy than the bilinear method. The formula used for interpolation is 3 (兀) / 兀, which is the most complete density interpolation formula in theory (sampling theorem). This is approximated by the third-order term of X in the Tiller expansion and used as an interpolation formula.
[0052] 図 4 (b)に示すマトリックス演算は、カラーフィルタの分光特性や撮像素子の分光感 度特性の違 、から、同一の被写体色が異なる信号として記録される差異を補正する ための演算処理であり、このマトリックス演算により RGBから XYZ表色系の三刺激値 に変換される。  [0052] The matrix operation shown in Fig. 4 (b) is an operation for correcting the difference in which the same subject color is recorded as a different signal due to the difference in the spectral characteristics of the color filter and the spectral sensitivity characteristics of the image sensor. It is a process and is converted from tristimulus values of RGB to XYZ color system by this matrix operation.
[0053] 図 4 (c)は、撮像素子の光電変換特性 (リニア変換特性)やゲイン補正 (平行移動) に伴う差異を補正するための処理を表すもので、この処理は、図 4 (c)に示すように、 1 ogE (E :露光量)に対 UogY (Y:XYZ表色系の刺激値)がリニアに変換する特性とな つている。  [0053] Fig. 4 (c) shows processing for correcting the difference due to photoelectric conversion characteristics (linear conversion characteristics) and gain correction (translation) of the image sensor. This processing is shown in Fig. 4 (c As shown in (), the characteristic is that UogY (Y: stimulus value of XYZ color system) is linearly converted to 1 ogE (E: exposure amount).
[0054] 図 4 (a)〜 (c)に示すような撮像装置特性補正処理により、撮像装置の特性に依存 したシーン参照生データ力 シーン参照画像データ等の標準化された色空間におけ る XYZ値へと変換される。なお、撮像装置特性補正処理で適用されるカラーフィルタ の配列パターン、マトリクス演算の行列係数 a〜i、光電変換特性やゲイン補正に伴う 差異を補正する係数値は、全て撮像装置毎に異なるものである。  [0054] By the imaging device characteristic correction processing as shown in Figs. 4 (a) to (c), the scene reference raw data power depending on the characteristics of the imaging device XYZ in a standardized color space such as scene reference image data Converted to a value. Note that the color filter array pattern applied in the imaging device characteristic correction process, matrix coefficients a to i for matrix calculation, and the coefficient values for correcting differences associated with photoelectric conversion characteristics and gain correction are all different for each imaging device. is there.
[0055] 鑑賞画像参照データ復元情報生成部 122は、シーン参照画像データから鑑賞画 像参照データを生成する際に撮像装置 100での鑑賞画像参照データを復元するた めの復元情報 (鑑賞画像参照データ復元情報)のエンコード情報を生成する。 [0055] The appreciation image reference data restoration information generation unit 122 restores the appreciation image reference data in the imaging device 100 when generating the appreciation image reference data from the scene reference image data. Encoding information of the restoration information (viewing image reference data restoration information) is generated.
[0056] 図 5 (a)に、シーン参照画像データの階調変換特性を示す。この階調変換特性は、 図 5 (a)に示すように、 logE (E :露光量)に対し logY(Y:XYZ表色系の刺激値)がリニ ァに変換する特性となっている。  FIG. 5 (a) shows the tone conversion characteristics of scene reference image data. As shown in FIG. 5 (a), this tone conversion characteristic is a characteristic that logY (Y: stimulus value of XYZ color system) is converted to linear with respect to logE (E: exposure amount).
[0057] 図 5 (b)に、シーン参照画像データと鑑賞画像参照データの sRGB色空間における 階調変換特性として、均等色空間である L*a*b*表色系の L*と logE (E:露光量)の関 係を示す。図 5 (b)に示した階調変換特性では、今度は鑑賞画像参照データが、 log Eに対し L*がリニアに変化する特性となっている。  [0057] Figure 5 (b) shows the L * a * b * color system L * and logE (same color space) as the gradation conversion characteristics in the sRGB color space of scene reference image data and appreciation image reference data. E: Exposure amount). In the gradation conversion characteristics shown in Fig. 5 (b), the appreciation image reference data now has a characteristic that L * changes linearly with respect to log E.
[0058] 図 6 (a)に、シーン参照画像データから鑑賞画像参照データの変換特性(図中の曲 線)を示す。図 6 (a)に示す変換特性に、撮影時の撮影シーン (順光、逆光、アンダー 、フラッシュ近接等)に伴う補正を加えることができる。図 6 (b)に、逆光、フラッシュ近 接撮影の光源条件、アンダー、オーバーの露出条件を補正するための鑑賞画像参 照データの階調変換特性を示す。図 6 (a)の変換特性と、図 6 (b)の変換特性を合成 したものが、鑑賞画像参照データ復元情報となる。  [0058] Fig. 6 (a) shows the conversion characteristics (curved line in the figure) of the viewing image reference data from the scene reference image data. The conversion characteristics shown in Fig. 6 (a) can be corrected with the shooting scene at the time of shooting (front light, backlight, under, flash proximity, etc.). Figure 6 (b) shows the gradation conversion characteristics of the appreciation image reference data for correcting the light conditions for backlighting, flash close-up photography, and under / over exposure conditions. The combination of the conversion characteristics shown in Fig. 6 (a) and the conversion characteristics shown in Fig. 6 (b) is the viewing image reference data restoration information.
[0059] 処理プロセス再現情報生成部 123は、シーン参照画像データ力も鑑賞画像参照デ ータを生成する際に撮像装置 100での鑑賞画像参照データの生成過程を再現する ための情報 (処理プロセス再現情報)を生成する。  [0059] The processing process reproduction information generation unit 123 reproduces the process of generating appreciation image reference data in the imaging apparatus 100 when the scene reference image data power and the appreciation image reference data are generated (processing process reproduction). Information).
[0060] 図 7に、処理プロセス再現情報の一例を示す。処理プロセス再現情報は、本撮影か ら所定時間前 (例えば、 2500ms前)までの撮影条件の履歴情報を示し、具体的には 、図 7に示すように、ユーザ特性を示す手ぶれレベルと、撮影時の撮影シーン (順光 、逆光、アンダー、フラッシュ近接等)の判別処理の処理結果を示す指標 1〜3と、指 標 1〜3から算出される合計指標 4と、算出された指標に対する評価値 1、手ぶれレべ ルの評価値 2及び撮影条件正当性の各項目につ 、ての履歴情報を示して!/、る。  FIG. 7 shows an example of processing process reproduction information. The processing process reproduction information indicates history information of shooting conditions from the actual shooting to a predetermined time before (for example, 2500 ms before). Specifically, as shown in FIG. Indicators 1 to 3 indicating the processing result of the discrimination process of shooting scenes (front light, backlight, under, flash proximity, etc.), total indicator 4 calculated from indicators 1 to 3, and evaluation of the calculated indicator Show history information for value 1, value for camera shake level evaluation 2, and correctness of shooting conditions.
[0061] 処理プロセス再現情報において、手ぶれレベルは、手ぶれの度合いを、例えば 10 段階(1〜10の整数)で示したものであり、手ぶれの度合いが高いほど手ぶれレベル の数値が高くなる。  In the processing process reproduction information, the camera shake level indicates the degree of camera shake in, for example, 10 levels (an integer of 1 to 10). The higher the camera shake level, the higher the value of the camera shake level.
[0062] 指標 1〜3は、撮影時の撮影シーン (順光、逆光、アンダー、フラッシュ近接等)を特 定するための数値である。図 8に、撮影シーンを判別するための判別マップを示す。 本実施形態では、図 8に示すように、指標 1をストロボ度 (ストロボの度合い)、指標 2を 逆光度 (逆光の度合 、)、指標 3をアンダー度 (アンダーの度合 、)とする。図 8 (a)に 示すように、指標 1 (ストロボ度)と指標 2 (逆光度)で決定される領域力も逆光と順光が 判別され、指標 1 (ストロボ度)と指標 3 (アンダー度)で決定される領域から、アンダー とフラッシュ近接が判別される。撮影シーン判別処理については、後に図 10〜図 20 を参照して詳細に説明する。 [0062] Indices 1 to 3 are numerical values for specifying a shooting scene (forward light, backlight, under, flash proximity, etc.) at the time of shooting. Fig. 8 shows the discrimination map for discriminating the shooting scene. In this embodiment, as shown in FIG. 8, the index 1 is the strobe degree (strobe degree), the index 2 is the backlight intensity (backlight degree), and the index 3 is the under degree (under degree). As shown in Fig. 8 (a), the area force determined by index 1 (strobe degree) and index 2 (backlight intensity) is also distinguished from backlight and direct light, and index 1 (strobe degree) and index 3 (under degree). Under and flash proximity are discriminated from the area determined by. The shooting scene discrimination process will be described in detail later with reference to FIGS.
[0063] 合計指標 4の定義を式 (1)に示す。合計指標 4 = 10— [{ (指標 1 +指標 2 +指標 3 [0063] The definition of total index 4 is shown in equation (1). Total index 4 = 10 — [{(index 1 + index 2 + index 3
+ 18)/36} X 10] (1)  + 18) / 36} X 10] (1)
また、評価値 1、評価値 2は、それぞれ、光源条件'露出条件評価値、手ぶれ評価 値を示しており、式(2)、(3)のように定義される。評価値 1 =合計指標 4 X 3.4— 5 (2)評価値 2=— 0.54 X手ぶれレベル +8.5 (3)  Evaluation value 1 and evaluation value 2 indicate the light source condition 'exposure condition evaluation value and camera shake evaluation value, respectively, and are defined as in equations (2) and (3). Evaluation value 1 = Total index 4 X 3.4— 5 (2) Evaluation value 2 = — 0.54 X Camera shake level +8.5 (3)
ここで、式(2)及び式(3)では、指標 1〜3のうち、 6以下の値を 6、 + 6以上の 値を + 6としている。  Here, in Equations (2) and (3), among indicators 1 to 3, a value of 6 or less is 6 and a value of +6 or more is +6.
[0064] 撮影条件正当性は、撮影時の環境や撮影者の技術レベル等、撮影が正しく行わ れた程度を示す数値であり、評価値 1及び評価値 2を用いて式 (4)のように定義され る。撮影条件正当性 =評価値 I X 0.8+評価値 2 X 0.2 (4)  [0064] The correctness of shooting conditions is a numerical value indicating the degree of shooting that has been performed correctly, such as the shooting environment and the technical level of the photographer, and is expressed by equation (4) using evaluation value 1 and evaluation value 2. Defined in Shooting condition correctness = evaluation value I X 0.8 + evaluation value 2 X 0.2 (4)
撮影が正しく行われるほど撮影条件正当性の数値は高くなる。図 7の処理プロセス 再現情報に示すように、時間の経過とともに撮影条件正当性の数値が高くなるように 撮影条件 (光源条件、露出条件等)が調整される。  The correctness of the shooting conditions increases as the shooting is performed correctly. As shown in the processing process reproduction information in FIG. 7, the photographing conditions (light source conditions, exposure conditions, etc.) are adjusted so that the numerical value of the photographing condition validity increases as time passes.
[0065] 図 8 (a)及び (b)では、図 7に示す処理プロセス再現情報で示された指標 1〜3の履 歴を菱形で示して 、る。図 8に示す判別マップ上の履歴情報を表示部 15に表示する ようにしてもよい。 [0065] In FIGS. 8 (a) and 8 (b), the history of the indices 1 to 3 indicated by the processing process reproduction information shown in FIG. 7 is indicated by diamonds. The history information on the discrimination map shown in FIG. 8 may be displayed on the display unit 15.
[0066] なお、上述では、手ぶれレベルを 1〜10の整数値、指標 1〜3の数値範囲を 6以 上 + 6以下としたが、これらの数値範囲は特に限定されない。数値範囲の変更に伴 い、合計指標 4や評価値 1、評価値 2の定義は変更される。また、手ぶれ以外のユー ザ特性として、ユーザによる各種のモード設定を用いるようにしてもよい。また、処理 プロセス再現情報に記録される履歴情報として、シャッタースピード、絞り値等を含め るようにしてちょい。 [0067] また、処理プロセス再現情報としての履歴情報の記録は、操作部 14により撮影モ ードが指定された後に開始し、シャッターボタンが押されたときを本撮影時としてもよ いし、操作部 14のシャッターボタンが半押し状態になった後に履歴情報の記録を開 始し、シャッターボタンが強く押されたときを本撮影時としてもよ!/、。 [0066] In the above description, the camera shake level is an integer value of 1 to 10, and the numerical range of the indicators 1 to 3 is 6 or more + 6 or less, but these numerical ranges are not particularly limited. The definition of total index 4, evaluation value 1, and evaluation value 2 will change as the numerical range changes. Various user mode settings may be used as user characteristics other than camera shake. Also, include the shutter speed and aperture value as history information recorded in the process process reproduction information. [0067] The recording of history information as processing process reproduction information starts after the shooting mode is designated by the operation unit 14, and the time when the shutter button is pressed may be the time of actual shooting. Recording of history information starts after the shutter button of Part 14 is pressed halfway, and the time when the shutter button is pressed hard can be used for actual shooting!
[0068] 図 3に、記憶デバイス 9の記録メディアに記録されるデータファイルのデータ構造を 示す。再現補助データ生成部 12で生成される撮像装置特性補正情報、鑑賞画像参 照データ復元情報及び処理プロセス再現情報は、図 3に示すように、ヘッダ情報処 理部 8により、シーン参照生データのファイルヘッダに再現補助データとして添付さ れて添付済みのデータファイルが作成され、この添付済みのデータファイル力 記憶 デバイス 9の記録メディアに記録されることになる。  FIG. 3 shows the data structure of a data file recorded on the recording medium of the storage device 9. The imaging device characteristic correction information, the appreciation image reference data restoration information, and the processing process reproduction information generated by the reproduction auxiliary data generation unit 12 are processed by the header information processing unit 8 as shown in FIG. The attached data file is created by attaching to the file header as reproduction auxiliary data, and is recorded on the recording medium of the attached data file storage device 9.
[0069] なお、撮像装置 100内で生成された鑑賞画像参照データを、表示部 15での表示 用に用いるだけでなぐシーン参照画像データと、この鑑賞画像参照データとを関連 付けて記録メディアに記録したり、この鑑賞画像参照データのサムネイル画像をシー ン参照生データのメタデータとして添付して記録メディアに記録するようにしてもょ ヽ  [0069] It should be noted that the scene reference image data that is simply used for display on the display unit 15 with the appreciation image reference data generated in the imaging apparatus 100 and the appreciation image reference data are associated with each other on a recording medium. You can record it or record the thumbnail image of the viewing image reference data on the recording medium as metadata of the scene reference raw data.
[0070] 次に、撮像装置 100における動作について説明する。 Next, the operation in the imaging apparatus 100 will be described.
[0071] 図 9のフローチャートを参照して、撮像装置 100において実行される画像データ記 録処理について説明する。  With reference to the flowchart of FIG. 9, the image data recording process executed in the imaging apparatus 100 will be described.
[0072] 操作部 14により撮影モードが指定されると、予備撮像が行われ (ステップ S1)、予 備撮像で得られた画像 (以下、予備撮像画像という。)を表示部 15に表示するための 鑑賞画像参照データが形成され (ステップ S2)、その形成された鑑賞画像参照デー タが表示部 15に表示される (ステップ S3)。なお、ステップ S1では、操作部 14により シャッターボタンが半押し状態になったときに予備撮像が行われるようにしてもよい。  [0072] When the shooting mode is designated by the operation unit 14, preliminary imaging is performed (step S1), and an image obtained by preliminary imaging (hereinafter referred to as preliminary imaging image) is displayed on the display unit 15. The viewing image reference data is formed (step S2), and the formed viewing image reference data is displayed on the display unit 15 (step S3). In step S1, preliminary imaging may be performed when the shutter button is pressed halfway by the operation unit 14.
[0073] 次いで、処理プロセス再現情報生成部 123において、撮影シーン判別処理(図 10 〜図 20参照)が行われ、手ぶれレベルや、撮影シーン判別処理の処理結果としての 指標 1〜3等力 撮影条件正当性を示す値が算出され、算出結果力 撮影条件の正 当性が判断される (ステップ S4)。そして、撮影条件正当性の値が高くなるように撮影 条件が調整される(ステップ S5)。ステップ S1〜S5の処理は、操作部 14のシャッター ボタンが押されて本撮影が指示されるまで繰り返され、本撮影より所定時間前までの 撮影条件の履歴情報が処理プロセス再現情報として生成される。 Next, in the processing process reproduction information generation unit 123, a shooting scene discrimination process (see FIGS. 10 to 20) is performed, and the camera shake level and the indices 1 to 3 as a result of the shooting scene discrimination process are captured. A value indicating the correctness of the condition is calculated, and the correctness of the calculated imaging force is determined (step S4). Then, the shooting condition is adjusted so that the value of the shooting condition validity is increased (step S5). The processing in steps S1 to S5 is performed by the shutter of the operation unit 14. The process is repeated until the actual shooting is instructed after the button is pressed, and history information of shooting conditions up to a predetermined time before the actual shooting is generated as processing process reproduction information.
[0074] 操作部 14のシャッターボタンが押され、本撮影が指示されると (ステップ S6; YES) 、CCD3から得られた撮像信号が AZD変 5によりデジタル信号に変換されるこ とによりシーン参照生データが生成されると同時に (ステップ S7)、再現補助データ生 成部 12では、再現補助データとして、撮像装置特性補正情報、鑑賞画像参照デー タ復元情報、処理プロセス再現情報が生成される (ステップ S8)。ステップ S8で生成 される処理プロセス再現情報は、ステップ S1〜S 5の処理で生成された本撮影前の 履歴情報に、本撮影時の情報を追加したものである。なお、本撮影の後に、記録メデ ィァへ記録するための鑑賞画像参照データを作成するステップを追加してもよい。  [0074] When the shutter button of the operation unit 14 is pressed and a real shooting is instructed (step S6; YES), the imaging signal obtained from the CCD 3 is converted into a digital signal by the AZD change 5 to refer to the scene. Simultaneously with the generation of raw data (step S7), the reproduction auxiliary data generation unit 12 generates imaging device characteristic correction information, appreciation image reference data restoration information, and processing process reproduction information as reproduction auxiliary data ( Step S8). The processing process reproduction information generated in step S8 is obtained by adding information at the time of actual photographing to the history information before actual photographing generated by the processing of steps S1 to S5. Note that a step of creating appreciation image reference data to be recorded on the recording media may be added after the main photographing.
[0075] 次 、で、ヘッダ情報処理部 8にお 、て、ステップ S 7で生成されたシーン参照生デ ータのファイルヘッダに、ステップ S8で生成された再現補助データがタグ情報として 添付され (ステップ S9)、添付済みデータファイル(図 3参照)が作成される (ステップ S 10)。この添付済みデータファイルは、記憶デバイス 9の記録メディアに記録、保存さ れ (ステップ S11)、本画像データ記録処理が終了する。  [0075] Next, the header information processing unit 8 attaches the reproduction assistance data generated in step S8 as tag information to the file header of the scene reference raw data generated in step S7. (Step S9), an attached data file (see FIG. 3) is created (Step S10). The attached data file is recorded and stored in the recording medium of the storage device 9 (step S11), and the image data recording process is completed.
く撮影シーン判別処理〉  <Scene determination processing>
次に、図 10のフローチャートを参照して、撮影条件の正当性を判断する際に必要 な撮影シーン判別処理にっ 、て説明する。  Next, with reference to the flowchart of FIG. 10, the shooting scene determination process necessary for determining the validity of the shooting conditions will be described.
[0076] まず、撮影画像データ (例えば、シーン参照生データ)が所定の画像領域に分割さ れ、各分割領域が撮影画像データ全体に占める割合を示す占有率 (第 1の占有率、 第 2の占有率)を算出する占有率算出処理が行われる (ステップ T1)。占有率算出処 理の詳細は、後に図 11、図 17を参照して説明する。  First, captured image data (for example, scene reference raw data) is divided into predetermined image areas, and an occupation ratio (a first occupation ratio, a second occupation ratio) indicating a ratio of each divided area to the entire captured image data. Occupancy ratio calculation processing is performed (step T1). Details of the occupation rate calculation process will be described later with reference to FIGS.
[0077] 次いで、撮影画像データの階調分布の偏りを示す偏倚量を算出する偏倚量算出 処理が行われる (ステップ T2)。ステップ Τ2の偏倚量算出処理については、後に図 2 0を参照して詳細に説明する。  [0077] Next, a bias amount calculation process for calculating a bias amount indicating a bias of the gradation distribution of the photographed image data is performed (step T2). The bias amount calculation process in step 2 will be described in detail later with reference to FIG.
[0078] 次いで、ステップ T1で算出された占有率と、撮影条件に応じて予め設定された係 数に基づいて撮影シーンを特定するための指標 1〜3が算出され (ステップ Τ3)、本 撮影シーン判別処理が終了する。ステップ Τ3における指標の算出方法は、後に詳 細に説明する。 [0078] Next, indices 1 to 3 for specifying the shooting scene are calculated based on the occupation ratio calculated in step T1 and a coefficient set in advance according to the shooting conditions (step Τ3), and the main shooting is performed. The scene discrimination process ends. The calculation method of the index in step Τ3 will be detailed later. Explain in detail.
[0079] 次に、図 11のフローチャートを参照して、第 1の占有率算出処理について詳細に説 明する。  Next, the first occupancy rate calculation process will be described in detail with reference to the flowchart of FIG.
[0080] まず、撮影画像データの RGB値が HSV表色系に変換される (ステップ T10)。図 1 2は、 RGBから HSV表色系に変換することにより色相値、彩度値、明度値を得る変 換プログラム (HSV変換プログラム)の一例を、プログラムコード (c言語)により示した ものである。図 12に示す HSV変換プログラムでは、入力画像データであるデジタル 画像データの値を、 InR、 InG、 InBと定義し、算出された色相値を OutHとし、スケール を 0〜360と定義し、彩度値を OutS、明度値を OutVとし、単位を 0〜255と定義している  [0080] First, the RGB values of the captured image data are converted into the HSV color system (step T10). Figure 12 shows an example of a conversion program (HSV conversion program) that obtains hue values, saturation values, and brightness values by converting from RGB to the HSV color system in program code (c language). is there. In the HSV conversion program shown in Fig. 12, the digital image data values that are input image data are defined as InR, InG, and InB, the calculated hue value is defined as OutH, the scale is defined as 0 to 360, and the saturation The value is OutS, the brightness value is OutV, and the unit is defined as 0 to 255
[0081] 次 、で、撮影画像データが、所定の明度と色相の組み合わせ力 なる領域に分割 され、分割領域毎に累積画素数を算出することにより 2次元ヒストグラムが作成される (ステップ Tl l)。以下、撮影画像データの領域分割について詳細に説明する。 [0081] Next, the captured image data is divided into regions having a combination of predetermined brightness and hue, and a two-dimensional histogram is created by calculating the cumulative number of pixels for each divided region (step Tl l). . Hereinafter, the area division of the captured image data will be described in detail.
[0082] 明度(V)は、明度値力 〜 25(vl)、 26- 50(v2)、 51〜84(v3)、 85〜169(v4)、 170〜199( v5)、 200〜224(v6)、 225〜255(v7)の 7つの領域に分割される。色相 (H)は、色相値が 0 〜39、 330〜359の肌色色相領域(HI及び H2)、色相値が 40〜160の緑色色相領域( H3)、色相値力 61〜250の青色色相領域(H4)、赤色色相領域(H5)の 4つの領域に 分割される。なお、赤色色相領域 (H5)は、撮影条件の判別への寄与が少ないとの 知見から、以下の計算では用いていない。肌色色相領域は、更に、肌色領域 (HI)と 、それ以外の領域 (H2)に分割される。以下、肌色色相領域 (H = 0〜39、 330〜359) のうち、下記の式 (5)を満たす色相' (H)を肌色領域 (HI)とし、式 (5)を満たさない領 域を (H2)とする。 [0082] Lightness (V) is lightness value power -25 (vl), 26-50 (v2), 51-84 (v3), 85-169 (v4), 170-199 (v5), 200-224 ( v6), divided into 7 regions from 225 to 255 (v7). Hue (H) is a flesh color range (HI and H2) with a hue value of 0 to 39, 330 to 359, a green hue range (H3) with a hue value of 40 to 160, and a blue hue range with a hue value of 61 to 250. It is divided into four areas (H4) and red hue area (H5). Note that the red hue region (H5) is not used in the following calculations because of the fact that it contributes little to the determination of imaging conditions. The flesh color hue area is further divided into a flesh color area (HI) and another area (H2). Hereinafter, of the flesh-colored hue areas (H = 0 to 39, 330 to 359), the hue '(H) that satisfies the following formula (5) is defined as the flesh-colored area (HI), and the area that does not satisfy formula (5) (H2).
[0083] 10 <彩度 (S) < 175、 [0083] 10 <Saturation (S) <175,
色相' (H) =色相 ) + 60 (0≤色相 )く 300のとき)、  Hue '(H) = Hue) + 60 (0≤Hue) (when 300)),
色相' (H) =色相 (H) - 300 (300≤色相 (H) < 360のとき)、  Hue '(H) = Hue (H)-300 (when 300 ≤ Hue (H) <360),
輝度 (Y) = InR X 0.30 + InG X 0.59 + InB X 0.11 (A)として、  Luminance (Y) = InR X 0.30 + InG X 0.59 + InB X 0.11 (A)
色相, (H)Z輝度 (Y) < 3.0 X (彩度 (S)Z255) + 0.7 (5)  Hue, (H) Z Luminance (Y) <3.0 X (Saturation (S) Z255) + 0.7 (5)
従って、撮影画像データの分割領域の数は 4 X 7 = 28個となる。なお、式(5)にお V、て明度 (V)を用いることも可能である。 Therefore, the number of divided areas of the captured image data is 4 × 7 = 28. In equation (5) V and brightness (V) can also be used.
[0084] 2次元ヒストグラムが作成されると、分割領域毎に算出された累積画素数の全画素 数 (撮影画像全体)に占める割合を示す第 1の占有率が算出され (ステップ T12)、本 第 1の占有率算出処理が終了する。明度領域 vi、色相領域 Hjの組み合わせ力もなる 分割領域において算出された第 1の占有率を Rijとすると、各分割領域における第 1 の占有率は表 1のように表される。  [0084] When the two-dimensional histogram is created, a first occupancy ratio indicating the ratio of the cumulative number of pixels calculated for each divided region to the total number of pixels (the entire captured image) is calculated (step T12). The first occupancy rate calculation process ends. Table 1 shows the first occupancy ratio in each divided area, where Rij is the first occupancy ratio calculated in the divided area that also has the combined power of the lightness area vi and the hue area Hj.
[0085] [表 1]  [0085] [Table 1]
[第 1 の占有率]  [1st occupancy]
Figure imgf000022_0001
Figure imgf000022_0001
[0086] 次に、指標の算出方法について説明する。  Next, an index calculation method will be described.
[0087] 表 2に、ストロボ撮影としての確度、即ち、ストロボ撮影時の顔領域の明度状態を定 量的に示す指標 (Xを算出するために必要な第 1の係数を分割領域別に示す。表 2に 示された各分割領域の係数は、表 1に示した各分割領域の第 1の占有率 Rijに乗算 する重み係数であり、撮影条件に応じて予め設定されている。  [0087] Table 2 shows the first coefficient necessary for calculating the accuracy (X for each divided area) that quantitatively indicates the accuracy of strobe shooting, that is, the brightness state of the face area during strobe shooting. The coefficient of each divided area shown in Table 2 is a weighting coefficient by which the first occupancy Rij of each divided area shown in Table 1 is multiplied, and is set in advance according to the photographing conditions.
[0088] [表 2]  [0088] [Table 2]
[第 1 の係数]  [First factor]
Figure imgf000022_0002
Figure imgf000022_0002
[0089] 図 13に、明度 (v)—色相(H)平面を示す。表 2によると、図 13において高明度の肌 色色相領域に分布する領域 (rl)から算出される第 1の占有率には、正 (+)の係数が用 いられ、それ以外の色相である青色色相領域 (r2)から算出される第 1の占有率には、 負 (-)の係数が用いられる。図 15は、肌色領域 (HI)における第 1の係数と、その他の 領域 (緑色色相領域 (H3) )における第 1の係数を、明度全体に渡って連続的に変化 する曲線 (係数曲線)として示したものである。表 2及び図 15によると、高明度 (V=17 0 224)の領域では、肌色領域 (HI)における第 1の係数の符号は正 (+)であり、その 他の領域 (例えば、緑色色相領域 (H3))における第 1の係数の符号は負 (-)であり、 両者の符号が異なって ヽることがゎカゝる。 FIG. 13 shows the brightness (v) —hue (H) plane. According to Table 2, a positive (+) coefficient is used for the first occupancy calculated from the region (rl) distributed in the high-brightness skin color hue region in Fig. 13, and other hues are used. A negative (-) coefficient is used for the first occupancy calculated from a certain blue hue region (r2). Figure 15 shows the first coefficient in the flesh-color area (HI) and the first coefficient in the other areas (green hue area (H3)) continuously changing over the entire brightness. It is shown as a curve (coefficient curve). According to Table 2 and Figure 15, in the high brightness (V = 17 0 224) region, the sign of the first coefficient in the skin color region (HI) is positive (+), and the other regions (e.g., green hue) In region (H3)), the sign of the first coefficient is negative (-), and the sign of both is different.
[0090] 明度領域 vi、色相領域 Hjにおける第 1の係数を Cijとすると、指標 αを算出するため の Hk領域の和は、式(6)のように定義される。  [0090] If the first coefficient in the lightness region vi and the hue region Hj is Cij, the sum of the Hk regions for calculating the index α is defined as in equation (6).
[0091] [数 1] [0091] [Equation 1]
/Λ領域の和: Rik χ Cik ( 6 )  Sum of / Λ regions: Rik χ Cik (6)
[0092] 従って、 H1 H4領域の和は、下記の式 (6-1)〜式 (6- 4)のように表される。 Accordingly, the sum of the H1 and H4 regions is expressed by the following formulas (6-1) to (6-4).
[0093] HI領域の和 = R11X(- 44.0) + R21X(- 16.0)+ (中略)... + R71X(- 11.3) (6-1) [0093] Sum of HI regions = R11X (-44.0) + R21X (-16.0) + (omitted) ... + R71X (-11.3) (6-1)
H2領域の和 = R12X0.0 + R22X8.6+ (中略)… +R72X(- 11.1) (6-2) H3領域の和 = R13X0.0 + R23X(- 6.3)+ (中略)… + R73X(- 10.0) (6-3) H4領域の和 = R14 X 0.0 + R24 X (- 1.8) + (中略)… + R74 X (- 14.6) (6-4) 指標 αは、式 (6- 1)〜(6-4)で示された Η1 Η4領域の和を用いて、式(7)のように 定義される。  Sum of H2 region = R12X0.0 + R22X8.6 + (omitted) ... + R72X (-11.1) (6-2) Sum of H3 region = R13X0.0 + R23X (-6.3) + (omitted) ... + R73X ( -10.0) (6-3) Sum of H4 region = R14 X 0.0 + R24 X (-1.8) + (Omitted) ... + R74 X (-14.6) (6-4) The index α is the formula (6-1) Using the sum of Η1 Η4 regions shown in ~ (6-4), it is defined as equation (7).
[0094] 指標 α =Η1領域の和 + Η2領域の和 + Η3領域の和 + Η4領域の和 +4.424 (7) 表 3に、逆光撮影としての確度、即ち、逆光撮影時の顔領域の明度状態を定量的 に示す指標 βを算出するために必要な第 2の係数を分割領域別に示す。表 3に示さ れた各分割領域の係数は、表 1に示した各分割領域の第 1の占有率 Rijに乗算する 重み係数であり、撮影条件に応じて予め設定されている。  [0094] Index α = Sum of Η1 area + Sum of Η2 area + Sum of Η3 area + Sum of Η4 area +4.424 (7) Table 3 shows the accuracy of backlighting, that is, brightness of face area during backlighting The second coefficient necessary to calculate the index β that quantitatively indicates the state is shown for each divided region. The coefficient of each divided area shown in Table 3 is a weighting coefficient by which the first occupancy ratio Rij of each divided area shown in Table 1 is multiplied, and is set in advance according to the shooting conditions.
[0095] [表 3]  [0095] [Table 3]
[第 2の係数]  [Second factor]
Figure imgf000023_0001
Figure imgf000023_0001
[0096] 図 14に、明度 (v)—色相(H)平面を示す。表 3によると、図 14において肌色色相領 域の中間明度に分布する領域 (r4)力 算出される占有率には負 (-)の係数が用いら れ、肌色色相領域の低明度 (シャドー)領域 (r3)から算出される占有率には正 (+)の係 数が用いられる。図 16は、肌色領域 (HI)における第 2の係数を、明度全体に渡って 連続的に変化する曲線 (係数曲線)として示したものである。表 3及び図 16によると、 肌色色相領域の、明度値が 85〜169(v4)の中間明度領域の第 2の係数の符号は負 (- )であり、明度値が 26〜84(v2,v3)の低明度 (シャドー)領域の第 2の係数の符号は正 (+ )であり、両領域での係数の符号が異なって 、ることがわ力る。 FIG. 14 shows the brightness (v) —hue (H) plane. According to Table 3, in Fig. 14, the skin color hue area Area (r4) force distributed in the middle brightness of the area The negative (-) coefficient is used for the calculated occupancy, and the occupancy calculated from the low brightness (shadow) area (r3) of the flesh hue area Is a positive (+) coefficient. Fig. 16 shows the second coefficient in the flesh color region (HI) as a curve (coefficient curve) that changes continuously over the entire brightness. According to Table 3 and Fig. 16, the sign of the second coefficient in the lightness value range of 85 to 169 (v4) in the flesh tone hue region is negative (-) and the lightness value is 26 to 84 (v2, The sign of the second coefficient in the low lightness (shadow) region of v3) is positive (+), which indicates that the sign of the coefficient in both regions is different.
[0097] 明度領域 vi、色相領域 Hjにおける第 2の係数を Dijとすると、指標 βを算出するため の Hk領域の和は、式(8)のように定義される。  [0097] If the second coefficient in the lightness region vi and the hue region Hj is Dij, the sum of the Hk regions for calculating the index β is defined as in equation (8).
[0098] [数 2]  [0098] [Equation 2]
Hk領域の和 = ^Rik x Dik ( 8 ) Hk region sum = ^ Rik x Dik (8)
[0099] 従って、 H1〜H4領域の和は、下記の式 (8-1)〜式 (8- 4)のように表される。 [0099] Therefore, the sum of the H1 to H4 regions is expressed by the following equations (8-1) to (8-4).
[0100] HI領域の和 = R11 X (- 27.0) + R21 X 4.5 + (中略)… + R71 X (- 24.0) (8-1) [0100] Sum of HI area = R11 X (-27.0) + R21 X 4.5 + (omitted)… + R71 X (-24.0) (8-1)
H2領域の和 = R12 X 0.0 + R22 X 4.7 + (中略)… + R72 X (- 8.5) (8-2) H3領域の和 = R13 X 0.0 + R23 X 0.0 + (中略)... + R73 X 0.0 (8-3)  H2 region sum = R12 X 0.0 + R22 X 4.7 + (omitted)… + R72 X (-8.5) (8-2) H3 region sum = R13 X 0.0 + R23 X 0.0 + (omitted) ... + R73 X 0.0 (8-3)
H4領域の和 = R14 X 0.0 + R24 X (- 5.1) + (中略)… + R74 X 7.2 (8-4) 指標 j8は、式 (8- 1)〜(8-4)で示された H1〜H4領域の和を用いて、式(9)のように 定義される。  Sum of H4 region = R14 X 0.0 + R24 X (-5.1) + (omitted)… + R74 X 7.2 (8-4) The index j8 is H1 expressed by the equations (8-1) to (8-4). Using the sum of the ~ H4 region, it is defined as in equation (9).
[0101] 指標 j8 = H1領域の和 + H2領域の和 + H3領域の和 + H4領域の和 + 1.554 (9)  [0101] Indicator j8 = Sum of H1 region + Sum of H2 region + Sum of H3 region + Sum of H4 region + 1.554 (9)
指標 α及び指標 βは、撮影画像データの明度と色相の分布量に基づいて算出さ れるため、撮影画像データがカラー画像である場合の撮影シーンの判別に有効であ る。  Since the index α and the index β are calculated based on the brightness and hue distribution amount of the captured image data, they are effective for determining a captured scene when the captured image data is a color image.
[0102] 次に、図 17のフローチャートを参照して、指標 γを算出するために実行される第 2 の占有率算出処理について詳細に説明する。  Next, the second occupancy rate calculation process executed to calculate the index γ will be described in detail with reference to the flowchart of FIG.
[0103] まず、撮影画像データの RGB値が HSV表色系に変換される (ステップ T20)。次 ヽ で、撮影画像データが、撮影画像画面の外縁からの距離と明度の組み合わせ力ゝらな る領域に分割され、分割領域毎に累積画素数を算出することにより 2次元ヒストグラム が作成される (ステップ T21)。以下、撮影画像データの領域分割について詳細に説 明する。 [0103] First, the RGB values of the photographed image data are converted into the HSV color system (step T20). Next, the captured image data is divided into regions where the combined power of the distance from the outer edge of the captured image screen and the brightness is determined, and the cumulative number of pixels is calculated for each divided region to obtain a two-dimensional histogram. Is created (step T21). Hereinafter, the area division of the captured image data will be described in detail.
[0104] 図 18 (a)〜(d)に、撮影画像データの画面の外縁からの距離に応じて分割された 4 つの領域 nl〜n4を示す。図 18 (a)に示す領域 nlが外枠であり、図 18 (b)に示す領 域 n2が、外枠の内側の領域であり、図 18 (c)に示す領域 n3が、領域 n2の更に内側 の領域であり、図 18 (d)に示す領域 n4が、撮影画像画面の中心部の領域である。ま た、明度は、上述のように vl〜v7の 7つの領域に分割するものとする。従って、撮影 画像データを、撮影画像画面の外縁からの距離と明度の組み合わせカゝらなる領域に 分割した場合の分割領域の数は 4 X 7 = 28個となる。  18A to 18D show four areas nl to n4 divided according to the distance from the outer edge of the screen of the captured image data. The area nl shown in FIG. 18 (a) is the outer frame, the area n2 shown in FIG. 18 (b) is the area inside the outer frame, and the area n3 shown in FIG. 18 (c) is the area n2. A further inner area, an area n4 shown in FIG. 18 (d) is an area at the center of the captured image screen. In addition, the lightness is divided into seven regions from vl to v7 as described above. Therefore, when the captured image data is divided into regions that are a combination of the distance from the outer edge of the captured image screen and the brightness, the number of divided regions is 4 × 7 = 28.
[0105] 2次元ヒストグラムが作成されると、分割領域毎に算出された累積画素数の全画素 数 (撮影画像全体)に占める割合を示す第 2の占有率が算出され (ステップ T22)、本 第 2の占有率算出処理が終了する。明度領域 vi、画面領域 njの組み合わせ力もなる 分割領域において算出された第 2の占有率を Qijとすると、各分割領域における第 2 の占有率は表 4のように表される。  [0105] When the two-dimensional histogram is created, a second occupancy ratio indicating the ratio of the cumulative number of pixels calculated for each divided region to the total number of pixels (the entire captured image) is calculated (step T22). The second occupancy rate calculation process ends. Assuming that Qij is the second occupancy calculated in the divided area that also has the combined power of the brightness area vi and the screen area nj, the second occupancy ratio in each divided area is expressed as shown in Table 4.
[0106] [表 4]  [0106] [Table 4]
[第 2の占有率]  [Second occupancy]
Figure imgf000025_0001
Figure imgf000025_0001
[0107] 次に、第 2の占有率を用いて算出される指標 γの算出方法について説明する。  Next, a method for calculating the index γ calculated using the second occupation ratio will be described.
[0108] 表 5に、指標 γを算出するために必要な第 3の係数を分割領域別に示す。表 5に示 された各分割領域の係数は、表 4に示した各分割領域の第 2の占有率 Qijに乗算す る重み係数であり、撮影条件に応じて予め設定されている。 Table 5 shows the third coefficient necessary for calculating the index γ for each divided region. The coefficient of each divided area shown in Table 5 is a weighting coefficient by which the second occupancy Qij of each divided area shown in Table 4 is multiplied, and is set in advance according to the photographing conditions.
[0109] [表 5] [第 3の係数] [0109] [Table 5] [Third coefficient]
Figure imgf000026_0001
Figure imgf000026_0001
[0110] 図 19は、画面領域 nl〜n4における第 3の係数を、明度全体に渡って連続的に変 化する曲線 (係数曲線)として示したものである。  [0110] Fig. 19 shows the third coefficient in the screen areas nl to n4 as a curve (coefficient curve) that continuously changes over the entire brightness.
[0111] 明度領域 vi、画面領域 njにおける第 3の係数を Eijとすると、指標 γを算出するため の nk領域 (画面領域 nk)の和は、式( 10)のように定義される。 [0111] If the third coefficient in the brightness area vi and screen area nj is Eij, the sum of the nk area (screen area nk) for calculating the index γ is defined as in equation (10).
[0112] [数 3] [0112] [Equation 3]
nk領域の和 = Y Qik x Eik ( 1 0 )  Sum of nk regions = Y Qik x Eik (1 0)
[0113] 従って、 nl〜n4領域の和は、下記の式 (10-1)〜式 (10-4)のように表される。 [0113] Accordingly, the sum of the nl to n4 regions is expressed by the following equations (10-1) to (10-4).
[0114] nl領域の和 = Q11 X 40.1 + Q21 X 37.0+ (中略)... + Q71 X 22.0 (10-1) [0114] Sum of nl region = Q11 X 40.1 + Q21 X 37.0+ (omitted) ... + Q71 X 22.0 (10-1)
n2領域の和 = Q12 X (-14.8) + Q22 X (-10.5) + (中略)… + Q72 X 0.0 (10-2) n3領域の和 = Q13 X 24.6 + Q23 X 12.1 + (中略)… + Q73 X 10.1 (10-3) n4領域の和 = Q 14 X 1.5 + Q24 X (- 32.9) + (中略)... + Q 74 X (- 52.2) (10-4) 指標 γは、式 (10-1)〜(; 10-4)で示された nl〜n4領域の和を用いて、式(11)のよう に定義される。  n2 region sum = Q12 X (-14.8) + Q22 X (-10.5) + (omitted)… + Q72 X 0.0 (10-2) n3 region sum = Q13 X 24.6 + Q23 X 12.1 + (omitted)… + Q73 X 10.1 (10-3) Sum of n4 region = Q 14 X 1.5 + Q24 X (-32.9) + (Omitted) ... + Q 74 X (-52.2) (10-4) Using the sum of the nl to n4 regions shown in 10-1) to (; 10-4), it is defined as in equation (11).
[0115] 指標 γ =nl領域の和 +η2領域の和 +η3領域の和 +η4領域の和 12.6201 (11 )  [0115] Index γ = Sum of nl regions + Sum of η2 regions + Sum of η3 regions + Sum of η4 regions 12.6201 (11)
指標 γは、撮影画像データの明度の分布位置による構図的な特徴 (撮影画像デー タの画面の外縁からの距離)に基づいて算出されるため、カラー画像だけでなくモノ クロ画像の撮影条件を判別するのにも有効である。  The index γ is calculated based on the compositional characteristics (distance from the outer edge of the screen of the captured image data) based on the brightness distribution position of the captured image data. It is also effective for discrimination.
[0116] 次に、図 20のフローチャートを参照して、偏倚量算出処理(図 10のステップ Τ2)に ついて説明する。 Next, with reference to the flowchart in FIG. 20, the bias amount calculation process (step Τ2 in FIG. 10) will be described.
[0117] まず、撮影画像データの RGB (Red,Green,Blue)値から、式 (A)を用いて各画素の 輝度 Y (明るさ)が算出され、輝度の標準偏差 (xl)が算出される (ステップ T23)。輝度 の標準偏差 (xl)は、式(12)のように定義される。 [0118] [数 4] [0117] First, the luminance Y (brightness) of each pixel is calculated from the RGB (Red, Green, Blue) values of the captured image data using Equation (A), and the standard deviation (xl) of the luminance is calculated. (Step T23). The standard deviation (xl) of brightness is defined as in equation (12). [0118] [Equation 4]
:画素輝度値-平均輝度 : Pixel luminance value-average luminance
体画素数  Number of body pixels
[0119] 式(12)において、画素輝度値とは、撮影画像データの各画素の輝度であり、平均 輝度値とは、撮影画像データの輝度の平均値である。また、全体画素数とは、撮影 画像データ全体の画素数である。 [0119] In equation (12), the pixel luminance value is the luminance of each pixel of the captured image data, and the average luminance value is the average value of the luminance of the captured image data. The total number of pixels is the number of pixels of the entire captured image data.
[0120] 次いで、式(13)に示すように、輝度差分値 (x2)が算出される (ステップ T24)。  Next, as shown in Expression (13), a luminance difference value (x2) is calculated (step T24).
[0121] 輝度差分値 (x2)= (最大輝度値一平均輝度値) Z255 (13) [0121] Luminance difference value (x2) = (Maximum luminance value, Average luminance value) Z255 (13)
式(13)において、最大輝度値とは、撮影画像データの輝度の最大値である。  In equation (13), the maximum luminance value is the maximum luminance value of the captured image data.
[0122] 次いで、撮影画像データの画面中央部における肌色領域の平均輝度値 (x3)が算 出され (ステップ T25)、更に、当該画面中央部における平均輝度値 (χ4)が算出され る(ステップ Τ26)。ここで、画面中央部とは、例えば、図 18において、領域 η3及び領 域 η4により構成される領域である。  [0122] Next, the average luminance value (x3) of the flesh color region in the center of the screen of the captured image data is calculated (step T25), and further the average luminance value (χ4) in the center of the screen is calculated (step S25). (Τ26). Here, the center of the screen is, for example, an area composed of an area η3 and an area η4 in FIG.
[0123] 次いで、肌色輝度分布値 (χ5)が算出され (ステップ Τ27)、本偏倚量算出処理が終 了する。撮影画像データの肌色領域の最大輝度値を Yskinjnax 肌色領域の最小輝 度値を Yskin_min、肌色領域の平均輝度値を Yskin_aveとすると、肌色輝度分布値 (x5) は、式(14)のように定義される。  [0123] Next, the flesh color luminance distribution value (χ5) is calculated (step Τ27), and this deviation amount calculation processing ends. When the maximum brightness value of the skin color area of the captured image data is Yskinjnax, the minimum brightness value of the skin color area is Yskin_min, and the average brightness value of the skin color area is Yskin_ave, the skin color brightness distribution value (x5) is defined as in equation (14) Is done.
[0124] x5 = (Yskin— max― Yskin— mm)/ 2― Yskin— ave (14)  [0124] x5 = (Yskin— max― Yskin— mm) / 2― Yskin—ave (14)
撮影画像データの画面中央部における肌色領域の平均輝度値を x6とする。ここで の画面中央部とは、例えば、図 18の領域 n2、領域 n3及び領域 n4から構成される領 域である。このとき、図 7及び図 8に示す指標 1は、指標 a、指標 γ、 χ6を用いて式(1 5)のように定義され、指標 2は、指標 j8、指標 γ、 χ6を用いて式(16)のように定義さ れる。  Let x6 be the average luminance value of the skin color area in the center of the screen of the captured image data. Here, the center of the screen is, for example, an area composed of the area n2, the area n3, and the area n4 in FIG. At this time, index 1 shown in FIGS. 7 and 8 is defined as in equation (15) using index a, index γ, χ6, and index 2 is defined using index j8, index γ, χ6. It is defined as (16).
[0125] 指標 1 = 0.46 X指標 α +0.61 X指標 γ +0.01 Χ χ6— 0.79 (15)  [0125] Indicator 1 = 0.46 X indicator α +0.61 X indicator γ +0.01 Χ χ6— 0.79 (15)
指標2 = 0.58 指標|8 +0.18 指標7 +(-0.03) 6 + 3.34 (16) ここで、式(15)及び式(16)において各指標に乗算される重み係数は、撮影条件 に応じて予め設定されて ヽる。  Index 2 = 0.58 Index | 8 +0.18 Index 7 + (-0.03) 6 + 3.34 (16) Here, the weighting factor multiplied by each index in Equation (15) and Equation (16) depends on the shooting conditions. Set in advance.
[0126] 図 7及び図 8に示す指標 3は、偏倚量算出処理で算出された偏倚量 (xl)〜(x5)に、 撮影条件に応じて予め設定された第 4の係数を乗算することにより得られる。表 6に、 各偏倚量に乗算する重み係数である第 4の係数を示す。 [0126] The index 3 shown in FIGS. 7 and 8 includes the deviation amounts (xl) to (x5) calculated in the deviation amount calculation process. It is obtained by multiplying a preset fourth coefficient according to the shooting conditions. Table 6 shows the fourth coefficient, which is a weighting coefficient by which each deviation is multiplied.
[0127] [表 6] [0127] [Table 6]
[第 4 の係数]
Figure imgf000028_0001
[Fourth coefficient]
Figure imgf000028_0001
[0128] 指標 3は、式(17)のように定義される。  [0128] Index 3 is defined as shown in equation (17).
[0129] 指標3= 1 0.02+ 2 1.13+ 3 0.06+ 4 (-0.01)+ 5 0.03— 6.49 (17) この指標 3は、撮影画像データの画面の構図的な特徴だけでなぐ輝度ヒストグラム 分布情報を持ち合わせており、特に、ストロボ (フラッシュ近接)撮影シーンとアンダー 撮影シーンの判別に有効である。  [0129] Indicator 3 = 1 0.02+ 2 1.13+ 3 0.06+ 4 (-0.01) + 5 0.03— 6.49 (17) This indicator 3 is a luminance histogram distribution information that consists only of the compositional features of the screen of the captured image data. In particular, it is effective for discriminating between strobe (flash proximity) shooting scenes and under shooting scenes.
[0130] 以上のように、本実施形態の撮像装置 100によれば、鑑賞画像参照データを生成 するための再現補助データを撮像装置側で生成して、その再現補助データを撮像 装置の特性に依存したシーン参照生データに添付して記録メディアに記録すること により、その記録メディアに記録されたデータを出力媒体上に出力する際、その再現 補助データを利用して鑑賞画像参照データが生成され、撮影画像情報の情報損失 を抑制するとともに、撮像装置で得られた画像データ以上の高品質な鑑賞画像参照 データを得ることが可能となる。  [0130] As described above, according to the imaging apparatus 100 of the present embodiment, reproduction auxiliary data for generating viewing image reference data is generated on the imaging apparatus side, and the reproduction auxiliary data is used as characteristics of the imaging apparatus. By attaching to the dependent scene reference raw data and recording on the recording medium, when the data recorded on the recording medium is output on the output medium, the viewing image reference data is generated using the reproduction auxiliary data. As a result, information loss of captured image information can be suppressed, and high-quality appreciation image reference data higher than image data obtained by the imaging device can be obtained.
〈撮像装置の変形例〉  <Modification of imaging device>
次に、撮像装置の記録メディアに記録された画像データの出力先で、一層好ましい 画像を得るために、図 1の撮像装置 100の構成に、撮影情報データ生成部 13を追 カロして構成される撮像装置 101について説明する。図 21に、撮像装置 100の変形 例としての撮像装置 101の構成を示す。図 21の撮像装置 101では、図 1の撮像装置 100と同一の構成要素には同一の符号を付し、その構成説明を省略する。  Next, in order to obtain a more preferable image at the output destination of the image data recorded on the recording medium of the imaging device, the imaging information data generation unit 13 is added to the configuration of the imaging device 100 in FIG. The imaging device 101 will be described. FIG. 21 shows a configuration of an imaging apparatus 101 as a modification of the imaging apparatus 100. In the imaging device 101 of FIG. 21, the same components as those of the imaging device 100 of FIG.
[0131] 撮影情報データ生成部 13は、撮影時の撮影条件設定である撮影情報データを生 成する。この撮影情報データは、例えば、カメラ名称やコード番号など、カメラ種別( 機種)に直接関係する情報や、露出時間、シャッタースピード、絞り値 (Fナンバー)、 ISO感度、輝度値、被写体距離範囲、光源、ストロボ発光の有無、被写体領域、ホヮ イトバランス、ズーム倍率、被写体構成、撮影シーン、ストロボ光源の反射光の量、撮 影彩度などの撮影条件設定、被写体の種類に関する情報等である。 [0131] The photographing information data generation unit 13 generates photographing information data that is a photographing condition setting at the time of photographing. This shooting information data includes, for example, information directly related to the camera type (model) such as camera name and code number, exposure time, shutter speed, aperture value (F number), ISO sensitivity, brightness value, subject distance range, Light source, strobe flash, subject area, Information about the type of the subject, the image balance, the zoom magnification, the subject composition, the shooting scene, the amount of reflected light from the strobe light source, the shooting saturation, and the like.
[0132] 図 22に、撮像装置 101の記憶デバイス 9の記録メディアに記録されるデータフアイ ルのデータ構造を示す。再現補助データ生成部 12で生成される撮像装置特性補正 情報、鑑賞画像参照データ復元情報及び処理プロセス再現情報と、撮影情報デー タ生成部 13で生成された撮影情報データは、ヘッダ情報処理部 8により、シーン参 照生データのファイルヘッダに再現補助データとして添付されて添付済みのデータ ファイルが作成され、この添付済みのデータファイル力 記憶デバイス 9の記録メディ ァに記録されることになる。  FIG. 22 shows the data structure of a data file recorded on the recording medium of the storage device 9 of the imaging apparatus 101. The imaging device characteristic correction information, the appreciation image reference data restoration information and the processing process reproduction information generated by the reproduction auxiliary data generation unit 12 and the shooting information data generated by the shooting information data generation unit 13 are the header information processing unit 8 As a result, the attached data file is created by being attached to the file header of the scene reference raw data as auxiliary reproduction data, and is recorded in the recording medium of the attached data file storage device 9.
[0133] 次に、撮像装置 101における動作について説明する。  Next, the operation in the imaging apparatus 101 will be described.
[0134] 図 23のフローチャートを参照して、撮像装置 101において実行される画像データ 記録処理につ!、て説明する。  With reference to the flowchart of FIG. 23, the image data recording process executed in the imaging apparatus 101 will be described.
[0135] 操作部 14により撮影モードが指定されると、予備撮像が行われ (ステップ S20)、予 備撮像で得られた画像 (以下、予備撮像画像という。)を表示部 15に表示するための 鑑賞画像参照データが形成され (ステップ S21)、その形成された鑑賞画像参照デ ータが表示部 15に表示される (ステップ S22)。なお、ステップ S20では、操作部 14 によりシャッターボタンが半押し状態になったときに予備撮像が行われるようにしても よい。  [0135] When the shooting mode is designated by the operation unit 14, preliminary imaging is performed (step S20), and an image obtained by preliminary imaging (hereinafter referred to as preliminary imaging image) is displayed on the display unit 15. The viewing image reference data is formed (step S21), and the formed viewing image reference data is displayed on the display unit 15 (step S22). In step S20, preliminary imaging may be performed when the shutter button is pressed halfway by the operation unit 14.
[0136] 次いで、処理プロセス再現情報生成部 123において、手ぶれレベルや、シーン判 別処理結果としての指標 1〜3等力 撮影条件正当性の値が算出され、算出結果か ら撮影条件の正当性が判断される (ステップ S23)。そして、撮影条件正当性の値が 高くなるように撮影条件が調整される (ステップ S 24)。ステップ S20〜S24の処理は 、操作部 14のシャッターボタンが押されて本撮影が指示されるまで繰り返され、本撮 影より所定時間前までの撮影条件の履歴情報が処理プロセス再現情報として生成さ れる。  [0136] Next, in the processing process reproduction information generating unit 123, the camera shake level and the indices 1 to 3 as a result of the scene discrimination process are calculated. The photographing condition validity value is calculated from the calculation result, and the validity of the photographing condition is calculated. Is determined (step S23). Then, the shooting condition is adjusted so that the value of the shooting condition validity is increased (step S 24). The processing of steps S20 to S24 is repeated until the actual shooting is instructed by pressing the shutter button of the operation unit 14, and history information of shooting conditions up to a predetermined time before the main shooting is generated as processing process reproduction information. It is.
[0137] 操作部 14のシャッターボタンが押され、本撮影が指示されると (ステップ S 25; YES )、 CCD3から得られた撮像信号が AZD変 5によりデジタル信号に変換されるこ とによりシーン参照生データが生成されると同時に (ステップ S26)、再現補助データ 生成部 12では、再現補助データとして、撮像装置特性補正情報、鑑賞画像参照デ ータ復元情報、処理プロセス再現情報が生成され (ステップ S27)、撮影情報データ 生成部 13では、撮影情報データが生成される (ステップ S28)。ステップ S27で生成 される処理プロセス再現情報は、ステップ S20〜S24の処理で生成された本撮影前 の履歴情報に、本撮影時の情報を追加したものである。なお、本撮影の後に、記録メ ディアへ記録するための鑑賞画像参照データを作成するステップを追加してもよい。 [0137] When the shutter button on the operation unit 14 is pressed and an actual shooting is instructed (step S25; YES), the image pickup signal obtained from the CCD3 is converted into a digital signal by the AZD change 5 to generate a scene. At the same time as the reference raw data is generated (step S26), the reproduction assistance data The generation unit 12 generates imaging device characteristic correction information, appreciation image reference data restoration information, and processing process reproduction information as reproduction auxiliary data (step S27), and the shooting information data generation unit 13 generates shooting information data. (Step S28). The processing process reproduction information generated in step S27 is obtained by adding the information at the time of main photographing to the history information before main photographing generated by the processing of steps S20 to S24. Note that a step of creating appreciation image reference data for recording on a recording medium may be added after the main photographing.
[0138] 次 、で、ヘッダ情報処理部 8にお 、て、ステップ S26で生成されたシーン参照生デ ータのファイルヘッダに、ステップ S27で生成された再現補助データ及びステップ S2 8で生成された撮影情報データがタグ情報として添付され (ステップ S29)、添付済み データファイル(図 22参照)が作成される (ステップ S30)。この添付済みデータフアイ ルは、記憶デバイス 9の記録メディアに記録、保存され (ステップ S31)、本画像デー タ記録処理が終了する。  Next, in the header information processing unit 8, the reproduction auxiliary data generated in step S 27 and the reproduction auxiliary data generated in step S 28 are added to the file header of the scene reference raw data generated in step S 26. The captured information data is attached as tag information (step S29), and an attached data file (see FIG. 22) is created (step S30). The attached data file is recorded and saved on the recording medium of the storage device 9 (step S31), and the image data recording process is completed.
[0139] 以上のように、図 21に示す撮像装置 101によれば、再現補助データに加え、撮影 時の撮影条件設定である撮影情報データをシーン参照生データのファイルヘッダに 添付して記録メディアに記録することにより、その記録メディアに記録されたデータを 出力媒体上に出力する際、撮影状況に応じた鑑賞画像参照データの生成が可能と なる。  As described above, according to the imaging device 101 shown in FIG. 21, in addition to the reproduction assistance data, the shooting information data that is the shooting condition setting at the time of shooting is attached to the file header of the scene reference raw data, and the recording medium By recording the image, it is possible to generate appreciation image reference data according to the shooting situation when the data recorded on the recording medium is output on the output medium.
[0140] 添付済みデータファイルが記録された記録メディアは撮像装置本体から取り出され 、画像処理装置や画像記録装置等の外部装置に装着され、その外部装置において 、出力媒体上での鑑賞画像形成のために最適化する画像処理が施されて鑑賞画像 参照データが生成されることになる。  [0140] The recording medium on which the attached data file is recorded is taken out from the imaging apparatus main body and mounted on an external apparatus such as an image processing apparatus or an image recording apparatus. In the external apparatus, viewing image formation on the output medium is performed. Therefore, the image processing to be optimized is performed to generate the viewing image reference data.
[0141] 次に、記録メディアに記録された画像データに、出力媒体上での鑑賞画像形成の ために最適化する画像処理を施す画像記録装置について説明する。  [0141] Next, an image recording apparatus that performs image processing that optimizes image data recorded on a recording medium for the formation of a viewing image on an output medium will be described.
〈画像記録装置の構成〉  <Configuration of image recording device>
図 24に、本発明の実施形態に係る画像記録装置 201の外観構成を示す。  FIG. 24 shows an external configuration of the image recording apparatus 201 according to the embodiment of the present invention.
[0142] 画像記録装置 201には、本体 202の一側面にマガジン装填部 203が設けられ、本 体 202内には、出力媒体である銀塩印画紙に露光する露光処理部 204と、露光され た銀塩印画紙を現像処理して乾燥し、プリントを作成するプリント作成部 205が備え られている。プリント作成部 205で作成されたプリントは、本体 202の他側面に設けら れたトレー 206に排出される。更に、本体 202の内部には、画像記録装置 201を構 成する各部を制御する制御部 207が備えられている。 [0142] The image recording apparatus 201 is provided with a magazine loading unit 203 on one side surface of a main body 202. The main body 202 is exposed to an exposure processing unit 204 that exposes silver salt photographic paper as an output medium. The print creation unit 205 is provided to create a print by developing and drying the silver halide photographic paper. It has been. The print created by the print creation unit 205 is discharged to a tray 206 provided on the other side of the main body 202. Further, inside the main body 202, a control unit 207 that controls each unit constituting the image recording apparatus 201 is provided.
[0143] また、本体 202の上部には、表示部 208、透過原稿読み込み装置であるフィルムス キヤナ部 209、反射原稿入力装置 210、操作部 211が配置されている。更に、本体 2 02には、各種記録メディアに記録された画像データを読み取り可能な画像読込部 2 14、各種記録メディアに画像データを書き込む画像書込部 215が備えられている。  In addition, on the upper part of the main body 202, a display unit 208, a film scanner unit 209 that is a transparent document reading device, a reflective document input device 210, and an operation unit 211 are arranged. Further, the main body 202 is provided with an image reading unit 214 that can read image data recorded on various recording media, and an image writing unit 215 that writes image data on various recording media.
[0144] フィルムスキャナ部 209や反射原稿入力装置 210から読み込まれる原稿として写真 感光材料がある。この写真感光材料としては、カラーネガフィルム、カラーリバーサル フィルム、白黒ネガフィルム、白黒リバーサルフィルム等が挙げられ、アナログカメラに より撮像された駒画像情報が記録される。フィルムスキャナ部 209は、写真感光材料 に記録された駒画像情報をデジタル画像データに変換し、駒画像データとする。ま た、写真感光材料が銀塩印画紙であるカラーペーパーの場合、反射原稿入力装置 210は、この銀塩印画紙に記録された駒画像情報を、フラットベッドスキャナにより駒 画像データに変換する。  A photographic photosensitive material is used as an original read from the film scanner unit 209 or the reflective original input device 210. Examples of the photographic material include a color negative film, a color reversal film, a black and white negative film, and a black and white reversal film. Frame image information captured by an analog camera is recorded. The film scanner unit 209 converts the frame image information recorded on the photographic photosensitive material into digital image data to obtain frame image data. When the photographic photosensitive material is color paper that is silver salt photographic paper, the reflective original input device 210 converts the frame image information recorded on the silver salt photographic paper into frame image data by a flatbed scanner.
[0145] 画像読込部 214は、 PCカード用アダプタ 214a、フロッピー(登録商標)ディスク用 アダプタ 214bを備え、それぞれ、 PCカード 213a、フロッピー(登録商標)ディスク 21 3bが差し込み可能になっている。 PCカード 213aは、例えば、デジタルカメラで撮像 して複数の駒画像データが記憶されたメモリを有する。フロッピー(登録商標)デイス ク 213bには、例えば、デジタルカメラで撮像された複数の駒画像データが記録され る。 PCカード 213a及びフロッピー(登録商標)ディスク 213b以外に駒画像データが 記録される記録メディアとしては、例えば、マルチメディアカード (登録商標)、メモリー スティック(登録商標)、 MDデータ、 CD— ROM等が挙げられる。  [0145] The image reading unit 214 includes a PC card adapter 214a and a floppy (registered trademark) disk adapter 214b, into which a PC card 213a and a floppy (registered trademark) disk 213b can be respectively inserted. The PC card 213a has, for example, a memory in which a plurality of frame image data is stored after being captured by a digital camera. In the floppy (registered trademark) disk 213b, for example, a plurality of frame image data captured by a digital camera is recorded. Recording media on which frame image data is recorded in addition to PC card 213a and floppy disk 213b include, for example, multimedia card (registered trademark), memory stick (registered trademark), MD data, CD-ROM, and the like. Can be mentioned.
[0146] 画像書込部 215には、フロッピー(登録商標)ディスク用アダプタ 215a、 MO用ァダ プタ 215b、光ディスク用アダプタ 215cが備えられ、それぞれ、フロッピー(登録商標 )ディスク 216a、 M0216b、光ディスク 216cが差し込み可能になっている。光デイス ク 216cとしては、 CD— R、 DVD— R等がある。  [0146] The image writing unit 215 includes a floppy (registered trademark) disk adapter 215a, an MO adapter 215b, and an optical disk adapter 215c. The floppy disk (registered trademark) disks 216a, M0216b, and the optical disk 216c, respectively. Can be inserted. Examples of the optical disk 216c include CD-R and DVD-R.
[0147] なお、図 24では、操作部 211、表示部 208、フィルムスキャナ部 209、反射原稿入 力装置 210、画像読込部 214が、本体 202に一体的に備えられた構造となっている 力 これらの何れか 1つ以上を別体として設けるようにしてもょ 、。 [0147] In Fig. 24, the operation unit 211, the display unit 208, the film scanner unit 209, and the reflection original input The force device 210 and the image reading unit 214 are integrally provided in the main body 202. Force One or more of these may be provided separately.
[0148] なお、図 24に示した画像記録装置 201では、感光材料に露光して現像してプリント を作成するものが例示されているが、プリント作成方式はこれに限定されず、例えば、 インクジェット方式、電子写真方式、感熱方式、昇華方式等の方式を用いるようにし てもよい。 [0148] In the image recording apparatus 201 shown in Fig. 24, an example is shown in which a photosensitive material is exposed and developed to create a print, but the print creation method is not limited to this, and for example, an inkjet A method such as a method, an electrophotographic method, a heat sensitive method, or a sublimation method may be used.
〈画像記録装置の内部構成〉  <Internal configuration of image recording device>
図 25に、画像記録装置 201の内部構成を示す。画像記録装置 201は、図 25に示 すように、制御部 207、露光処理部 204、プリント作成部 205、フィルムスキャナ部 20 9、反射原稿入力装置 210、画像読込部 214、画像書込部 215、データ蓄積手段 27 1、テンプレート記憶手段 272、操作部 211、表示部 208により構成される。  FIG. 25 shows the internal configuration of the image recording apparatus 201. As shown in FIG. 25, the image recording apparatus 201 includes a control unit 207, an exposure processing unit 204, a print creation unit 205, a film scanner unit 209, a reflection original input device 210, an image reading unit 214, and an image writing unit 215. , Data storage means 271, template storage means 272, operation section 211, and display section 208.
[0149] 制御部 207は、マイクロコンピュータにより構成され、 ROM (Read Only Memory)等 の記憶部(図示略)に記憶されている各種制御プログラムと、 CPU (Central Process! ng Unit) (図示略)との協働により、画像記録装置 201を構成する各部の動作を制御 する。 [0149] The control unit 207 is constituted by a microcomputer, and various control programs stored in a storage unit (not shown) such as a ROM (Read Only Memory) and a CPU (Central Process Unit) (not shown) The operation of each unit constituting the image recording apparatus 201 is controlled in cooperation with the.
[0150] また、制御部 207は、画像処理部 270を有し、操作部 211の情報入力手段 12から の入力信号に基づ 、て、フィルムスキャナ部 209や反射原稿入力装置 210から読み 取られた画像データ、画像読込部 214から読み込まれた画像データ又は通信手段( 図示略)を介して外部機器より入力された画像データに画像処理を施して露光用画 像情報を形成し、露光処理部 204に出力する。この画像処理部 270は、画像処理さ れた画像データに対して出力形態に応じた変換処理を施して指定された出力先に 出力する。画像処理部 270の出力先としては、表示部 208、画像書込部 215、通信 手段等がある。  In addition, the control unit 207 has an image processing unit 270 and is read from the film scanner unit 209 or the reflection original input device 210 based on an input signal from the information input unit 12 of the operation unit 211. Image data read from the image reading unit 214 or image data input from an external device via a communication means (not shown) to form image information for exposure to form exposure image information. Output to 204. The image processing unit 270 performs a conversion process according to the output form on the image processed image data, and outputs it to a designated output destination. The output destination of the image processing unit 270 includes a display unit 208, an image writing unit 215, a communication unit, and the like.
[0151] 露光処理部 204は、感光材料に画像の露光を行 ヽ、この感光材料をプリント作成 部 205に出力する。プリント作成部 205は、露光された感光材料を現像処理して乾燥 し、プリント Pl、 P2、 P3を作成する。プリント P1は、サービスサイズ、ノヽイビジョンサイ ズ、パノラマサイズ等のプリントであり、プリント P2は、 A4サイズのプリントであり、プリ ント P3は、名刺サイズのプリントである。 [0152] フィルムスキャナ部 209は、アナログカメラにより撮像された現像済みのネガフィル ム?^、リバーサルフィルム等の透過原稿に記録された駒画像を読み取り、駒画像のデ ジタル画像信号を取得する。反射原稿入力装置 210は、フラットベッドスキャナにより 、プリント P (写真プリント、書画、各種の印刷物)上の画像を読み取り、デジタル画像 信号を取得する。 The exposure processing unit 204 exposes an image on the photosensitive material, and outputs the photosensitive material to the print creating unit 205. The print creating unit 205 develops the exposed photosensitive material and dries it to create prints Pl, P2, and P3. The print P1 is a service size, a no-vision size, a panorama size, etc., the print P2 is an A4 size print, and the print P3 is a business card size print. [0152] The film scanner unit 209 reads a frame image recorded on a transparent original such as a developed negative film or reversal film imaged by an analog camera, and obtains a digital image signal of the frame image. The reflection original input device 210 reads an image on the print P (photo print, document, various printed materials) by a flat bed scanner, and acquires a digital image signal.
[0153] 操作部 211には、情報入力手段 212が設けられている。情報入力手段 212は、例 えば、タツチパネル等により構成されており、情報入力手段 212の操作信号を制御部 207に出力する。また、操作部 211は、キーボードやマウスを備えて構成するようにし てもよい。  The operation unit 211 is provided with information input means 212. For example, the information input unit 212 includes a touch panel or the like, and outputs an operation signal of the information input unit 212 to the control unit 207. The operation unit 211 may be configured to include a keyboard and a mouse.
[0154] 画像読込部 214は、 PCカード 213aやフロッピー(登録商標)ディスク 213bに記録 された駒画像情報を読み出して制御部 207に転送する。この画像読込部 214は、画 像転送手段 230として、 PCカード用アダプタ 214a、フロッピー(登録商標)ディスク用 アダプタ 214b等を有する。画像読込部 14は、 PCカード用アダプタ 214aに差し込ま れた PCカード 213aや、フロッピー(登録商標)ディスク用アダプタ 214bに差し込まれ たフロッピー(登録商標)ディスク 213bに記録された駒画像情報を読み取り、制御部 207に転送する。 PCカード用アダプタ 214aとしては、例えば PCカードリーダや PC カードスロット等が用いられる。  The image reading unit 214 reads the frame image information recorded on the PC card 213a or the floppy (registered trademark) disk 213b and transfers the frame image information to the control unit 207. The image reading unit 214 includes, as the image transfer means 230, a PC card adapter 214a, a floppy (registered trademark) disk adapter 214b, and the like. The image reading unit 14 reads the frame image information recorded on the PC card 213a inserted into the PC card adapter 214a and the floppy disk 213b inserted into the floppy disk adapter 214b. Transfer to the control unit 207. For example, a PC card reader, a PC card slot, or the like is used as the PC card adapter 214a.
[0155] 画像書込部 215は、画像搬送部 231として、フロッピー(登録商標)ディスク用ァダ プタ 215a、 MO用アダプタ 215b、光ディスク用アダプタ 215cを備えている。画像書 込部 215は、制御部 207から入力される書込信号に従って、フロッピー(登録商標) ディスク用アダプタ 215aに差し込まれたフロッピー(登録商標)ディスク 216a、 MO用 アダプタ 215bに差し込まれた MO 216b,光ディスク用アダプタ 215cに差し込まれた 光ディスク 216cに、本発明における画像処理方法によって生成された画像データを 書き込む。  The image writing unit 215 includes a floppy (registered trademark) disk adapter 215a, an MO adapter 215b, and an optical disk adapter 215c as the image transport unit 231. In accordance with a write signal input from the control unit 207, the image writing unit 215 includes a floppy disk 216a inserted into the floppy disk adapter 215a and an MO 216b inserted into the MO adapter 215b. The image data generated by the image processing method according to the present invention is written to the optical disk 216c inserted into the optical disk adapter 215c.
[0156] データ蓄積手段 271は、画像情報とそれに対応する注文情報(どの駒の画像から 何枚プリントを作成するかの情報、プリントサイズの情報等)とを記憶し順次蓄積する  [0156] The data storage means 271 stores and sequentially stores image information and corresponding order information (information about how many prints are to be created from which frame images, print size information, etc.)
[0157] テンプレート記憶手段 272は、サンプル識別情報 Dl、 D2、 D3に対応してサンプ ル画像データ (背景画像やイラスト画像等を示すデータ)を記憶すると共に、該サン プル画像データとの合成領域を設定するテンプレートのデータを少なくとも 1つ記憶 する。ここで、オペレータの操作 (このオペレータの操作は、クライアントの指示に基づ く)によりテンプレート記憶手段 272に予め記憶された複数のテンプレートから所定の テンプレートが選択されると、制御部 207は、駒画像情報と当該選択されたテンプレ 一トとを合成し、次いで、オペレータの操作(このオペレータの操作は、クライアントの 指示に基づく)によりサンプル識別情報 Dl、 D2、 D3が指定されると、当該指定され たサンプル識別情報 Dl、 D2、 D3に基づいてサンプル画像データを選択し、当該選 択されたサンプル画像データと、クライアントにより注文された画像データ及び Z又は 文字データを合成して、結果としてクライアントが所望するサンプル画像データに基 づくプリントを作成する。このテンプレートによる合成は、周知のクロマキ一法によって 行なわれる。 [0157] The template storage means 272 corresponds to the sample identification information Dl, D2, D3. Image data (data indicating a background image, an illustration image, etc.) and at least one template data for setting a synthesis area with the sample image data. Here, when a predetermined template is selected from a plurality of templates stored in advance in the template storage means 272 by an operator's operation (the operator's operation is based on an instruction from the client), the control unit 207 When the image identification information Dl, D2, and D3 are specified by the operator's operation (this operator's operation is based on the client's instruction), the image information and the selected template are combined. The sample image data is selected based on the sample identification information Dl, D2, and D3, and the selected sample image data is combined with the image data and Z or character data ordered by the client. Creates a print based on the desired sample image data. This template synthesis is performed by the well-known Chromaki method.
[0158] なお、サンプル識別情報は、サンプル識別情報 Dl、 D2、 D3の 3種類に限らず、 3 種類より多くても、また、少なくてもよい。  Note that the sample identification information is not limited to the three types of sample identification information Dl, D2, and D3, but may be more or less than three types.
[0159] また、プリントのサンプルを指定するサンプル識別情報 Dl、 D2、 D3は、操作部 21 1力 入力される様に構成されている力 サンプル識別情報 Dl、 D2、 D3が、プリント のサンプル又は注文シートに記録されているため、 OCR(Optical Character Reader) 等の読み取り手段により読み取ることができる。又は、オペレータがキーボードから入 力することちでさる。  [0159] In addition, sample identification information Dl, D2, and D3 for specifying a print sample is a force that is configured to be input by the operation unit 21 1 force Sample identification information Dl, D2, and D3 are printed samples or Since it is recorded on the order sheet, it can be read by reading means such as OCR (Optical Character Reader). Or, the operator can input from the keyboard.
[0160] このようにプリントのサンプルを指定するサンプル識別情報 D1に対応してサンプル 画像データを記録しておき、プリントのサンプルを指定するサンプル識別情報 D1を 入力し、この入力されるサンプル識別情報 D1に基づきサンプル画像データを選択し 、この選択されたサンプル画像データと、注文に基づく画像データ及び Z又は文字 データとを合成し、指定によるサンプルに基づくプリントを作成するため、種々の実物 大のサンプルをユーザが実際に手にしてプリントの注文ができ、幅広いユーザの多 様な要求に応じることができる。  [0160] In this way, sample image data is recorded corresponding to sample identification information D1 for specifying a print sample, sample identification information D1 for specifying a print sample is input, and this sample identification information is input. Select sample image data based on D1, and combine the selected sample image data with the image data and Z or character data based on the order to create prints based on the specified samples. Users can actually order samples for printing and can meet the diverse requirements of a wide range of users.
[0161] また、第 1のサンプルを指定する第 1のサンプル識別情報 D2と第 1のサンプルの画 像データを記憶し、また、第 2のサンプルを指定する第 2のサンプル識別情報 D3と第 2のサンプルの画像データを記憶し、指定される第 1及び第 2のサンプル識別情報 D 2、 D3とに基づいて選択されたサンプル画像データと、注文に基づく画像データ及 び Z又は文字データとを合成し、指定によるサンプルに基づくプリントを作成するた め、さらに多種多様の画像を合成することができ、より一層幅広いユーザの多様な要 求に応じたプリントを作成することができる。 [0161] Further, the first sample identification information D2 designating the first sample and the image data of the first sample are stored, and the second sample identification information D3 designating the second sample and the first sample identification data D3 are stored. The image data of two samples is stored, the sample image data selected based on the designated first and second sample identification information D2, D3, the image data based on the order, and the Z or character data Since a print based on the specified sample is created, a wider variety of images can be synthesized, and a print that meets a wider variety of user requirements can be created.
[0162] 表示部 208は、 CRTや LCD等の表示ディスプレイにより構成され、制御部 207から 入力される表示制御信号に従って表示処理を行う。  [0162] The display unit 208 includes a display such as a CRT or LCD, and performs display processing in accordance with a display control signal input from the control unit 207.
[0163] 制御部 207の画像処理部 270は、通信手段(図示略)を用いて、画像記録装置 20 1が設置されている施設内の別のコンピュータや、インターネット等の通信ネットヮー クを介した遠方のコンピュータから、撮像画像を表す画像データとプリント等の作業 命令を受信し、遠隔操作で画像処理を実施したりプリントを作成したりすることも可能 になっている。  [0163] The image processing unit 270 of the control unit 207 uses a communication means (not shown) via another computer in the facility where the image recording apparatus 201 is installed or a communication network such as the Internet. It is also possible to receive image data representing captured images and work instructions such as printing from a distant computer, and to perform image processing and create prints remotely.
[0164] また、画像処理部 270は、通信手段(図示略)を用いて、本発明の画像処理を施し た後の撮影画像を表す画像データと付帯するオーダー情報を、施設内の別のコンビ ユータゃインターネット等を介した遠方のコンピュータに対して送付することも可能に なっている。  [0164] Also, the image processing unit 270 uses communication means (not shown) to send image data representing the captured image after the image processing of the present invention and the accompanying order information to another combination in the facility. Utah can also be sent to distant computers via the Internet.
[0165] このように画像記録装置 201は、各種記録メディアの画像及び画像原稿を分割測 光して得られた画像情報を取り込む入力手段と、この入力手段から取り入れた入力 画像の画像情報を「出力画像の大きさ」と「出力画像における主要被写体の大きさ」と いう情報を取得又は推定して出力媒体上で画像を観察する際に好ましい印象を与え る画像となるように処理を行う画像処理手段と、処理済の画像を表示、プリント出力、 記録メディアに書き込む画像出力手段及び通信回線を介して施設内の別のコンビュ ータゃインターネット等を介した遠方のコンピュータに対して画像データと付帯するォ ーダー情報を送信する通信手段とを有する。  As described above, the image recording apparatus 201 inputs the image information of the various recording media and the image information obtained by dividing and metering the image original, and the image information of the input image taken from the input means into “ An image that is processed to obtain an image that gives a favorable impression when observing the image on the output medium by acquiring or estimating information such as “the size of the output image” and “the size of the main subject in the output image” Image processing means, image output means for displaying processed images, printing output, writing to recording media, and another computer in the facility via a communication line, image data to a remote computer via the Internet, etc. And communication means for transmitting the attached order information.
[0166] 図 26に、画像記録装置 201を、画像処理装置 301と、画像処理装置 301で画像処 理された画像データを出力する出力部 302に分けた場合の内部構成を示す。  FIG. 26 shows an internal configuration when the image recording apparatus 201 is divided into an image processing apparatus 301 and an output unit 302 that outputs image data processed by the image processing apparatus 301.
[0167] 画像処理装置 301は、図 26に示すように、入力部 303、ヘッダ情報解析部 304、 撮像装置特性補正処理部 305、鑑賞画像参照データ復元条件生成部 306、処理プ ロセス再現部 307、最適化処理部 308により構成される。 As shown in FIG. 26, the image processing device 301 includes an input unit 303, a header information analysis unit 304, an imaging device characteristic correction processing unit 305, an appreciation image reference data restoration condition generation unit 306, a processing program. The process reproduction unit 307 and the optimization processing unit 308 are configured.
[0168] 入力部 303は、図 25の画像読込部 214を含み、記録メディアを装着する装着部を 備えている。入力部 303は、記録メディアが装着部に装着されると、当該記録メディ ァに記録されたデータファイルを読み出して、ヘッダ情報解析部 304に出力する。な お、本実施形態では、入力部 303が、装着された記録メディアからデータファイルを 読み出すこととして説明するが、有線又は無線の通信手段を備え、この通信手段を 介してデータファイルを入力するようにしてもよ!、。  [0168] The input unit 303 includes the image reading unit 214 of Fig. 25 and includes a mounting unit for mounting a recording medium. When the recording medium is loaded in the loading unit, the input unit 303 reads the data file recorded in the recording medium and outputs the data file to the header information analysis unit 304. In the present embodiment, the input unit 303 is described as reading a data file from the attached recording medium. However, the input unit 303 includes a wired or wireless communication unit, and inputs the data file via the communication unit. Anyway!
[0169] ヘッダ情報解析部 304は、入力部 303から入力されたデータファイルを解析し、当 該データファイルを、シーン参照生データ、再現補助データ (撮像装置特性補正情 報、鑑賞画像参照データ復元情報、処理プロセス再現情報)及び撮影情報データに 分け、シーン参照生データを撮像装置特性補正処理部 305内のシーン参照画像デ ータ生成部 311へ出力し、撮像装置特性補正情報を装置特性補正処理部 309へ出 力し、鑑賞画像参照データ復元情報を鑑賞画像参照データ復元条件生成部 306へ 出力し、処理プロセス再現情報を処理プロセス再現部 307へ出力し、撮影情報デー タを最適化処理部 308内の撮影情報データ処理部 313へ出力する。  [0169] The header information analysis unit 304 analyzes the data file input from the input unit 303, and converts the data file into scene reference raw data, reproduction auxiliary data (imaging device characteristic correction information, appreciation image reference data restoration). Information, processing process reproduction information) and shooting information data, and output the scene reference raw data to the scene reference image data generation unit 311 in the imaging device characteristic correction processing unit 305, and correct the imaging device characteristic correction information to the device characteristic correction. Output to the processing unit 309, output the viewing image reference data restoration information to the viewing image reference data restoration condition generation unit 306, and output the processing process reproduction information to the processing process reproduction unit 307 to optimize the shooting information data The information is output to the photographing information data processing unit 313 in the unit 308.
[0170] 撮像装置特性補正処理部 305は、装置特性補正処理部 309、処理条件テーブル 310、シーン参照画像データ生成部 311、一時記憶メモリ 312により構成される。  The imaging device characteristic correction processing unit 305 includes a device characteristic correction processing unit 309, a processing condition table 310, a scene reference image data generation unit 311, and a temporary storage memory 312.
[0171] 装置特性補正処理部 309は、ヘッダ情報解析部 304から入力された撮像装置特 性補正情報と処理条件テーブル 310の参照により、シーン参照画像データの生成条 件を決定する。処理条件テーブル 310は、撮像装置の特性毎に、シーン参照画像デ ータを生成するための処理条件を記憶するテーブルである。  The device characteristic correction processing unit 309 determines scene generation image data generation conditions based on the imaging device characteristic correction information input from the header information analysis unit 304 and the processing condition table 310. The processing condition table 310 is a table that stores processing conditions for generating scene reference image data for each characteristic of the imaging apparatus.
[0172] シーン参照画像データ生成部 311は、ヘッダ情報解析部 304から入力されたシー ン参照生データに対し、装置特性補正処理部 309で決定された生成条件に従って 撮像装置特性補正処理を施して、撮像装置の特性に依存しな!ヽ標準化されたシー ン参照画像データを生成し、一時記憶メモリ 312に出力する。具体的に、撮像装置 特性補正処理には、少なくともシーン参照生データを生成した撮像装置の撮像素子 固有の分光感度に基づく各色チャンネルの信号強度を、例えば、 RIMM RGBや E RIMM RGB、 scRGB等の標準色空間にマッピングする処理が含まれる。 [0173] 一時記憶メモリ 312は、シーン参照画像データ生成部 311で生成されたシーン参 照画像データを一時的に記憶する。 [0172] The scene reference image data generation unit 311 performs imaging device characteristic correction processing on the scene reference raw data input from the header information analysis unit 304 in accordance with the generation conditions determined by the device characteristic correction processing unit 309. Depending on the characteristics of the imaging device, generate standardized scene reference image data and output it to the temporary storage memory 312. Specifically, in the imaging device characteristic correction process, the signal intensity of each color channel based on at least the spectral sensitivity unique to the imaging device of the imaging device that generated the scene reference raw data, for example, RIMM RGB, E RIMM RGB, scRGB, etc. Includes mapping to standard color space. The temporary storage memory 312 temporarily stores the scene reference image data generated by the scene reference image data generation unit 311.
[0174] 鑑賞画像参照データ復元条件生成部 306は、ヘッダ情報解析部 304から入力され た鑑賞画像参照データ復元情報に基づ!、て、撮像装置での鑑賞画像参照データを 復元するための復元条件を決定する。 [0174] The appreciation image reference data restoration condition generation unit 306 is based on the appreciation image reference data restoration information input from the header information analysis unit 304 !, and is restored to restore the appreciation image reference data in the imaging device. Determine the conditions.
[0175] 処理プロセス再現部 307は、ヘッダ情報解析部 304から入力された処理プロセス 再現情報に基づ!、て、撮像装置での鑑賞画像参照データの生成過程を再現するた めの再現条件を決定する。 [0175] Based on the processing process reproduction information input from the header information analysis unit 304, the processing process reproduction unit 307 sets reproduction conditions for reproducing the generation process of the appreciation image reference data in the imaging device. decide.
[0176] 最適化処理部 308は、撮影情報データ処理部 313、鑑賞画像参照データ生成部 3[0176] The optimization processing unit 308 includes the photographing information data processing unit 313, the appreciation image reference data generation unit 3
14、一時記憶メモリ 315、設定入力部 316により構成される。 14, composed of a temporary storage memory 315 and a setting input unit 316.
[0177] 撮影情報データ処理部 313は、ヘッダ情報解析部 304から入力された撮影情報デ ータに基づいて、撮影条件に応じた鑑賞画像参照データを生成するための生成条 件を決定する。 The shooting information data processing unit 313 determines a generation condition for generating appreciation image reference data corresponding to the shooting condition based on the shooting information data input from the header information analysis unit 304.
[0178] 鑑賞画像参照データ生成部 314は、一時記憶メモリ 312からシーン参照画像デー タを読み出し、撮影情報データ処理部 313で決定された鑑賞画像参照データの生 成条件と、鑑賞画像参照データ復元条件生成部 306で決定された復元条件と、処 理プロセス再現部 307で決定された再現条件と、設定入力部 316から指定された出 力先 (記憶デバイス 318、出力デバイス 317、表示部 208)の情報に基づいて、当該 シーン参照画像データに対して、出力先において最適な画像を得るための最適化 処理を施して鑑賞画像参照データを生成し、設定入力部 316の操作情報と共に一 時記憶メモリ 315に出力する。最適化処理には、例えば、出力先の色域への圧縮、 1 6bitから 8bitへの階調圧縮、出力画素数の低減、出力デバイスや表示デバイスの出 力特性 (LUT)への対応処理等が含まれる。更に、ノイズ抑制、鮮鋭化、カラーバラ ンス調整、彩度調整、覆い焼き処理等の画像処理が含まれる。  [0178] The appreciation image reference data generation unit 314 reads the scene reference image data from the temporary storage memory 312 and generates the appreciation image reference data determined by the shooting information data processing unit 313 and restores the appreciation image reference data. The restoration conditions determined by the condition generation unit 306, the reproduction conditions determined by the processing process reproduction unit 307, and the output destination specified by the setting input unit 316 (storage device 318, output device 317, display unit 208) Based on this information, the scene reference image data is subjected to an optimization process for obtaining an optimal image at the output destination to generate viewing image reference data, which is temporarily stored together with the operation information of the setting input unit 316. Output to memory 315. Optimization processing includes, for example, compression to the color gamut of the output destination, gradation compression from 16 bits to 8 bits, reduction of the number of output pixels, processing to support output characteristics (LUT) of output devices and display devices, etc. Is included. Furthermore, image processing such as noise suppression, sharpening, color balance adjustment, saturation adjustment, and dodging processing is included.
[0179] 一時記憶メモリ 315は、鑑賞画像参照データ生成部 314で生成された鑑賞画像参 照データを一時的に記憶し、設定入力部 316により設定された出力先 (記憶デバイ ス 318、出力デバイス 317、表示部 208の何れか)に出力する。  [0179] The temporary storage memory 315 temporarily stores the viewing image reference data generated by the viewing image reference data generation unit 314, and the output destination (storage device 318, output device) set by the setting input unit 316. 317 and display 208).
[0180] 設定入力部 316は、鑑賞画像参照データ生成部 314で生成される鑑賞画像参照 データの出力先を指定するための入力デバイスであり、図 25の操作部 211に対応す る。 [0180] The setting input unit 316 refers to the appreciation image generated by the appreciation image reference data generation unit 314. This is an input device for designating the data output destination, and corresponds to the operation unit 211 in FIG.
[0181] なお、画像処理装置 301の各構成要素のうち、入力部 303及び設定入力部 316以 外の構成要素が、図 25に示す画像処理部 270に含まれる。  Note that, among the components of the image processing apparatus 301, components other than the input unit 303 and the setting input unit 316 are included in the image processing unit 270 shown in FIG.
[0182] 出力部 302は、表示部 208、図 25の露光処理部 204及びプリント作成部 205に対 応する出力デバイス 317、図 25の画像書込部 215に対応する記憶デバイス 318によ り構成される c [0182] The output unit 302 includes a display unit 208, an output device 317 corresponding to the exposure processing unit 204 and the print creation unit 205 in FIG. 25, and a storage device 318 corresponding to the image writing unit 215 in FIG. C
[0183] 次に、画像記録装置 201における動作について説明する。  Next, the operation of the image recording apparatus 201 will be described.
[0184] 図 27を参照して、画像記録装置 201において実行される画像処理について説明 する。 [0184] With reference to FIG. 27, the image processing executed in the image recording apparatus 201 will be described.
[0185] 入力部 303にデータが入力されると (即ち、装着部に記録メディアが装着されると) ( ステップ S40)、当該記録メディアに記録されたデータファイルが読み出され、ヘッダ 情報解析部 304において、データファイルの内容が解析され (ステップ S41)、シーン 参照生データ、再現補助データ (撮像装置特性補正情報、鑑賞画像参照データ復 元情報、処理プロセス再現情報)及び撮影情報データに分けられる。  [0185] When data is input to the input unit 303 (that is, when a recording medium is mounted to the mounting unit) (step S40), the data file recorded on the recording medium is read and the header information analysis unit In 304, the contents of the data file are analyzed (step S41), and are divided into scene reference raw data, reproduction auxiliary data (imaging device characteristic correction information, appreciation image reference data restoration information, processing process reproduction information), and shooting information data. .
[0186] 撮影情報データ処理部 313では、撮影情報データに基づいて、撮影条件に応じた 鑑賞画像参照データを生成するための生成条件が決定され (ステップ S42)、鑑賞画 像参照データ復元条件生成部 306では、鑑賞画像参照データ復元情報に基づ 、て 、撮像装置での鑑賞画像参照データを復元するための復元条件が決定され (ステツ プ S44)、処理プロセス再現部 307では、処理プロセス再現情報に基づいて、撮像装 置での鑑賞画像参照データの生成過程を再現するための再現条件が決定される (ス テツプ S45)。  [0186] The shooting information data processing unit 313 determines generation conditions for generating viewing image reference data according to shooting conditions based on the shooting information data (step S42), and generates viewing image reference data restoration condition generation. Based on the appreciation image reference data restoration information, the unit 306 determines restoration conditions for restoring the appreciation image reference data in the imaging device (step S44). The processing process reproduction unit 307 reproduces the processing process reproduction data. Based on the information, the reproduction condition for reproducing the generation process of the viewing image reference data in the imaging device is determined (step S45).
[0187] 装置特性補正処理部 309では、撮像装置特性補正情報と処理条件テーブル 310 の参照により、シーン参照画像データの生成条件が決定され、シーン参照生データ に対し、当該決定された生成条件に従って撮像装置特性補正処理が施され (ステツ プ S43)、シーン参照画像データが生成される(ステップ S46)。  [0187] In the device characteristic correction processing unit 309, the generation condition of the scene reference image data is determined by referring to the imaging apparatus characteristic correction information and the processing condition table 310, and the scene reference raw data is determined according to the determined generation condition. Imaging device characteristic correction processing is performed (step S43), and scene reference image data is generated (step S46).
[0188] 次 、で、ステップ S42、 S44及び S45で決定された各種画像処理条件 (鑑賞画像 参照データの生成条件、復元条件、再現条件)に基づいて、ステップ S46で生成さ れたシーン参照画像データに対して最適化処理が施され (ステップ S47)、鑑賞画像 参照データが生成される (ステップ S48)。 [0188] Next, in step S46, based on the various image processing conditions (appearance image reference data generation conditions, restoration conditions, reproduction conditions) determined in steps S42, S44, and S45. The optimized scene reference image data is subjected to optimization processing (step S47), and appreciation image reference data is generated (step S48).
[0189] 鑑賞画像参照データは、設定入力部 316 (操作部 211)で設定された出力先にお いて、当該出力先に応じた固有の処理が施され (ステップ S49)、当該出力先のデバ イスから出力され (ステップ S50)、本画像処理が終了する。 [0189] The viewing image reference data is subjected to processing specific to the output destination set in the setting input unit 316 (operation unit 211) (step S49), and the output destination device is checked. The image is output from the chair (step S50), and this image processing ends.
[0190] なお、図 27のフローチャートでは、図 22に示すような撮影情報データが含まれるデ 一タファイルに対する画像処理について説明したが、図 3に示すような撮影情報デー タを含まないデータファイルに対する画像処理では、図 27のステップ S42の処理は 行われない。 In the flowchart of FIG. 27, the image processing for the data file including the shooting information data as shown in FIG. 22 has been described. However, for the data file not including the shooting information data as shown in FIG. In the image processing, step S42 in FIG. 27 is not performed.
[0191] 以上のように、本実施形態の画像記録装置 201によれば、撮像装置から出力され たシーン参照生データのファイルヘッダに添付された再現情報データ、撮影情報デ ータに基づいて最適化された鑑賞画像参照データを生成することにより、撮像画像 情報の情報損失を伴うことなぐ撮像装置で得られた画像データ以上の高品質な鑑 賞画像参照データを得ることが可能となる。  [0191] As described above, according to the image recording device 201 of the present embodiment, the optimum is based on the reproduction information data and the shooting information data attached to the file header of the scene reference raw data output from the imaging device. By generating converted viewing image reference data, it is possible to obtain high-quality recognition image reference data that is higher than image data obtained by an imaging device that does not involve information loss of captured image information.
[0192] なお、再現補助データに処理プロセス再現情報が含まれていなくても、充分高品質 な鑑賞画像参照データが得られる力 この処理プロセス再現情報が含まれて 、る方 がー層高品質な鑑賞画像参照データを得ることが可能である。 [0192] It should be noted that even if the reproduction assistance data does not contain processing process reproduction information, it is possible to obtain sufficiently high quality viewing image reference data. It is possible to obtain appreciation image reference data.

Claims

請求の範囲 The scope of the claims
[1] 撮像により撮像装置の特性に依存したシーン参照生データを生成するシーン参照生 データ生成手段と、  [1] Scene reference raw data generation means for generating scene reference raw data depending on characteristics of the imaging device by imaging;
前記シーン参照生データに対して出力媒体上での鑑賞画像形成のために最適化す る画像処理を施して鑑賞画像参照データを生成する際の再現補助データを生成す る再現補助データ生成手段と、  Reproduction auxiliary data generating means for generating reproduction auxiliary data when generating image reference data by performing image processing optimized for image formation on an output medium with respect to the scene reference raw data;
前記シーン参照生データに前記再現補助データを添付して記録メディアに記録する 記録制御手段と、  Recording control means for attaching the reproduction auxiliary data to the scene reference raw data and recording it on a recording medium;
を備えることを特徴とする撮像装置。  An imaging apparatus comprising:
[2] 前記再現補助データを用いて、前記シーン参照生データに画像処理を施し、前記鑑 賞画像参照データを作成する画像処理手段と、  [2] Image processing means for applying image processing to the scene reference raw data using the auxiliary reproduction data and creating the award image reference data;
前記画像処理手段で作成された鑑賞画像参照データに基づ!/ヽて、出力媒体上に鑑 賞画像を形成する画像形成手段と、  Based on appreciation image reference data created by the image processing means! / Image forming means for forming a recognition image on the output medium;
を備えることを特徴とする請求の範囲第 1項に記載の撮像装置。  The imaging apparatus according to claim 1, further comprising:
[3] 前記画像処理手段は、前記シーン参照生データを標準化してシーン参照データを 作成し、このシーン参照データに対して、鑑賞画像参照データ復元情報を用いて、 画像処理を施して前記鑑賞画像参照データを生成し、 [3] The image processing means creates scene reference data by standardizing the scene reference raw data, performs image processing on the scene reference data using the viewing image reference data restoration information, and performs the viewing. Generate image reference data,
前記再現補助データは、前記鑑賞画像参照データ復元情報を含むことを特徴とする 請求の範囲第 2項に記載の撮像装置。  The imaging apparatus according to claim 2, wherein the reproduction assistance data includes the appreciation image reference data restoration information.
[4] 本撮影前に予備撮影を行!ヽ、予備撮影の結果に応じて撮影条件の調整を行う撮影 条件調整手段を備え、 [4] Preliminary shooting is performed before the actual shooting! 撮 影, equipped with shooting condition adjustment means to adjust shooting conditions according to the result of preliminary shooting,
前記再現補助データは、前記撮影条件調整手段で本撮影前に調整される撮影条件 の履歴を示す処理プロセス再現情報を含むことを特徴とする請求の範囲第 2項に記 載の撮像装置。  3. The imaging apparatus according to claim 2, wherein the reproduction auxiliary data includes processing process reproduction information indicating a history of imaging conditions adjusted by the imaging condition adjusting unit before actual imaging.
[5] 前記処理プロセス再現情報には、撮影条件の正当性を判断するための指標値が含 まれることを特徴とする請求の範囲第 4項に記載の撮像装置。  [5] The imaging apparatus according to claim 4, wherein the processing process reproduction information includes an index value for judging the validity of the imaging condition.
[6] 前記指標値には、撮影時のユーザ特性、光源条件、露出条件の少なくとも一つを特 定する値が含まれることを特徴とする請求の範囲第 5項に記載の撮像装置。 6. The imaging device according to claim 5, wherein the index value includes a value that specifies at least one of user characteristics, light source conditions, and exposure conditions at the time of shooting.
[7] 前記光源条件、露出条件は、撮影時の撮影シーン判別処理の判別結果であることを 特徴とする請求の範囲第 6項に記載の撮像装置。 7. The imaging apparatus according to claim 6, wherein the light source condition and the exposure condition are determination results of a shooting scene determination process at the time of shooting.
[8] 撮影時の撮影条件設定を示す撮影情報データを生成する撮影情報データ生成手 段を備え、 [8] A shooting information data generation means for generating shooting information data indicating shooting condition settings at the time of shooting is provided.
前記記録制御手段は、前記シーン参照生データに前記撮影情報データを添付して 記録メディアに記録することを特徴とする請求の範囲第 1項乃至第 7項の何れか 1項 に記載の撮像装置。  The imaging apparatus according to any one of claims 1 to 7, wherein the recording control unit records the shooting information data on a recording medium with the scene reference raw data attached thereto. .
[9] 撮像装置の特性に依存したシーン参照生データと、当該シーン参照生データに対し て出力媒体上での鑑賞画像形成のために最適化する画像処理を施して鑑賞画像参 照データを生成する際の再現補助データと、を入力する入力手段と、  [9] Scene reference raw data depending on the characteristics of the imaging device, and image processing that optimizes the scene reference raw data to form an appreciation image on the output medium, generates appreciation image reference data Input means for inputting reproduction auxiliary data when performing,
前記入力されたシーン参照生データに対し、前記入力された再現補助データに基づ いて最適化処理を施して鑑賞画像参照データを生成する鑑賞画像参照データ生成 手段と、  Appreciation image reference data generation means for generating appreciation image reference data by performing an optimization process on the input scene reference raw data based on the input reproduction auxiliary data;
を備えることを特徴とする画像処理装置。  An image processing apparatus comprising:
[10] 前記再現補助データには、出力媒体上での鑑賞画像参照データを生成する際に撮 像装置での鑑賞画像参照データを復元するための鑑賞画像参照データ復元情報が 含まれることを特徴とする請求の範囲第 9項に記載の画像処理装置。 [10] The reproduction assistance data includes appreciation image reference data restoration information for restoring appreciation image reference data on the imaging device when generating appreciation image reference data on the output medium. The image processing device according to claim 9.
[11] 前記再現補助データには、出力媒体上での鑑賞画像参照データを生成する際に撮 像装置での鑑賞画像参照データの生成過程を再現するための処理プロセス再現情 報が含まれることを特徴とする請求の範囲第 9項に記載の画像処理装置。 [11] The reproduction assistance data includes processing process reproduction information for reproducing the generation process of the appreciation image reference data on the imaging device when generating the appreciation image reference data on the output medium. The image processing apparatus according to claim 9, wherein:
[12] 前記処理プロセス再現情報には、撮像装置での撮影条件の正当性を判断するため の指標値が含まれることを特徴とする請求の範囲第 11項に記載の画像処理装置。 12. The image processing apparatus according to claim 11, wherein the processing process reproduction information includes an index value for determining the validity of the imaging condition in the imaging apparatus.
[13] 前記指標値には、撮像装置による撮影時のユーザ特性、光源条件、露出条件の少 なくとも一つを特定する値が含まれることを特徴とする請求の範囲第 12項に記載の 画像処理装置。 13. The index value according to claim 12, wherein the index value includes a value that specifies at least one of user characteristics, light source conditions, and exposure conditions at the time of shooting by the imaging device. Image processing device.
[14] 前記光源条件、露出条件は、撮影時の撮影シーン判別処理の判別結果であることを 特徴とする請求の範囲第 13項に記載の画像処理装置。  14. The image processing apparatus according to claim 13, wherein the light source condition and the exposure condition are determination results of a shooting scene determination process at the time of shooting.
[15] 前記入力手段は、撮影時の撮影条件設定を示す撮影条件データを入力し、 前記鑑賞画像参照データ生成手段は、前記入力されたシーン参照生データに対し 、入力された前記再現補助データ及び前記撮影条件データに基づ!、て最適化処理 を施して鑑賞画像参照データを生成することを特徴とする請求の範囲第 9項乃至第 14項の何れか 1項に記載の画像処理装置。 [15] The input means inputs shooting condition data indicating shooting condition settings at the time of shooting, The appreciation image reference data generating means generates an appreciation image reference data by performing an optimization process on the input scene reference raw data based on the input reproduction assistance data and the shooting condition data. 15. The image processing device according to any one of claims 9 to 14, wherein:
請求の範囲第 9項乃至第 15項の何れか 1項に記載の画像処理装置と、 The image processing device according to any one of claims 9 to 15,
前記画像処理装置の鑑賞画像参照データ生成手段により生成された鑑賞画像参照 データを用いて出力媒体上に鑑賞画像を形成する画像形成手段と、 Image forming means for forming an appreciation image on an output medium using appreciation image reference data generated by the appreciation image reference data generation means of the image processing device;
を備えることを特徴とする画像記録装置。 An image recording apparatus comprising:
PCT/JP2005/023007 2005-01-20 2005-12-15 Imaging device, image processing device, and image recording device WO2006077703A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-013274 2005-01-20
JP2005013274A JP2006203573A (en) 2005-01-20 2005-01-20 Imaging apparatus, image processor, and image recording apparatus

Publications (1)

Publication Number Publication Date
WO2006077703A1 true WO2006077703A1 (en) 2006-07-27

Family

ID=36692097

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/023007 WO2006077703A1 (en) 2005-01-20 2005-12-15 Imaging device, image processing device, and image recording device

Country Status (2)

Country Link
JP (1) JP2006203573A (en)
WO (1) WO2006077703A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5610860B2 (en) * 2009-08-07 2014-10-22 キヤノン株式会社 Imaging apparatus and information processing apparatus
JP5550333B2 (en) * 2009-12-28 2014-07-16 キヤノン株式会社 Imaging apparatus, development method, and program
JP6409938B2 (en) * 2017-10-13 2018-10-24 ソニー株式会社 Imaging apparatus and image processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11127415A (en) * 1997-10-24 1999-05-11 Nikon Corp Electronic camera, method for controlling electronic camera and storage medium
JP2001251551A (en) * 2000-03-08 2001-09-14 Fuji Photo Film Co Ltd Electronic camera
JP2004096500A (en) * 2002-08-30 2004-03-25 Konica Minolta Holdings Inc Image pickup apparatus, image processing apparatus, and image recording apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11127415A (en) * 1997-10-24 1999-05-11 Nikon Corp Electronic camera, method for controlling electronic camera and storage medium
JP2001251551A (en) * 2000-03-08 2001-09-14 Fuji Photo Film Co Ltd Electronic camera
JP2004096500A (en) * 2002-08-30 2004-03-25 Konica Minolta Holdings Inc Image pickup apparatus, image processing apparatus, and image recording apparatus

Also Published As

Publication number Publication date
JP2006203573A (en) 2006-08-03

Similar Documents

Publication Publication Date Title
US7312824B2 (en) Image-capturing apparatus, image processing apparatus and image recording apparatus
US7076119B2 (en) Method, apparatus, and program for image processing
JP2004173010A (en) Image pickup device, image processor, image recorder, image processing method, program and recording medium
US20040247175A1 (en) Image processing method, image capturing apparatus, image processing apparatus and image recording apparatus
JP2004128809A (en) Method and apparatus for image processing, and imaging device
WO2006123492A1 (en) Image processing method, image processing device, imaging device and image processing program
JPWO2005079056A1 (en) Image processing apparatus, photographing apparatus, image processing system, image processing method and program
JP2005026800A (en) Image processing method, imaging apparatus, image processing apparatus, and image recording apparatus
US20040041926A1 (en) Image-capturing apparatus, imager processing apparatus and image recording apparatus
WO2005112428A1 (en) Image processing method, image processing device, image recorder, and image processing program
US7324702B2 (en) Image processing method, image processing apparatus, image recording apparatus, program, and recording medium
WO2006077702A1 (en) Imaging device, image processing device, and image processing method
JP2004336521A (en) Image processing method, image processor, and image recording apparatus
WO2006077703A1 (en) Imaging device, image processing device, and image recording device
WO2006033235A1 (en) Image processing method, image processing device, imaging device, and image processing program
JP2006203571A (en) Imaging apparatus, image processing apparatus, and image recording apparatus
US6801296B2 (en) Image processing method, image processing apparatus and image recording apparatus
WO2006033236A1 (en) Image processing method, image processing device, imaging device, and image processing program
JP2004096508A (en) Image processing method, image processing apparatus, image recording apparatus, program, and recording medium
JP3664582B2 (en) White balance adjustment method and apparatus
JP4623024B2 (en) Electronic camera
JP2004328530A (en) Imaging apparatus, image processing apparatus, and image recording apparatus
WO2006033234A1 (en) Image processing method, image processing device, imaging device, and image processing program
JP4292873B2 (en) Image processing method, image processing apparatus, and image recording apparatus
JP2006203567A (en) Imaging apparatus, image processing apparatus, and image recording apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05816842

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 5816842

Country of ref document: EP