WO2006077703A1 - Dispositif d’imagerie, dispositif de traitement d’image et dispositif d’enregistrement d’image - Google Patents

Dispositif d’imagerie, dispositif de traitement d’image et dispositif d’enregistrement d’image Download PDF

Info

Publication number
WO2006077703A1
WO2006077703A1 PCT/JP2005/023007 JP2005023007W WO2006077703A1 WO 2006077703 A1 WO2006077703 A1 WO 2006077703A1 JP 2005023007 W JP2005023007 W JP 2005023007W WO 2006077703 A1 WO2006077703 A1 WO 2006077703A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image
shooting
information
scene
Prior art date
Application number
PCT/JP2005/023007
Other languages
English (en)
Japanese (ja)
Inventor
Hiroaki Takano
Takeshi Nakajima
Daisuke Sato
Tsukasa Ito
Original Assignee
Konica Minolta Photo Imaging, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Photo Imaging, Inc. filed Critical Konica Minolta Photo Imaging, Inc.
Publication of WO2006077703A1 publication Critical patent/WO2006077703A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • H04N9/8047Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction using transform coding

Definitions

  • Imaging apparatus image processing apparatus, and image recording apparatus
  • the present invention relates to an imaging apparatus such as a digital camera, and an image processing apparatus and an image recording apparatus that perform an optimization process for forming an appreciation image on an output medium on captured image data obtained by the imaging apparatus.
  • captured image data obtained by an imaging device has been recorded on a recording medium such as a CD-R (Compact Disc Recordable), a floppy (registered trademark) disk, a memory card, or the like.
  • a recording medium such as a CD-R (Compact Disc Recordable), a floppy (registered trademark) disk, a memory card, or the like.
  • Distributed via communication networks such as CRT (Cathode Ray Tube), liquid crystal, plasma, etc., and display on display devices of small liquid crystal monitors of mobile phones, digital printers, inkjet printers, thermal printers, etc.
  • There are various output methods such as printing as a hard copy image using an output device such as the above.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2004-96500
  • An object of the present invention is to suppress information loss of photographed image information and to view a high-quality appreciation image that is higher than image data obtained by an imaging device on the image processing apparatus or image recording apparatus side. It is possible to generate reference data.
  • Scene reference raw data generation means for generating scene reference raw data depending on the characteristics of the imaging device by imaging, and optimizing the scene reference raw data for forming an appreciation image on an output medium
  • Reproduction auxiliary data generating means for generating reproduction auxiliary data when performing image processing to generate appreciation image reference data, and recording control means for attaching the reproduction auxiliary data to the scene reference raw data and recording it on a recording medium
  • An imaging apparatus comprising:
  • Image processing means for performing image processing on the scene reference raw data using the reproduction auxiliary data to create the viewing image reference data, and based on the viewing image reference data created by the image processing means.
  • the imaging apparatus according to 1, comprising an image forming means for forming an appreciation image on an output medium.
  • the image processing means standardizes the scene reference raw data to create scene reference data, and performs image processing on the scene reference data using the viewing image reference data restoration information to perform the viewing image 3.
  • the imaging apparatus according to 2, wherein reference data is generated, and the reproduction assistance data includes the viewing image reference data restoration information.
  • the imaging apparatus further comprising a shadow condition adjusting unit, wherein the reproduction assistance data includes processing process reproduction information indicating a history of shooting conditions adjusted before the main shooting by the shooting condition adjusting unit. .
  • processing process reproduction information includes an index value for determining the validity of the imaging condition.
  • index value includes a value that specifies at least one of user characteristics, light source conditions, and exposure conditions during shooting.
  • shooting information data generating means for generating shooting information data indicating shooting condition setting at the time of shooting
  • the recording control means records the shooting information data attached to the scene reference raw data on a recording medium.
  • An image processing apparatus comprising: reference data generation means.
  • the reproduction auxiliary data includes appreciation image reference data restoration information for restoring appreciation image reference data in the imaging device when generating appreciation image reference data on the output medium.
  • the reproduction auxiliary data includes processing process reproduction information for reproducing the generation process of the appreciation image reference data in the imaging device when generating the appreciation image reference data on the output medium.
  • processing process reproduction information includes an index value for determining validity of imaging conditions in the imaging apparatus.
  • the index values include user characteristics, light source conditions, and exposure conditions when shooting with the imaging device. 13.
  • the input means inputs photographing condition data indicating photographing condition settings at the time of photographing
  • the appreciation image reference data generating means inputs the reproduction auxiliary data inputted with respect to the inputted scene reference raw data.
  • the image processing apparatus according to any one of 9 to 14, wherein the image processing apparatus generates an appreciation image reference data by performing an optimization process based on the shooting condition data.
  • An image recording apparatus comprising: an image forming unit.
  • the term "generation” means that a program and a processing circuit that operate in the imaging apparatus, image processing apparatus, and image recording apparatus according to the present invention renew image signals and data.
  • “Create” may be used as a synonym.
  • imaging device refers to a device including an imaging element (image sensor) having a photoelectric conversion function, and includes a so-called digital camera and scanner.
  • image sensors include CCD (Charge Coupled Device), a charge transfer mechanism, and a CCD image sensor that provides color sensitivity by combining a pine pattern color filter, CMOS (Complementary Metal). -Oxide Semiconductor) type image sensor.
  • the output current of these image sensors is digitized by A / D change.
  • the contents of each color channel at this stage are signal intensities based on the spectral sensitivity unique to the image sensor.
  • scene reference raw data depending on the characteristics of the imaging device is a direct raw output signal of the imaging device that records information faithful to the subject, and is digitized by an A / D converter. Noise correction such as fixed pattern noise and dark current noise Means the data performed. This scene reference raw data is used for image processing that modifies the data content to improve the effect of image viewing such as tone conversion 'sharpness enhancement' and saturation enhancement. The process of mapping the signal strength of the channel to a standardized color space such as RIMM R GB or sRGB is omitted.
  • the information amount (for example, the number of gradations) of the scene reference raw data is preferably equal to or greater than the information amount (for example, the number of gradations) required for the viewing image reference data, according to the performance of the A / D converter. Good.
  • the gradation number of the scene reference raw data is preferably 12 bits or more, more preferably 14 bits or more, and more than 16 bits. preferable.
  • the "output medium” is, for example, a display device such as CRT (Cathode Ray Tube), LCD (Liquid Crystal Display), plasma display, silver salt photographic paper, inkjet paper, thermal printer paper, etc. This is a paper for generating a hard copy image.
  • CRT Cathode Ray Tube
  • LCD Liquid Crystal Display
  • plasma display silver salt photographic paper
  • inkjet paper thermal printer paper
  • thermal printer paper etc. This is a paper for generating a hard copy image.
  • Viewing image reference data is used for display devices such as CRTs, LCDs, and plasma displays, and generates hard copy images on output media such as silver halide photographic paper, inkjet paper, and thermal printer paper. This means digital image data used for. Appreciation Image reference data is subjected to an “optimization process” so that an optimal image can be obtained on display devices such as CRT, LCD, and plasma display, and output media such as silver halide photographic paper, inkjet paper, and thermal printer paper. This is different from the scene reference raw data.
  • the "recording medium” is a storage medium used for storing "scene reference raw data” and "reproduction assistance data” output from the imaging apparatus, and is a compact flash (registered trademark), memory stick (registered). Trademark), SmartMedia (registered trademark), multimedia card, hard disk, floppy (registered trademark) disk, magnetic storage medium (MO), CD-R, etc.
  • the unit that writes to the recording media can be installed in an independent or remote location connected wirelessly via a communication unit such as the Internet or a communication unit such as the Internet, even if it is integrated with the camera. Any unit such as a unit may be used.
  • the file format when recording to a recording medium is not a format specific to the imaging device. TIFF (Tagged Image File Format), JPEG (Joint Photograp hie Coding Experts Group), Exif (Exchangeable Image File Format), etc.
  • the "shooting information data” is a record of shooting condition settings at the time of shooting, and may include the same tag information written in the header part of the Exif7 aisle. Specifically, exposure time, shutter speed, aperture value (F-number), ISO sensitivity, brightness value, subject distance range, light source, flash on / off, subject area, white balance, zoom magnification, shooting scene, strobe light source Tag (code) indicating information on the amount of reflected light, shooting saturation, type of subject, subject configuration, and the like.
  • shooting information data is a value obtained at the time of shooting by a sensor provided in the camera, data processed by the sensor, It is classified into the shooting conditions of the camera set based on the sensor value, but in addition to this, the shooting mode dial (for example, portrait, sport, macro shooting mode, etc.) It also includes information that the photographer has manually set the setting switch for forced flash.
  • the shooting mode dial for example, portrait, sport, macro shooting mode, etc.
  • standardized scene reference image data means that the signal intensity of each color channel based on at least the spectral sensitivity of the image sensor itself has been mapped to the standard color space such as RIMM RGB, ERIMM RGB, or scRGB. Yes, it means image data in which the image processing that modifies the data contents is omitted in order to improve the effect at the time of image viewing such as gradation conversion, sharpness enhancement, and saturation enhancement.
  • the scene reference image data is the photoelectric conversion characteristics of the imaging device (opto-electronic conversionposition defined by ISO 1452, for example, Corona “Fine Imaging and Digital Photography” (published by the Japan Photographic Society Publishing Committee, page 449). It is preferable that the correction is made (see).
  • the amount of information (for example, the number of gradations) of the standardized scene reference image data conforms to the performance of the A / D converter and is equal to or greater than the amount of information (for example, the number of gradations) required for the viewing image reference data.
  • the number of gradations of the viewing image reference data is 8 bits per channel
  • the gradation number of the scene reference image data is preferably 12 bits or more, more preferably 14 bits or more, and even more preferably 16 bits or more.
  • Imaging device characteristic correction for generating standardized scene reference image data means a process of converting “scene reference raw data depending on characteristics of the imaging device” to “standardized scene reference image data”.
  • the content of this process depends on the state of the “scene reference raw data that depends on the characteristics of the imaging device”, but at least the signal intensity of each color channel based on the spectral sensitivity unique to the imaging device is set to RIMM RGB, ERIMM RGB, sc RGB, etc. Mapping to the standard color space. For example, when the “scene reference raw data depending on the characteristics of the imaging device” is not subjected to the interpolation processing based on the color filter array, this processing needs to be additionally performed.
  • optical processing means obtaining an optimal image on a display device such as a CRT, LCD, plasma display, or an output medium such as silver salt photographic paper, inkjet paper, thermal printer paper, etc.
  • a display device such as a CRT, LCD, plasma display, or an output medium such as silver salt photographic paper, inkjet paper, thermal printer paper, etc.
  • the process is performed so that the optimum color reproduction is obtained within the color gamut of the sRGB standard.
  • processing is performed to obtain optimal color reproduction within the color gamut of silver salt photographic paper.
  • color gamut compression it also includes gradation compression from 16 bits to 8 bits, reduction of the number of output pixels, and handling of output device output characteristics (LUT).
  • image processing such as noise suppression, sharpening, color balance adjustment, saturation adjustment or masking, and baking is performed.
  • the degree of white balance adjustment can be relaxed and the color balance can be adjusted specially.
  • the distance between the photographer and the subject is estimated based on the "amount of reflected light from the strobe light source" information, and can be reflected in, for example, setting of image processing conditions that suppress over-exposed skin.
  • the degree of sharpness is relaxed, By strengthening the smooth wrinkle treatment, the wrinkles of the skin can be made inconspicuous.
  • the image recording apparatus includes a color negative film, a color reversal film, a black and white, in addition to a mechanism for performing image processing according to the present invention on digital image data acquired from the imaging apparatus according to the present invention.
  • Film scanners that input frame image information of photographic photosensitive materials recorded by analog cameras, such as negative film and black and white reversal film, and flatbed scanners that input image information reproduced on color paper that is silver salt photographic paper You may have.
  • a compact flash (registered trademark), memory stick (registered trademark), smart media (registered trademark), multimedia card (registered trademark), floppy (registered trademark) disk Digital image data can be obtained from a remote location via a means for reading digital image data stored on any known portable "recording media” such as magneto-optical storage media (MO) or CD-R, or via communication means.
  • processing means for forming an appreciation image on any known output medium such as display devices such as CRT, LCD, plasma display, hard copy image generation paper such as silver salt photographic paper, inkjet paper, thermal printer paper, etc. And may be provided.
  • reproduction auxiliary data for generating appreciation image reference data is generated on the imaging apparatus side, and the reproduction auxiliary data is attached to scene reference raw data depending on the characteristics of the imaging apparatus.
  • the viewing image reference data is generated using the reproduction auxiliary data, and the information loss of the captured image information is reduced.
  • appreciation image reference data with higher quality than the image data obtained by the imaging device.
  • the image processing apparatus or the image recording apparatus side reproduces the reproduction assistance data by attaching the shooting information data, which is the shooting condition setting at the time of shooting, to the scene reference raw data. Also, viewing image reference data is generated using the shooting information data, and it is possible to obtain viewing image reference data of higher quality.
  • FIG. 1 is a block diagram showing a main part configuration of an imaging apparatus according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing an internal configuration of a reproduction auxiliary data generation unit.
  • FIG. 3 is a diagram showing the data structure of a data file recorded on the recording medium of the storage device in FIG.
  • ⁇ 4 A diagram for explaining the contents of the imaging device characteristic correction information.
  • ⁇ 5 A diagram showing the tone conversion characteristics of scene reference image data (a), and a diagram showing the tone conversion characteristics of scene reference image data and appreciation image reference data (b).
  • FIG. 6 A diagram showing the conversion characteristics of scene reference image data into appreciation image reference data (a), and a diagram showing gradation conversion characteristics of appreciation image reference data for each light source condition and exposure condition (b).
  • FIG. 7 is a diagram showing an example of the data structure of processing process reproduction information.
  • FIG. 9 is a flowchart showing image data recording processing executed in the imaging apparatus of the embodiment.
  • FIG. 10 is a flowchart showing a shooting scene determination process executed in the imaging apparatus.
  • FIG. 12 is a diagram showing an example of a program for converting RGB power into the HSV color system.
  • FIG. 13 is a diagram showing the lightness (V) —hue (H) plane and the region rl and region r2 on the V—H plane.
  • FIG. 14 is a diagram showing the brightness (V) —hue (H) plane and the region r3 and the region r4 on the V—H plane. 15] A diagram showing a curve representing a first coefficient for multiplying the first occupancy ratio for calculating the index ⁇ .
  • [16] A diagram showing a curve representing a second coefficient for multiplying the first occupancy rate for calculating the index j8.
  • FIG. 17 is a flowchart showing a second occupancy ratio calculation process for calculating a second occupancy ratio based on the composition of captured image data.
  • FIG. 18 is a diagram showing regions nl to n4 determined according to the distance from the outer edge of the screen of captured image data.
  • FIG. 20 is a flowchart showing a bias amount calculation process.
  • FIG. 21 is a block diagram showing a main part configuration of an imaging apparatus in a modification of the present embodiment.
  • FIG. 22 is a diagram showing a data structure of a data file recorded on a recording medium of the storage device of FIG.
  • FIG. 23 is a flowchart showing image data recording processing executed in an imaging apparatus according to a modification of the present embodiment.
  • FIG. 24 is an external view of an image recording apparatus according to an embodiment of the present invention.
  • FIG. 25 is a block diagram showing the internal configuration of the image recording apparatus.
  • FIG. 26 is a block diagram showing the internal configuration of the image processing apparatus and output unit.
  • FIG. 27 is a flowchart showing image processing executed in the image recording apparatus.
  • FIG. 1 shows a main part configuration of an imaging apparatus 100 according to an embodiment of the present invention.
  • the imaging device 100 includes a lens 1, an aperture 2, a CCD (Charge Coupled Device) 3, an analog processing circuit 4, an AZD converter 5, a temporary storage memory 6, an image processing unit 7, a header.
  • Information processing section 8, storage device 9, CCD drive circuit 10, control section 11, reproduction auxiliary data generation section 12, operation section 14, display section 15, strobe drive circuit 16, strobe 17, focal length adjustment circuit 18, autofocus A drive circuit 19 and a motor 20 are provided.
  • the optical system of the imaging apparatus 100 includes a lens 1, a diaphragm 2, and a CCD 3.
  • the lens 1 adjusts the focus and forms an optical image of the subject.
  • Aperture 2 adjusts the amount of light beam that has passed through lens 1.
  • the CCD 3 photoelectrically converts the subject light imaged on the light receiving surface by the lens 1 into an electrical signal (imaging signal) of an amount corresponding to the amount of incident light for each sensor in the CCD 3. Then, the CCD 3 is sequentially controlled by the timing pulse input from the CCD drive circuit 10, and sequentially outputs this imaging signal to the analog processing circuit 4.
  • the analog processing circuit 4 performs RGB signal amplification, noise reduction processing, and the like on the image pickup signal input from the CCD 3.
  • the processing in the analog processing circuit 4 is switched ONZOFF via the control unit 11 in accordance with the operation signal of the operation unit 14 force.
  • the AZD converter 5 converts the imaging signal input from the analog processing circuit 4 into a digital signal and outputs the digital signal.
  • the digital signal obtained by AZD modification 5 will be described as scene reference raw data.
  • the temporary storage memory 6 is a notch memory or the like, and temporarily stores the image data output from the AZD modification 5.
  • the image processing unit 7 performs tone correction, spectral sensitivity crosstalk correction, dark current noise suppression, sharpening, for display on the display unit 15 with respect to the image data stored in the temporary storage memory 6. In addition to image quality improvement processing such as white balance adjustment and saturation adjustment, processing such as image size change, trimming, and aspect conversion is performed.
  • image quality improvement processing such as white balance adjustment and saturation adjustment
  • processing such as image size change, trimming, and aspect conversion is performed.
  • the processing in the image processing unit 7 is switched ONZOFF via the control unit 11 in accordance with an operation signal from the operation unit 14.
  • the header information processing unit 8 adds the reproduction auxiliary data (details will be described later) generated by the reproduction auxiliary data generation unit 12 to the file header (header area) of the scene reference raw data stored in the temporary storage memory 6. Attach and create the attached data file (see Figure 3).
  • the storage device 9 is configured by a nonvolatile semiconductor memory or the like, and stores a control program for the imaging apparatus 100.
  • the storage device 9 includes a mounting unit for mounting a recording medium such as a memory card, and in accordance with a control signal input from the control unit 11, reading of recorded data of a recording medium mounted on the mounting unit, Write data to the recording media.
  • the CCD drive circuit 10 outputs a timing pulse in accordance with a control signal input from the control unit 11, and controls the drive of the CCD 3.
  • the control unit 11 is configured by a CPU (Central Processing Unit) or the like, and is stored in the storage device 9, reads out a control program of the imaging device 100, and controls the entire imaging device 100 in accordance with the read control program. I do.
  • CPU Central Processing Unit
  • the control unit 11 includes an automatic focus driving circuit 19 that controls a motor 20 that adjusts the focal length and focus (focus) of the lens 1 in accordance with an operation signal from the operation unit 14, and a focal length adjustment.
  • the control unit 11 instructs the reproduction auxiliary data generation unit 12 to generate reproduction auxiliary data (details will be described later) to be attached to the file header of the scene reference raw data obtained by shooting. Instructs the recording medium to record a data file with supplementary reproduction data attached to the scene reference raw data.
  • the reproduction auxiliary data generation unit 12 generates reproduction auxiliary data necessary for generating appreciation image reference data by performing an optimization process for appreciation image formation on the output medium.
  • the reproduction assistance data is output to the header information processing unit 8.
  • Reproduction auxiliary data generation The internal configuration of the unit 12 will be described in detail later with reference to FIG.
  • the operation unit 14 includes various function buttons such as a shutter button, a power ON / OFF button, and a zoom button, a cursor key, and the like, and outputs an operation signal corresponding to each button and key to the control unit 11. It consists of a play etc. and performs the required display processing according to the display control signal input by the control unit 11.
  • the display unit 15 displays information for the user of the image capturing apparatus 100 to confirm the conditions for shooting, or displays the viewing image reference data generated for display on the display unit 15. To do.
  • the display unit 15 also has a function as a finder that continuously displays images captured by the CCD 3 in the shooting mode.
  • the strobe driving circuit 16 controls the strobe 17 to emit light when the subject brightness is low, based on a control signal input from the control unit 11.
  • the strobe 17 boosts the battery voltage to a predetermined high voltage and stores it as a charge in a capacitor.
  • the strobe 17 is driven by the strobe driving circuit 16 to emit light from the X tube with the electric charge stored in the capacitor, and irradiates the subject with auxiliary light.
  • the focal length adjustment circuit 18 controls the motor 20 for adjusting the focal length by moving the lens 1 by the control signal input from the control unit 11.
  • the automatic focus driving circuit 19 controls the motor 20 for moving the lens 1 and adjusting the focus (focus) by the control signal input from the control unit 11.
  • FIG. 2 shows an internal configuration of the reproduction assistance data generation unit 12.
  • the reproduction auxiliary data generation unit 12 includes an imaging device characteristic correction information generation unit 121, an appreciation image reference data restoration information generation unit 122, and a processing process reproduction information generation unit 123.
  • the imaging device characteristic correction information generation unit 121 generates information (imaging device characteristic correction information) necessary for imaging device characteristic correction processing for generating scene reference image data in which the scene reference raw data force is also standardized. To do. As shown in FIG. 4, the imaging device characteristic correction processing includes filter interpolation calculation (a), matrix calculation (b), photoelectric conversion characteristic and gain correction (c) for scene reference raw data.
  • the filter interpolation calculation shown in FIG. 4 (a) is the image data having a filter arrangement of one pixel and one color. Force Interpolation to linear image data of 1 pixel 3 colors (RGB).
  • the filter array is an array pattern of color filters for CCD color discrimination, and an array of RGB primary colors is generally used.
  • As the filter interpolation method it is possible to apply the neare st neighbor method, the bi-linear interpolation method, the noi 1 ⁇ Higg (bi-cubic convolution) method, etc. Is possible.
  • a pixel nearest to the target pixel is selected, and the pixel is enlarged and reduced as it is in accordance with the enlargement / reduction size, and the pixel is interpolated only at a necessary portion.
  • linear density interpolation is performed from the density values of the four pixels around the target pixel according to the coordinates (real values).
  • interpolation is performed using a cubic function from the density values of 16 pixels around the pixel of interest in order to perform interpolation with higher accuracy than the bilinear method.
  • the formula used for interpolation is 3 ( ⁇ ) / ⁇ , which is the most complete density interpolation formula in theory (sampling theorem). This is approximated by the third-order term of X in the Tiller expansion and used as an interpolation formula.
  • the matrix operation shown in Fig. 4 (b) is an operation for correcting the difference in which the same subject color is recorded as a different signal due to the difference in the spectral characteristics of the color filter and the spectral sensitivity characteristics of the image sensor. It is a process and is converted from tristimulus values of RGB to XYZ color system by this matrix operation.
  • Fig. 4 (c) shows processing for correcting the difference due to photoelectric conversion characteristics (linear conversion characteristics) and gain correction (translation) of the image sensor. This processing is shown in Fig. 4 (c As shown in (), the characteristic is that UogY (Y: stimulus value of XYZ color system) is linearly converted to 1 ogE (E: exposure amount).
  • the scene reference raw data power depending on the characteristics of the imaging device XYZ in a standardized color space such as scene reference image data Converted to a value.
  • the color filter array pattern applied in the imaging device characteristic correction process, matrix coefficients a to i for matrix calculation, and the coefficient values for correcting differences associated with photoelectric conversion characteristics and gain correction are all different for each imaging device. is there.
  • the appreciation image reference data restoration information generation unit 122 restores the appreciation image reference data in the imaging device 100 when generating the appreciation image reference data from the scene reference image data. Encoding information of the restoration information (viewing image reference data restoration information) is generated.
  • FIG. 5 (a) shows the tone conversion characteristics of scene reference image data.
  • this tone conversion characteristic is a characteristic that logY (Y: stimulus value of XYZ color system) is converted to linear with respect to logE (E: exposure amount).
  • Figure 5 (b) shows the L * a * b * color system L * and logE (same color space) as the gradation conversion characteristics in the sRGB color space of scene reference image data and appreciation image reference data.
  • E Exposure amount
  • the appreciation image reference data now has a characteristic that L * changes linearly with respect to log E.
  • Fig. 6 (a) shows the conversion characteristics (curved line in the figure) of the viewing image reference data from the scene reference image data.
  • the conversion characteristics shown in Fig. 6 (a) can be corrected with the shooting scene at the time of shooting (front light, backlight, under, flash proximity, etc.).
  • Figure 6 (b) shows the gradation conversion characteristics of the appreciation image reference data for correcting the light conditions for backlighting, flash close-up photography, and under / over exposure conditions.
  • the combination of the conversion characteristics shown in Fig. 6 (a) and the conversion characteristics shown in Fig. 6 (b) is the viewing image reference data restoration information.
  • the processing process reproduction information generation unit 123 reproduces the process of generating appreciation image reference data in the imaging apparatus 100 when the scene reference image data power and the appreciation image reference data are generated (processing process reproduction). Information).
  • FIG. 7 shows an example of processing process reproduction information.
  • the processing process reproduction information indicates history information of shooting conditions from the actual shooting to a predetermined time before (for example, 2500 ms before). Specifically, as shown in FIG. Indicators 1 to 3 indicating the processing result of the discrimination process of shooting scenes (front light, backlight, under, flash proximity, etc.), total indicator 4 calculated from indicators 1 to 3, and evaluation of the calculated indicator Show history information for value 1, value for camera shake level evaluation 2, and correctness of shooting conditions.
  • the camera shake level indicates the degree of camera shake in, for example, 10 levels (an integer of 1 to 10). The higher the camera shake level, the higher the value of the camera shake level.
  • Indices 1 to 3 are numerical values for specifying a shooting scene (forward light, backlight, under, flash proximity, etc.) at the time of shooting.
  • Fig. 8 shows the discrimination map for discriminating the shooting scene.
  • the index 1 is the strobe degree (strobe degree)
  • the index 2 is the backlight intensity (backlight degree)
  • the index 3 is the under degree (under degree).
  • the area force determined by index 1 (strobe degree) and index 2 (backlight intensity) is also distinguished from backlight and direct light
  • Under and flash proximity are discriminated from the area determined by. The shooting scene discrimination process will be described in detail later with reference to FIGS.
  • Total index 4 10 — [ ⁇ (index 1 + index 2 + index 3
  • Evaluation value 1 and evaluation value 2 indicate the light source condition 'exposure condition evaluation value and camera shake evaluation value, respectively, and are defined as in equations (2) and (3).
  • Evaluation value 1 Total index 4 X 3.4— 5 (2)
  • Evaluation value 2 — 0.54 X Camera shake level +8.5 (3)
  • Equations (2) and (3) among indicators 1 to 3, a value of 6 or less is 6 and a value of +6 or more is +6.
  • the correctness of the shooting conditions increases as the shooting is performed correctly.
  • the photographing conditions (light source conditions, exposure conditions, etc.) are adjusted so that the numerical value of the photographing condition validity increases as time passes.
  • FIGS. 8 (a) and 8 (b) the history of the indices 1 to 3 indicated by the processing process reproduction information shown in FIG. 7 is indicated by diamonds.
  • the history information on the discrimination map shown in FIG. 8 may be displayed on the display unit 15.
  • the camera shake level is an integer value of 1 to 10
  • the numerical range of the indicators 1 to 3 is 6 or more + 6 or less, but these numerical ranges are not particularly limited.
  • the definition of total index 4, evaluation value 1, and evaluation value 2 will change as the numerical range changes.
  • Various user mode settings may be used as user characteristics other than camera shake.
  • the recording of history information as processing process reproduction information starts after the shooting mode is designated by the operation unit 14, and the time when the shutter button is pressed may be the time of actual shooting. Recording of history information starts after the shutter button of Part 14 is pressed halfway, and the time when the shutter button is pressed hard can be used for actual shooting!
  • FIG. 3 shows the data structure of a data file recorded on the recording medium of the storage device 9.
  • the imaging device characteristic correction information, the appreciation image reference data restoration information, and the processing process reproduction information generated by the reproduction auxiliary data generation unit 12 are processed by the header information processing unit 8 as shown in FIG.
  • the attached data file is created by attaching to the file header as reproduction auxiliary data, and is recorded on the recording medium of the attached data file storage device 9.
  • scene reference image data that is simply used for display on the display unit 15 with the appreciation image reference data generated in the imaging apparatus 100 and the appreciation image reference data are associated with each other on a recording medium. You can record it or record the thumbnail image of the viewing image reference data on the recording medium as metadata of the scene reference raw data.
  • step S1 When the shooting mode is designated by the operation unit 14, preliminary imaging is performed (step S1), and an image obtained by preliminary imaging (hereinafter referred to as preliminary imaging image) is displayed on the display unit 15.
  • the viewing image reference data is formed (step S2), and the formed viewing image reference data is displayed on the display unit 15 (step S3).
  • preliminary imaging may be performed when the shutter button is pressed halfway by the operation unit 14.
  • a shooting scene discrimination process (see FIGS. 10 to 20) is performed, and the camera shake level and the indices 1 to 3 as a result of the shooting scene discrimination process are captured.
  • a value indicating the correctness of the condition is calculated, and the correctness of the calculated imaging force is determined (step S4).
  • the shooting condition is adjusted so that the value of the shooting condition validity is increased (step S5).
  • the processing in steps S1 to S5 is performed by the shutter of the operation unit 14. The process is repeated until the actual shooting is instructed after the button is pressed, and history information of shooting conditions up to a predetermined time before the actual shooting is generated as processing process reproduction information.
  • step S6 When the shutter button of the operation unit 14 is pressed and a real shooting is instructed (step S6; YES), the imaging signal obtained from the CCD 3 is converted into a digital signal by the AZD change 5 to refer to the scene. Simultaneously with the generation of raw data (step S7), the reproduction auxiliary data generation unit 12 generates imaging device characteristic correction information, appreciation image reference data restoration information, and processing process reproduction information as reproduction auxiliary data (Ste S8).
  • the processing process reproduction information generated in step S8 is obtained by adding information at the time of actual photographing to the history information before actual photographing generated by the processing of steps S1 to S5. Note that a step of creating appreciation image reference data to be recorded on the recording media may be added after the main photographing.
  • the header information processing unit 8 attaches the reproduction assistance data generated in step S8 as tag information to the file header of the scene reference raw data generated in step S7.
  • Step S9 an attached data file (see FIG. 3) is created (Step S10).
  • the attached data file is recorded and stored in the recording medium of the storage device 9 (step S11), and the image data recording process is completed.
  • captured image data for example, scene reference raw data
  • an occupation ratio a first occupation ratio, a second occupation ratio indicating a ratio of each divided area to the entire captured image data. Occupancy ratio calculation processing is performed (step T1). Details of the occupation rate calculation process will be described later with reference to FIGS.
  • step T2 a bias amount calculation process for calculating a bias amount indicating a bias of the gradation distribution of the photographed image data is performed.
  • the bias amount calculation process in step 2 will be described in detail later with reference to FIG.
  • indices 1 to 3 for specifying the shooting scene are calculated based on the occupation ratio calculated in step T1 and a coefficient set in advance according to the shooting conditions (step ⁇ 3), and the main shooting is performed.
  • the scene discrimination process ends.
  • the calculation method of the index in step ⁇ 3 will be detailed later. Explain in detail.
  • the RGB values of the captured image data are converted into the HSV color system (step T10).
  • Figure 12 shows an example of a conversion program (HSV conversion program) that obtains hue values, saturation values, and brightness values by converting from RGB to the HSV color system in program code (c language). is there.
  • HSV conversion program shown in Fig. 12
  • the digital image data values that are input image data are defined as InR, InG, and InB
  • the calculated hue value is defined as OutH
  • the scale is defined as 0 to 360
  • the saturation The value is OutS
  • the brightness value is OutV
  • the unit is defined as 0 to 255
  • the captured image data is divided into regions having a combination of predetermined brightness and hue, and a two-dimensional histogram is created by calculating the cumulative number of pixels for each divided region (step Tl l). .
  • a two-dimensional histogram is created by calculating the cumulative number of pixels for each divided region (step Tl l). .
  • the area division of the captured image data will be described in detail.
  • Lightness (V) is lightness value power -25 (vl), 26-50 (v2), 51-84 (v3), 85-169 (v4), 170-199 (v5), 200-224 ( v6), divided into 7 regions from 225 to 255 (v7).
  • Hue (H) is a flesh color range (HI and H2) with a hue value of 0 to 39, 330 to 359, a green hue range (H3) with a hue value of 40 to 160, and a blue hue range with a hue value of 61 to 250. It is divided into four areas (H4) and red hue area (H5). Note that the red hue region (H5) is not used in the following calculations because of the fact that it contributes little to the determination of imaging conditions.
  • the flesh color hue area is further divided into a flesh color area (HI) and another area (H2).
  • HI flesh color area
  • H2 another area
  • the hue '(H) that satisfies the following formula (5) is defined as the flesh-colored area (HI), and the area that does not satisfy formula (5) (H2).
  • Hue '(H) Hue) + 60 (0 ⁇ Hue) (when 300)),
  • Hue '(H) Hue (H)-300 (when 300 ⁇ Hue (H) ⁇ 360),
  • Luminance (Y) InR X 0.30 + InG X 0.59 + InB X 0.11 (A)
  • V and brightness (V) can also be used.
  • a first occupancy ratio indicating the ratio of the cumulative number of pixels calculated for each divided region to the total number of pixels (the entire captured image) is calculated (step T12).
  • the first occupancy rate calculation process ends.
  • Table 1 shows the first occupancy ratio in each divided area, where Rij is the first occupancy ratio calculated in the divided area that also has the combined power of the lightness area vi and the hue area Hj.
  • Table 2 shows the first coefficient necessary for calculating the accuracy (X for each divided area) that quantitatively indicates the accuracy of strobe shooting, that is, the brightness state of the face area during strobe shooting.
  • the coefficient of each divided area shown in Table 2 is a weighting coefficient by which the first occupancy Rij of each divided area shown in Table 1 is multiplied, and is set in advance according to the photographing conditions.
  • FIG. 13 shows the brightness (v) —hue (H) plane.
  • a positive (+) coefficient is used for the first occupancy calculated from the region (rl) distributed in the high-brightness skin color hue region in Fig. 13, and other hues are used.
  • a negative (-) coefficient is used for the first occupancy calculated from a certain blue hue region (r2).
  • Figure 15 shows the first coefficient in the flesh-color area (HI) and the first coefficient in the other areas (green hue area (H3)) continuously changing over the entire brightness. It is shown as a curve (coefficient curve).
  • the sign of the first coefficient in the skin color region (HI) is positive (+), and the other regions (e.g., green hue) In region (H3)), the sign of the first coefficient is negative (-), and the sign of both is different.
  • Index ⁇ Sum of ⁇ 1 area + Sum of ⁇ 2 area + Sum of ⁇ 3 area + Sum of ⁇ 4 area +4.424 (7)
  • Table 3 shows the accuracy of backlighting, that is, brightness of face area during backlighting
  • the second coefficient necessary to calculate the index ⁇ that quantitatively indicates the state is shown for each divided region.
  • the coefficient of each divided area shown in Table 3 is a weighting coefficient by which the first occupancy ratio Rij of each divided area shown in Table 1 is multiplied, and is set in advance according to the shooting conditions.
  • FIG. 14 shows the brightness (v) —hue (H) plane.
  • the skin color hue area Area (r4) force distributed in the middle brightness of the area
  • the negative (-) coefficient is used for the calculated occupancy, and the occupancy calculated from the low brightness (shadow) area (r3) of the flesh hue area Is a positive (+) coefficient.
  • Fig. 16 shows the second coefficient in the flesh color region (HI) as a curve (coefficient curve) that changes continuously over the entire brightness. According to Table 3 and Fig.
  • the sign of the second coefficient in the lightness value range of 85 to 169 (v4) in the flesh tone hue region is negative (-) and the lightness value is 26 to 84 (v2,
  • the sign of the second coefficient in the low lightness (shadow) region of v3) is positive (+), which indicates that the sign of the coefficient in both regions is different.
  • H2 region sum R12 X 0.0 + R22 X 4.7 + (omitted)... + R72 X (-8.5) (8-2)
  • H3 region sum R13 X 0.0 + R23 X 0.0 + (omitted) ... + R73 X 0.0 (8-3)
  • Indicator j8 Sum of H1 region + Sum of H2 region + Sum of H3 region + Sum of H4 region + 1.554 (9)
  • the index ⁇ and the index ⁇ are calculated based on the brightness and hue distribution amount of the captured image data, they are effective for determining a captured scene when the captured image data is a color image.
  • the RGB values of the photographed image data are converted into the HSV color system (step T20).
  • the captured image data is divided into regions where the combined power of the distance from the outer edge of the captured image screen and the brightness is determined, and the cumulative number of pixels is calculated for each divided region to obtain a two-dimensional histogram. Is created (step T21).
  • the area division of the captured image data will be described in detail.
  • FIG. 18A to 18D show four areas nl to n4 divided according to the distance from the outer edge of the screen of the captured image data.
  • the area nl shown in FIG. 18 (a) is the outer frame
  • the area n2 shown in FIG. 18 (b) is the area inside the outer frame
  • the area n3 shown in FIG. 18 (c) is the area n2.
  • a further inner area, an area n4 shown in FIG. 18 (d) is an area at the center of the captured image screen.
  • a second occupancy ratio indicating the ratio of the cumulative number of pixels calculated for each divided region to the total number of pixels (the entire captured image) is calculated (step T22).
  • the second occupancy rate calculation process ends. Assuming that Qij is the second occupancy calculated in the divided area that also has the combined power of the brightness area vi and the screen area nj, the second occupancy ratio in each divided area is expressed as shown in Table 4.
  • Table 5 shows the third coefficient necessary for calculating the index ⁇ for each divided region.
  • the coefficient of each divided area shown in Table 5 is a weighting coefficient by which the second occupancy Qij of each divided area shown in Table 4 is multiplied, and is set in advance according to the photographing conditions.
  • Fig. 19 shows the third coefficient in the screen areas nl to n4 as a curve (coefficient curve) that continuously changes over the entire brightness.
  • n2 region sum Q12 X (-14.8) + Q22 X (-10.5) + (omitted)... + Q72 X 0.0 (10-2)
  • n3 region sum Q13 X 24.6 + Q23 X 12.1 + (omitted)... + Q73 X 10.1 (10-3)
  • Sum of n4 region Q 14 X 1.5 + Q24 X (-32.9) + (Omitted) ... + Q 74 X (-52.2) (10-4) Using the sum of the nl to n4 regions shown in 10-1) to (; 10-4), it is defined as in equation (11).
  • the index ⁇ is calculated based on the compositional characteristics (distance from the outer edge of the screen of the captured image data) based on the brightness distribution position of the captured image data. It is also effective for discrimination.
  • step ⁇ 2 in FIG. 10 the bias amount calculation process
  • the luminance Y (brightness) of each pixel is calculated from the RGB (Red, Green, Blue) values of the captured image data using Equation (A), and the standard deviation (xl) of the luminance is calculated. (Step T23).
  • the standard deviation (xl) of brightness is defined as in equation (12). [0118] [Equation 4]
  • the pixel luminance value is the luminance of each pixel of the captured image data
  • the average luminance value is the average value of the luminance of the captured image data.
  • the total number of pixels is the number of pixels of the entire captured image data.
  • a luminance difference value (x2) is calculated (step T24).
  • Luminance difference value (x2) (Maximum luminance value, Average luminance value) Z255 (13)
  • the maximum luminance value is the maximum luminance value of the captured image data.
  • the average luminance value (x3) of the flesh color region in the center of the screen of the captured image data is calculated (step T25), and further the average luminance value ( ⁇ 4) in the center of the screen is calculated (step S25). ( ⁇ 26).
  • the center of the screen is, for example, an area composed of an area ⁇ 3 and an area ⁇ 4 in FIG.
  • the flesh color luminance distribution value ( ⁇ 5) is calculated (step ⁇ 27), and this deviation amount calculation processing ends.
  • the maximum brightness value of the skin color area of the captured image data is Yskinjnax
  • the minimum brightness value of the skin color area is Yskin_min
  • the average brightness value of the skin color area is Yskin_ave
  • the skin color brightness distribution value (x5) is defined as in equation (14) Is done.
  • x5 (Yskin— max ⁇ Yskin— mm) / 2 ⁇ Yskin—ave (14)
  • x6 be the average luminance value of the skin color area in the center of the screen of the captured image data.
  • the center of the screen is, for example, an area composed of the area n2, the area n3, and the area n4 in FIG.
  • index 1 shown in FIGS. 7 and 8 is defined as in equation (15) using index a
  • index ⁇ , ⁇ 6, and index 2 is defined using index j8, index ⁇ , ⁇ 6. It is defined as (16).
  • Indicator 1 0.46 X indicator ⁇ +0.61 X indicator ⁇ +0.01 ⁇ ⁇ 6— 0.79 (15)
  • the index 3 shown in FIGS. 7 and 8 includes the deviation amounts (xl) to (x5) calculated in the deviation amount calculation process. It is obtained by multiplying a preset fourth coefficient according to the shooting conditions. Table 6 shows the fourth coefficient, which is a weighting coefficient by which each deviation is multiplied.
  • Indicator 3 1 0.02+ 2 1.13+ 3 0.06+ 4 (-0.01) + 5 0.03— 6.49 (17)
  • This indicator 3 is a luminance histogram distribution information that consists only of the compositional features of the screen of the captured image data. In particular, it is effective for discriminating between strobe (flash proximity) shooting scenes and under shooting scenes.
  • reproduction auxiliary data for generating viewing image reference data is generated on the imaging apparatus side, and the reproduction auxiliary data is used as characteristics of the imaging apparatus.
  • the reproduction auxiliary data is used as characteristics of the imaging apparatus.
  • FIG. 21 shows a configuration of an imaging apparatus 101 as a modification of the imaging apparatus 100.
  • the same components as those of the imaging device 100 of FIG. 21 the same components as those of the imaging device 100 of FIG.
  • the photographing information data generation unit 13 generates photographing information data that is a photographing condition setting at the time of photographing.
  • This shooting information data includes, for example, information directly related to the camera type (model) such as camera name and code number, exposure time, shutter speed, aperture value (F number), ISO sensitivity, brightness value, subject distance range, Light source, strobe flash, subject area, Information about the type of the subject, the image balance, the zoom magnification, the subject composition, the shooting scene, the amount of reflected light from the strobe light source, the shooting saturation, and the like.
  • FIG. 22 shows the data structure of a data file recorded on the recording medium of the storage device 9 of the imaging apparatus 101.
  • the imaging device characteristic correction information, the appreciation image reference data restoration information and the processing process reproduction information generated by the reproduction auxiliary data generation unit 12 and the shooting information data generated by the shooting information data generation unit 13 are the header information processing unit 8
  • the attached data file is created by being attached to the file header of the scene reference raw data as auxiliary reproduction data, and is recorded in the recording medium of the attached data file storage device 9.
  • step S20 preliminary imaging is performed (step S20), and an image obtained by preliminary imaging (hereinafter referred to as preliminary imaging image) is displayed on the display unit 15.
  • the viewing image reference data is formed (step S21), and the formed viewing image reference data is displayed on the display unit 15 (step S22).
  • preliminary imaging may be performed when the shutter button is pressed halfway by the operation unit 14.
  • the camera shake level and the indices 1 to 3 as a result of the scene discrimination process are calculated.
  • the photographing condition validity value is calculated from the calculation result, and the validity of the photographing condition is calculated. Is determined (step S23).
  • the shooting condition is adjusted so that the value of the shooting condition validity is increased (step S 24).
  • the processing of steps S20 to S24 is repeated until the actual shooting is instructed by pressing the shutter button of the operation unit 14, and history information of shooting conditions up to a predetermined time before the main shooting is generated as processing process reproduction information. It is.
  • step S25 When the shutter button on the operation unit 14 is pressed and an actual shooting is instructed (step S25; YES), the image pickup signal obtained from the CCD3 is converted into a digital signal by the AZD change 5 to generate a scene.
  • step S26 the reproduction assistance data
  • the generation unit 12 generates imaging device characteristic correction information, appreciation image reference data restoration information, and processing process reproduction information as reproduction auxiliary data (step S27), and the shooting information data generation unit 13 generates shooting information data.
  • the processing process reproduction information generated in step S27 is obtained by adding the information at the time of main photographing to the history information before main photographing generated by the processing of steps S20 to S24. Note that a step of creating appreciation image reference data for recording on a recording medium may be added after the main photographing.
  • the reproduction auxiliary data generated in step S 27 and the reproduction auxiliary data generated in step S 28 are added to the file header of the scene reference raw data generated in step S 26.
  • the captured information data is attached as tag information (step S29), and an attached data file (see FIG. 22) is created (step S30).
  • the attached data file is recorded and saved on the recording medium of the storage device 9 (step S31), and the image data recording process is completed.
  • the shooting information data that is the shooting condition setting at the time of shooting is attached to the file header of the scene reference raw data, and the recording medium By recording the image, it is possible to generate appreciation image reference data according to the shooting situation when the data recorded on the recording medium is output on the output medium.
  • the recording medium on which the attached data file is recorded is taken out from the imaging apparatus main body and mounted on an external apparatus such as an image processing apparatus or an image recording apparatus.
  • an external apparatus such as an image processing apparatus or an image recording apparatus.
  • viewing image formation on the output medium is performed. Therefore, the image processing to be optimized is performed to generate the viewing image reference data.
  • FIG. 24 shows an external configuration of the image recording apparatus 201 according to the embodiment of the present invention.
  • the image recording apparatus 201 is provided with a magazine loading unit 203 on one side surface of a main body 202.
  • the main body 202 is exposed to an exposure processing unit 204 that exposes silver salt photographic paper as an output medium.
  • the print creation unit 205 is provided to create a print by developing and drying the silver halide photographic paper. It has been.
  • the print created by the print creation unit 205 is discharged to a tray 206 provided on the other side of the main body 202.
  • a control unit 207 that controls each unit constituting the image recording apparatus 201 is provided inside the main body 202.
  • a display unit 208 On the upper part of the main body 202, a display unit 208, a film scanner unit 209 that is a transparent document reading device, a reflective document input device 210, and an operation unit 211 are arranged. Further, the main body 202 is provided with an image reading unit 214 that can read image data recorded on various recording media, and an image writing unit 215 that writes image data on various recording media.
  • a photographic photosensitive material is used as an original read from the film scanner unit 209 or the reflective original input device 210.
  • the photographic material include a color negative film, a color reversal film, a black and white negative film, and a black and white reversal film.
  • Frame image information captured by an analog camera is recorded.
  • the film scanner unit 209 converts the frame image information recorded on the photographic photosensitive material into digital image data to obtain frame image data.
  • the photographic photosensitive material is color paper that is silver salt photographic paper
  • the reflective original input device 210 converts the frame image information recorded on the silver salt photographic paper into frame image data by a flatbed scanner.
  • the image reading unit 214 includes a PC card adapter 214a and a floppy (registered trademark) disk adapter 214b, into which a PC card 213a and a floppy (registered trademark) disk 213b can be respectively inserted.
  • the PC card 213a has, for example, a memory in which a plurality of frame image data is stored after being captured by a digital camera.
  • the floppy (registered trademark) disk 213b for example, a plurality of frame image data captured by a digital camera is recorded.
  • Recording media on which frame image data is recorded in addition to PC card 213a and floppy disk 213b include, for example, multimedia card (registered trademark), memory stick (registered trademark), MD data, CD-ROM, and the like. Can be mentioned.
  • the image writing unit 215 includes a floppy (registered trademark) disk adapter 215a, an MO adapter 215b, and an optical disk adapter 215c.
  • the operation unit 211, the display unit 208, the film scanner unit 209, and the reflection original input The force device 210 and the image reading unit 214 are integrally provided in the main body 202. Force One or more of these may be provided separately.
  • the print creation method is not limited to this, and for example, an inkjet A method such as a method, an electrophotographic method, a heat sensitive method, or a sublimation method may be used.
  • FIG. 25 shows the internal configuration of the image recording apparatus 201.
  • the image recording apparatus 201 includes a control unit 207, an exposure processing unit 204, a print creation unit 205, a film scanner unit 209, a reflection original input device 210, an image reading unit 214, and an image writing unit 215. , Data storage means 271, template storage means 272, operation section 211, and display section 208.
  • the control unit 207 is constituted by a microcomputer, and various control programs stored in a storage unit (not shown) such as a ROM (Read Only Memory) and a CPU (Central Process Unit) (not shown) The operation of each unit constituting the image recording apparatus 201 is controlled in cooperation with the.
  • a storage unit not shown
  • ROM Read Only Memory
  • CPU Central Process Unit
  • control unit 207 has an image processing unit 270 and is read from the film scanner unit 209 or the reflection original input device 210 based on an input signal from the information input unit 12 of the operation unit 211.
  • Image data read from the image reading unit 214 or image data input from an external device via a communication means (not shown) to form image information for exposure to form exposure image information.
  • the image processing unit 270 performs a conversion process according to the output form on the image processed image data, and outputs it to a designated output destination.
  • the output destination of the image processing unit 270 includes a display unit 208, an image writing unit 215, a communication unit, and the like.
  • the exposure processing unit 204 exposes an image on the photosensitive material, and outputs the photosensitive material to the print creating unit 205.
  • the print creating unit 205 develops the exposed photosensitive material and dries it to create prints Pl, P2, and P3.
  • the print P1 is a service size, a no-vision size, a panorama size, etc.
  • the print P2 is an A4 size print
  • the print P3 is a business card size print.
  • the film scanner unit 209 reads a frame image recorded on a transparent original such as a developed negative film or reversal film imaged by an analog camera, and obtains a digital image signal of the frame image.
  • the reflection original input device 210 reads an image on the print P (photo print, document, various printed materials) by a flat bed scanner, and acquires a digital image signal.
  • the operation unit 211 is provided with information input means 212.
  • the information input unit 212 includes a touch panel or the like, and outputs an operation signal of the information input unit 212 to the control unit 207.
  • the operation unit 211 may be configured to include a keyboard and a mouse.
  • the image reading unit 214 reads the frame image information recorded on the PC card 213a or the floppy (registered trademark) disk 213b and transfers the frame image information to the control unit 207.
  • the image reading unit 214 includes, as the image transfer means 230, a PC card adapter 214a, a floppy (registered trademark) disk adapter 214b, and the like.
  • the image reading unit 14 reads the frame image information recorded on the PC card 213a inserted into the PC card adapter 214a and the floppy disk 213b inserted into the floppy disk adapter 214b. Transfer to the control unit 207.
  • a PC card reader, a PC card slot, or the like is used as the PC card adapter 214a.
  • the image writing unit 215 includes a floppy (registered trademark) disk adapter 215a, an MO adapter 215b, and an optical disk adapter 215c as the image transport unit 231.
  • the image writing unit 215 includes a floppy disk 216a inserted into the floppy disk adapter 215a and an MO 216b inserted into the MO adapter 215b.
  • the image data generated by the image processing method according to the present invention is written to the optical disk 216c inserted into the optical disk adapter 215c.
  • the data storage means 271 stores and sequentially stores image information and corresponding order information (information about how many prints are to be created from which frame images, print size information, etc.)
  • the template storage means 272 corresponds to the sample identification information Dl, D2, D3. Image data (data indicating a background image, an illustration image, etc.) and at least one template data for setting a synthesis area with the sample image data.
  • Image data data indicating a background image, an illustration image, etc.
  • the control unit 207 When a predetermined template is selected from a plurality of templates stored in advance in the template storage means 272 by an operator's operation (the operator's operation is based on an instruction from the client), the control unit 207 When the image identification information Dl, D2, and D3 are specified by the operator's operation (this operator's operation is based on the client's instruction), the image information and the selected template are combined.
  • the sample image data is selected based on the sample identification information Dl, D2, and D3, and the selected sample image data is combined with the image data and Z or character data ordered by the client. Creates a print based on the desired sample image data.
  • This template synthesis is performed by the well-known Chromaki method.
  • sample identification information is not limited to the three types of sample identification information Dl, D2, and D3, but may be more or less than three types.
  • sample identification information Dl, D2, and D3 for specifying a print sample is a force that is configured to be input by the operation unit 21 1 force
  • Sample identification information Dl, D2, and D3 are printed samples or Since it is recorded on the order sheet, it can be read by reading means such as OCR (Optical Character Reader). Or, the operator can input from the keyboard.
  • sample image data is recorded corresponding to sample identification information D1 for specifying a print sample, sample identification information D1 for specifying a print sample is input, and this sample identification information is input.
  • Select sample image data based on D1 and combine the selected sample image data with the image data and Z or character data based on the order to create prints based on the specified samples. Users can actually order samples for printing and can meet the diverse requirements of a wide range of users.
  • the first sample identification information D2 designating the first sample and the image data of the first sample are stored, and the second sample identification information D3 designating the second sample and the first sample identification data D3 are stored.
  • the image data of two samples is stored, the sample image data selected based on the designated first and second sample identification information D2, D3, the image data based on the order, and the Z or character data Since a print based on the specified sample is created, a wider variety of images can be synthesized, and a print that meets a wider variety of user requirements can be created.
  • the display unit 208 includes a display such as a CRT or LCD, and performs display processing in accordance with a display control signal input from the control unit 207.
  • the image processing unit 270 of the control unit 207 uses a communication means (not shown) via another computer in the facility where the image recording apparatus 201 is installed or a communication network such as the Internet. It is also possible to receive image data representing captured images and work instructions such as printing from a distant computer, and to perform image processing and create prints remotely.
  • the image processing unit 270 uses communication means (not shown) to send image data representing the captured image after the image processing of the present invention and the accompanying order information to another combination in the facility. Utah can also be sent to distant computers via the Internet.
  • the image recording apparatus 201 inputs the image information of the various recording media and the image information obtained by dividing and metering the image original, and the image information of the input image taken from the input means into “ An image that is processed to obtain an image that gives a favorable impression when observing the image on the output medium by acquiring or estimating information such as “the size of the output image” and “the size of the main subject in the output image” Image processing means, image output means for displaying processed images, printing output, writing to recording media, and another computer in the facility via a communication line, image data to a remote computer via the Internet, etc. And communication means for transmitting the attached order information.
  • FIG. 26 shows an internal configuration when the image recording apparatus 201 is divided into an image processing apparatus 301 and an output unit 302 that outputs image data processed by the image processing apparatus 301.
  • the image processing device 301 includes an input unit 303, a header information analysis unit 304, an imaging device characteristic correction processing unit 305, an appreciation image reference data restoration condition generation unit 306, a processing program.
  • the process reproduction unit 307 and the optimization processing unit 308 are configured.
  • the input unit 303 includes the image reading unit 214 of Fig. 25 and includes a mounting unit for mounting a recording medium. When the recording medium is loaded in the loading unit, the input unit 303 reads the data file recorded in the recording medium and outputs the data file to the header information analysis unit 304. In the present embodiment, the input unit 303 is described as reading a data file from the attached recording medium. However, the input unit 303 includes a wired or wireless communication unit, and inputs the data file via the communication unit.
  • the header information analysis unit 304 analyzes the data file input from the input unit 303, and converts the data file into scene reference raw data, reproduction auxiliary data (imaging device characteristic correction information, appreciation image reference data restoration). Information, processing process reproduction information) and shooting information data, and output the scene reference raw data to the scene reference image data generation unit 311 in the imaging device characteristic correction processing unit 305, and correct the imaging device characteristic correction information to the device characteristic correction.
  • the information is output to the photographing information data processing unit 313 in the unit 308.
  • the imaging device characteristic correction processing unit 305 includes a device characteristic correction processing unit 309, a processing condition table 310, a scene reference image data generation unit 311, and a temporary storage memory 312.
  • the device characteristic correction processing unit 309 determines scene generation image data generation conditions based on the imaging device characteristic correction information input from the header information analysis unit 304 and the processing condition table 310.
  • the processing condition table 310 is a table that stores processing conditions for generating scene reference image data for each characteristic of the imaging apparatus.
  • the scene reference image data generation unit 311 performs imaging device characteristic correction processing on the scene reference raw data input from the header information analysis unit 304 in accordance with the generation conditions determined by the device characteristic correction processing unit 309. Depending on the characteristics of the imaging device, generate standardized scene reference image data and output it to the temporary storage memory 312. Specifically, in the imaging device characteristic correction process, the signal intensity of each color channel based on at least the spectral sensitivity unique to the imaging device of the imaging device that generated the scene reference raw data, for example, RIMM RGB, E RIMM RGB, scRGB, etc. Includes mapping to standard color space.
  • the temporary storage memory 312 temporarily stores the scene reference image data generated by the scene reference image data generation unit 311.
  • the appreciation image reference data restoration condition generation unit 306 is based on the appreciation image reference data restoration information input from the header information analysis unit 304 !, and is restored to restore the appreciation image reference data in the imaging device. Determine the conditions.
  • the processing process reproduction unit 307 sets reproduction conditions for reproducing the generation process of the appreciation image reference data in the imaging device. decide.
  • the optimization processing unit 308 includes the photographing information data processing unit 313, the appreciation image reference data generation unit 3
  • the shooting information data processing unit 313 determines a generation condition for generating appreciation image reference data corresponding to the shooting condition based on the shooting information data input from the header information analysis unit 304.
  • the appreciation image reference data generation unit 314 reads the scene reference image data from the temporary storage memory 312 and generates the appreciation image reference data determined by the shooting information data processing unit 313 and restores the appreciation image reference data.
  • the restoration conditions determined by the condition generation unit 306, the reproduction conditions determined by the processing process reproduction unit 307, and the output destination specified by the setting input unit 316 (storage device 318, output device 317, display unit 208) Based on this information, the scene reference image data is subjected to an optimization process for obtaining an optimal image at the output destination to generate viewing image reference data, which is temporarily stored together with the operation information of the setting input unit 316. Output to memory 315.
  • Optimization processing includes, for example, compression to the color gamut of the output destination, gradation compression from 16 bits to 8 bits, reduction of the number of output pixels, processing to support output characteristics (LUT) of output devices and display devices, etc. Is included. Furthermore, image processing such as noise suppression, sharpening, color balance adjustment, saturation adjustment, and dodging processing is included.
  • the temporary storage memory 315 temporarily stores the viewing image reference data generated by the viewing image reference data generation unit 314, and the output destination (storage device 318, output device) set by the setting input unit 316. 317 and display 208).
  • the setting input unit 316 refers to the appreciation image generated by the appreciation image reference data generation unit 314. This is an input device for designating the data output destination, and corresponds to the operation unit 211 in FIG.
  • components other than the input unit 303 and the setting input unit 316 are included in the image processing unit 270 shown in FIG.
  • the output unit 302 includes a display unit 208, an output device 317 corresponding to the exposure processing unit 204 and the print creation unit 205 in FIG. 25, and a storage device 318 corresponding to the image writing unit 215 in FIG. C
  • step S40 When data is input to the input unit 303 (that is, when a recording medium is mounted to the mounting unit) (step S40), the data file recorded on the recording medium is read and the header information analysis unit In 304, the contents of the data file are analyzed (step S41), and are divided into scene reference raw data, reproduction auxiliary data (imaging device characteristic correction information, appreciation image reference data restoration information, processing process reproduction information), and shooting information data. .
  • the shooting information data processing unit 313 determines generation conditions for generating viewing image reference data according to shooting conditions based on the shooting information data (step S42), and generates viewing image reference data restoration condition generation. Based on the appreciation image reference data restoration information, the unit 306 determines restoration conditions for restoring the appreciation image reference data in the imaging device (step S44). The processing process reproduction unit 307 reproduces the processing process reproduction data. Based on the information, the reproduction condition for reproducing the generation process of the viewing image reference data in the imaging device is determined (step S45).
  • the generation condition of the scene reference image data is determined by referring to the imaging apparatus characteristic correction information and the processing condition table 310, and the scene reference raw data is determined according to the determined generation condition.
  • Imaging device characteristic correction processing is performed (step S43), and scene reference image data is generated (step S46).
  • step S46 based on the various image processing conditions (appearance image reference data generation conditions, restoration conditions, reproduction conditions) determined in steps S42, S44, and S45.
  • the optimized scene reference image data is subjected to optimization processing (step S47), and appreciation image reference data is generated (step S48).
  • the viewing image reference data is subjected to processing specific to the output destination set in the setting input unit 316 (operation unit 211) (step S49), and the output destination device is checked.
  • the image is output from the chair (step S50), and this image processing ends.
  • step S42 in FIG. 27 is not performed.
  • the optimum is based on the reproduction information data and the shooting information data attached to the file header of the scene reference raw data output from the imaging device.
  • reproduction assistance data does not contain processing process reproduction information, it is possible to obtain sufficiently high quality viewing image reference data. It is possible to obtain appreciation image reference data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)
  • Studio Devices (AREA)

Abstract

Suppression de la perte d’informations de données image acquises et création de données de référence d’image de qualité supérieure aux données image obtenues par un dispositif d’imagerie à un dispositif de traitement d’image ou un dispositif d’enregistrement d’image. Le dispositif d’imagerie (100) prend l’image et génère des données brutes de référence de la scène dépendant des caractéristiques du dispositif d’imagerie. Une unité de génération de données auxiliaires de re-création (12) soumet les données brutes de référence de scène à un traitement d’image optimisant la formation de l’image sur un support de sortie, générant des données auxiliaires de re-création employées pour générer les données de référence de l’image finale. Une unité de traitement d’informations d’en-tête (8) attache les données auxiliaires de re-création à l’en-tête du fichier (zone d’en-tête) des données brutes de référence de scène pour préparer un fichier de données attaché enregistré sur un support d’enregistrement d’un dispositif de stockage (9).
PCT/JP2005/023007 2005-01-20 2005-12-15 Dispositif d’imagerie, dispositif de traitement d’image et dispositif d’enregistrement d’image WO2006077703A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-013274 2005-01-20
JP2005013274A JP2006203573A (ja) 2005-01-20 2005-01-20 撮像装置、画像処理装置及び画像記録装置

Publications (1)

Publication Number Publication Date
WO2006077703A1 true WO2006077703A1 (fr) 2006-07-27

Family

ID=36692097

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/023007 WO2006077703A1 (fr) 2005-01-20 2005-12-15 Dispositif d’imagerie, dispositif de traitement d’image et dispositif d’enregistrement d’image

Country Status (2)

Country Link
JP (1) JP2006203573A (fr)
WO (1) WO2006077703A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5610860B2 (ja) * 2009-08-07 2014-10-22 キヤノン株式会社 撮像装置及び情報処理装置
JP5550333B2 (ja) * 2009-12-28 2014-07-16 キヤノン株式会社 撮像装置、現像方法及びプログラム
JP6409938B2 (ja) * 2017-10-13 2018-10-24 ソニー株式会社 撮像装置および画像処理方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11127415A (ja) * 1997-10-24 1999-05-11 Nikon Corp 電子カメラ、電子カメラの制御方法、および、記録媒体
JP2001251551A (ja) * 2000-03-08 2001-09-14 Fuji Photo Film Co Ltd 電子カメラ
JP2004096500A (ja) * 2002-08-30 2004-03-25 Konica Minolta Holdings Inc 撮像装置、画像処理装置及び画像記録装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11127415A (ja) * 1997-10-24 1999-05-11 Nikon Corp 電子カメラ、電子カメラの制御方法、および、記録媒体
JP2001251551A (ja) * 2000-03-08 2001-09-14 Fuji Photo Film Co Ltd 電子カメラ
JP2004096500A (ja) * 2002-08-30 2004-03-25 Konica Minolta Holdings Inc 撮像装置、画像処理装置及び画像記録装置

Also Published As

Publication number Publication date
JP2006203573A (ja) 2006-08-03

Similar Documents

Publication Publication Date Title
US7312824B2 (en) Image-capturing apparatus, image processing apparatus and image recording apparatus
US7076119B2 (en) Method, apparatus, and program for image processing
JP2004173010A (ja) 撮像装置、画像処理装置、画像記録装置、画像処理方法、プログラム及び記録媒体
US20040247175A1 (en) Image processing method, image capturing apparatus, image processing apparatus and image recording apparatus
WO2006123492A1 (fr) Procede et dispositif de traitement d'images, dispositif d'imagerie et programme de traitement d'images
JP2004128809A (ja) 画像処理方法及び装置、及び撮像装置
JPWO2005079056A1 (ja) 画像処理装置、撮影装置、画像処理システム、画像処理方法及びプログラム
JP2005026800A (ja) 画像処理方法、撮像装置、画像処理装置及び画像記録装置
US20040041926A1 (en) Image-capturing apparatus, imager processing apparatus and image recording apparatus
JP2004096506A (ja) 画像形成方法、画像処理装置及び画像記録装置
WO2005112428A1 (fr) Procédé de traitement d’images, dispositif de traitement d’images, enregistreur d’images, et programme de traitement d’images
US7324702B2 (en) Image processing method, image processing apparatus, image recording apparatus, program, and recording medium
WO2006077702A1 (fr) Dispositif d’imagerie, dispositif de traitement d’image et méthode de traitement d’image
JP2004336521A (ja) 画像処理方法、画像処理装置及び画像記録装置
WO2006077703A1 (fr) Dispositif d’imagerie, dispositif de traitement d’image et dispositif d’enregistrement d’image
WO2006033235A1 (fr) Procede de traitement d'images, dispositif de traitement d'images, dispositif d'imagerie et programme de traitement d'images
JP2006203571A (ja) 撮像装置、画像処理装置及び画像記録装置
US6801296B2 (en) Image processing method, image processing apparatus and image recording apparatus
WO2006033236A1 (fr) Procede de traitement d'images, dispositif de traitement d'images, dispositif d'imagerie et programme de traitement d'images
JP2004096508A (ja) 画像処理方法、画像処理装置、画像記録装置、プログラム及び記録媒体
JP3664582B2 (ja) ホワイトバランス調整方法および装置
JP4623024B2 (ja) 電子カメラ
JP2004328530A (ja) 撮像装置、画像処理装置及び画像記録装置
WO2006033234A1 (fr) Procede de traitement d'images, dispositif de traitement d'images, dispositif d'imagerie et programme de traitement d'images
JP4292873B2 (ja) 画像処理方法、画像処理装置及び画像記録装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05816842

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 5816842

Country of ref document: EP