WO2005115016A1 - 画像処理装置、画像加工・編集装置、画像ファイル再生装置、画像処理方法、画像加工・編集方法及び画像ファイル再生方法 - Google Patents

画像処理装置、画像加工・編集装置、画像ファイル再生装置、画像処理方法、画像加工・編集方法及び画像ファイル再生方法 Download PDF

Info

Publication number
WO2005115016A1
WO2005115016A1 PCT/JP2005/007866 JP2005007866W WO2005115016A1 WO 2005115016 A1 WO2005115016 A1 WO 2005115016A1 JP 2005007866 W JP2005007866 W JP 2005007866W WO 2005115016 A1 WO2005115016 A1 WO 2005115016A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
data
information
image file
generating
Prior art date
Application number
PCT/JP2005/007866
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
Hideaki Yoshida
Original Assignee
Olympus Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2004130128A external-priority patent/JP4642375B2/ja
Priority claimed from JP2004130127A external-priority patent/JP4589651B2/ja
Application filed by Olympus Corporation filed Critical Olympus Corporation
Priority to CN200580013159.6A priority Critical patent/CN1947431B/zh
Priority to EP05737293.0A priority patent/EP1742488B1/de
Publication of WO2005115016A1 publication Critical patent/WO2005115016A1/ja
Priority to US11/586,079 priority patent/US8155431B2/en
Priority to US13/412,082 priority patent/US8693764B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32128Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/32Image data format
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3214Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3215Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a time or duration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3245Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document of image modifying data, e.g. handwritten addenda, highlights or augmented reality information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3274Storage or retrieval of prestored additional information
    • H04N2201/3277The additional information being stored in the same storage device as the image data

Definitions

  • Image processing device image processing and editing device, image file reproducing device, image processing method, image processing and editing method, and image file reproducing method
  • the present invention relates to an image processing apparatus, an image processing / editing apparatus, an image file reproducing apparatus, an image processing method, an image processing / editing method, and an image file reproducing method suitable for capturing and editing a stereo image. .
  • the system controller includes a stereo adapter detector that detects whether the stereo adapter is attached, and an automatic exposure (AE) controller that analyzes the subject image signal related to the photometric area and calculates the photometric information required for exposure control.
  • AE automatic exposure
  • a photometry area setting unit for setting the photometry area described above is provided, and the photometry area setting unit has a function of setting different photometry areas in the normal shooting mode and the stereo shooting mode, respectively. It discloses a technique for setting an optimal photometric area for each of a normal shooting mode and a stereo shooting mode.
  • a method of projecting two images of the same subject (hereinafter, referred to as a monocular image) mutually shifted according to parallax on the left and right sides of one image frame.
  • a monocular image one image in which monocular images are arranged on the left and right
  • the left and right monocular images in the integrated image are observed with the right eye or the left eye, respectively.
  • This perceived image (hereinafter referred to as a fusion image) gives perspective to each part according to the amount of displacement of each part of the left and right monocular images.
  • the Exif standard follows the JPEG standard file (JPEG file) format. That is, the image data itself is treated as image data conforming to the PEG standard, and Exif data (metadata) is arranged in the header of the JPEG file.
  • JPEG file JPEG file
  • Metadata Exif data
  • Exif standard metadata is described in TIFF format and includes information such as shooting date and thumbnail.
  • the left and right monocular images captured using a stereo adapter may actually have vignetting at the boundary or a shift in the imaging position.
  • a predetermined portion in each of the left and right regions for correcting the vignetting and misalignment is trimmed to set an effective monocular image range (hereinafter referred to as an image frame). Since the range of the image frame affects the sense of depth of the stereoscopic image, it is necessary to appropriately set the image frame when processing and editing the stereoscopic image.
  • By describing information related to trimming and the like in the metadata it is possible to appropriately set an image frame and give an appropriate sense of depth to the fused image.
  • processing 'editing is not always performed using processing' editing software (hereinafter, referred to as 3D-compatible software) corresponding to a stereoscopic image, but processing 'editing software (hereinafter, 3D-compatible) which does not support stereoscopic images.
  • 3D-compatible software processing' editing software
  • Non-compliant software is exempted from processing and editing.
  • the 3D-compatible software should perform the trimming suitable for the stereoscopic image before the enlargement. Therefore, the sense of depth of the fusion image increases, and the sense of discomfort between the actual image and the recognized image increases.
  • FIG. 17 is an explanatory diagram for explaining a problem in this case.
  • Reference numerals 17a to 17d in FIG. 17 denote cases where the integrated image is trimmed by 3D-compatible software
  • reference numerals 17e to 17h denote examples in which the integrated image is trimmed by 3D-incompatible software and then trimmed by 3D-compatible software. Is shown.
  • Reference numerals 17a and 17e denote integrated images in which monochromatic images of the same subject (person) are arranged on the left and right. As described above, this integrated image is subjected to trimming for setting an effective image frame to be used for the fused image for correction of vignetting and displacement. In the Exif standard, information on the appropriate trimming position is described in the metadata in advance.
  • trimming at the optimal trimming position can be automatically performed by using this metadata.
  • Reference numeral 17b indicates the optimum trimming position by a broken line.
  • Reference numeral 17c indicates a clipped image by 3D-compatible software.
  • the metadata is automatically corrected according to the image after extraction. That is, information such as the optimum trimming position and the maximum pop-out amount that determines the maximum value of the sense of depth of the image are also updated.
  • the image is further cut out by 3D-compatible software.
  • the optimal trimming position has been updated corresponding to the image with the reference numeral 17c, and the image cut out by the 3D software is as indicated by the reference numeral 17d.
  • Reference numeral 17g denotes an image cut out by 3D-incompatible software. In this case, the metadata is not modified.
  • the 3D-compatible software also cuts out the image force of 17g.
  • the 3D-compatible software performs automatic trimming with reference to the metadata. That is, the sign
  • clipping is performed at the optimal trimming position (broken line) set in the metadata. That is, in this case, the subject is outside the cutout range, and an appropriate integrated image cannot be obtained from the cutout image.
  • the present invention provides an image processing apparatus, an image processing and editing apparatus capable of properly recognizing a fused image even when a stereoscopic image is processed by an editing machine or software that does not support a stereoscopic image. And an image file reproducing apparatus, an image processing method, an image processing and editing method, and an image file reproducing method.
  • the image processing apparatus is a stereo image data generating means for generating stereo image data based on a plurality of monocular images of the same object obtained with a predetermined parallax.
  • Ancillary data generating means for generating ancillary data relating to the stereo image data, ancillary data date and time information generating means for generating information relating to the date and time when the ancillary data was generated or updated, and the stereo image data generating means.
  • Image file generating means for generating an image file by combining the generated stereo image data with the accompanying data generated by the accompanying data generating means, wherein the image data is generated by the accompanying data date / time information generating means.
  • File information generating means for converting the image file into a predetermined file format and generating an image file by further adding information on the date and time when the image file was generated or updated.
  • An image processing / editing apparatus combines stereo image data based on a plurality of monocular images of the same subject obtained with a predetermined parallax and accompanying data relating to the stereo image data. Further, information on the date and time when the associated data was generated or updated and information on the date and time when the image file was generated or updated were further added, and the image generated by converting to a predetermined file format was added.
  • An accompanying data updating means for updating the accompanying data; and an accompanying data update date / time information generating means for generating information on the date and time when the accompanying data was updated.
  • image data generating means for generating an image file by combining the stereo image data edited by the image processing 'editing means' with the accompanying data updated by the accompanying data updating means.
  • the information on the date and time when the accompanying data generated by the accompanying data update date / time information generating means is updated and the information on the date and time when the image file is generated or updated are further added to the predetermined file.
  • the image file reproducing apparatus further comprising: an image file generating unit configured to generate an image file by converting the image into a format.
  • Playback means for further adding the information about the date and time when the image file was created and the information about the date and time when the image file was created or updated, and playing back the image file created by converting it to a predetermined file format.
  • Determining means for determining whether the information on the date and time when the associated data reproduced or reproduced by the reproducing means was generated or updated matches the information on the date and time when the image file was generated or updated, and If it is determined that they match, the control means prohibits the continuation of the reproducing operation by the reproducing means. Step and what you have.
  • stereo image data generating means generates stereo image data based on a plurality of monocular images of the same object obtained with a predetermined parallax
  • stereo image Ancillary data generating means for generating ancillary data relating to data
  • stereo image size information generating means for generating information regarding an image size of the stereo image data
  • stereo image data generated by the stereo image data generating means for generating an image file by combining the additional data generated by the additional data generating means, wherein information relating to the image size generated by the stereo image size information generating means is stored. Generates an image file by adding it to the inside and outside of the accompanying data and converting it to a predetermined file format
  • An image file generation unit that is one having,
  • An image processing / editing apparatus synthesizes stereo image data based on a plurality of monocular images of the same subject obtained with a predetermined parallax and accompanying data relating to the stereo image data.
  • information on the image size of the stereo image data is further added to the inside and outside of the accompanying data, and converted into a predetermined file format to perform image processing based on the stereo image data in the image file generated.
  • Image processing 'editing means for performing editing, and associated data updating means for updating the accompanying data of the accompanying data before the image processing' editing is performed, based on the contents of the image processing 'editing, The information related to the image size of the stereo image data before the editing of the image color added to the inside and outside of the accompanying data is performed. Then, based on the contents of the image processing / editing, the stereo image size information updating means for updating the information on each image size, and the stereo image data edited / edited by the image processing / editing means. And an accompanying data updated by the accompanying data updating means, to generate an image file, wherein the data relating to the image size updated by the stereo image size information updating means is attached to the accompanying data.
  • Image file generating means for converting the image into a predetermined file format and generating an image file by adding the inside and outside of the file to each other.
  • the image file reproducing device is obtained with a predetermined parallax. Synthesizing stereo image data based on a plurality of monocular images of the same subject with accompanying data relating to the stereo image data, and further adding information relating to the image size of the stereo image data inside and outside the accompanying data, Reproducing means for reproducing an image file generated by converting the image data into a predetermined file format; and information on an image size in the accompanying data reproduced by the reproducing means and information on an image size outside the accompanying data. It has a judging means for judging the coincidence, and a control means for prohibiting the continuation of the reproducing operation by the reproducing means when the judging means judges that they do not match.
  • FIG. 1 is a block diagram showing an electronic camera incorporating an image processing device according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing a specific configuration of an image file generation unit 50c in FIG.
  • FIG. 3 is an explanatory diagram showing an example of an image file generated by an image file generation unit 50c
  • FIG. 4 is a flowchart showing an operation flow of an image file generation unit 50c.
  • FIG. 5 is a block diagram showing an image processing / editing apparatus according to a second embodiment of the present invention.
  • FIG. 6 is a flowchart showing a processing flow of image file generation.
  • FIG. 7 is a block diagram showing an image file reproducing device according to a third embodiment of the present invention.
  • FIG. 8 is a flowchart showing a processing flow of an image file.
  • FIG. 9 is a block diagram showing an electronic camera incorporating an image processing device according to a fourth embodiment of the present invention.
  • FIG. 10 is a block diagram showing a specific configuration of an image file generation unit 50c in FIG.
  • FIG. 11 is an explanatory diagram showing an example of an image file generated by an image file generation unit 50c.
  • FIG. 12 is a flowchart showing an operation flow of an image file generation unit 50c.
  • FIG. 13 is a block diagram showing an image processing / editing apparatus according to a fifth embodiment of the present invention.
  • FIG. 14 is a flowchart showing a processing flow of image file generation.
  • FIG. 15 is a block diagram showing an image file reproducing device according to a sixth embodiment of the present invention.
  • FIG. 16 is a flowchart showing a processing flow of an image file.
  • FIG. 17 is an explanatory diagram for explaining a problem of the background art.
  • FIG. 1 is a block diagram showing an electronic camera in which the image processing device according to the first embodiment of the present invention is incorporated. This embodiment is applied to an image file recording device.
  • the electronic camera includes a camera body 1, a lens unit 5 having a lens barrel, and a stereo adapter 10 for capturing a stereo image.
  • a mirror type stereo adapter 10 is detachably attached to the lens unit 5.
  • the stereo adapter 10 has a configuration in which mirrors 11 and 12 are arranged at positions separated by parallax, respectively, and mirrors 13 and 14 for guiding light reflected by these mirrors 11 and 12 to the camera side are arranged. Te ru.
  • the light that has passed through the mirrors 11 and 13 and the mirrors 12 and 14 in the stereo adapter 10 is respectively exposed through an imaging lens group 21 in the lens unit 5, an exposure control mechanism 22, and a half mirror in the camera body 1. Guided to 31.
  • the lens unit 5 includes a photographic lens group 21, an exposure control mechanism 22, a lens driving mechanism 23, a lens driver 24, and an exposure control driver 25.
  • the imaging lens group 21 is a main imaging optical system capable of performing normal monocular imaging (monocular imaging) when the stereo adapter 10 is not attached, and is driven by the lens driving mechanism 23 to perform focusing and zooming. Is being adjusted!
  • the lens driving mechanism 23 is controlled by a lens driver 24.
  • the exposure control mechanism 22 controls the aperture of the photographing lens group 21 and a shutter device (not shown).
  • the exposure control mechanism 22 is controlled by an exposure control driver 25.
  • the light guided from the lens unit 5 to the camera body 1 passes through the half mirror 31 and is guided to the CCD color image sensor 34 via the low-pass and infrared cut filter system 32. An image is formed.
  • the CCD color image sensor 34 is driven and controlled by a CCD driver 35 to convert an optical image of a subject into an electric signal.
  • a CCD driver 35 to convert an optical image of a subject into an electric signal.
  • the CCD color imaging element 34 for example, an interline type with a vertical overflow drain structure and a progressive (sequential) scanning type is employed.
  • the light incident on the left-eye viewing mirror 11 of the stereo adapter 10 forms an image on an area L of an imaging surface (not shown) of the CCD color imaging device 34 via the mirror 13 and the imaging lens group 21.
  • the light that has entered the right-eye viewing mirror 12 is focused on the area R of the imaging surface (not shown) of the CCD color imaging device 34 via the mirror 14 and the imaging lens group 21. I have.
  • the signal photoelectrically converted by the CCD color imaging device 34 is passed through a pre-processing circuit 36 including AZD conversion and the like, to a digital processing circuit for performing color signal generation processing, matrix conversion processing, and other various digital processing. Given to 39.
  • color image data is generated by processing the digitized image signal.
  • an LCD display unit 40 is connected, and a memory card 42 such as a CF (Compact Flash Memory Card) or a smart media is connected via a card interface (IF) 41.
  • a memory card 42 such as a CF (Compact Flash Memory Card) or a smart media is connected via a card interface (IF) 41.
  • the LCD display section 40 performs display based on color image data, and the memory card 42 stores color image data.
  • the memory card 42 can be loaded into an external personal computer 60. Then, the image recorded on the memory card 42 can be displayed on the personal computer 60 and image processing can be performed. Further, the image recorded on the memory card 42 can be printed out by a printer (not shown).
  • the half mirror 31 is configured to partially reflect an incident subject image, and thus guides reflected light to the AF sensor module 45.
  • the AF sensor module 45 performs focus detection based on a light beam that has entered through the photographing lens group 21.
  • the AF sensor module 45 includes a separator lens 46 for dividing a pupil and an AF sensor 47 including a line sensor.
  • the system controller 50 including a CPU and the like includes the camera body 1 and the lens This is to control each part in the unit 5 in an integrated manner.
  • the system controller 50 includes a lens driver 24, an exposure control driver 25, a CCD driver 35, a pre-process circuit 36, a digital process circuit 39, an AF sensor module 45, an operation switch section 52, An operation display unit 53, a nonvolatile memory (EEPROM) 51, and a stereo switching switch (SW) 54 are connected.
  • EEPROM nonvolatile memory
  • SW stereo switching switch
  • the operation switch unit 52 includes various switch powers such as a release switch and a shooting mode setting.
  • the operation display unit 53 is a display unit for displaying an operation state, a mode state, and the like of the camera.
  • the EEPROM 51 is a memory for storing various setting information and the like.
  • the stereo switch 54 is a switch for switching modes when the stereo adapter 10 is mounted on the lens unit 5. Note that, here, the shooting mode switching is performed by operating the stereo switching switch 54. The switching is not limited to this. For example, a detection function may be provided in the stereo adapter 10 so that the shooting mode is automatically switched.
  • the system controller 50 controls the driving of the CCD color image sensor 34 by the exposure control mechanism 22 and the CCD driver 35 to perform exposure (charge accumulation) and signal reading.
  • the system controller 50 supplies the output of the CCD 34 to the digital process circuit 39 via the pre-process circuit 36 to perform various kinds of signal processing, and records the signals on the memory card 42 via the card interface 41.
  • the strobe 57 emits flash light, and is controlled by the system controller 50 via the exposure control driver 25 in the lens unit 5!
  • the system controller 50 further includes an exposure control unit 50d and a photometric area setting unit 50e.
  • the exposure control unit 50d analyzes the subject image signal relating to the photometric area and calculates exposure information necessary for exposure control.
  • the photometric area setting unit 50e sets a photometric area for the exposure control unit 50d.
  • the system controller 50 further includes a metadata generation unit 50a, a stereo image generation unit 50b, and an image file generation unit 50c.
  • the metadata generation unit 50a as ancillary data generation means includes various types of data related to the captured image.
  • the associated data (hereinafter referred to as metadata) is generated and provided to the image file generating unit 50c.
  • the metadata generation unit 50a generates information on the shooting date and time, and when the shot image is a stereoscopic image, generates various information (stereoscopic image information) related to the stereoscopic image.
  • the stereoscopic image information include trimming information.
  • the trimming information is information indicating an effective image frame of each monochromatic image, and indicates an area used for recognition of a fused image during reproduction.
  • the stereo image generation unit 50b is configured to generate a stereo image based on a plurality of input monocular images. For example, when one image in which a plurality of monocular images are arranged on the imaging surface of the CCD 34 forms an image, the stereo image generation unit 50b generates a stereo image using the input image as it is. When a plurality of monocular images are separately captured and input, for example, the stereo image generating unit 50b generates a stereo image based on each of the input monocular images.
  • the image file generation unit 50c can convert the stereo image generated by the stereo image generation unit 50b into an electronic image file in a predetermined format and output the electronic image file.
  • the image file generating unit 50c converts one integrated image into an image format of, for example, the PEG standard or the TIFF standard to form one image file.
  • the image file generating unit 50c allocates each monocurricular image to each page in the TIFF format to form one image file.
  • the image file generating unit 50c performs a compression process on the stereo image as necessary, and converts the stereo image into a digital image file of a predetermined format with attached data (metadata).
  • FIG. 2 is a block diagram showing a specific configuration of the image file generation unit 50c in FIG.
  • FIG. 3 is an explanatory diagram illustrating an example of an image file generated by the image file generation unit 50c.
  • Image data of a stereo image is input to the compression unit 71 of the image file generation unit 50c.
  • the compression section 71 performs predetermined compression processing on the image data of the stereo image and outputs the data to the data synthesis section 72.
  • the metadata is also input to the data combining unit 72, and the data combining unit 72 combines the image data and the metadata.
  • the image file generation unit 50c includes a metadata date and time setting unit 73 and an image file date and time setting unit 74.
  • the metadata date and time setting unit 73 can output information on the date and time of creation of metadata regarding various types of stereoscopic image information to the data synthesizing unit 72.
  • the image file date / time setting section 74 can output information on the date / time of creation of the image file to the data synthesizing section 72. Both the metadata date and time setting unit 73 and the image file date and time setting unit 74 set the date and time when the image file was last updated, that is, in this case, the same date and time.
  • the data synthesizing unit 72 synthesizes the input data and supplies the synthesized data to the format unit 75.
  • the format unit 75 arranges the input data according to a predetermined image format and outputs the data as one image file.
  • the image file has a header section and an image data section.
  • image data compressed by the compression section 71 is arranged in the image data section.
  • the header section contains the file name of the image file, the file creation (update) date and time, and also metadata.
  • the file creation (update) date and time information is set by the image file date and time setting unit 74.
  • the metadata includes metadata identification information, information on the shooting date and time of the image, and information on the stereoscopic image.
  • the information on the shooting date and time is information generated by the metadata generation unit 50a and included in the metadata input to the data synthesis unit 72.
  • the stereoscopic image information includes various information related to the stereoscopic image when the image data arranged in the image data section is a stereoscopic image.
  • the trimming information is included in the metadata generated by the metadata generation unit 50a and input to the data synthesis unit 72.
  • the metadata creation (update) date and time is information set by the metadata date and time setting unit 73, and includes the metadata creation date and time for various types of stereoscopic image information.
  • the file format in Fig. 3 is an example of an image file, and Fig. 3 illustrates an example in which the metadata creation (update) date and time is arranged as a part of the stereoscopic image information. It may be placed in the area.
  • the controller 50 has been described as being 3D-compatible. However, if the controller 50 does not support 3D, the controller 50 in the metadata area may In the description area of the image information, for example, a predetermined initial value is set or no information is described.
  • FIG. 4 is a flowchart showing an operation flow of the image file generation unit 50c.
  • the subject optical image incident via the stereo adapter 10 forms an image on the imaging surface of the CCD color imaging device 34 via the imaging lens group 21, the exposure control mechanism 22, the half mirror 31, and the filter system 32.
  • the CCD color image sensor 34 outputs data of one image including the left and right monochromatic images L and R.
  • the image signal from the CCD color image sensor 34 is input to the controller 50 via the pre-processing circuit 36.
  • the stereo image generation unit 50b generates an integrated image in which monocular images are arranged on the left and right based on the input image signal.
  • the metadata generation unit 50a generates shooting date / time information and stereoscopic image information regarding the generated integrated image.
  • step S1 of Fig. 4 the image file generation unit 50c performs a predetermined compression process on the generated integrated image (stereo image).
  • the metadata date and time setting unit 73 in the image file generation unit 50c generates information on the date and time of metadata creation regarding the stereoscopic image information generated by the metadata generation unit 50a.
  • the metadata date / time setting unit 73 may set the current time as the metadata creation date / time.
  • the image file date and time setting unit 74 in the image file generation unit 50c generates the same date and time as the metadata creation date and time as the information of the image file creation date and time.
  • the image file generation unit 50c combines the stereo image, metadata, metadata creation date and time, and image file creation date and time information with the data combining unit 72 (step S4).
  • An image file having the data format shown in (1) is generated (step S5).
  • the image file generated by the image file generation unit 50c is provided to the digital process circuit 39.
  • the digital process circuit 39 can display the integrated image on the display screen of the LCD 40 based on the input electronic image file. Also, the digital process circuit 39 converts the input electronic image file into a memory card 42 via a card IF41. To be recorded.
  • information on the date and time of creation of the image file is matched with information on the date and time of creation of the metadata. It is clear that they should be matched.
  • the image file generated by the image file generation unit includes not only the creation (update) date and time of the image file itself but also the metadata of the stereoscopic image information.
  • Information on creation (update) date and time is arranged.
  • Creation (update) date and time information of metadata related to stereoscopic image information is separately arranged in image file creation (update) date and time information.
  • Creation (update) of image files and creation of metadata about stereoscopic image information (Update) can be managed separately.
  • the creation (update) date and time of the metadata is made to match the creation (update) date and time of the image file. Since the creation (update) date and time of the metadata and the creation (update) date and time of the image file substantially match or the time difference is relatively small, it is possible to set the actual creation (update) date and time. Good. In this case, it is possible to determine whether or not the image file has been created (updated) by a 3D-incompatible device or software by determining that the difference between these dates and times is smaller than the threshold value.
  • an image file is generated by an image processing device having an image file generation unit.
  • the image file generation unit may be implemented by software having the same function. Obviously, this is also feasible. That is, by using a computer that can execute a program having the same function as the flowchart of FIG. 4, an image file generating unit similar to that of FIG. 2 can be generated.
  • the image file date and time setting unit 74 is configured by an operation system of a computer or the like. In general, the creation (update) date and time of an image file is automatically described when the file is created (updated).
  • data such as file names and file creation (update) dates and times are described as examples arranged in the header of an image file. It is a small configuration that is placed in the directory entry area managed by the 'location'.
  • the example in which the present invention is applied to an electronic camera can be applied to a single image processing apparatus that processes an image captured by a force electronic camera. Similar functions can be achieved by a program such as a personal computer that processes a captured image.
  • FIG. 5 is a block diagram showing an image processing / editing apparatus according to the second embodiment of the present invention
  • FIG. 6 is a flowchart showing a processing flow of image file generation. This embodiment is applied to an image file editing apparatus. It should be noted that this embodiment can also be realized by software having the same functions as those of the apparatus in FIG.
  • An image file having the data format shown in Fig. 3 is input to the decoding unit 81.
  • the decoding unit 81 decodes the input image file and extracts various data included in the image file.
  • the decoding unit 81 gives the image data in the image file to the image decompression unit 82, and gives the metadata to the metadata processing unit 84.
  • the image expanding section 82 expands the input image data to obtain image data before compression. This image data is provided to the image processing / editing unit 83.
  • the image processing / editing unit 83 performs processing / editing on the input image according to the user's processing / editing operation (step S11), and generates the image after the processing / editing into an image file. Part 86 Output to The image processing / editing unit 83 can use the processing / editing process with reference to metadata on the image to be processed / edited.
  • the metadata processing unit 84 updates the metadata based on the image processing / editing process of the image editing / editing unit 83. For example, when the image processing 'processing of the editing unit 83' editing process is an enlargement process for a stereoscopic image, the metadata processing unit 84 provides trimming information indicating an optimal cutout range corresponding to the enlargement process. To update.
  • the image file generating unit 86 combines the image data from the image processing / editing unit 83 and the metadata from the metadata processing unit 84 to generate an image file in the file format shown in FIG.
  • the date and time updating section 87 in the image file generating section 86 is given the information on the current time from the timer 88 and sets the information on the file updating date and time (step S12).
  • the image file generating unit 86 rewrites the file creation (update) date and time (see FIG. 3) in the file format to the information of the update date and time. Further, the image file generation unit 86 rewrites the metadata creation (update) date and time (see FIG. 3) in the file format to information of the same date and time as the file creation (update) date (step S13). In this way, the image file generating unit 86 generates and outputs an image file in which the file creation (update) date and the metadata creation (update) date are changed to the same date and time (step S14).
  • the metadata is updated along with the image processing / editing, and the update date and time are changed to the same current date and time as the image file creation (update) date and time. .
  • the information of the date of creation (update) of the image file and the date of creation (update) of the metadata match. This is it.
  • the information on the update date and time of the metadata and the update date and time of the image file need not always be based on the actual update date and time as long as they are the same date and time.
  • FIG. 7 is a block diagram showing an image file reproducing device according to the third embodiment of the present invention
  • FIG. 8 is a flowchart showing a processing flow of an image file.
  • This embodiment is applied to an image file reproducing apparatus. It should be noted that this embodiment can also be implemented by software having the same function as the device in FIG. [0078]
  • An image file having the file format shown in Fig. 3 is input to the decoding unit 91.
  • the decoding unit 91 decodes the input image file and extracts various data included in the image file.
  • the decoding unit 91 gives the image data in the image file to the image decompression unit 92, and gives the metadata to the switching unit 96, the metadata invalidation unit 95, and the date / time determination unit 94.
  • the image expansion unit 92 expands the input image data to obtain image data before compression. This image data is provided to the reproduction processing unit 93.
  • the date and time determination unit 94 acquires information on the date and time of file creation (update) (see FIG. 3) and information on the date and time of metadata creation (update) (step S 21). The date and time determination unit 94 determines whether the acquired file creation (update) date and time and the metadata creation (update) date and time match or not (step S22). The date and time determination unit 94 outputs the determination result to the switching unit 96 and the warning unit 97.
  • the date and time determination unit 94 determines that the information of both dates and times match each other. Good. However, considering that erroneous determinations may occur depending on the setting of the threshold value, it is a rule that it is determined that they match only in exactly the same case.
  • the metadata invalidating unit 95 invalidates the input metadata (step S24). For example, the metadata invalidating unit 95 invalidates the metadata relating to the stereoscopic image by deleting the parameter or setting the parameter to an initial value.
  • the switching unit 96 selectively supplies one of the metadata from the decoding unit 91 and the metadata from the metadata invalidating unit 95 to the reproduction processing unit 93 based on the determination result of the date and time determining unit 94. For example, when the switching unit 96 receives the determination result indicating that the date and time match, the switching unit 96 provides the metadata from the decoding unit 91 to the reproduction processing unit 93 and determines that the date and time do not match. Is given, the reproduction process is prohibited (step S24).
  • the reproduction processing unit 93 performs predetermined reproduction processing on the image data using the metadata from the switching unit 96 (step S25).
  • the playback processing unit 93 sends the metadata information set by the metadata invalidation unit 95 to the playback processing unit 93. May be given.
  • the warning unit 97 issues a warning to the user, for example, by displaying a warning or the like.
  • the image file in which the creation (update) date and time of the file and the metadata are described independently is different from each other when the date and time do not match
  • the image The reproduction process is prohibited because there is no correlation between the data and the metadata, or the reproduction process is performed using the invalidated metadata.
  • the reproduction process is performed using the metadata and the image data in correlation with the image data. Can be prevented.
  • FIG. 9 is a block diagram showing an electronic camera in which the image processing device according to the fourth embodiment of the present invention is incorporated. This embodiment is applied to an image file recording device. 9, the same components as those in FIG. 1 are denoted by the same reference numerals, and description thereof will be omitted.
  • the camera main body 101 of the present embodiment is different from the camera main body 1 of the first embodiment only in that a system controller 150 is employed instead of the system controller 50.
  • the system controller 150 differs from the system controller 50 in that a metadata generation unit 150a and an image file generation unit 150c are used instead of the metadata generation unit 50a and the image file generation unit 50c, respectively.
  • the metadata generation unit 150a as the associated data generation unit generates various types of associated data (hereinafter, referred to as metadata) related to the captured image and supplies the generated data to the image file generation unit 150c.
  • the metadata generation unit 150a generates information on the shooting date and time, and when the shot image is a stereoscopic image, generates various information (stereoscopic image information) related to the stereoscopic image.
  • the stereoscopic image information for example, there is an optimum trimming position which is trimming information.
  • the information on the optimum trimming position is information indicating an effective image frame of each monocular image, and indicates an area used for recognition of a fused image during reproduction.
  • the stereoscopic image information there is also information on the maximum pop-out amount that defines the maximum value of the sense of depth of the stereoscopic image.
  • the image file generation unit 150c can convert the stereo image generated by the stereo image generation unit 50b into an electronic image file in a predetermined format and output the electronic image file. Yes.
  • the image file generating unit 150c converts one integrated image into an image format of, for example, the PEG standard or the TIFF standard to form one image file.
  • the image file generating unit 150c allocates each monocular image to each page of the TIFF format to form one image file.
  • the image file generating unit 150c performs a compression process on the stereo image as necessary, and converts the stereo image into a digital image file of a predetermined format with attached data (metadata).
  • FIG. 10 is a block diagram showing a specific configuration of the image file generating unit 150c in FIG.
  • FIG. 11 is an explanatory diagram showing an example of an image file generated by the image file generation unit 150c.
  • Image data of a stereo image is input to the compression unit 171 of the image file generation unit 150c.
  • the compression section 171 performs predetermined compression processing on the image data of the stereo image, and outputs the data to the data synthesis section 172.
  • the metadata is also input to the data combining unit 172, and the data combining unit 172 combines the image data and the metadata.
  • image file generating section 150c includes metadata size setting section 173 and image file size setting section 174.
  • the metadata size setting unit 173 can output information on the image size (3D image size) (the number of pixels) of the stereoscopic image to the data synthesizing unit 172.
  • the image file size setting section 174 can output information on the image size (the number of pixels) of the image based on the image data to the data synthesizing section 172.
  • the metadata size setting unit 173 and the image file size setting unit 174 set image size information of an image based on the same image data. Therefore, the image data is data of a stereoscopic image. In this case, the outputs of these setting units 173 and 174 become information of the same image size.
  • the data synthesizing unit 172 synthesizes the input data and supplies the synthesized data to the format unit 175.
  • the format unit 175 arranges the input data according to a predetermined image format and outputs the data as one image file.
  • the image file has a header part and an image data part.
  • image In the data section image data compressed by the compression section 171 is arranged.
  • the header part information on the file name of the image file and the image size are arranged, and also metadata is arranged.
  • the image size information is set by the image file size setting unit 174.
  • the metadata includes metadata identification information, information on the shooting date and time of the image, and information on the stereoscopic image.
  • the information on the shooting date and time is information generated by the metadata generation unit 150a and included in the metadata input to the data synthesis unit 172.
  • the three-dimensional image information includes various information related to the three-dimensional image.
  • the information on the optimal trimming position and the information on the maximum pop-out amount are generated by the metadata generation unit 150a and are included in the metadata input to the data synthesis unit 172.
  • the information on the 3D image size is information set by the metadata size setting unit 173, and indicates the image size of a stereoscopic image based on the image data.
  • the information of the 3D image size recorded as metadata and the information of the image size recorded in the header part of the image file have the same contents.
  • Fig. 11 is an example of an image file.
  • Fig. 11 an example in which the 3D image size is arranged as a part of the stereoscopic image information has been described. You can do it.
  • controller 150 is 3D-compatible. However, if controller 150 is not 3D-compatible, stereoscopic image information in the metadata area may be used. In the description area, for example, a predetermined initial value is set or no information is described.
  • FIG. 12 is a flowchart showing the operation flow of the image file generation unit 150c.
  • the subject optical image incident via the stereo adapter 10 forms an image on the imaging surface of the CCD color imaging device 34 via the imaging lens group 21, the exposure control mechanism 22, the half mirror 31, and the filter system 32.
  • the CCD color image sensor 34 includes left and right monochromatic images L, R Output data of one image including.
  • the image signal from the CCD color image sensor 34 is input to the controller 150 via the pre-processing circuit 36.
  • the stereo image generation unit 50b generates an integrated image in which a monocular image is arranged on the left and right based on the input image signal.
  • the metadata generation unit 150a generates shooting date and time information and stereoscopic image information relating to the generated integrated image.
  • step S101 in Fig. 12 the image file generating unit 150c performs a predetermined compression process on the generated integrated image (stereo image).
  • the metadata size setting unit 173 in the image file generating unit 150c generates information (3D image size) of the image size of the stereoscopic image stored in the image data unit.
  • step S103 the image file size setting section 174 in the image file generation section 150c generates information on the image size of the image stored in the image data section. That is, in this case, the 3D image size set by the metadata size setting unit 173 and the image size set by the image file size setting unit 174 have the same value.
  • the image file generating unit 150c combines the stereo image, the metadata, the 3D image size, and the information on the image size by the data combining unit 172 (step S104), and the data shown in FIG. A format image file is generated (step S105).
  • the image file generated by the image file generation unit 150c is provided to the digital process circuit 39.
  • the digital process circuit 39 can display the integrated image on the display screen of the LCD 40 based on the input electronic image file.
  • the digital process circuit 39 can also provide the input electronic image file to the memory card 42 via the card IF 41 and record it.
  • the image file generated by the image file generation unit includes, in addition to the image size information of the image based on the image data in the image file, 3D image size power Located in metadata.
  • the information on the image size of the stereoscopic image is arranged separately from the information on the image size of the header part in the image file, and these information can be managed separately. If an image file is created (updated) by a 3D-incompatible device or software, Since the 3D image size of the metadata to be created is not generated or updated, comparing the information of these image sizes allows the image file to be created (updated) by a 3D-incompatible device or software. ⁇ can be determined.
  • the image file generating unit described in the example of generating the image file by the image processing apparatus having the image file generating unit has the same function. It is obvious that the present invention can also be realized by software exhibiting the following. That is, by using a computer capable of executing a program having the same function as the flowchart of FIG. 12, an image file generating unit similar to that of FIG. 10 can be generated.
  • the image file size setting unit 174 is configured by an operation system of a computer, and the image size in the image file In general, it is automatically described when creating (updating).
  • the example in which the present invention is applied to an electronic camera can be applied to a single image processing apparatus that processes an image captured by a force electronic camera. Similar functions can be achieved by a program such as a personal computer that processes a captured image.
  • FIG. 13 is a block diagram showing an image processing / editing apparatus according to the fifth embodiment of the present invention
  • FIG. 14 is a flowchart showing a processing flow of image file generation. This embodiment is applied to the image file editing apparatus shown in FIG. Note that this embodiment can also be realized by software having the same functions as those of the device in FIG.
  • An image file having the data format shown in Fig. 11 is input to the decoding unit 181.
  • the decoding unit 181 decodes the input image file and extracts various data included in the image file.
  • the decoding unit 181 gives the image data in the image file to the image decompression unit 182, and gives the metadata to the metadata processing unit 184.
  • the image expansion unit 182 expands the input image data to obtain image data before compression. This image data is provided to the image processing / editing unit 183.
  • the image processing / editing unit 183 performs the processing / editing on the input image data according to the user's processing / editing operation (step S111), and converts the image data after the processing / editing into an image file. Output to the file generation unit 186.
  • the image processing / editing unit 183 can use the image processing / editing process with reference to metadata on image data to be processed / edited.
  • the metadata processing unit 184 updates the metadata based on the image processing / editing processing of the editing unit 183. For example, when the editing process of the image processing 'editing unit 183 is a trimming process for a stereoscopic image, the metadata processing unit 184 determines an optimum cutout range corresponding to the trimming process. Update the information of the trimming position and the information of the maximum protrusion amount.
  • the image file generation unit 186 combines the image data from the image processing / editing unit 183 with the metadata from the metadata processing unit 184 to generate an image file in the file format shown in FIG.
  • the size updating unit 187 in the image file generating unit 186 has the same functions as the metadata size setting unit 173 and the image file size setting unit 174 in FIG.
  • the image size information of the subsequent image is generated (step S112).
  • the image file generation unit 186 rewrites the information on the image size in the image file in the file format (see FIG. 11) with the information generated by the size update unit 187. Further, the image file generation unit 186 rewrites the information of the 3D image size (see FIG.
  • Step S113 in the metadata in the file format to the information generated by the size update unit 187 (Step S113).
  • the generation unit 186 generates and outputs an image file in which the image size in the image file and the 3D image size in the metadata are changed to the same information (step S114).
  • the metadata is updated along with the image processing / editing.
  • the image size in the image file and the information on the 3D image size in the metadata match as long as the image is edited and edited using the editing device according to the present embodiment. This is it.
  • FIG. 15 is a block diagram showing an image file reproducing device according to the sixth embodiment of the present invention
  • FIG. 16 is a flowchart showing a processing flow of an image file. This embodiment is applied to the image file reproducing apparatus shown in FIG. It should be noted that this embodiment can also be realized by software having the same function as the device in FIG.
  • An image file having the file format shown in Fig. 11 is input to the decoding unit 191.
  • the decoding unit 191 decodes the input image file and extracts various data included in the image file.
  • the decoding unit 191 gives the image data in the image file to the image decompression unit 192, and gives the metadata to the switching unit 196, the metadata invalidation unit 195, and the size determination unit 194.
  • the image expansion unit 192 expands the input image data to obtain image data before compression. This image data is provided to the reproduction processing unit 193.
  • the size determination unit 194 acquires information on the image size (see Fig. 11) in the image file and information on the 3D image size in the metadata (step S121). It is determined whether the image size in the file matches the 3D image size in the metadata (step S122). Size determination section 194 outputs the determination result to switching section 196 and warning section 197.
  • the metadata invalidating unit 195 invalidates the input metadata (step S124). 0
  • the metadata invalidating unit 195 deletes the parameter of the metadata related to the stereoscopic image.
  • invalidation is performed by setting the initial value.
  • the switching section 196 selectively supplies one of the metadata from the decoding section 191 and the metadata from the metadata invalidating section 195 to the reproduction processing section 193 based on the determination result of the size determining section 194. For example, the switching unit 196 determines that the image sizes match.
  • the switching unit 196 determines that the image sizes match.
  • the reproduction processing unit 193 uses the metadata from the switching unit 196 to perform predetermined reproduction processing on the image data (step S125). 0 Also, the reproduction processing unit 193 transmits the image size from the size determination unit 194 to the image size. If a determination result indicating that the metadata does not match is given, the metadata information set by the metadata invalidating unit 195 may be given to the reproduction processing unit 193.
  • the warning unit 197 issues a warning to the user, for example, by displaying a warning or the like. .
  • the image size is not matched.
  • the reproduction process is prohibited because there is no correlation between the data and the metadata, or the reproduction process is performed using the invalidated metadata. This makes it possible to prevent the reproduction process from being performed using the metadata that cannot be correlated with the image data when the image is processed or edited by a device or software that does not support 3D. Wear.
  • the stereo system has been described as a binocular system corresponding to the left and right eyes. However, the same applies to a general multi-view stereo system having three or more eyes. Of course it is possible.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Television Signal Processing For Recording (AREA)
PCT/JP2005/007866 2004-04-26 2005-04-26 画像処理装置、画像加工・編集装置、画像ファイル再生装置、画像処理方法、画像加工・編集方法及び画像ファイル再生方法 WO2005115016A1 (ja)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN200580013159.6A CN1947431B (zh) 2004-04-26 2005-04-26 图像处理装置、图像加工/编辑装置、图像文件再现装置、图像处理方法、图像加工/编辑方法及图像文件再现方法
EP05737293.0A EP1742488B1 (de) 2004-04-26 2005-04-26 Vorrichtung und Verfahren zur Bilddateiwiedergabe
US11/586,079 US8155431B2 (en) 2004-04-26 2006-10-24 Image file processing apparatus which generates an image file to include stereo image data, collateral data related to the stereo image data, information of a date and time at which the collateral data is updated, and information of a date and time at which the image file is generated or updated, and corresponding image file processing method
US13/412,082 US8693764B2 (en) 2004-04-26 2012-03-05 Image file processing apparatus which generates an image file to include stereo image data and collateral data related to the stereo image data, and information related to an image size of the stereo image data, and corresponding image file processing method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2004-130127 2004-04-26
JP2004130128A JP4642375B2 (ja) 2004-04-26 2004-04-26 画像処理装置、画像加工・編集装置、画像ファイル再生装置、画像処理方法、画像加工・編集方法及び画像ファイル再生方法
JP2004130127A JP4589651B2 (ja) 2004-04-26 2004-04-26 画像処理装置、画像加工・編集装置、画像ファイル再生装置、画像処理方法、画像加工・編集方法及び画像ファイル再生方法
JP2004-130128 2004-04-26

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/586,079 Continuation US8155431B2 (en) 2004-04-26 2006-10-24 Image file processing apparatus which generates an image file to include stereo image data, collateral data related to the stereo image data, information of a date and time at which the collateral data is updated, and information of a date and time at which the image file is generated or updated, and corresponding image file processing method

Publications (1)

Publication Number Publication Date
WO2005115016A1 true WO2005115016A1 (ja) 2005-12-01

Family

ID=35428693

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/007866 WO2005115016A1 (ja) 2004-04-26 2005-04-26 画像処理装置、画像加工・編集装置、画像ファイル再生装置、画像処理方法、画像加工・編集方法及び画像ファイル再生方法

Country Status (3)

Country Link
US (2) US8155431B2 (de)
EP (2) EP1742488B1 (de)
WO (1) WO2005115016A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7929027B2 (en) 2006-12-27 2011-04-19 Fujifilm Corporation Image management method

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007122643A (ja) * 2005-10-31 2007-05-17 Toshiba Corp データ検索システム、メタデータ同期方法およびデータ検索装置
JP4993578B2 (ja) * 2007-01-15 2012-08-08 オリンパスイメージング株式会社 画像ファイル再生装置,画像ファイル加工編集装置
US7788267B2 (en) * 2007-02-26 2010-08-31 Seiko Epson Corporation Image metadata action tagging
DE202007010389U1 (de) * 2007-07-24 2007-09-27 Maier, Florian Vorrichtung zur automatischen Positionierung von gekoppelten Kameras zur plastischen Bilddarstellung
JP4672764B2 (ja) * 2007-10-03 2011-04-20 富士フイルム株式会社 誤消去判断装置、方法及びプログラム
JP4913085B2 (ja) * 2008-03-04 2012-04-11 オリンパスイメージング株式会社 マルチ画像ファイル編集装置、マルチ画像ファイル編集プログラム、マルチ画像ファイル編集方法
KR101547151B1 (ko) * 2008-12-26 2015-08-25 삼성전자주식회사 영상 처리 방법 및 장치
JP5266126B2 (ja) * 2009-03-31 2013-08-21 富士フイルム株式会社 画像表示装置および方法並びにプログラム
JP2010268184A (ja) * 2009-05-14 2010-11-25 Hoya Corp 撮像装置
JP5577623B2 (ja) * 2009-05-14 2014-08-27 リコーイメージング株式会社 撮像装置
US9524700B2 (en) * 2009-05-14 2016-12-20 Pure Depth Limited Method and system for displaying images of various formats on a single display
JP5604173B2 (ja) * 2010-04-30 2014-10-08 三洋電機株式会社 再生装置、表示装置、記録装置及び格納媒体
US9402065B2 (en) 2011-09-29 2016-07-26 Qualcomm Incorporated Methods and apparatus for conditional display of a stereoscopic image pair
US9152646B2 (en) * 2013-04-05 2015-10-06 Dropbox, Inc. Ordering content items
JP6498064B2 (ja) * 2015-07-27 2019-04-10 キヤノン株式会社 電子機器及びその制御方法
JP6991830B2 (ja) * 2017-10-25 2022-01-13 オリンパス株式会社 画像処理装置、画像処理方法、画像処理プログラム

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11355733A (ja) * 1998-06-04 1999-12-24 Sony Corp データ送受信装置及びデータ送受信方法
JP2001160068A (ja) * 1999-11-12 2001-06-12 Ricoh Co Ltd 文書管理システムにおいて問い合わせを処理するための方法及び装置
WO2001097531A2 (en) 2000-06-12 2001-12-20 Vrex, Inc. Electronic stereoscopic media delivery system
JP2002082775A (ja) * 2000-07-06 2002-03-22 Hitachi Ltd 計算機システム
JP2002165210A (ja) * 2000-08-04 2002-06-07 Matsushita Electric Ind Co Ltd データ送信端末及びデータ送受信装置
US20020071616A1 (en) 2000-08-29 2002-06-13 Olympus Optical Co., Ltd. Method and apparatus of generating three dimensional image data having one file structure and recording the image data on a recording medium, and recording medium for storing the three dimensional image data having one file structure
EP1235143A2 (de) 2000-12-27 2002-08-28 Microsoft Corporation Verfahren und Vorrichtung zur Erzeugung und Erhaltung von versionsspezifischen Eigenschaften in einer Datei
WO2004008768A1 (en) 2002-07-16 2004-01-22 Electronics And Telecommunications Research Institute Apparatus and method for adapting 2d and 3d stereoscopic video signal

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6023277A (en) * 1996-07-03 2000-02-08 Canon Kabushiki Kaisha Display control apparatus and method
CN100338630C (zh) * 1997-12-18 2007-09-19 富士胶片株式会社 图像合成系统、装置和方法、分离方法及客户机
US6708309B1 (en) * 1999-03-11 2004-03-16 Roxio, Inc. Method and system for viewing scalable documents
US7053937B1 (en) * 1999-05-21 2006-05-30 Pentax Corporation Three-dimensional image capturing device and recording medium
US6775665B1 (en) 1999-09-30 2004-08-10 Ricoh Co., Ltd. System for treating saved queries as searchable documents in a document management system
EP1122710B1 (de) * 2000-02-03 2007-01-24 SANYO ELECTRIC Co., Ltd. Bildelement-Taktgenerator für eine Anzeige
US6766430B2 (en) 2000-07-06 2004-07-20 Hitachi, Ltd. Data reallocation among storage systems
US7106885B2 (en) * 2000-09-08 2006-09-12 Carecord Technologies, Inc. Method and apparatus for subject physical position and security determination
JP2002150315A (ja) * 2000-11-09 2002-05-24 Minolta Co Ltd 画像処理装置および記録媒体
JP3945160B2 (ja) * 2000-12-25 2007-07-18 日本電気株式会社 情報提供サーバ、クライアント、情報提供システムの処理方法、及びプログラムを記録した記録媒体
JP2002218506A (ja) 2001-01-18 2002-08-02 Olympus Optical Co Ltd 撮像装置
US7340383B2 (en) * 2001-12-20 2008-03-04 Ricoh Company, Ltd. Control device, method and computer program product for browsing data
US7302118B2 (en) 2002-02-07 2007-11-27 Microsoft Corporation Transformation of images
US6993196B2 (en) 2002-03-18 2006-01-31 Eastman Kodak Company Digital image storage method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11355733A (ja) * 1998-06-04 1999-12-24 Sony Corp データ送受信装置及びデータ送受信方法
JP2001160068A (ja) * 1999-11-12 2001-06-12 Ricoh Co Ltd 文書管理システムにおいて問い合わせを処理するための方法及び装置
WO2001097531A2 (en) 2000-06-12 2001-12-20 Vrex, Inc. Electronic stereoscopic media delivery system
JP2002082775A (ja) * 2000-07-06 2002-03-22 Hitachi Ltd 計算機システム
JP2002165210A (ja) * 2000-08-04 2002-06-07 Matsushita Electric Ind Co Ltd データ送信端末及びデータ送受信装置
US20020071616A1 (en) 2000-08-29 2002-06-13 Olympus Optical Co., Ltd. Method and apparatus of generating three dimensional image data having one file structure and recording the image data on a recording medium, and recording medium for storing the three dimensional image data having one file structure
EP1235143A2 (de) 2000-12-27 2002-08-28 Microsoft Corporation Verfahren und Vorrichtung zur Erzeugung und Erhaltung von versionsspezifischen Eigenschaften in einer Datei
WO2004008768A1 (en) 2002-07-16 2004-01-22 Electronics And Telecommunications Research Institute Apparatus and method for adapting 2d and 3d stereoscopic video signal

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DIGITAL IMAGING GROUP,INC.: ""DIG35 Specification -Metadata for Digital Images-" version 1.0", 30 August 2000 (2000-08-30)
KAWAI T. ET AL: "Rittai Eizo no Non-linear Henshuyo Software no Kaihatsu", THE JOURNAL OF THE INSTITUTE OF IMAGE INFORMATION AND TELEVISION ENGINEERS, 2003, pages 249 - 249, XP002997650 *
See also references of EP1742488A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7929027B2 (en) 2006-12-27 2011-04-19 Fujifilm Corporation Image management method
US8400524B2 (en) 2006-12-27 2013-03-19 Fujifilm Corporation Image management method

Also Published As

Publication number Publication date
EP2442576A2 (de) 2012-04-18
US8155431B2 (en) 2012-04-10
EP2442576A3 (de) 2013-08-21
EP1742488A1 (de) 2007-01-10
US20120163705A1 (en) 2012-06-28
US8693764B2 (en) 2014-04-08
US20070036444A1 (en) 2007-02-15
EP1742488B1 (de) 2014-10-15
EP1742488A4 (de) 2008-10-22

Similar Documents

Publication Publication Date Title
WO2005115016A1 (ja) 画像処理装置、画像加工・編集装置、画像ファイル再生装置、画像処理方法、画像加工・編集方法及び画像ファイル再生方法
JP4993578B2 (ja) 画像ファイル再生装置,画像ファイル加工編集装置
US8743175B2 (en) 3D image file, photographing apparatus, image reproducing apparatus, and image processing apparatus
JP3841630B2 (ja) 画像取り扱い装置
JP4720785B2 (ja) 撮像装置、画像再生装置、撮像方法及びプログラム
US20130113892A1 (en) Three-dimensional image display device, three-dimensional image display method and recording medium
JP2007295547A (ja) デジタルカメラ
WO2005112475A9 (ja) 画像処理装置
JP2009129420A (ja) 画像処理装置および方法並びにプログラム
CN1947431B (zh) 图像处理装置、图像加工/编辑装置、图像文件再现装置、图像处理方法、图像加工/编辑方法及图像文件再现方法
JP2008288798A (ja) 撮像装置
JP4642375B2 (ja) 画像処理装置、画像加工・編集装置、画像ファイル再生装置、画像処理方法、画像加工・編集方法及び画像ファイル再生方法
JP4668602B2 (ja) 立体視画像生成装置および方法並びにプログラム
JP5264426B2 (ja) 撮像装置及びその制御方法、並びにプログラム
US20130120374A1 (en) Image processing device, image processing method, and image processing program
JP2009177316A (ja) 撮像装置
JP5144782B2 (ja) 撮像装置、画像再生装置、撮像方法及びプログラム
JP2012165247A (ja) 画像処理装置、撮影装置および画像処理プログラム
JP2009094727A (ja) 画像記録装置及び画像記録方法
JP2002330450A (ja) 撮像装置
JP2012114549A (ja) 画像処理装置、撮像装置および画像処理プログラム
JP2013021637A (ja) 撮像装置

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2005737293

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 11586079

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 200580013159.6

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

WWP Wipo information: published in national office

Ref document number: 2005737293

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 11586079

Country of ref document: US