WO2012132626A1 - 画像表示装置及び画像表示方法 - Google Patents

画像表示装置及び画像表示方法 Download PDF

Info

Publication number
WO2012132626A1
WO2012132626A1 PCT/JP2012/053983 JP2012053983W WO2012132626A1 WO 2012132626 A1 WO2012132626 A1 WO 2012132626A1 JP 2012053983 W JP2012053983 W JP 2012053983W WO 2012132626 A1 WO2012132626 A1 WO 2012132626A1
Authority
WO
WIPO (PCT)
Prior art keywords
shooting
image
display
image data
data
Prior art date
Application number
PCT/JP2012/053983
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
学 船田
啓 松岡
Original Assignee
オリンパスイメージング株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパスイメージング株式会社 filed Critical オリンパスイメージング株式会社
Publication of WO2012132626A1 publication Critical patent/WO2012132626A1/ja

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00323Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a measuring, monitoring or signaling apparatus, e.g. for transmitting measured information to a central location
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/907Television signal recording using static stores, e.g. storage tubes or semiconductor memories
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • H04N1/00026Methods therefor
    • H04N1/00068Calculating or estimating
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0084Digital still camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3214Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3215Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a time or duration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/325Modified version of the image, e.g. part of the image, image reduced in size or resolution, thumbnail or screennail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3253Position information, e.g. geographical position at time of capture, GPS data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3274Storage or retrieval of prestored additional information

Definitions

  • the present invention relates to an image display device and an image display method capable of displaying photographing position information of an image.
  • photographing position information can be recorded in association with a photographed image when photographing an image.
  • the shooting position information By recording the shooting position information in association with the shot image, for example, the user can search for a desired image using the shooting position information or display the shooting position information on the map image. .
  • Japanese Patent Application Laid-Open No. 10-210337 has proposed a technique for recording shooting position information even in a situation where positioning by GPS cannot be performed.
  • positioning data capturing position information
  • the previous positioning data is associated with the photographed image.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide an image reproduction device and an image reproduction method that can correctly present photographing position information to a user even when photographing position information is not obtained. .
  • an image display device includes a determination unit that determines whether or not shooting position information is associated with image data obtained by shooting, and the image data.
  • the shooting position information is not associated with the image data
  • the shooting date / time information associated with the image data and the shooting date / time information associated with the image data are respectively obtained at timings before and after.
  • a shooting position estimation unit that estimates a region including a shooting position from which the image data was obtained from position information and date / time information indicating the timing before and after, and the shooting position information is associated with the image data.
  • display data is generated so as to display a shooting position indicated by the shooting position information on a map image, and the image data
  • a display data generation unit that generates display data so as to display the region on a map image, and a map image based on the generated display data
  • a display control unit for displaying
  • the determination unit determines whether or not shooting position information is associated with image data obtained by shooting, and the determination
  • the shooting position estimation unit is associated with the shooting date and time information associated with the image data and the image data. From the position information obtained at the timing before and after the shooting date and time information and the time information indicating the timing before and after, respectively, the area indicating the shooting position according to the image data is estimated, and the determination unit, When it is determined that the shooting position information is associated with the image data, the display data generation unit displays the shooting position indicated by the shooting position information.
  • Display data is generated to be displayed on a map image, and when the determination unit determines that the shooting position information is not associated with the image data, the display data generation unit adds the image data to the image data.
  • display data is generated so as to display the region on the map image, and the display control unit displays the map image on the display unit based on the generated display data. It is characterized by that.
  • FIG. 1 is a block diagram showing a configuration of a digital camera as an example of an image display apparatus according to an embodiment of the present invention.
  • FIG. 2 is a flowchart showing the operation in the photographing mode.
  • FIG. 3 is a diagram illustrating an example of the structure of an image file.
  • FIG. 4 is a flowchart showing the operation in the playback mode.
  • FIG. 5 is a diagram illustrating an example of a GPS log.
  • FIG. 6 is a diagram illustrating an example of the interpolation GPS log.
  • FIG. 7 is a diagram showing a first example of a map image reproduced in one embodiment of the present invention.
  • FIG. 8 is a diagram showing a second example of a map image reproduced in one embodiment of the present invention.
  • FIG. 1 is a block diagram showing a configuration of a digital camera as an example of an image display apparatus according to an embodiment of the present invention.
  • FIG. 2 is a flowchart showing the operation in the photographing mode.
  • FIG. 3
  • FIG. 9 is a diagram showing a third example of the map image reproduced in the embodiment of the present invention.
  • FIG. 10 is a diagram illustrating a display example of a map image in consideration of water depth information.
  • FIG. 11 is a diagram illustrating an example of a GPS log including altitude information.
  • FIG. 12 is a diagram illustrating a display example of a map image in consideration of altitude information.
  • FIG. 1 is a block diagram illustrating a configuration of a digital camera (hereinafter referred to as a camera) as an example of an image display apparatus according to an embodiment of the present invention.
  • a camera 100 illustrated in FIG. 1 has a function of an image display device and also a function of an imaging device.
  • the camera 100 includes an imaging unit 101, an SDRAM 102, an image processing unit 103, a display driver (LCD driver) 104, a display unit (LCD) 105, and a display data processing unit 106.
  • LCD driver display driver
  • LCD display unit
  • a memory interface (I / F) 107 a recording medium 108, a communication unit 109, a gyro sensor 110, a pressure sensor 111, a microcomputer 112, an operation unit 113, a flash memory 114, and an interface (I / F).
  • F interface
  • the imaging unit 101 includes a photographing lens, an imaging element, and an imaging signal processing circuit.
  • the photographing lens is an optical system for condensing an image of a subject (not shown) on the photoelectric conversion surface of the image sensor.
  • the photographing lens includes a plurality of lenses such as a focus lens and a zoom lens, and an aperture.
  • the image sensor has a photoelectric conversion surface for receiving an image of a subject condensed by the photographing lens.
  • pixels including photoelectric conversion elements for example, photodiodes
  • a Bayer color filter is arranged on the surface of the photoelectric conversion surface.
  • Such an image sensor converts light collected by the photographing lens into an electric signal (image signal).
  • the imaging signal processing circuit performs various analog signal processing such as CDS processing and AGC processing on the imaging signal input from the imaging device.
  • the CDS process is a process for removing a dark current noise component in an imaging signal.
  • the AGC process is an amplification process for obtaining a desired dynamic range of the imaging signal.
  • the imaging signal processing circuit converts the imaging signal subjected to the analog signal processing into image data as digital data.
  • the SDRAM 102 is a storage unit that temporarily stores image data obtained by the imaging unit 101, various data processed by the image processing unit 103 and the display data processing unit 106, and the like.
  • the image processing unit 103 performs various image processing on the image data stored in the SDRAM 102.
  • the image processing includes processing such as white balance (WB) processing, demosaic processing, color conversion processing, gradation conversion processing, edge enhancement processing, and compression / decompression processing.
  • WB white balance
  • demosaic processing demosaic processing
  • color conversion processing gradation conversion processing
  • edge enhancement processing edge enhancement processing
  • compression / decompression processing compression / decompression processing.
  • the LCD driver 104 drives the LCD 105 under the control of the microcomputer 112 having a function as a display control unit, and displays various images on the LCD 105.
  • the LCD 105 as a display unit is a liquid crystal display provided on the back surface of the camera 100, for example, and displays various images such as an image based on the image data processed in the image processing unit 103.
  • the display data processing unit 106 generates display data for displaying shooting position information corresponding to image data.
  • the display data processing unit 106 includes a position interpolation processing unit 1061, a position information generation unit 1062, and a display data generation unit 1063.
  • a position interpolation processing unit 1061 having a function as a determination unit determines whether or not shooting position information is associated with image data, and interpolates shooting position information when shooting position information is not associated. Necessary information is acquired from the microcomputer 112.
  • a position information generation unit 1062 having a function as a shooting position estimation unit interpolates shooting position information corresponding to image data not associated with shooting position information, according to information received from the position interpolation processing unit 1061.
  • the display data generation unit 1063 generates display data for displaying the shooting position information corresponding to the image data on the map image according to the shooting position information interpolated by the position information generation unit 1062.
  • the display data generation unit 1063 stores map image data associated with position information (longitude / latitude information) and icon data for displaying various icons superimposed on the map image.
  • the memory I / F 107 is an interface that acts as an intermediary for the microcomputer 112 to perform data communication with the recording medium 108.
  • the recording medium 108 is a recording medium on which an image file obtained by the shooting operation is recorded.
  • the communication unit 109 is a communication unit for the camera 100 to remotely communicate with various external devices.
  • the communication unit 109 includes a GPS communication unit 1091 and a wireless communication unit 1092.
  • the GPS communication unit 1091 is a GPS receiver that receives radio waves from GPS satellites.
  • the wireless communication unit 1092 is a communication module for wirelessly communicating the camera 100 with an external device such as a wireless LAN module.
  • the gyro sensor 110 is a sensor for detecting the posture change of the camera 100. With this gyro sensor 110, it is possible to detect in which direction the camera 100 has moved.
  • the pressure sensor 111 detects the pressure around the camera 100. The pressure sensor 111 can detect the depth and altitude of the place where the camera 100 is located.
  • the microcomputer 112 comprehensively controls various sequences of the camera 100. When any one of the operation units of the operation unit 113 is operated, the microcomputer 112 controls each block shown in FIG. 1 corresponding to the operation.
  • the microcomputer 112 in this embodiment has a function of storing GPS logs.
  • the GPS log is a log that records at least the shooting frame number of the image obtained at that time, the GPS data at the time of shooting, and the shooting date and time at that time each time shooting is performed. .
  • the operation unit 113 includes operation members for the user to perform various operations of the camera 100.
  • the operation unit 113 in this embodiment relates to at least an operation unit for setting the operation mode of the camera 100 to the GPS mode, various operation units related to a reproduction mode such as a reproduction button and a reproduction end button, and a photographing mode such as a release button. Includes various operation units.
  • the operation unit 113 includes an operation unit for turning off the power of the camera 100, an operation unit for switching the operation mode of the camera 100 between the shooting mode and the reproduction mode, and the like.
  • the flash memory 114 is a memory that stores processing programs and setting data necessary for the microcomputer 112 to execute various processes.
  • the I / F 115 is an interface that mediates when the camera 100 performs wired communication with various external devices.
  • FIG. 2 is a flowchart showing the operation of the camera 100 in the shooting mode.
  • the camera 100 according to the present embodiment can set the operation mode to the GPS mode.
  • GPS mode shooting position information obtained by positioning by GPS is associated with image data obtained by shooting.
  • the operation mode of the camera 100 is set to the GPS mode in advance.
  • the microcomputer 112 first performs a through image display process (step S101). As the through image display processing, the microcomputer 112 continuously operates the image pickup device of the image pickup unit 101. Then, the microcomputer 112 controls the LCD driver 104 to display an image corresponding to the image data sequentially obtained by the continuous operation of the image sensor on the LCD 105. Through such a through image display process, the user can determine the composition based on the image displayed on the LCD 105.
  • the microcomputer 112 determines whether or not the release button has been pressed (step S102). In step S102, when the release button is pressed, the microcomputer 112 executes a photographing operation. Therefore, the microcomputer 112 determines whether or not the current operation mode is a shooting mode for shooting a still image (step S103).
  • step S104 when the current operation mode is a shooting mode for shooting a still image, the microcomputer 112 executes still image shooting (step S104).
  • the microcomputer 112 performs a focusing process to adjust the focal position of the shooting lens. Further, in parallel with the focusing process, the microcomputer 112 determines an exposure condition (aperture opening amount, exposure time / sensitivity of the image sensor) at the time of performing the photographing operation in accordance with the subject brightness and the like. Thereafter, the microcomputer 112 causes the imaging device to perform imaging according to the determined exposure condition. Further, the microcomputer 112 acquires shooting date / time information using a built-in clock.
  • the microcomputer 112 starts the operation of the GPS communication unit 1091 to receive GPS radio waves from GPS satellites (not shown), and generates GPS data from the received GPS radio waves.
  • the GPS data is represented by, for example, longitude X and latitude Y.
  • the microcomputer 112 stores the acquired GPS data as a GPS log in association with the shooting date / time and the shooting frame number.
  • GPS data is not recorded in the GPS log, but the shooting frame number and shooting date / time are recorded.
  • step S105 when the current operation mode is a shooting mode for shooting a moving image, the microcomputer 112 executes moving image shooting (step S105).
  • moving image shooting the microcomputer 112 performs focusing processing in the same manner as in still image shooting to adjust the focal position of the shooting lens. Further, in parallel with the focusing process, the microcomputer 112 determines an exposure condition (aperture opening amount, exposure time / sensitivity of the image sensor) at the time of performing the photographing operation in accordance with the subject brightness and the like. Thereafter, the microcomputer 112 causes the imaging device to perform imaging according to the determined exposure condition.
  • imaging by the image sensor is repeatedly executed until the end of moving image shooting is instructed by the user. Further, the microcomputer 112 generates or updates the GPS log in the same manner as the still image shooting.
  • the microcomputer 112 After execution of still image shooting or moving image shooting, the microcomputer 112 causes the image processing unit 103 to perform image processing on image data stored in the SDRAM 102 as a result of still image shooting or moving image shooting. Thereafter, the microcomputer 112 adds attribute data such as shooting information to the image data processed by the image processing unit 103 to generate an image file. The microcomputer 112 records the generated image file on the recording medium 108 (step S106).
  • FIG. 3 is a diagram showing an example of the structure of the image file recorded in step S106.
  • the image file includes an attribute data recording unit and an image data recording unit.
  • the attribute data recording unit corresponds to, for example, the Exif (Exchangeable image file format) standard.
  • various attribute data such as shooting conditions at the time of still image shooting or moving image shooting, shooting position information, and shooting date / time information are recorded in a metadata format.
  • the shooting position information in the present embodiment includes not only the above-described GPS data but also interpolation GPS data described later.
  • the image data recording unit still image data obtained by still image shooting or moving image data obtained by moving image shooting is compressed and recorded.
  • step S107 determines whether or not to change the operation mode to the reproduction mode. For example, it is determined that the operation mode is to be changed when instructed by the user to operate the operation unit 113 to change the operation mode to the reproduction mode.
  • step S107 when an instruction to change the operation mode is given, the microcomputer 112 executes an operation in a reproduction mode described later.
  • Step S107 when the operation mode change instruction is not issued, the microcomputer 112 determines whether or not to turn off the power of the camera 100 (Step S108). For example, when the user turns off the power of the camera 100 or when the no-operation state continues for a predetermined time, the power of the camera 100 is turned off. In step S108, when the power of the camera 100 is not turned off, the microcomputer 112 performs the through image display process in step S101 again. In step S108, when turning off the power of the camera 100, the microcomputer 112 performs a power-off process for turning off the power of the camera 100, and ends the process of FIG.
  • FIG. 4 is a flowchart showing the operation of the camera 100 in the playback mode.
  • the playback mode in the present embodiment not only the selected image file is simply played, but also the map image is displayed according to the shooting position information recorded as the attribute data of the selected image file. it can. Whether or not to display the map image can be set by the operation of the operation unit 113 by the user.
  • the microcomputer 112 controls the LCD driver 104 to display a thumbnail image showing a list of image files recorded on the recording medium 108 on the LCD 105 (step S201).
  • an index for identifying the file may be attached to each thumbnail image so that the still image file and the moving image file can be identified.
  • the microcomputer 112 determines whether any image file displayed as a thumbnail is selected by the user's operation (step S202). In step S202, when an image file is selected, the microcomputer 112 displays a map image, that is, whether the operation mode of the camera 100 is set to a map display mode for displaying a map image by a user operation. It is determined whether or not (step S203).
  • step S203 when the map display mode is set, the microcomputer 112 instructs the display data processing unit 106 to generate display data for map display.
  • the display data processing unit 106 reads out the attribute data of the image file selected by the user through the position interpolation processing unit 1061 (step S204). Then, the display data processing unit 106 determines whether or not shooting position information is recorded as attribute data by the position interpolation processing unit 1061 (step S205).
  • step S205 when shooting position information is recorded as the read attribute data, the display data processing unit 106 uses the attribute data (GPS data or interpolated GPS data) recorded in the selected image file. Display data for displaying the map. Then, the microcomputer 112 controls the LCD driver 104 to display a map image based on the display data on the LCD 105 (step S206).
  • attribute data GPS data or interpolated GPS data
  • the microcomputer 112 determines whether or not to finish the reproduction of the map image (step S207). For example, when it is instructed by the user's operation of the operation unit 113 to end the reproduction of the map image, it is determined that the reproduction ends. The microcomputer 112 stands by while performing the determination of step S207 until it is determined that the reproduction of the map image is finished.
  • step S205 when shooting position information is not recorded as the read attribute data, the display data processing unit 106 acquires a GPS log from the microcomputer 112 by the position interpolation processing unit 1061 (step S208).
  • FIG. 5 is a diagram illustrating an example of a GPS log.
  • the GPS log in the example shown in FIG. 5 is recorded in association with GPS data, still image / moving image identification data, shooting date / time, and water depth for each shooting frame.
  • GPS data is not recorded because GPS radio waves cannot be received underwater.
  • shooting position information as attribute data is not recorded.
  • the shooting position information of the shooting frames for which such GPS data could not be obtained is interpolated from the GPS data related to the preceding and subsequent shooting frames.
  • the position interpolation processing unit 1061 of the display data processing unit 106 captures images at the timing before and after the shooting date and time when the selected image file is obtained from the acquired GPS log, and before and after the loss of GPS data.
  • a difference ⁇ T in date and time, and a difference ⁇ X (longitude difference) and ⁇ Y (latitude difference) between photographing positions before and after losing GPS data are calculated.
  • ⁇ T is the difference between the shooting date / time tF in the shooting frame F and the shooting date / time tB in the shooting frame B (tF ⁇ tB).
  • ⁇ X is (xF ⁇ xB)
  • ⁇ Y is (yF ⁇ yB).
  • the position interpolation processing unit 1061 determines whether or not the calculated ⁇ T is within a predetermined time (for example, 5 hours) (step S209). In step S209, when ⁇ T is within the predetermined time, the position interpolation processing unit 1061 of the display data processing unit 106 causes the position information generation unit 1062 to estimate the imaging region (step S210).
  • a predetermined time for example, 5 hours
  • the concept of estimating the imaging region will be described with reference to FIG.
  • an image file corresponding to the shooting frame C is selected.
  • the user moves at a constant linear velocity from the shooting position PB (xB, yB) of the shooting frame B to the shooting position PF (xF, yF) of the shooting frame F.
  • the shooting position PC (xC, yC) of the shooting frame C is a position indicated by the following (formula 1) on the straight line connecting the shooting position PB and the shooting position PF.
  • the shooting position PC of the shooting frame C is represented by a circular area including the shooting position PB with the position of (Equation 1) as the center.
  • the radius rC of such a circular area can be determined according to the time difference from the shooting position PB to the shooting position PC. That is, the radius rC is determined by the following (Formula 2).
  • rC r ⁇ (tC ⁇ tB) (Formula 2)
  • r is a coefficient.
  • (Expression 2) the radius of the circular area is determined according to the time difference from the immediately preceding shooting frame, but the radius of the circular area is determined according to the time difference from the immediately following shooting frame (ie, shooting frame D). May be.
  • the circular shooting area estimated from (Equation 1) and (Equation 2) indicates that the shooting frame C is actually used even if the user's movement from the shooting position PB to the shooting position PF is not a uniform linear movement. There is a high possibility that the shooting position is included.
  • (Equation 1) and (Equation 2) are when the image file corresponding to the shooting frame C is selected. It is also possible to obtain the shooting areas of the shooting frame D and the shooting frame E according to the same concept as when the image file corresponding to the shooting frame C is selected. That is, the center position PD (xD, yD) of the shooting area related to the shooting frame D can be calculated by replacing tC in (Equation 1) with tD. Similarly, the center position PE (xE, yE) of the shooting area for the shooting frame E can be calculated by replacing tC in (Equation 1) with tE.
  • the radius rD of the shooting area relating to the shooting frame D can be obtained by multiplying the time difference (tD-tC) from tC, which is the previous shooting date and time, by a coefficient r.
  • the radius rE of the shooting area related to the shooting frame E can be obtained by multiplying the time difference (tE-tD) from tD, which is the previous shooting date and time, by a coefficient r.
  • An interpolation GPS log as shown in FIG. 6 can be generated by estimating the shooting area as described above.
  • the position information generating unit 1062 records the generated interpolated GPS log in its own memory.
  • the position information generation unit 1062 obtains the interpolation GPS data (in the above example, (xC, yC)) corresponding to the image file selected by the user from the information recorded in the interpolation GPS log. Is recorded in the attribute data recording unit of the image file selected by (Step S211). By recording these information, the next time the same image file is selected, step S205 is branched to step S206. Therefore, it is not necessary to perform the processing of steps S208 to S211.
  • the display data generation unit 1063 After recording the interpolation GPS log, the display data generation unit 1063 generates display data for displaying a map image using the generated interpolation GPS data. Then, the microcomputer 112 controls the LCD driver 104 to display a map image based on the display data on the LCD 105 (step S212).
  • the microcomputer 112 determines whether or not to finish the reproduction of the map image (step S213). For example, when it is instructed by the user's operation of the operation unit 113 to end the reproduction of the map image, it is determined that the reproduction ends. The microcomputer 112 stands by while performing the determination of step S213 until it is determined that the reproduction of the map image is finished.
  • FIG. 7 shows an example of the map image reproduced in step S212.
  • a shooting corresponding to the image file selected by the user is displayed on the map image 201 around the shooting area corresponding to the image file selected by the user.
  • An image showing the region is displayed in a superimposed manner.
  • FIG. 7 shows a case where the image file of the shooting frame C is selected.
  • the display data generation unit 1063 selects map image data representing the map image 201 in the vicinity of the shooting area of the shooting frame C. Then, the display data generation unit 1063 generates display data so that the image RC of the circular area indicating the shooting area of the shooting frame C is displayed on the map image 201.
  • the display data may be generated so as to display the index image PC indicating the center position of the image RC in the circular area.
  • FIG. 6 if GPS data related to a shooting frame other than the shooting frame C is also recorded as an interpolated GPS log, information related to a shooting position other than the selected shooting frame can be displayed at the same time.
  • a display example in this case is shown in FIG. In the display example of FIG. 8, for shooting frames A, B, and F in which shooting position information is associated with image data, index images PA, PB, and PF that directly indicate the shooting positions may be displayed in a superimposed manner.
  • the shooting frames D and E in which the shooting position information is not associated with the image data as with the shooting frame C, the circular regions RD and RE indicating the shooting region may be displayed in a superimposed manner.
  • a thumbnail image 202 indicating an image file in the recording medium 108 in addition to the map image 201.
  • an index 202a for identifying a moving image and a still image, a scroll icon 203, and the like may be displayed.
  • the user's movement trajectory 204 may be displayed.
  • step S209 If ⁇ T exceeds a predetermined time (for example, 5 hours) in step S209, the display data processing unit 106 notifies the microcomputer 112 accordingly. In response to this, the microcomputer 112 controls the LCD driver 104 to display a warning indicating that the map display is not performed on the LCD 105 (step S214).
  • ⁇ T is long is because shooting before and after the loss of GPS data is shooting in different moving steps, and it is considered that there is not much meaning even if the shooting area is estimated.
  • the shooting area is not displayed on the map image.
  • FIG. 8 in the case of simultaneously displaying information related to the shooting positions corresponding to a plurality of shooting frames, it is not necessary to display the information related to the shooting position of the shooting frame I without warning in step S214. good.
  • step S203 when the operation mode of the camera 100 is not set to the map display mode for displaying the map information, the microcomputer 112 decompresses the image file selected by the user in the image processing unit 103, and An image based on the decompressed image data is displayed on the LCD 105 by controlling the LCD driver 104 (step S215).
  • the microcomputer 112 determines whether or not to end the reproduction of the image (step S216). For example, it is determined that the reproduction is to be terminated when an instruction is given by the operation of the operation unit 113 by the user to end the reproduction of the image.
  • the microcomputer 112 stands by while performing the determination in step S216 until it is determined that the reproduction is to be ended.
  • step S202 determines whether or not to change the operation mode to the shooting mode (step S202). S217). For example, it is determined that the operation mode is to be changed when instructed by the user to operate the operation unit 113 to change the operation mode to the shooting mode.
  • step S217 when an instruction to change the operation mode is given, the microcomputer 112 executes the operation in the photographing mode of FIG.
  • step S217 if an instruction to change the operation mode is not given, the microcomputer 112 determines whether to turn off the power of the camera 100 (step S218). For example, when the camera is turned off by the user or when the no-operation state is continued for a predetermined time, the camera is turned off. In step S218, when the power of the camera 100 is not turned off, the microcomputer 112 performs the thumbnail display in step S201 again. In step S218, when the power of the camera 100 is turned off, the microcomputer 112 performs a power-off process for turning off the power of the camera 100 and ends the process of FIG.
  • the shooting position information at the timing when the GPS data is lost is estimated from the acquired shooting date and time information.
  • the reproduction of the image file selected by the user and the reproduction of the map image are performed separately.
  • the reproduction of the image file selected by the user and the reproduction of the map image are performed separately. It may be performed simultaneously.
  • the radius of the circular area can be calculated by multiplying the time difference (tF-tE) by the coefficient r. desirable.
  • a display example in such a case is shown in FIG.
  • the display examples in FIGS. 7 to 9 are display examples of a shooting area on a two-dimensional map image.
  • the shooting frame D is underwater shooting at a water depth of 3 m.
  • the shooting area is 3 m below the circular area estimated by the calculations of (Expression 1) and (Expression 2).
  • a three-dimensional map image can be displayed, as shown in FIG. 10, a true circular area RD ′ is displayed at a position 3 m below the circular area RD estimated by (Expression 1) and (Expression 2). Is desirable.
  • the shooting area is displayed on the three-dimensional map image in consideration of the altitude. Can also be displayed.
  • the circular area estimated by the calculations of (Expression 1) and (Expression 2) is an area parallel to the flat ground. Therefore, when the estimated circular region is on a mountain slope or the like, it is desirable that the circular region be inclined according to the inclination angle of the mountain.
  • the coefficient r in (Equation 2) may be a fixed value, but is preferably changed according to the moving speed of the camera 100. For example, if the output signal of the gyro sensor 110 is integrated once, the moving speed of the camera 100 can be calculated. For this reason, the imaging region can be estimated more accurately by making the value of the coefficient r variable according to the output signal of the gyro sensor 110. Further, it is possible to specify the moving direction of the camera 100 from the output signal of the gyro sensor 110, and by using this, it is possible to perform curve interpolation on the shooting position information at the timing when the GPS data is lost. Such interpolation can be performed not only from the gyro sensor 110 but also from the output of an electronic compass or the like.
  • the position display by the above (Formula 1) and (Formula 2) is not limited to this, and may be modified within a range that satisfies the purpose of the present embodiment.
  • the acceleration can be obtained as described above, and the position can be displayed with the radius rC from the gravity center position of the photographing position PB (xB, yB) and the estimated photographing PC (xC, yC) obtained by (Equation 1). .
  • the position may be obtained by a calculation formula that further considers a coefficient related to the moving speed.
  • the GPS log described above records GPS data every time shooting is performed.
  • GPS data may be recorded even while shooting is not being performed. In this way, it is possible to estimate the shooting area more accurately by shortening the GPS data acquisition interval.
  • GPS data is repeatedly acquired even in a situation where GPS data such as underwater cannot be acquired. Therefore, for example, when it is determined from the output signal of the pressure sensor 111 that the camera 100 is located in water, it is desirable to interrupt the acquisition of GPS data.
  • interpolation GPS data is generated when shooting position information is not recorded in the selected image file.
  • interpolated GPS data may be generated for all shooting frames that do not have GPS data in the GPS log.
  • a digital camera is used as an example of an image display device.
  • the technology of this embodiment is applied to various image display devices other than a digital camera having a map image display function. Is possible.
  • the configuration shown in FIG. 1 may be configured as a control circuit (ASIC) in which a plurality of blocks are integrated into one chip.
  • ASIC control circuit
  • the above-described embodiments include inventions at various stages, and various inventions can be extracted by appropriately combining a plurality of disclosed constituent elements. For example, even if some configuration requirements are deleted from all the configuration requirements shown in the embodiment, the above-described problem can be solved, and this configuration requirement is deleted when the above-described effects can be obtained.
  • the configuration can also be extracted as an invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Library & Information Science (AREA)
  • Computer Graphics (AREA)
  • Studio Devices (AREA)
  • Television Signal Processing For Recording (AREA)
  • Controls And Circuits For Display Device (AREA)
PCT/JP2012/053983 2011-03-25 2012-02-20 画像表示装置及び画像表示方法 WO2012132626A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011067171A JP5638993B2 (ja) 2011-03-25 2011-03-25 画像表示装置及び画像表示方法
JP2011-067171 2011-03-25

Publications (1)

Publication Number Publication Date
WO2012132626A1 true WO2012132626A1 (ja) 2012-10-04

Family

ID=46930387

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/053983 WO2012132626A1 (ja) 2011-03-25 2012-02-20 画像表示装置及び画像表示方法

Country Status (2)

Country Link
JP (1) JP5638993B2 (enrdf_load_stackoverflow)
WO (1) WO2012132626A1 (enrdf_load_stackoverflow)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022253045A1 (zh) * 2021-06-04 2022-12-08 Oppo广东移动通信有限公司 照片拍摄方法、装置、终端及存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0637491Y2 (ja) 1986-03-31 1994-09-28 株式会社アイラブユ− 交換機
JP6476082B2 (ja) * 2015-06-29 2019-02-27 京セラ株式会社 電子機器、画像データ保存方法及び画像データ保存プログラム
KR101692643B1 (ko) * 2015-11-18 2017-01-03 재단법인 다차원 스마트 아이티 융합시스템 연구단 저전력 무선 카메라 및 센서 시스템
JP6808497B2 (ja) * 2017-01-05 2021-01-06 キヤノン株式会社 画像処理装置、画像処理装置の制御方法及びプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11122638A (ja) * 1997-10-15 1999-04-30 Oki Electric Ind Co Ltd 画像処理装置および画像処理方法ならびに画像処理プログラムが記録されたコンピュータ読み取り可能な情報記録媒体
JP2006033273A (ja) * 2004-07-14 2006-02-02 Fuji Photo Film Co Ltd プリント装置、プリント方法及びプログラム
JP2006157810A (ja) * 2004-12-01 2006-06-15 Olympus Corp 表示制御装置、カメラ、表示制御方法及びプログラム
JP2010062704A (ja) * 2008-09-02 2010-03-18 Panasonic Corp データ作成装置、及びデータ作成方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3671478B2 (ja) * 1995-11-09 2005-07-13 株式会社デンソー 車両の日射検出装置及び車両用空気調和装置
JP4380609B2 (ja) * 2005-09-02 2009-12-09 トヨタ自動車株式会社 運転支援装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11122638A (ja) * 1997-10-15 1999-04-30 Oki Electric Ind Co Ltd 画像処理装置および画像処理方法ならびに画像処理プログラムが記録されたコンピュータ読み取り可能な情報記録媒体
JP2006033273A (ja) * 2004-07-14 2006-02-02 Fuji Photo Film Co Ltd プリント装置、プリント方法及びプログラム
JP2006157810A (ja) * 2004-12-01 2006-06-15 Olympus Corp 表示制御装置、カメラ、表示制御方法及びプログラム
JP2010062704A (ja) * 2008-09-02 2010-03-18 Panasonic Corp データ作成装置、及びデータ作成方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022253045A1 (zh) * 2021-06-04 2022-12-08 Oppo广东移动通信有限公司 照片拍摄方法、装置、终端及存储介质

Also Published As

Publication number Publication date
JP5638993B2 (ja) 2014-12-10
JP2012205038A (ja) 2012-10-22

Similar Documents

Publication Publication Date Title
JP5550989B2 (ja) 撮影装置、その制御方法、及びプログラム
US8756009B2 (en) Portable apparatus
CN101800850B (zh) 拍摄装置、拍摄方法以及记录介质
US9554028B2 (en) Imaging device, imaging system, imaging method, and computer-readable recording medium associating image data with responsibility acceptance or abandonment information
US7978254B2 (en) Image capturing apparatus, its controlling method, and program
US20120268621A1 (en) Imaging apparatus, azimuth recording method, and program
TWI492618B (zh) 攝影裝置及電腦可讀取記錄媒體
JP5638993B2 (ja) 画像表示装置及び画像表示方法
JP2010245607A (ja) 画像記録装置および電子カメラ
US8547454B2 (en) Digital image photographing apparatuses and methods of controlling the same to provide location information
JP4807582B2 (ja) 画像処理装置、撮像装置及びそのプログラム
JP5942260B2 (ja) 撮像装置、画像再生装置
JP4702220B2 (ja) 撮像装置及び撮像方法
JP2007221711A (ja) センサユニット、及び、電子機器
JP4888829B2 (ja) 動画処理装置、動画撮影装置および動画撮影プログラム
JP5869046B2 (ja) 撮影装置、その制御方法、及びプログラム
JP2012113578A (ja) 検索装置、検索方法、検索プログラム及びこれらを用いたカメラ装置、並びに検索システム
JP2012085223A (ja) 撮影条件生成装置、撮像装置および撮影条件生成プログラム
JP5962974B2 (ja) 撮像装置、撮像方法、及びプログラム
JP2008283477A (ja) 画像処理装置及び画像処理方法
JP2010130590A (ja) 撮像装置及び撮像方法
JP2010259050A (ja) 電子カメラ及び現像処理プログラム
JP6121004B2 (ja) 撮影装置、その制御方法、及びプログラム
US20130258158A1 (en) Electronic device
JP6729583B2 (ja) 画像処理装置および方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12764487

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12764487

Country of ref document: EP

Kind code of ref document: A1