WO2013114732A1 - Electronic device - Google Patents

Electronic device Download PDF

Info

Publication number
WO2013114732A1
WO2013114732A1 PCT/JP2012/081876 JP2012081876W WO2013114732A1 WO 2013114732 A1 WO2013114732 A1 WO 2013114732A1 JP 2012081876 W JP2012081876 W JP 2012081876W WO 2013114732 A1 WO2013114732 A1 WO 2013114732A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
data
unit
distance
positioning
Prior art date
Application number
PCT/JP2012/081876
Other languages
French (fr)
Japanese (ja)
Inventor
松本 慎也
Original Assignee
株式会社ザクティ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ザクティ filed Critical 株式会社ザクティ
Publication of WO2013114732A1 publication Critical patent/WO2013114732A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/40Correcting position, velocity or attitude
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/24Details of cameras or camera bodies; Accessories therefor with means for separately producing marks on the film, e.g. title, time of exposure

Definitions

  • the present invention relates to an electronic device such as an imaging device.
  • positional information using a global positioning system can be recorded together with a captured image, but it is increasing. If the GPS function is used, the approximate current location of the user and the electronic device can be grasped.
  • GPS global positioning system
  • the server device extracts an image related to the current location information and sends it back to the mobile phone (for example, see Patent Document 1 below).
  • the position detection accuracy using the GPS function is limited, and errors that are unacceptable to the user may be included in the detection position. Note that the method of Patent Document 1 does not contribute to the reduction of the error.
  • an object of the present invention is to provide an electronic device that can accurately correct positioning data.
  • a first electronic device generates an image pickup unit having an image pickup device that outputs an image signal of a subject, and distance data representing a distance between the electronic device and the subject using an output of the image pickup device.
  • a distance data generation unit a shooting direction detection unit that generates shooting direction data by detecting a shooting direction of the imaging unit, a positioning processing unit that performs positioning based on a signal from a satellite and generates original positioning data;
  • a transmission unit that transmits a reference image based on the output of the imaging device to an external device, a reception unit that receives position data based on the reference image from the external device, received position data, the distance data, and the shooting direction data.
  • a positioning data correcting unit for correcting the original positioning data based on the positioning data.
  • the first electronic device data related to the position associated with the target image corresponding to the reference image is received as the position data by the receiving unit, and the position related to the target image is related to the first electronic device.
  • the data may include location data of an object included in the target image.
  • the transmission unit may transmit the reference image and the original positioning data to an external device, and the reception position data is a position based on the reference image and the original positioning data. It may be data.
  • a third electronic device generates an image pickup unit having an image pickup device that outputs an image signal of a subject, and distance data representing a distance between the electronic device and the subject using an output of the image pickup device.
  • a distance data generation unit a shooting direction detection unit that generates shooting direction data by detecting a shooting direction of the imaging unit, a positioning processing unit that performs positioning based on a signal from a satellite and generates original positioning data;
  • a transmitting unit that transmits the original positioning data to an external device, a receiving unit that receives a target image and position data based on the original positioning data from the external device, a reference image based on the target image and an output of the imaging element, and And based on the extraction position data, the distance data, and the shooting direction data, the arithmetic processing unit for extracting the position data from the received content of the receiving unit according to the comparison result
  • And positioning data correcting unit for correcting the Kihara positioning data, characterized by comprising a.
  • the position data may include location data of an object included in the target image.
  • the reception unit may receive a target image group and a position data group based on the original positioning data from the external device, and the arithmetic processing unit An image corresponding to the reference image may be extracted as the target image from the inside, and the position data associated with the target image may be extracted from the position data group.
  • a distance image based on the distance data may be used as the reference image.
  • the imaging unit includes a distance measuring light source that irradiates light to the subject, and the distance data is generated using the distance measuring light source. May be.
  • the imaging unit includes a plurality of imaging elements as the imaging element, a plurality of ranging light sources corresponding to the plurality of imaging elements,
  • the distance measuring light source irradiates the subject with light having a plurality of wavelengths, and the distance data generation unit generates the distance data using the output of each imaging device depending on the reflected light from the subject. May be.
  • a warning notification may be made according to the correction amount to the original positioning data by the positioning data correction unit.
  • FIG. 1 is a schematic overall block diagram of a positioning system according to an embodiment of the present invention. It is a figure which defines the X-, Y-, and Z-axis which concerns on embodiment of this invention.
  • (A) And (b) is a figure for demonstrating the azimuth angle and attitude
  • (A)-(c) is a figure for demonstrating the light source for ranging which can be provided in an imaging part. It is a figure which shows the structure of an image file. It is an operation
  • A) And (b) is a figure which shows the example of the normal picked-up image and distance image which are acquired with an imaging device.
  • FIG. 1 It is an internal block diagram of the server apparatus based on 1st Example of this invention. It is a flowchart which shows the procedure of a similar image search process. It is a figure which shows several evaluation object image and several position data. It is a figure for demonstrating a positioning data correction process, Comprising: It is a figure which shows the positional relationship example between an imaging device and a target object. It is a figure for demonstrating a positioning data correction process, Comprising: It is a figure which shows the positional relationship example between an imaging device and a target object. (A) And (b) is a correction image figure of positioning data by positioning data correction processing. It is a figure which shows the modification of the one part step of FIG.
  • FIG. 1 is a schematic block diagram showing the overall configuration of a positioning system according to an embodiment of the present invention.
  • the positioning system includes an electronic device 1 and a server device SV that is an external device of the electronic device 1 and includes a computer or the like.
  • the electronic device 1 and the server device SV are connected to a network NET formed including the Internet and the like, and can communicate arbitrary data via the network NET.
  • the electronic device 1 includes each part referred to by reference numerals 11 to 21.
  • the electronic device 1 has a photographing function. Therefore, hereinafter, the electronic device 1 is referred to as an imaging device (digital camera).
  • An imaging device is a kind of electronic equipment.
  • the imaging apparatus 1 functions other than the imaging function (for example, a telephone function, an Internet connection function, an e-mail transmission / reception function, and a music playback function) may be further realized.
  • the imaging apparatus 1 is a device other than the imaging apparatus. (For example, a mobile phone or an information terminal).
  • the imaging unit 11 has an optical system, an aperture, and an imaging element (solid-state imaging element) IS including a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and the subject using the imaging element IS.
  • an imaging element solid-state imaging element
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the positioning processing unit 12 receives a signal transmitted from a satellite that forms a global positioning system (GPS), and performs positioning of the imaging device 1 based on the received signal.
  • the positioning of the imaging device 1 refers to processing for obtaining the position (location) of the imaging device 1, and the obtained position of the imaging device 1 includes the longitude, latitude, and altitude of the imaging device 1. However, the altitude of the imaging device 1 may not be included in the required position of the imaging device 1.
  • the result of positioning by the positioning processing unit 12, that is, the obtained data including the longitude, latitude, and altitude of the imaging device 1 is output to the main control unit 21 as original positioning data. As shown in FIG.
  • an X axis, a Y axis, and a Z axis that are orthogonal to and intersect each other at the origin O are defined.
  • the X and Y axes are parallel to the horizontal plane, and the Z axis is parallel to the direction of gravity.
  • the longitude, latitude, and altitude of the position (location) of an arbitrary object are expressed as coordinate values x, y, and z on the X, Y, and Z axes, respectively, and correspond to the coordinate values x, y, and z.
  • Data indicating the position (location) of the object is referred to as position data or expressed as (x, y, z).
  • the coordinate values x, y, and z of the imaging device 1 indicated by the original positioning data are represented by symbols x M , y M, and z M, respectively. Therefore, the position of the imaging device 1 represented by the original positioning data is (x M , y M , z M ).
  • the positive directions of the X and Y axes correspond to east and north, respectively, and the positive direction of the Z axis corresponds to the upward direction (the direction opposite to the direction of gravity).
  • the photographing direction detection unit 13 generates photographing direction data by detecting the photographing direction of the imaging unit 11 (that is, the optical axis direction of the imaging unit 11).
  • the shooting direction data includes an azimuth angle ⁇ and a posture angle ⁇ . Considering that the imaging device 1 is disposed at the origin O, the significance of the azimuth angle ⁇ and the attitude angle ⁇ will be described with reference to FIGS.
  • the azimuth angle ⁇ is an angle representing the azimuth in the photographing direction, and is detected using an electronic compass or the like. For example, as shown in FIG.
  • the posture angle ⁇ is an angle that represents a posture difference of the imaging apparatus 1 as viewed from the reference posture, and is detected using a magnetic sensor, a gyro sensor, or the like. The posture of the imaging device 1 when the photographing direction is parallel to the horizontal plane is the reference posture. Therefore, as shown in FIG.
  • the angle formed by the photographing direction (that is, the optical axis direction) and the horizontal plane can be defined as the posture angle ⁇ .
  • > 0 ° when the shooting direction has an upward component
  • ⁇ ⁇ 0 ° when the shooting direction has a downward component.
  • the photographing direction recording unit 14 includes a semiconductor memory or the like, and temporarily records the photographing direction data generated by the photographing direction detection unit 13 as necessary.
  • the memory unit 15 includes a semiconductor memory, a magnetic disk, or the like, and records arbitrary information generated by the imaging device 1 or acquired by the imaging device 1. Part or all of the memory unit 15 functions as the database 16.
  • the display unit 17 is a display device having a display screen such as a liquid crystal display panel, and displays an arbitrary video under the control of the main control unit 21.
  • the operation unit 18 includes a shutter button 18a that receives a still image shooting instruction, a zoom button 18b that receives a zoom magnification change instruction, and the like, and receives various operations from the user. The details of the operation on the operation unit 18 are transmitted to the main control unit 21.
  • the buttons 18 a and 18 b may be buttons on a touch panel that can be provided on the display unit 17.
  • the transmission unit 19 and the reception unit 20 form a communication unit.
  • the transmission unit 19 is connected to the network NET and can transmit arbitrary information to any device other than the imaging device 1 including the server device SV.
  • the receiving unit 20 is connected to the network NET and can receive any information from any device other than the imaging device 1 including the server device SV.
  • the main control unit 21 is formed by a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, and comprehensively controls the operation of each part in the imaging apparatus 1.
  • the main control unit 21 includes an image processing unit 31, a distance data generation unit 32, and a positioning data correction unit 33.
  • the image processing unit 31, the distance data generation unit 32, or the positioning data correction unit 33 may be considered to be provided outside the main control unit 21.
  • the image processing unit 31 generates a color image by performing predetermined image processing (noise reduction processing, demosaicing processing, color correction processing, edge enhancement processing, etc.) on the image signal from the image sensor IS.
  • the color image is a two-dimensional image representing the intensity and color of light from the subject, and is a kind of photographed image of the imaging unit 11.
  • the color image generated by the image processing unit 31 is hereinafter referred to as a normal captured image.
  • an image signal or image data representing an arbitrary image is also simply referred to as an image.
  • the distance data generation unit (distance image generation unit) 32 detects the subject distance of each subject located in the imaging region of the imaging unit 11 using the output image signal of the imaging element IS, and distance data representing the detection result Is generated.
  • the subject distance of a certain subject refers to a distance in real space between the subject and the imaging device 1 or any imaging device that photographs the subject.
  • the distance data generation unit 32 can generate a distance image as distance data.
  • the distance image is a grayscale image in which the detection value of the subject distance is given to the pixel value of each pixel.
  • a process for generating distance data and a distance image based on an output image signal of the imaging element IS is called a distance data generation process or a distance image generation process.
  • a distance measuring light source LT for irradiating light on a subject may be provided in the imaging unit 11, and distance data may be generated using the distance measuring light source LT.
  • the distance measuring light source LT may be formed of a plurality of light emitting elements.
  • distance data may be generated using so-called TOF (Time Of Flight) ranging.
  • a ranging light source LT is formed by an LED (Light Emitting Diode) that emits near-infrared light, and the emitted light of the LED is modulated at about 10 MHz (megahertz) and projected onto a subject.
  • the reflected light from the subject with respect to the light emitted from the LED is received by the image sensor IS.
  • the distance data generation unit 32 can obtain the phase difference between the emitted light and the reflected light for each pixel based on the output image signal of the image sensor IS, and can calculate the subject distance from the phase difference for each pixel.
  • the emitted light of the LED may be modulated with a plurality of sine waves having different wavelengths (for example, several sine waves) and projected onto the subject. If only the phase difference for a single wavelength is viewed, there may be a plurality of distances that give the same phase (resulting in a large error in distance measurement). Ranging accuracy can be improved by projecting modulated waves having a plurality of wavelengths. If the phase difference is continuously measured by the image sensor IS, distance data can be generated in real time.
  • a laser is employed as the distance measuring light source LT, and laser light scanning is performed so that light from the laser is sequentially irradiated to each subject in the imaging region. good.
  • reflected light from the subject with respect to the emitted light of the laser is received by the image sensor IS, and the distance data generation unit 32 can obtain distance data based on the output image signal of the image sensor IS.
  • the distance image generation process may be realized by a so-called pattern irradiation method. That is, as shown in FIG.
  • the distance measuring light source LT is formed so that the subject is irradiated with laser light having a predetermined pattern (for example, a matrix pattern), and the imaging element IS representing the reflected light of the laser light is formed.
  • the distance data may be obtained based on the output image signal (based on the degree of distortion of the reflected light).
  • the positioning data correction unit 33 corrects the original positioning data as necessary based on the data obtained by using the transmission unit 19 and the reception unit 20, and outputs the corrected original positioning data as corrected positioning data.
  • the coordinate values x, y, and z of the imaging device 1 indicated by the corrected positioning data are represented by symbols x C , y C, and z C, respectively. That is, the position of the imaging device 1 indicated by the corrected positioning data is (x C , y C , z C ). The correction method will be described in detail later.
  • the main control unit 21 can perform image recording processing.
  • a normal captured image based on an image signal read from the image sensor IS in response to a pressing operation of the shutter button 18a (corresponding to a full pressing operation described later) is added to the memory unit 15 (for example, the memory unit 15).
  • the memory unit 15 for example, the memory unit 15.
  • the non-volatile memory In the non-volatile memory.
  • an image file including a header area and a main body area is formed in the memory unit 15, and additional data and a normal photographed image are stored in the header area and the main body area, respectively.
  • the image file can be made to conform to the file format of Exif (Exchangeable image file format).
  • the additional data can include the positioning data of the photographing point of the normal photographed image, the focal length, the F value, the ISO sensitivity, etc. at the time of photographing the normal photographed image, and may further include the distance image. . Further, an image file in which a distance image is stored in the main body area instead of a normal captured image may be formed.
  • the positioning data included in the additional data is original positioning data (x M , y M , z M ) or corrected positioning data (x C , y C , z C ).
  • the original positioning data (x M , y M , z M ) or corrected positioning data (x C , y C , z C ) obtained one after another may be accumulated and recorded in the database 16 in time series. .
  • the main control unit 21 can perform through display using the display unit 17.
  • Through display refers to a process of reading out image signals from the image sensor IS at a predetermined frame rate, sequentially generating normal captured images, and updating and displaying the sequentially generated normal captured images on the display unit 17. Note that the resolution of the normal captured image generated for the purpose of through display may be lower than the resolution of the normal captured image acquired and generated in the image recording process.
  • the main control unit 21 can perform log path recording processing.
  • the log path storing process refers to a process of acquiring positioning data periodically or intermittently and recording the sequentially acquired positioning data in the memory unit 15 (database 16) in time series. As a result, the movement route of the user who moves while holding the imaging device 1 is recorded.
  • the positioning data acquired and recorded in the log path saving process is in principle the original positioning data (x M , y M , z M ), but when the correction by the positioning data correction unit 33 is performed, the corrected positioning is performed. Data (x C , y C , z C ).
  • the main control unit 21 can execute positioning data correction processing for correcting the positioning result of the positioning processing unit 12 using the positioning data correction unit 33.
  • the CPU of the main control unit 21 reads the correction application software stored in the nonvolatile program memory in the memory unit 15, and the CPU executes the correction application software, thereby realizing the positioning data correction process.
  • the positioning data correction process may be realized only by hardware.
  • FIG. 6 is an operation flowchart of the positioning system according to the first embodiment.
  • step S11 the image pickup apparatus 1 and the correction application software are activated by turning on the image pickup apparatus 1.
  • step S12 to S18 are sequentially executed.
  • the through display can be interrupted as appropriate.
  • the log path recording process is continuously executed after the imaging apparatus 1 is activated (the same applies to other examples described later).
  • step S12 the positioning processing unit 12 and the shooting direction detection unit 13 acquire original positioning data (x M , y M , z M ) and shooting direction data ( ⁇ , ⁇ ).
  • the main control unit 21 may acquire a focal length to be included in the additional data (see FIG. 5).
  • step S13 the image processing unit 31 acquires a normal captured image from the output image signal of the imaging element IS, and the distance data generation unit 32 acquires a distance image by a distance image generation process.
  • the obtained normal photographed image is used for through display or recorded by image recording processing.
  • the shutter button 18a can be pressed in two stages. The operation of pushing the shutter button 18a by half is called a half-pressing operation, and the operation of pushing the shutter button 18a completely is called a full-pressing operation.
  • the image processing unit 31 and the distance data generating unit 32 generate a normal captured image and a distance image based on an image signal read from the image sensor IS in response to a half-press operation or a full-press operation. Also good.
  • the above-described half-press operation or full-press operation is an operation of touching the shutter button 18a on the touch panel with an operating body (finger or operation pen), or a shutter button on the touch panel.
  • the operation body can be replaced with an operation of releasing the operation body from the touch panel.
  • the normal captured image and the distance image acquired in step S13 are the images 310 and 320 in FIGS. 7A and 7B, respectively.
  • the normal captured image 310 and the distance image 320 are images obtained by capturing an object SUB that is one of the subjects. 7A and 7B, the point SUB P represents the center of gravity of the object SUB in the normal captured image 310 and the distance image 320.
  • step S ⁇ b> 14 the main control unit 21 sets the distance image acquired in step S ⁇ b> 13 as a reference image, and under the control of the main control unit 21, the transmission unit 19 together with the reference image capture position ( That is, the original positioning data (x M , y M , z M ) at the time of capturing the reference image is uploaded (transmitted).
  • Upload refers to transmitting arbitrary information from the transmission unit 19 to the server device SV (or an arbitrary site or the like) via the network NET.
  • step S15 the server device SV executes a similar image search process based on the reference image and the original positioning data uploaded in step S14, thereby determining target object position data (x T , y T , z T ). To do. Ideally, the object position data (x T , y T , z T ) accurately indicates the location of the object SUB included in the reference image (the same applies to other examples described later).
  • FIG. 8 is an internal block diagram of the server device SV A as the server device SV according to the first embodiment.
  • FIG. 9 is a flowchart showing the procedure of the similar image search process.
  • the similar image search process includes the processes of steps S31 to S34.
  • the server device SV A includes each part referred to by reference numerals 51 to 54. However, all or part of the database 51, connected to the network NET, it may be provided to devices other than the server device SV A.
  • FIG. 10 shows an image group 340 composed of a plurality of evaluation target images and a position data group 341 composed of a plurality of position data.
  • the image group 340 and the position data group 341 are actually stored in the database 51 after being stored in a plurality of image files conforming to a file format such as Exif, for example.
  • the i-th evaluation object image in the image group 340 is referred to by a symbol Q i (i is an integer).
  • Position data is associated with the evaluation target image for each evaluation target image.
  • the position data associated with the evaluation target image Q i is represented by (x i , y i , z i ).
  • the position data (x i , y i , z i ) represents the location of the object included in the evaluation target image Q i .
  • each evaluation object image includes only one object.
  • each evaluation target image is also a distance image for the corresponding target object.
  • step S ⁇ b> 31 the image limitation selection unit 52 narrows down the evaluation target images to be supplied to the similarity evaluation unit 53 in the image group 340 based on the original positioning data (x M , y M , z M ). That is, the selection unit 52 supplies the similarity evaluation unit 53 with only the image of the object having a location near the position (x M , y M , z M ) in the image group 340. Part of 340 is selected and selected, and the plurality of selected evaluation target images are supplied to the similarity evaluation unit 53.
  • the shooting direction data may also be given to the selection unit 52 by uploading, and the selection may be performed using the shooting direction data (further narrowing may be performed using the shooting direction data). Further, only the reference image may be uploaded in step S14 of FIG. 6, and the original positioning data may not be uploaded. However, in this case, narrowing down and selection in step S31 (FIG. 9) are not performed.
  • step S32 the similarity evaluation unit 53 performs similarity evaluation processing for each evaluation object image for the evaluation object images Q 1 to Q n .
  • the similarity evaluation process the similarity between the reference image and the evaluation target image is evaluated based on the reference image and the evaluation target image.
  • the evaluation target image Q i is more similar to the reference image as the evaluation target image Q i has a higher similarity.
  • the degree of similarity determined for the evaluation target image Q i represented by the symbol SIM i.
  • the similarity evaluation unit 53 includes a reference image and the evaluation image feature quantity of each of the reference image and the evaluation object image Q i from the image data of the target image Q i (object The shape, depth, edge, histogram, etc.) of each object are extracted, and the similarity SIM i is obtained by comparing the image feature amount between the reference image and the evaluation target image Q i .
  • step S33 the similarity evaluation unit 53 extracts a similarity equal to or greater than a predetermined threshold from the obtained similarities SIM 1 to SIM n , and sets the evaluation target image corresponding to the extracted similarity to the similarity of the reference image. Extract as an image.
  • the evaluation target image corresponding to the maximum similarity can be specified as a similar image of the reference image.
  • An evaluation target image specified and extracted as a similar image of the reference image is referred to as a specific similar image (extraction target image).
  • step S34 the object position data acquisition unit 54 uses the position data associated with the specific similar image as the object position data (x T , y T , z T ), and the position data (x 1 , y 1 , z 1). ) To (x n , y n , z n ) (in other words, extracted from the position data group 341).
  • the degree of similarity SIM 2 is the evaluation object image Q 2 by being extracted as above similarity predetermined threshold is extracted as a specific similar image
  • data relating to the position associated with the evaluation object image Q 2 i.e.
  • position data (x 2 , y 2 , z 2 ) representing the location of the object in the evaluation object image Q 2 is extracted as object position data (x T , y T , z T ).
  • the object position data (x T , y T , z T ) determined by the similar image search process of FIG. 9 is downloaded (received) by the receiving unit 20 in step S16 following step S15.
  • Download refers to receiving arbitrary information from the server device SV via the network NET.
  • step S17 the positioning data correction unit 33 performs the original positioning based on the object position data (x T , y T , z T ), the distance image as the distance data, and the shooting direction data ( ⁇ , ⁇ ).
  • Positioning data correction processing for correcting data (x M , y M , z M ) is performed.
  • the object position data (x T , y T , z T ), the distance image, and the shooting direction data ( ⁇ , ⁇ ) obtained in steps S16, S13, and S12 can be used. .
  • the unit of coordinate values x, y, and z is assumed to be m (meter).
  • the main control unit 21 may correct the positioning data in the image file from (x M , y M , z M ) to (x C , y C , z C ).
  • the moving route recorded in the log route recording process can be corrected.
  • FIGS. 13A and 13B are correction image diagrams of positioning data.
  • the correction information may be displayed on the display unit 17 while performing through display.
  • the correction information is, for example, corrected positioning data, an error between original positioning data and corrected positioning data, or an accurate distance from the imaging apparatus 1 to the target SUB based on the corrected positioning data and target position data.
  • the positioning processing unit 12 If the positioning processing unit 12 is used, the current location of the user and the imaging device 1 can be grasped, but the position detection accuracy is limited. If the correction method according to the present embodiment is used, the position detection error can be accurately corrected not only from the image obtained in response to the shutter operation but also from the image obtained for the through display. It becomes possible to know the current location of Since there are images of various objects on the network, even if the object for the imaging apparatus 1 is not a famous object (such as a building of a tourist attraction) (for example, an exhibition hall, a shopping mall, Positioning data can be corrected even if it is an anonymous building or an alley store.
  • a famous object such as a building of a tourist attraction
  • Positioning data can be corrected even if it is an anonymous building or an alley store.
  • the object recognition is performed when the shooting environment of the imaging apparatus 1 is dark or the object is confused with a background of similar colors. Even under difficult circumstances, it is possible to accurately determine the object, and as a result, it is possible to correct the positioning data accurately.
  • step S14 may be replaced with step S14 'in FIG. That is, in step S14 (step S14 '), the main control unit 21 may set the normal captured image acquired in step S13 as a reference image.
  • each evaluation target image is a color image of the corresponding target object (represents the intensity and color of light incident on the image sensor of the imaging device that captures the target object from the target object). 2D image) (the same applies to other examples described later).
  • the normal captured image is set as the reference image, it becomes difficult to accurately determine the object under the above-described circumstances (for example, when the imaging environment of the imaging apparatus 1 is dark), and as a result, the distance image is converted into the reference image.
  • the correction accuracy of positioning data may be lower than the case of using as the positioning data.
  • the display of the correction information in step S18 may include display of warning information (hereinafter referred to as warning display).
  • the main control unit 21 can obtain a distance ⁇ ERR (distance in real space) between the position indicated by the original positioning data and the position indicated by the corrected positioning data.
  • the distance ⁇ ERR represents a positioning error by the positioning processing unit 12 and corresponds to a correction amount to the original positioning data by the positioning data correcting unit 33.
  • the main control unit 21 may cause the display unit 17 to display warning information (characters, figures, icons, etc.) indicating that the positioning error is large when the distance ⁇ ERR is greater than or equal to a predetermined threshold.
  • the warning display is a type of warning notification that suggests a large positioning error
  • the warning notification may be any notification that appeals to the human senses (particularly visual, auditory, and tactile). That is, the main control unit 21 outputs video using the display unit 17, audio output using a speaker (not shown) in the imaging device 1, or the imaging device 1 when the distance ⁇ ERR is greater than or equal to a predetermined threshold. The user may be notified that the positioning error is large by vibration or the like (the same applies to other embodiments described later). “Large positioning error” in the warning notification means that the positioning error (that is, the distance ⁇ ERR ) is larger than a predetermined allowable error.
  • FIG. 15 is an operation flowchart of the positioning system according to the second embodiment.
  • the imaging apparatus 1 and the correction application software are started by turning on the imaging apparatus 1, and thereafter, through display is continuously performed, while the processes in steps S52 to S59 are sequentially performed.
  • step S52 the positioning processing unit 12 acquires original positioning data (x M , y M , z M ).
  • the transmission unit 19 uploads (transmits) the original positioning data (x M , y M , z M ) acquired in step S52 under the control of the main control unit 21.
  • the image restriction selection unit 52 (see FIG. 8) provided in the server device SV follows the method described in the first embodiment, and the image group In 340, the evaluation target images to be supplied to the similarity evaluation unit 53 are narrowed down.
  • the selection unit 52 supplies the similarity evaluation unit 53 with only the image of the object having a location near the position (x M , y M , z M ) in the image group 340. A part of 340 is selected and selected. As in the first embodiment, it is assumed that the plurality of evaluation target images selected here are evaluation target images Q 1 to Q n . Note that the shooting direction data may also be given to the selection unit 52 by uploading, and the selection may be performed using the shooting direction data (further narrowing may be performed using the shooting direction data).
  • the server device SV transmits the selected plurality of evaluation target images to the imaging device 1 via the network NET together with the plurality of position data corresponding to the selected plurality of evaluation target images.
  • the receiving unit 20 uses the evaluation target images Q 1 to Q n for the objects around the position (x M , y M , z M ) as the corresponding position data (x 1 , y 1 , z 1 ) to (x n , y n , z n ) are downloaded (received).
  • the normal captured image 310 and the distance image 320 are acquired by the acquisition process in step S55 (see FIGS. 7A and 7B).
  • the acquisition process in step S55 is the same as that in step S13 (see FIG. 6). However, the acquisition in step S55 may be performed between the processes in steps S51 to S54.
  • the positioning processing unit 12 and the imaging direction detection unit 13 acquire the latest original positioning data (x M , y M , z M ) and imaging direction data ( ⁇ , ⁇ ).
  • the main control unit 21 may acquire a focal length to be included in the additional data (see FIG. 5).
  • the object position data (x T , y T , z T ) is determined by executing the similar image search process in step S57 on the imaging device 1 side.
  • the main control unit 21 may be provided with a similarity evaluation unit 53 and an object position data acquisition unit 54 (see also FIG. 8).
  • a part including the similarity evaluation unit 53 and the object position data acquisition unit 54 included in the main control unit 21 may be called an arithmetic processing unit 34.
  • the evaluation target images Q 1 to Q n and the position data (x 1 , y 1 , z 1 ) to (x n , y n , z n ) downloaded in step S54 are supplied to the arithmetic processing unit 34 for calculation.
  • the processing unit 34 performs steps S32 to S34 in FIG. Thereby, in step S57 and the arithmetic processing unit 34, the similarity between the reference image and each evaluation target image is evaluated (that is, the similarity indicating the comparison result is derived by comparing the reference image with each evaluation target image).
  • a specific similar image (extraction target image) that is a similar image of the reference image is specified and extracted from the evaluation target images Q 1 to Q n , and the position data associated with the specific similar image is the target position data (X T , y T , z T ) is extracted from the position data (x 1 , y 1 , z 1 ) to (x n , y n , z n ).
  • the main control unit 21 desirably sets the distance image acquired in step S55 as the reference image in step S57.
  • each evaluation object image is also a distance image for the corresponding object.
  • the normal captured image acquired in step S55 may be set as the reference image in step S57.
  • step S58 the positioning data correction unit 33 performs positioning data correction processing.
  • the positioning data correction process in step S58 is the same as that in step S17 in FIG. However, in the positioning data correction process in step S58, the object position data (x T , y T , z T ), the distance image, and the shooting direction data ( ⁇ , ⁇ ) obtained in steps S57, S55, and S56 are used. Can do. Thereafter, the process of step S59 is performed.
  • the process of step S59 is the same as that of step S18 of FIG.
  • the same effect as the first embodiment can be obtained by the second embodiment.
  • a similar image can be searched on the imaging device 1 side without burdening the server device SV and without depending on the search capability of the server device SV. Is also expected.
  • a normal captured image and a distance image obtained by shooting, and an arbitrary image (including an evaluation target image) acquired via the network NET, original positioning data corresponding to them, corrected positioning, and the like may be stored and recorded in the database 16 together with the data or the position data (the same applies to the first embodiment described above and other embodiments described later).
  • the distance image acquired by the imaging device 1 and the corrected positioning data are associated with each other and stored in the database 16. You may make it record (it is the same also in the above-mentioned 1st Example and the other Example mentioned later).
  • Information stored and recorded in the database 16 may be used for subsequent positioning data correction processing.
  • FIG. 17 is an operation flowchart of the positioning system according to the third embodiment.
  • step S71 the image pickup apparatus 1 and the correction application software are activated by turning on the image pickup apparatus 1. Thereafter, through display is continuously executed, while the processes in steps S72 to S80 are sequentially executed.
  • step S71 the image pickup apparatus 1 and the correction application software are activated by turning on the image pickup apparatus 1. Thereafter, through display is continuously executed, while the processes in steps S72 to S80 are sequentially executed.
  • step S72 the positioning processing unit 12 acquires original positioning data (x M , y M , z M ).
  • the transmission unit 19 uploads (transmits) the original positioning data (x M , y M , z M ) acquired in step S72 under the control of the main control unit 21.
  • the server device SV Based on the uploaded original positioning data (x M , y M , z M ), the server device SV returns a recommended shooting spot having a location near the position (x M , y M , z M ) to the imaging device 1. It is recommended to the user to shoot at the recommended shooting spot.
  • the evaluation for the server device SV is based on uploaded original positioning data (x M, y M, z M) , position (x M, y M, z M) object having a location in the vicinity of A target image is extracted from the database 51.
  • the evaluation target image extracted here is called a recommended shooting image.
  • the position data stored in the database 51 and associated with the recommended shooting image is referred to as recommended shooting position data.
  • the recommended shooting image and the recommended shooting position data are downloaded (received) together with the corresponding navigation information by the receiving unit 20 in step S74.
  • the recommended shooting image is an image for which shooting is recommended for correcting the positioning data, and an image that the user is likely to desire shooting. Therefore, it is desirable that the target object in the recommended photographing image is an object as famous as possible (for example, a temple at a tourist attraction).
  • each evaluation target image is composed of a color image for the target object and a distance image for the target object.
  • the recommended shooting image to be downloaded includes the recommended shooting color image 410 and the recommended shooting distance image 420 shown in FIGS.
  • the navigation information corresponding to the recommended shooting image is information for guiding the user to a position suitable for shooting the recommended shooting image, and includes, for example, map information.
  • the user moves to a position suitable for shooting a recommended shooting image according to the navigation information while referring to the recommended shooting color image 410 downloaded and displayed on the display unit 17, and then presses the shutter button 18 a.
  • the normal captured image 310 and the distance image 320 are acquired by the imaging device 1 (see FIGS. 7A and 7B).
  • the positioning processing unit 12 and the imaging direction detection unit 13 acquire the latest original positioning data (x M , y M , z M ) and imaging direction data ( ⁇ , ⁇ ).
  • the main control unit 21 may acquire a focal length to be included in the additional data (see FIG. 5).
  • a similarity evaluation unit 53 and an object position data acquisition unit 54 are provided in the main control unit 21.
  • the similarity evaluation unit 53 of the main control unit 21 sets the distance image 320 as a reference image, and then compares the reference image and the recommended shooting distance image 420 to compare the reference image and the recommended shooting distance image. The similarity between 420 is evaluated.
  • the similarity evaluation unit 53 of the main control unit 21 sets the normal captured image 310 as a reference image, and then compares the reference image and the recommended shooting color image 410 to compare the reference image and the recommended shooting color. The similarity between the images 410 may be evaluated.
  • the method for deriving the similarity between two images is as described in the first embodiment.
  • the object position data acquisition unit 54 of the main control unit 21 confirms whether the similarity corresponding to the above comparison result is equal to or greater than a predetermined threshold, whereby the image 410 or 420 is a similar image of the reference image. Check if it corresponds to.
  • the object position data acquisition unit 54 of the control unit 21 determines that the image 410 or 420 is a similar image of the reference image in step S78 when the similarity (similarity in step S77) is equal to or greater than a predetermined threshold. and extracts the photographing recommended position data from the received content of the reception unit 20 sets the photographing recommended position data object position data (x T, y T, z T) to.
  • the recommended shooting position data set as the object position data (x T , y T , z T ) is data representing the exact location of the target object in the recommended shooting image. If the similarity in step S77 is not greater than or equal to a predetermined threshold value, the process may return to step S75 to prompt re-shooting of the normal shot image and the distance image.
  • step S79 the positioning data correction unit 33 performs positioning data correction processing.
  • the positioning data correction process in step S79 is the same as that in step S17 in FIG. However, in the positioning data correction process in step S79, the object position data (x T , y T , z T ), distance image, and shooting direction data ( ⁇ , ⁇ ) set or acquired in steps S78, S75, and S76. Can be used. Thereafter, the process of step S80 is performed.
  • the process of step S80 is the same as that of step S18 of FIG.
  • Correction is possible.
  • an image uploaded by a friend of the user of the imaging apparatus 1 is downloaded as a recommended shooting image, it is possible to perform shooting at the same location as the shooting location of the friend. Then, by uploading an image captured by the imaging device 1 to an image sharing site (for example, Flickr (registered trademark), Facebook (registered trademark), Twitter (registered trademark)) on the network NET, the user of the imaging device 1 and It is also possible to share image information between the user's friends in real time.
  • an image sharing site for example, Flickr (registered trademark), Facebook (registered trademark), Twitter (registered trademark)
  • the original positioning data (x M, y M, z M) is 2 or more captured recommended image based on may be extracted.
  • two or more recommended shooting images are extracted, two or more sets of “recommended shooting images, recommended shooting position data, and navigation information” are downloaded by the receiving unit 20, and the imaging apparatus 1 has a plurality of sets.
  • the user is prompted to select one of a plurality of recommended shooting color images after displaying the recommended shooting color image on the display unit 17.
  • the user selects one shooting recommended color image from a plurality of sets of shooting recommended color images.
  • the movement of the user and the processing of the imaging device 1 after step S74 are performed as described above.
  • the image pickup unit 11 is provided with a plurality of sets of image pickup devices, optical systems, and ranging light sources.
  • the imaging unit 11 provided with an image sensor IS 1 and optical system and the distance-measuring light source LT 1 for image sensor IS 1, the image pickup element IS 2 and optical system for the imaging element IS 2 and providing a distance measuring light source LT 2.
  • Each of the image sensors IS 1 and IS 2 is the same as the image sensor IS described above.
  • Each of the ranging light sources LT 1 and LT 2 may be the same as the above-described ranging light source LT.
  • the imaging elements IS 1 and IS 2 are arranged at different positions, and can generate images having parallax with each other. However, a part of the imaging regions of the image sensors IS 1 and IS 2 overlap each other.
  • Image sensor IS 1 and IS 2 respectively, function as the left and right eyes of the image pickup apparatus 1.
  • the distance measuring light source LT 1 irradiates light on the subject in the imaging region of the image sensor IS 1
  • the distance measuring light source LT 2 irradiates light on the subject in the imaging region of the image sensor IS 2 .
  • Each distance measuring light source may be formed of a plurality of light emitting elements.
  • the distance measuring light sources LT 1 and LT 2 may irradiate light having a plurality of wavelengths to the subject in the imaging regions of the imaging elements IS 1 and IS 2 . More specifically, for example, each of the ranging light sources LT 1 and LT 2 is formed by an LED (Light Emitting Diode) that emits near-infrared light, and a plurality of sine waves having different wavelengths (for example, several 10) are formed. The emitted light of each LED may be modulated with a sine wave of (1) and projected onto the subject. Reflected light from the subject with respect to the light emitted from each LED is received by the imaging elements IS 1 and IS 2 .
  • LED Light Emitting Diode
  • the distance data generation unit 32 obtains the phase difference between the emitted light and the reflected light for each pixel based on the output image signal of the image sensor IS 1 , and calculates the subject distance from the phase difference for each pixel. A left eye distance image based on the output image signal of IS 1 can be generated. Similarly, the distance data generation unit 32 obtains the phase difference between the emitted light and reflected light for each pixel based on an output image signal from the image sensor IS 2, by obtaining the object distance from the phase difference for each pixel, A right eye distance image based on the output image signal of the image sensor IS 2 can be generated.
  • the distance data generation unit 32 can generate a final distance image in which the influence of occlusion is suppressed by integrating the left eye distance image and the right eye distance image.
  • the final distance image may be a distance image (for example, the distance image 320 in FIG. 7B) to be generated by the distance data generation unit 32 in the first to third embodiments described above or other embodiments described later. .
  • the image processing unit 31 of FIG. 1, the output image signal from the image sensor IS 1 can generate a left-eye normal photographed image, it is possible to generate a right-eye normal photographed image from the output image signal from the image sensor IS 2 . In the process of generating these images, noise removal processing using a median filter or the like may be performed.
  • the left-eye or right-eye normal captured image is a normal captured image (for example, the normal captured image 310 of FIG. 7A) to be generated by the image processing unit 31 in the first to third embodiments described above or other embodiments described later. ).
  • the image processing unit 31 obtains the center-of-gravity coordinates of the object SUB in each of the left eye and right eye normal captured images.
  • the main control unit 21 can derive an accurate distance between the imaging device 1 and the object SUB from each barycentric coordinate and the distance data obtained by the main control unit 21, and performs positioning using the derived distance. Data correction processing can be performed.
  • the influence of occlusion can be suppressed by performing distance measurement using a plurality of image sensors. Also, when looking at only the phase difference for a single wavelength, there may be multiple distances that give the same phase (as a result, a large error may occur in distance measurement), but a modulated wave with multiple wavelengths should be projected. Thus, the accuracy of distance measurement can be improved. If the phase difference is continuously measured by the image sensor, distance data can be generated in real time.
  • the image pickup unit 11 is provided with two sets of the image pickup device, the optical system, and the distance measuring light source. Also good.
  • the distance data generation unit 32 may generate distance data using the principle of triangulation based on output image signals of two or more image sensors.
  • FIG. 20 is a schematic block diagram showing the overall configuration of the positioning system according to the fifth embodiment, and the positioning system of FIG. 20 is formed by adding an intermediary device CC to the positioning system of FIG.
  • the mediation device CC is a device that has a communication function and can be connected to the network NET, and is, for example, a mobile phone (including a so-called smartphone), an information terminal, or a personal computer. According to the present embodiment, even when the imaging device 1 is a device that cannot always connect to the network NET (for example, a Wi-Fi (registered trademark) authentication device), the correction of the positioning data, etc. via the intermediary device. Is possible.
  • any image (including the image to be evaluated) and position data acquired via the network NET may be accumulated and recorded in the database 16, in the fifth embodiment, the database 16 May be provided in the intermediary device CC. Thereby, the data storage load in the imaging device 1 is reduced.
  • FIG. 21 shows an access point 500 (hereinafter referred to as AP 500) as an example of a specific object.
  • the imaging device 1 is a Wi-Fi (registered trademark) authentication device, for example, and can be connected to the network NET via the AP 500.
  • the AP 500 has a memory that stores AP position data (x T2 , y T2 , z T2 ) representing the location of the AP 500.
  • the AP 500 can transmit AP position data (x T2 , y T2 , z T2 ) to the imaging apparatus 1, and the receiving unit 20 of the imaging apparatus 1 can transmit the AP position data. Data can be received.
  • the imaging apparatus 1 can notify the user that the AP 500 should be photographed using the display unit 17 or the like.
  • the user operates the imaging device 1 to cause the imaging device 1 to photograph the AP 500 as an object.
  • a captured image of the AP 500 as an object is obtained by the imaging device 1.
  • the captured image obtained here includes a distance image in the imaging region including the AP 500.
  • the positioning data correction unit 33 captures the object position data as the AP position data (x T2 , y T2 , z T2 ), the distance image obtained by photographing the AP 500 as the object, and at the time of photographing the distance image. Positioning data correction processing for correcting the original positioning data (x M , y M , z M ) can be performed based on the shooting direction data. The specific contents of the positioning data correction process are as described in the above embodiment.
  • the positioning data correction can be completed in a short time without requiring the server device SV.
  • the AP 500 is information indicating its own shape (for example, the image of the AP 500 itself) or the AP 500. May be transmitted to the imaging apparatus 1 (for example, an identification mark unique to the AP 500 that can be observed by the imaging apparatus 1).
  • the shooting direction data detected and recorded by the shooting direction detection unit 13 and the shooting direction recording unit 14 may include a roll angle.
  • the roll angle is an angle in the roll direction of the imaging device 1 and is detected using a magnetic sensor or a gyro sensor.
  • the rotation angle in the motion of rotating the housing of the imaging device 1 around the optical axis in a state where the optical axis of the imaging unit 11 is unchanged is the roll angle.
  • the roll angle In a state in which the horizontal direction in the captured image (normally captured image, distance image) of the imaging device 1 is parallel to the horizontal plane, the roll angle is 0 degrees, and the horizontal direction of the captured image increases from the horizontal plane as the absolute value of the roll angle increases. Tilt.
  • the main control unit 21 corrects the distance image or the normal captured image with the roll angle (the horizontal image of the distance image or the normal captured image with respect to the horizontal plane).
  • the image whose inclination is corrected can be set as the reference image.
  • the posture angle ⁇ can be deleted from the shooting direction data.
  • the original positioning data, the corrected positioning data, and the arbitrary position data may not include the Z-axis component that is an altitude component.
  • the posture angle ⁇ may be deleted from the shooting direction data.
  • Each of the imaging device 1 and the server device SV can be configured by hardware or a combination of hardware and software.
  • a block diagram of a part realized by software represents a functional block diagram of the part.
  • a function realized using software may be described as a program, and the function may be realized by executing the program on a program execution device (for example, a computer).

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Details Of Cameras Including Film Mechanisms (AREA)
  • Cameras In General (AREA)
  • Automatic Focus Adjustment (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

An image pick-up device (1) uploads original positioning data (xM, yM, zM), and a reference image (such as a distance image) based on the output of an image pick-up element (IS). A server device (SV) extracts from a database a plurality of evaluation target images (such as distance images) for a subject having a location near the positions (xM, yM, zM), retrieves an image similar to the reference image from the extracted images, and updates the image pick-up device (1) with the position data of the subject that was associated with the similar image. A positioning data correction unit (33) corrects the original positioning data on the basis of received position data, the distance image, and image pick-up direction data.

Description

電子機器Electronics
 本発明は、撮像装置等の電子機器に関する。 The present invention relates to an electronic device such as an imaging device.
 撮像装置等の電子機器において、撮影画像と共にグローバルポジショニングシステム(GPS)を利用した位置情報を記録できるが増えてきている。GPS機能を用いればユーザ及び電子機器の概ねの現在地を把握することができる。尚、携帯電話機の現在地情報をサーバ機器に送信し、サーバ機器が当該現在地情報に関連する画像を抽出して携帯電話機に返信する方法もある(例えば下記特許文献1参照)。 In an electronic apparatus such as an imaging apparatus, positional information using a global positioning system (GPS) can be recorded together with a captured image, but it is increasing. If the GPS function is used, the approximate current location of the user and the electronic device can be grasped. There is also a method in which the current location information of the mobile phone is transmitted to the server device, and the server device extracts an image related to the current location information and sends it back to the mobile phone (for example, see Patent Document 1 below).
特開2010-277326号公報JP 2010-277326 A
 GPS機能を用いた位置の検出精度には限界があり、ユーザにとって許容できない誤差が検出位置に含まれることがある。尚、特許文献1の方法は当該誤差の低減に寄与しない。 The position detection accuracy using the GPS function is limited, and errors that are unacceptable to the user may be included in the detection position. Note that the method of Patent Document 1 does not contribute to the reduction of the error.
 そこで本発明は、測位データを正確に補正しうる電子機器を提供することを目的とする。 Therefore, an object of the present invention is to provide an electronic device that can accurately correct positioning data.
 本発明に係る第1の電子機器は、被写体の画像信号を出力する撮像素子を有する撮像部と、前記撮像素子の出力を用いて当該電子機器及び前記被写体間の距離を表す距離データを生成する距離データ生成部と、前記撮像部の撮影方向を検出することで撮影方向データを生成する撮影方向検出部と、衛星からの信号に基づく測位を行って原測位データを生成する測位処理部と、前記撮像素子の出力に基づく基準画像を外部機器に送信する送信部と、前記外部機器から前記基準画像に基づく位置データを受信する受信部と、受信位置データ、前記距離データ及び前記撮影方向データに基づいて前記原測位データを補正する測位データ補正部と、を備えたことを特徴とする。 A first electronic device according to the present invention generates an image pickup unit having an image pickup device that outputs an image signal of a subject, and distance data representing a distance between the electronic device and the subject using an output of the image pickup device. A distance data generation unit, a shooting direction detection unit that generates shooting direction data by detecting a shooting direction of the imaging unit, a positioning processing unit that performs positioning based on a signal from a satellite and generates original positioning data; A transmission unit that transmits a reference image based on the output of the imaging device to an external device, a reception unit that receives position data based on the reference image from the external device, received position data, the distance data, and the shooting direction data. And a positioning data correcting unit for correcting the original positioning data based on the positioning data.
 これにより測位データの正確な補正が期待される。 This is expected to correct the positioning data accurately.
 具体的には例えば、第1の電子機器において、前記基準画像に対応する対象画像に関連付けられた位置に関するデータが前記位置データとして前記受信部にて受信され、前記対象画像に関連付けられた位置に関するデータは、前記対象画像に含まれる対象物の所在地データを含んでいてもよい。 Specifically, for example, in the first electronic device, data related to the position associated with the target image corresponding to the reference image is received as the position data by the receiving unit, and the position related to the target image is related to the first electronic device. The data may include location data of an object included in the target image.
 また例えば、第1の電子機器において、前記送信部は、前記基準画像及び前記原測位データを外部機器に送信しても良く、前記受信位置データは、前記基準画像及び前記原測位データに基づく位置データであっても良い。 Further, for example, in the first electronic device, the transmission unit may transmit the reference image and the original positioning data to an external device, and the reception position data is a position based on the reference image and the original positioning data. It may be data.
 原測位データを外部機器に提供することで、電子機器に返信されるべき位置データの決定用処理負担を軽減可能である。 提供 By providing the original positioning data to the external device, it is possible to reduce the processing load for determining the position data to be returned to the electronic device.
 本発明に係る第3の電子機器は、被写体の画像信号を出力する撮像素子を有する撮像部と、前記撮像素子の出力を用いて当該電子機器及び前記被写体間の距離を表す距離データを生成する距離データ生成部と、前記撮像部の撮影方向を検出することで撮影方向データを生成する撮影方向検出部と、衛星からの信号に基づく測位を行って原測位データを生成する測位処理部と、前記原測位データを外部機器に送信する送信部と、前記外部機器から前記原測位データに基づく対象画像及び位置データを受信する受信部と、前記対象画像と前記撮像素子の出力に基づく基準画像とを対比し、対比結果に応じて前記位置データを前記受信部の受信内容から抽出する演算処理部と、抽出位置データ、前記距離データ及び前記撮影方向データに基づいて前記原測位データを補正する測位データ補正部と、を備えたことを特徴とする。 A third electronic device according to the present invention generates an image pickup unit having an image pickup device that outputs an image signal of a subject, and distance data representing a distance between the electronic device and the subject using an output of the image pickup device. A distance data generation unit, a shooting direction detection unit that generates shooting direction data by detecting a shooting direction of the imaging unit, a positioning processing unit that performs positioning based on a signal from a satellite and generates original positioning data; A transmitting unit that transmits the original positioning data to an external device, a receiving unit that receives a target image and position data based on the original positioning data from the external device, a reference image based on the target image and an output of the imaging element, and And based on the extraction position data, the distance data, and the shooting direction data, the arithmetic processing unit for extracting the position data from the received content of the receiving unit according to the comparison result And positioning data correcting unit for correcting the Kihara positioning data, characterized by comprising a.
 これによっても測位データの正確な補正が期待される。 This also expects accurate correction of positioning data.
 具体的には例えば、第2の電子機器において、前記位置データは、前記対象画像に含まれる対象物の所在地データを含んでいても良い。 Specifically, for example, in the second electronic device, the position data may include location data of an object included in the target image.
 また例えば、第2の電子機器において、前記受信部は、前記外部機器から前記原測位データに基づく対象画像群及び位置データ群を受信しても良く、前記演算処理部は、前記対象画像群の中から前記基準画像に対応する画像を前記対象画像として抽出し、前記対象画像に関連付けられた前記位置データを前記位置データ群から抽出しても良い。 For example, in the second electronic device, the reception unit may receive a target image group and a position data group based on the original positioning data from the external device, and the arithmetic processing unit An image corresponding to the reference image may be extracted as the target image from the inside, and the position data associated with the target image may be extracted from the position data group.
 また例えば、第1又は第2の電子機器において、前記基準画像として、前記距離データによる距離画像を用いてもよい。 For example, in the first or second electronic device, a distance image based on the distance data may be used as the reference image.
 また例えば、第1又は第2の電子機器において、前記撮像部は、前記被写体に対して光を照射する測距用光源を有し、前記距離データは、前記測距用光源を用いて生成されてもよい。 Further, for example, in the first or second electronic device, the imaging unit includes a distance measuring light source that irradiates light to the subject, and the distance data is generated using the distance measuring light source. May be.
 また例えば、第1又は第2の電子機器において、前記撮像部は、前記撮像素子として複数の撮像素子を有するとともに、前記複数の撮像素子に対応する複数の測距用光源を有し、前記複数の測距用光源は前記被写体に対して複数の波長を持つ光を照射し、前記距離データ生成部は、前記被写体からの反射光に依存する各撮像素子の出力を用いて前記距離データを生成してもよい。 Further, for example, in the first or second electronic apparatus, the imaging unit includes a plurality of imaging elements as the imaging element, a plurality of ranging light sources corresponding to the plurality of imaging elements, The distance measuring light source irradiates the subject with light having a plurality of wavelengths, and the distance data generation unit generates the distance data using the output of each imaging device depending on the reflected light from the subject. May be.
 これにより、オクルージョンの影響が抑制された、より正確な距離データの取得が期待される。 ∙ As a result, more accurate distance data can be obtained with the influence of occlusion suppressed.
 また例えば、第1又は第2の電子機器において、前記測位データ補正部による前記原測位データへの補正量に応じ、警告報知を成してもよい。 Also, for example, in the first or second electronic device, a warning notification may be made according to the correction amount to the original positioning data by the positioning data correction unit.
 本発明によれば、測位データを正確に補正しうる電子機器の提供が可能である。 According to the present invention, it is possible to provide an electronic device capable of accurately correcting positioning data.
本発明の実施形態に係る測位システムの概略全体ブロック図である。1 is a schematic overall block diagram of a positioning system according to an embodiment of the present invention. 本発明の実施形態に係るX、Y及びZ軸を定義する図である。It is a figure which defines the X-, Y-, and Z-axis which concerns on embodiment of this invention. (a)及び(b)は、撮像装置の方位角及び姿勢角を説明するための図である。(A) And (b) is a figure for demonstrating the azimuth angle and attitude | position angle of an imaging device. (a)~(c)は、撮像部に設けることのできる測距用光源を説明するための図である。(A)-(c) is a figure for demonstrating the light source for ranging which can be provided in an imaging part. 画像ファイルの構造を示す図である。It is a figure which shows the structure of an image file. 本発明の第1実施例に係る測位システムの動作フローチャートである。It is an operation | movement flowchart of the positioning system which concerns on 1st Example of this invention. (a)及び(b)は、撮像装置にて取得される通常撮影画像及び距離画像の例を示す図である。(A) And (b) is a figure which shows the example of the normal picked-up image and distance image which are acquired with an imaging device. 本発明の第1実施例に係るサーバ機器の内部ブロック図である。It is an internal block diagram of the server apparatus based on 1st Example of this invention. 類似画像検索処理の手順を示すフローチャートである。It is a flowchart which shows the procedure of a similar image search process. 複数の評価対象画像と複数の位置データを示す図である。It is a figure which shows several evaluation object image and several position data. 測位データ補正処理を説明するための図であって、撮像装置及び対象物間の位置関係例を示す図である。It is a figure for demonstrating a positioning data correction process, Comprising: It is a figure which shows the positional relationship example between an imaging device and a target object. 測位データ補正処理を説明するための図であって、撮像装置及び対象物間の位置関係例を示す図である。It is a figure for demonstrating a positioning data correction process, Comprising: It is a figure which shows the positional relationship example between an imaging device and a target object. (a)及び(b)は、測位データ補正処理による測位データの補正イメージ図である。(A) And (b) is a correction image figure of positioning data by positioning data correction processing. 図6の一部のステップの変形例を示す図である。It is a figure which shows the modification of the one part step of FIG. 本発明の第2実施例に係る測位システムの動作フローチャートである。It is an operation | movement flowchart of the positioning system which concerns on 2nd Example of this invention. 本発明の第2実施例に係る主制御部の内部ブロック図である。It is an internal block diagram of the main control part which concerns on 2nd Example of this invention. 本発明の第3実施例に係る測位システムの動作フローチャートである。It is an operation | movement flowchart of the positioning system which concerns on 3rd Example of this invention. (a)及び(b)は、本発明の第3実施例に係り、ダウンロードされる撮影推奨カラー画像及び撮影推奨距離画像を示す図である。(A) And (b) is a figure which shows the photography recommendation color image and photography recommendation distance image which concern on 3rd Example of this invention and are downloaded. 本発明の第4実施例に係る撮像部の構造を示す図である。It is a figure which shows the structure of the imaging part which concerns on 4th Example of this invention. 本発明の第5実施例に係る測位システムの概略全体ブロック図である。It is a general | schematic whole block diagram of the positioning system which concerns on 5th Example of this invention. 本発明の第6実施例に係る撮像装置及びアクセスポイントを示す図である。It is a figure which shows the imaging device and access point which concern on 6th Example of this invention.
 以下、本発明の実施形態の例を、図面を参照して具体的に説明する。参照される各図において、同一の部分には同一の符号を付し、同一の部分に関する重複する説明を原則として省略する。尚、本明細書では、記述の簡略化上、情報、信号、物理量、状態量又は部材等を参照する記号又は符号を記すことによって該記号又は符号に対応する情報、信号、物理量、状態量又は部材等の名称を省略又は略記することがある。 Hereinafter, an example of an embodiment of the present invention will be specifically described with reference to the drawings. In each of the drawings to be referred to, the same part is denoted by the same reference numeral, and redundant description regarding the same part is omitted in principle. In this specification, for simplification of description, a symbol or reference that refers to information, signal, physical quantity, state quantity, member, or the like is written to indicate information, signal, physical quantity, state quantity or Names of members and the like may be omitted or abbreviated.
 図1は、本発明の実施形態に係る測位システムの全体構成を示す概略ブロック図である。測位システムは、電子機器1と、電子機器1の外部機器である、コンピュータ等から成るサーバ機器SVと、を備える。電子機器1及びサーバ機器SVは、インターネット等を含んで形成されるネットワークNETに接続され、ネットワークNETを介して任意のデータの通信が可能である。電子機器1は、符号11~21によって参照される各部位を備える。電子機器1は撮影機能を有している。故に、以下では、電子機器1を撮像装置(デジタルカメラ)と呼ぶ。撮像装置は電子機器の一種である。撮像装置1において、撮影機能以外の機能(例えば、電話機能、インターネット接続機能、電子メール送受信機能及び音楽の再生機能)が更に実現されても良く、この場合、撮像装置1は撮像装置以外の機器(例えば、携帯電話機又は情報端末)に分類されうる。 FIG. 1 is a schematic block diagram showing the overall configuration of a positioning system according to an embodiment of the present invention. The positioning system includes an electronic device 1 and a server device SV that is an external device of the electronic device 1 and includes a computer or the like. The electronic device 1 and the server device SV are connected to a network NET formed including the Internet and the like, and can communicate arbitrary data via the network NET. The electronic device 1 includes each part referred to by reference numerals 11 to 21. The electronic device 1 has a photographing function. Therefore, hereinafter, the electronic device 1 is referred to as an imaging device (digital camera). An imaging device is a kind of electronic equipment. In the imaging apparatus 1, functions other than the imaging function (for example, a telephone function, an Internet connection function, an e-mail transmission / reception function, and a music playback function) may be further realized. In this case, the imaging apparatus 1 is a device other than the imaging apparatus. (For example, a mobile phone or an information terminal).
 撮像部11は、光学系、絞り、及び、CCD(Charge Coupled Device)又はCMOS(Complementary Metal Oxide Semiconductor)イメージセンサなどから成る撮像素子(固体撮像素子)ISを有し、撮像素子ISを用いて被写体の撮影を行う。撮像素子ISは、光学系を介して入射した被写体の光学像を光電変換し、該光電変換によって得られた電気信号、即ち被写体の画像信号を出力する。 The imaging unit 11 has an optical system, an aperture, and an imaging element (solid-state imaging element) IS including a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and the subject using the imaging element IS. Take a photo of The image sensor IS photoelectrically converts an optical image of a subject incident through the optical system, and outputs an electrical signal obtained by the photoelectric conversion, that is, an image signal of the subject.
 測位処理部12は、グローバルポジショニングシステム(GPS)を形成する衛星より送信される信号を受信し、受信信号に基づき撮像装置1の測位を行う。撮像装置1の測位とは、撮像装置1の位置(所在地)を求める処理を指し、求められる撮像装置1の位置は、撮像装置1の経度、緯度及び高度を含む。但し、撮像装置1の高度は、求められる撮像装置1の位置に含まれないこともある。測位処理部12による測位の結果、即ち、求められた撮像装置1の経度、緯度及び高度を含むデータは原測位データとして主制御部21に出力される。図2に示すように、原点Oにて互いに直交及び交差するX軸、Y軸及びZ軸を定義する。X軸及びY軸は水平面に平行であり、Z軸は重力の方向に平行である。以下では、任意の物体の位置(所在地)の経度、緯度及び高度を、それぞれX、Y及びZ軸上の座標値x,y及びzとして表現すると共に、座標値x,y及びzに対応する物体の位置(所在地)を示すデータを位置データと呼ぶ又は(x,y,z)にて表記する。原測位データによって示される撮像装置1の座標値x,y及びzを、夫々、記号x,y及びzにて表す。従って、原測位データが表す撮像装置1の位置は(x,y,z)である。原点Oを基準として、X軸及びY軸の正方向はそれぞれ東及び北に対応し、Z軸の正方向は上方向(重力の向きと逆方向)に対応する。 The positioning processing unit 12 receives a signal transmitted from a satellite that forms a global positioning system (GPS), and performs positioning of the imaging device 1 based on the received signal. The positioning of the imaging device 1 refers to processing for obtaining the position (location) of the imaging device 1, and the obtained position of the imaging device 1 includes the longitude, latitude, and altitude of the imaging device 1. However, the altitude of the imaging device 1 may not be included in the required position of the imaging device 1. The result of positioning by the positioning processing unit 12, that is, the obtained data including the longitude, latitude, and altitude of the imaging device 1 is output to the main control unit 21 as original positioning data. As shown in FIG. 2, an X axis, a Y axis, and a Z axis that are orthogonal to and intersect each other at the origin O are defined. The X and Y axes are parallel to the horizontal plane, and the Z axis is parallel to the direction of gravity. Below, the longitude, latitude, and altitude of the position (location) of an arbitrary object are expressed as coordinate values x, y, and z on the X, Y, and Z axes, respectively, and correspond to the coordinate values x, y, and z. Data indicating the position (location) of the object is referred to as position data or expressed as (x, y, z). The coordinate values x, y, and z of the imaging device 1 indicated by the original positioning data are represented by symbols x M , y M, and z M, respectively. Therefore, the position of the imaging device 1 represented by the original positioning data is (x M , y M , z M ). With reference to the origin O, the positive directions of the X and Y axes correspond to east and north, respectively, and the positive direction of the Z axis corresponds to the upward direction (the direction opposite to the direction of gravity).
 撮影方向検出部13は、撮像部11の撮影方向(即ち、撮像部11の光軸方向)を検出することにより、撮影方向データを生成する。撮影方向データには、方位角θ及び姿勢角φが含まれる。原点Oに撮像装置1が配置されていると考え、図3(a)及び(b)を参照して方位角θ及び姿勢角φの意義を説明する。方位角θは、撮影方向の方位を表す角度であり、電子コンパス等を用いて検出される。例えば、図3(a)に示す如く、原点Oから北に向かう線分と、原点Oから光軸に沿って被写体側に向かう線分301との成す角度を、方位角θとして定義することができる。但し、線分301が原点Oから西に伸びている場合にθ=90°であり、線分301が原点Oから東に伸びている場合にθ=270°であるとする。姿勢角φは、基準姿勢から見た撮像装置1の姿勢差を表す角度であり、磁気センサ又はジャイロセンサ等を用いて検出される。撮影方向が水平面と平行であるときの撮像装置1の姿勢が基準姿勢である。従って、図3(b)に示す如く、撮影方向(即ち光軸方向)と水平面との成す角度を姿勢角φとして定義することができる。但し、撮影方向が上方向の成分を有しているときφ>0°であり、撮影方向が下方向の成分を有しているときφ<0°であるとする。 The photographing direction detection unit 13 generates photographing direction data by detecting the photographing direction of the imaging unit 11 (that is, the optical axis direction of the imaging unit 11). The shooting direction data includes an azimuth angle θ and a posture angle φ. Considering that the imaging device 1 is disposed at the origin O, the significance of the azimuth angle θ and the attitude angle φ will be described with reference to FIGS. The azimuth angle θ is an angle representing the azimuth in the photographing direction, and is detected using an electronic compass or the like. For example, as shown in FIG. 3A, an angle formed by a line segment from the origin O toward the north and a line segment 301 from the origin O along the optical axis toward the subject can be defined as an azimuth angle θ. it can. However, it is assumed that θ = 90 ° when the line segment 301 extends west from the origin O, and θ = 270 ° when the line segment 301 extends east from the origin O. The posture angle φ is an angle that represents a posture difference of the imaging apparatus 1 as viewed from the reference posture, and is detected using a magnetic sensor, a gyro sensor, or the like. The posture of the imaging device 1 when the photographing direction is parallel to the horizontal plane is the reference posture. Therefore, as shown in FIG. 3B, the angle formed by the photographing direction (that is, the optical axis direction) and the horizontal plane can be defined as the posture angle φ. However, it is assumed that φ> 0 ° when the shooting direction has an upward component, and φ <0 ° when the shooting direction has a downward component.
 撮影方向記録部14は、半導体メモリ等から成り、撮影方向検出部13にて生成された撮影方向データを必要分だけ一時的に記録する。メモリ部15は、半導体メモリ又は磁気ディスク等から成り、撮像装置1にて生成された又は撮像装置1が取得した任意の情報を記録する。メモリ部15の一部又は全部はデータベース16として機能する。表示部17は、液晶ディスプレイパネル等の表示画面を有する表示装置であり、主制御部21の制御の下、任意の映像を表示する。操作部18は、静止画像の撮影指示を受け付けるシャッタボタン18a及びズーム倍率の変更指示を受け付けるズームボタン18b等を備え、ユーザからの各種操作を受け付ける。操作部18に対する操作内容は、主制御部21に伝達される。ボタン18a及び18bは、表示部17に設けられうるタッチパネル上のボタンであってもよい。 The photographing direction recording unit 14 includes a semiconductor memory or the like, and temporarily records the photographing direction data generated by the photographing direction detection unit 13 as necessary. The memory unit 15 includes a semiconductor memory, a magnetic disk, or the like, and records arbitrary information generated by the imaging device 1 or acquired by the imaging device 1. Part or all of the memory unit 15 functions as the database 16. The display unit 17 is a display device having a display screen such as a liquid crystal display panel, and displays an arbitrary video under the control of the main control unit 21. The operation unit 18 includes a shutter button 18a that receives a still image shooting instruction, a zoom button 18b that receives a zoom magnification change instruction, and the like, and receives various operations from the user. The details of the operation on the operation unit 18 are transmitted to the main control unit 21. The buttons 18 a and 18 b may be buttons on a touch panel that can be provided on the display unit 17.
 送信部19及び受信部20によって通信部が形成される。送信部19は、ネットワークNETに接続され、サーバ機器SVを含む、撮像装置1以外の任意の機器に対して、任意の情報を送信可能である。受信部20は、ネットワークNETに接続され、サーバ機器SVを含む、撮像装置1以外の任意の機器から、任意の情報を受信可能である。 The transmission unit 19 and the reception unit 20 form a communication unit. The transmission unit 19 is connected to the network NET and can transmit arbitrary information to any device other than the imaging device 1 including the server device SV. The receiving unit 20 is connected to the network NET and can receive any information from any device other than the imaging device 1 including the server device SV.
 主制御部21は、CPU(Central Processing Unit)、RAM(Random Access Memory)及びROM(Read Only Memory)等にて形成され、撮像装置1内の各部位の動作を統括的に制御する。主制御部21には、画像処理部31、距離データ生成部32及び測位データ補正部33が設けられている。画像処理部31、距離データ生成部32又は測位データ補正部33は、主制御部21外に設けられていると考えても構わない。 The main control unit 21 is formed by a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, and comprehensively controls the operation of each part in the imaging apparatus 1. The main control unit 21 includes an image processing unit 31, a distance data generation unit 32, and a positioning data correction unit 33. The image processing unit 31, the distance data generation unit 32, or the positioning data correction unit 33 may be considered to be provided outside the main control unit 21.
 画像処理部31は、撮像素子ISからの画像信号に対し所定の画像処理(ノイズ低減処理、デモザイキング処理、色補正処理、エッジ強調処理など)を施すことでカラー画像を生成する。カラー画像は、被写体からの光の強さ及び色を表す二次元画像であり、撮像部11の撮影画像の一種である。但し、後述の距離画像と区別すべく、以下では、画像処理部31にて生成されるカラー画像を通常撮影画像と呼ぶ。尚、本明細書では、記述の簡略化上、任意の画像を表す画像信号又は画像データを単に画像とも言う。 The image processing unit 31 generates a color image by performing predetermined image processing (noise reduction processing, demosaicing processing, color correction processing, edge enhancement processing, etc.) on the image signal from the image sensor IS. The color image is a two-dimensional image representing the intensity and color of light from the subject, and is a kind of photographed image of the imaging unit 11. However, in order to distinguish from the distance image described later, the color image generated by the image processing unit 31 is hereinafter referred to as a normal captured image. In the present specification, for simplification of description, an image signal or image data representing an arbitrary image is also simply referred to as an image.
 距離データ生成部(距離画像生成部)32は、撮像素子ISの出力画像信号を用いて、撮像部11の撮影領域内に位置する各被写体の被写体距離を検出し、その検出結果を表す距離データを生成する。或る被写体の被写体距離とは、当該被写体と撮像装置1又は当該被写体を撮影する任意の撮像装置との間の実空間上の距離を指す。距離データ生成部32は、距離画像を距離データとして生成することができる。距離画像は、各画素の画素値に被写体距離の検出値を持たせた濃淡画像である。撮像素子ISの出力画像信号に基づき距離データ及び距離画像を生成する処理を距離データ生成処理又は距離画像生成処理と呼ぶ。 The distance data generation unit (distance image generation unit) 32 detects the subject distance of each subject located in the imaging region of the imaging unit 11 using the output image signal of the imaging element IS, and distance data representing the detection result Is generated. The subject distance of a certain subject refers to a distance in real space between the subject and the imaging device 1 or any imaging device that photographs the subject. The distance data generation unit 32 can generate a distance image as distance data. The distance image is a grayscale image in which the detection value of the subject distance is given to the pixel value of each pixel. A process for generating distance data and a distance image based on an output image signal of the imaging element IS is called a distance data generation process or a distance image generation process.
 撮像素子ISの出力を用いた被写体距離の検出方法及び距離データの生成方法として、公知の任意の方法を利用可能である。図4(a)に示す如く、被写体に対して光を照射する測距用光源LTを撮像部11に設け、測距用光源LTを用いて距離データを生成するようにしても良い。測距用光源LTを複数個の発光素子にて形成しても良い。 Any known method can be used as a method for detecting a subject distance using the output of the image sensor IS and a method for generating distance data. As shown in FIG. 4A, a distance measuring light source LT for irradiating light on a subject may be provided in the imaging unit 11, and distance data may be generated using the distance measuring light source LT. The distance measuring light source LT may be formed of a plurality of light emitting elements.
 そして、いわゆるTOF(Time Of Flight)測距を用いて距離データを生成しても良い。より具体的には例えば、近赤外線光を出射するLED(Light Emitting Diode)にて測距用光源LTを形成し、該LEDの出射光を約10MHz(メガヘルツ)で変調して被写体に投光する。LEDの出射光に対する被写体からの反射光は、撮像素子ISにて受光される。距離データ生成部32は、撮像素子ISの出力画像信号に基づき画素ごとに上記出射光及び反射光間の位相差を求め、画素ごとに該位相差から被写体距離を算出することができる。また、互いに異なる波長を持った複数の正弦波(例えば数10の正弦波)でLEDの出射光を変調して被写体に投光するようにしても良い。単波長についての位相差だけを見ると同位相を与える距離が複数存在しうる(結果、測距に大きな誤差が生じうる)。複数の波長を持った変調波を投光することで測距の精度向上が可能である。撮像素子ISにて連続的に位相差を計測するようにすればリアルタイムで距離データを生成することができる。 And, distance data may be generated using so-called TOF (Time Of Flight) ranging. More specifically, for example, a ranging light source LT is formed by an LED (Light Emitting Diode) that emits near-infrared light, and the emitted light of the LED is modulated at about 10 MHz (megahertz) and projected onto a subject. . The reflected light from the subject with respect to the light emitted from the LED is received by the image sensor IS. The distance data generation unit 32 can obtain the phase difference between the emitted light and the reflected light for each pixel based on the output image signal of the image sensor IS, and can calculate the subject distance from the phase difference for each pixel. Further, the emitted light of the LED may be modulated with a plurality of sine waves having different wavelengths (for example, several sine waves) and projected onto the subject. If only the phase difference for a single wavelength is viewed, there may be a plurality of distances that give the same phase (resulting in a large error in distance measurement). Ranging accuracy can be improved by projecting modulated waves having a plurality of wavelengths. If the phase difference is continuously measured by the image sensor IS, distance data can be generated in real time.
 或いは、図4(b)に示す如く、測距用光源LTとしてレーザを採用してレーザからの光が撮影領域内の各被写体に順次照射されるようにレーザ光のスキャニングを行うようにしても良い。この場合、レーザの出射光に対する被写体からの反射光が撮像素子ISにて受光され、距離データ生成部32は、撮像素子ISの出力画像信号に基づき距離データを得ることができる。更に或いは、いわゆるパターン照射方式にて距離画像生成処理を実現してもよい。即ち、図4(c)に示す如く、所定パターン(例えばマトリクスパターン)を有するレーザ光が被写体に照射されるように測距用光源LTを形成し、レーザ光の反射光を表す撮像素子ISの出力画像信号に基づき(反射光のひずみ具合いに基づき)距離データを得るようにしても良い。 Alternatively, as shown in FIG. 4B, a laser is employed as the distance measuring light source LT, and laser light scanning is performed so that light from the laser is sequentially irradiated to each subject in the imaging region. good. In this case, reflected light from the subject with respect to the emitted light of the laser is received by the image sensor IS, and the distance data generation unit 32 can obtain distance data based on the output image signal of the image sensor IS. Further alternatively, the distance image generation process may be realized by a so-called pattern irradiation method. That is, as shown in FIG. 4C, the distance measuring light source LT is formed so that the subject is irradiated with laser light having a predetermined pattern (for example, a matrix pattern), and the imaging element IS representing the reflected light of the laser light is formed. The distance data may be obtained based on the output image signal (based on the degree of distortion of the reflected light).
 測位データ補正部33は、送信部19及び受信部20を利用して得たデータに基づき、必要に応じて原測位データを補正し、補正の成された原測位データを補正測位データとして出力する。補正測位データによって示される撮像装置1の座標値x,y及びzを、夫々、記号x,y及びzにて表す。即ち、補正測位データが指し示す撮像装置1の位置は(x,y,z)である。補正方法については後に詳説する。 The positioning data correction unit 33 corrects the original positioning data as necessary based on the data obtained by using the transmission unit 19 and the reception unit 20, and outputs the corrected original positioning data as corrected positioning data. . The coordinate values x, y, and z of the imaging device 1 indicated by the corrected positioning data are represented by symbols x C , y C, and z C, respectively. That is, the position of the imaging device 1 indicated by the corrected positioning data is (x C , y C , z C ). The correction method will be described in detail later.
[画像記録処理]
 主制御部21は、画像記録処理を行うことができる。画像記録処理とは、シャッタボタン18aの押下操作(後述の全押し操作に相当)を受けて撮像素子ISから読み出した画像信号に基づく通常撮影画像を、付加データと共にメモリ部15(例えばメモリ部15内の不揮発性メモリ)に記録する処理を指す。この際例えば、図5に示す如く、メモリ部15内にヘッダ領域及び本体領域から成る画像ファイルが形成され、付加データ及び通常撮影画像が夫々ヘッダ領域及び本体領域に格納される。画像ファイルを、例えば、Exif(Exchangeable image file format)のファイルフォーマットに準拠させることができる。付加データに、通常撮影画像の撮影地点の測位データ、並びに、通常撮影画像の撮影時の焦点距離、F値及びISO感度などを含めておくことができ、更に距離画像を含めておいても良い。また、通常撮影画像ではなく距離画像が本体領域に格納された画像ファイルを形成しても良い。付加データに含められる測位データは、原測位データ(x,y,z)又は補正測位データ(x,y,z)である。データベース16に、次々と得られる原測位データ(x,y,z)又は補正測位データ(x,y,z)を、時系列に沿って蓄積記録するようにしても良い。
[Image recording processing]
The main control unit 21 can perform image recording processing. In the image recording process, a normal captured image based on an image signal read from the image sensor IS in response to a pressing operation of the shutter button 18a (corresponding to a full pressing operation described later) is added to the memory unit 15 (for example, the memory unit 15). In the non-volatile memory). At this time, for example, as shown in FIG. 5, an image file including a header area and a main body area is formed in the memory unit 15, and additional data and a normal photographed image are stored in the header area and the main body area, respectively. For example, the image file can be made to conform to the file format of Exif (Exchangeable image file format). The additional data can include the positioning data of the photographing point of the normal photographed image, the focal length, the F value, the ISO sensitivity, etc. at the time of photographing the normal photographed image, and may further include the distance image. . Further, an image file in which a distance image is stored in the main body area instead of a normal captured image may be formed. The positioning data included in the additional data is original positioning data (x M , y M , z M ) or corrected positioning data (x C , y C , z C ). The original positioning data (x M , y M , z M ) or corrected positioning data (x C , y C , z C ) obtained one after another may be accumulated and recorded in the database 16 in time series. .
[スルー表示]
 主制御部21は、表示部17を用いてスルー表示を行うことができる。スルー表示とは、所定のフレームレートで撮像素子ISから画像信号を読み出して、通常撮影画像を順次生成し、順次生成された通常撮影画像を表示部17に更新表示させる処理を指す。尚、スルー表示を成すことを目的として生成される通常撮影画像の解像度は、画像記録処理において取得及び生成される通常撮影画像の解像度より低くても良い。
[Through display]
The main control unit 21 can perform through display using the display unit 17. Through display refers to a process of reading out image signals from the image sensor IS at a predetermined frame rate, sequentially generating normal captured images, and updating and displaying the sequentially generated normal captured images on the display unit 17. Note that the resolution of the normal captured image generated for the purpose of through display may be lower than the resolution of the normal captured image acquired and generated in the image recording process.
[ログ経路記録処理]
 主制御部21は、ログ経路記録処理を行うことができる。ログ経路保存処理とは、周期的に又は間欠的に測位データを取得し、順次取得された測位データを時系列に沿ってメモリ部15(データベース16)に記録する処理を指す。これにより、撮像装置1を所持して移動するユーザの移動経路が記録されることになる。ログ経路保存処理において取得及び記録される測位データは、原則として原測位データ(x,y,z)であるが、測位データ補正部33による補正が行われた場合には、補正測位データ(x,y,z)である。
[Log path recording process]
The main control unit 21 can perform log path recording processing. The log path storing process refers to a process of acquiring positioning data periodically or intermittently and recording the sequentially acquired positioning data in the memory unit 15 (database 16) in time series. As a result, the movement route of the user who moves while holding the imaging device 1 is recorded. The positioning data acquired and recorded in the log path saving process is in principle the original positioning data (x M , y M , z M ), but when the correction by the positioning data correction unit 33 is performed, the corrected positioning is performed. Data (x C , y C , z C ).
[測位データ補正処理]
 主制御部21は、測位データ補正部33を用いて、測位処理部12の測位結果を補正する測位データ補正処理を実行することができる。具体的には例えば、メモリ部15内の不揮発性プログラムメモリに保存された補正アプリケーションソフトウェアを主制御部21のCPUが読み込み、CPUが補正アプリケーションソフトウェアを実行することで測位データ補正処理が実現される。但し、測位データ補正処理を、ハードウェアのみで実現させるようにしても良い。
[Positioning data correction processing]
The main control unit 21 can execute positioning data correction processing for correcting the positioning result of the positioning processing unit 12 using the positioning data correction unit 33. Specifically, for example, the CPU of the main control unit 21 reads the correction application software stored in the nonvolatile program memory in the memory unit 15, and the CPU executes the correction application software, thereby realizing the positioning data correction process. . However, the positioning data correction process may be realized only by hardware.
 上述の構成及び動作を基本とする測位システムのより詳細な実施例を、以下に説明する。矛盾なき限り、以下に述べる複数の実施例の内、2以上の実施例を互いに組み合わせることも可能であり、或る実施例において説明した事項を他の実施例に適用することも可能である。 A more detailed embodiment of the positioning system based on the above-described configuration and operation will be described below. As long as there is no contradiction, two or more of the embodiments described below can be combined with each other, and the matters described in one embodiment can be applied to other embodiments.
<<第1実施例>>
 測位システムの第1実施例を説明する。図6は、第1実施例に係る測位システムの動作フローチャートである。まずステップS11において、撮像装置1に電源が投入されることで撮像装置1及び補正アプリケーションソフトウェアが起動し、その後、スルー表示が継続的に実行される一方でステップS12~S18の処理が順次実行される(但し、スルー表示は適宜中断されうる)。ここでは、撮像装置1の起動後、ログ経路記録処理を継続実行されるものとする(後述の他の実施例においても同様)。
<< First Example >>
A first embodiment of the positioning system will be described. FIG. 6 is an operation flowchart of the positioning system according to the first embodiment. First, in step S11, the image pickup apparatus 1 and the correction application software are activated by turning on the image pickup apparatus 1. Thereafter, through display is continuously executed, while the processes in steps S12 to S18 are sequentially executed. (However, the through display can be interrupted as appropriate). Here, it is assumed that the log path recording process is continuously executed after the imaging apparatus 1 is activated (the same applies to other examples described later).
 ステップS12において、測位処理部12及び撮影方向検出部13は、原測位データ(x,y,z)及び撮影方向データ(θ,φ)を取得する。この際、主制御部21は、上記付加データ(図5参照)に含められるべき焦点距離を取得しておいても良い。 In step S12, the positioning processing unit 12 and the shooting direction detection unit 13 acquire original positioning data (x M , y M , z M ) and shooting direction data (θ, φ). At this time, the main control unit 21 may acquire a focal length to be included in the additional data (see FIG. 5).
 ステップS13において、画像処理部31は撮像素子ISの出力画像信号から通常撮影画像を取得し、距離データ生成部32は距離画像生成処理によって距離画像を取得する。得られた通常撮影画像はスルー表示に利用される或いは画像記録処理によって記録される。シャッタボタン18aは2段階の押下操作が可能である。シャッタボタン18aを半分だけ押しこむ操作を半押し操作と呼び、シャッタボタン18aを完全に押しこむ操作を全押し操作と呼ぶ。ステップS13において、画像処理部31及び距離データ生成部32は、半押し操作又は全押し操作を受けて撮像素子ISから読み出された画像信号に基づき通常撮影画像及び距離画像を生成するようにしても良い。シャッタボタン18aがタッチパネル上のボタンである場合、上述の半押し操作又は全押し操作は、タッチパネル上のシャッタボタン18aを操作体(指や操作ペン)にて触れる操作、又は、タッチパネル上のシャッタボタン18aを操作体にて触れた後、該操作体をタッチパネルから離す操作に置き換えられうる。今、ステップS13にて取得された通常撮影画像及び距離画像が、夫々、図7(a)及び(b)の画像310及び320であるとする。通常撮影画像310及び距離画像320は、被写体の1つである対象物SUBを撮影することで得られた画像である。図7(a)及び(b)において、点SUBは、通常撮影画像310及び距離画像320における対象物SUBの重心を表している。 In step S13, the image processing unit 31 acquires a normal captured image from the output image signal of the imaging element IS, and the distance data generation unit 32 acquires a distance image by a distance image generation process. The obtained normal photographed image is used for through display or recorded by image recording processing. The shutter button 18a can be pressed in two stages. The operation of pushing the shutter button 18a by half is called a half-pressing operation, and the operation of pushing the shutter button 18a completely is called a full-pressing operation. In step S13, the image processing unit 31 and the distance data generating unit 32 generate a normal captured image and a distance image based on an image signal read from the image sensor IS in response to a half-press operation or a full-press operation. Also good. When the shutter button 18a is a button on the touch panel, the above-described half-press operation or full-press operation is an operation of touching the shutter button 18a on the touch panel with an operating body (finger or operation pen), or a shutter button on the touch panel. After touching 18a with the operation body, the operation body can be replaced with an operation of releasing the operation body from the touch panel. Assume that the normal captured image and the distance image acquired in step S13 are the images 310 and 320 in FIGS. 7A and 7B, respectively. The normal captured image 310 and the distance image 320 are images obtained by capturing an object SUB that is one of the subjects. 7A and 7B, the point SUB P represents the center of gravity of the object SUB in the normal captured image 310 and the distance image 320.
 ステップS14において、主制御部21は、ステップS13にて取得された距離画像を基準画像に設定し、主制御部21の制御の下、送信部19は、基準画像と共に、基準画像の撮影位置(即ち、基準画像の撮影時における原測位データ(x,y,z))をアップロードする(送信する)。アップロードとは、送信部19からネットワークNETを経由してサーバ機器SV(又は任意のサイト等)に対し任意の情報を送信することを指す。 In step S <b> 14, the main control unit 21 sets the distance image acquired in step S <b> 13 as a reference image, and under the control of the main control unit 21, the transmission unit 19 together with the reference image capture position ( That is, the original positioning data (x M , y M , z M ) at the time of capturing the reference image is uploaded (transmitted). Upload refers to transmitting arbitrary information from the transmission unit 19 to the server device SV (or an arbitrary site or the like) via the network NET.
 ステップS15において、サーバ機器SVは、ステップS14にてアップロードされた基準画像及び原測位データに基づき類似画像検索処理を実行し、これによって対象物位置データ(x,y,z)を決定する。理想的には、対象物位置データ(x,y,z)は基準画像に含まれている対象物SUBの所在地を正確に示す(後述の他の実施例においても同様)。 In step S15, the server device SV executes a similar image search process based on the reference image and the original positioning data uploaded in step S14, thereby determining target object position data (x T , y T , z T ). To do. Ideally, the object position data (x T , y T , z T ) accurately indicates the location of the object SUB included in the reference image (the same applies to other examples described later).
 図8~図10を参照して、類似画像検索処理を説明する。図8は、第1実施例に係るサーバ機器SVとしてのサーバ機器SVの内部ブロック図である。図9は、類似画像検索処理の手順を表すフローチャートである。類似画像検索処理は、ステップS31~S34の処理から成る。サーバ機器SVは、符号51~54によって参照される各部位を備える。但し、データベース51の全部又は一部は、ネットワークNETに接続された、サーバ機器SV以外の機器に設けられていても良い。図10には、複数の評価対象画像から成る画像群340と、複数の位置データから成る位置データ群341が示されている。画像群340及び位置データ群341は、実際には例えば、Exif等のファイルフォーマットに準拠する複数の画像ファイルに格納された上でデータベース51に保存される。画像群340における第i番目の評価対象画像を記号Qによって参照する(iは整数)。評価対象画像ごとに位置データが評価対象画像に関連付けられている。評価対象画像Qに関連付けられた位置データを(x,y,z)によって表す。位置データ(x,y,z)は、評価対象画像Qに含まれる対象物の所在地を表している。 The similar image search process will be described with reference to FIGS. FIG. 8 is an internal block diagram of the server device SV A as the server device SV according to the first embodiment. FIG. 9 is a flowchart showing the procedure of the similar image search process. The similar image search process includes the processes of steps S31 to S34. The server device SV A includes each part referred to by reference numerals 51 to 54. However, all or part of the database 51, connected to the network NET, it may be provided to devices other than the server device SV A. FIG. 10 shows an image group 340 composed of a plurality of evaluation target images and a position data group 341 composed of a plurality of position data. The image group 340 and the position data group 341 are actually stored in the database 51 after being stored in a plurality of image files conforming to a file format such as Exif, for example. The i-th evaluation object image in the image group 340 is referred to by a symbol Q i (i is an integer). Position data is associated with the evaluation target image for each evaluation target image. The position data associated with the evaluation target image Q i is represented by (x i , y i , z i ). The position data (x i , y i , z i ) represents the location of the object included in the evaluation target image Q i .
 説明の具体化及び簡略化上、基準画像には対象物が1つのみ含まれているものとし、各評価対象画像にも対象物が1つのみ含まれているものとする。上述の如く、基準画像が撮像装置1の撮影による距離画像である場合(図6のステップS14参照)、各評価対象画像も、対応する対象物についての距離画像である。 For the sake of concreteness and simplification of the description, it is assumed that the reference image includes only one object, and each evaluation object image includes only one object. As described above, when the reference image is a distance image captured by the imaging apparatus 1 (see step S14 in FIG. 6), each evaluation target image is also a distance image for the corresponding target object.
 撮像装置1から送信された原測位データ(x,y,z)は画像限定選択部52に与えられる。ステップS31において、画像限定選択部52は、原測位データ(x,y,z)に基づき、画像群340において類似度評価部53に供給されるべき評価対象画像を絞り込む。即ち、選択部52は、画像群340の内、位置(x,y,z)付近に所在地を有する対象物についての画像のみが類似度評価部53に供給されるように、画像群340の一部を限定して選択し、選択した複数の評価対象画像を類似度評価部53に供給する。尚、アップロードによって撮影方向データも選択部52に与え、撮影方向データをも利用して上記選択を行うようにしても良い(撮影方向データをも利用して更なる絞り込みを行っても良い)。また、図6のステップS14において基準画像のみのアップロードを行い、原測位データをアップロードしないようにしても良い。但し、この場合、ステップS31(図9)の絞り込み及び選択が行われなくなる。 The original positioning data (x M , y M , z M ) transmitted from the imaging device 1 is given to the image limitation selection unit 52. In step S <b> 31, the image limitation selection unit 52 narrows down the evaluation target images to be supplied to the similarity evaluation unit 53 in the image group 340 based on the original positioning data (x M , y M , z M ). That is, the selection unit 52 supplies the similarity evaluation unit 53 with only the image of the object having a location near the position (x M , y M , z M ) in the image group 340. Part of 340 is selected and selected, and the plurality of selected evaluation target images are supplied to the similarity evaluation unit 53. Note that the shooting direction data may also be given to the selection unit 52 by uploading, and the selection may be performed using the shooting direction data (further narrowing may be performed using the shooting direction data). Further, only the reference image may be uploaded in step S14 of FIG. 6, and the original positioning data may not be uploaded. However, in this case, narrowing down and selection in step S31 (FIG. 9) are not performed.
 今、評価対象画像Q~Qが選択されて類似度評価部53に供給されたとする(nは2以上の整数)。ステップS32において、類似度評価部53は、評価対象画像Q~Qにつき評価対象画像ごとに類似度評価処理を行う。類似度評価処理では、基準画像及び評価対象画像に基づき基準画像と評価対象画像との類似度を評価する。評価対象画像Qの類似度が大きいほど評価対象画像Qは基準画像に類似している。評価対象画像Qに対して求められた類似度を記号SIMによって表す。具体的には例えば、類似度評価処理において、類似度評価部53は、基準画像及び評価対象画像Qの画像データから基準画像及び評価対象画像Qの夫々の画像特徴量(対象物を含む各物体の形状、奥行き、エッジ、ヒストグラムなど)を抽出し、基準画像及び評価対象画像Q間で画像特徴量を対比することで類似度SIMを求める。 Assume that the evaluation target images Q 1 to Q n are selected and supplied to the similarity evaluation unit 53 (n is an integer of 2 or more). In step S32, the similarity evaluation unit 53 performs similarity evaluation processing for each evaluation object image for the evaluation object images Q 1 to Q n . In the similarity evaluation process, the similarity between the reference image and the evaluation target image is evaluated based on the reference image and the evaluation target image. The evaluation target image Q i is more similar to the reference image as the evaluation target image Q i has a higher similarity. The degree of similarity determined for the evaluation target image Q i represented by the symbol SIM i. Specifically, for example, in the similarity evaluation process, the similarity evaluation unit 53 includes a reference image and the evaluation image feature quantity of each of the reference image and the evaluation object image Q i from the image data of the target image Q i (object The shape, depth, edge, histogram, etc.) of each object are extracted, and the similarity SIM i is obtained by comparing the image feature amount between the reference image and the evaluation target image Q i .
 ステップS33において、類似度評価部53は、求められた類似度SIM~SIMの中から所定の閾値以上の類似度を抽出し、抽出した類似度に対応する評価対象画像を基準画像の類似画像として抽出する。所定の閾値以上の類似度が複数存在する場合には、最大の類似度に対応する評価対象画像を基準画像の類似画像として特定することができる。基準画像の類似画像として特定及び抽出された評価対象画像を特定類似画像(抽出対象画像)と呼ぶ。 In step S33, the similarity evaluation unit 53 extracts a similarity equal to or greater than a predetermined threshold from the obtained similarities SIM 1 to SIM n , and sets the evaluation target image corresponding to the extracted similarity to the similarity of the reference image. Extract as an image. When there are a plurality of similarities equal to or higher than a predetermined threshold, the evaluation target image corresponding to the maximum similarity can be specified as a similar image of the reference image. An evaluation target image specified and extracted as a similar image of the reference image is referred to as a specific similar image (extraction target image).
 ステップS34において、対象物位置データ取得部54は、特定類似画像に関連付けられた位置データを対象物位置データ(x,y,z)として、位置データ(x,y,z)~(x,y,z)から抽出する(換言すれば位置データ群341から抽出する)。従って例えば、類似度SIMが所定の閾値以上の類似度として抽出されることで評価対象画像Qが特定類似画像として抽出された場合、評価対象画像Qに関連付けられた位置に関するデータ、即ち、評価対象画像Q内の対象物の所在地を表す位置データ(x,y,z)が対象物位置データ(x,y,z)として抽出される。 In step S34, the object position data acquisition unit 54 uses the position data associated with the specific similar image as the object position data (x T , y T , z T ), and the position data (x 1 , y 1 , z 1). ) To (x n , y n , z n ) (in other words, extracted from the position data group 341). Thus, for example, if the degree of similarity SIM 2 is the evaluation object image Q 2 by being extracted as above similarity predetermined threshold is extracted as a specific similar image, data relating to the position associated with the evaluation object image Q 2, i.e. Then, position data (x 2 , y 2 , z 2 ) representing the location of the object in the evaluation object image Q 2 is extracted as object position data (x T , y T , z T ).
 図6の説明に戻る。図9の類似画像検索処理により決定された対象物位置データ(x,y,z)は、ステップS15に続くステップS16において、受信部20によりダウンロードされる(受信される)。ダウンロードとは、ネットワークNETを介してサーバ機器SVからの任意の情報を受信することを指す。 Returning to the description of FIG. The object position data (x T , y T , z T ) determined by the similar image search process of FIG. 9 is downloaded (received) by the receiving unit 20 in step S16 following step S15. Download refers to receiving arbitrary information from the server device SV via the network NET.
 その後、ステップS17において、測位データ補正部33は、対象物位置データ(x,y,z)、距離データとしての距離画像、及び、撮影方向データ(θ,φ)に基づき、原測位データ(x,y,z)を補正する測位データ補正処理を行う。ステップS17の測位データ補正処理において、ステップS16、S13及びS12にて得た対象物位置データ(x,y,z)、距離画像及び撮影方向データ(θ,φ)を用いることができる。 Thereafter, in step S17, the positioning data correction unit 33 performs the original positioning based on the object position data (x T , y T , z T ), the distance image as the distance data, and the shooting direction data (θ, φ). Positioning data correction processing for correcting data (x M , y M , z M ) is performed. In the positioning data correction process in step S17, the object position data (x T , y T , z T ), the distance image, and the shooting direction data (θ, φ) obtained in steps S16, S13, and S12 can be used. .
 図11等を参照して測位データ補正処理を説明する。今、図11に示す如く、撮像装置1から東に10mだけ離れた位置に対象物SUBが存在していることを想定し(従ってθ=270°)、結果、距離画像320において対象物SUBの重心SUBの被写体距離が10mであったとする。また、図11の例では、φ=0°であるとする。この場合、測位データ補正部33は、対象物位置データ(x,y,z)を対象物SUBの正確な所在地を表す位置データであるとみなし、(x,y,z)=(x-10,y,z)に従って、補正測位データ(x,y,z)を求める。即ち、測位データ補正部33は、測位処理部12による原測位データ(x,y,z)を、(x,y,z)=(x-10,y,z)へと補正する。尚、座標値x,y及びzの単位はm(メートル)であるとする。 The positioning data correction process will be described with reference to FIG. Now, as shown in FIG. 11, it is assumed that the object SUB exists at a position 10 m east from the imaging device 1 (thus θ = 270 °), and as a result, in the distance image 320, the object SUB Assume that the subject distance of the center of gravity SUB P is 10 m. In the example of FIG. 11, it is assumed that φ = 0 °. In this case, the positioning data correction unit 33 regards the object position data (x T , y T , z T ) as position data representing the exact location of the object SUB, and (x C , y C , z C). ) = (X T −10, y T , z T ), the corrected positioning data (x C , y C , z C ) is obtained. That is, the positioning data correction unit 33 converts the original positioning data (x M , y M , z M ) by the positioning processing unit 12 into (x C , y C , z C ) = (x T −10, y T , z T )). The unit of coordinate values x, y, and z is assumed to be m (meter).
 また、図12に示す如く、θ=270°且つ0°<φ<90°であって、距離画像320において対象物SUBの重心SUBの被写体距離が10mであったならば、測位データ補正部33は、対象物位置データ(x,y,z)を対象物SUBの正確な所在地を表す位置データであるとみなし、(x,y,z)=(x-10・cosφ,y,z-10・sinφ)に従って、補正測位データ(x,y,z)を求める。即ち、測位データ補正部33は、測位処理部12による原測位データ(x,y,z)を、(x,y,z)=(x-10・cosφ,y,z-10・sinφ)へと補正する。 As shown in FIG. 12, if θ = 270 ° and 0 ° <φ <90 ° and the subject distance of the center of gravity SUB P of the object SUB is 10 m in the distance image 320, the positioning data correction unit 33 regards the object position data (x T , y T , z T ) as position data representing the exact location of the object SUB, and (x C , y C , z C ) = (x T −10 -Corrected positioning data (x C , y C , z C ) is obtained according to cos φ, y T , z T −10 · sin φ). That is, the positioning data correction unit 33 converts the original positioning data (x M , y M , z M ) by the positioning processing unit 12 into (x C , y C , z C ) = (x T −10 · cos φ, y T , Z T −10 · sinφ).
 測位データ補正処理の後、ステップS18において、主制御部21は、画像ファイル内の測位データを(x,y,z)から(x,y,z)に補正することができると共に、ログ経路記録処理において記録される上記移動経路を補正することができる。図13(a)及び(b)は、測位データの補正イメージ図である。また、ステップS18において、スルー表示を行いつつ補正情報を表示部17に表示するようにしても良い。補正情報は、例えば、補正測位データ、原測位データ及び補正測位データ間の誤差、又は、補正測位データ及び対象物位置データに基づく撮像装置1から対象物SUBまでの正確な距離である。 After the positioning data correction process, in step S18, the main control unit 21 may correct the positioning data in the image file from (x M , y M , z M ) to (x C , y C , z C ). In addition, the moving route recorded in the log route recording process can be corrected. FIGS. 13A and 13B are correction image diagrams of positioning data. In step S18, the correction information may be displayed on the display unit 17 while performing through display. The correction information is, for example, corrected positioning data, an error between original positioning data and corrected positioning data, or an accurate distance from the imaging apparatus 1 to the target SUB based on the corrected positioning data and target position data.
 測位処理部12を用いれば、ユーザ及び撮像装置1の概ねの現在地を把握することができるが、位置の検出精度には限界がある。本実施例による補正方法を用いれば、シャッタ操作に応答して得た画像はもちろん、スルー表示用に得た画像からでも、位置の検出誤差を正確に修正することができ、ユーザ及び撮像装置1の現在地を正確に知ることが可能となる。ネットワーク上には様々な対象物についての画像が存在するため、撮像装置1にとっての対象物が有名な対象物(観光名所の建造物等)でなくても(例えば、展示会会場、ショッピングモール、無名のビル、路地裏の店舗であっても)、測位データの補正が可能である。また、距離画像を基準画像に設定し、距離画像を元に類似画像検索処理を行うことにより、撮像装置1の撮影環境が暗い状況下や、対象物が同系色の背景に紛れていて物体認識が困難な状況下においても、正確に対象物を判別することができ、結果、正確に測位データを補正することが可能である。 If the positioning processing unit 12 is used, the current location of the user and the imaging device 1 can be grasped, but the position detection accuracy is limited. If the correction method according to the present embodiment is used, the position detection error can be accurately corrected not only from the image obtained in response to the shutter operation but also from the image obtained for the through display. It becomes possible to know the current location of Since there are images of various objects on the network, even if the object for the imaging apparatus 1 is not a famous object (such as a building of a tourist attraction) (for example, an exhibition hall, a shopping mall, Positioning data can be corrected even if it is an anonymous building or an alley store. Further, by setting a distance image as a reference image and performing a similar image search process based on the distance image, the object recognition is performed when the shooting environment of the imaging apparatus 1 is dark or the object is confused with a background of similar colors. Even under difficult circumstances, it is possible to accurately determine the object, and as a result, it is possible to correct the positioning data accurately.
 但し、ステップS14を図14のステップS14’へと置き換えるようにしてもよい。即ち、ステップS14(ステップS14’)において、主制御部21は、ステップS13にて取得された通常撮影画像を基準画像に設定するようにしても良い。通常撮影画像が基準画像である場合、各評価対象画像は、対応する対象物についてのカラー画像(対象物から、対象物を撮影する撮像装置の撮像素子に入射した光の強さ及び色を表す二次元画像)である(後述の他の実施例においても同様)。但し、通常撮影画像を基準画像に設定した場合、上述の状況下(例えば撮像装置1の撮影環境が暗い状況下)において対象物を正確に判別することが難しくなり、結果、距離画像を基準画像として用いる場合よりも測位データの補正精度等が低下しうる。 However, step S14 may be replaced with step S14 'in FIG. That is, in step S14 (step S14 '), the main control unit 21 may set the normal captured image acquired in step S13 as a reference image. When the normal captured image is a reference image, each evaluation target image is a color image of the corresponding target object (represents the intensity and color of light incident on the image sensor of the imaging device that captures the target object from the target object). 2D image) (the same applies to other examples described later). However, when the normal captured image is set as the reference image, it becomes difficult to accurately determine the object under the above-described circumstances (for example, when the imaging environment of the imaging apparatus 1 is dark), and as a result, the distance image is converted into the reference image. The correction accuracy of positioning data may be lower than the case of using as the positioning data.
 また、ステップS18における補正情報の表示は警告情報の表示(以下、警告表示という)を含んでいても良い。ステップS18において、主制御部21は、原測位データが指し示す位置と補正測位データが指し示す位置との距離ΔERR(実空間上の距離)を求めることができる。距離ΔERRは、測位処理部12による測位の誤差を表していると共に、測位データ補正部33による原測位データへの補正量に相当する。主制御部21は、距離ΔERRが所定閾値以上であるとき、測位の誤差が大きいことを示唆する警告情報(文字、図形、アイコンなど)を表示部17に表示させるようにしても良い。警告表示は、測位の誤差が大きいことを示唆する警告報知の一種であり、警告報知は、人間の五感(特に視覚、聴覚、触覚)に訴える任意に報知であっても良い。即ち、主制御部21は、距離ΔERRが所定閾値以上であるとき、表示部17を用いた映像出力、撮像装置1内のスピーカ(不図示)を用いた音声出力、又は、撮像装置1の振動などによって、測位の誤差が大きいことをユーザに報知しても良い(後述の他の実施例においても同様)。警告報知における“測位の誤差が大きい”とは、測位の誤差(即ち距離ΔERR)が所定許容誤差よりも大きいことを意味する。 Further, the display of the correction information in step S18 may include display of warning information (hereinafter referred to as warning display). In step S18, the main control unit 21 can obtain a distance ΔERR (distance in real space) between the position indicated by the original positioning data and the position indicated by the corrected positioning data. The distance ΔERR represents a positioning error by the positioning processing unit 12 and corresponds to a correction amount to the original positioning data by the positioning data correcting unit 33. The main control unit 21 may cause the display unit 17 to display warning information (characters, figures, icons, etc.) indicating that the positioning error is large when the distance ΔERR is greater than or equal to a predetermined threshold. The warning display is a type of warning notification that suggests a large positioning error, and the warning notification may be any notification that appeals to the human senses (particularly visual, auditory, and tactile). That is, the main control unit 21 outputs video using the display unit 17, audio output using a speaker (not shown) in the imaging device 1, or the imaging device 1 when the distance ΔERR is greater than or equal to a predetermined threshold. The user may be notified that the positioning error is large by vibration or the like (the same applies to other embodiments described later). “Large positioning error” in the warning notification means that the positioning error (that is, the distance Δ ERR ) is larger than a predetermined allowable error.
<<第2実施例>>
 測位システムの第2実施例を説明する。図15は、第2実施例に係る測位システムの動作フローチャートである。まずステップS51において、撮像装置1に電源が投入されることで撮像装置1及び補正アプリケーションソフトウェアが起動し、その後、スルー表示が継続的に実行される一方でステップS52~S59の処理が順次実行される。
<< Second Example >>
A second embodiment of the positioning system will be described. FIG. 15 is an operation flowchart of the positioning system according to the second embodiment. First, in step S51, the imaging apparatus 1 and the correction application software are started by turning on the imaging apparatus 1, and thereafter, through display is continuously performed, while the processes in steps S52 to S59 are sequentially performed. The
 ステップS52において、測位処理部12は原測位データ(x,y,z)を取得する。続くステップS53において、送信部19は、主制御部21の制御の下、ステップS52にて取得された原測位データ(x,y,z)をアップロードする(送信する)。アップロードされた原測位データ(x,y,z)に基づき、サーバ機器SVに設けられた画像限定選択部52(図8参照)は、第1実施例で述べた方法に従い、画像群340において類似度評価部53に供給されるべき評価対象画像を絞り込む。即ち、選択部52は、画像群340の内、位置(x,y,z)付近に所在地を有する対象物についての画像のみが類似度評価部53に供給されるように、画像群340の一部を限定して選択する。第1実施例と同様、ここで選択された複数の評価対象画像が評価対象画像Q~Qであるとする。尚、アップロードによって撮影方向データも選択部52に与え、撮影方向データをも利用して上記選択を行うようにしても良い(撮影方向データをも利用して更なる絞り込みを行っても良い)。 In step S52, the positioning processing unit 12 acquires original positioning data (x M , y M , z M ). In subsequent step S53, the transmission unit 19 uploads (transmits) the original positioning data (x M , y M , z M ) acquired in step S52 under the control of the main control unit 21. Based on the uploaded original positioning data (x M , y M , z M ), the image restriction selection unit 52 (see FIG. 8) provided in the server device SV follows the method described in the first embodiment, and the image group In 340, the evaluation target images to be supplied to the similarity evaluation unit 53 are narrowed down. That is, the selection unit 52 supplies the similarity evaluation unit 53 with only the image of the object having a location near the position (x M , y M , z M ) in the image group 340. A part of 340 is selected and selected. As in the first embodiment, it is assumed that the plurality of evaluation target images selected here are evaluation target images Q 1 to Q n . Note that the shooting direction data may also be given to the selection unit 52 by uploading, and the selection may be performed using the shooting direction data (further narrowing may be performed using the shooting direction data).
 第2実施例に係るサーバ機器SVは、選択した複数の評価対象画像を、選択した複数の評価対象画像に対応する複数の位置データと共に、ネットワークNETを介して撮像装置1に送信する。逆に見れば、ステップS54において、受信部20は、位置(x,y,z)周辺の対象物についての評価対象画像Q~Qを、対応する位置データ(x,y,z)~(x,y,z)と共にダウンロードする(受信する)。その後、ステップS55の取得処理により通常撮影画像310及び距離画像320を取得する(図7(a)及び(b)参照)。ステップS55の取得処理はステップS13(図6参照)のそれと同じである。但し、ステップS55における取得は、ステップS51~S54の処理間に行われても良い。 The server device SV according to the second embodiment transmits the selected plurality of evaluation target images to the imaging device 1 via the network NET together with the plurality of position data corresponding to the selected plurality of evaluation target images. Conversely, in step S54, the receiving unit 20 uses the evaluation target images Q 1 to Q n for the objects around the position (x M , y M , z M ) as the corresponding position data (x 1 , y 1 , z 1 ) to (x n , y n , z n ) are downloaded (received). Thereafter, the normal captured image 310 and the distance image 320 are acquired by the acquisition process in step S55 (see FIGS. 7A and 7B). The acquisition process in step S55 is the same as that in step S13 (see FIG. 6). However, the acquisition in step S55 may be performed between the processes in steps S51 to S54.
 続くステップS56において、測位処理部12及び撮影方向検出部13は、最新の原測位データ(x,y,z)及び撮影方向データ(θ,φ)を取得する。この際、主制御部21は、上記付加データ(図5参照)に含められるべき焦点距離を取得しておいても良い。第2実施例では、撮像装置1側にてステップS57の類似画像検索処理を実行することで対象物位置データ(x,y,z)を決定する。これを実現するために、図16に示す如く、主制御部21に類似度評価部53及び対象物位置データ取得部54を設けておくと良い(図8も参照)。主制御部21に含められる、類似度評価部53及び対象物位置データ取得部54から成る部位を演算処理部34と呼んでも良い。ステップS54にてダウンロードされた評価対象画像Q~Q及び位置データ(x,y,z)~(x,y,z)を、演算処理部34に供給し、演算処理部34にて図9のステップS32~S34の処理を成す。これにより、ステップS57及び演算処理部34において、基準画像と各評価対象画像との類似度が評価されて(即ち、基準画像と各評価対象画像とが対比されて対比結果を表す類似度が導出され)評価対象画像Q~Qの中から基準画像の類似画像である特定類似画像(抽出対象画像)が特定及び抽出されると共に、特定類似画像に関連付けられた位置データが対象物位置データ(x,y,z)として、位置データ(x,y,z)~(x,y,z)から抽出される。 In subsequent step S56, the positioning processing unit 12 and the imaging direction detection unit 13 acquire the latest original positioning data (x M , y M , z M ) and imaging direction data (θ, φ). At this time, the main control unit 21 may acquire a focal length to be included in the additional data (see FIG. 5). In the second embodiment, the object position data (x T , y T , z T ) is determined by executing the similar image search process in step S57 on the imaging device 1 side. In order to realize this, as shown in FIG. 16, the main control unit 21 may be provided with a similarity evaluation unit 53 and an object position data acquisition unit 54 (see also FIG. 8). A part including the similarity evaluation unit 53 and the object position data acquisition unit 54 included in the main control unit 21 may be called an arithmetic processing unit 34. The evaluation target images Q 1 to Q n and the position data (x 1 , y 1 , z 1 ) to (x n , y n , z n ) downloaded in step S54 are supplied to the arithmetic processing unit 34 for calculation. The processing unit 34 performs steps S32 to S34 in FIG. Thereby, in step S57 and the arithmetic processing unit 34, the similarity between the reference image and each evaluation target image is evaluated (that is, the similarity indicating the comparison result is derived by comparing the reference image with each evaluation target image). A specific similar image (extraction target image) that is a similar image of the reference image is specified and extracted from the evaluation target images Q 1 to Q n , and the position data associated with the specific similar image is the target position data (X T , y T , z T ) is extracted from the position data (x 1 , y 1 , z 1 ) to (x n , y n , z n ).
 主制御部21は、ステップS55で取得された距離画像をステップS57における基準画像に設定することが望ましい。この場合、第1実施例で述べたように、各評価対象画像も、対応する対象物についての距離画像である。但し、ステップS55で取得された通常撮影画像をステップS57における基準画像に設定しても良い。 The main control unit 21 desirably sets the distance image acquired in step S55 as the reference image in step S57. In this case, as described in the first embodiment, each evaluation object image is also a distance image for the corresponding object. However, the normal captured image acquired in step S55 may be set as the reference image in step S57.
 ステップS57に続くステップS58において、測位データ補正部33は、測位データ補正処理を行う。ステップS58の測位データ補正処理は、図6のステップS17のそれと同じである。但し、ステップS58の測位データ補正処理において、ステップS57、S55及びS56にて得た対象物位置データ(x,y,z)、距離画像及び撮影方向データ(θ,φ)を用いることができる。その後、ステップS59の処理が成される。ステップS59の処理は、図6のステップS18のそれと同じである。 In step S58 following step S57, the positioning data correction unit 33 performs positioning data correction processing. The positioning data correction process in step S58 is the same as that in step S17 in FIG. However, in the positioning data correction process in step S58, the object position data (x T , y T , z T ), the distance image, and the shooting direction data (θ, φ) obtained in steps S57, S55, and S56 are used. Can do. Thereafter, the process of step S59 is performed. The process of step S59 is the same as that of step S18 of FIG.
 第2実施例によっても第1実施例と同様の効果が得られる。第2実施例では、サーバ機器SVに負担をかけることなく、またサーバ機器SVの検索能力に依存することなく、撮像装置1側で類似画像の検索を行うことができ、場合によっては検索時間短縮も期待される。 The same effect as the first embodiment can be obtained by the second embodiment. In the second embodiment, a similar image can be searched on the imaging device 1 side without burdening the server device SV and without depending on the search capability of the server device SV. Is also expected.
 尚、撮像装置1において、撮影によって得た通常撮影画像及び距離画像、並びに、ネットワークNETを経由して取得した任意の画像(評価対象画像を含む)を、それらに対応する原測位データ、補正測位データ又は位置データと共に、データベース16に蓄積記録しておいても良い(上述の第1実施例及び後述の他の実施例においても同様)。また、通常撮影画像及び距離画像の撮影位置をお気に入りの場所として登録するユーザ操作が操作部18に成された場合、撮像装置1の撮影による距離画像と補正測位データを互いに関連付けてデータベース16に蓄積記録しておくようにしても良い(上述の第1実施例及び後述の他の実施例においても同様)。データベース16に蓄積記録された情報(画像及びデータを含む)を、以後の測位データ補正処理に利用しても良い。 In the imaging apparatus 1, a normal captured image and a distance image obtained by shooting, and an arbitrary image (including an evaluation target image) acquired via the network NET, original positioning data corresponding to them, corrected positioning, and the like. It may be stored and recorded in the database 16 together with the data or the position data (the same applies to the first embodiment described above and other embodiments described later). In addition, when a user operation for registering the shooting position of the normal shooting image and the distance image as a favorite place is performed on the operation unit 18, the distance image acquired by the imaging device 1 and the corrected positioning data are associated with each other and stored in the database 16. You may make it record (it is the same also in the above-mentioned 1st Example and the other Example mentioned later). Information stored and recorded in the database 16 (including images and data) may be used for subsequent positioning data correction processing.
<<第3実施例>>
 測位システムの第3実施例を説明する。図17は、第3実施例に係る測位システムの動作フローチャートである。まずステップS71において、撮像装置1に電源が投入されることで撮像装置1及び補正アプリケーションソフトウェアが起動し、その後、スルー表示が継続的に実行される一方でステップS72~S80の処理が順次実行される。
<< Third Example >>
A third embodiment of the positioning system will be described. FIG. 17 is an operation flowchart of the positioning system according to the third embodiment. First, in step S71, the image pickup apparatus 1 and the correction application software are activated by turning on the image pickup apparatus 1. Thereafter, through display is continuously executed, while the processes in steps S72 to S80 are sequentially executed. The
 ステップS72において、測位処理部12は原測位データ(x,y,z)を取得する。続くステップS73において、送信部19は、主制御部21の制御の下、ステップS72にて取得された原測位データ(x,y,z)をアップロードする(送信する)。 In step S72, the positioning processing unit 12 acquires original positioning data (x M , y M , z M ). In subsequent step S73, the transmission unit 19 uploads (transmits) the original positioning data (x M , y M , z M ) acquired in step S72 under the control of the main control unit 21.
 サーバ機器SVは、アップロードされた原測位データ(x,y,z)に基づき、位置(x,y,z)付近に所在地を有する撮影おすすめスポットを撮像装置1に返信し、撮影おすすめスポットにて撮影を行うことをユーザに推奨する。 Based on the uploaded original positioning data (x M , y M , z M ), the server device SV returns a recommended shooting spot having a location near the position (x M , y M , z M ) to the imaging device 1. It is recommended to the user to shoot at the recommended shooting spot.
 具体的には、サーバ機器SVは、アップロードされた原測位データ(x,y,z)に基づき、位置(x,y,z)付近に所在地を有する対象物についての評価対象画像をデータベース51から抽出する。ここで抽出された評価対象画像を撮影推奨画像と呼ぶ。データベース51に格納され、撮影推奨画像に関連付けられた位置データを撮影推奨位置データと呼ぶ。 Specifically, the evaluation for the server device SV is based on uploaded original positioning data (x M, y M, z M) , position (x M, y M, z M) object having a location in the vicinity of A target image is extracted from the database 51. The evaluation target image extracted here is called a recommended shooting image. The position data stored in the database 51 and associated with the recommended shooting image is referred to as recommended shooting position data.
 撮影推奨画像及び撮影推奨位置データは、対応するナビゲーション情報と共に、ステップS74において受信部20によりダウンロードされる(受信される)。撮影推奨画像は、測位データの補正用に撮影が推奨される画像であって、且つ、ユーザが撮影を望む可能性が高い画像である。故に、撮影推奨画像内の対象物は、なるだけ著名な物体(例えば観光名所の寺)であることが望ましい。第3実施例では、各々の評価対象画像が、対象物についてのカラー画像と対象物についての距離画像とから成るものとする。説明の具体化のため、以下の説明では、ダウンロードされる撮影推奨画像が図18(a)及び(b)の撮影推奨カラー画像410及び撮影推奨距離画像420から成るものとする。 The recommended shooting image and the recommended shooting position data are downloaded (received) together with the corresponding navigation information by the receiving unit 20 in step S74. The recommended shooting image is an image for which shooting is recommended for correcting the positioning data, and an image that the user is likely to desire shooting. Therefore, it is desirable that the target object in the recommended photographing image is an object as famous as possible (for example, a temple at a tourist attraction). In the third example, it is assumed that each evaluation target image is composed of a color image for the target object and a distance image for the target object. For the sake of specific description, in the following description, it is assumed that the recommended shooting image to be downloaded includes the recommended shooting color image 410 and the recommended shooting distance image 420 shown in FIGS.
 撮影推奨画像に対応するナビゲーション情報は、撮影推奨画像の撮影に適した位置までユーザを誘導するための情報であり、例えば地図情報を含む。ユーザは、ダウンロードされ且つ表示部17に表示された撮影推奨カラー画像410を参照しつつ、ナビゲーション情報に従って撮影推奨画像の撮影に適した位置にまで移動した後、シャッタボタン18aの押下操作を行う。結果、ステップS75において、撮像装置1により通常撮影画像310及び距離画像320が取得される(図7(a)及び(b)参照)。続くステップS76において、測位処理部12及び撮影方向検出部13は、最新の原測位データ(x,y,z)及び撮影方向データ(θ,φ)を取得する。この際、主制御部21は、上記付加データ(図5参照)に含められるべき焦点距離を取得しておいても良い。 The navigation information corresponding to the recommended shooting image is information for guiding the user to a position suitable for shooting the recommended shooting image, and includes, for example, map information. The user moves to a position suitable for shooting a recommended shooting image according to the navigation information while referring to the recommended shooting color image 410 downloaded and displayed on the display unit 17, and then presses the shutter button 18 a. As a result, in step S75, the normal captured image 310 and the distance image 320 are acquired by the imaging device 1 (see FIGS. 7A and 7B). In subsequent step S76, the positioning processing unit 12 and the imaging direction detection unit 13 acquire the latest original positioning data (x M , y M , z M ) and imaging direction data (θ, φ). At this time, the main control unit 21 may acquire a focal length to be included in the additional data (see FIG. 5).
 第3実施例においても、図16に示す如く、類似度評価部53及び対象物位置データ取得部54を主制御部21に設けておく。そして、ステップS77において、主制御部21の類似度評価部53は、距離画像320を基準画像に設定した上で、基準画像及び撮影推奨距離画像420を対比することにより基準画像及び撮影推奨距離画像420間の類似度を評価する。或いは、ステップS77において、主制御部21の類似度評価部53は、通常撮影画像310を基準画像に設定した上で、基準画像及び撮影推奨カラー画像410を対比することにより基準画像及び撮影推奨カラー画像410間の類似度を評価しても良い。2枚の画像間の類似度の導出方法は、第1実施例で述べた通りである。そして、主制御部21の対象物位置データ取得部54は、上述の対比結果に相当する上記類似度が所定の閾値以上であるかを確認することで、画像410又は420が基準画像の類似画像に相当するのかを照合する。 Also in the third embodiment, as shown in FIG. 16, a similarity evaluation unit 53 and an object position data acquisition unit 54 are provided in the main control unit 21. In step S77, the similarity evaluation unit 53 of the main control unit 21 sets the distance image 320 as a reference image, and then compares the reference image and the recommended shooting distance image 420 to compare the reference image and the recommended shooting distance image. The similarity between 420 is evaluated. Alternatively, in step S77, the similarity evaluation unit 53 of the main control unit 21 sets the normal captured image 310 as a reference image, and then compares the reference image and the recommended shooting color image 410 to compare the reference image and the recommended shooting color. The similarity between the images 410 may be evaluated. The method for deriving the similarity between two images is as described in the first embodiment. Then, the object position data acquisition unit 54 of the main control unit 21 confirms whether the similarity corresponding to the above comparison result is equal to or greater than a predetermined threshold, whereby the image 410 or 420 is a similar image of the reference image. Check if it corresponds to.
 制御部21の対象物位置データ取得部54は、上記類似度(ステップS77における類似度)が所定の閾値以上であるとき、ステップS78において、画像410又は420は基準画像の類似画像であると判断して受信部20の受信内容から撮影推奨位置データを抽出し、撮影推奨位置データを対象物位置データ(x,y,z)に設定する。上述の説明から明らかなように、対象物位置データ(x,y,z)として設定される撮影推奨位置データは、撮影推奨画像内の対象物の正確な所在地を表すデータである。尚、ステップS77における類似度が所定の閾値以上でないときには、ステップS75に戻り、通常撮影画像及び距離画像の再撮影を促すようにしても良い。 The object position data acquisition unit 54 of the control unit 21 determines that the image 410 or 420 is a similar image of the reference image in step S78 when the similarity (similarity in step S77) is equal to or greater than a predetermined threshold. and extracts the photographing recommended position data from the received content of the reception unit 20 sets the photographing recommended position data object position data (x T, y T, z T) to. As is clear from the above description, the recommended shooting position data set as the object position data (x T , y T , z T ) is data representing the exact location of the target object in the recommended shooting image. If the similarity in step S77 is not greater than or equal to a predetermined threshold value, the process may return to step S75 to prompt re-shooting of the normal shot image and the distance image.
 ステップS78に続くステップS79において、測位データ補正部33は、測位データ補正処理を行う。ステップS79の測位データ補正処理は、図6のステップS17のそれと同じである。但し、ステップS79の測位データ補正処理において、ステップS78、S75及びS76にて設定又は取得された対象物位置データ(x,y,z)、距離画像及び撮影方向データ(θ,φ)を用いることができる。その後、ステップS80の処理が成される。ステップS80の処理は、図6のステップS18のそれと同じである。 In step S79 following step S78, the positioning data correction unit 33 performs positioning data correction processing. The positioning data correction process in step S79 is the same as that in step S17 in FIG. However, in the positioning data correction process in step S79, the object position data (x T , y T , z T ), distance image, and shooting direction data (θ, φ) set or acquired in steps S78, S75, and S76. Can be used. Thereafter, the process of step S80 is performed. The process of step S80 is the same as that of step S18 of FIG.
 上述の如く、第3実施例では、観光地等における撮影おすすめスポット(プロカメラマンが推奨する撮影ベストポイント等)をユーザに提示して誘導するというユーザ便宜を図った流れの中で、測位データの補正が可能となる。また、撮像装置1のユーザの友人によってアップロードされた画像が撮影推奨画像としてダウンロードされた場合には、当該友人の撮影場所と同じ場所で撮影を行うことができる。そして、撮像装置1による撮影画像をネットワークNET上の画像共有サイト(例えば、Flickr(登録商標)、Facebook(登録商標)、Twitter(登録商標))などにアップロードすることで、撮像装置1のユーザ及び当該ユーザの友人間において画像情報をリアルタイムに共有することも可能となる。 As described above, in the third embodiment, in the flow for the convenience of the user to present and guide the recommended shooting spots (such as shooting best points recommended by professional photographers) in tourist spots, Correction is possible. In addition, when an image uploaded by a friend of the user of the imaging apparatus 1 is downloaded as a recommended shooting image, it is possible to perform shooting at the same location as the shooting location of the friend. Then, by uploading an image captured by the imaging device 1 to an image sharing site (for example, Flickr (registered trademark), Facebook (registered trademark), Twitter (registered trademark)) on the network NET, the user of the imaging device 1 and It is also possible to share image information between the user's friends in real time.
 尚、サーバ機器SVにおいて、原測位データ(x,y,z)に基づき2以上の撮影推奨画像が抽出されても良い。2以上の撮影推奨画像が抽出された場合には、受信部20により、“撮影推奨画像、撮影推奨位置データ及びナビゲーション情報”の組が2組以上ダウンロードされ、撮像装置1は、複数組分の撮影推奨カラー画像を表示部17に表示した上で、複数組分の撮影推奨カラー画像から1つを選択すべきことをユーザに促す。これを受けて、ユーザは複数組分の撮影推奨カラー画像から1つの撮影推奨カラー画像を選択する。この選択後、ステップS74より後のユーザの移動及び撮像装置1の処理が上述の如く成される。 Incidentally, in the server apparatus SV, the original positioning data (x M, y M, z M) is 2 or more captured recommended image based on may be extracted. When two or more recommended shooting images are extracted, two or more sets of “recommended shooting images, recommended shooting position data, and navigation information” are downloaded by the receiving unit 20, and the imaging apparatus 1 has a plurality of sets. The user is prompted to select one of a plurality of recommended shooting color images after displaying the recommended shooting color image on the display unit 17. In response to this, the user selects one shooting recommended color image from a plurality of sets of shooting recommended color images. After this selection, the movement of the user and the processing of the imaging device 1 after step S74 are performed as described above.
<<第4実施例>>
 測位システムの第4実施例を説明する。第4実施例では、撮像部11に、撮像素子、光学系及び測距用光源を複数組分設ける。例えば、図19に示す如く、撮像部11に、撮像素子IS並びに撮像素子IS用の光学系及び測距用光源LTを設けると共に、撮像素子IS並びに撮像素子IS用の光学系及び測距用光源LTを設ける。単眼の場合でも(即ち撮像素子の個数が1の場合でも)距離画像を生成することが可能であるが、周知の如く、単眼ではオクルージョンの影響を受けた部分の距離データを生成することができない。複眼を利用すればオクルージョンの影響を抑制することができる。
<< 4th Example >>
A fourth embodiment of the positioning system will be described. In the fourth embodiment, the image pickup unit 11 is provided with a plurality of sets of image pickup devices, optical systems, and ranging light sources. For example, as shown in FIG. 19, the imaging unit 11, provided with an image sensor IS 1 and optical system and the distance-measuring light source LT 1 for image sensor IS 1, the image pickup element IS 2 and optical system for the imaging element IS 2 and providing a distance measuring light source LT 2. Although it is possible to generate a distance image even in the case of a single eye (that is, even when the number of imaging elements is 1), as is well known, distance data of a portion affected by occlusion cannot be generated with a single eye. . If compound eyes are used, the influence of occlusion can be suppressed.
 撮像素子IS及びISの夫々は上述の撮像素子ISと同様のものである。測距用光源LT及びLTの夫々も上述の測距用光源LTと同様のものであってよい。撮像素子IS及びISは、互いに異なる位置に配置され、互いに視差を持った画像を生成可能である。但し、撮像素子IS及びISの撮影領域の一部同士は重複している。撮像素子IS及びISは、夫々、撮像装置1の左眼及び右眼として機能する。測距用光源LTは撮像素子ISの撮影領域内の被写体に対して光を照射し、測距用光源LTは撮像素子ISの撮影領域内の被写体に対して光を照射する。各測距用光源を複数個の発光素子にて形成しても良い。 Each of the image sensors IS 1 and IS 2 is the same as the image sensor IS described above. Each of the ranging light sources LT 1 and LT 2 may be the same as the above-described ranging light source LT. The imaging elements IS 1 and IS 2 are arranged at different positions, and can generate images having parallax with each other. However, a part of the imaging regions of the image sensors IS 1 and IS 2 overlap each other. Image sensor IS 1 and IS 2, respectively, function as the left and right eyes of the image pickup apparatus 1. The distance measuring light source LT 1 irradiates light on the subject in the imaging region of the image sensor IS 1 , and the distance measuring light source LT 2 irradiates light on the subject in the imaging region of the image sensor IS 2 . Each distance measuring light source may be formed of a plurality of light emitting elements.
 測距用光源LT及びLTは、撮像素子IS及びISの撮影領域内の被写体に対して複数の波長を有する光を照射するようにしても良い。より具体的には例えば、近赤外線光を出射するLED(Light Emitting Diode)にて測距用光源LT及びLTの夫々を形成し、互いに異なる波長を持った複数の正弦波(例えば数10の正弦波)で各LEDの出射光を変調して被写体に投光するようにしても良い。各LEDの出射光に対する被写体からの反射光は、撮像素子IS及びISにて受光される。距離データ生成部32は、撮像素子ISの出力画像信号に基づき画素ごとに上記出射光及び反射光間の位相差を求め、画素ごとに該位相差から被写体距離を算出することで、撮像素子ISの出力画像信号に基づく左眼距離画像を生成することができる。同様に、距離データ生成部32は、撮像素子ISの出力画像信号に基づき画素ごとに上記出射光及び反射光間の位相差を求め、画素ごとに該位相差から被写体距離を求めることで、撮像素子ISの出力画像信号に基づく右眼距離画像を生成することができる。そして、距離データ生成部32は、左眼距離画像と右眼距離画像を統合することで、オクルージョンの影響が抑制された最終距離画像を生成することができる。最終距離画像は、上述の第1~第3実施例又は後述の他の実施例において距離データ生成部32が生成すべき距離画像(例えば図7(b)の距離画像320)であっても良い。 The distance measuring light sources LT 1 and LT 2 may irradiate light having a plurality of wavelengths to the subject in the imaging regions of the imaging elements IS 1 and IS 2 . More specifically, for example, each of the ranging light sources LT 1 and LT 2 is formed by an LED (Light Emitting Diode) that emits near-infrared light, and a plurality of sine waves having different wavelengths (for example, several 10) are formed. The emitted light of each LED may be modulated with a sine wave of (1) and projected onto the subject. Reflected light from the subject with respect to the light emitted from each LED is received by the imaging elements IS 1 and IS 2 . The distance data generation unit 32 obtains the phase difference between the emitted light and the reflected light for each pixel based on the output image signal of the image sensor IS 1 , and calculates the subject distance from the phase difference for each pixel. A left eye distance image based on the output image signal of IS 1 can be generated. Similarly, the distance data generation unit 32 obtains the phase difference between the emitted light and reflected light for each pixel based on an output image signal from the image sensor IS 2, by obtaining the object distance from the phase difference for each pixel, A right eye distance image based on the output image signal of the image sensor IS 2 can be generated. Then, the distance data generation unit 32 can generate a final distance image in which the influence of occlusion is suppressed by integrating the left eye distance image and the right eye distance image. The final distance image may be a distance image (for example, the distance image 320 in FIG. 7B) to be generated by the distance data generation unit 32 in the first to third embodiments described above or other embodiments described later. .
 図1の画像処理部31は、撮像素子ISの出力画像信号から左眼通常撮影画像を生成することができ、撮像素子ISの出力画像信号から右眼通常撮影画像を生成することができる。それらの画像の生成過程において、メディアンフィルタ等を用いたノイズ除去処理が行われても良い。左眼又は右眼通常撮影画像は、上述の第1~第3実施例又は後述の他の実施例において画像処理部31が生成すべき通常撮影画像(例えば図7(a)の通常撮影画像310)であっても良い。画像処理部31は、左眼及び右眼通常撮影画像の夫々における対象物SUBの重心座標を求める。主制御部21(距離データ生成部32)は、各重心座標及び自身が求めた距離データから撮像装置1及び対象物SUB間の正確な距離を導出することができ、導出した距離を用いて測位データ補正処理を成すことができる。 The image processing unit 31 of FIG. 1, the output image signal from the image sensor IS 1 can generate a left-eye normal photographed image, it is possible to generate a right-eye normal photographed image from the output image signal from the image sensor IS 2 . In the process of generating these images, noise removal processing using a median filter or the like may be performed. The left-eye or right-eye normal captured image is a normal captured image (for example, the normal captured image 310 of FIG. 7A) to be generated by the image processing unit 31 in the first to third embodiments described above or other embodiments described later. ). The image processing unit 31 obtains the center-of-gravity coordinates of the object SUB in each of the left eye and right eye normal captured images. The main control unit 21 (distance data generation unit 32) can derive an accurate distance between the imaging device 1 and the object SUB from each barycentric coordinate and the distance data obtained by the main control unit 21, and performs positioning using the derived distance. Data correction processing can be performed.
 上述したように、複数の撮像素子を用いて測距を行うことでオクルージョンの影響を抑制することができる。また、単波長についての位相差だけを見ると同位相を与える距離が複数存在しうるが(結果、測距に大きな誤差が生じうるが)、複数の波長を持った変調波を投光することで測距の精度向上が可能である。撮像素子にて連続的に位相差を計測するようにすればリアルタイムで距離データを生成することが可能である。 As described above, the influence of occlusion can be suppressed by performing distance measurement using a plurality of image sensors. Also, when looking at only the phase difference for a single wavelength, there may be multiple distances that give the same phase (as a result, a large error may occur in distance measurement), but a modulated wave with multiple wavelengths should be projected. Thus, the accuracy of distance measurement can be improved. If the phase difference is continuously measured by the image sensor, distance data can be generated in real time.
 尚、撮像部11に、撮像素子、光学系及び測距用光源を2組設ける例を上述したが、撮像部11に、撮像素子、光学系及び測距用光源を3組以上設けるようにしても良い。 Although the example in which the image pickup unit 11 is provided with two sets of the image pickup device, the optical system, and the distance measuring light source is described above, the image pickup unit 11 is provided with three sets or more of the image pickup element, the optical system, and the distance measuring light source. Also good.
 また、撮像部11に、互いに視差を有する2以上の撮像素子を設けた場合には、撮像部11から測距用光源を割愛することも可能である。この場合、距離データ生成部32は、2以上の撮像素子の出力画像信号に基づき三角測量の原理を用いて距離データを生成すればよい。 Further, when the imaging unit 11 is provided with two or more imaging elements having parallax, it is possible to omit the ranging light source from the imaging unit 11. In this case, the distance data generation unit 32 may generate distance data using the principle of triangulation based on output image signals of two or more image sensors.
<<第5実施例>>
 測位システムの第5実施例を説明する。送信部19及び受信部20から成る通信部は、仲介機器を介してネットワークNETに接続されても良い。図20は、第5実施例に係る測位システムの全体構成を示す概略ブロック図であり、図1の測位システムに対して仲介機器CCを追加することで図20の測位システムが形成される。仲介機器CCは、通信機能を有し、ネットワークNETに接続可能な機器であって、例えば、携帯電話機(所謂スマートフォンを含む)、情報端末、パーソナルコンピュータである。本実施例によれば、撮像装置1がネットワークNETに常時接続できない機器(例えば、Wi-Fi(登録商標)の認証機器)である場合などにおいても、仲介機器を介して、測位データの補正等が可能となる。
<< 5th Example >>
A fifth embodiment of the positioning system will be described. The communication unit including the transmission unit 19 and the reception unit 20 may be connected to the network NET through an intermediary device. FIG. 20 is a schematic block diagram showing the overall configuration of the positioning system according to the fifth embodiment, and the positioning system of FIG. 20 is formed by adding an intermediary device CC to the positioning system of FIG. The mediation device CC is a device that has a communication function and can be connected to the network NET, and is, for example, a mobile phone (including a so-called smartphone), an information terminal, or a personal computer. According to the present embodiment, even when the imaging device 1 is a device that cannot always connect to the network NET (for example, a Wi-Fi (registered trademark) authentication device), the correction of the positioning data, etc. via the intermediary device. Is possible.
 尚、ネットワークNETを経由して取得した任意の画像(評価対象画像を含む)及び位置データをデータベース16に蓄積記録しておいても良いことを上述したが、第5実施例においては、データベース16を仲介機器CCに設けておいても良い。これにより、撮像装置1におけるデータ保存負荷が軽減される。 Although it has been described above that any image (including the image to be evaluated) and position data acquired via the network NET may be accumulated and recorded in the database 16, in the fifth embodiment, the database 16 May be provided in the intermediary device CC. Thereby, the data storage load in the imaging device 1 is reduced.
<<第6実施例>>
 第6実施例を説明する。第6実施例では、自身の位置情報を撮像装置1に送信可能な特定物体が撮像装置1の周辺に位置していることを想定する。図21には、特定物体の例としてのアクセスポイント500(以下、AP500という)が示されている。撮像装置1は、例えばWi-Fi(登録商標)の認証機器であって、AP500を介してネットワークNETに接続することができる。
<< Sixth Example >>
A sixth embodiment will be described. In the sixth embodiment, it is assumed that a specific object capable of transmitting its own position information to the imaging device 1 is located around the imaging device 1. FIG. 21 shows an access point 500 (hereinafter referred to as AP 500) as an example of a specific object. The imaging device 1 is a Wi-Fi (registered trademark) authentication device, for example, and can be connected to the network NET via the AP 500.
 AP500は、AP500の所在地を表すAP位置データ(xT2,yT2,zT2)を保持するメモリを有する。撮像装置1及びAP500間の通信可能な状態において、AP500はAP位置データ(xT2,yT2,zT2)を撮像装置1に送信することができ、撮像装置1の受信部20は該AP位置データを受信することができる。AP位置データを受信したとき、撮像装置1は、AP500を撮影すべきことを表示部17等を用いてユーザに報知することができる。これを受けて、ユーザは撮像装置1を操作し、撮像装置1にAP500を対象物として撮影させる。結果、撮像装置1にて、対象物としてのAP500の撮影画像が得られる。ここで得られる撮影画像は、AP500を含んだ撮影領域内の距離画像を含む。 The AP 500 has a memory that stores AP position data (x T2 , y T2 , z T2 ) representing the location of the AP 500. In a state where communication between the imaging apparatus 1 and the AP 500 is possible, the AP 500 can transmit AP position data (x T2 , y T2 , z T2 ) to the imaging apparatus 1, and the receiving unit 20 of the imaging apparatus 1 can transmit the AP position data. Data can be received. When receiving the AP position data, the imaging apparatus 1 can notify the user that the AP 500 should be photographed using the display unit 17 or the like. In response to this, the user operates the imaging device 1 to cause the imaging device 1 to photograph the AP 500 as an object. As a result, a captured image of the AP 500 as an object is obtained by the imaging device 1. The captured image obtained here includes a distance image in the imaging region including the AP 500.
 測位データ補正部33は、AP位置データ(xT2,yT2,zT2)としての対象物位置データと、AP500を対象物として撮影することで得た距離画像と、その距離画像の撮影時における撮影方向データとに基づき、原測位データ(x,y,z)を補正する測位データ補正処理を行うことができる。測位データ補正処理の具体的内容については上述の実施例で述べたとおりである。 The positioning data correction unit 33 captures the object position data as the AP position data (x T2 , y T2 , z T2 ), the distance image obtained by photographing the AP 500 as the object, and at the time of photographing the distance image. Positioning data correction processing for correcting the original positioning data (x M , y M , z M ) can be performed based on the shooting direction data. The specific contents of the positioning data correction process are as described in the above embodiment.
 第6実施例によっても第1実施例と同様の効果が得られる。第6実施例では、サーバ機器SVを必要とすることもなく、短時間で測位データ補正を完了できる。尚、測位データ補正処理において、距離画像上におけるどの物体がAP500であるのかを撮像装置1に正確に認識させるべく、AP500は、自身の形状等を示す情報(例えば、AP500自身の画像)やAP500を特定付ける情報(例えば、撮像装置1にて観測可能なAP500固有の識別標識)を撮像装置1に送信するようにしてもよい。 Also in the sixth embodiment, the same effect as in the first embodiment can be obtained. In the sixth embodiment, the positioning data correction can be completed in a short time without requiring the server device SV. In the positioning data correction process, in order to make the imaging apparatus 1 accurately recognize which object on the distance image is the AP 500, the AP 500 is information indicating its own shape (for example, the image of the AP 500 itself) or the AP 500. May be transmitted to the imaging apparatus 1 (for example, an identification mark unique to the AP 500 that can be observed by the imaging apparatus 1).
 <<変形等>>
 本発明の実施形態は、特許請求の範囲に示された技術的思想の範囲内において、適宜、種々の変更が可能である。以上の実施形態は、あくまでも、本発明の実施形態の例であって、本発明ないし各構成要件の用語の意義は、以上の実施形態に記載されたものに制限されるものではない。上述の説明文中に示した具体的な数値は、単なる例示であって、当然の如く、それらを様々な数値に変更することができる。上述の実施形態に適用可能な注釈事項として、以下に、注釈1~注釈3を記す。各注釈に記載した内容は、矛盾なき限り、任意に組み合わせることが可能である。
<< Deformation, etc. >>
The embodiment of the present invention can be appropriately modified in various ways within the scope of the technical idea shown in the claims. The above embodiment is merely an example of the embodiment of the present invention, and the meaning of the term of the present invention or each constituent element is not limited to that described in the above embodiment. The specific numerical values shown in the above description are merely examples, and as a matter of course, they can be changed to various numerical values. As annotations applicable to the above-described embodiment, annotations 1 to 3 are described below. The contents described in each comment can be arbitrarily combined as long as there is no contradiction.
[注釈1]
 撮影方向検出部13及び撮影方向記録部14にて検出及び記録される撮影方向データに、ロール角が含まれていても良い。ロール角は、撮像装置1のロール方向における角度であり、磁気センサ又はジャイロセンサ等を用いて検出される。撮像部11の光軸を不変にした状態で撮像装置1の筐体を光軸周りに回転させる運動における回転角がロール角である。撮像装置1の撮影画像(通常撮影画像、距離画像)における水平方向が水平面と平行な状態において、ロール角は0度であり、ロール角の絶対値が増大するにつれて撮影画像の水平方向が水平面から傾く。第1~第3実施例を含む上述の各実施例において、主制御部21は、距離画像又は通常撮影画像をロール角にて補正したもの(水平面に対する、距離画像又は通常撮影画像の水平方向の傾きを補正した画像)を基準画像に設定することができる。これにより、ロール角に依存しない類似画像検索処理、類似度評価処理及び測位データ補正処理を実現可能である。
[Note 1]
The shooting direction data detected and recorded by the shooting direction detection unit 13 and the shooting direction recording unit 14 may include a roll angle. The roll angle is an angle in the roll direction of the imaging device 1 and is detected using a magnetic sensor or a gyro sensor. The rotation angle in the motion of rotating the housing of the imaging device 1 around the optical axis in a state where the optical axis of the imaging unit 11 is unchanged is the roll angle. In a state in which the horizontal direction in the captured image (normally captured image, distance image) of the imaging device 1 is parallel to the horizontal plane, the roll angle is 0 degrees, and the horizontal direction of the captured image increases from the horizontal plane as the absolute value of the roll angle increases. Tilt. In each of the above-described embodiments including the first to third embodiments, the main control unit 21 corrects the distance image or the normal captured image with the roll angle (the horizontal image of the distance image or the normal captured image with respect to the horizontal plane). The image whose inclination is corrected) can be set as the reference image. Thereby, it is possible to realize similar image search processing, similarity evaluation processing, and positioning data correction processing that do not depend on the roll angle.
[注釈2]
 姿勢角φは撮影方向データから削除されうる。また、原測位データ、補正測位データ及び任意の位置データ(対象物位置データを含む)に、高度成分であるZ軸成分が含まれないこともある。原測位データ、補正測位データ及び任意の位置データにZ軸成分が含まれない場合には、特に、撮影方向データから姿勢角φを削除しても構わない。
[Note 2]
The posture angle φ can be deleted from the shooting direction data. In addition, the original positioning data, the corrected positioning data, and the arbitrary position data (including the object position data) may not include the Z-axis component that is an altitude component. When the Z-axis component is not included in the original positioning data, the corrected positioning data, and the arbitrary position data, the posture angle φ may be deleted from the shooting direction data.
[注釈3]
 撮像装置1及びサーバ機器SVの夫々を、ハードウェア、或いは、ハードウェアとソフトウェアの組み合わせによって構成することができる。ソフトウェアを用いて撮像装置1又はサーバ機器SVを構成する場合、ソフトウェアにて実現される部位についてのブロック図は、その部位の機能ブロック図を表すことになる。ソフトウェアを用いて実現される機能をプログラムとして記述し、該プログラムをプログラム実行装置(例えばコンピュータ)上で実行することによって、その機能を実現するようにしてもよい。
[Note 3]
Each of the imaging device 1 and the server device SV can be configured by hardware or a combination of hardware and software. When the imaging apparatus 1 or the server device SV is configured using software, a block diagram of a part realized by software represents a functional block diagram of the part. A function realized using software may be described as a program, and the function may be realized by executing the program on a program execution device (for example, a computer).
  1 撮像装置
 11 撮像部
 12 測位処理部
 13 撮影方向検出部
 19 送信部
 20 受信部
 32 距離データ生成部
 33 測位データ補正部
 34 演算処理部
 52 画像限定選択部
 53 類似度評価部
 54 対象物位置データ取得部
 IS、IS、IS 撮像素子
 LT、LT、LT 測距用光源
 NET ネットワーク
 SV、SV サーバ機器
DESCRIPTION OF SYMBOLS 1 Imaging device 11 Imaging part 12 Positioning process part 13 Shooting direction detection part 19 Transmission part 20 Reception part 32 Distance data generation part 33 Positioning data correction part 34 Operation processing part 52 Image limitation selection part 53 Similarity evaluation part 54 Object position data Acquisition unit IS, IS 1 , IS 2 Image sensor LT, LT 1 , LT 2 Distance measuring light source NET network SV, SV A server equipment

Claims (10)

  1.  被写体の画像信号を出力する撮像素子を有する撮像部と、
     前記撮像素子の出力を用いて当該電子機器及び前記被写体間の距離を表す距離データを生成する距離データ生成部と、
     前記撮像部の撮影方向を検出することで撮影方向データを生成する撮影方向検出部と、
     衛星からの信号に基づく測位を行って原測位データを生成する測位処理部と、
     前記撮像素子の出力に基づく基準画像を外部機器に送信する送信部と、
     前記外部機器から前記基準画像に基づく位置データを受信する受信部と、
     受信位置データ、前記距離データ及び前記撮影方向データに基づいて前記原測位データを補正する測位データ補正部と、を備えた
    ことを特徴とする電子機器。
    An imaging unit having an imaging element that outputs an image signal of a subject;
    A distance data generation unit that generates distance data representing a distance between the electronic device and the subject using the output of the imaging element;
    A shooting direction detection unit that generates shooting direction data by detecting a shooting direction of the imaging unit;
    A positioning processing unit that performs positioning based on signals from satellites and generates original positioning data;
    A transmission unit that transmits a reference image based on the output of the image sensor to an external device;
    A receiving unit for receiving position data based on the reference image from the external device;
    An electronic apparatus comprising: a positioning data correction unit that corrects the original positioning data based on reception position data, the distance data, and the shooting direction data.
  2.  前記基準画像に対応する対象画像に関連付けられた位置に関するデータが前記位置データとして前記受信部にて受信され、
     前記対象画像に関連付けられた位置に関するデータは、前記対象画像に含まれる対象物の所在地データを含む
    ことを特徴とする請求項1に記載の電子機器。
    Data relating to the position associated with the target image corresponding to the reference image is received as the position data by the receiving unit,
    The electronic device according to claim 1, wherein the data relating to the position associated with the target image includes location data of an object included in the target image.
  3.  前記送信部は、前記基準画像及び前記原測位データを外部機器に送信し、
     前記受信位置データは、前記基準画像及び前記原測位データに基づく位置データである
    ことを特徴とする請求項1又は請求項2に記載の電子機器。
    The transmitting unit transmits the reference image and the original positioning data to an external device,
    The electronic apparatus according to claim 1, wherein the reception position data is position data based on the reference image and the original positioning data.
  4.  被写体の画像信号を出力する撮像素子を有する撮像部と、
     前記撮像素子の出力を用いて当該電子機器及び前記被写体間の距離を表す距離データを生成する距離データ生成部と、
     前記撮像部の撮影方向を検出することで撮影方向データを生成する撮影方向検出部と、
     衛星からの信号に基づく測位を行って原測位データを生成する測位処理部と、
     前記原測位データを外部機器に送信する送信部と、
     前記外部機器から前記原測位データに基づく対象画像及び位置データを受信する受信部と、
     前記対象画像と前記撮像素子の出力に基づく基準画像とを対比し、対比結果に応じて前記位置データを前記受信部の受信内容から抽出する演算処理部と、
     抽出位置データ、前記距離データ及び前記撮影方向データに基づいて前記原測位データを補正する測位データ補正部と、を備えた
    ことを特徴とする電子機器。
    An imaging unit having an imaging element that outputs an image signal of a subject;
    A distance data generation unit that generates distance data representing a distance between the electronic device and the subject using the output of the imaging element;
    A shooting direction detection unit that generates shooting direction data by detecting a shooting direction of the imaging unit;
    A positioning processing unit that performs positioning based on signals from satellites and generates original positioning data;
    A transmission unit for transmitting the original positioning data to an external device;
    A receiving unit for receiving a target image and position data based on the original positioning data from the external device;
    An arithmetic processing unit that compares the target image with a reference image based on the output of the image sensor and extracts the position data from the received content of the receiving unit according to a comparison result;
    An electronic apparatus comprising: a positioning data correction unit that corrects the original positioning data based on the extracted position data, the distance data, and the shooting direction data.
  5.  前記位置データは、前記対象画像に含まれる対象物の所在地データを含む
    ことを特徴とする請求項5に記載の電子機器。
    The electronic device according to claim 5, wherein the position data includes location data of an object included in the target image.
  6.  前記受信部は、前記外部機器から前記原測位データに基づく対象画像群及び位置データ群を受信し、
     前記演算処理部は、前記対象画像群の中から前記基準画像に対応する画像を前記対象画像として抽出し、前記対象画像に関連付けられた前記位置データを前記位置データ群から抽出する
    ことを特徴とする請求項4又は請求項5に記載の電子機器。
    The receiving unit receives a target image group and a position data group based on the original positioning data from the external device,
    The arithmetic processing unit extracts an image corresponding to the reference image from the target image group as the target image, and extracts the position data associated with the target image from the position data group. The electronic device according to claim 4 or 5.
  7.  前記基準画像として、前記距離データによる距離画像を用いた
    ことを特徴とする請求項1~請求項6の何れかに記載の電子機器。
    7. The electronic apparatus according to claim 1, wherein a distance image based on the distance data is used as the reference image.
  8.  前記撮像部は、前記被写体に対して光を照射する測距用光源を有し、
     前記距離データは、前記測距用光源を用いて生成される
    ことを特徴とする請求項1~請求項7の何れかに記載の電子機器。
    The imaging unit includes a ranging light source that irradiates light to the subject,
    8. The electronic device according to claim 1, wherein the distance data is generated using the distance measuring light source.
  9.  前記撮像部は、前記撮像素子として複数の撮像素子を有するとともに、前記複数の撮像素子に対応する複数の測距用光源を有し、
     前記複数の測距用光源は前記被写体に対して複数の波長を持つ光を照射し、
     前記距離データ生成部は、前記被写体からの反射光に依存する各撮像素子の出力を用いて前記距離データを生成する
    ことを特徴とする請求項1~請求項7の何れかに記載の電子機器。
    The imaging unit has a plurality of imaging elements as the imaging element and a plurality of ranging light sources corresponding to the plurality of imaging elements,
    The plurality of ranging light sources irradiate the subject with light having a plurality of wavelengths,
    8. The electronic apparatus according to claim 1, wherein the distance data generation unit generates the distance data using an output of each image sensor depending on reflected light from the subject. .
  10.  前記測位データ補正部による前記原測位データへの補正量に応じ、警告報知を成す
    ことを特徴とする請求項1~請求項9の何れかに記載の電子機器。
    10. The electronic device according to claim 1, wherein warning notification is made according to an amount of correction to the original positioning data by the positioning data correction unit.
PCT/JP2012/081876 2012-01-30 2012-12-10 Electronic device WO2013114732A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012016108A JP2013156111A (en) 2012-01-30 2012-01-30 Electronic device
JP2012-016108 2012-01-30

Publications (1)

Publication Number Publication Date
WO2013114732A1 true WO2013114732A1 (en) 2013-08-08

Family

ID=48904793

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/081876 WO2013114732A1 (en) 2012-01-30 2012-12-10 Electronic device

Country Status (2)

Country Link
JP (1) JP2013156111A (en)
WO (1) WO2013114732A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08184436A (en) * 1994-12-28 1996-07-16 Konica Corp Position measuring equipment and camera with information recording function
JP2001257920A (en) * 2000-03-13 2001-09-21 Fuji Photo Film Co Ltd Camera system
JP2006093861A (en) * 2004-09-21 2006-04-06 Olympus Corp Electronic imaging apparatus
JP2011234257A (en) * 2010-04-30 2011-11-17 Canon Marketing Japan Inc Imaging apparatus, captured image display system, control method, captured image display method and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08184436A (en) * 1994-12-28 1996-07-16 Konica Corp Position measuring equipment and camera with information recording function
JP2001257920A (en) * 2000-03-13 2001-09-21 Fuji Photo Film Co Ltd Camera system
JP2006093861A (en) * 2004-09-21 2006-04-06 Olympus Corp Electronic imaging apparatus
JP2011234257A (en) * 2010-04-30 2011-11-17 Canon Marketing Japan Inc Imaging apparatus, captured image display system, control method, captured image display method and program

Also Published As

Publication number Publication date
JP2013156111A (en) 2013-08-15

Similar Documents

Publication Publication Date Title
US9565364B2 (en) Image capture device having tilt and/or perspective correction
CN107113415B (en) The method and apparatus for obtaining and merging for more technology depth maps
KR102143456B1 (en) Depth information acquisition method and apparatus, and image collection device
US9159169B2 (en) Image display apparatus, imaging apparatus, image display method, control method for imaging apparatus, and program
US9699375B2 (en) Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
JP6398472B2 (en) Image display system, image display apparatus, image display method, and program
JP2010057057A (en) Place name registration device and place name registration method
JP2014112302A (en) Prescribed area management system, communication method, and program
JP2008099268A (en) Apparatus and method for tagging id on photograph on the basis of geographical relative position
KR20090019184A (en) Image reproducing apparatus which uses the image files comprised in the electronic map, image reproducing method for the same, and recording medium which records the program for carrying the same method
JP2006003132A (en) Three-dimensional surveying apparatus and electronic storage medium
WO2015183490A1 (en) Methods and apparatus for position estimation
JP2020510928A (en) Image display method and electronic device
JP2011058854A (en) Portable terminal
KR101296601B1 (en) The camera control system and method for producing the panorama of map information
JP2015231101A (en) Imaging condition estimation apparatus and method, terminal device, computer program and recording medium
JP2017090420A (en) Three-dimensional information restoration device and three-dimensional information restoration method
JP6623568B2 (en) Imaging device, imaging control method, and program
JP6115113B2 (en) Predetermined area management system, predetermined area management method, and program
WO2013114732A1 (en) Electronic device
JP6610741B2 (en) Image display system, image display apparatus, image display method, and program
KR101733792B1 (en) Method and apparatus for correcting position
CN115699075A (en) Generating panoramas using a mobile camera
JP2017005370A (en) Imaging device and imaging method
US20200005832A1 (en) Method for calculating position coordinates and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12867629

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12867629

Country of ref document: EP

Kind code of ref document: A1