WO2016157406A1 - Dispositif d'acquisition d'images, procédé de génération de fichiers d'images, et programme de génération de fichiers d'images - Google Patents

Dispositif d'acquisition d'images, procédé de génération de fichiers d'images, et programme de génération de fichiers d'images Download PDF

Info

Publication number
WO2016157406A1
WO2016157406A1 PCT/JP2015/060106 JP2015060106W WO2016157406A1 WO 2016157406 A1 WO2016157406 A1 WO 2016157406A1 JP 2015060106 W JP2015060106 W JP 2015060106W WO 2016157406 A1 WO2016157406 A1 WO 2016157406A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
position information
image
acquisition device
image acquisition
Prior art date
Application number
PCT/JP2015/060106
Other languages
English (en)
Japanese (ja)
Inventor
清水 宏
鈴木 基之
橋本 康宣
西島 英男
荒井 郁也
Original Assignee
日立マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立マクセル株式会社 filed Critical 日立マクセル株式会社
Priority to PCT/JP2015/060106 priority Critical patent/WO2016157406A1/fr
Publication of WO2016157406A1 publication Critical patent/WO2016157406A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor

Definitions

  • the present apparatus relates to an image acquisition apparatus that captures and stores an image, and particularly relates to an apparatus having a function of storing positional information of the image acquisition apparatus at the time of shooting together with the captured image.
  • a digital system that obtains a two-dimensional photographed image by projecting an image of a subject through a lens to a camera sensor, which is an assembly of multiple pixels composed of semiconductors, and measuring the amount of light irradiated to each pixel.
  • the image acquisition device is widely used.
  • the captured image data of an image captured by such an image acquisition device is compressed by a predetermined image compression method so that the size of the image file is reduced, and attribute information called exif (exchangable image file format) is used. Are added to the compressed image file.
  • attribute information in addition to information related to the shooting conditions of the image data (photographed image acquisition device and lens, focal length and aperture value of the lens, shutter speed, sensor sensitivity, etc.), position information of the image acquisition device at the time of shooting is included.
  • the attribute information is included and stored in the image file.
  • Patent Document 1 includes a GPS position acquisition function in a camera, and acquires a movement locus when a user moves with the camera in addition to position information at the time of shooting. Technology is disclosed.
  • artificial satellite information is received, particularly at the initial activation of the position acquisition function by GPS, and the positional information can be calculated according to the operation schedule of the artificial satellite at the current time.
  • the camera position information can be calculated at any position on the earth as long as it can receive radio waves from artificial satellites. Location information can be obtained.
  • the position information that can be acquired is only the position information of the camera, that is, the photographer, and the subject position information cannot be acquired. there were.
  • An object of the present invention is to provide a technique that allows subject position information of a subject to be added to an image file together with current position information of an image acquisition device when the subject is photographed.
  • an image acquisition device that acquires current position information of an image acquisition device, by collating a shooting region captured by the image acquisition device with subject electronic information including coordinates constituting the subject.
  • An image file including a subject position information acquisition unit that specifies the subject and acquires subject position information of the specified subject, the current position information, and the subject position information acquired by the subject position information acquisition unit is generated. And an image file generation unit.
  • the subject position information of the subject can be added to the image file together with the current position information of the image acquisition device when the subject is photographed.
  • FIG. 1 is a configuration diagram of a communication system including an image acquisition device having a position information acquisition function according to Embodiment 1.
  • FIG. 2 is a hardware configuration diagram of the image acquisition device according to Embodiment 1.
  • FIG. 6 is a diagram illustrating an example of a viewfinder image that is a screen when an object is observed by the image acquisition apparatus according to Embodiment 1.
  • FIG. 2 is a software configuration diagram of the image acquisition apparatus according to Embodiment 1.
  • FIG. 6 is an explanatory diagram of a shooting axis from the image acquisition apparatus according to Embodiment 1.
  • FIG. It is explanatory drawing which shows the content of the image file handled with the image acquisition apparatus which concerns on a 1st Example.
  • FIG. 6 is an explanatory diagram illustrating an operation example of the image acquisition device according to Embodiment 1.
  • FIG. 6 is an explanatory diagram illustrating an operation example of the image acquisition device according to Embodiment 1.
  • FIG. (A)-(c) is explanatory drawing which shows the operation example of the image acquisition apparatus which concerns on Embodiment 1.
  • FIG. 6 is an explanatory diagram illustrating an example of data in which a maximum distance to a subject selected by a lock-on mark is set for each subject by the image acquisition device according to the first embodiment.
  • FIG. 3 is a diagram showing an overview of overall processing of the image acquisition device according to the first embodiment.
  • FIG. (A)-(c) is explanatory drawing which shows the method based on Embodiment 2 which selects the to-be-photographed object which is not near the center part of a finder.
  • (A)-(c) is a figure which shows the outline
  • FIG. (A) is a figure which shows typically the outline
  • (b) is a schematic diagram of the structural example of an optical finder camera.
  • Embodiment 1 of the present invention will be described in detail with reference to FIGS. (Embodiment 1) ⁇ System configuration>
  • FIG. 1 is a configuration diagram of a communication system including an image acquisition apparatus 1000 having a position information acquisition function according to the first embodiment.
  • the communication system includes an image acquisition device 1000, a wireless router 1050, and a database 1010 connected to the image acquisition device 1000 via the wireless router 1050.
  • the image acquisition device 1000 is a digital camera (the image of the subject 1020 is focused on the image sensor by an optical lens, and the luminance and color of the subject image projected on the image sensor are detected by a plurality of pixels constituting the image sensor. Therefore, not only in the case of a camera that shoots a digital image composed of multiple pixels), but also with an existing film shooting camera, electronic information corresponding to each frame of the film being shot.
  • the present invention can be applied to the case where a storage medium that stores the position information configured as described above is provided.
  • the image acquisition apparatus 1000 is equipped with a GPS unit that acquires current position information indicating the position taken by the image acquisition apparatus 1000.
  • the GPS unit receives radio waves from the GPS satellite 1040 and acquires current position information of the camera.
  • the GPS unit 3050 is capable of receiving radio waves from a plurality of GPS artificial satellites, receives radio waves from at least three or more artificial satellites, and is obtained from received information.
  • the basic information is the exact time obtained by the above and the coordinates of each artificial satellite at that time (orbital orbit information of the artificial satellite).
  • the current position is calculated by using the distance to each artificial satellite obtained by the same method of 300,000 km / sec.
  • the image acquisition apparatus 1000 can also specify the current position approximately using radio waves from a wireless router 1050 or a mobile phone base station (not shown). Furthermore, the image acquisition apparatus 1000 receives the current position of the portable information terminal obtained by a portable information terminal capable of acquiring subject position information 1002 by GPS nearby via short-range wireless communication such as Bluetooth (registered trademark). Then, it can be acquired as the current position information when the received current position of the portable information terminal is photographed.
  • a wireless router 1050 or a mobile phone base station not shown.
  • the image acquisition apparatus 1000 receives the current position of the portable information terminal obtained by a portable information terminal capable of acquiring subject position information 1002 by GPS nearby via short-range wireless communication such as Bluetooth (registered trademark). Then, it can be acquired as the current position information when the received current position of the portable information terminal is photographed.
  • the subject 1020 when the subject 1020 is imaged using the image acquisition device 1000 and the subject position information 1002 is stored, it is better to store not the position information of the image acquisition device but the location of the subject 1020 (subject position information 1002). There are actually useful cases.
  • the subject position information 1002 is acquired, and the information is transmitted to the image acquisition device 1000 to obtain a captured image.
  • subject position information 1002 may be stored in association with each other.
  • an apparatus different from the image acquisition apparatus 1000 needs to be placed near the subject, and the photographer needs to approach the subject 1020 once. Therefore, when photographing a long-distance subject (such as a mountain), it becomes difficult to acquire the position of the subject 1020 at the time of photographing.
  • the image acquisition apparatus 1000 can photograph a predetermined range as the photographing region 1060 around the photographing axis 1030 that coincides with the center line of the lens.
  • the subject 1020 on the photographing axis 1030 is the main photographing target in the photographing region 1060
  • subject position information 1002 of the subject 1020 is acquired.
  • the imaging axis 1030 is a line connecting the center of the lens of the image acquisition apparatus 1000 and the center of the imaging area 1060.
  • the database 1010 stores subject electronic information in which the name of the subject 1020, subject position information 1002, and three-dimensional image data of the subject 1020 are associated with each other.
  • the three-dimensional image data is data for drawing the subject 1020 and is composed of coordinates for specifying the position of each point constituting the subject 1020 in the three-dimensional space.
  • the database 1010 may be included in the image acquisition apparatus 1000.
  • the image acquisition apparatus 1000 may be configured to be able to be updated to the latest data by periodically acquiring data stored in the database 1010 from an external server (database).
  • the image acquisition apparatus 1000 receives the three-dimensional image data of the subject 1020 on the shooting axis 1030 and the subject position information 1002 from the database 1010 via the wireless router 1050. Then, the image acquisition apparatus 1000 determines whether the received 3D image data matches the shape of the subject 1020 in the shooting area 1060, and based on the subject position information 1002 of the matched subject 1020, the name of the subject 1020, and the like. Generate an image file. Note that the image acquisition apparatus 1000 may directly receive the three-dimensional image data, the subject position information 1002, and the like from the database 1010 without passing through the wireless router 1050. ⁇ Hardware configuration of image acquisition device>
  • FIG. 2 is a hardware configuration diagram of the image acquisition apparatus 1000 according to the first embodiment.
  • the image acquisition apparatus 1000 includes a CPU 3000 that is a central information processing apparatus, a shutter button 3010 that is pressed during shooting, a sensor (camera sensor) 3020, a signal processing DSP 3030, and an encoder / decoder 3040.
  • the image acquisition apparatus 1000 is basically configured as a computer system as an example.
  • Sensor 3020 is an image sensor that converts an optical image collected by a lens (not shown) into an electrical signal.
  • the signal processing DSP 3030 performs signal processing of the sensor 3020.
  • the sensor 3020, the signal processing DSP 3030, and the encoder / decoder 3040 are not only connected to the bus, but the output signal from the sensor 3020 may be directly sent to the signal processing DSP 3030 and the encoder / decoder 3040 to process the video signal. good.
  • the video signal having a large size is not passed through the bus 3001, the image signal is not occupied on the bus 3001, and the camera can perform other operations while performing compression processing from shooting.
  • the encoder / decoder 3040 compresses the video signal composed of RGB obtained by the signal processing DSP 3030 using a compression method such as discrete cosine transform or Huffman coding. Note that the encoder / decoder 3040 may have a function of compressing not only a captured still image but also a moving image.
  • the GPS unit 3050 acquires position information indicating the current position of the image acquisition apparatus 1000.
  • the G sensor 3060 measures the elevation angle of the image acquisition device 1000 based on the direction of the image acquisition device 1000 and the acceleration generated when the image acquisition device 1000 is moved.
  • the geomagnetic sensor 3070 measures the azimuth angle of the image acquisition device 1000, for example.
  • the wireless LAN 3080 performs wireless communication between the camera and an external device such as a portable information terminal, or obtains the current position using a signal of a wireless communication base station.
  • the flash memory 3090 stores a program for controlling the entire camera and basic constants.
  • the SD-RAM 3100 is a work memory for program execution, and stores GPS satellite orbit information that is sequentially updated, position information that is acquired by GPS, and the like.
  • the clock 3110 is used for attaching a time code to image information stored at the time of photographing or measuring the position information by the GPS.
  • the operation switch 3130 accepts various operations of the image acquisition apparatus 1000 such as changing the setting contents of the image acquisition apparatus 1000, for example.
  • the infrared light receiving unit 3151 receives an instruction from the outside such as a shutter operation of the image acquisition apparatus 1000 by an infrared remote controller or the like.
  • the remote control I / F 3150 converts the output signal output from the infrared light receiving unit 3151 into digital data for use as a control signal for the image acquisition apparatus 1000.
  • the short-range wireless communication unit 3160 performs communication between the image acquisition apparatus 1000 and an external device such as a portable information terminal via short-range wireless (for example, Bluetooth (registered trademark)).
  • short-range wireless for example, Bluetooth (registered trademark)
  • the EVF / LCD (display) 3120 displays a finder image (described later, FIG. 3) of the subject received by the sensor 3020 during shooting.
  • the EVF / LCD (display) 3120 is used for visually confirming image data that has already been taken and stored in an external memory 3141 to be described later.
  • the EVF / LCD (display) 3120 is used for confirming / changing the setting contents of the image acquisition apparatus 1000.
  • a finder image displayed on the EVF / LCD 3120 will be described with reference to FIG.
  • the EVF / LCD displays a viewfinder image 7000 that is a screen when the photographer observes the subject with the camera.
  • a viewfinder image 7000 In the viewfinder image 7000, subject candidates, a first subject 4030, a second subject 4040, and a third subject 4050 are displayed.
  • a lock-on mark 7010 is displayed on the viewfinder image 7000, and this lock-on mark 7010 is usually displayed at a position where the subject can be most easily captured, that is, near the center of the screen of the viewfinder image 7000.
  • the photographer aligns the lock-on mark 7010 with a target subject. For example, the photographer selects the second subject 4040, presses the shutter button halfway, and aligns AF (Auto Focus) with the second subject 4040.
  • the second subject 4040 is selected as a subject for which the photographer wants to record subject position information, and the photographed image data is acquired by pressing the shutter button. Thereafter, an image file in which shooting information including current position information and subject position information is added to the compressed image data compressed by the encoder / decoder is stored in the external memory.
  • FIG. 4 is a software configuration diagram of the image acquisition apparatus 1000 according to the first embodiment.
  • the image acquisition apparatus 1000 includes an image data acquisition unit 210, a subject position information acquisition unit 220, a matching processing unit 230, and an image file generation unit 240.
  • the subject position information acquisition unit 220 is a three-dimensional structure composed of a photographing region photographed by the image obtaining apparatus 1000 and coordinates for specifying the position of each point constituting the subject stored in the database in a three-dimensional space.
  • the subject is identified by collating with the image data, and subject position information of the identified subject is acquired.
  • the subject position information acquisition unit 220 requests all the subject position information and three-dimensional image data included in the imaging region specified by the image data acquisition unit 210 from the database.
  • the database transmits subject position information and three-dimensional image data to the image acquisition apparatus 1000 in response to a request.
  • the image acquisition apparatus 1000 receives subject position information and three-dimensional image data transmitted from the database.
  • the subject position information acquisition unit 220 acquires subject position information and three-dimensional image data transmitted from the database.
  • the subject position information acquisition unit 220 calculates a shooting axis line that connects the center of the lens of the image acquisition apparatus 1000 and the center of the shooting area.
  • the imaging axis will be described in detail with reference to FIG.
  • FIG. 5 is a diagram showing a vector of the imaging axis 1030.
  • the X-axis 6010 shown in FIG. 5 is the north-facing direction
  • the Y-axis 6020 is the east-facing
  • the Z-axis 6030 is the upward-facing coordinate system.
  • the vector of the imaging axis 1030 can be shown by having two angles of a directional angle 6060 and an elevation angle 6050 with respect to the X axis 6010 in the X, Y, and Z axis space.
  • the imaging axis 1030 is based on the current position information (origin 6000) of the image acquisition device 1000 acquired by the GPS unit, the direction angle 6060 measured by the geomagnetic sensor, and the elevation angle 6050 acquired by the G sensor. Calculated.
  • the subject is a target subject by indicating the relationship between the imaging axis 1030 and the constituent plane of the subject's 3D image data (whether there is an intersection in the constituent plane).
  • the subject position information acquisition unit 220 calculates an imaging axis based on the current position information of the image acquisition apparatus 1000, the elevation angle, and the direction angle. Then, the subject position information acquisition unit 220 identifies a subject that intersects with the shooting axis (a subject including an intersection with the shooting axis). Specifically, the subject position information acquisition unit 220 specifies 3D image data that intersects the calculated imaging axis from each 3D image data transmitted from the database. For example, the subject position information acquisition unit 220, when any of the coordinates constituting the shooting axis coincides with any of the coordinates that constitute the three-dimensional image data, the subject position information acquisition unit 220 converts the three-dimensional image data including the coordinates to the 3 of the subject. Identified as dimensional image data.
  • the subject position information acquisition unit 220 matches the subject position information with the position information associated with the subject electronic information or the image data of the subject included in the shooting area according to the type of the subject electronic information.
  • the subject position information corresponding to the information or the subject position information of the subject that overlaps in a predesignated range around the photographing axis connecting the center of the lens of the image acquisition device and the center of the photographing region is acquired.
  • the matching processing unit 230 converts the 3D image data specified by the subject position information acquisition unit 220 into 2D image data.
  • the matching processing unit 230 extracts image data of the subject to be imaged from the captured image data of the captured image. For example, the matching processing unit 230 extracts image data of a subject imaged within a predetermined range from the center of the imaging region. Then, the matching processing unit 230 determines whether the image based on the converted two-dimensional image image data and the image based on the image data extracted from the captured image data match or approximate.
  • the image data acquisition unit 210 obtains an imaging region to be imaged by the image acquisition device 1000 based on the current position information acquired by the GPS unit, the elevation angle measured by the G sensor, and the azimuth angle measured by the geomagnetic sensor. Identify.
  • the image data acquisition unit 210 acquires captured image data for drawing an image in the imaging region. Then, the image data acquisition unit 210 generates a thumbnail image based on the captured image data. The captured image data is compressed by an encoder / decoder to generate compressed image data.
  • the image file generation unit 240 generates shooting information B2 including current position information and subject position information, and shooting information A. In addition, the image file generation unit 240 generates an image file (described later, FIG. 6) including the generated shooting information A, shooting information, thumbnail images, and compressed image data. Then, the image file generation unit 240 stores the generated image file in the external memory. ⁇ Image information file>
  • FIG. 6 is an explanatory diagram showing the contents of the image file 2010 handled by the image acquisition apparatus 1000 according to the first embodiment of the present invention.
  • the image file 2010 includes shooting information A2020, shooting information B2030, a thumbnail image 2040, and compressed image data 2050.
  • Shooting information A2020 indicates the type of information related to the shot image 2000 stored in the image file 2010.
  • the thumbnail image 2040 is a reduced image of the captured image 2000.
  • the compressed image data 2050 is compressed by combining the information amount of the captured image 2000 with a transform / encoding method such as discrete cosine transform or Huffman coding, thereby reducing the data amount and increasing the storage and reading efficiency. ing.
  • a transform / encoding method such as discrete cosine transform or Huffman coding
  • Information about the captured image 2000 shown in the shooting information B2030 includes, for example, a shooting date and time, a storage date and time, a camera name used for shooting, a lens name used for shooting, a shutter speed, an aperture value, and a film mode (for example, Reversal mode, black-and-white mode, etc.), ISO sensitivity indicating a gain for amplifying the sensor output at the time of shooting, current position information indicating the position where the image acquisition device 1000 has shot the shot image 2000, subject position information, And a subject name indicating the name.
  • a shooting date and time includes, for example, a shooting date and time, a storage date and time, a camera name used for shooting, a lens name used for shooting, a shutter speed, an aperture value, and a film mode (for example, Reversal mode, black-and-white mode, etc.), ISO sensitivity indicating a gain for amplifying the sensor output at the time of shooting, current position information indicating the position where the image acquisition device 1000 has shot the shot image 2000,
  • the shooting information A2020, the shooting information B2030, the thumbnail image 2040, and the compressed image data 2050 are collectively handled as one image file 2010, whereby the shooting information A2020 and the shooting information B2030 included in the image file 2010 are displayed.
  • the thumbnail image 2040 and the compressed image data 2050 can be integrally copied from the image acquisition apparatus 1000 to another device.
  • the related information can be handled together, so that the image file 2010 can be handled without losing the shooting information B2030.
  • “at a certain date / time”, “from which location”, and “from which location the subject was viewed”, which corresponds to the shooting information B2030 not the entire image file according to the present embodiment. It is also possible to obtain only information. ⁇ Camera operation>
  • FIG. 7 and 8 are explanatory diagrams illustrating an operation example of the image acquisition apparatus 1000 according to the first embodiment.
  • 7 shows a state in which the layout of the image acquisition device 1000, the first subject 4030, and the second subject 4040 is observed from the side
  • FIG. 8 shows the image acquisition device 1000, the first subject 4030, and the second subject 4040.
  • the layout is observed from an oblique direction.
  • the image acquisition apparatus 1000 is installed in a state in which an image of the first subject 4030 and the second subject 4040 can be acquired with the imaging axis 1030 extending in the lens direction. ing.
  • the current position 4001 of the image acquisition apparatus 1000 can be indicated by coordinate data represented by three numerical values of latitude, longitude, and altitude on the earth.
  • the three-dimensional image 4010 drawn based on the three-dimensional image data has a rectangular parallelepiped shape and has six constituent surfaces 4011 in total. Consists of.
  • the subject position information acquisition unit 220 calculates the presence / absence of an intersection between the configuration surface 4011 constituting the three-dimensional image 4010 and the imaging axis 1030. Then, the subject position information acquisition unit 220 identifies the first subject 4030 and the second subject 4040 where the intersection is present as subject candidates.
  • the matching processing unit 230 two-dimensionally converts the three-dimensional image data for drawing the first subject 4030 specified by the subject position information acquisition unit 220 and the three-dimensional image data for drawing the second subject 4040. Convert to image data. Then, the matching processing unit 230 matches or approximates the image of each subject viewed from the imaging axis 1030 of the image acquisition apparatus 1000 and the image of the building corresponding to the subject included in the image captured by the photographer. Perform a matching check.
  • the photographer mainly photographs the main subject, that is, the subject to which the subject position information is to be added.
  • the subject mainly includes, for example, an AF locked target and a target set by a lock-on mark described later.
  • the matching processing unit 230 selects a subject that matches or approximates as a result of the matching check. By performing the matching check, it can be selected whether the main subject is the first subject 4030 or the second subject 4040.
  • the subject position information acquisition unit 220 acquires subject position information corresponding to three-dimensional image data that matches or approximates the image data of the subject included in the imaging region. As a result, an appropriate one of the first subject position 4031 that is the position of the first subject 4030 and the second subject position 4041 that is the position of the second subject 4040 can be acquired as subject position information. Then, more appropriate subject position information can be added to the image file.
  • the first subject 4030 is a candidate for a subject that is closest to the image acquisition apparatus 1000.
  • the second subject position 4041 is a candidate for a subject that exists near the image acquisition apparatus 1000 next to the first subject 4030.
  • the subject position information acquisition unit 220 replaces the subject that is determined to match or approximate as a result of the matching check, and is a first subject that has an intersection closest to the image acquisition device 1000. 4030 may be selected as the subject.
  • FIGS. 9A to 9C are explanatory diagrams illustrating an operation example of the image acquisition apparatus 1000 according to the first embodiment.
  • the imaging area 1060 includes a first subject 4030, a second subject 4040, and a third subject 4050 in front of the image acquisition apparatus 1000.
  • the first subject 4030, the second subject 4040, and the third subject 4050 are included in the range of the lock-on mark 7010.
  • the subject position information acquisition unit has a predetermined range (for example, a range in which a lock-on mark 7010 is displayed) centered on a shooting axis connecting the center of the lens of the image acquisition device and the center of the shooting region.
  • a predetermined range for example, a range in which a lock-on mark 7010 is displayed
  • a subject that partially or entirely overlaps is specified as a subject for obtaining subject position information.
  • the subject position information acquisition unit calculates whether or not a part of or all of the lock-on mark 7010 overlaps the three-dimensional configuration surface of each subject. Then, the subject position information acquisition unit identifies the first subject 4030, the second subject 4040, and the third subject 4050, which are partially or entirely overlapped with the lock-on mark 7010, as subject candidates.
  • the subject position information acquisition unit specifies a subject having a short distance from the image acquisition device as a target for acquiring subject position information.
  • the subject position information acquisition unit includes a distance from the lens of the image acquisition device 1000 to the first subject 4030, a distance from the lens of the image acquisition device 1000 to the second subject 4040, The distance from the lens of the image acquisition apparatus 1000 to the third subject 4050 is calculated. Then, the subject position information acquisition unit selects the first subject 4030 having a small distance value (close distance) as the subject.
  • a small distance value close distance
  • FIG. 9B similar to FIG. 9A, all the subjects that partially overlap with the lock-on mark 7010 are selected as candidates.
  • a wire frame image (subject electronic) based on the three-dimensional image data of the subject is added to the image of the selected subject (the subject from which subject position information is acquired at the time of shooting). (Image based on information) 4042 is superimposed and displayed.
  • the photographer can easily see which subject is selected when the shutter button is pressed.
  • the layout of the imaging region 1060, the imaging axis 1030, the lock-on mark 7010, the first subject 4030, and the second subject 4040 of the image acquisition apparatus 1000 is as shown in FIG. 9A.
  • the height of the first subject 4030 is lower than the height of the second subject 4040.
  • the first subject 4030 or the second subject 4040 is selected at the moment when the shutter button is pressed due to slight vertical movement of the image acquisition apparatus 1000 at the time of shooting. Therefore, for example, as shown in FIG. 9B, the second subject 4040 that is the closest to the distance from the lens of the image acquisition device 1000 and that partially or entirely overlaps the lock-on mark 7010 is selected as the subject.
  • the EVF / LCD is displayed.
  • the wire frame image 4042 By displaying the wire frame image 4042 on the image of the selected subject, it is possible to present to the photographer that the subject has been selected more clearly than just having the lock-on mark on the subject.
  • the photographer can select and shoot the subject intended by the photographer with more certainty.
  • FIG. 9B when the first subject 4030 and the second subject 4040 are close to each other and the photographing axis 1030 is on the last line between the positions of the subjects, the photographer It can be shown that the subject to be clearly selected is definitely selected.
  • FIG. 9C is a view of the state in which the wire frame image 4042 is superimposed and displayed on the image of the selected subject, as in FIG. 9B, as viewed from above.
  • the example shown in FIG. 9C is different in that it is generated by the image acquisition apparatus 1000 instead of using the subject position information stored in the database as the subject position information of the supplemented subject. This is effective when there is a large error in the position information of the subject, for example, when the subject is not a very large facility, for example, when one of subjects composed of a plurality of buildings is selected.
  • centroid point 9000 in the two-dimensional figure shape of the projected view from the top of the three-dimensional shape collated with the subject to be photographed, and acquiring the calculated centroid point 9000 as the coordinates of the subject, It is possible to generate subject position information that is much more consistent with a photographed subject as compared to other facilities attached or representative positions of large subjects.
  • the lock-on mark 7010 described so far has been described as a circle, but in order to calculate the positional relationship with the three dimensions, the shape of a rectangular parallelepiped or a square is simpler to calculate, and the rectangular parallelepiped. However, there is little adverse effect that it is difficult for the photographer to shoot. Further, in the selection of subjects, it is not necessary to select all subjects within the range of the lock-on mark 7010 as selection candidates, and depending on the focal length of the lens, if the subject to be photographed is a general house, In some cases, by limiting the distance to the farthest object to be selected as a candidate depending on the case of a mountain or lake, it is not necessary to perform extra extraction calculations.
  • the 3D image data of the subject, the subject position information and the subject name related thereto, and the map information that summarizes them are downloaded to the image acquisition device 1000 each time it is acquired from the database, and the image acquisition device 1000
  • the computer may perform the process, or vice versa, all the image acquisition apparatuses 1000 including the above three-dimensional data may be stored in the built-in memory, and all processing may be performed in the image acquisition apparatus 1000. .
  • the depression angle may be stored in association with each other.
  • the captured image data from the image acquisition device 1000 to the computer (image file generation device), current position information of the image acquisition device 1000 for specifying the vector of the imaging axis, and the direction angle measured by the geomagnetic sensor, Transmission may be performed in association with the elevation angle acquired by the G sensor.
  • subject selection and subject position information of the subject may be added as post-processing.
  • the image file generation device includes a subject position information acquisition unit, a matching processing unit, and an image file generation unit.
  • the image file generation device does not have a configuration such as a shutter button, a sensor (camera sensor), a GPS unit, a G sensor, or a geomagnetic sensor.
  • the current position information of the image acquisition device that captures the subject, the elevation angle of the image acquisition device, and the azimuth angle of the image acquisition device are input to the image file generation device via an external memory. Then, the subject position information acquisition unit of the image file generation device captures the imaging region based on the current position information of the image acquisition device that captures the input subject, the elevation angle of the image acquisition device, and the azimuth angle of the image acquisition device. And an imaging axis.
  • the subject position information acquisition unit of the image file generation device is for specifying the shooting area shot by the image acquisition device calculated by itself and the position of each point constituting the subject stored in the database in the three-dimensional space.
  • a subject is identified by collating with three-dimensional image data composed of coordinates, and subject position information of the identified subject is acquired.
  • the image file generation unit of the image file generation device generates an image file including the current position information and the subject position information.
  • the size of the subject is determined to some extent depending on the focal length of the lens, and the subject can be placed in the EVF / LCD regardless of whether the subject is too small or too large for the entire EVF / LCD. It becomes difficult to express in. Therefore, it is desirable to set a range in which the subject position information acquisition unit 220 of the image acquisition apparatus 1000 acquires subject position coordinates according to the size of the subject. Depending on the subject, subject position information may not be stored in the database, or subject position information stored in the database may be inaccurate. Therefore, it is desirable to change the method for specifying the subject position coordinates according to the size of the subject.
  • the subject position information acquisition unit changes the distance from the image acquisition device that acquires the subject position information to the subject according to the type of the subject electronic information. The following description is based on the assumption that a standard lens (such as a 50 mm lens at 35 mm full size) is used as the lens.
  • the subject position information acquisition unit only applies the subject position coordinates of a subject whose distance from the current position of the image acquisition apparatus 1000 is within 100 m.
  • subject position information corresponding to a three-dimensional image may not be stored in the database.
  • the subject position information acquisition unit uses the coordinates in the vicinity of the center of gravity of the two-dimensional plan view of the building (such as ordinary houses and condominiums) viewed from above as subject position information. get.
  • the subject position information acquisition unit acquires only the subject position coordinates of a subject whose distance from the current position of the image acquisition apparatus 1000 is within 500 m.
  • the subject position information acquisition unit acquires subject position information stored in the database in addition to the coordinates near the center of gravity position.
  • the subject position information acquisition unit acquires only the subject position coordinates of a subject whose distance from the current position of the image acquisition apparatus 1000 is within 5 km. In addition to the subject position information stored in the database, the subject position information acquisition unit acquires, as subject coordinates, the intersection (lock-on position) between the plane of the three-dimensional image data facing the camera and the shooting axis. Also good.
  • the subject position information acquisition unit only applies the subject position coordinates of a subject whose distance from the current position of the image acquisition apparatus 1000 is within 20 km. To get.
  • the subject position information acquisition unit for example, in the case of a mountain, the top of a mountain, in the case of a lake, in the vicinity of the center of gravity position, in the case of an island, the center of the island, the lock-on position, the position near the center of gravity, or the subject stored in the database You may acquire position information.
  • the shape of the subject is not a rectangular parallelepiped, and therefore, using a plurality of small surface information used for generating the shape of the mountain or island as they are,
  • the subject may be specified by calculating the intersection coordinates.
  • the acquisition distance shown in FIG. 10 and the position information setting location differ depending on the focal length of the lens, and can be changed as appropriate based on customized data individually set by the photographer.
  • the captured image data is data composed of digital information
  • the acquired current position information and subject position information may be added and stored in the captured image data as they are, and are associated with the captured image data.
  • Different data (which may be the same device / other devices) may be stored, and even when shooting is performed on a film, it may be stored as digital data that can be associated with the film.
  • a sensor camera sensor
  • captures an electronic image of a subject in order to collate with three dimensions.
  • the subject is specified based on the sensor image and three dimensions, and subject position information is calculated.
  • the calculated subject position information is stored in an electronic information storage medium such as a memory separate from the film, starting from the film case.
  • the sensor may have a different optical system that is independent of the optical system that shoots on the film. If installed so as to have, the operation shown in the present invention can be performed.
  • a camera-equipped mobile phone equipped with a camera, capable of capturing and storing captured images, and having functions and capabilities capable of referencing and uploading external data through communication, and further enabling image processing within the main unit, etc.
  • the portable information communication terminal is one of devices suitable for implementing the present invention. ⁇ Overall processing>
  • FIG. 11 is a diagram showing an overview of the overall processing of the image acquisition apparatus 1000 according to the first embodiment. Note that the entire process is started when the image acquisition apparatus 1000 starts shooting the captured image 2000.
  • the GPS unit 3050 acquires the current position information of the image acquisition apparatus 1000.
  • the G sensor 3060 measures the elevation angle of the image acquisition apparatus 1000.
  • the geomagnetic sensor 3070 measures the azimuth angle of the image acquisition device 1000.
  • the image data acquisition unit 210 identifies the imaging region 1060 based on the current position information acquired in S1101, the elevation angle measured in S1102, and the azimuth angle.
  • the subject position information acquisition unit 220 requests the database 1010 for all subject position information and three-dimensional image data included in the imaging region 1060 specified in S1103. As a result, the subject position information acquisition unit 220 acquires subject position information and three-dimensional image data transmitted from the database 1010.
  • the image data acquisition unit 210 acquires captured image data for rendering an image in the imaging region 1060. Further, the image data acquisition unit 210 generates a thumbnail image 2040 based on the captured image data. The captured image data is compressed by an encoder / decoder 3040 to generate compressed image data 2050.
  • the subject position information acquisition unit 220 calculates a shooting axis 1030 that connects the center of the lens of the image acquisition apparatus 1000 and the center of the shooting area 1060 specified in S1103. Specifically, the subject position information acquisition unit 220 uses the current position information of the image acquisition apparatus 1000 acquired by the GPS unit in S1101, the direction angle measured by the geomagnetic sensor in S1102, and the G sensor in S1102. An imaging axis 1030 is calculated based on the acquired elevation angle 6050.
  • the subject position information acquisition unit 220 identifies a subject that intersects the shooting axis 1030. Specifically, the subject position information acquisition unit 220 specifies 3D image data that intersects the imaging axis 1030 calculated in S1107 from each 3D image data acquired in S1104. For example, the subject position information acquisition unit 220 specifies the three-dimensional image data including the coordinates when any of the coordinates configuring the imaging axis 1030 matches any of the coordinates configuring the three-dimensional image data. .
  • the subject position information acquisition unit 220 extracts subject position information of the subject corresponding to the three-dimensional image data specified in S1108 from each subject position information acquired in S1104.
  • the matching processing unit 230 converts the 3D image data specified in S1108 into 2D image data.
  • the matching processing unit 230 extracts image data of the subject to be photographed from the photographed image data acquired in S ⁇ b> 1106. For example, the matching processing unit 230 extracts image data of a subject imaged within a predetermined range from the center of the imaging region 1060.
  • the matching processing unit 230 determines whether the image based on the two-dimensional image image data converted in S1110 matches the image based on the image data extracted in S1111.
  • the matching processing unit 230 determines that the image based on the two-dimensional image image data converted in S1110 and the image based on the image data extracted in S1111 do not match or approximate (S1112-No)
  • S1116 Proceed to
  • the matching processing unit 230 determines that the image based on the two-dimensional image image data converted in S1110 matches the image based on the image data extracted in S1111 (S1112-Yes)
  • the process proceeds to S1113.
  • the image file generation unit 240 generates shooting information B2030 including the current position information acquired in S1101 and subject position information extracted in S1109, and shooting information A2020.
  • the image file generation unit 240 includes the shooting information A2020 generated in step S1113, shooting information B2030, the thumbnail image 2040 generated in step S1106, and the compressed image data 2050.
  • An image file 2010 is generated.
  • the image file generation unit 240 stores the image file 2010 generated in S1114 in the external memory 3141, and ends the entire process.
  • the image file generation unit 240 If NO in S1112, the image file generation unit 240 generates shooting information B2030 including the current position information acquired in S1101 and shooting information A2020 in S1116. Note that the subject position information included in the shooting information B2030 stores information indicating that the subject position information has not been acquired. Note that the image file generation unit 240 may generate shooting information B2030 that does not include subject position information.
  • the image file generation unit 240 includes the shooting information A2020, the shooting information B2030 generated in step S1116, the thumbnail image 2040 generated in step S1106, and the compressed image data 2050.
  • An image file 2010 is generated.
  • the image file generation unit 240 stores the image file 2010 generated in S1117 in the external memory 3141, and ends the entire process.
  • the subject position information acquisition unit 220 performs pattern matching between the two-dimensional image image data obtained by converting the three-dimensional image data and the image in the shooting region 1060, thereby extracting an image that matches or approximates.
  • the image may be specified as a subject.
  • the shape of the subject as seen from the camera in a small house, condominium, office building, etc. is clear and can be calculated using only the shape as seen from the camera when there are many objects. , Can reduce the amount of calculation.
  • the subject position information of the subject can be added to the image file together with the current position information of the image acquisition device when the subject is photographed.
  • Embodiment 1 subject position information and photographed image data of a subject displayed near the center of the finder image when the shutter button is pressed are acquired and stored in an external memory.
  • subject position information of a subject displayed near the center of the finder image when the shutter button is half-pressed and captured image data taken when the shutter button is pressed are acquired. And stored in the external memory. Accordingly, it is possible to acquire subject position information of a subject that is not in the center of the finder image 7000 when the shutter button 3010 is pressed.
  • the second embodiment of the present invention will be described below with reference to FIGS. 12 (a) to 12 (c), mainly with respect to differences from the first embodiment.
  • FIGS. 12A to 12C show the case where the subject to be selected is not near the center of the viewfinder where the lock-on mark is located when the photographer performs framing (specifies the shooting range) with the viewfinder A method for selecting a subject will be described.
  • the subject that the photographer wants to photograph is the first subject 4030.
  • the finder image 7000 to be framed is the image shown in FIG. 12A, and is manually shaken leftward with respect to the state shown in FIG. ing.
  • a lock-on mark 7010 is superimposed on the first subject 4030 selected by the photographer.
  • the shutter button 3010 of the viewfinder image 7000 is half-pressed.
  • half-pressing the shutter button 3010 is an operation used for performing exposure and AF and locking the exposure amount and focus position of the image acquisition apparatus 1000 in that state. This may be an operation for aligning exposure or AF with the first subject 4030. Simultaneously with this operation, a selection operation as a target for acquiring position information is performed on the first subject 4030.
  • FIG. 12 (c) shows a state where the image acquisition apparatus 1000 once shaken to the left is manually returned to perform the lock-on, and the scene to be photographed is framed.
  • the viewfinder image 7000 includes a first subject 4030 and a second subject 4040.
  • the first subject 4030 selected in the state shown in FIG. One subject 4030 is displayed on the left side of the screen.
  • the first subject 4030 is selected as the subject. Therefore, for example, when a mark equivalent to the lock-on mark 7010 can be superimposed and displayed in the finder image 7000 such as EVF (Electronic View Finder), the subject tracking mark 7020 is displayed as shown in FIG. The subject tracking mark 7020 moves in the viewfinder image 7000 while tracking the first subject 4030.
  • EVF Electronic View Finder
  • the photographer can recognize that the first subject 4030 is selected as the subject, so the photographer can change the orientation of the image acquisition apparatus 1000 with confidence.
  • a predetermined viewfinder image 7000 can be taken, and a subject that is not in the center of the viewfinder image 7000 can be selected when the shutter button 3010 is pressed.
  • the subject position information of the subject selected when the shutter button 3010 is pressed down, the image taken when the shutter button 3010 is pressed, and the current position information of the image acquisition device 1000 are associated with each other in the external memory.
  • FIGS. 13A to 13C are diagrams showing an outline of a configuration example of a portable information terminal (for example, a camera-equipped mobile phone or a camera-equipped tablet terminal) that is an image acquisition apparatus according to the third embodiment.
  • 13A shows a state in which the portable information terminal 1200 is observed from the side
  • FIGS. 13B and 13C show a state in which the portable information terminal 1200 is observed from the front.
  • the portable information terminal 1200 is provided on the front surface side of the portable information terminal 1200, and is provided on the opposite side of the display 1230, which is a display 1230 that is used as a finder like a normal camera. And a front camera 1220 provided on the front side of the portable information terminal 1200.
  • the front camera 1220 photographs the display 1230 side, for example, the photographer and the background of the photographer.
  • the front camera 1220 is used when a portable information terminal is used as a videophone.
  • FIG. 13B shows a state in which the subject is photographed using the rear camera 1210.
  • the display 1230 displays a subject photographed by the rear camera 1210 as a real-time moving image, and displays a lock-on mark 1240 superimposed on the subject.
  • the portable information terminal 1200 acquires the subject position coordinates of the subject at the position of the lock-on mark 1240.
  • the portable information terminal 1200 recognizes the subject at the location where the lock-on mark 1240 was present as the subject to be photographed, and places the lock-on mark 1240 on the subject. The monitor screen continues to be displayed in real time. After that, when the shutter button 1260 is pressed, the portable information terminal 1200 acquires the subject image and the subject position information of the subject indicated by the lock-on mark 1240.
  • FIG. 13C shows an example of a state in which when a photographer is photographed using the front camera 1220, the photographing result is displayed on the display 1230 of the portable information terminal 1200.
  • the display 1230 displays a subject 1270 together with the photographer on the captured image.
  • an additional information display area 1280 is displayed so as to be superimposed on the screen.
  • the subject name of the subject 1270, and the subject position information of the subject by latitude and longitude are displayed.
  • the front camera 1220 since the front camera 1220 is used, the real-time display of the monitor screen at the time of shooting is more convenient for shooting when the left and right are reversed and displayed as a mirror image.
  • the three-dimensional data stored in the database is collated with the photographed image data before being horizontally reversed or displayed on the display unit.
  • the three-dimensional image data is reversed left-right and then collated with the subject image.
  • FIG. 14A is a diagram schematically showing an outline of a configuration example of a single-lens reflex camera 1400 that is an image acquisition device according to the fourth embodiment
  • FIG. 14B is a diagram according to the fourth embodiment. It is a figure which shows typically the outline
  • the light from the subject incident from the photographing optical axis 1410 is reflected by the mirror 1430 up 90 degrees through the lens and then focused on the focus glass 1420.
  • the subject image connected to the focus glass 1420 is repeatedly reflected by the pentaprism 1460 and then guided to the viewfinder 1440 so that the photographer can visually see the optical image of the subject.
  • the mirror 1430 rises to the vicinity of the focus glass 1420, and the light passing through the shooting optical axis 1410 is focused on the fourth camera sensor 1454 without blocking the mirror 1430, and the focal plane shutter 1470 in front of it is opened and closed. Thus, exposure is performed at the shutter speed at the time of shooting.
  • the camera sensor is shown in this figure, a camera using a film may be applied instead of the camera sensor.
  • the subject light is not focused on the fourth camera sensor 1454 as in a general digital camera.
  • the fourth camera sensor 1454 cannot be used to match the shape of the subject with the three-dimensional shape.
  • a method of acquiring the shape of the subject by passing through a part of the mirror 1430 as a half mirror and forming an image on the fourth camera sensor 1454 at the camera floor position by the sub mirror 1431, the viewfinder 1440
  • the third camera sensor 1453 is installed in parallel to the photographic optical axis 1410, separately from the method of imaging the light on the fourth camera sensor 1454 on the finder, and completely separate from the photographic optical system.
  • FIG. 14B shows a case where the image acquisition device is an optical viewfinder camera 1500 provided with an optical viewfinder separately from the photographing lens. Similarly to FIG. 14A, exposure is performed at the shutter speed at the time of shooting by opening and closing the focal plane shutter 1520 in front of the first camera sensor 1510.
  • a camera sensor is used, but a camera using a film may be used instead of the camera sensor.
  • a method is shown in which a mirror is provided in the optical viewfinder in the same manner as in FIG. 14A and light is focused on the second camera sensor 1530 on the viewfinder.
  • the shape of the subject is acquired, collated with three dimensions, and the subject position information is obtained. You can get it.
  • the present invention made by the present inventor has been specifically described based on the embodiment.
  • the present invention is not limited to the embodiment, and various modifications can be made without departing from the scope of the invention. Needless to say.
  • a part of the configuration of one embodiment may be replaced with the configuration of another embodiment.
  • the configuration of another embodiment may be added to the configuration of a certain embodiment. These all belong to the category of the present invention.
  • the present invention acquires both device position information and subject position information.
  • the present invention can also be applied to a device having a function for storing the position information.
  • a mobile phone such as a camera-equipped mobile phone equipped with a camera, capable of capturing and storing captured images, and having functions for referencing and uploading external data through communication and image processing inside the main unit, etc.
  • the information communication terminal is one of devices suitable for implementing the present invention.
  • Image data acquisition unit 220 ... subject position information acquisition unit, 230 ... matching processing unit, 240 image file generator, 1000: Image acquisition device, 1010 ... database, 1030: Shooting axis, 1400 ... single-lens reflex camera, 1500 ... optical viewfinder camera, 2010 ... Image file, 2030 ... Shooting information B 7010: Lock-on mark.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un dispositif d'acquisition d'images qui acquiert des informations de position actuelle du dispositif d'acquisition d'images et qui comporte: un unité d'acquisition d'informations de position de sujet qui détermine un sujet en comparant une zone de capture d'images, dans laquelle le dispositif d'acquisition d'images capture une image, avec des informations électroniques de sujet comprenant les coordonnées constituant le sujet, et qui acquiert alors des informations de position de sujet du sujet déterminé; et une unité de génération de fichiers d'images qui génère un fichier d'images comprenant les informations de position actuelle et les informations de position de sujet acquises par l'unité d'acquisition d'informations de position de sujet.
PCT/JP2015/060106 2015-03-31 2015-03-31 Dispositif d'acquisition d'images, procédé de génération de fichiers d'images, et programme de génération de fichiers d'images WO2016157406A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/060106 WO2016157406A1 (fr) 2015-03-31 2015-03-31 Dispositif d'acquisition d'images, procédé de génération de fichiers d'images, et programme de génération de fichiers d'images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/060106 WO2016157406A1 (fr) 2015-03-31 2015-03-31 Dispositif d'acquisition d'images, procédé de génération de fichiers d'images, et programme de génération de fichiers d'images

Publications (1)

Publication Number Publication Date
WO2016157406A1 true WO2016157406A1 (fr) 2016-10-06

Family

ID=57005317

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/060106 WO2016157406A1 (fr) 2015-03-31 2015-03-31 Dispositif d'acquisition d'images, procédé de génération de fichiers d'images, et programme de génération de fichiers d'images

Country Status (1)

Country Link
WO (1) WO2016157406A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019149791A (ja) * 2018-11-08 2019-09-05 京セラ株式会社 電子機器、制御方法およびプログラム

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1186035A (ja) * 1997-07-11 1999-03-30 Nippon Telegr & Teleph Corp <Ntt> 距離参照型景観ラベリング装置およびシステム
JP2002344789A (ja) * 2001-05-16 2002-11-29 Fuji Photo Film Co Ltd 撮像装置および位置情報検出システム
JP2009017540A (ja) * 2007-05-31 2009-01-22 Panasonic Corp 画像撮影装置、付加情報提供サーバ及び付加情報フィルタリングシステム
JP4488233B2 (ja) * 2003-04-21 2010-06-23 日本電気株式会社 映像オブジェクト認識装置、映像オブジェクト認識方法、および映像オブジェクト認識プログラム
JP2012244562A (ja) * 2011-05-24 2012-12-10 Nikon Corp デジタルカメラ
JP2014224861A (ja) * 2013-05-15 2014-12-04 オリンパスイメージング株式会社 表示装置および撮像装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1186035A (ja) * 1997-07-11 1999-03-30 Nippon Telegr & Teleph Corp <Ntt> 距離参照型景観ラベリング装置およびシステム
JP2002344789A (ja) * 2001-05-16 2002-11-29 Fuji Photo Film Co Ltd 撮像装置および位置情報検出システム
JP4488233B2 (ja) * 2003-04-21 2010-06-23 日本電気株式会社 映像オブジェクト認識装置、映像オブジェクト認識方法、および映像オブジェクト認識プログラム
JP2009017540A (ja) * 2007-05-31 2009-01-22 Panasonic Corp 画像撮影装置、付加情報提供サーバ及び付加情報フィルタリングシステム
JP2012244562A (ja) * 2011-05-24 2012-12-10 Nikon Corp デジタルカメラ
JP2014224861A (ja) * 2013-05-15 2014-12-04 オリンパスイメージング株式会社 表示装置および撮像装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019149791A (ja) * 2018-11-08 2019-09-05 京セラ株式会社 電子機器、制御方法およびプログラム

Similar Documents

Publication Publication Date Title
CN109064545B (zh) 一种对房屋进行数据采集和模型生成的方法及装置
WO2017088678A1 (fr) Appareil et procédé de prise d&#39;image panoramique à exposition prolongée
WO2017221659A1 (fr) Dispositif de capture d&#39;images, dispositif d&#39;affichage et système de capture et d&#39;affichage d&#39;images
KR101720190B1 (ko) 디지털 촬영 장치 및 이의 제어 방법
CN110022444B (zh) 无人飞行机的全景拍照方法与使用其的无人飞行机
JP4396500B2 (ja) 撮像装置、画像の姿勢調整方法、及びプログラム
WO2013069048A1 (fr) Dispositif de génération d&#39;images et procédé de génération d&#39;images
US8339477B2 (en) Digital camera capable of detecting name of captured landmark and method thereof
WO2015192547A1 (fr) Procédé permettant de capturer une image tridimensionnelle se basant sur un terminal mobile, et terminal mobile
CN106791483B (zh) 图像传输方法及装置、电子设备
CN104243800A (zh) 控制装置和存储介质
JPWO2014141522A1 (ja) 画像判定装置、撮像装置、3次元計測装置、画像判定方法、及びプログラム
KR20120012201A (ko) 파노라마 사진 촬영 방법
JP5750696B2 (ja) 表示装置および表示プログラム
KR100943548B1 (ko) 촬영 장치의 포즈 가이드 방법 및 장치
JP2011058854A (ja) 携帯端末
JP2019110434A (ja) 画像処理装置、画像処理システムおよびプログラム
JP6741498B2 (ja) 撮像装置、表示装置、及び撮像表示システム
CN104169795B (zh) 图像显示装置、将该图像显示装置作为取景器装置搭载的摄影装置以及图像显示方法
JP5248951B2 (ja) カメラ装置、画像撮影支援装置、画像撮影支援方法、及び画像撮影支援プログラム
JP7306089B2 (ja) 画像処理システム、撮像システム、画像処理装置、撮像装置およびプログラム
WO2016157406A1 (fr) Dispositif d&#39;acquisition d&#39;images, procédé de génération de fichiers d&#39;images, et programme de génération de fichiers d&#39;images
JP2009111827A (ja) 撮影装置及び画像ファイル提供システム
JP2009060338A (ja) 表示装置、電子カメラ
JP2004088607A (ja) 撮像装置、撮像方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15887559

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15887559

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP