WO2018180214A1 - Image processing device, camera device, and image processing method - Google Patents

Image processing device, camera device, and image processing method Download PDF

Info

Publication number
WO2018180214A1
WO2018180214A1 PCT/JP2018/008125 JP2018008125W WO2018180214A1 WO 2018180214 A1 WO2018180214 A1 WO 2018180214A1 JP 2018008125 W JP2018008125 W JP 2018008125W WO 2018180214 A1 WO2018180214 A1 WO 2018180214A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
captured images
images
overlapping
Prior art date
Application number
PCT/JP2018/008125
Other languages
French (fr)
Japanese (ja)
Inventor
浩明 菊池
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2019509066A priority Critical patent/JP6712358B2/en
Publication of WO2018180214A1 publication Critical patent/WO2018180214A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/915Television signal processing therefor for field- or frame-skip recording or reproducing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback

Definitions

  • the present invention relates to an image processing device, a camera device, and an image processing method, and more particularly, to an image processing device, a camera device, and an image processing method in a case where a composite image is generated by synthesizing images having continuous and overlapping regions.
  • a technique for panoramic synthesis of a plurality of continuously shot images has been proposed.
  • a technique for efficiently synthesizing panoramic images a technique has been used in which a photographing position where each photographed image is acquired is measured by a GPS (Global Positioning System) or a rangefinder.
  • GPS Global Positioning System
  • Patent Document 1 describes a technique for capturing a captured image for panoramic synthesis using a camera with a gyro sensor.
  • the movement of the camera is detected by the gyro sensor, and the correct shooting position of the second scene is indicated to the user by the indicator.
  • a camera with a gyro sensor intended to give instructions is described.
  • Patent Document 2 describes a technique for panoramic synthesis of a plurality of captured images.
  • the similarity is evaluated based on the spatial frequency or the number of feature points for the synthesis position of the Nth and N + 1th images.
  • the movement amount between the first image and the (N + 1) th image cannot be calculated, the movement amount calculated from two consecutive images before the Nth image or after the N + 1th image is calculated.
  • panorama composition To perform panorama composition.
  • the overlap of the image group composed of the plurality of captured images is spatially determined. Can be calculated.
  • Patent Documents 1 and 2 do not mention storing information indicating an overlapping area between captured images.
  • the present invention has been made in view of such circumstances, and the object thereof is not to measure the photographing position with sensors, and to spatially arrange a plurality of photographed images based on position information indicating an overlapping region.
  • an image processing device a camera device, and an image processing method capable of efficiently storing a plurality of captured images while suppressing the time and calculation cost required for detecting corresponding points when performing synthesis processing. It is to be.
  • an image processing apparatus is a captured image input unit that sequentially inputs a plurality of captured images in time-series order used for image synthesis, and a plurality of captured images The first image and the second image before and after the chronological order of the captured image input unit for inputting a plurality of captured images having overlapping areas, and an overlapping area between the input first image and the second image
  • An overlapping area measuring unit for measuring, a storage control unit for storing a plurality of input captured images in the storage unit, and storing in the storage unit positional information indicating the overlapping region measured in association with the second image, and a storage unit
  • a spatial arrangement unit that spatially arranges a plurality of captured images based on a plurality of stored captured images and position information, and an overlapping region of the plurality of spatially arranged captured images is detected, and correspondence between the detected overlapping region images A corresponding point detection unit for detecting points;
  • a synthesis processing unit that generates a composite image by combining a
  • the overlap region between the first image and the second image to be input is measured by the overlap region measurement unit, and the storage controller stores position information indicating the overlap region in association with the second image. Is remembered. Thereby, the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors. Then, the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed. Further, according to this aspect, the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
  • An image processing apparatus is a photographed image input unit that sequentially inputs a plurality of photographed images in time-series order used for image synthesis, and includes a plurality of photographed images in time-series order.
  • a captured image input unit that inputs a plurality of captured images in which the front and back first images and the second image have overlapping areas; a first image cutout unit that extracts the first image from the plurality of input captured images;
  • an overlap region measurement unit that measures an overlap region between the cut out first image and the plurality of input captured images, and when the measurement result of the overlap region measurement unit becomes a default value, the second is extracted from the plurality of captured images.
  • a second image cutout unit that cuts out an image; a storage control unit that stores a plurality of input captured images in a storage unit; and a storage control unit that stores position information indicating an overlapping region measured in association with the second image; Multiple captured images and positions stored in the storage unit
  • a spatial arrangement unit that spatially arranges a plurality of captured images based on the information, a corresponding point detection unit that detects an overlapping region of the plurality of spatially arranged captured images, and detects corresponding points between the images of the detected overlapping region;
  • a combining processing unit that combines a plurality of captured images based on the detected corresponding points to generate a combined image.
  • the overlapping area measurement unit measures the overlapping area between the cut-out first image and the plurality of input captured images
  • the storage control unit indicates the overlapping area in association with the second image.
  • Information is stored in the storage unit.
  • the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors.
  • the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed.
  • the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
  • An image processing apparatus is a photographed image input unit that sequentially inputs a plurality of photographed images in time-series order used for image synthesis, and includes a plurality of photographed images in time-series order.
  • a captured image input unit that inputs a plurality of captured images in which the preceding and following first images and the second image have overlapping regions; an overlapping region measuring unit that measures an overlapping region between the input first image and second image;
  • the image extraction unit that extracts the first image and the second image, the extracted first image and the second image are stored in the storage unit, and
  • a storage control unit that stores position information indicating an overlap area measured in association with the second image in the storage unit, and a space in which a plurality of captured images are spatially arranged based on the plurality of captured images and position information stored in the storage unit Overlapping between the arrangement part and multiple shot images arranged in space
  • a corresponding point detecting unit that detects a corresponding area between images of the
  • the overlap region between the first image and the second image to be input is measured by the overlap region measurement unit, and the position information indicating the overlap region measured in association with the second image is stored by the storage control unit.
  • the storage control unit Stored in the department.
  • the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors.
  • the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed.
  • the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
  • the compositing processing unit Preferably, the compositing processing unit generates an ortho image by compositing.
  • the composition processing unit since the composition processing unit generates an ortho image by compositing, it is possible to suppress time and calculation cost for detecting corresponding points of overlapping regions in the generation of the ortho image.
  • the storage control unit records position information in an EXIF tag attached to the second image.
  • the storage controller since the storage controller records the position information on the EXIF tag attached to the second image, the position information can be efficiently stored in association with the second image.
  • the corresponding point detection unit also detects an overlapping area of the first image with an image other than the second image, and detects a corresponding point between the images of the detected overlapping area.
  • the corresponding point detection unit also detects an overlapping area of the first image with an image other than the second image, and detects a corresponding point between the images of the detected overlapping area.
  • the combining process is performed with more corresponding points than the combining process using only the corresponding points of the overlapping region of the first image and the second image, it is possible to perform more accurate combining. .
  • the position information indicating the overlapping area is the first coordinates of the first image and the second coordinates of the second image corresponding to the first image.
  • the position information indicating the overlapping area is the first coordinates of the first image and the second coordinates of the second image corresponding to the first image, more accurate position information of the overlapping area is stored in the storage unit. Can be memorized.
  • the position information indicating the overlapping region is two or less sets of first coordinates and second coordinates.
  • the position information indicating the overlapping area is the first coordinate and the second coordinate of two sets or less, the information amount of the position information is suppressed, and the storage capacity of the storage unit is effectively reduced. Can be used.
  • a camera apparatus is a captured image acquisition unit that sequentially acquires a plurality of captured images in time-series order used for image synthesis, and before and after a time-series order among a plurality of captured images.
  • the first image and the second image of the first image and the second image overlap region measurement that measures an overlap region between a captured image acquisition unit that acquires a plurality of captured images having overlapping regions and a moving image captured by the first image and the captured image acquisition unit.
  • the shooting control unit that causes the captured image acquisition unit to acquire the second image, the acquired plurality of captured images to be stored in the storage unit, and the first A storage control unit that stores position information indicating an overlap area measured in association with two images in a storage unit, and a spatial arrangement that spatially arranges a plurality of captured images based on the plurality of captured images and position information stored in the storage unit And a plurality of spatially arranged Corresponding point detection unit that detects overlapping regions of shadow images, detects corresponding points between images of the detected overlapping regions, and a combining process that generates a combined image by combining a plurality of captured images based on the detected corresponding points A section.
  • the overlapping region measurement unit measures the overlapping region between the first image and the moving image captured by the captured image acquisition unit, and the storage control unit indicates the overlapping region measured in association with the second image.
  • Information is stored in the storage unit.
  • the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors.
  • the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed.
  • the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
  • An image processing method is a step of sequentially inputting a plurality of photographed images in time-series order used for image synthesis, wherein The step of inputting a plurality of captured images having an overlapping area between one image and the second image, the step of measuring an overlapping area between the first image and the second image to be input, and storing the plurality of input captured images A step of storing in the storage unit position information indicating an overlap region measured in association with the second image and a plurality of photographed images based on the plurality of photographed images and position information stored in the storage unit.
  • a step of spatial arrangement a step of detecting overlapping regions of a plurality of captured images arranged in space, a step of detecting corresponding points between the images of the detected overlapping regions, and a combination of the plurality of captured images based on the detected corresponding points Together And generating an image.
  • An image processing method is a step of sequentially inputting a plurality of photographed images in time-series order used for image synthesis, wherein The step of inputting a plurality of captured images having an overlapping area between the one image and the second image, the step of cutting out the first image from the plurality of input captured images, and the plurality of the input of the cut out first image
  • a step of spatially arranging shadow images a step of detecting overlapping regions of a plurality of spatially arranged captured images, a step of detecting corresponding points between the images of the detected overlapping regions, and a plurality of photographing based on the detected corresponding points Synthesizing images to generate a composite image.
  • An image processing method is a step of sequentially inputting a plurality of photographed images in time-series order used for image synthesis, wherein Measurement of a step of inputting a plurality of captured images having an overlapping area between one image and a second image, a step of measuring an overlapping area between the input first image and the second image, and a step of measuring the overlapping area
  • Measurement of a step of inputting a plurality of captured images having an overlapping area between one image and a second image, a step of measuring an overlapping area between the input first image and the second image, and a step of measuring the overlapping area A step of extracting the first image and the second image when the result has become a predetermined value, and storing the extracted first image and the second image in the storage unit and measuring in association with the second image
  • the overlapping area between the input first image and the second image is measured by the overlapping area measuring unit, and the storage control unit stores the positional information indicating the overlapping area in association with the second image.
  • the spatial arrangement of the plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors, and the corresponding point detection unit calculates Corresponding points between images in the overlapping region are detected based on the spatial arrangement of the plurality of captured images.
  • the corresponding point detection process is performed only in a limited area, so that the time required for detecting the corresponding point and the calculation cost required for detecting the corresponding point can be suppressed, and the position indicating the overlapping area Since the information is also stored in the storage unit, the captured image can be efficiently stored in the storage unit.
  • FIG. 1 It is a perspective view which shows the structure of the bridge which is one of the structures which are imaging
  • 3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. It is a figure which shows the function structural example of an image processing apparatus. 3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. It is a figure which shows the function structural example of an image processing apparatus. 3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. It is a figure which shows the function structural example of an image processing apparatus. 3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. It is a figure which shows the function structural example of an image processing apparatus.
  • FIG. 1 is a perspective view showing the structure of a bridge, which is one of the structures to be photographed, and is a perspective view of the bridge as viewed from below.
  • the object to be photographed used in the present invention is not particularly limited.
  • the present invention is applied to the case of synthesizing a photographed image that is used when a structure damage detection inspection is performed. Is done.
  • the present invention is applied to the case where the floor slab 6 in the space is dividedly photographed and the captured images obtained by the divided photographing are combined. .
  • the bridge 1 shown in FIG. 1 has a main girder 2, a cross girder 3, a counter-tilt 4 and a horizontal 5 which are connected by bolts, rivets or welding.
  • a floor slab 6 for driving a vehicle or the like is provided on the upper portion of the main beam 2 or the like.
  • the floor slab 6 is generally made of reinforced concrete.
  • the main girder 2 is a member that is passed between the abutment or the pier and supports the load of the vehicle on the floor slab 6.
  • the cross beam 3 is a member that connects the main beams 2 in order to support the load by the plurality of main beams 2.
  • the anti-tilt frame 4 and the horizontal frame 5 are members for connecting the main girders 2 to each other in order to resist lateral loads of wind and earthquake, respectively.
  • FIG. 2 is a conceptual diagram showing a photographing system.
  • the imaging system 500 includes a robot apparatus 100 and a computer 300.
  • the robot apparatus 100 includes an imaging apparatus 200, and moves and captures images with the imaging apparatus 200 under the control of the computer 300.
  • the robot apparatus 100 is not particularly limited as long as it can move and shoot under the control of the computer 300.
  • Other examples of the robot apparatus 100 include an apparatus called a traveling robot, a small helicopter, a multicopter, a drone, or UAV (Unmanned Aerial Vehicles).
  • the computer 300 and the robot apparatus 100 can communicate with each other, and the computer 300 can remotely control the movement of the robot apparatus 100 and the imaging control of the imaging apparatus 200.
  • FIG. 3 is a perspective view showing an appearance of the robot apparatus 100 including the camera which is an embodiment of the imaging apparatus 200, and shows a state where the robot apparatus 100 is installed between the main girders 2 of the bridge 1.
  • FIG. 4 is a cross-sectional view of a main part of the robot apparatus 100 shown in FIG.
  • the robot apparatus 100 includes an imaging apparatus 200, controls the position (imaging position) of the imaging apparatus 200 in the three-dimensional space, controls the imaging direction of the imaging apparatus 200, and bridges An arbitrary inspection member or the like of the bridge 1 composed of a plurality of members at the time of the inspection of 1 is photographed.
  • the robot apparatus 100 includes a main frame 102, a vertical extension arm 104, a casing 106 in which a drive unit and various control units for the vertical extension arm 104 are disposed, and a casing 106.
  • the X direction driving unit 108 moves in the longitudinal direction of the frame 102 (direction orthogonal to the longitudinal direction of the main beam 2) (X direction), and the longitudinal direction of the main beam 2 (Y direction).
  • a Y-direction drive unit 110 (FIG. 6) that moves in the vertical direction
  • a Z-direction drive unit 112 (FIG. 6) that expands and contracts the vertical extension arm 104 in the vertical direction (Z direction).
  • the X-direction drive unit 108 includes a ball screw 108A disposed in the longitudinal direction (X direction) of the main frame 102, a ball nut 108B disposed in the housing 106, and a motor 108C that rotates the ball screw 108A.
  • the casing 106 is moved in the X direction by rotating the ball screw 108A forward or backward by the motor 108C.
  • the Y-direction drive unit 110 includes tires 110A and 110B disposed at both ends of the main frame 102, and motors (not shown) disposed in the tires 110A and 110B. By driving the motor, the entire robot apparatus 100 is moved in the Y direction.
  • the robot apparatus 100 is installed in such a manner that the tires 110A and 110B at both ends of the main frame 102 are placed on the lower flanges of the two main girders 2 and the main girders 2 are sandwiched between them. Thereby, the robot apparatus 100 can be suspended (suspended) along the main girder 2 by being suspended from the lower flange of the main girder 2.
  • the main frame 102 is configured such that its length can be adjusted according to the interval between the main girders 2.
  • the vertical extension arm 104 is disposed in the housing 106 of the robot apparatus 100 and moves in the X direction and the Y direction together with the housing 106. Further, the vertical extension arm 104 is expanded and contracted in the Z direction by a Z direction driving unit 112 (FIG. 6) provided in the housing 106.
  • a camera installation unit 104A is provided at the tip of the vertical extension arm 104, and a camera 202 that can be rotated in the pan direction and the tilt direction by the pan / tilt mechanism 120 is installed in the camera installation unit 104A. ing.
  • the camera 202 is rotated around a pan axis P coaxial with the vertical extension arm 104 by a pan / tilt mechanism 120 to which a driving force is applied from a pan / tilt driving unit 206 (FIG. 6) or centered on a horizontal tilt axis T. Rotate. As a result, the camera 202 can shoot in an arbitrary posture (shoot in an arbitrary shooting direction).
  • FIG. 6 is a block diagram illustrating a functional configuration example of the robot apparatus 100.
  • the robot apparatus 100 includes a robot control unit 130, an X-direction drive unit 108, a Y-direction drive unit 110, and a Z-direction drive unit 112 on the robot apparatus 100 side, a camera 202 on the imaging apparatus 200 side, and imaging control.
  • the unit 204, the pan / tilt control unit 210, the pan / tilt driving unit 206, and the robot side communication unit 230 are configured.
  • the robot-side communication unit 230 performs two-way wireless communication with the computer 300, a movement command for controlling movement of the robot apparatus 100 transmitted from the computer 300, a pan / tilt command for controlling the pan / tilt mechanism 120, and a camera 202.
  • Various commands such as a shooting command to be controlled are received, and the received commands are output to corresponding control units. Details of the computer 300 will be described later.
  • the robot control unit 130 controls the X direction driving unit 108, the Y direction driving unit 110, and the Z direction driving unit 112 based on the movement command input from the robot side communication unit 230, and the robot device 100 performs the X direction and Y direction.
  • the vertical extension arm 104 is expanded and contracted in the Z direction (see FIG. 3).
  • the pan / tilt control unit 210 operates the pan / tilt mechanism 120 in the pan direction and the tilt direction via the pan / tilt driving unit 206 based on the pan / tilt command input from the robot side communication unit 230 to pan / tilt the camera 202 in a desired direction (see FIG. 5).
  • the imaging control unit 204 causes the imaging unit 202A of the camera 202 to capture a live view image or an inspection image based on the imaging command input from the robot side communication unit 230.
  • a moving image including a captured image or a live view image acquired by the imaging unit 202A is used.
  • Image data indicating an image K taken by the imaging unit 202A of the camera 202 when the bridge 1 is inspected is transmitted to the computer 300 via the robot side communication unit 230.
  • the computer 300 is composed of a tablet computer as shown in FIG.
  • the computer 300 includes a flat casing having a rectangular outline, and includes a touch panel display 302 that also serves as a display unit 326 (FIG. 6) and an input unit 328 (FIG. 6).
  • FIG. 7 is a block diagram showing a system configuration of the computer 300.
  • the computer 300 includes a CPU (Central Processing Unit) 310 that controls the overall operation of the computer 300, and a main memory 314, a nonvolatile memory 316, a mobile device via the system bus 312.
  • a communication unit 318, a wireless LAN communication unit 320, a short-range wireless communication unit 322, a wired communication unit 324, a display unit 326, an input unit 328, a key input unit 330, an audio processing unit 332, an image processing unit 334, and the like are connected. Is done.
  • the CPU 310 reads out an operation program (OS (Operating System) and application program that operates on the OS), standard data, and the like stored in the non-volatile memory 316, expands them in the main memory 314, and performs this operation. By executing the program, it functions as a control unit that controls the operation of the entire computer.
  • OS Operating System
  • the main memory 314 is composed of, for example, a RAM (Random Access Memory) and functions as a work memory for the CPU 310.
  • the non-volatile memory 316 is composed of, for example, a flash EEPROM (EEPROM: Electrically Erasable Programmable Read Only Memory), and stores the above-described operation program and various fixed data.
  • the nonvolatile memory 316 functions as a storage unit of the computer 300 and stores various data.
  • the mobile communication unit 318 is a third-generation mobile communication system compliant with the IMT-2000 standard (International Mobile Telecommunication-2000) and a fourth-generation mobile communication system compliant with the IMT-Advanced standard (International Mobile Telecommunications-Advanced). Based on this, data is transmitted / received to / from the nearest base station (not shown) via the antenna 318A.
  • IMT-2000 International Mobile Telecommunication-2000
  • IMT-Advanced International Mobile Telecommunications-Advanced
  • the wireless LAN communication unit 320 uses a predetermined wireless LAN communication standard (for example, IEEE802.11a / b / g / n standard) between the wireless LAN access point and an external device capable of wireless LAN communication via the antenna 320A. ) To perform wireless LAN communication.
  • a predetermined wireless LAN communication standard for example, IEEE802.11a / b / g / n standard
  • the short-range wireless communication unit 322 transmits / receives data to / from other Bluetooth (registered trademark) equipment within the range of, for example, class 2 (within a radius of about 10 m) via the antenna 322A.
  • Bluetooth registered trademark
  • the wired communication unit 324 performs communication according to a predetermined communication standard with an external device connected by a cable via the external connection terminal 306. For example, USB (Universal Serial Bus) communication is performed.
  • USB Universal Serial Bus
  • the display unit 326 includes a color LCD (liquid crystal display) panel that constitutes a display portion of the touch panel display 302 and a driving circuit thereof, and displays various images.
  • a color LCD liquid crystal display
  • the input unit 328 constitutes a touch panel portion of the touch panel display 302.
  • the input unit 328 is configured integrally with the color LCD panel using a transparent electrode.
  • the key input unit 330 includes a plurality of operation buttons provided on the casing of the computer 300 and a drive circuit thereof.
  • the audio processing unit 332 converts the digital audio data provided via the system bus 312 into an analog signal and outputs it from the speaker 304.
  • the image processing unit 334 digitizes an analog image signal output from the built-in camera 305 including a photographing lens and an image sensor, performs necessary signal processing, and outputs it.
  • FIG. 8 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the first embodiment. Note that the image processing apparatus 400 is provided in the computer 300.
  • the image processing apparatus 400 includes a captured image input unit 402, an overlapping area measurement unit 404, a storage control unit 406, a storage unit 408, a space arrangement unit 410, a corresponding point detection unit 412, and a synthesis processing unit 414.
  • the photographed image input unit 402 is sequentially input with a plurality of photographed images in time series order used for image synthesis, and before and after the first image and the next image (first image) before and after the time series order among the plurality of photographed images. With the second image), a plurality of captured images having overlapping areas are input. In other words, the photographed image input unit 402 receives a plurality of photographed images constituting a composite image, and a photographed image having an overlapping area between the previous image and the subsequent image.
  • FIG. 9 is a diagram conceptually showing a captured image input to the captured image input unit 402.
  • the eight captured images from the image 601 to the image 608 constitute a composite image and are captured from the image 601 to the image 608 in chronological order. Further, the image 601 to the image 608 have overlapping areas with each other, and a part of the front image and a part of the rear image overlap each other.
  • the image 601 and the image 602 is the previous image
  • the image 602 is the subsequent image
  • the image 601 and the image 602 have overlapping areas
  • the image 602 and the image 603 have a relationship having an overlapping area.
  • a similar relationship is established from image 604 to image 608.
  • region of the dotted line in a figure is an overlapping area
  • the overlapping area measurement unit 404 measures an overlapping area between the previous image and the subsequent image input to the captured image input unit 402. Specifically, the overlapping area measurement unit 404 measures information indicating the size of the overlapping area, such as the area of the overlapping area with the previous image or the length of the overlapping area in the subsequent image. In addition, the overlapping area measurement unit 404 calculates position information indicating the overlapping area together with the measurement of the overlapping area.
  • FIG. 10 is a diagram for explaining the measurement of the overlapping area of the front image U and the rear image V.
  • FIG. 10 is a diagram for explaining the measurement of the overlapping area of the front image U and the rear image V.
  • the front image U and the rear image V have an overlapping area.
  • the overlapping area P1 overlaps the rear image V
  • the overlapping area P2 overlaps the front image U.
  • the subject H, the subject I, and the subject J are shown as a common subject.
  • the overlapping area measuring unit 404 detects a pair of feature point coordinates between the previous image U and the subsequent image V, for example, and determines overlapping areas, and measures the overlapping areas P1 and P2. Specifically, the overlapping area measurement unit 404 extracts feature points O5 to O8 in the subsequent image, and detects feature points O1 to O4 corresponding to the feature points O5 to O8, respectively, from the previous image.
  • overlapping portions that are equal to or greater than the threshold may be adopted by template matching.
  • the overlapping area measuring unit 404 measures the size (area, length) of the overlapping area based on the feature points detected in this way, or obtains information (coordinates) indicating the position of the overlapping area in each image. Or measure.
  • the storage control unit 406 stores a plurality of input captured images in the storage unit 408, and causes the storage unit 408 to store position information indicating an overlapping area measured in association with the second image. Specifically, the image 608 to the image 608 shown in FIG. 9 are stored in the storage unit 408, and the position information indicating the overlapping area is stored in association with the subsequent image.
  • FIG. 11 shows a data configuration example of the storage unit 408. Note that FIG. 11 describes the image 601, the image 602, and the image 603, and other images are also stored in the same manner.
  • FIG. 11 (A) shows an example of the storage data structure of the shooting file name.
  • the image 601 is stored in the storage unit 408 with the shooting file name “filename1”, the image 602 is “filename2”, and the image 603 is “filename3”.
  • FIG. 11B shows an example of the storage data structure of the previous image index having an overlapping area. Since the image 601 is the foremost image, there is no previous image with an overlapping area, and “NONE” is stored, and since the image 602 has an overlapping area with the image 601, “1” indicating the image 601 is stored. Since the image 603 has an overlapping area with the image 602, “2” indicating the image 602 is stored in the storage unit 408.
  • FIG. 11C shows an example of the storage data configuration of the pixels of the previous image. Since the image 601 is the foremost image, the information of the pixels of the previous image having the overlapping area is stored as “NONE”, and the image 602 has the overlapping area with the image 601, so the pixels of the previous image (image 601). The information “(X11, Y11),... (X1n, Y1n)” is stored. Since the image 603 has an overlapping area with the image 602, the pixel information “(X21, Y21),... (X2n, Y2n)” of the previous image (image 602) is stored.
  • FIG. 11 (D) shows a configuration example of storage data of a corresponding pixel. Since the image 601 is the foremost image, the corresponding pixel information is “NONE”, and the image 602 is the pixel information “(X21-I) corresponding to the pixel shown in the part (C) of FIG. , Y21-I),... (X2n-I, Y2n-I) "is stored, and the image 603 contains the corresponding pixels" (X31-I, Y31-I), ... (X3n-I, Y3n-I) "is stored.
  • position information indicating the overlapping area information on the pixels in the previous image (part (C) in FIG. 11) and information on the pixels corresponding to the pixels in the previous image (part (D) in FIG. 11). Is remembered. That is, position information first coordinates (part (C) in FIG. 11) indicating the overlapping area and second coordinates (part (D) in FIG. 11) of the rear image corresponding to the previous image are stored.
  • region is 1st coordinate and 2nd coordinate of 2 sets or less. That is, it is sufficient that the position information indicating the overlapping area has a role as outline information.
  • the outline information is rough information on overlapping images, and information having a small capacity and easy calculation is preferable.
  • information forms other than the information forms described above are also employed as position information indicating overlapping areas. In this way, by storing the positional information indicating the overlapping area in association with the rear image and the rear image in the storage unit, the captured images constituting the image group are associated with each other, and the management of the files of the image group is made more efficient. To do. Further, by storing the position information indicating the overlapping area in association with the rear image and the rear image in this manner, it is not necessary to measure the shooting position with the GPS or the rangefinder.
  • FIG. 12 is a diagram for explaining a case where the position information of the overlapping area is stored in an EXIF (Exchangeable image file format) tag attached to the subsequent image.
  • EXIF Exchangeable image file format
  • the location information of the overlapping area is stored in the “MakerNote” part of the “EXIF IFD” of the EXIF tag. That is, the previous image file name having the overlapping area is stored in the place of “MakerNote”, and the coordinates of the pixel of the previous image and the coordinates of the corresponding pixel are stored. Specifically, the name of the previous image file with the overlapping area, the pixel (coordinate) of the previous image, and the corresponding pixel (coordinate) are stored in the “MakerNote” portion of the EXIF tag.
  • the space arrangement unit 410 spatially arranges a plurality of photographed images based on the plurality of photographed images and position information stored in the storage unit 408. That is, the space arrangement unit 410 arranges the captured image in space based on the position information of the overlapping area that is the outline information.
  • FIG. 13 is a conceptual diagram when images 601 to 608 are spatially arranged based on the position information of the overlapping area.
  • Each image from image 601 to image 608 is overlapped by overlapping region 480, overlapping region 482, overlapping region 484, overlapping region 486, overlapping region 488, overlapping region 490, and overlapping region 492.
  • positional information regarding the overlapping area 480, the overlapping area 482, the overlapping area 484, the overlapping area 486, the overlapping area 488, the overlapping area 490, and the overlapping area 492 is stored in association with the image to be the subsequent image.
  • the spatial arrangement unit 410 spatially arranges each captured image based on the position information indicating each overlapping area stored in the storage unit 408.
  • the spatial arrangement unit 410 spatially arranges the image 601 and the image 602 based on position information indicating the overlapping area 480 of the image 602. Similarly, the spatial arrangement unit 410 spatially arranges the image 602 and the image 603, the image 603 and the image 604, the image 604 and the image 605, the image 605 and the image 606, the image 606 and the image 607, and the image 607 and the image 608. .
  • the spatial arrangement unit 410 can calculate the spatial arrangement of the entire photographed image input to the photographed image input unit 402 by sequentially arranging each image spatially.
  • Corresponding point detection unit 412 detects overlapping areas of a plurality of spatially arranged captured images, and detects corresponding points between the images of the detected overlapping areas. That is, the corresponding point detection unit 412 detects corresponding points in the overlapping area based on the position information indicating the overlapping area or the overlapping area newly generated as a result of spatial arrangement with the overlapping area based on the position information indicating the overlapping area. .
  • a known method is employed as the method for detecting the corresponding points.
  • SIFT Scale-invariant feature transform
  • SURF Speed-Upped Robust Feature
  • FIG. 14 is a diagram showing the overlap of the image 601 to the image 608 after the spatial arrangement, and shows that a new overlapping area occurs after the spatial arrangement is performed using the position information of the overlapping area.
  • the corresponding point detection unit 412 detects an overlapping area between the previous image and the subsequent image and an overlapping area between the previous image and an image other than the subsequent image, and detects corresponding points between the images in the detected overlapping area.
  • the corresponding point detection unit 412 not only performs corresponding point detection on the overlapping area 480 with the image 601 and the overlapping area 482 with the image 603 in the image 602, but also with the overlapping area 494 with the image 607, and the image Corresponding point detection is also performed for the overlapping region 495 with 608. As a result, more corresponding points can be detected than the corresponding points detected between the image 601 and the image 602 and between the image 602 and the image 603.
  • the corresponding point detection unit 412 not only performs corresponding point detection on the overlapping region 480 with the image 602 in the image 601, but also performs corresponding point detection on the overlapping region 496 with the image 608. As a result, more corresponding points can be detected than the corresponding points between the image 601 and the image 602.
  • the composition processing unit 414 generates a composite image by combining a plurality of captured images based on the detected corresponding points.
  • the composition processing unit 414 can generate various composite images by combining a plurality of images.
  • the composition processing unit 414 generates a panorama composite image and an ortho image.
  • the ortho image is an image obtained by converting an aerial photograph into an image having no positional deviation and giving correct positional information.
  • FIG. 15 is a flowchart showing the operation and the image processing method of the image processing apparatus 400 of the present embodiment.
  • a plurality of photographed images in time series used for composition are input by the photographed image input unit 402 (step S10).
  • a captured image input unit 402 realized by the CPU 310 of the computer 300 via the mobile communication unit 318, the wireless LAN communication unit 320, or the short-range wireless communication unit 322 of the computer 300 for the image group captured by the robot apparatus 100. Is input.
  • the overlapping area of each image is measured by the overlapping area measuring unit 404 (step S11).
  • the overlapping area measurement unit 404 is realized by the CPU 310 of the computer 300.
  • the captured image input by the captured image input unit 402 and the position information measured by the overlapping region measuring unit 404 are realized in the main memory 314 of the computer 300 by the storage control unit 406 realized by the CPU 310 of the computer 300. Is stored in the storage unit 408 (step S12).
  • the spatial arrangement unit 410 realized by the CPU 310 of the computer 300 arranges the captured image in a spatial manner based on the position information of the overlapping area, which is the outline information (step S13).
  • the corresponding point detection unit 412 realized by the CPU 310 of the computer 300 detects the corresponding point in the overlapping region (step S14).
  • the overlapping area after the captured image is spatially arranged is not only the overlapping area of the relationship between the previous image and the subsequent image, but also the overlapping area with an image that is not in the context of the image (for example, the overlapping area 494 to the overlapping area shown in FIG. 14). Corresponding points can also be detected in the region 496).
  • the composite processing unit 414 realized by the CPU 310 of the computer 300 generates a composite image based on the detected corresponding points (step S15).
  • the hardware structure of a processing unit (processing unit) that executes various processes is the following various processors.
  • the various processors are programmable processors that can change the circuit configuration after manufacturing, such as a CPU or FPGA (Field Programmable Gate Gate Array) that is a general-purpose processor that functions as various processing units by executing software (programs). Examples include a dedicated electric circuit which is a processor having a circuit configuration specifically designed to execute a specific process such as a logic device (Programmable Logic Device: PLD) and an ASIC (Application Specific Integrated Circuit).
  • PLD Programmable Logic Device
  • ASIC Application Specific Integrated Circuit
  • One processing unit may be configured by one of these various processors, or may be configured by two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of CPU and FPGA). May be. Further, the plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or server, one processor is configured with a combination of one or more CPUs and software. There is a form in which the processor functions as a plurality of processing units.
  • SoC system-on-chip
  • a form of using a processor that realizes the functions of the entire system including a plurality of processing units with a single IC (integrated circuit) chip. is there.
  • various processing units are configured using one or more of the various processors as a hardware structure.
  • circuitry circuitry in which circuit elements such as semiconductor elements are combined.
  • a moving image is input to the captured image input unit 402.
  • FIG. 16 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the second embodiment.
  • the part already demonstrated in FIG. 8 attaches
  • the image processing apparatus 400 includes a captured image input unit 402, a first image cutout unit 421, an overlapping area measurement unit 404, a second image cutout unit 423, a storage control unit 406, a storage unit 408, and a space arrangement unit. 410, a corresponding point detection unit 412, and a synthesis processing unit 414.
  • the moving image is input to the photographed image input unit 402. That is, a plurality of time-sequential captured images used for image synthesis are sequentially input, and the first image and the second image before and after the time-sequential order among the plurality of input captured images have overlapping areas. A moving image composed of a plurality of captured images is input.
  • the first image cutout unit 421 cuts out the first image from the plurality of shot images input to the shot image input unit 402.
  • a moving image is input to the captured image input unit 402, and one of a plurality of frame images constituting the input moving image is cut out as a previous image.
  • the overlapping area measuring unit 404 measures an overlapping area between the cut out first image and the plurality of input captured images. That is, the overlapping area measuring unit 404 measures the overlapping area with the clipped previous image for the frame image of the input moving image. For example, the overlapping area measurement unit 404 may measure the overlapping area of the frame image that is in chronological order from the extracted previous image.
  • the second image cutout unit 423 cuts out the second image from the plurality of captured images when the measurement result of the overlapping area measurement unit 404 becomes a predetermined value. That is, the second image cutout unit 423 cuts out the frame image as the second image when the overlapped region is measured by the overlapped region measuring unit 404 and the result becomes a predetermined value.
  • the overlapping region measuring unit 404 measures the area of the overlapping region
  • the second image cutout unit 423 cuts out the frame image in which the area to be measured is a predetermined value as the second image.
  • the storage control unit 406 stores the input moving image and the position information of the overlapping area measured by the overlapping area measuring unit 404 in the storage unit 408.
  • the storage control unit 406 may cause the storage unit 408 to store the front image, the rear image, and the position information of the overlapping area.
  • FIG. 17 is a flowchart showing the operation and the image processing method of the image processing apparatus 400 of the present embodiment.
  • a plurality of photographed images in time series used for composition are input by the photographed image input unit 402 (step S20).
  • an image group (moving image) captured by the robot apparatus 100 is captured by the CPU 310 of the computer 300 via the mobile communication unit 318, the wireless LAN communication unit 320, or the short-range wireless communication unit 322 of the computer 300.
  • a moving image is input to the captured image input unit 402.
  • the first image cutout unit 421 cuts out a previous image from a plurality of frame images constituting the moving image (step S21).
  • the first image cutout unit 421 is realized by the CPU 310 of the computer 300.
  • the overlapping area measuring unit 404 measures an overlapping area between a plurality of frame images constituting the moving image and the previous image (step S22).
  • the overlapping area measurement unit 404 is realized by the CPU 310 of the computer 300.
  • the second image cutout unit 423 cuts out a frame image that is a subsequent image from the moving image when the measurement result of the overlapping area measurement unit 404 becomes a threshold value (step S23).
  • the second image cutout unit 423 is realized by the CPU 310 of the computer 300.
  • the moving image input by the photographed image input unit 402 and the position information measured by the overlapping area measuring unit 404 are realized in the main memory 314 of the computer 300 by the storage control unit 406 realized by the CPU 310 of the computer 300. It is stored in the storage unit 408 (step S24).
  • the spatial arrangement unit 410 realized by the CPU 310 of the computer 300 arranges the captured image in a spatial arrangement based on the position information of the overlapping area that is the outline information (step S25).
  • the corresponding point detection unit 412 realized by the CPU 310 of the computer 300 detects the corresponding point in the overlapping region (step S26).
  • the overlapping area after the captured image is spatially arranged is not only the overlapping area of the relationship between the previous image and the subsequent image, but also the overlapping area with an image not in the context of the image (for example, the overlapping areas 494 and 496 shown in FIG. 14). ) Can also detect corresponding points.
  • the composite processing unit 414 realized by the CPU 310 of the computer 300 generates a composite image based on the detected corresponding points (step S27).
  • a third embodiment will be described.
  • a group of still images is input to the photographed image input unit 402, and a front image and a rear image are extracted from the image group and stored in the storage unit 408.
  • FIG. 18 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the third embodiment.
  • the part already demonstrated in FIG. 8 attaches
  • the image processing apparatus 400 includes a captured image input unit 402, an overlapping area measurement unit 404, an image extraction unit 431, a storage control unit 406, a storage unit 408, a space arrangement unit 410, a corresponding point detection unit 412, and a synthesis process. Part 414.
  • the photographed image input unit 402 receives a group of still images. That is, a captured image input unit 402 that sequentially inputs a plurality of captured images in time series used for image synthesis, and includes a first image and a second image before and after the time sequence in the plurality of captured images. A plurality of captured images having overlapping areas are input.
  • the image extraction unit 431 extracts the previous image and the rear image when the measurement result of the overlapping area measurement unit 404 becomes a default value. That is, the image extraction unit 431 extracts the previous image and the subsequent image from the still image group input by the captured image input unit 402 when the measurement result of the overlapping area measurement unit 404 becomes a default value.
  • the overlapping area measurement unit 404 measures the area of the overlapping area, and extracts the front image and the rear image when the area to be measured becomes a predetermined value.
  • the storage control unit 406 stores the extracted previous image and subsequent image in the storage unit 408, and causes the storage unit 408 to store position information indicating the overlapping area measured in association with the subsequent image.
  • FIG. 19 is a flowchart showing an operation and an image processing method of the image processing apparatus 400 of the present embodiment.
  • the photographed image input unit 402 inputs a plurality of photographed images (groups of still images) in time series used for composition (step S30).
  • a captured image input unit 402 realized by the CPU 310 of the computer 300 via the mobile communication unit 318, the wireless LAN communication unit 320, or the short-range wireless communication unit 322 of the computer 300 for the image group captured by the robot apparatus 100. Is input.
  • the overlapping area of each image is measured by the overlapping area measuring unit 404 (step S31).
  • the overlapping area measurement unit 404 is realized by the CPU 310 of the computer 300.
  • the image extraction unit 431 extracts the previous image and the rear image (step S32).
  • the image extraction unit 431 is realized by the CPU 310 of the computer 300.
  • the captured image input by the captured image input unit 402 and the position information measured by the overlapping region measuring unit 404 are realized in the main memory 314 of the computer 300 by the storage control unit 406 realized by the CPU 310 of the computer 300. Is stored in the storage unit 408 (step S33).
  • the spatial arrangement unit 410 realized by the CPU 310 of the computer 300 arranges the photographed images in a spatial arrangement based on the position information of the overlapping area that is the outline information (step S34).
  • the corresponding point detection unit 412 realized by the CPU 310 of the computer 300 detects the corresponding point in the overlapping region (step S35).
  • the overlapping area after the captured image is spatially arranged is not only the overlapping area of the relationship between the previous image and the subsequent image, but also the overlapping area with an image not in the context of the image (for example, the overlapping areas 494 and 496 shown in FIG. 14). ) Can also detect corresponding points.
  • the composite processing unit 414 realized by the CPU 310 of the computer 300 generates a composite image based on the detected corresponding points (step S36).
  • the first image and the second image are automatically acquired by the captured image acquisition unit 401.
  • FIG. 20 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the second embodiment.
  • the part already demonstrated in FIG. 8 attaches
  • the camera device 403 includes a captured image acquisition unit 401, an overlapping area measurement unit 404, an imaging control unit 441, a storage control unit 406, a storage unit 408, a space arrangement unit 410, a corresponding point detection unit 412, and a composition processing unit 414.
  • the camera device 403 is provided in the robot device 100.
  • the captured image acquisition unit 401 is realized by the imaging device 200 of the robot apparatus 100, and includes an overlapping area measurement unit 404, an imaging control unit 441, a storage control unit 406, a storage unit 408, a space arrangement unit 410, and a corresponding point detection unit 412.
  • the synthesis processing unit 414 is realized by the imaging control unit 204 of the robot apparatus 100, for example.
  • the captured image acquisition unit 401 is a captured image acquisition unit 401 that sequentially acquires a plurality of time-sequential captured images used for image synthesis, and includes a previous image before and after a time-sequential order among the plurality of captured images. A plurality of captured images having overlapping areas with the subsequent image are acquired.
  • the overlapping area measurement unit 404 measures an overlapping area between the previous image and the moving image captured by the captured image acquisition unit 401.
  • the overlapping area measurement unit 404 measures an overlapping area between the acquired previous image and the live view image acquired by the camera.
  • the imaging control unit 441 causes the captured image acquisition unit 401 to acquire the second image when the measurement result of the overlapping area measurement unit 404 becomes a default value. That is, the imaging control unit 441 measures an overlapping area between the previous image and the live view image, and automatically acquires the subsequent image when the overlapping area becomes a predetermined value. For example, the overlapping area measuring unit 404 measures the area of the overlapping area, and when the area to be measured becomes a predetermined value, the imaging control unit 441 causes the captured image acquisition unit 401 to acquire the second image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

Provided are an image processing device, a camera device, and an image processing method, whereby time and costs of calculation required for corresponding point detection when synthesizing can be suppressed and a plurality of captured images efficiently stored, by calculating a spatial arrangement for a plurality of captured images on the basis of position information indicating overlapping areas without measuring imaging position by sensor type. The image processing device 400 comprises: a captured image input unit 402; an overlapping area measurement unit 404 that measures overlapping areas; a storage control unit 406 that stores position information indicating overlapping areas associated to a second image and measured, storing same in a storage unit; a spatial arrangement unit 410 that spatially arranges captured images; a corresponding point detection unit 412 that detects overlapping areas in captured images and detects corresponding points between images in the detected overlapping areas; and a synthesizing unit 414 that synthesizes a plurality of captured images on the basis of the detected corresponding points and generates a composite image.

Description

画像処理装置、カメラ装置、および画像処理方法Image processing apparatus, camera apparatus, and image processing method
 本発明は、画像処理装置、カメラ装置、および画像処理方法に関し、特に連続し互いに重なる領域を有する画像を合成して合成画像を生成する場合の画像処理装置、カメラ装置、および画像処理方法に関する。 The present invention relates to an image processing device, a camera device, and an image processing method, and more particularly, to an image processing device, a camera device, and an image processing method in a case where a composite image is generated by synthesizing images having continuous and overlapping regions.
 従来より、連続して撮影された複数の撮影画像をパノラマ合成する場合の技術が提案されてきた。効率的にパノラマ画像合成するための一つの手法として、各撮影画像を取得した撮影位置をGPS(Global Positioning System)または測距計で計測しておく手法が用いられてきた。 Conventionally, a technique for panoramic synthesis of a plurality of continuously shot images has been proposed. As one technique for efficiently synthesizing panoramic images, a technique has been used in which a photographing position where each photographed image is acquired is measured by a GPS (Global Positioning System) or a rangefinder.
 例えば特許文献1には、ジャイロセンサ付きカメラによりパノラマ合成用の撮影画像を撮影する技術が記載されている。特許文献1には、第1の撮影場面から隣接する第2の撮影場面にカメラ移動させる場合に、カメラの移動をジャイロセンサで検出して、ユーザにインジケーターにより正しい第2の場面の撮影位置の指示を出すことを目的としたジャイロセンサ付きカメラが記載されている。 For example, Patent Document 1 describes a technique for capturing a captured image for panoramic synthesis using a camera with a gyro sensor. In Patent Document 1, when the camera is moved from the first shooting scene to the adjacent second shooting scene, the movement of the camera is detected by the gyro sensor, and the correct shooting position of the second scene is indicated to the user by the indicator. A camera with a gyro sensor intended to give instructions is described.
 また、位置センサおよびジャイロセンサのセンサ類をカメラに備えることなく、パノラマ合成を効率的に行うことを目的とした手法も提案されている。 Also, a method for efficiently performing panorama synthesis without providing a camera with position sensors and gyro sensors has been proposed.
 例えば特許文献2では、複数の撮影画像をパノラマ合成する技術が記載されている。特許文献2に記載されたパノラマ画像合成方法は、N枚目とN+1枚目との画像の合成箇所について、空間周波数または特徴点の数により類似性を評価し、類似性が閾値より低いためN枚目の画像とN+1枚目の画像との間の移動量が算出できないときは、N枚目の画像より前またはN+1枚目の画像より後の連続する2枚の画像から算出した移動量を用いてパノラマ合成を行う。 For example, Patent Document 2 describes a technique for panoramic synthesis of a plurality of captured images. In the panoramic image synthesis method described in Patent Document 2, the similarity is evaluated based on the spatial frequency or the number of feature points for the synthesis position of the Nth and N + 1th images. When the movement amount between the first image and the (N + 1) th image cannot be calculated, the movement amount calculated from two consecutive images before the Nth image or after the N + 1th image is calculated. To perform panorama composition.
特開平06-105214号公報Japanese Patent Application Laid-Open No. 06-105214 特開2011-188035号公報JP 2011-188035 A
 ここで、複数の撮影画像が互いに重複する場合において、各撮影画像がどのように重複領域を形成しているかの情報があると、複数の撮影画像で構成される画像群の重なりを空間的に算出することができる。 Here, in the case where a plurality of captured images overlap with each other, if there is information on how each captured image forms an overlapping region, the overlap of the image group composed of the plurality of captured images is spatially determined. Can be calculated.
 また、時系列順で互いに重複領域を有する撮影画像を合成する場合に、重複領域において対応点検出を行い、その対応点検出の結果に基づいて合成処理を行う。したがって、各撮影画像の空間的な配置が分かっていると、合成を行う場合に効率的な画像処理を行うことができる。すなわち、合成処理を行う場合に、各撮影画像の重複領域が分かっていれば、重複領域においてのみ対応点検出を行えばよく、処理時間の抑制および計算コストの抑制を行うことができる。 Also, when compositing captured images having overlapping areas in time series order, corresponding points are detected in the overlapping areas, and a combining process is performed based on the result of the corresponding point detection. Therefore, if the spatial arrangement of each captured image is known, efficient image processing can be performed when combining. That is, when performing the synthesis process, if the overlapping area of each captured image is known, the corresponding points need only be detected in the overlapping area, and the processing time and the calculation cost can be suppressed.
 特許文献1および2には、撮影画像同士の重複領域を示す情報を記憶させることに関しては言及されていない。 Patent Documents 1 and 2 do not mention storing information indicating an overlapping area between captured images.
 本発明はこのような事情に鑑みてなされたもので、その目的は、撮影位置をセンサ類で測定する必要がなく、重複領域を示す位置情報に基づいて複数の撮影画像の空間的な配置を算出することにより、合成処理を行う場合の対応点検出に要する時間および計算コストを抑制し、複数の撮影画像を効率的に記憶することができる画像処理装置、カメラ装置、および画像処理方法を提供することである。 The present invention has been made in view of such circumstances, and the object thereof is not to measure the photographing position with sensors, and to spatially arrange a plurality of photographed images based on position information indicating an overlapping region. Provided are an image processing device, a camera device, and an image processing method capable of efficiently storing a plurality of captured images while suppressing the time and calculation cost required for detecting corresponding points when performing synthesis processing. It is to be.
 上記目的を達成するための本発明の一の態様である画像処理装置は、画像の合成に使用する時系列順の複数の撮影画像を順次入力する撮影画像入力部であって、複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する複数の撮影画像を入力する撮影画像入力部と、入力する第1画像と第2画像との重複領域を計測する重複領域計測部と、入力した複数の撮影画像を記憶部に記憶させ、かつ第2画像に関連付けて計測した重複領域を示す位置情報を記憶部に記憶させる記憶制御部と、記憶部に記憶された複数の撮影画像と位置情報に基づいて複数の撮影画像を空間配置する空間配置部と、空間配置された複数の撮影画像の重複領域を検出し、検出した重複領域の画像間の対応点を検出する対応点検出部と、検出した対応点に基づいて複数の撮影画像を合成して合成画像を生成する合成処理部と、を備える。 In order to achieve the above object, an image processing apparatus according to an aspect of the present invention is a captured image input unit that sequentially inputs a plurality of captured images in time-series order used for image synthesis, and a plurality of captured images The first image and the second image before and after the chronological order of the captured image input unit for inputting a plurality of captured images having overlapping areas, and an overlapping area between the input first image and the second image An overlapping area measuring unit for measuring, a storage control unit for storing a plurality of input captured images in the storage unit, and storing in the storage unit positional information indicating the overlapping region measured in association with the second image, and a storage unit A spatial arrangement unit that spatially arranges a plurality of captured images based on a plurality of stored captured images and position information, and an overlapping region of the plurality of spatially arranged captured images is detected, and correspondence between the detected overlapping region images A corresponding point detection unit for detecting points; And a synthesis processing unit that generates a composite image by combining a plurality of photographed images based on corresponding points out.
 本態様によれば、重複領域計測部により、入力する第1画像と第2画像との重複領域が計測され、記憶制御部により、第2の画像に関連付けて重複領域を示す位置情報が記憶部に記憶される。これにより、撮影位置をセンサ類で測定することなく、撮影画像入力部に入力された複数の撮影画像の空間的な配置が算出される。そして、対応点検出部は、算出された複数の撮影画像の空間的な配置に基づいて、重複領域における画像間の対応点を検出する。したがって、本態様は、限られた領域のみの対応点検出処理が行われることになるので、対応点検出に要する時間および対応点検出に要する計算コストを抑制することができる。また本態様は、重複領域を示す位置情報も記憶部に記憶されることにより、撮影画像を効率的に記憶部に記憶させることができる。 According to this aspect, the overlap region between the first image and the second image to be input is measured by the overlap region measurement unit, and the storage controller stores position information indicating the overlap region in association with the second image. Is remembered. Thereby, the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors. Then, the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed. Further, according to this aspect, the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
 本発明の他の態様である画像処理装置は、画像の合成に使用する時系列順の複数の撮影画像を順次入力する撮影画像入力部であって、複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する複数の撮影画像を入力する撮影画像入力部と、入力された複数の撮影画像から第1画像を切り出す第1画像切出部と、切り出された第1画像と入力された複数の撮影画像との重複領域を計測する重複領域計測部と、重複領域計測部の計測結果が既定値となった場合に、複数の撮影画像から第2画像を切り出す第2画像切出部と、入力した複数の撮影画像を記憶部に記憶させ、かつ第2画像に関連付けて計測した重複領域を示す位置情報を記憶部に記憶させる記憶制御部と、記憶部に記憶された複数の撮影画像と位置情報に基づいて複数の撮影画像を空間配置する空間配置部と、空間配置された複数の撮影画像の重複領域を検出し、検出した重複領域の画像間の対応点を検出する対応点検出部と、検出した対応点に基づいて複数の撮影画像を合成して合成画像を生成する合成処理部と、を備える。 An image processing apparatus according to another aspect of the present invention is a photographed image input unit that sequentially inputs a plurality of photographed images in time-series order used for image synthesis, and includes a plurality of photographed images in time-series order. A captured image input unit that inputs a plurality of captured images in which the front and back first images and the second image have overlapping areas; a first image cutout unit that extracts the first image from the plurality of input captured images; When an overlap region measurement unit that measures an overlap region between the cut out first image and the plurality of input captured images, and when the measurement result of the overlap region measurement unit becomes a default value, the second is extracted from the plurality of captured images. A second image cutout unit that cuts out an image; a storage control unit that stores a plurality of input captured images in a storage unit; and a storage control unit that stores position information indicating an overlapping region measured in association with the second image; Multiple captured images and positions stored in the storage unit A spatial arrangement unit that spatially arranges a plurality of captured images based on the information, a corresponding point detection unit that detects an overlapping region of the plurality of spatially arranged captured images, and detects corresponding points between the images of the detected overlapping region; A combining processing unit that combines a plurality of captured images based on the detected corresponding points to generate a combined image.
 本態様によれば、重複領域計測部により、切り出された第1画像と入力された複数の撮影画像との重複領域が計測され、記憶制御部により、第2画像に関連付けて重複領域を示す位置情報が記憶部に記憶される。これにより、撮影位置をセンサ類で測定することなく、撮影画像入力部に入力された複数の撮影画像の空間的な配置が算出される。そして、対応点検出部は、算出された複数の撮影画像の空間的な配置に基づいて、重複領域における画像間の対応点を検出する。したがって、本態様は、限られた領域のみの対応点検出処理が行われることになるので、対応点検出に要する時間および対応点検出に要する計算コストを抑制することができる。また本態様は、重複領域を示す位置情報も記憶部に記憶されることにより、撮影画像を効率的に記憶部に記憶させることができる。 According to this aspect, the overlapping area measurement unit measures the overlapping area between the cut-out first image and the plurality of input captured images, and the storage control unit indicates the overlapping area in association with the second image. Information is stored in the storage unit. Thereby, the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors. Then, the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed. Further, according to this aspect, the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
 本発明の他の態様である画像処理装置は、画像の合成に使用する時系列順の複数の撮影画像を順次入力する撮影画像入力部であって、複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する複数の撮影画像を入力する撮影画像入力部と、入力する第1画像と第2画像との重複領域を計測する重複領域計測部と、重複領域計測部の計測結果が既定値となった場合に、第1画像と第2画像とを抽出する画像抽出部と、抽出した第1画像と第2画像を記憶部に記憶させ、かつ第2画像に関連付けて計測した重複領域を示す位置情報を記憶部に記憶させる記憶制御部と、記憶部に記憶された複数の撮影画像と位置情報に基づいて複数の撮影画像を空間配置する空間配置部と、空間配置された複数の撮影画像の重複領域を検出し、検出した重複領域の画像間の対応点を検出する対応点検出部と、検出した対応点に基づいて複数の撮影画像を合成して合成画像を生成する合成処理部と、を備える。 An image processing apparatus according to another aspect of the present invention is a photographed image input unit that sequentially inputs a plurality of photographed images in time-series order used for image synthesis, and includes a plurality of photographed images in time-series order. A captured image input unit that inputs a plurality of captured images in which the preceding and following first images and the second image have overlapping regions; an overlapping region measuring unit that measures an overlapping region between the input first image and second image; When the measurement result of the overlapping area measurement unit becomes a default value, the image extraction unit that extracts the first image and the second image, the extracted first image and the second image are stored in the storage unit, and A storage control unit that stores position information indicating an overlap area measured in association with the second image in the storage unit, and a space in which a plurality of captured images are spatially arranged based on the plurality of captured images and position information stored in the storage unit Overlapping between the arrangement part and multiple shot images arranged in space A corresponding point detecting unit that detects a corresponding area between images of the detected overlapping region, and a combining processing unit that combines a plurality of captured images based on the detected corresponding points to generate a combined image, Prepare.
 本態様によれば、重複領域計測部により、入力する第1画像と第2画像との重複領域が計測され、記憶制御部により、第2画像に関連付けて計測した重複領域を示す位置情報が記憶部に記憶される。これにより、撮影位置をセンサ類で測定することなく、撮影画像入力部に入力された複数の撮影画像の空間的な配置が算出される。そして、対応点検出部は、算出された複数の撮影画像の空間的な配置に基づいて、重複領域における画像間の対応点を検出する。したがって、本態様は、限られた領域のみの対応点検出処理が行われることになるので、対応点検出に要する時間および対応点検出に要する計算コストを抑制することができる。また本態様は、重複領域を示す位置情報も記憶部に記憶されることにより、撮影画像を効率的に記憶部に記憶させることができる。 According to this aspect, the overlap region between the first image and the second image to be input is measured by the overlap region measurement unit, and the position information indicating the overlap region measured in association with the second image is stored by the storage control unit. Stored in the department. Thereby, the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors. Then, the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed. Further, according to this aspect, the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
 好ましくは、合成処理部は、合成することによりオルソ画像を生成する。 Preferably, the compositing processing unit generates an ortho image by compositing.
 本態様によれば、合成処理部は、合成することによりオルソ画像を生成するので、オルソ画像の生成における重複領域の対応点を検出する時間および計算コストを抑制することができる。 According to this aspect, since the composition processing unit generates an ortho image by compositing, it is possible to suppress time and calculation cost for detecting corresponding points of overlapping regions in the generation of the ortho image.
 好ましくは、記憶制御部は、第2画像に付属するEXIFタグに、位置情報を記録する。 Preferably, the storage control unit records position information in an EXIF tag attached to the second image.
 本態様によれば、記憶制御部により、第2画像に付属するEXIFタグに位置情報が記録されるので、第2画像に関連付けて位置情報を効率的に記憶させることができる。 According to this aspect, since the storage controller records the position information on the EXIF tag attached to the second image, the position information can be efficiently stored in association with the second image.
 好ましくは、対応点検出部は、第1画像において第2画像以外の画像との重複領域も検出し、検出した重複領域の画像間の対応点を検出する。 Preferably, the corresponding point detection unit also detects an overlapping area of the first image with an image other than the second image, and detects a corresponding point between the images of the detected overlapping area.
 本態様によれば、対応点検出部により、第1画像において第2画像以外の画像との重複領域も検出し、検出した重複領域の画像間の対応点が検出される。これにより本態様は、第1画像と第2画像との重複領域の対応点のみによる合成処理よりも、より多くの対応点により合成処理が行われるので、より精度の高い合成を行うことができる。 According to this aspect, the corresponding point detection unit also detects an overlapping area of the first image with an image other than the second image, and detects a corresponding point between the images of the detected overlapping area. As a result, in this aspect, since the combining process is performed with more corresponding points than the combining process using only the corresponding points of the overlapping region of the first image and the second image, it is possible to perform more accurate combining. .
 好ましくは、重複領域を示す位置情報は、第1画像の第1座標および第1画像に対応する第2画像の第2座標である。 Preferably, the position information indicating the overlapping area is the first coordinates of the first image and the second coordinates of the second image corresponding to the first image.
 本態様によれば、重複領域を示す位置情報は、第1画像の第1座標および第1画像に対応する第2画像の第2座標であるので、より正確な重複領域の位置情報を記憶部に記憶することができる。 According to this aspect, since the position information indicating the overlapping area is the first coordinates of the first image and the second coordinates of the second image corresponding to the first image, more accurate position information of the overlapping area is stored in the storage unit. Can be memorized.
 好ましくは、重複領域を示す位置情報は、2組以下の第1座標および第2座標である。 Preferably, the position information indicating the overlapping region is two or less sets of first coordinates and second coordinates.
 本態様によれば、重複領域を示す位置情報は、2組以下の第1座標および第2座標であるので、位置情報の情報量は抑制されたものとなり、記憶部の記憶容量を効果的に使用することができる。 According to this aspect, since the position information indicating the overlapping area is the first coordinate and the second coordinate of two sets or less, the information amount of the position information is suppressed, and the storage capacity of the storage unit is effectively reduced. Can be used.
 本発明の他の態様であるカメラ装置は、画像の合成に使用する時系列順の複数の撮影画像を順次取得する撮影画像取得部であって、複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する複数の撮影画像を取得する撮影画像取得部と、第1画像と撮影画像取得部で捉えられる動画との重複領域を計測する重複領域計測部と、重複領域計測部の計測結果が既定値となった場合に、撮影画像取得部に第2画像を取得させる撮影制御部と、取得した複数の撮影画像を記憶部に記憶させ、かつ第2画像に関連付けて計測した重複領域を示す位置情報を記憶部に記憶させる記憶制御部と、記憶部に記憶された複数の撮影画像と位置情報に基づいて複数の撮影画像を空間配置する空間配置部と、空間配置された複数の撮影画像の重複領域を検出し、検出した重複領域の画像間の対応点を検出する対応点検出部と、検出した対応点に基づいて複数の撮影画像を合成して合成画像を生成する合成処理部と、を備える。 A camera apparatus according to another aspect of the present invention is a captured image acquisition unit that sequentially acquires a plurality of captured images in time-series order used for image synthesis, and before and after a time-series order among a plurality of captured images. The first image and the second image of the first image and the second image overlap region measurement that measures an overlap region between a captured image acquisition unit that acquires a plurality of captured images having overlapping regions and a moving image captured by the first image and the captured image acquisition unit. And when the measurement result of the overlapping area measurement unit becomes a default value, the shooting control unit that causes the captured image acquisition unit to acquire the second image, the acquired plurality of captured images to be stored in the storage unit, and the first A storage control unit that stores position information indicating an overlap area measured in association with two images in a storage unit, and a spatial arrangement that spatially arranges a plurality of captured images based on the plurality of captured images and position information stored in the storage unit And a plurality of spatially arranged Corresponding point detection unit that detects overlapping regions of shadow images, detects corresponding points between images of the detected overlapping regions, and a combining process that generates a combined image by combining a plurality of captured images based on the detected corresponding points A section.
 本態様によれば、重複領域計測部により、第1画像と撮影画像取得部で捉えられる動画との重複領域が計測され、記憶制御部により、第2画像に関連付けて計測した重複領域を示す位置情報が記憶部に記憶される。これにより、撮影位置をセンサ類で測定することなく、撮影画像入力部に入力された複数の撮影画像の空間的な配置が算出される。そして、対応点検出部は、算出された複数の撮影画像の空間的な配置に基づいて、重複領域における画像間の対応点を検出する。したがって、本態様は、限られた領域のみの対応点検出処理が行われることになるので、対応点検出に要する時間および対応点検出に要する計算コストを抑制することができる。また本態様は、重複領域を示す位置情報も記憶部に記憶されることにより、撮影画像を効率的に記憶部に記憶させることができる。 According to this aspect, the overlapping region measurement unit measures the overlapping region between the first image and the moving image captured by the captured image acquisition unit, and the storage control unit indicates the overlapping region measured in association with the second image. Information is stored in the storage unit. Thereby, the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors. Then, the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed. Further, according to this aspect, the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
 本発明の他の態様である画像処理方法は、画像の合成に使用する時系列順の複数の撮影画像を順次入力するステップであって、複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する複数の撮影画像を入力するステップと、入力する第1画像と第2画像との重複領域を計測するステップと、入力した複数の撮影画像を記憶部に記憶させ、かつ第2画像に関連付けて計測した重複領域を示す位置情報を記憶部に記憶させるステップと、記憶部に記憶された複数の撮影画像と位置情報に基づいて複数の撮影画像を空間配置するステップと、空間配置された複数の撮影画像の重複領域を検出し、検出した重複領域の画像間の対応点を検出するステップと、検出した対応点に基づいて複数の撮影画像を合成して合成画像を生成するステップと、を含む。 An image processing method according to another aspect of the present invention is a step of sequentially inputting a plurality of photographed images in time-series order used for image synthesis, wherein The step of inputting a plurality of captured images having an overlapping area between one image and the second image, the step of measuring an overlapping area between the first image and the second image to be input, and storing the plurality of input captured images A step of storing in the storage unit position information indicating an overlap region measured in association with the second image and a plurality of photographed images based on the plurality of photographed images and position information stored in the storage unit. A step of spatial arrangement, a step of detecting overlapping regions of a plurality of captured images arranged in space, a step of detecting corresponding points between the images of the detected overlapping regions, and a combination of the plurality of captured images based on the detected corresponding points Together And generating an image.
 本発明の他の態様である画像処理方法は、画像の合成に使用する時系列順の複数の撮影画像を順次入力するステップであって、複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する複数の撮影画像を入力するステップと、入力された複数の撮影画像から第1画像を切り出すステップと、切り出された第1画像と入力された複数の撮影画像との重複領域を計測するステップと、重複領域を計測するステップの計測結果が既定値となった場合に、複数の撮影画像から第2画像を切り出すステップと、入力した複数の撮影画像を記憶部に記憶させ、かつ第2画像に関連付けて計測した重複領域を示す位置情報を記憶部に記憶させるステップと、記憶部に記憶された複数の撮影画像と位置情報に基づいて複数の撮影画像を空間配置するステップと、空間配置された複数の撮影画像の重複領域を検出し、検出した重複領域の画像間の対応点を検出するステップと、検出した対応点に基づいて複数の撮影画像を合成して合成画像を生成するステップと、を含む。 An image processing method according to another aspect of the present invention is a step of sequentially inputting a plurality of photographed images in time-series order used for image synthesis, wherein The step of inputting a plurality of captured images having an overlapping area between the one image and the second image, the step of cutting out the first image from the plurality of input captured images, and the plurality of the input of the cut out first image A step of measuring an overlapping area with the captured image, a step of cutting out the second image from the plurality of captured images when the measurement result of the step of measuring the overlapping area is a default value, and the plurality of input captured images Is stored in the storage unit, and the storage unit stores position information indicating the overlap region measured in association with the second image, and a plurality of captured images and a plurality of position information are stored based on the plurality of captured images stored in the storage unit. A step of spatially arranging shadow images, a step of detecting overlapping regions of a plurality of spatially arranged captured images, a step of detecting corresponding points between the images of the detected overlapping regions, and a plurality of photographing based on the detected corresponding points Synthesizing images to generate a composite image.
 本発明の他の態様である画像処理方法は、画像の合成に使用する時系列順の複数の撮影画像を順次入力するステップであって、複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する複数の撮影画像を入力するステップと、入力する第1画像と第2画像との重複領域を計測するステップと、重複領域を計測するステップの計測結果が既定値となった場合に、第1画像と第2画像とを抽出するステップと、抽出した第1画像と第2画像を記憶部に記憶させ、かつ第2画像に関連付けて計測した重複領域を示す位置情報を記憶部に記憶させるステップと、記憶部に記憶された複数の撮影画像と位置情報に基づいて複数の撮影画像を空間配置するステップと、空間配置された複数の撮影画像の重複領域を検出し、検出した重複領域の画像間の対応点を検出するステップと、検出した対応点に基づいて複数の撮影画像を合成して合成画像を生成するステップと、を含む。 An image processing method according to another aspect of the present invention is a step of sequentially inputting a plurality of photographed images in time-series order used for image synthesis, wherein Measurement of a step of inputting a plurality of captured images having an overlapping area between one image and a second image, a step of measuring an overlapping area between the input first image and the second image, and a step of measuring the overlapping area A step of extracting the first image and the second image when the result has become a predetermined value, and storing the extracted first image and the second image in the storage unit and measuring in association with the second image A step of storing position information indicating a region in the storage unit, a step of spatially arranging a plurality of photographed images based on the plurality of photographed images stored in the storage unit and the position information, and a plurality of photographed images arranged in space Detect overlapping areas And detecting corresponding points between the images of the detected overlapped area, the steps of combining a plurality of captured images to generate a composite image based on the detected corresponding points, a.
 本発明によれば、重複領域計測部により、入力する第1画像と第2画像との重複領域が計測され、記憶制御部により、第2の画像に関連付けて重複領域を示す位置情報が記憶部に記憶され、これにより、撮影位置をセンサ類で測定することなく、撮影画像入力部に入力された複数の撮影画像の空間的な配置が算出され、そして、対応点検出部は、算出された複数の撮影画像の空間的な配置に基づいて、重複領域における画像間の対応点を検出する。したがって、本態様は、限られた領域のみの対応点検出処理が行われることになるので、対応点検出に要する時間および対応点検出に要する計算コストを抑制することができ、重複領域を示す位置情報も記憶部に記憶されることにより、撮影画像を効率的に記憶部に記憶させることができる。 According to the present invention, the overlapping area between the input first image and the second image is measured by the overlapping area measuring unit, and the storage control unit stores the positional information indicating the overlapping area in association with the second image. Thus, the spatial arrangement of the plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors, and the corresponding point detection unit calculates Corresponding points between images in the overlapping region are detected based on the spatial arrangement of the plurality of captured images. Therefore, in this aspect, the corresponding point detection process is performed only in a limited area, so that the time required for detecting the corresponding point and the calculation cost required for detecting the corresponding point can be suppressed, and the position indicating the overlapping area Since the information is also stored in the storage unit, the captured image can be efficiently stored in the storage unit.
撮影対象物である構造物の一つである橋梁の構造を示す斜視図である。It is a perspective view which shows the structure of the bridge which is one of the structures which are imaging | photography objects. 撮影システムを示す概念図である。It is a conceptual diagram which shows an imaging | photography system. ロボット装置の外観を示す斜視図である。It is a perspective view which shows the external appearance of a robot apparatus. 図3に示したロボット装置の要部断面図である。It is principal part sectional drawing of the robot apparatus shown in FIG. カメラを示す図である。It is a figure which shows a camera. ロボット装置の機能構成例を示すブロック図である。It is a block diagram which shows the function structural example of a robot apparatus. コンピュータのシステム構成を示すブロック図である。It is a block diagram which shows the system configuration | structure of a computer. 画像処理装置の機能構成例を示す図である。It is a figure which shows the function structural example of an image processing apparatus. 撮影画像入力部に入力される画像を概念的に示す図である。It is a figure which shows notionally the image input into a picked-up image input part. 重複領域の計測に関して説明する図である。It is a figure explaining the measurement of an overlap area. 記憶部のデータ構成例を示す。The data structural example of a memory | storage part is shown. EXIFタグに、重複領域の位置情報が記憶されることを説明する図である。It is a figure explaining that the positional information on an overlap area | region is memorize | stored in an EXIF tag. 画像が重複領域の位置情報に基づいて、空間配置された場合の概念図である。It is a conceptual diagram when the image is spatially arranged based on the position information of the overlapping area. 空間配置後の画像の重なりを示した図である。It is the figure which showed the overlap of the image after space arrangement | positioning. 画像処理装置の動作および画像処理方法を示したフローチャートである。3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. 画像処理装置の機能構成例を示す図である。It is a figure which shows the function structural example of an image processing apparatus. 画像処理装置の動作および画像処理方法を示したフローチャートである。3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. 画像処理装置の機能構成例を示す図である。It is a figure which shows the function structural example of an image processing apparatus. 画像処理装置の動作および画像処理方法を示したフローチャートである。3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. 画像処理装置の機能構成例を示す図である。It is a figure which shows the function structural example of an image processing apparatus.
 以下、添付図面にしたがって本発明にかかる画像処理装置、カメラ装置および画像処理方法の好ましい実施の形態について説明する。 Hereinafter, preferred embodiments of an image processing device, a camera device, and an image processing method according to the present invention will be described with reference to the accompanying drawings.
 図1は、撮影対象物である構造物の一つである橋梁の構造を示す斜視図であり、橋梁を下から見た斜視図である。本発明で使用される撮影画像の撮影対象物は、特に限定されるものではないが、例えば、構造物の損傷の検出検査を行う場合に使用される撮影画像を合成する場合に本発明が適用される。具体的には格間単位で床版6の損傷検出検査を行う場合に、格間内の床版6を分割撮影し、その分割撮影された撮影画像を合成する場合に本発明が適用される。 FIG. 1 is a perspective view showing the structure of a bridge, which is one of the structures to be photographed, and is a perspective view of the bridge as viewed from below. The object to be photographed used in the present invention is not particularly limited. For example, the present invention is applied to the case of synthesizing a photographed image that is used when a structure damage detection inspection is performed. Is done. Specifically, in the case where the damage detection inspection of the floor slab 6 is performed in the space unit, the present invention is applied to the case where the floor slab 6 in the space is dividedly photographed and the captured images obtained by the divided photographing are combined. .
 図1に示す橋梁1は、主桁2と、横桁3と、対傾構4と、横構5とを有し、これらがボルト、リベットまたは溶接により連結されて構成されている。また、主桁2等の上部には、車輌等が走行するための床版6が打設されている。床版6は、鉄筋コンクリート製のものが一般的である。 The bridge 1 shown in FIG. 1 has a main girder 2, a cross girder 3, a counter-tilt 4 and a horizontal 5 which are connected by bolts, rivets or welding. In addition, a floor slab 6 for driving a vehicle or the like is provided on the upper portion of the main beam 2 or the like. The floor slab 6 is generally made of reinforced concrete.
 主桁2は、橋台または橋脚の間に渡され、床版6上の車輌等の荷重を支える部材である。横桁3は、荷重を複数の主桁2で支持するため、主桁2を連結する部材である。対傾構4および横構5は、それぞれ風および地震の横荷重に抵抗するため、主桁2を相互に連結する部材である。 The main girder 2 is a member that is passed between the abutment or the pier and supports the load of the vehicle on the floor slab 6. The cross beam 3 is a member that connects the main beams 2 in order to support the load by the plurality of main beams 2. The anti-tilt frame 4 and the horizontal frame 5 are members for connecting the main girders 2 to each other in order to resist lateral loads of wind and earthquake, respectively.
 図2は、撮影システムを示す概念図である。 FIG. 2 is a conceptual diagram showing a photographing system.
 撮影システム500は、ロボット装置100とコンピュータ300とで構成されている。 The imaging system 500 includes a robot apparatus 100 and a computer 300.
 ロボット装置100は、撮像装置200を備えおり、コンピュータ300の制御により移動および撮像装置200による撮影を行う。ロボット装置100は、コンピュータ300の制御により移動および撮影を行えるものであれば特に限定されるものではない。ロボット装置100の他の例としては、走行型ロボット、小型のヘリコプター、マルチコプター、ドローンまたはUAV(Unmanned Aerial Vehicles、無人航空機)と呼ばれる装置があげられる。 The robot apparatus 100 includes an imaging apparatus 200, and moves and captures images with the imaging apparatus 200 under the control of the computer 300. The robot apparatus 100 is not particularly limited as long as it can move and shoot under the control of the computer 300. Other examples of the robot apparatus 100 include an apparatus called a traveling robot, a small helicopter, a multicopter, a drone, or UAV (Unmanned Aerial Vehicles).
 コンピュータ300およびロボット装置100は相互に通信可能であり、コンピュータ300は、ロボット装置100の移動の制御および撮像装置200での撮影制御を遠隔で行うことができる。 The computer 300 and the robot apparatus 100 can communicate with each other, and the computer 300 can remotely control the movement of the robot apparatus 100 and the imaging control of the imaging apparatus 200.
 図3は、撮像装置200の一実施形態であるカメラを含むロボット装置100の外観を示す斜視図であり、橋梁1の主桁2間に設置されている状態に関して示している。また、図4は、図3に示したロボット装置100の要部断面図である。 FIG. 3 is a perspective view showing an appearance of the robot apparatus 100 including the camera which is an embodiment of the imaging apparatus 200, and shows a state where the robot apparatus 100 is installed between the main girders 2 of the bridge 1. FIG. FIG. 4 is a cross-sectional view of a main part of the robot apparatus 100 shown in FIG.
 図3および図4に示すようにロボット装置100は、撮像装置200を備え、撮像装置200の3次元空間内の位置(撮影位置)を制御するとともに、撮像装置200による撮影方向を制御し、橋梁1の点検時に複数の部材から構成された橋梁1の任意の点検部材等を撮影するものである。 As shown in FIGS. 3 and 4, the robot apparatus 100 includes an imaging apparatus 200, controls the position (imaging position) of the imaging apparatus 200 in the three-dimensional space, controls the imaging direction of the imaging apparatus 200, and bridges An arbitrary inspection member or the like of the bridge 1 composed of a plurality of members at the time of the inspection of 1 is photographed.
 ロボット装置100は、後に詳しく説明するが、主フレーム102と、垂直伸延アーム104と、垂直伸延アーム104の駆動部および各種の制御部等が配設された筐体106と、筐体106を主フレーム102の長手方向(主桁2の長手方向と直交する方向)(X方向)に移動させるX方向駆動部108(図6)と、ロボット装置100全体を主桁2の長手方向(Y方向)に移動させるY方向駆動部110(図6)と、垂直伸延アーム104を垂直方向(Z方向)に伸縮させるZ方向駆動部112(図6)とを備えている。 As will be described in detail later, the robot apparatus 100 includes a main frame 102, a vertical extension arm 104, a casing 106 in which a drive unit and various control units for the vertical extension arm 104 are disposed, and a casing 106. The X direction driving unit 108 (FIG. 6) moves in the longitudinal direction of the frame 102 (direction orthogonal to the longitudinal direction of the main beam 2) (X direction), and the longitudinal direction of the main beam 2 (Y direction). And a Y-direction drive unit 110 (FIG. 6) that moves in the vertical direction and a Z-direction drive unit 112 (FIG. 6) that expands and contracts the vertical extension arm 104 in the vertical direction (Z direction).
 X方向駆動部108は、主フレーム102の長手方向(X方向)に配設されたボールネジ108Aと、筐体106に配設されたボールナット108Bと、ボールネジ108Aを回転させるモータ108Cとから構成され、モータ108Cによりボールネジ108Aを正転または逆転させることにより、筐体106をX方向に移動させる。 The X-direction drive unit 108 includes a ball screw 108A disposed in the longitudinal direction (X direction) of the main frame 102, a ball nut 108B disposed in the housing 106, and a motor 108C that rotates the ball screw 108A. The casing 106 is moved in the X direction by rotating the ball screw 108A forward or backward by the motor 108C.
 Y方向駆動部110は、主フレーム102の両端にそれぞれ配設されたタイヤ110A、110Bと、タイヤ110A、110B内に配設されたモータ(図示せず)とから構成され、タイヤ110A、110Bをモータ駆動することによりロボット装置100全体をY方向に移動させる。 The Y-direction drive unit 110 includes tires 110A and 110B disposed at both ends of the main frame 102, and motors (not shown) disposed in the tires 110A and 110B. By driving the motor, the entire robot apparatus 100 is moved in the Y direction.
 なお、ロボット装置100は、主フレーム102の両端のタイヤ110A、110Bが、2箇所の主桁2の下フランジ上に載置され、かつ主桁2を挟む態様で設置される。これにより、ロボット装置100は、主桁2の下フランジに懸垂して、主桁2に沿って移動(自走)することができる。また、主フレーム102は、図示しないが、主桁2の間隔に合わせて長さが調整可能に構成されている。 The robot apparatus 100 is installed in such a manner that the tires 110A and 110B at both ends of the main frame 102 are placed on the lower flanges of the two main girders 2 and the main girders 2 are sandwiched between them. Thereby, the robot apparatus 100 can be suspended (suspended) along the main girder 2 by being suspended from the lower flange of the main girder 2. Although not shown, the main frame 102 is configured such that its length can be adjusted according to the interval between the main girders 2.
 垂直伸延アーム104は、ロボット装置100の筐体106に配設されており、筐体106とともにX方向およびY方向に移動する。また、垂直伸延アーム104は、筐体106内に設けられたZ方向駆動部112(図6)によりZ方向に伸縮する。 The vertical extension arm 104 is disposed in the housing 106 of the robot apparatus 100 and moves in the X direction and the Y direction together with the housing 106. Further, the vertical extension arm 104 is expanded and contracted in the Z direction by a Z direction driving unit 112 (FIG. 6) provided in the housing 106.
 図5に示すように垂直伸延アーム104の先端には、カメラ設置部104Aが設けられており、カメラ設置部104Aには、パンチルト機構120によりパン方向およびチルト方向に回転可能なカメラ202が設置されている。 As shown in FIG. 5, a camera installation unit 104A is provided at the tip of the vertical extension arm 104, and a camera 202 that can be rotated in the pan direction and the tilt direction by the pan / tilt mechanism 120 is installed in the camera installation unit 104A. ing.
 撮影される画像(撮影画像)を、点検調書に添付する「点検画像」として取得する。 * Acquire the image (photographed image) to be captured as an “inspection image” attached to the inspection report.
 また、カメラ202は、パンチルト駆動部206(図6)から駆動力が加えられるパンチルト機構120により垂直伸延アーム104と同軸のパン軸Pを中心に回転し、あるいは水平方向のチルト軸Tを中心に回転する。これにより、カメラ202は任意の姿勢にて撮影(任意の撮影方向の撮影)ができる。 Further, the camera 202 is rotated around a pan axis P coaxial with the vertical extension arm 104 by a pan / tilt mechanism 120 to which a driving force is applied from a pan / tilt driving unit 206 (FIG. 6) or centered on a horizontal tilt axis T. Rotate. As a result, the camera 202 can shoot in an arbitrary posture (shoot in an arbitrary shooting direction).
 図6は、ロボット装置100の機能構成例を示すブロック図である。 FIG. 6 is a block diagram illustrating a functional configuration example of the robot apparatus 100.
 図6に示すようにロボット装置100は、ロボット装置100側のロボット制御部130、X方向駆動部108、Y方向駆動部110およびZ方向駆動部112と、撮像装置200側のカメラ202、撮影制御部204、パンチルト制御部210、およびパンチルト駆動部206と、ロボット側通信部230とから構成されている。 As shown in FIG. 6, the robot apparatus 100 includes a robot control unit 130, an X-direction drive unit 108, a Y-direction drive unit 110, and a Z-direction drive unit 112 on the robot apparatus 100 side, a camera 202 on the imaging apparatus 200 side, and imaging control. The unit 204, the pan / tilt control unit 210, the pan / tilt driving unit 206, and the robot side communication unit 230 are configured.
 ロボット側通信部230は、コンピュータ300との間で双方向の無線通信を行い、コンピュータ300から送信されるロボット装置100の移動を制御する移動指令、パンチルト機構120を制御するパンチルト指令、カメラ202を制御する撮影指令等の各種の指令を受信し、受信した指令をそれぞれ対応する制御部に出力する。なお、コンピュータ300の詳細については後述する。 The robot-side communication unit 230 performs two-way wireless communication with the computer 300, a movement command for controlling movement of the robot apparatus 100 transmitted from the computer 300, a pan / tilt command for controlling the pan / tilt mechanism 120, and a camera 202. Various commands such as a shooting command to be controlled are received, and the received commands are output to corresponding control units. Details of the computer 300 will be described later.
 ロボット制御部130は、ロボット側通信部230から入力する移動指令に基づいてX方向駆動部108、Y方向駆動部110、およびZ方向駆動部112を制御し、ロボット装置100のX方向およびY方向に移動させるとともに、垂直伸延アーム104をZ方向に伸縮させる(図3参照)。 The robot control unit 130 controls the X direction driving unit 108, the Y direction driving unit 110, and the Z direction driving unit 112 based on the movement command input from the robot side communication unit 230, and the robot device 100 performs the X direction and Y direction. And the vertical extension arm 104 is expanded and contracted in the Z direction (see FIG. 3).
 パンチルト制御部210は、ロボット側通信部230から入力するパンチルト指令に基づいてパンチルト駆動部206を介してパンチルト機構120をパン方向およびチルト方向に動作させ、カメラ202を所望の方向にパンチルトさせる(図5参照)。 The pan / tilt control unit 210 operates the pan / tilt mechanism 120 in the pan direction and the tilt direction via the pan / tilt driving unit 206 based on the pan / tilt command input from the robot side communication unit 230 to pan / tilt the camera 202 in a desired direction (see FIG. 5).
 撮影制御部204は、ロボット側通信部230から入力する撮影指令に基づいてカメラ202の撮像部202Aにライブビュー画像、または点検画像の撮影を行わせる。本発明の撮影画像は、撮像部202Aで取得された撮影画像またはライブビュー画像を含む動画が使用される。 The imaging control unit 204 causes the imaging unit 202A of the camera 202 to capture a live view image or an inspection image based on the imaging command input from the robot side communication unit 230. As the captured image of the present invention, a moving image including a captured image or a live view image acquired by the imaging unit 202A is used.
 橋梁1の点検時にカメラ202の撮像部202Aより撮影された画像Kを示す画像データは、それぞれロボット側通信部230を介してコンピュータ300に送信される。 Image data indicating an image K taken by the imaging unit 202A of the camera 202 when the bridge 1 is inspected is transmitted to the computer 300 via the robot side communication unit 230.
 コンピュータ300は、図2に示すように、タブレット型のコンピュータで構成される。コンピュータ300は、矩形の輪郭を有する平板状の筐体を備え、表示部326(図6)と入力部328(図6)とを兼ねたタッチパネルディスプレイ302、を備えている。 The computer 300 is composed of a tablet computer as shown in FIG. The computer 300 includes a flat casing having a rectangular outline, and includes a touch panel display 302 that also serves as a display unit 326 (FIG. 6) and an input unit 328 (FIG. 6).
 図7は、コンピュータ300のシステム構成を示すブロック図である。 FIG. 7 is a block diagram showing a system configuration of the computer 300.
 図7に示すように、コンピュータ300は、コンピュータ300の全体の動作を制御するCPU(Central Processing Unit)310を備え、このCPU310にシステムバス312を介して、メインメモリ314、不揮発性メモリ316、モバイル通信部318、無線LAN通信部320、近距離無線通信部322、有線通信部324、表示部326、入力部328、キー入力部330、音声処理部332、画像処理部334等が接続されて構成される。 As shown in FIG. 7, the computer 300 includes a CPU (Central Processing Unit) 310 that controls the overall operation of the computer 300, and a main memory 314, a nonvolatile memory 316, a mobile device via the system bus 312. A communication unit 318, a wireless LAN communication unit 320, a short-range wireless communication unit 322, a wired communication unit 324, a display unit 326, an input unit 328, a key input unit 330, an audio processing unit 332, an image processing unit 334, and the like are connected. Is done.
 CPU310は、不揮発性メモリ316に記憶された動作プログラム(OS(Operating System)、および、そのOS上で動作するアプリケーションプログラム)、および、定型データ等を読み出し、メインメモリ314に展開して、当動作プログラムを実行することにより、このコンピュータ全体の動作を制御する制御部として機能する。 The CPU 310 reads out an operation program (OS (Operating System) and application program that operates on the OS), standard data, and the like stored in the non-volatile memory 316, expands them in the main memory 314, and performs this operation. By executing the program, it functions as a control unit that controls the operation of the entire computer.
 メインメモリ314は、たとえば、RAM(Random Access Memory)で構成され、CPU310のワークメモリとして機能する。 The main memory 314 is composed of, for example, a RAM (Random Access Memory) and functions as a work memory for the CPU 310.
 不揮発性メモリ316は、たとえば、フラッシュEEPROM(EEPROM:Electrically Erasable Programmable Read Only Memory)で構成され、上述した動作プログラムや各種定型データを記憶する。また、不揮発性メモリ316は、コンピュータ300の記憶部として機能し、各種データを記憶する。 The non-volatile memory 316 is composed of, for example, a flash EEPROM (EEPROM: Electrically Erasable Programmable Read Only Memory), and stores the above-described operation program and various fixed data. The nonvolatile memory 316 functions as a storage unit of the computer 300 and stores various data.
 モバイル通信部318は、IMT-2000規格(International Mobile Telecommunication-2000)に準拠した第3世代移動通信システム、および、IMT-Advance規格(International Mobile Telecommunications-Advanced)に準拠した第4世代移動通信システムに基づき、アンテナ318Aを介して、最寄りの図示しない基地局との間でデータの送受を実行する。 The mobile communication unit 318 is a third-generation mobile communication system compliant with the IMT-2000 standard (International Mobile Telecommunication-2000) and a fourth-generation mobile communication system compliant with the IMT-Advanced standard (International Mobile Telecommunications-Advanced). Based on this, data is transmitted / received to / from the nearest base station (not shown) via the antenna 318A.
 無線LAN通信部320は、アンテナ320Aを介して、無線LANアクセスポイントや無線LAN通信が可能な外部機器との間で所定の無線LAN通信規格(たとえば、例えばIEEE802.11a/b/g/n規格)に従った無線LAN通信を行う。 The wireless LAN communication unit 320 uses a predetermined wireless LAN communication standard (for example, IEEE802.11a / b / g / n standard) between the wireless LAN access point and an external device capable of wireless LAN communication via the antenna 320A. ) To perform wireless LAN communication.
 近距離無線通信部322は、アンテナ322Aを介して、たとえばクラス2(半径約10m内)の範囲内にある他のBluetooth(登録商標)規格の機器とデータの送受を実行する。 The short-range wireless communication unit 322 transmits / receives data to / from other Bluetooth (registered trademark) equipment within the range of, for example, class 2 (within a radius of about 10 m) via the antenna 322A.
 有線通信部324は、外部接続端子306を介してケーブルで接続された外部機器との間で所定の通信規格に従った通信を行う。たとえば、USB(Universal Serial Bus)通信を行う。 The wired communication unit 324 performs communication according to a predetermined communication standard with an external device connected by a cable via the external connection terminal 306. For example, USB (Universal Serial Bus) communication is performed.
 表示部326は、タッチパネルディスプレイ302のディスプレイ部分を構成するカラーLCD(liquid crystal display)パネルと、その駆動回路と、で構成され、各種画像を表示する。 The display unit 326 includes a color LCD (liquid crystal display) panel that constitutes a display portion of the touch panel display 302 and a driving circuit thereof, and displays various images.
 入力部328は、タッチパネルディスプレイ302のタッチパネル部分を構成する。入力部328は、透明電極を用いてカラーLCDパネルと一体的に構成される。 The input unit 328 constitutes a touch panel portion of the touch panel display 302. The input unit 328 is configured integrally with the color LCD panel using a transparent electrode.
 キー入力部330は、コンピュータ300の筐体に備えられた複数の操作ボタンと、その駆動回路と、で構成される。 The key input unit 330 includes a plurality of operation buttons provided on the casing of the computer 300 and a drive circuit thereof.
 音声処理部332は、システムバス312を介して与えられるデジタル音声データをアナログ化してスピーカー304から出力する。 The audio processing unit 332 converts the digital audio data provided via the system bus 312 into an analog signal and outputs it from the speaker 304.
 画像処理部334は、撮影レンズおよびイメージセンサーを備えた内蔵カメラ305から出力されるアナログの画像信号をデジタル化し、所要の信号処理を施して出力する。 The image processing unit 334 digitizes an analog image signal output from the built-in camera 305 including a photographing lens and an image sensor, performs necessary signal processing, and outputs it.
 <第1の実施形態>
 図8は、第1の実施形態の画像処理装置400の機能構成例を示す図である。なお、画像処理装置400はコンピュータ300に設けられる。
<First Embodiment>
FIG. 8 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the first embodiment. Note that the image processing apparatus 400 is provided in the computer 300.
 画像処理装置400は、撮影画像入力部402、重複領域計測部404、記憶制御部406、記憶部408、空間配置部410、対応点検出部412、および合成処理部414により構成される。 The image processing apparatus 400 includes a captured image input unit 402, an overlapping area measurement unit 404, a storage control unit 406, a storage unit 408, a space arrangement unit 410, a corresponding point detection unit 412, and a synthesis processing unit 414.
 撮影画像入力部402は、画像の合成に使用する時系列順の複数の撮影画像が順次入力され、複数の撮影画像のうちの時系列順の前後の前画像(第1画像)と後画像(第2画像)とは互いに重複領域を有する複数の撮影画像が入力される。すなわち、撮影画像入力部402には、合成画像を構成する複数の撮影画像であって、前画像と後画像が重複領域を有する撮影画像が入力される。 The photographed image input unit 402 is sequentially input with a plurality of photographed images in time series order used for image synthesis, and before and after the first image and the next image (first image) before and after the time series order among the plurality of photographed images. With the second image), a plurality of captured images having overlapping areas are input. In other words, the photographed image input unit 402 receives a plurality of photographed images constituting a composite image, and a photographed image having an overlapping area between the previous image and the subsequent image.
 図9は、撮影画像入力部402に入力される撮影画像を概念的に示す図である。画像601から画像608までの8枚の撮影画像は、合成画像を構成し画像601から画像608まで時系列順に撮影されたものである。また、画像601から画像608は互いに重複領域を有し、前画像の一部分と後画像の一部分が重なる関係となっている。すなわち、画像601と画像602では、画像601が前画像、画像602が後画像であり、画像601と画像602とは重複領域を有し、画像602と画像603では、画像602が前画像、画像603が後画像であり、画像602と画像603とでは重複領域を有する関係となっている。画像604から画像608でも同様の関係となっている。なお、図中の点線の領域は重複領域であり、矢印は重なる関係を示している。 FIG. 9 is a diagram conceptually showing a captured image input to the captured image input unit 402. The eight captured images from the image 601 to the image 608 constitute a composite image and are captured from the image 601 to the image 608 in chronological order. Further, the image 601 to the image 608 have overlapping areas with each other, and a part of the front image and a part of the rear image overlap each other. That is, in the image 601 and the image 602, the image 601 is the previous image, the image 602 is the subsequent image, the image 601 and the image 602 have overlapping areas, and in the image 602 and the image 603, the image 602 is the previous image, Reference numeral 603 denotes a rear image, and the image 602 and the image 603 have a relationship having an overlapping area. A similar relationship is established from image 604 to image 608. In addition, the area | region of the dotted line in a figure is an overlapping area | region, and the arrow has shown the overlapping relationship.
 重複領域計測部404は、撮影画像入力部402に入力された前画像と後画像との重複領域を計測する。具体的には、重複領域計測部404は、後画像において前画像との重複する領域の面積または重複領域の長さ等の、重複領域の大きさを示す情報が計測される。また、重複領域計測部404は、重複領域の計測と共に、重複領域を示す位置情報も合わせて算出される。 The overlapping area measurement unit 404 measures an overlapping area between the previous image and the subsequent image input to the captured image input unit 402. Specifically, the overlapping area measurement unit 404 measures information indicating the size of the overlapping area, such as the area of the overlapping area with the previous image or the length of the overlapping area in the subsequent image. In addition, the overlapping area measurement unit 404 calculates position information indicating the overlapping area together with the measurement of the overlapping area.
 図10は、前画像Uと後画像Vとの重複領域の計測に関して説明する図である。 FIG. 10 is a diagram for explaining the measurement of the overlapping area of the front image U and the rear image V. FIG.
 前画像Uと後画像Vとは重複領域を有し、前画像Uでは重複領域P1で後画像Vと重複し、後画像Vでは重複領域P2で前画像Uと重複している。また前画像Uおよび後画像Vでは、被写体H、被写体I、および被写体Jが共通の被写体として写っている。重複領域計測部404は、例えば前画像Uと後画像Vとの間で特徴点の座標のペアを検出して重複箇所とし、重複領域P1およびP2を計測する。具体的には、重複領域計測部404は、後画像において特徴点O5からO8が抽出され、特徴点O5からO8に対応する特徴点O1からO4をそれぞれ前画像から検出する。検出された重複箇所のうち、例えばテンプレートマッチングにより閾値以上の重複箇所を採用してもよい。重複領域計測部404は、このように検出された特徴点に基づいて、重複領域の大きさ(面積、長さ)を計測したり、または各画像における重複領域の位置を示す情報(座標)を計測したりする。 The front image U and the rear image V have an overlapping area. In the front image U, the overlapping area P1 overlaps the rear image V, and in the rear image V, the overlapping area P2 overlaps the front image U. In the front image U and the rear image V, the subject H, the subject I, and the subject J are shown as a common subject. The overlapping area measuring unit 404 detects a pair of feature point coordinates between the previous image U and the subsequent image V, for example, and determines overlapping areas, and measures the overlapping areas P1 and P2. Specifically, the overlapping area measurement unit 404 extracts feature points O5 to O8 in the subsequent image, and detects feature points O1 to O4 corresponding to the feature points O5 to O8, respectively, from the previous image. Among the detected overlapping portions, for example, overlapping portions that are equal to or greater than the threshold may be adopted by template matching. The overlapping area measuring unit 404 measures the size (area, length) of the overlapping area based on the feature points detected in this way, or obtains information (coordinates) indicating the position of the overlapping area in each image. Or measure.
 記憶制御部406は、入力された複数の撮影画像を記憶部408に記憶させ、かつ第2画像に関連付けて計測した重複領域を示す位置情報を記憶部408に記憶させる。具体的には、図9で示した画像601から画像608を記憶部408に記憶させ、かつ重複領域を示す位置情報を後画像に関連付けて記憶させる。 The storage control unit 406 stores a plurality of input captured images in the storage unit 408, and causes the storage unit 408 to store position information indicating an overlapping area measured in association with the second image. Specifically, the image 608 to the image 608 shown in FIG. 9 are stored in the storage unit 408, and the position information indicating the overlapping area is stored in association with the subsequent image.
 図11は、記憶部408のデータ構成例を示す。なお、図11では画像601、画像602、および画像603に関して説明されており、他の画像も同様に記憶される。 FIG. 11 shows a data configuration example of the storage unit 408. Note that FIG. 11 describes the image 601, the image 602, and the image 603, and other images are also stored in the same manner.
 図11の部分(A)には撮影ファイル名の記憶データ構成例が記載されている。例えば、画像601は「filename1」、画像602は「filename2」、および画像603は「filename3」の撮影ファイル名で記憶部408に保存される。 FIG. 11 (A) shows an example of the storage data structure of the shooting file name. For example, the image 601 is stored in the storage unit 408 with the shooting file name “filename1”, the image 602 is “filename2”, and the image 603 is “filename3”.
 図11の部分(B)には重複領域のある前画像インデックスの記憶データ構成例が記載されている。画像601は最前の画像であるので、重複領域のある前画像は存在せず「無し(NULL)と記憶され、画像602は画像601と重複領域があるので画像601を示す「1」が記憶され、画像603は画像602と重複領域があるので画像602を示す「2」と記憶部408に記憶される。 FIG. 11B shows an example of the storage data structure of the previous image index having an overlapping area. Since the image 601 is the foremost image, there is no previous image with an overlapping area, and “NONE” is stored, and since the image 602 has an overlapping area with the image 601, “1” indicating the image 601 is stored. Since the image 603 has an overlapping area with the image 602, “2” indicating the image 602 is stored in the storage unit 408.
 図11の部分(C)には前画像の画素の記憶データ構成例が記載されている。画像601は最前画像であるので、互いに重複領域を有する前画像の画素の情報は「無し(NULL)」と記憶され、画像602は画像601と重複領域を有するので前画像(画像601)の画素の情報「(X11,Y11),・・・・(X1n,Y1n)」が記憶されている。また画像603は、画像602と重複領域を有するので前画像(画像602)の画素の情報「(X21,Y21),・・・・(X2n,Y2n)」として記憶されている。 FIG. 11C shows an example of the storage data configuration of the pixels of the previous image. Since the image 601 is the foremost image, the information of the pixels of the previous image having the overlapping area is stored as “NONE”, and the image 602 has the overlapping area with the image 601, so the pixels of the previous image (image 601). The information “(X11, Y11),... (X1n, Y1n)” is stored. Since the image 603 has an overlapping area with the image 602, the pixel information “(X21, Y21),... (X2n, Y2n)” of the previous image (image 602) is stored.
 図11の部分(D)には対応する画素の記憶データの構成例が記載されている。画像601は最前画像であるので、対応する画素の情報は「無し(NULL)」であり、画像602は、図11の部分(C)で示した画素と対応する画素の情報「(X21-I,Y21-I),・・・・(X2n-I,Y2n-I)」が記憶され、画像603は対応する画素「(X31-I,Y31-I),・・・・(X3n-I,Y3n-I)」が記憶される。 FIG. 11 (D) shows a configuration example of storage data of a corresponding pixel. Since the image 601 is the foremost image, the corresponding pixel information is “NONE”, and the image 602 is the pixel information “(X21-I) corresponding to the pixel shown in the part (C) of FIG. , Y21-I),... (X2n-I, Y2n-I) "is stored, and the image 603 contains the corresponding pixels" (X31-I, Y31-I), ... (X3n-I, Y3n-I) "is stored.
 図11に示した例では重複領域を示す位置情報として、前画像の画素の情報(図11の部分(C))と前画像の画素に対応する画素の情報(図11の部分(D))が記憶されている。すなわち、重複領域を示す位置情報第1座標(図11の部分(C))および前画像に対応する後画像の第2座標(図11の部分(D))が記憶されている。なお、重複領域を示す位置情報は、2組以下の第1座標および第2座標であることが好ましい。すなわち、重複領域を示す位置情報は、概略情報としての役割があれば十分であり、ここで概略情報とは画像の重なりの大まかな情報であり、情報として容量が軽く算出が容易なものが好ましい。なお、上述した情報形態以外の情報形態でも、重複領域を示す位置情報として採用される。このように、後画像と後画像に関連づけて重複領域を示す位置情報を記憶部に記憶させることにより、画像群を構成する撮影画像が相互に関連づけられて、画像群のファイルの管理が効率化する。また、このように、後画像と後画像に関連づけて重複領域を示す位置情報を記憶部に記憶させることにより、撮影位置をGPSや測距計で測距しておく必要がなくなる。 In the example shown in FIG. 11, as position information indicating the overlapping area, information on the pixels in the previous image (part (C) in FIG. 11) and information on the pixels corresponding to the pixels in the previous image (part (D) in FIG. 11). Is remembered. That is, position information first coordinates (part (C) in FIG. 11) indicating the overlapping area and second coordinates (part (D) in FIG. 11) of the rear image corresponding to the previous image are stored. In addition, it is preferable that the positional information which shows an overlapping area | region is 1st coordinate and 2nd coordinate of 2 sets or less. That is, it is sufficient that the position information indicating the overlapping area has a role as outline information. Here, the outline information is rough information on overlapping images, and information having a small capacity and easy calculation is preferable. . It should be noted that information forms other than the information forms described above are also employed as position information indicating overlapping areas. In this way, by storing the positional information indicating the overlapping area in association with the rear image and the rear image in the storage unit, the captured images constituting the image group are associated with each other, and the management of the files of the image group is made more efficient. To do. Further, by storing the position information indicating the overlapping area in association with the rear image and the rear image in this manner, it is not necessary to measure the shooting position with the GPS or the rangefinder.
 図12は、後画像に付属するEXIF(Exchangeable image file format)タグに、重複領域の位置情報を記憶する場合を説明する図である。 FIG. 12 is a diagram for explaining a case where the position information of the overlapping area is stored in an EXIF (Exchangeable image file format) tag attached to the subsequent image.
 図12に示すように、例えばEXIFタグの「EXIF IFD」の「MakerNote」の箇所に重複領域の位置情報を記憶させる。すなわち、「MakerNote」の箇所に、重複領域のある前画像ファイル名を記憶させ、前画像の画素の座標、および対応する画素の座標を記憶させる。具体的には、EXIFタグの「MakerNote」の箇所に、重複領域のある前画像ファイル名、前画像の画素(座標)、および対応する画素(座標)が記憶される。 As shown in FIG. 12, for example, the location information of the overlapping area is stored in the “MakerNote” part of the “EXIF IFD” of the EXIF tag. That is, the previous image file name having the overlapping area is stored in the place of “MakerNote”, and the coordinates of the pixel of the previous image and the coordinates of the corresponding pixel are stored. Specifically, the name of the previous image file with the overlapping area, the pixel (coordinate) of the previous image, and the corresponding pixel (coordinate) are stored in the “MakerNote” portion of the EXIF tag.
 空間配置部410は、記憶部408に記憶された複数の撮影画像と位置情報に基づいて複数の撮影画像を空間配置する。すなわち、空間配置部410は、撮影画像を概略情報である重複領域の位置情報に基づいて、空間配置していく。 The space arrangement unit 410 spatially arranges a plurality of photographed images based on the plurality of photographed images and position information stored in the storage unit 408. That is, the space arrangement unit 410 arranges the captured image in space based on the position information of the overlapping area that is the outline information.
 図13は、画像601から画像608が、重複領域の位置情報に基づいて、空間配置された場合の概念図である。 FIG. 13 is a conceptual diagram when images 601 to 608 are spatially arranged based on the position information of the overlapping area.
 画像601から画像608の各画像は、重複領域480、重複領域482、重複領域484、重複領域486、重複領域488、重複領域490、および重複領域492で重複している。記憶部408には重複領域480、重複領域482、重複領域484、重複領域486、重複領域488、重複領域490、および重複領域492に関する位置情報が後画像となる画像と関連して記憶されている。空間配置部410は、記憶部408に記憶された各重複領域を示す位置情報に基づいて、各撮影画像を空間的に配置する。具体的には、空間配置部410は、画像602の重複領域480を示す位置情報に基づいて画像601と画像602とを空間的配置を行う。同様に空間配置部410は、画像602および画像603、画像603および画像604、画像604および画像605、画像605および画像606、画像606および画像607、および画像607および画像608に関して空間的配置を行う。空間配置部410は、順次各画像を空間的配置することにより、撮影画像入力部402に入力された撮影画像全体の空間的配置を算出することができる。 Each image from image 601 to image 608 is overlapped by overlapping region 480, overlapping region 482, overlapping region 484, overlapping region 486, overlapping region 488, overlapping region 490, and overlapping region 492. In the storage unit 408, positional information regarding the overlapping area 480, the overlapping area 482, the overlapping area 484, the overlapping area 486, the overlapping area 488, the overlapping area 490, and the overlapping area 492 is stored in association with the image to be the subsequent image. . The spatial arrangement unit 410 spatially arranges each captured image based on the position information indicating each overlapping area stored in the storage unit 408. Specifically, the spatial arrangement unit 410 spatially arranges the image 601 and the image 602 based on position information indicating the overlapping area 480 of the image 602. Similarly, the spatial arrangement unit 410 spatially arranges the image 602 and the image 603, the image 603 and the image 604, the image 604 and the image 605, the image 605 and the image 606, the image 606 and the image 607, and the image 607 and the image 608. . The spatial arrangement unit 410 can calculate the spatial arrangement of the entire photographed image input to the photographed image input unit 402 by sequentially arranging each image spatially.
 対応点検出部412は、空間配置された複数の撮影画像の重複領域を検出し、検出した重複領域の画像間の対応点を検出する。すなわち、対応点検出部412は、重複領域を示す位置情報により重複する領域または、重複領域を示す位置情報により重複する領域と空間配置した結果により新たに発生した重複領域とにおいて対応点検出を行う。ここで対応点検出の手法は公知の手法が採用される。例えば、対応点の検出の手法として、画像間の拡大縮小、回転および照明変化等に強いロバストな局所特徴量として、SIFT (Scale-invariant feature transform)特徴量、SURF(Speed-Upped Robust Feature)特徴量、およびAKAZE (Accelerated KAZE)特徴量を使用するものが知られている。 Corresponding point detection unit 412 detects overlapping areas of a plurality of spatially arranged captured images, and detects corresponding points between the images of the detected overlapping areas. That is, the corresponding point detection unit 412 detects corresponding points in the overlapping area based on the position information indicating the overlapping area or the overlapping area newly generated as a result of spatial arrangement with the overlapping area based on the position information indicating the overlapping area. . Here, a known method is employed as the method for detecting the corresponding points. For example, as a method for detecting corresponding points, robust local features that are robust to scaling, rotation, and illumination changes between images, SIFT (Scale-invariant feature transform) features, SURF (Speed-Upped Robust Feature) features Those using quantities and AKAZE (Accelerated KAZE) features are known.
 図14は、空間配置後の画像601から画像608の重なりを示した図であり、重複領域の位置情報を使用して空間配置をしてその後に、新たに重複領域が発生することを示した図である。すなわち、対応点検出部412は、前画像と後画像との重複領域および前画像において後画像以外の画像との重複領域も検出し、検出した重複領域の画像間の対応点を検出する。 FIG. 14 is a diagram showing the overlap of the image 601 to the image 608 after the spatial arrangement, and shows that a new overlapping area occurs after the spatial arrangement is performed using the position information of the overlapping area. FIG. That is, the corresponding point detection unit 412 detects an overlapping area between the previous image and the subsequent image and an overlapping area between the previous image and an image other than the subsequent image, and detects corresponding points between the images in the detected overlapping area.
 具体的には、対応点検出部412は、画像602において画像601との重複領域480および画像603との重複領域482に関して対応点検出を行うだけでなく、画像607との重複領域494、および画像608との重複領域495に関しても対応点検出を行う。これにより、画像601-画像602間、画像602-画像603間での対応点検出よりも、より数の多い対応点検出を行うことができる。 Specifically, the corresponding point detection unit 412 not only performs corresponding point detection on the overlapping area 480 with the image 601 and the overlapping area 482 with the image 603 in the image 602, but also with the overlapping area 494 with the image 607, and the image Corresponding point detection is also performed for the overlapping region 495 with 608. As a result, more corresponding points can be detected than the corresponding points detected between the image 601 and the image 602 and between the image 602 and the image 603.
 また、対応点検出部412は、画像601において画像602との重複領域480に関して対応点検出を行うだけでなく、画像608との重複領域496に関しても対応点検出を行う。これにより、画像601-画像602間での対応点検出よりも、より数の多い対応点検出を行うことができる。 Also, the corresponding point detection unit 412 not only performs corresponding point detection on the overlapping region 480 with the image 602 in the image 601, but also performs corresponding point detection on the overlapping region 496 with the image 608. As a result, more corresponding points can be detected than the corresponding points between the image 601 and the image 602.
 合成処理部414は、検出した対応点に基づいて複数の撮影画像を合成して合成画像を生成する。合成処理部414は、複数の画像を合成することにより様々な合成画像を生成することができる。例えば、合成処理部414は、パノラマ合成画像およびオルソ画像を生成する。ここでオルソ画像とは、空中写真を位置ズレのない画像に変換し、正しい位置情報を付与した画像である。 The composition processing unit 414 generates a composite image by combining a plurality of captured images based on the detected corresponding points. The composition processing unit 414 can generate various composite images by combining a plurality of images. For example, the composition processing unit 414 generates a panorama composite image and an ortho image. Here, the ortho image is an image obtained by converting an aerial photograph into an image having no positional deviation and giving correct positional information.
 図15は、本実施形態の画像処理装置400の動作および画像処理方法を示したフローチャートである。 FIG. 15 is a flowchart showing the operation and the image processing method of the image processing apparatus 400 of the present embodiment.
 先ず、撮影画像入力部402により、合成に使用する時系列順の複数の撮影画像が入力される(ステップS10)。例えば、ロボット装置100で撮影された画像群をコンピュータ300のモバイル通信部318、無線LAN通信部320、または近距離無線通信部322を介して、コンピュータ300のCPU310で実現される撮影画像入力部402に入力される。 First, a plurality of photographed images in time series used for composition are input by the photographed image input unit 402 (step S10). For example, a captured image input unit 402 realized by the CPU 310 of the computer 300 via the mobile communication unit 318, the wireless LAN communication unit 320, or the short-range wireless communication unit 322 of the computer 300 for the image group captured by the robot apparatus 100. Is input.
 次に、重複領域計測部404により、各画像の重複領域が計測される(ステップS11)。例えば、重複領域計測部404はコンピュータ300のCPU310により実現される。その後、撮影画像入力部402で入力された撮影画像、および重複領域計測部404で計測された位置情報がコンピュータ300のCPU310で実現される記憶制御部406により、コンピュータ300のメインメモリ314で実現される記憶部408に記憶される(ステップS12)。 Next, the overlapping area of each image is measured by the overlapping area measuring unit 404 (step S11). For example, the overlapping area measurement unit 404 is realized by the CPU 310 of the computer 300. Thereafter, the captured image input by the captured image input unit 402 and the position information measured by the overlapping region measuring unit 404 are realized in the main memory 314 of the computer 300 by the storage control unit 406 realized by the CPU 310 of the computer 300. Is stored in the storage unit 408 (step S12).
 次に、コンピュータ300のCPU310で実現される空間配置部410により撮影画像を、概略情報である重複領域の位置情報に基づいて空間配置する(ステップS13)。その後、コンピュータ300のCPU310で実現される対応点検出部412により、重複領域における対応点を検出する(ステップS14)。撮影画像を空間配置した後の重複領域は、前画像と後画像との関係の重複領域だけでなく、画像の前後関係にない画像との重複領域(例えば図14に示した重複領域494~重複領域496)においても対応点を検出することができる。 Next, the spatial arrangement unit 410 realized by the CPU 310 of the computer 300 arranges the captured image in a spatial manner based on the position information of the overlapping area, which is the outline information (step S13). Thereafter, the corresponding point detection unit 412 realized by the CPU 310 of the computer 300 detects the corresponding point in the overlapping region (step S14). The overlapping area after the captured image is spatially arranged is not only the overlapping area of the relationship between the previous image and the subsequent image, but also the overlapping area with an image that is not in the context of the image (for example, the overlapping area 494 to the overlapping area shown in FIG. 14). Corresponding points can also be detected in the region 496).
 その後、コンピュータ300のCPU310で実現される合成処理部414により、検出した対応点に基づいて合成画像を生成する(ステップS15)。 After that, the composite processing unit 414 realized by the CPU 310 of the computer 300 generates a composite image based on the detected corresponding points (step S15).
 上記実施形態において、各種の処理を実行する処理部(processing unit)のハードウェア的な構造は、次に示すような各種のプロセッサ(processor)である。各種のプロセッサには、ソフトウェア(プログラム)を実行して各種の処理部として機能する汎用的なプロセッサであるCPU、FPGA(Field Programmable Gate Array)などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)、ASIC(Application Specific Integrated Circuit)などの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路などが含まれる。 In the above-described embodiment, the hardware structure of a processing unit (processing unit) that executes various processes is the following various processors. The various processors are programmable processors that can change the circuit configuration after manufacturing, such as a CPU or FPGA (Field Programmable Gate Gate Array) that is a general-purpose processor that functions as various processing units by executing software (programs). Examples include a dedicated electric circuit which is a processor having a circuit configuration specifically designed to execute a specific process such as a logic device (Programmable Logic Device: PLD) and an ASIC (Application Specific Integrated Circuit).
 1つの処理部は、これら各種のプロセッサのうちの1つで構成されていてもよいし、同種または異種の2つ以上のプロセッサ(例えば、複数のFPGA、あるいはCPUとFPGAの組み合わせ)で構成されてもよい。また、複数の処理部を1つのプロセッサで構成してもよい。複数の処理部を1つのプロセッサで構成する例としては、第1に、クライアントやサーバなどのコンピュータに代表されるように、1つ以上のCPUとソフトウェアの組合せで1つのプロセッサを構成し、このプロセッサが複数の処理部として機能する形態がある。第2に、システムオンチップ(System On Chip:SoC)などに代表されるように、複数の処理部を含むシステム全体の機能を1つのIC(Integrated Circuit)チップで実現するプロセッサを使用する形態がある。このように、各種の処理部は、ハードウェア的な構造として、上記各種のプロセッサを1つ以上用いて構成される。 One processing unit may be configured by one of these various processors, or may be configured by two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of CPU and FPGA). May be. Further, the plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or server, one processor is configured with a combination of one or more CPUs and software. There is a form in which the processor functions as a plurality of processing units. Second, as represented by a system-on-chip (SoC), a form of using a processor that realizes the functions of the entire system including a plurality of processing units with a single IC (integrated circuit) chip. is there. As described above, various processing units are configured using one or more of the various processors as a hardware structure.
 さらに、これらの各種のプロセッサのハードウェア的な構造は、より具体的には、半導体素子などの回路素子を組み合わせた電気回路(circuitry)である。 Further, the hardware structure of these various processors is more specifically an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
 上述の各構成および機能は、任意のハードウェア、ソフトウェア、或いは両者の組み合わせによって適宜実現可能である。例えば、上述の処理ステップ(処理手順)をコンピュータに実行させるプログラム、そのようなプログラムを記録したコンピュータ読み取り可能な記録媒体(非一時的記録媒体)、或いはそのようなプログラムをインストール可能なコンピュータに対しても本発明を適用することが可能である。 The above-described configurations and functions can be appropriately realized by arbitrary hardware, software, or a combination of both. For example, for a program that causes a computer to execute the above-described processing steps (processing procedure), a computer-readable recording medium (non-transitory recording medium) that records such a program, or a computer that can install such a program However, the present invention can be applied.
 <第2の実施形態>
 次に第2の実施形態に関して説明する。本実施形態は、撮影画像入力部402に動画が入力される。
<Second Embodiment>
Next, a second embodiment will be described. In the present embodiment, a moving image is input to the captured image input unit 402.
 図16は、第2の実施形態の画像処理装置400の機能構成例を示す図である。なお、図8で既に説明を行った箇所は同じ符号を付し説明を省略する。 FIG. 16 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the second embodiment. In addition, the part already demonstrated in FIG. 8 attaches | subjects the same code | symbol, and abbreviate | omits description.
 本実施形態の画像処理装置400は、撮影画像入力部402、第1画像切出部421、重複領域計測部404、第2画像切出部423、記憶制御部406、記憶部408、空間配置部410、対応点検出部412、および合成処理部414により構成される。 The image processing apparatus 400 according to the present embodiment includes a captured image input unit 402, a first image cutout unit 421, an overlapping area measurement unit 404, a second image cutout unit 423, a storage control unit 406, a storage unit 408, and a space arrangement unit. 410, a corresponding point detection unit 412, and a synthesis processing unit 414.
 撮影画像入力部402には、動画が入力される。すなわち、画像の合成に使用する時系列順の複数の撮影画像が順次入力され、入力された複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する複数の撮影画像で構成された動画が入力される。 The moving image is input to the photographed image input unit 402. That is, a plurality of time-sequential captured images used for image synthesis are sequentially input, and the first image and the second image before and after the time-sequential order among the plurality of input captured images have overlapping areas. A moving image composed of a plurality of captured images is input.
 第1画像切出部421は、撮影画像入力部402に入力された複数の撮影画像から第1画像を切り出す。すなわち、本実施形態では撮影画像入力部402に動画が入力され、入力された動画を構成する複数のフレーム画像のうちの一つが前画像として切り出される。 The first image cutout unit 421 cuts out the first image from the plurality of shot images input to the shot image input unit 402. In other words, in the present embodiment, a moving image is input to the captured image input unit 402, and one of a plurality of frame images constituting the input moving image is cut out as a previous image.
 重複領域計測部404は、切り出された第1画像と入力された複数の撮影画像との重複領域を計測する。すなわち、重複領域計測部404は、切り出された前画像との重複領域を、入力された動画のフレーム画像について計測する。例えば、重複領域計測部404は、切り出された前画像より時系列順であとのフレーム画像について重複領域の計測を行ってもよい。 The overlapping area measuring unit 404 measures an overlapping area between the cut out first image and the plurality of input captured images. That is, the overlapping area measuring unit 404 measures the overlapping area with the clipped previous image for the frame image of the input moving image. For example, the overlapping area measurement unit 404 may measure the overlapping area of the frame image that is in chronological order from the extracted previous image.
 第2画像切出部423は、重複領域計測部404の計測結果が既定値となった場合に、複数の撮影画像から第2画像を切り出す。すなわち、第2画像切出部423は、重複領域計測部404により重複領域が計測されて、その結果が既定値となった場合に、そのフレーム画像を第2画像として切り出す。例えば重複領域計測部404は、重複領域の面積を計測し、計測する面積が既定値となったフレーム画像を第2画像として第2画像切出部423は切り出す。 The second image cutout unit 423 cuts out the second image from the plurality of captured images when the measurement result of the overlapping area measurement unit 404 becomes a predetermined value. That is, the second image cutout unit 423 cuts out the frame image as the second image when the overlapped region is measured by the overlapped region measuring unit 404 and the result becomes a predetermined value. For example, the overlapping region measuring unit 404 measures the area of the overlapping region, and the second image cutout unit 423 cuts out the frame image in which the area to be measured is a predetermined value as the second image.
 記憶制御部406は、入力された動画および重複領域計測部404に計測された重複領域の位置情報を記憶部408に記憶する。または、記憶制御部406は前画像および後画像と重複領域の位置情報とを記憶部408に記憶させてもよい。 The storage control unit 406 stores the input moving image and the position information of the overlapping area measured by the overlapping area measuring unit 404 in the storage unit 408. Alternatively, the storage control unit 406 may cause the storage unit 408 to store the front image, the rear image, and the position information of the overlapping area.
 図17は、本実施形態の画像処理装置400の動作および画像処理方法を示したフローチャートである。 FIG. 17 is a flowchart showing the operation and the image processing method of the image processing apparatus 400 of the present embodiment.
 先ず、撮影画像入力部402により、合成に使用する時系列順の複数の撮影画像が入力される(ステップS20)。例えば、ロボット装置100で撮影された画像群(動画)をコンピュータ300のモバイル通信部318、無線LAN通信部320、または近距離無線通信部322を介して、コンピュータ300のCPU310で実現される撮影画像入力部402に入力される。本実施形態の場合は、撮影画像入力部402に動画が入力される。 First, a plurality of photographed images in time series used for composition are input by the photographed image input unit 402 (step S20). For example, an image group (moving image) captured by the robot apparatus 100 is captured by the CPU 310 of the computer 300 via the mobile communication unit 318, the wireless LAN communication unit 320, or the short-range wireless communication unit 322 of the computer 300. Input to the input unit 402. In the case of this embodiment, a moving image is input to the captured image input unit 402.
 その後、第1画像切出部421により、動画を構成する複数のフレーム画像から前画像が切り出される(ステップS21)。例えば、第1画像切出部421は、コンピュータ300のCPU310により実現される。 Thereafter, the first image cutout unit 421 cuts out a previous image from a plurality of frame images constituting the moving image (step S21). For example, the first image cutout unit 421 is realized by the CPU 310 of the computer 300.
 次に、重複領域計測部404により、動画を構成する複数のフレーム画像と前画像との重複領域が計測される(ステップS22)。例えば、重複領域計測部404はコンピュータ300のCPU310により実現される。そして、第2画像切出部423は、重複領域計測部404の計測結果が閾値となった場合に、動画から後画像となるフレーム画像が切り出される(ステップS23)。例えば、第2画像切出部423は、コンピュータ300のCPU310で実現される。 Next, the overlapping area measuring unit 404 measures an overlapping area between a plurality of frame images constituting the moving image and the previous image (step S22). For example, the overlapping area measurement unit 404 is realized by the CPU 310 of the computer 300. Then, the second image cutout unit 423 cuts out a frame image that is a subsequent image from the moving image when the measurement result of the overlapping area measurement unit 404 becomes a threshold value (step S23). For example, the second image cutout unit 423 is realized by the CPU 310 of the computer 300.
 その後、撮影画像入力部402で入力された動画、および重複領域計測部404で計測された位置情報がコンピュータ300のCPU310で実現される記憶制御部406により、コンピュータ300のメインメモリ314で実現される記憶部408に記憶される(ステップS24)。 Thereafter, the moving image input by the photographed image input unit 402 and the position information measured by the overlapping area measuring unit 404 are realized in the main memory 314 of the computer 300 by the storage control unit 406 realized by the CPU 310 of the computer 300. It is stored in the storage unit 408 (step S24).
 次に、コンピュータ300のCPU310で実現される空間配置部410により撮影画像を、概略情報である重複領域の位置情報に基づいて空間配置する(ステップS25)。その後、コンピュータ300のCPU310で実現される対応点検出部412により、重複領域における対応点を検出する(ステップS26)。撮影画像を空間配置した後の重複領域は、前画像と後画像との関係の重複領域だけでなく、画像の前後関係にない画像との重複領域(例えば図14に示した重複領域494および496)においても対応点を検出することができる。 Next, the spatial arrangement unit 410 realized by the CPU 310 of the computer 300 arranges the captured image in a spatial arrangement based on the position information of the overlapping area that is the outline information (step S25). Thereafter, the corresponding point detection unit 412 realized by the CPU 310 of the computer 300 detects the corresponding point in the overlapping region (step S26). The overlapping area after the captured image is spatially arranged is not only the overlapping area of the relationship between the previous image and the subsequent image, but also the overlapping area with an image not in the context of the image (for example, the overlapping areas 494 and 496 shown in FIG. 14). ) Can also detect corresponding points.
 その後、コンピュータ300のCPU310で実現される合成処理部414により、検出した対応点に基づいて合成画像を生成する(ステップS27)。 After that, the composite processing unit 414 realized by the CPU 310 of the computer 300 generates a composite image based on the detected corresponding points (step S27).
 <第3の実施形態>
 次に第3の実施形態に関して説明する。本実施形態は、撮影画像入力部402に静止画の画像群が入力され、その画像群から前画像および後画像が抽出されて記憶部408に記憶される。
<Third Embodiment>
Next, a third embodiment will be described. In the present embodiment, a group of still images is input to the photographed image input unit 402, and a front image and a rear image are extracted from the image group and stored in the storage unit 408.
 図18は、第3の実施形態の画像処理装置400の機能構成例を示す図である。なお、図8で既に説明を行った箇所は同じ符号を付し説明を省略する。 FIG. 18 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the third embodiment. In addition, the part already demonstrated in FIG. 8 attaches | subjects the same code | symbol, and abbreviate | omits description.
 本実施形態の画像処理装置400は、撮影画像入力部402、重複領域計測部404、画像抽出部431、記憶制御部406、記憶部408、空間配置部410、対応点検出部412、および合成処理部414により構成される。 The image processing apparatus 400 according to the present embodiment includes a captured image input unit 402, an overlapping area measurement unit 404, an image extraction unit 431, a storage control unit 406, a storage unit 408, a space arrangement unit 410, a corresponding point detection unit 412, and a synthesis process. Part 414.
 撮影画像入力部402には、静止画の画像群が入力される。すなわち、画像の合成に使用する時系列順の複数の撮影画像を順次入力する撮影画像入力部402であって、複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する複数の撮影画像が入力される。 The photographed image input unit 402 receives a group of still images. That is, a captured image input unit 402 that sequentially inputs a plurality of captured images in time series used for image synthesis, and includes a first image and a second image before and after the time sequence in the plurality of captured images. A plurality of captured images having overlapping areas are input.
 画像抽出部431は、重複領域計測部404の計測結果が既定値となった場合に、前画像と後画像とを抽出する。すなわち、画像抽出部431は、重複領域計測部404の計測結果が既定値となった場合に、撮影画像入力部402で入力された静止画の画像群から前画像および後画像を抽出する。例えば重複領域計測部404は、重複領域の面積を計測し、計測する面積が既定値となった場合に前画像および後画像を抽出する。 The image extraction unit 431 extracts the previous image and the rear image when the measurement result of the overlapping area measurement unit 404 becomes a default value. That is, the image extraction unit 431 extracts the previous image and the subsequent image from the still image group input by the captured image input unit 402 when the measurement result of the overlapping area measurement unit 404 becomes a default value. For example, the overlapping area measurement unit 404 measures the area of the overlapping area, and extracts the front image and the rear image when the area to be measured becomes a predetermined value.
 記憶制御部406は、抽出した前画像と後画像を記憶部408に記憶させ、かつ後画像に関連付けて計測した重複領域を示す位置情報を記憶部408に記憶させる。 The storage control unit 406 stores the extracted previous image and subsequent image in the storage unit 408, and causes the storage unit 408 to store position information indicating the overlapping area measured in association with the subsequent image.
 図19は、本実施形態の画像処理装置400の動作および画像処理方法を示したフローチャートである。 FIG. 19 is a flowchart showing an operation and an image processing method of the image processing apparatus 400 of the present embodiment.
 先ず、撮影画像入力部402により、合成に使用する時系列順の複数の撮影画像(静止画の画像群)が入力される(ステップS30)。例えば、ロボット装置100で撮影された画像群をコンピュータ300のモバイル通信部318、無線LAN通信部320、または近距離無線通信部322を介して、コンピュータ300のCPU310で実現される撮影画像入力部402に入力される。 First, the photographed image input unit 402 inputs a plurality of photographed images (groups of still images) in time series used for composition (step S30). For example, a captured image input unit 402 realized by the CPU 310 of the computer 300 via the mobile communication unit 318, the wireless LAN communication unit 320, or the short-range wireless communication unit 322 of the computer 300 for the image group captured by the robot apparatus 100. Is input.
 次に、重複領域計測部404により、各画像の重複領域が計測される(ステップS31)。例えば、重複領域計測部404はコンピュータ300のCPU310により実現される。 Next, the overlapping area of each image is measured by the overlapping area measuring unit 404 (step S31). For example, the overlapping area measurement unit 404 is realized by the CPU 310 of the computer 300.
 そして、画像抽出部431により、重複領域計測部404の計測結果が既定値となった場合に、前画像と後画像とが抽出される(ステップS32)。例えば、画像抽出部431はコンピュータ300のCPU310により実現される。 Then, when the measurement result of the overlapping area measurement unit 404 becomes a default value, the image extraction unit 431 extracts the previous image and the rear image (step S32). For example, the image extraction unit 431 is realized by the CPU 310 of the computer 300.
 その後、撮影画像入力部402で入力された撮影画像、および重複領域計測部404で計測された位置情報がコンピュータ300のCPU310で実現される記憶制御部406により、コンピュータ300のメインメモリ314で実現される記憶部408に記憶される(ステップS33)。 Thereafter, the captured image input by the captured image input unit 402 and the position information measured by the overlapping region measuring unit 404 are realized in the main memory 314 of the computer 300 by the storage control unit 406 realized by the CPU 310 of the computer 300. Is stored in the storage unit 408 (step S33).
 次に、コンピュータ300のCPU310で実現される空間配置部410により撮影画像を、概略情報である重複領域の位置情報に基づいて空間配置する(ステップS34)。その後、コンピュータ300のCPU310で実現される対応点検出部412により、重複領域における対応点を検出する(ステップS35)。撮影画像を空間配置した後の重複領域は、前画像と後画像との関係の重複領域だけでなく、画像の前後関係にない画像との重複領域(例えば図14に示した重複領域494および496)においても対応点を検出することができる。 Next, the spatial arrangement unit 410 realized by the CPU 310 of the computer 300 arranges the photographed images in a spatial arrangement based on the position information of the overlapping area that is the outline information (step S34). Thereafter, the corresponding point detection unit 412 realized by the CPU 310 of the computer 300 detects the corresponding point in the overlapping region (step S35). The overlapping area after the captured image is spatially arranged is not only the overlapping area of the relationship between the previous image and the subsequent image, but also the overlapping area with an image not in the context of the image (for example, the overlapping areas 494 and 496 shown in FIG. 14). ) Can also detect corresponding points.
 その後、コンピュータ300のCPU310で実現される合成処理部414により、検出した対応点に基づいて合成画像を生成する(ステップS36)。 After that, the composite processing unit 414 realized by the CPU 310 of the computer 300 generates a composite image based on the detected corresponding points (step S36).
 <第4の実施形態>
 次に第4の実施形態に関して説明する。本実施形態は、第1画像および第2画像が撮影画像取得部401に自動で取得される。
<Fourth Embodiment>
Next, a fourth embodiment will be described. In the present embodiment, the first image and the second image are automatically acquired by the captured image acquisition unit 401.
 図20は、第2の実施形態の画像処理装置400の機能構成例を示す図である。なお、図8で既に説明を行った箇所は同じ符号を付し説明を省略する。 FIG. 20 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the second embodiment. In addition, the part already demonstrated in FIG. 8 attaches | subjects the same code | symbol, and abbreviate | omits description.
 カメラ装置403は、撮影画像取得部401、重複領域計測部404、撮影制御部441、記憶制御部406、記憶部408、空間配置部410、対応点検出部412、および合成処理部414を備える。例えばカメラ装置403は、ロボット装置100に備えられている。例えば、撮影画像取得部401は、ロボット装置100の撮像装置200で実現され、重複領域計測部404、撮影制御部441、記憶制御部406、記憶部408、空間配置部410、対応点検出部412、および合成処理部414は例えばロボット装置100の撮影制御部204により実現される。 The camera device 403 includes a captured image acquisition unit 401, an overlapping area measurement unit 404, an imaging control unit 441, a storage control unit 406, a storage unit 408, a space arrangement unit 410, a corresponding point detection unit 412, and a composition processing unit 414. For example, the camera device 403 is provided in the robot device 100. For example, the captured image acquisition unit 401 is realized by the imaging device 200 of the robot apparatus 100, and includes an overlapping area measurement unit 404, an imaging control unit 441, a storage control unit 406, a storage unit 408, a space arrangement unit 410, and a corresponding point detection unit 412. , And the synthesis processing unit 414 is realized by the imaging control unit 204 of the robot apparatus 100, for example.
 撮影画像取得部401は、画像の合成に使用する時系列順の複数の撮影画像を順次取得する撮影画像取得部401であって、複数の撮影画像のうちの時系列順の前後の前画像と後画像とは互いに重複領域を有する複数の撮影画像を取得する。 The captured image acquisition unit 401 is a captured image acquisition unit 401 that sequentially acquires a plurality of time-sequential captured images used for image synthesis, and includes a previous image before and after a time-sequential order among the plurality of captured images. A plurality of captured images having overlapping areas with the subsequent image are acquired.
 重複領域計測部404は、前画像と撮影画像取得部401で捉えられる動画との重複領域を計測する。例えば重複領域計測部404は、取得された前画像とカメラで取得されるライブビュー画像との重複領域を計測する。 The overlapping area measurement unit 404 measures an overlapping area between the previous image and the moving image captured by the captured image acquisition unit 401. For example, the overlapping area measurement unit 404 measures an overlapping area between the acquired previous image and the live view image acquired by the camera.
 撮影制御部441は、重複領域計測部404の計測結果が既定値となった場合に、撮影画像取得部401に第2画像を取得させる。すなわち、撮影制御部441は、前画像とライブビュー画像との重複領域を計測しており、重複領域が既定値となった場合に後画像を自動で取得する。例えば重複領域計測部404は、重複領域の面積を計測し、計測する面積が既定値となった場合に撮影制御部441は撮影画像取得部401に第2画像を取得させる。 The imaging control unit 441 causes the captured image acquisition unit 401 to acquire the second image when the measurement result of the overlapping area measurement unit 404 becomes a default value. That is, the imaging control unit 441 measures an overlapping area between the previous image and the live view image, and automatically acquires the subsequent image when the overlapping area becomes a predetermined value. For example, the overlapping area measuring unit 404 measures the area of the overlapping area, and when the area to be measured becomes a predetermined value, the imaging control unit 441 causes the captured image acquisition unit 401 to acquire the second image.
 以上で本発明の例に関して説明してきたが、本発明は上述した実施の形態に限定されず、本発明の精神を逸脱しない範囲で種々の変形が可能であることは言うまでもない。 The examples of the present invention have been described above, but the present invention is not limited to the above-described embodiments, and it goes without saying that various modifications can be made without departing from the spirit of the present invention.
1    橋梁
2    主桁
3    横桁
4    対傾構
5    横構
6    床版
100  ロボット装置
102  主フレーム
104  垂直伸延アーム
104A カメラ設置部
106  筐体
108  X方向駆動部
108A ボールネジ
108B ボールナット
108C モータ
110  Y方向駆動部
110A タイヤ
110B タイヤ
112  Z方向駆動部
120  パンチルト機構
130  ロボット制御部
200  撮像装置
202  カメラ
202A 撮像部
204  撮影制御部
206  パンチルト駆動部
210  パンチルト制御部
230  ロボット側通信部
300  コンピュータ
302  タッチパネルディスプレイ
304  スピーカー
305  内蔵カメラ
306  外部接続端子
310  CPU
312  システムバス
314  メインメモリ
316  不揮発性メモリ
318  モバイル通信部
318A アンテナ
320  無線LAN通信部
320A アンテナ
322  近距離無線通信部
322A アンテナ
324  有線通信部
326  表示部
328  入力部
330  キー入力部
332  音声処理部
334  画像処理部
400  画像処理装置
401  撮影画像取得部
402  撮影画像入力部
403  カメラ装置
404  重複領域計測部
406  記憶制御部
408  記憶部
410  空間配置部
412  対応点検出部
414  合成処理部
421  第1画像切出部
423  第2画像切出部
431  画像抽出部
441  撮影制御部
500  撮影システム
S10-S15 第1の実施形態の画像処理方法の工程
S20-S27 第2の実施形態の画像処理方法の工程
S30-S36 第3の実施形態の画像処理方法の工程
DESCRIPTION OF SYMBOLS 1 Bridge 2 Main girder 3 Horizontal girder 4 Counter-tilt 5 Horizontal structure 6 Floor slab 100 Robot apparatus 102 Main frame 104 Vertical extension arm 104A Camera installation part 106 Case 108 X direction drive part 108A Ball screw 108B Ball nut 108C Motor 110 Y direction drive Unit 110A tire 110B tire 112 Z-direction drive unit 120 pan-tilt mechanism 130 robot control unit 200 imaging device 202 camera 202A imaging unit 204 imaging control unit 206 pan-tilt drive unit 210 pan-tilt control unit 230 robot-side communication unit 300 computer 302 touch panel display 304 speaker 305 Built-in camera 306 External connection terminal 310 CPU
312 System bus 314 Main memory 316 Non-volatile memory 318 Mobile communication unit 318A Antenna 320 Wireless LAN communication unit 320A Antenna 322 Short-range wireless communication unit 322A Antenna 324 Wired communication unit 326 Display unit 328 Input unit 330 Key input unit 332 Audio processing unit 334 Image processing unit 400 Image processing device 401 Captured image acquisition unit 402 Captured image input unit 403 Camera device 404 Overlap area measurement unit 406 Storage control unit 408 Storage unit 410 Spatial arrangement unit 412 Corresponding point detection unit 414 Composite processing unit 421 First image cut Output unit 423 Second image extraction unit 431 Image extraction unit 441 Imaging control unit 500 Imaging system S10-S15 Steps S20-S27 of the image processing method of the first embodiment Step S30- of the image processing method of the second embodiment S3 Step of the image processing method of the third embodiment

Claims (12)

  1.  画像の合成に使用する時系列順の複数の撮影画像を順次入力する撮影画像入力部であって、前記複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する前記複数の撮影画像を入力する撮影画像入力部と、
     前記入力する前記第1画像と前記第2画像との前記重複領域を計測する重複領域計測部と、
     前記入力した前記複数の撮影画像を記憶部に記憶させ、かつ前記第2画像に関連付けて前記計測した前記重複領域を示す位置情報を記憶部に記憶させる記憶制御部と、
     前記記憶部に記憶された前記複数の撮影画像と前記位置情報に基づいて前記複数の撮影画像を空間配置する空間配置部と、
     前記空間配置された前記複数の撮影画像の前記重複領域を検出し、前記検出した前記重複領域の画像間の対応点を検出する対応点検出部と、
     前記検出した対応点に基づいて前記複数の撮影画像を合成して合成画像を生成する合成処理部と、
     を備えた画像処理装置。
    A photographed image input unit that sequentially inputs a plurality of photographed images in time series used for image synthesis, wherein the first image and the second image before and after the time series of the plurality of photographed images are mutually A captured image input unit for inputting the plurality of captured images having an overlapping area;
    An overlapping area measuring unit that measures the overlapping area between the first image and the second image to be input;
    A storage control unit for storing the input plurality of captured images in a storage unit and storing in the storage unit position information indicating the measured overlap area in association with the second image;
    A spatial arrangement unit that spatially arranges the plurality of captured images based on the plurality of captured images and the position information stored in the storage unit;
    A corresponding point detection unit that detects the overlapping regions of the plurality of captured images arranged in space and detects corresponding points between the images of the detected overlapping regions;
    A combining processing unit that combines the plurality of captured images based on the detected corresponding points to generate a combined image;
    An image processing apparatus.
  2.  画像の合成に使用する時系列順の複数の撮影画像を順次入力する撮影画像入力部であって、前記複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する前記複数の撮影画像を入力する撮影画像入力部と、
     前記入力された前記複数の撮影画像から前記第1画像を切り出す第1画像切出部と、
     前記切り出された前記第1画像と前記入力された前記複数の撮影画像との前記重複領域を計測する重複領域計測部と、
     前記重複領域計測部の計測結果が既定値となった場合に、前記複数の撮影画像から前記第2画像を切り出す第2画像切出部と、
     前記入力した前記複数の撮影画像を記憶部に記憶させ、かつ前記第2画像に関連付けて前記計測した前記重複領域を示す位置情報を記憶部に記憶させる記憶制御部と、
     前記記憶部に記憶された前記複数の撮影画像と前記位置情報に基づいて前記複数の撮影画像を空間配置する空間配置部と、
     前記空間配置された前記複数の撮影画像の前記重複領域を検出し、前記検出した前記重複領域の画像間の対応点を検出する対応点検出部と、
     前記検出した対応点に基づいて前記複数の撮影画像を合成して合成画像を生成する合成処理部と、
     を備えた画像処理装置。
    A photographed image input unit that sequentially inputs a plurality of photographed images in time series used for image synthesis, wherein the first image and the second image before and after the time series of the plurality of photographed images are mutually A captured image input unit for inputting the plurality of captured images having an overlapping area;
    A first image cutout unit that cuts out the first image from the input plurality of captured images;
    An overlapping area measuring unit that measures the overlapping area between the cut out first image and the input plurality of captured images;
    A second image cutout unit that cuts out the second image from the plurality of captured images when the measurement result of the overlapping region measurement unit is a default value;
    A storage control unit for storing the input plurality of captured images in a storage unit and storing in the storage unit position information indicating the measured overlap area in association with the second image;
    A spatial arrangement unit that spatially arranges the plurality of captured images based on the plurality of captured images and the position information stored in the storage unit;
    A corresponding point detection unit that detects the overlapping regions of the plurality of captured images arranged in space and detects corresponding points between the images of the detected overlapping regions;
    A combining processing unit that combines the plurality of captured images based on the detected corresponding points to generate a combined image;
    An image processing apparatus.
  3.  画像の合成に使用する時系列順の複数の撮影画像を順次入力する撮影画像入力部であって、前記複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する前記複数の撮影画像を入力する撮影画像入力部と、
     前記入力する前記第1画像と前記第2画像との前記重複領域を計測する重複領域計測部と、
     前記重複領域計測部の計測結果が既定値となった場合に、前記第1画像と前記第2画像とを抽出する画像抽出部と、
     前記抽出した前記第1画像と前記第2画像を記憶部に記憶させ、かつ前記第2画像に関連付けて前記計測した前記重複領域を示す位置情報を記憶部に記憶させる記憶制御部と、
     前記記憶部に記憶された前記複数の撮影画像と前記位置情報に基づいて前記複数の撮影画像を空間配置する空間配置部と、
     前記空間配置された前記複数の撮影画像の前記重複領域を検出し、前記検出した前記重複領域の画像間の対応点を検出する対応点検出部と、
     前記検出した対応点に基づいて前記複数の撮影画像を合成して合成画像を生成する合成処理部と、
     を備えた画像処理装置。
    A photographed image input unit that sequentially inputs a plurality of photographed images in time series used for image synthesis, wherein the first image and the second image before and after the time series of the plurality of photographed images are mutually A captured image input unit for inputting the plurality of captured images having an overlapping area;
    An overlapping area measuring unit that measures the overlapping area between the first image and the second image to be input;
    An image extraction unit that extracts the first image and the second image when the measurement result of the overlapping region measurement unit is a default value;
    A storage control unit for storing the extracted first image and the second image in a storage unit, and storing in the storage unit position information indicating the measured overlap area in association with the second image;
    A spatial arrangement unit that spatially arranges the plurality of captured images based on the plurality of captured images and the position information stored in the storage unit;
    A corresponding point detection unit that detects the overlapping regions of the plurality of captured images arranged in space and detects corresponding points between the images of the detected overlapping regions;
    A combining processing unit that combines the plurality of captured images based on the detected corresponding points to generate a combined image;
    An image processing apparatus.
  4.  前記合成処理部は、合成することによりオルソ画像を生成する請求項1から3のいずれか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 3, wherein the composition processing unit generates an ortho image by compositing.
  5.  前記記憶制御部は、前記第2画像に付属するEXIFタグに、前記位置情報を記録する請求項1から4のいずれか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 4, wherein the storage control unit records the position information in an EXIF tag attached to the second image.
  6.  前記対応点検出部は、前記第1画像において前記第2画像以外の画像との前記重複領域も検出し、前記検出した前記重複領域の画像間の対応点を検出する請求項1から5のいずれか1項に記載の画像処理装置。 The said corresponding point detection part also detects the said overlapping area | region with images other than the said 2nd image in the said 1st image, and detects the corresponding point between the images of the detected said overlapping area. The image processing apparatus according to claim 1.
  7.  前記重複領域を示す前記位置情報は、前記第1画像の第1座標および前記第1画像に対応する前記第2画像の第2座標である請求項1から6のいずれか1項に記載の画像処理装置。 The image according to any one of claims 1 to 6, wherein the position information indicating the overlapping region is a first coordinate of the first image and a second coordinate of the second image corresponding to the first image. Processing equipment.
  8.  前記重複領域を示す前記位置情報は、2組以下の第1座標および第2座標である請求項7に記載の画像処理装置。 The image processing apparatus according to claim 7, wherein the position information indicating the overlapping area is two or less sets of first coordinates and second coordinates.
  9.  画像の合成に使用する時系列順の複数の撮影画像を順次取得する撮影画像取得部であって、前記複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する前記複数の撮影画像を取得する撮影画像取得部と、
     前記第1画像と前記撮影画像取得部で捉えられる動画との重複領域を計測する重複領域計測部と、
     前記重複領域計測部の計測結果が既定値となった場合に、前記撮影画像取得部に前記第2画像を取得させる撮影制御部と、
     前記取得した前記複数の撮影画像を記憶部に記憶させ、かつ前記第2画像に関連付けて前記計測した前記重複領域を示す位置情報を記憶部に記憶させる記憶制御部と、
     前記記憶部に記憶された前記複数の撮影画像と前記位置情報に基づいて前記複数の撮影画像を空間配置する空間配置部と、
     前記空間配置された前記複数の撮影画像の前記重複領域を検出し、前記検出した前記重複領域の画像間の対応点を検出する対応点検出部と、
     前記検出した対応点に基づいて前記複数の撮影画像を合成して合成画像を生成する合成処理部と、
     を備えたカメラ装置。
    A captured image acquisition unit for sequentially acquiring a plurality of captured images in time series used for image composition, wherein the first image and the second image before and after the time sequence in the plurality of captured images are mutually A captured image acquisition unit that acquires the plurality of captured images having overlapping areas;
    An overlapping region measuring unit that measures an overlapping region between the first image and the moving image captured by the captured image acquisition unit;
    When the measurement result of the overlapping area measurement unit becomes a default value, the imaging control unit that causes the captured image acquisition unit to acquire the second image;
    A storage control unit that stores the acquired plurality of captured images in a storage unit, and stores, in the storage unit, position information indicating the measured overlap area in association with the second image;
    A spatial arrangement unit that spatially arranges the plurality of captured images based on the plurality of captured images and the position information stored in the storage unit;
    A corresponding point detection unit that detects the overlapping regions of the plurality of captured images arranged in space and detects corresponding points between the images of the detected overlapping regions;
    A combining processing unit that combines the plurality of captured images based on the detected corresponding points to generate a combined image;
    A camera device comprising:
  10.  画像の合成に使用する時系列順の複数の撮影画像を順次入力するステップであって、前記複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する前記複数の撮影画像を入力するステップと、
     前記入力する前記第1画像と前記第2画像との前記重複領域を計測するステップと、
     前記入力した前記複数の撮影画像を記憶部に記憶させ、かつ前記第2画像に関連付けて前記計測した前記重複領域を示す位置情報を記憶部に記憶させるステップと、
     前記記憶部に記憶された前記複数の撮影画像と前記位置情報に基づいて前記複数の撮影画像を空間配置するステップと、
     前記空間配置された前記複数の撮影画像の前記重複領域を検出し、前記検出した前記重複領域の画像間の対応点を検出するステップと、
     前記検出した対応点に基づいて前記複数の撮影画像を合成して合成画像を生成するステップと、
     を含む画像処理方法。
    A step of sequentially inputting a plurality of time-sequential captured images to be used for image synthesis, wherein the first image and the second image before and after the time-sequential order of the plurality of photographed images have overlapping areas; Inputting the plurality of photographed images,
    Measuring the overlapping area between the first image and the second image to be input;
    Storing the input plurality of captured images in a storage unit, and storing in the storage unit position information indicating the measured overlap area in association with the second image;
    Spatially arranging the plurality of photographed images based on the plurality of photographed images stored in the storage unit and the position information;
    Detecting the overlapping regions of the plurality of captured images arranged in space, and detecting corresponding points between the images of the detected overlapping regions;
    Combining the plurality of captured images based on the detected corresponding points to generate a composite image;
    An image processing method including:
  11.  画像の合成に使用する時系列順の複数の撮影画像を順次入力するステップであって、前記複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する前記複数の撮影画像を入力するステップと、
     前記入力された前記複数の撮影画像から前記第1画像を切り出すステップと、
     前記切り出された前記第1画像と前記入力された前記複数の撮影画像との前記重複領域を計測するステップと、
     前記重複領域を計測するステップでの計測結果が既定値となった場合に、前記複数の撮影画像から前記第2画像を切り出すステップと、
     前記入力した前記複数の撮影画像を記憶部に記憶させ、かつ前記第2画像に関連付けて前記計測した前記重複領域を示す位置情報を記憶部に記憶させるステップと、
     前記記憶部に記憶された前記複数の撮影画像と前記位置情報に基づいて前記複数の撮影画像を空間配置するステップと、
     前記空間配置された前記複数の撮影画像の前記重複領域を検出し、前記検出した前記重複領域の画像間の対応点を検出するステップと、
     前記検出した対応点に基づいて前記複数の撮影画像を合成して合成画像を生成するステップと、
     を含む画像処理方法。
    A step of sequentially inputting a plurality of time-sequential captured images to be used for image synthesis, wherein the first image and the second image before and after the time-sequential order of the plurality of photographed images have overlapping areas; Inputting the plurality of photographed images,
    Cutting out the first image from the input plurality of captured images;
    Measuring the overlap area between the cut out first image and the input plurality of captured images;
    Cutting out the second image from the plurality of captured images when the measurement result in the step of measuring the overlap region is a default value;
    Storing the input plurality of captured images in a storage unit, and storing in the storage unit position information indicating the measured overlap area in association with the second image;
    Spatially arranging the plurality of photographed images based on the plurality of photographed images stored in the storage unit and the position information;
    Detecting the overlapping regions of the plurality of captured images arranged in space, and detecting corresponding points between the images of the detected overlapping regions;
    Combining the plurality of captured images based on the detected corresponding points to generate a composite image;
    An image processing method including:
  12.  画像の合成に使用する時系列順の複数の撮影画像を順次入力するステップであって、前記複数の撮影画像のうちの時系列順の前後の第1画像と第2画像とは互いに重複領域を有する前記複数の撮影画像を入力するステップと、
     前記入力する前記第1画像と前記第2画像との前記重複領域を計測するステップと、
     前記重複領域を計測するステップの計測結果が既定値となった場合に、前記第1画像と前記第2画像とを抽出するステップと、
     前記抽出した前記第1画像と前記第2画像を記憶部に記憶させ、かつ前記第2画像に関連付けて前記計測した前記重複領域を示す位置情報を記憶部に記憶させるステップと、
     前記記憶部に記憶された前記複数の撮影画像と前記位置情報に基づいて前記複数の撮影画像を空間配置するステップと、
     前記空間配置された前記複数の撮影画像の前記重複領域を検出し、前記検出した前記重複領域の画像間の対応点を検出するステップと、
     前記検出した対応点に基づいて前記複数の撮影画像を合成して合成画像を生成するステップと、
     を含む画像処理方法。
    A step of sequentially inputting a plurality of time-sequential captured images to be used for image synthesis, wherein the first image and the second image before and after the time-sequential order of the plurality of photographed images have overlapping areas; Inputting the plurality of photographed images,
    Measuring the overlapping area between the first image and the second image to be input;
    Extracting the first image and the second image when the measurement result of the step of measuring the overlap region is a predetermined value;
    Storing the extracted first image and the second image in a storage unit, and storing, in the storage unit, positional information indicating the measured overlap area in association with the second image;
    Spatially arranging the plurality of photographed images based on the plurality of photographed images stored in the storage unit and the position information;
    Detecting the overlapping regions of the plurality of captured images arranged in space, and detecting corresponding points between the images of the detected overlapping regions;
    Combining the plurality of captured images based on the detected corresponding points to generate a composite image;
    An image processing method including:
PCT/JP2018/008125 2017-03-28 2018-03-02 Image processing device, camera device, and image processing method WO2018180214A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019509066A JP6712358B2 (en) 2017-03-28 2018-03-02 Image processing device, camera device, and image processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017063546 2017-03-28
JP2017-063546 2017-03-28

Publications (1)

Publication Number Publication Date
WO2018180214A1 true WO2018180214A1 (en) 2018-10-04

Family

ID=63675340

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/008125 WO2018180214A1 (en) 2017-03-28 2018-03-02 Image processing device, camera device, and image processing method

Country Status (2)

Country Link
JP (1) JP6712358B2 (en)
WO (1) WO2018180214A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023127313A1 (en) * 2021-12-28 2023-07-06 富士フイルム株式会社 Image capture supporting device, image capture supporting method, and program
WO2023135910A1 (en) * 2022-01-17 2023-07-20 富士フイルム株式会社 Image-capturing device, image-capturing method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09322055A (en) * 1996-05-28 1997-12-12 Canon Inc Electronic camera system
JPH1188811A (en) * 1997-09-04 1999-03-30 Sony Corp Camcorder and photographing method
JP2000134537A (en) * 1998-10-28 2000-05-12 Ricoh Co Ltd Image input device and its method
JP2000292166A (en) * 1999-04-07 2000-10-20 Topcon Corp Image forming device
WO2008087721A1 (en) * 2007-01-18 2008-07-24 Fujitsu Limited Image synthesizer, image synthesizing method, and program
JP2008288798A (en) * 2007-05-16 2008-11-27 Nikon Corp Imaging apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09322055A (en) * 1996-05-28 1997-12-12 Canon Inc Electronic camera system
JPH1188811A (en) * 1997-09-04 1999-03-30 Sony Corp Camcorder and photographing method
JP2000134537A (en) * 1998-10-28 2000-05-12 Ricoh Co Ltd Image input device and its method
JP2000292166A (en) * 1999-04-07 2000-10-20 Topcon Corp Image forming device
WO2008087721A1 (en) * 2007-01-18 2008-07-24 Fujitsu Limited Image synthesizer, image synthesizing method, and program
JP2008288798A (en) * 2007-05-16 2008-11-27 Nikon Corp Imaging apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023127313A1 (en) * 2021-12-28 2023-07-06 富士フイルム株式会社 Image capture supporting device, image capture supporting method, and program
WO2023135910A1 (en) * 2022-01-17 2023-07-20 富士フイルム株式会社 Image-capturing device, image-capturing method, and program

Also Published As

Publication number Publication date
JP6712358B2 (en) 2020-06-17
JPWO2018180214A1 (en) 2019-12-12

Similar Documents

Publication Publication Date Title
JP6560366B2 (en) Structure identifying device and method for structure
US10356301B2 (en) Imaging system, angle-of-view adjustment method, and angle-of-view adjustment program
JP6712330B2 (en) Imaging control device, imaging control method and program
JP6733267B2 (en) Information processing program, information processing method, and information processing apparatus
US11100671B2 (en) Image generation apparatus, image generation system, image generation method, and image generation program
US10951821B2 (en) Imaging control device, imaging system, and imaging control method
JP5663352B2 (en) Image processing apparatus, image processing method, and image processing program
JP6507268B2 (en) Photography support apparatus and photography support method
US11991477B2 (en) Output control apparatus, display terminal, remote control system, control method, and non-transitory computer-readable medium
WO2021168804A1 (en) Image processing method, image processing apparatus and image processing system
JP4680033B2 (en) Monitoring system and monitoring device
WO2018180214A1 (en) Image processing device, camera device, and image processing method
JP2020037475A (en) Construction machine control system, construction machine control method, and program
CN114616820B (en) Image pickup support device, image pickup system, image pickup support method, and storage medium
WO2020162264A1 (en) Photographing system, photographing spot setting device, photographing device, and photographing method
JP6779368B2 (en) Image processing equipment, image processing methods, and programs
JP6770826B2 (en) Automatic collimation method and automatic collimation device for measuring the placement position of structures
CN114821544A (en) Perception information generation method and device, vehicle, electronic equipment and storage medium
JP2021155179A (en) Photographing system and program for crane
JP6715340B2 (en) Imaging plan generation device, imaging plan generation method, and program
JP2021103410A (en) Mobile body and imaging system
CN113646606A (en) Control method, control equipment, unmanned aerial vehicle and storage medium
KR20130055447A (en) Device and method of generating mosaic images
WO2023135910A1 (en) Image-capturing device, image-capturing method, and program
US20240155223A1 (en) Imaging control device, imaging system, imaging control method, and imaging control program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18777813

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019509066

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18777813

Country of ref document: EP

Kind code of ref document: A1