WO2018180214A1 - Dispositif de traitement d'images, dispositif de type caméra et procédé de traitement d'images - Google Patents

Dispositif de traitement d'images, dispositif de type caméra et procédé de traitement d'images Download PDF

Info

Publication number
WO2018180214A1
WO2018180214A1 PCT/JP2018/008125 JP2018008125W WO2018180214A1 WO 2018180214 A1 WO2018180214 A1 WO 2018180214A1 JP 2018008125 W JP2018008125 W JP 2018008125W WO 2018180214 A1 WO2018180214 A1 WO 2018180214A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
captured images
images
overlapping
Prior art date
Application number
PCT/JP2018/008125
Other languages
English (en)
Japanese (ja)
Inventor
浩明 菊池
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2019509066A priority Critical patent/JP6712358B2/ja
Publication of WO2018180214A1 publication Critical patent/WO2018180214A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/915Television signal processing therefor for field- or frame-skip recording or reproducing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback

Definitions

  • the present invention relates to an image processing device, a camera device, and an image processing method, and more particularly, to an image processing device, a camera device, and an image processing method in a case where a composite image is generated by synthesizing images having continuous and overlapping regions.
  • a technique for panoramic synthesis of a plurality of continuously shot images has been proposed.
  • a technique for efficiently synthesizing panoramic images a technique has been used in which a photographing position where each photographed image is acquired is measured by a GPS (Global Positioning System) or a rangefinder.
  • GPS Global Positioning System
  • Patent Document 1 describes a technique for capturing a captured image for panoramic synthesis using a camera with a gyro sensor.
  • the movement of the camera is detected by the gyro sensor, and the correct shooting position of the second scene is indicated to the user by the indicator.
  • a camera with a gyro sensor intended to give instructions is described.
  • Patent Document 2 describes a technique for panoramic synthesis of a plurality of captured images.
  • the similarity is evaluated based on the spatial frequency or the number of feature points for the synthesis position of the Nth and N + 1th images.
  • the movement amount between the first image and the (N + 1) th image cannot be calculated, the movement amount calculated from two consecutive images before the Nth image or after the N + 1th image is calculated.
  • panorama composition To perform panorama composition.
  • the overlap of the image group composed of the plurality of captured images is spatially determined. Can be calculated.
  • Patent Documents 1 and 2 do not mention storing information indicating an overlapping area between captured images.
  • the present invention has been made in view of such circumstances, and the object thereof is not to measure the photographing position with sensors, and to spatially arrange a plurality of photographed images based on position information indicating an overlapping region.
  • an image processing device a camera device, and an image processing method capable of efficiently storing a plurality of captured images while suppressing the time and calculation cost required for detecting corresponding points when performing synthesis processing. It is to be.
  • an image processing apparatus is a captured image input unit that sequentially inputs a plurality of captured images in time-series order used for image synthesis, and a plurality of captured images The first image and the second image before and after the chronological order of the captured image input unit for inputting a plurality of captured images having overlapping areas, and an overlapping area between the input first image and the second image
  • An overlapping area measuring unit for measuring, a storage control unit for storing a plurality of input captured images in the storage unit, and storing in the storage unit positional information indicating the overlapping region measured in association with the second image, and a storage unit
  • a spatial arrangement unit that spatially arranges a plurality of captured images based on a plurality of stored captured images and position information, and an overlapping region of the plurality of spatially arranged captured images is detected, and correspondence between the detected overlapping region images A corresponding point detection unit for detecting points;
  • a synthesis processing unit that generates a composite image by combining a
  • the overlap region between the first image and the second image to be input is measured by the overlap region measurement unit, and the storage controller stores position information indicating the overlap region in association with the second image. Is remembered. Thereby, the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors. Then, the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed. Further, according to this aspect, the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
  • An image processing apparatus is a photographed image input unit that sequentially inputs a plurality of photographed images in time-series order used for image synthesis, and includes a plurality of photographed images in time-series order.
  • a captured image input unit that inputs a plurality of captured images in which the front and back first images and the second image have overlapping areas; a first image cutout unit that extracts the first image from the plurality of input captured images;
  • an overlap region measurement unit that measures an overlap region between the cut out first image and the plurality of input captured images, and when the measurement result of the overlap region measurement unit becomes a default value, the second is extracted from the plurality of captured images.
  • a second image cutout unit that cuts out an image; a storage control unit that stores a plurality of input captured images in a storage unit; and a storage control unit that stores position information indicating an overlapping region measured in association with the second image; Multiple captured images and positions stored in the storage unit
  • a spatial arrangement unit that spatially arranges a plurality of captured images based on the information, a corresponding point detection unit that detects an overlapping region of the plurality of spatially arranged captured images, and detects corresponding points between the images of the detected overlapping region;
  • a combining processing unit that combines a plurality of captured images based on the detected corresponding points to generate a combined image.
  • the overlapping area measurement unit measures the overlapping area between the cut-out first image and the plurality of input captured images
  • the storage control unit indicates the overlapping area in association with the second image.
  • Information is stored in the storage unit.
  • the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors.
  • the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed.
  • the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
  • An image processing apparatus is a photographed image input unit that sequentially inputs a plurality of photographed images in time-series order used for image synthesis, and includes a plurality of photographed images in time-series order.
  • a captured image input unit that inputs a plurality of captured images in which the preceding and following first images and the second image have overlapping regions; an overlapping region measuring unit that measures an overlapping region between the input first image and second image;
  • the image extraction unit that extracts the first image and the second image, the extracted first image and the second image are stored in the storage unit, and
  • a storage control unit that stores position information indicating an overlap area measured in association with the second image in the storage unit, and a space in which a plurality of captured images are spatially arranged based on the plurality of captured images and position information stored in the storage unit Overlapping between the arrangement part and multiple shot images arranged in space
  • a corresponding point detecting unit that detects a corresponding area between images of the
  • the overlap region between the first image and the second image to be input is measured by the overlap region measurement unit, and the position information indicating the overlap region measured in association with the second image is stored by the storage control unit.
  • the storage control unit Stored in the department.
  • the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors.
  • the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed.
  • the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
  • the compositing processing unit Preferably, the compositing processing unit generates an ortho image by compositing.
  • the composition processing unit since the composition processing unit generates an ortho image by compositing, it is possible to suppress time and calculation cost for detecting corresponding points of overlapping regions in the generation of the ortho image.
  • the storage control unit records position information in an EXIF tag attached to the second image.
  • the storage controller since the storage controller records the position information on the EXIF tag attached to the second image, the position information can be efficiently stored in association with the second image.
  • the corresponding point detection unit also detects an overlapping area of the first image with an image other than the second image, and detects a corresponding point between the images of the detected overlapping area.
  • the corresponding point detection unit also detects an overlapping area of the first image with an image other than the second image, and detects a corresponding point between the images of the detected overlapping area.
  • the combining process is performed with more corresponding points than the combining process using only the corresponding points of the overlapping region of the first image and the second image, it is possible to perform more accurate combining. .
  • the position information indicating the overlapping area is the first coordinates of the first image and the second coordinates of the second image corresponding to the first image.
  • the position information indicating the overlapping area is the first coordinates of the first image and the second coordinates of the second image corresponding to the first image, more accurate position information of the overlapping area is stored in the storage unit. Can be memorized.
  • the position information indicating the overlapping region is two or less sets of first coordinates and second coordinates.
  • the position information indicating the overlapping area is the first coordinate and the second coordinate of two sets or less, the information amount of the position information is suppressed, and the storage capacity of the storage unit is effectively reduced. Can be used.
  • a camera apparatus is a captured image acquisition unit that sequentially acquires a plurality of captured images in time-series order used for image synthesis, and before and after a time-series order among a plurality of captured images.
  • the first image and the second image of the first image and the second image overlap region measurement that measures an overlap region between a captured image acquisition unit that acquires a plurality of captured images having overlapping regions and a moving image captured by the first image and the captured image acquisition unit.
  • the shooting control unit that causes the captured image acquisition unit to acquire the second image, the acquired plurality of captured images to be stored in the storage unit, and the first A storage control unit that stores position information indicating an overlap area measured in association with two images in a storage unit, and a spatial arrangement that spatially arranges a plurality of captured images based on the plurality of captured images and position information stored in the storage unit And a plurality of spatially arranged Corresponding point detection unit that detects overlapping regions of shadow images, detects corresponding points between images of the detected overlapping regions, and a combining process that generates a combined image by combining a plurality of captured images based on the detected corresponding points A section.
  • the overlapping region measurement unit measures the overlapping region between the first image and the moving image captured by the captured image acquisition unit, and the storage control unit indicates the overlapping region measured in association with the second image.
  • Information is stored in the storage unit.
  • the spatial arrangement of a plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors.
  • the corresponding point detection unit detects corresponding points between the images in the overlapping region based on the calculated spatial arrangement of the plurality of captured images. Therefore, according to the present aspect, the corresponding point detection processing is performed only for a limited area, so that the time required for detecting the corresponding points and the calculation cost required for detecting the corresponding points can be suppressed.
  • the position information indicating the overlapping area is also stored in the storage unit, so that the captured image can be efficiently stored in the storage unit.
  • An image processing method is a step of sequentially inputting a plurality of photographed images in time-series order used for image synthesis, wherein The step of inputting a plurality of captured images having an overlapping area between one image and the second image, the step of measuring an overlapping area between the first image and the second image to be input, and storing the plurality of input captured images A step of storing in the storage unit position information indicating an overlap region measured in association with the second image and a plurality of photographed images based on the plurality of photographed images and position information stored in the storage unit.
  • a step of spatial arrangement a step of detecting overlapping regions of a plurality of captured images arranged in space, a step of detecting corresponding points between the images of the detected overlapping regions, and a combination of the plurality of captured images based on the detected corresponding points Together And generating an image.
  • An image processing method is a step of sequentially inputting a plurality of photographed images in time-series order used for image synthesis, wherein The step of inputting a plurality of captured images having an overlapping area between the one image and the second image, the step of cutting out the first image from the plurality of input captured images, and the plurality of the input of the cut out first image
  • a step of spatially arranging shadow images a step of detecting overlapping regions of a plurality of spatially arranged captured images, a step of detecting corresponding points between the images of the detected overlapping regions, and a plurality of photographing based on the detected corresponding points Synthesizing images to generate a composite image.
  • An image processing method is a step of sequentially inputting a plurality of photographed images in time-series order used for image synthesis, wherein Measurement of a step of inputting a plurality of captured images having an overlapping area between one image and a second image, a step of measuring an overlapping area between the input first image and the second image, and a step of measuring the overlapping area
  • Measurement of a step of inputting a plurality of captured images having an overlapping area between one image and a second image, a step of measuring an overlapping area between the input first image and the second image, and a step of measuring the overlapping area A step of extracting the first image and the second image when the result has become a predetermined value, and storing the extracted first image and the second image in the storage unit and measuring in association with the second image
  • the overlapping area between the input first image and the second image is measured by the overlapping area measuring unit, and the storage control unit stores the positional information indicating the overlapping area in association with the second image.
  • the spatial arrangement of the plurality of photographed images input to the photographed image input unit is calculated without measuring the photographing position with sensors, and the corresponding point detection unit calculates Corresponding points between images in the overlapping region are detected based on the spatial arrangement of the plurality of captured images.
  • the corresponding point detection process is performed only in a limited area, so that the time required for detecting the corresponding point and the calculation cost required for detecting the corresponding point can be suppressed, and the position indicating the overlapping area Since the information is also stored in the storage unit, the captured image can be efficiently stored in the storage unit.
  • FIG. 1 It is a perspective view which shows the structure of the bridge which is one of the structures which are imaging
  • 3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. It is a figure which shows the function structural example of an image processing apparatus. 3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. It is a figure which shows the function structural example of an image processing apparatus. 3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. It is a figure which shows the function structural example of an image processing apparatus. 3 is a flowchart illustrating an operation of the image processing apparatus and an image processing method. It is a figure which shows the function structural example of an image processing apparatus.
  • FIG. 1 is a perspective view showing the structure of a bridge, which is one of the structures to be photographed, and is a perspective view of the bridge as viewed from below.
  • the object to be photographed used in the present invention is not particularly limited.
  • the present invention is applied to the case of synthesizing a photographed image that is used when a structure damage detection inspection is performed. Is done.
  • the present invention is applied to the case where the floor slab 6 in the space is dividedly photographed and the captured images obtained by the divided photographing are combined. .
  • the bridge 1 shown in FIG. 1 has a main girder 2, a cross girder 3, a counter-tilt 4 and a horizontal 5 which are connected by bolts, rivets or welding.
  • a floor slab 6 for driving a vehicle or the like is provided on the upper portion of the main beam 2 or the like.
  • the floor slab 6 is generally made of reinforced concrete.
  • the main girder 2 is a member that is passed between the abutment or the pier and supports the load of the vehicle on the floor slab 6.
  • the cross beam 3 is a member that connects the main beams 2 in order to support the load by the plurality of main beams 2.
  • the anti-tilt frame 4 and the horizontal frame 5 are members for connecting the main girders 2 to each other in order to resist lateral loads of wind and earthquake, respectively.
  • FIG. 2 is a conceptual diagram showing a photographing system.
  • the imaging system 500 includes a robot apparatus 100 and a computer 300.
  • the robot apparatus 100 includes an imaging apparatus 200, and moves and captures images with the imaging apparatus 200 under the control of the computer 300.
  • the robot apparatus 100 is not particularly limited as long as it can move and shoot under the control of the computer 300.
  • Other examples of the robot apparatus 100 include an apparatus called a traveling robot, a small helicopter, a multicopter, a drone, or UAV (Unmanned Aerial Vehicles).
  • the computer 300 and the robot apparatus 100 can communicate with each other, and the computer 300 can remotely control the movement of the robot apparatus 100 and the imaging control of the imaging apparatus 200.
  • FIG. 3 is a perspective view showing an appearance of the robot apparatus 100 including the camera which is an embodiment of the imaging apparatus 200, and shows a state where the robot apparatus 100 is installed between the main girders 2 of the bridge 1.
  • FIG. 4 is a cross-sectional view of a main part of the robot apparatus 100 shown in FIG.
  • the robot apparatus 100 includes an imaging apparatus 200, controls the position (imaging position) of the imaging apparatus 200 in the three-dimensional space, controls the imaging direction of the imaging apparatus 200, and bridges An arbitrary inspection member or the like of the bridge 1 composed of a plurality of members at the time of the inspection of 1 is photographed.
  • the robot apparatus 100 includes a main frame 102, a vertical extension arm 104, a casing 106 in which a drive unit and various control units for the vertical extension arm 104 are disposed, and a casing 106.
  • the X direction driving unit 108 moves in the longitudinal direction of the frame 102 (direction orthogonal to the longitudinal direction of the main beam 2) (X direction), and the longitudinal direction of the main beam 2 (Y direction).
  • a Y-direction drive unit 110 (FIG. 6) that moves in the vertical direction
  • a Z-direction drive unit 112 (FIG. 6) that expands and contracts the vertical extension arm 104 in the vertical direction (Z direction).
  • the X-direction drive unit 108 includes a ball screw 108A disposed in the longitudinal direction (X direction) of the main frame 102, a ball nut 108B disposed in the housing 106, and a motor 108C that rotates the ball screw 108A.
  • the casing 106 is moved in the X direction by rotating the ball screw 108A forward or backward by the motor 108C.
  • the Y-direction drive unit 110 includes tires 110A and 110B disposed at both ends of the main frame 102, and motors (not shown) disposed in the tires 110A and 110B. By driving the motor, the entire robot apparatus 100 is moved in the Y direction.
  • the robot apparatus 100 is installed in such a manner that the tires 110A and 110B at both ends of the main frame 102 are placed on the lower flanges of the two main girders 2 and the main girders 2 are sandwiched between them. Thereby, the robot apparatus 100 can be suspended (suspended) along the main girder 2 by being suspended from the lower flange of the main girder 2.
  • the main frame 102 is configured such that its length can be adjusted according to the interval between the main girders 2.
  • the vertical extension arm 104 is disposed in the housing 106 of the robot apparatus 100 and moves in the X direction and the Y direction together with the housing 106. Further, the vertical extension arm 104 is expanded and contracted in the Z direction by a Z direction driving unit 112 (FIG. 6) provided in the housing 106.
  • a camera installation unit 104A is provided at the tip of the vertical extension arm 104, and a camera 202 that can be rotated in the pan direction and the tilt direction by the pan / tilt mechanism 120 is installed in the camera installation unit 104A. ing.
  • the camera 202 is rotated around a pan axis P coaxial with the vertical extension arm 104 by a pan / tilt mechanism 120 to which a driving force is applied from a pan / tilt driving unit 206 (FIG. 6) or centered on a horizontal tilt axis T. Rotate. As a result, the camera 202 can shoot in an arbitrary posture (shoot in an arbitrary shooting direction).
  • FIG. 6 is a block diagram illustrating a functional configuration example of the robot apparatus 100.
  • the robot apparatus 100 includes a robot control unit 130, an X-direction drive unit 108, a Y-direction drive unit 110, and a Z-direction drive unit 112 on the robot apparatus 100 side, a camera 202 on the imaging apparatus 200 side, and imaging control.
  • the unit 204, the pan / tilt control unit 210, the pan / tilt driving unit 206, and the robot side communication unit 230 are configured.
  • the robot-side communication unit 230 performs two-way wireless communication with the computer 300, a movement command for controlling movement of the robot apparatus 100 transmitted from the computer 300, a pan / tilt command for controlling the pan / tilt mechanism 120, and a camera 202.
  • Various commands such as a shooting command to be controlled are received, and the received commands are output to corresponding control units. Details of the computer 300 will be described later.
  • the robot control unit 130 controls the X direction driving unit 108, the Y direction driving unit 110, and the Z direction driving unit 112 based on the movement command input from the robot side communication unit 230, and the robot device 100 performs the X direction and Y direction.
  • the vertical extension arm 104 is expanded and contracted in the Z direction (see FIG. 3).
  • the pan / tilt control unit 210 operates the pan / tilt mechanism 120 in the pan direction and the tilt direction via the pan / tilt driving unit 206 based on the pan / tilt command input from the robot side communication unit 230 to pan / tilt the camera 202 in a desired direction (see FIG. 5).
  • the imaging control unit 204 causes the imaging unit 202A of the camera 202 to capture a live view image or an inspection image based on the imaging command input from the robot side communication unit 230.
  • a moving image including a captured image or a live view image acquired by the imaging unit 202A is used.
  • Image data indicating an image K taken by the imaging unit 202A of the camera 202 when the bridge 1 is inspected is transmitted to the computer 300 via the robot side communication unit 230.
  • the computer 300 is composed of a tablet computer as shown in FIG.
  • the computer 300 includes a flat casing having a rectangular outline, and includes a touch panel display 302 that also serves as a display unit 326 (FIG. 6) and an input unit 328 (FIG. 6).
  • FIG. 7 is a block diagram showing a system configuration of the computer 300.
  • the computer 300 includes a CPU (Central Processing Unit) 310 that controls the overall operation of the computer 300, and a main memory 314, a nonvolatile memory 316, a mobile device via the system bus 312.
  • a communication unit 318, a wireless LAN communication unit 320, a short-range wireless communication unit 322, a wired communication unit 324, a display unit 326, an input unit 328, a key input unit 330, an audio processing unit 332, an image processing unit 334, and the like are connected. Is done.
  • the CPU 310 reads out an operation program (OS (Operating System) and application program that operates on the OS), standard data, and the like stored in the non-volatile memory 316, expands them in the main memory 314, and performs this operation. By executing the program, it functions as a control unit that controls the operation of the entire computer.
  • OS Operating System
  • the main memory 314 is composed of, for example, a RAM (Random Access Memory) and functions as a work memory for the CPU 310.
  • the non-volatile memory 316 is composed of, for example, a flash EEPROM (EEPROM: Electrically Erasable Programmable Read Only Memory), and stores the above-described operation program and various fixed data.
  • the nonvolatile memory 316 functions as a storage unit of the computer 300 and stores various data.
  • the mobile communication unit 318 is a third-generation mobile communication system compliant with the IMT-2000 standard (International Mobile Telecommunication-2000) and a fourth-generation mobile communication system compliant with the IMT-Advanced standard (International Mobile Telecommunications-Advanced). Based on this, data is transmitted / received to / from the nearest base station (not shown) via the antenna 318A.
  • IMT-2000 International Mobile Telecommunication-2000
  • IMT-Advanced International Mobile Telecommunications-Advanced
  • the wireless LAN communication unit 320 uses a predetermined wireless LAN communication standard (for example, IEEE802.11a / b / g / n standard) between the wireless LAN access point and an external device capable of wireless LAN communication via the antenna 320A. ) To perform wireless LAN communication.
  • a predetermined wireless LAN communication standard for example, IEEE802.11a / b / g / n standard
  • the short-range wireless communication unit 322 transmits / receives data to / from other Bluetooth (registered trademark) equipment within the range of, for example, class 2 (within a radius of about 10 m) via the antenna 322A.
  • Bluetooth registered trademark
  • the wired communication unit 324 performs communication according to a predetermined communication standard with an external device connected by a cable via the external connection terminal 306. For example, USB (Universal Serial Bus) communication is performed.
  • USB Universal Serial Bus
  • the display unit 326 includes a color LCD (liquid crystal display) panel that constitutes a display portion of the touch panel display 302 and a driving circuit thereof, and displays various images.
  • a color LCD liquid crystal display
  • the input unit 328 constitutes a touch panel portion of the touch panel display 302.
  • the input unit 328 is configured integrally with the color LCD panel using a transparent electrode.
  • the key input unit 330 includes a plurality of operation buttons provided on the casing of the computer 300 and a drive circuit thereof.
  • the audio processing unit 332 converts the digital audio data provided via the system bus 312 into an analog signal and outputs it from the speaker 304.
  • the image processing unit 334 digitizes an analog image signal output from the built-in camera 305 including a photographing lens and an image sensor, performs necessary signal processing, and outputs it.
  • FIG. 8 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the first embodiment. Note that the image processing apparatus 400 is provided in the computer 300.
  • the image processing apparatus 400 includes a captured image input unit 402, an overlapping area measurement unit 404, a storage control unit 406, a storage unit 408, a space arrangement unit 410, a corresponding point detection unit 412, and a synthesis processing unit 414.
  • the photographed image input unit 402 is sequentially input with a plurality of photographed images in time series order used for image synthesis, and before and after the first image and the next image (first image) before and after the time series order among the plurality of photographed images. With the second image), a plurality of captured images having overlapping areas are input. In other words, the photographed image input unit 402 receives a plurality of photographed images constituting a composite image, and a photographed image having an overlapping area between the previous image and the subsequent image.
  • FIG. 9 is a diagram conceptually showing a captured image input to the captured image input unit 402.
  • the eight captured images from the image 601 to the image 608 constitute a composite image and are captured from the image 601 to the image 608 in chronological order. Further, the image 601 to the image 608 have overlapping areas with each other, and a part of the front image and a part of the rear image overlap each other.
  • the image 601 and the image 602 is the previous image
  • the image 602 is the subsequent image
  • the image 601 and the image 602 have overlapping areas
  • the image 602 and the image 603 have a relationship having an overlapping area.
  • a similar relationship is established from image 604 to image 608.
  • region of the dotted line in a figure is an overlapping area
  • the overlapping area measurement unit 404 measures an overlapping area between the previous image and the subsequent image input to the captured image input unit 402. Specifically, the overlapping area measurement unit 404 measures information indicating the size of the overlapping area, such as the area of the overlapping area with the previous image or the length of the overlapping area in the subsequent image. In addition, the overlapping area measurement unit 404 calculates position information indicating the overlapping area together with the measurement of the overlapping area.
  • FIG. 10 is a diagram for explaining the measurement of the overlapping area of the front image U and the rear image V.
  • FIG. 10 is a diagram for explaining the measurement of the overlapping area of the front image U and the rear image V.
  • the front image U and the rear image V have an overlapping area.
  • the overlapping area P1 overlaps the rear image V
  • the overlapping area P2 overlaps the front image U.
  • the subject H, the subject I, and the subject J are shown as a common subject.
  • the overlapping area measuring unit 404 detects a pair of feature point coordinates between the previous image U and the subsequent image V, for example, and determines overlapping areas, and measures the overlapping areas P1 and P2. Specifically, the overlapping area measurement unit 404 extracts feature points O5 to O8 in the subsequent image, and detects feature points O1 to O4 corresponding to the feature points O5 to O8, respectively, from the previous image.
  • overlapping portions that are equal to or greater than the threshold may be adopted by template matching.
  • the overlapping area measuring unit 404 measures the size (area, length) of the overlapping area based on the feature points detected in this way, or obtains information (coordinates) indicating the position of the overlapping area in each image. Or measure.
  • the storage control unit 406 stores a plurality of input captured images in the storage unit 408, and causes the storage unit 408 to store position information indicating an overlapping area measured in association with the second image. Specifically, the image 608 to the image 608 shown in FIG. 9 are stored in the storage unit 408, and the position information indicating the overlapping area is stored in association with the subsequent image.
  • FIG. 11 shows a data configuration example of the storage unit 408. Note that FIG. 11 describes the image 601, the image 602, and the image 603, and other images are also stored in the same manner.
  • FIG. 11 (A) shows an example of the storage data structure of the shooting file name.
  • the image 601 is stored in the storage unit 408 with the shooting file name “filename1”, the image 602 is “filename2”, and the image 603 is “filename3”.
  • FIG. 11B shows an example of the storage data structure of the previous image index having an overlapping area. Since the image 601 is the foremost image, there is no previous image with an overlapping area, and “NONE” is stored, and since the image 602 has an overlapping area with the image 601, “1” indicating the image 601 is stored. Since the image 603 has an overlapping area with the image 602, “2” indicating the image 602 is stored in the storage unit 408.
  • FIG. 11C shows an example of the storage data configuration of the pixels of the previous image. Since the image 601 is the foremost image, the information of the pixels of the previous image having the overlapping area is stored as “NONE”, and the image 602 has the overlapping area with the image 601, so the pixels of the previous image (image 601). The information “(X11, Y11),... (X1n, Y1n)” is stored. Since the image 603 has an overlapping area with the image 602, the pixel information “(X21, Y21),... (X2n, Y2n)” of the previous image (image 602) is stored.
  • FIG. 11 (D) shows a configuration example of storage data of a corresponding pixel. Since the image 601 is the foremost image, the corresponding pixel information is “NONE”, and the image 602 is the pixel information “(X21-I) corresponding to the pixel shown in the part (C) of FIG. , Y21-I),... (X2n-I, Y2n-I) "is stored, and the image 603 contains the corresponding pixels" (X31-I, Y31-I), ... (X3n-I, Y3n-I) "is stored.
  • position information indicating the overlapping area information on the pixels in the previous image (part (C) in FIG. 11) and information on the pixels corresponding to the pixels in the previous image (part (D) in FIG. 11). Is remembered. That is, position information first coordinates (part (C) in FIG. 11) indicating the overlapping area and second coordinates (part (D) in FIG. 11) of the rear image corresponding to the previous image are stored.
  • region is 1st coordinate and 2nd coordinate of 2 sets or less. That is, it is sufficient that the position information indicating the overlapping area has a role as outline information.
  • the outline information is rough information on overlapping images, and information having a small capacity and easy calculation is preferable.
  • information forms other than the information forms described above are also employed as position information indicating overlapping areas. In this way, by storing the positional information indicating the overlapping area in association with the rear image and the rear image in the storage unit, the captured images constituting the image group are associated with each other, and the management of the files of the image group is made more efficient. To do. Further, by storing the position information indicating the overlapping area in association with the rear image and the rear image in this manner, it is not necessary to measure the shooting position with the GPS or the rangefinder.
  • FIG. 12 is a diagram for explaining a case where the position information of the overlapping area is stored in an EXIF (Exchangeable image file format) tag attached to the subsequent image.
  • EXIF Exchangeable image file format
  • the location information of the overlapping area is stored in the “MakerNote” part of the “EXIF IFD” of the EXIF tag. That is, the previous image file name having the overlapping area is stored in the place of “MakerNote”, and the coordinates of the pixel of the previous image and the coordinates of the corresponding pixel are stored. Specifically, the name of the previous image file with the overlapping area, the pixel (coordinate) of the previous image, and the corresponding pixel (coordinate) are stored in the “MakerNote” portion of the EXIF tag.
  • the space arrangement unit 410 spatially arranges a plurality of photographed images based on the plurality of photographed images and position information stored in the storage unit 408. That is, the space arrangement unit 410 arranges the captured image in space based on the position information of the overlapping area that is the outline information.
  • FIG. 13 is a conceptual diagram when images 601 to 608 are spatially arranged based on the position information of the overlapping area.
  • Each image from image 601 to image 608 is overlapped by overlapping region 480, overlapping region 482, overlapping region 484, overlapping region 486, overlapping region 488, overlapping region 490, and overlapping region 492.
  • positional information regarding the overlapping area 480, the overlapping area 482, the overlapping area 484, the overlapping area 486, the overlapping area 488, the overlapping area 490, and the overlapping area 492 is stored in association with the image to be the subsequent image.
  • the spatial arrangement unit 410 spatially arranges each captured image based on the position information indicating each overlapping area stored in the storage unit 408.
  • the spatial arrangement unit 410 spatially arranges the image 601 and the image 602 based on position information indicating the overlapping area 480 of the image 602. Similarly, the spatial arrangement unit 410 spatially arranges the image 602 and the image 603, the image 603 and the image 604, the image 604 and the image 605, the image 605 and the image 606, the image 606 and the image 607, and the image 607 and the image 608. .
  • the spatial arrangement unit 410 can calculate the spatial arrangement of the entire photographed image input to the photographed image input unit 402 by sequentially arranging each image spatially.
  • Corresponding point detection unit 412 detects overlapping areas of a plurality of spatially arranged captured images, and detects corresponding points between the images of the detected overlapping areas. That is, the corresponding point detection unit 412 detects corresponding points in the overlapping area based on the position information indicating the overlapping area or the overlapping area newly generated as a result of spatial arrangement with the overlapping area based on the position information indicating the overlapping area. .
  • a known method is employed as the method for detecting the corresponding points.
  • SIFT Scale-invariant feature transform
  • SURF Speed-Upped Robust Feature
  • FIG. 14 is a diagram showing the overlap of the image 601 to the image 608 after the spatial arrangement, and shows that a new overlapping area occurs after the spatial arrangement is performed using the position information of the overlapping area.
  • the corresponding point detection unit 412 detects an overlapping area between the previous image and the subsequent image and an overlapping area between the previous image and an image other than the subsequent image, and detects corresponding points between the images in the detected overlapping area.
  • the corresponding point detection unit 412 not only performs corresponding point detection on the overlapping area 480 with the image 601 and the overlapping area 482 with the image 603 in the image 602, but also with the overlapping area 494 with the image 607, and the image Corresponding point detection is also performed for the overlapping region 495 with 608. As a result, more corresponding points can be detected than the corresponding points detected between the image 601 and the image 602 and between the image 602 and the image 603.
  • the corresponding point detection unit 412 not only performs corresponding point detection on the overlapping region 480 with the image 602 in the image 601, but also performs corresponding point detection on the overlapping region 496 with the image 608. As a result, more corresponding points can be detected than the corresponding points between the image 601 and the image 602.
  • the composition processing unit 414 generates a composite image by combining a plurality of captured images based on the detected corresponding points.
  • the composition processing unit 414 can generate various composite images by combining a plurality of images.
  • the composition processing unit 414 generates a panorama composite image and an ortho image.
  • the ortho image is an image obtained by converting an aerial photograph into an image having no positional deviation and giving correct positional information.
  • FIG. 15 is a flowchart showing the operation and the image processing method of the image processing apparatus 400 of the present embodiment.
  • a plurality of photographed images in time series used for composition are input by the photographed image input unit 402 (step S10).
  • a captured image input unit 402 realized by the CPU 310 of the computer 300 via the mobile communication unit 318, the wireless LAN communication unit 320, or the short-range wireless communication unit 322 of the computer 300 for the image group captured by the robot apparatus 100. Is input.
  • the overlapping area of each image is measured by the overlapping area measuring unit 404 (step S11).
  • the overlapping area measurement unit 404 is realized by the CPU 310 of the computer 300.
  • the captured image input by the captured image input unit 402 and the position information measured by the overlapping region measuring unit 404 are realized in the main memory 314 of the computer 300 by the storage control unit 406 realized by the CPU 310 of the computer 300. Is stored in the storage unit 408 (step S12).
  • the spatial arrangement unit 410 realized by the CPU 310 of the computer 300 arranges the captured image in a spatial manner based on the position information of the overlapping area, which is the outline information (step S13).
  • the corresponding point detection unit 412 realized by the CPU 310 of the computer 300 detects the corresponding point in the overlapping region (step S14).
  • the overlapping area after the captured image is spatially arranged is not only the overlapping area of the relationship between the previous image and the subsequent image, but also the overlapping area with an image that is not in the context of the image (for example, the overlapping area 494 to the overlapping area shown in FIG. 14). Corresponding points can also be detected in the region 496).
  • the composite processing unit 414 realized by the CPU 310 of the computer 300 generates a composite image based on the detected corresponding points (step S15).
  • the hardware structure of a processing unit (processing unit) that executes various processes is the following various processors.
  • the various processors are programmable processors that can change the circuit configuration after manufacturing, such as a CPU or FPGA (Field Programmable Gate Gate Array) that is a general-purpose processor that functions as various processing units by executing software (programs). Examples include a dedicated electric circuit which is a processor having a circuit configuration specifically designed to execute a specific process such as a logic device (Programmable Logic Device: PLD) and an ASIC (Application Specific Integrated Circuit).
  • PLD Programmable Logic Device
  • ASIC Application Specific Integrated Circuit
  • One processing unit may be configured by one of these various processors, or may be configured by two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of CPU and FPGA). May be. Further, the plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or server, one processor is configured with a combination of one or more CPUs and software. There is a form in which the processor functions as a plurality of processing units.
  • SoC system-on-chip
  • a form of using a processor that realizes the functions of the entire system including a plurality of processing units with a single IC (integrated circuit) chip. is there.
  • various processing units are configured using one or more of the various processors as a hardware structure.
  • circuitry circuitry in which circuit elements such as semiconductor elements are combined.
  • a moving image is input to the captured image input unit 402.
  • FIG. 16 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the second embodiment.
  • the part already demonstrated in FIG. 8 attaches
  • the image processing apparatus 400 includes a captured image input unit 402, a first image cutout unit 421, an overlapping area measurement unit 404, a second image cutout unit 423, a storage control unit 406, a storage unit 408, and a space arrangement unit. 410, a corresponding point detection unit 412, and a synthesis processing unit 414.
  • the moving image is input to the photographed image input unit 402. That is, a plurality of time-sequential captured images used for image synthesis are sequentially input, and the first image and the second image before and after the time-sequential order among the plurality of input captured images have overlapping areas. A moving image composed of a plurality of captured images is input.
  • the first image cutout unit 421 cuts out the first image from the plurality of shot images input to the shot image input unit 402.
  • a moving image is input to the captured image input unit 402, and one of a plurality of frame images constituting the input moving image is cut out as a previous image.
  • the overlapping area measuring unit 404 measures an overlapping area between the cut out first image and the plurality of input captured images. That is, the overlapping area measuring unit 404 measures the overlapping area with the clipped previous image for the frame image of the input moving image. For example, the overlapping area measurement unit 404 may measure the overlapping area of the frame image that is in chronological order from the extracted previous image.
  • the second image cutout unit 423 cuts out the second image from the plurality of captured images when the measurement result of the overlapping area measurement unit 404 becomes a predetermined value. That is, the second image cutout unit 423 cuts out the frame image as the second image when the overlapped region is measured by the overlapped region measuring unit 404 and the result becomes a predetermined value.
  • the overlapping region measuring unit 404 measures the area of the overlapping region
  • the second image cutout unit 423 cuts out the frame image in which the area to be measured is a predetermined value as the second image.
  • the storage control unit 406 stores the input moving image and the position information of the overlapping area measured by the overlapping area measuring unit 404 in the storage unit 408.
  • the storage control unit 406 may cause the storage unit 408 to store the front image, the rear image, and the position information of the overlapping area.
  • FIG. 17 is a flowchart showing the operation and the image processing method of the image processing apparatus 400 of the present embodiment.
  • a plurality of photographed images in time series used for composition are input by the photographed image input unit 402 (step S20).
  • an image group (moving image) captured by the robot apparatus 100 is captured by the CPU 310 of the computer 300 via the mobile communication unit 318, the wireless LAN communication unit 320, or the short-range wireless communication unit 322 of the computer 300.
  • a moving image is input to the captured image input unit 402.
  • the first image cutout unit 421 cuts out a previous image from a plurality of frame images constituting the moving image (step S21).
  • the first image cutout unit 421 is realized by the CPU 310 of the computer 300.
  • the overlapping area measuring unit 404 measures an overlapping area between a plurality of frame images constituting the moving image and the previous image (step S22).
  • the overlapping area measurement unit 404 is realized by the CPU 310 of the computer 300.
  • the second image cutout unit 423 cuts out a frame image that is a subsequent image from the moving image when the measurement result of the overlapping area measurement unit 404 becomes a threshold value (step S23).
  • the second image cutout unit 423 is realized by the CPU 310 of the computer 300.
  • the moving image input by the photographed image input unit 402 and the position information measured by the overlapping area measuring unit 404 are realized in the main memory 314 of the computer 300 by the storage control unit 406 realized by the CPU 310 of the computer 300. It is stored in the storage unit 408 (step S24).
  • the spatial arrangement unit 410 realized by the CPU 310 of the computer 300 arranges the captured image in a spatial arrangement based on the position information of the overlapping area that is the outline information (step S25).
  • the corresponding point detection unit 412 realized by the CPU 310 of the computer 300 detects the corresponding point in the overlapping region (step S26).
  • the overlapping area after the captured image is spatially arranged is not only the overlapping area of the relationship between the previous image and the subsequent image, but also the overlapping area with an image not in the context of the image (for example, the overlapping areas 494 and 496 shown in FIG. 14). ) Can also detect corresponding points.
  • the composite processing unit 414 realized by the CPU 310 of the computer 300 generates a composite image based on the detected corresponding points (step S27).
  • a third embodiment will be described.
  • a group of still images is input to the photographed image input unit 402, and a front image and a rear image are extracted from the image group and stored in the storage unit 408.
  • FIG. 18 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the third embodiment.
  • the part already demonstrated in FIG. 8 attaches
  • the image processing apparatus 400 includes a captured image input unit 402, an overlapping area measurement unit 404, an image extraction unit 431, a storage control unit 406, a storage unit 408, a space arrangement unit 410, a corresponding point detection unit 412, and a synthesis process. Part 414.
  • the photographed image input unit 402 receives a group of still images. That is, a captured image input unit 402 that sequentially inputs a plurality of captured images in time series used for image synthesis, and includes a first image and a second image before and after the time sequence in the plurality of captured images. A plurality of captured images having overlapping areas are input.
  • the image extraction unit 431 extracts the previous image and the rear image when the measurement result of the overlapping area measurement unit 404 becomes a default value. That is, the image extraction unit 431 extracts the previous image and the subsequent image from the still image group input by the captured image input unit 402 when the measurement result of the overlapping area measurement unit 404 becomes a default value.
  • the overlapping area measurement unit 404 measures the area of the overlapping area, and extracts the front image and the rear image when the area to be measured becomes a predetermined value.
  • the storage control unit 406 stores the extracted previous image and subsequent image in the storage unit 408, and causes the storage unit 408 to store position information indicating the overlapping area measured in association with the subsequent image.
  • FIG. 19 is a flowchart showing an operation and an image processing method of the image processing apparatus 400 of the present embodiment.
  • the photographed image input unit 402 inputs a plurality of photographed images (groups of still images) in time series used for composition (step S30).
  • a captured image input unit 402 realized by the CPU 310 of the computer 300 via the mobile communication unit 318, the wireless LAN communication unit 320, or the short-range wireless communication unit 322 of the computer 300 for the image group captured by the robot apparatus 100. Is input.
  • the overlapping area of each image is measured by the overlapping area measuring unit 404 (step S31).
  • the overlapping area measurement unit 404 is realized by the CPU 310 of the computer 300.
  • the image extraction unit 431 extracts the previous image and the rear image (step S32).
  • the image extraction unit 431 is realized by the CPU 310 of the computer 300.
  • the captured image input by the captured image input unit 402 and the position information measured by the overlapping region measuring unit 404 are realized in the main memory 314 of the computer 300 by the storage control unit 406 realized by the CPU 310 of the computer 300. Is stored in the storage unit 408 (step S33).
  • the spatial arrangement unit 410 realized by the CPU 310 of the computer 300 arranges the photographed images in a spatial arrangement based on the position information of the overlapping area that is the outline information (step S34).
  • the corresponding point detection unit 412 realized by the CPU 310 of the computer 300 detects the corresponding point in the overlapping region (step S35).
  • the overlapping area after the captured image is spatially arranged is not only the overlapping area of the relationship between the previous image and the subsequent image, but also the overlapping area with an image not in the context of the image (for example, the overlapping areas 494 and 496 shown in FIG. 14). ) Can also detect corresponding points.
  • the composite processing unit 414 realized by the CPU 310 of the computer 300 generates a composite image based on the detected corresponding points (step S36).
  • the first image and the second image are automatically acquired by the captured image acquisition unit 401.
  • FIG. 20 is a diagram illustrating a functional configuration example of the image processing apparatus 400 according to the second embodiment.
  • the part already demonstrated in FIG. 8 attaches
  • the camera device 403 includes a captured image acquisition unit 401, an overlapping area measurement unit 404, an imaging control unit 441, a storage control unit 406, a storage unit 408, a space arrangement unit 410, a corresponding point detection unit 412, and a composition processing unit 414.
  • the camera device 403 is provided in the robot device 100.
  • the captured image acquisition unit 401 is realized by the imaging device 200 of the robot apparatus 100, and includes an overlapping area measurement unit 404, an imaging control unit 441, a storage control unit 406, a storage unit 408, a space arrangement unit 410, and a corresponding point detection unit 412.
  • the synthesis processing unit 414 is realized by the imaging control unit 204 of the robot apparatus 100, for example.
  • the captured image acquisition unit 401 is a captured image acquisition unit 401 that sequentially acquires a plurality of time-sequential captured images used for image synthesis, and includes a previous image before and after a time-sequential order among the plurality of captured images. A plurality of captured images having overlapping areas with the subsequent image are acquired.
  • the overlapping area measurement unit 404 measures an overlapping area between the previous image and the moving image captured by the captured image acquisition unit 401.
  • the overlapping area measurement unit 404 measures an overlapping area between the acquired previous image and the live view image acquired by the camera.
  • the imaging control unit 441 causes the captured image acquisition unit 401 to acquire the second image when the measurement result of the overlapping area measurement unit 404 becomes a default value. That is, the imaging control unit 441 measures an overlapping area between the previous image and the live view image, and automatically acquires the subsequent image when the overlapping area becomes a predetermined value. For example, the overlapping area measuring unit 404 measures the area of the overlapping area, and when the area to be measured becomes a predetermined value, the imaging control unit 441 causes the captured image acquisition unit 401 to acquire the second image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'images, un dispositif de type caméra et un procédé de traitement d'images, avec lesquels le temps et des coûts de calcul requis pour une détection de points correspondants lors d'une synthèse peuvent être supprimés et une pluralité d'images capturées sont stockées de manière efficace par le calcul d'un agencement spatial pour une pluralité d'images capturées sur la base d'informations de position indiquant des zones se chevauchant sans mesure de position d'imagerie par type de capteur. Le dispositif de traitement d'images 400 comprend : une unité d'entrée d'images capturées 402; une unité de mesure de zones se chevauchant 404 qui mesure des zones se chevauchant; une unité de commande de stockage 406 qui stocke des informations de position indiquant des zones se chevauchant associées à une seconde image et mesurées, stockant ces zones dans une unité de stockage; une unité d'agencement spatial 410 qui agence de manière spatiale des images capturées; une unité de détection de points correspondants 412 qui détecte des zones se chevauchant dans des images capturées et détecte des points se correspondant entre des images dans les zones se chevauchant détectées; et une unité de synthèse 414 qui synthétise une pluralité d'images capturées sur la base des points correspondants détectés et génère une image composite.
PCT/JP2018/008125 2017-03-28 2018-03-02 Dispositif de traitement d'images, dispositif de type caméra et procédé de traitement d'images WO2018180214A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019509066A JP6712358B2 (ja) 2017-03-28 2018-03-02 画像処理装置、カメラ装置、および画像処理方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017063546 2017-03-28
JP2017-063546 2017-03-28

Publications (1)

Publication Number Publication Date
WO2018180214A1 true WO2018180214A1 (fr) 2018-10-04

Family

ID=63675340

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/008125 WO2018180214A1 (fr) 2017-03-28 2018-03-02 Dispositif de traitement d'images, dispositif de type caméra et procédé de traitement d'images

Country Status (2)

Country Link
JP (1) JP6712358B2 (fr)
WO (1) WO2018180214A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023127313A1 (fr) * 2021-12-28 2023-07-06 富士フイルム株式会社 Dispositif de support de capture d'image, procédé de support de capture d'image et programme
WO2023135910A1 (fr) * 2022-01-17 2023-07-20 富士フイルム株式会社 Dispositif et procédé de capture d'image, et programme associé

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09322055A (ja) * 1996-05-28 1997-12-12 Canon Inc 電子カメラシステム
JPH1188811A (ja) * 1997-09-04 1999-03-30 Sony Corp カメラ一体型ビデオレコーダおよび撮影方法
JP2000134537A (ja) * 1998-10-28 2000-05-12 Ricoh Co Ltd 画像入力装置及びその方法
JP2000292166A (ja) * 1999-04-07 2000-10-20 Topcon Corp 画像形成装置
WO2008087721A1 (fr) * 2007-01-18 2008-07-24 Fujitsu Limited Synthétiseur d'image, procédé et programme de synthèse d'image
JP2008288798A (ja) * 2007-05-16 2008-11-27 Nikon Corp 撮像装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09322055A (ja) * 1996-05-28 1997-12-12 Canon Inc 電子カメラシステム
JPH1188811A (ja) * 1997-09-04 1999-03-30 Sony Corp カメラ一体型ビデオレコーダおよび撮影方法
JP2000134537A (ja) * 1998-10-28 2000-05-12 Ricoh Co Ltd 画像入力装置及びその方法
JP2000292166A (ja) * 1999-04-07 2000-10-20 Topcon Corp 画像形成装置
WO2008087721A1 (fr) * 2007-01-18 2008-07-24 Fujitsu Limited Synthétiseur d'image, procédé et programme de synthèse d'image
JP2008288798A (ja) * 2007-05-16 2008-11-27 Nikon Corp 撮像装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023127313A1 (fr) * 2021-12-28 2023-07-06 富士フイルム株式会社 Dispositif de support de capture d'image, procédé de support de capture d'image et programme
WO2023135910A1 (fr) * 2022-01-17 2023-07-20 富士フイルム株式会社 Dispositif et procédé de capture d'image, et programme associé

Also Published As

Publication number Publication date
JPWO2018180214A1 (ja) 2019-12-12
JP6712358B2 (ja) 2020-06-17

Similar Documents

Publication Publication Date Title
JP6560366B2 (ja) 構造物の部材特定装置及び方法
US10356301B2 (en) Imaging system, angle-of-view adjustment method, and angle-of-view adjustment program
JP6712330B2 (ja) 撮影制御装置、撮影制御方法及びプログラム
JP6733267B2 (ja) 情報処理プログラム、情報処理方法および情報処理装置
US11100671B2 (en) Image generation apparatus, image generation system, image generation method, and image generation program
US10951821B2 (en) Imaging control device, imaging system, and imaging control method
JP5663352B2 (ja) 画像処理装置、画像処理方法、及び画像処理プログラム
JP6507268B2 (ja) 撮影支援装置及び撮影支援方法
US9071819B2 (en) System and method for providing temporal-spatial registration of images
US11991477B2 (en) Output control apparatus, display terminal, remote control system, control method, and non-transitory computer-readable medium
JP4680033B2 (ja) 監視システム及び監視装置
WO2018180214A1 (fr) Dispositif de traitement d'images, dispositif de type caméra et procédé de traitement d'images
WO2021168804A1 (fr) Procédé de traitement d'image, appareil de traitement d'image et programme de traitement d'image
WO2020162264A1 (fr) Système de photographie, dispositif de réglage de point de photographie, dispositif de photographie et procédé de photographie
CN114616820A (zh) 摄像支援装置、摄像装置、摄像系统、摄像支援系统、摄像支援方法及程序
JP6779368B2 (ja) 画像処理装置、画像処理方法、およびプログラム
JP6770826B2 (ja) 構造物の配置位置測定用の自動視準方法及び自動視準装置
CN114821544A (zh) 感知信息生成方法、装置、车辆、电子设备及存储介质
KR101323099B1 (ko) 모자이크 영상 생성 장치
JP2021155179A (ja) クレーン用撮影システム及びプログラム
JP6715340B2 (ja) 撮影計画生成装置、撮影計画生成方法、およびプログラム
CN113646606A (zh) 一种控制方法、设备、无人机及存储介质
JP2020005111A (ja) 情報処理装置、制御方法及びプログラム
WO2023135910A1 (fr) Dispositif et procédé de capture d'image, et programme associé
US20240155223A1 (en) Imaging control device, imaging system, imaging control method, and imaging control program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18777813

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019509066

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18777813

Country of ref document: EP

Kind code of ref document: A1