WO2023135910A1 - Image-capturing device, image-capturing method, and program - Google Patents
Image-capturing device, image-capturing method, and program Download PDFInfo
- Publication number
- WO2023135910A1 WO2023135910A1 PCT/JP2022/041381 JP2022041381W WO2023135910A1 WO 2023135910 A1 WO2023135910 A1 WO 2023135910A1 JP 2022041381 W JP2022041381 W JP 2022041381W WO 2023135910 A1 WO2023135910 A1 WO 2023135910A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- area
- imaging device
- feature point
- imaging
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 52
- 238000003384 imaging method Methods 0.000 claims description 275
- 230000008569 process Effects 0.000 claims description 50
- 239000000284 extract Substances 0.000 claims description 14
- 230000015572 biosynthetic process Effects 0.000 description 104
- 238000003786 synthesis reaction Methods 0.000 description 104
- 238000005516 engineering process Methods 0.000 description 51
- 238000012545 processing Methods 0.000 description 27
- 239000002131 composite material Substances 0.000 description 21
- 238000010586 diagram Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 16
- 238000004891 communication Methods 0.000 description 11
- 238000001514 detection method Methods 0.000 description 11
- 230000002194 synthesizing effect Effects 0.000 description 11
- 230000004044 response Effects 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 7
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 239000013598 vector Substances 0.000 description 5
- 230000009466 transformation Effects 0.000 description 4
- 102100029860 Suppressor of tumorigenicity 20 protein Human genes 0.000 description 3
- 238000004549 pulsed laser deposition Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000011150 reinforced concrete Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
Definitions
- the technology of the present disclosure relates to an imaging device, an imaging method, and a program.
- JP 2010-045587 discloses an image capturing unit, an image display unit, a shake detection unit, an image recording unit, a relative relationship calculation unit, a display control unit, an overlap calculation unit, a notification unit, A camera device having a shooting control unit is disclosed.
- the image capturing unit captures an image.
- the image display unit has at least a screen that displays an image.
- the shake detection section detects apparatus shake during image shooting by the image shooting section.
- the image recording section records information of an image captured by the image capturing section.
- the relative relationship calculation unit calculates the photographing range of the first image photographed immediately before by the image photographing unit and recorded in the image recording unit, and the photographing range of the second image photographed by the image photographing unit subsequent to the first image.
- a relative degree parameter representing at least a relative positional relationship between is obtained.
- the display control unit generates an image for clearly indicating the relative positional relationship between the shooting ranges from the relative relationship degree parameter obtained by the relative relationship calculation unit, and displays the image together with the second image on the screen of the image display unit. display.
- the overlap calculation unit obtains a degree of overlap parameter representing the degree of overlap between the imaging range of the first image and the imaging range of the second image.
- the notification unit gives a predetermined notification to the photographer according to the redundancy parameter obtained by the overlap calculation unit.
- Japanese Patent Application Laid-Open No. 2013-186853 describes an image processing device that includes acquisition means, division means, feature point detection means, vector calculation means, area selection means, and generation means.
- the acquisition means acquires a plurality of images continuously captured with movement of the imaging unit.
- the dividing means divides the image acquired by the acquiring means into a plurality of regions.
- the feature point detection means detects feature points in each of the plurality of areas divided by the division means.
- the vector calculation means calculates a plurality of vectors corresponding to a plurality of areas between adjacent images of the plurality of images based on the feature points detected by the feature point detection means.
- the area selection means selects a specific area from the plurality of divided areas based on the plurality of vectors calculated by the vector calculation means.
- the generation means generates a composite image by combining adjacent images based on the specific area selected by the area selection means.
- an imaging control device for controlling imaging of a moving object equipped with a camera, which includes a wide-angle image acquisition unit, an imaging information acquisition unit, an overlap information acquisition unit, and an area information acquisition unit.
- a photographing control device is disclosed that includes a photographing area calculator, and a controller.
- the wide-angle image acquisition unit acquires a wide-angle image obtained by wide-angle imaging of an entire image of an imaging target.
- the photographing information acquisition unit acquires photographing information regarding the number of shots or the photographing angle of view of a plurality of divided images obtained by closely photographing a part of the whole image of the object to be photographed by a mobile camera.
- the margin information acquisition unit acquires margin information relating to margin when a composite image of a photographing target is generated by synthesizing a plurality of divided images.
- the area information acquisition unit acquires shooting target area information about an area of the entire image of the shooting target. Based on the shooting information, the margin information, and the shooting target region information, the shooting area calculation unit calculates each shot of a wide-angle image in which the margin is secured, which is the shooting area of each of the divided images constituting the composite image. Calculate the area.
- the control unit moves the moving body, causes the camera to take close-up shots of the calculated shooting areas, and acquires the shot close-up images as divided images.
- the control unit compares an image corresponding to each shooting area of the acquired wide-angle image with an image shot by the camera at close range, and controls the position of the moving body for causing the camera to take close shots of each shooting area.
- Japanese Unexamined Patent Application Publication No. 2007-174301 discloses an image capturing device comprising a memory, motion detection means, guide image creation means, superimposed image creation means, and current image acquisition means.
- a memory stores the first captured image.
- the motion detection means detects motion of the imaging device and calculates a motion direction and a motion distance.
- the guide image creating means displays the first photographed image on the display screen, and shifts the displayed first photographed image in a direction opposite to the movement direction with respect to the display screen according to the movement distance. to fix the display, and the fixed image remaining on the display screen after the shift is displayed as a guide image.
- the superimposed image creating means superimposes and displays the current image and the guide image currently being captured by the image capturing device on the display screen.
- the current image obtaining means stores the current image in the memory as a second photographed image when it is detected that the photographing button has been pressed.
- Japanese Patent Laying-Open No. 2008-252934 discloses a camera device having monitor means for monitoring an image that enters the field of view and means for electronically recording the captured image on a recording medium.
- the camera device comprises means for displaying on the monitor means an image showing the relative positional relationship between the image displayed on the monitor means and the image taken immediately before, together with the image within the field of view of the camera device.
- One embodiment according to the technology of the present disclosure for example, based on the result of comparing the first feature points included in the entire area of the first image and the second feature points included in the entire area of the second image, An imaging apparatus, an imaging method, and a program capable of reducing the load required for processor processing when determining an overlap rate compared to when determining an overlap rate between a first image and a second image offer.
- a first aspect of the technology of the present disclosure is an imaging device that includes a processor, the processor acquires a first image, acquires a moving direction of a moving body on which the imaging device is mounted, and moves the moving body. acquire a second image captured by the imaging device, and compare the first feature points included in the first image with the second feature points included in the second image, based on the result of to determine the overlap ratio between the first image and the second image, and in the case of determining the overlap ratio, the image area included in the first image for obtaining the first feature point
- the imaging device changes a second image area, which is an image area included in a certain first image area and/or a second image and is an image area for obtaining a second feature point, according to a movement direction.
- a second aspect of the technology of the present disclosure is the imaging device according to the first aspect, wherein the processor causes the display device to display the second image as a display image.
- a third aspect of the technology of the present disclosure is the imaging device according to the first aspect or the second aspect, wherein the first image area is a partial image area of the first image, and the second image The region is an imaging device that is a partial image region of the second image.
- a fourth aspect of the technology of the present disclosure is the imaging device according to any one of the first to third aspects, wherein the first image area is positioned on the front side in the moving direction of the first image.
- the second image area is an imaging device that is an area located on the rear side in the movement direction in the second image.
- a fifth aspect of the technology of the present disclosure is the imaging device according to any one of the first to fourth aspects, wherein the processor divides the first image into a plurality of The imaging device selects a first image area from the first divided areas and selects a second image area from a plurality of second divided areas obtained by dividing the second image.
- a sixth aspect of the technology of the present disclosure is the imaging device according to the fifth aspect, wherein the first image area includes two or more first divided areas among the plurality of first divided areas, and the second The image area is an imaging device that includes two or more of the plurality of second divided areas.
- a seventh aspect of the technology of the present disclosure is the imaging device according to the fifth aspect, wherein the first image area is one first divided area of the plurality of first divided areas, and the second image area is an imaging device that is one second segmented area of a plurality of second segmented areas.
- An eighth aspect of the technology of the present disclosure is the imaging device according to any one of the first to seventh aspects, wherein the first image area and the second image area are rectangular areas. It is an imaging device.
- a ninth aspect of the technology of the present disclosure is the imaging device according to any one of the first to seventh aspects, wherein the processor moves in the horizontal direction or the vertical direction of the second image. is the direction of inclination, the first triangular region including the corner located on the front side in the moving direction among the four corners of the first image is set as the first image region, and the four corners of the second image are set to the first triangular region
- the imaging device sets the second triangular area including the corner positioned on the rear side in the movement direction as the second image area.
- a tenth aspect of the technology of the present disclosure is the imaging device according to any one of the first to ninth aspects, wherein the processor acquires the first feature point and The imaging device obtains from the image sensor only image data used for comparison with the second feature point.
- An eleventh aspect of the technology of the present disclosure is the imaging device according to any one of the first to tenth aspects, wherein the first feature point is a feature point included only in the first image area.
- the second feature point is an imaging device that is a feature point included only in the second image area.
- a twelfth aspect of the technology of the present disclosure is the imaging device according to any one of the first to eleventh aspects, wherein the processor, in accordance with the moving speed of the moving body, An imaging device for changing the area and/or the area of the second image area.
- a thirteenth aspect of the technology of the present disclosure is the imaging device according to the twelfth aspect, wherein the processor increases the area of the first image region and/or the area of the second image region as the moving speed of the moving object increases. is an imaging device that increases the
- a fourteenth aspect of the technology of the present disclosure is the imaging device according to the twelfth aspect, wherein the processor increases the area of the first image region and/or the area of the second image region as the moving speed of the moving object increases. is an imaging device that reduces the
- a fifteenth aspect of the technology of the present disclosure is the imaging device according to any one of the first to fourteenth aspects, wherein the processor, when the moving object is moving, is an imaging device that extracts a first feature point included in a first image area that is a part of the image area of the .
- a sixteenth aspect of the technology of the present disclosure is the imaging device according to any one of the first to fifteenth aspects, wherein when the moving body is stationary, the processor It is an imaging device that extracts feature points included in an area.
- a seventeenth aspect of the technology of the present disclosure is the imaging device according to any one of the first to sixteenth aspects, wherein the area of the first image region and the area of the second image region are adjacent to each other In the imaging device, each area is set to be larger than an overlap area, which is an area where the first images are overlapped when the first image is synthesized.
- An eighteenth aspect of the technology of the present disclosure is an imaging device according to any one of the first to seventeenth aspects, wherein the moving object is a flying object.
- a nineteenth aspect according to the technology of the present disclosure is to obtain a first image, to obtain a moving direction of a moving body on which an imaging device is mounted, and to capture an image by the imaging device when the moving body moves. obtaining a second image obtained by the above, and based on the result of comparing the first feature points included in the first image and the second feature points included in the second image, the first image and the second image and a first image area that is an image area included in the first image when determining the overlap rate and is an image area for obtaining the first feature point, and/or changing a second image area, which is an image area included in the second image and is an image area for obtaining the second feature points, according to the moving direction.
- a twentieth aspect of the technology of the present disclosure is to obtain a first image, to obtain a moving direction of a moving body on which an imaging device is mounted, and to capture an image by the imaging device when the moving body moves. obtaining a second image obtained by the above, and based on the result of comparing the first feature points included in the first image and the second feature points included in the second image, the first image and the second image and a first image area that is an image area included in the first image when determining the overlap rate and is an image area for obtaining the first feature point, and/or for causing the computer to perform processing including changing the second image area, which is an image area included in the second image and is an image area for obtaining the second feature points, according to the movement direction. It's a program.
- FIG. 20 is an explanatory diagram illustrating an example of a fifth operation of the processor;
- FIG. 20 is an explanatory diagram illustrating an example of a sixth operation of the processor;
- FIG. 20 is an explanatory diagram illustrating an example of a seventh operation of the processor;
- FIG. 22 is an explanatory diagram illustrating an example of an eighth operation of the processor;
- FIG. 22 is an explanatory diagram illustrating an example of a ninth operation of the processor;
- FIG. 22 is an explanatory diagram illustrating an example of a tenth operation of the processor;
- FIG. 22 is an explanatory diagram illustrating an example of an eleventh operation of the processor; It is a perspective view which shows an example of the moving direction of a flight imaging device.
- FIG. 4 is an explanatory diagram illustrating a first example of a relationship between an image region in which feature points are compared between an image for synthesis and an image for display, and a moving direction of the flight imaging device;
- FIG. 4 is an explanatory diagram illustrating a first example of a relationship between an image region in which feature points are compared between an image for synthesis and an image for display, and a moving direction of the flight imaging device;
- I/F is an abbreviation for "Interface”.
- RAM is an abbreviation for "Random Access Memory”.
- EEPROM is an abbreviation for "Electrically Erasable Programmable Read-Only Memory”.
- CPU is an abbreviation for "Central Processing Unit”.
- HDD is an abbreviation for “Hard Disk Drive”.
- SSD is an abbreviation for “Solid State Drive”.
- DRAM is an abbreviation for "Dynamic Random Access Memory”.
- SRAM is an abbreviation for "Static Random Access Memory”.
- CMOS is an abbreviation for "Complementary Metal Oxide Semiconductor”.
- LiDAR is an abbreviation for “Light Detection And Ranging”.
- GPU is an abbreviation for "Graphics Processing Unit”.
- TPU is an abbreviation for "Tensor Processing Unit”.
- USB is an abbreviation for "Universal Serial Bus”.
- ASIC is an abbreviation for "Application Specific Integrated Circuit”.
- FPGA is an abbreviation for "Field-Programmable Gate Array”.
- PLD is an abbreviation for "Programmable Logic Device”.
- SoC is an abbreviation for "System-on-a-chip.”
- IC is an abbreviation for "Integrated Circuit”.
- perpendicular means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to perfect verticality, and does not go against the spirit of the technology of the present disclosure. It refers to the vertical in the sense of including the error of
- match means, in addition to perfect match, an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, and does not go against the spirit of the technology of the present disclosure.
- the “vertical direction” means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to the perfect vertical direction, and is contrary to the spirit of the technology of the present disclosure. It refers to the vertical direction in the sense of including the degree of error that does not occur.
- the flight imaging device 1 has a flight function and an imaging function, and images the wall surface 2A of the object 2 while flying.
- the concept of “flight” includes not only the meaning that the flight imaging device 1 moves in the air, but also the meaning that the flight imaging device 1 stops in the air.
- the wall surface 2A is flat as an example.
- a plane refers to a two-dimensional surface (that is, a surface along a two-dimensional direction).
- the concept of "flat surface” does not include the meaning of a mirror surface.
- the wall surface 2A is a plane defined horizontally and vertically (that is, a plane extending horizontally and vertically).
- the wall surface 2A includes irregularities.
- the unevenness referred to here includes, for example, unevenness due to defects and/or defects in addition to unevenness caused by the material forming the wall surface 2A.
- the object 2 having the wall surface 2A is a pier provided on a bridge.
- the piers are made of reinforced concrete, for example.
- a bridge pier is given as an example of the object 2 here, the object 2 may be an object other than a bridge pier (for example, a tunnel or a dam).
- the flight function of the flight imaging device 1 (hereinafter also simply referred to as "flight function”) is a function of the flight imaging device 1 flying based on a flight instruction signal.
- a flight instruction signal refers to a signal that instructs the flight imaging device 1 to fly.
- the flight instruction signal is transmitted, for example, from a transmitter 20 for operating the flight imaging device 1 .
- the transmitter 20 is operated by a user (not shown).
- the transmitter 20 includes a control unit 22 for operating the flight imaging device 1 and a display device 24 for displaying images captured by the flight imaging device 1 .
- the display device 24 is, for example, a liquid crystal display.
- the flight instruction signal is classified into a plurality of instruction signals including a movement instruction signal that instructs the movement and movement direction of the flight imaging device 1 and a stationary instruction signal that instructs the flight imaging device 1 to stand still.
- a movement instruction signal that instructs the movement and movement direction of the flight imaging device 1
- a stationary instruction signal that instructs the flight imaging device 1 to stand still.
- the imaging function of the flight imaging device 1 (hereinafter also simply referred to as “imaging function”) is a function of the flight imaging device 1 imaging a subject (as an example, the wall surface 2A of the target object 2).
- the flight imaging device 1 includes a flying object 10 and an imaging device 30.
- the flying object 10 is, for example, an unmanned aerial vehicle such as a drone.
- a flight function is realized by the aircraft 10 .
- the flying object 10 has a plurality of propellers 12 and flies by rotating the plurality of propellers 12 . Flying of the aircraft 10 is synonymous with flying of the flight imaging device 1 .
- the flying object 10 is an example of the “moving object” and the “flying object” according to the technology of the present disclosure.
- the imaging device 30 is, for example, a digital camera or a video camera. An imaging function is realized by the imaging device 30 .
- the imaging device 30 is an example of an “imaging device” according to the technology of the present disclosure.
- the imaging device 30 is mounted on the aircraft 10 . Specifically, the imaging device 30 is provided below the aircraft 10 .
- the imaging device 30 may be provided in the upper part or the front part of the flying object 10 or the like.
- the flight imaging device 1 sequentially images a plurality of areas 3 of the wall surface 2A.
- a region 3 is a region determined by the angle of view of the flight imaging device 1 .
- a rectangular area is shown as an example of the area 3 .
- a plurality of synthesis images 92 are obtained by sequentially capturing images of the plurality of regions 3 by the imaging device 30 .
- a composite image 90 is generated by combining a plurality of composite images 92 .
- a plurality of images for synthesis 92 are synthesized so that adjacent images for synthesis 92 partially overlap each other.
- the process of synthesizing a plurality of images for synthesis 92 may be performed by the flight imaging device 1 or may be performed by an external device (not shown) communicably connected to the flight imaging device 1 .
- the composite image 90 is used, for example, for inspecting or surveying the wall surface 2A of the object 2 .
- the plurality of synthesized images 90 used to generate the synthesized image 90 also include images that have undergone projective transformation.
- An image that has undergone projective transformation refers to, for example, an image that includes an image area distorted into a trapezoid or the like due to the posture (for example, depression angle or elevation angle) of the imaging device 30, and is corrected.
- the wall surface 2A is imaged by the imaging device 30 in a state in which the posture of the imaging device 30 is tilted with respect to the wall surface 2A (that is, a state in which the optical axis OA of the imaging device 30 is tilted with respect to the wall surface 2A). This is the processing performed on the image obtained by
- the distortion of the image caused by the angle of depression or elevation is corrected by performing a projective transformation. That is, the image obtained by the imaging device 30 imaging with the posture of the imaging device 30 tilted with respect to the wall surface 2A is transformed into an image as if from a position facing the wall surface 2A. It is converted like an image obtained by taking an image.
- each area 3 is imaged by the imaging device 30 with the optical axis OA of the imaging device 30 perpendicular to the wall surface 2A.
- the plurality of regions 3 are imaged such that adjacent regions 3 partially overlap each other.
- a plurality of regions 3 are imaged so that the adjacent regions 3 partially overlap each other. This is for synthesizing
- overlap ratio the ratio of the area of the overlapping portion to the total area of each region 3 is called an overlap ratio.
- the overlap ratio is set to the default overlap ratio.
- the default overlap rate is set to a rate (eg, 30%) at which the amount of feature points that can be combined with adjacent composite images 92 is obtained.
- the plurality of regions 3 are regions 3 that have already been imaged (that is, regions 3 that have been imaged by the flight imaging device 1) and regions 3 that have not been imaged (that is, regions 3 that have been imaged by the flight imaging device 1). 3).
- imaging target region 3A the region 3 that has not yet been imaged among the plurality of regions 3
- imaged area 3B' This is referred to as an imaged area 3B'.
- the flight imaging device 1 images a plurality of regions 3 while moving in a zigzag manner by alternately repeating horizontal movement and vertical movement.
- the flight imaging device 1 captures a plurality of images in the order in which a part of the imaging target area 3A and a part of the imaging completed area 3B that is imaged one frame before the imaging target area 3A (for example, one frame before) overlap each other.
- Each area 3 is imaged.
- the flight imaging device 1 alternately repeats movement in the horizontal direction and the movement in the vertical direction, thereby capturing images of a plurality of regions 3 while moving in a zigzag manner. do.
- the imaging device 30 includes a computer 32, a communication device 34, an image sensor 36, an image sensor driver 38, an imaging lens 40, an image memory 42, and an input/output I/F44.
- the computer 32 includes a processor 46, storage 48, and RAM50.
- Processor 46 , storage 48 , and RAM 50 are interconnected via bus 52 , and bus 52 is connected to input/output I/F 44 .
- the input/output I/F 44 is connected with the communication device 34 , the image sensor driver 38 , the imaging lens 40 , and the image memory 42 .
- the computer 32 is an example of a "computer” according to the technology of the present disclosure.
- the processor 46 is an example of a "processor” according to the technology of the present disclosure.
- the processor 46 has, for example, a CPU and controls the imaging device 30 as a whole.
- the storage 48 is a nonvolatile storage device that stores various programs, various parameters, and the like.
- the storage 48 includes, for example, HDD and/or flash memory (eg, EEPROM and/or SSD).
- the RAM 50 is a memory that temporarily stores information, and is used by the processor 46 as a work memory. Examples of the RAM 50 include DRAM and/or SRAM.
- the communication device 34 is communicably connected to the transmitter 20 as an example.
- the communication device 34 is wirelessly communicably connected to the transmitter 20 according to a predetermined wireless communication standard.
- the predefined wireless communication standard includes, for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark).
- the communication device 34 is in charge of exchanging information with the transmitter 20 .
- communication device 34 transmits information to transmitter 20 in response to a request from processor 46 .
- Communication device 34 also receives information transmitted from transmitter 20 and outputs the received information to processor 46 via bus 52 .
- the communication device 34 may be communicably connected to the transmitter 20 and/or the aircraft 10. .
- the image sensor 36 is connected with an image sensor driver 38 .
- Image sensor driver 38 controls image sensor 36 according to instructions from processor 46 .
- Image sensor 36 is, for example, a CMOS image sensor. Although a CMOS image sensor is exemplified as the image sensor 36 here, the technology of the present disclosure is not limited to this, and other image sensors may be used.
- the image sensor 36 captures an image of a subject (for example, the wall surface 2A of the target object 2) under the control of an image sensor driver 38, and outputs image data obtained by capturing the image.
- the imaging lens 40 is arranged on the subject side (object side) of the image sensor 36 .
- the imaging lens 40 captures subject light, which is reflected light from the subject, and forms an image of the captured subject light on the imaging surface of the image sensor 36 .
- the imaging lens 40 includes a plurality of optical elements (not shown) such as a focus lens, a zoom lens, and an aperture.
- the imaging lens 40 is connected to the computer 32 via an input/output I/F 44 .
- the plurality of optical elements included in the imaging lens 40 are connected to the input/output I/F 44 via a driving mechanism (not shown) having a power source.
- a plurality of optical elements included in imaging lens 40 operate under the control of computer 32 . In the imaging device 30, by operating a plurality of optical elements included in the imaging lens 40, adjustment of focus, optical zoom, shutter speed, and the like are realized.
- the image memory 42 temporarily stores image data generated by the image sensor 36 .
- the processor 46 acquires image data from the image memory 42 and uses the acquired image data to perform various processes.
- the storage 48 stores an imaging program 60 .
- the imaging program 60 is an example of a "program" according to the technology of the present disclosure.
- the processor 46 reads the imaging program 60 from the storage 48 and executes the read imaging program 60 on the RAM 50 .
- the processor 46 performs imaging processing for imaging a plurality of areas 3 (see FIG. 1) according to an imaging program 60 executed on the RAM 50 .
- the imaging process is performed by the processor 46 according to the imaging program 60, including a first imaging control unit 62, a first feature point information generation unit 64, a first movement determination unit 66, a movement direction information generation unit 68, a second imaging control unit 70, a display It is realized by operating as the control unit 72 , the second feature point information generation unit 74 , the feature point comparison unit 76 , the overlap determination unit 78 , and the second movement determination unit 80 .
- the flying object 10 receives the movement instruction signal transmitted from the transmitter 20 in response to the user's operation, and moves to the first imaging position based on the received movement instruction signal. Also, the flying object 10 receives a stationary instruction signal transmitted from the transmitter 20 in response to an operation by the user, and stands still at the initial imaging position based on the received stationary instruction signal.
- the imaging device 30 receives the imaging start signal transmitted from the transmitter 20 in response to the user's operation, the imaging device 30 executes imaging processing described below.
- the first imaging control unit 62 outputs a first imaging instruction signal to the image sensor 36 to cause the image sensor 36 to image the imaging target region 3A.
- Image data for synthesis is obtained by imaging the imaging target region 3A with the image sensor 36 under the control of the first imaging control section 62 .
- the synthesizing image data is image data representing the synthesizing image 92 .
- the image data for synthesis is stored in the storage 48 .
- the image for synthesis 92 indicated by the image data for synthesis shown in FIG. 4 is the first image for synthesis.
- the composition image 92 is an example of the "first image” according to the technology of the present disclosure.
- the synthesizing image 92 includes characteristic points corresponding to the unevenness of the imaging target area 3A.
- the feature point included in the image for synthesis 92 will be referred to as a "first feature point".
- the first feature point information generation unit 64 acquires the image for synthesis 92 based on the image data for synthesis stored in the storage 48, and The image for use 92 is divided into a plurality of first divided areas 96 .
- the composite image 92 is evenly divided into a plurality of first divided regions 96 .
- the number of multiple first divided areas 96 is nine.
- the nine first segmented regions 96 need to be distinguished and explained, the nine first segmented regions 96 are respectively divided into the first segmented region 96-1 and the second segmented region 96-1 by using branch numbers 1 to 9.
- each of the plurality of first divided regions 96 is a rectangular region.
- the first feature point information generation unit 64 extracts the first feature points included in the entire area of the image for synthesis 92 for each first divided area 96. , first feature point information indicating the coordinates of the first feature point for each of the extracted first divided areas 96 is generated.
- the first feature point information generated by the first feature point information generation unit 64 is stored in the storage 48 .
- the coordinates of the first feature point indicated by the first feature point information are derived, for example, by performing image processing (for example, high-frequency component extraction processing, etc.) on the synthesis image data.
- the coordinates of the first feature point are, for example, coordinates based on any one of the four vertices of the imaging target area 3A.
- the aircraft 10 when the aircraft 10 receives the movement instruction signal transmitted from the transmitter 20 in response to the user's operation, the aircraft 10 moves based on the received movement instruction signal.
- the flying object 10 is moving horizontally based on the movement instruction signal.
- the moving direction of the flying object 10 is the right direction.
- the first movement determination unit 66 determines whether the flying object 10 is moving. The first movement determination unit 66 determines that the flying object 10 is moving based on the movement instruction signal when the imaging device 30 receives the movement instruction signal transmitted from the transmitter 20 in response to the user's operation. do.
- the movement instruction signal includes information for instructing the movement direction of the aircraft 10 .
- the moving direction information generating unit 68 acquires the moving direction of the flying object 10 based on the movement instruction signal, and converts the acquired moving direction to Generate moving direction information to indicate.
- the flying object 10 continues to move based on the received movement instruction signal while receiving the movement instruction signal transmitted from the transmitter 20 in response to the user's operation.
- the second imaging control unit 70 outputs a second imaging instruction signal to the image sensor 36 to cause the image sensor 36 to image the imaging target region 3A.
- Display image data is obtained by capturing an image of the image capturing target area 3A with the image sensor 36 under the control of the second image capturing control section 70 .
- the display image data is image data representing the display image 94 .
- the display image 94 is obtained by being imaged by the imaging device 30 when the flying object 10 moves from the position where the synthesis image 92 was obtained.
- the image data for display is stored in the image memory 42 .
- the display image 94 is an example of the "second image” and the "display image” according to the technology of the present disclosure.
- the display image 94 includes characteristic points corresponding to the unevenness of the imaging target area 3A.
- the feature points included in the display image 94 are referred to as “second feature points”. Further, hereinafter, when it is not necessary to distinguish between the “first feature point” and the “second feature point”, they are also simply referred to as the "feature point”.
- the display control unit 72 acquires display image data stored in the image memory 42 and executes various processes on the display image data. The display control unit 72 then outputs the display image data to the transmitter 20 .
- the transmitter 20 receives the display image data, and displays the display image 94 (that is, the live view image) on the display device 24 based on the received display image data.
- the second feature point information generation unit 74 acquires the display image 94 based on the display image data stored in the image memory 42 and divides the acquired display image 94 into a plurality of second divided areas 100 .
- the display image 94 is equally divided into a plurality of second divided regions 100 as an example.
- the number of the plurality of second divided regions 100 is the same as the number of the plurality of first divided regions 96 . That is, in the example shown in FIG. 6, the number of the plurality of second divided regions 100 is nine. In the following, when the nine second divided regions 100 need to be distinguished and explained, the nine second divided regions 100 are divided into the second divided region 100-1 and the second divided region 100-1 by using branch numbers 1 to 9, respectively.
- Two divided regions 100-2, second divided regions 100-3, second divided regions 100-4, second divided regions 100-5, second divided regions 100-6, second divided regions 100-7, second divided regions They are called an area 100-8 and a second divided area 100-9.
- each of the plurality of second divided regions 100 is a rectangular region.
- the second feature point information generation unit 74 generates the second split areas 100 for extracting the second feature points from the plurality of second split areas 100 obtained by splitting the display image 94 as second image areas. 102. Specifically, the second feature point information generation unit 74 acquires the movement direction of the flying object 10 (hereinafter also simply referred to as “movement direction”) based on the movement direction information generated by the movement direction information generation unit 68. Then, the second segmented region 100 located on the rear side in the movement direction from among the plurality of second segmented regions 100 is selected as the second image region 102 . In the example shown in FIG. 6, three second divided regions 100 (as an example, a second divided region 100-1, a second divided region 100- 4, and the second segmented area 100-7) is selected as the second image area 102.
- the second feature point information generation unit 74 extracts second feature points included in the second image area 102 for each second image area 102 .
- a second feature point is a feature point included only in the second image area 102 .
- the second feature point information generation unit 74 generates second feature point information indicating the coordinates of the second feature points for each of the extracted second image regions 102 .
- the second feature point information generated by the second feature point information generating section 74 is stored in the storage 48 .
- the coordinates of the second feature points extracted by the second feature point information generating section 74 are derived by the same method as the coordinates of the first feature points extracted by the first feature point information generating section 64 .
- FIG. 7 shows the operation of the feature point comparison unit 76 when the first composite image is obtained as the composite image 92 .
- the first feature point information stored in the storage 48 corresponding to the first image for synthesis is the entire area of the image for synthesis 92. is information indicating the coordinates of the first feature point for each of the first divided areas 96 .
- the feature point comparison unit 76 acquires the first feature point information and the second feature point information stored in the storage 48 . Further, when the first composite image is obtained as the composite image 92 , the feature point comparison section 76 acquires the movement direction information generated by the movement direction information generation section 68 . Then, based on the acquired moving direction information, the feature point comparison unit 76 selects the first divided area 96 positioned forward in the moving direction from among the plurality of first divided areas 96 with respect to the synthesis image 92 as the first divided area 96 . One image area 98 is selected. In the example shown in FIG. 7, three first divided regions 96 (as an example, the first divided region 96-3, the first divided region 96-6 , and the first segmented area 96 - 9 ) are selected as the first image area 98 .
- the feature point comparison unit 76 extracts the first feature points included in the first image area 98 from the first feature point information.
- the first feature point is a feature point included only in the first image area 98 .
- the feature point comparison unit 76 compares the first feature points included in the first image area 98 and the second feature points included in the second image area 102 . That is, the feature point comparison unit 76 compares the first feature point included in the first image area 98 of the image for synthesis 92 and the first feature point of the image for display 94 among the image for synthesis 92 and the image for display 94 captured at different times.
- the second feature points included in the second image area 102 are compared.
- the feature point comparison unit 76 generates comparison result information indicating a result of comparing the first feature point and the second feature point.
- the comparison result information is specifically information about the result of comparing the coordinates of the first feature point and the coordinates of the second feature point.
- the area of the three first image regions 98 and the area of the three second image regions The area of 102 is set to be larger than the overlap area, which is the area where the adjacent images for synthesis 92 are overlapped when they are synthesized.
- the overlap determination unit 78 acquires the comparison result information generated by the feature point comparison unit 76, and based on the acquired comparison result information, an image 92 for synthesis and an image 94 for display. It is determined whether or not the overlap rate of the overlap region 95, which is the region where the and overlap, is equal to or less than a predetermined threshold.
- the overlap ratio refers to, for example, the ratio of the area of the overlap region 95 to the area of one frame (for example, the area of the image for synthesis 92 or the area of the image for display 94).
- An overlap region 95 between the image for synthesis 92 and the image for display 94 corresponds to the overlap region 5 where the regions 3 overlap each other.
- the predetermined threshold value is set in consideration of the efficiency of sequentially imaging a plurality of regions 3 and the aforementioned predetermined overlap ratio for synthesizing adjacent images for synthesis 92 (see FIG. 1). .
- the predetermined threshold is set to 50% or less in consideration of the efficiency when sequentially imaging a plurality of regions 3 .
- the default threshold is set to a value greater than the default overlap (30% as an example). Note that the moving speed of the flying object 10 is the speed at which the overlap determination unit 78 performs at least one determination until the overlap ratio between the image for synthesis 92 and the image for display 94 falls below the predetermined overlap ratio. is set to
- the first imaging control unit 62 By outputting the first imaging instruction signal to the image sensor 36, the image sensor 36 is caused to image the imaging target region 3A.
- Image data for synthesis is obtained by imaging the imaging target region 3A with the image sensor 36 under the control of the first imaging control section 62 .
- a synthesizing image 92 indicated by the synthesizing image data shown in FIG. 9 is the synthesizing image 92 obtained after the above-described first synthesizing image (see FIG. 4).
- An image for synthesis obtained after the first image for synthesis is hereinafter referred to as the Nth image for synthesis (where N ⁇ 2).
- the image data for synthesis representing the Nth image for synthesis is stored in the storage 48 .
- the first imaging control unit 62 changes the image from the imaging device 30 to the transmitter 20 . may be caused to transmit an imaging permission signal. Further, the transmitter 20 may display, on the display device 24, characters or the like indicating that the imaging is permitted when the imaging permission signal is received. Then, when the imaging device 30 receives an imaging instruction signal transmitted from the transmitter 20 to the imaging device 30 by the user according to characters or the like displayed on the display device 24, the first imaging control unit 62 By outputting the first imaging instruction signal to the sensor 36, the image sensor 36 may be caused to image the imaging target region 3A.
- the aircraft 10 continues to move based on the received movement instruction signal while receiving the movement instruction signal as the flight instruction signal.
- the second movement determination unit 80 determines whether the flying object 10 continues to move based on the movement instruction signal received by the imaging device 30 and the movement direction information generated by the movement direction information generation unit 68. judge. When the movement instruction signal is received by the imaging device 30 and the movement direction indicated by the movement instruction signal matches the movement direction indicated by the movement direction information, the second movement determination unit 80 determines whether the flying object 10 continues to move. determine that there is
- the first feature point information generation unit 64 When the second movement determination unit 80 determines that the flying object 10 continues to move, the first feature point information generation unit 64 generates a synthesis image data based on the latest synthesis image data stored in the storage 48 .
- An image 92 (that is, the N-th image for synthesis) is obtained, and the obtained image for synthesis 92 is divided into a plurality of first divided regions 96 .
- the composite image 92 is evenly divided into a plurality of first divided regions 96 .
- the number of the plurality of first divided regions 96 is the same as the number of divisions of the first composite image. That is, in the example shown in FIG. 10, the number of multiple first divided areas 96 is nine.
- each of the plurality of first divided regions 96 is a rectangular region.
- the first feature point information generation unit 64 When the second movement determination unit 80 determines that the flying object 10 continues to move, the first feature point information generation unit 64 generates a plurality of first feature point information obtained by dividing the synthesis image 92 . A first divided area 96 for extracting a first feature point from the divided area 96 is selected as a first image area 98 . Specifically, the first feature point information generation unit 64 acquires the movement direction of the flying object 10 based on the movement direction information generated by the movement direction information generation unit 68, and A first divided area 96 located on the front side in the moving direction of is selected as a first image area 98 . In the example shown in FIG. 10, three first divided regions 96 (as an example, the first divided region 96-3, the first divided region 96-6 , and the first segmented area 96 - 9 ) are selected as the first image area 98 .
- the first feature point information generation unit 64 extracts first feature points included in the first image area 98 for each first image area 98 .
- the first feature point is a feature point included only in the first image area 98 .
- the first feature point information generation unit 64 generates the first feature points included in the first image area 98, which is a partial image area of the composite image 90, when the flying object 10 is moving.
- Each first image area 98 is extracted.
- the first feature point information generating section 64 generates first feature point information indicating the coordinates of the first feature points for each of the extracted first image regions 98 .
- the first feature point information generated by the first feature point information generation unit 64 is stored in the storage 48 .
- FIG. 11 shows the operation of the feature point comparison unit 76 when the N-th synthesis image is obtained as the synthesis image 92 while the flying object 10 continues to move (see FIG. 9). ing.
- the first feature point information corresponding to the N-th synthesis image is the first segmented region located on the front side of the plurality of first segmented regions 96 in the moving direction.
- 96 is information indicating the coordinates of the first feature point for each first image area 98 .
- the second feature point information is information indicating the coordinates of the second feature point for each second image region 102 which is the second segmented region 100 located on the rear side in the moving direction among the plurality of second segmented regions 100. is.
- the feature point comparison unit 76 acquires the first feature point information and the second feature point information stored in the storage 48 . Then, when the flying object 10 is moving, the feature point comparison unit 76 compares the first feature points and the second image included in the first image area 98 based on the first feature point information and the second feature point information. A second feature point included in the area 102 is compared, and comparison result information indicating the result of the comparison is generated.
- the first imaging control unit 62 When the overlap determination unit 78 determines that the overlap ratio between the image for synthesis 92 and the image for display 94 is equal to or less than the predetermined threshold value, by outputting the first imaging instruction signal to the image sensor 36, The image sensor 36 is made to image the imaging target area 3A. Image data for synthesis is obtained by imaging the imaging target region 3A with the image sensor 36 under the control of the first imaging control section 62 . A composite image 92 indicated by the composite image data shown in FIG. 11 is the N-th composite image. The image data for synthesis representing the Nth image for synthesis is stored in the storage 48 .
- the transmitter is operated in response to the user's operation.
- a stationary instruction signal is transmitted from 20 to the aircraft 10 as a flight instruction signal.
- the second movement determination unit 80 determines that the flying object 10 has not continued to move, that is, the flying object 10 has stopped moving.
- the first feature point information generation unit 64 When the second movement determination unit 80 determines that the flying object 10 has stopped moving, the first feature point information generation unit 64 generates the image for synthesis 92 based on the latest image data for synthesis stored in the storage 48 . (that is, the N-th image for synthesis) is obtained, and the obtained image for synthesis 92 is divided into a plurality of first divided regions 96 .
- the composite image 92 is evenly divided into a plurality of first divided regions 96 .
- the number of the plurality of first divided regions 96 is the same as the number of divisions of the first composite image. That is, in the example shown in FIG. 12, the number of multiple first divided areas 96 is nine.
- each of the plurality of first divided regions 96 is a rectangular region.
- the first feature point information generation unit 64 converts the first feature points included in the entire area of the synthesis image 92 into the first Each divided area 96 is extracted, and first characteristic point information indicating the coordinates of the first characteristic point for each first divided area 96 extracted is generated.
- the first feature point information generator 64 extracts the first feature points included in the entire area of the synthesis image 92 for each first divided area 96 when the flying object 10 stops moving.
- the first feature point information generating section 64 generates first feature point information indicating the coordinates of the first feature point for each of the extracted first divided areas 96 .
- the first feature point information generated by the first feature point information generation unit 64 is stored in the storage 48 .
- FIG. 14 shows the feature point comparison unit 76 when the flying object 10 temporarily stops moving after obtaining the N-th image for synthesis (see FIG. 12) as the image for synthesis 92 and then starts moving. operation is shown.
- the flying object 10 is moving vertically based on the movement instruction signal. Specifically, the moving direction of the flying object 10 is downward.
- the first feature point information corresponding to the Nth image for synthesis indicates the coordinates of the first feature point for each first divided area 96 for the entire area of the image for synthesis 92.
- the second feature point information is information indicating the coordinates of the second feature point for each second image region 102 which is the second segmented region 100 located on the rear side in the moving direction among the plurality of second segmented regions 100. is.
- the feature point comparison unit 76 acquires the first feature point information and the second feature point information stored in the storage 48 . Further, when the flying object 10 stops moving, the feature point comparison unit 76 acquires the movement direction information generated by the movement direction information generation unit 68, and based on the acquired movement direction information, the image for synthesis 92 is generated.
- the first divided area 96 located on the front side in the moving direction from among the plurality of first divided areas 96 is selected as the first image area 98 .
- three first segmented regions 96 for example, first segmented region 96-7, first segmented region 96-8 , and the first segmented area 96 - 9 ) are selected as the first image area 98 .
- the feature point comparison unit 76 extracts the first feature points included in the first image area 98 from the first feature point information. Then, the feature point comparison unit 76 compares the first feature points included in the first image area 98 and the second feature points included in the second image area 102, and generates comparison result information indicating the comparison result. do.
- the flight imaging device 1 may image the wall surface 2A while moving in all directions along the wall surface 2A of the object 2 .
- the movement directions of the flight imaging device 1 are classified into eight directions along the wall surface 2A of the object 2.
- rightward, leftward, upward, downward, upper right, lower right, upper left, and lower left directions are shown as an example of the moving direction of the flight imaging device 1 . That is, the flight imaging device 1 moves along the designated direction among the right direction, left direction, upward direction, downward direction, upper right direction, lower right direction, upper left direction, and lower left direction.
- the characteristic point comparison unit 76 determines the characteristic points according to the movement direction of the flying object 10 .
- a relationship is shown between a first image area 98 and a second image area 102 for which .
- the feature point comparison unit 76 sets the first segmented region 96-3, the first segmented region 96-6, and the first segmented region 96-3 as the first image region 98. 96-9, and the second feature points included in the second divided regions 100-1, 100-4, and 100-7 as the second image region 102. Compare with
- the feature point comparison unit 76 selects the first segmented region 96-1, the first segmented region 96-4, and the first segmented region as the first image region 98. 96-7, and the second feature points included in the second divided regions 100-3, 100-6, and 100-9 as the second image region 102 Compare with
- the feature point comparison unit 76 selects the first segmented region 96-1, the first segmented region 96-2, and the first segmented region as the first image region 98. 96-3, and the second feature points included in the second divided regions 100-7, 100-8, and 100-9 as the second image region 102 Compare with
- the feature point comparison unit 76 selects the first segmented region 96-7, the first segmented region 96-8, and the first segmented region 96-7 as the first image region 98. 96-9, and the second feature points included in the second divided regions 100-1, 100-2, and 100-3 as the second image region 102 Compare with
- the feature point comparison unit 76 determines the characteristic points according to the movement direction of the flying object 10 .
- the relationship between a first image area 98 and a second image area 102 against which points are compared is shown.
- the feature point comparison unit 76 sets the first divided region 96-2, the first divided region 96-3, and the first divided region 96-2 as the first image region 98. 96-6, and the second feature points included in the second divided regions 100-4, 100-7, and 100-8 as the second image region 102 Compare with Further, when the moving direction of the flying object 10 is the lower right direction, the feature point comparison unit 76 selects the first divided region 96-6, the first divided region 96-8, and the first divided region 96-6 as the first image region 98. A first feature point included in the region 96-9 and a second feature included in the second divided regions 100-1, 100-2, and 100-4 as the second image region 102 Compare points.
- the feature point comparison unit 76 sets the first divided region 96-1, the first divided region 96-2, and the first divided region 96-1 as the first image region 98. 96-4, and the second feature points included in the second divided regions 100-6, 100-8, and 100-9 as the second image region 102 Compare with
- the feature point comparison unit 76 selects the first segmented region 96-4, the first segmented region 96-7, and the first segmented region 96-4 as the first image region 98. 96-8, and the second feature points included in the second divided regions 100-2, 100-3, and 100-6 as the second image region 102 Compare with Thus, the first image region 98 and the second image region 102 whose feature points are compared by the feature point comparison unit 76 are changed according to the moving direction of the flying object 10 .
- FIG. 18 shows an example of the flow of imaging processing according to this embodiment.
- step ST10 the first imaging control unit 62 causes the image sensor 36 to image the imaging target area 3A (see FIG. 4).
- Image data for synthesis is obtained by imaging the imaging target region 3A with the image sensor 36 under the control of the first imaging control section 62 .
- step ST12 the imaging process proceeds to step ST12.
- step ST12 the first feature point information generation unit 64 generates first feature points included in the entire area of the image for synthesis 92 for each first divided area 96 based on the image data for synthesis acquired in step ST10. First feature point information indicating the coordinates of the first feature point for each of the extracted first divided areas 96 is generated (see FIG. 4). After the process of step ST12 is executed, the imaging process proceeds to step ST14.
- the first movement determination unit 66 determines whether or not the flying object 10 is moving (see FIG. 5). In step ST14, if the flying object 10 is moving, the determination is affirmative, and the imaging process proceeds to step ST16. In step ST14, if the flying object 10 is not moving, the determination is negative, and the imaging process is executed again in step ST14.
- step ST16 the movement direction information generation unit 68 acquires the movement direction of the flying object 10 and generates movement direction information indicating the acquired movement direction (see FIG. 5). After the process of step ST16 is executed, the imaging process proceeds to step ST18.
- the second imaging control unit 70 causes the image sensor 36 to image the imaging target region 3A (see FIG. 6). Display image data is obtained by capturing an image of the image capturing target area 3A with the image sensor 36 under the control of the second image capturing control section 70 . After the process of step ST18 is executed, the imaging process proceeds to step ST20.
- step ST20 the display control unit 72 causes the display device 24 to display the display image 94 (that is, the live view image) based on the display image data obtained at step ST18 (see FIG. 6).
- step ST20 the imaging process proceeds to step ST22.
- step ST22 the second feature point information generation unit 74 generates the second image region of the display image 94 based on the movement direction information obtained in step ST16 and the display image data obtained in step ST18. Second feature point information indicating the coordinates of the second feature points for every 102 is generated (see FIG. 6). After the process of step ST22 is executed, the imaging process proceeds to step ST24.
- step ST24 the feature point comparison unit 76 calculates the first image region of the image for synthesis 92 based on the first feature point information generated in step ST12 and the second feature point information generated in step ST22. 98 and the second feature points included in the second image area 102 of the display image 94 are compared, and comparison result information indicating the comparison result is generated (see FIG. 7). .
- the above description is for the case where the process of step ST24 is the first process. If the process of step ST24 is the second or subsequent process, the feature point comparison unit 76 compares the first feature point information generated in step ST32 or step ST34 described later and the second feature point information generated in step ST22. Based on the point information, the first feature point and the second feature point are compared, and comparison result information indicating the comparison result is generated (see FIG. 11 or FIG. 14). After the process of step ST24 is executed, the imaging process proceeds to step ST26.
- step ST26 the overlap determination unit 78 acquires the comparison result information generated in step ST24, and based on the acquired comparison result information, the overlap ratio between the image for synthesis 92 and the image for display 94 is set to the default value. It is determined whether or not it is equal to or less than the threshold (see FIG. 8). In step ST26, if the overlap rate between the image for synthesis 92 and the image for display 94 is equal to or less than the predetermined threshold value, the determination is affirmative, and the imaging process proceeds to step ST28. In step ST26, if the overlap ratio between the image for synthesis 92 and the image for display 94 exceeds the predetermined threshold value, the determination is negative, and the imaging process proceeds to step ST16.
- the first imaging control unit 62 causes the image sensor 36 to image the imaging target region 3A (see FIG. 9 or FIG. 12).
- Image data for synthesis is obtained by imaging the imaging target region 3A with the image sensor 36 under the control of the first imaging control section 62 .
- the imaging process proceeds to step ST30.
- step ST30 the second movement determination unit 80 determines whether the flying object 10 continues to move based on the movement instruction signal received by the imaging device 30 and the movement direction information generated in step ST16. is determined (see FIG. 10 or FIG. 13). In step ST30, if the flying object 10 continues to move, the determination is affirmative, and the imaging process proceeds to step ST32. In step ST30, if the flying object 10 has not continued to move, that is, if the flying object 10 has stopped moving, the determination is negative, and the imaging process proceeds to step ST34.
- step ST32 the first feature point information generation unit 64 generates the first image region of the image for synthesis 92 based on the moving direction information obtained in step ST16 and the image data for synthesis obtained in step ST28.
- the first feature point information indicating the coordinates of the first feature points for every 98 is generated (see FIG. 10).
- step ST34 the first feature point information generation unit 64 generates first feature points included in the entire area of the image for synthesis 92 for each first divided area 96 based on the image data for synthesis acquired in step ST28. First feature point information indicating the coordinates of the first feature point for each of the extracted first divided areas 96 is generated (see FIG. 13). After the process of step ST34 is executed, the imaging process proceeds to step ST36.
- the processor 46 determines whether or not a condition for terminating the imaging process (end condition) is met.
- end conditions include a condition that the user has given an instruction to end the imaging process to the imaging device 30, or a condition that the number of images for synthesis 92 has reached the number specified by the user.
- the termination condition is not met, the determination is negative, and the imaging process proceeds to step ST14.
- the end condition is met, the determination is affirmative and the imaging process ends.
- the imaging method described as the operation of the flight imaging device 1 described above is an example of the “imaging method” according to the technology of the present disclosure.
- the processor 46 compares the first feature point included in the image for synthesis 92 and the second feature point included in the image for display 94, and the result is Based on this, the overlap ratio between the image for synthesis 92 and the image for display 94 is determined.
- the processor 46 determines the overlap ratio, the first image region 98, which is the image region included in the image for synthesis 92 and for obtaining the first feature point, and the A second image area 102, which is an image area included in the image 94 and for obtaining the second feature point, is changed according to the moving direction.
- the load required for the processing of the processor 46 when determining the overlap ratio can be reduced.
- the processor 46 also compared the first feature points included in the first image area 98 of the image for synthesis 92 with the second feature points included in the second image area 102 of the image for display 94. Based on the results, the overlap ratio between the image for synthesis 92 and the image for display 94 is determined. Therefore, for example, based on the result of comparing the first feature points included in the entire area of the image for synthesis 92 and the second feature points included in the entire area of the image for display 94, the image for synthesis 92 and the image for display As in the case of judging the overlap rate with 94, it is possible to ensure the judgment accuracy of the overlap rate.
- the processor 46 causes the display device 24 to display the display image 94 . Therefore, based on the display image 94, the position of the imaging target area 3A can be confirmed.
- the first image area 98 is a partial image area of the synthesis image 92
- the second image area 102 is a partial image area of the display image 94 . Therefore, for example, compared with the case of comparing the first feature points included in the entire area of the first image and the second feature points included in the entire area of the second image, the first feature points and the second feature points It is possible to reduce the load required for the processing of the processor 46 when comparing the .
- the first image area 98 is an area located on the front side in the movement direction of the image for synthesis 92
- the second image area 102 is an area located on the rear side of the image for display 94 in the movement direction. Therefore, by comparing the first feature points included in the first image area 98 and the second feature points included in the second image area 102, the overlap rate between the synthesis image 92 and the display image 94 can be determined. can judge.
- the processor 46 selects a first image region 98 from a plurality of first divided regions 96 obtained by dividing the composite image 92 . Therefore, for example, compared to setting the first image region 98 from the image for synthesis 92 in a state in which the image for synthesis 92 is not divided into a plurality of first divided regions 96, the first image region 98 can be set by simple processing. Can be set.
- the processor 46 selects a second image region 102 from a plurality of second divided regions 100 obtained by dividing the display image 94 . Therefore, for example, compared to setting the second image area 102 from the display image 94 in a state in which the display image 94 is not divided into a plurality of second divided areas 100, the second image area 102 can be set by simple processing. Can be set.
- the first image region 98 includes three first segmented regions 96 out of the plurality of first segmented regions 96, and the second image region 102 includes three second segmented regions 100 out of the plurality of second segmented regions 100.
- a divided area 100 is included. Therefore, for example, the first image region 98 includes less than three first segmented regions 96 out of the plurality of first segmented regions 96, and the second image region 102 includes three out of the plurality of second segmented regions 100. It is possible to improve the determination accuracy of the overlap rate compared to the case of including less than the second divided areas 100 .
- first image area 98 and the second image area 102 are each rectangular areas. Therefore, for example, the first feature point and the second feature point can be compared with a simpler process than when the first image area 98 and the second image area 102 are areas with shapes other than rectangles. be able to.
- the first feature point is a feature point included only in the first image area 98
- the second feature point is a feature point included only in the second image area 102 . Therefore, for example, the overlap ratio It is possible to reduce the load required for the processing of the processor 46 when determining
- the processor 46 extracts first feature points included in a first image area 98 that is a partial image area of the display image 94 . Therefore, for example, when the moving body is moving, the processor 46 extracts the first feature points included in the entire area of the display image 94, and the processor 46 extracts the first feature points. 46 processing load can be reduced.
- the processor 46 extracts feature points included in the entire area of the display image 94 when the flying object 10 is stationary. Therefore, the first feature point included in the first image area 98 of the display image 94 can be extracted regardless of the direction in which the flying object 10 moves from the stationary state.
- the area of the first image area 98 and the area of the second image area 102 are each larger than the overlapping area (hereinafter simply referred to as "overlap area") when the adjacent images for synthesis 92 are synthesized.
- overlap area the overlapping area
- the first feature point and the second feature point Based on this, it is possible to secure determination accuracy when determining the overlap ratio between the image for synthesis 92 and the image for display 94 .
- the imaging device 30 is mounted on the aircraft 10 . Therefore, the imaging device 30 can image the imaging target region 3A while the aircraft 10 is flying.
- the first image region 98 includes three first divided regions 96 out of the plurality of first divided regions 96, and the second image region 102 out of the plurality of second divided regions 100. includes three second divided regions 100 of However, as shown in FIG. 19 as an example, the first image region 98 is one of the first divided regions 96 of the plurality of first divided regions 96, and the second image region 102 is the plurality of second divided regions. It may be one of the second divided regions 100 of the regions 100 . Then, the feature point comparison unit 76 compares the first feature points included in the first image region 98 that is one first divided region 96 and the second feature points included in the second image region 102 that is one second divided region 100 . Two feature points may be compared and comparison result information indicating the comparison result may be generated.
- the feature point comparison unit 76 compares the first feature points included in the first image area 98 that is the three first segmented areas 96 and the three second segmented areas 100. Compared to the case of comparing the second feature points included in the second image area 102, the load required for the processing of the processor 46 when comparing the first feature points and the second feature points can be reduced. .
- the first image region 98 is two of the first divided regions 96 of the plurality of first divided regions 96
- the second image region 102 is of the plurality of second divided regions 100. may be two second segmented regions 100.
- the feature point comparison unit 76 compares the first feature points included in the first image regions 98 that are the two first divided regions 96 and the second feature points that are included in the second image regions 102 that are the two second divided regions 100 . Two feature points may be compared and comparison result information indicating the comparison result may be generated.
- the feature point comparison unit 76 compares the first feature points included in the first image regions 98, which are the three first divided regions 96, and the second image regions, which are the three second divided regions 100. Compared to the case of comparing the second feature points included in 102, the load required for the processing of the processor 46 when comparing the first feature points and the second feature points can be reduced.
- first image region 98 includes four or more first divided regions 96 out of the plurality of first divided regions 96
- second image region 102 includes four out of the plurality of second divided regions 100.
- the second segmented region 100 described above may be included.
- the first image area 98 and the second image area 102 are each rectangular areas.
- the processor 46 selects one first triangular area 104 out of the plurality of first triangular areas 104 including the four corners of the image for synthesis 92 (that is, the moving direction of the four corners of the image for synthesis 92).
- a first triangular region 104 ) containing the forwardly located corner may be set to the first image region 98 .
- the processor 46 selects one of the plurality of second triangular regions 106 including the four corners of the display image 94 (that is, the movement direction of the four corners of the display image 94).
- a second triangular area 106 ) containing the corner located on the rear side may be set in the second image area 102 .
- the feature point comparison unit 76 compares the first feature points included in the first image area 98 and the second feature points included in the second image area 102, whereby the comparison is performed. Result information can be generated.
- the second imaging control unit 70 causes the entire area of the image sensor 36 to output display image data.
- the second imaging control unit 70 acquires the movement direction of the flying object 10 based on the movement instruction signal, and uses the cropping function of the image sensor 36 to obtain the image sensor 36
- the image data for display may be output only from the area 36A located on the front side in the moving direction among all the pixels of . That is, when acquiring the display image 94, the second imaging control unit 70 causes the image sensor 36 to output only image data used for comparing the first feature points and the second feature points. good too.
- the second feature point information generation unit 74 generates the second feature point information of the display image 94 based on the display image data output only from the region 36A positioned on the front side in the movement direction of the entire region of the image sensor 36. Second feature point information regarding the two-image region 102 may be generated.
- the second feature point information regarding the second image area 102 of the display image 94 is higher than, for example, the case where the display image data is output from the entire area of the image sensor 36 . It is possible to reduce the load required for the processing of the processor 46 when generating .
- the processor 46 is used to compare the first feature points and the second feature points in the display image data output from the entire area of the image sensor 36 when acquiring the display image 94. Only image data may be acquired. According to this example, compared to the case where the second feature point information about the entire area of the display image 94 is generated based on the display image data output from the entire area of the image sensor 36, the second feature points The load required for the processing of the processor 46 when generating information can be reduced.
- the first feature point information generation unit 64 changes the area of the first image region 98 in the synthesis image 92 according to the moving speed of the aircraft 10.
- the second feature point information generator 74 may change the area of the second image area 102 in the display image 94 according to the moving speed of the flying object 10 .
- the first feature point information generator 64 increases the area of the first image region 98 as the moving speed of the flying object 10 increases.
- the second feature point information generator 74 increases the area of the second image area 102 as the moving speed of the flying object 10 increases. According to the example shown in FIG. 22, for example, even if the moving speed of the flying object 10 increases, the overlap area is smaller than that in the case where the area of the first image area 98 and the area of the first image area 98 are constant. Rate determination accuracy can be improved.
- the first feature point information generator 64 reduces the area of the first image region 98 as the moving speed of the flying object 10 increases.
- the second feature point information generator 74 reduces the area of the second image area 102 as the moving speed of the flying object 10 increases. According to the example shown in FIG. 23, for example, even if the moving speed of the flying object 10 increases, the overlap ratio It is possible to reduce the load required for the processing of the processor 46 when determining
- both the area of the first image area 98 and the area of the second image area 102 are changed, but the area of the first image area 98 and the area of the second image area 102 are changed. Only one of the areas may be changed.
- the processor 46 determines the first image area 98 included in the image for synthesis 92 and the second image area 102 included in the image for display 94 according to the movement direction. to change. However, when determining the overlap ratio, the processor 46 selects only one of the first image area 98 included in the image for synthesis 92 and the second image area 102 included in the image for display 94 in the moving direction. may be changed accordingly.
- the number of the plurality of first divided regions 96 is nine as an example, but the number of the plurality of first divided regions 96 may be other than nine (eg, four, six, eight, etc.). 1, or 16, etc.).
- the number of the plurality of second segmented regions 100 is nine as an example, but the number of the plurality of second segmented regions 100 may be other than nine (eg, four, six, eight, or sixteen). etc.) is also acceptable.
- the number of the plurality of first segmented regions 96 and the number of the plurality of second segmented regions 100 may be different.
- the movement direction information generation unit 68 acquires the movement instruction signal transmitted from the transmitter 20 and generates movement direction information based on the acquired movement instruction signal.
- the movement direction information generator 68 may, for example, acquire a movement signal generated by the aircraft 10 and generate movement direction information based on the acquired movement signal.
- the movement direction information generation unit 68 may generate movement direction information based on information acquired from the imaging device 30 and/or various devices mounted on the aircraft 10 .
- Various devices include, for example, satellite positioning systems (eg, global positioning systems), accelerometers, three-dimensional sensors (eg, LiDAR or stereo cameras), and the like.
- the imaging device 30 can be mounted on various mobile objects (for example, a gondola, an automatic guided robot, an unmanned guided vehicle, or a high-speed aircraft). inspection vehicle) or the like.
- the image for synthesis 92 is acquired by the imaging device 30, but the image for synthesis 92 may be acquired by an imaging device other than the imaging device 30. Then, the overlap rate between the synthesis image 92 acquired by another imaging device and the display image 94 acquired by the imaging device 30 may be determined.
- the overlap ratio between the live view image as the display image 94 and the composition image 92 is determined.
- a rate may be determined.
- the overlap ratio between the image for synthesis 92 and the image for display 94 is determined.
- a rate may be determined.
- processor 46 is illustrated in each of the above embodiments, at least one other CPU, at least one GPU, and/or at least one TPU may be used in place of or together with the processor 46 can be
- the imaging program 60 may be stored in a portable non-temporary computer-readable storage medium such as an SSD or USB memory (hereinafter simply referred to as "non-temporary storage medium").
- An imaging program 60 stored in a non-temporary storage medium is installed in the computer 32 of the imaging device 30 , and the processor 46 executes processing according to the imaging program 60 .
- the imaging program 60 is stored in another computer connected to the imaging device 30 via a network or in a storage device such as a server device, and the imaging program 60 is downloaded in response to a request from the imaging device 30 and stored in the computer 32 .
- a storage device such as a server device
- the imaging program 60 is downloaded in response to a request from the imaging device 30 and stored in the computer 32 . may be installed in
- imaging program 60 it is not necessary to store all of the imaging program 60 in a storage device such as another computer or server device connected to the imaging device 30, or in the storage 48, and a part of the imaging program 60 may be stored. good too.
- the computer 32 is built into the imaging device 30, the technology of the present disclosure is not limited to this, and the computer 32 may be provided outside the imaging device 30, for example.
- the computer 32 including the processor 46, the storage 48, and the RAM 50 is illustrated in each of the above-described embodiments, the technology of the present disclosure is not limited to this, and instead of the computer 32, ASIC, FPGA, and/or Or you may apply the device containing PLD. Also, instead of the computer 32, a combination of hardware configuration and software configuration may be used.
- processors can be used as hardware resources for executing the various processes described in each of the above embodiments.
- processors include CPUs, which are general-purpose processors that function as hardware resources that execute various processes by executing software, that is, programs.
- processors include, for example, dedicated electronic circuits such as FPGAs, PLDs, and ASICs, which are processors having circuit configurations specially designed to execute specific processing.
- a memory is built in or connected to each processor, and each processor uses the memory to perform various processes.
- Hardware resources that perform various processes may be configured with one of these various processors, or a combination of two or more processors of the same or different types (for example, a combination of multiple FPGAs or CPUs). and FPGA). Also, the hardware resource for executing various processes may be one processor.
- one processor is configured by combining one or more CPUs and software, and this processor functions as a hardware resource that executes various processes.
- this processor functions as a hardware resource that executes various processes.
- SoC SoC, etc.
- a and/or B is synonymous with “at least one of A and B.” That is, “A and/or B” means that only A, only B, or a combination of A and B may be used. Also, in this specification, when three or more matters are expressed by connecting with “and/or”, the same idea as “A and/or B" is applied.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
Abstract
According to the present invention, a processor acquires a first image, acquires a movement direction of a moving body on which an image-capturing device is mounted, acquires a second image obtained by performing image-capturing by the image-capturing device in a case in which the moving body has moved, determines an overlap ratio of the first image and the second image on the basis of a result of comparing a first feature point included in the first image and a second feature point included in the second image, and in a case of determining the overlap ratio, changes a first image region that is an image region included in the first image and is an image region for obtaining the first feature point and/or a second image region that is an image region included in the second image and is an image region for obtaining the second feature point, in accordance with the movement direction.
Description
本開示の技術は、撮像装置、撮像方法、及びプログラムに関する。
The technology of the present disclosure relates to an imaging device, an imaging method, and a program.
特開2010-045587号公報には、画像撮影部と、画像表示部と、振れ検知部と、画像記録部と、相対関係演算部と、表示制御部と、重複演算部と、通知部と、撮影制御部とを有するカメラ装置が開示されている。画像撮影部は、画像を撮影する。画像表示部は、少なくとも画像を表示する画面を備える。振れ検知部は、画像撮影部による画像撮影時の装置振れを検知する。画像記録部は、画像撮影部にて撮影された画像の情報を記録する。相対関係演算部は、画像撮影部で直前に撮影されて画像記録部に記録された第一画像の撮影範囲と、第一画像の次に画像撮影部にて撮影される第二画像の撮影範囲との間の、少なくとも相対的な位置関係を表す相対関係度パラメータを求める。表示制御部は、相対関係演算部が求めた相対関係度パラメータから撮影範囲間の相対的な位置関係を明示するための画像を生成し、当該画像を第二画像と共に画像表示部の画面上に表示させる。重複演算部は、第一画像の撮影範囲と第二画像の撮影範囲との間の重なり具合を表す重複度パラメータを求める。通知部は、重複演算部が求めた重複度パラメータに応じて撮影者へ所定の通知を行う。撮影制御部は、重複演算部が求めた重複度パラメータが所定の閾値範囲内となり、振れ検知部の検知出力から画像撮影部における画像撮影時に装置振れが略々無いと判断できる時に、画像撮影部に画像を撮影させる。
JP 2010-045587 discloses an image capturing unit, an image display unit, a shake detection unit, an image recording unit, a relative relationship calculation unit, a display control unit, an overlap calculation unit, a notification unit, A camera device having a shooting control unit is disclosed. The image capturing unit captures an image. The image display unit has at least a screen that displays an image. The shake detection section detects apparatus shake during image shooting by the image shooting section. The image recording section records information of an image captured by the image capturing section. The relative relationship calculation unit calculates the photographing range of the first image photographed immediately before by the image photographing unit and recorded in the image recording unit, and the photographing range of the second image photographed by the image photographing unit subsequent to the first image. A relative degree parameter representing at least a relative positional relationship between is obtained. The display control unit generates an image for clearly indicating the relative positional relationship between the shooting ranges from the relative relationship degree parameter obtained by the relative relationship calculation unit, and displays the image together with the second image on the screen of the image display unit. display. The overlap calculation unit obtains a degree of overlap parameter representing the degree of overlap between the imaging range of the first image and the imaging range of the second image. The notification unit gives a predetermined notification to the photographer according to the redundancy parameter obtained by the overlap calculation unit. When the overlap parameter obtained by the overlap calculation unit is within a predetermined threshold range and it can be determined from the detection output of the shake detection unit that there is substantially no apparatus shake during image shooting by the image capturing unit, the image capturing unit to take an image.
特開2013-186853号公報には、取得手段と、分割手段と、特徴点検出手段と、ベクトル算出手段と、領域選択手段と、生成手段とを備える画像処理装置が記載されている。取得手段は、撮像部の移動を伴い連続的に撮像された複数の画像を取得する。分割手段は、取得手段により取得された画像を複数の領域に分割する。特徴点検出手段は、分割手段により分割された複数の領域における特徴点を夫々検出する。ベクトル算出手段は、特徴点検出手段により夫々検出された特徴点に基づいて、複数の画像の隣接する画像間における複数の領域に対応するベクトルを複数算出する。領域選択手段は、ベクトル算出手段により算出された複数のベクトルに基づいて、分割された複数の領域から特定の領域を選択する。生成手段は、領域選択手段により選択された特定の領域に基づいて、隣接する画像同士を合成することで合成画像を生成する。
Japanese Patent Application Laid-Open No. 2013-186853 describes an image processing device that includes acquisition means, division means, feature point detection means, vector calculation means, area selection means, and generation means. The acquisition means acquires a plurality of images continuously captured with movement of the imaging unit. The dividing means divides the image acquired by the acquiring means into a plurality of regions. The feature point detection means detects feature points in each of the plurality of areas divided by the division means. The vector calculation means calculates a plurality of vectors corresponding to a plurality of areas between adjacent images of the plurality of images based on the feature points detected by the feature point detection means. The area selection means selects a specific area from the plurality of divided areas based on the plurality of vectors calculated by the vector calculation means. The generation means generates a composite image by combining adjacent images based on the specific area selected by the area selection means.
国際公開第2018/168406号パンフレットには、カメラを備える移動体の撮影を制御する撮影制御装置であって、広角画像取得部と、撮影情報取得部と、糊代情報取得部と、領域情報取得部と、撮影領域算出部と、制御部と、を備える撮影制御装置が開示されている。広角画像取得部は、撮影対象の全体像が広角撮影された広角画像を取得する。撮影情報取得部は、撮影対象の全体像の一部が、移動体のカメラで近接撮影されることにより取得される複数の分割画像の撮影枚数または撮影画角に関する撮影情報を取得する。糊代情報取得部は、複数の分割画像を合成して撮影対象の合成画像を生成する場合の糊代に関する糊代情報を取得する。領域情報取得部は、撮影対象の全体像の領域に関する撮影対象領域情報を取得する。撮影領域算出部は、撮影情報、糊代情報、および撮影対象領域情報に基づいて、合成画像を構成する分割画像のそれぞれの撮影領域であって、糊代が確保された広角画像におけるそれぞれの撮影領域を算出する。制御部は、移動体を移動させ、算出した各撮影領域をカメラにより近接撮影させ、撮影した近接画像を分割画像として取得する。制御部は、取得した広角画像の各撮影領域に対応する画像とカメラにより近接撮影される画像とを対比し、カメラに各撮影領域を近接撮影させる移動体の位置を制御する。
International Publication No. 2018/168406 pamphlet describes an imaging control device for controlling imaging of a moving object equipped with a camera, which includes a wide-angle image acquisition unit, an imaging information acquisition unit, an overlap information acquisition unit, and an area information acquisition unit. A photographing control device is disclosed that includes a photographing area calculator, and a controller. The wide-angle image acquisition unit acquires a wide-angle image obtained by wide-angle imaging of an entire image of an imaging target. The photographing information acquisition unit acquires photographing information regarding the number of shots or the photographing angle of view of a plurality of divided images obtained by closely photographing a part of the whole image of the object to be photographed by a mobile camera. The margin information acquisition unit acquires margin information relating to margin when a composite image of a photographing target is generated by synthesizing a plurality of divided images. The area information acquisition unit acquires shooting target area information about an area of the entire image of the shooting target. Based on the shooting information, the margin information, and the shooting target region information, the shooting area calculation unit calculates each shot of a wide-angle image in which the margin is secured, which is the shooting area of each of the divided images constituting the composite image. Calculate the area. The control unit moves the moving body, causes the camera to take close-up shots of the calculated shooting areas, and acquires the shot close-up images as divided images. The control unit compares an image corresponding to each shooting area of the acquired wide-angle image with an image shot by the camera at close range, and controls the position of the moving body for causing the camera to take close shots of each shooting area.
特開2007-174301号公報には、メモリと、動き検出手段と、ガイド画像作成手段と、重ね合わせ画像作成手段と、カレント画像取得手段とを具備する画像撮影装置が開示されている。メモリは、第1の撮影済み画像を記憶する。動き検出手段は、画像撮影装置の動きを検出して動き方向と動き距離を算出する。ガイド画像作成手段は、第1の撮影済み画像を表示画面に表示すると共に、当該表示された第1の撮影済み画像を当該表示画面に対して動き方向と反対方向に動き距離に応じてシフトして表示を固定し、当該シフト後に表示画面に残っている固定された画像をガイド画像として表示する。重ね合わせ画像作成手段は、画像撮影装置が現在撮影中のカレント画像とガイド画像を表示画面に重ねて表示する。カレント画像取得手段は、撮影ボタンが押下された事を検出したらカレント画像を第2の撮影済み画像としてメモリに記憶する。
Japanese Unexamined Patent Application Publication No. 2007-174301 discloses an image capturing device comprising a memory, motion detection means, guide image creation means, superimposed image creation means, and current image acquisition means. A memory stores the first captured image. The motion detection means detects motion of the imaging device and calculates a motion direction and a motion distance. The guide image creating means displays the first photographed image on the display screen, and shifts the displayed first photographed image in a direction opposite to the movement direction with respect to the display screen according to the movement distance. to fix the display, and the fixed image remaining on the display screen after the shift is displayed as a guide image. The superimposed image creating means superimposes and displays the current image and the guide image currently being captured by the image capturing device on the display screen. The current image obtaining means stores the current image in the memory as a second photographed image when it is detected that the photographing button has been pressed.
特開2008-252934号公報には、視野に入った画像をモニタするためのモニタ手段と、撮影された画像を記録媒体に電子的に記録する手段を有するカメラ装置が開示されている。カメラ装置は、モニタ手段に表示されている画像と直前に撮影された画像との相対位置関係を示すイメージを、該カメラ装置の視野に入った画像と一緒にモニタ手段に表示する手段を具備する。相対位置関係を示すイメージを、該カメラ装置の視野に入った画像と一緒にモニタ手段に表示する手段は、相対位置関係に対応する方向及び量だけ、モニタ手段に表示されている画像に対して移動させた画像枠を示すイメージを、相対位置関係を示すイメージとしてモニタ手段に表示する。
Japanese Patent Laying-Open No. 2008-252934 discloses a camera device having monitor means for monitoring an image that enters the field of view and means for electronically recording the captured image on a recording medium. The camera device comprises means for displaying on the monitor means an image showing the relative positional relationship between the image displayed on the monitor means and the image taken immediately before, together with the image within the field of view of the camera device. . Means for displaying an image showing the relative positional relationship on the monitor means together with an image within the field of view of the camera device, for the image displayed on the monitor means by a direction and amount corresponding to the relative positional relationship An image showing the moved image frame is displayed on the monitor means as an image showing the relative positional relationship.
本開示の技術に係る一つの実施形態は、例えば、第1画像の全領域に含まれる第1特徴点と第2画像の全領域に含まれる第2特徴点とを比較した結果に基づいて、第1画像と第2画像とのオーバーラップ率を判定する場合に比して、オーバーラップ率を判定する場合のプロセッサの処理に要する負荷を軽減することができる撮像装置、撮像方法、及びプログラムを提供する。
One embodiment according to the technology of the present disclosure, for example, based on the result of comparing the first feature points included in the entire area of the first image and the second feature points included in the entire area of the second image, An imaging apparatus, an imaging method, and a program capable of reducing the load required for processor processing when determining an overlap rate compared to when determining an overlap rate between a first image and a second image offer.
本開示の技術に係る第1の態様は、プロセッサを備える撮像装置であって、プロセッサは、第1画像を取得し、撮像装置が搭載された移動体の移動方向を取得し、移動体が移動した場合に、撮像装置によって撮像されることで得られた第2画像を取得し、第1画像に含まれる第1特徴点と第2画像に含まれる第2特徴点とを比較した結果に基づいて、第1画像と第2画像とのオーバーラップ率を判定し、オーバーラップ率を判定する場合に、第1画像に含まれる画像領域であって、第1特徴点を得るための画像領域である第1画像領域、及び/又は第2画像に含まれる画像領域であって、第2特徴点を得るための画像領域である第2画像領域を移動方向に応じて変更する撮像装置である。
A first aspect of the technology of the present disclosure is an imaging device that includes a processor, the processor acquires a first image, acquires a moving direction of a moving body on which the imaging device is mounted, and moves the moving body. acquire a second image captured by the imaging device, and compare the first feature points included in the first image with the second feature points included in the second image, based on the result of to determine the overlap ratio between the first image and the second image, and in the case of determining the overlap ratio, the image area included in the first image for obtaining the first feature point The imaging device changes a second image area, which is an image area included in a certain first image area and/or a second image and is an image area for obtaining a second feature point, according to a movement direction.
本開示の技術に係る第2の態様は、第1の態様に係る撮像装置において、プロセッサは、第2画像を表示用画像として表示装置に表示させる撮像装置である。
A second aspect of the technology of the present disclosure is the imaging device according to the first aspect, wherein the processor causes the display device to display the second image as a display image.
本開示の技術に係る第3の態様は、第1の態様又は第2の態様に係る撮像装置において、第1画像領域は、第1画像のうちの一部の画像領域であり、第2画像領域は、第2画像のうちの一部の画像領域である撮像装置である。
A third aspect of the technology of the present disclosure is the imaging device according to the first aspect or the second aspect, wherein the first image area is a partial image area of the first image, and the second image The region is an imaging device that is a partial image region of the second image.
本開示の技術に係る第4の態様は、第1の態様から第3の態様の何れか一つの態様に係る撮像装置において、第1画像領域は、第1画像における移動方向の前側に位置する領域であり、第2画像領域は、第2画像における移動方向の後側に位置する領域である撮像装置である。
A fourth aspect of the technology of the present disclosure is the imaging device according to any one of the first to third aspects, wherein the first image area is positioned on the front side in the moving direction of the first image. The second image area is an imaging device that is an area located on the rear side in the movement direction in the second image.
本開示の技術に係る第5の態様は、第1の態様から第4の態様の何れか一つの態様に係る撮像装置において、プロセッサは、第1画像が分割されることで得られた複数の第1分割領域から第1画像領域を選定し、第2画像が分割されることで得られた複数の第2分割領域から第2画像領域を選定する撮像装置である。
A fifth aspect of the technology of the present disclosure is the imaging device according to any one of the first to fourth aspects, wherein the processor divides the first image into a plurality of The imaging device selects a first image area from the first divided areas and selects a second image area from a plurality of second divided areas obtained by dividing the second image.
本開示の技術に係る第6の態様は、第5の態様に係る撮像装置において、第1画像領域は、複数の第1分割領域のうちの2つ以上の第1分割領域を含み、第2画像領域は、複数の第2分割領域のうちの2つ以上の第2分割領域を含む撮像装置である。
A sixth aspect of the technology of the present disclosure is the imaging device according to the fifth aspect, wherein the first image area includes two or more first divided areas among the plurality of first divided areas, and the second The image area is an imaging device that includes two or more of the plurality of second divided areas.
本開示の技術に係る第7の態様は、第5の態様に係る撮像装置において、第1画像領域は、複数の第1分割領域のうちの1つの第1分割領域であり、第2画像領域は、複数の第2分割領域のうちの1つの第2分割領域である撮像装置である。
A seventh aspect of the technology of the present disclosure is the imaging device according to the fifth aspect, wherein the first image area is one first divided area of the plurality of first divided areas, and the second image area is an imaging device that is one second segmented area of a plurality of second segmented areas.
本開示の技術に係る第8の態様は、第1の態様から第7の態様の何れか一つの態様に係る撮像装置において、第1画像領域及び第2画像領域は、それぞれ矩形状の領域である撮像装置である。
An eighth aspect of the technology of the present disclosure is the imaging device according to any one of the first to seventh aspects, wherein the first image area and the second image area are rectangular areas. It is an imaging device.
本開示の技術に係る第9の態様は、第1の態様から第7の態様の何れか一つの態様に係る撮像装置において、プロセッサは、移動方向が第2画像の横方向又は縦方向に対して傾斜する方向である場合、第1画像の4つの角のうちの移動方向の前側に位置する角を含む第1三角形領域を第1画像領域に設定し、かつ第2画像の4つの角のうちの移動方向の後側に位置する角を含む第2三角形領域を第2画像領域に設定する撮像装置である。
A ninth aspect of the technology of the present disclosure is the imaging device according to any one of the first to seventh aspects, wherein the processor moves in the horizontal direction or the vertical direction of the second image. is the direction of inclination, the first triangular region including the corner located on the front side in the moving direction among the four corners of the first image is set as the first image region, and the four corners of the second image are set to the first triangular region The imaging device sets the second triangular area including the corner positioned on the rear side in the movement direction as the second image area.
本開示の技術に係る第10の態様は、第1の態様から第9の態様の何れか一つの態様に係る撮像装置において、プロセッサは、第2画像を取得する場合に、第1特徴点と第2特徴点とを比較するのに用いられる画像データのみをイメージセンサから取得する撮像装置である。
A tenth aspect of the technology of the present disclosure is the imaging device according to any one of the first to ninth aspects, wherein the processor acquires the first feature point and The imaging device obtains from the image sensor only image data used for comparison with the second feature point.
本開示の技術に係る第11の態様は、第1の態様から第10の態様の何れか一つの態様に係る撮像装置において、第1特徴点は、第1画像領域のみに含まれる特徴点であり、第2特徴点は、第2画像領域のみに含まれる特徴点である撮像装置である。
An eleventh aspect of the technology of the present disclosure is the imaging device according to any one of the first to tenth aspects, wherein the first feature point is a feature point included only in the first image area. , and the second feature point is an imaging device that is a feature point included only in the second image area.
本開示の技術に係る第12の態様は、第1の態様から第11の態様の何れか一つの態様に係る撮像装置において、プロセッサは、移動体の移動速度に応じて、第1画像領域の面積及び/又は第2画像領域の面積を変更する撮像装置である。
A twelfth aspect of the technology of the present disclosure is the imaging device according to any one of the first to eleventh aspects, wherein the processor, in accordance with the moving speed of the moving body, An imaging device for changing the area and/or the area of the second image area.
本開示の技術に係る第13の態様は、第12の態様に係る撮像装置において、プロセッサは、移動体の移動速度が増加するに従って、第1画像領域の面積及び/又は第2画像領域の面積を増加させる撮像装置である。
A thirteenth aspect of the technology of the present disclosure is the imaging device according to the twelfth aspect, wherein the processor increases the area of the first image region and/or the area of the second image region as the moving speed of the moving object increases. is an imaging device that increases the
本開示の技術に係る第14の態様は、第12の態様に係る撮像装置において、プロセッサは、移動体の移動速度が増加するに従って、第1画像領域の面積及び/又は第2画像領域の面積を減少させる撮像装置である。
A fourteenth aspect of the technology of the present disclosure is the imaging device according to the twelfth aspect, wherein the processor increases the area of the first image region and/or the area of the second image region as the moving speed of the moving object increases. is an imaging device that reduces the
本開示の技術に係る第15の態様は、第1の態様から第14の態様の何れか一つの態様に係る撮像装置において、プロセッサは、移動体が移動している場合、第1画像のうちの一部の画像領域である第1画像領域に含まれる第1特徴点を抽出する撮像装置である。
A fifteenth aspect of the technology of the present disclosure is the imaging device according to any one of the first to fourteenth aspects, wherein the processor, when the moving object is moving, is an imaging device that extracts a first feature point included in a first image area that is a part of the image area of the .
本開示の技術に係る第16の態様は、第1の態様から第15の態様の何れか一つの態様に係る撮像装置において、プロセッサは、移動体が停止している場合、第1画像の全領域に含まれる特徴点を抽出する撮像装置である。
A sixteenth aspect of the technology of the present disclosure is the imaging device according to any one of the first to fifteenth aspects, wherein when the moving body is stationary, the processor It is an imaging device that extracts feature points included in an area.
本開示の技術に係る第17の態様は、第1の態様から第16の態様の何れか一つの態様に係る撮像装置において、第1画像領域の面積及び第2画像領域の面積は、隣接する第1画像が合成される場合にオーバーラップする領域であるオーバーラップ領域よりもそれぞれ大きい面積に設定される撮像装置である。
A seventeenth aspect of the technology of the present disclosure is the imaging device according to any one of the first to sixteenth aspects, wherein the area of the first image region and the area of the second image region are adjacent to each other In the imaging device, each area is set to be larger than an overlap area, which is an area where the first images are overlapped when the first image is synthesized.
本開示の技術に係る第18の態様は、第1の態様から第17の態様の何れか一つの態様に係る撮像装置において、移動体は、飛行体である撮像装置である。
An eighteenth aspect of the technology of the present disclosure is an imaging device according to any one of the first to seventeenth aspects, wherein the moving object is a flying object.
本開示の技術に係る第19の態様は、第1画像を取得すること、撮像装置が搭載された移動体の移動方向を取得すること、移動体が移動した場合に、撮像装置によって撮像されることで得られた第2画像を取得すること、第1画像に含まれる第1特徴点と第2画像に含まれる第2特徴点とを比較した結果に基づいて、第1画像と第2画像とのオーバーラップ率を判定すること、及び、オーバーラップ率を判定する場合に、第1画像に含まれる画像領域であって、第1特徴点を得るための画像領域である第1画像領域、及び/又は第2画像に含まれる画像領域であって、第2特徴点を得るための画像領域である第2画像領域を移動方向に応じて変更することを備える撮像方法である。
A nineteenth aspect according to the technology of the present disclosure is to obtain a first image, to obtain a moving direction of a moving body on which an imaging device is mounted, and to capture an image by the imaging device when the moving body moves. obtaining a second image obtained by the above, and based on the result of comparing the first feature points included in the first image and the second feature points included in the second image, the first image and the second image and a first image area that is an image area included in the first image when determining the overlap rate and is an image area for obtaining the first feature point, and/or changing a second image area, which is an image area included in the second image and is an image area for obtaining the second feature points, according to the moving direction.
本開示の技術に係る第20の態様は、第1画像を取得すること、撮像装置が搭載された移動体の移動方向を取得すること、移動体が移動した場合に、撮像装置によって撮像されることで得られた第2画像を取得すること、第1画像に含まれる第1特徴点と第2画像に含まれる第2特徴点とを比較した結果に基づいて、第1画像と第2画像とのオーバーラップ率を判定すること、及び、オーバーラップ率を判定する場合に、第1画像に含まれる画像領域であって、第1特徴点を得るための画像領域である第1画像領域、及び/又は第2画像に含まれる画像領域であって、第2特徴点を得るための画像領域である第2画像領域を移動方向に応じて変更することを含む処理をコンピュータに実行させるためのプログラムである。
A twentieth aspect of the technology of the present disclosure is to obtain a first image, to obtain a moving direction of a moving body on which an imaging device is mounted, and to capture an image by the imaging device when the moving body moves. obtaining a second image obtained by the above, and based on the result of comparing the first feature points included in the first image and the second feature points included in the second image, the first image and the second image and a first image area that is an image area included in the first image when determining the overlap rate and is an image area for obtaining the first feature point, and/or for causing the computer to perform processing including changing the second image area, which is an image area included in the second image and is an image area for obtaining the second feature points, according to the movement direction. It's a program.
以下、添付図面に従って本開示の技術に係る撮像装置、撮像方法、及びプログラムの実施形態の一例について説明する。
An example of an embodiment of an imaging device, an imaging method, and a program according to the technology of the present disclosure will be described below with reference to the accompanying drawings.
先ず、以下の説明で使用される文言について説明する。
First, the wording used in the following explanation will be explained.
I/Fとは、“Interface”の略称を指す。RAMとは、“Random Access Memory”の略称を指す。EEPROMとは、“Electrically Erasable Programmable Read-Only Memory”の略称を指す。CPUとは、“Central Processing Unit”の略称を指す。HDDとは、“Hard Disk Drive”の略称を指す。SSDとは、“Solid State Drive”の略称を指す。DRAMとは、“Dynamic Random Access Memory”の略称を指す。SRAMとは、“Static Random Access Memory”の略称を指す。CMOSとは、“Complementary Metal Oxide Semiconductor”の略称を指す。LiDARとは、“Light Detection And Ranging”の略称を指す。GPUとは、“Graphics Processing Unit”の略称を指す。TPUとは、“Tensor Processing Unit”の略称を指す。USBとは、“Universal Serial Bus”の略称を指す。ASICとは、“Application Specific Integrated Circuit”の略称を指す。FPGAとは、“Field-Programmable Gate Array”の略称を指す。PLDとは、“Programmable Logic Device”の略称を指す。SoCとは、“System-on-a-chip”の略称を指す。ICとは、“Integrated Circuit”の略称を指す。
I/F is an abbreviation for "Interface". RAM is an abbreviation for "Random Access Memory". EEPROM is an abbreviation for "Electrically Erasable Programmable Read-Only Memory". CPU is an abbreviation for "Central Processing Unit". HDD is an abbreviation for "Hard Disk Drive". SSD is an abbreviation for "Solid State Drive". DRAM is an abbreviation for "Dynamic Random Access Memory". SRAM is an abbreviation for "Static Random Access Memory". CMOS is an abbreviation for "Complementary Metal Oxide Semiconductor". LiDAR is an abbreviation for “Light Detection And Ranging”. GPU is an abbreviation for "Graphics Processing Unit". TPU is an abbreviation for "Tensor Processing Unit". USB is an abbreviation for "Universal Serial Bus". ASIC is an abbreviation for "Application Specific Integrated Circuit". FPGA is an abbreviation for "Field-Programmable Gate Array". PLD is an abbreviation for "Programmable Logic Device". SoC is an abbreviation for "System-on-a-chip." IC is an abbreviation for "Integrated Circuit".
本明細書の説明において、「垂直」とは、完全な垂直の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの垂直を指す。本明細書の説明において、「一致」とは、完全な一致の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの一致を指す。本明細書の説明において、「均等」とは、完全な均等の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの均等を指す。本明細書の説明において、「水平方向」とは、完全な水平方向の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの水平方向を指す。本明細書の説明において、「鉛直方向」とは、完全な鉛直方向の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの鉛直方向を指す。
In the description of this specification, "perpendicular" means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to perfect verticality, and does not go against the spirit of the technology of the present disclosure. It refers to the vertical in the sense of including the error of In the description of this specification, "match" means, in addition to perfect match, an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, and does not go against the spirit of the technology of the present disclosure. It refers to a match in terms of meaning including errors in In the description of this specification, "equivalent" means, in addition to complete equivalent, an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, and does not violate the spirit of the technology of the present disclosure It refers to equality in the sense of including the error of In the description of this specification, the “horizontal direction” means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs in addition to the complete horizontal direction, and is contrary to the spirit of the technology of the present disclosure. It refers to the horizontal direction in the sense of including the degree of error that does not occur. In the description of this specification, the “vertical direction” means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to the perfect vertical direction, and is contrary to the spirit of the technology of the present disclosure. It refers to the vertical direction in the sense of including the degree of error that does not occur.
一例として図1に示すように、飛行撮像装置1は、飛行機能及び撮像機能を備えており、飛行しながら対象物2の壁面2Aを撮像する。本明細書の説明において、「飛行」の概念には、飛行撮像装置1が空中を移動するという意味の他に、飛行撮像装置1が空中で静止するという意味も含まれる。
As shown in FIG. 1 as an example, the flight imaging device 1 has a flight function and an imaging function, and images the wall surface 2A of the object 2 while flying. In the description of this specification, the concept of “flight” includes not only the meaning that the flight imaging device 1 moves in the air, but also the meaning that the flight imaging device 1 stops in the air.
壁面2Aは、一例として、平面である。平面とは、二次元状の面(すなわち、二次元方向に沿う面)を指す。また、本明細書の説明において、「平面」の概念には、鏡面の意味は含まれない。本実施形態において、例えば、壁面2Aは、水平方向及び鉛直方向で規定された平面(すなわち、水平方向及び鉛直方向に延びる面)である。壁面2Aには、凹凸が含まれている。ここで言う凹凸には、例えば、壁面2Aを形成する材料に起因する凹凸に加えて、欠損及び/又は欠陥に伴う凹凸が含まれる。一例として、壁面2Aを有する対象物2は、橋梁に設けられた橋脚である。橋脚は、例えば鉄筋コンクリート製である。ここでは、対象物2の一例として、橋脚が挙げられているが、対象物2は、橋脚以外の物体(例えば、トンネル又はダム等)でもよい。
The wall surface 2A is flat as an example. A plane refers to a two-dimensional surface (that is, a surface along a two-dimensional direction). Moreover, in the description of this specification, the concept of "flat surface" does not include the meaning of a mirror surface. In the present embodiment, for example, the wall surface 2A is a plane defined horizontally and vertically (that is, a plane extending horizontally and vertically). The wall surface 2A includes irregularities. The unevenness referred to here includes, for example, unevenness due to defects and/or defects in addition to unevenness caused by the material forming the wall surface 2A. As an example, the object 2 having the wall surface 2A is a pier provided on a bridge. The piers are made of reinforced concrete, for example. Although a bridge pier is given as an example of the object 2 here, the object 2 may be an object other than a bridge pier (for example, a tunnel or a dam).
飛行撮像装置1の飛行機能(以下、単に「飛行機能」とも称する)は、飛行指示信号に基づいて飛行撮像装置1が飛行する機能である。飛行指示信号とは、飛行撮像装置1の飛行を指示する信号を指す。飛行指示信号は、例えば飛行撮像装置1を操縦するための送信機20から送信される。送信機20は、ユーザ(図示省略)によって操作される。送信機20は、飛行撮像装置1を操縦するための操縦部22と、飛行撮像装置1によって撮像されることで得られた画像を表示するための表示装置24とを備える。表示装置24は、例えば液晶ディスプレイである。
The flight function of the flight imaging device 1 (hereinafter also simply referred to as "flight function") is a function of the flight imaging device 1 flying based on a flight instruction signal. A flight instruction signal refers to a signal that instructs the flight imaging device 1 to fly. The flight instruction signal is transmitted, for example, from a transmitter 20 for operating the flight imaging device 1 . The transmitter 20 is operated by a user (not shown). The transmitter 20 includes a control unit 22 for operating the flight imaging device 1 and a display device 24 for displaying images captured by the flight imaging device 1 . The display device 24 is, for example, a liquid crystal display.
飛行指示信号は、具体的には、飛行撮像装置1の移動及び移動方向を指示する移動指示信号と、飛行撮像装置1の静止を指示する静止指示信号とを含む複数の指示信号に分類される。ここでは、飛行指示信号が送信機20から送信される例が挙げられているが、飛行撮像装置1に対して飛行ルートを設定する基地局(図示省略)等から飛行指示信号が送信されてもよい。飛行撮像装置1の撮像機能(以下、単に「撮像機能」とも称する)は、飛行撮像装置1が被写体(一例として、対象物2の壁面2A)を撮像する機能である。
Specifically, the flight instruction signal is classified into a plurality of instruction signals including a movement instruction signal that instructs the movement and movement direction of the flight imaging device 1 and a stationary instruction signal that instructs the flight imaging device 1 to stand still. . Here, an example in which the flight instruction signal is transmitted from the transmitter 20 is given. good. The imaging function of the flight imaging device 1 (hereinafter also simply referred to as “imaging function”) is a function of the flight imaging device 1 imaging a subject (as an example, the wall surface 2A of the target object 2).
飛行撮像装置1は、飛行体10及び撮像装置30を備える。飛行体10は、例えばドローン等の無人航空機である。飛行機能は、飛行体10によって実現される。飛行体10は、複数のプロペラ12を有しており、複数のプロペラ12が回転することによって飛行する。飛行体10が飛行することは、飛行撮像装置1が飛行することと同義である。飛行体10は、本開示の技術に係る「移動体」及び「飛行体」の一例である。
The flight imaging device 1 includes a flying object 10 and an imaging device 30. The flying object 10 is, for example, an unmanned aerial vehicle such as a drone. A flight function is realized by the aircraft 10 . The flying object 10 has a plurality of propellers 12 and flies by rotating the plurality of propellers 12 . Flying of the aircraft 10 is synonymous with flying of the flight imaging device 1 . The flying object 10 is an example of the “moving object” and the “flying object” according to the technology of the present disclosure.
撮像装置30は、例えば、デジタルカメラ又はビデオカメラである。撮像機能は、撮像装置30によって実現される。撮像装置30は、本開示の技術に係る「撮像装置」の一例である。撮像装置30は、飛行体10に搭載されている。具体的には、撮像装置30は、飛行体10の下部に設けられている。ここでは、撮像装置30が飛行体10の下部に設けられている例が挙げられているが、撮像装置30は、飛行体10の上部又は前部等に設けられてもよい。
The imaging device 30 is, for example, a digital camera or a video camera. An imaging function is realized by the imaging device 30 . The imaging device 30 is an example of an “imaging device” according to the technology of the present disclosure. The imaging device 30 is mounted on the aircraft 10 . Specifically, the imaging device 30 is provided below the aircraft 10 . Here, an example in which the imaging device 30 is provided in the lower part of the flying object 10 is given, but the imaging device 30 may be provided in the upper part or the front part of the flying object 10 or the like.
飛行撮像装置1は、壁面2Aの複数の領域3を順次に撮像する。領域3は、飛行撮像装置1による画角によって定まる領域である。図1に示す例では、領域3の一例として、四角形の領域が示されている。複数の領域3が撮像装置30によって順次に撮像されることで複数の合成用画像92が得られる。複数の合成用画像92が合成されることにより合成画像90が生成される。複数の合成用画像92は、隣接する合成用画像92同士の一部が重なり合うように合成される。複数の合成用画像92を合成する処理は、飛行撮像装置1によって実行されてもよく、飛行撮像装置1に通信可能に接続された外部装置(図示省略)によって実行されてもよい。合成画像90は、例えば、対象物2の壁面2Aを点検したり測量したりするために利用される。
The flight imaging device 1 sequentially images a plurality of areas 3 of the wall surface 2A. A region 3 is a region determined by the angle of view of the flight imaging device 1 . In the example shown in FIG. 1, a rectangular area is shown as an example of the area 3 . A plurality of synthesis images 92 are obtained by sequentially capturing images of the plurality of regions 3 by the imaging device 30 . A composite image 90 is generated by combining a plurality of composite images 92 . A plurality of images for synthesis 92 are synthesized so that adjacent images for synthesis 92 partially overlap each other. The process of synthesizing a plurality of images for synthesis 92 may be performed by the flight imaging device 1 or may be performed by an external device (not shown) communicably connected to the flight imaging device 1 . The composite image 90 is used, for example, for inspecting or surveying the wall surface 2A of the object 2 .
合成画像90の生成に用いられる複数の合成画像90には、射影変換が施された画像も含まれている。射影変換が施された画像とは、例えば、撮像装置30の姿勢(例えば、俯角又は仰角)に起因して台形等に歪んだ画像領域を含む画像が補正された画像を指す。射影変換は、壁面2Aに対して撮像装置30の姿勢が傾いた状態(すなわち、壁面2Aに対して撮像装置30の光軸OAが傾いた状態)で撮像装置30によって壁面2Aが撮像されることによって得られた画像に対して行われる処理である。
The plurality of synthesized images 90 used to generate the synthesized image 90 also include images that have undergone projective transformation. An image that has undergone projective transformation refers to, for example, an image that includes an image area distorted into a trapezoid or the like due to the posture (for example, depression angle or elevation angle) of the imaging device 30, and is corrected. In projective transformation, the wall surface 2A is imaged by the imaging device 30 in a state in which the posture of the imaging device 30 is tilted with respect to the wall surface 2A (that is, a state in which the optical axis OA of the imaging device 30 is tilted with respect to the wall surface 2A). This is the processing performed on the image obtained by
俯角又は仰角に起因して生じる画像の歪みは、射影変換が行われることによって補正される。すなわち、壁面2Aに対して撮像装置30の姿勢が傾いた状態で撮像装置30によって撮像が行われることで得られた画像は、射影変換が行われることで、あたかも壁面2Aに正対した位置から撮像が行われることによって得られた画像のように変換される。
The distortion of the image caused by the angle of depression or elevation is corrected by performing a projective transformation. That is, the image obtained by the imaging device 30 imaging with the posture of the imaging device 30 tilted with respect to the wall surface 2A is transformed into an image as if from a position facing the wall surface 2A. It is converted like an image obtained by taking an image.
図1に示す例では、壁面2Aに対して撮像装置30の光軸OAが垂直な状態で各領域3が撮像装置30によって撮像される態様が示されている。複数の領域3は、隣接する領域3同士の一部が重なり合うように撮像される。隣接する領域3同士の一部が重なり合うように複数の領域3を撮像するのは、隣接する領域3のうち重なり合う部分に含まれる特徴点に基づいて、隣接する領域3に対応する合成用画像92を合成するためである。
In the example shown in FIG. 1, each area 3 is imaged by the imaging device 30 with the optical axis OA of the imaging device 30 perpendicular to the wall surface 2A. The plurality of regions 3 are imaged such that adjacent regions 3 partially overlap each other. A plurality of regions 3 are imaged so that the adjacent regions 3 partially overlap each other. This is for synthesizing
以下、隣接する領域3同士の一部が重なり合うことをオーバーラップと称する場合がある。また、各領域3の全体の面積に対してオーバーラップする部分の面積が占める割合をオーバーラップ率と称する。オーバーラップ率は、既定のオーバーラップ率に設定される。既定のオーバーラップ率は、例えば隣接する合成用画像92を合成し得る特徴点の量が得られる率(一例として30%)に設定される。
Hereafter, overlapping of adjacent regions 3 may be referred to as overlap. Also, the ratio of the area of the overlapping portion to the total area of each region 3 is called an overlap ratio. The overlap ratio is set to the default overlap ratio. The default overlap rate is set to a rate (eg, 30%) at which the amount of feature points that can be combined with adjacent composite images 92 is obtained.
図1に示す例では、複数の領域3は、既に撮像された領域3(すなわち、飛行撮像装置1によって撮像された領域3)と、未撮像の領域3(すなわち、飛行撮像装置1によって撮像されようとしている領域3)とを含む。以下、複数の領域3を区別して説明する場合、複数の領域3のうちの未撮像の領域3を「撮像対象領域3A」と称し、複数の領域3のうちの既に撮像された領域3を「撮像済み領域3B」と称する。
In the example shown in FIG. 1 , the plurality of regions 3 are regions 3 that have already been imaged (that is, regions 3 that have been imaged by the flight imaging device 1) and regions 3 that have not been imaged (that is, regions 3 that have been imaged by the flight imaging device 1). 3). Hereinafter, when the plurality of regions 3 are separately described, the region 3 that has not yet been imaged among the plurality of regions 3 will be referred to as "imaging target region 3A", and the region 3 that has already been imaged among the plurality of regions 3 will be referred to as " This is referred to as an imaged area 3B'.
飛行撮像装置1は、一例として水平方向への移動と鉛直方向への移動を交互に繰り返すことによりジグザグに移動しながら、複数の領域3を撮像する。また、飛行撮像装置1は、撮像対象領域3Aの一部と、撮像対象領域3Aの一つ前(例えば、1フレーム前)に撮像された撮像済み領域3Bの一部とが重なり合う順番で複数の領域3をそれぞれ撮像する。以下、一例として図1に示すように、飛行撮像装置1が水平方向への移動と鉛直方向への移動を交互に繰り返すことによりジグザグに移動しながら複数の領域3を撮像する例を前提に説明する。
As an example, the flight imaging device 1 images a plurality of regions 3 while moving in a zigzag manner by alternately repeating horizontal movement and vertical movement. In addition, the flight imaging device 1 captures a plurality of images in the order in which a part of the imaging target area 3A and a part of the imaging completed area 3B that is imaged one frame before the imaging target area 3A (for example, one frame before) overlap each other. Each area 3 is imaged. In the following, as an example, as shown in FIG. 1, the flight imaging device 1 alternately repeats movement in the horizontal direction and the movement in the vertical direction, thereby capturing images of a plurality of regions 3 while moving in a zigzag manner. do.
一例として図2に示すように、撮像装置30は、コンピュータ32、通信装置34、イメージセンサ36、イメージセンサドライバ38、撮像レンズ40、画像メモリ42、及び入出力I/F44を備える。
As shown in FIG. 2 as an example, the imaging device 30 includes a computer 32, a communication device 34, an image sensor 36, an image sensor driver 38, an imaging lens 40, an image memory 42, and an input/output I/F44.
コンピュータ32は、プロセッサ46、ストレージ48、及びRAM50を備える。プロセッサ46、ストレージ48、及びRAM50は、バス52を介して相互に接続されており、バス52は、入出力I/F44に接続されている。また、入出力I/F44には、通信装置34、イメージセンサドライバ38、撮像レンズ40、及び画像メモリ42が接続されている。コンピュータ32は、本開示の技術に係る「コンピュータ」の一例である。プロセッサ46は、本開示の技術に係る「プロセッサ」の一例である。
The computer 32 includes a processor 46, storage 48, and RAM50. Processor 46 , storage 48 , and RAM 50 are interconnected via bus 52 , and bus 52 is connected to input/output I/F 44 . Also, the input/output I/F 44 is connected with the communication device 34 , the image sensor driver 38 , the imaging lens 40 , and the image memory 42 . The computer 32 is an example of a "computer" according to the technology of the present disclosure. The processor 46 is an example of a "processor" according to the technology of the present disclosure.
プロセッサ46は、例えば、CPUを有しており、撮像装置30の全体を制御する。ストレージ48は、各種プログラム及び各種パラメータ等を記憶する不揮発性の記憶装置である。ストレージ48としては、例えば、HDD及び/又はフラッシュメモリ(例えば、EEPROM及び/又はSSD)等が挙げられる。
The processor 46 has, for example, a CPU and controls the imaging device 30 as a whole. The storage 48 is a nonvolatile storage device that stores various programs, various parameters, and the like. The storage 48 includes, for example, HDD and/or flash memory (eg, EEPROM and/or SSD).
RAM50は、一時的に情報が記憶されるメモリであり、プロセッサ46によってワークメモリとして用いられる。RAM50としては、例えば、DRAM及び/又はSRAM等が挙げられる。
The RAM 50 is a memory that temporarily stores information, and is used by the processor 46 as a work memory. Examples of the RAM 50 include DRAM and/or SRAM.
通信装置34は、一例として送信機20と通信可能に接続されている。ここでは、通信装置34が既定の無線通信規格で送信機20と無線通信可能に接続されている。既定の無線通信規格とは、例えば、Wi-Fi(登録商標)又はBluetooth(登録商標)等が挙げられる。通信装置34は、送信機20との間の情報の授受を司る。例えば、通信装置34は、プロセッサ46からの要求に応じた情報を送信機20に送信する。また、通信装置34は、送信機20から送信された情報を受信し、受信した情報を、バス52を介してプロセッサ46に出力する。ここでは、通信装置34が送信機20と通信可能に接続されている例が挙げられているが、通信装置34は、送信機20及び/又は飛行体10と通信可能に接続されていてもよい。
The communication device 34 is communicably connected to the transmitter 20 as an example. Here, the communication device 34 is wirelessly communicably connected to the transmitter 20 according to a predetermined wireless communication standard. The predefined wireless communication standard includes, for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark). The communication device 34 is in charge of exchanging information with the transmitter 20 . For example, communication device 34 transmits information to transmitter 20 in response to a request from processor 46 . Communication device 34 also receives information transmitted from transmitter 20 and outputs the received information to processor 46 via bus 52 . Although an example in which the communication device 34 is communicably connected to the transmitter 20 is given here, the communication device 34 may be communicably connected to the transmitter 20 and/or the aircraft 10. .
イメージセンサ36は、イメージセンサドライバ38と接続されている。イメージセンサドライバ38は、プロセッサ46からの指示に従って、イメージセンサ36を制御する。イメージセンサ36は、例えば、CMOSイメージセンサである。なお、ここでは、イメージセンサ36としてCMOSイメージセンサを例示しているが、本開示の技術はこれに限定されず、他のイメージセンサであってもよい。イメージセンサ36は、イメージセンサドライバ38の制御下で、被写体(一例として、対象物2の壁面2A)を撮像し、撮像することで得た画像データを出力する。
The image sensor 36 is connected with an image sensor driver 38 . Image sensor driver 38 controls image sensor 36 according to instructions from processor 46 . Image sensor 36 is, for example, a CMOS image sensor. Although a CMOS image sensor is exemplified as the image sensor 36 here, the technology of the present disclosure is not limited to this, and other image sensors may be used. The image sensor 36 captures an image of a subject (for example, the wall surface 2A of the target object 2) under the control of an image sensor driver 38, and outputs image data obtained by capturing the image.
撮像レンズ40は、イメージセンサ36よりも被写体側(物体側)に配置されている。撮像レンズ40は、被写体からの反射光である被写体光を取りこみ、取り込んだ被写体光をイメージセンサ36の撮像面に結像させる。撮像レンズ40には、フォーカスレンズ、ズームレンズ、及び絞り等の複数の光学素子(図示省略)が含まれている。撮像レンズ40は、入出力I/F44を介してコンピュータ32に接続されている。具体的には、撮像レンズ40に含まれる複数の光学素子は、動力源を有する駆動機構(図示省略)を介して入出力I/F44に接続されている。撮像レンズ40に含まれる複数の光学素子は、コンピュータ32の制御下で作動する。撮像装置30では、撮像レンズ40に含まれる複数の光学素子を作動させることによって、フォーカス、光学ズーム、及びシャッタスピードの調節等が実現される。
The imaging lens 40 is arranged on the subject side (object side) of the image sensor 36 . The imaging lens 40 captures subject light, which is reflected light from the subject, and forms an image of the captured subject light on the imaging surface of the image sensor 36 . The imaging lens 40 includes a plurality of optical elements (not shown) such as a focus lens, a zoom lens, and an aperture. The imaging lens 40 is connected to the computer 32 via an input/output I/F 44 . Specifically, the plurality of optical elements included in the imaging lens 40 are connected to the input/output I/F 44 via a driving mechanism (not shown) having a power source. A plurality of optical elements included in imaging lens 40 operate under the control of computer 32 . In the imaging device 30, by operating a plurality of optical elements included in the imaging lens 40, adjustment of focus, optical zoom, shutter speed, and the like are realized.
画像メモリ42には、イメージセンサ36によって生成された画像データが一時的に記憶される。プロセッサ46は、画像メモリ42から画像データを取得し、取得した画像データを用いて各種処理を実行する。
The image memory 42 temporarily stores image data generated by the image sensor 36 . The processor 46 acquires image data from the image memory 42 and uses the acquired image data to perform various processes.
一例として図3に示すように、ストレージ48には、撮像プログラム60が記憶されている。撮像プログラム60は、本開示の技術に係る「プログラム」の一例である。プロセッサ46は、ストレージ48から撮像プログラム60を読み出し、読み出した撮像プログラム60をRAM50上で実行する。プロセッサ46は、RAM50上で実行する撮像プログラム60に従って、複数の領域3(図1参照)を撮像するための撮像処理を行う。
As shown in FIG. 3 as an example, the storage 48 stores an imaging program 60 . The imaging program 60 is an example of a "program" according to the technology of the present disclosure. The processor 46 reads the imaging program 60 from the storage 48 and executes the read imaging program 60 on the RAM 50 . The processor 46 performs imaging processing for imaging a plurality of areas 3 (see FIG. 1) according to an imaging program 60 executed on the RAM 50 .
撮像処理は、プロセッサ46が撮像プログラム60に従って、第1撮像制御部62、第1特徴点情報生成部64、第1移動判定部66、移動方向情報生成部68、第2撮像制御部70、表示制御部72、第2特徴点情報生成部74、特徴点比較部76、オーバーラップ判定部78、及び第2移動判定部80として動作することで実現される。
The imaging process is performed by the processor 46 according to the imaging program 60, including a first imaging control unit 62, a first feature point information generation unit 64, a first movement determination unit 66, a movement direction information generation unit 68, a second imaging control unit 70, a display It is realized by operating as the control unit 72 , the second feature point information generation unit 74 , the feature point comparison unit 76 , the overlap determination unit 78 , and the second movement determination unit 80 .
一例として図4に示すように、飛行体10は、ユーザによる操作に応じて送信機20から送信された移動指示信号を受信し、受信した移動指示信号に基づいて最初の撮像位置に移動する。また、飛行体10は、ユーザによる操作に応じて送信機20から送信された静止指示信号を受信し、受信した静止指示信号に基づいて最初の撮像位置で静止する。そして、撮像装置30は、ユーザによる操作に応じて送信機20から送信された撮像開始信号を受信した場合、以下に説明する撮像処理を実行する。
As an example, as shown in FIG. 4, the flying object 10 receives the movement instruction signal transmitted from the transmitter 20 in response to the user's operation, and moves to the first imaging position based on the received movement instruction signal. Also, the flying object 10 receives a stationary instruction signal transmitted from the transmitter 20 in response to an operation by the user, and stands still at the initial imaging position based on the received stationary instruction signal. When the imaging device 30 receives the imaging start signal transmitted from the transmitter 20 in response to the user's operation, the imaging device 30 executes imaging processing described below.
第1撮像制御部62は、イメージセンサ36に対して第1撮像指示信号を出力することにより、イメージセンサ36に撮像対象領域3Aを撮像させる。第1撮像制御部62の制御下でイメージセンサ36によって撮像対象領域3Aが撮像されることにより合成用画像データが得られる。合成用画像データは、合成用画像92を示す画像データである。合成用画像データは、ストレージ48に記憶される。図4に示す合成用画像データによって示される合成用画像92は、1枚目の合成用画像である。合成用画像92は、本開示の技術に係る「第1画像」の一例である。合成用画像92には、撮像対象領域3Aの凹凸に対応する特徴点が含まれる。以下、合成用画像92に含まれる特徴点を「第1特徴点」と称する。
The first imaging control unit 62 outputs a first imaging instruction signal to the image sensor 36 to cause the image sensor 36 to image the imaging target region 3A. Image data for synthesis is obtained by imaging the imaging target region 3A with the image sensor 36 under the control of the first imaging control section 62 . The synthesizing image data is image data representing the synthesizing image 92 . The image data for synthesis is stored in the storage 48 . The image for synthesis 92 indicated by the image data for synthesis shown in FIG. 4 is the first image for synthesis. The composition image 92 is an example of the "first image" according to the technology of the present disclosure. The synthesizing image 92 includes characteristic points corresponding to the unevenness of the imaging target area 3A. Hereinafter, the feature point included in the image for synthesis 92 will be referred to as a "first feature point".
第1特徴点情報生成部64は、合成用画像92が1枚目の合成用画像である場合、ストレージ48に記憶された合成用画像データに基づいて合成用画像92を取得し、取得した合成用画像92を複数の第1分割領域96に分割する。合成用画像92は、一例として複数の第1分割領域96に均等に分割される。図4に示す例では、複数の第1分割領域96の個数は、9つである。以下、9つの第1分割領域96を区別して説明する必要がある場合、枝番1から枝番9を用いることで、9つの第1分割領域96を、それぞれ第1分割領域96-1、第1分割領域96-2、第1分割領域96-3、第1分割領域96-4、第1分割領域96-5、第1分割領域96-6、第1分割領域96-7、第1分割領域96-8、及び第1分割領域96-9と称する。一例として、複数の第1分割領域96は、いずれも矩形状の領域である。
When the image for synthesis 92 is the first image for synthesis, the first feature point information generation unit 64 acquires the image for synthesis 92 based on the image data for synthesis stored in the storage 48, and The image for use 92 is divided into a plurality of first divided areas 96 . As an example, the composite image 92 is evenly divided into a plurality of first divided regions 96 . In the example shown in FIG. 4, the number of multiple first divided areas 96 is nine. In the following, when the nine first segmented regions 96 need to be distinguished and explained, the nine first segmented regions 96 are respectively divided into the first segmented region 96-1 and the second segmented region 96-1 by using branch numbers 1 to 9. 1 divided area 96-2, first divided area 96-3, first divided area 96-4, first divided area 96-5, first divided area 96-6, first divided area 96-7, first divided area They are referred to as a region 96-8 and a first segmented region 96-9. As an example, each of the plurality of first divided regions 96 is a rectangular region.
第1特徴点情報生成部64は、合成用画像92が1枚目の合成用画像である場合、合成用画像92の全領域に含まれる第1特徴点を第1分割領域96毎に抽出し、抽出した第1分割領域96毎の第1特徴点の座標を示す第1特徴点情報を生成する。第1特徴点情報生成部64によって生成された第1特徴点情報は、ストレージ48に記憶される。第1特徴点情報が示す第1特徴点の座標は、例えば、合成用画像データに対して画像処理(例えば、高周波成分抽出処理等)が施されることにより導出される。第1特徴点の座標は、例えば、撮像対象領域3Aの4つの頂点のうちいずれか1つの頂点を基準にした座標である。
When the image for synthesis 92 is the first image for synthesis, the first feature point information generation unit 64 extracts the first feature points included in the entire area of the image for synthesis 92 for each first divided area 96. , first feature point information indicating the coordinates of the first feature point for each of the extracted first divided areas 96 is generated. The first feature point information generated by the first feature point information generation unit 64 is stored in the storage 48 . The coordinates of the first feature point indicated by the first feature point information are derived, for example, by performing image processing (for example, high-frequency component extraction processing, etc.) on the synthesis image data. The coordinates of the first feature point are, for example, coordinates based on any one of the four vertices of the imaging target area 3A.
一例として図5に示すように、飛行体10は、ユーザによる操作に応じて送信機20から送信された移動指示信号を受信した場合、受信した移動指示信号に基づいて移動する。図5に示す例では、飛行体10が移動指示信号に基づいて水平方向に移動している。具体的には、飛行体10の移動方向は右方向である。
As an example, as shown in FIG. 5, when the aircraft 10 receives the movement instruction signal transmitted from the transmitter 20 in response to the user's operation, the aircraft 10 moves based on the received movement instruction signal. In the example shown in FIG. 5, the flying object 10 is moving horizontally based on the movement instruction signal. Specifically, the moving direction of the flying object 10 is the right direction.
第1移動判定部66は、飛行体10が移動しているか否かを判定する。第1移動判定部66は、ユーザによる操作に応じて送信機20から送信された移動指示信号が撮像装置30で受信された場合、移動指示信号に基づいて飛行体10が移動していると判定する。移動指示信号には、飛行体10の移動方向を指示する情報が含まれる。
The first movement determination unit 66 determines whether the flying object 10 is moving. The first movement determination unit 66 determines that the flying object 10 is moving based on the movement instruction signal when the imaging device 30 receives the movement instruction signal transmitted from the transmitter 20 in response to the user's operation. do. The movement instruction signal includes information for instructing the movement direction of the aircraft 10 .
移動方向情報生成部68は、飛行体10が移動していると第1移動判定部66によって判定された場合、移動指示信号に基づいて飛行体10の移動方向を取得し、取得した移動方向を示す移動方向情報を生成する。
When the first movement determining unit 66 determines that the flying object 10 is moving, the moving direction information generating unit 68 acquires the moving direction of the flying object 10 based on the movement instruction signal, and converts the acquired moving direction to Generate moving direction information to indicate.
一例として図6に示すように、飛行体10は、ユーザによる操作に応じて送信機20から送信された移動指示信号を受信している間、受信した移動指示信号に基づいて移動を継続する。
As an example, as shown in FIG. 6, the flying object 10 continues to move based on the received movement instruction signal while receiving the movement instruction signal transmitted from the transmitter 20 in response to the user's operation.
第2撮像制御部70は、イメージセンサ36に対して第2撮像指示信号を出力することにより、イメージセンサ36に撮像対象領域3Aを撮像させる。第2撮像制御部70の制御下でイメージセンサ36によって撮像対象領域3Aが撮像されることにより表示用画像データが得られる。表示用画像データは、表示用画像94を示す画像データである。表示用画像94は、合成用画像92が得られた位置から飛行体10が移動した場合に撮像装置30によって撮像されることで得られる。表示用画像データは、画像メモリ42に記憶される。表示用画像94は、本開示の技術に係る「第2画像」及び「表示用画像」の一例である。表示用画像94には、撮像対象領域3Aの凹凸に対応する特徴点が含まれる。以下、表示用画像94に含まれる特徴点を「第2特徴点」と称する。また、以下では、「第1特徴点」と「第2特徴点」とを区別して説明する必要がない場合、単に「特徴点」とも称する。
The second imaging control unit 70 outputs a second imaging instruction signal to the image sensor 36 to cause the image sensor 36 to image the imaging target region 3A. Display image data is obtained by capturing an image of the image capturing target area 3A with the image sensor 36 under the control of the second image capturing control section 70 . The display image data is image data representing the display image 94 . The display image 94 is obtained by being imaged by the imaging device 30 when the flying object 10 moves from the position where the synthesis image 92 was obtained. The image data for display is stored in the image memory 42 . The display image 94 is an example of the "second image" and the "display image" according to the technology of the present disclosure. The display image 94 includes characteristic points corresponding to the unevenness of the imaging target area 3A. Hereinafter, the feature points included in the display image 94 are referred to as "second feature points". Further, hereinafter, when it is not necessary to distinguish between the "first feature point" and the "second feature point", they are also simply referred to as the "feature point".
表示制御部72は、画像メモリ42に記憶された表示用画像データを取得し、表示用画像データに対して各種処理を実行する。そして、表示制御部72は、表示用画像データを送信機20に対して出力する。送信機20は、表示用画像データを受信し、受信した表示用画像データに基づいて表示用画像94(すなわち、ライブビュー画像)を表示装置24に表示する。
The display control unit 72 acquires display image data stored in the image memory 42 and executes various processes on the display image data. The display control unit 72 then outputs the display image data to the transmitter 20 . The transmitter 20 receives the display image data, and displays the display image 94 (that is, the live view image) on the display device 24 based on the received display image data.
第2特徴点情報生成部74は、画像メモリ42に記憶された表示用画像データに基づいて表示用画像94を取得し、取得した表示用画像94を複数の第2分割領域100に分割する。表示用画像94は、一例として複数の第2分割領域100に均等に分割される。複数の第2分割領域100の個数は、複数の第1分割領域96の個数と同じである。すなわち、図6に示す例では、複数の第2分割領域100の個数は、9つである。以下、9つの第2分割領域100を区別して説明する必要がある場合、枝番1から枝番9を用いることで、9つの第2分割領域100を、それぞれ第2分割領域100-1、第2分割領域100-2、第2分割領域100-3、第2分割領域100-4、第2分割領域100-5、第2分割領域100-6、第2分割領域100-7、第2分割領域100-8、及び第2分割領域100-9と称する。一例として、複数の第2分割領域100は、いずれも矩形状の領域である。
The second feature point information generation unit 74 acquires the display image 94 based on the display image data stored in the image memory 42 and divides the acquired display image 94 into a plurality of second divided areas 100 . The display image 94 is equally divided into a plurality of second divided regions 100 as an example. The number of the plurality of second divided regions 100 is the same as the number of the plurality of first divided regions 96 . That is, in the example shown in FIG. 6, the number of the plurality of second divided regions 100 is nine. In the following, when the nine second divided regions 100 need to be distinguished and explained, the nine second divided regions 100 are divided into the second divided region 100-1 and the second divided region 100-1 by using branch numbers 1 to 9, respectively. Two divided regions 100-2, second divided regions 100-3, second divided regions 100-4, second divided regions 100-5, second divided regions 100-6, second divided regions 100-7, second divided regions They are called an area 100-8 and a second divided area 100-9. As an example, each of the plurality of second divided regions 100 is a rectangular region.
第2特徴点情報生成部74は、表示用画像94が分割されることで得られた複数の第2分割領域100から第2特徴点を抽出するための第2分割領域100を第2画像領域102として選定する。具体的には、第2特徴点情報生成部74は、移動方向情報生成部68によって生成された移動方向情報に基づいて飛行体10の移動方向(以下、単に「移動方向」とも称する)を取得し、複数の第2分割領域100のうちの移動方向の後側に位置する第2分割領域100を第2画像領域102として選定する。図6に示す例では、複数の第2分割領域100のうちの移動方向の後側に位置する3つの第2分割領域100(一例として、第2分割領域100-1、第2分割領域100-4、及び第2分割領域100-7)が第2画像領域102として選定されている。
The second feature point information generation unit 74 generates the second split areas 100 for extracting the second feature points from the plurality of second split areas 100 obtained by splitting the display image 94 as second image areas. 102. Specifically, the second feature point information generation unit 74 acquires the movement direction of the flying object 10 (hereinafter also simply referred to as “movement direction”) based on the movement direction information generated by the movement direction information generation unit 68. Then, the second segmented region 100 located on the rear side in the movement direction from among the plurality of second segmented regions 100 is selected as the second image region 102 . In the example shown in FIG. 6, three second divided regions 100 (as an example, a second divided region 100-1, a second divided region 100- 4, and the second segmented area 100-7) is selected as the second image area 102. FIG.
続いて、第2特徴点情報生成部74は、第2画像領域102に含まれる第2特徴点を第2画像領域102毎に抽出する。第2特徴点は、第2画像領域102のみに含まれる特徴点である。そして、第2特徴点情報生成部74は、抽出した第2画像領域102毎の第2特徴点の座標を示す第2特徴点情報を生成する。第2特徴点情報生成部74によって生成された第2特徴点情報は、ストレージ48に記憶される。第2特徴点情報生成部74によって抽出される第2特徴点の座標は、第1特徴点情報生成部64によって抽出される第1特徴点の座標と同様の手法により導出される。
Subsequently, the second feature point information generation unit 74 extracts second feature points included in the second image area 102 for each second image area 102 . A second feature point is a feature point included only in the second image area 102 . Then, the second feature point information generation unit 74 generates second feature point information indicating the coordinates of the second feature points for each of the extracted second image regions 102 . The second feature point information generated by the second feature point information generating section 74 is stored in the storage 48 . The coordinates of the second feature points extracted by the second feature point information generating section 74 are derived by the same method as the coordinates of the first feature points extracted by the first feature point information generating section 64 .
一例として図7には、合成用画像92として1枚目の合成用画像が得られた場合の特徴点比較部76の動作が示されている。合成用画像92として1枚目の合成用画像が得られた場合、1枚目の合成用画像に対応してストレージ48に記憶されている第1特徴点情報は、合成用画像92の全領域について第1分割領域96毎の第1特徴点の座標を示す情報である。
As an example, FIG. 7 shows the operation of the feature point comparison unit 76 when the first composite image is obtained as the composite image 92 . When the first image for synthesis is obtained as the image for synthesis 92, the first feature point information stored in the storage 48 corresponding to the first image for synthesis is the entire area of the image for synthesis 92. is information indicating the coordinates of the first feature point for each of the first divided areas 96 .
特徴点比較部76は、ストレージ48に記憶されている第1特徴点情報及び第2特徴点情報を取得する。また、特徴点比較部76は、合成用画像92として1枚目の合成用画像が得られた場合、移動方向情報生成部68によって生成された移動方向情報を取得する。そして、特徴点比較部76は、取得した移動方向情報に基づいて、合成用画像92に対して、複数の第1分割領域96のうちの移動方向の前側に位置する第1分割領域96を第1画像領域98として選定する。図7に示す例では、複数の第1分割領域96のうちの移動方向の前側に位置する3つの第1分割領域96(一例として、第1分割領域96-3、第1分割領域96-6、及び第1分割領域96-9)が第1画像領域98として選定されている。
The feature point comparison unit 76 acquires the first feature point information and the second feature point information stored in the storage 48 . Further, when the first composite image is obtained as the composite image 92 , the feature point comparison section 76 acquires the movement direction information generated by the movement direction information generation section 68 . Then, based on the acquired moving direction information, the feature point comparison unit 76 selects the first divided area 96 positioned forward in the moving direction from among the plurality of first divided areas 96 with respect to the synthesis image 92 as the first divided area 96 . One image area 98 is selected. In the example shown in FIG. 7, three first divided regions 96 (as an example, the first divided region 96-3, the first divided region 96-6 , and the first segmented area 96 - 9 ) are selected as the first image area 98 .
続いて、特徴点比較部76は、第1特徴点情報から第1画像領域98に含まれる第1特徴点を抽出する。第1特徴点は、第1画像領域98のみに含まれる特徴点である。そして、特徴点比較部76は、第1画像領域98に含まれる第1特徴点と、第2画像領域102に含まれる第2特徴点とを比較する。すなわち、特徴点比較部76は、撮像時刻が前後する合成用画像92及び表示用画像94のうちの合成用画像92の第1画像領域98に含まれる第1特徴点と表示用画像94の第2画像領域102に含まれる第2特徴点とを比較する。特徴点比較部76は、第1特徴点と第2特徴点とを比較した結果を示す比較結果情報を生成する。比較結果情報は、具体的には、第1特徴点の座標と第2特徴点の座標とを比較した結果に関する情報である。なお、複数の合成画像90のうちの隣接する合成用画像92を第1特徴点及び第2特徴点に基づいて合成するために、3つの第1画像領域98の面積及び3つの第2画像領域102の面積は、隣接する合成用画像92が合成される場合にオーバーラップする領域であるオーバーラップ領域よりもそれぞれ大きい面積に設定される。
Subsequently, the feature point comparison unit 76 extracts the first feature points included in the first image area 98 from the first feature point information. The first feature point is a feature point included only in the first image area 98 . Then, the feature point comparison unit 76 compares the first feature points included in the first image area 98 and the second feature points included in the second image area 102 . That is, the feature point comparison unit 76 compares the first feature point included in the first image area 98 of the image for synthesis 92 and the first feature point of the image for display 94 among the image for synthesis 92 and the image for display 94 captured at different times. The second feature points included in the second image area 102 are compared. The feature point comparison unit 76 generates comparison result information indicating a result of comparing the first feature point and the second feature point. The comparison result information is specifically information about the result of comparing the coordinates of the first feature point and the coordinates of the second feature point. In addition, in order to synthesize adjacent images 92 for synthesis among the plurality of synthesized images 90 based on the first feature points and the second feature points, the area of the three first image regions 98 and the area of the three second image regions The area of 102 is set to be larger than the overlap area, which is the area where the adjacent images for synthesis 92 are overlapped when they are synthesized.
一例として図8に示すように、オーバーラップ判定部78は、特徴点比較部76によって生成された比較結果情報を取得し、取得した比較結果情報に基づいて、合成用画像92と表示用画像94とがオーバーラップしている領域であるオーバーラップ領域95のオーバーラップ率が既定の閾値以下であるか否かを判定する。ここで、オーバーラップ率とは、例えば、1フレーム分の面積(例えば、合成用画像92の面積又は表示用画像94の面積)に対するオーバーラップ領域95の面積の割合を指す。合成用画像92と表示用画像94とのオーバーラップ領域95は、領域3同士がオーバーラップするオーバーラップ領域5に対応する。既定の閾値は、複数の領域3を順次に撮像する場合の効率と、隣接する合成用画像92(図1参照)を合成するための上述の既定のオーバーラップ率等を考慮して設定される。例えば、既定の閾値は、複数の領域3を順次に撮像する場合の効率を考慮して50%以下に設定される。また、既定の閾値は、上述の既定のオーバーラップ(一例として30%)より大きい値に設定される。なお、飛行体10の移動速度は、合成用画像92と表示用画像94とオーバーラップ率が上述の既定のオーバーラップ率を下回るまでにオーバーラップ判定部78による判定が少なくとも一回は行われる速度に設定される。
As an example, as shown in FIG. 8, the overlap determination unit 78 acquires the comparison result information generated by the feature point comparison unit 76, and based on the acquired comparison result information, an image 92 for synthesis and an image 94 for display. It is determined whether or not the overlap rate of the overlap region 95, which is the region where the and overlap, is equal to or less than a predetermined threshold. Here, the overlap ratio refers to, for example, the ratio of the area of the overlap region 95 to the area of one frame (for example, the area of the image for synthesis 92 or the area of the image for display 94). An overlap region 95 between the image for synthesis 92 and the image for display 94 corresponds to the overlap region 5 where the regions 3 overlap each other. The predetermined threshold value is set in consideration of the efficiency of sequentially imaging a plurality of regions 3 and the aforementioned predetermined overlap ratio for synthesizing adjacent images for synthesis 92 (see FIG. 1). . For example, the predetermined threshold is set to 50% or less in consideration of the efficiency when sequentially imaging a plurality of regions 3 . Also, the default threshold is set to a value greater than the default overlap (30% as an example). Note that the moving speed of the flying object 10 is the speed at which the overlap determination unit 78 performs at least one determination until the overlap ratio between the image for synthesis 92 and the image for display 94 falls below the predetermined overlap ratio. is set to
一例として図9に示すように、第1撮像制御部62は、合成用画像92と表示用画像94とのオーバーラップ率が既定の閾値以下であるとオーバーラップ判定部78によって判定された場合、イメージセンサ36に対して第1撮像指示信号を出力することにより、イメージセンサ36に撮像対象領域3Aを撮像させる。第1撮像制御部62の制御下でイメージセンサ36によって撮像対象領域3Aが撮像されることにより合成用画像データが得られる。図9に示す合成用画像データによって示される合成用画像92は、上述の1枚目の合成用画像(図4参照)の後に得られた合成用画像92である。以下、1枚目の合成用画像の後に得られた合成用画像をN枚目の合成用画像と称する(ただし、N≧2である)。N枚目の合成用画像を示す合成用画像データは、ストレージ48に記憶される。
As an example, as shown in FIG. 9, when the overlap determination unit 78 determines that the overlap ratio between the image for synthesis 92 and the image for display 94 is equal to or less than a predetermined threshold, the first imaging control unit 62 By outputting the first imaging instruction signal to the image sensor 36, the image sensor 36 is caused to image the imaging target region 3A. Image data for synthesis is obtained by imaging the imaging target region 3A with the image sensor 36 under the control of the first imaging control section 62 . A synthesizing image 92 indicated by the synthesizing image data shown in FIG. 9 is the synthesizing image 92 obtained after the above-described first synthesizing image (see FIG. 4). An image for synthesis obtained after the first image for synthesis is hereinafter referred to as the Nth image for synthesis (where N≧2). The image data for synthesis representing the Nth image for synthesis is stored in the storage 48 .
なお、第1撮像制御部62は、合成用画像92と表示用画像94とのオーバーラップ率が既定の閾値以下であるとオーバーラップ判定部78によって判定された場合、撮像装置30から送信機20に撮像許可信号を送信させてもよい。また、送信機20は、撮像許可信号を受信した場合に、撮像が許可された旨の文字等を表示装置24に表示してもよい。そして、表示装置24に表示された文字等に応じてユーザが送信機20から撮像装置30に送信させた撮像指示信号が撮像装置30で受信された場合に、第1撮像制御部62は、イメージセンサ36に対して第1撮像指示信号を出力することにより、イメージセンサ36に撮像対象領域3Aを撮像させてもよい。
Note that, when the overlap determination unit 78 determines that the overlap ratio between the image for synthesis 92 and the image for display 94 is equal to or less than a predetermined threshold value, the first imaging control unit 62 changes the image from the imaging device 30 to the transmitter 20 . may be caused to transmit an imaging permission signal. Further, the transmitter 20 may display, on the display device 24, characters or the like indicating that the imaging is permitted when the imaging permission signal is received. Then, when the imaging device 30 receives an imaging instruction signal transmitted from the transmitter 20 to the imaging device 30 by the user according to characters or the like displayed on the display device 24, the first imaging control unit 62 By outputting the first imaging instruction signal to the sensor 36, the image sensor 36 may be caused to image the imaging target region 3A.
一例として図10に示すように、飛行体10は、飛行指示信号として移動指示信号を受信している間、受信した移動指示信号に基づいて移動を継続する。
As an example, as shown in FIG. 10, the aircraft 10 continues to move based on the received movement instruction signal while receiving the movement instruction signal as the flight instruction signal.
第2移動判定部80は、撮像装置30によって受信された移動指示信号と、移動方向情報生成部68によって生成された移動方向情報とに基づいて、飛行体10が移動を継続しているか否かを判定する。第2移動判定部80は、撮像装置30によって移動指示信号が受信され、かつ移動指示信号が示す移動方向と移動方向情報が示す移動方向とが一致する場合、飛行体10が移動を継続していると判定する。
The second movement determination unit 80 determines whether the flying object 10 continues to move based on the movement instruction signal received by the imaging device 30 and the movement direction information generated by the movement direction information generation unit 68. judge. When the movement instruction signal is received by the imaging device 30 and the movement direction indicated by the movement instruction signal matches the movement direction indicated by the movement direction information, the second movement determination unit 80 determines whether the flying object 10 continues to move. determine that there is
第1特徴点情報生成部64は、飛行体10が移動を継続していると第2移動判定部80によって判定された場合、ストレージ48に記憶された最新の合成用画像データに基づいて合成用画像92(すなわち、N番目の合成用画像)を取得し、取得した合成用画像92を複数の第1分割領域96に分割する。一例として、合成用画像92は、複数の第1分割領域96に均等に分割される。一例として、複数の第1分割領域96の個数は、1枚目の合成用画像を分割した個数と同じである。すなわち、図10に示す例では、複数の第1分割領域96の個数は、9つである。一例として、複数の第1分割領域96は、いずれも矩形状の領域である。
When the second movement determination unit 80 determines that the flying object 10 continues to move, the first feature point information generation unit 64 generates a synthesis image data based on the latest synthesis image data stored in the storage 48 . An image 92 (that is, the N-th image for synthesis) is obtained, and the obtained image for synthesis 92 is divided into a plurality of first divided regions 96 . As an example, the composite image 92 is evenly divided into a plurality of first divided regions 96 . As an example, the number of the plurality of first divided regions 96 is the same as the number of divisions of the first composite image. That is, in the example shown in FIG. 10, the number of multiple first divided areas 96 is nine. As an example, each of the plurality of first divided regions 96 is a rectangular region.
第1特徴点情報生成部64は、飛行体10が移動を継続していると第2移動判定部80によって判定された場合、合成用画像92が分割されることで得られた複数の第1分割領域96から第1特徴点を抽出するための第1分割領域96を第1画像領域98として選定する。具体的には、第1特徴点情報生成部64は、移動方向情報生成部68によって生成された移動方向情報に基づいて飛行体10の移動方向を取得し、複数の第1分割領域96のうちの移動方向の前側に位置する第1分割領域96を第1画像領域98として選定する。図10に示す例では、複数の第1分割領域96のうちの移動方向の前側に位置する3つの第1分割領域96(一例として、第1分割領域96-3、第1分割領域96-6、及び第1分割領域96-9)が第1画像領域98として選定されている。
When the second movement determination unit 80 determines that the flying object 10 continues to move, the first feature point information generation unit 64 generates a plurality of first feature point information obtained by dividing the synthesis image 92 . A first divided area 96 for extracting a first feature point from the divided area 96 is selected as a first image area 98 . Specifically, the first feature point information generation unit 64 acquires the movement direction of the flying object 10 based on the movement direction information generated by the movement direction information generation unit 68, and A first divided area 96 located on the front side in the moving direction of is selected as a first image area 98 . In the example shown in FIG. 10, three first divided regions 96 (as an example, the first divided region 96-3, the first divided region 96-6 , and the first segmented area 96 - 9 ) are selected as the first image area 98 .
続いて、第1特徴点情報生成部64は、第1画像領域98に含まれる第1特徴点を第1画像領域98毎に抽出する。第1特徴点は、第1画像領域98のみに含まれる特徴点である。このように、第1特徴点情報生成部64は、飛行体10が移動している場合、合成画像90のうちの一部の画像領域である第1画像領域98に含まれる第1特徴点を第1画像領域98毎に抽出する。そして、第1特徴点情報生成部64は、抽出した第1画像領域98毎の第1特徴点の座標を示す第1特徴点情報を生成する。第1特徴点情報生成部64によって生成された第1特徴点情報は、ストレージ48に記憶される。
Subsequently, the first feature point information generation unit 64 extracts first feature points included in the first image area 98 for each first image area 98 . The first feature point is a feature point included only in the first image area 98 . In this manner, the first feature point information generation unit 64 generates the first feature points included in the first image area 98, which is a partial image area of the composite image 90, when the flying object 10 is moving. Each first image area 98 is extracted. Then, the first feature point information generating section 64 generates first feature point information indicating the coordinates of the first feature points for each of the extracted first image regions 98 . The first feature point information generated by the first feature point information generation unit 64 is stored in the storage 48 .
一例として図11には、飛行体10が移動を継続した状態で合成用画像92としてN枚目の合成用画像が得られた場合(図9参照)の特徴点比較部76の動作が示されている。飛行体10が移動を継続している場合、N枚目の合成用画像に対応する第1特徴点情報は、複数の第1分割領域96のうちの移動方向の前側に位置する第1分割領域96である第1画像領域98毎の第1特徴点の座標を示す情報である。一方、第2特徴点情報は、複数の第2分割領域100のうちの移動方向の後側に位置する第2分割領域100である第2画像領域102毎の第2特徴点の座標を示す情報である。
As an example, FIG. 11 shows the operation of the feature point comparison unit 76 when the N-th synthesis image is obtained as the synthesis image 92 while the flying object 10 continues to move (see FIG. 9). ing. When the flying object 10 continues to move, the first feature point information corresponding to the N-th synthesis image is the first segmented region located on the front side of the plurality of first segmented regions 96 in the moving direction. 96 is information indicating the coordinates of the first feature point for each first image area 98 . On the other hand, the second feature point information is information indicating the coordinates of the second feature point for each second image region 102 which is the second segmented region 100 located on the rear side in the moving direction among the plurality of second segmented regions 100. is.
特徴点比較部76は、ストレージ48に記憶されている第1特徴点情報及び第2特徴点情報を取得する。そして、特徴点比較部76は、飛行体10が移動している場合、第1特徴点情報及び第2特徴点情報に基づいて、第1画像領域98に含まれる第1特徴点と第2画像領域102に含まれる第2特徴点とを比較し、比較した結果を示す比較結果情報を生成する。
The feature point comparison unit 76 acquires the first feature point information and the second feature point information stored in the storage 48 . Then, when the flying object 10 is moving, the feature point comparison unit 76 compares the first feature points and the second image included in the first image area 98 based on the first feature point information and the second feature point information. A second feature point included in the area 102 is compared, and comparison result information indicating the result of the comparison is generated.
一例として図12に示すように、撮像装置30が複数の領域3のうちの端部に位置する領域3を撮像する位置に飛行体10が到達した場合にも、第1撮像制御部62は、合成用画像92と表示用画像94とのオーバーラップ率が既定の閾値以下であるとオーバーラップ判定部78によって判定された場合、イメージセンサ36に対して第1撮像指示信号を出力することにより、イメージセンサ36に撮像対象領域3Aを撮像させる。第1撮像制御部62の制御下でイメージセンサ36によって撮像対象領域3Aが撮像されることにより合成用画像データが得られる。図11に示す合成画像データによって示される合成用画像92は、N枚目の合成用画像である。N枚目の合成用画像を示す合成用画像データは、ストレージ48に記憶される。
As an example, as shown in FIG. 12, even when the flying object 10 reaches a position where the imaging device 30 captures an image of the region 3 located at the end of the plurality of regions 3, the first imaging control unit 62 When the overlap determination unit 78 determines that the overlap ratio between the image for synthesis 92 and the image for display 94 is equal to or less than the predetermined threshold value, by outputting the first imaging instruction signal to the image sensor 36, The image sensor 36 is made to image the imaging target area 3A. Image data for synthesis is obtained by imaging the imaging target region 3A with the image sensor 36 under the control of the first imaging control section 62 . A composite image 92 indicated by the composite image data shown in FIG. 11 is the N-th composite image. The image data for synthesis representing the Nth image for synthesis is stored in the storage 48 .
一例として図13に示すように、撮像装置30が複数の領域3のうちの端部に位置する領域3を撮像する位置に飛行体10が到達した場合には、ユーザによる操作に応じて送信機20から飛行体10に飛行指示信号として静止指示信号が送信される。
As an example, as shown in FIG. 13 , when the flying object 10 reaches a position where the image capturing device 30 captures an image of the area 3 located at the end of the plurality of areas 3, the transmitter is operated in response to the user's operation. A stationary instruction signal is transmitted from 20 to the aircraft 10 as a flight instruction signal.
第2移動判定部80は、撮像装置30によって飛行指示信号として静止指示信号が受信された場合、飛行体10が移動を継続していない、すなわち飛行体10が移動を停止したと判定する。
When the imaging device 30 receives a stationary instruction signal as the flight instruction signal, the second movement determination unit 80 determines that the flying object 10 has not continued to move, that is, the flying object 10 has stopped moving.
第1特徴点情報生成部64は、飛行体10が移動を停止したと第2移動判定部80によって判定された場合、ストレージ48に記憶された最新の合成用画像データに基づいて合成用画像92(すなわち、N番目の合成用画像)を取得し、取得した合成用画像92を複数の第1分割領域96に分割する。合成用画像92は、一例として複数の第1分割領域96に均等に分割される。一例として、複数の第1分割領域96の個数は、1枚目の合成用画像を分割した個数と同じである。すなわち、図12に示す例では、複数の第1分割領域96の個数は、9つである。一例として、複数の第1分割領域96は、いずれも矩形状の領域である。
When the second movement determination unit 80 determines that the flying object 10 has stopped moving, the first feature point information generation unit 64 generates the image for synthesis 92 based on the latest image data for synthesis stored in the storage 48 . (that is, the N-th image for synthesis) is obtained, and the obtained image for synthesis 92 is divided into a plurality of first divided regions 96 . As an example, the composite image 92 is evenly divided into a plurality of first divided regions 96 . As an example, the number of the plurality of first divided regions 96 is the same as the number of divisions of the first composite image. That is, in the example shown in FIG. 12, the number of multiple first divided areas 96 is nine. As an example, each of the plurality of first divided regions 96 is a rectangular region.
そして、第1特徴点情報生成部64は、飛行体10が移動を停止したと第2移動判定部80によって判定された場合、合成用画像92の全領域に含まれる第1特徴点を第1分割領域96毎に抽出し、抽出した第1分割領域96毎の第1特徴点の座標を示す第1特徴点情報を生成する。このように、第1特徴点情報生成部64は、飛行体10が移動を停止した場合、合成用画像92の全領域に含まれる第1特徴点を第1分割領域96毎に抽出する。そして、第1特徴点情報生成部64は、抽出した第1分割領域96毎の第1特徴点の座標を示す第1特徴点情報を生成する。第1特徴点情報生成部64によって生成された第1特徴点情報は、ストレージ48に記憶される。
Then, when the second movement determination unit 80 determines that the flying object 10 has stopped moving, the first feature point information generation unit 64 converts the first feature points included in the entire area of the synthesis image 92 into the first Each divided area 96 is extracted, and first characteristic point information indicating the coordinates of the first characteristic point for each first divided area 96 extracted is generated. In this way, the first feature point information generator 64 extracts the first feature points included in the entire area of the synthesis image 92 for each first divided area 96 when the flying object 10 stops moving. Then, the first feature point information generating section 64 generates first feature point information indicating the coordinates of the first feature point for each of the extracted first divided areas 96 . The first feature point information generated by the first feature point information generation unit 64 is stored in the storage 48 .
一例として図14には、合成用画像92としてN枚目の合成用画像(図12参照)が得られてから飛行体10が移動を一旦停止した後に移動を開始した場合の特徴点比較部76の動作が示されている。図14に示す例では、飛行体10が移動指示信号に基づいて鉛直方向に移動している。具体的には、飛行体10の移動方向は下方向である。飛行体10が移動を停止した場合、N枚目の合成用画像に対応する第1特徴点情報は、合成用画像92の全領域について第1分割領域96毎の第1特徴点の座標を示す情報である。一方、第2特徴点情報は、複数の第2分割領域100のうちの移動方向の後側に位置する第2分割領域100である第2画像領域102毎の第2特徴点の座標を示す情報である。
As an example, FIG. 14 shows the feature point comparison unit 76 when the flying object 10 temporarily stops moving after obtaining the N-th image for synthesis (see FIG. 12) as the image for synthesis 92 and then starts moving. operation is shown. In the example shown in FIG. 14, the flying object 10 is moving vertically based on the movement instruction signal. Specifically, the moving direction of the flying object 10 is downward. When the flying object 10 stops moving, the first feature point information corresponding to the Nth image for synthesis indicates the coordinates of the first feature point for each first divided area 96 for the entire area of the image for synthesis 92. Information. On the other hand, the second feature point information is information indicating the coordinates of the second feature point for each second image region 102 which is the second segmented region 100 located on the rear side in the moving direction among the plurality of second segmented regions 100. is.
特徴点比較部76は、ストレージ48に記憶されている第1特徴点情報及び第2特徴点情報を取得する。また、特徴点比較部76は、飛行体10が移動を停止した場合、移動方向情報生成部68によって生成された移動方向情報を取得し、取得した移動方向情報に基づいて、合成用画像92に対して、複数の第1分割領域96のうちの移動方向の前側に位置する第1分割領域96を第1画像領域98として選定する。図14に示す例では、複数の第1分割領域96のうちの移動方向の前側に位置する3つの第1分割領域96(一例として、第1分割領域96-7、第1分割領域96-8、及び第1分割領域96-9)が第1画像領域98として選定されている。
The feature point comparison unit 76 acquires the first feature point information and the second feature point information stored in the storage 48 . Further, when the flying object 10 stops moving, the feature point comparison unit 76 acquires the movement direction information generated by the movement direction information generation unit 68, and based on the acquired movement direction information, the image for synthesis 92 is generated. On the other hand, the first divided area 96 located on the front side in the moving direction from among the plurality of first divided areas 96 is selected as the first image area 98 . In the example shown in FIG. 14, three first segmented regions 96 (for example, first segmented region 96-7, first segmented region 96-8 , and the first segmented area 96 - 9 ) are selected as the first image area 98 .
続いて、特徴点比較部76は、第1特徴点情報から第1画像領域98に含まれる第1特徴点を抽出する。そして、特徴点比較部76は、第1画像領域98に含まれる第1特徴点と、第2画像領域102に含まれる第2特徴点とを比較し、比較した結果を示す比較結果情報を生成する。
Subsequently, the feature point comparison unit 76 extracts the first feature points included in the first image area 98 from the first feature point information. Then, the feature point comparison unit 76 compares the first feature points included in the first image area 98 and the second feature points included in the second image area 102, and generates comparison result information indicating the comparison result. do.
なお、以上の説明では、飛行撮像装置1が水平方向への移動と鉛直方向への移動を交互に繰り返すことによりジグザグに移動しながら複数の領域3を撮像する例(図1参照)を前提としている。しかしながら、飛行撮像装置1は、対象物2の壁面2Aに沿ったあらゆる方向に移動しながら壁面2Aを撮像してもよい。
Note that the above description assumes an example (see FIG. 1) of imaging a plurality of regions 3 while moving in a zigzag manner by alternately repeating movement in the horizontal direction and movement in the vertical direction. there is However, the flight imaging device 1 may image the wall surface 2A while moving in all directions along the wall surface 2A of the object 2 .
一例として図15に示すように、飛行撮像装置1の移動方向は、対象物2の壁面2Aに沿った8方向に分類されている。図15に示す例では、飛行撮像装置1の移動方向の一例として、右方向、左方向、上方向、下方向、右上方向、右下方向、左上方向、及び左下方向が示されている。すなわち、飛行撮像装置1は、右方向、左方向、上方向、下方向、右上方向、右下方向、左上方向、及び左下方向のうちの指定された方向に沿って移動する。
As an example, as shown in FIG. 15, the movement directions of the flight imaging device 1 are classified into eight directions along the wall surface 2A of the object 2. In the example shown in FIG. 15 , rightward, leftward, upward, downward, upper right, lower right, upper left, and lower left directions are shown as an example of the moving direction of the flight imaging device 1 . That is, the flight imaging device 1 moves along the designated direction among the right direction, left direction, upward direction, downward direction, upper right direction, lower right direction, upper left direction, and lower left direction.
一例として図16には、飛行撮像装置1の移動方向が、右方向、左方向、上方向、及び下方向である場合に、飛行体10の移動方向に応じて特徴点比較部76によって特徴点が比較される第1画像領域98と第2画像領域102との関係が示されている。
As an example, in FIG. 16 , when the movement direction of the flight imaging device 1 is the right direction, the left direction, the upward direction, and the downward direction, the characteristic point comparison unit 76 determines the characteristic points according to the movement direction of the flying object 10 . A relationship is shown between a first image area 98 and a second image area 102 for which .
例えば、飛行体10の移動方向が右方向である場合、特徴点比較部76は、第1画像領域98としての第1分割領域96-3、第1分割領域96-6、及び第1分割領域96-9に含まれる第1特徴点と、第2画像領域102としての第2分割領域100-1、第2分割領域100-4、及び第2分割領域100-7に含まれる第2特徴点とを比較する。
For example, when the moving direction of the flying object 10 is the right direction, the feature point comparison unit 76 sets the first segmented region 96-3, the first segmented region 96-6, and the first segmented region 96-3 as the first image region 98. 96-9, and the second feature points included in the second divided regions 100-1, 100-4, and 100-7 as the second image region 102. Compare with
また、飛行体10の移動方向が左方向である場合、特徴点比較部76は、第1画像領域98としての第1分割領域96-1、第1分割領域96-4、及び第1分割領域96-7に含まれる第1特徴点と、第2画像領域102としての第2分割領域100-3、第2分割領域100-6、及び第2分割領域100-9に含まれる第2特徴点とを比較する。
Further, when the moving direction of the flying object 10 is the left direction, the feature point comparison unit 76 selects the first segmented region 96-1, the first segmented region 96-4, and the first segmented region as the first image region 98. 96-7, and the second feature points included in the second divided regions 100-3, 100-6, and 100-9 as the second image region 102 Compare with
また、飛行体10の移動方向が上方向である場合、特徴点比較部76は、第1画像領域98としての第1分割領域96-1、第1分割領域96-2、及び第1分割領域96-3に含まれる第1特徴点と、第2画像領域102としての第2分割領域100-7、第2分割領域100-8、及び第2分割領域100-9に含まれる第2特徴点とを比較する。
Further, when the moving direction of the flying object 10 is upward, the feature point comparison unit 76 selects the first segmented region 96-1, the first segmented region 96-2, and the first segmented region as the first image region 98. 96-3, and the second feature points included in the second divided regions 100-7, 100-8, and 100-9 as the second image region 102 Compare with
また、飛行体10の移動方向が下方向である場合、特徴点比較部76は、第1画像領域98としての第1分割領域96-7、第1分割領域96-8、及び第1分割領域96-9に含まれる第1特徴点と、第2画像領域102としての第2分割領域100-1、第2分割領域100-2、及び第2分割領域100-3に含まれる第2特徴点とを比較する。
Further, when the moving direction of the flying object 10 is downward, the feature point comparison unit 76 selects the first segmented region 96-7, the first segmented region 96-8, and the first segmented region 96-7 as the first image region 98. 96-9, and the second feature points included in the second divided regions 100-1, 100-2, and 100-3 as the second image region 102 Compare with
一例として図17には、飛行撮像装置1の移動方向が、右上方向、右下方向、左上方向、及び左下方向である場合に、飛行体10の移動方向に応じて特徴点比較部76によって特徴点が比較される第1画像領域98と第2画像領域102との関係が示されている。
As an example, in FIG. 17 , when the movement direction of the flight imaging device 1 is the upper right direction, the lower right direction, the upper left direction, and the lower left direction, the feature point comparison unit 76 determines the characteristic points according to the movement direction of the flying object 10 . The relationship between a first image area 98 and a second image area 102 against which points are compared is shown.
例えば、飛行体10の移動方向が右上方向である場合、特徴点比較部76は、第1画像領域98としての第1分割領域96-2、第1分割領域96-3、及び第1分割領域96-6に含まれる第1特徴点と、第2画像領域102としての第2分割領域100-4、第2分割領域100-7、及び第2分割領域100-8に含まれる第2特徴点とを比較する。また、飛行体10の移動方向が右下方向である場合、特徴点比較部76は、第1画像領域98としての第1分割領域96-6、第1分割領域96-8、及び第1分割領域96-9に含まれる第1特徴点と、第2画像領域102としての第2分割領域100-1、第2分割領域100-2、及び第2分割領域100-4に含まれる第2特徴点とを比較する。
For example, when the moving direction of the flying object 10 is the upper right direction, the feature point comparison unit 76 sets the first divided region 96-2, the first divided region 96-3, and the first divided region 96-2 as the first image region 98. 96-6, and the second feature points included in the second divided regions 100-4, 100-7, and 100-8 as the second image region 102 Compare with Further, when the moving direction of the flying object 10 is the lower right direction, the feature point comparison unit 76 selects the first divided region 96-6, the first divided region 96-8, and the first divided region 96-6 as the first image region 98. A first feature point included in the region 96-9 and a second feature included in the second divided regions 100-1, 100-2, and 100-4 as the second image region 102 Compare points.
また、飛行体10の移動方向が左上方向である場合、特徴点比較部76は、第1画像領域98としての第1分割領域96-1、第1分割領域96-2、及び第1分割領域96-4に含まれる第1特徴点と、第2画像領域102としての第2分割領域100-6、第2分割領域100-8、及び第2分割領域100-9に含まれる第2特徴点とを比較する。
Further, when the moving direction of the flying object 10 is the upper left direction, the feature point comparison unit 76 sets the first divided region 96-1, the first divided region 96-2, and the first divided region 96-1 as the first image region 98. 96-4, and the second feature points included in the second divided regions 100-6, 100-8, and 100-9 as the second image region 102 Compare with
また、飛行体10の移動方向が左下方向である場合、特徴点比較部76は、第1画像領域98としての第1分割領域96-4、第1分割領域96-7、及び第1分割領域96-8に含まれる第1特徴点と、第2画像領域102としての第2分割領域100-2、第2分割領域100-3、及び第2分割領域100-6に含まれる第2特徴点とを比較する。このように、特徴点比較部76で特徴点が比較される第1画像領域98及び第2画像領域102は、飛行体10の移動方向に応じて変更される。
Further, when the moving direction of the flying object 10 is the lower left direction, the feature point comparison unit 76 selects the first segmented region 96-4, the first segmented region 96-7, and the first segmented region 96-4 as the first image region 98. 96-8, and the second feature points included in the second divided regions 100-2, 100-3, and 100-6 as the second image region 102 Compare with Thus, the first image region 98 and the second image region 102 whose feature points are compared by the feature point comparison unit 76 are changed according to the moving direction of the flying object 10 .
次に、本実施形態に係る飛行撮像装置1の作用について図18を参照しながら説明する。図18には、本実施形態に係る撮像処理の流れの一例が示されている。
Next, the operation of the flight imaging device 1 according to this embodiment will be described with reference to FIG. FIG. 18 shows an example of the flow of imaging processing according to this embodiment.
図18に示す撮像処理では、先ず、ステップST10で、第1撮像制御部62は、イメージセンサ36に撮像対象領域3Aを撮像させる(図4参照)。第1撮像制御部62の制御下でイメージセンサ36によって撮像対象領域3Aが撮像されることにより合成用画像データが得られる。ステップST10の処理が実行された後、撮像処理は、ステップST12へ移行する。
In the imaging process shown in FIG. 18, first, in step ST10, the first imaging control unit 62 causes the image sensor 36 to image the imaging target area 3A (see FIG. 4). Image data for synthesis is obtained by imaging the imaging target region 3A with the image sensor 36 under the control of the first imaging control section 62 . After the process of step ST10 is executed, the imaging process proceeds to step ST12.
ステップST12で、第1特徴点情報生成部64は、ステップST10で取得された合成用画像データに基づいて、合成用画像92の全領域に含まれる第1特徴点を第1分割領域96毎に抽出し、抽出した第1分割領域96毎の第1特徴点の座標を示す第1特徴点情報を生成する(図4参照)。ステップST12の処理が実行された後、撮像処理は、ステップST14へ移行する。
In step ST12, the first feature point information generation unit 64 generates first feature points included in the entire area of the image for synthesis 92 for each first divided area 96 based on the image data for synthesis acquired in step ST10. First feature point information indicating the coordinates of the first feature point for each of the extracted first divided areas 96 is generated (see FIG. 4). After the process of step ST12 is executed, the imaging process proceeds to step ST14.
ステップST14で、第1移動判定部66は、飛行体10が移動しているか否かを判定する(図5参照)。ステップST14において、飛行体10が移動している場合には、判定が肯定されて、撮像処理は、ステップST16へ移行する。ステップST14において、飛行体10が移動していない場合には、判定が否定されて、撮像処理は、ステップST14の処理を再度実行する。
At step ST14, the first movement determination unit 66 determines whether or not the flying object 10 is moving (see FIG. 5). In step ST14, if the flying object 10 is moving, the determination is affirmative, and the imaging process proceeds to step ST16. In step ST14, if the flying object 10 is not moving, the determination is negative, and the imaging process is executed again in step ST14.
ステップST16で、移動方向情報生成部68は、飛行体10の移動方向を取得し、取得した移動方向を示す移動方向情報を生成する(図5参照)。ステップST16の処理が実行された後、撮像処理は、ステップST18へ移行する。
At step ST16, the movement direction information generation unit 68 acquires the movement direction of the flying object 10 and generates movement direction information indicating the acquired movement direction (see FIG. 5). After the process of step ST16 is executed, the imaging process proceeds to step ST18.
ステップST18で、第2撮像制御部70は、イメージセンサ36に撮像対象領域3Aを撮像させる(図6参照)。第2撮像制御部70の制御下でイメージセンサ36によって撮像対象領域3Aが撮像されることにより表示用画像データが得られる。ステップST18の処理が実行された後、撮像処理は、ステップST20へ移行する。
At step ST18, the second imaging control unit 70 causes the image sensor 36 to image the imaging target region 3A (see FIG. 6). Display image data is obtained by capturing an image of the image capturing target area 3A with the image sensor 36 under the control of the second image capturing control section 70 . After the process of step ST18 is executed, the imaging process proceeds to step ST20.
ステップST20で、表示制御部72は、ステップST18で得られた表示用画像データに基づいて、表示用画像94(すなわち、ライブビュー画像)を表示装置24に表示させる(図6参照)。ステップST20の処理が実行された後、撮像処理は、ステップST22へ移行する。
At step ST20, the display control unit 72 causes the display device 24 to display the display image 94 (that is, the live view image) based on the display image data obtained at step ST18 (see FIG. 6). After the process of step ST20 is executed, the imaging process proceeds to step ST22.
ステップST22で、第2特徴点情報生成部74は、ステップST16で得られた移動方向情報、及びステップST18で得られた表示用画像データに基づいて、表示用画像94のうちの第2画像領域102毎の第2特徴点の座標を示す第2特徴点情報を生成する(図6参照)。ステップST22の処理が実行された後、撮像処理は、ステップST24へ移行する。
In step ST22, the second feature point information generation unit 74 generates the second image region of the display image 94 based on the movement direction information obtained in step ST16 and the display image data obtained in step ST18. Second feature point information indicating the coordinates of the second feature points for every 102 is generated (see FIG. 6). After the process of step ST22 is executed, the imaging process proceeds to step ST24.
ステップST24で、特徴点比較部76は、ステップST12で生成された第1特徴点情報、及びステップST22で生成された第2特徴点情報に基づいて、合成用画像92のうちの第1画像領域98に含まれる第1特徴点と、表示用画像94のうちの第2画像領域102に含まれる第2特徴点とを比較し、比較した結果を示す比較結果情報を生成する(図7参照)。なお、以上はステップST24の処理が1回目の処理である場合の説明である。ステップST24の処理が2回目以降の処理である場合には、特徴点比較部76は、後述するステップST32又はステップST34で生成された第1特徴点情報、及びステップST22で生成された第2特徴点情報に基づいて、第1特徴点と第2特徴点とを比較し、比較した結果を示す比較結果情報を生成する(図11又は図14参照)。ステップST24の処理が実行された後、撮像処理は、ステップST26へ移行する。
In step ST24, the feature point comparison unit 76 calculates the first image region of the image for synthesis 92 based on the first feature point information generated in step ST12 and the second feature point information generated in step ST22. 98 and the second feature points included in the second image area 102 of the display image 94 are compared, and comparison result information indicating the comparison result is generated (see FIG. 7). . The above description is for the case where the process of step ST24 is the first process. If the process of step ST24 is the second or subsequent process, the feature point comparison unit 76 compares the first feature point information generated in step ST32 or step ST34 described later and the second feature point information generated in step ST22. Based on the point information, the first feature point and the second feature point are compared, and comparison result information indicating the comparison result is generated (see FIG. 11 or FIG. 14). After the process of step ST24 is executed, the imaging process proceeds to step ST26.
ステップST26で、オーバーラップ判定部78は、ステップST24で生成された比較結果情報を取得し、取得した比較結果情報に基づいて、合成用画像92と表示用画像94とのオーバーラップ率が既定の閾値以下であるか否かを判定する(図8参照)。ステップST26において、合成用画像92と表示用画像94とのオーバーラップ率が既定の閾値以下である場合には、判定が肯定されて、撮像処理は、ステップST28へ移行する。ステップST26において、合成用画像92と表示用画像94とのオーバーラップ率が既定の閾値を超えている場合には、判定が否定されて、撮像処理は、ステップST16へ移行する。
In step ST26, the overlap determination unit 78 acquires the comparison result information generated in step ST24, and based on the acquired comparison result information, the overlap ratio between the image for synthesis 92 and the image for display 94 is set to the default value. It is determined whether or not it is equal to or less than the threshold (see FIG. 8). In step ST26, if the overlap rate between the image for synthesis 92 and the image for display 94 is equal to or less than the predetermined threshold value, the determination is affirmative, and the imaging process proceeds to step ST28. In step ST26, if the overlap ratio between the image for synthesis 92 and the image for display 94 exceeds the predetermined threshold value, the determination is negative, and the imaging process proceeds to step ST16.
ステップST28で、第1撮像制御部62は、イメージセンサ36に撮像対象領域3Aを撮像させる(図9又は図12参照)。第1撮像制御部62の制御下でイメージセンサ36によって撮像対象領域3Aが撮像されることにより合成用画像データが得られる。ステップST28の処理が実行された後、撮像処理は、ステップST30へ移行する。
At step ST28, the first imaging control unit 62 causes the image sensor 36 to image the imaging target region 3A (see FIG. 9 or FIG. 12). Image data for synthesis is obtained by imaging the imaging target region 3A with the image sensor 36 under the control of the first imaging control section 62 . After the process of step ST28 is executed, the imaging process proceeds to step ST30.
ステップST30で、第2移動判定部80は、撮像装置30によって受信された移動指示信号と、ステップST16で生成された移動方向情報とに基づいて、飛行体10が移動を継続しているか否かを判定する(図10又は図13参照)。ステップST30において、飛行体10が移動を継続している場合には、判定が肯定されて、撮像処理は、ステップST32へ移行する。ステップST30において、飛行体10が移動を継続していない、すなわち飛行体10が移動を停止した場合には、判定が否定されて、撮像処理は、ステップST34へ移行する。
In step ST30, the second movement determination unit 80 determines whether the flying object 10 continues to move based on the movement instruction signal received by the imaging device 30 and the movement direction information generated in step ST16. is determined (see FIG. 10 or FIG. 13). In step ST30, if the flying object 10 continues to move, the determination is affirmative, and the imaging process proceeds to step ST32. In step ST30, if the flying object 10 has not continued to move, that is, if the flying object 10 has stopped moving, the determination is negative, and the imaging process proceeds to step ST34.
ステップST32で、第1特徴点情報生成部64は、ステップST16で得られた移動方向情報、及びステップST28で取得された合成用画像データに基づいて、合成用画像92のうちの第1画像領域98毎の第1特徴点の座標を示す第1特徴点情報を生成する(図10参照)。ステップST32の処理が実行された後、撮像処理は、ステップST36へ移行する。
In step ST32, the first feature point information generation unit 64 generates the first image region of the image for synthesis 92 based on the moving direction information obtained in step ST16 and the image data for synthesis obtained in step ST28. The first feature point information indicating the coordinates of the first feature points for every 98 is generated (see FIG. 10). After the process of step ST32 is executed, the imaging process proceeds to step ST36.
ステップST34で、第1特徴点情報生成部64は、ステップST28で取得された合成用画像データに基づいて、合成用画像92の全領域に含まれる第1特徴点を第1分割領域96毎に抽出し、抽出した第1分割領域96毎の第1特徴点の座標を示す第1特徴点情報を生成する(図13参照)。ステップST34の処理が実行された後、撮像処理は、ステップST36へ移行する。
In step ST34, the first feature point information generation unit 64 generates first feature points included in the entire area of the image for synthesis 92 for each first divided area 96 based on the image data for synthesis acquired in step ST28. First feature point information indicating the coordinates of the first feature point for each of the extracted first divided areas 96 is generated (see FIG. 13). After the process of step ST34 is executed, the imaging process proceeds to step ST36.
ステップST36で、プロセッサ46は、撮像処理を終了する条件(終了条件)が成立したか否かを判定する。終了条件の一例としては、ユーザが撮像処理を終了させる指示を撮像装置30に対して付与したという条件、又はユーザが指定する枚数に合成用画像92の枚数が達したという条件等が挙げられる。ステップST36において、終了条件が成立していない場合には、判定が否定されて、撮像処理は、ステップST14へ移行する。ステップST36において、終了条件が成立した場合には、判定が肯定されて、撮像処理は終了する。なお、上述の飛行撮像装置1の作用として説明した撮像方法は、本開示の技術に係る「撮像方法」の一例である。
At step ST36, the processor 46 determines whether or not a condition for terminating the imaging process (end condition) is met. Examples of end conditions include a condition that the user has given an instruction to end the imaging process to the imaging device 30, or a condition that the number of images for synthesis 92 has reached the number specified by the user. In step ST36, if the termination condition is not met, the determination is negative, and the imaging process proceeds to step ST14. In step ST36, if the end condition is met, the determination is affirmative and the imaging process ends. Note that the imaging method described as the operation of the flight imaging device 1 described above is an example of the “imaging method” according to the technology of the present disclosure.
以上説明したように、本実施形態に係る飛行撮像装置1では、プロセッサ46は、合成用画像92に含まれる第1特徴点と表示用画像94に含まれる第2特徴点とを比較した結果に基づいて、合成用画像92と表示用画像94とのオーバーラップ率を判定する。ここで、プロセッサ46は、オーバーラップ率を判定する場合に、合成用画像92に含まれる画像領域であって、第1特徴点を得るための画像領域である第1画像領域98、及び表示用画像94に含まれる画像領域であって、第2特徴点を得るための画像領域である第2画像領域102を移動方向に応じて変更する。したがって、例えば、合成用画像92の全領域に含まれる第1特徴点と表示用画像94の全領域に含まれる第2特徴点とを比較した結果に基づいて、合成用画像92と表示用画像94とのオーバーラップ率を判定する場合に比して、オーバーラップ率を判定する場合のプロセッサ46の処理に要する負荷を軽減することができる。
As described above, in the flight imaging device 1 according to the present embodiment, the processor 46 compares the first feature point included in the image for synthesis 92 and the second feature point included in the image for display 94, and the result is Based on this, the overlap ratio between the image for synthesis 92 and the image for display 94 is determined. Here, when the processor 46 determines the overlap ratio, the first image region 98, which is the image region included in the image for synthesis 92 and for obtaining the first feature point, and the A second image area 102, which is an image area included in the image 94 and for obtaining the second feature point, is changed according to the moving direction. Therefore, for example, based on the result of comparing the first feature points included in the entire area of the image for synthesis 92 and the second feature points included in the entire area of the image for display 94, the image for synthesis 92 and the image for display 94, the load required for the processing of the processor 46 when determining the overlap ratio can be reduced.
また、プロセッサ46は、合成用画像92のうちの第1画像領域98に含まれる第1特徴点と、表示用画像94のうちの第2画像領域102に含まれる第2特徴点とを比較した結果に基づいて、合成用画像92と表示用画像94とのオーバーラップ率を判定する。したがって、例えば、合成用画像92の全領域に含まれる第1特徴点と表示用画像94の全領域に含まれる第2特徴点とを比較した結果に基づいて、合成用画像92と表示用画像94とのオーバーラップ率を判定する場合と同様に、オーバーラップ率の判定精度を確保することができる。
The processor 46 also compared the first feature points included in the first image area 98 of the image for synthesis 92 with the second feature points included in the second image area 102 of the image for display 94. Based on the results, the overlap ratio between the image for synthesis 92 and the image for display 94 is determined. Therefore, for example, based on the result of comparing the first feature points included in the entire area of the image for synthesis 92 and the second feature points included in the entire area of the image for display 94, the image for synthesis 92 and the image for display As in the case of judging the overlap rate with 94, it is possible to ensure the judgment accuracy of the overlap rate.
また、プロセッサ46は、表示用画像94を表示装置24に表示させる。したがって、表示用画像94に基づいて撮像対象領域3Aの位置を確認することができる。
Also, the processor 46 causes the display device 24 to display the display image 94 . Therefore, based on the display image 94, the position of the imaging target area 3A can be confirmed.
また、第1画像領域98は、合成用画像92のうちの一部の画像領域であり、第2画像領域102は、表示用画像94のうちの一部の画像領域である。したがって、例えば、第1画像の全領域に含まれる第1特徴点と第2画像の全領域に含まれる第2特徴点とを比較する場合に比して、第1特徴点と第2特徴点とを比較する場合のプロセッサ46の処理に要する負荷を軽減することができる。
Also, the first image area 98 is a partial image area of the synthesis image 92 , and the second image area 102 is a partial image area of the display image 94 . Therefore, for example, compared with the case of comparing the first feature points included in the entire area of the first image and the second feature points included in the entire area of the second image, the first feature points and the second feature points It is possible to reduce the load required for the processing of the processor 46 when comparing the .
また、第1画像領域98は、合成用画像92における移動方向の前側に位置する領域であり、第2画像領域102は、表示用画像94における移動方向の後側に位置する領域である。したがって、第1画像領域98に含まれる第1特徴点と、第2画像領域102に含まれる第2特徴点とを比較することにより、合成用画像92と表示用画像94とのオーバーラップ率を判定することができる。
Also, the first image area 98 is an area located on the front side in the movement direction of the image for synthesis 92, and the second image area 102 is an area located on the rear side of the image for display 94 in the movement direction. Therefore, by comparing the first feature points included in the first image area 98 and the second feature points included in the second image area 102, the overlap rate between the synthesis image 92 and the display image 94 can be determined. can judge.
また、プロセッサ46は、合成用画像92が分割されることで得られた複数の第1分割領域96から第1画像領域98を選定する。したがって、例えば、合成用画像92が複数の第1分割領域96に分割されない状態で合成用画像92から第1画像領域98を設定する場合に比して、簡単な処理で第1画像領域98を設定することができる。同様に、プロセッサ46は、表示用画像94が分割されることで得られた複数の第2分割領域100から第2画像領域102を選定する。したがって、例えば、表示用画像94が複数の第2分割領域100に分割されない状態で表示用画像94から第2画像領域102を設定する場合に比して、簡単な処理で第2画像領域102を設定することができる。
Also, the processor 46 selects a first image region 98 from a plurality of first divided regions 96 obtained by dividing the composite image 92 . Therefore, for example, compared to setting the first image region 98 from the image for synthesis 92 in a state in which the image for synthesis 92 is not divided into a plurality of first divided regions 96, the first image region 98 can be set by simple processing. Can be set. Similarly, the processor 46 selects a second image region 102 from a plurality of second divided regions 100 obtained by dividing the display image 94 . Therefore, for example, compared to setting the second image area 102 from the display image 94 in a state in which the display image 94 is not divided into a plurality of second divided areas 100, the second image area 102 can be set by simple processing. Can be set.
また、第1画像領域98は、複数の第1分割領域96のうちの3つの第1分割領域96を含み、第2画像領域102は、複数の第2分割領域100のうちの3つの第2分割領域100を含む。したがって、例えば、第1画像領域98が複数の第1分割領域96のうちの3つ未満の第1分割領域96を含み、第2画像領域102が複数の第2分割領域100のうちの3つ未満の第2分割領域100を含む場合に比して、オーバーラップ率の判定精度を向上させることができる。
In addition, the first image region 98 includes three first segmented regions 96 out of the plurality of first segmented regions 96, and the second image region 102 includes three second segmented regions 100 out of the plurality of second segmented regions 100. A divided area 100 is included. Therefore, for example, the first image region 98 includes less than three first segmented regions 96 out of the plurality of first segmented regions 96, and the second image region 102 includes three out of the plurality of second segmented regions 100. It is possible to improve the determination accuracy of the overlap rate compared to the case of including less than the second divided areas 100 .
また、第1画像領域98及び第2画像領域102は、それぞれ矩形状の領域である。したがって、例えば、第1画像領域98及び第2画像領域102が、それぞれ矩形状以外の形状の領域である場合に比して、簡単な処理で第1特徴点と第2特徴点とを比較することができる。
Also, the first image area 98 and the second image area 102 are each rectangular areas. Therefore, for example, the first feature point and the second feature point can be compared with a simpler process than when the first image area 98 and the second image area 102 are areas with shapes other than rectangles. be able to.
また、第1特徴点は、第1画像領域98のみに含まれる特徴点であり、第2特徴点は、第2画像領域102のみに含まれる特徴点である。したがって、例えば、第1特徴点が第1画像領域98以外の領域の特徴点を含み、第2特徴点が第2画像領域102以外の領域の特徴点を含む場合に比して、オーバーラップ率を判定する場合のプロセッサ46の処理に要する負荷を軽減することができる。
Also, the first feature point is a feature point included only in the first image area 98 , and the second feature point is a feature point included only in the second image area 102 . Therefore, for example, the overlap ratio It is possible to reduce the load required for the processing of the processor 46 when determining
また、プロセッサ46は、飛行体10が移動している場合、表示用画像94のうちの一部の画像領域である第1画像領域98に含まれる第1特徴点を抽出する。したがって、例えば、移動体が移動している場合に、プロセッサ46が表示用画像94の全領域に含まれる第1特徴点を抽出する場合に比して、第1特徴点を抽出する場合のプロセッサ46の処理に要する負荷を軽減することができる。
Also, when the flying object 10 is moving, the processor 46 extracts first feature points included in a first image area 98 that is a partial image area of the display image 94 . Therefore, for example, when the moving body is moving, the processor 46 extracts the first feature points included in the entire area of the display image 94, and the processor 46 extracts the first feature points. 46 processing load can be reduced.
また、プロセッサ46は、飛行体10が停止している場合、表示用画像94の全領域に含まれる特徴点を抽出する。したがって、飛行体10が停止している状態からどの方向に移動した場合でも、表示用画像94のうちの第1画像領域98に含まれる第1特徴点を抽出することができる。
Also, the processor 46 extracts feature points included in the entire area of the display image 94 when the flying object 10 is stationary. Therefore, the first feature point included in the first image area 98 of the display image 94 can be extracted regardless of the direction in which the flying object 10 moves from the stationary state.
また、第1画像領域98の面積及び第2画像領域102の面積は、隣接する合成用画像92が合成される場合にオーバーラップする領域(以下、単に「オーバーラップ領域」と称する)よりもそれぞれ大きい面積に設定される。したがって、例えば、第1画像領域98の面積及び第2画像領域102の面積が、オーバーラップ領域よりもそれぞれ小さい面積に設定される場合に比して、第1特徴点と第2特徴点とに基づいて合成用画像92と表示用画像94とのオーバーラップ率を判定する場合の判定精度を確保することができる。
In addition, the area of the first image area 98 and the area of the second image area 102 are each larger than the overlapping area (hereinafter simply referred to as "overlap area") when the adjacent images for synthesis 92 are synthesized. Set in a large area. Therefore, for example, compared to the case where the area of the first image area 98 and the area of the second image area 102 are set to areas smaller than the overlap area, the first feature point and the second feature point Based on this, it is possible to secure determination accuracy when determining the overlap ratio between the image for synthesis 92 and the image for display 94 .
また、撮像装置30は、飛行体10に搭載されている。したがって、飛行体10が飛行しながら撮像装置30によって撮像対象領域3Aを撮像することができる。
Also, the imaging device 30 is mounted on the aircraft 10 . Therefore, the imaging device 30 can image the imaging target region 3A while the aircraft 10 is flying.
なお、上記実施形態では、第1画像領域98は、複数の第1分割領域96のうちの3つの第1分割領域96を含み、第2画像領域102は、複数の第2分割領域100のうちの3つの第2分割領域100を含む。しかしながら、一例として図19に示すように、第1画像領域98は、複数の第1分割領域96のうちの1つの第1分割領域96であり、第2画像領域102は、複数の第2分割領域100のうちの1つの第2分割領域100でもよい。そして、特徴点比較部76は、1つの第1分割領域96である第1画像領域98に含まれる第1特徴点と、1つの第2分割領域100である第2画像領域102に含まれる第2特徴点とを比較し、比較した結果を示す比較結果情報を生成してもよい。
Note that, in the above embodiment, the first image region 98 includes three first divided regions 96 out of the plurality of first divided regions 96, and the second image region 102 out of the plurality of second divided regions 100. includes three second divided regions 100 of However, as shown in FIG. 19 as an example, the first image region 98 is one of the first divided regions 96 of the plurality of first divided regions 96, and the second image region 102 is the plurality of second divided regions. It may be one of the second divided regions 100 of the regions 100 . Then, the feature point comparison unit 76 compares the first feature points included in the first image region 98 that is one first divided region 96 and the second feature points included in the second image region 102 that is one second divided region 100 . Two feature points may be compared and comparison result information indicating the comparison result may be generated.
図19に示す例によれば、例えば、特徴点比較部76が、3つの第1分割領域96である第1画像領域98に含まれる第1特徴点と、3つの第2分割領域100である第2画像領域102に含まれる第2特徴点とを比較する場合に比して、第1特徴点と第2特徴点とを比較する場合のプロセッサ46の処理に要する負荷を軽減することができる。
According to the example shown in FIG. 19 , for example, the feature point comparison unit 76 compares the first feature points included in the first image area 98 that is the three first segmented areas 96 and the three second segmented areas 100. Compared to the case of comparing the second feature points included in the second image area 102, the load required for the processing of the processor 46 when comparing the first feature points and the second feature points can be reduced. .
また、特に図示しないが、第1画像領域98は、複数の第1分割領域96のうちの2つの第1分割領域96であり、第2画像領域102は、複数の第2分割領域100のうちの2つの第2分割領域100でもよい。そして、特徴点比較部76は、2つの第1分割領域96である第1画像領域98に含まれる第1特徴点と、2つの第2分割領域100である第2画像領域102に含まれる第2特徴点とを比較し、比較した結果を示す比較結果情報を生成してもよい。
Also, although not shown, the first image region 98 is two of the first divided regions 96 of the plurality of first divided regions 96, and the second image region 102 is of the plurality of second divided regions 100. may be two second segmented regions 100. Then, the feature point comparison unit 76 compares the first feature points included in the first image regions 98 that are the two first divided regions 96 and the second feature points that are included in the second image regions 102 that are the two second divided regions 100 . Two feature points may be compared and comparison result information indicating the comparison result may be generated.
本例によっても、例えば、特徴点比較部76が、3つの第1分割領域96である第1画像領域98に含まれる第1特徴点と、3つの第2分割領域100である第2画像領域102に含まれる第2特徴点とを比較する場合に比して、第1特徴点と第2特徴点とを比較する場合のプロセッサ46の処理に要する負荷を軽減することができる。
According to this example as well, for example, the feature point comparison unit 76 compares the first feature points included in the first image regions 98, which are the three first divided regions 96, and the second image regions, which are the three second divided regions 100. Compared to the case of comparing the second feature points included in 102, the load required for the processing of the processor 46 when comparing the first feature points and the second feature points can be reduced.
なお、第1画像領域98は、複数の第1分割領域96のうちの4つ以上の第1分割領域96を含み、第2画像領域102は、複数の第2分割領域100のうちの4つ以上の第2分割領域100を含んでいてもよい。
Note that the first image region 98 includes four or more first divided regions 96 out of the plurality of first divided regions 96, and the second image region 102 includes four out of the plurality of second divided regions 100. The second segmented region 100 described above may be included.
また、上記実施形態では、第1画像領域98及び第2画像領域102は、それぞれ矩形状の領域である。しかしながら、一例として図20に示すように、プロセッサ46は、飛行体10の移動方向が傾斜方向(すなわち、表示用画像94の横方向又は縦方向に対して傾斜する方向)である場合には、第1画像領域98及び第2画像領域102を次のように設定してもよい。すなわち、プロセッサ46は、合成用画像92の4つの角を含む複数の第1三角形領域104のうちの1つの第1三角形領域104(すなわち、合成用画像92の4つの角のうちの移動方向の前側に位置する角を含む第1三角形領域104)を第1画像領域98に設定してもよい。また、プロセッサ46は、表示用画像94の4つの角を含む複数の第2三角形領域106のうちの1つの第2三角形領域106(すなわち、表示用画像94の4つの角のうちの移動方向の後側に位置する角を含む第2三角形領域106)を第2画像領域102に設定してもよい。
Also, in the above embodiment, the first image area 98 and the second image area 102 are each rectangular areas. However, as shown in FIG. 20 as an example, when the moving direction of the flying object 10 is an inclined direction (that is, a direction inclined with respect to the horizontal or vertical direction of the display image 94), the processor 46 The first image area 98 and the second image area 102 may be set as follows. That is, the processor 46 selects one first triangular area 104 out of the plurality of first triangular areas 104 including the four corners of the image for synthesis 92 (that is, the moving direction of the four corners of the image for synthesis 92). A first triangular region 104 ) containing the forwardly located corner may be set to the first image region 98 . Also, the processor 46 selects one of the plurality of second triangular regions 106 including the four corners of the display image 94 (that is, the movement direction of the four corners of the display image 94). A second triangular area 106 ) containing the corner located on the rear side may be set in the second image area 102 .
図20に示す例によっても、特徴点比較部76によって、第1画像領域98に含まれる第1特徴点と、第2画像領域102に含まれる第2特徴点とが比較されることにより、比較結果情報を生成することができる。
According to the example shown in FIG. 20 as well, the feature point comparison unit 76 compares the first feature points included in the first image area 98 and the second feature points included in the second image area 102, whereby the comparison is performed. Result information can be generated.
また、上記実施形態では、第2撮像制御部70は、イメージセンサ36の全領域から表示用画像データを出力させる。しかしながら、一例として図21に示すように、第2撮像制御部70は、移動指示信号に基づいて飛行体10の移動方向を取得し、イメージセンサ36のクロップ機能を利用することにより、イメージセンサ36の全画素のうちの移動方向の前側に位置する領域36Aのみから表示用画像データを出力させてもよい。つまり、第2撮像制御部70は、表示用画像94を取得する場合に、イメージセンサ36に対して第1特徴点と第2特徴点とを比較するのに用いられる画像データのみを出力させてもよい。そして、第2特徴点情報生成部74は、イメージセンサ36の全領域のうちの移動方向の前側に位置する領域36Aのみから出力された表示用画像データに基づいて表示用画像94のうちの第2画像領域102に関する第2特徴点情報を生成してもよい。
Also, in the above embodiment, the second imaging control unit 70 causes the entire area of the image sensor 36 to output display image data. However, as shown in FIG. 21 as an example, the second imaging control unit 70 acquires the movement direction of the flying object 10 based on the movement instruction signal, and uses the cropping function of the image sensor 36 to obtain the image sensor 36 The image data for display may be output only from the area 36A located on the front side in the moving direction among all the pixels of . That is, when acquiring the display image 94, the second imaging control unit 70 causes the image sensor 36 to output only image data used for comparing the first feature points and the second feature points. good too. Then, the second feature point information generation unit 74 generates the second feature point information of the display image 94 based on the display image data output only from the region 36A positioned on the front side in the movement direction of the entire region of the image sensor 36. Second feature point information regarding the two-image region 102 may be generated.
図21に示す例によれば、例えば、イメージセンサ36の全領域から表示用画像データが出力される場合に比して、表示用画像94のうちの第2画像領域102に関する第2特徴点情報を生成する場合のプロセッサ46の処理に要する負荷を軽減することができる。
According to the example shown in FIG. 21, the second feature point information regarding the second image area 102 of the display image 94 is higher than, for example, the case where the display image data is output from the entire area of the image sensor 36 . It is possible to reduce the load required for the processing of the processor 46 when generating .
なお、プロセッサ46は、表示用画像94を取得する場合に、イメージセンサ36の全領域から出力された表示用画像データのうちの第1特徴点と第2特徴点とを比較するのに用いられる画像データのみを取得してもよい。本例によれば、イメージセンサ36の全領域から出力された表示用画像データに基づいて表示用画像94の全領域に関する第2特徴点情報が生成される場合に比して、第2特徴点情報を生成する場合のプロセッサ46の処理に要する負荷を軽減することができる。
Note that the processor 46 is used to compare the first feature points and the second feature points in the display image data output from the entire area of the image sensor 36 when acquiring the display image 94. Only image data may be acquired. According to this example, compared to the case where the second feature point information about the entire area of the display image 94 is generated based on the display image data output from the entire area of the image sensor 36, the second feature points The load required for the processing of the processor 46 when generating information can be reduced.
また、一例として図22及び図23に示すように、第1特徴点情報生成部64は、飛行体10の移動速度に応じて、合成用画像92のうちの第1画像領域98の面積を変更してもよい。同様に、第2特徴点情報生成部74は、飛行体10の移動速度に応じて、表示用画像94のうちの第2画像領域102の面積を変更してもよい。
Further, as shown in FIGS. 22 and 23 as an example, the first feature point information generation unit 64 changes the area of the first image region 98 in the synthesis image 92 according to the moving speed of the aircraft 10. You may Similarly, the second feature point information generator 74 may change the area of the second image area 102 in the display image 94 according to the moving speed of the flying object 10 .
具体的には、図22に示す例では、第1特徴点情報生成部64は、飛行体10の移動速度が増加するに従って、第1画像領域98の面積を増加させている。同様に、第2特徴点情報生成部74は、飛行体10の移動速度が増加するに従って、第2画像領域102の面積を増加させている。図22に示す例によれば、例えば、飛行体10の移動速度が増加しても、第1画像領域98の面積及び第1画像領域98の面積が一定である場合に比して、オーバーラップ率の判定精度を向上させることができる。
Specifically, in the example shown in FIG. 22, the first feature point information generator 64 increases the area of the first image region 98 as the moving speed of the flying object 10 increases. Similarly, the second feature point information generator 74 increases the area of the second image area 102 as the moving speed of the flying object 10 increases. According to the example shown in FIG. 22, for example, even if the moving speed of the flying object 10 increases, the overlap area is smaller than that in the case where the area of the first image area 98 and the area of the first image area 98 are constant. Rate determination accuracy can be improved.
一方、図23に示す例では、第1特徴点情報生成部64は、飛行体10の移動速度が増加するに従って、第1画像領域98の面積を減少させている。同様に、第2特徴点情報生成部74は、飛行体10の移動速度が増加するに従って、第2画像領域102の面積を減少させている。図23に示す例よれば、例えば、飛行体10の移動速度が増加しても、第1画像領域98の面積及び第1画像領域98の面積が一定である場合に比して、オーバーラップ率を判定する場合のプロセッサ46の処理に要する負荷を軽減することができる。
On the other hand, in the example shown in FIG. 23, the first feature point information generator 64 reduces the area of the first image region 98 as the moving speed of the flying object 10 increases. Similarly, the second feature point information generator 74 reduces the area of the second image area 102 as the moving speed of the flying object 10 increases. According to the example shown in FIG. 23, for example, even if the moving speed of the flying object 10 increases, the overlap ratio It is possible to reduce the load required for the processing of the processor 46 when determining
なお、図22及び図23に示す例では、第1画像領域98の面積及び第2画像領域102の面積の両方が変更されているが、第1画像領域98の面積及び第2画像領域102の面積のうちのどちらか一方のみが変更されてもよい。
22 and 23, both the area of the first image area 98 and the area of the second image area 102 are changed, but the area of the first image area 98 and the area of the second image area 102 are changed. Only one of the areas may be changed.
また、上記実施形態では、プロセッサ46は、オーバーラップ率を判定する場合に、合成用画像92に含まれる第1画像領域98及び表示用画像94に含まれる第2画像領域102を移動方向に応じて変更する。しかしながら、プロセッサ46は、オーバーラップ率を判定する場合に、合成用画像92に含まれる第1画像領域98及び表示用画像94に含まれる第2画像領域102のうちのどちらか一方のみを移動方向に応じて変更してもよい。
Further, in the above embodiment, when judging the overlap rate, the processor 46 determines the first image area 98 included in the image for synthesis 92 and the second image area 102 included in the image for display 94 according to the movement direction. to change. However, when determining the overlap ratio, the processor 46 selects only one of the first image area 98 included in the image for synthesis 92 and the second image area 102 included in the image for display 94 in the moving direction. may be changed accordingly.
また、上記実施形態では、複数の第1分割領域96の個数は、一例として9つであるが、複数の第1分割領域96の個数は、9つ以外(例えば、4つ、6つ、8つ、又は16つ等)でもよい。同様に、複数の第2分割領域100の個数は、一例として9つであるが、複数の第2分割領域100の個数は、9つ以外(例えば、4つ、6つ、8つ、又は16つ等)でもよい。また、複数の第1分割領域96の個数と複数の第2分割領域100の個数とは異なっていてもよい。
Further, in the above embodiment, the number of the plurality of first divided regions 96 is nine as an example, but the number of the plurality of first divided regions 96 may be other than nine (eg, four, six, eight, etc.). 1, or 16, etc.). Similarly, the number of the plurality of second segmented regions 100 is nine as an example, but the number of the plurality of second segmented regions 100 may be other than nine (eg, four, six, eight, or sixteen). etc.) is also acceptable. Also, the number of the plurality of first segmented regions 96 and the number of the plurality of second segmented regions 100 may be different.
また、上記実施形態では、移動方向情報生成部68は、送信機20から送信された移動指示信号を取得し、取得した移動指示信号に基づいて移動方向情報を生成する。しかしながら、移動方向情報生成部68は、例えば、飛行体10で生成された移動信号を取得し、取得した移動信号に基づいて移動方向情報を生成してもよい。また、移動方向情報生成部68は、撮像装置30及び/又は飛行体10に搭載された各種装置から取得された情報に基づいて移動方向情報を生成してもよい。各種装置としては、例えば、衛星測位システム(例えば、全地球測位システム)、加速度計、3次元センサ(例えば、LiDAR又はステレオカメラ)等が挙げられる。
Also, in the above embodiment, the movement direction information generation unit 68 acquires the movement instruction signal transmitted from the transmitter 20 and generates movement direction information based on the acquired movement instruction signal. However, the movement direction information generator 68 may, for example, acquire a movement signal generated by the aircraft 10 and generate movement direction information based on the acquired movement signal. Further, the movement direction information generation unit 68 may generate movement direction information based on information acquired from the imaging device 30 and/or various devices mounted on the aircraft 10 . Various devices include, for example, satellite positioning systems (eg, global positioning systems), accelerometers, three-dimensional sensors (eg, LiDAR or stereo cameras), and the like.
また、上記実施形態では、撮像装置30が飛行体10に搭載されている例が挙げられているが、撮像装置30は、各種移動体(例えば、ゴンドラ、自動搬送ロボット、無人搬送車、又は高所点検車)等に搭載されてもよい。
Further, in the above-described embodiment, an example in which the imaging device 30 is mounted on the flying object 10 is given, but the imaging device 30 can be mounted on various mobile objects (for example, a gondola, an automatic guided robot, an unmanned guided vehicle, or a high-speed aircraft). inspection vehicle) or the like.
また、上記実施形態では、合成用画像92が撮像装置30によって取得されるが、合成用画像92は、撮像装置30以外のその他の撮像装置によって取得されてもよい。そして、その他の撮像装置によって取得された合成用画像92と、撮像装置30によって取得された表示用画像94とのオーバーラップ率が判定されてもよい。
Also, in the above embodiment, the image for synthesis 92 is acquired by the imaging device 30, but the image for synthesis 92 may be acquired by an imaging device other than the imaging device 30. Then, the overlap rate between the synthesis image 92 acquired by another imaging device and the display image 94 acquired by the imaging device 30 may be determined.
また、上記実施形態では、表示用画像94としてのライブビュー画像と合成用画像92とのオーバーラップ率が判定されるが、表示用画像94としてのポストビュー画像と合成用画像92とのオーバーラップ率が判定されてもよい。
Further, in the above embodiment, the overlap ratio between the live view image as the display image 94 and the composition image 92 is determined. A rate may be determined.
また、上記実施形態では、合成用画像92と表示用画像94とのオーバーラップ率が判定されるが、合成用画像92以外の第1画像と表示用画像94以外の第2画像とのオーバーラップ率が判定されてもよい。
Further, in the above-described embodiment, the overlap ratio between the image for synthesis 92 and the image for display 94 is determined. A rate may be determined.
また、上記各実施形態では、プロセッサ46を例示したが、プロセッサ46に代えて、又は、プロセッサ46と共に、他の少なくとも1つのCPU、少なくとも1つのGPU、及び/又は、少なくとも1つのTPUを用いるようにしてもよい。
In addition, although the processor 46 is illustrated in each of the above embodiments, at least one other CPU, at least one GPU, and/or at least one TPU may be used in place of or together with the processor 46 can be
また、上記各実施形態では、ストレージ48に撮像プログラム60が記憶されている形態例を挙げて説明したが、本開示の技術はこれに限定されない。例えば、撮像プログラム60がSSD又はUSBメモリなどの可搬型の非一時的なコンピュータ読取可能な記憶媒体(以下、単に「非一時的記憶媒体」と称する)に記憶されていてもよい。非一時的記憶媒体に記憶されている撮像プログラム60は、撮像装置30のコンピュータ32にインストールされ、プロセッサ46は、撮像プログラム60に従って処理を実行する。
Further, in each of the above-described embodiments, an example of a form in which the imaging program 60 is stored in the storage 48 has been described, but the technology of the present disclosure is not limited to this. For example, the imaging program 60 may be stored in a portable non-temporary computer-readable storage medium such as an SSD or USB memory (hereinafter simply referred to as "non-temporary storage medium"). An imaging program 60 stored in a non-temporary storage medium is installed in the computer 32 of the imaging device 30 , and the processor 46 executes processing according to the imaging program 60 .
また、ネットワークを介して撮像装置30に接続される他のコンピュータ又はサーバ装置等の記憶装置に撮像プログラム60を記憶させておき、撮像装置30の要求に応じて撮像プログラム60がダウンロードされ、コンピュータ32にインストールされてもよい。
Also, the imaging program 60 is stored in another computer connected to the imaging device 30 via a network or in a storage device such as a server device, and the imaging program 60 is downloaded in response to a request from the imaging device 30 and stored in the computer 32 . may be installed in
また、撮像装置30に接続される他のコンピュータ又はサーバ装置等の記憶装置、又はストレージ48に撮像プログラム60の全てを記憶させておく必要はなく、撮像プログラム60の一部を記憶させておいてもよい。
Further, it is not necessary to store all of the imaging program 60 in a storage device such as another computer or server device connected to the imaging device 30, or in the storage 48, and a part of the imaging program 60 may be stored. good too.
また、撮像装置30には、コンピュータ32が内蔵されているが、本開示の技術はこれに限定されず、例えば、コンピュータ32が撮像装置30の外部に設けられるようにしてもよい。
In addition, although the computer 32 is built into the imaging device 30, the technology of the present disclosure is not limited to this, and the computer 32 may be provided outside the imaging device 30, for example.
また、上記各実施形態では、プロセッサ46、ストレージ48、及びRAM50を含むコンピュータ32が例示されているが、本開示の技術はこれに限定されず、コンピュータ32に代えて、ASIC、FPGA、及び/又はPLDを含むデバイスを適用してもよい。また、コンピュータ32に代えて、ハードウェア構成及びソフトウェア構成の組み合わせを用いてもよい。
Further, although the computer 32 including the processor 46, the storage 48, and the RAM 50 is illustrated in each of the above-described embodiments, the technology of the present disclosure is not limited to this, and instead of the computer 32, ASIC, FPGA, and/or Or you may apply the device containing PLD. Also, instead of the computer 32, a combination of hardware configuration and software configuration may be used.
また、上記各実施形態で説明した各種処理を実行するハードウェア資源としては、次に示す各種のプロセッサを用いることができる。プロセッサとしては、例えば、ソフトウェア、すなわち、プログラムを実行することで、各種処理を実行するハードウェア資源として機能する汎用的なプロセッサであるCPUが挙げられる。また、プロセッサとしては、例えば、FPGA、PLD、又はASICなどの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電子回路が挙げられる。何れのプロセッサにもメモリが内蔵又は接続されており、何れのプロセッサもメモリを使用することで各種処理を実行する。
Also, the following various processors can be used as hardware resources for executing the various processes described in each of the above embodiments. Examples of processors include CPUs, which are general-purpose processors that function as hardware resources that execute various processes by executing software, that is, programs. Also, processors include, for example, dedicated electronic circuits such as FPGAs, PLDs, and ASICs, which are processors having circuit configurations specially designed to execute specific processing. A memory is built in or connected to each processor, and each processor uses the memory to perform various processes.
各種処理を実行するハードウェア資源は、これらの各種のプロセッサのうちの1つで構成されてもよいし、同種または異種の2つ以上のプロセッサの組み合わせ(例えば、複数のFPGAの組み合わせ、又はCPUとFPGAとの組み合わせ)で構成されてもよい。また、各種処理を実行するハードウェア資源は1つのプロセッサであってもよい。
Hardware resources that perform various processes may be configured with one of these various processors, or a combination of two or more processors of the same or different types (for example, a combination of multiple FPGAs or CPUs). and FPGA). Also, the hardware resource for executing various processes may be one processor.
1つのプロセッサで構成する例としては、第1に、1つ以上のCPUとソフトウェアの組み合わせで1つのプロセッサを構成し、このプロセッサが、各種処理を実行するハードウェア資源として機能する形態がある。第2に、SoCなどに代表されるように、各種処理を実行する複数のハードウェア資源を含むシステム全体の機能を1つのICチップで実現するプロセッサを使用する形態がある。このように、各種処理は、ハードウェア資源として、上記各種のプロセッサの1つ以上を用いて実現される。
As an example of configuration with one processor, first, there is a form in which one processor is configured by combining one or more CPUs and software, and this processor functions as a hardware resource that executes various processes. Secondly, as typified by SoC, etc., there is a mode of using a processor that implements the function of the entire system including a plurality of hardware resources for executing various processes with a single IC chip. In this way, various processes are realized using one or more of the above various processors as hardware resources.
更に、これらの各種のプロセッサのハードウェア的な構造としては、より具体的には、半導体素子などの回路素子を組み合わせた電子回路を用いることができる。また、上記の視線検出処理はあくまでも一例である。したがって、主旨を逸脱しない範囲内において不要なステップを削除したり、新たなステップを追加したり、処理順序を入れ替えたりしてもよいことは言うまでもない。
Furthermore, as the hardware structure of these various processors, more specifically, an electronic circuit in which circuit elements such as semiconductor elements are combined can be used. Also, the line-of-sight detection process described above is merely an example. Therefore, it goes without saying that unnecessary steps may be deleted, new steps added, and the order of processing may be changed without departing from the scope of the invention.
以上に示した記載内容及び図示内容は、本開示の技術に係る部分についての詳細な説明であり、本開示の技術の一例に過ぎない。例えば、上記の構成、機能、作用、及び効果に関する説明は、本開示の技術に係る部分の構成、機能、作用、及び効果の一例に関する説明である。よって、本開示の技術の主旨を逸脱しない範囲内において、以上に示した記載内容及び図示内容に対して、不要な部分を削除したり、新たな要素を追加したり、置き換えたりしてもよいことは言うまでもない。また、錯綜を回避し、本開示の技術に係る部分の理解を容易にするために、以上に示した記載内容及び図示内容では、本開示の技術の実施を可能にする上で特に説明を要しない技術常識等に関する説明は省略されている。
The descriptions and illustrations shown above are detailed descriptions of the parts related to the technology of the present disclosure, and are merely examples of the technology of the present disclosure. For example, the above descriptions of configurations, functions, actions, and effects are descriptions of examples of configurations, functions, actions, and effects of portions related to the technology of the present disclosure. Therefore, unnecessary parts may be deleted, new elements added, or replaced with respect to the above-described description and illustration without departing from the gist of the technology of the present disclosure. Needless to say. In addition, in order to avoid complication and facilitate understanding of the portion related to the technology of the present disclosure, the descriptions and illustrations shown above require particular explanation in order to enable implementation of the technology of the present disclosure. Descriptions of common technical knowledge, etc., that are not used are omitted.
本明細書において、「A及び/又はB」は、「A及びBのうちの少なくとも1つ」と同義である。つまり、「A及び/又はB」は、Aだけであってもよいし、Bだけであってもよいし、A及びBの組み合わせであってもよい、という意味である。また、本明細書において、3つ以上の事柄を「及び/又は」で結び付けて表現する場合も、「A及び/又はB」と同様の考え方が適用される。
In this specification, "A and/or B" is synonymous with "at least one of A and B." That is, "A and/or B" means that only A, only B, or a combination of A and B may be used. Also, in this specification, when three or more matters are expressed by connecting with "and/or", the same idea as "A and/or B" is applied.
本明細書に記載された全ての文献、特許出願及び技術規格は、個々の文献、特許出願及び技術規格が参照により取り込まれることが具体的かつ個々に記された場合と同程度に、本明細書中に参照により取り込まれる。
All publications, patent applications and technical standards mentioned herein are expressly incorporated herein by reference to the same extent as if each individual publication, patent application and technical standard were specifically and individually noted to be incorporated by reference. incorporated by reference into the book.
Claims (20)
- プロセッサを備える撮像装置であって、
前記プロセッサは、
第1画像を取得し、
前記撮像装置が搭載された移動体の移動方向を取得し、
前記移動体が移動した場合に、前記撮像装置によって撮像されることで得られた第2画像を取得し、
前記第1画像に含まれる第1特徴点と前記第2画像に含まれる第2特徴点とを比較した結果に基づいて、前記第1画像と前記第2画像とのオーバーラップ率を判定し、
前記オーバーラップ率を判定する場合に、前記第1画像に含まれる画像領域であって、前記第1特徴点を得るための画像領域である第1画像領域、及び/又は前記第2画像に含まれる画像領域であって、前記第2特徴点を得るための画像領域である第2画像領域を前記移動方向に応じて変更する
撮像装置。 An imaging device comprising a processor,
The processor
obtain a first image;
Acquiring a moving direction of a moving body on which the imaging device is mounted,
Acquiring a second image obtained by being imaged by the imaging device when the moving object moves,
Determining an overlap rate between the first image and the second image based on the result of comparing the first feature points included in the first image and the second feature points included in the second image,
When determining the overlap rate, an image area included in the first image, which is an image area for obtaining the first feature point, and/or included in the second image an imaging device that changes a second image area, which is an image area for obtaining the second feature point, according to the moving direction. - 前記プロセッサは、前記第2画像を表示用画像として表示装置に表示させる
請求項1記載の撮像装置。 The imaging device according to claim 1, wherein the processor causes a display device to display the second image as a display image. - 前記第1画像領域は、前記第1画像のうちの一部の画像領域であり、
前記第2画像領域は、前記第2画像のうちの一部の画像領域である
請求項1又は請求項2に記載の撮像装置。 The first image area is a partial image area of the first image,
The imaging device according to claim 1 or 2, wherein the second image area is a partial image area of the second image. - 前記第1画像領域は、前記第1画像における前記移動方向の前側に位置する領域であり、
前記第2画像領域は、前記第2画像における前記移動方向の後側に位置する領域である
請求項1から請求項3の何れか一項に記載の撮像装置。 The first image area is an area located on the front side of the movement direction in the first image,
The imaging device according to any one of claims 1 to 3, wherein the second image area is an area located on the rear side in the moving direction in the second image. - 前記プロセッサは、
前記第1画像が分割されることで得られた複数の第1分割領域から前記第1画像領域を選定し、
前記第2画像が分割されることで得られた複数の第2分割領域から前記第2画像領域を選定する
請求項1から請求項4の何れか一項に記載の撮像装置。 The processor
selecting the first image region from a plurality of first divided regions obtained by dividing the first image;
The imaging device according to any one of claims 1 to 4, wherein the second image area is selected from a plurality of second divided areas obtained by dividing the second image. - 前記第1画像領域は、前記複数の第1分割領域のうちの2つ以上の第1分割領域を含み、
前記第2画像領域は、前記複数の第2分割領域のうちの2つ以上の第2分割領域を含む
請求項5に記載の撮像装置。 The first image region includes two or more first segmented regions of the plurality of first segmented regions,
The imaging device according to claim 5, wherein the second image area includes two or more second divided areas among the plurality of second divided areas. - 前記第1画像領域は、前記複数の第1分割領域のうちの1つの第1分割領域であり、
前記第2画像領域は、前記複数の第2分割領域のうちの1つの第2分割領域である
請求項5に記載の撮像装置。 the first image region is one first segmented region of the plurality of first segmented regions;
The imaging device according to claim 5, wherein the second image area is one of the plurality of second divided areas. - 前記第1画像領域及び前記第2画像領域は、それぞれ矩形状の領域である
請求項1から請求項7の何れか一項に記載の撮像装置。 The imaging device according to any one of claims 1 to 7, wherein the first image area and the second image area are rectangular areas. - 前記プロセッサは、前記移動方向が前記第2画像の横方向又は縦方向に対して傾斜する方向である場合、前記第1画像の4つの角のうちの前記移動方向の前側に位置する角を含む第1三角形領域を前記第1画像領域に設定し、かつ前記第2画像の4つの角のうちの前記移動方向の後側に位置する角を含む第2三角形領域を前記第2画像領域に設定する
請求項1から請求項7の何れか一項に記載の撮像装置。 The processor includes a corner located on the front side of the moving direction out of the four corners of the first image when the moving direction is a direction that is inclined with respect to the horizontal direction or the vertical direction of the second image. A first triangular area is set as the first image area, and a second triangular area including a corner located on the rear side of the moving direction among the four corners of the second image is set as the second image area. The imaging device according to any one of claims 1 to 7. - 前記プロセッサは、前記第2画像を取得する場合に、前記第1特徴点と前記第2特徴点とを比較するのに用いられる画像データのみをイメージセンサから取得する
請求項1から請求項9のいずれか一項に記載の撮像装置。 10. The processor of any one of claims 1 to 9, wherein when obtaining the second image, the processor obtains from an image sensor only image data used to compare the first feature points and the second feature points. The imaging device according to any one of the items. - 前記第1特徴点は、前記第1画像領域のみに含まれる特徴点であり、
前記第2特徴点は、前記第2画像領域のみに含まれる特徴点である
請求項1から請求項10の何れか一項に記載の撮像装置。 The first feature point is a feature point included only in the first image area,
The imaging apparatus according to any one of claims 1 to 10, wherein the second feature point is a feature point included only in the second image area. - 前記プロセッサは、前記移動体の移動速度に応じて、前記第1画像領域の面積及び/又は前記第2画像領域の面積を変更する
請求項1から請求項11の何れか一項に記載の撮像装置。 The imaging according to any one of claims 1 to 11, wherein the processor changes the area of the first image area and/or the area of the second image area according to the moving speed of the moving object. Device. - 前記プロセッサは、前記移動体の移動速度が増加するに従って、前記第1画像領域の面積及び/又は前記第2画像領域の面積を増加させる
請求項12に記載の撮像装置。 13. The imaging apparatus according to claim 12, wherein the processor increases the area of the first image area and/or the area of the second image area as the moving speed of the moving object increases. - 前記プロセッサは、前記移動体の移動速度が増加するに従って、前記第1画像領域の面積及び/又は前記第2画像領域の面積を減少させる
請求項12に記載の撮像装置。 13. The imaging apparatus according to claim 12, wherein the processor reduces the area of the first image area and/or the area of the second image area as the moving speed of the moving body increases. - 前記プロセッサは、前記移動体が移動している場合、前記第1画像のうちの一部の画像領域である前記第1画像領域に含まれる前記第1特徴点を抽出する
請求項1から請求項14の何れか一項に記載の撮像装置。 The processor extracts the first feature points included in the first image area, which is a partial image area of the first image, when the moving body is moving. 15. The imaging device according to any one of 14. - 前記プロセッサは、前記移動体が停止している場合、前記第1画像の全領域に含まれる特徴点を抽出する
請求項1から請求項15の何れか一項に記載の撮像装置。 The imaging apparatus according to any one of claims 1 to 15, wherein the processor extracts feature points included in the entire area of the first image when the moving body is stationary. - 前記第1画像領域の面積及び前記第2画像領域の面積は、隣接する前記第1画像が合成される場合にオーバーラップする領域であるオーバーラップ領域よりもそれぞれ大きい面積に設定される
請求項1から請求項16の何れか一項に記載の撮像装置。 2. The area of the first image area and the area of the second image area are each set to a larger area than an overlap area, which is an area where the adjacent first images overlap when combined. 17. The imaging device according to any one of claims 16 to 16. - 前記移動体は、飛行体である
請求項1から請求項17の何れか一項に記載の撮像装置。 The imaging apparatus according to any one of claims 1 to 17, wherein the mobile object is an aircraft. - 第1画像を取得すること、
撮像装置が搭載された移動体の移動方向を取得すること、
前記移動体が移動した場合に、前記撮像装置によって撮像されることで得られた第2画像を取得すること、
前記第1画像に含まれる第1特徴点と前記第2画像に含まれる第2特徴点とを比較した結果に基づいて、前記第1画像と前記第2画像とのオーバーラップ率を判定すること、及び、
前記オーバーラップ率を判定する場合に、前記第1画像に含まれる画像領域であって、前記第1特徴点を得るための画像領域である第1画像領域、及び/又は前記第2画像に含まれる画像領域であって、前記第2特徴点を得るための画像領域である第2画像領域を前記移動方向に応じて変更すること
を備える撮像方法。 obtaining a first image;
Acquiring the direction of movement of a moving object equipped with an imaging device;
Acquiring a second image captured by the imaging device when the moving body moves;
Determining an overlap ratio between the first image and the second image based on a result of comparing the first feature points included in the first image and the second feature points included in the second image. ,as well as,
When determining the overlap rate, an image area included in the first image, which is an image area for obtaining the first feature point, and/or included in the second image changing a second image area, which is an image area for obtaining the second feature point, according to the moving direction. - 第1画像を取得すること、
撮像装置が搭載された移動体の移動方向を取得すること、
前記移動体が移動した場合に、前記撮像装置によって撮像されることで得られた第2画像を取得すること、
前記第1画像に含まれる第1特徴点と前記第2画像に含まれる第2特徴点とを比較した結果に基づいて、前記第1画像と前記第2画像とのオーバーラップ率を判定すること、及び、
前記オーバーラップ率を判定する場合に、前記第1画像に含まれる画像領域であって、前記第1特徴点を得るための画像領域である第1画像領域、及び/又は前記第2画像に含まれる画像領域であって、前記第2特徴点を得るための画像領域である第2画像領域を前記移動方向に応じて変更すること
を含む処理をコンピュータに実行させるためのプログラム。 obtaining a first image;
Acquiring the direction of movement of a moving object equipped with an imaging device;
Acquiring a second image captured by the imaging device when the moving body moves;
Determining an overlap rate between the first image and the second image based on a result of comparing the first feature points included in the first image and the second feature points included in the second image. ,as well as,
When determining the overlap rate, an image area included in the first image, which is an image area for obtaining the first feature point, and/or included in the second image A program for causing a computer to execute a process including changing a second image area, which is an image area for obtaining the second feature points, according to the moving direction.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023573861A JPWO2023135910A1 (en) | 2022-01-17 | 2022-11-07 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-005320 | 2022-01-17 | ||
JP2022005320 | 2022-01-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023135910A1 true WO2023135910A1 (en) | 2023-07-20 |
Family
ID=87278847
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/041381 WO2023135910A1 (en) | 2022-01-17 | 2022-11-07 | Image-capturing device, image-capturing method, and program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2023135910A1 (en) |
WO (1) | WO2023135910A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013015148A1 (en) * | 2011-07-27 | 2013-01-31 | オリンパス株式会社 | Image processing system, information processing device, program, and image processing method |
WO2018180214A1 (en) * | 2017-03-28 | 2018-10-04 | 富士フイルム株式会社 | Image processing device, camera device, and image processing method |
WO2019087247A1 (en) * | 2017-10-30 | 2019-05-09 | 株式会社オプティム | Long image generation system, method and program |
-
2022
- 2022-11-07 WO PCT/JP2022/041381 patent/WO2023135910A1/en active Application Filing
- 2022-11-07 JP JP2023573861A patent/JPWO2023135910A1/ja active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013015148A1 (en) * | 2011-07-27 | 2013-01-31 | オリンパス株式会社 | Image processing system, information processing device, program, and image processing method |
WO2018180214A1 (en) * | 2017-03-28 | 2018-10-04 | 富士フイルム株式会社 | Image processing device, camera device, and image processing method |
WO2019087247A1 (en) * | 2017-10-30 | 2019-05-09 | 株式会社オプティム | Long image generation system, method and program |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023135910A1 (en) | 2023-07-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6708790B2 (en) | Image generation device, image generation system, image generation method, and image generation program | |
US11122215B2 (en) | Imaging apparatus, unmanned moving object, imaging method, system, and program | |
JP6594180B2 (en) | IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM | |
JP7030442B2 (en) | Image processing equipment, image processing methods, and programs | |
CN111800589B (en) | Image processing method, device and system and robot | |
JP6716015B2 (en) | Imaging control device, imaging system, and imaging control method | |
EP3529978B1 (en) | An image synthesis system | |
JP2019045991A (en) | Generation device, generation method and program | |
WO2020008973A1 (en) | Image-capture plan presentation device and method | |
JP2021096865A (en) | Information processing device, flight control instruction method, program, and recording medium | |
CN112985359B (en) | Image acquisition method and image acquisition equipment | |
WO2023135910A1 (en) | Image-capturing device, image-capturing method, and program | |
JP2021097268A (en) | Control device, mobile object, and control method | |
EP3529977B1 (en) | A bundle adjustment system | |
JP6685814B2 (en) | Imaging device and control method thereof | |
WO2021115192A1 (en) | Image processing device, image processing method, program and recording medium | |
JPWO2018180214A1 (en) | Image processing apparatus, camera apparatus, and image processing method | |
WO2023127313A1 (en) | Image capture supporting device, image capture supporting method, and program | |
JP2021022846A (en) | Inspection method and inspection system | |
WO2024018691A1 (en) | Control device, control method, and program | |
WO2023195394A1 (en) | Imaging assistance device, moving body, imaging assistance method, and program | |
WO2024038647A1 (en) | Imaging assistance device, imaging device, imaging assistance method, and program | |
WO2023195401A1 (en) | Imaging control device, imaging control method, and program | |
WO2023089983A1 (en) | Information processing device, information processing method, and program | |
JP2024012969A (en) | Imaging control device, control method of imaging control device and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22919295 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023573861 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |