CN114586335A - Image processing apparatus, image processing method, program, and recording medium - Google Patents

Image processing apparatus, image processing method, program, and recording medium Download PDF

Info

Publication number
CN114586335A
CN114586335A CN202080074343.6A CN202080074343A CN114586335A CN 114586335 A CN114586335 A CN 114586335A CN 202080074343 A CN202080074343 A CN 202080074343A CN 114586335 A CN114586335 A CN 114586335A
Authority
CN
China
Prior art keywords
image
flight
moving image
flying
flying object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080074343.6A
Other languages
Chinese (zh)
Inventor
周杰旻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN114586335A publication Critical patent/CN114586335A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

It is expected that the image quality of a moving image can be improved by synthesizing a plurality of moving images captured while flying by an image-pickup flying object. The image processing device processes a moving image captured by an imaging unit included in the flying object. A processing unit of an image processing device specifies a flight path along which a flight object flies, causes the flight object to fly around the flight path a plurality of times, causes an imaging unit included in the flight object to capture a plurality of moving images having the same capture range by the plurality of times of the flying around, and generates a composite moving image by combining the plurality of moving images captured by the plurality of times of the flying around.

Description

Image processing apparatus, image processing method, program, and recording medium Technical Field
The present disclosure relates to an image processing apparatus, an image processing method, a program, and a recording medium.
Background
Conventionally, an Image combining technique (Image combining) for combining a plurality of images is known. The image quality of an image can be improved by combining a plurality of images. Patent document 1 discloses an image processing apparatus that performs image synthesis. The image processing apparatus includes: the image processing apparatus includes a synthesizing section for synthesizing a plurality of images captured at different time points, and a motion correcting section for correcting the images so as to reduce the influence of motion on the images.
Background art documents:
[ patent document ]
[ patent document 1] European patent application publication No. 3450310 Specification
Disclosure of Invention
The technical problems to be solved by the invention are as follows:
the Image processing apparatus in patent document 1 captures a plurality of still images with the position of the imaging apparatus fixed, and then synthesizes the plurality of still images (Image Stacking). However, it does not consider a case where a plurality of moving images are combined (Video Stacking) while the imaging device is moving, as in the case where the aircraft mounts the imaging device. It is expected that the image quality of a moving image is improved by synthesizing a plurality of moving images captured by a flying object that can be captured while flying.
Means for solving the technical problem:
in one aspect, an image processing apparatus that processes a moving image captured by an imaging unit included in a flying object includes a processing unit that specifies a flight path along which the flying object flies; causing the flying body to fly around the flight path a plurality of times; causing an imaging unit included in the flying object to capture a plurality of moving images having the same capture range by a plurality of circling flights; a plurality of moving images shot by a plurality of times of circling flight are synthesized to generate a synthesized moving image.
The dynamic image may have a plurality of image frames in time series order. The processing section may control the flying object so that each of the image frames of the plurality of dynamic images at the same relative time has the same shooting range.
A processing unit for acquiring the state of the flying object in synchronization with the vertical synchronization signal of the imaging unit during the flight on the first flight path; during the flight of the flight path in the second round and subsequent rounds, the flight of the flying object and the imaging unit are controlled in synchronization with the vertical synchronization signal of the imaging unit so that imaging is performed in the same state as the state of the flying object in the first round.
The state of the flight object may include: at least one of a position of the flying object, an orientation of the flying object, and an angle of a gimbal supporting the imaging unit.
The processing unit may generate a composite moving image from the first moving image obtained in the first pass and the second moving images obtained in the second and subsequent passes.
The processing unit may compare the first moving image with the second moving image for each image frame of the same relative time; and performing motion compensation on the second dynamic image on the first dynamic image according to the comparison result.
The motion compensation may comprise global motion compensation.
The processing unit may generate the composite moving image from statistics of the same pixel in the image frames at the same relative time in the first moving image and the second moving image.
The processing unit compares the first moving image with the second moving image for each image frame of the same relative time; extracting a characteristic region for the second dynamic image; a region corresponding to the feature region in the first moving image is replaced with the feature region in the second moving image.
The processing unit may acquire the number of times of flight of the flying object around the flight path; outputting a dynamic image photographed in the last surround when the acquired number of surrounds is less than a threshold; and outputting the synthesized dynamic image when the acquired surrounding times are more than or equal to the threshold value.
The processing unit may evaluate the outputted composite moving image; when the evaluation result of the synthesized dynamic image meets the preset reference, ending the flying and shooting of the flying object; and when the evaluation result of the synthetic dynamic image does not meet the preset reference, flying and shooting along the next surrounding flying path.
The processing section may acquire operation information indicating an evaluation result of the composite moving image.
The processing unit may perform image recognition on the synthesized moving image; and evaluating the synthesized dynamic image according to the image recognition result.
The image processing apparatus may be a flying object.
In one aspect, an image processing method for processing a moving image captured by an imaging unit included in a flying object, includes: appointing a flight path flown by a flight body; causing the flying body to fly around the flight path a plurality of times; causing an imaging unit included in the flying object to capture a plurality of moving images having the same capture range by a plurality of circling flights; and synthesizing a plurality of dynamic images shot by the plurality of times of circling flights to generate a synthesized dynamic image.
The dynamic image may have a plurality of image frames in time series order. The step of capturing a plurality of moving images may include the steps of: the flying body is controlled so that each of the image frames of the plurality of dynamic images at the same relative time has the same shooting range.
The step of capturing a plurality of moving images may include the steps of: acquiring the state of a flying object in synchronization with a vertical synchronization signal of an imaging unit during the flight of the first flight path; and controlling the flying of the flying object and the imaging unit in synchronization with the vertical synchronization signal of the imaging unit so that imaging is performed in the same state as the state of the flying object in the first turn during the flying of the flight path in the second and subsequent turns.
The state of the flight object may include: at least one of information such as a position of the flying object, an orientation of the flying object, and an angle of a gimbal supporting the imaging unit.
The step of generating a composite dynamic image may comprise the steps of: and generating a synthetic dynamic image according to the first dynamic image obtained in the first circle and the second dynamic image obtained in the second circle and later.
The step of generating a composite dynamic image may comprise the steps of: comparing the first dynamic image with the second dynamic image for each image frame of the same relative time; and performing motion compensation on the second dynamic image on the first dynamic image according to the comparison result.
The motion compensation may comprise global motion compensation.
The step of generating a composite dynamic image may comprise the steps of: a composite moving image is generated from the statistical values of the same pixels of the image frames at the same relative time in the first moving image and the second moving image.
The step of generating a composite dynamic image may comprise the steps of: comparing the first dynamic image with the second dynamic image for each image frame of the same relative time; extracting a characteristic region for the second dynamic image; and replacing a region in the first moving image corresponding to the feature region with the feature region in the second moving image.
The method can also comprise the following steps: acquiring the surrounding times of the flight of a flight body on a flight path; outputting a dynamic image photographed in the last surround when the acquired number of surrounds is less than a threshold; and outputting the synthesized dynamic image when the acquired number of times of surrounding is equal to or greater than the threshold value.
The step of capturing a plurality of moving images may include the steps of: evaluating the output composite dynamic image; when the evaluation result of the synthesized dynamic image meets the preset reference, ending the flying and shooting of the flying object; and when the evaluation result of the synthetic dynamic image does not meet the preset reference, flying and shooting along the next surrounding flying path.
The step of evaluating the composite moving image may comprise the steps of: operation information indicating the evaluation result of the composite moving image is acquired.
The step of evaluating the composite moving image may comprise the steps of: performing image recognition on the synthesized dynamic image; and evaluating the synthesized dynamic image according to the result of the image recognition.
The image processing method may be performed by an image processing apparatus. The image processing apparatus may be a flying object.
In one aspect, a program for causing an image processing apparatus that processes a moving image captured by an imaging unit included in a flying object to execute: appointing a flight path flown by a flight body; flying the flying body around the flight path a plurality of times; causing an imaging unit included in the flying object to capture a plurality of moving images having the same capture range by a plurality of circling flights; a plurality of moving images shot by a plurality of times of circling flight are synthesized to generate a synthesized moving image.
In one aspect, there is provided a recording medium which is a computer-readable recording medium having a program recorded thereon for causing an image processing apparatus which processes a moving image captured by an imaging unit included in a flying object to execute: appointing a flight path flown by a flight body; causing the flying body to fly around the flight path a plurality of times; causing an imaging unit included in the flying object to capture a plurality of moving images having the same capture range by a plurality of circling flights; a plurality of moving images shot by a plurality of times of circling flight are synthesized to generate a synthesized moving image.
Moreover, the summary of the invention described above is not exhaustive of all features of the disclosure. Furthermore, sub-combinations of these feature sets may also constitute the invention.
Drawings
Fig. 1 is a schematic diagram showing a configuration example of a flying body system in an embodiment.
Fig. 2 is a diagram showing one example of a concrete appearance of the unmanned aerial vehicle.
Fig. 3 is a block diagram showing one example of a hardware configuration of the unmanned aerial vehicle.
Fig. 4 is a block diagram showing one example of a hardware configuration of a terminal.
Fig. 5 is a diagram showing one example of an action summary of the unmanned aerial vehicle.
Fig. 6 is a flowchart showing an example of the operation of the unmanned aerial vehicle.
Fig. 7 is a flowchart showing a first example of moving image synthesis.
Fig. 8 is a flowchart showing a second example of moving image synthesis.
Fig. 9 is a flowchart showing an example of output of a moving image.
Description of the symbols:
10 flying body system
80 terminal
81 terminal control part
83 operating part
85 communication unit
87 storage unit
88 display part
100 unmanned aircraft
110 UAV control
150 communication unit
160 storage unit
200 universal joint
210 rotor mechanism
220 image pickup part
240 GPS receiver
250 inertia measuring device
260 magnetic compass
270 barometric altimeter
280 ultrasonic sensor
290 laser measuring device
Detailed Description
The present disclosure will be described below with reference to embodiments of the present invention, but the following embodiments do not limit the invention according to the claims. All combinations of features described in the embodiments are not necessarily essential to the inventive solution.
The claims, the specification, the drawings, and the abstract of the specification contain matters to be protected by copyright. The copyright owner would not make an objection to the facsimile reproduction by anyone of the files, as represented by the patent office documents or records. However, in other cases, the copyright of everything is reserved.
In the following embodiments, the flying object is exemplified by an Unmanned Aerial Vehicle (UAV). The image processing device is, for example, an unmanned aircraft, but may be another device (for example, a terminal, a transmitter, a server, or another image processing device). An image processing method is used for defining the action of an image processing device. In addition, a program (for example, a program for causing the image processing apparatus to execute various processes) is recorded in the recording medium.
The "unit" or "device" described in the following embodiments is not limited to a physical configuration realized by hardware, and includes a configuration in which functions of the configuration are realized by software such as a program. Further, the functions of one configuration may be realized by two or more physical structures, or the functions of two or more structures may be realized by one physical structure, for example. The term "acquisition" in the embodiments is not limited to the operation of directly acquiring information, signals, or the like, and may include, for example, any of acquisition, that is, reception, and acquisition from a storage unit (for example, a memory) by a processing unit via a communication unit. The terms are understood and interpreted identically to those described in the claims.
Fig. 1 is a schematic diagram showing an example of the configuration of a flight body system 10 in the embodiment. The flight body system 10 includes an unmanned aerial vehicle 100 and a terminal 80. The unmanned aerial vehicle 100 and the terminal 80 can communicate with each other through wired communication or wireless communication (e.g., a wireless LAN). In fig. 1, the terminal 80 exemplifies a portable terminal (e.g., a smartphone or a tablet terminal), but may be another terminal (e.g., a PC (Personal Computer), a transmitter (proportional controller) that can operate the unmanned aircraft 100 by a joystick).
Fig. 2 is a diagram showing one example of the concrete appearance of the unmanned aerial vehicle 100. Fig. 2 shows a perspective view of the unmanned aerial vehicle 100 when flying in the moving direction STV 0. The unmanned aerial vehicle 100 is an example of a mobile body.
As shown in fig. 2, the roll axis (refer to the x-axis) is set in a direction parallel to the ground and along the moving direction STV 0. In this case, the pitch axis (see the y axis) is set in the direction parallel to the ground and perpendicular to the roll axis, and the yaw axis (see the z axis) is set in the direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis.
The unmanned aerial vehicle 100 includes a UAV main body 102, a universal joint 200, an image pickup unit 220, and a plurality of image pickup units 230.
The UAV body 102 includes a plurality of propellers. UAV body 102 flies unmanned aircraft 100 by controlling the rotation of the plurality of rotors. UAV body 102 uses, for example, four rotors to fly unmanned aircraft 100. The number of rotors is not limited to four. Further, the unmanned aerial vehicle 100 may be a fixed wing aircraft without rotors.
The image pickup unit 220 is a camera for photographing an object included in a desired photographing range (for example, an overhead object, a landscape such as a mountain and river, or a building on the ground).
The plurality of imaging units 230 are sensing cameras that capture images of the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100. The two cameras 230 may be provided on the nose, i.e., the front face, of the unmanned aircraft 100. The other two image pickup units 230 may be provided on the bottom surface of the unmanned aircraft 100. The two image pickup portions 230 on the front side may be paired to function as a so-called stereo camera. The two image pickup portions 230 on the bottom surface side may also be paired to function as a stereo camera. The three-dimensional spatial data around the unmanned aerial vehicle 100 can be generated based on the images captured by the plurality of imaging units 230. In addition, the number of the image pickup units 230 included in the unmanned aerial vehicle 100 is not limited to four. The unmanned aerial vehicle 100 may include at least one image pickup unit 230. The unmanned aircraft 100 may include at least one camera 230 at the nose, tail, sides, bottom, and top of the unmanned aircraft 100, respectively. The angle of view settable in the image pickup section 230 may be larger than the angle of view settable in the image pickup section 220. The image pickup section 230 may have a single focus lens or a fisheye lens.
Fig. 3 is a block diagram showing one example of the hardware configuration of the unmanned aerial vehicle 100. The unmanned aerial vehicle 100 includes a UAV control Unit 110, a communication Unit 150, a storage Unit 160, a universal joint 200, a rotor mechanism 210, an imaging Unit 220, an imaging Unit 230, a GPS receiver 240, an Inertial Measurement Unit (IMU) 250, a magnetic compass 260, an air pressure altimeter 270, an ultrasonic sensor 280, and a laser Measurement device 290.
The UAV control Unit 110 is constituted by, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor). The UAV control unit 110 performs signal processing for controlling the operation of each unit of the unmanned aircraft 100 as a whole, input/output processing of data with respect to other units, arithmetic processing of data, and storage processing of data.
The UAV control 110 may control the flight of the unmanned aircraft 100 according to programs stored in the memory 160. The UAV control 110 may control the flight as directed to flight control by maneuvering from the terminal 80, etc. The UAV control section 110 may capture images (e.g., dynamic images, still images) (e.g., aerial images).
The UAV controller 110 acquires position information indicating a position of the unmanned aircraft 100. The UAV controller 110 may obtain, from the GPS receiver 240, location information indicating the latitude, longitude, and altitude at which the unmanned aircraft 100 is located. The UAV control unit 110 may acquire latitude and longitude information indicating the latitude and longitude where the unmanned aircraft 100 is located from the GPS receiver 240, and may acquire altitude information indicating the altitude where the unmanned aircraft 100 is located from the barometric altimeter 270 as position information. The UAV control unit 110 may acquire the distance between the ultrasonic wave radiation point and the ultrasonic wave reflection point generated by the ultrasonic wave sensor 280 as the altitude information.
The UAV control 110 may obtain orientation information from the magnetic compass 260 that represents the orientation of the unmanned aerial vehicle 100. The orientation information may be represented by, for example, a bearing corresponding to the orientation of the nose of the unmanned aircraft 100.
The UAV control unit 110 may acquire position information indicating a position where the unmanned aircraft 100 should exist when the imaging unit 220 images an imaging range to be imaged. The UAV control 110 may obtain from the storage 160 location information indicating a location where the unmanned aerial vehicle 100 should be present. The UAV controller 110 may acquire, from another device, the position information indicating the position where the unmanned aerial vehicle 100 should exist through the communication unit 150. The UAV control section 110 may determine a position where the unmanned aircraft 100 may exist with reference to the three-dimensional map database, and acquire the position as position information indicating a position where the unmanned aircraft 100 should exist.
The UAV control unit 110 can acquire respective imaging ranges of the imaging unit 220 and the imaging unit 230. The UAV control section 110 may acquire, from the imaging section 220 and the imaging section 230, angle-of-view information indicating an angle of view of the imaging section 220 and the imaging section 230 as a parameter for determining an imaging range. The UAV control unit 110 may acquire information indicating the shooting directions of the imaging unit 220 and the imaging unit 230 as a parameter for determining the shooting range. The UAV control unit 110 may acquire posture information indicating a posture state of the imaging unit 220 from the universal joint 200 as information indicating an imaging direction of the imaging unit 220, for example. The attitude information of the imaging unit 220 may indicate an angle of rotation of the gimbal 200 from the pitch axis and yaw axis reference rotation angles.
The UAV control section 110 may acquire position information indicating a position where the unmanned aerial vehicle 100 is located as a parameter for determining a shooting range. The UAV controller 110 may limit an imaging range indicating a geographical range to be imaged by the imaging unit 220, based on the angles of view and the imaging directions of the imaging units 220 and 230 and the position of the unmanned aircraft 100.
The UAV control unit 110 may acquire shooting range information from the storage unit 160. The UAV control unit 110 may acquire shooting range information through the communication unit 150.
The UAV control unit 110 controls the universal joint 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230. The UAV control unit 110 may control the imaging range of the imaging unit 220 by changing the imaging direction or the angle of view of the imaging unit 220. The UAV control unit 110 can control the imaging range of the imaging unit 220 supported by the gimbal 200 by controlling the rotation mechanism of the gimbal 200.
The shooting range refers to a geographical range to be shot by the image pickup unit 220 or the image pickup unit 230. The shooting range is defined by latitude, longitude, and altitude. The photographing range may be a range of three-dimensional spatial data defined by latitude, longitude, and altitude. The shooting range may be a range of two-dimensional spatial data defined by latitude and longitude. The photographing range may be determined based on the angle of view and the photographing direction of the photographing part 220 or the photographing part 230 and the position where the unmanned aerial vehicle 100 is located. The photographing direction of the image pickup unit 220 and the image pickup unit 230 may be defined by the azimuth and depression angle of the front faces of the image pickup unit 220 and the image pickup unit 230, at which the photographing lenses are provided. The shooting direction of the camera 220 may be a direction determined by the head orientation of the unmanned aerial vehicle 100 and the attitude state of the camera 220 with respect to the gimbal 200. The shooting direction of the camera section 230 may be a direction determined from the head orientation of the unmanned aerial vehicle 100 and the position where the camera section 230 is provided.
The UAV control 110 may determine the surroundings of the unmanned aircraft 100 by analyzing a plurality of images captured by the plurality of cameras 230. The UAV control 110 may control flight based on the surroundings of the unmanned aircraft 100, such as avoiding obstacles.
The UAV control unit 110 can acquire stereo information (three-dimensional information) indicating a stereo shape (three-dimensional shape) of an object existing around the unmanned aircraft 100. The object may be, for example, part of a landscape of a building, road, vehicle, tree, etc. The stereo information is, for example, three-dimensional spatial data. The UAV control unit 110 may acquire the stereoscopic information by generating the stereoscopic information indicating the stereoscopic shape of the object existing around the unmanned aircraft 100 from each of the images acquired by the plurality of imaging units 230. The UAV control unit 110 can acquire the stereoscopic information indicating the stereoscopic shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the storage unit 160. The UAV control section 110 can acquire the stereoscopic information relating to the stereoscopic shape of the object existing around the unmanned aerial vehicle 100 by referring to the three-dimensional map database managed by the server existing on the network.
UAV control 110 controls the flight of unmanned aircraft 100 by controlling rotor mechanism 210. That is, the UAV controller 110 controls the position including the latitude, longitude, and altitude of the unmanned aerial vehicle 100 by controlling the rotor mechanism 210. The UAV control unit 110 may control the shooting range of the imaging unit 220 by controlling the flight of the unmanned aircraft 100. The UAV control unit 110 may control an angle of view of the image pickup unit 220 by controlling a zoom lens included in the image pickup unit 220. The UAV control unit 110 may control the angle of view of the image pickup unit 220 by digital zooming using a digital zoom function of the image pickup unit 220.
When the imaging unit 220 is fixed to the unmanned aircraft 100 and the imaging unit 220 cannot be moved, the UAV control unit 110 may cause the imaging unit 220 to image a desired imaging range in a desired environment by causing the unmanned aircraft 100 to move to a particular position at a particular date and time. Alternatively, even if the imaging unit 220 does not have a zoom function and the angle of view of the imaging unit 220 cannot be changed, the UAV control unit 110 may cause the imaging unit 220 to image a desired imaging range in a desired environment by moving the unmanned aerial vehicle 100 to a particular position at a particular date and time.
The communication unit 150 communicates with the terminal 80. The communication unit 150 can perform wireless communication by any wireless communication method. The communication unit 150 can perform wired communication by any wired communication method. The communication section 150 may transmit the captured image or additional information (metadata) about the captured image to the terminal 80.
The storage unit 160 may store various information, various data, various programs, and various images. The various images may include a captured image or an image based on a captured image. The programs may include programs necessary for the UAV control unit 110 to control the universal joint 200, the rotor mechanism 210, the camera unit 220, the GPS receiver 240, the inertial measurement unit 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measurement unit 290. The storage section 160 may be a computer-readable recording medium. The storage unit 160 includes a memory, and may include a rom (read Only memory), a ram (random Access memory), and the like. The storage unit 160 may include at least one of an hdd (hard Disk drive), an ssd (solid State drive), an SD card, a usb (universal Serial bus) memory, and other memories. At least a portion of the storage portion 160 may be detachable from the unmanned aircraft 100.
The gimbal 200 can rotatably support the imaging unit 220 around a yaw axis, a pitch axis, and a roll axis. The gimbal 200 can change the imaging direction of the imaging unit 220 by rotating the imaging unit 220 around at least one of the yaw axis, pitch axis, and roll axis.
Rotor mechanism 210 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors. The rotary wing mechanism 210 is controlled to rotate by the UAV control 110, thereby flying the unmanned aerial vehicle 100.
The image pickup section 220 picks up an object in a desired shooting range and generates data of a shot image. Data of the captured image captured by the imaging unit 220 may be stored in the memory of the imaging unit 220 or the storage unit 160.
The imaging unit 230 captures an image of the periphery of the unmanned aircraft 100 and generates data of the captured image. The image data of the image pickup section 230 may be stored in the storage section 160.
The GPS receiver 240 receives a plurality of signals indicating time and the position (coordinates) of each GPS satellite transmitted from a plurality of navigation satellites (i.e., GPS satellites). The GPS receiver 240 calculates the position of the GPS receiver 240 (i.e., the position of the unmanned aircraft 100) based on the plurality of received signals. The GPS receiver 240 outputs the position information of the unmanned aerial vehicle 100 to the UAV control section 110. In addition, the calculation of the position information of the GPS receiver 240 may be performed by the UAV control section 110 instead of the GPS receiver 240. In this case, information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240 is input to the UAV control unit 110.
The inertial measurement unit 250 detects the attitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110. The inertial measurement unit 250 can detect the acceleration in the three-axis directions of the front-back, left-right, and up-down of the unmanned aerial vehicle 100 and the angular velocity in the three-axis directions of the pitch axis, roll axis, and yaw axis as the attitude of the unmanned aerial vehicle 100.
The magnetic compass 260 detects the orientation of the nose of the unmanned aerial vehicle 100, and outputs the detection result to the UAV control section 110.
The barometric altimeter 270 detects the flying height of the unmanned aircraft 100, and outputs the detection result to the UAV control unit 110.
The ultrasonic sensor 280 emits ultrasonic waves, detects the ultrasonic waves reflected by the ground or an object, and outputs the detection result to the UAV control unit 110. The detection result may show the distance from the unmanned aerial vehicle 100 to the ground, i.e., the altitude. The detection result may show the distance from the unmanned aerial vehicle 100 to the object (subject).
The laser surveying instrument 290 irradiates a laser beam on an object, receives reflected light reflected by the object, and measures a distance between the unmanned aircraft 100 and the object (subject) by the reflected light. As an example of the laser-based distance measuring method, a time-of-flight method may be cited.
Fig. 4 is a block diagram showing one example of the hardware configuration of the terminal 80. The terminal 80 includes a terminal control unit 81, an operation unit 83, a communication unit 85, a storage unit 87, and a display unit 88. The terminal 80 may be held by a user who wishes to direct flight control of the unmanned aerial vehicle 100. The terminal 80 may indicate flight control of the unmanned aircraft 100.
The terminal control unit 81 is configured by, for example, a CPU, an MPU, or a DSP. The terminal control unit 81 performs signal processing for controlling the operations of the respective units of the terminal 80 as a whole, data input/output processing with respect to the other units, data arithmetic processing, and data storage processing.
The terminal control unit 81 can acquire data and information from the unmanned aircraft 100 via the communication unit 85. The terminal control unit 81 may acquire data and information input via the operation unit 83. The terminal control unit 81 may acquire data or information stored in the storage unit 87. The terminal control unit 81 can transmit data and information to the unmanned aircraft 100 via the communication unit 85. The terminal control unit 81 may transmit the data and the information to the display unit 88 and cause the display unit 88 to display the display information based on the data and the information. The information displayed by the display section 88 and the information transmitted to the unmanned aerial vehicle 100 through the communication section 85 may include information of a flight path on which the unmanned aerial vehicle 100 flies, a shooting position, a shot image, an image based on the shot image (e.g., a composite image).
The operation unit 83 receives and acquires data and information input by the user of the terminal 80. The operation section 83 may include input devices such as buttons, keys, a touch panel, and a microphone. The touch panel may be configured by an operation portion 83 and a display portion 88. In this case, the operation section 83 can accept a touch operation, a click operation, a drag operation, and the like.
The communication unit 85 performs wireless communication with the unmanned aircraft 100 by various wireless communication methods. The wireless communication means of the wireless communication may include, for example, communication based on a wireless LAN or a public wireless network. The communication unit 85 can perform wired communication by any wired communication method.
The storage unit 87 can store various information, various data, various programs, and various images. The various programs may include application programs executed by the terminal 80. The storage section 87 may be a computer-readable recording medium. The storage section 87 may include ROM, RAM, and the like. The storage section 87 may include at least one of an HDD, an SSD, an SD card, a USB memory, and another memory. At least a portion of the storage portion 87 is detachable from the terminal 80.
The storage unit 87 may store a captured image acquired from the unmanned aircraft 100 or an image based on the captured image. The storage unit 87 may store the captured image or additional information of the image based on the captured image.
The Display unit 88 is configured by, for example, an LCD (Liquid Crystal Display), and displays various information and data output from the terminal control unit 81. For example, the display section 88 may display a captured image or an image based on the captured image. The display section 88 may also display various data and information related to the execution of the application program.
The operation of the unmanned aircraft 100 will be described below.
Fig. 5 is a diagram showing an example of an outline of the operation of the unmanned aerial vehicle 100.
The UAV control 110 specifies a flight path RT. The UAV control unit 110 acquires a shooting range for shooting a dynamic image while flying along the flight path RT.
The shooting range is determined by the state of the unmanned aircraft 100. The state of the unmanned aircraft 100 may include: at least one of information such as the position of the unmanned aerial vehicle 100, the orientation (e.g., the nose direction) of the unmanned aerial vehicle 100, and the angle (rotation angle) of the gimbal 200 supporting the imaging unit 220 is captured. The state of the unmanned aircraft 100 may include state information (for example, flight information or imaging information) of another unmanned aircraft 100. For example, the UAV control unit 110 may acquire the position of the imaging unit 220 by GPS technology, or may acquire the position information of the unmanned aircraft 100 with high accuracy by rtk (real Time Kinematic GPS) technology. The imaging range can be generated and obtained by the UAV control unit 110 from the positional relationship between the flight position along the flight path RT and the subject as the imaging target. The photographing range may be stored in the storage part 160 and obtained from the storage part 160. The shooting range can be obtained from an external server through the communication section 150.
The UAV control 110 causes the unmanned aerial vehicle 100 to fly along the acquired flight path RT. The imaging unit 220 captures a moving image by capturing an image of the acquired imaging range while the unmanned aircraft 100 is flying along the flight path.
The unmanned aerial vehicle 100 flies on the same flight path RT a plurality of times and takes a moving image (video). A dynamic image is composed of an image sequence having a plurality of image frames. The moving picture may have, for example, 30 (equivalent to 30fps) or 60 (equivalent to 60fps) picture frames per second. The UAV control unit 110 causes the unmanned aerial vehicle 100 to fly along the same flight path RT a plurality of times, and causes the imaging unit 220 to capture a dynamic image of the same imaging range a plurality of times.
As shown in fig. 5, the UAV control unit 110 acquires a first image frame gf11, a second image frame gf12, a third image frame gf13, and fourth image frames gf14 and … from the imaging unit 220 in time-series order in the first turn of the flight path RT. The UAV control unit 110 acquires a first image frame gf21, a second image frame gf22, a third image frame gf23, and fourth image frames gf24 and … from the imaging unit 220 in the second turn of the flight path RT. The UAV control unit 110 acquires a first image frame gf31, a second image frame gf32, a third image frame gf33, and fourth image frames gf34 and … from the imaging unit 220 in the third turn of the flight path RT. In fig. 5, the X-th image frame is simply referred to as an X-th frame.
In image frames of the same relative time (time-series position) in each circle, the same shooting range is shot. For example, in each circle, the image ranges of the first image frames gf11, gf21, gf31 captured at the same relative time t1 correspond to the same capture range. In each circle, the image ranges of the second image frames gf12, gf22, gf32 captured at the same relative time t2 correspond to the same capture range. In each circle, the image ranges of the third image frames gf13, gf23, gf33 captured at the same relative time t3 correspond to the same capture range. In each circle, the image ranges of the fourth image frames gf14, gf24, gf34 captured at the same relative time t4 correspond to the same capture range. When the shooting range is the same, the state of the unmanned aircraft 100 is the same. Thus, the unmanned aerial vehicle 100 can acquire a plurality of image frames taken at the same position. The unmanned aerial vehicle 100 can perform continuous imaging frame by repeatedly flying and imaging on the flight path RT.
The UAV control unit 110 synthesizes a plurality of image frames of the same relative time in each circle, and obtains a synthesized image frame for each image frame of the same relative time. For example, three first image frames gf11, gf21, gf31 are synthesized, and a first synthesized image frame is generated. Similarly, a second combined image frame … is generated for image frames subsequent to the second image frame. The UAV control unit 110 generates a composite moving image including each composite image frame in time series.
In addition, the UAV control unit 110 may store state information of the unmanned aircraft 100 when capturing the image frame. The time when the image frame is captured, that is, the time when the state information of the unmanned aircraft 100 is acquired may be synchronized with the vertical synchronization signal (VSYNC signal) of the image pickup unit 220. The state of the unmanned aircraft 100 may be saved at least at the first lap shot. As a result, the unmanned aerial vehicle 100 can follow the state of the unmanned aerial vehicle during the first turn of flight even during the second and subsequent turns of flight, and can capture a moving image of an image frame having the same imaging range even after the second turn of flight.
Fig. 6 is a flowchart showing an example of the operation of the unmanned aerial vehicle 100.
First, the UAV control unit 110 specifies a flight path RT (S11). For example, the flight path RT may be designated by the user in advance through the operation unit 83 of the terminal 80, or may be designated by being acquired through the communication unit 85 and the communication unit 150. The flight path RT may be generated and designated by the UAV control unit 110 so that one or more desired subjects can be captured. The flight path RT may be stored in the storage unit 160 in advance, and may be obtained from the storage unit 160 and designated. The flight path RT may be specified by being acquired from an external server through the communication unit 150. For example, the flight path RT is a flight path that can photograph a desired object. In addition, the UAV control unit 110 may specify the flight path RT in accordance with a manual operation (manipulation) by the operation unit 83 of the terminal 80 during the first turn flight.
The UAV control unit 110 causes the imaging unit 220 to start imaging along the flight path RT in accordance with a predetermined imaging start trigger signal. The photographing start trigger signal may include: the communication unit 150 receives an image capture start instruction from the terminal 80, or detects that a predetermined time to start image capture has been reached. The indication of the start of shooting may include: the video composition mode is selected as the shooting mode, for example, by the operation section 83 of the terminal 80.
The UAV controller 110 stores the state of the unmanned aircraft 100 at the time when the shooting of the moving image is started along the flight path RT in the storage 160 (S12). The UAV control unit 110 may acquire the state of the unmanned aircraft 100 instructed by the terminal 80 through the communication unit 150, that is, the state of the unmanned aircraft 100 at the start of shooting. The UAV control unit 110 may determine the state of the unmanned aircraft 100 at the start of shooting, according to a desired object. The imaging range imaged by the imaging unit 220 is determined according to the state of the unmanned aerial vehicle 100.
The UAV controller 110 captures a dynamic image along the flight path RT (S13). The UAV control unit 110 controls the flight of the unmanned aerial vehicle 100 to fly along the flight path RT in each turn, and acquires each image frame of the dynamic image of each turn. The UAV control unit 110 synthesizes the moving images captured in the respective circles to generate a synthesized moving image (S14). The composition of the moving image will be described later in detail. The UAV control unit 110 outputs a moving image such as a synthesized moving image. The output of the moving image will be described in detail later (S15). The UAV control unit 110 may store information that the turn is the second turn in the storage unit 160 during the flight and shooting of each turn (for example, at the start of shooting of each turn). Further, similarly to S12, in S13, the UAV controller 110 may also save the state of the unmanned aerial vehicle 100 at least when each image frame of the first lap is acquired. Thus, the unmanned aircraft 100 may perform the flying and imaging in the same state of the unmanned aircraft 100 as the state of the unmanned aircraft 100 in the first turn after the second turn of the flight path RT.
The UAV control unit 110 evaluates the output moving image (output moving image) (S16). The UAV control unit 110 may evaluate the output moving image when the capturing of the moving image in each circle is completed. For example, the UAV control unit 110 may determine that the shooting of the moving image is completed when the shooting and the flight of the predetermined flight path RT are completed. For example, the UAV control unit 110 may determine that capturing of the moving image has ended when: when the unmanned aerial vehicle 100 is maneuvered by the terminal 80 in the first turn, the maneuvering of the unmanned aerial vehicle 100 by the terminal 80 ends; when the operation unit 83 performs an operation to instruct the unmanned aircraft 100 to end the operation and the communication unit 85 notifies the unmanned aircraft 100.
The UAV control unit 110 determines whether or not the evaluation result of the output moving image satisfies a preset reference (S17). The preset reference may be a subjective reference of the user or an objective reference.
When the preset reference is the user subjective reference, the UAV control unit 110 may transmit the output moving image to the terminal 80 through the communication unit 150, and the terminal control unit 81 of the terminal 80 may receive the output moving image through the communication unit 85 and display the output moving image through the display unit 88. In addition, the user can also confirm the displayed output dynamic image, and the user subjectively judges whether the output dynamic image meets the preset standard or not. In this case, when the preset reference is satisfied, the terminal control portion 81 may acquire the operation information indicating that the preset reference is satisfied through the operation portion 83 and transmit to the unmanned aerial vehicle 100 through the communication portion 85. On the other hand, when the preset reference is not satisfied, the terminal control portion 81 may acquire operation information that the preset reference is not satisfied through the operation portion 83 and transmit to the unmanned aircraft 100 through the communication portion 85. That is, the user may manually input the evaluation result.
When the preset reference is an objective reference, the UAV control unit 110 may perform image recognition (e.g., pattern recognition) on the output dynamic image, and evaluate the output dynamic image according to a result of the image recognition. For example, in this case, the preset reference may be a reference based on pixel values of respective pixels of respective image frames of the output moving image.
When the output moving image satisfies the preset reference (yes at S17), the UAV control section 110 ends the processing of fig. 5, and ends the flight and shooting along the flight path RT.
On the other hand, when the output moving image does not satisfy the preset reference (no in S17), the UAV control unit 110 proceeds to the next round of flying and shooting (S18). In this case, the UAV control unit 110 acquires the state information of the unmanned aircraft 100 at the start of shooting from the storage unit 160, and sets it as the state of the unmanned aircraft 100 at the starting point of the flight path RT (S18). Thus, the UAV control unit 110 moves to a position where the shooting of the next round flight path RT is started, and sets the imaging unit 220 at the shooting start time in a state where the desired shooting range can be shot.
Further, the moving image to be evaluated may be limited to only the synthesized moving image among the output moving images. For example, even if the reference moving image of the first turn is not evaluated, the quality of the synthesized moving image is not affected, and the processing time of fig. 6 can be shortened.
The unmanned aerial vehicle 100 repeats the flight and imaging along the flight path RT at least N times. "N" is an arbitrary number of 2 or more, and is, for example, a number of times of surround in which the quality of the generated composite moving image is higher than a predetermined quality. When the evaluation result of the output dynamic image does not satisfy the preset reference, the flying and shooting can be continued along the flying path RT after N times. The value of N may be specified by the user through the operation unit 83 of the terminal 80, for example, or may be determined as an arbitrary value as appropriate. The UAV control unit 110 may determine the value of N according to a scene or an imaging range to be imaged.
In this way, the unmanned aerial vehicle 100 (an example of an image processing device) processes a moving image captured by the imaging unit 220 included in the unmanned aerial vehicle 100 (an example of a flying object). The UAV control (one example of a processing portion) may specify a flight path RT for the unmanned aircraft 100 to fly. The UAV control 110 may fly the unmanned aerial vehicle 100 around the flight path RT multiple times. The UAV control unit 110 can cause the imaging unit 220 to capture a plurality of dynamic images having the same capture range by a plurality of circling flights. The UAV control unit 110 may synthesize a plurality of dynamic images captured by a plurality of circling flights, and generate a synthesized dynamic image.
It is difficult for the unmanned aerial vehicle 100 to capture a moving image while staying at one location during flight. Therefore, when shooting a moving image, it is difficult to continuously shoot images in the same shooting range, and it is difficult to synthesize images in the same shooting range. In contrast, when the unmanned aircraft 100 captures a moving image, the same imaging range can be captured with time by performing a plurality of circling flights on the predetermined flight path RT without staying at one place. Therefore, the unmanned aerial vehicle 100 can fix the imaging range of a wide range within the same imaging range, and obtain a plurality of moving images having a plurality of image frames corresponding to the respective imaging ranges. Accordingly, the unmanned aerial vehicle 100 can obtain various advantageous imaging effects (for example, Temporal contrast, HDR (High Dynamic Range)) by synthesizing the plurality of Dynamic images and generating a synthesized Dynamic image. That is, the unmanned aerial vehicle 100 can obtain the imaging effect of the long-time exposure, improve the snr (signal to Noise ratio), reduce Noise, and expand the dynamic range.
In addition, the moving image may have a plurality of image frames. The UAV control section 110 may control the unmanned aerial vehicle 100 so that image frames of the plurality of dynamic images at the same relative time each have the same shooting range.
In this way, the unmanned aerial vehicle 100 obtains images having the same imaging range from each of the frames of the image frames at the same relative time in each of the moving images, and thus can obtain a plurality of image frames having the same imaging range over a wide range as the entire moving image.
Further, the UAV control unit 110 may acquire the state of the unmanned aircraft 100 in synchronization with the vertical synchronization signal (VSYNC signal) of the imaging unit 220 during the flight of the first turn flight path RT. The UAV controller 110 may control the flight of the unmanned aircraft 100 and the imaging unit 220 in synchronization with the vertical synchronization signal of the imaging unit 220 during the flight of the flight path RT in the second round and thereafter so as to capture an image in the same state as the state of the unmanned aircraft 100 in the first round.
Thus, the unmanned aerial vehicle 100 can acquire the state of the unmanned aerial vehicle 100 every time one image frame is acquired by synchronizing with the vertical synchronization signal of the imaging unit 220. By storing the flight pattern and the imaging pattern of the first turn of the unmanned aircraft 100 and setting the flight pattern and the imaging pattern in the subsequent rounds to be the same as those of the first turn, the unmanned aircraft 100 can relatively easily fix the imaging range corresponding to the state of the unmanned aircraft 100 in a wide range and obtain a plurality of moving images.
In addition, the state of the unmanned aerial vehicle 100 may include at least one of information such as the position of the unmanned aerial vehicle 100, the orientation of the unmanned aerial vehicle 100, the angle of the gimbal 200 supporting the camera 220, and the like.
Thus, the unmanned aircraft 100 can acquire the image frame of the imaging range captured by the imaging unit 220 in the past by, for example, storing the state of the unmanned aircraft 100 in the storage unit 160, and acquiring and setting the state of the unmanned aircraft 100 from the storage unit 160 at a later time point.
In addition, when the evaluation of the composite dynamic image satisfies the preset reference, the UAV control section 110 may end the control of the flight and shooting of the unmanned aircraft 100. When the evaluation of the synthesized dynamic image does not satisfy the preset reference, the UAV control unit 110 may control the flying and shooting along the next round flight path RT.
Thus, the unmanned aerial vehicle 100 can continuously photograph on the flight path RT until the evaluation of the composite moving image reaches the preset reference. Therefore, it is expected to improve the quality of the composite moving image of the unmanned aircraft 100.
In addition, the UAV control unit 110 may acquire operation information indicating an evaluation result of the synthesized dynamic image. The operation information may be obtained from the terminal 80. Thus, the user can subjectively synthesize the moving image for evaluation, and can determine whether to acquire more images as a basis of the synthesized moving image.
In addition, the UAV control unit 110 may perform image recognition on the synthesized dynamic image. The UAV control section 110 may evaluate the synthesized dynamic image based on the result of the image recognition. Thus, the unmanned aerial vehicle 100 can objectively evaluate the composite dynamic image through image recognition, and can determine whether to fly again on the flight path RT and continue to acquire the image frame on which the composite dynamic image is based.
In addition, the processing relating to the flight control and the shooting control described above and the synthesis of the moving image may be mainly performed by the unmanned aircraft 100. In this case, each item of control and each item of processing can be performed by one apparatus, and efficient processing can be performed, thereby shortening the processing time. In addition, there is no need to prepare devices for performing these processes separately from the unmanned aircraft 100. The processing related to the shooting control and the synthesis of the moving image may be mainly performed by other devices (for example, the terminal 80 and the transmitter).
Fig. 7 is a flowchart showing a first example of composition of a moving image. The moving image combining process corresponds to S14 in fig. 6. In fig. 7, it is assumed that a moving image which is wrapped once at S13 of fig. 6 is acquired.
The UAV control unit 110 determines whether or not the obtained moving image is a moving image obtained in the first turn of the flight path RT (S21). For example, the UAV control unit 110 can determine that the current flight path RT is the order of several turns by referring to the storage unit 160. The UAV control unit 110 may acquire information indicating that the current flight path RT is a few turns from the storage unit 160.
When the obtained moving image is a moving image of the first turn flight path RT, the UAV control unit 110 stores each image frame of the obtained moving image in the storage unit 160 as each image frame of the reference moving image (S22). In addition, in the first turn, when the image frame is acquired, that is, in synchronization with the vertical synchronization signal of the image pickup unit 220, the state information of the flying object is stored in the storage unit 160. This makes it possible to grasp the state of the unmanned aircraft 100 at the moment when the image is captured. The UAV control unit 110 also stores each image frame of the obtained moving image as each image frame of the dynamic image for calculation (S23).
On the other hand, if the obtained moving image is a moving image of the flight path RT of the second round or later, the UAV control unit 110 compares each image frame of the obtained moving image with each corresponding image frame of the reference moving image, and calculates a global motion vector (S24). The corresponding image frames refer to image frames of the same relative time. The global motion refers to motion information representing a change in flying movement of the unmanned aircraft 100 and a state (attitude) of the unmanned aircraft 100 at a plurality of points in time. The global motion is represented by a motion vector (global motion vector).
The UAV control unit 110 corrects the global motion, i.e., performs global motion compensation, based on the calculated global motion vector (S25). For example, in global motion compensation, since the motion of the entire image frame can be expressed by affine transformation and motion compensation is performed in units of image frames, the encoding efficiency and the compensation efficiency are high. The UAV control unit 110 may perform inter-frame prediction and motion compensation other than global motion compensation between image frames at the same relative time in each turn. Note that the motion compensation related processing in S24 and S25 may be omitted.
The UAV control unit 110 adds each image frame of the obtained moving image to each corresponding image frame of the dynamic image for calculation (S26). In this case, the value of each pixel in each frame of the moving image subjected to the global motion compensation may be added to the value of each pixel in each corresponding image frame of the moving image for calculation.
For example, when a second circle of moving images is obtained in S21, the UAV control unit 110 adds the pixel values of the pixels in the first image frame gf11 of the moving image of the first circle of the moving images for calculation to the pixel values of the pixels in the first image frame gf21 of the moving image of the second circle of the moving images for calculation, thereby calculating the first image frame in the new moving images for calculation. For example, when a moving image of the third circle is obtained in S21, the UAV control unit 110 adds the pixel value of each pixel in the first image frame of the moving image for calculation obtained by adding the moving image of the first circle and the moving image of the second circle to the pixel value of each pixel in the first image frame gf31 of the moving image of the third circle, and generates the first frame in the new moving image for calculation. The same addition is performed for the third and subsequent moving images. The same applies to image frames subsequent to the second image frame.
The UAV control unit 110 calculates an average value of each image frame of the calculated dynamic image for calculation (S27). In this case, the UAV control unit 110 may calculate an average value of pixel values of each pixel in each image frame of the dynamic image for calculation. The UAV control unit 110 generates a composite moving image having each image frame whose average value is calculated (S27). Thus, when the flight on the flight path RT is the second round or later, the unmanned aircraft 100 can capture the moving image and output (for example, transmit and display) the synthesized moving image.
In this way, the UAV control unit 110 can generate a composite moving image from the first moving image (for example, the reference moving image) obtained in the first turn and the second moving image obtained in the second and subsequent turns. As a result, the unmanned aircraft 100 can generate a composite moving image in which a plurality of surrounding moving images are combined, with the first turn moving image as a reference.
The UAV control unit 110 may compare the first moving image with the second moving image for each image frame of the same relative time, and perform motion compensation of the second moving image on the first moving image based on the comparison result.
As a result, the unmanned aircraft 100 can perform motion compensation in the image frames at the same relative time after the first turn and the second turn. Therefore, the uniformity of the image range of the image frame at each of the same relative times in the plurality of dynamic images can be improved. The image range corresponds to a shooting range. Therefore, for example, even if the flying environment of the unmanned aircraft 100 is not good, it is possible to reduce the positional deviation between the plurality of image frames in each moving image and improve the image quality of the composite moving image.
Additionally, the motion compensation may include global motion compensation. Thus, the unmanned aerial vehicle 100 can improve the encoding efficiency of the compression encoding of the moving image and the efficiency of the motion compensation.
The UAV control unit 110 may generate a composite moving image from statistics of the same pixel in the image frames of the first moving image and the second moving image at the same relative time. When the unmanned aerial vehicle 100 captures a dynamic image while flying, it is difficult to acquire image frames whose capturing ranges are the same. In this regard, the unmanned aerial vehicle 100 is able to circle around the same flight path RT and obtain multiple image frames at the same relative time. In addition, the unmanned aerial vehicle 100 can obtain a moving image by improving the image quality of the image frame even if the image frame having a low image quality includes a little number of image frames by obtaining a statistical value (for example, an average value) of a plurality of image frames.
Fig. 8 is a flowchart showing a second example of the composition of a moving image. In fig. 8, regarding the same processing as fig. 7, the same step numbers are assigned, and the description thereof is omitted or simplified.
First, the unmanned aerial vehicle 100 performs the same processing as S21, S22, S24, and S25 in fig. 7.
Then, the UAV control unit 110 extracts a feature region in the image frame of the obtained moving image (S26A). The feature region is extracted based on objective or user subjective. The feature region may be, for example, a region having a feature of value in the surround. For example, the UAV control unit 110 may extract, as a feature region, a difference region between image frames at the same relative time in the obtained moving image and the reference moving image. For example, the UAV control unit 110 may extract, as a feature region, a region in which a predetermined object exists in an image frame of the obtained moving image. For example, the UAV control unit 110 may extract, as a feature region, a region designated by the user from the image frame of the obtained moving image through the operation unit 83 of the terminal 80. The extraction of the feature region is performed for each image frame in the obtained moving image.
The UAV control unit 110 replaces the region (feature corresponding region) of each image frame of the reference moving image corresponding to the feature region extracted from each image frame of the obtained moving image with the extracted feature region (S27A). In this case, the UAV control section 110 may replace the pixel values of the respective pixels in the feature corresponding region with the pixel values of the respective pixels in the extracted feature region. The UAV control unit 110 generates a composite moving image including each image frame in which the feature corresponding region in the reference moving image is replaced with the feature region in the obtained moving image (S27A).
In this way, the UAV control unit 110 may compare the first moving image with the second moving image for each image frame of the same relative time, extract a feature region for the second moving image, and replace a region (feature corresponding region) corresponding to the feature region in the first moving image with the feature region in the second moving image.
In this way, the unmanned aircraft 100 replaces a portion of the first moving image that is low in image quality or a portion that is not in a state desired by the user with a portion of the image frame of the other moving image at the same relative time, thereby improving the image quality of the first moving image and obtaining a composite moving image. For example, when an arbitrary tower or building as a subject is photographed, many visitors may be present around the tower or building in the image frame of the first moving image. Even in this case, when there is no guest in the image frame of the same relative time in the second moving image, the unmanned aerial vehicle 100 extracts the portion as the feature region to replace the feature corresponding region of the image frame in the first moving image. As a result, the unmanned aircraft 100 can obtain a composite moving image including the tower or the building from which the visitor is excluded.
Fig. 9 is a flowchart showing an example of output of a moving image. The moving image output process corresponds to S15 in fig. 6. In fig. 9, it is assumed that a moving image which is wrapped once at S13 of fig. 6 is acquired.
The UAV control unit 110 determines whether or not the obtained moving image is a moving image of the nth or subsequent turn (S31). When the obtained moving image is a moving image of the surround before the nth turn, the UAV control unit 110 outputs a moving image of the last surround (S32). In this case, the UAV control unit 110 may output a moving image captured in real time by the imaging unit 220, instead of synthesizing the moving image. When the obtained moving image is a moving image after the N-th wrap, the UAV control unit 110 outputs the generated composite moving image (S33).
The UAV control unit 110 may transmit the moving image to another device (for example, the terminal 80) through the communication unit 150 as an output of the moving image. The UAV controller 110 may display the moving image on another device (for example, the terminal 80) and output the moving image. In this case, the terminal control section 81 of the terminal 80 can receive the moving image through the communication section 85 and display the moving image through the display section 88. The UAV control unit 110 may store the moving image in the storage unit 160 or another recording medium (e.g., an external recording medium) and output the moving image.
In this way, the UAV control 110 may obtain the number of wraps flown by the flight path RT of the unmanned aircraft 100. When the number of acquired wraparound times is less than a threshold value (e.g., N times), the UAV control section 110 may output a dynamic image captured in the last wraparound. When the acquired number of times of circling is equal to or greater than the threshold value, the UAV control section 110 may output a synthesized dynamic image.
As a result, the unmanned aircraft 100 generates an unnecessary artifact in the synthesized moving image, if the number of wraparound times is assumed that the image quality of the synthesized moving image is insufficient. Therefore, in this case, the unmanned aircraft 100 can suppress the output of the synthesized moving image and provide the latest moving image by providing the moving image of the last surround that is not synthesized. Further, when the unmanned aerial vehicle 100 flies and captures a moving image a number of times equal to or greater than the threshold value, a long time may be required. Even in this case, some moving images can be output, and the user can confirm. On the other hand, the unmanned aircraft 100 stabilizes the image quality of the composite moving image for the number of wraparound times for which the image quality of the composite moving image is assumed to be sufficient. In this case, it is expected that the unmanned aircraft 100 can provide a moving image having an improved image quality as compared with a moving image at each turn by providing a composite moving image.
The example of outputting a moving image shown in fig. 9 is an example, and other output methods are also possible. For example, the UAV control unit 110 may output a composite moving image independently of the number of times of surrounding, regardless of whether the obtained moving image is a moving image of the second turn.
The present disclosure has been explained above using the embodiments, but the technical scope of the present disclosure is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes and modifications can be made in the above embodiments. As is apparent from the description of the claims, the embodiments to which such changes or improvements are made are included in the technical scope of the present disclosure.
The execution order of the operations, sequence, steps, and stages in the apparatus, system, program, and method shown in the claims, description, and drawings of the specification may be implemented in any order unless it is explicitly stated that "before …", "in advance", or the like, and the output of the preceding process is not used in the following process. The operational flow in the claims, the specification, and the drawings is described using "first", "next", and the like for convenience, but it is not necessarily meant to be performed in this order.
In the above-described embodiment, the shooting and the composition of a plurality of moving images when the flying object is flying have been described, but the above-described embodiment is not limited to the flying object, and may be applied to other moving objects (for example, a vehicle and a ship). In this case, for example, by replacing the expression of flight with movement, the above-described embodiment can be applied to capturing and synthesizing a plurality of moving images when the moving object moves.

Claims (30)

  1. An image processing apparatus for processing a moving image captured by an imaging unit included in a flying object,
    comprises a processing part and a control part, wherein the processing part is provided with a processing part,
    the processing unit specifies a flight path on which the flight object flies;
    flying the flying object around the flight path a plurality of times;
    shooting a plurality of dynamic images having the same shooting range by an image pickup section included in the flying object through a plurality of circling flights;
    a plurality of moving images shot by a plurality of times of circling flight are synthesized to generate a synthesized moving image.
  2. The image processing apparatus according to claim 1, wherein the dynamic image has a plurality of image frames in time-series order,
    the processing unit controls the flying object so that each of image frames of a plurality of moving images having the same relative time has the same shooting range.
  3. The image processing apparatus according to claim 2,
    the processing unit acquires the state of the flying object in synchronization with a vertical synchronization signal of the imaging unit during the flight of the first flight path;
    during the flight of the flight path in the second round and subsequent rounds, the flight of the flying object and the imaging unit are controlled in synchronization with the vertical synchronization signal of the imaging unit so that the image is captured in the same state as the state of the flying object in the first round.
  4. The image processing apparatus according to claim 3, wherein the state of the flying object includes at least one of information such as a position of the flying object, an orientation of the flying object, and an angle of a gimbal that supports the imaging unit.
  5. The image processing apparatus according to any one of claims 2 to 4, wherein the processing unit generates the composite moving image from a first moving image obtained in a first pass and a second moving image obtained in a second or subsequent pass.
  6. The image processing apparatus according to claim 5,
    the processing section, for each of the image frames of the same relative time,
    comparing the first dynamic image with the second dynamic image;
    and performing motion compensation on the second dynamic image on the first dynamic image according to the comparison result.
  7. The image processing apparatus according to claim 6, wherein the motion compensation comprises global motion compensation.
  8. The image processing apparatus according to any one of claims 5 to 7, wherein the processing unit generates the composite moving image based on statistical values of the same pixel in image frames of the same relative time in the first moving image and the second moving image.
  9. The image processing apparatus according to any one of claims 5 to 7,
    the processing section, for each of the image frames of the same relative time,
    comparing the first dynamic image with the second dynamic image,
    extracting a feature region for the second dynamic image,
    replacing a region in the first dynamic image corresponding to the feature region with the feature region in the second dynamic image.
  10. The image processing apparatus according to any one of claims 5 to 9,
    the processing unit acquires the number of flight rounds of the flying object on the flight path;
    outputting a dynamic image photographed in the last surround when the acquired number of surrounds is less than a threshold;
    and outputting the synthesized dynamic image when the acquired surrounding times are greater than or equal to the threshold value.
  11. The image processing apparatus according to any one of claims 1 to 10,
    the processing unit evaluates the output composite moving image;
    when the evaluation result of the synthesized dynamic image meets a preset reference, ending the flying and shooting of the flying object;
    and when the evaluation result of the synthetic dynamic image does not meet the preset reference, flying and shooting along the flying path surrounded next time.
  12. The image processing apparatus according to claim 11, wherein the processing unit acquires operation information indicating an evaluation result of the composite moving image.
  13. The image processing apparatus according to claim 11,
    the processing unit performs image recognition on the synthesized moving image;
    and evaluating the synthesized dynamic image according to the image recognition result.
  14. The image processing apparatus according to any one of claims 1 to 13, characterized in that the image processing apparatus is the flying object.
  15. An image processing method for processing a moving image captured by an imaging unit included in a flying object, comprising:
    specifying a flight path for the flight object to fly;
    flying the flying object around the flight path a plurality of times;
    capturing a plurality of moving images having the same capturing range by an imaging unit included in the flying object by a plurality of circling flights; and
    a plurality of moving images shot by a plurality of times of circling flight are synthesized to generate a synthesized moving image.
  16. The image processing method according to claim 15, wherein the dynamic image has a plurality of image frames in time-series order,
    the step of capturing the plurality of moving images includes the steps of: the flying object is controlled so that each of image frames of the plurality of dynamic images at the same relative time has the same shooting range.
  17. The image processing method according to claim 16, wherein the step of capturing the plurality of moving images comprises the steps of:
    acquiring a state of the flying object in synchronization with a vertical synchronization signal of the imaging unit during the flight on the first flight path; and
    during the flight of the flight path in the second round and subsequent rounds, the flight of the flying object and the imaging unit are controlled in synchronization with the vertical synchronization signal of the imaging unit so that the image is captured in the same state as the state of the flying object in the first round.
  18. The image processing method according to claim 17, wherein the state of the flying object includes at least one of information such as a position of the flying object, an orientation of the flying object, and an angle of a gimbal that supports the imaging unit.
  19. The image processing method according to any one of claims 16 to 18, wherein the step of generating the composite dynamic image includes the steps of: and generating the synthesized dynamic image according to the first dynamic image obtained in the first circle and the second dynamic image obtained in the second circle.
  20. The image processing method according to claim 19, wherein the step of generating the composite moving image includes the steps of:
    for each of the image frames at the same relative time,
    comparing the first dynamic image with the second dynamic image; and
    and performing motion compensation on the second dynamic image on the first dynamic image according to the comparison result.
  21. The image processing method of claim 20, wherein the motion compensation comprises global motion compensation.
  22. The image processing method according to any one of claims 19 to 21, wherein the step of generating the composite moving image includes the steps of: generating the composite moving image from statistics of the same pixel of the image frames of the same relative time in the first moving image and the second moving image.
  23. The image processing method according to any one of claims 19 to 21, wherein the step of generating the composite dynamic image includes the steps of:
    for each of the image frames at the same relative time,
    comparing the first dynamic image with the second dynamic image;
    extracting a feature region for the second dynamic image; and
    replacing a region in the first dynamic image corresponding to the feature region with the feature region in the second dynamic image.
  24. The image processing method according to any one of claims 19 to 23, further comprising the steps of: acquiring the flying surrounding times of the flying body on the flying path;
    outputting a dynamic image photographed in the last surround when the acquired number of surrounds is less than a threshold; and
    and outputting the synthesized dynamic image when the acquired surrounding times are greater than or equal to the threshold value.
  25. The image processing method according to any one of claims 15 to 24, wherein the step of capturing the plurality of moving images includes the steps of:
    evaluating the outputted composite dynamic image;
    when the evaluation result of the synthesized dynamic image meets a preset reference, ending the flying and shooting of the flying object; and
    and when the evaluation result of the composite dynamic image does not meet the preset reference, flying and shooting along the flying path encircled next time.
  26. The image processing method according to claim 25, wherein the step of evaluating the composite moving image comprises the steps of: operation information representing an evaluation result of the composite moving image is acquired.
  27. The image processing method according to claim 25, wherein the step of evaluating the composite moving image comprises the steps of:
    performing image recognition on the synthesized dynamic image; and
    and evaluating the synthesized dynamic image according to the image recognition result.
  28. The image processing method according to any one of claims 15 to 27, wherein the image processing method is executed by an image processing apparatus,
    the image processing device is the flying object.
  29. A program for causing an image processing apparatus for processing a moving image captured by an imaging unit included in a flying object to execute:
    specifying a flight path for the flight object to fly;
    flying the flying object around the flight path a plurality of times;
    causing an imaging unit included in the flying object to capture a plurality of moving images having the same capture range by a plurality of times of circling flight;
    a plurality of moving images shot by a plurality of times of circling flight are synthesized to generate a synthesized moving image.
  30. A recording medium that is a computer-readable recording medium having a program recorded thereon, the program causing an image processing apparatus that processes a moving image captured by an imaging unit included in a flying object to execute:
    specifying a flight path for the flight object to fly;
    flying the flying object around the flight path a plurality of times;
    causing an imaging unit included in the flying object to capture a plurality of moving images having the same capture range by a plurality of times of circling flight;
    a plurality of moving images shot by a plurality of times of circling flight are synthesized to generate a synthesized moving image.
CN202080074343.6A 2019-12-09 2020-12-03 Image processing apparatus, image processing method, program, and recording medium Pending CN114586335A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2019-222092 2019-12-09
JP2019222092A JP6997164B2 (en) 2019-12-09 2019-12-09 Image processing equipment, image processing methods, programs, and recording media
PCT/CN2020/133589 WO2021115192A1 (en) 2019-12-09 2020-12-03 Image processing device, image processing method, program and recording medium

Publications (1)

Publication Number Publication Date
CN114586335A true CN114586335A (en) 2022-06-03

Family

ID=76311106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080074343.6A Pending CN114586335A (en) 2019-12-09 2020-12-03 Image processing apparatus, image processing method, program, and recording medium

Country Status (3)

Country Link
JP (1) JP6997164B2 (en)
CN (1) CN114586335A (en)
WO (1) WO2021115192A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024006072A (en) * 2022-06-30 2024-01-17 本田技研工業株式会社 Image processing device, image processing method, image processing system, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108419023A (en) * 2018-03-26 2018-08-17 华为技术有限公司 A kind of method and relevant device generating high dynamic range images
CN108471500A (en) * 2009-10-16 2018-08-31 奥林巴斯株式会社 Camera, method for imaging and storage medium
CN109246355A (en) * 2018-09-19 2019-01-18 北京云迹科技有限公司 The method, apparatus and robot of panoramic picture are generated using robot
JP2019080165A (en) * 2017-10-24 2019-05-23 M−Solutions株式会社 Image provision method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4341613B2 (en) * 2005-11-29 2009-10-07 セイコーエプソン株式会社 Control apparatus, photographing apparatus, photographing apparatus control method, and control program
JP2008186145A (en) * 2007-01-29 2008-08-14 Mitsubishi Electric Corp Aerial image processing apparatus and aerial image processing method
JP5267396B2 (en) * 2009-09-16 2013-08-21 ソニー株式会社 Image processing apparatus and method, and program
JP2014185947A (en) * 2013-03-25 2014-10-02 Geo Technical Laboratory Co Ltd Image photographing method for three-dimensional restoration
JP6803919B2 (en) * 2016-10-17 2020-12-23 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Flight path generation methods, flight path generation systems, flying objects, programs, and recording media

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108471500A (en) * 2009-10-16 2018-08-31 奥林巴斯株式会社 Camera, method for imaging and storage medium
JP2019080165A (en) * 2017-10-24 2019-05-23 M−Solutions株式会社 Image provision method
CN108419023A (en) * 2018-03-26 2018-08-17 华为技术有限公司 A kind of method and relevant device generating high dynamic range images
CN109246355A (en) * 2018-09-19 2019-01-18 北京云迹科技有限公司 The method, apparatus and robot of panoramic picture are generated using robot

Also Published As

Publication number Publication date
JP6997164B2 (en) 2022-01-17
JP2021093592A (en) 2021-06-17
WO2021115192A1 (en) 2021-06-17

Similar Documents

Publication Publication Date Title
CN109952755B (en) Flight path generation method, flight path generation system, flight object, and recording medium
US10021339B2 (en) Electronic device for generating video data
US11722647B2 (en) Unmanned aerial vehicle imaging control method, unmanned aerial vehicle imaging method, control terminal, unmanned aerial vehicle control device, and unmanned aerial vehicle
JP6765512B2 (en) Flight path generation method, information processing device, flight path generation system, program and recording medium
JP6878567B2 (en) 3D shape estimation methods, flying objects, mobile platforms, programs and recording media
US11122209B2 (en) Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium
JP6962775B2 (en) Information processing equipment, aerial photography route generation method, program, and recording medium
WO2019230604A1 (en) Inspection system
JP2017201261A (en) Shape information generating system
CN110291777B (en) Image acquisition method, device and machine-readable storage medium
CN111344650B (en) Information processing device, flight path generation method, program, and recording medium
WO2018214401A1 (en) Mobile platform, flying object, support apparatus, portable terminal, method for assisting in photography, program and recording medium
CN109891188B (en) Mobile platform, imaging path generation method, program, and recording medium
JP2019028560A (en) Mobile platform, image composition method, program and recording medium
CN114586335A (en) Image processing apparatus, image processing method, program, and recording medium
CN111213107B (en) Information processing device, imaging control method, program, and recording medium
JP2021096865A (en) Information processing device, flight control instruction method, program, and recording medium
JP6329219B2 (en) Operation terminal and moving body
JP7206530B2 (en) IMAGE PROCESSING SYSTEM, IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM
CN112313942A (en) Control device for image processing and frame body control
WO2020119572A1 (en) Shape inferring device, shape inferring method, program, and recording medium
JP6803960B1 (en) Image processing equipment, image processing methods, programs, and recording media
WO2023135910A1 (en) Image-capturing device, image-capturing method, and program
CN111615616A (en) Position estimation device, position estimation method, program, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination