WO2021115192A1 - 图像处理装置、图像处理方法、程序及记录介质 - Google Patents

图像处理装置、图像处理方法、程序及记录介质 Download PDF

Info

Publication number
WO2021115192A1
WO2021115192A1 PCT/CN2020/133589 CN2020133589W WO2021115192A1 WO 2021115192 A1 WO2021115192 A1 WO 2021115192A1 CN 2020133589 W CN2020133589 W CN 2020133589W WO 2021115192 A1 WO2021115192 A1 WO 2021115192A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
dynamic
dynamic image
flying body
image processing
Prior art date
Application number
PCT/CN2020/133589
Other languages
English (en)
French (fr)
Chinese (zh)
Inventor
周杰旻
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080074343.6A priority Critical patent/CN114586335A/zh
Publication of WO2021115192A1 publication Critical patent/WO2021115192A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present disclosure relates to an image processing device, an image processing method, a program, and a recording medium.
  • Patent Document 1 discloses an image processing device that performs image synthesis.
  • the image processing device includes a synthesis unit that synthesizes a plurality of images taken at different points in time, and a motion correction unit that corrects for reducing the influence of motion on the image.
  • Patent Document 1 European Patent Application Publication No. 3450310 Specification
  • the image processing device in Patent Document 1 fixes the position of the imaging device to take a plurality of still images, and then combines the plurality of still images (Image Stacking). However, it does not take into account the situation where multiple moving images are combined (Video Stacking) while the shooting device is moving, as in the case of a flying body equipped with a shooting device. It is expected that by synthesizing a plurality of moving images taken by flying objects that can be photographed while flying, the image quality of moving images can be improved.
  • the dynamic image may have a plurality of image frames in time series order.
  • the processing unit may control the flying body so that each image frame of the same relative time in the plurality of dynamic images has the same shooting range.
  • the processing unit acquires the state of the flying object in synchronization with the vertical synchronization signal of the imaging unit during the flight of the first lap of the flight path; during the flight of the flight path after the second lap, it synchronizes the flight with the vertical synchronization signal of the imaging unit.
  • the flying and imaging unit of the body is controlled so that the shooting is performed in the same state as the state of the flying body in the first circle.
  • the status of the flying body may include at least one of the position of the flying body, the orientation of the flying body, and the angle of the gimbal supporting the imaging unit.
  • the processing unit may generate a composite dynamic image based on the first dynamic image obtained in the first circle and the second dynamic image obtained after the second circle.
  • the processing unit may compare the first dynamic image with the second dynamic image for each image frame of the same relative time; according to the comparison result, perform the motion compensation of the second dynamic image on the first dynamic image.
  • Motion compensation may include global motion compensation.
  • the processing unit may generate a composite moving image based on the statistical value of the same pixel of the image frame of the same relative time in the first moving image and the second moving image.
  • the processing unit compares the first dynamic image with the second dynamic image for each image frame of the same relative time; extracts the characteristic region from the second dynamic image; replaces the characteristic region in the second dynamic image with the one in the first dynamic image.
  • the area corresponding to the characteristic area is the area corresponding to the characteristic area.
  • the processing unit can obtain the number of turns of the flying object on the flight path; when the acquired number of turns is less than the threshold, output the dynamic image taken in the last round; when the acquired number of turns is greater than or equal to the threshold, output the composite dynamic image .
  • the processing unit can evaluate the output synthetic dynamic image; when the evaluation result of the synthetic dynamic image meets the preset criterion, the flying and shooting of the flying object is ended; when the evaluation result of the synthetic dynamic image does not meet the preset criterion, it moves along the downward direction. Fly and take pictures on a circling flight path.
  • the processing unit may acquire operation information indicating the evaluation result of the synthesized moving image.
  • the processing unit can perform image recognition for the synthetic dynamic image; evaluate the synthetic dynamic image according to the result of the image recognition.
  • the image processing device may be a flying object.
  • an image processing method that processes a dynamic image taken by a camera included in a flying body includes the following steps: designating a flight path of the flying body; making the flying body fly around along the flight path Multiple times; the imaging unit included in the flying body captures multiple dynamic images with the same shooting range through multiple circumnavigation flights; and synthesizes multiple dynamic images captured through multiple circumnavigation flights to generate a composite dynamic image.
  • the dynamic image may have a plurality of image frames in time series order.
  • the step of shooting a plurality of dynamic images may include the step of controlling the flying body so that each of the plurality of dynamic images has the same shooting range for image frames of the same relative time.
  • the step of capturing multiple dynamic images may include the following steps: during the flight of the first lap of the flight path, acquiring the state of the flying object in synchronization with the vertical synchronization signal of the camera unit; and during the flight of the flight path after the second lap, The flight of the flying object and the imaging unit are controlled in synchronization with the vertical synchronization signal of the imaging unit so that the shooting is performed in the same state as the state of the flying object in the first circle.
  • the status of the flying body may include at least one of the position of the flying body, the orientation of the flying body, and the angle of the universal joint supporting the imaging unit.
  • the step of generating a synthetic dynamic image may include the following steps: generating a synthetic dynamic image based on the first dynamic image obtained in the first circle and the second dynamic image obtained after the second circle.
  • the step of generating a synthetic dynamic image may include the following steps: for each image frame of the same relative time, comparing the first dynamic image with the second dynamic image; according to the comparison result, performing the movement of the second dynamic image on the first dynamic image make up.
  • Motion compensation may include global motion compensation.
  • the step of generating a composite moving image may include the following steps: generating a composite moving image based on the statistical values of the same pixels of the image frames of the same relative time in the first moving image and the second moving image.
  • the step of generating a synthetic dynamic image may include the following steps: comparing the first dynamic image with the second dynamic image for each image frame of the same relative time; extracting the characteristic region from the second dynamic image; and using the second dynamic image The feature area of replaces the area corresponding to the feature area in the first dynamic image.
  • the step of shooting multiple dynamic images may include the following steps: evaluating the output synthetic dynamic image; when the evaluation result of the synthetic dynamic image meets a preset criterion, ending the flight and shooting of the flying object; when the evaluation result of the synthetic dynamic image is not When the preset reference is met, the flight and shooting will be carried out along the flight path of the next circle.
  • the step of evaluating the synthetic dynamic image may include the following steps: obtaining operation information representing the evaluation result of the synthetic dynamic image.
  • the step of evaluating the synthetic dynamic image may include the following steps: performing image recognition for the synthetic dynamic image; and evaluating the synthetic dynamic image according to the result of the image recognition.
  • the image processing method can be executed by an image processing device.
  • the image processing device may be a flying object.
  • a recording medium which is a computer-readable recording medium on which a program is recorded for causing an image processing device that processes a moving image captured by an imaging unit included in an flying body to execute the following Steps: Specify the flight path of the flying body; make the flying body circle and fly along the flight path for multiple times; make the camera included in the flying body shoot multiple dynamic images with the same shooting range through multiple circles; Multiple dynamic images taken by flying around are synthesized to generate a composite dynamic image.
  • FIG. 1 is a schematic diagram showing an example of the configuration of the flying body system in the embodiment.
  • Fig. 2 is a diagram showing an example of a specific appearance of an unmanned aircraft.
  • Fig. 3 is a block diagram showing an example of the hardware configuration of the unmanned aircraft.
  • Fig. 4 is a block diagram showing an example of the hardware configuration of the terminal.
  • Fig. 5 is a diagram showing an example of the operation outline of the unmanned aircraft.
  • Fig. 6 is a flowchart showing an example of the operation of the unmanned aircraft.
  • Fig. 7 is a flowchart showing a first example of dynamic image synthesis.
  • Fig. 8 is a flowchart showing a second example of dynamic image synthesis.
  • Fig. 9 is a flowchart showing an output example of a moving image.
  • the flying object is an unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle) as an example.
  • UAV Unmanned Aerial Vehicle
  • the image processing device is, for example, an unmanned aircraft, but it may also be another device (for example, a terminal, a transmitter, a server, and other image processing devices).
  • the image processing method is used to specify the actions of the image processing device.
  • a program for example, a program that causes the image processing apparatus to execute various processes is recorded in the recording medium.
  • the “section” or “device” described in the following embodiments is not limited to a physical structure realized by hardware, but also includes a function that realizes the structure by software such as a program.
  • the function of one structure may be realized by two or more physical structures, or the function of two or more structures may also be realized by, for example, one physical structure.
  • the “acquisition” described in the embodiment is not limited to the action of directly acquiring information or signals, etc., but also includes, for example, any of the processing unit's acquisition or reception through the communication unit and the acquisition from the storage unit (such as a memory, etc.) By. The understanding and interpretation of these terms are also the same in the description of the claims.
  • FIG. 1 is a schematic diagram showing a configuration example of a flying body system 10 in the embodiment.
  • the flying body system 10 includes an unmanned aircraft 100 and a terminal 80.
  • the unmanned aircraft 100 and the terminal 80 may communicate with each other through wired communication or wireless communication (for example, a wireless LAN (Local Area Network)).
  • the terminal 80 exemplifies a portable terminal (such as a smart phone or a tablet terminal), but it may also be another terminal (such as a PC (Personal Computer, personal computer), which can be manipulated by a joystick for unmanned driving
  • the transmitter proportional controller
  • FIG. 2 is a diagram showing an example of a specific appearance of unmanned aircraft 100. As shown in FIG. FIG. 2 shows a perspective view when the unmanned aircraft 100 is flying in the moving direction STV0. Unmanned aircraft 100 is an example of a moving body.
  • the roll axis (refer to the x-axis) is set in a direction parallel to the ground and along the moving direction STV0.
  • set the pitch axis (refer to the y-axis) in a direction parallel to the ground and perpendicular to the roll axis, and then set the yaw axis in a direction perpendicular to the ground and perpendicular to the roll and pitch axes ( Refer to the z axis).
  • the unmanned aircraft 100 includes a UAV main body 102, a gimbal 200, an imaging unit 220, and a plurality of imaging units 230.
  • the UAV main body 102 includes a plurality of rotors (propellers).
  • the UAV main body 102 makes the unmanned aircraft 100 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 102 uses, for example, four rotors to fly the unmanned aircraft 100.
  • the number of rotors is not limited to four.
  • the unmanned aircraft 100 may be a fixed-wing aircraft without a rotor.
  • the imaging unit 220 is a photographing camera that photographs a subject included in a desired photographing range (for example, the sky above the subject, the scenery such as mountains and rivers, and the buildings on the ground).
  • a desired photographing range for example, the sky above the subject, the scenery such as mountains and rivers, and the buildings on the ground.
  • the plurality of imaging units 230 are sensor cameras that photograph the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100.
  • the two camera units 230 may be installed on the nose of the unmanned aircraft 100, that is, on the front side.
  • the other two camera units 230 may be provided on the bottom surface of the unmanned aircraft 100.
  • the two imaging units 230 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging parts 230 on the bottom side may also be paired to function as a stereo camera.
  • the three-dimensional space data around the unmanned aircraft 100 may be generated based on the images captured by the plurality of imaging units 230.
  • the number of imaging units 230 included in unmanned aircraft 100 is not limited to four.
  • the unmanned aircraft 100 only needs to include at least one camera 230.
  • the unmanned aircraft 100 may include at least one camera 230 on the nose, tail, sides, bottom surface, and top surface of the unmanned aircraft 100, respectively.
  • the angle of view that can be set in the imaging unit 230 may be larger than the angle of view that can be set in the imaging unit 220.
  • the imaging part 230 may have a single focus lens or a fisheye lens.
  • FIG. 3 is a block diagram showing an example of the hardware configuration of unmanned aircraft 100.
  • the unmanned aircraft 100 includes a UAV control unit 110, a communication unit 150, a storage unit 160, a universal joint 200, a rotor mechanism 210, a camera unit 220, a camera unit 230, a GPS receiver 240, and an inertial measurement unit (IMU: Inertial Measurement Unit). ) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, laser measuring device 290.
  • IMU Inertial Measurement Unit
  • the UAV control unit 110 is composed of, for example, a CPU (Central Processing Unit: Central Processing Unit), MPU (Micro Processing Unit: Microprocessor), or DSP (Digital Signal Processor: Digital Signal Processor).
  • the UAV control unit 110 performs signal processing for overall control of the operations of each part of the unmanned aircraft 100, data input and output processing with other parts, data arithmetic processing, and data storage processing.
  • the UAV control unit 110 can control the flight of the unmanned aircraft 100 according to a program stored in the storage unit 160.
  • the UAV control unit 110 can control the flight in accordance with the flight control instructions from the terminal 80 or the like.
  • the UAV control unit 110 can capture images (for example, moving images, still images) (for example, aerial photography).
  • the UAV control unit 110 acquires position information indicating the position of the unmanned aircraft 100.
  • the UAV control unit 110 can obtain position information indicating the latitude, longitude, and altitude where the unmanned aircraft 100 is located from the GPS receiver 240.
  • the UAV control unit 110 can obtain the latitude and longitude information indicating the latitude and longitude of the unmanned aircraft 100 from the GPS receiver 240, and obtain the altitude information indicating the altitude of the unmanned aircraft 100 from the barometric altimeter 270 as position information.
  • the UAV control unit 110 may obtain the distance between the ultrasonic radiation point and the ultrasonic reflection point generated by the ultrasonic sensor 280 as height information.
  • the UAV control unit 110 can acquire the orientation information indicating the orientation of the unmanned aircraft 100 from the magnetic compass 260.
  • the orientation information may be represented by, for example, an orientation corresponding to the orientation of the nose of the unmanned aircraft 100.
  • the UAV control unit 110 can acquire position information indicating the position where the unmanned aircraft 100 should exist when the imaging unit 220 captures the shooting range to be captured.
  • the UAV control unit 110 may obtain position information indicating the position where the unmanned aircraft 100 should exist from the storage unit 160.
  • the UAV control unit 110 can obtain the position information indicating the position where the unmanned aerial vehicle 100 should exist from other devices through the communication unit 150.
  • the UAV control unit 110 may refer to the three-dimensional map database to determine the possible location of the unmanned aircraft 100, and obtain the location as the location information indicating the location where the unmanned aircraft 100 should exist.
  • the UAV control unit 110 can acquire the respective imaging ranges of the imaging unit 220 and the imaging unit 230.
  • the UAV control unit 110 may acquire the angle of view information representing the angle of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230 as a parameter for determining the imaging range.
  • the UAV control unit 110 may acquire information indicating the shooting direction of the camera unit 220 and the camera unit 230 as a parameter for determining the shooting range.
  • the UAV control unit 110 may obtain posture information indicating the posture state of the imaging unit 220 from the gimbal 200 as information indicating the imaging direction of the imaging unit 220, for example.
  • the posture information of the imaging unit 220 may indicate the angle of rotation of the universal joint 200 from the pitch axis and the yaw axis reference rotation angle.
  • the UAV control unit 110 may obtain position information indicating the location of the unmanned aircraft 100 as a parameter for determining the shooting range.
  • the UAV control unit 110 may limit the imaging range representing the geographic range captured by the imaging unit 220 according to the angle of view and the imaging direction of the imaging unit 220 and the imaging unit 230, and the location of the unmanned aircraft 100.
  • the UAV control unit 110 may acquire the shooting range information from the storage unit 160.
  • the UAV control unit 110 may obtain the shooting range information through the communication unit 150.
  • the UAV control unit 110 controls the universal joint 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230.
  • the UAV control unit 110 can control the imaging range of the imaging unit 220 by changing the imaging direction or angle of view of the imaging unit 220.
  • the UAV control unit 110 can control the imaging range of the imaging unit 220 supported by the universal joint 200 by controlling the rotation mechanism of the universal joint 200.
  • the photographing range refers to the geographic range photographed by the photographing unit 220 or the photographing unit 230.
  • the shooting range is defined by latitude, longitude and altitude.
  • the shooting range may be a range of three-dimensional spatial data defined by latitude, longitude, and altitude.
  • the shooting range may be a range of two-dimensional spatial data defined by latitude and longitude.
  • the shooting range may be determined based on the angle of view and shooting direction of the camera 220 or 230 and the location where the unmanned aircraft 100 is located.
  • the shooting directions of the imaging unit 220 and the imaging unit 230 can be defined by the orientation and depression angle of the front facing of the imaging unit 220 and the imaging unit 230 on which the imaging lens is provided.
  • the imaging direction of the imaging unit 220 may be a direction determined by the orientation of the nose of the unmanned aircraft 100 and the posture state of the imaging unit 220 with respect to the gimbal 200.
  • the imaging direction of the imaging unit 230 may be a direction determined from the orientation of the nose of the unmanned aircraft 100 and the position where the imaging unit 230 is installed.
  • the UAV control unit 110 can determine the surrounding environment of the unmanned aircraft 100 by analyzing multiple images captured by the multiple camera units 230.
  • the UAV control unit 110 may control the flight based on the surrounding environment of the unmanned aircraft 100, such as avoiding obstacles.
  • the UAV control unit 110 can acquire three-dimensional information (three-dimensional information) indicating the three-dimensional shape (three-dimensional shape) of an object existing around the unmanned aircraft 100.
  • the object may be a part of a landscape such as buildings, roads, vehicles, trees, etc., for example.
  • the stereo information is, for example, three-dimensional spatial data.
  • the UAV control unit 110 may generate 3D information indicating the 3D shape of an object existing around the unmanned aircraft 100 based on each image acquired by the plurality of camera units 230, thereby acquiring the 3D information.
  • the UAV control unit 110 can obtain the three-dimensional information indicating the three-dimensional shape of objects existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the storage unit 160.
  • the UAV control unit 110 can acquire three-dimensional information related to the three-dimensional shape of objects existing around the unmanned aircraft 100 by referring to a three-dimensional map database managed by a server existing on the network.
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotor mechanism 210. That is, the UAV control unit 110 controls the position including the latitude, longitude, and altitude of the unmanned aircraft 100 by controlling the rotor mechanism 210.
  • the UAV control unit 110 can control the shooting range of the camera unit 220 by controlling the flight of the unmanned aircraft 100.
  • the UAV control unit 110 can control the angle of view of the imaging unit 220 by controlling the zoom lens included in the imaging unit 220.
  • the UAV control unit 110 can use the digital zoom function of the camera unit 220 to control the angle of view of the camera unit 220 through digital zoom.
  • the UAV control unit 110 can move the camera unit 220 to the desired position by moving the unmanned aircraft 100 to a specific position at a specific date and time.
  • the UAV control unit 110 can move the unmanned aerial vehicle 100 to a specific position on a specific date and time to make the imaging unit 220 work as desired.
  • the communication unit 150 communicates with the terminal 80.
  • the communication unit 150 can perform wireless communication by any wireless communication method.
  • the communication unit 150 can perform wired communication through any wired communication method.
  • the communication unit 150 may send the captured image or additional information (metadata) related to the captured image to the terminal 80.
  • the storage unit 160 can be various types of information, various types of data, various types of programs, and various types of images.
  • the various images may include a photographed image or an image based on the photographed image.
  • the program may include the UAV control unit 110 to control the universal joint 200, the rotor mechanism 210, the camera unit 220, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring device 290.
  • the storage 160 may be a computer-readable recording medium.
  • the storage unit 160 includes memory, and may include ROM (Read Only Memory), RAM (Random Access Memory), and the like.
  • the storage unit 160 may include at least one of HDD (Hard Disk Drive), SSD (Solid State Drive), SD card, USB (Universal Serial bus) memory, and other memories. At least a part of the storage unit 160 can be detached from the unmanned aircraft 100.
  • the universal joint 200 may rotatably support the imaging unit 220 around the yaw axis, the pitch axis, and the roll axis.
  • the gimbal 200 can change the imaging direction of the imaging unit 220 by rotating the imaging unit 220 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the rotor mechanism 210 includes a plurality of rotor wings and a plurality of drive motors that rotate the plurality of rotor wings.
  • the rotor mechanism 210 is controlled by the UAV control unit 110 to rotate, so that the unmanned aircraft 100 can fly.
  • the imaging unit 220 captures a subject in a desired imaging range and generates captured image data.
  • the data of the captured image captured by the imaging unit 220 may be stored in the memory included in the imaging unit 220 or the storage unit 160.
  • the imaging unit 230 captures the surroundings of the unmanned aircraft 100 and generates captured image data.
  • the image data of the imaging unit 230 may be stored in the storage unit 160.
  • the GPS receiver 240 receives a plurality of signals transmitted from a plurality of navigation satellites (ie, GPS satellites) that indicate time and the position (coordinate) of each GPS satellite.
  • the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned aircraft 100) based on the received multiple signals.
  • the GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control unit 110.
  • the UAV control unit 110 may replace the GPS receiver 240 to calculate the position information of the GPS receiver 240. In this case, the information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240 is input to the UAV control unit 110.
  • the inertial measurement device 250 detects the posture of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the inertial measurement device 250 can detect the acceleration in the front and rear, left and right, and up and down directions of the unmanned aircraft 100 and the angular velocities in the three axis directions of the pitch axis, the roll axis, and the yaw axis as the posture of the unmanned aircraft 100.
  • the magnetic compass 260 detects the orientation of the nose of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the barometric altimeter 270 detects the flying altitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected by the ground and objects, and outputs the detection result to the UAV control unit 110.
  • the detection result can show the distance from the unmanned aircraft 100 to the ground, that is, the height.
  • the detection result can show the distance from the unmanned aircraft 100 to the object (subject).
  • the laser measuring instrument 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between the unmanned aircraft 100 and the object (subject) through the reflected light.
  • a time-of-flight method may be used.
  • FIG. 4 is a block diagram showing an example of the hardware configuration of the terminal 80.
  • the terminal 80 includes a terminal control unit 81, an operation unit 83, a communication unit 85, a storage unit 87, and a display unit 88.
  • the terminal 80 may be held by a user who wishes to instruct the flight control of the unmanned aircraft 100.
  • the terminal 80 may instruct the flight control of the unmanned aircraft 100.
  • the terminal control unit 81 is configured using, for example, a CPU, MPU, or DSP.
  • the terminal control unit 81 performs signal processing for overall control of the operation of each part of the terminal 80, data input/output processing with other parts, data arithmetic processing, and data storage processing.
  • the terminal control unit 81 can acquire data and information from the unmanned aircraft 100 via the communication unit 85.
  • the terminal control unit 81 can also acquire data and information input via the operation unit 83.
  • the terminal control unit 81 may obtain data or information stored in the storage unit 87.
  • the terminal control unit 81 can transmit data and information to the unmanned aircraft 100 via the communication unit 85.
  • the terminal control unit 81 may send data and information to the display unit 88 and cause the display unit 88 to display display information based on the data and information.
  • the information displayed by the display unit 88 and the information sent to the unmanned aircraft 100 through the communication unit 85 may include the flight path of the unmanned aircraft 100, the shooting position, the captured image, and the information based on the image (for example, composite image) of the captured image. information.
  • the operation unit 83 receives and obtains data and information input by the user of the terminal 80.
  • the operation unit 83 may include input devices such as buttons, keys, a touch panel, and a microphone.
  • the touch panel may be composed of an operation part 83 and a display part 88. In this case, the operation section 83 can accept touch operations, click operations, drag operations, and the like.
  • the communication unit 85 performs wireless communication with the unmanned aircraft 100 through various wireless communication methods.
  • the wireless communication method of the wireless communication may include communication based on a wireless LAN or a public wireless network.
  • the communication unit 85 can perform wired communication by any wired communication method.
  • the storage unit 87 can store various information, various data, various programs, and various images.
  • the various programs may include application programs executed by the terminal 80.
  • the storage section 87 may be a computer-readable recording medium.
  • the storage section 87 may include ROM, RAM, and the like.
  • the storage unit 87 may include at least one of HDD, SSD, SD card, USB memory, and other memories. At least a part of the storage part 87 can be detached from the terminal 80.
  • the storage unit 87 may store a captured image acquired from the unmanned aircraft 100 or an image based on the captured image.
  • the storage unit 87 may store additional information of the captured image or the image based on the captured image.
  • the display unit 88 is configured with an LCD (Liquid Crystal Display), for example, and displays various information and data output from the terminal control unit 81.
  • the display section 88 may display a captured image or an image based on the captured image.
  • the display unit 88 may also display various data and information related to the execution of the application program.
  • FIG. 5 is a diagram showing an example of the outline of the operation of unmanned aircraft 100.
  • the UAV control unit 110 specifies the flight path RT.
  • the UAV control unit 110 acquires the shooting range of the shooting moving image during the flight along the flight path RT.
  • the shooting range is determined by the state of the unmanned aircraft 100.
  • the state of the unmanned aircraft 100 may include: the position of the unmanned aircraft 100 related to the shooting, the orientation of the unmanned aircraft 100 (for example, the direction of the nose), and the angle (rotation angle) of the universal joint 200 that supports the camera 220 ) And other information.
  • the status of the unmanned aircraft 100 may also include status information of other unmanned aircraft 100 (for example, flight information or shooting information).
  • the UAV control unit 110 can obtain the position of the camera unit 220 through GPS technology, or can obtain the position information of the unmanned aircraft 100 with high precision through RTK (Real Time Kinetic GPS) technology.
  • RTK Real Time Kinetic GPS
  • the shooting range may be generated by the UAV control unit 110 according to the positional relationship between the flying position along the flight path RT and the shooting object, that is, the subject.
  • the shooting range may be stored in the storage part 160 and obtained from the storage part 160.
  • the shooting range can be obtained from an external server through the communication unit 150.
  • the UAV control unit 110 causes the unmanned aircraft 100 to fly along the acquired flight path RT.
  • the imaging unit 220 captures the acquired imaging range to capture a moving image.
  • the unmanned aircraft 100 flies multiple times on the same flight path RT and shoots dynamic images (videos).
  • a dynamic image consists of an image sequence with multiple image frames.
  • the dynamic image may have, for example, 30 (equivalent to 30 fps) or 60 (equivalent to 60 fps) image frames per second.
  • the UAV control unit 110 causes the unmanned aircraft 100 to fly along the same flight path RT multiple times, and causes the imaging unit 220 to capture moving images of the same shooting range multiple times.
  • the UAV control unit 110 acquires the first image frame gf11, the second image frame gf12, the third image frame gf13, and the first image frame gf11, the second image frame gf12, the third image frame gf13, and the first image frame gf11, the second image frame gf12, and the Four image frames gf14,....
  • the UAV control unit 110 acquires the first image frame gf21, the second image frame gf22, the third image frame gf23, the fourth image frame gf24,... From the imaging unit 220 in the second circle of the flight path RT.
  • the UAV control unit 110 acquires the first image frame gf31, the second image frame gf32, the third image frame gf33, the fourth image frame gf34,... From the imaging unit 220 in the third circle of the flight path RT.
  • the X-th image frame is simply described as the X-th frame.
  • the same shooting range is photographed.
  • the image ranges corresponding to the image ranges of the first image frames gf11, gf21, and gf31 captured at the same relative time t1 are the same.
  • the image ranges corresponding to the image ranges of the second image frames gf12, gf22, and gf32 captured at the same relative time t2 are the same.
  • the image ranges corresponding to the image ranges of the third image frames gf13, gf23, and gf33 captured at the same relative time t3 are the same.
  • the image ranges corresponding to the image ranges of the fourth image frames gf14, gf24, and gf34 captured at the same relative time t4 are the same.
  • the state of the unmanned aircraft 100 is the same.
  • unmanned aircraft 100 can acquire multiple image frames taken at the same position.
  • the unmanned aircraft 100 can continuously shoot frame by frame by repeatedly flying and shooting on the flight path RT.
  • the UAV control unit 110 synthesizes a plurality of image frames of the same relative time in each circle, and each image frame of the same relative time obtains a composite image frame. For example, three first image frames gf11, gf21, and gf31 are synthesized to generate a first synthesized image frame. For image frames after the second image frame, the second composite image frame is generated in the same way,.... The UAV control unit 110 generates a composite moving image including each composite image frame sequentially in time series.
  • the UAV control unit 110 may store the status information of the unmanned aircraft 100 when capturing an image frame.
  • the time when the image frame is photographed that is, the time when the state information of the unmanned aircraft 100 is acquired, may be synchronized with the vertical synchronization signal (VSYNC signal) of the imaging unit 220.
  • VSYNC signal vertical synchronization signal
  • At least the state of the unmanned aircraft 100 can be saved during the first round of shooting.
  • the unmanned aircraft 100 can also follow the state of the unmanned aerial vehicle during the first flight during the second lap and later, and can also capture moving images of image frames with the same shooting range after the second lap.
  • FIG. 6 is a flowchart showing an example of the operation of unmanned aircraft 100.
  • the UAV control unit 110 specifies the flight path RT (S11).
  • the flight path RT may be specified by the user through the operation unit 83 of the terminal 80 in advance, or may be obtained by the communication unit 85 and the communication unit 150 and specified.
  • the flight path RT can be generated and specified by the UAV control unit 110, so that more than one desired subject can be photographed.
  • the flight path RT may be stored in the storage unit 160 in advance, and obtained from the storage unit 160 for designation.
  • the flight path RT can be specified by obtaining it from an external server through the communication unit 150.
  • the flight path RT is a flight path that can capture a desired subject.
  • the UAV control unit 110 can designate the flight path RT in accordance with manual operation (manipulation) by the operation unit 83 of the terminal 80 during the first round of flight.
  • the UAV control unit 110 causes the imaging unit 220 to start imaging along the flight path RT according to a predetermined shooting start trigger signal.
  • the shooting start trigger signal may include: receiving a shooting start instruction from the terminal 80 through the communication unit 150, or detecting that a predetermined time to start shooting has been reached.
  • the instruction to start shooting may include, for example, the video synthesis mode is selected as the shooting mode through the operating unit 83 of the terminal 80.
  • the UAV control unit 110 stores the state of the unmanned aircraft 100 when the moving image is started along the flight path RT in the storage unit 160 (S12).
  • the UAV control unit 110 may also acquire the state of the unmanned aircraft 100 instructed by the terminal 80 through the communication unit 150, that is, the state of the unmanned aircraft 100 at the start of imaging.
  • the UAV control unit 110 may determine the state of the unmanned aircraft 100 at the start of imaging according to a desired subject.
  • the imaging range captured by the imaging unit 220 is determined according to the state of the unmanned aircraft 100.
  • the UAV control unit 110 captures a moving image along the flight path RT (S13).
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 so that it flies along the flight path RT in each circle, and acquires each image frame of the dynamic image of each circle.
  • the UAV control unit 110 synthesizes the moving images captured in each circle, and generates a synthetic moving image (S14). The composition of dynamic images will be described in detail later.
  • the UAV control unit 110 outputs moving images such as composite moving images. The output of the moving image will be described in detail later (S15).
  • the UAV control unit 110 may store the information of the number of the circle in the storage unit 160 during the flight and shooting of each circle (for example, when the shooting of each circle starts).
  • the UAV control unit 110 may also save the state of the unmanned aircraft 100 at least when acquiring each image frame of the first circle. Therefore, after the second round of the flight path RT, the unmanned aircraft 100 can also perform flight and shooting in the state of the unmanned aircraft 100 that is the same as the state of the unmanned aircraft 100 in the first round.
  • the UAV control unit 110 evaluates the output moving image (output moving image) (S16).
  • the UAV control unit 110 may evaluate the output moving image when the shooting of the moving image in each circle ends. For example, the UAV control unit 110 may determine that the shooting of the moving image is completed when the flight and shooting of the predetermined flight path RT are completed. For example, the UAV control unit 110 can determine that the shooting of the moving image has ended in the following situations: when the unmanned aircraft 100 is manipulated by the terminal 80 in the first lap, and the terminal 80 controls the unmanned aircraft 100; When 83 has performed an operation instructing the end of the operation of the unmanned aircraft 100 and notified the unmanned aircraft 100 through the communication unit 85.
  • the UAV control unit 110 determines whether the evaluation result of the output moving image satisfies a preset criterion (S17).
  • the preset benchmark can be a user's subjective benchmark or an objective benchmark.
  • the UAV control unit 110 can send the output dynamic image to the terminal 80 through the communication unit 150, and the terminal control unit 81 of the terminal 80 receives the output dynamic image through the communication unit 85, and displays it through the display unit 88 Output dynamic images.
  • the user may also confirm the displayed output moving image, and the user may subjectively determine whether the output moving image satisfies a preset criterion.
  • the terminal control unit 81 may obtain the operation information indicating that the preset criterion is satisfied through the operation unit 83, and send it to the unmanned aircraft 100 through the communication unit 85.
  • the terminal control unit 81 may obtain the operation information that the preset criterion is not met through the operation unit 83, and send it to the unmanned aircraft 100 through the communication unit 85. That is, the user can manually input the evaluation result.
  • the UAV control unit 110 may perform image recognition (for example, pattern recognition) on the output moving image, and evaluate the output moving image according to the result of the image recognition.
  • image recognition for example, pattern recognition
  • the preset reference may be a reference based on the pixel value of each pixel of each image frame of the output dynamic image.
  • the UAV control unit 110 ends the process of FIG. 5, and ends the flight and shooting along the flight path RT.
  • the UAV control unit 110 enters the next round of flying and shooting (S18).
  • the UAV control unit 110 acquires the status information of the unmanned aircraft 100 at the start of shooting from the storage unit 160, and sets it as the status of the unmanned aircraft 100 at the starting point of the flight path RT (S18 ).
  • the UAV control unit 110 moves to the position where the imaging of the next-circle flight path RT starts, and the imaging unit 220 at the start of imaging is brought into a state capable of imaging the desired imaging range.
  • the moving image to be evaluated may be limited to the synthesized moving image among the output moving images. For example, even if the reference dynamic image of the first circle has not been evaluated, the quality of the synthesized dynamic image will not be affected, and the processing time of FIG. 6 can be shortened.
  • the unmanned aircraft 100 repeatedly performs flight and shooting along the flight path RT at least N times.
  • N is an arbitrary number greater than or equal to 2, and for example, it is assumed that the quality of the generated synthetic moving image is higher than the number of times of the predetermined quality.
  • the value of N may be designated by the user through the operation unit 83 of the terminal 80, for example, or may be appropriately determined as an arbitrary numerical value.
  • the UAV control unit 110 may also determine the value of N according to the shooting scene or the shooting range.
  • unmanned aircraft 100 processes the moving images captured by the imaging unit 220 included in the unmanned aircraft 100 (an example of a flying object).
  • the UAV control unit (an example of the processing unit) can specify the flight path RT on which the unmanned aircraft 100 is flying.
  • the UAV control unit 110 can make the unmanned aircraft 100 circulate along the flight path RT multiple times.
  • the UAV control unit 110 can cause the imaging unit 220 to capture a plurality of dynamic images having the same shooting range through multiple round flights.
  • the UAV control unit 110 may synthesize a plurality of moving images captured through multiple circling flights to generate a synthetic moving image.
  • the unmanned aircraft 100 It is difficult for the unmanned aircraft 100 to take dynamic images while staying in one place during flight. Therefore, when shooting moving images, it is difficult to continuously shoot in the same shooting range, and it is difficult to synthesize images in the same shooting range. In this regard, the unmanned aircraft 100 does not stay in one place when shooting dynamic images, but performs multiple rounds on the designated flight path RT, so that it can respond to the same one over time. Shooting within the shooting range. Therefore, the unmanned aircraft 100 is within the same shooting range, that is, a larger shooting range can be fixed, and a plurality of dynamic images having a plurality of image frames corresponding to each shooting range can be obtained.
  • the unmanned aircraft 100 synthesizes the plurality of dynamic images and generates a composite dynamic image, so as to obtain various beneficial shooting effects (for example, Temporal Denoise, HDR (High Dynamic Range, high dynamic range)). That is, the unmanned aircraft 100 can obtain a long-time exposure shooting effect, increase the SNR (Signal to Noise Ratio), reduce noise, and expand the dynamic range.
  • beneficial shooting effects for example, Temporal Denoise, HDR (High Dynamic Range, high dynamic range)
  • the dynamic image may have a plurality of image frames.
  • the UAV control unit 110 may control the unmanned aircraft 100 so that each of the plurality of dynamic images has the same image frame at the same relative time.
  • the unmanned aircraft 100 obtains images with the same shooting range from each of the image frames of the same relative time in each moving image, so that as the whole moving image, the same shooting range can be obtained over a larger range. Of multiple image frames.
  • the UAV control unit 110 may acquire the state of the unmanned aircraft 100 in synchronization with the vertical synchronization signal (VSYNC signal) of the imaging unit 220 during the flight of the first flight path RT.
  • the UAV control unit 110 can control the flight of the unmanned aircraft 100 and the camera unit 220 in synchronization with the vertical synchronization signal of the camera unit 220 during the flight of the flight path RT after the second lap, so that the flight and camera unit 220 of the unmanned aerial vehicle 100 can be in line with those in the first lap.
  • the shooting is performed in the same state of the unmanned aircraft 100.
  • the unmanned aircraft 100 is synchronized with the vertical synchronization signal of the camera unit 220, so that every time an image frame is acquired, the state of the unmanned aircraft 100 can be acquired.
  • the unmanned aircraft 100 stores the flight mode and shooting mode of the unmanned aircraft 100 in the first lap, and sets the flight mode and shooting mode in the subsequent laps to be the same as those in the first lap, so that it can be easily
  • the shooting range corresponding to the state of the unmanned aircraft 100 is fixed in a wide range, and multiple dynamic images are obtained.
  • the state of the unmanned aircraft 100 may include at least one of the position of the unmanned aircraft 100, the orientation of the unmanned aircraft 100, and the angle of the universal joint 200 supporting the camera 220.
  • the unmanned aircraft 100 can store the state of the unmanned aircraft 100 in the storage unit 160, and acquire and set the state of the unmanned aircraft 100 from the storage unit 160 at a later point in time, for example.
  • the image frame of the imaging range captured by the imaging unit 220 in the past is acquired.
  • the UAV control unit 110 may end the control of the flight and shooting of the unmanned aircraft 100.
  • the UAV control unit 110 may perform flight and shooting control along the next flight path RT to be circled.
  • the unmanned aircraft 100 can continuously shoot on the flight path RT until the evaluation of the synthetic dynamic image reaches the preset reference. Therefore, it is expected that the quality of the synthesized moving image of unmanned aircraft 100 will be improved.
  • the UAV control unit 110 may acquire operation information indicating the evaluation result of the synthesized moving image.
  • the operation information can be obtained from the terminal 80. In this way, the user can subjectively synthesize the dynamic image for evaluation, and can determine whether to acquire more images as the basis of the synthetic dynamic image.
  • the UAV control unit 110 may perform image recognition for the composite moving image.
  • the UAV control unit 110 may evaluate the synthesized dynamic image based on the result of the image recognition.
  • the unmanned aircraft 100 can objectively evaluate the synthetic dynamic image through image recognition, and can determine whether to fly again on the flight path RT and continue to acquire image frames that are the basis of the synthetic dynamic image.
  • the processing related to the above-mentioned flight control and shooting control, and synthesis of moving images may be mainly performed by the unmanned aircraft 100.
  • various controls and various processes can be performed by one device, which can implement efficient processing and shorten the processing time.
  • the processing related to the above-mentioned shooting control and synthesis of moving images may also be mainly performed by other devices (for example, the terminal 80 and the transmitter).
  • Fig. 7 is a flowchart showing a first example of composition of moving images.
  • the composite processing of the moving image corresponds to S14 in FIG. 6.
  • FIG. 7 it is assumed that a moving image that surrounds at any one time in S13 of FIG. 6 is acquired.
  • the UAV control unit 110 determines whether the obtained moving image is the moving image obtained in the first circle of the flight path RT (S21). For example, by referring to the storage unit 160, the UAV control unit 110 can discriminate which lap of the current flight path RT is. The UAV control unit 110 may obtain information indicating the number of laps of the current flight path RT from the storage unit 160.
  • the UAV control unit 110 stores each of the obtained moving image frames as each of the reference moving images in the storage unit 160 (S22).
  • the image frame is acquired, that is, the state information of the flying object is stored in the storage unit 160 in synchronization with the vertical synchronization signal of the imaging unit 220. Thereby, it is possible to grasp the state of unmanned aircraft 100 at the moment the image is taken.
  • the UAV control unit 110 also stores each image frame of the obtained moving image as each image frame of the calculation moving image (S23).
  • the UAV control unit 110 compares each image frame of the obtained moving image with each corresponding image frame of the reference moving image, and Calculate the global motion vector (S24).
  • Corresponding image frames refer to image frames at the same relative time.
  • the global motion refers to motion information representing changes in the state (posture) of the unmanned aircraft 100 and the flight movement of the unmanned aircraft 100 at multiple time points.
  • the global motion is represented by a motion vector (global motion vector).
  • the UAV control unit 110 corrects the global motion based on the calculated global motion vector, that is, performs global motion compensation (S25). For example, in global motion compensation, since the motion of the entire image frame can be expressed through affine transformation, and the motion compensation is performed in units of the image frame, the coding efficiency and the compensation efficiency are high. In addition, the UAV control unit 110 may also implement inter-frame prediction and motion compensation other than global motion compensation between image frames at the same relative time in each circle. In addition, the processing related to motion compensation in S24 and 25 may be omitted.
  • the UAV control unit 110 adds each image frame of the obtained moving image to each corresponding image frame of the calculation moving image (S26).
  • the value of each pixel of each frame of the moving image subjected to global motion compensation may be added to the value of each pixel of each corresponding image frame in the moving image for calculation.
  • the UAV control unit 110 compares the pixel value of each pixel of the first image frame gf11, which is the first circle moving image of the calculation moving image, to the second circle moving image
  • the pixel values of the pixels of the first image frame gf21 are added to calculate the first image frame in the new dynamic image for calculation.
  • the UAV control unit 110 adds the dynamic image of the first circle and the dynamic image of the second circle to each pixel of the first image frame of the dynamic image for calculation.
  • the pixel value of is added to the pixel value of each pixel of the first image frame gf31 of the third circle dynamic image to generate the first frame of the new dynamic image for calculation.
  • the same addition is performed for the moving images after the third circle.
  • the UAV control unit 110 calculates the average value of each image frame of the calculated moving image for calculation (S27). In this case, the UAV control unit 110 may calculate the average value of the pixel value of each pixel of each image frame of the moving image for calculation. The UAV control unit 110 generates a composite moving image having each image frame whose average value is calculated (S27). Thus, when the flight of the flight path RT is the flight after the second lap, the unmanned aircraft 100 can output (for example, transmit, display) the synthesized moving image while capturing the moving image.
  • the UAV control unit 110 can generate a composite moving image based on the first moving image (for example, the reference moving image) obtained in the first lap and the second moving image obtained after the second lap.
  • first moving image for example, the reference moving image
  • second moving image obtained after the second lap.
  • unmanned aircraft 100 can generate a composite moving image in which a plurality of surrounding moving images are synthesized using the first-circle moving image as a reference.
  • the UAV control unit 110 may compare the first moving image with the second moving image for each image frame of the same relative time, and perform the motion compensation of the second moving image on the first moving image based on the comparison result.
  • unmanned aircraft 100 can perform motion compensation in image frames of the same relative time after the first lap and the second lap. Therefore, it is possible to improve the uniformity of the image range of each image frame at the same relative time in a plurality of moving images.
  • the image range corresponds to the shooting range. Therefore, for example, even if the flying environment of the unmanned aircraft 100 is not good, it is possible to reduce the positional deviation between a plurality of image frames in each moving image, thereby improving the image quality of the composite moving image.
  • motion compensation may include global motion compensation.
  • unmanned aircraft 100 can improve the coding efficiency of compression coding of moving images and the efficiency of motion compensation.
  • the UAV control unit 110 may generate a composite moving image based on the statistical value of the same pixel of the image frame of the same relative time in the first moving image and the second moving image.
  • the unmanned aircraft 100 captures dynamic images while flying, it is difficult to obtain image frames with the same shooting range.
  • the unmanned aircraft 100 can circle on the same flight path RT and obtain multiple image frames at the same relative time.
  • the unmanned aircraft 100 obtains statistical values (for example, average values) of a plurality of image frames, so that even if some image frames with lower image quality are included, the image quality of the image frames can be improved and a dynamic image can be obtained.
  • Fig. 8 is a flowchart showing a second example of composition of moving images.
  • the same step numbers are assigned, and the description thereof is omitted or simplified.
  • unmanned aircraft 100 performs the same processing as S21, S22, S24, and S25 in FIG. 7.
  • the UAV control unit 110 extracts the characteristic region in the image frame of the obtained moving image (S26A).
  • the feature area is extracted based on objective or user subjectivity.
  • the characteristic area may be, for example, a characteristic area having value in the surround.
  • the UAV control unit 110 may extract the difference area between the obtained moving image and the image frame of the same relative time in the reference moving image as the characteristic area.
  • the UAV control unit 110 may extract an area where a predetermined subject exists in an image frame of the obtained moving image as a characteristic area.
  • the UAV control unit 110 may extract an area designated by the user as a characteristic area through the operation unit 83 of the terminal 80 for the obtained image frame of the moving image. The extraction of the characteristic region is implemented for each image frame in the obtained moving image.
  • the UAV control unit 110 replaces the area (feature corresponding area) of each image frame of the reference moving image corresponding to the characteristic area extracted in each image frame of the obtained moving image with the extracted characteristic area (S27A).
  • the UAV control unit 110 may replace the pixel value of each pixel in the feature corresponding area with the pixel value of each pixel in the extracted feature area.
  • the UAV control unit 110 generates a composite moving image having each image frame in which the characteristic corresponding area in the reference moving image is replaced with the characteristic area in the obtained moving image (S27A).
  • the UAV control unit 110 can compare the first moving image with the second moving image for each image frame of the same relative time, extract the characteristic area from the second moving image, and replace the first moving image with the characteristic area in the second moving image.
  • An area corresponding to the characteristic area in a dynamic image feature corresponding area.
  • the unmanned aircraft 100 replaces a part of the first moving image with a lower image quality or a part that is not in the state expected by the user with a part of the image frame of the same relative time in another moving image, thereby improving the first moving image.
  • the quality of the dynamic image and the synthesized dynamic image are obtained. For example, when photographing an arbitrary tower or building as a subject, there may be many tourists around the tower or building in the image frame of the first moving image. Even in this case, when there is no tourist in the image frame of the same relative time in the second dynamic image, the unmanned aircraft 100 extracts this part as a feature area to replace the feature corresponding to the image frame in the first dynamic image. area. As a result, unmanned aircraft 100 can obtain a composite moving image including towers or buildings excluding tourists.
  • Fig. 9 is a flowchart showing an output example of a moving image.
  • the output processing of the moving image corresponds to S15 in FIG. 6.
  • FIG. 9 it is assumed that a moving image that surrounds at any one time in S13 of FIG. 6 is acquired.
  • the UAV control unit 110 determines whether the obtained moving image is the moving image after the Nth circle (S31). When the obtained moving image is the surrounding moving image before the Nth circle, the UAV control unit 110 outputs the moving image of the last surrounding (S32). In this case, the UAV control unit 110 may output a moving image captured by the imaging unit 220 in real time, instead of synthesizing the moving image. When the obtained moving image is the moving image after the N-th circle, the UAV control unit 110 outputs the generated composite moving image (S33).
  • the UAV control unit 110 may transmit the moving image to another device (for example, the terminal 80) through the communication unit 150 as an output of the moving image.
  • the UAV control unit 110 may display a moving image on another device (for example, the terminal 80) as an output of the moving image.
  • the terminal control unit 81 of the terminal 80 can receive the moving image through the communication unit 85 and display the moving image through the display unit 88.
  • the UAV control unit 110 may store the moving image in the storage unit 160 or another recording medium (for example, an external recording medium) as the output of the moving image.
  • the UAV control unit 110 can obtain the number of turns of the flight path RT of the unmanned aircraft 100.
  • the UAV control section 110 may output the dynamic image captured in the last surround.
  • the UAV control section 110 may output the synthesized dynamic image.
  • unmanned aircraft 100 may have undesired artifacts appearing in the synthesized moving image when the image quality of the synthesized moving image is assumed to be insufficient. Therefore, in this case, the unmanned aircraft 100 can suppress the output of the synthesized dynamic image and provide the latest dynamic image by providing the unsynthesized dynamic image of the last circling.
  • unmanned aircraft 100 flies a threshold number or more and takes a moving image, it sometimes takes a long time. Even in this case, some moving images can be output, and the user can confirm.
  • the unmanned aircraft 100 assumes that the image quality of the composite moving image is sufficient for the number of rounds, and the image quality of the composite moving image is stable. In this case, it is expected that the unmanned aircraft 100 can provide a dynamic image with improved image quality compared to the dynamic image at each circle by providing the composite dynamic image.
  • the output example of the moving image shown in FIG. 9 is an example, and other output methods may be used.
  • the UAV control unit 110 may output the composite moving image independently of the number of laps, regardless of the number of laps of the obtained moving image.
  • the shooting and synthesis of a plurality of moving images during the flight of the flying body have been described, but it is not limited to the flying body, and the above-mentioned embodiment may be applied to other moving bodies (for example, vehicles and ships). In this case, for example, by replacing the expression of flying with movement, the above-described embodiment can also be applied to the shooting and synthesis of a plurality of moving images when the moving body is moving.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
PCT/CN2020/133589 2019-12-09 2020-12-03 图像处理装置、图像处理方法、程序及记录介质 WO2021115192A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080074343.6A CN114586335A (zh) 2019-12-09 2020-12-03 图像处理装置、图像处理方法、程序及记录介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-222092 2019-12-09
JP2019222092A JP6997164B2 (ja) 2019-12-09 2019-12-09 画像処理装置、画像処理方法、プログラム、及び記録媒体

Publications (1)

Publication Number Publication Date
WO2021115192A1 true WO2021115192A1 (zh) 2021-06-17

Family

ID=76311106

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/133589 WO2021115192A1 (zh) 2019-12-09 2020-12-03 图像处理装置、图像处理方法、程序及记录介质

Country Status (3)

Country Link
JP (1) JP6997164B2 (ja)
CN (1) CN114586335A (ja)
WO (1) WO2021115192A1 (ja)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024006072A (ja) * 2022-06-30 2024-01-17 本田技研工業株式会社 画像処理装置、画像処理方法、画像処理システム、およびプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070122139A1 (en) * 2005-11-29 2007-05-31 Seiko Epson Corporation Controller, photographing equipment, control method of photographing equipment, and control program
CN102210136A (zh) * 2009-09-16 2011-10-05 索尼公司 图像处理设备和方法以及程序
CN109246355A (zh) * 2018-09-19 2019-01-18 北京云迹科技有限公司 利用机器人生成全景图像的方法、装置及机器人
CN109952755A (zh) * 2016-10-17 2019-06-28 深圳市大疆创新科技有限公司 飞行路径生成方法、飞行路径生成系统、飞行体、程序以及记录介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008186145A (ja) * 2007-01-29 2008-08-14 Mitsubishi Electric Corp 空撮画像処理装置および空撮画像処理方法
JP2011087183A (ja) * 2009-10-16 2011-04-28 Olympus Imaging Corp 撮影装置、画像処理装置、およびプログラム
JP2014185947A (ja) * 2013-03-25 2014-10-02 Geo Technical Laboratory Co Ltd 3次元復元のための画像撮影方法
JP7021900B2 (ja) * 2017-10-24 2022-02-17 M-Solutions株式会社 画像提供方法
CN108419023B (zh) * 2018-03-26 2020-09-08 华为技术有限公司 一种生成高动态范围图像的方法以及相关设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070122139A1 (en) * 2005-11-29 2007-05-31 Seiko Epson Corporation Controller, photographing equipment, control method of photographing equipment, and control program
CN102210136A (zh) * 2009-09-16 2011-10-05 索尼公司 图像处理设备和方法以及程序
CN109952755A (zh) * 2016-10-17 2019-06-28 深圳市大疆创新科技有限公司 飞行路径生成方法、飞行路径生成系统、飞行体、程序以及记录介质
CN109246355A (zh) * 2018-09-19 2019-01-18 北京云迹科技有限公司 利用机器人生成全景图像的方法、装置及机器人

Also Published As

Publication number Publication date
JP6997164B2 (ja) 2022-01-17
JP2021093592A (ja) 2021-06-17
CN114586335A (zh) 2022-06-03

Similar Documents

Publication Publication Date Title
US11722647B2 (en) Unmanned aerial vehicle imaging control method, unmanned aerial vehicle imaging method, control terminal, unmanned aerial vehicle control device, and unmanned aerial vehicle
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
WO2018073879A1 (ja) 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体
KR20170136750A (ko) 전자 장치 및 그의 동작 방법
CN112154649A (zh) 航测方法、拍摄控制方法、飞行器、终端、系统及存储介质
WO2019080768A1 (zh) 信息处理装置、空中摄像路径生成方法、程序、及记录介质
WO2019230604A1 (ja) 検査システム
CN110291777B (zh) 图像采集方法、设备及机器可读存储介质
JP2017201261A (ja) 形状情報生成システム
WO2018073878A1 (ja) 3次元形状推定方法、3次元形状推定システム、飛行体、プログラム、及び記録媒体
CN111344650B (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
WO2018214401A1 (zh) 移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质
JP2019028560A (ja) モバイルプラットフォーム、画像合成方法、プログラム、及び記録媒体
WO2021115192A1 (zh) 图像处理装置、图像处理方法、程序及记录介质
JP2021096865A (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
CN111213107B (zh) 信息处理装置、拍摄控制方法、程序以及记录介质
US20210092306A1 (en) Movable body, image generation method, program, and recording medium
WO2020119572A1 (zh) 形状推断装置、形状推断方法、程序以及记录介质
KR101552407B1 (ko) 동일 장소에서의 시간을 초월한 동반 촬영 시스템 및 방법
JP7081198B2 (ja) 撮影システム及び撮影制御装置
WO2020001629A1 (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
JP6803960B1 (ja) 画像処理装置、画像処理方法、プログラム、及び記録媒体
JP2019212961A (ja) 移動体、光量調整方法、プログラム、及び記録媒体
WO2023047799A1 (ja) 画像処理装置、画像処理方法及びプログラム
WO2020088397A1 (zh) 位置推定装置、位置推定方法、程序以及记录介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20897999

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20897999

Country of ref document: EP

Kind code of ref document: A1