WO2019227438A1 - Procédé et dispositif de traitement d'image, aéronef, système et support de stockage - Google Patents

Procédé et dispositif de traitement d'image, aéronef, système et support de stockage Download PDF

Info

Publication number
WO2019227438A1
WO2019227438A1 PCT/CN2018/089376 CN2018089376W WO2019227438A1 WO 2019227438 A1 WO2019227438 A1 WO 2019227438A1 CN 2018089376 W CN2018089376 W CN 2018089376W WO 2019227438 A1 WO2019227438 A1 WO 2019227438A1
Authority
WO
WIPO (PCT)
Prior art keywords
image frame
image
target
pixel value
paddle
Prior art date
Application number
PCT/CN2018/089376
Other languages
English (en)
Chinese (zh)
Inventor
黄若普
曹子晟
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201880010731.0A priority Critical patent/CN110291529A/zh
Priority to PCT/CN2018/089376 priority patent/WO2019227438A1/fr
Publication of WO2019227438A1 publication Critical patent/WO2019227438A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Definitions

  • the present invention relates to the field of control technology, and in particular, to an image processing method, device, aircraft, system, and storage medium.
  • aerial photography and camera technology have become more and more widely used.
  • aerial photography and camera technology on aircraft are most widely used.
  • the photographing device is mounted under the blades of the aircraft, during the flight of the aircraft, when the attitude of the aircraft is adjusted, the angle of view of the photographing device is easily affected by the blades, causing the photographing device
  • the appearance of the blades in the captured picture affects the quality of aerial photography.
  • Embodiments of the present invention provide an image processing method, device, system, and storage medium, which can automatically remove paddles in a captured image, improve the quality of the captured image, and reduce energy loss and cost.
  • an embodiment of the present invention provides an image processing method, which is applied to a photographing device, where the photographing device is mounted on an aircraft with a paddle and includes:
  • an embodiment of the present invention provides an image processing device, including a memory and a processor;
  • the memory is used to store program instructions
  • the processor executes program instructions stored in the memory.
  • the processor executes program instructions stored in the memory.
  • the processor executes program instructions stored in the memory.
  • the processor executes program instructions stored in the memory.
  • an embodiment of the present invention provides an aircraft, including:
  • a processor configured to obtain motor rotation information to determine a blade rotation period of the aircraft according to the motor rotation information, adjust a shooting interval duration of the photographing device according to the blade rotation period, and control mounting on the aircraft
  • the upper shooting device obtains an image frame sequence according to the shooting interval.
  • an embodiment of the present invention provides an image processing system, including: an image processing device and an aircraft;
  • the aircraft is configured to obtain motor rotation information to determine a blade rotation period of the aircraft according to the motor rotation information, adjust a shooting interval duration of a photographing device according to the blade rotation period, and control a mount on A shooting device on the aircraft shoots an image frame sequence according to the shooting interval duration;
  • the image processing device is configured to obtain an image frame sequence obtained by a photographing device; perform paddle detection on the image frames in the image frame sequence, and detect, from the image frame sequence, a target that is determined to have an image area of the paddle blade. An image frame; modifying or replacing the target image frame to remove the paddle image area.
  • an embodiment of the present invention provides a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the image processing method according to the first aspect is implemented.
  • the image processing device acquires an image frame sequence obtained by a photographing device, detects from the image frame sequence a target image frame in which a paddle image area exists, and modifies the target image frame or Replace to remove the blade image area. In this way, the quality of captured images is improved, energy consumption is reduced, and costs are saved.
  • FIG. 1 is a schematic structural diagram of an image processing system according to an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of an image processing method according to an embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of an aircraft provided by an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of an image frame sequence according to an embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of another image processing method according to an embodiment of the present invention.
  • FIG. 6 is a schematic flowchart of another image processing method according to an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of an image processing device according to an embodiment of the present invention.
  • the image processing method provided in the embodiment of the present invention may be executed by an image processing device.
  • the image processing device may be provided on an aircraft with a paddle (such as a drone) configured with a photographing device.
  • the image processing device may be provided on a device such as a robot with a paddle, a mobile terminal (such as a mobile phone) configured with a photographing device.
  • a mobile terminal such as a mobile phone
  • This solution determines the rotation period of the blades of the aircraft based on the motor rotation information obtained from the electronic governor of the aircraft, and adjusts the shooting interval length of the photographing device according to the blade rotation period, so that the shooting interval time and the blade rotation period form A phase difference, and controlling the shooting device to obtain an image frame sequence according to the shooting interval duration.
  • a phase difference is formed by adjusting the shooting interval duration and the blade rotation period, so as to avoid each frame image caused by the shooting device when the blade is captured without the phase difference between the shooting interval duration and the blade rotation period. The blades appear in all frames.
  • the motor rotation information is obtained from the electronic governor of the aircraft, wherein the motor rotation information includes any one or more types of information about a blade rotation speed, a blade rotation frequency, and a blade rotation period.
  • the image processing device may determine a blade rotation period of the aircraft according to the motor transfer information obtained from the electronic governor, and adjust a shooting interval duration of the photographing device according to the blade rotation period, thereby An image frame sequence obtained by the shooting device according to the shooting interval is acquired.
  • the image processing device performs paddle detection on the image frames in the image frame sequence, detects, from the image frame sequence, a target image frame in which a paddle image area exists, and modifies the target image frame Or replace to remove the blade image area in the target image frame.
  • the embodiment of the present invention uses an image processing device to execute corresponding processing as an example to describe an image processing method.
  • FIG. 1 is a schematic structural diagram of an image processing system according to an embodiment of the present invention.
  • the image processing system shown in FIG. 1 includes: an image processing device 11 and an aircraft 12, and the image processing device 11 may be
  • the control terminal of the aircraft 12 may specifically be any one or more of a remote controller, a smart phone, a tablet computer, a laptop computer, a ground station, and a wearable device (watch, bracelet).
  • the aircraft 12 may be a rotor-type aircraft, such as a quad-rotor, a six-rotor, an eight-rotor, or a fixed-wing aircraft.
  • the aircraft includes a power system 121 for providing flight power to the aircraft.
  • the power system 121 includes any one or more of a propeller, a motor, and an electronic governor.
  • the aircraft 12 may further include a gimbal 122 and a camera device. 123.
  • the camera 123 is mounted on the main body of the aircraft through the gimbal 122.
  • the camera device 123 is used for image or video shooting during the flight of the aircraft 12, including, but not limited to, a multispectral imager, a hyperspectral imager, a visible light camera, and an infrared camera.
  • the gimbal 122 is a multi-axis transmission and stabilization system.
  • the PTZ 122 motor compensates the shooting angle of the imaging device by adjusting the rotation angle of the rotating shaft, and prevents or reduces the shake of the imaging device by setting an appropriate buffer mechanism.
  • the image processing system may obtain the image frame sequence captured by the shooting device 123 through the image processing device 11, perform paddle detection on the image frames in the image frame sequence, and In the image frame sequence, a target image frame in which a paddle image area exists is determined and determined, and the target image frame is modified or replaced to remove the paddle image area.
  • the image processing system may acquire the blade rotation period of the aircraft 12 through the image processing device 11, where the blade rotation period It is the image processing device 11 that determines the blade rotation period of the aircraft 12 based on the motor rotation information acquired by the electronic governor of the aircraft 12.
  • the image processing device 11 may adjust a shooting interval duration of the photographing device according to the blade rotation period, wherein the shooting interval duration is a non-integer multiple of the blade rotation period.
  • the image processing device 11 when the image processing device 11 adjusts the duration of the shooting interval according to the blade rotation period, the image processing device 11 may determine a phase difference parameter according to the blade rotation period and the determined phase difference parameter.
  • the shooting interval is described.
  • the phase difference parameter is a positive number less than 1.
  • FIG. 2 is a schematic flowchart of an image processing method according to an embodiment of the present invention.
  • the method may be executed by an image processing device.
  • the specific explanation of the image processing device is as described above.
  • the method according to the embodiment of the present invention includes the following steps.
  • S201 Acquire an image frame sequence obtained by a shooting device.
  • the image processing device may acquire an image frame sequence obtained by a shooting device mounted on the aircraft.
  • the image processing device may obtain a blade rotation period of the aircraft, and adjust the shooting interval of the shooting device according to the blade rotation period. Duration, so as to control the photographing device to obtain the image frame sequence according to the duration of the photographing interval.
  • FIG. 3 is a schematic structural diagram of an aircraft provided by an embodiment of the present invention. Since the blades of the aircraft are driven by a motor, each rotor of the aircraft shown in FIG. 3 includes A motor includes: a motor 31, a motor 32, a motor 33, and a motor 34. In addition, the aircraft also includes a power system 35 provided on the fuselage. The power system 35 includes an electronic governor, so the electronic governor can obtain the rotational speed and rotation of the blades of the aircraft. Any one or more kinds of data information such as frequency and rotation period.
  • the image processing device can obtain the real-time rotation speed of the motor from the electronic governor of the power system 35 to determine the rotation period of the blades of the aircraft.
  • the image processing device may determine a phase difference parameter according to the obtained blade rotation period and the determined phase difference. Parameter to determine the duration of the shooting interval.
  • the duration of the shooting interval is a non-integer multiple of the rotation period of the blade, and the phase difference parameter is a positive number less than.
  • the phase difference parameter ⁇ may be a preset value, and the shooting interval duration t may be determined according to a transformation of the blade rotation period T and an adjustment to a positive integer n.
  • the phase difference parameter ⁇ may be determined by performing blind adjustment on the duration of the shooting interval.
  • the blade rotation period of the aircraft is obtained through this implementation manner, so that the image processing device can adjust the duration of the shooting interval of the shooting device according to the rotation period of the paddle, so that the shooting device follows the Shoot at intervals.
  • S202 Perform paddle detection on the image frames in the image frame sequence, and detect, from the image frame sequence, a target image frame in which a paddle image area exists.
  • the image processing device may perform paddle detection on the image frames in the image frame sequence obtained by the photographing device, and detect from the image frame sequence and determine the target in which the paddle image area exists.
  • Image frame may obtain data information of each image frame in the image frame sequence, and detect whether there is an image frame including a paddle image area in the image frame sequence according to the data information. If the detection result is yes, it is determined that the image frame including the blade image area is a target image frame.
  • the image processing device can detect a target image frame in which a paddle image region exists from the image frame sequence, so as to replace or modify the target image frame subsequently.
  • FIG. 4 is taken as an example for illustration.
  • FIG. 4 is a schematic diagram of an image frame sequence provided by an embodiment of the present invention. It is assumed that the image frame obtained by the photographing device acquired by the image processing device according to a shooting interval duration t includes 4
  • the frame image frame is the first image frame 41, the second image frame 42, the third image frame 43, and the fourth image frame 44 as shown in FIG.
  • the image processing device may separately obtain data information of a first image frame 41, data information of a second image frame 42, data information of a third image frame 43, and data information of a fourth image frame 44 in the image frame sequence, By comparing data information of each image frame, it is detected whether there is an image frame including a paddle image area in the image frame sequence. If it is detected that the first image frame 41 includes a paddle image area 411, determining the first An image frame 41 is a target image frame.
  • S203 Modify or replace the target image frame to remove the paddle image area.
  • the image processing device may modify or replace the target image frame to remove the paddle image area.
  • the image processing device can modify or replace the image frame in which the paddle appears, to remove the paddle in the image frame, thereby improving the quality of the captured image.
  • the image processing device may further determine a target position area in which the blade appears in the target image frame from an adjacent image frame of the target image frame. To obtain a target image corresponding to the target position area, and use the target image to replace an image of a target position area where a paddle appears in the target image frame to obtain a replacement image frame. It should be noted that, because the shooting device captures image frames fast and the content of adjacent image frames is similar, the adjacent image frames of the target image frame can be used to perform the replacement processing on the target image frame.
  • the target location area may be a preset target location area. In other embodiments, the target location area may also be a target location area calculated by a preset algorithm. Embodiments of the present invention Not specifically limited.
  • FIG. 4 can be used as an example for description.
  • the target image frame determined by the image processing device is the first image frame 41. If it is determined that the blade position appears in the target position area 411 in the first image frame 41, then Acquiring a target image corresponding to the target position area 411 in the first image frame 41 from an adjacent image frame such as the second image frame 42 of the first image frame 41, and using the target image to replace the first An image of the target position area 411 of the blade appears in an image frame 41, and a replacement image frame is obtained.
  • the image processing device may further select at least two image frames from the image frame sequence to remove the target image frame, and fuse the at least two image frames. Process the fused image frame, and use the fused image frame to replace the target image frame.
  • FIG. 4 is taken as an example for description. Assuming that the target image frame determined by the image processing device is the first image frame 41, the image processing device may select and remove the first image frame 41 from the image frame sequence in FIG. 4. At least two image frames, such as a second image frame 42 and a third image frame 43, and performing fusion processing on the selected second image frame 42 and the third image frame 43 to obtain a fused image frame, thereby using the fused image frame, Replace the first image frame 41.
  • the image processing device obtains an image frame sequence obtained by the photographing device, performs paddle detection on the image frames in the image frame sequence, and determines from the image frame sequence to determine that a paddle image area exists.
  • the target image frame and modify or replace the target image frame to remove the paddle image area, thereby improving the quality of the captured image, reducing energy consumption, and saving costs.
  • FIG. 5 is a schematic flowchart of another image processing method according to an embodiment of the present invention.
  • the method may be executed by an image processing device.
  • the specific explanation of the image processing device is as described above.
  • the difference between the embodiment of the present invention and the embodiment shown in FIG. 2 is that the embodiment of the present invention detects how to detect whether there is a paddle image area in the image frame according to the acquired data information of the image frame in the image frame sequence, so that The process of determining an image frame including a blade image area as a target image frame is described in detail.
  • S501 Acquire an image frame sequence obtained by a shooting device.
  • the image processing device may acquire an image frame sequence obtained by the shooting device.
  • the example of the specific implementation process is as described above, and is not repeated here.
  • S502 Acquire data information of image frames in the image frame sequence.
  • the image processing device may acquire data information of each image frame in the image frame sequence.
  • S503 Detect whether there is an image frame including a paddle image area in the image frame according to the data information.
  • the image processing device may detect whether the image frame includes a paddle image area according to the acquired data information of the image frame, wherein the data information includes information of each image frame in the image frame sequence. Pixel values. In one embodiment, the image processing device may detect whether a paddle image area is included in the image frame by acquiring a pixel value of the image frame.
  • the image processing device may determine a comparison image frame sequence from the image frame sequence, and The determined pixel values of each image frame in the compared image frame sequence are compared, and if the difference in pixel values in the comparison result satisfies a determination condition, it is determined that the image frame includes a paddle image area.
  • the difference in the pixel values in the comparison result meets the judgment condition, which means that the difference in pixel values between the pixel values of the image frames in the sequence of the image frames and the pixel values of other image frames in the sequence of the compared image frames is significantly greater than the comparison
  • the pixel values of other image frames in the image frame sequence are different from the pixel values of the pixel values of each image frame.
  • the difference in pixel values in the comparison result that satisfies the determination condition may also refer to other determination conditions, such as a determination condition that the difference in pixel values is greater than a preset threshold, which is not specifically limited in this embodiment of the present invention.
  • the image frame sequence acquired by the image processing device includes 100 image frames.
  • 50 image frames are determined from the 100 image frames as a comparison image frame sequence.
  • the pixel values of 50 image frames in the image frame sequence are compared. If the difference between the pixel values of the first image frame and the second image frame in the compared image frame sequence is greater than a preset difference value, the first image frame may be confirmed as an image frame in the first pixel image frame sequence , Confirm that the second image frame is an image frame in a second pixel image frame sequence.
  • the two Image frames are confirmed as the image frames in the second pixel image frame sequence; if the pixel value difference between the two image frames in the comparison result is greater than the preset difference value, since the pixel value is from low to high, the corresponding color is from black To white, therefore, the image frame with the smaller pixel value in the two image frames whose pixel value difference is greater than the preset difference value is confirmed as the image frame in the first pixel image frame sequence, and the image frame with the larger pixel value is It is confirmed as the image frame in the second pixel image frame sequence.
  • the number of image frames identified as being the first pixel image frame sequence and the number of image frames identified as being the second pixel image frame sequence are compared to determine that the smaller number of image frame sequences is The target image frame where the blade appears. If the number of image frames in the first pixel image frame sequence is less than the number of image frames in the second pixel image frame sequence, it is determined that the image frames in the first pixel image frame sequence are target image frames in which the paddle image area appears.
  • the image processing device When the image processing device detects whether a paddle image area exists in an image frame, it can also detect whether a paddle image area exists in the image frame by comparing and comparing the preset position area of the image frame in the image frame sequence.
  • the image processing device may determine a preset position area of an image frame from the determined comparison image frame sequence, and the determined image area of each image frame in the compared image frame sequence The pixel values in the preset position area are compared. If the difference in the pixel values in the comparison result meets the determination condition, it is determined that the image frame includes a paddle image area.
  • a preset position area of each image frame is determined from the determined comparison image frame sequence, and pixel values of the preset position area of each image frame in the comparison image frame sequence are compared.
  • the pixel values of the preset position area of the first image frame in the compared image frame sequence are compared with the pixel values of the preset position area of the second image frame, and the difference between the pixel values is greater than the preset difference value, and the first image If the pixel value of the frame is smaller than the pixel value of the second image frame, the first image frame is confirmed as the image frame in the first pixel image frame sequence, and the second image frame is determined as the second pixel image frame sequence.
  • the two image frames to be compared may be confirmed as the image frames in the second pixel image frame sequence.
  • the first pixel image frame sequence and the second pixel image frame sequence can be obtained, and the number of image frames confirmed as being the first pixel image frame sequence and the number of image frames confirmed as being the second pixel image frame can be obtained.
  • the number of image frames in the sequence is compared, and the smaller number of image frames is determined as the target image frame where the paddles appear.
  • the image processing device when it determines whether the image frame includes a paddle image area according to the pixel values of the acquired image frame, it may determine a reference image frame from the image frame sequence, and detect The pixel value of the image frame obtained in the image frame sequence is different from the pixel value of the reference image frame. If the pixel value difference satisfies a determination condition, it is determined that the image frame includes a paddle image area. Wherein, the pixel value difference meets the judgment condition means that the pixel value difference between the pixel value of an image frame in the image frame sequence and the pixel value of the reference image frame is significantly greater than the pixel values of other image frames in the image frame sequence and the reference image frame. The pixel value difference of the pixel value.
  • the pixel value difference meeting the judgment condition may also refer to other judgment conditions, such as a judgment condition such as a difference in pixel value greater than a preset threshold, which is not specifically limited in this embodiment of the present invention.
  • FIG. 4 can be used as an example for description.
  • the third image frame 43 is determined as the reference image frame from the image frame sequence in FIG. 4.
  • the pixel value difference of the pixel value of 43 is greater than the preset difference value
  • the first image frame 41 may be confirmed as an image frame in the first pixel image frame sequence
  • the third image frame 43 is determined as the first Image frames in a two-pixel image frame sequence.
  • the second image frame 42 may be confirmed as yes Image frames in the second pixel image frame sequence.
  • the fourth image frame 44 may be confirmed as a second pixel Image frames in a sequence of image frames.
  • the first image frame 41 may be determined. Includes blade image area.
  • the image processing device may further determine the preset position area in the reference image frame to obtain the preset position of the image frame obtained from the image frame sequence.
  • the pixel value of the region is compared with the pixel value difference of the preset position region of the reference image frame to detect whether a paddle image region exists in the image frame.
  • the image processing device may determine a preset position area from the reference image frame, and detect a pixel value of the preset position area of the image frame obtained from the image frame sequence and the pixel value of the preset position area.
  • the pixel value difference in the preset position area of the reference image frame is described, and if the pixel value difference satisfies a determination condition, it is determined that the image frame includes a paddle image area.
  • FIG. 4 can be used as an example for description.
  • the third image frame 43 is determined as a reference image frame from the image frame sequence shown in FIG. 4 and a preset position area of the third image frame 43 is determined. If the difference between the pixel value of the preset position area in the first image frame 41 and the pixel value of the preset position area in the third image frame 43 is greater than the preset difference value, the first image frame 41 may be changed. It is determined to be an image frame in the first pixel image frame sequence, and the third image frame 43 is determined to be an image frame in the second pixel image frame sequence.
  • the second image frame 42 is identified as an image frame in a second pixel image frame sequence. If it is detected that the pixel value of the preset position area in the fourth image frame 44 is smaller than the preset difference value, the fourth image frame 44 may be confirmed as an image frame in the second pixel image frame sequence.
  • the first image frame 41 may be determined. Includes blade image area.
  • the image processing device detects that an image frame including a paddle image area exists in the image frame sequence, it determines that the image frame including the paddle image area is a target image frame.
  • S505 Modify or replace the target image frame to remove the paddle image area.
  • the image processing device may modify or replace the image of the target position area where the blade appears in the target image frame to remove the blade image area; or, from the image, Select at least two image frames from the target image frame in the frame sequence, perform fusion processing on the at least two image frames to obtain a fused image frame, and use the fused image frame to modify or replace the target image frame to Remove the blade image area.
  • the example of the specific implementation process is as described above, and is not repeated here.
  • the image processing device acquires data of the image frames in the image frame sequence, and according to the data information, if an image frame including a paddle image area is detected in the image frame sequence, the image processing device determines The image frame including the blade image area is a target image frame, and the target image frame is modified or replaced to remove the blade image area. In this way, the accuracy of determining the target image frame is improved.
  • FIG. 6 is a schematic flowchart of another image processing method according to an embodiment of the present invention.
  • the method may be executed by an image processing device.
  • the specific explanation of the image processing device is as described above.
  • the difference between the embodiment of the present invention and the embodiment shown in FIG. 5 is that the embodiment of the present invention illustrates a detailed process of modifying or replacing the target image frame.
  • S601 Acquire an image frame sequence obtained by a shooting device.
  • the image processing device may obtain an image frame sequence obtained by the photographing device.
  • the specific embodiments and examples are as described above, and details are not described herein again.
  • S602 Perform paddle detection on the image frames in the image frame sequence, and detect, from the image frame sequence, a target image frame in which a paddle image area is determined to exist.
  • the image processing device may perform paddle detection on the image frames in the image frame sequence acquired by the photographing device, and detect from the image frame sequence that a paddle image area exists.
  • the target image frame, the specific embodiment and the example are described above, and will not be repeated here.
  • S603 Select at least two image frames from which the target image frame is removed from the image frame sequence.
  • the image processing device may select at least two image frames from which the target image frame is removed from the acquired image frame sequence.
  • S604 Perform fusion processing on the at least two image frames to obtain a fused image frame.
  • the image processing device may perform fusion processing on the selected at least two image frames from which the target image frame is removed to obtain a fused image frame.
  • the determined target image frame is the first image frame 41
  • the second image frame 42 and the third image frame from which the first image frame 41 is removed may be selected for fusion processing to obtain a fused image frame.
  • the average image stack may be used to perform image frame fusion.
  • the image processing device may obtain the target image frame sequence from at least two image frames excluding the target image frame selected from the image frame sequence by determining a target position area of the blade in the target image frame.
  • the pixel value of the target position area determining an average pixel value of the target position area in the at least two image frames, and using the average pixel value to replace the pixel value of the target position area in the target image frame To get the fused image frame.
  • the average pixel value may be a weighted average pixel value, which is not specifically limited in the embodiment of the present invention.
  • FIG. 4 is taken as an example for description. It is assumed that the target image frame is the first image frame 41 and the paddle is in the target position area 411 in the first image frame 41.
  • the second image frame 42 and the third image frame 43 of the first image frame 41 may be based on the pixel value of the target position area in the second image frame 42 and the target in the third image frame 43.
  • the pixel value of the position area determines an average pixel value of the target position area in the second image frame 42 and the target position area in the third image frame 43.
  • the image processing device may use the average pixel value.
  • the pixel value of the target position area 411 in the first image frame 41 is replaced to obtain a fused image frame.
  • the image processing device may determine the average pixel value of the at least two image frames by acquiring the pixel values of at least two image frames excluding the target image frame selected from the image frame sequence, and use The average pixel value is substituted for the pixel value of the target image frame to obtain a fused image frame.
  • the target image frame is the first image frame 41. If the image processing device obtains the second image frame 42 and the third image frame 43 except the first image frame 41 from the image frame sequence, then The average pixel value of the second image frame 42 and the third image frame 43 may be determined according to the obtained pixel value of the second image frame 42 and the obtained pixel value of the third image frame 43. The processing device may use the average pixel value to replace the pixel value of the first image frame 41 to obtain a fused image frame.
  • the image processing device when the image processing device performs fusion processing on the at least two image frames, the image processing device may further obtain at least two image areas that do not include a paddle blade from image frames before and after the target image frame.
  • the motion vector and pixel value of the image frame, and determine the target of the target image frame by interpolation based on the motion vectors and pixel values of at least two image frames that do not include the paddle image area before and after the target image frame are obtained The pixel value, so that the obtained target pixel value of the target image is determined to replace the pixel value of the target image to obtain a fused image frame.
  • the image processing device may also determine the target position area of the paddle in the target image frame, delete the image of the target position area in the target image frame, and the purpose of removing the paddle image area has been reached. Taking FIG. 4 as an example, the image processing device may delete the image of the target position area 411 in the first image frame 41 according to the target position area 411 of the blade in the first image frame 41.
  • the image processing device when the image processing device performs fusion processing on the at least two image frames, the image processing device may also perform fusion processing on the image frames by using other fusion processing methods, which are not specifically limited in this embodiment of the present invention.
  • the image processing device may use the fused image frame to replace the target image frame.
  • the specific embodiments and examples are as described above, and are not repeated here.
  • the image processing device selects at least two image frames from the obtained image frame sequence and removes the target image frame, performs fusion processing on the at least two image frames, and obtains a fused image frame, and Using the fused image frame to replace the target image frame. In this way, replacement or modification of the target image frame where the blade appears is improved, and the quality of the captured image is improved.
  • FIG. 7 is a schematic structural diagram of an image processing device according to an embodiment of the present invention.
  • the image processing device includes: a memory 701, a processor 702, and a data interface 703.
  • the memory 701 may include a volatile memory (volatile memory); the memory 701 may also include a non-volatile memory (non-volatile memory); and the memory 701 may further include a combination of the foregoing types of memories.
  • the processor 702 may be a central processing unit (central processing unit, CPU).
  • the processor 702 may further include a hardware chip.
  • the above hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD), or a combination thereof. Specifically, it can be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), or any combination thereof.
  • ASIC application-specific integrated circuit
  • PLD programmable logic device
  • FPGA field-programmable gate array
  • the memory 701 is configured to store program instructions. When the program instructions are executed, the processor 702 may call the program instructions stored in the memory 701 to perform the following steps:
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • processor 702 invokes program instructions stored in the memory 701 for performing the following steps:
  • the obtaining a blade rotation period of the aircraft includes:
  • the blade rotation period of the aircraft is determined based on the motor rotation information obtained from the electronic governor.
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • phase difference parameter which is a positive number less than 1
  • the duration of the shooting interval is determined according to the blade rotation period and the determined phase difference parameter.
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • the detection result is yes, it is determined that the image frame including the blade image area is a target image frame.
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • the image frame includes a paddle image area.
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • the image frame includes a paddle image area.
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • the image frame includes a paddle image area.
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • the image frame includes a paddle image area.
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • processor 702 invokes program instructions stored in the memory 701 for performing the following steps:
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • processor 702 invokes program instructions stored in the memory 701 to perform the following steps:
  • the image processing device obtains an image frame sequence obtained by the photographing device, performs paddle detection on the image frames in the image frame sequence, and determines from the image frame sequence to determine that a paddle image area exists.
  • the target image frame and modify or replace the target image frame to remove the paddle image area, thereby improving the quality of the captured image, reducing energy consumption, and saving costs.
  • An embodiment of the present invention also provides an aircraft, including: a fuselage; a power system provided on the fuselage for providing flying power; the power system includes: a blade; a motor for driving the blade to rotate; electronic A governor for controlling the rotation of the motor; an electronic governor for controlling the rotation of the motor; a processor for obtaining motor rotation information to determine a rotation period of the blade of the aircraft according to the motor rotation information, and The blade rotation period, adjusting the shooting interval time of the shooting device, and controlling the shooting device mounted on the aircraft to shoot according to the shooting interval time to obtain an image frame sequence.
  • the processor is configured to determine a blade rotation period of the aircraft based on the motor rotation information obtained from the electronic governor.
  • the processor is configured to adjust the duration of the shooting interval according to the rotation period of the blade, wherein the duration of the shooting interval is a non-integer multiple of the rotation period of the blade.
  • the processor is configured to determine a phase difference parameter, where the phase difference parameter is a positive number less than 1; and determine the duration of the shooting interval according to the blade rotation period and the determined phase difference parameter.
  • the aircraft may be a quad-rotor drone, a six-rotor drone, a multi-rotor drone, or the like.
  • the power system may include structures such as a motor, an ESC, and a propeller.
  • the motor is responsible for driving the aircraft propeller, and the electronic governor is responsible for controlling the rotation speed of the motor of the aircraft.
  • An embodiment of the present invention further provides an image processing system, including: an image processing device and an aircraft;
  • the aircraft is configured to obtain motor rotation information to determine a blade rotation period of the aircraft according to the motor rotation information, adjust a shooting interval duration of a photographing device according to the blade rotation period, and control a mount on A shooting device on the aircraft shoots an image frame sequence according to the shooting interval duration;
  • the image processing device is configured to obtain an image frame sequence obtained by a photographing device; perform paddle detection on the image frames in the image frame sequence, and detect, from the image frame sequence, a target that is determined to have an image area of the paddle blade. An image frame; modifying or replacing the target image frame to remove the paddle image area.
  • the image processing device is configured to obtain a blade rotation period of the aircraft; and adjust a shooting interval duration of the photographing device according to the blade rotation period.
  • the aircraft further comprises a motor for driving the rotation of the blades, and an electronic governor for controlling the rotation of the motor;
  • the image processing device is configured to determine a blade rotation period of the aircraft based on motor rotation information obtained from an electronic governor.
  • the image processing device is configured to determine the duration of the shooting interval according to the rotation period of the blade, wherein the duration of the shooting interval is a non-integer multiple of the rotation period of the blade.
  • the image processing device is configured to determine a phase difference parameter, where the phase difference parameter is a positive number less than 1, and determine the duration of the shooting interval according to the blade rotation period and the determined phase difference parameter.
  • the image processing device is configured to obtain data information of image frames in the image frame sequence; and detect whether there is an image frame including a paddle image area in the image frame according to the data information; If yes, the image frame including the blade image area is determined as the target image frame.
  • the image processing device is configured to determine a comparison image frame sequence from the image frame sequence; compare the pixel values of each image frame in the determined comparison image frame sequence; if the comparison The difference in pixel values in the result satisfies the determination condition, and it is determined that the image frame includes a paddle image area.
  • the image processing device is configured to determine a compared image frame sequence from the image frame sequence; determine a first position region of the image frame in the compared image frame sequence; and determine the determined ratio
  • the pixel values of the first position region of each image frame in the image frame sequence are compared; if the difference in pixel values in the comparison result meets the determination condition, it is determined that the image frame includes a paddle image region.
  • the image processing device is configured to determine a reference image frame from the image frame sequence; and detect a difference between a pixel value of the image frame obtained from the image frame sequence and a pixel value of the reference image frame If the pixel value difference satisfies a determination condition, determining that the image frame includes a paddle image area.
  • the image processing device is configured to determine a reference image frame from the image frame sequence; determine a second position region in the reference image frame; and detect an image obtained from the image frame sequence
  • the pixel value of the second position area of the frame is different from the pixel value of the second position area of the reference image frame; if the pixel value difference satisfies a determination condition, it is determined that the image frame includes a paddle image area.
  • the image processing device is configured to determine a target position area where a paddle appears in the target image frame; and obtain a target corresponding to the target position area from an adjacent image frame of the target image frame. An image; using the target image, replacing an image of a target position area where a blade appears in the target image frame to obtain a replacement image frame.
  • the image processing device is configured to select at least two image frames from the image frame sequence and remove a target image frame; perform fusion processing on the at least two image frames to obtain a fused image frame; and use the Fusion image frames, replacing the target image frames.
  • the image processing device is configured to determine a target position area of a paddle in the target image frame; and acquire the selected one of the at least two image frames from the image frame sequence except the target image frame.
  • a pixel value of a target position area determining an average pixel value of the target position area in the at least two image frames; using the average pixel value to replace the pixel value of the target position area in the target image frame, Get the fused image frame.
  • the image processing device is configured to obtain pixel values of at least two image frames excluding a target image frame selected from the image frame sequence; determine an average pixel value of the at least two image frames; The average pixel value is substituted for the pixel value of the target image frame to obtain a fused image frame.
  • the image processing device is configured to determine a target position area of a paddle in the target image frame; and delete an image of the target position area in the target image frame.
  • the image processing device obtains an image frame sequence obtained by the photographing device, performs paddle detection on the image frames in the image frame sequence, and determines from the image frame sequence to determine that a paddle image area exists.
  • the target image frame and modify or replace the target image frame to remove the paddle image area, thereby improving the quality of the captured image, reducing energy consumption, and saving costs.
  • a computer-readable storage medium stores a computer program, and the computer program is executed by a processor to implement the present invention.
  • the image processing method described in the embodiment corresponding to FIG. 6 can also implement the image processing device according to the embodiment of the present invention described in FIG. 7, and details are not described herein again.
  • the computer-readable storage medium may be an internal storage unit of the device according to any one of the foregoing embodiments, such as a hard disk or a memory of the device.
  • the computer-readable storage medium may also be an external storage device of the device, such as a plug-in hard disk, a Smart Media Card (SMC), and a Secure Digital (SD) card equipped on the device. , Flash card (Flash card) and so on.
  • the computer-readable storage medium may further include both an internal storage unit of the device and an external storage device.
  • the computer-readable storage medium is used to store the computer program and other programs and data required by the device.
  • the computer-readable storage medium may also be used to temporarily store data that has been or will be output.
  • the program can be stored in a computer-readable storage medium.
  • the program When executed, the processes of the embodiments of the methods described above may be included.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random, Access Memory, RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

Des modes de réalisation de la présente invention concernent un procédé et un dispositif de traitement d'image, un aéronef, un système et un support de stockage. Le procédé consiste à : obtenir une séquence de trames d'image obtenue par un appareil de photographie ; effectuer une détection de pale sur des trames d'image dans la séquence de trames d'image, et déterminer, à partir de la séquence de trames d'image grâce à une détection, une trame d'image cible dans laquelle se trouve une zone d'image de pale ; et modifier ou remplacer la trame d'image cible, de façon à retirer la zone d'image de pale. Grâce au procédé, la qualité d'une image photographiée est améliorée, la perte d'énergie est réduite, et les coûts sont économisés.
PCT/CN2018/089376 2018-05-31 2018-05-31 Procédé et dispositif de traitement d'image, aéronef, système et support de stockage WO2019227438A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880010731.0A CN110291529A (zh) 2018-05-31 2018-05-31 一种图像处理方法、设备、飞行器、系统及存储介质
PCT/CN2018/089376 WO2019227438A1 (fr) 2018-05-31 2018-05-31 Procédé et dispositif de traitement d'image, aéronef, système et support de stockage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/089376 WO2019227438A1 (fr) 2018-05-31 2018-05-31 Procédé et dispositif de traitement d'image, aéronef, système et support de stockage

Publications (1)

Publication Number Publication Date
WO2019227438A1 true WO2019227438A1 (fr) 2019-12-05

Family

ID=68001284

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/089376 WO2019227438A1 (fr) 2018-05-31 2018-05-31 Procédé et dispositif de traitement d'image, aéronef, système et support de stockage

Country Status (2)

Country Link
CN (1) CN110291529A (fr)
WO (1) WO2019227438A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112131414A (zh) * 2020-09-23 2020-12-25 北京百度网讯科技有限公司 信号灯的图像的标注方法、装置、电子设备以及路侧设备
CN114439702A (zh) * 2022-01-28 2022-05-06 华能盐城大丰新能源发电有限责任公司 一种风力发电机的叶片状态监测方法和装置

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114266773B (zh) * 2022-03-02 2022-05-20 成都数联云算科技有限公司 显示面板缺陷定位方法、装置、设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105258672A (zh) * 2015-11-26 2016-01-20 广州航新航空科技股份有限公司 旋翼椎体监测方法及椎体测量装置以及系统
US20160027313A1 (en) * 2014-07-22 2016-01-28 Sikorsky Aircraft Corporation Environmentally-aware landing zone classification
CN105425810A (zh) * 2015-12-29 2016-03-23 国家电网公司 一种巡检用无人机
CN105651780A (zh) * 2015-12-28 2016-06-08 新疆金风科技股份有限公司 通过无人机检测风机叶片状态的方法、装置及系统
CN106043722A (zh) * 2016-06-29 2016-10-26 汇星海科技(天津)有限公司 一种可全景拍摄的新型无人机
CN106488139A (zh) * 2016-12-27 2017-03-08 深圳市道通智能航空技术有限公司 一种无人机拍摄的图像补偿方法、装置及无人机
US9758246B1 (en) * 2016-01-06 2017-09-12 Gopro, Inc. Systems and methods for adjusting flight control of an unmanned aerial vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160027313A1 (en) * 2014-07-22 2016-01-28 Sikorsky Aircraft Corporation Environmentally-aware landing zone classification
CN105258672A (zh) * 2015-11-26 2016-01-20 广州航新航空科技股份有限公司 旋翼椎体监测方法及椎体测量装置以及系统
CN105651780A (zh) * 2015-12-28 2016-06-08 新疆金风科技股份有限公司 通过无人机检测风机叶片状态的方法、装置及系统
CN105425810A (zh) * 2015-12-29 2016-03-23 国家电网公司 一种巡检用无人机
US9758246B1 (en) * 2016-01-06 2017-09-12 Gopro, Inc. Systems and methods for adjusting flight control of an unmanned aerial vehicle
CN106043722A (zh) * 2016-06-29 2016-10-26 汇星海科技(天津)有限公司 一种可全景拍摄的新型无人机
CN106488139A (zh) * 2016-12-27 2017-03-08 深圳市道通智能航空技术有限公司 一种无人机拍摄的图像补偿方法、装置及无人机

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112131414A (zh) * 2020-09-23 2020-12-25 北京百度网讯科技有限公司 信号灯的图像的标注方法、装置、电子设备以及路侧设备
CN114439702A (zh) * 2022-01-28 2022-05-06 华能盐城大丰新能源发电有限责任公司 一种风力发电机的叶片状态监测方法和装置

Also Published As

Publication number Publication date
CN110291529A (zh) 2019-09-27

Similar Documents

Publication Publication Date Title
WO2020014909A1 (fr) Procédé et dispositif de photographie, et véhicule aérien sans pilote
WO2020113408A1 (fr) Procédé et dispositif de traitement d'image, véhicule aérien sans pilote, système et support de stockage
WO2017183908A1 (fr) Méthodologie et appareil de génération d'un zoom haute fidélité pour une vidéo mobile
US11258949B1 (en) Electronic image stabilization to improve video analytics accuracy
WO2019227438A1 (fr) Procédé et dispositif de traitement d'image, aéronef, système et support de stockage
US20230059888A1 (en) Combined mechanical and electronic image stabilization
WO2020057609A1 (fr) Procédé et appareil de transmission d'image, terminal d'envoi d'image et système de transmission d'image d'aéronef
US11140332B2 (en) Imaging control method, imaging device and unmanned aerial vehicle
CN108513642B (zh) 一种图像处理方法、无人机、地面控制台及其图像处理系统
WO2020019106A1 (fr) Procédé de commande de cardan et de véhicule aérien sans pilote, cardan et véhicule aérien sans pilote
AU2019212641B2 (en) Voronoi cropping of images for post field generation
WO2020014953A1 (fr) Procédé et dispositif de traitement d'image
WO2020135604A1 (fr) Procédé et dispositif de traitement d'image, et véhicule aérien sans pilote
US11920762B2 (en) Method for controlling illuminating device, and apparatus, aircraft, and system thereof
CN111712857A (zh) 图像处理方法、装置、云台和存储介质
US20200349689A1 (en) Image processing method and device, unmanned aerial vehicle, system and storage medium
WO2021253173A1 (fr) Procédé et appareil de traitement d'image, et système d'inspection
WO2023165535A1 (fr) Procédé et appareil de traitement d'image et dispositif
CN111247558A (zh) 一种图像处理方法、设备、无人机、系统及存储介质
US20210377446A1 (en) Blur correction device, imaging apparatus, monitoring system, and program
WO2020000311A1 (fr) Procédé, appareil et dispositif de traitement d'image et aéronef sans pilote
WO2018032259A1 (fr) Dispositif de traitement d'image et procédé de traitement d'image
CN110785772A (zh) 一种图像处理方法、设备、系统及存储介质
WO2021196014A1 (fr) Procédé et appareil de traitement d'image, système de photographie et appareil photographique
US11616914B2 (en) Tracking objects using sensor rotation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18921123

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18921123

Country of ref document: EP

Kind code of ref document: A1