CN112514363A - Image transmission system and method, control device and movable platform - Google Patents

Image transmission system and method, control device and movable platform Download PDF

Info

Publication number
CN112514363A
CN112514363A CN201980049933.0A CN201980049933A CN112514363A CN 112514363 A CN112514363 A CN 112514363A CN 201980049933 A CN201980049933 A CN 201980049933A CN 112514363 A CN112514363 A CN 112514363A
Authority
CN
China
Prior art keywords
image received
movable platform
image
moment
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980049933.0A
Other languages
Chinese (zh)
Inventor
刘怀宇
陈颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112514363A publication Critical patent/CN112514363A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Abstract

An image transmission system and method, a control device and a movable platform, wherein the image transmission system comprises the movable platform (100) and the control device (200) which is in communication connection with the movable platform; the movable platform is used for sending the image acquired by the movable platform and the motion information of the movable platform during image acquisition to the control device; if the image received by the control device at the current moment has an abnormal area, the control device is used for filling the abnormal area according to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment, and the abnormal area comprises a pixel information missing area or a pixel information error area. According to the method and the device, the abnormal area in the image received at the current moment is filled through the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment, and the effect and the reliability of abnormal recovery of wireless image transmission can be improved.

Description

Image transmission system and method, control device and movable platform
Technical Field
The present disclosure relates to the field of image transmission, and in particular, to an image transmission system and method, a control device, and a movable platform.
Background
When the movable platform is remotely controlled, a user needs to acquire an image acquired by the movable platform so as to operate the control device according to the image returned by the movable platform, and the movable platform is safely controlled. Generally, the movable platform transmits the images acquired by the movable platform to the control device in a wireless transmission mode. However, in the wireless image transmission process, due to the adverse conditions such as wireless signal interference, shielding on a transmission path, too long transmission distance, and antenna pointing deviation, problems such as interruption, screen splash, mosaic, etc. may be caused, which greatly affects the remote control experience, and may also cause potential safety hazards.
The conventional wireless image transmission abnormality recovery method is generally based on image information analysis, specifically, based on optical flow, motion information of an object in an image is estimated, and then an error or missing region in a current image is matched from a historical image based on an image segmentation mode according to the motion information of the object. However, for some objects with smooth surfaces, weak textures, repeated textures or similar textures, motion information of the objects is not accurately estimated, so that the wireless image transmission abnormal recovery effect is poor, and the reliability is low.
Disclosure of Invention
The application provides an image transmission system and method, a control device and a movable platform, so as to improve the effect and reliability of wireless image transmission abnormity recovery.
According to a first aspect of embodiments of the present application, there is provided an image transmission system comprising a movable platform and a control device communicatively connected to the movable platform;
the movable platform is used for sending the image acquired by the movable platform and the motion information of the movable platform during image acquisition to the control device;
if an abnormal area exists in the image received by the control device at the current moment, the control device is used for filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment, and the abnormal area comprises a pixel information missing area or a pixel information error area.
According to a second aspect of embodiments of the present application, there is provided an image processing method, the method including:
receiving an image collected by a movable platform and motion information of the movable platform when the image is collected, wherein the image is sent by the movable platform;
if an abnormal area exists in the image received at the current moment, filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment;
wherein the abnormal region includes a pixel information missing region or a pixel information error region.
According to a third aspect of embodiments of the present application, there is provided a control apparatus of a movable platform, the apparatus comprising:
storage means for storing program instructions; and
one or more processors that invoke program instructions stored in the storage device, the one or more processors individually or collectively configured to, when the program instructions are executed, perform operations comprising:
receiving an image collected by a movable platform and motion information of the movable platform when the image is collected, wherein the image is sent by the movable platform;
if an abnormal area exists in the image received at the current moment, filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment;
wherein the abnormal region includes a pixel information missing region or a pixel information error region.
According to a fourth aspect of embodiments of the present application, there is provided an image transmission method, the method including:
the method comprises the steps of sending an image acquired by a movable platform and motion information of the movable platform during image acquisition to a control device in communication connection with the movable platform, so that when an abnormal area exists in the image received by the control device at the current moment, the control device is used for filling the abnormal area according to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment, wherein the abnormal area comprises a pixel information missing area or a pixel information error area.
According to a fifth aspect of embodiments of the present application, there is provided a movable platform communicatively connected to a control device, the movable platform comprising:
the shooting device is used for acquiring images;
the sensor module is used for acquiring motion information of the movable platform when the image is acquired;
a processor electrically connected to the camera and the sensor module, respectively, the processor configured to perform the following operations:
sending the image acquired by the shooting device and the motion information of the movable platform when the image is acquired to the control device, so that the control device is used for filling the abnormal area according to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment when the abnormal area exists in the image received at the current moment;
wherein the abnormal region includes a pixel information missing region or a pixel information error region.
According to the technical scheme provided by the embodiment of the application, the abnormal area in the image received at the current moment is filled through the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment, the abnormal recovery effect and reliability of wireless image transmission can be improved, so that under the condition that the wireless image transmission is wrong, the influence on remote control and user experience is minimized, more information support and more accurate visual information are provided for the remote control, the control safety, effectiveness and control experience are improved, and the overall visual experience of the wireless image transmission is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a block diagram of an image transmission system in an embodiment of the present application;
FIG. 2 is a schematic workflow diagram of an image transmission system in an embodiment of the present application;
fig. 3A is a schematic diagram of an image received by the control device at the previous time and an image received at the current time in an embodiment of the present application;
fig. 3B is a schematic image diagram of a control device in an embodiment of the present application, which converts fig. 3A (1) to fig. 3A (2) corresponding to a shooting angle of view;
fig. 4A is a schematic diagram of an image received by the control device at the previous time and an image received at the current time in another embodiment of the present application;
fig. 4B is a schematic image diagram of a control device in another embodiment of the present application, which converts fig. 4A (1) to fig. 4A (2) corresponding to a shooting angle of view;
fig. 5A is a schematic diagram of an image received by the control device at the previous time and an image received at the current time in another embodiment of the present application;
fig. 5B is a schematic image diagram of a control device in another embodiment of the present application, which converts fig. 5A (1) to fig. 5A (2) corresponding to a shooting angle of view;
FIG. 6 is a flowchart of an image processing method on the control device side according to an embodiment of the present disclosure;
fig. 7 is a block diagram of a control device in an embodiment of the present application;
FIG. 8 is a flowchart illustrating an image transmission method on the movable platform side according to an embodiment of the present disclosure;
fig. 9 is a block diagram of a movable platform according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that, in the following examples and embodiments, features may be combined with each other without conflict.
The conventional wireless image transmission abnormality recovery method is generally based on image information analysis, specifically, based on optical flow, motion information of an object in an image is estimated, and then an error or missing region in a current image is matched from a historical image based on an image segmentation mode according to the motion information of the object. However, for some objects with smooth surfaces, weak textures, repeated textures or similar textures, motion information of the objects is not accurately estimated, so that the wireless image transmission abnormal recovery effect is poor, and the reliability is low.
In this regard, according to the method and the device, the abnormal area in the image received at the current moment is filled up through the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment, the effect and the reliability of abnormal recovery of wireless image transmission can be improved, so that under the condition that the wireless image transmission is wrong, the influence on remote control and user experience is minimized, more information support and more accurate visual information are provided for the remote control, the control safety, the effectiveness and the control experience are improved, and the overall visual experience of the wireless image transmission is improved.
Fig. 1 is a block diagram of an image transmission system in an embodiment of the present application; referring to fig. 1, an embodiment of the present application provides an image transmission system, which may include a movable platform 100 and a control device 200 communicatively connected to the movable platform 100.
In the embodiment of the present application, the movable platform 100 may be an unmanned aerial vehicle, an unmanned vehicle, a mobile robot, a handheld device, or other types of movable platforms.
The control device 200 may be a combination of a remote controller and a mobile terminal (such as a mobile phone, a tablet computer, etc.), or may be a remote controller with a screen or a PC ground station, and the control device 200 may also be a combination of a remote controller and VR glasses, or a combination of a PC ground station and VR glasses, etc.
Wherein, when the control device 200 is a combination of a remote controller and a mobile terminal, the remote controller may be used to remotely control the movable platform 100 to change the position and/or posture of the movable platform 100 and/or to change the position and/or posture of a photographing device of the movable platform 100; the mobile terminal is used for displaying the image acquired by the movable platform 100, and the user can operate the remote controller according to the displayed image to realize the control of the movable platform 100.
When the control device 200 is a remote controller with a screen or a PC ground station, the remote controller or the PC ground station may be used to remotely control the movable platform 100 to change the position and/or posture of the movable platform 100 and/or to change the position and/or posture of a photographing device of the movable platform 100; the remote controller or the PC ground station may also display an image captured by the movable platform 100, and the user may operate the remote controller or the PC ground station according to the displayed image to implement control of the movable platform 100.
When the control device 200 is a combination of a remote controller and VR glasses, the remote controller may be used to remotely control the movable platform 100 to change the position and/or posture of the movable platform 100, and/or to change the position and/or posture of a photographing device of the movable platform 100; the VR glasses may display images captured by the movable platform 100, and the user may operate the remote controller according to the displayed images to control the movable platform 100.
When the control device 200 is a combination of a PC ground station and VR glasses, the PC ground station may be used to remotely control the movable platform 100 to change the position and/or attitude of the movable platform 100 and/or to change the position and/or attitude of a camera of the movable platform 100; the VR glasses may display images captured by the movable platform 100, and the user may operate the PC ground station according to the displayed images to implement control of the movable platform 100.
In the embodiment of the present application, the movable platform 100 and the control device 200 are communicatively connected based on a wireless communication method, which may be wifi, Lightbridge, OcuSync or other wireless communication methods, and the present application does not specifically limit the wireless communication method between the movable platform 100 and the control device 200.
Fig. 1 is a block diagram of an image transmission system in an embodiment of the present application, and fig. 2 is a schematic workflow diagram of the image transmission system in the embodiment of the present application; referring to fig. 1 and 2, the movable platform 100 is configured to send the image captured by the movable platform 100 and the motion information of the movable platform 100 when capturing the image to the control device 200. If there is an abnormal area in the image received by the control device 200 at the current moment, the control device 200 is configured to fill the abnormal area according to the image received at the previous moment and the motion information of the movable platform 100 corresponding to the image received at the previous moment, where the abnormal area includes a pixel information missing area or a pixel information error area. The control device 200 of the present application fills up the abnormality in the image received at the current time by the image received at the previous time and the motion information of the movable platform 100 corresponding to the image received at the previous time, and can improve the effect and reliability of recovering from the abnormality in wireless image transmission. Wherein the movable platform 100 may include a camera for capturing images. The imaging device may be an image sensor, or may be an integrated imaging device such as a camera or a video camera.
The motion information of the movable platform 100 may include at least one of position information and posture information of a photographing device of the movable platform 100. Use movable platform as unmanned aerial vehicle, shoot the device and install in unmanned aerial vehicle as the example, shoot the device and can include sensor module, this sensor module is used for acquireing at least one in the positional information and the attitude information of shooting the device. Optionally, the frequency of the sensor module acquiring at least one of the position information and the attitude information of the shooting device is the same as the frequency of the shooting device acquiring the image, that is, while the shooting device acquires one frame of image, the sensor module also acquires at least one of the position information and the attitude information of the shooting device; of course, the frequency of the sensor module acquiring at least one of the position information and the posture information of the camera device may be different from the frequency of the camera device acquiring the image, for example, the frequency of the sensor module acquiring at least one of the position information and the posture information of the camera device is greater than the frequency of the camera device acquiring the image.
For example, the sensor module may include an attitude sensor, which may be an Inertial Measurement Unit (IMU), which may be used to detect attitude information of the photographing apparatus, and/or a position sensor. The position sensor may be a Global Positioning System (GPS) positioning device, a Real-time kinematic (RTK) carrier-phase differential positioning device (RTK positioning device for short), and the like, and may be used to detect position information of the photographing device.
In one embodiment, when the relative position relationship between the camera and the drone is fixed, the position information of the camera can be determined based on the position information of the drone, and for example, a position sensor is arranged in the body of the drone, and the position information of the camera can be determined based on the position information of the drone acquired by the position sensor and the relative position relationship between the drone and the camera.
In order to reduce errors occurring in the image transmission and the motion information transmission processes, in this embodiment, the movable platform 100 transmits the image captured by the movable platform 100 and the motion information of the movable platform 100 when capturing the image to the control device 200 based on different signal transmission channels. Alternatively, the movable platform 100 transmits the image captured by the movable platform 100 to the control device 200 based on a broadband wireless transmission channel, and transmits the motion information of the movable platform 100 at the time of capturing the image to the control device 200 based on a narrowband wireless transmission channel. Because the data volume of the image is large, the bandwidth of the broadband wireless transmission channel is high, and the image data can be ensured to be completely transmitted to the control device 200 based on the mode of transmitting the image by the broadband wireless transmission channel; the data volume of the motion information is relatively small, the bandwidth of the narrow-band wireless transmission channel is small, but the reliability is high, and the motion information is guaranteed to be accurately transmitted to the control device 200.
It should be noted that, in the embodiment of the present application, when an abnormal area exists in an image received by the control device 200 at a current time, according to the image received at the previous time and the motion information of the movable platform 100 corresponding to the image received at the previous time, a process of filling the abnormal area may be in real time, that is, in a process of controlling the movable platform 100 to move by the control device 200, the control device 200 performs a filling operation of the abnormal area in the image in real time, so as to minimize an influence on the remote control and the user experience under a condition that a wireless image transmission error occurs, provide a large amount of information support and more accurate visual information for the remote control, improve the control safety, the effectiveness, and the control experience, and improve the effect of the overall visual experience of the wireless image transmission. Of course, in other embodiments, the control device 200 may have an abnormal area in the image received at the current time, and the process of filling the abnormal area may be performed offline according to the image received at the previous time and the motion information of the movable platform 100 corresponding to the image received at the previous time.
The control device 200 of the embodiment of the application correspondingly stores the received image and the motion information, displays the received image or the filled image in real time, facilitates the user to control the movable platform 100 to move according to the real-time displayed image, and improves the moving safety of the movable platform 100.
It should be understood that the abnormal region may also be other regions indicating pixel information abnormality in the image, and is not limited to the pixel information missing region and the pixel information error region listed above.
The detailed implementation process of the control device 200 filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform 100 corresponding to the image received at the previous moment will be described in detail below.
The specific implementation process of the control device 200 for filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform 100 corresponding to the image received at the previous moment may include the following steps:
(1) processing the image received at the last moment according to the motion information of the movable platform 100 corresponding to the image received at the last moment and the motion information of the movable platform 100 corresponding to the image received at the current moment;
the processing in the above steps may include at least one of scaling processing, translation processing, and rotation processing, and the processing manner needs to be determined according to the motion information of the movable platform 100 corresponding to the image received at the previous time and the motion information of the movable platform 100 corresponding to the image received at the current time.
When at least one of the position and the posture of the shooting device of the movable platform 100 is changed, the object in the image collected by the shooting device of the movable platform 100 is correspondingly changed, wherein the object change condition in the image is as follows for the position change of the shooting device: if the photographing device of the movable platform 100 moves forward, the object in the image becomes large; if the photographing device of the movable platform 100 moves backward, the object in the image becomes small. If the camera of the movable platform 100 moves to the left, the object in the image moves to the right; if the camera of the movable platform 100 moves to the right, the object in the image moves to the left. If the photographing device of the movable platform 100 moves upward, the object in the image moves downward; if the photographing device of the movable platform 100 moves downward, the object in the image moves upward.
For the posture change of the shooting device, the object size change condition in the image is as follows: if the camera of the movable platform 100 rotates upward in the pitch direction, the object in the image rotates downward; if the camera of the movable platform 100 is rotated downward in the pitch direction, the object in the image is rotated upward. If the camera of the movable platform 100 rotates left in the yaw direction, the object in the image rotates right; if the camera of the movable platform 100 is rotated to the right in the yaw direction, the object in the image is rotated to the left. If the camera of the movable platform 100 rotates left in the roll direction, the object in the image rotates right; if the camera of the movable platform 100 is rotated to the right in the roll direction, the object in the image is rotated to the left.
In this embodiment, the optical axis of the lens is parallel to the front-back direction of the photographing device, the optical axis of the lens is perpendicular to the left-right direction of the photographing device, and the gravity direction of the photographing device is parallel to the up-down direction of the photographing device.
In this embodiment, the camera of the movable platform 100 may generate a position change or an attitude change between two images captured by the camera of the movable platform 100, and in step (1), it may be determined whether the position change or the attitude change or both of the position change and the attitude change has occurred between the two images captured by the movable platform 100 according to the motion information of the movable platform 100 corresponding to the image received at the previous time and the motion information of the movable platform 100 corresponding to the image received at the current time, and the image received at the previous time is further processed according to the determined position change and/or attitude change. Optionally, when the step (1) is implemented, the control device 200 is configured to determine at least one of a relative positional relationship (i.e., a change in position of the photographing device) between the image received at the last time and the image received at the current time and a relative attitude relationship (i.e., a change in attitude of the photographing device) between the image received at the last time and the image received at the current time according to the motion information of the movable platform 100 corresponding to the image received at the last time and the motion information of the movable platform 100 corresponding to the image received at the current time; and processing the image received at the last moment according to at least one of the relative position relation and the relative attitude relation.
The implementation process of the control device 200 processing the image received at the last time according to at least one of the relative position relationship and the relative posture relationship may include: determining a conversion matrix between the image received at the current moment and the image received at the previous moment according to at least one of the relative position relation and the relative attitude relation; and processing the image received at the last moment according to the conversion matrix. Wherein the transformation matrix may comprise a scaling matrix, or a rotation matrix, or a translation matrix, or a concatenation of at least two of the scaling matrix, the rotation matrix, and the translation matrix.
How the control device 200 determines the conversion matrix between the image received at the present time and the image received at the previous time based on at least one of the relative positional relationship and the relative posture relationship will be described in detail below.
In some embodiments, the control device 200 is configured to determine the scaling matrix between the image received at the current time and the image received at the previous time when the relative position relationship between the image received at the current time and the image received at the previous time moves between the front and the back positions. When the shooting device moves to the front-back position, the object in the image collected by the shooting device becomes larger or smaller, so that when the relative position relationship between the image received at the current moment and the image received at the last moment moves to the front-back position, the scaling transformation relationship exists between the image received at the current moment and the image received at the last moment.
The control device 200 may employ different strategies to determine the scaling matrix between the image received at the current time and the image received at the previous time, for example, in an embodiment in which the scaling matrix between the image received at the current time and the image received at the previous time is determined, the control device 200 is configured to determine the scaling matrix between the image received at the current time and the image received at the previous time according to the distance size of the front-back position movement when the relative positional relationship between the image received at the current time and the image received at the previous time moves in the front-back position. In this embodiment, the scaling factor corresponding to the scaling matrix is positively correlated with the distance moved by the front and rear positions, that is, the larger the distance moved by the front and rear positions is, the larger the scaling factor corresponding to the scaling matrix is.
In another embodiment of determining the scaling matrix between the image received at the current time and the image received at the previous time, the movable platform 100 is provided with a distance detection sensor for detecting the depth information of the object around the movable platform 100, and the present application does not specifically limit the type of the distance detection sensor, and any distance detection sensor capable of detecting the depth information of the object around the movable platform 100 is within the scope of the present application. The movable platform 100 of the present embodiment is further configured to send depth information corresponding to an image acquired by the movable platform 100 to the control device 200, specifically, the movable platform 100 acquires the depth information of an object around the movable platform 100 through detection of the distance detection sensor while acquiring the image; determining depth information corresponding to the image according to the depth information of objects around the movable platform 100; the depth information corresponding to the image is transmitted to the control device 200. To reduce errors occurring during the image transmission process, the movable platform 100 may transmit the image captured by the movable platform 100 and the depth information corresponding to the image captured by the movable platform 100 to the control device 200 based on different signal transmission channels. The movable platform 100 may transmit depth information corresponding to an image captured by the movable platform 100 and motion information of the movable platform 100 when capturing the image to the control device 200 based on the same signal transmission channel, and for example, the movable platform 100 transmits depth information corresponding to an image captured by the movable platform 100 and motion information of the movable platform 100 when capturing the image to the control device 200 based on a narrow-band wireless transmission channel.
In this embodiment, the control device 200 is configured to determine a scaling matrix between the image received at the current time and the image received at the previous time according to the depth information corresponding to the image received at the current time and the depth information corresponding to the image received at the previous time when the relative positional relationship between the position of the image received at the current time and the position corresponding to the image received at the previous time has moved between the front and rear positions. And determining a scaling matrix based on the depth information, so that objects at different depths in the image can be scaled to different degrees, the actual visual impression is closer, and the effect of abnormal recovery of wireless image transmission is further improved.
Further, in this embodiment, the scaling factor corresponding to the scaling matrix corresponds to a ratio of the depth information corresponding to the image received at the current time to the depth information corresponding to the image received at the previous time, and optionally, the scaling factor corresponding to the scaling matrix is equal to the ratio of the depth information corresponding to the image received at the current time to the depth information corresponding to the image received at the previous time, that is, the scaling factor corresponding to the scaling matrix is equal to (the depth information corresponding to the image received at the current time/the depth information corresponding to the image received at the previous time); of course, the scaling factor corresponding to the scaling matrix may not be equal to the ratio of the depth information corresponding to the image received at the current time to the depth information corresponding to the image received at the previous time, and the scaling factor corresponding to the scaling matrix may be determined according to the ratio of the depth information corresponding to the image received at the current time to the depth information corresponding to the image received at the previous time.
It should be appreciated that the above-described embodiments of determining a scaling matrix based on the distance size of the front-to-back position movement and the embodiments of determining a scaling matrix based on the depth information may be combined to obtain a more accurate scaling matrix.
In other embodiments, the control device 200 is configured to determine a translation matrix between the image received at the current time and the image received at the previous time when there is a left-right position movement and/or a top-bottom position movement in the relative position relationship between the image received at the current time and the image received at the previous time. When the shooting device moves left and right, the object in the image collected by the shooting device moves right or left, so that when the relative position relation between the image received at the current moment and the image received at the previous moment moves left and right, a translation transformation relation exists between the image received at the current moment and the image received at the previous moment. When the shooting device moves up and down, the object in the image collected by the shooting device moves downwards or upwards, so that when the relative position relation between the image received at the current moment and the image received at the previous moment moves up and down, the translation transformation relation exists between the image received at the current moment and the image received at the previous moment.
In other embodiments, the control device 200 is configured to determine the rotation matrix between the image received at the current time and the image received at the previous time when at least one of a pitch attitude change, a roll attitude change, and a yaw attitude change exists in a relative attitude relationship between the image received at the current time and the image received at the previous time. When at least one of pitch attitude change, roll attitude change and yaw attitude change exists in the shooting device, the object in the image collected by the shooting device rotates around the corresponding axis, so that when at least one of pitch attitude change, roll attitude change and yaw attitude change exists in the relative attitude relationship corresponding to the image received at the current moment and the image received at the last moment, a rotation transformation relationship exists between the image received at the current moment and the image received at the last moment.
Furthermore, the implementation process of the control device 200 processing the image received at the last time according to the transformation matrix may include: the control device 200 is configured to convert the image received at the previous moment to the shooting angle corresponding to the motion information of the movable platform 100 corresponding to the image received at the current moment according to the conversion matrix.
When the conversion matrix is a scaling matrix, the control device 200 is configured to perform scaling processing on the image received at the previous time according to the scaling matrix, so as to convert the image received at the previous time into a shooting angle corresponding to the motion information of the movable platform 100 corresponding to the image received at the current time. For example, referring to fig. 3A, the diagram (1) of fig. 3A is an image received at the previous time, the diagram (2) of fig. 3A is an image received at the current time, an abnormal region 31 exists in the diagram (2) of fig. 3A, and there is a backward movement in the relative positional relationship (i.e., the change in the position of the photographing apparatus) corresponding to the diagram (1) of fig. 3A in the diagram (2) of fig. 3A, so that the object in the diagram (2) of fig. 3A becomes smaller than the object in the diagram (1) of fig. 3A, a scaling matrix between the diagram (2) of fig. 3A and the diagram (1) of fig. 3A can be determined according to the magnitude of the backward movement distance, and then the diagram (1) of fig. 3A is scaled according to the determined scaling matrix between the diagram (2) of fig. 3A and the diagram (1) of fig. 3A, so that the diagram (1) of fig. 3A is converted into the photographing angle corresponding to the diagram (2) of fig. 3A, the image shown in fig. 3B is obtained, and fig. 3B is the processed image of fig. 3A (1).
When the conversion matrix is a translation matrix, the control device 200 is configured to perform translation processing on the image received at the previous time according to the translation matrix, so as to convert the image received at the previous time into a shooting angle corresponding to the motion information of the movable platform 100 corresponding to the image received at the current time. For example, referring to fig. 4A, the diagram (1) of fig. 4A is the image received at the previous moment, the diagram (2) of fig. 4A is the image received at the current moment, the abnormal region 41 exists in the diagram (2) of fig. 4A, the relative positional relationship (i.e. the change of the position of the photographing device) between the diagram (2) of fig. 4A and the diagram (1) of fig. 4A exists a leftward translation, therefore, the object in the diagram (2) of fig. 4A translates rightward in relation to the diagram (1) of fig. 4A, the translation matrix between the diagram (2) of fig. 4A and the diagram (1) of fig. 4A can be determined according to the distance size of the leftward translation, then the translation processing is performed on the diagram (1) of fig. 4A to the right according to the determined translation matrix between the diagram (2) of fig. 4A and the diagram (1) of fig. 4A, the diagram (1) of fig. 4A is converted into the photographing perspective corresponding to the diagram (2) of fig. 4A, the image shown in fig. 4B is obtained, and fig. 4B is the processed image of fig. 4A (1).
When the conversion matrix is a rotation matrix, the control device 200 is configured to perform rotation processing on the image received at the previous time according to the rotation matrix, so as to convert the image received at the previous time into a shooting angle corresponding to the motion information of the movable platform 100 corresponding to the image received at the current time. For example, referring to fig. 5A, fig. 1 of fig. 5A is an image received at the previous time, fig. 2 of fig. 5A is an image received at the current time, an abnormal region 51 exists in fig. 2 of fig. 5A, there is a left rotation (a roll attitude change) in the relative attitude relationship (i.e., an attitude change of the photographing apparatus) corresponding to fig. 1 of fig. 5A in fig. 2 of fig. 5A, and therefore, the object in fig. 2 of fig. 5A is rotated to the right in fig. 1 of fig. 5A, a rotation matrix between fig. 2 of fig. 5A and fig. 1 of fig. 5A may be determined according to the magnitude of the yaw attitude change, and then the right rotation process may be performed on fig. 1 of fig. 5A according to the determined rotation matrix between fig. 2 of fig. 5A and fig. 1 of fig. 5A, so as to shift fig. 5A view angle corresponding to fig. 2 of fig. 5A, the image shown in fig. 5B is obtained, and fig. 5B is the processed image of fig. 5A (1).
(2) And filling the abnormal area according to the processed image received at the last moment.
When step (2) is implemented, specifically, the control device 200 is configured to determine, according to the pixel correspondence between the processed image received at the last time and the image received at the current time, an area corresponding to the abnormal area in the processed image received at the last time; and covering the area corresponding to the abnormal area in the image received at the last moment after processing on the abnormal area. For example, referring to fig. 3A and 3B, the area 32 in fig. 3B is an area corresponding to the abnormal area 31, and the area 32 is covered on the abnormal area 31, that is, the operation of filling the abnormal area is completed. Referring to fig. 4A and 4B, the area 42 in fig. 4B is an area corresponding to the abnormal area 41, and the area 42 is covered on the abnormal area 41, that is, the operation of filling the abnormal area is completed. Referring to fig. 5A and 5B, the area 52 in fig. 5B is an area corresponding to the abnormal area 51, and the area 52 is covered on the abnormal area 51, that is, the operation of filling the abnormal area is completed.
In addition, considering that lens distortion is usually introduced into a lens of the shooting device, in order to eliminate the influence of the lens distortion, the control device 200 may first process the image received at the previous moment and the image received at the current moment respectively according to the lens parameters, and then fill up an abnormal region in the image received at the current moment after the lens distortion is eliminated according to the image received at the previous moment after the lens distortion is eliminated and the motion information of the movable platform 100 corresponding to the image received at the previous moment, so as to improve the image display effect.
FIG. 6 is a flowchart of an image processing method on the control device side according to an embodiment of the present disclosure; referring to fig. 6, an image processing method according to an embodiment of the present application may include the following steps:
s601: receiving an image collected by the movable platform and sent by the movable platform and motion information of the movable platform when the image is collected;
s602: if an abnormal area exists in the image received at the current moment, filling a lens distortion abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment, wherein the abnormal area comprises a pixel information missing area or a pixel information error area.
Optionally, the motion information of the movable platform includes at least one of position information and attitude information of a camera of the movable platform.
Optionally, filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment, including: processing the image received at the last moment according to the motion information of the movable platform corresponding to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the current moment; and filling the abnormal area according to the processed image received at the last moment.
Optionally, the processing includes at least one of scaling processing, translation processing, and rotation processing.
Optionally, processing the image received at the previous time according to the motion information of the movable platform corresponding to the image received at the previous time and the motion information of the movable platform corresponding to the image received at the current time includes: determining at least one of a relative position relation between the image received at the last moment and the image received at the current moment and a relative posture relation between the image received at the last moment and the image received at the current moment according to the motion information of the movable platform corresponding to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the current moment; and processing the image received at the last moment according to at least one of the relative position relation and the relative attitude relation.
Optionally, processing the image received at the last time according to at least one of the relative position relationship and the relative posture relationship includes: determining a conversion matrix between the image received at the current moment and the image received at the previous moment according to at least one of the relative position relation and the relative attitude relation; and processing the image received at the last moment according to the conversion matrix.
Optionally, the transformation matrix comprises a scaling matrix, or a rotation matrix, or a translation matrix, or a concatenation of at least two of the scaling matrix, the rotation matrix and the translation matrix.
Optionally, determining a transformation matrix between the image received at the current time and the image received at the previous time according to at least one of the relative position relationship and the relative posture relationship includes: and if the relative position relation corresponding to the image received at the current moment and the image received at the previous moment has front and back position movement, determining a scaling matrix between the image received at the current moment and the image received at the previous moment.
Optionally, determining a scaling matrix between the image received at the current time and the image received at the last time includes: and determining a scaling matrix between the image received at the current moment and the image received at the last moment according to the distance moved by the front and rear positions.
Optionally, the scaling factor corresponding to the scaling matrix is positively correlated to the distance moved by the front and rear positions.
Optionally, the movable platform is provided with a distance detection sensor for detecting depth information of objects around the movable platform; the image processing method further includes: receiving depth information corresponding to an image collected by a movable platform and sent by the movable platform; determining a scaling matrix between an image received at a current time and an image received at a previous time, comprising: and determining a scaling matrix between the image received at the current moment and the image received at the last moment according to the depth information corresponding to the image received at the current moment and the depth information corresponding to the image received at the last moment.
Optionally, the scaling factor corresponding to the scaling matrix corresponds to a ratio of depth information corresponding to the image received at the current time to depth information corresponding to the image received at the previous time.
Optionally, determining a transformation matrix between the image received at the current time and the image received at the previous time according to at least one of the relative position relationship and the relative posture relationship includes: and if the relative position relationship corresponding to the image received at the current moment and the image received at the last moment has left-right position movement and/or up-down position movement, determining a translation matrix between the image received at the current moment and the image received at the last moment.
Optionally, determining a transformation matrix between the image received at the current time and the image received at the previous time according to at least one of the relative position relationship and the relative posture relationship includes: and if at least one of pitch attitude change, roll attitude change and yaw attitude change exists in the relative attitude relationship corresponding to the image received at the current moment and the image received at the last moment, determining a rotation matrix between the image received at the current moment and the image received at the last moment.
Optionally, processing the image received at the last time according to the transformation matrix includes: and converting the image received at the previous moment into the shooting visual angle corresponding to the motion information of the movable platform corresponding to the image received at the current moment according to the conversion matrix.
Optionally, filling the abnormal region according to the processed image received at the last time, including: determining an area corresponding to the abnormal area in the processed image received at the last moment according to the corresponding relation of the pixels in the processed image received at the last moment and the pixel in the image received at the current moment; and covering the area corresponding to the abnormal area in the image received at the last moment after processing on the abnormal area.
Optionally, the motion information of the movable platform is obtained based on attitude sensor of the camera of the movable platform and/or position sensor detection of the camera of the movable platform.
Optionally, the movable platform transmits the image captured by the movable platform and the motion information of the movable platform when capturing the image to the control device based on different signal transmission channels.
Optionally, the movable platform transmits the image acquired by the movable platform to the control device based on the broadband wireless transmission channel, and transmits the motion information of the movable platform when the image is acquired to the control device based on the narrowband wireless transmission channel.
For the image processing method of the foregoing embodiment, an embodiment of the present application further provides a control device for a movable platform, and fig. 7 is a block diagram of the control device in the embodiment of the present application; referring to fig. 7, the control device may include a storage device 210 and one or more first processors 220.
Wherein, the storage device 210 is used for storing program instructions; the one or more first processors 220, invoking program instructions stored in the storage 210, the one or more first processors 220, individually or collectively, being configured to, when executed, perform the following:
receiving an image collected by the movable platform and sent by the movable platform and motion information of the movable platform when the image is collected;
if an abnormal area exists in the image received at the current moment, filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment;
the abnormal area comprises a pixel information missing area or a pixel information error area.
Optionally, the motion information of the movable platform includes at least one of position information and attitude information of a camera of the movable platform.
Optionally, the one or more first processors 220 are further configured to fill the abnormal area according to the image received at the last time and the motion information of the movable platform corresponding to the image received at the last time, individually or collectively, to implement the following operations: processing the image received at the last moment according to the motion information of the movable platform corresponding to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the current moment; and filling the abnormal area according to the processed image received at the last moment.
Optionally, the processing includes at least one of scaling processing, translation processing, and rotation processing.
Optionally, the one or more first processors 220, when processing the image received at the previous time according to the motion information of the movable platform corresponding to the image received at the previous time and the motion information of the movable platform corresponding to the image received at the current time, are further configured to, separately or together: determining at least one of a relative position relation between the image received at the last moment and the image received at the current moment and a relative posture relation between the image received at the last moment and the image received at the current moment according to the motion information of the movable platform corresponding to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the current moment; and processing the image received at the last moment according to at least one of the relative position relation and the relative attitude relation.
Optionally, the one or more first processors 220, when processing the image received at the last time according to at least one of the relative position relationship and the relative pose relationship, are further configured individually or collectively to perform the following operations: determining a conversion matrix between the image received at the current moment and the image received at the previous moment according to at least one of the relative position relation and the relative attitude relation; and processing the image received at the last moment according to the conversion matrix.
Optionally, the transformation matrix comprises a scaling matrix, or a rotation matrix, or a translation matrix, or a concatenation of at least two of the scaling matrix, the rotation matrix and the translation matrix.
Optionally, the one or more first processors 220, when determining the transformation matrix between the image received at the current time and the image received at the previous time according to at least one of the relative position relationship and the relative pose relationship, are further configured, individually or collectively, to: and if the relative position relation corresponding to the image received at the current moment and the image received at the previous moment has front and back position movement, determining a scaling matrix between the image received at the current moment and the image received at the previous moment.
Optionally, the one or more first processors 220, when determining the scaling matrix between the image received at the current time and the image received at the last time, are further configured, individually or collectively, to: and determining a scaling matrix between the image received at the current moment and the image received at the last moment according to the distance moved by the front and rear positions.
Optionally, the scaling factor corresponding to the scaling matrix is positively correlated to the distance moved by the front and rear positions.
Optionally, the movable platform is provided with a distance detection sensor for detecting depth information of objects around the movable platform; the one or more first processors 220, individually or collectively, are further configured to perform the following operations: receiving depth information corresponding to an image collected by a movable platform and sent by the movable platform; the one or more first processors 220, when determining the scaling matrix between the image received at the current time and the image received at the previous time, are further configured, individually or collectively, to: and determining a scaling matrix between the image received at the current moment and the image received at the last moment according to the depth information corresponding to the image received at the current moment and the depth information corresponding to the image received at the last moment.
Optionally, the scaling factor corresponding to the scaling matrix corresponds to a ratio of depth information corresponding to the image received at the current time to depth information corresponding to the image received at the previous time.
Optionally, the one or more first processors 220, when determining the transformation matrix between the image received at the current time and the image received at the previous time according to at least one of the relative position relationship and the relative pose relationship, are further configured, individually or collectively, to: and if the relative position relationship corresponding to the image received at the current moment and the image received at the last moment has left-right position movement and/or up-down position movement, determining a translation matrix between the image received at the current moment and the image received at the last moment.
Optionally, the one or more first processors 220, when determining the transformation matrix between the image received at the current time and the image received at the previous time according to at least one of the relative position relationship and the relative pose relationship, are further configured, individually or collectively, to: and if at least one of pitch attitude change, roll attitude change and yaw attitude change exists in the relative attitude relationship corresponding to the image received at the current moment and the image received at the last moment, determining a rotation matrix between the image received at the current moment and the image received at the last moment.
Optionally, the one or more first processors 220, when processing the image received at the last time according to the transformation matrix, are further configured individually or collectively to perform the following: and converting the image received at the previous moment into the shooting visual angle corresponding to the motion information of the movable platform corresponding to the image received at the current moment according to the conversion matrix.
Optionally, the one or more first processors 220, in filling the abnormal area according to the image received at the last time after the processing, are further configured individually or collectively to perform the following operations: determining an area corresponding to the abnormal area in the processed image received at the last moment according to the corresponding relation of the pixels in the processed image received at the last moment and the pixel in the image received at the current moment; and covering the area corresponding to the abnormal area in the image received at the last moment after processing on the abnormal area.
Optionally, the motion information of the movable platform is obtained based on attitude sensor of the camera of the movable platform and/or position sensor detection of the camera of the movable platform.
Optionally, the movable platform transmits the image captured by the movable platform and the motion information of the movable platform when capturing the image to the control device based on different signal transmission channels.
Optionally, the movable platform transmits the image acquired by the movable platform to the control device based on the broadband wireless transmission channel, and transmits the motion information of the movable platform when the image is acquired to the control device based on the narrowband wireless transmission channel.
The storage device may include a volatile memory (volatile memory), such as a random-access memory (RAM); the storage device may also include a non-volatile memory (non-volatile memory), such as a flash memory (flash memory), a Hard Disk Drive (HDD) or a solid-state drive (SSD); the storage 110 may also comprise a combination of memories of the kind described above.
The first processor 220 may be a Central Processing Unit (CPU). The first Processor 220 may also be other general purpose processors, Digital Signal Processors (DSPs), application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
FIG. 8 is a flowchart illustrating an image transmission method on the movable platform side according to an embodiment of the present disclosure; referring to fig. 8, an image transmission method according to an embodiment of the present application may include the following steps:
s801: and sending the image acquired by the movable platform and the motion information of the movable platform during image acquisition to a control device in communication connection with the movable platform, so that when an abnormal area exists in the image received by the control device at the current moment, the control device is used for filling the abnormal area according to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment, wherein the abnormal area comprises a pixel information missing area or a pixel information error area.
Optionally, the motion information of the movable platform includes at least one of position information and attitude information of a camera of the movable platform.
Optionally, the motion information of the movable platform is obtained based on attitude sensor of the camera of the movable platform and/or position sensor detection of the camera of the movable platform.
Optionally, the movable platform transmits the image captured by the movable platform and the motion information of the movable platform when capturing the image to the control device based on different signal transmission channels.
Optionally, the movable platform transmits the image acquired by the movable platform to the control device based on the broadband wireless transmission channel, and transmits the motion information of the movable platform when the image is acquired to the control device based on the narrowband wireless transmission channel.
With respect to the image transmission method in the foregoing embodiment, an embodiment of the present application further provides a movable platform, and fig. 9 is a block diagram of a structure of the movable platform in the embodiment of the present application. Referring to fig. 9, the movable platform may include a camera 110, a sensor module 120, and a second processor 130.
The shooting device 110 is used for acquiring images;
a sensor module 120 for acquiring motion information of the movable platform when acquiring the image;
a second processor 130 electrically connected to the camera 110 and the sensor module 120, respectively, wherein the second processor 130 is configured to perform the following operations: sending the image acquired by the shooting device 110 and the motion information of the movable platform when acquiring the image to the control device, so that the control device has an abnormal area in the image received at the current moment, and filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment; the abnormal area comprises a pixel information missing area or a pixel information error area.
Optionally, the motion information of the movable platform includes at least one of position information and attitude information of the camera 110 of the movable platform.
Optionally, the sensor module 120 includes a position sensor and/or a gesture sensor, and the motion information of the movable platform is obtained based on the position sensor and/or the gesture sensor detection.
Optionally, the movable platform transmits the image captured by the movable platform and the motion information of the movable platform when capturing the image to the control device based on different signal transmission channels.
Optionally, the movable platform transmits the image acquired by the movable platform to the control device based on the broadband wireless transmission channel, and transmits the motion information of the movable platform when the image is acquired to the control device based on the narrowband wireless transmission channel.
The second processor 130 may be a Central Processing Unit (CPU). The second Processor 130 may also be other general purpose processors, Digital Signal Processors (DSPs), application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Furthermore, an embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the above-mentioned image processing method or the above-mentioned image transmission method.
The computer readable storage medium may be an internal storage unit, such as a hard disk or a memory, of the control device or the mobile platform according to any of the foregoing embodiments. The computer readable storage medium may also be an external storage device of the control apparatus or the removable platform, such as a plug-in hard disk, a Smart Media Card (SMC), an SD Card, a Flash memory Card (Flash Card), and the like provided on the device. Further, the computer-readable storage medium may also include both an internal storage unit of the control apparatus or the mobile platform and an external storage device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the control apparatus or the movable platform, and may also be used for temporarily storing data that has been output or is to be output.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above disclosure is only a few examples of the present application, and certainly should not be taken as limiting the scope of the present application, which is therefore intended to cover all modifications that are within the scope of the present application and which are equivalent to the claims.

Claims (67)

1. An image transmission system, comprising a movable platform and a control device communicatively coupled to the movable platform;
the movable platform is used for sending the image acquired by the movable platform and the motion information of the movable platform during image acquisition to the control device;
if an abnormal area exists in the image received by the control device at the current moment, the control device is used for filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment, and the abnormal area comprises a pixel information missing area or a pixel information error area.
2. The image transmission system according to claim 1, wherein the motion information of the movable platform includes at least one of position information and posture information of a photographing device of the movable platform.
3. The image transmission system according to claim 1, wherein the control device is configured to process the image received at the previous time according to the motion information of the movable platform corresponding to the image received at the previous time and the motion information of the movable platform corresponding to the image received at the current time; and filling the abnormal area according to the processed image received at the last moment.
4. The image transmission system according to claim 3, wherein the processing includes at least one of scaling processing, translation processing, and rotation processing.
5. The image transmission system according to claim 3, wherein the control device is configured to determine at least one of a relative positional relationship corresponding to the image received at the previous time and the image received at the current time, and a relative posture relationship corresponding to the image received at the previous time and the image received at the current time, based on the motion information of the movable platform corresponding to the image received at the previous time and the motion information of the movable platform corresponding to the image received at the current time; and processing the image received at the last moment according to at least one of the relative position relation and the relative attitude relation.
6. The image transmission system according to claim 5, wherein the control means is configured to determine a conversion matrix between the image received at the current time and the image received at the previous time, based on at least one of the relative positional relationship and the relative attitude relationship; and processing the image received at the last moment according to the conversion matrix.
7. The image transmission system of claim 6, wherein the transformation matrix comprises a scaling matrix, or a rotation matrix, or a translation matrix, or a concatenation of at least two of a scaling matrix, a rotation matrix, and a translation matrix.
8. The image transmission system according to claim 7, wherein the control means is configured to determine the scaling matrix between the image received at the present time and the image received at the previous time when a relative positional relationship corresponding to the image received at the present time and the image received at the previous time has moved forward and backward.
9. The image transmission system according to claim 8, wherein the control means is configured to determine a scaling matrix between the image received at the current time and the image received at the previous time according to a distance of the front-back position movement when the relative positional relationship between the image received at the current time and the image received at the previous time is moved in front-back position.
10. The image transmission system according to claim 9, wherein the scaling factor corresponding to the scaling matrix positively correlates with the distance moved by the front and rear positions.
11. The image transmission system according to any one of claims 8 or 10, wherein the movable platform is provided with a distance detection sensor for detecting depth information of an object around the movable platform;
the movable platform is also used for sending the depth information corresponding to the image acquired by the movable platform to the control device;
the control device is used for determining a scaling matrix between the image received at the current moment and the image received at the last moment according to the depth information corresponding to the image received at the current moment and the depth information corresponding to the image received at the last moment when the relative position relation between the position of the image received at the current moment and the image received at the last moment is moved between the front position and the rear position.
12. The image transmission system according to claim 11, wherein the scaling factor corresponding to the scaling matrix corresponds to a ratio of the depth information corresponding to the image received at the current time to the depth information corresponding to the image received at the previous time.
13. The image transmission system according to claim 7, wherein the control device is configured to determine a translation matrix between the image received at the current time and the image received at the previous time when a left-right position movement and/or a vertical position movement exists in a relative positional relationship corresponding to the image received at the current time and the image received at the previous time.
14. The image transmission system according to claim 7, wherein the control means is configured to determine the rotation matrix between the image received at the present time and the image received at the previous time when at least one of a change in pitch attitude, a change in roll attitude, and a change in yaw attitude exists in a relative attitude relationship corresponding to the image received at the present time and the image received at the previous time.
15. The image transmission system according to claim 6, wherein the control device is configured to convert the image received at the previous moment to the shooting perspective corresponding to the motion information of the movable platform corresponding to the image received at the current moment according to the conversion matrix.
16. The image transmission system according to claim 3, wherein the control device is configured to determine an area corresponding to the abnormal area in the processed image received at the previous time according to a pixel correspondence relationship between the processed image received at the previous time and the image received at the current time; and covering the area corresponding to the abnormal area in the processed image received at the last moment on the abnormal area.
17. The image transmission system according to claim 1, wherein the motion information of the movable platform is obtained based on a posture sensor of a camera of the movable platform and/or a position sensor detection of the camera of the movable platform.
18. The image transmission system according to claim 1, wherein the movable platform transmits the image captured by the movable platform and the motion information of the movable platform at the time of capturing the image to the control device based on different signal transmission channels.
19. The image transmission system according to claim 18, wherein the movable platform transmits the image captured by the movable platform to the control device based on a broadband wireless transmission channel, and transmits the motion information of the movable platform at the time of capturing the image to the control device based on a narrowband wireless transmission channel.
20. An image processing method, characterized in that the method comprises:
receiving an image collected by a movable platform and motion information of the movable platform when the image is collected, wherein the image is sent by the movable platform;
if an abnormal area exists in the image received at the current moment, filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment;
wherein the abnormal region includes a pixel information missing region or a pixel information error region.
21. The method of claim 20, wherein the motion information of the movable platform comprises at least one of position information and pose information of a camera of the movable platform.
22. The method of claim 20, wherein the filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment comprises:
processing the image received at the last moment according to the motion information of the movable platform corresponding to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the current moment;
and filling the abnormal area according to the processed image received at the last moment.
23. The method of claim 22, wherein the processing comprises at least one of scaling, translation, and rotation.
24. The method of claim 22, wherein the processing the image received at the previous moment according to the motion information of the movable platform corresponding to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the current moment comprises:
determining at least one of a relative position relationship corresponding to the image received at the last moment and the image received at the current moment and a relative posture relationship corresponding to the image received at the last moment and the image received at the current moment according to the motion information of the movable platform corresponding to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the current moment;
and processing the image received at the last moment according to at least one of the relative position relation and the relative attitude relation.
25. The method of claim 24, wherein the processing the image received at the previous time according to at least one of the relative positional relationship and the relative pose relationship comprises:
determining a conversion matrix between the image received at the current moment and the image received at the previous moment according to at least one of the relative position relation and the relative attitude relation;
and processing the image received at the last moment according to the conversion matrix.
26. The method of claim 25, wherein the transformation matrix comprises a scaling matrix, or a rotation matrix, or a translation matrix, or a concatenation of at least two of a scaling matrix, a rotation matrix, and a translation matrix.
27. The method of claim 26, wherein determining a transition matrix between the image received at the current time and the image received at the previous time according to at least one of the relative positional relationship and the relative pose relationship comprises:
and when the relative position relation corresponding to the image received at the current moment and the image received at the last moment moves from front to back, determining a scaling matrix between the image received at the current moment and the image received at the last moment.
28. The method of claim 27, wherein determining a scaling matrix between the image received at the current time and the image received at the previous time comprises:
and determining a scaling matrix between the image received at the current moment and the image received at the last moment according to the distance moved by the front and rear positions.
29. The method of claim 28, wherein the scaling factor corresponding to the scaling matrix is positively correlated to the distance moved by the front and back positions.
30. The method according to any one of claims 27 to 29, wherein the movable platform is provided with a distance detection sensor for detecting depth information of objects around the movable platform;
the method further comprises the following steps:
receiving depth information which is sent by the movable platform and corresponds to an image collected by the movable platform;
the determining a scaling matrix between the image received at the current time and the image received at the last time comprises:
and determining a scaling matrix between the image received at the current moment and the image received at the last moment according to the depth information corresponding to the image received at the current moment and the depth information corresponding to the image received at the last moment.
31. The method of claim 30, wherein the scaling factor corresponding to the scaling matrix corresponds to a ratio of depth information corresponding to the image received at the current time to depth information corresponding to the image received at the previous time.
32. The method of claim 26, wherein determining a transition matrix between the image received at the current time and the image received at the previous time according to at least one of the relative positional relationship and the relative pose relationship comprises:
and when the relative position relation between the image received at the current moment and the image received at the last moment has left-right position movement and/or up-down position movement, determining a translation matrix between the image received at the current moment and the image received at the last moment.
33. The method of claim 26, wherein determining a transition matrix between the image received at the current time and the image received at the previous time according to at least one of the relative positional relationship and the relative pose relationship comprises:
when at least one of a pitch attitude change, a roll attitude change and a yaw attitude change exists in a relative attitude relationship corresponding to the image received at the current moment and the image received at the last moment, determining a rotation matrix between the image received at the current moment and the image received at the last moment.
34. The method of claim 25, wherein processing the image received at the previous time according to the transformation matrix comprises:
and converting the image received at the previous moment to a shooting visual angle corresponding to the motion information of the movable platform corresponding to the image received at the current moment according to the conversion matrix.
35. The method of claim 22, wherein filling the abnormal region according to the processed image received at the last time comprises:
determining a region corresponding to the abnormal region in the processed image received at the last moment according to the pixel corresponding relation between the processed image received at the last moment and the image received at the current moment;
and covering the area corresponding to the abnormal area in the processed image received at the last moment on the abnormal area.
36. The method of claim 20, wherein the motion information of the movable platform is obtained based on attitude sensor of a camera of the movable platform and/or position sensor detection of a camera of the movable platform.
37. The method of claim 20, wherein the movable platform transmits the image captured by the movable platform and the motion information of the movable platform at the time of capturing the image to a control device communicatively coupled to the movable platform based on different signal transmission channels.
38. The method of claim 37, wherein the movable platform transmits the image captured by the movable platform to the control device based on a broadband wireless transmission channel, and transmits the motion information of the movable platform at the time of capturing the image to the control device based on a narrowband wireless transmission channel.
39. A control device for a movable platform, the device comprising:
storage means for storing program instructions; and
one or more processors that invoke program instructions stored in the storage device, the one or more processors individually or collectively configured to, when the program instructions are executed, perform operations comprising:
receiving an image collected by the movable platform and motion information of the movable platform when the image is collected, wherein the image is sent by the movable platform;
if an abnormal area exists in the image received at the current moment, filling the abnormal area according to the image received at the previous moment and the motion information of the movable platform corresponding to the image received at the previous moment;
wherein the abnormal region includes a pixel information missing region or a pixel information error region.
40. The control device of claim 39, wherein the motion information of the movable platform comprises at least one of position information and attitude information of a camera of the movable platform.
41. The control device of claim 39, wherein the one or more processors, in filling the abnormal area based on the image received at the previous time and the motion information of the movable platform corresponding to the image received at the previous time, are individually or collectively further configured to:
processing the image received at the last moment according to the motion information of the movable platform corresponding to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the current moment;
and filling the abnormal area according to the processed image received at the last moment.
42. The control device of claim 41, wherein the processing comprises at least one of scaling, translation, and rotation.
43. The control device of claim 41, wherein the one or more processors, when processing the image received at the previous time according to the motion information of the movable platform corresponding to the image received at the previous time and the motion information of the movable platform corresponding to the image received at the current time, are further configured to, individually or collectively:
determining at least one of a relative position relationship corresponding to the image received at the last moment and the image received at the current moment and a relative posture relationship corresponding to the image received at the last moment and the image received at the current moment according to the motion information of the movable platform corresponding to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the current moment;
and processing the image received at the last moment according to at least one of the relative position relation and the relative attitude relation.
44. The control device of claim 43, wherein the one or more processors, when processing the image received at the last instance in accordance with at least one of the relative positional relationship and the relative pose relationship, are further configured, individually or collectively, to:
determining a conversion matrix between the image received at the current moment and the image received at the previous moment according to at least one of the relative position relation and the relative attitude relation;
and processing the image received at the last moment according to the conversion matrix.
45. The control device of claim 44, wherein the transformation matrix comprises a scaling matrix, or a rotation matrix, or a translation matrix, or a concatenation of at least two of a scaling matrix, a rotation matrix, and a translation matrix.
46. The control device of claim 45, wherein the one or more processors, when determining a transition matrix between the image received at the current time and the image received at the previous time based on at least one of the relative positional relationship and the relative pose relationship, are individually or collectively further configured to:
and when the relative position relation corresponding to the image received at the current moment and the image received at the last moment moves from front to back, determining a scaling matrix between the image received at the current moment and the image received at the last moment.
47. The control device of claim 46, wherein the one or more processors, when determining a scaling matrix between the image received at the current time and the image received at the previous time, are further configured, individually or collectively, to:
and determining a scaling matrix between the image received at the current moment and the image received at the last moment according to the distance moved by the front and rear positions.
48. The control device of claim 47, wherein the scaling factor corresponding to the scaling matrix is positively correlated to the distance moved by the front and back positions.
49. A control device according to any one of claims 46 to 48, wherein the movable platform is provided with a distance detection sensor for detecting depth information of objects around the movable platform;
the one or more processors, individually or collectively, are further configured to perform operations comprising:
receiving depth information which is sent by the movable platform and corresponds to an image collected by the movable platform;
the one or more processors, when determining a scaling matrix between the image received at the current time and the image received at the previous time, are further configured, individually or collectively, to:
and determining a scaling matrix between the image received at the current moment and the image received at the last moment according to the depth information corresponding to the image received at the current moment and the depth information corresponding to the image received at the last moment.
50. The control device of claim 49, wherein the scaling factor corresponding to the scaling matrix corresponds to a ratio of depth information corresponding to the image received at the current time to depth information corresponding to the image received at the previous time.
51. The control device of claim 45, wherein the one or more processors, when determining a transition matrix between the image received at the current time and the image received at the previous time based on at least one of the relative positional relationship and the relative pose relationship, are individually or collectively further configured to:
and when the relative position relation between the image received at the current moment and the image received at the last moment has left-right position movement and/or up-down position movement, determining a translation matrix between the image received at the current moment and the image received at the last moment.
52. The control device of claim 45, wherein the one or more processors, when determining a transition matrix between the image received at the current time and the image received at the previous time based on at least one of the relative positional relationship and the relative pose relationship, are individually or collectively further configured to:
when at least one of a pitch attitude change, a roll attitude change and a yaw attitude change exists in a relative attitude relationship corresponding to the image received at the current moment and the image received at the last moment, determining a rotation matrix between the image received at the current moment and the image received at the last moment.
53. The control device of claim 44, wherein the one or more processors, when processing the image received at the last instance in accordance with the transformation matrix, are further configured, individually or collectively, to:
and converting the image received at the previous moment to a shooting visual angle corresponding to the motion information of the movable platform corresponding to the image received at the current moment according to the conversion matrix.
54. The control device of claim 41, wherein the one or more processors, when filling the abnormal region based on the image received at the last time after processing, are further configured, individually or collectively, to:
determining a region corresponding to the abnormal region in the processed image received at the last moment according to the pixel corresponding relation between the processed image received at the last moment and the image received at the current moment;
and covering the area corresponding to the abnormal area in the processed image received at the last moment on the abnormal area.
55. The control device of claim 39, wherein the motion information of the movable platform is obtained based on attitude sensor of the camera of the movable platform and/or position sensor detection of the camera of the movable platform.
56. The control device of claim 39, wherein the movable platform transmits the image captured by the movable platform and the motion information of the movable platform at the time of capturing the image to the control device based on different signal transmission channels.
57. The control device of claim 56, wherein the movable platform transmits the image captured by the movable platform to the control device based on a broadband wireless transmission channel, and transmits the motion information of the movable platform at the time of capturing the image to the control device based on a narrowband wireless transmission channel.
58. An image transmission method, characterized in that the method comprises:
the method comprises the steps of sending an image acquired by a movable platform and motion information of the movable platform during image acquisition to a control device in communication connection with the movable platform, so that when an abnormal area exists in the image received by the control device at the current moment, the control device is used for filling the abnormal area according to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment, wherein the abnormal area comprises a pixel information missing area or a pixel information error area.
59. The method of claim 58, wherein the motion information of the movable platform comprises at least one of position information and pose information of a camera of the movable platform.
60. The method of claim 58, wherein the motion information of the movable platform is obtained based on attitude sensor of a camera of the movable platform and/or position sensor detection of a camera of the movable platform.
61. The method of claim 58, wherein the movable platform transmits the image captured by the movable platform and the motion information of the movable platform at the time of capturing the image to the control device based on different signal transmission channels.
62. The method of claim 61, wherein the movable platform transmits the image captured by the movable platform to the control device based on a broadband wireless transmission channel, and transmits the motion information of the movable platform at the time of capturing the image to the control device based on a narrowband wireless transmission channel.
63. A movable platform in communicative connection with a control device, the movable platform comprising:
the shooting device is used for acquiring images;
the sensor module is used for acquiring motion information of the movable platform when the image is acquired;
a processor electrically connected to the camera and the sensor module, respectively, the processor configured to perform the following operations:
sending the image acquired by the shooting device and the motion information of the movable platform when the image is acquired to the control device, so that the control device is used for filling the abnormal area according to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment when the abnormal area exists in the image received at the current moment;
wherein the abnormal region includes a pixel information missing region or a pixel information error region.
64. The movable platform of claim 63, wherein the motion information of the movable platform comprises at least one of position information and pose information of a camera of the movable platform.
65. The movable platform of claim 63, wherein the sensor module comprises an attitude sensor and/or a position sensor, wherein the motion information of the movable platform is obtained based on the attitude sensor and/or the position sensor detection.
66. The movable platform of claim 63, wherein the movable platform transmits the image captured by the movable platform and motion information of the movable platform at the time of capturing the image to the control device based on different signal transmission channels.
67. The movable platform of claim 66, wherein the movable platform transmits the images captured by the movable platform to the control device based on a broadband wireless transmission channel and transmits motion information of the movable platform at the time of capturing the images to the control device based on a narrowband wireless transmission channel.
CN201980049933.0A 2019-12-17 2019-12-17 Image transmission system and method, control device and movable platform Pending CN112514363A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/125870 WO2021119982A1 (en) 2019-12-17 2019-12-17 Image transmission system and method, control apparatus, and mobile platform

Publications (1)

Publication Number Publication Date
CN112514363A true CN112514363A (en) 2021-03-16

Family

ID=74924084

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980049933.0A Pending CN112514363A (en) 2019-12-17 2019-12-17 Image transmission system and method, control device and movable platform

Country Status (2)

Country Link
CN (1) CN112514363A (en)
WO (1) WO2021119982A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113124835A (en) * 2021-04-22 2021-07-16 广州南方卫星导航仪器有限公司 Multi-lens photogrammetric data processing device for unmanned aerial vehicle
CN115205952A (en) * 2022-09-16 2022-10-18 深圳市企鹅网络科技有限公司 Online learning image acquisition method and system based on deep learning

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060285724A1 (en) * 2005-06-20 2006-12-21 Ying-Li Tian Salient motion detection system, method and program product therefor
CN101188772A (en) * 2006-11-17 2008-05-28 中兴通讯股份有限公司 A method for hiding time domain error in video decoding
CN107274342A (en) * 2017-05-22 2017-10-20 纵目科技(上海)股份有限公司 A kind of underbody blind area fill method and system, storage medium, terminal device
CN108198248A (en) * 2018-01-18 2018-06-22 维森软件技术(上海)有限公司 A kind of vehicle bottom image 3D display method
CN109656260A (en) * 2018-12-03 2019-04-19 北京采立播科技有限公司 A kind of unmanned plane geographic information data acquisition system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101193313A (en) * 2006-11-20 2008-06-04 中兴通讯股份有限公司 A method for hiding video decoding time domain error
CN100531400C (en) * 2007-07-26 2009-08-19 上海交通大学 Video error coverage method based on macro block level and pixel motion estimation
CN107241544B (en) * 2016-03-28 2019-11-26 展讯通信(天津)有限公司 Video image stabilization method, device and camera shooting terminal
CN106023192B (en) * 2016-05-17 2019-04-09 成都通甲优博科技有限责任公司 A kind of time reference real-time calibration method and system of Image-capturing platform
CN106257911A (en) * 2016-05-20 2016-12-28 上海九鹰电子科技有限公司 Image stability method and device for video image
CN106534692A (en) * 2016-11-24 2017-03-22 腾讯科技(深圳)有限公司 Video image stabilization method and device
CN107231526B (en) * 2017-06-09 2020-02-21 联想(北京)有限公司 Image processing method and electronic device
CN107682705B (en) * 2017-09-26 2020-05-12 杭州电子科技大学 Stereo video B frame error concealment method based on MV-HEVC framework
CN110337668B (en) * 2018-04-27 2021-08-31 深圳市大疆创新科技有限公司 Image stability augmentation method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060285724A1 (en) * 2005-06-20 2006-12-21 Ying-Li Tian Salient motion detection system, method and program product therefor
CN101188772A (en) * 2006-11-17 2008-05-28 中兴通讯股份有限公司 A method for hiding time domain error in video decoding
CN107274342A (en) * 2017-05-22 2017-10-20 纵目科技(上海)股份有限公司 A kind of underbody blind area fill method and system, storage medium, terminal device
CN108198248A (en) * 2018-01-18 2018-06-22 维森软件技术(上海)有限公司 A kind of vehicle bottom image 3D display method
CN109656260A (en) * 2018-12-03 2019-04-19 北京采立播科技有限公司 A kind of unmanned plane geographic information data acquisition system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113124835A (en) * 2021-04-22 2021-07-16 广州南方卫星导航仪器有限公司 Multi-lens photogrammetric data processing device for unmanned aerial vehicle
CN115205952A (en) * 2022-09-16 2022-10-18 深圳市企鹅网络科技有限公司 Online learning image acquisition method and system based on deep learning

Also Published As

Publication number Publication date
WO2021119982A1 (en) 2021-06-24

Similar Documents

Publication Publication Date Title
KR102457222B1 (en) Mobile robot and method thereof
US20150042800A1 (en) Apparatus and method for providing avm image
US11523085B2 (en) Method, system, device for video data transmission and photographing apparatus
US9418628B2 (en) Displaying image data based on perspective center of primary image
CN112514363A (en) Image transmission system and method, control device and movable platform
US20190003840A1 (en) Map registration point collection with mobile drone
CN112154649A (en) Aerial survey method, shooting control method, aircraft, terminal, system and storage medium
CN110428372B (en) Depth data and 2D laser data fusion method and device and storage medium
JP2014063411A (en) Remote control system, control method, and program
JP2011028495A (en) Remote control apparatus of automatic guided vehicle
US20200007794A1 (en) Image transmission method, apparatus, and device
CN112313596A (en) Inspection method, equipment and storage medium based on aircraft
US11132586B2 (en) Rolling shutter rectification in images/videos using convolutional neural networks with applications to SFM/SLAM with rolling shutter images/videos
CN112204946A (en) Data processing method, device, movable platform and computer readable storage medium
CN108171116B (en) Auxiliary obstacle avoidance method and device for aircraft and auxiliary obstacle avoidance system
WO2020024182A1 (en) Parameter processing method and apparatus, camera device and aircraft
CN111247389A (en) Data processing method and device for shooting equipment and image processing equipment
US20220113720A1 (en) System and method to facilitate remote and accurate maneuvering of unmanned aerial vehicle under communication latency
JP2019046149A (en) Crop cultivation support apparatus
KR102015099B1 (en) Apparatus and method for providing wrap around view monitoring using dis information
CN111492409B (en) Apparatus and method for three-dimensional interaction with augmented reality remote assistance
CN108496351B (en) Unmanned aerial vehicle and control method thereof, control terminal and control method thereof
JP5864371B2 (en) Still image automatic generation system, worker information processing terminal, instructor information processing terminal, and determination device in still image automatic generation system
KR20210106422A (en) Job control system, job control method, device and instrument
US10165173B2 (en) Operating method and apparatus for detachable lens type camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210316

RJ01 Rejection of invention patent application after publication