WO2021119982A1 - 图像传输系统及方法、控制装置、可移动平台 - Google Patents

图像传输系统及方法、控制装置、可移动平台 Download PDF

Info

Publication number
WO2021119982A1
WO2021119982A1 PCT/CN2019/125870 CN2019125870W WO2021119982A1 WO 2021119982 A1 WO2021119982 A1 WO 2021119982A1 CN 2019125870 W CN2019125870 W CN 2019125870W WO 2021119982 A1 WO2021119982 A1 WO 2021119982A1
Authority
WO
WIPO (PCT)
Prior art keywords
image received
movable platform
image
moment
control device
Prior art date
Application number
PCT/CN2019/125870
Other languages
English (en)
French (fr)
Inventor
刘怀宇
陈颖
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980049933.0A priority Critical patent/CN112514363A/zh
Priority to PCT/CN2019/125870 priority patent/WO2021119982A1/zh
Publication of WO2021119982A1 publication Critical patent/WO2021119982A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • This application relates to the field of image transmission, in particular to an image transmission system and method, a control device, and a movable platform.
  • the user When remotely controlling the movable platform, the user needs to obtain the image collected by the movable platform to operate the control device according to the image returned by the movable platform to realize the safe operation of the movable platform.
  • the mobile platform will transmit the collected images to the control device through wireless transmission.
  • wireless image transmission due to frequent wireless signal interference, occlusion on the transmission path, excessive transmission distance, antenna pointing deviation and other adverse conditions, it may cause interruption, blurring, mosaic and other problems, which greatly affects the remote The control experience may also cause safety hazards.
  • Traditional wireless image transmission abnormal recovery methods are usually based on image information analysis, specifically based on optical flow to estimate the motion information of the object in the image, and then match the error or missing in the current image from the historical image based on the image segmentation method based on the motion information of the object area.
  • image information analysis specifically based on optical flow to estimate the motion information of the object in the image, and then match the error or missing in the current image from the historical image based on the image segmentation method based on the motion information of the object area.
  • the estimation of the motion information of the objects is inaccurate, resulting in poor recovery of abnormal wireless image transmission and low reliability.
  • This application provides an image transmission system and method, a control device, and a movable platform to improve the effect and reliability of abnormal recovery of wireless image transmission.
  • an image transmission system including a movable platform and a control device communicatively connected with the movable platform;
  • the movable platform is used to send the image collected by the movable platform and the movement information of the movable platform when the image is collected to the control device;
  • the control device is used to fill in the abnormality based on the image received at the previous time and the movement information of the movable platform corresponding to the image received at the previous time.
  • Area the abnormal area includes a pixel information missing area or a pixel information error area.
  • an image processing method including:
  • the abnormal area includes a pixel information missing area or a pixel information error area.
  • a control device for a movable platform including:
  • Storage device for storing program instructions
  • One or more processors call program instructions stored in the storage device, and when the program instructions are executed, the one or more processors are individually or collectively configured to implement the following operations:
  • the abnormal area includes a pixel information missing area or a pixel information error area.
  • an image transmission method including:
  • the image collected by the movable platform and the movement information of the movable platform when the image is collected are sent to the control device communicatively connected with the movable platform, so that the control device exists in the image received at the current moment
  • the control device is used to fill the abnormal area according to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last time, wherein the abnormal area includes a pixel information missing area or a pixel Information error area.
  • a movable platform which is communicatively connected with a control device, and the movable platform includes:
  • a sensor module for collecting movement information of the movable platform when collecting the image
  • the processor is electrically connected to the photographing device and the sensor module respectively, and the processor is configured to perform the following operations:
  • the image collected by the shooting device and the movement information of the movable platform when the image is collected are sent to the control device, so that when there is an abnormal area in the image received at the current moment, the control device is used for Filling the abnormal area according to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment;
  • the abnormal area includes a pixel information missing area or a pixel information error area.
  • the present application fills in the abnormal area in the image received at the current time through the image received at the last time and the motion information of the movable platform corresponding to the image received at the last time, which can improve wireless
  • the effect and reliability of abnormal image transmission recovery minimize the impact on remote control and user experience in the case of wireless image transmission errors, provide more information support and more accurate visual information for remote control, and improve Control safety, effectiveness and control experience, and improve the overall visual experience of wireless image transmission.
  • Fig. 1 is a structural block diagram of an image transmission system in an embodiment of the present application
  • FIG. 2 is a schematic diagram of the work flow of the image transmission system in an embodiment of the present application.
  • FIG. 3A is a schematic diagram of the image received at the previous moment and the image received at the current moment by the control device in an embodiment of the present application;
  • FIG. 3B is a schematic diagram of an image in a shooting angle corresponding to FIG. 3A (2) that the control device in an embodiment of the present application converts from FIG. 3A (1);
  • 4A is a schematic diagram of the image received at the previous time and the image received at the current time by the control device in another embodiment of the present application;
  • FIG. 4B is a schematic diagram of an image in a shooting angle corresponding to FIG. 4A (2) that the control device in another embodiment of the present application converts from FIG. 4A (1) to FIG. 4A;
  • FIG. 5A is a schematic diagram of the image received at the previous time and the image received at the current time by the control device in another embodiment of the present application;
  • FIG. 5B is a schematic diagram of an image in a shooting angle corresponding to FIG. 5A (2) by the control device in another embodiment of the present application;
  • FIG. 6 is a schematic diagram of the method flow of the image processing method on the control device side in an embodiment of the present application.
  • Fig. 7 is a structural block diagram of a control device in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of the method flow of the image transmission method on the side of the movable platform in an embodiment of the present application.
  • Fig. 9 is a structural block diagram of a movable platform in an embodiment of the present application.
  • Traditional wireless image transmission abnormal recovery methods are usually based on image information analysis, specifically based on optical flow to estimate the motion information of the object in the image, and then match the error or missing in the current image from the historical image based on the image segmentation method based on the motion information of the object area.
  • image information analysis specifically based on optical flow to estimate the motion information of the object in the image, and then match the error or missing in the current image from the historical image based on the image segmentation method based on the motion information of the object area.
  • the estimation of the motion information of the objects is inaccurate, resulting in poor recovery of abnormal wireless image transmission and low reliability.
  • this application uses the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment to fill in the abnormal area in the image received at the current moment, which can improve the effect and recovery of abnormal wireless image transmission.
  • Reliability minimizes the impact on remote control and user experience in the case of errors in wireless image transmission, provides more information support and more accurate visual information for remote control, and improves control safety, effectiveness, and Control experience, and improve the overall visual experience of wireless image transmission.
  • FIG. 1 is a structural block diagram of an image transmission system in an embodiment of the present application; please refer to FIG. 1, an embodiment of the present application provides an image transmission system.
  • the image transmission system may include a movable platform 100 and communicate with the movable platform 100 Connected control device 200.
  • the movable platform 100 may be a drone, an unmanned vehicle, a mobile robot, a handheld device, or other types of movable platforms.
  • the control device 200 can be a combination of a remote control and a mobile terminal (such as a mobile phone, a tablet computer, etc.), or a remote control with a screen or a PC ground station, and the control device 200 can also be a combination of a remote control and VR glasses or a PC ground station Combination with VR glasses, etc.
  • a mobile terminal such as a mobile phone, a tablet computer, etc.
  • a remote control with a screen or a PC ground station such as a mobile phone, a tablet computer, etc.
  • the control device 200 can also be a combination of a remote control and VR glasses or a PC ground station Combination with VR glasses, etc.
  • the remote control can be used to remotely control the movable platform 100 to change the position and/or posture of the movable platform 100, and/or change the shooting of the movable platform 100
  • the position and/or posture of the device; the mobile terminal is used to display the image collected by the movable platform 100, and the user can operate the remote control according to the displayed image to realize the control of the movable platform 100.
  • the remote control or PC ground station can be used to remotely control the movable platform 100 to change the position and/or posture of the movable platform 100, and/or change the movable platform
  • the position and/or posture of the camera 100; the remote control or PC ground station can also display the images collected by the mobile platform 100, and the user can operate the remote control or PC ground station according to the displayed image to realize the control of the mobile platform 100 control.
  • the remote control can be used to remotely control the movable platform 100 to change the position and/or posture of the movable platform 100, and/or change the shooting device of the movable platform 100 Position and/or posture; VR glasses can display images collected by the movable platform 100, and the user can operate the remote control according to the displayed images to achieve control of the movable platform 100.
  • the control device 200 is a combination of a PC ground station and VR glasses
  • the PC ground station can be used to remotely control the movable platform 100 to change the position and/or posture of the movable platform 100, and/or change the shooting of the movable platform 100
  • the position and/or posture of the device; the VR glasses can display images collected by the movable platform 100, and the user can operate the PC ground station according to the displayed images to realize the control of the movable platform 100.
  • the mobile platform 100 and the control device 200 are in communication connection based on a wireless communication method.
  • the wireless communication method can be selected as wifi, Lightbridge, OcuSync or other wireless communication methods.
  • the wireless communication method between 200 is not specifically limited.
  • Fig. 1 is a structural block diagram of an image transmission system in an embodiment of the present application
  • Fig. 2 is a schematic diagram of a work flow of an image transmission system in an embodiment of the present application
  • the image collected by the movable platform 100 and the movement information of the movable platform 100 when the image is collected are sent to the control device 200. If there is an abnormal area in the image received by the control device 200 at the current time, the control device 200 is used to fill in the abnormal area according to the image received at the last time and the motion information of the movable platform 100 corresponding to the image received at the last time. Pixel information missing area or pixel information error area.
  • the control device 200 of the present application uses the image received at the last time and the motion information of the movable platform 100 corresponding to the image received at the last time to fill in the abnormality in the image received at the current time, which can improve the effect of abnormal recovery of wireless image transmission. And reliability.
  • the movable platform 100 may include a photographing device, and the photographing device is used to collect images.
  • the photographing device may be an image sensor, or an integrated photographing device such as a camera and a video camera.
  • the movement information of the movable platform 100 may include at least one of position information and posture information of the photographing device of the movable platform 100.
  • the photographing device may include a sensor module for acquiring at least one of position information and posture information of the photographing device.
  • the frequency at which the sensor module acquires at least one of the position information and posture information of the camera is the same as the frequency at which the camera acquires images, that is, the camera acquires a frame of images while the sensor module also acquires the position information of the camera.
  • the frequency at which the sensor module acquires at least one of the position information and attitude information of the camera and the frequency at which the camera acquires images may also be different.
  • the sensor module acquires the position information of the camera.
  • the frequency of at least one of and the posture information is greater than the frequency of the image capturing device.
  • the sensor module may include an attitude sensor and/or a position sensor.
  • the attitude sensor may be an inertial measurement unit (IMU), and the inertial measurement unit may be used to detect the attitude information of the camera.
  • the position sensor may be a global positioning system (GPS) positioning device, a real-time kinematic (RTK) carrier phase differential positioning device (RTK positioning device for short), etc., and can be used to detect the location information of the camera.
  • GPS global positioning system
  • RTK real-time kinematic
  • the position information of the camera can be determined based on the position information of the drone.
  • a position sensor is provided in the drone body Based on the position information of the drone acquired by the position sensor and the relative position relationship between the drone and the camera, the position information of the camera can be determined.
  • the movable platform 100 transmits the images collected by the movable platform 100 and the movement information of the movable platform 100 when the images are collected based on different signal transmission channels.
  • Control device 200 the movable platform 100 transmits the image collected by the movable platform 100 to the control device 200 based on a broadband wireless transmission channel, and transmits motion information of the movable platform 100 when the image is collected to the control device 200 based on the narrowband wireless transmission channel.
  • the image transmission method based on the broadband wireless transmission channel can ensure that the image data is completely transmitted to the control device 200; the data volume of the motion information is relatively small, and the narrowband wireless transmission channel The bandwidth is small, but the reliability is high, which ensures that the motion information is accurately transmitted to the control device 200.
  • the control device 200 when there is an abnormal area in the image received at the current time, uses the image received at the previous time and the movement information of the movable platform 100 corresponding to the image received at the previous time.
  • the process of filling the abnormal area can be real-time, that is, during the process of the control device 200 controlling the movement of the movable platform 100, the control device 200 performs the filling operation of the abnormal area in the image in real time, so as to realize the error in the wireless image transmission.
  • control device 200 has an abnormal area in the image received at the current time, and fills the abnormal area based on the image received at the previous time and the motion information of the movable platform 100 corresponding to the image received at the previous time.
  • a process can be performed offline.
  • the control device 200 of the embodiment of the present application will correspondingly store the received image and motion information, and display the received image or the filled image in real time, which is convenient for the user to control the movable platform 100 to move according to the real-time display image to improve the mobility.
  • the security of platform 100 movement will correspondingly store the received image and motion information, and display the received image or the filled image in real time, which is convenient for the user to control the movable platform 100 to move according to the real-time display image to improve the mobility.
  • the security of platform 100 movement will correspondingly store the received image and motion information, and display the received image or the filled image in real time, which is convenient for the user to control the movable platform 100 to move according to the real-time display image to improve the mobility.
  • the security of platform 100 movement will correspondingly store the received image and motion information, and display the received image or the filled image in real time, which is convenient for the user to control the movable platform 100 to move according to the real-time display image to improve the mobility.
  • abnormal area may also be another area indicating abnormal pixel information in the image, and is not limited to the pixel information missing area and the pixel information error area listed above.
  • control device 200 filling the abnormal area according to the image received at the last time and the motion information of the movable platform 100 corresponding to the image received at the last time will be described in detail below.
  • the control device 200 can fill in the abnormal area in a specific implementation process that may include the following steps:
  • the processing in the above steps may include at least one of zoom processing, translation processing, and rotation processing.
  • the processing method needs to be based on the movement information of the movable platform 100 corresponding to the image received at the last moment and the movable image corresponding to the image received at the current moment.
  • the movement information of the platform 100 is determined.
  • the object in the image collected by the camera of the movable platform 100 changes accordingly.
  • the object in the image The changes are as follows: if the camera of the movable platform 100 moves forward, the object in the image becomes larger; if the camera of the movable platform 100 moves backward, the object in the image becomes smaller. If the camera of the movable platform 100 moves to the left, the object in the image moves to the right; if the camera of the movable platform 100 moves to the right, the object in the image moves to the left. If the imaging device of the movable platform 100 moves upward, the object in the image moves downward; if the imaging device of the movable platform 100 moves downward, the object in the image moves upward.
  • the size of the object in the image changes as follows: if the camera of the movable platform 100 rotates upward in the tilt direction, the object in the image rotates downward; if the camera of the movable platform 100 rotates in the tilt direction Rotate down, the object in the image rotates up. If the camera of the movable platform 100 rotates to the left in the yaw direction, the object in the image rotates to the right; if the camera of the movable platform 100 rotates to the right in the yaw direction, the object in the image rotates to the left. If the camera of the movable platform 100 rotates to the left in the roll direction, the object in the image rotates to the right; if the camera of the movable platform 100 rotates to the right in the roll direction, the object in the image rotates to the left.
  • the optical axis of the lens is parallel to the front and rear direction of the imaging device, the optical axis of the lens is perpendicular to the left and right directions of the imaging device, and the gravity direction of the imaging device is parallel to the vertical direction of the imaging device.
  • the camera of the movable platform 100 may have a position change or a posture change between the two images collected by the camera of the movable platform 100.
  • This step (1) can be based on the image received at the previous moment.
  • the movement information of the movable platform 100 and the movement information of the movable platform 100 corresponding to the image received at the current moment determine whether the movable platform 100 has a position change or a posture change between the two images collected, The position change and the posture change are still generated, and the image received at the last moment is processed according to the determined position change and/or posture change.
  • the control device 200 is configured to, according to the motion information of the movable platform 100 corresponding to the image received at the last moment and the motion information of the movable platform 100 corresponding to the image received at the current moment, Determine the relative positional relationship between the image received at the last moment and the image received at the current moment (that is, the position change of the photographing device), and the relative posture relationship between the image received at the last moment and the image received at the current moment (that is, the position of the photographing device). At least one of the posture change); according to at least one of the relative position relationship and the relative posture relationship, the image received at the last moment is processed.
  • the control device 200 may process the image received at the last moment in the realization process including: determining the image received at the current moment according to at least one of the relative position relationship and the relative posture relationship
  • the conversion matrix between the image received at the last moment and the image received at the last moment; according to the conversion matrix, the image received at the last moment is processed.
  • the conversion matrix may include a scaling matrix, or a rotation matrix, or a translation matrix, or a cascade of at least two of the scaling matrix, the rotation matrix and the translation matrix.
  • control device 200 determines the conversion matrix between the image received at the current moment and the image received at the previous moment according to at least one of the relative position relationship and the relative posture relationship.
  • control device 200 is configured to determine the difference between the image received at the current time and the image received at the previous time when the relative position relationship between the image received at the current time and the image received at the previous time has moved forward and backward.
  • the zoom matrix When the camera moves back and forth, the object in the image captured by the camera becomes larger or smaller. Therefore, when the relative position relationship between the image received at the current moment and the image received at the previous moment is the front and back position movement, the current There is a scaling transformation relationship between the image received at the moment and the image received at the previous moment.
  • the control device 200 may adopt different strategies to determine the scaling matrix between the image received at the current moment and the image received at the previous moment, for example, one of which determines the difference between the image received at the current moment and the image received at the previous moment.
  • the control device 200 is used to determine the image received at the current moment according to the distance of the front and rear position movement when the relative position relationship between the image received at the current moment and the image received at the previous moment has a front-to-back position movement.
  • the zoom matrix between the image received at the previous moment is positively correlated with the distance of the front and rear position movement, that is, the greater the distance of the front and rear position movement, the larger the zoom factor corresponding to the zoom matrix.
  • the movable platform 100 is provided with a distance detection sensor, which is used to detect objects around the movable platform 100
  • a distance detection sensor which is used to detect objects around the movable platform 100
  • This application does not specifically limit the type of distance detection sensor, and any distance detection sensor that can detect the depth information of objects around the movable platform 100 belongs to the protection scope of this application.
  • the movable platform 100 of this embodiment is also used to send depth information corresponding to the image collected by the movable platform 100 to the control device 200. Specifically, the movable platform 100 acquires the movable platform through the distance detection sensor while collecting the image.
  • the depth information of the objects around the platform 100; the depth information corresponding to the image is determined according to the depth information of the objects around the movable platform 100; and the depth information corresponding to the image is sent to the control device 200.
  • the movable platform 100 may transmit the images collected by the movable platform 100 and the depth information corresponding to the images collected by the movable platform 100 to the control device 200 based on different signal transmission channels.
  • the movable platform 100 can transmit the depth information corresponding to the image collected by the movable platform 100 and the movement information of the movable platform 100 when the image is collected to the control device 200 based on the same signal transmission channel.
  • the movable platform 100 is based on narrowband wireless
  • the transmission channel transmits the depth information corresponding to the image collected by the movable platform 100 and the movement information of the movable platform 100 when the image is collected to the control device 200.
  • control device 200 is used for the relative position relationship between the position of the image received at the current moment and the image received at the previous time when there is a back-to-back position movement, according to the depth information corresponding to the image received at the current time and the previous
  • the depth information corresponding to the image received at the moment determines the zoom matrix between the image received at the current moment and the image received at the previous moment. Determining the zoom matrix based on the depth information can enable objects of different depths in the image to be zoomed to different degrees, which is closer to the actual visual perception, and further improves the effect of abnormal recovery of wireless image transmission.
  • the zoom factor corresponding to the zoom matrix corresponds to the ratio of the depth information corresponding to the image received at the current moment to the depth information corresponding to the image received at the last moment.
  • the zoom factor corresponding to the zoom matrix corresponds to the ratio of the depth information corresponding to the image received at the current moment.
  • control device 200 is configured to determine whether the image received at the current moment and the image received at the current moment have a left-to-right position movement and/or a vertical position movement when the relative positional relationship between the image received at the current moment and the image received at the previous moment exists.
  • the translation matrix between images received at a time When the camera has a left-right position movement, the object in the image collected by the camera moves to the right or left. Therefore, when the relative position relationship between the image received at the current moment and the image received at the previous moment has a left-right position movement, There is a translation transformation relationship between the image received at the current moment and the image received at the previous moment.
  • control device 200 is used for when the relative attitude relationship between the image received at the current moment and the image received at the previous moment has at least one of a pitch attitude change, a roll attitude change, and a yaw attitude change, Determine the rotation matrix between the image received at the current moment and the image received at the previous moment.
  • the camera has at least one of pitch attitude change, roll attitude change, and yaw attitude change, the object in the image collected by the camera also rotates around the corresponding axis. Therefore, the image received at the current moment is the same as the previous one.
  • the relative attitude relationship corresponding to the image received at any time includes at least one of a pitch attitude change, a roll attitude change, and a yaw attitude change
  • control device 200 may process the image received at the last moment according to the conversion matrix.
  • the control device 200 is configured to convert the image received at the last moment to the image corresponding to the image received at the current moment according to the conversion matrix.
  • the motion information of the mobile platform 100 corresponds to the shooting angle of view.
  • the control device 200 is used to perform zoom processing on the image received at the last moment according to the zoom matrix, so as to convert the image received at the last moment to the mobile platform 100 corresponding to the image received at the current moment.
  • the image (1) of FIG. 3A is the image received at the previous moment
  • the image (2) of FIG. 3A is the image received at the current moment
  • the image (2) of FIG. 3A is an abnormal area 31 3A (2) and FIG. 3A (1) corresponding to the relative position relationship (that is, the position change of the camera) there is a backward movement, therefore, the object in FIG. 3A (2) relative to FIG.
  • the object in figure (1) becomes smaller, and the zoom matrix between figure (2) in figure 3A and figure (1) in figure 3A can be determined according to the distance moved backward, and then according to the determined figure (2) in figure 3A
  • the zoom matrix between (1) of FIG. 3A and (1) of FIG. 3A performs reduction processing of (1) of FIG. 3A, and converts (1) of FIG. 3A to the shooting angle of view corresponding to (2) of FIG. 3A, and obtains
  • the image shown in Fig. 3B, Fig. 3B is the processed image of Fig. 3A in Fig. (1).
  • the control device 200 is used to perform translation processing on the image received at the last moment according to the translation matrix, so as to convert the image received at the last moment to the movable platform 100 corresponding to the image received at the current moment.
  • Figure 4A (1) is the image received at the previous moment
  • Figure 4A (2) is the image received at the current moment
  • Figure 4A (2) has an abnormal area 41 4A (2) and FIG. 4A (1) corresponding to the relative positional relationship (that is, the position change of the camera) there is a left translation, therefore, the object in FIG. 4A (2) relative to FIG.
  • the translation matrix between Figure 4A (2) and Figure 4A (1) can be determined according to the distance of the left translation, and then according to the determined Figure (2) of Figure 4A
  • the translation matrix between (1) of FIG. 4A and (1) of FIG. 4A performs a rightward translation process for (1) of FIG. 4A, and transforms (1) of FIG. 4A to the shooting angle of view corresponding to (2) of FIG. 4A.
  • the image shown in Fig. 4B is obtained, and Fig. 4B is the processed image of Fig. 4A in Fig. (1).
  • the control device 200 is used to perform rotation processing on the image received at the previous time according to the rotation matrix, so as to convert the image received at the previous time to the movable platform 100 corresponding to the image received at the current time.
  • the image (1) of FIG. 5A is the image received at the previous moment
  • the image (2) of FIG. 5A is the image received at the current moment
  • FIG. 5A has an abnormal area 51 , Figure 5A (2) and Figure 5A (1) corresponding to the relative posture relationship (that is, the posture change of the camera) there is a left rotation (roll posture change), therefore, in Figure 5A (2)
  • the object rotates to the right in Figure 5A (1), and the rotation matrix between Figure 5A (2) and Figure 5A (1) can be determined according to the change in yaw attitude, and then according to the determined figure
  • the rotation matrix between the graph (2) of FIG. 5A and the graph (1) of FIG. 5A performs the right rotation processing of the graph (1) of FIG. 5A, and the graph (1) of FIG. 5A is converted to the graph (2) of FIG. 5A Under the corresponding shooting angle of view, the image shown in FIG. 5B is obtained, and FIG. 5B is the processed image in (1) of FIG. 5A.
  • the control device 200 is configured to determine the corresponding relationship between the processed image received at the previous time and the pixel in the image received at the current time to determine whether the image received at the previous time after processing is compared with the pixels in the image received at the current time.
  • the area corresponding to the abnormal area; the area corresponding to the abnormal area in the image received at the last time after processing is overlaid on the abnormal area.
  • FIGS. 3A and 3B The area 32 in FIG. 3B is the area corresponding to the abnormal area 31, and the area 32 is covered on the abnormal area 31, that is, the operation of filling the abnormal area is completed. 4A and 4B, the area 42 in FIG.
  • the area 4B is the area corresponding to the abnormal area 41, and the area 42 is covered on the abnormal area 41, that is, the operation of filling the abnormal area is completed.
  • the area 52 in FIG. 5B is the area corresponding to the abnormal area 51, and the area 52 is covered on the abnormal area 51, that is, the operation of filling the abnormal area is completed.
  • the control device 200 may first process the image received at the last moment and the image received at the current moment according to the lens parameters, and then process the image received at the current moment according to the lens distortion.
  • the image received at the last time after the elimination and the motion information of the movable platform 100 corresponding to the image received at the last time are used to fill in the abnormal areas in the image received at the current time after the lens distortion is eliminated to improve the image display effect.
  • FIG. 6 is a schematic diagram of the method flow of the image processing method on the control device side in an embodiment of the present application; please refer to FIG. 6, the image processing method of the embodiment of the present application may include the following steps:
  • S601 Receive the image collected by the movable platform and the movement information of the movable platform when the image is collected from the movable platform;
  • S602 If there is an abnormal area in the image received at the current moment, fill in the abnormal lens distortion area based on the image received at the last time and the motion information of the movable platform corresponding to the image received at the last time, where the abnormal area includes missing pixel information Area or pixel information error area.
  • the movement information of the movable platform includes at least one of position information and posture information of the camera of the movable platform.
  • filling the abnormal area according to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment includes: according to the motion information of the movable platform corresponding to the image received at the last moment and the current The motion information of the movable platform corresponding to the image received at any time is processed for the image received at the last time; the abnormal area is filled according to the image received at the last time after the processing.
  • the processing includes at least one of zoom processing, translation processing, and rotation processing.
  • processing the image received at the previous moment includes: according to the previous moment The movement information of the movable platform corresponding to the received image and the movement information of the movable platform corresponding to the image received at the current moment, determine the relative position relationship between the image received at the previous moment and the image received at the current moment, and the previous moment received At least one of the relative posture relationships between the image of and the image received at the current moment; processing the image received at the last moment according to at least one of the relative position relationship and the relative posture relationship.
  • processing the image received at the previous moment according to at least one of the relative position relationship and the relative posture relationship includes: determining the image and the image received at the current moment according to at least one of the relative position relationship and the relative posture relationship.
  • the conversion matrix between the images received at a time; according to the conversion matrix, the image received at the previous time is processed.
  • the conversion matrix includes a scaling matrix, or a rotation matrix, or a translation matrix, or a cascade of at least two of the scaling matrix, the rotation matrix and the translation matrix.
  • determining the conversion matrix between the image received at the current moment and the image received at the previous time according to at least one of the relative position relationship and the relative posture relationship including: if the image received at the current time is compared with the image received at the previous time When the relative position relationship corresponding to the image has front and back position movement, the zoom matrix between the image received at the current moment and the image received at the previous moment is determined.
  • determining the zoom matrix between the image received at the current moment and the image received at the previous moment includes: determining the zoom between the image received at the current moment and the image received at the previous moment according to the distance between the front and back positions matrix.
  • the zoom factor corresponding to the zoom matrix is positively correlated with the distance moved by the front and rear positions.
  • the movable platform is provided with a distance detection sensor for detecting depth information of objects around the movable platform; the image processing method further includes: receiving depth information corresponding to the image collected by the movable platform and sent by the movable platform; Determining the zoom matrix between the image received at the current moment and the image received at the previous moment includes: determining the image and the image received at the current moment according to the depth information corresponding to the image received at the current moment and the depth information corresponding to the image received at the previous moment The zoom matrix between the images received at the last moment.
  • the zoom factor corresponding to the zoom matrix corresponds to the ratio of the depth information corresponding to the image received at the current moment to the depth information corresponding to the image received at the previous moment.
  • determining the conversion matrix between the image received at the current moment and the image received at the previous time according to at least one of the relative position relationship and the relative posture relationship, including: if the image received at the current time is compared with the image received at the previous time When the relative position relationship corresponding to the image has left and right position movement and/or up and down position movement, the translation matrix between the image received at the current moment and the image received at the previous moment is determined.
  • determining the conversion matrix between the image received at the current moment and the image received at the previous time according to at least one of the relative position relationship and the relative posture relationship, including: if the image received at the current time is compared with the image received at the previous time When the relative posture relationship corresponding to the image has at least one of a pitch posture change, a roll posture change, and a yaw posture change, the rotation matrix between the image received at the current moment and the image received at the previous moment is determined.
  • processing the image received at the last moment according to the conversion matrix includes: according to the conversion matrix, converting the image received at the last moment to the shooting angle of view corresponding to the movement information of the movable platform corresponding to the image received at the current moment under.
  • filling the abnormal area according to the image received at the last time after processing includes: determining the last time after processing according to the pixel correspondence between the image received at the last time after processing and the image received at the current time The area corresponding to the abnormal area in the received image; the area corresponding to the abnormal area in the image received at the last time after processing is overlaid on the abnormal area.
  • the movement information of the movable platform is obtained based on detection by the posture sensor of the camera of the movable platform and/or the position sensor of the camera of the movable platform.
  • the movable platform transmits the image collected by the movable platform and the movement information of the movable platform when the image is collected to the control device based on different signal transmission channels.
  • the movable platform transmits the image collected by the movable platform to the control device based on the broadband wireless transmission channel, and transmits the movement information of the movable platform when the image is collected to the control device based on the narrowband wireless transmission channel.
  • FIG. 7 is a structural block diagram of the control device in an embodiment of the present application; please refer to FIG. 7, the control device may include The storage device 210 and one or more first processors 220.
  • the storage device 210 is used to store program instructions; one or more first processors 220 call the program instructions stored in the storage device 210. When the program instructions are executed, the one or more first processors 220 individually Or collectively configured to perform the following operations:
  • the abnormal area includes a pixel information missing area or a pixel information error area.
  • the movement information of the movable platform includes at least one of position information and posture information of the camera of the movable platform.
  • the one or more first processors 220 fill the abnormal area according to the image received at the last time and the motion information of the movable platform corresponding to the image received at the last time, and are further configured individually or collectively to It is used to implement the following operations: according to the movement information of the movable platform corresponding to the image received at the last moment and the movement information of the movable platform corresponding to the image received at the current moment, process the image received at the previous moment; The image received at the last moment fills in the abnormal area.
  • the processing includes at least one of zoom processing, translation processing, and rotation processing.
  • the one or more first processors 220 compare the movement information of the movable platform corresponding to the image received at the previous moment and the movement information of the movable platform corresponding to the image received at the current moment to the movement information received at the previous moment.
  • it is separately or collectively further configured to implement the following operations: according to the movement information of the movable platform corresponding to the image received at the last moment and the movement information of the movable platform corresponding to the image received at the current moment, Determine at least one of the relative position relationship between the image received at the last moment and the image received at the current moment, and the relative posture relationship between the image received at the last moment and the image received at the current moment; according to the relative position relationship and the relative posture relationship At least one of them processes the image received at the previous moment.
  • the one or more first processors 220 are separately or collectively further configured for implementation when processing the image received at the last moment according to at least one of the relative position relationship and the relative posture relationship.
  • the operation is as follows: determining the conversion matrix between the image received at the current moment and the image received at the previous time according to at least one of the relative position relationship and the relative posture relationship; and processing the image received at the previous time according to the conversion matrix.
  • the conversion matrix includes a scaling matrix, or a rotation matrix, or a translation matrix, or a cascade of at least two of the scaling matrix, the rotation matrix and the translation matrix.
  • the one or more first processors 220 individually or It is collectively further configured to implement the following operations: if the relative position relationship between the image received at the current moment and the image received at the previous moment has a front-to-back position movement, determine whether the image received at the current moment and the image received at the previous moment have Zoom matrix between.
  • the one or more first processors 220 are further configured to perform the following operations individually or collectively: The distance of the front and back position movement determines the zoom matrix between the image received at the current moment and the image received at the previous moment.
  • the zoom factor corresponding to the zoom matrix is positively correlated with the distance moved by the front and rear positions.
  • the movable platform is provided with a distance detection sensor for detecting the depth information of objects around the movable platform; one or more of the first processors 220 are separately or collectively also configured to perform the following operations: receiving The depth information corresponding to the image collected by the movable platform sent by the movable platform; when the one or more first processors 220 determine the zoom matrix between the image received at the current moment and the image received at the previous moment, individually or collectively
  • the ground is further configured to perform the following operations: determine the zoom between the image received at the current time and the image received at the previous time according to the depth information corresponding to the image received at the current time and the depth information corresponding to the image received at the previous time matrix.
  • the zoom factor corresponding to the zoom matrix corresponds to the ratio of the depth information corresponding to the image received at the current moment to the depth information corresponding to the image received at the previous moment.
  • the one or more first processors 220 individually or It is collectively further configured to perform the following operations: if the relative position relationship between the image received at the current moment and the image received at the previous moment has left and right position movement and/or up and down position movement, determine the image received at the current moment and the upper and lower position.
  • the translation matrix between images received at a time if the relative position relationship between the image received at the current moment and the image received at the previous moment has left and right position movement and/or up and down position movement, determine the image received at the current moment and the upper and lower position.
  • the one or more first processors 220 individually or It is collectively further configured to perform the following operations: if the relative attitude relationship between the image received at the current moment and the image received at the last moment has at least one of a pitch attitude change, a roll attitude change, and a yaw attitude change, Determine the rotation matrix between the image received at the current moment and the image received at the previous moment.
  • the one or more first processors 220 are separately or collectively further configured to perform the following operations: according to the conversion matrix, The image received at a moment is converted to the shooting angle of view corresponding to the movement information of the movable platform corresponding to the image received at the current moment.
  • the one or more first processors 220 fill in the abnormal area according to the image received at the last time after processing, and are further configured to perform the following operations individually or collectively:
  • the corresponding relationship between the pixels in the image received at the moment and the image received at the current moment is determined, and the area corresponding to the abnormal area in the image received at the last time after processing is determined;
  • the image received at the previous time after processing is compared with the abnormal area in the image received at the previous time.
  • the corresponding area covers the abnormal area.
  • the movement information of the movable platform is obtained based on detection by the posture sensor of the camera of the movable platform and/or the position sensor of the camera of the movable platform.
  • the movable platform transmits the image collected by the movable platform and the movement information of the movable platform when the image is collected to the control device based on different signal transmission channels.
  • the movable platform transmits the image collected by the movable platform to the control device based on the broadband wireless transmission channel, and transmits the movement information of the movable platform when the image is collected to the control device based on the narrowband wireless transmission channel.
  • the above-mentioned storage device may include volatile memory, such as random-access memory (RAM); the storage device may also include non-volatile memory (non-volatile memory), such as flash memory ( flash memory, hard disk drive (HDD) or solid-state drive (SSD); the storage device 110 may also include a combination of the foregoing types of memories.
  • volatile memory such as random-access memory (RAM)
  • non-volatile memory such as flash memory ( flash memory, hard disk drive (HDD) or solid-state drive (SSD)
  • SSD solid-state drive
  • the storage device 110 may also include a combination of the foregoing types of memories.
  • the aforementioned first processor 220 may be a central processing unit (CPU).
  • the first processor 220 may also be other general-purpose processors, digital signal processors (Digital Signal Processors, DSP), application-specific integrated circuits (ASICs), field-programmable gate arrays (field-programmable gate arrays). array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • FIG. 8 is a schematic diagram of the method flow of the image transmission method on the side of the movable platform in an embodiment of the present application; please refer to FIG. 8, the image transmission method of the embodiment of the present application may include the following steps:
  • S801 Send the image collected by the movable platform and the movement information of the movable platform when the image is collected to the control device communicatively connected with the movable platform, so that when there is an abnormal area in the image received at the current moment, the control device is used for According to the image received at the last moment and the motion information of the movable platform corresponding to the image received at the last moment, the abnormal area is filled, where the abnormal area includes a pixel information missing area or a pixel information error area.
  • the movement information of the movable platform includes at least one of position information and posture information of the camera of the movable platform.
  • the movement information of the movable platform is obtained based on detection by the posture sensor of the camera of the movable platform and/or the position sensor of the camera of the movable platform.
  • the movable platform transmits the image collected by the movable platform and the movement information of the movable platform when the image is collected to the control device based on different signal transmission channels.
  • the movable platform transmits the image collected by the movable platform to the control device based on the broadband wireless transmission channel, and transmits the movement information of the movable platform when the image is collected to the control device based on the narrowband wireless transmission channel.
  • FIG. 9 is a structural block diagram of the movable platform in an embodiment of the present application.
  • the movable platform may include a camera 110, a sensor module 120 and a second processor 130.
  • the photographing device 110 is used to collect images
  • the sensor module 120 is used to collect movement information of the movable platform when collecting images
  • the second processor 130 is electrically connected to the camera 110 and the sensor module 120.
  • the second processor 130 is configured to perform the following operations: the image captured by the camera 110 and the movement information of the movable platform when the image is captured Send to the control device so that the control device has an abnormal area in the image received at the current moment, and is used to fill the abnormal area according to the image received at the previous time and the movement information of the movable platform corresponding to the image received at the previous time; ,
  • the abnormal area includes the pixel information missing area or the pixel information error area.
  • the movement information of the movable platform includes at least one of position information and posture information of the camera 110 of the movable platform.
  • the sensor module 120 includes a posture sensor and/or a position sensor, and the movement information of the movable platform is obtained based on the detection of the posture sensor and/or the position sensor.
  • the movable platform transmits the image collected by the movable platform and the movement information of the movable platform when the image is collected to the control device based on different signal transmission channels.
  • the movable platform transmits the image collected by the movable platform to the control device based on the broadband wireless transmission channel, and transmits the movement information of the movable platform when the image is collected to the control device based on the narrowband wireless transmission channel.
  • the aforementioned second processor 130 may be a central processing unit (CPU).
  • the second processor 130 may also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application-specific integrated circuits (ASICs), field-programmable gate arrays (field-programmable gate arrays, etc.). array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • an embodiment of the present application also provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the steps of the above-mentioned image processing method or the above-mentioned image transmission method are realized.
  • the computer-readable storage medium may be the internal storage unit of the control device or the removable platform described in any of the foregoing embodiments, such as a hard disk or a memory.
  • the computer-readable storage medium may also be an external storage device of a control device or a removable platform, such as a plug-in hard disk, a smart media card (SMC), an SD card, or a flash memory card equipped on the device. Flash Card) etc.
  • the computer-readable storage medium may also include both the internal storage unit of the control device or the removable platform and the external storage device.
  • the computer-readable storage medium is used to store the computer program and other programs and data required by the control device or the movable platform, and can also be used to temporarily store data that has been output or will be output.
  • the program can be stored in a computer readable storage medium, and the program can be stored in a computer readable storage medium. During execution, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

一种图像传输系统及方法、控制装置、可移动平台,所述图像传输系统包括可移动平台(100)和与可移动平台通信连接的控制装置(200);可移动平台用于将可移动平台采集的图像以及可移动平台在采集图像时的运动信息发送给控制装置;若控制装置当前时刻接收的图像中存在异常区域,控制装置用于根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台的运动信息,填补异常区域,异常区域包括像素信息缺失区域或像素信息错误区域。本申请通过上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台的运动信息,来填补当前时刻接收的图像中的异常区域,可以提高无线图像传输异常恢复的效果和可靠性。

Description

图像传输系统及方法、控制装置、可移动平台 技术领域
本申请涉及图像传输领域,尤其涉及一种图像传输系统及方法、控制装置、可移动平台。
背景技术
在对可移动平台进行远程操控时,用户需要获取可移动平台采集的图像,以根据可移动平台返回的图像操作控制装置,实现对可移动平台的安全操控。通常,可移动平台会通过无线传输方式将其采集的图像传输给控制装置。然而,在无线图像传输过程中,由于经常受到无线信号干扰、传输路径上的遮挡、传输距离过远、天线指向偏差等不利条件影响,可能导致中断、花屏、马赛克等问题,极大地影响了远程操控体验,还可能造成安全隐患。
传统无线图像传输异常恢复方法通常是基于图像信息分析,具体是基于光流来估计图像中物体的运动信息,再根据物体的运动信息基于图像分割方式从历史图像中匹配当前图像中的出错或缺失区域。然而对于一些表面光滑、弱纹理、重复纹理或相似纹理的物体,物体的运动信息估计不准确,导致无线图像传输异常恢复的效果较差、可靠性低。
发明内容
本申请提供一种图像传输系统及方法、控制装置、可移动平台,以提高无线图像传输异常恢复的效果和可靠性。
根据本申请实施例的第一方面,提供一种图像传输系统,所述图像传输系统包括可移动平台和与所述可移动平台通信连接的控制装置;
所述可移动平台用于将所述可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息发送给所述控制装置;
若所述控制装置当前时刻接收的图像中存在异常区域,所述控制装置用于根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域,所述异常区域包括像素信息缺失区域或像素信息错误区域。
根据本申请实施例的第二方面,提供一种图像处理方法,所述方法包括:
接收可移动平台发送的所述可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息;
若当前时刻接收的图像中存在异常区域,根据上一时刻接收的图像以及所述上一 时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域;
其中,所述异常区域包括像素信息缺失区域或像素信息错误区域。
根据本申请实施例的第三方面,提供一种可移动平台的控制装置,所述装置包括:
存储装置,用于存储程序指令;以及
一个或多个处理器,调用所述存储装置中存储的程序指令,当所述程序指令被执行时,所述一个或多个处理器单独地或共同地被配置成用于实施如下操作:
接收可移动平台发送的所述可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息;
若当前时刻接收的图像中存在异常区域,根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域;
其中,所述异常区域包括像素信息缺失区域或像素信息错误区域。
根据本申请实施例的第四方面,提供一种图像传输方法,所述方法包括:
将可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息发送给与所述可移动平台通信连接的控制装置,以使所述控制装置在当前时刻接收的图像中存在异常区域时,用于根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域,其中,所述异常区域包括像素信息缺失区域或像素信息错误区域。
根据本申请实施例的第五方面,提供一种可移动平台,与控制装置通信连接,所述可移动平台包括:
拍摄装置,用于采集图像;
传感器模块,用于采集所述可移动平台在采集所述图像时的运动信息;
处理器,与所述拍摄装置、所述传感器模块分别电连接,所述处理器被配置成用于实施如下操作:
将所述拍摄装置采集的图像以及所述可移动平台在采集所述图像时的运动信息发送给所述控制装置,以使所述控制装置在当前时刻接收的图像中存在异常区域时,用于根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域;
其中,所述异常区域包括像素信息缺失区域或像素信息错误区域。
根据本申请实施例提供的技术方案,本申请通过上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台的运动信息,来填补当前时刻接收的图像中的异常区域,可以提高无线图像传输异常恢复的效果和可靠性,使得在无线图像传输出错的情 况下,最小化对远程操控和用户体验上的影响,为远程操控提供更多的信息支撑以及更准确的视觉信息,提高了操控安全性、有效性以及操控体验,并且提高了无线图像传输的整体视觉体验。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一实施例中的图像传输系统的结构框图;
图2是本申请一实施例中的图像传输系统的工作流程示意图;
图3A是本申请一实施例中的控制装置在上一时刻接收的图像以及当前时刻接收的图像的示意图;
图3B是本申请一实施例中的控制装置将图3A的图(1)转换至图3A的图(2)对应的拍摄视角下的图像示意图;
图4A是本申请另一实施例中的控制装置在上一时刻接收的图像以及当前时刻接收的图像的示意图;
图4B是本申请另一实施例中的控制装置将图4A的图(1)转换至图4A的图(2)对应的拍摄视角下的图像示意图;
图5A是本申请另一实施例中的控制装置在上一时刻接收的图像以及当前时刻接收的图像的示意图;
图5B是本申请另一实施例中的控制装置将图5A的图(1)转换至图5A的图(2)对应的拍摄视角下的图像示意图;
图6是本申请一实施例中的图像处理方法在控制装置侧的方法流程示意图;
图7是本申请一实施例中的控制装置的结构框图;
图8是本申请一实施例中的图像传输方法在可移动平台侧的方法流程示意图;
图9是本申请一实施例中的可移动平台的结构框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
需要说明的是,在不冲突的情况下,下述的实施例及实施方式中的特征可以相互组合。
传统无线图像传输异常恢复方法通常是基于图像信息分析,具体是基于光流来估计图像中物体的运动信息,再根据物体的运动信息基于图像分割方式从历史图像中匹配当前图像中的出错或缺失区域。然而对于一些表面光滑、弱纹理、重复纹理或相似纹理的物体,物体的运动信息估计不准确,导致无线图像传输异常恢复的效果较差、可靠性低。
对于此,本申请通过上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台的运动信息,来填补当前时刻接收的图像中存在异常区域,可以提高无线图像传输异常恢复的效果和可靠性,使得在无线图像传输出错的情况下,最小化对远程操控和用户体验上的影响,为远程操控提供更多的信息支撑以及更准确的视觉信息,提高了操控安全性、有效性以及操控体验,并且提高了无线图像传输的整体视觉体验。
图1是本申请一实施例中的图像传输系统的结构框图;请参见图1,本申请实施例提供一种图像传输系统,该图像传输系统可以包括可移动平台100和与可移动平台100通信连接的控制装置200。
本申请实施例中,可移动平台100可以为无人机,无人车,移动机器人,手持设备或其他类型的可移动平台。
控制装置200可以为遥控器和移动终端(如手机、平板电脑等)的组合,也可以为带屏遥控器或PC地面站,控制装置200还可以为遥控器和VR眼镜的组合或者PC地面站和VR眼镜的组合等。
其中,当控制装置200为遥控器和移动终端的组合时,遥控器可以用于遥控可移动平台100,以改变可移动平台100的位置和/或姿态,和/或改变可移动平台100的拍摄装置的位置和/或姿态;移动终端用于显示可移动平台100采集的图像,用户可以根据显示的图像来操作遥控器,以实现对可移动平台100的控制。
当控制装置200为带屏遥控器或PC地面站时,遥控器或PC地面站可以用于遥控可移动平台100,以改变可移动平台100的位置和/或姿态,和/或改变可移动平台100的拍摄装置的位置和/或姿态;遥控器或PC地面站还可以显示可移动平台100采集的图像,用户可以根据显示的图像来操作遥控器或PC地面站,以实现对可移动平台100的控制。
当控制装置200为遥控器和VR眼镜的组合时,遥控器可以用于遥控可移动平台100,以改变可移动平台100的位置和/或姿态,和/或改变可移动平台100的拍摄装置的位置和/或姿态;VR眼镜可以显示可移动平台100采集的图像,用户可以根据显示的图像来操作遥控器,以实现对可移动平台100的控制。
当控制装置200为PC地面站和VR眼镜的组合时,PC地面站可以用于遥控可移动平台100,以改变可移动平台100的位置和/或姿态,和/或改变可移动平台100的拍摄装置的位置和/或姿态;VR眼镜可以显示可移动平台100采集的图像,用户可以根据显示的图像来操作PC地面站,以实现对可移动平台100的控制。
本申请实施例中,可移动平台100与控制装置200基于无线通信方式进行通信连接,该无线通信方式可以选择为wifi、Lightbridge、OcuSync或其他无线通信方式,本申请对可移动平台100与控制装置200之间的无线通信方式不作具体限定。
图1是本申请一实施例中的图像传输系统的结构框图,图2是本申请一实施例中的图像传输系统的工作流程示意图;请结合图1和图2,可移动平台100用于将可移动平台100采集的图像以及可移动平台100在采集图像时的运动信息发送给控制装置200。若控制装置200当前时刻接收的图像中存在异常区域,控制装置200用于根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台100的运动信息,填补异常区域,异常区域包括像素信息缺失区域或像素信息错误区域。本申请的控制装置200通过上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台100的运动信息,来填补当前时刻接收的图像中存在异常,可以提高无线图像传输异常恢复的效果和可靠性。其中,可移动平台100可以包括拍摄装置,该拍摄装置用于采集图像。拍摄装置可以为图像传感器,也可以为相机、摄像机等集成的拍摄装置。
可移动平台100的运动信息可以包括可移动平台100的拍摄装置的位置信息和姿态信息中的至少一种。以可移动平台为无人机,拍摄装置安装于无人机为例,拍摄装置可以包括传感器模块,该传感器模块用于获取拍摄装置的位置信息和姿态信息中的至少一种。可选地,传感器模块获取拍摄装置的位置信息和姿态信息中的至少一种的频率与拍摄装置采集图像的频率相同,即拍摄装置在采集一帧图像同时,传感器模块也获取拍摄装置的位置信息和姿态信息中的至少一种;当然,传感器模块获取拍摄装置的位置信息和姿态信息中的至少一种的频率与拍摄装置采集图像的频率也可以不相同,比如传感器模块获取拍摄装置的位置信息和姿态信息中的至少一种的频率大于拍摄装置采集图像的频率。
示例性的,传感器模块可以包括姿态传感器和/或位置传感器,该姿态传感器可以为惯性测量单元(IMU,Inertial measurement unit),惯性测量单元可用于检测拍摄装置的姿态信息。位置传感器可以为全球定位系统(Global positioning system,GPS)定位装置、实时动态(Real-time kinematic,RTK)载波相位差分定位装置(简称RTK定位装置)等,可用于检测拍摄装置的位置信息。
在一种实施例中,当拍摄装置与无人机的相对位置关系固定时,拍摄装置的位置信息可基于无人机的位置信息确定,示例性的,无人机机身中设置有位置传感器,基于该位置传感器所获取的无人机的位置信息和无人机与拍摄装置的相对位置关系可以 确定拍摄装置的位置信息。
为减少图像传输以及运动信息传输过程中出现的错误,本实施例中,可移动平台100基于不同的信号传输通道传输可移动平台100采集的图像以及可移动平台100在采集图像时的运动信息至控制装置200。可选地,可移动平台100基于宽带无线传输通道传输可移动平台100采集的图像至控制装置200,并基于窄带无线传输通道传输可移动平台100在采集图像时的运动信息至控制装置200。由于图像的数据量较大,宽带无线传输通道的带宽高,基于宽带无线传输通道传输图像的方式能够确保图像数据完整地传输至控制装置200;运动信息的数据量相对较小,窄带无线传输通道的带宽小,但可靠性高,保证运动信息被精确传输至控制装置200。
需要说明的是,本申请实施例中,控制装置200在当前时刻接收的图像中存在异常区域时,根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台100的运动信息,填补异常区域这一过程可以是实时的,也即在控制装置200控制可移动平台100移动的过程中,控制装置200实时执行图像中异常区域的填补操作,从而实现在无线图像传输出错的情况下,最小化对远程操控和用户体验上的影响,为远程操控提供跟多的信息支撑以及更准确的视觉信息,提高操控安全性、有效性以及操控体验,并且提高无线图像传输的整体视觉体验的效果。当然,在其他实施例中,控制装置200在当前时刻接收的图像中存在异常区域,根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台100的运动信息,填补异常区域这一过程可以离线执行。
本申请实施例的控制装置200会对接收到的图像以及运动信息进行对应存储,并实时显示接收到的图像或填补后的图像,方便用户根据实时显示图像控制可移动平台100移动,提高可移动平台100移动的安全性。
应当理解的是,异常区域还可以为其他指示图像中像素信息异常的区域,不限于上述列举的像素信息缺失区域、像素信息错误区域。
下面将详细介绍控制装置200根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台100的运动信息,填补异常区域的具体实现过程。
控制装置200根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台100的运动信息,填补异常区域的具体实现过程可以包括以下步骤:
(1)、根据上一时刻接收的图像对应的可移动平台100的运动信息以及当前时刻接收的图像对应的可移动平台100的运动信息,对上一时刻接收的图像进行处理;
上述步骤中的处理可以包括缩放处理、平移处理、旋转处理中的至少一种,处理方式需要根据上一时刻接收的图像对应的可移动平台100的运动信息以及当前时刻接收的图像对应的可移动平台100的运动信息确定。
当可移动平台100的拍摄装置的位置和姿态中的至少一个存在变化时,可移动平台100的拍摄装置采集的图像中的物体对应产生变化,其中,对于拍摄装置的位置变化,图像中的物体变化情况如下:若可移动平台100的拍摄装置向前移动,图像中的物体变大;若可移动平台100的拍摄装置向后移动,图像中的物体变小。若可移动平台100的拍摄装置向左移动,图像中的物体向右移动;若可移动平台100的拍摄装置向右移动,图像中的物体向左移动。若可移动平台100的拍摄装置向上移动,图像中的物体向下移动;若可移动平台100的拍摄装置向下移动,图像中的物体向上移动。
对于拍摄装置的姿态变化,图像中的物体大小变化情况如下:若可移动平台100的拍摄装置沿俯仰方向向上旋转,图像中的物体向下旋转;若可移动平台100的拍摄装置沿俯仰方向向下旋转,图像中的物体向上旋转。若可移动平台100的拍摄装置沿偏航方向向左旋转,图像中的物体向右旋转;若可移动平台100的拍摄装置沿偏航方向向右旋转,图像中的物体向左旋转。若可移动平台100的拍摄装置沿横滚方向向左旋转,图像中的物体向右旋转;若可移动平台100的拍摄装置沿横滚方向向右旋转,图像中的物体向左旋转。
本实施例中,镜头的光轴平行于拍摄装置的前后方向,镜头的光轴垂直于拍摄装置的左右方向,拍摄装置的重力方向平行于拍摄装置的上下方向。
本实施例中,可移动平台100的拍摄装置在可移动平台100的拍摄装置采集两幅图像之间可能产生位置变化,也可能产生姿态变化,该步骤(1)可以根据上一时刻接收的图像对应的可移动平台100的运动信息以及当前时刻接收的图像对应的可移动平台100的运动信息,确定可移动平台100在采集这两幅图像之间是产生了位置变化,还是产生了姿态变化,还是产生了位置变化和姿态变化,再进一步根据确定出的位置变化和/或姿态变化来对上一时刻接收的图像进行处理。可选地,在实现上述步骤(1)时,控制装置200用于根据上一时刻接收的图像对应的可移动平台100的运动信息以及当前时刻接收的图像对应的可移动平台100的运动信息,确定上一时刻接收的图像与当前时刻接收的图像对应的相对位置关系(即拍摄装置的位置变化),和上一时刻接收的图像与当前时刻接收的图像对应的相对姿态关系(即拍摄装置的姿态变化)中的至少一个;根据相对位置关系和相对姿态关系中的至少一个,对上一时刻接收的图像进行处理。
控制装置200根据相对位置关系和相对姿态关系中的至少一个,对上一时刻接收的图像进行处理的实现过程可以包括:根据相对位置关系和相对姿态关系中的至少一个,确定当前时刻接收的图像和上一时刻接收的图像之间的转换矩阵;根据转换矩阵,对上一时刻接收的图像进行处理。其中,转换矩阵可以包括缩放矩阵,或旋转矩阵,或平移矩阵,或缩放矩阵、旋转矩阵和平移矩阵中至少两个的级联。
下面详细介绍控制装置200如何根据相对位置关系和相对姿态关系中的至少一 个,确定当前时刻接收的图像和上一时刻接收的图像之间的转换矩阵。
在一些实施例中,控制装置200用于在当前时刻接收的图像与上一时刻接收的图像对应的相对位置关系存在前后位置移动时,确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵。当拍摄装置存在前后位置移动时,拍摄装置采集的图像中的物体变大或变小,因此,在当前时刻接收的图像与上一时刻接收的图像对应的相对位置关系存在前后位置移动时,当前时刻接收的图像和上一时刻接收的图像之间存在缩放变换关系。
控制装置200可以采用不同的策略来确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵,例如,在其中一种确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵的实施例中,控制装置200用于在当前时刻接收的图像与上一时刻接收的图像对应的相对位置关系存在前后位置移动时,根据前后位置移动的距离大小,确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵。本实施例中,缩放矩阵对应的缩放倍数与前后位置移动的距离大小正相关,也即,前后位置移动的距离越大,缩放矩阵对应的缩放倍数也越大。
在另一种确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵的实施例中,可移动平台100设有距离检测传感器,该距离检测传感器用于检测可移动平台100周围物体的深度信息,本申请对距离检测传感器的类型不作具体限定,任何能够检测可移动平台100周围物体的深度信息的距离检测传感器均属于本申请的保护范围。本实施例的可移动平台100还用于将可移动平台100采集的图像对应的深度信息发送给控制装置200,具体地,可移动平台100在采集图像的同时,通过距离检测传感器检测获得可移动平台100周围物体的深度信息;根据可移动平台100周围物体的深度信息,确定图像对应的深度信息;再将图像对应的深度信息发送给控制装置200。为减少图像传输过程中出现的错误,可移动平台100可基于不同的信号传输通道传输可移动平台100采集的图像以及可移动平台100采集的图像对应的深度信息至控制装置200。可移动平台100可基于同一信号传输通道传输可移动平台100采集的图像对应的深度信息以及可移动平台100在采集图像时的运动信息至控制装置200,示例性的,可移动平台100基于窄带无线传输通道传输可移动平台100采集的图像对应的深度信息以及可移动平台100在采集图像时的运动信息至控制装置200。
本实施例中,控制装置200用于在当前时刻接收的图像时的位置与上一时刻接收的图像对应的相对位置关系存在前后位置移动时,根据当前时刻接收的图像对应的深度信息以及上一时刻接收的图像对应的深度信息,确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵。基于深度信息确定缩放矩阵,可以使得图像中各不同深度的物体得到不同程度的缩放,更加接近实际视觉观感,进一步提高无线图像传输异常恢复的效果。
进一步地,在本实施例中,缩放矩阵对应的缩放倍数与当前时刻接收的图像对应的深度信息与上一时刻接收的图像对应的深度信息的比值相对应,可选地,缩放矩阵对应的缩放倍数与当前时刻接收的图像对应的深度信息与上一时刻接收的图像对应的深度信息的比值相等,即缩放矩阵对应的缩放倍数=(当前时刻接收的图像对应的深度信息/上一时刻接收的图像对应的深度信息);当然,缩放矩阵对应的缩放倍数与当前时刻接收的图像对应的深度信息与上一时刻接收的图像对应的深度信息的比值也可以不相等,根据当前时刻接收的图像对应的深度信息与上一时刻接收的图像对应的深度信息的比值确定缩放矩阵对应的缩放倍数即可。
应当理解的是,上述基于前后位置移动的距离大小确定缩放矩阵的实施例和基于深度信息确定缩放矩阵的实施例可以进行组合,以获得更加精确的缩放矩阵。
在另一些实施例中,控制装置200用于在当前时刻接收的图像与上一时刻接收的图像对应的相对位置关系存在左右位置移动和/或上下位置移动时,确定当前时刻接收的图像和上一时刻接收的图像之间的平移矩阵。当拍摄装置存在左右位置移动时,拍摄装置采集的图像中的物体向右或向左移动,因此,在当前时刻接收的图像与上一时刻接收的图像对应的相对位置关系存在左右位置移动时,当前时刻接收的图像和上一时刻接收的图像之间存在平移变换关系。当拍摄装置存在上下位置移动时,拍摄装置采集的图像中的物体向下或向上移动,因此,在当前时刻接收的图像与上一时刻接收的图像对应的相对位置关系存在上下位置移动时,当前时刻接收的图像和上一时刻接收的图像之间存在平移变换关系。
在另一些实施例中,控制装置200用于在当前时刻接收的图像与上一时刻接收的图像对应的相对姿态关系存在俯仰姿态变化、横滚姿态变化和偏航姿态变化中的至少一个时,确定当前时刻接收的图像和上一时刻接收的图像之间的旋转矩阵。当拍摄装置存在俯仰姿态变化、横滚姿态变化和偏航姿态变化中的至少一个时,拍摄装置采集的图像中的物体也是绕对应的轴旋转的,因此,在当前时刻接收的图像与上一时刻接收的图像对应的相对姿态关系存在俯仰姿态变化、横滚姿态变化和偏航姿态变化中的至少一个时,当前时刻接收的图像和上一时刻接收的图像之间存在旋转变换关系。
此外,控制装置200根据转换矩阵,对上一时刻接收的图像进行处理的实现过程可以包括:控制装置200用于根据转换矩阵,将上一时刻接收的图像转换至当前时刻接收的图像对应的可移动平台100的运动信息对应的拍摄视角下。
当转换矩阵为缩放矩阵时,控制装置200用于根据缩放矩阵,对上一时刻接收的图像进行缩放处理,以将上一时刻接收的图像转换至当前时刻接收的图像对应的可移动平台100的运动信息对应的拍摄视角下。示例性的,请参见图3A,图3A的图(1)为上一时刻接收的图像,图3A的图(2)为当前时刻接收的图像,图3A的图(2)中存在异常区域31,图3A的图(2)与图3A的图(1)对应的相对位置关系(即拍摄装置的位置 变化)存在向后移动,因此,图3A的图(2)中的物体相对图3A的图(1)中的物体变小,可以根据向后移动的距离大小确定图3A的图(2)和图3A的图(1)之间的缩放矩阵,再根据确定的图3A的图(2)和图3A的图(1)之间的缩放矩阵对图3A的图(1)进行缩小处理,将图3A的图(1)转换至图3A的图(2)对应的拍摄视角下,得到图3B所示的图像,图3B即为处理后的图3A的图(1)的图像。
当转换矩阵为平移矩阵时,控制装置200用于根据平移矩阵,对上一时刻接收的图像进行平移处理,以将上一时刻接收的图像转换至当前时刻接收的图像对应的可移动平台100的运动信息对应的拍摄视角下。示例性的,请参见图4A,图4A的图(1)为上一时刻接收的图像,图4A的图(2)为当前时刻接收的图像,图4A的图(2)中存在异常区域41,图4A的图(2)与图4A的图(1)对应的相对位置关系(即拍摄装置的位置变化)存在向左平移,因此,图4A的图(2)中的物体相对图4A的图(1)中的向右平移,可以根据向左平移的距离大小确定图4A的图(2)和图4A的图(1)之间的平移矩阵,再根据确定的图4A的图(2)和图4A的图(1)之间的平移矩阵对图4A的图(1)进行向右平移处理,将图4A的图(1)转换至图4A的图(2)对应的拍摄视角下,得到图4B所示的图像,图4B即为处理后的图4A的图(1)的图像。
当转换矩阵为旋转矩阵时,控制装置200用于根据旋转矩阵,对上一时刻接收的图像进行旋转处理,以将上一时刻接收的图像转换至当前时刻接收的图像对应的可移动平台100的运动信息对应的拍摄视角下。示例性的,请参见图5A,图5A的图(1)为上一时刻接收的图像,图5A的图(2)为当前时刻接收的图像,图5A的图(2)中存在异常区域51,图5A的图(2)与图5A的图(1)对应的相对姿态关系(即拍摄装置的姿态变化)存在向左旋转(横滚姿态改变),因此,图5A的图(2)中的物体相对图5A的图(1)中的向右旋转,可以根据偏航姿态变化大小确定图5A的图(2)和图5A的图(1)之间的旋转矩阵,再根据确定的图5A的图(2)和图5A的图(1)之间的旋转矩阵对图5A的图(1)进行向右旋转处理,将图5A的图(1)转换至图5A的图(2)对应的拍摄视角下,得到图5B所示的图像,图5B即为处理后的图5A的图(1)的图像。
(2)、根据处理后的上一时刻接收的图像,填补异常区域。
在实现步骤(2)时,具体地,控制装置200用于根据处理后的上一时刻接收的图像与当前时刻接收的图像中的像素对应关系,确定处理后的上一时刻接收的图像中与异常区域相对应的区域;将处理后的上一时刻接收的图像中与异常区域相对应的区域覆盖至异常区域上。示例性的,请参见图3A和图3B,图3B中的区域32即为与异常区域31相对应的区域,将区域32覆盖到异常区域31上,即完成了异常区域填补的操作。请参见图4A和图4B,图4B中的区域42即为与异常区域41相对应的区域,将区域42覆盖到异常区域41上,即完成了异常区域填补的操作。请参见图5A和图5B,图5B中的区域52即为与异常区域51相对应的区域,将区域52覆盖到异常区域51上,即完成了异常区域填补的操作。
另外,考虑到拍摄装置的镜头通常会引入镜头畸变,为消除镜头畸变的影响,控制装置200可以先根据镜头参数对上一时刻接收的图像与当前时刻接收的图像进行分别处理,再根据镜头畸变消除后的上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台100的运动信息,对镜头畸变消除后的当前时刻接收的图像中的异常区域进行填补,提高图像显示效果。
图6是本申请一实施例中的图像处理方法在控制装置侧的方法流程示意图;请参见图6,本申请实施例的图像处理方法可以包括如下步骤:
S601:接收可移动平台发送的可移动平台采集的图像以及可移动平台在采集图像时的运动信息;
S602:若当前时刻接收的图像中存在异常区域,根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台的运动信息,填补镜头畸变异常区域,其中,异常区域包括像素信息缺失区域或像素信息错误区域。
可选地,可移动平台的运动信息包括可移动平台的拍摄装置的位置信息和姿态信息中的至少一种。
可选地,根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台的运动信息,填补异常区域,包括:根据上一时刻接收的图像对应的可移动平台的运动信息以及当前时刻接收的图像对应的可移动平台的运动信息,对上一时刻接收的图像进行处理;根据处理后的上一时刻接收的图像,填补异常区域。
可选地,处理包括缩放处理、平移处理、旋转处理中的至少一种。
可选地,根据上一时刻接收的图像对应的可移动平台的运动信息以及当前时刻接收的图像对应的可移动平台的运动信息,对上一时刻接收的图像进行处理,包括:根据上一时刻接收的图像对应的可移动平台的运动信息以及当前时刻接收的图像对应的可移动平台的运动信息,确定上一时刻接收的图像与当前时刻接收的图像对应的相对位置关系,和上一时刻接收的图像与当前时刻接收的图像对应的相对姿态关系中的至少一个;根据相对位置关系和相对姿态关系中的至少一个,对上一时刻接收的图像进行处理。
可选地,根据相对位置关系和相对姿态关系中的至少一个,对上一时刻接收的图像进行处理,包括:根据相对位置关系和相对姿态关系中的至少一个,确定当前时刻接收的图像和上一时刻接收的图像之间的转换矩阵;根据转换矩阵,对上一时刻接收的图像进行处理。
可选地,转换矩阵包括缩放矩阵,或旋转矩阵,或平移矩阵,或缩放矩阵、旋转矩阵和平移矩阵中至少两个的级联。
可选地,根据相对位置关系和相对姿态关系中的至少一个,确定当前时刻接收的 图像和上一时刻接收的图像之间的转换矩阵,包括:若当前时刻接收的图像与上一时刻接收的图像对应的相对位置关系存在前后位置移动时,确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵。
可选地,确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵,包括:根据前后位置移动的距离大小,确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵。
可选地,缩放矩阵对应的缩放倍数与前后位置移动的距离大小正相关。
可选地,可移动平台设有距离检测传感器,用于检测可移动平台周围物体的深度信息;所述图像处理方法还包括:接收可移动平台发送的可移动平台采集的图像对应的深度信息;确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵,包括:根据当前时刻接收的图像对应的深度信息以及上一时刻接收的图像对应的深度信息,确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵。
可选地,缩放矩阵对应的缩放倍数与当前时刻接收的图像对应的深度信息与上一时刻接收的图像对应的深度信息的比值相对应。
可选地,根据相对位置关系和相对姿态关系中的至少一个,确定当前时刻接收的图像和上一时刻接收的图像之间的转换矩阵,包括:若当前时刻接收的图像与上一时刻接收的图像对应的相对位置关系存在左右位置移动和/或上下位置移动时,确定当前时刻接收的图像和上一时刻接收的图像之间的平移矩阵。
可选地,根据相对位置关系和相对姿态关系中的至少一个,确定当前时刻接收的图像和上一时刻接收的图像之间的转换矩阵,包括:若当前时刻接收的图像与上一时刻接收的图像对应的相对姿态关系存在俯仰姿态变化、横滚姿态变化和偏航姿态变化中的至少一个时,确定当前时刻接收的图像和上一时刻接收的图像之间的旋转矩阵。
可选地,根据转换矩阵,对上一时刻接收的图像进行处理,包括:根据转换矩阵,将上一时刻接收的图像转换至当前时刻接收的图像对应的可移动平台的运动信息对应的拍摄视角下。
可选地,根据处理后的上一时刻接收的图像,填补异常区域,包括:根据处理后的上一时刻接收的图像与当前时刻接收的图像中的像素对应关系,确定处理后的上一时刻接收的图像中与异常区域相对应的区域;将处理后的上一时刻接收的图像中与异常区域相对应的区域覆盖至异常区域上。
可选地,可移动平台的运动信息基于可移动平台的拍摄装置的姿态传感器和/或可移动平台的拍摄装置的位置传感器检测获得。
可选地,可移动平台基于不同的信号传输通道传输可移动平台采集的图像以及可移动平台在采集图像时的运动信息至控制装置。
可选地,可移动平台基于宽带无线传输通道传输可移动平台采集的图像至控制装置,并基于窄带无线传输通道传输可移动平台在采集图像时的运动信息至控制装置。
对于上述实施例的图像处理方法,本申请实施例还提供一种可移动平台的控制装置,图7是本申请一实施例中的控制装置的结构框图;请参见图7,该控制装置可以包括存储装置210和一个或多个第一处理器220。
其中,存储装置210,用于存储程序指令;一个或多个第一处理器220,调用存储装置210中存储的程序指令,当程序指令被执行时,一个或多个第一处理器220单独地或共同地被配置成用于实施如下操作:
接收可移动平台发送的可移动平台采集的图像以及可移动平台在采集图像时的运动信息;
若当前时刻接收的图像中存在异常区域,根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台的运动信息,填补异常区域;
其中,异常区域包括像素信息缺失区域或像素信息错误区域。
可选地,可移动平台的运动信息包括可移动平台的拍摄装置的位置信息和姿态信息中的至少一种。
可选地,一个或多个第一处理器220在根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台的运动信息,填补异常区域,单独地或共同地被进一步配置成用于实施如下操作:根据上一时刻接收的图像对应的可移动平台的运动信息以及当前时刻接收的图像对应的可移动平台的运动信息,对上一时刻接收的图像进行处理;根据处理后的上一时刻接收的图像,填补异常区域。
可选地,处理包括缩放处理、平移处理、旋转处理中的至少一种。
可选地,一个或多个第一处理器220在根据上一时刻接收的图像对应的可移动平台的运动信息以及当前时刻接收的图像对应的可移动平台的运动信息,对上一时刻接收的图像进行处理时,单独地或共同地被进一步配置成用于实施如下操作:根据上一时刻接收的图像对应的可移动平台的运动信息以及当前时刻接收的图像对应的可移动平台的运动信息,确定上一时刻接收的图像与当前时刻接收的图像对应的相对位置关系,和上一时刻接收的图像与当前时刻接收的图像对应的相对姿态关系中的至少一个;根据相对位置关系和相对姿态关系中的至少一个,对上一时刻接收的图像进行处理。
可选地,一个或多个第一处理器220在根据相对位置关系和相对姿态关系中的至少一个,对上一时刻接收的图像进行处理时,单独地或共同地被进一步配置成用于实施如下操作:根据相对位置关系和相对姿态关系中的至少一个,确定当前时刻接收的图像和上一时刻接收的图像之间的转换矩阵;根据转换矩阵,对上一时刻接收的图像进行处理。
可选地,转换矩阵包括缩放矩阵,或旋转矩阵,或平移矩阵,或缩放矩阵、旋转矩阵和平移矩阵中至少两个的级联。
可选地,一个或多个第一处理器220在根据相对位置关系和相对姿态关系中的至少一个,确定当前时刻接收的图像和上一时刻接收的图像之间的转换矩阵时,单独地或共同地被进一步配置成用于实施如下操作:若当前时刻接收的图像与上一时刻接收的图像对应的相对位置关系存在前后位置移动时,确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵。
可选地,一个或多个第一处理器220在确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵时,单独地或共同地被进一步配置成用于实施如下操作:根据前后位置移动的距离大小,确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵。
可选地,缩放矩阵对应的缩放倍数与前后位置移动的距离大小正相关。
可选地,可移动平台设有距离检测传感器,用于检测可移动平台周围物体的深度信息;一个或多个第一处理器220单独地或共同地还被配置成用于实施如下操作:接收可移动平台发送的可移动平台采集的图像对应的深度信息;一个或多个第一处理器220在确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵时,单独地或共同地被进一步配置成用于实施如下操作:根据当前时刻接收的图像对应的深度信息以及上一时刻接收的图像对应的深度信息,确定当前时刻接收的图像和上一时刻接收的图像之间的缩放矩阵。
可选地,缩放矩阵对应的缩放倍数与当前时刻接收的图像对应的深度信息与上一时刻接收的图像对应的深度信息的比值相对应。
可选地,一个或多个第一处理器220在根据相对位置关系和相对姿态关系中的至少一个,确定当前时刻接收的图像和上一时刻接收的图像之间的转换矩阵时,单独地或共同地被进一步配置成用于实施如下操作:若当前时刻接收的图像与上一时刻接收的图像对应的相对位置关系存在左右位置移动和/或上下位置移动时,确定当前时刻接收的图像和上一时刻接收的图像之间的平移矩阵。
可选地,一个或多个第一处理器220在根据相对位置关系和相对姿态关系中的至少一个,确定当前时刻接收的图像和上一时刻接收的图像之间的转换矩阵时,单独地或共同地被进一步配置成用于实施如下操作:若当前时刻接收的图像与上一时刻接收的图像对应的相对姿态关系存在俯仰姿态变化、横滚姿态变化和偏航姿态变化中的至少一个时,确定当前时刻接收的图像和上一时刻接收的图像之间的旋转矩阵。
可选地,一个或多个第一处理器220在根据转换矩阵,对上一时刻接收的图像进行处理时,单独地或共同地被进一步配置成用于实施如下操作:根据转换矩阵,将上一时刻接收的图像转换至当前时刻接收的图像对应的可移动平台的运动信息对应的拍 摄视角下。
可选地,一个或多个第一处理器220在根据处理后的上一时刻接收的图像,填补异常区域,单独地或共同地被进一步配置成用于实施如下操作:根据处理后的上一时刻接收的图像与当前时刻接收的图像中的像素对应关系,确定处理后的上一时刻接收的图像中与异常区域相对应的区域;将处理后的上一时刻接收的图像中与异常区域相对应的区域覆盖至异常区域上。
可选地,可移动平台的运动信息基于可移动平台的拍摄装置的姿态传感器和/或可移动平台的拍摄装置的位置传感器检测获得。
可选地,可移动平台基于不同的信号传输通道传输可移动平台采集的图像以及可移动平台在采集图像时的运动信息至控制装置。
可选地,可移动平台基于宽带无线传输通道传输可移动平台采集的图像至控制装置,并基于窄带无线传输通道传输可移动平台在采集图像时的运动信息至控制装置。
上述存储装置可以包括易失性存储器(volatile memory),例如随机存取存储器(random-access memory,RAM);存储装置也可以包括非易失性存储器(non-volatile memory),例如快闪存储器(flash memory),硬盘(hard disk drive,HDD)或固态硬盘(solid-state drive,SSD);存储装置110还可以包括上述种类的存储器的组合。
上述第一处理器220可以是中央处理器(central processing unit,CPU)。该第一处理器220还可以是其它通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(application-specific integrated circuit,ASIC)、现场可编程逻辑门阵列(field-programmable gate array,FPGA)或者其它可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
图8是本申请一实施例中的图像传输方法在可移动平台侧的方法流程示意图;请参见图8,本申请实施例的图像传输方法可以包括如下步骤:
S801:将可移动平台采集的图像以及可移动平台在采集图像时的运动信息发送给与可移动平台通信连接的控制装置,以使控制装置在当前时刻接收的图像中存在异常区域时,用于根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台的运动信息,填补异常区域,其中,异常区域包括像素信息缺失区域或像素信息错误区域。
可选地,可移动平台的运动信息包括可移动平台的拍摄装置的位置信息和姿态信息中的至少一种。
可选地,可移动平台的运动信息基于可移动平台的拍摄装置的姿态传感器和/或可移动平台的拍摄装置的位置传感器检测获得。
可选地,可移动平台基于不同的信号传输通道传输可移动平台采集的图像以及可移动平台在采集图像时的运动信息至控制装置。
可选地,可移动平台基于宽带无线传输通道传输可移动平台采集的图像至控制装置,并基于窄带无线传输通道传输可移动平台在采集图像时的运动信息至控制装置。
对于上述实施例的图像传输方法,本申请实施例还提供一种可移动平台,图9是本申请一实施例中的可移动平台的结构框图。请参见图9,该可移动平台可以包括拍摄装置110、传感器模块120和第二处理器130。
其中,拍摄装置110,用于采集图像;
传感器模块120,用于采集可移动平台在采集图像时的运动信息;
第二处理器130,与拍摄装置110、传感器模块120分别电连接,第二处理器130被配置成用于实施如下操作:将拍摄装置110采集的图像以及可移动平台在采集图像时的运动信息发送给控制装置,以使控制装置在当前时刻接收的图像中存在异常区域,用于根据上一时刻接收的图像以及上一时刻接收的图像对应的可移动平台的运动信息,填补异常区域;其中,异常区域包括像素信息缺失区域或像素信息错误区域。
可选地,可移动平台的运动信息包括可移动平台的拍摄装置110的位置信息和姿态信息中的至少一种。
可选地,传感器模块120包括姿态传感器和/或位置传感器,可移动平台的运动信息基于姿态传感器和/或位置传感器检测获得。
可选地,可移动平台基于不同的信号传输通道传输可移动平台采集的图像以及可移动平台在采集图像时的运动信息至控制装置。
可选地,可移动平台基于宽带无线传输通道传输可移动平台采集的图像至控制装置,并基于窄带无线传输通道传输可移动平台在采集图像时的运动信息至控制装置。
上述第二处理器130可以是中央处理器(central processing unit,CPU)。该第二处理器130还可以是其它通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(application-specific integrated circuit,ASIC)、现场可编程逻辑门阵列(field-programmable gate array,FPGA)或者其它可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
此外,本申请实施例还提供一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现上述图像处理方法或上述图像传输方法的步骤。
所述计算机可读存储介质可以是前述任一实施例所述的控制装置或可移动平台的内部存储单元,例如硬盘或内存。所述计算机可读存储介质也可以是控制装置或可移 动平台的外部存储设备,例如所述设备上配备的插接式硬盘、智能存储卡(Smart Media Card,SMC)、SD卡、闪存卡(Flash Card)等。进一步的,所述计算机可读存储介质还可以既包括控制装置或可移动平台的内部存储单元也包括外部存储设备。所述计算机可读存储介质用于存储所述计算机程序以及所述控制装置或可移动平台所需的其他程序和数据,还可以用于暂时地存储已经输出或者将要输出的数据。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
以上所揭露的仅为本申请部分实施例而已,当然不能以此来限定本申请之权利范围,因此依本申请权利要求所作的等同变化,仍属本申请所涵盖的范围。

Claims (67)

  1. 一种图像传输系统,其特征在于,所述图像传输系统包括可移动平台和与所述可移动平台通信连接的控制装置;
    所述可移动平台用于将所述可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息发送给所述控制装置;
    若所述控制装置当前时刻接收的图像中存在异常区域,所述控制装置用于根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域,所述异常区域包括像素信息缺失区域或像素信息错误区域。
  2. 根据权利要求1所述的图像传输系统,其特征在于,所述可移动平台的运动信息包括所述可移动平台的拍摄装置的位置信息和姿态信息中的至少一种。
  3. 根据权利要求1所述的图像传输系统,其特征在于,所述控制装置用于根据所述上一时刻接收的图像对应的可移动平台的运动信息以及所述当前时刻接收的图像对应的可移动平台的运动信息,对所述上一时刻接收的图像进行处理;根据处理后的上一时刻接收的图像,填补所述异常区域。
  4. 根据权利要求3所述的图像传输系统,其特征在于,所述处理包括缩放处理、平移处理、旋转处理中的至少一种。
  5. 根据权利要求3所述的图像传输系统,其特征在于,所述控制装置用于根据所述上一时刻接收的图像对应的可移动平台的运动信息以及所述当前时刻接收的图像对应的可移动平台的运动信息,确定所述上一时刻接收的图像与所述当前时刻接收的图像对应的相对位置关系,和所述上一时刻接收的图像与所述当前时刻接收的图像对应的相对姿态关系中的至少一个;根据所述相对位置关系和所述相对姿态关系中的至少一个,对所述上一时刻接收的图像进行处理。
  6. 根据权利要求5所述的图像传输系统,其特征在于,所述控制装置用于根据所述相对位置关系和所述相对姿态关中的至少一个,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的转换矩阵;根据所述转换矩阵,对所述上一时刻接收的图像进行处理。
  7. 根据权利要求6所述的图像传输系统,其特征在于,所述转换矩阵包括缩放矩阵,或旋转矩阵,或平移矩阵,或缩放矩阵、旋转矩阵和平移矩阵中至少两个的级联。
  8. 根据权利要求7所述的图像传输系统,其特征在于,所述控制装置用于在所述当前时刻接收的图像与所述上一时刻接收的图像对应的相对位置关系存在前后位置移动时,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵。
  9. 根据权利要求8所述的图像传输系统,其特征在于,所述控制装置用于在所述当前时刻接收的图像与所述上一时刻接收的图像对应的相对位置关系存在前后位置移动时,根据前后位置移动的距离大小,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵。
  10. 根据权利要求9所述的图像传输系统,其特征在于,所述缩放矩阵对应的缩 放倍数与所述前后位置移动的距离大小正相关。
  11. 根据权利要求或8至10任一项所述的图像传输系统,其特征在于,所述可移动平台设有距离检测传感器,用于检测所述可移动平台周围物体的深度信息;
    所述可移动平台还用于将所述可移动平台采集的图像对应的深度信息发送给所述控制装置;
    所述控制装置用于在所述当前时刻接收的图像时的位置与所述上一时刻接收的图像对应的相对位置关系存在前后位置移动时,根据所述当前时刻接收的图像对应的深度信息以及所述上一时刻接收的图像对应的深度信息,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵。
  12. 根据权利要求11所述的图像传输系统,其特征在于,所述缩放矩阵对应的缩放倍数与所述当前时刻接收的图像对应的深度信息与所述上一时刻接收的图像对应的深度信息的比值相对应。
  13. 根据权利要求7所述的图像传输系统,其特征在于,所述控制装置用于在所述当前时刻接收的图像与所述上一时刻接收的图像对应的相对位置关系存在左右位置移动和/或上下位置移动时,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的平移矩阵。
  14. 根据权利要求7所述的图像传输系统,其特征在于,所述控制装置用于在所述当前时刻接收的图像与所述上一时刻接收的图像对应的相对姿态关系存在俯仰姿态变化、横滚姿态变化和偏航姿态变化中的至少一个时,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的旋转矩阵。
  15. 根据权利要求6所述的图像传输系统,其特征在于,所述控制装置用于根据所述转换矩阵,将所述上一时刻接收的图像转换至所述当前时刻接收的图像对应的可移动平台的运动信息对应的拍摄视角下。
  16. 根据权利要求3所述的图像传输系统,其特征在于,所述控制装置用于根据所述处理后的上一时刻接收的图像与所述当前时刻接收的图像中的像素对应关系,确定所述处理后的上一时刻接收的图像中与所述异常区域相对应的区域;将处理后的上一时刻接收的图像中与所述异常区域相对应的区域覆盖至所述异常区域上。
  17. 根据权利要求1所述的图像传输系统,其特征在于,所述可移动平台的运动信息基于所述可移动平台的拍摄装置的姿态传感器和/或所述可移动平台的拍摄装置的位置传感器检测获得。
  18. 根据权利要求1所述的图像传输系统,其特征在于,所述可移动平台基于不同的信号传输通道传输所述可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息至所述控制装置。
  19. 根据权利要求18所述的图像传输系统,其特征在于,所述可移动平台基于宽带无线传输通道传输所述可移动平台采集的图像至所述控制装置,并基于窄带无线传输通道传输所述可移动平台在采集所述图像时的运动信息至所述控制装置。
  20. 一种图像处理方法,其特征在于,所述方法包括:
    接收可移动平台发送的所述可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息;
    若当前时刻接收的图像中存在异常区域,根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域;
    其中,所述异常区域包括像素信息缺失区域或像素信息错误区域。
  21. 根据权利要求20所述的方法,其特征在于,所述可移动平台的运动信息包括所述可移动平台的拍摄装置的位置信息和姿态信息中的至少一种。
  22. 根据权利要求20所述的方法,其特征在于,所述根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域,包括:
    根据所述上一时刻接收的图像对应的可移动平台的运动信息以及所述当前时刻接收的图像对应的可移动平台的运动信息,对所述上一时刻接收的图像进行处理;
    根据处理后的上一时刻接收的图像,填补所述异常区域。
  23. 根据权利要求22所述的方法,其特征在于,所述处理包括缩放处理、平移处理、旋转处理中的至少一种。
  24. 根据权利要求22所述的方法,其特征在于,所述根据所述上一时刻接收的图像对应的可移动平台的运动信息以及所述当前时刻接收的图像对应的可移动平台的运动信息,对所述上一时刻接收的图像进行处理,包括:
    根据所述上一时刻接收的图像对应的可移动平台的运动信息以及所述当前时刻接收的图像对应的可移动平台的运动信息,确定所述上一时刻接收的图像与所述当前时刻接收的图像对应的相对位置关系,和所述上一时刻接收的图像与所述当前时刻接收的图像对应的相对姿态关系中的至少一个;
    根据所述相对位置关系和所述相对姿态关系中的至少一个,对所述上一时刻接收的图像进行处理。
  25. 根据权利要求24所述的方法,其特征在于,所述根据所述相对位置关系和所述相对姿态关系中的至少一个,对所述上一时刻接收的图像进行处理,包括:
    根据所述相对位置关系和所述相对姿态关系中的至少一个,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的转换矩阵;
    根据所述转换矩阵,对所述上一时刻接收的图像进行处理。
  26. 根据权利要求25所述的方法,其特征在于,所述转换矩阵包括缩放矩阵,或旋转矩阵,或平移矩阵,或缩放矩阵、旋转矩阵和平移矩阵中至少两个的级联。
  27. 根据权利要求26所述的方法,其特征在于,所述根据所述相对位置关系和所述相对姿态关系中的至少一个,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的转换矩阵,包括:
    当所述当前时刻接收的图像与所述上一时刻接收的图像对应的相对位置关系存在前后位置移动时,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩 放矩阵。
  28. 根据权利要求27所述的方法,其特征在于,所述确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵,包括:
    根据所述前后位置移动的距离大小,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵。
  29. 根据权利要求28所述的方法,其特征在于,所述缩放矩阵对应的缩放倍数与所述前后位置移动的距离大小正相关。
  30. 根据权利要求27至29任一项所述的方法,其特征在于,所述可移动平台设有距离检测传感器,用于检测所述可移动平台周围物体的深度信息;
    所述方法还包括:
    接收所述可移动平台发送的所述可移动平台采集的图像对应的深度信息;
    所述确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵,包括:
    根据所述当前时刻接收的图像对应的深度信息以及所述上一时刻接收的图像对应的深度信息,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵。
  31. 根据权利要求30所述的方法,其特征在于,所述缩放矩阵对应的缩放倍数与所述当前时刻接收的图像对应的深度信息与所述上一时刻接收的图像对应的深度信息的比值相对应。
  32. 根据权利要求26所述的方法,其特征在于,所述根据所述相对位置关系和所述相对姿态关系中的至少一个,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的转换矩阵,包括:
    当所述当前时刻接收的图像与所述上一时刻接收的图像对应的相对位置关系存在左右位置移动和/或上下位置移动时,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的平移矩阵。
  33. 根据权利要求26所述的方法,其特征在于,所述根据所述相对位置关系和所述相对姿态关系中的至少一个,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的转换矩阵,包括:
    当所述当前时刻接收的图像与所述上一时刻接收的图像对应的相对姿态关系存在俯仰姿态变化、横滚姿态变化和偏航姿态变化中的至少一个时,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的旋转矩阵。
  34. 根据权利要求25所述的方法,其特征在于,所述根据所述转换矩阵,对所述上一时刻接收的图像进行处理,包括:
    根据所述转换矩阵,将所述上一时刻接收的图像转换至所述当前时刻接收的图像对应的可移动平台的运动信息对应的拍摄视角下。
  35. 根据权利要求22所述的方法,其特征在于,所述根据处理后的上一时刻接收 的图像,填补所述异常区域,包括:
    根据所述处理后的上一时刻接收的图像与所述当前时刻接收的图像中的像素对应关系,确定所述处理后的上一时刻接收的图像中与所述异常区域相对应的区域;
    将处理后的上一时刻接收的图像中与所述异常区域相对应的区域覆盖至所述异常区域上。
  36. 根据权利要求20所述的方法,其特征在于,所述可移动平台的运动信息基于所述可移动平台的拍摄装置的姿态传感器和/或所述可移动平台的拍摄装置的位置传感器检测获得。
  37. 根据权利要求20所述的方法,其特征在于,所述可移动平台基于不同的信号传输通道传输所述可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息至与所述可移动平台通信连接的控制装置。
  38. 根据权利要求37所述的方法,其特征在于,所述可移动平台基于宽带无线传输通道传输所述可移动平台采集的图像至所述控制装置,并基于窄带无线传输通道传输所述可移动平台在采集所述图像时的运动信息至所述控制装置。
  39. 一种可移动平台的控制装置,其特征在于,所述装置包括:
    存储装置,用于存储程序指令;以及
    一个或多个处理器,调用所述存储装置中存储的程序指令,当所述程序指令被执行时,所述一个或多个处理器单独地或共同地被配置成用于实施如下操作:
    接收所述可移动平台发送的所述可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息;
    若当前时刻接收的图像中存在异常区域,根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域;
    其中,所述异常区域包括像素信息缺失区域或像素信息错误区域。
  40. 根据权利要求39所述的控制装置,其特征在于,所述可移动平台的运动信息包括所述可移动平台的拍摄装置的位置信息和姿态信息中的至少一种。
  41. 根据权利要求39所述的控制装置,其特征在于,所述一个或多个处理器在根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域,单独地或共同地被进一步配置成用于实施如下操作:
    根据所述上一时刻接收的图像对应的可移动平台的运动信息以及所述当前时刻接收的图像对应的可移动平台的运动信息,对所述上一时刻接收的图像进行处理;
    根据处理后的上一时刻接收的图像,填补所述异常区域。
  42. 根据权利要求41所述的控制装置,其特征在于,所述处理包括缩放处理、平移处理、旋转处理中的至少一种。
  43. 根据权利要求41所述的控制装置,其特征在于,所述一个或多个处理器在根据所述上一时刻接收的图像对应的可移动平台的运动信息以及所述当前时刻接收的图像对应的可移动平台的运动信息,对所述上一时刻接收的图像进行处理时,单独地或 共同地被进一步配置成用于实施如下操作:
    根据所述上一时刻接收的图像对应的可移动平台的运动信息以及所述当前时刻接收的图像对应的可移动平台的运动信息,确定所述上一时刻接收的图像与所述当前时刻接收的图像对应的相对位置关系,和所述上一时刻接收的图像与所述当前时刻接收的图像对应的相对姿态关系中的至少一个;
    根据所述相对位置关系和所述相对姿态关系中的至少一个,对所述上一时刻接收的图像进行处理。
  44. 根据权利要求43所述的控制装置,其特征在于,所述一个或多个处理器在根据所述相对位置关系和所述相对姿态关系中的至少一个,对所述上一时刻接收的图像进行处理时,单独地或共同地被进一步配置成用于实施如下操作:
    根据所述相对位置关系和所述相对姿态关系中的至少一个,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的转换矩阵;
    根据所述转换矩阵,对所述上一时刻接收的图像进行处理。
  45. 根据权利要求44所述的控制装置,其特征在于,所述转换矩阵包括缩放矩阵,或旋转矩阵,或平移矩阵,或缩放矩阵、旋转矩阵和平移矩阵中至少两个的级联。
  46. 根据权利要求45所述的控制装置,其特征在于,所述一个或多个处理器在根据所述相对位置关系和所述相对姿态关系中的至少一个,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的转换矩阵时,单独地或共同地被进一步配置成用于实施如下操作:
    当所述当前时刻接收的图像与所述上一时刻接收的图像对应的相对位置关系存在前后位置移动时,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵。
  47. 根据权利要求46所述的控制装置,其特征在于,所述一个或多个处理器在确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵时,单独地或共同地被进一步配置成用于实施如下操作:
    根据所述前后位置移动的距离大小,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵。
  48. 根据权利要求47所述的控制装置,其特征在于,所述缩放矩阵对应的缩放倍数与所述前后位置移动的距离大小正相关。
  49. 根据权利要求46至48任一项所述的控制装置,其特征在于,所述可移动平台设有距离检测传感器,用于检测所述可移动平台周围物体的深度信息;
    所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    接收所述可移动平台发送的所述可移动平台采集的图像对应的深度信息;
    所述一个或多个处理器在确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵时,单独地或共同地被进一步配置成用于实施如下操作:
    根据所述当前时刻接收的图像对应的深度信息以及所述上一时刻接收的图像对应 的深度信息,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的缩放矩阵。
  50. 根据权利要求49所述的控制装置,其特征在于,所述缩放矩阵对应的缩放倍数与所述当前时刻接收的图像对应的深度信息与所述上一时刻接收的图像对应的深度信息的比值相对应。
  51. 根据权利要求45所述的控制装置,其特征在于,所述一个或多个处理器在根据所述相对位置关系和所述相对姿态关系中的至少一个,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的转换矩阵时,单独地或共同地被进一步配置成用于实施如下操作:
    当所述当前时刻接收的图像与所述上一时刻接收的图像对应的相对位置关系存在左右位置移动和/或上下位置移动时,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的平移矩阵。
  52. 根据权利要求45所述的控制装置,其特征在于,所述一个或多个处理器在根据所述相对位置关系和所述相对姿态关系中的至少一个,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的转换矩阵时,单独地或共同地被进一步配置成用于实施如下操作:
    当所述当前时刻接收的图像与所述上一时刻接收的图像对应的相对姿态关系存在俯仰姿态变化、横滚姿态变化和偏航姿态变化中的至少一个时,确定所述当前时刻接收的图像和所述上一时刻接收的图像之间的旋转矩阵。
  53. 根据权利要求44所述的控制装置,其特征在于,所述一个或多个处理器在根据所述转换矩阵,对所述上一时刻接收的图像进行处理时,单独地或共同地被进一步配置成用于实施如下操作:
    根据所述转换矩阵,将所述上一时刻接收的图像转换至所述当前时刻接收的图像对应的可移动平台的运动信息对应的拍摄视角下。
  54. 根据权利要求41所述的控制装置,其特征在于,所述一个或多个处理器在根据处理后的上一时刻接收的图像,填补所述异常区域,单独地或共同地被进一步配置成用于实施如下操作:
    根据所述处理后的上一时刻接收的图像与所述当前时刻接收的图像中的像素对应关系,确定所述处理后的上一时刻接收的图像中与所述异常区域相对应的区域;
    将处理后的上一时刻接收的图像中与所述异常区域相对应的区域覆盖至所述异常区域上。
  55. 根据权利要求39所述的控制装置,其特征在于,所述可移动平台的运动信息基于所述可移动平台的拍摄装置的姿态传感器和/或所述可移动平台的拍摄装置的位置传感器检测获得。
  56. 根据权利要求39所述的控制装置,其特征在于,所述可移动平台基于不同的信号传输通道传输所述可移动平台采集的图像以及所述可移动平台在采集所述图像时 的运动信息至所述控制装置。
  57. 根据权利要求56所述的控制装置,其特征在于,所述可移动平台基于宽带无线传输通道传输所述可移动平台采集的图像至所述控制装置,并基于窄带无线传输通道传输所述可移动平台在采集所述图像时的运动信息至所述控制装置。
  58. 一种图像传输方法,其特征在于,所述方法包括:
    将可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息发送给与所述可移动平台通信连接的控制装置,以使所述控制装置在当前时刻接收的图像中存在异常区域时,用于根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域,其中,所述异常区域包括像素信息缺失区域或像素信息错误区域。
  59. 根据权利要求58所述的方法,其特征在于,所述可移动平台的运动信息包括所述可移动平台的拍摄装置的位置信息和姿态信息中的至少一种。
  60. 根据权利要求58所述的方法,其特征在于,所述可移动平台的运动信息基于所述可移动平台的拍摄装置的姿态传感器和/或所述可移动平台的拍摄装置的位置传感器检测获得。
  61. 根据权利要求58所述的方法,其特征在于,所述可移动平台基于不同的信号传输通道传输所述可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息至所述控制装置。
  62. 根据权利要求61所述的方法,其特征在于,所述可移动平台基于宽带无线传输通道传输所述可移动平台采集的图像至所述控制装置,并基于窄带无线传输通道传输所述可移动平台在采集所述图像时的运动信息至所述控制装置。
  63. 一种可移动平台,与控制装置通信连接,其特征在于,所述可移动平台包括:
    拍摄装置,用于采集图像;
    传感器模块,用于采集所述可移动平台在采集所述图像时的运动信息;
    处理器,与所述拍摄装置、所述传感器模块分别电连接,所述处理器被配置成用于实施如下操作:
    将所述拍摄装置采集的图像以及所述可移动平台在采集所述图像时的运动信息发送给所述控制装置,以使所述控制装置在当前时刻接收的图像中存在异常区域时,用于根据上一时刻接收的图像以及所述上一时刻接收的图像对应的可移动平台的运动信息,填补所述异常区域;
    其中,所述异常区域包括像素信息缺失区域或像素信息错误区域。
  64. 根据权利要求63所述的可移动平台,其特征在于,所述可移动平台的运动信息包括所述可移动平台的拍摄装置的位置信息和姿态信息中的至少一种。
  65. 根据权利要求63所述的可移动平台,其特征在于,所述传感器模块包括姿态传感器和/或位置传感器,所述可移动平台的运动信息基于所述姿态传感器和/或所述位置传感器检测获得。
  66. 根据权利要求63所述的可移动平台,其特征在于,所述可移动平台基于不同的信号传输通道传输所述可移动平台采集的图像以及所述可移动平台在采集所述图像时的运动信息至所述控制装置。
  67. 根据权利要求66所述的可移动平台,其特征在于,所述可移动平台基于宽带无线传输通道传输所述可移动平台采集的图像至所述控制装置,并基于窄带无线传输通道传输所述可移动平台在采集所述图像时的运动信息至所述控制装置。
PCT/CN2019/125870 2019-12-17 2019-12-17 图像传输系统及方法、控制装置、可移动平台 WO2021119982A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980049933.0A CN112514363A (zh) 2019-12-17 2019-12-17 图像传输系统及方法、控制装置、可移动平台
PCT/CN2019/125870 WO2021119982A1 (zh) 2019-12-17 2019-12-17 图像传输系统及方法、控制装置、可移动平台

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/125870 WO2021119982A1 (zh) 2019-12-17 2019-12-17 图像传输系统及方法、控制装置、可移动平台

Publications (1)

Publication Number Publication Date
WO2021119982A1 true WO2021119982A1 (zh) 2021-06-24

Family

ID=74924084

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/125870 WO2021119982A1 (zh) 2019-12-17 2019-12-17 图像传输系统及方法、控制装置、可移动平台

Country Status (2)

Country Link
CN (1) CN112514363A (zh)
WO (1) WO2021119982A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113124835A (zh) * 2021-04-22 2021-07-16 广州南方卫星导航仪器有限公司 一种无人机多镜头摄影测量数据处理装置
CN115205952B (zh) * 2022-09-16 2022-11-25 深圳市企鹅网络科技有限公司 一种基于深度学习的线上学习图像采集方法及系统

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060285724A1 (en) * 2005-06-20 2006-12-21 Ying-Li Tian Salient motion detection system, method and program product therefor
CN101102511A (zh) * 2007-07-26 2008-01-09 上海交通大学 基于宏块级和像素级运动估计的视频差错掩盖方法
CN101188772A (zh) * 2006-11-17 2008-05-28 中兴通讯股份有限公司 一种视频解码的时域错误隐蔽方法
CN101193313A (zh) * 2006-11-20 2008-06-04 中兴通讯股份有限公司 一种视频解码的时域错误隐蔽方法
CN106023192A (zh) * 2016-05-17 2016-10-12 成都通甲优博科技有限责任公司 一种图像采集平台的时间基准实时标定方法及系统
CN106257911A (zh) * 2016-05-20 2016-12-28 上海九鹰电子科技有限公司 用于视频图像的图像稳定方法和装置
CN106534692A (zh) * 2016-11-24 2017-03-22 腾讯科技(深圳)有限公司 一种视频稳像方法及装置
CN107231526A (zh) * 2017-06-09 2017-10-03 联想(北京)有限公司 图像处理方法以及电子设备
CN107241544A (zh) * 2016-03-28 2017-10-10 展讯通信(天津)有限公司 视频稳像方法、装置及摄像终端
CN107682705A (zh) * 2017-09-26 2018-02-09 杭州电子科技大学 基于mv‑hevc框架的立体视频b帧错误隐藏方法
CN110337668A (zh) * 2018-04-27 2019-10-15 深圳市大疆创新科技有限公司 图像增稳方法和装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107274342A (zh) * 2017-05-22 2017-10-20 纵目科技(上海)股份有限公司 一种车底盲区填充方法及系统、存储介质、终端设备
CN108198248A (zh) * 2018-01-18 2018-06-22 维森软件技术(上海)有限公司 一种车辆底部图像3d显示方法
CN109656260A (zh) * 2018-12-03 2019-04-19 北京采立播科技有限公司 一种无人机地理信息数据采集系统

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060285724A1 (en) * 2005-06-20 2006-12-21 Ying-Li Tian Salient motion detection system, method and program product therefor
CN101188772A (zh) * 2006-11-17 2008-05-28 中兴通讯股份有限公司 一种视频解码的时域错误隐蔽方法
CN101193313A (zh) * 2006-11-20 2008-06-04 中兴通讯股份有限公司 一种视频解码的时域错误隐蔽方法
CN101102511A (zh) * 2007-07-26 2008-01-09 上海交通大学 基于宏块级和像素级运动估计的视频差错掩盖方法
CN107241544A (zh) * 2016-03-28 2017-10-10 展讯通信(天津)有限公司 视频稳像方法、装置及摄像终端
CN106023192A (zh) * 2016-05-17 2016-10-12 成都通甲优博科技有限责任公司 一种图像采集平台的时间基准实时标定方法及系统
CN106257911A (zh) * 2016-05-20 2016-12-28 上海九鹰电子科技有限公司 用于视频图像的图像稳定方法和装置
CN106534692A (zh) * 2016-11-24 2017-03-22 腾讯科技(深圳)有限公司 一种视频稳像方法及装置
CN107231526A (zh) * 2017-06-09 2017-10-03 联想(北京)有限公司 图像处理方法以及电子设备
CN107682705A (zh) * 2017-09-26 2018-02-09 杭州电子科技大学 基于mv‑hevc框架的立体视频b帧错误隐藏方法
CN110337668A (zh) * 2018-04-27 2019-10-15 深圳市大疆创新科技有限公司 图像增稳方法和装置

Also Published As

Publication number Publication date
CN112514363A (zh) 2021-03-16

Similar Documents

Publication Publication Date Title
US9600859B2 (en) Image processing device, image processing method, and information processing device
US10334151B2 (en) Phase detection autofocus using subaperture images
US9197866B2 (en) Method for monitoring a traffic stream and a traffic monitoring device
WO2022036980A1 (zh) 位姿确定方法、装置、电子设备、存储介质及程序
US20130271579A1 (en) Mobile Stereo Device: Stereo Imaging, Measurement and 3D Scene Reconstruction with Mobile Devices such as Tablet Computers and Smart Phones
US20150042800A1 (en) Apparatus and method for providing avm image
US20110249117A1 (en) Imaging device, distance measuring method, and non-transitory computer-readable recording medium storing a program
CN110022444B (zh) 无人飞行机的全景拍照方法与使用其的无人飞行机
WO2017020150A1 (zh) 一种图像处理方法、装置及摄像机
TWI400940B (zh) 遠端控制軌道式攝影裝置的手持式裝置及方法
US20170374354A1 (en) Apparatus and method for focal length adjustment and depth map determination
KR20210104684A (ko) 측량 및 매핑 시스템, 측량 및 매핑 방법, 장치 및 기기
EP3529978B1 (en) An image synthesis system
US20160249038A1 (en) Method and apparatus for capturing 360 degree viewing images using spherical camera and mobile phone
WO2019061064A1 (zh) 图像处理方法和设备
WO2019019819A1 (zh) 一种用于处理任务区域的任务的移动电子设备以及方法
WO2021119982A1 (zh) 图像传输系统及方法、控制装置、可移动平台
KR101096157B1 (ko) 듀얼 카메라를 이용한 실시간 감시장치
CN111247389B (zh) 关于拍摄设备的数据处理方法、装置及图像处理设备
WO2021168804A1 (zh) 图像处理方法、图像处理装置和图像处理系统
JP2014063411A (ja) 遠隔制御システム、制御方法、及び、プログラム
US20200007794A1 (en) Image transmission method, apparatus, and device
CN112204946A (zh) 数据处理方法、装置、可移动平台及计算机可读存储介质
WO2020024182A1 (zh) 一种参数处理方法、装置及摄像设备、飞行器
CN114199235B (zh) 一种基于扇面深度相机的定位系统及定位方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19956316

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19956316

Country of ref document: EP

Kind code of ref document: A1