WO2020057609A1 - 图像传输方法、装置、图像发送端及飞行器图传系统 - Google Patents

图像传输方法、装置、图像发送端及飞行器图传系统 Download PDF

Info

Publication number
WO2020057609A1
WO2020057609A1 PCT/CN2019/106726 CN2019106726W WO2020057609A1 WO 2020057609 A1 WO2020057609 A1 WO 2020057609A1 CN 2019106726 W CN2019106726 W CN 2019106726W WO 2020057609 A1 WO2020057609 A1 WO 2020057609A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
preset
module
region
threshold
Prior art date
Application number
PCT/CN2019/106726
Other languages
English (en)
French (fr)
Inventor
李昭早
Original Assignee
深圳市道通智能航空技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司 filed Critical 深圳市道通智能航空技术有限公司
Publication of WO2020057609A1 publication Critical patent/WO2020057609A1/zh
Priority to US17/206,466 priority Critical patent/US11523085B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
    • H04N7/0122Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal the input and the output signals having different aspect ratios
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440218Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0125Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level one of the standards being a high definition standard
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet

Definitions

  • Embodiments of the present invention relate to the technical field of aircraft, and in particular, to an image transmission method, an image transmission device, an image transmitting end, and an aircraft image transmission system.
  • drones and other aircraft have received more and more attention.
  • As a popular shooting vehicle due to its high mobility, fast response, and flexible position movement, drones can obtain many shooting angles that cannot be achieved by normal photography. More and more applied to aerial mapping.
  • the aerial surveying and mapping parameters are set before the drone takes off, and after takeoff, the image is obtained through the drone's shooting device and stored in the drone's storage. In the space, files such as photos, videos, or videos are formed.
  • the user views the image obtained by the shooting device from the drone's storage space. In this way, it is impossible to determine in real time whether the image acquired by the drone during flight is overexposed, too dark, or the field of view. If the image obtained by the shooting device is viewed from the storage space of the drone, the effect cannot meet the expectations. Effect, you need to reset the drone's aerial mapping parameters and fly again. This is neither convenient nor wasteful.
  • an embodiment of the present invention provides an image transmission method, an image transmission device, an image transmitting end, and an aircraft image transmission system capable of transmitting images in real time.
  • an embodiment of the present invention provides an image transmission method, where the method includes:
  • the method before encoding the second image to obtain an encoded image, the method further includes:
  • the encoding the second image to obtain an encoded image includes:
  • the preset condition is that an image gray value of a pixel is less than a preset gray threshold.
  • the method further includes:
  • the discarding the second images whose number is not within a preset range includes:
  • the first preset number threshold is determined by the size of the second image and the first preset dead point tolerance number.
  • the discarding the second images whose number is not within a preset range further includes:
  • the symmetric black border image is discarded, and the number is greater than the second when the symmetric black border image is a partial area of the second image A second number of preset threshold images;
  • the partial region includes a first region and a second region, the first region and the second region are symmetrically distributed, and the second preset number threshold is determined by the first region and the second region. The size and the second preset dead point tolerance number are determined.
  • the method before converting the original image into a first image, the method further includes:
  • the converting the original image into a first image includes:
  • the method further includes: storing the original image.
  • the audio and video interface includes one or more of the following interfaces: HDMI, VGA, DVI, and DP.
  • the image receiving end interface includes one or more of the following interfaces: BT1120, BT656, and BT709.
  • an embodiment of the present invention provides an image transmission apparatus, where the apparatus includes:
  • a first conversion module configured to convert the original image into a first image, where the first image is an image conforming to an audio-video interface format
  • a second conversion module configured to convert the first image into a second image, where the second image is an image conforming to an interface format of an image receiving end;
  • An encoding module configured to encode the second image to obtain an encoded image
  • a transmission module configured to transmit the encoded image to the image receiving end.
  • the device further includes:
  • a statistics module configured to count the number of pixels in a preset area of the second image that meets a preset condition
  • a detection module for detecting whether the number is within a preset range
  • the encoding module is specifically configured to:
  • the preset condition is that an image gray value of a pixel is less than a preset gray threshold.
  • the device further includes:
  • the discarding module is configured to discard the second images whose number is not within a preset range.
  • the discarding module is specifically configured to:
  • the first preset number threshold is determined by the size of the second image and the first preset dead point tolerance number.
  • the discarding module is specifically configured to:
  • the symmetric black border image is discarded, and the number of the symmetric black border image is greater than the second when the preset area is a partial area of the second image.
  • the partial region includes a first region and a second region, the first region and the second region are symmetrically distributed, and the second preset number threshold is determined by the first region and the second region. The size and the second preset dead point tolerance number are determined.
  • the device further includes:
  • a compression module configured to compress the original image according to a preset ratio to obtain a compressed image
  • the first conversion module is specifically configured to:
  • the apparatus further includes: a storage module, configured to store the original image in a storage space.
  • an image sending end including:
  • At least one processor At least one processor
  • a memory connected in communication with the at least one processor; wherein,
  • the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to execute the image transmission method as described above.
  • an embodiment of the present invention provides a computer program product.
  • the computer program product includes a computer program stored on a non-volatile computer-readable storage medium.
  • the computer program includes program instructions. When the instructions are executed by the computer, the computer is caused to execute the image transmission method as described above.
  • an embodiment of the present invention further provides a non-volatile computer-readable storage medium, where the computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are used to cause a computer to execute as described above.
  • an embodiment of the present invention further provides an aircraft image transmission system, including: an image transmitting end and an image receiving end;
  • the image sending end and the image receiving end are communicatively connected;
  • the image transmitting end is an image transmitting end as described above.
  • the image transmission method provided in the embodiment of the present invention first converts the obtained original image into a first image conforming to the audio-video interface format, and then converts the first image into a second image conforming to the image-receiving-end interface format, and Encoding the second image, and finally transmitting the encoded image to the image receiving end, thereby real-time transmission of the image, so that the user can see the situation of the acquired image in real time.
  • the image transmission method provided in the embodiment of the present invention detects whether the number of pixels in a preset area of the second image that meets the preset condition is within a preset range, so as to discard the second image as a full black screen image or a symmetric black image. Edge the image of the image, thereby optimizing the visual effect of the image received by the image receiving end.
  • FIG. 1 is a schematic diagram of an application environment of an image transmission method according to an embodiment of the present invention.
  • FIG. 2 is a schematic structural diagram of an unmanned aerial vehicle provided by an embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of an image transmission method according to an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of another image transmission method according to an embodiment of the present invention.
  • FIG. 6 (a) is a schematic diagram of a full black screen image provided by an embodiment of the present invention.
  • FIG. 6 (b) is a schematic diagram of a symmetric black border image provided by an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of an image transmission device according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a hardware structure of an image transmitting end according to an embodiment of the present invention.
  • FIG. 9 is a schematic diagram of an aircraft image transmission system according to an embodiment of the present invention.
  • FIG. 1 is an application environment of an image transmission method according to an embodiment of the present invention.
  • the application environment includes an unmanned aerial vehicle 100, an image receiving end 200, and a user (not shown).
  • the unmanned aerial vehicle 100 and the image receiving terminal 200 are communicatively connected to perform information interaction.
  • the user can know the situation of the images collected by the unmanned aerial vehicle 100 through the image receiving end 200.
  • the unmanned aerial vehicle 100 may be any type of power-driven flying vehicle or other movable equipment, including but not limited to a multi-axis rotary wing unmanned aerial vehicle, such as a four-axis rotary wing unmanned aerial vehicle, a fixed-wing aircraft, and a helicopter.
  • a four-axis rotary wing unmanned aerial vehicle is taken as an example for description.
  • the unmanned aerial vehicle 100 may have a corresponding volume or power according to the needs of the actual situation, so as to provide sufficient load capacity, flight speed, flight range, and the like.
  • the unmanned aerial vehicle 100 is provided with at least one power system for providing flight power and a flight control system for controlling the flight of the unmanned aerial vehicle 100.
  • the flight control system is communicatively connected with the power system.
  • the power system may include an electronic governor (referred to as an ESC for short), one or more propellers, and one or more electric motors corresponding to the one or more propellers.
  • the motor is connected between the electronic governor and the propeller, and the motor and the propeller are arranged on the arm of the corresponding unmanned aerial vehicle 100.
  • the electronic governor is used to receive the driving signal generated by the flight control system, and provides a driving current to the motor according to the driving signal to control the speed of the motor.
  • the motor is used to drive the propeller to rotate, so as to provide power for the flight of the unmanned aerial vehicle 100, and the power enables the unmanned aerial vehicle 100 to realize one or more degrees of freedom of movement.
  • the unmanned aerial vehicle 100 may rotate about one or more rotation axes.
  • the rotation axis may include a roll axis, a pan axis, and a pitch axis.
  • the motor can be a DC motor or an AC motor.
  • the motor may be a brushless motor or a brushed motor.
  • the flight control system may include a flight controller and a sensing system.
  • the sensing system is used to measure the attitude information of the unmanned aerial vehicle 100, that is, the position information and state information of the unmanned aerial vehicle 100 in space, such as three-dimensional position, three-dimensional angle, three-dimensional velocity, three-dimensional acceleration, and three-dimensional angular velocity.
  • the sensing system may include, for example, at least one of a gyroscope, an electronic compass, an Inertial Measurement Unit (IMU), a vision sensor, a global navigation satellite system, and a barometer.
  • the global navigation satellite system may be a Global Positioning System (Global Positioning System, GPS).
  • the flight controller is used to control the flight of the unmanned aerial vehicle 100.
  • the flight controller may control the flight of the unmanned aerial vehicle 100 according to the attitude information measured by the sensing system. It can be understood that the flight controller can control the flight of the unmanned aerial vehicle 100 according to a pre-programmed program instruction, and can also control the flight of the unmanned aerial vehicle 100 by responding to one or more control instructions from other devices.
  • one or more functional modules may be added to the unmanned aerial vehicle 100, so that the unmanned aerial vehicle 100 can implement more functions, such as aerial photography.
  • the unmanned aerial vehicle 100 is provided with at least one image acquisition device for acquiring images, so as to acquire original images through the image acquisition device.
  • the unmanned aerial vehicle 100 may further provide a fixing bracket for fixedly installing the image acquisition device, so that the user can replace the image acquisition device installed on the unmanned aerial vehicle 100 according to his own needs.
  • the unmanned aerial vehicle 100 may further include a storage space for storing the original image, so that the original image can be recalled later when needed.
  • the storage control may be a built-in or external storage space of the UAV 100.
  • the UAV 100 is provided with an external SD card interface, and a memory device such as an SD card can be inserted into the interface to store the acquired original image.
  • several frames of continuous original images form videos or videos.
  • videos or videos formed of several frames of original images may also be stored in the internal or external storage space of the UAV 100.
  • the UAV 100 also has at least one image processing module and image transmission module.
  • the image processing module is used to process the original images collected by the image acquisition device, and send the processed images to the image receiver through the image transmission module.
  • Terminal 200 to achieve real-time transmission of images.
  • the image acquisition device, the image processing module, and the image transmission module may be integrated into a component of the unmanned aerial vehicle 100, for example, integrated into the camera component of the unmanned aerial vehicle 100. In order to realize the function of transmitting the collected images to the image receiving end 200 in real time through the image transmitting end.
  • the image receiving end 200 may be any type of user interaction device.
  • the image receiving end 200 may be equipped with one or more different user interaction devices to collect user instructions or display or feedback information to the user.
  • the image receiving terminal 200 may be equipped with a touch display screen, through which the touch instruction of the user is received, and information is displayed to the user through the touch display screen, such as displaying images.
  • the image receiving end 200 may be a smart terminal device, such as a mobile phone, a tablet, a personal computer, a wearable device, and the like.
  • a software application (APP) matching the unmanned aerial vehicle 100 may be installed on the image receiving terminal 200. The user can use the software application to display the received image sent by the drone 100 on the touch display screen.
  • APP software application
  • the image receiving end 200 may also be a dedicated control device supporting the unmanned aerial vehicle 100, for example, a remote control of the unmanned aerial vehicle 100, etc., which may receive images from the unmanned aerial vehicle 100 and use an internal or The externally connected display shows.
  • each component of the unmanned aerial vehicle 100 is for identification purposes only, and should not be construed as limiting the embodiments of the present invention.
  • the image transmission method provided by the embodiment of the present invention can be further extended to other suitable application environments, and is not limited to the application environment shown in FIG. 1. In the actual application process, the application environment may also include more image receiving ends.
  • FIG. 3 is a schematic flowchart of an image transmission method according to an embodiment of the present invention.
  • the image transmission method according to the embodiment of the present invention may be implemented in cooperation with various components in the unmanned aerial vehicle, which is not limited herein.
  • the image transmission method includes:
  • an original image may be collected by an image acquisition device to obtain the original image.
  • the image acquisition device may be an image acquisition device such as a camera or a video camera.
  • a camera with a larger image sensor size such as a full-frame camera or a medium-format camera, may be used to obtain a high-quality original image such as a full-frame or medium-format camera.
  • the original image is converted into a first image to facilitate the output of the original image.
  • the obtained original image is converted into an image conforming to the audio-video interface format for output to an image processing module for subsequent processing.
  • the audio and video interface includes, but is not limited to, one or more of the following interfaces: High Definition Multimedia Interface (HDMI), Video Graphics Matrix (VGA), and Analog Data Interface (Digital Visual Interface) , DVI), Display Interface (Display Port, DP).
  • HDMI High Definition Multimedia Interface
  • VGA Video Graphics Matrix
  • Analog Data Interface Digital Visual Interface
  • DVI Display Interface
  • Display Port Display Port
  • the first image is an image conforming to the audio-video interface format
  • the original image is converted to the first image, it is output through the audio-video interfaces such as HDMI, VGA, DVI, and DP.
  • the video can also be converted into a video that conforms to the audio-video interface format and output through audio-video interfaces such as HDMI, VGA, DVI, DP, etc. Real-time video transmission.
  • the image receiving end usually cannot directly receive the image output by audio and video interfaces such as HDMI, VGA, DVI, DP, etc., in order to realize the real-time transmission of the image, the first image needs to be converted into the second image so that the image receiving end can receive To the processed image.
  • audio and video interfaces such as HDMI, VGA, DVI, DP, etc.
  • the image processing module of the unmanned aerial vehicle is used to process the first image output through audio and video interfaces such as HDMI, VGA, DVI, and DP.
  • the image processing module may include a conversion chip for converting the first image into an image that conforms to the interface format of the image receiving end, that is, the second image.
  • the conversion chip may be HiSilicon Hi3519 chip and the like.
  • the image receiving end interface includes, but is not limited to, one or more of the following interfaces: BT1120, BT656, and BT709.
  • the image processing module may further include an encoder for encoding the second image.
  • the functions of converting the first image into a second image and encoding the second image to obtain an encoded image may also be implemented on the same chip, that is, the functions of the chip and the encoder are to be converted Integrated on the same chip, such as on the Hi3519 chip.
  • the encoding standards can be DIVX, XVID, AVC, H.263, H.264, H.265, Windows, Madia Video, and so on.
  • the encoded image can be transmitted to the image receiving end through the image transmission module, so as to realize the real-time transmission of the image.
  • the image receiving end can decode the received image through a decoder, and display the decoded image on the display screen of the image receiving end, so that the user can know the collected image in real time.
  • the condition of the image so that during the flight of the UAV, if the original image acquired is overexposed, too dark, and the field of vision, it can be adjusted in time to make the effect of the original image acquired meet the expectations Effect.
  • the acquired original image is first converted into a first image conforming to the audio-video interface format, and then the first image is converted into a second image conforming to the image receiving-end interface format, and the second image Encoding is performed, and finally the encoded image is transmitted to the image receiving end, thereby real-time transmission of the image, so that the user can see the situation of the acquired image in real time.
  • FIG. 4 is a schematic flowchart of another image transmission method according to an embodiment of the present invention.
  • the image transmission method according to the embodiment of the present invention may be executed in cooperation with various components in the unmanned aerial vehicle, which is not limited herein.
  • the image transmission method includes:
  • the obtained original image can be stored in the built-in or external storage space of the UAV.
  • an external SD card interface is provided on the UAV, and a memory device such as an SD card can be inserted into the interface to store the acquired original image.
  • several frames of continuous original images form videos or videos.
  • videos or videos formed of several frames of original images can also be stored in the built-in or external storage space of the UAV.
  • the resolution of the original image is generally high.
  • the original image can have a resolution of 40 million.
  • Such a high-resolution image is not convenient for real-time transmission of the image. Therefore, after obtaining the original image, , the original image can be compressed according to a preset ratio to facilitate subsequent image transmission.
  • the original image with a resolution of 40 million can be reduced to 1080P (resolution of 1920 ⁇ 1080) or 720P (resolution of 1280 ⁇ 720).
  • the preset ratio may be configured in the image acquisition device in advance, or the preset ratio may be adjusted according to needs.
  • converting the original image into a first image may include: converting the compressed image into a first image.
  • the preset area of the second image may be the entire area of the second image, or may be a partial area of the second image.
  • the preset condition is that the image gray value of the pixel is less than a preset gray level threshold.
  • the image acquisition device is provided with a mechanical shutter.
  • the mechanical shutter is closed once.
  • the mechanical shutter will be constantly closed, which will cause a black screen in the acquired original image.
  • the second image is obtained by converting the first image
  • the first image is obtained from the original image.
  • a black screen may also appear in the second image, resulting in a black screen image in the image transmitted to the image receiving end, which affects the user ’s viewing angle effect.
  • the black screen needs to be processed before the second image is encoded to obtain the encoded image.
  • the black screen will mainly include the following 4 cases:
  • Case 2 a frame of symmetrical black border images appears in the second image of several consecutive frames, and the next frame of the symmetrical black border image is a full black screen image;
  • Case 3 a frame of symmetrical black border image appears in the second image of several consecutive frames, and the previous frame of the symmetrical black border image is a full black screen image;
  • Case 4 A frame of symmetric black border appears in the second image of several consecutive frames, and the next frame of the symmetric black border image is a full black screen image, and the next frame of the full black screen image is a symmetric black border image .
  • the closing speed of the mechanical shutter is very fast, generally the entire shutter action is completed within about 50ms, and the closing and opening of the mechanical shutter itself takes less time to complete within 5ms. Therefore, when the frame rate is 60 fps (frames per second, Frames, Second), the first case in Figure 5 occurs when the mechanical shutter is closed, the full black screen image appears the most, the second and third cases in Figure 5 The probability of occurrence is very low, and the fourth case will hardly occur. In addition, the probability that two consecutive frames of images appear as symmetrical black border images is basically zero in actual situations. When the frame rate is 30fps, the second and third cases rarely occur.
  • the second image of each frame can be detected to determine whether there is a full black screen image or a symmetrical black border image to prevent the full black screen image or the symmetrical black border image from being transmitted to the image receiving end, which affects the viewing angle effect.
  • whether the second image is a full black screen image or a symmetrical black border image may be determined by using the detected image gray value of the pixels of the second image, and the above four situations may occur. In this case, the full black screen image or symmetrical black border image is discarded to avoid the black screen.
  • whether the second image is a full black screen image or a symmetrical black border image may be determined by counting the number of pixels in a preset area of the second image that meets a preset condition.
  • the preset condition is that the image gray value of the pixel is less than a preset gray threshold.
  • Different full-screen images and symmetrical black-edge images can be judged by using different judgment strategies. Because the full black screen image is for the entire area of the second image, statistics and judgments can be made based on the image gray value of the pixels of the entire area of the second image; and the symmetric black border image is for the second image In terms of the partial region, the statistics and judgment can be performed based on the image gray value of the pixels of the partial region of the second image.
  • the full black screen image is a second image in which the number is greater than a first preset number threshold when the preset area is an entire area of the second image.
  • the first preset number threshold is determined by the size of the second image and the first preset dead point tolerance number.
  • the symmetric black border image is a second image in which the number is greater than a second preset number threshold when the preset area is a partial area of the second image.
  • the partial region includes a first region and a second region, the first region and the second region are symmetrically distributed, and the second preset number threshold is determined by the first region and the second region. The size and the second preset dead point tolerance number are determined.
  • T is a preset gray-scale threshold.
  • the preset gray threshold is used to define the gray level of the pixel.
  • the preset grayscale threshold can be pre-configured in the image processing module, and the preset grayscale threshold can be adjusted as needed.
  • the preset gray-scale threshold T may be 25.
  • M1 is a first preset number threshold, and the first preset number threshold is used to define whether the second image is a completely black screen image. That is, when m> M1, it indicates that the second image is a full black screen image; when m ⁇ M1, it indicates that the second image is not a full black screen image.
  • B1 is the first preset dead point tolerance number.
  • the first preset dead point tolerance number is used to define the maximum tolerable "bad point” number.
  • the first preset dead pixel tolerance number can be pre-configured in the image processing module, and the preset gray level threshold can be adjusted as needed. For example, the first preset dead point tolerance number may be eight.
  • the range is generally from 0 to 255, of which white is 255 and black is 0.
  • the number of gray values of the pixels whose statistics are less than the preset gray level threshold m> M1 is counted, it indicates that the second image is a completely black screen image.
  • T is a preset gray-scale threshold.
  • M2 is a second preset number threshold, and the second preset number threshold is used to define whether the second image is a symmetrical black border image. That is, when m> M2, it indicates that the second image is a symmetric black-edge image; when m ⁇ M2, it indicates that the second image is not a symmetric black-edge image.
  • B2 is the second preset dead point tolerance number.
  • the second preset dead point tolerance number is similar to the first preset dead point tolerance number, and therefore, details are not described herein.
  • the second preset dead point tolerance number may also be eight.
  • h1 is the height of the first region or the second region. This height can be pre-configured in the image processing module and can be adjusted as needed. In addition, according to the actual situation, h1 cannot be too small, because if it is too small, even if the gray values of the pixels in the first and second regions are preset with a gray threshold, that is, the image color in the first and second regions Both are close to black. Because the first and second regions are small, the overall impact on the second image is not large, and it will not affect the visual effect of the user. In this case, it is not suitable to discard the image. Therefore, h1 cannot be too small, for example, h1 can be greater than 10.
  • the range is generally from 0 to 255, of which white is 255 and black is 0.
  • the number of gray values of the image of the pixels is less than the preset gray threshold m> M2, it indicates that the second image is a symmetrical black border image.
  • the preset range is less than or equal to the first preset number threshold, that is, when m ⁇ M1, the second image is not a full black screen image; for the second image, it is not In the case of a symmetric black border image, the preset range is less than or equal to a second preset number threshold, that is, when m ⁇ M2, the second image is not a symmetric black border image.
  • the encoding the second image to obtain an encoded image includes: encoding the second images within a preset range to obtain an encoded image.
  • discarding the second images whose quantity is not within a preset range includes: discarding a completely black screen image (the first case, the second case, the third case, and the nth frame in the fourth case in FIG. 5) Image); when the adjacent frame image of the symmetric black border image is a full black screen image, the symmetric black border image is discarded (the n-1 frame image in the second case in FIG. 5 and the nth frame in the third case) +1 frame image, and n-1 frame image and n + 1 frame image in the fourth case).
  • the image receiving end can be prevented from receiving images that affect visual effects, so as to improve the user experience.
  • step 402 and / or step 403 may not be mandatory steps in different embodiments.
  • steps 401-410 may have different execution orders. For example, step 410 is performed first, and then step 409 is performed.
  • the acquired original image is first converted into a first image conforming to the audio-video interface format, and then the first image is converted into a second image conforming to the image receiving-end interface format, and the second image Encoding is performed, and finally the encoded image is transmitted to the image receiving end, thereby real-time transmission of the image, so that the user can see the situation of the acquired image in real time.
  • the image receiving end it is also detected whether the number of pixels in a preset area of the second image that meets the preset condition is within a preset range, so as to discard the second image as a full black screen image or a symmetrical black border image. Image to optimize the visual effect of the image received by the image receiving end.
  • FIG. 7 is a schematic diagram of an image transmission device according to an embodiment of the present invention.
  • the image transmission device 70 may be configured in the unmanned aerial vehicle.
  • the image transmission device 70 includes: an acquisition module 701, a storage module 702, a compression module 703, a first conversion module 704, a second conversion module 705, a statistics module 706, a detection module 707, an encoding module 708, and a transmission module 709 and the discarding module 710.
  • the obtaining module 701 is configured to obtain an original image.
  • the acquisition module 701 may be connected to an image acquisition device. During the flight of the unmanned aerial vehicle, the acquisition module 701 may acquire an original image through the image acquisition device to acquire the original image.
  • the image acquisition device may be an image acquisition device such as a camera or a video camera.
  • a camera with a larger image sensor size such as a full-frame camera or a medium-format camera, may be used to obtain a high-quality original image such as a full-frame or medium-format camera.
  • the storage module 702 is configured to store the original image in a storage space.
  • the storage module 702 may store the original image obtained by the acquisition module 701 in a built-in or external storage space of the UAV.
  • an external SD card interface is provided on the UAV, and a memory device such as an SD card can be inserted into the interface to store the acquired original image.
  • several frames of continuous original images form videos or videos.
  • the storage module 702 may also store videos or videos formed of several frames of consecutive original images in the built-in or external storage space of the UAV.
  • the compression module 703 is configured to perform compression processing on the original image according to a preset ratio to obtain a compressed image.
  • the resolution of the original image is generally high.
  • the original image may have a resolution of 40 million.
  • the acquisition module 701 obtains After the original image, the compression module 703 may perform compression processing on the original image according to a preset ratio to facilitate subsequent image transmission.
  • the first conversion module 704 is configured to convert the original image into a first image, where the first image is an image conforming to an audio-video interface format.
  • the first conversion module 704 may convert the compressed image into a first image, so as to facilitate the output of the compressed image. For example, the first conversion module 704 converts the compressed image into an image conforming to the audio-video interface format for output to the second conversion module 705 for subsequent processing.
  • the audio and video interface includes, but is not limited to, one or more of the following interfaces: HDMI, VGA, DVI, DP.
  • the compressed image is converted into the first image by the first conversion module 704 and then output through the audio-video interfaces such as HDMI, VGA, DVI, and DP.
  • the first conversion module 704 can also convert the compressed video into a video conforming to the audio-video interface format, and pass HDMI, VGA, DVI, DP, etc. Audio and video interface output for real-time video transmission.
  • the second conversion module 705 is configured to convert the first image into a second image, where the second image is an image conforming to an interface format of an image receiving end.
  • the image receiving end usually cannot directly receive images output by audio and video interfaces such as HDMI, VGA, DVI, DP, etc., in order to realize the real-time transmission of images, it is necessary to convert the first image into the second image through the second conversion module 705 So that the image receiving end can receive the processed image.
  • audio and video interfaces such as HDMI, VGA, DVI, DP, etc.
  • the image receiving end interface includes, but is not limited to, one or more of the following interfaces: BT1120, BT656, and BT709.
  • an image conforming to the HDMI format may be converted into an image conforming to the BT1120 format by the second conversion module 705.
  • the statistics module 706 is configured to count the number of pixels in a preset area of the second image that meets a preset condition.
  • the preset area of the second image may be the entire area of the second image, or may be a partial area of the second image.
  • the preset condition is that the image gray value of the pixel is less than a preset gray level threshold.
  • the image acquisition device is provided with a mechanical shutter.
  • the mechanical shutter is closed once.
  • the mechanical shutter will be constantly closed, which will cause a black screen in the acquired original image.
  • the second image is obtained by converting the first image
  • the first image is obtained from the original image.
  • a black screen may also appear in the second image, resulting in a black screen image in the image transmitted to the image receiving end, which affects the user ’s viewing angle effect.
  • the black screen needs to be processed before the encoding module 708 encodes the second image to obtain the encoded image.
  • the statistics module 706 can be used to count the number of pixels in the preset area of the second image that meets the preset conditions, so as to determine whether there is a full black screen image or a symmetrical black border image based on the number.
  • a black screen image or a symmetrical black border image to prevent a full black screen image or a symmetrical black border image from being transmitted to the image receiving end, thereby affecting the viewing angle effect.
  • the preset condition is that the image gray value of the pixel is less than a preset gray threshold.
  • the statistics module 706 for the full black screen image and the symmetrical black border image can use a different statistical strategy for statistics. Since the full black screen image is for the entire area of the second image, the statistics module 706 may perform statistics based on the image gray value of the pixels of the entire area of the second image; and the symmetric black border image is for the second area In terms of a partial region of the image, the statistics module 706 may perform statistics based on the image gray value of the pixels of the partial region of the second image.
  • T is a preset gray-scale threshold.
  • the preset gray threshold is used to define the gray level of the pixel.
  • the preset gray level threshold can be pre-configured in the image processing module, and the preset gray level threshold can be adjusted as needed.
  • the preset gray-scale threshold T may be 25.
  • the range is generally from 0 to 255, of which white is 255 and black is 0.
  • the smaller the gray value of the image, the darker the color, that is, the closer it is to black . Therefore, when the image gray value of a pixel is less than the preset gray threshold, the pixel can be considered to be close to black. At this time, the number of pixel gray values is less than the preset gray threshold plus one. That is, when Y (i, j) ⁇ T, m m + 1.
  • the size of the first region and the second region are both wxh1, where the width of the first region and the second region is w, and the first The height of the region and the second region is h1, and the first region and the second region are symmetrically distributed on the upper and lower sides of the second image;
  • the image gray value of the pixels in the i-th row and the j-th column of the second image is Y ( i, j) , where i is 0 to (w-1), j is 0 to (h1-1) and (h-h1) to (h-1).
  • Counting the number of pixels whose image gray value is less than a preset gray threshold in a partial region of the second image is as follows:
  • T is a preset gray-scale threshold.
  • h1 is the height of the first region or the second region. This height can be pre-configured in the image processing module and can be adjusted as needed.
  • h1 cannot be too small, because if it is too small, even if the gray values of the pixels in the first and second regions are preset with a gray threshold, that is, the image color in the first and second regions Both are close to black. Because the first and second regions are small, the overall impact on the second image is not large, and it will not affect the visual effect of the user. In this case, it is not suitable to discard the image. Therefore, h1 cannot be too small, for example, h1 can be greater than 10.
  • the range is generally from 0 to 255, of which white is 255 and black is 0.
  • the smaller the gray value of the image, the darker the color, that is, the closer it is to black . Therefore, when the image gray value of a pixel is less than the preset gray threshold, the pixel can be considered to be close to black. At this time, the number of pixel gray values is less than the preset gray threshold plus one. That is, when Y (i, j) ⁇ T, m m + 1.
  • the detection module 707 is configured to detect whether the number is within a preset range.
  • the detection module 707 can detect whether the number is within a preset range, so as to determine whether the second image is a full black screen image or a symmetrical black border image, thereby improving the visual effect of the user.
  • the preset range is less than or equal to the first preset number threshold, that is, when m ⁇ M1, the second image is not a full black screen image, where M1 is the first Preset number threshold; for the case where the second image is not a symmetrical black border image, the preset range is less than or equal to the second preset number threshold, that is, when m ⁇ M2, the second image is not a symmetrical black border image , Where M2 is the second preset number threshold.
  • M1 is a first preset number threshold, and the first preset number threshold is used to define whether the second image is a completely black screen image. That is, when m> M1, it indicates that the second image is a full black screen image; when m ⁇ M1, it indicates that the second image is not a full black screen image.
  • B1 is the first preset dead point tolerance number.
  • M2 is a second preset number threshold, and the second preset number threshold is used to define whether the second image is a symmetrical black border image. That is, when m> M2, it indicates that the second image is a symmetric black-edge image; when m ⁇ M2, it indicates that the second image is not a symmetric black-edge image.
  • B2 is the second preset dead point tolerance number.
  • the encoding module 708 is configured to encode the second image to obtain an encoded image.
  • the image encoded by the encoding module 708 should be the second image in a preset range, that is, encoding
  • the module 708 is specifically configured to encode the second images in the preset range to obtain an encoded image.
  • the encoding standards can be DIVX, XVID, AVC, H.263, H.264, H.265, Windows, Madia Video, and so on.
  • the transmission module 709 is configured to transmit the encoded image to the image receiving end.
  • the image encoded by the encoding module 708 can be transmitted to the image receiving end through the transmission module 709 to implement real-time transmission of the image.
  • the image receiving end can decode the received image through a decoder, and display the decoded image on the display screen of the image receiving end, so that the user can know the collected image in real time.
  • Image conditions so that during the flight of the unmanned aerial vehicle, if it encounters situations such as overexposure, overdarkness, and field of vision of the collected original image, it can be adjusted in time to make the effect of the collected original image meet the expectations Effect.
  • the discarding module 710 is configured to discard the second images whose number is not within a preset range.
  • the discarding module 710 is specifically configured to discard all black screen images (the first case, the second case, the third case, and the nth frame image in the fourth case in FIG. 5); when the symmetrical black border image When the adjacent frame image is a full black screen image, the symmetrical black border image is discarded (the n-1 frame image in the second case in FIG. 5, the n + 1 frame image in the third case, and the fourth (N-1 frame image and n + 1 frame image in this case).
  • the full black screen image is a second image in which the number is greater than a first preset number threshold when the preset area is the entire area of the second image, and the first preset number threshold is determined by the first The size of the two images and the first preset dead pixel tolerance number are determined. That is:
  • M1 is a first preset number threshold, and the first preset number threshold is used to define whether the second image is a completely black screen image.
  • B1 is the first preset dead point tolerance number.
  • the symmetric black border image is a second image in which the number is greater than a second preset number threshold when the preset area is a partial area of the second image.
  • the partial region includes a first region and a second region, the first region and the second region are symmetrically distributed, and the second preset number threshold is determined by the size of the first region and the second region, The second preset dead point tolerance number is determined. That is:
  • M2 is a second preset number threshold, and the second preset number threshold is used to define whether the second image is a symmetric black-edge image.
  • B2 is the second preset dead point tolerance number.
  • the discarding module 710 By discarding the second images whose number is not within the preset range by the discarding module 710, it is possible to prevent the image receiving end from receiving images that affect visual effects, so as to improve the user experience.
  • the storage module 702 and / or the compression module 703 may not be necessary modules of the image transmission device 70 in different embodiments, that is, in some embodiments, the storage module 702 and The compression module 703 may be omitted.
  • the image transmission device 70 may execute the image transmission method provided by any method embodiment, and has corresponding function modules and beneficial effects for executing the method.
  • the image transmission method provided in the method embodiment may be executed.
  • FIG. 8 is a schematic diagram of a hardware structure of an image sending end according to an embodiment of the present invention.
  • the image sending end may be the above-mentioned unmanned aerial vehicle 100, an unmanned ship, or a camera component of the unmanned aerial vehicle 100, and the like.
  • the image sending end 80 includes:
  • One processor 801 is taken as an example in FIG. 8.
  • the processor 801 and the memory 802 may be connected through a bus or other manners. In FIG. 8, the connection through the bus is taken as an example.
  • the memory 802 is a non-volatile computer-readable storage medium, and can be used to store non-volatile software programs, non-volatile computer executable programs, and modules, such as program instructions corresponding to the image transmission method provided by the embodiment of the present invention.
  • / Module for example, the acquisition module 701, storage module 702, compression module 703, first conversion module 704, second conversion module 705, statistics module 706, detection module 707, encoding module 708, and transmission module 709 shown in FIG. 7 And discard module 710).
  • the processor 801 executes various functional applications and data processing on the image sending end by running non-volatile software programs, instructions, and modules stored in the memory 802, that is, implementing the image transmission method provided by the method embodiment.
  • the memory 802 may include a storage program area and a storage data area, where the storage program area may store an operating system and application programs required for at least one function; the storage data area may store data created according to the use of the image sending end, and the like.
  • the memory 802 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage device.
  • the memory 802 may optionally include a memory remotely set relative to the processor 801, and these remote memories may be connected to the image sending end through a network. Examples of the network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the one or more modules are stored in the memory 802, and when executed by the one or more processors 801, execute the image transmission method provided by the embodiment of the present invention, for example, the above-mentioned FIG. 4 is executed. Steps 401 to 410 of the method, or the functions of the modules 701-710 in FIG. 7 are implemented.
  • the image sending end may further include a communication interface, which is used to implement communication with other devices, such as a server and the like.
  • a communication interface which is used to implement communication with other devices, such as a server and the like.
  • Other devices included in the image sending end are not limited herein.
  • the image sending end can execute the image transmission method provided by the embodiment of the present invention, and has corresponding function modules and beneficial effects of executing the method.
  • image transmission method provided in the embodiment of the present invention.
  • An embodiment of the present invention provides a computer program product.
  • the computer program product includes a computer program stored on a non-volatile computer-readable storage medium.
  • the computer program includes program instructions.
  • the program instructions When the program instructions are executed by a computer, At this time, the computer is caused to execute the image transmission method provided by the embodiment of the present invention. For example, the method steps 401 to 410 in FIG. 4 described above are performed, or the functions of the modules 701-710 in FIG. 7 are implemented.
  • An embodiment of the present invention provides a non-volatile computer-readable storage medium.
  • the computer-readable storage medium stores computer-executable instructions, where the computer-executable instructions are used to cause a computer to execute an image provided by an embodiment of the present invention.
  • Transmission method For example, the method steps 401 to 410 in FIG. 4 described above are performed, or the functions of the modules 701-710 in FIG. 7 are implemented.
  • FIG. 9 is a schematic diagram of an aircraft image transmission system according to an embodiment of the present invention.
  • the aircraft image transmission system 90 includes the image sending end 80 and the image receiving end 200 described above.
  • the image sending end 80 and the image receiving end 200 are communicatively connected.
  • the image transmitting end 80 may process the acquired original image and transmit it to the image receiving end 200 in real time, so as to implement the real time transmission of the image.
  • the image sending end 80 detects whether the number of pixels in the preset area of the second image that meets the preset condition is within the preset range, so as to discard the image in which the second image is a full black screen image or a symmetrical black border image, thereby Optimize the visual effect of the image received by the image receiving end 200.
  • the device embodiments described above are only schematic, and the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical Units can be located in one place or distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the objective of the solution of this embodiment.
  • the embodiments can be implemented by means of software plus a general hardware platform, and of course, also by hardware.
  • the program can be stored in a computer-readable storage medium. In this case, the process of the embodiment of each method may be included.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RandomAccess Memory, RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Aviation & Aerospace Engineering (AREA)

Abstract

本发明实施例涉及飞行器技术领域,公开了一种图像传输方法、装置、图像发送端及飞行器图传系统。其中,所述方法包括:获取原始图像;将所述原始图像转换为第一图像,所述第一图像为符合音视频接口格式的图像;将所述第一图像转换为第二图像,所述第二图像为符合图像接收端接口格式的图像;对所述第二图像进行编码得到编码后的图像;将所述编码后的图像传输给所述图像接收端。通过本发明实施例提供的图像传输方法实现图像的实时传输。

Description

图像传输方法、装置、图像发送端及飞行器图传系统 技术领域
本发明实施例涉及飞行器技术领域,尤其涉及一种图像传输方法、图像传输装置、图像发送端及飞行器图传系统。
背景技术
随着飞行技术的发展,无人机等飞行器受到越来越多的关注。无人机作为一种热门的拍摄载具,由于其具有高机动性、反应快速以及灵活的位置移动等特点,使得无人机可以获得许多正常摄影无法实现的拍摄角度,从而使得无人机被越来越多的应用到航拍测绘中。
对于目前无人机在航拍测绘中的应用而言,通常是在无人机起飞前把航拍测绘参数设置好,起飞之后,通过无人机的拍摄装置获取图像,并存储于无人机的存储空间中,形成照片、视频或录像等文件,等到无人机完成航拍测绘的飞行任务后,用户再从无人机的存储空间中查看拍摄装置所获取图像。通过该方式无法实时确定无人机在飞行过程中所获取的图像是否过曝、过暗、视野等情况,若从无人机的存储空间中查看通过拍摄装置所获取图像的效果不能满足预期的效果,则需要重新设置无人机的航拍测绘参数,重飞一次。这样既不方便,而且会造成浪费。
因此,如何实现将无人机等飞行器所获取的图像实时传输给用户终端等图像接收端,以便用户可以实时看到所获取的图像的情况是一个迫切需要解决的问题。
发明内容
为了解决上述技术问题,本发明实施例提供一种可以实时传输图像的图像传输方法、图像传输装置、图像发送端及飞行器图传系统。
本发明实施例公开了如下技术方案:
第一方面,本发明实施例提供了一种图像传输方法,所述方法包括:
获取原始图像;
将所述原始图像转换为第一图像,所述第一图像为符合音视频接口格式的图像;
将所述第一图像转换为第二图像,所述第二图像为符合图像接收端接口格式的图像;
对所述第二图像进行编码得到编码后的图像;
将所述编码后的图像传输给所述图像接收端。
可选的,在对所述第二图像进行编码得到编码后的图像之前,所述方法还包括:
统计所述第二图像的预设区域中的像素点满足预设条件的数量;
检测所述数量是否在预设范围内;
所述对所述第二图像进行编码得到编码后的图像,包括:
将所述数量在预设范围内的第二图像进行编码得到编码后的图像。
可选的,所述预设条件为:像素点的图像灰度值小于预设灰度阈值。
可选的,所述方法还包括:
丢弃所述数量不在预设范围内的第二图像。
可选的,所述丢弃所述数量不在预设范围内的第二图像,包括:
丢弃全黑屏图像,所述全黑屏图像为所述预设区域为所述第二图像的整个区域时所述数量大于第一预设数量阈值的第二图像;
其中,所述第一预设数量阈值由所述第二图像的尺寸及第一预设坏点容忍数所确定。
可选的,所述丢弃所述数量不在预设范围内的第二图像,还包括:
当对称黑边图像的相邻帧图像为全黑屏图像时,丢弃对称黑边图像,所述对称黑边图像为所述预设区域为所述第二图像的部分区域时所述数量大于第二预设数量阈值的第二图像;
其中,该部分区域包括第一区域和第二区域,所述第一区域与所述第二区域为对称分布,所述第二预设数量阈值由所述第一区域与所述第二区域的尺寸、第二预设坏点容忍数所确定。
可选的,在将所述原始图像转换为第一图像之前,所述方法还包括:
对所述原始图像按照预设比例进行压缩处理得到压缩后的图像;
所述将所述原始图像转换为第一图像,包括:
将所述压缩后的图像转换为第一图像。
可选的,所述方法还包括:存储所述原始图像。
可选的,所述音视频接口包括以下接口中的一种或多种:HDMI、VGA、DVI、DP。
可选的,所述图像接收端接口包括以下接口中的一种或多种:BT1120、BT656、BT709。
第二方面,本发明实施例提供了一种图像传输装置,所述装置包括:
获取模块,用于获取原始图像;
第一转换模块,用于将所述原始图像转换为第一图像,所述第一图像为符合音视频接口格式的图像;
第二转换模块,用于将所述第一图像转换为第二图像,所述第二图像为符合图像接收端接口格式的图像;
编码模块,用于对所述第二图像进行编码得到编码后的图像;
传输模块,用于将所述编码后的图像传输给所述图像接收端。
可选的,所述装置还包括:
统计模块,用于统计所述第二图像的预设区域中的像素点满足预设条件的数量;
检测模块,用于检测所述数量是否在预设范围内;
所述编码模块具体用于:
将所述数量在预设范围内的第二图像进行编码得到编码后的图像。
可选的,所述预设条件为:像素点的图像灰度值小于预设灰度阈值。
可选的,所述装置还包括:
丢弃模块,用于丢弃所述数量不在预设范围内的第二图像。
可选的,所述丢弃模块具体用于:
丢弃全黑屏图像,所述全黑屏图像为所述预设区域为所述第二图像的整个区域时所述数量大于第一预设数量阈值的第二图像;
其中,所述第一预设数量阈值由所述第二图像的尺寸及第一预设坏点容忍数所确定。
可选的,所述丢弃模块具体用于:
当对称黑边图像的相邻帧图像为全黑屏图像时,丢弃对称黑边图像,所述对称黑边图像为所述预设区域为所述第二图像的部分区域时所述数量大于第二预设数量阈值的第二图像;
其中,该部分区域包括第一区域和第二区域,所述第一区域与所述第二区域为对称分布,所述第二预设数量阈值由所述第一区域与所述第二区域的尺寸、第二预设坏点容忍数所确定。
可选的,所述装置还包括:
压缩模块,用于对所述原始图像按照预设比例进行压缩处理得到压缩后的图像;
所述第一转换模块具体用于:
将所述压缩后的图像转换为第一图像。
可选的,所述装置还包括:存储模块,用于将所述原始图像存储至存储空间。
第三方面,本发明实施例提供了一种图像发送端,包括:
至少一个处理器;以及,
与所述至少一个处理器通信连接的存储器;其中,
所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行如上所述的图像传输方法。
第四方面,本发明实施例提供了一种计算机程序产品,所述计算机程序产品包括存储在非易失性计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使计算机执行如上所述的图像传输方法。
第五方面,本发明实施例还提供了一种非易失性计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使计算机执行如上所述的图像传输方法。
第六方面,本发明实施例还提供了一种飞行器图传系统,包括:图像发送端以及图像接收端;
其中,所述图像发送端和所述图像接收端通信连接;
所述图像发送端为如上所述的图像发送端。
本发明实施例的提供的图像传输方法,首先将所获取的原始图像转换为符合音视频接口格式的第一图像,再将该第一图像转换为符合图像接收端接口格式的第二图像,并对第二图像进行编码,最后将编码后的图像传输给图像接收端,从而实现图像的实时传输,以便用户可以实时看到所获取的图像的情况。
此外,本发明实施例的提供的图像传输方法通过检测第二图像的预设区域中的像素点满足预设条件的数量是否在预设范围内,以丢弃第二图像为全黑屏图像或对称黑边图像的图像,从而优化图像接收端接收到的图像的视觉效果。
附图说明
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1是本发明实施例提供的图像传输方法的应用环境示意图;
图2是本发明实施例提供的无人飞行器的架构示意图;
图3是本发明实施例提供的一种图像传输方法的流程示意图;
图4是本发明实施例提供的另一种图像传输方法的流程示意图;
图5是本发明实施例提供4种黑屏的情况;
图6(a)是本发明实施例提供的全黑屏图像的示意图;
图6(b)是本发明实施例提供的对称黑边图像的示意图;
图7是本发明实施例提供的一种图像传输装置示意图;
图8是本发明实施例提供的图像发送端硬件结构示意图;
图9是本发明实施例提供的飞行器图传系统的示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案 进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
此外,下面所描述的本发明各个实施方式中所涉及到的技术特征只要彼此之间未构成冲突就可以相互组合。
图1为本发明实施例提供的图像传输方法其中一种应用环境。如图1所示,所述应用环境包括无人飞行器100、图像接收端200及用户(图未示)。其中,无人飞行器100与图像接收端200通信连接,以进行信息的交互。用户可以通过该图像接收端200了解无人飞行器100所采集的图像的情况。
无人飞行器100可以是以任何类型的动力驱动的飞行载具或其他可移动设备,包括但不限于多轴旋翼无人飞行器,如四轴旋翼无人飞行器、固定翼飞行器以及直升机等。在本实施例中以四轴旋翼无人飞行器为例进行陈述。
该无人飞行器100可以根据实际情况的需要,具备相应的体积或者动力,从而提供足够的载重能力、飞行速度、飞行续航里程等。
例如,如图2所示,无人飞行器100至少具备一个用于提供飞行动力的动力系统以及用于对无人飞行器100的飞行进行控制的飞行控制系统。该飞行控制系统与动力系统通信连接。
该动力系统可以包括电子调速器(简称为电调)、一个或多个螺旋桨以及与一个或多个螺旋桨相对应的一个或多个电机。其中,电机连接在电子调速器与螺旋桨之间,电机和螺旋桨设置在对应的无人飞行器100机臂上。
电子调速器用于接收飞行控制系统产生的驱动信号,并根据驱动信号提供驱动电流给电机,以控制电机的转速。电机用于驱动螺旋桨旋转,从而为无人飞行器100的飞行提供动力,该动力使得无人飞行器100能够实现一个或多个自由度的运动。
在一些实施例中,无人飞行器100可以围绕一个或多个旋转轴旋转。例如,上述旋转轴可以包括横滚轴、平移轴和俯仰轴。可以理解的是, 电机可以是直流电机,也可以交流电机。另外,电机可以是无刷电机,也可以有刷电机。
飞行控制系统可以包括飞行控制器和传感系统。传感系统用于测量无人飞行器100的姿态信息,即无人飞行器100在空间的位置信息和状态信息,例如,三维位置、三维角度、三维速度、三维加速度和三维角速度等。传感系统例如可以包括陀螺仪、电子罗盘、惯性测量单元(Inertial Measurement Unit,IMU)、视觉传感器、全球导航卫星系统和气压计等传感器中的至少一种。例如,全球导航卫星系统可以是全球定位系统(Global Positioning System,GPS)。
飞行控制器用于控制无人飞行器100的飞行,例如,可以根据传感系统测量的姿态信息控制无人飞行器100的飞行。可以理解的是,飞行控制器可以按照预先编好的程序指令对无人飞行器100的飞行进行控制,也可以通过响应来自其它设备的一个或多个控制指令对无人飞行器100的飞行进行控制。
此外,无人飞行器100上还可以添加有一种或者多种功能模块,令无人飞行器100能够实现更多的功能,如进行航拍测绘等。
例如,在一些实施例中,该无人飞行器100至少具备一个用于采集图像的图像采集装置,以通过该图像采集装置获取原始图像。在另一些实施例中,该无人飞行器100还可以提供用于固定安装图像采集设备的固定支架,从而可以使用户根据自身的需要,更换安装在无人飞行器100上的图像采集设备。
在一些实施例中,该无人飞行器100还可以包括存储空间,该存储空间用于存储该原始图像,以便于后续在需要时调用该原始图像。其中,该存储控制可以为无人飞行器100内置或者外部的存储空间。例如,无人飞行器100上设置有外接SD卡接口,SD卡等记忆设备可插入该接口中,以存储所获取的原始图像。并且,若干帧连续的原始图像形成视频或录像,在一些实例中,也可以将由若干帧连续的原始图像所形成视频或录像存储于无人飞行器100的内置或者外部的存储空间。
此外,无人飞行器100还至少具备一个图像处理模块及图传模块,该图像处理模块用于对图像采集装置所采集的原始图像进行处理,并通 过图传模块将处理后的图像发送给图像接收端200,以实现图像的实时传输。
在一些实施例中,图像采集装置、图像处理模块及图传模块均可集成于无人飞行器100的一个组件中,例如,集成于无人飞行器100摄像组件中,该摄像组件可以作为图像发送端,以通过图像发送端实现将所采集的图像实时传输给图像接收端200的功能。
图像接收端200可以是任何类型的用户交互设备。图像接收端200可以装配有一种或者多种不同的用户交互设备,用以采集用户指令或者向用户展示或者反馈信息。
这些交互设备包括但不限于:按键、显示屏、触摸屏、扬声器以及遥控操作杆。例如,图像接收端200可以装配有触控显示屏,通过该触控显示屏接收用户的触摸指令并通过触控显示屏向用户展示信息,如展示图像。
在一些实施例中,图像接收端200可以为智能终端设备,例如,手机、平板、个人计算机、可穿戴设备等等。该图像接收端200上可以安装有与无人飞行器100相匹配的软件应用程序(APP)。用户可以通过该软件应用程序,将接收到的无人飞行器100发送的图像显示于触控显示屏上。
在另一些实施例中,图像接收端200还可以是与无人飞行器100配套的专用控制设备,例如,无人飞行器100的遥控器等,其可以接收来自无人飞行器100的图像并通过内置或者外部连接的显示屏显示。
可以理解的是,上述对于无人飞行器100的各组成部分的命名仅是出于标识的目的,并不应理解为对本发明的实施例的限制。此外,本发明实施例提供的图像传输方法还可以进一步的拓展到其他合适的应用环境中,而不限于图1中所示的应用环境。在实际应用过程中,该应用环境还可以包括更多的图像接收端。
实施例1:
图3为本发明实施例提供的一种图像传输方法的流程示意图。本发明实施例的图像传输方法可由上述无人飞行器中的各组成部分配合执 行在此不予限定。
请参阅图3,所述图像传输方法包括:
301:获取原始图像。
在无人飞行器飞行的过程中,可以通过图像采集装置采集原始图像以获取该原始图像。其中,所述图像采集装置可以为相机、摄像机等图像采集设备。并且,为了提高采集到的原始图像的质量,可以采用图像传感器尺寸较大的相机,如全画幅相机或中画幅相机等,以获取全画幅或者中画幅等图像质量较高的原始图像。
302:将所述原始图像转换为第一图像,所述第一图像为符合音视频接口格式的图像。
在获取得到原始图像后,将该原始图像转换为第一图像,以便于原始图像的输出。例如,将所获取的原始图像转换为符合音视频接口格式的图像以便输出至图像处理模块进行后续的处理。
该音视频接口包括但不限于以下接口中的一种或多种:高清晰度多媒体接口(High Definition Multimedia Interface,HDMI)、视频图形矩阵(Video Graphics Array,VGA)、模拟数据接口(Digital Visual Interface,DVI)、显示接口(Display Port,DP)。
由于第一图像为符合音视频接口格式的图像,因此,在原始图像转换为第一图像后便通过HDMI、VGA、DVI、DP等音视频接口输出。
在一些实施例中,由于若干帧连续的原始图像可以形成视频,因此也可将该视频转换为符合音视频接口格式的视频,并通过HDMI、VGA、DVI、DP等音视频接口输出,以便进行视频的实时传输。
303:将所述第一图像转换为第二图像,所述第二图像为符合图像接收端接口格式的图像。
由于通常图像接收端无法直接接收由HDMI、VGA、DVI、DP等音视频接口输出的图像,因此,为了实现图像的实时传输,需要将第一图像转换为第二图像,以便图像接收端可接收到经过处理后的图像。
其中,无人飞行器的图像处理模块用于对通过HDMI、VGA、DVI、DP等音视频接口输出的第一图像进行处理。例如,图像处理模块中可包括有转换芯片,用于将所述第一图像转换为符合图像接收端接口格式的图 像,也即第二图像。
其中,该转换芯片可以为海思的Hi3519芯片等。该图像接收端接口包括但不限于以下接口中的一种或多种:BT1120、BT656、BT709。例如,可以通过Hi3519芯片将符合HDMI格式的图像转换为符合BT1120格式的图像。
304:对所述第二图像进行编码得到编码后的图像。
图像处理模块还可以包括编码器,用于对第二图像进行编码。在一些实施例中,还可以在同一芯片上实现将所述第一图像转换为第二图像及对所述第二图像进行编码得到编码后的图像的功能,也即将转换芯片和编码器的功能集成于同一芯片上,如集成于Hi3519芯片上。
其中,编码标准可以是DIVX、XVID、AVC、H.263、H.264、H.265、Windows Madia Video等等。
305:将所述编码后的图像传输给所述图像接收端。
其中,编码后的图像可以通过图传模块传输给图像接收端,以实现图像的实时传输。并且,在图像接收端接收到编码后的图像后,图像接收端可通过解码器对接收到图像进行解码,并将经解码的图像显示于图像接收端的显示屏上,以便用户可以实时了解采集的图像的情况,从而使得在无人飞行器飞行过程中,若遇到所获取的原始图像出现过曝、过暗、视野等情况时,可以及时进行调整,以使所获取的原始图像的效果满足预期的效果。
在本发明实施例中,首先将所获取的原始图像转换为符合音视频接口格式的第一图像,再将该第一图像转换为符合图像接收端接口格式的第二图像,并对第二图像进行编码,最后将编码后的图像传输给图像接收端,从而实现图像的实时传输,以便用户可以实时看到所获取的图像的情况。
实施例2:
图4为本发明实施例提供的另一种图像传输方法的流程示意图。本发明实施例的图像传输方法可由上述无人飞行器中的各组成部分配合执行在此不予限定。
请参阅图4,所述图像传输方法包括:
401:获取原始图像。
402:存储所述原始图像。
为了后续在需要的时候调用原始图像,可以将获取的原始图像存储于无人飞行器的内置或者外部的存储空间。例如,无人飞行器上设置有外接SD卡接口,SD卡等记忆设备可插入该接口中,以存储所获取的原始图像。并且,若干帧连续的原始图像形成视频或录像,在一些实例中,也可以将由若干帧连续的原始图像所形成视频或录像存储于无人飞行器的内置或者外部的存储空间。
403:对所述原始图像按照预设比例进行压缩处理得到压缩后的图像。
通常为了保证图像的质量,原始图像的分辨率一般较高,例如,原始图像可为4000万分辨率,如此高分辨率的图像并不便于实现图像的实时传输,因此,在获取到原始图像后,可按照预设比例对原始图像进行压缩处理,以便于后续图像的传输。例如,可以将4000万分辨率的原始图像缩小到1080P(1920×1080的分辨率)或者720P(1280×720的分辨率)等。
其中,预设比例可以预先配置于图像采集装置,或者该预设比例也可以根据需要进行调整。
404:将所述原始图像转换为第一图像,所述第一图像为符合音视频接口格式的图像。
基于步骤403,将所述原始图像转换为第一图像可以包括:将所述压缩后的图像转换为第一图像。
405:将所述第一图像转换为第二图像,所述第二图像为符合图像接收端接口格式的图像。
406:统计所述第二图像的预设区域中的像素点满足预设条件的数量。
其中,第二图像的预设区域可以是第二图像的全部区域,也可以是第二图像的部分区域。该预设条件为:像素点的图像灰度值小于预设灰度阈值。
通常图像采集装置在设置有机械快门,每次采集原始图像,机械快门会闭合一次。在图像采集装置采集原始图像的过程中,由于机械快门会不停的闭合,会导致所采集的原始图像出现黑屏的情况,由于第二图像为第一图像转换得到的,第一图像由原始图像转换得到的,因此,第二图像中也会出现黑屏的情况,从而导致传输到图像接收端的图像会存在黑屏的图像的情况,影响用户的视角效果。
因此,为了提高用户的视角效果,避免黑屏的情况,在对第二图像进行编码得到编码后的图像之前,需要对黑屏情况进行处理。
如图5所示,由于机械快门会不停的闭合,导致出现黑屏的情况主要包括有以下4中情况:
第1种情况:若干连续帧的第二图像中出现一帧全黑屏图像;
第2种情况:若干连续帧第二图像中出现一帧对称黑边图像,且该对称黑边图像的后一帧图像为全黑屏图像;
第3种情况:若干连续帧第二图像中出现一帧对称黑边图像,且该对称黑边图像的前一帧图像为全黑屏图像;
第4种情况:若干连续帧第二图像中出现一帧对称黑边图像,且该对称黑边图像的后一帧图像为全黑屏图像,该全黑屏图像的后一帧图像为对称黑边图像。
由于机械快门闭合速度非常快,一般大约50ms以内完成整个快门的动作,机械快门闭合和开启本身的时间更短在5ms以内就完成。因此,在帧率为60fps(每秒显示帧数,Frames per Second)时,图5中的第1种情况机械快门闭合时全黑屏图像出现的最多,图5中的第2和第3种情况出现概率非常低,第4种情况几乎不会出现,此外,连续2帧图像为对称黑边图像出现的概率在实际情况中基本为零。在帧率为30fps时,第2、第3种情况更是很少出现了。
因此,为了避免黑屏的情况,可以针对每帧第二图像进行检测,判断是否存在全黑屏图像或对称黑边图像,以防止全黑屏图像或对称黑边图像传输到图像接收端,而影响视角效果。
在一些实现方式中,可以通过被检测的图像也即第二图像的像素点的图像灰度值的情况来确定第二图像是否为全黑屏图像或对称黑边图 像,并在出现上述4种情况时,丢弃全黑屏图像或对称黑边图像,以避免黑屏的情况。
具体的,可以通过统计所述第二图像的预设区域中的像素点满足预设条件的数量来判断第二图像是否为全黑屏图像或对称黑边图像。其中,该预设条件为:像素点的图像灰度值小于预设灰度阈值。
针对全黑屏图像与对称黑边图像可以用个不同的判断策略进行判断。由于全黑屏图像是针对第二图像的整个区域而言的,因此,可以基于第二图像的整个区域的像素点的图像灰度值来进行统计及判断;而对称黑边图像是针对第二图像的部分区域而言的,因此,可以基于第二图像的部分区域的像素点的图像灰度值来进行统计及判断。
具体的,所述全黑屏图像为所述预设区域为所述第二图像的整个区域时所述数量大于第一预设数量阈值的第二图像。所述第一预设数量阈值由所述第二图像的尺寸及第一预设坏点容忍数所确定。
所述对称黑边图像为所述预设区域为所述第二图像的部分区域时所述数量大于第二预设数量阈值的第二图像。其中,该部分区域包括第一区域和第二区域,所述第一区域与所述第二区域为对称分布,所述第二预设数量阈值由所述第一区域与所述第二区域的尺寸、第二预设坏点容忍数所确定。
下面结合图6(a)和图6(b),分别对全黑屏图像及对称黑边图像的判断进行具体描述。
对于全黑屏图像所对应的数量统计及全黑屏图像的判断,如图6(a)所示,假设第二图像的尺寸为wxh,其中,第二图像的宽为w,第二图像的高为h;像素点的图像灰度值小于预设灰度阈值的数量为m,初始时m=0;处于第二图像的第i行,第j列的像素点的图像灰度值为Y (i,j),其中,i取0~(w-1),j取0~(h-1)。
统计第二图像的全部区域中的像素点的图像灰度值小于预设灰度阈值的像素点的数量,以判断第二图像是否为全黑屏图像,具体如下:
i=0~(w-1),j=0~(h-1)
当Y(i,j)<T时,m=m+1
如果m>M1=w×h-B1,则表明第二图像为全黑屏图像
其中,T为预设灰度阈值。该预设灰度阈值用于界定像素点的灰色等级。该预设灰度阈值可以预先配置于图像处理模块中,并且该预设灰度阈值可以根据需要进行调整。例如,根据历史经验数据,预设灰度阈值T可以为25。
M1为第一预设数量阈值,该第一预设数量阈值用于界定第二图像是否为全黑屏图像。也即,当m>M1时,表明第二图像为全黑屏图像;当m≤M1时,表明第二图像不为全黑屏图像。其中,第一预设数量阈值M1由第二图像的尺寸及第一预设坏点容忍数所确定,也即M1=w×h-B1。
B1为第一预设坏点容忍数。对于图像来说,在要求不是特别高的情况下,对于图像中的“坏点”,按照业内默认的标准,“坏点”是一种“正常”现象,只是别太多,在一定情况下,是可以容忍的。因此,为保守起见,在满足图像预期效果的前提下,可以容忍一定的“坏点”数。第一预设坏点容忍数用于界定最大能容忍的“坏点”数。该第一预设坏点容忍数可以预先配置于图像处理模块中,并且该预设灰度阈值可以根据需要进行调整。例如,该第一预设坏点容忍数可以为8。
由于通常把白色与黑色之间按对数关系分成若干级,范围一般从0到255,其中,白色为255,黑色为0,图像灰度值越小,表明颜色越深,也即越接近黑色。因此,当某个像素点的图像灰度值小于预设灰度阈值时,可以认为该像素点接近黑色,此时,像素点的图像灰度值小于预设灰度阈值的数量加1,也即:当Y(i,j)<T时,m=m+1。并且,当统计到像素点的图像灰度值小于预设灰度阈值的数量m>M1时,表明第二图像为全黑屏图像。
对于对称黑边图像所对应的数量统计及对称黑边图像的判断,如图6(b)所示,假设第二图像的尺寸为wxh,其中,第二图像的宽为w,第二图像的高为h;像素点的图像灰度值小于预设灰度阈值的数量为m,初始时m=0;第一区域与第二区域的尺寸均为wxh1,其中,第一区域与第二区域的宽为w,第一区域与第二区域的高为h1,第一区域与第二区域对称分布在第二图像的上下两边;处于第二图像的第i行,第j列的像素点的图像灰度值为Y (i,j),其中,i取0~(w-1),j取0~(h1-1)和 (h-h1)~(h-1)。
统计第二图像的部分区域中的像素点的图像灰度值小于预设灰度阈值的像素点的数量,以判断第二图像是否为对称黑边图像,具体如下:
i=0~(w-1),j=0~(h1-1)和j=(h-h1)~(h-1)
当Y(i,j)<T时,m=m+1
如果m>M2=w×h1×2-B2,则表明第二图像为对称黑边图像
其中,T为预设灰度阈值。M2为第二预设数量阈值,该第二预设数量阈值用于界定第二图像是否为对称黑边图像。也即,当m>M2时,表明第二图像为对称黑边图像;当m≤M2时,表明第二图像不为对称黑边图像。其中,第二预设数量阈值M2由第一区域与所述第二区域的尺寸及第二预设坏点容忍数所确定,也即M2=w×h1×2-B2。
B2为第二预设坏点容忍数。其中,第二预设坏点容忍数与第一预设坏点容忍数类似,因此,在此处不赘述。该第二预设坏点容忍数也可为8。
h1为第一区域或第二区域的高。该高可以预先配置于图像处理模块中,并且可以根据需要进行调整。此外,根据实际情况,h1不能太小,因为太小的话,即使第一区域和第二区域的像素点的灰度值均预设灰度阈值,也即第一区域和第二区域中图像颜色均接近黑色,由于第一区域和第二区域较小,对第二图像的整体的影响并不大,也并不会影响用户的视觉效果,在该种情况下,并不适合丢弃该图像。因此,h1不能太小,例如,h1可以大于10。
由于通常把白色与黑色之间按对数关系分成若干级,范围一般从0到255,其中,白色为255,黑色为0,图像灰度值越小,表明颜色越深,也即越接近黑色。因此,当某个像素点的图像灰度值小于预设灰度阈值时,可以认为该像素点接近黑色,此时,像素点的图像灰度值小于预设灰度阈值的数量加1,也即:当Y(i,j)<T时,m=m+1。并且,当统计到像素点的图像灰度值小于预设灰度阈值的数量m>M2时,表明第二图像为对称黑边图像。
407:检测所述数量是否在预设范围内。
检测所述数量是否在预设范围内,以便确定第二图像是否为全黑屏 图像或对称黑边图像,从而提高用户的视觉效果。
对于第二图像不为全黑屏图像的情况,该预设范围为小于或等于第一预设数量阈值,也即当m≤M1时,第二图像不为全黑屏图像;对于第二图像不为对称黑边图像的情况,该预设范围为小于或等于第二预设数量阈值,也即当m≤M2时,第二图像不为对称黑边图像。
408:对所述第二图像进行编码得到编码后的图像。
基于步骤407,所述对所述第二图像进行编码得到编码后的图像包括:将所述数量在预设范围内的第二图像进行编码得到编码后的图像。
409:将所述编码后的图像传输给所述图像接收端。
410:丢弃所述数量不在预设范围内的第二图像。
其中,丢弃所述数量不在预设范围内的第二图像包括:丢弃全黑屏图像(图5中的第1种情况、第2种情况、第3种情况、第4种情况中的第n帧图像);当对称黑边图像的相邻帧图像为全黑屏图像时,丢弃对称黑边图像(图5中的第2种情况中的第n-1帧图像、第3种情况中的第n+1帧图像、及第4种情况中的第n-1帧图像和第n+1帧图像)。
通过丢弃所述数量不在预设范围内的第二图像,如全黑屏图像或对称黑边图像,可以避免图像接收端接收到影响视觉效果的图像,以便提高用户的体验。
可以理解的是,在一些实施例中,所述步骤402和/或步骤403在不同的实施例中,可以不是必选步骤,另外,根据本发明实施例的描述可以理解,在不同实施例中,在不矛盾的情况下,所述步骤401-410可以有不同的执行顺序。例如,先执行步骤410再执行步骤409等。
还需要说明的是,本发明实施例中所述步骤401-410中未详尽描述的技术细节,可参考上述实施例的具体描述。
在本发明实施例中,首先将所获取的原始图像转换为符合音视频接口格式的第一图像,再将该第一图像转换为符合图像接收端接口格式的第二图像,并对第二图像进行编码,最后将编码后的图像传输给图像接收端,从而实现图像的实时传输,以便用户可以实时看到所获取的图像的情况。
并且,在本发明实施例中还通过检测第二图像的预设区域中的像素 点满足预设条件的数量是否在预设范围内,以丢弃第二图像为全黑屏图像或对称黑边图像的图像,从而优化图像接收端接收到的图像的视觉效果。
实施例3:
图7为本发明实施例提供的一种图像传输装置的示意图。其中,所述图像传输装置70可配置于上述无人飞行器中。
参照图7,所述图像传输装置70包括:获取模块701、存储模块702、压缩模块703、第一转换模块704、第二转换模块705、统计模块706、检测模块707、编码模块708、传输模块709以及丢弃模块710。
具体的,获取模块701用于获取原始图像。
获取模块701可以与图像采集装置连接,在无人飞行器飞行的过程中,获取模块701可以通过图像采集装置采集原始图像以获取该原始图像。其中,所述图像采集装置可以为相机、摄像机等图像采集设备。并且,为了提高采集到的原始图像的质量,可以采用图像传感器尺寸较大的相机,如全画幅相机或中画幅相机等,以获取全画幅或者中画幅等图像质量较高的原始图像。
具体的,存储模块702用于将所述原始图像存储至存储空间。
为了后续在需要的时候调用原始图像,存储模块702可以将获取模块701获取的原始图像存储于无人飞行器的内置或者外部的存储空间。例如,无人飞行器上设置有外接SD卡接口,SD卡等记忆设备可插入该接口中,以存储所获取的原始图像。并且,若干帧连续的原始图像形成视频或录像,在一些实例中,存储模块702也可以将由若干帧连续的原始图像所形成视频或录像存储于无人飞行器的内置或者外部的存储空间。
压缩模块703用于对所述原始图像按照预设比例进行压缩处理得到压缩后的图像。
通常为了保证图像的质量,原始图像的分辨率一般较高,例如,原始图像可为4000万分辨率,如此高分辨率的图像并不便于实现图像的实时传输,因此,在获取模块701获取到原始图像后,压缩模块703可 按照预设比例对原始图像进行压缩处理,以便于后续图像的传输。
第一转换模块704用于将所述原始图像转换为第一图像,所述第一图像为符合音视频接口格式的图像。
在经过压缩模块703对原始图像进行压缩后,第一转换模块704可以将该压缩后的图像转换为第一图像,以便于压缩后的图像的输出。例如,第一转换模块704将压缩后的图像转换为符合音视频接口格式的图像以便输出至第二转换模块705进行后续的处理。
该音视频接口包括但不限于以下接口中的一种或多种:HDMI、VGA、DVI、DP。
由于第一图像为符合音视频接口格式的图像,因此,通过第一转换模块704将压缩后的图像转换为第一图像后便通过HDMI、VGA、DVI、DP等音视频接口输出。
在一些实施例中,由于若干帧连续的图像可以形成视频,因此第一转换模块704也可将该压缩后的视频转换为符合音视频接口格式的视频,并通过HDMI、VGA、DVI、DP等音视频接口输出,以便进行视频的实时传输。
第二转换模块705用于将所述第一图像转换为第二图像,所述第二图像为符合图像接收端接口格式的图像。
由于通常图像接收端无法直接接收由HDMI、VGA、DVI、DP等音视频接口输出的图像,因此,为了实现图像的实时传输,需要通过第二转换模块705将第一图像转换为第二图像,以便图像接收端可接收到经过处理后的图像。
该图像接收端接口包括但不限于以下接口中的一种或多种:BT1120、BT656、BT709。例如,可以通过第二转换模块705将符合HDMI格式的图像转换为符合BT1120格式的图像。
统计模块706用于统计所述第二图像的预设区域中的像素点满足预设条件的数量。
其中,第二图像的预设区域可以是第二图像的全部区域,也可以是第二图像的部分区域。该预设条件为:像素点的图像灰度值小于预设灰度阈值。
通常图像采集装置在设置有机械快门,每次采集原始图像,机械快门会闭合一次。在图像采集装置采集原始图像的过程中,由于机械快门会不停的闭合,会导致所采集的原始图像出现黑屏的情况,由于第二图像为第一图像转换得到的,第一图像由原始图像转换得到的,因此,第二图像中也会出现黑屏的情况,从而导致传输到图像接收端的图像会存在黑屏的图像的情况,影响用户的视角效果。
因此,为了提高用户的视角效果,避免黑屏的情况,在编码模块708对第二图像进行编码得到编码后的图像之前,需要对黑屏情况进行处理。例如,可以通过统计模块706对第二图像的预设区域中的像素点满足预设条件的数量进行统计,以便基于该数量判断是否存在全黑屏图像或对称黑边图像,若存在时,丢弃全黑屏图像或对称黑边图像,以防止全黑屏图像或对称黑边图像传输到图像接收端,而影响视角效果。
其中,该预设条件为:像素点的图像灰度值小于预设灰度阈值。
针对全黑屏图像与对称黑边图像统计模块706可以用个不同的统计策略进行统计。由于全黑屏图像是针对第二图像的整个区域而言的,因此,统计模块706可以基于第二图像的整个区域的像素点的图像灰度值来进行统计;而对称黑边图像是针对第二图像的部分区域而言的,因此,统计模块706可以基于第二图像的部分区域的像素点的图像灰度值来进行统计。
对于全黑屏图像所对应的数量统计,如图6(a)所示,假设第二图像的尺寸为wxh,其中,第二图像的宽为w,第二图像的高为h;像素点的图像灰度值小于预设灰度阈值的数量为m,初始时m=0;处于第二图像的第i行,第j列的像素点的图像灰度值为Y (i,j),其中,i取0~(w-1),j取0~(h-1)。
统计第二图像的全部区域中的像素点的图像灰度值小于预设灰度阈值的像素点的数量,具体如下:
i=0~(w-1),j=0~(h-1)
当Y(i,j)<T时,m=m+1
其中,T为预设灰度阈值。该预设灰度阈值用于界定像素点的灰色等级。该预设灰度阈值可以预先配置于图像处理模块中,并且该预设灰 度阈值可以根据需要进行调整。例如,根据历史经验数据,预设灰度阈值T可以为25。
由于通常把白色与黑色之间按对数关系分成若干级,范围一般从0到255,其中,白色为255,黑色为0,图像灰度值越小,表明颜色越深,也即越接近黑色。因此,当某个像素点的图像灰度值小于预设灰度阈值时,可以认为该像素点接近黑色,此时,像素点的图像灰度值小于预设灰度阈值的数量加1,也即:当Y(i,j)<T时,m=m+1。
对于对称黑边图像所对应的数量统计,如图6(b)所示,假设第二图像的尺寸为wxh,其中,第二图像的宽为w,第二图像的高为h;像素点的图像灰度值小于预设灰度阈值的数量为m,初始时m=0;第一区域与第二区域的尺寸均为wxh1,其中,第一区域与第二区域的宽为w,第一区域与第二区域的高为h1,第一区域与第二区域对称分布在第二图像的上下两边;处于第二图像的第i行,第j列的像素点的图像灰度值为Y (i,j),其中,i取0~(w-1),j取0~(h1-1)和(h-h1)~(h-1)。
统计第二图像的部分区域中的像素点的图像灰度值小于预设灰度阈值的像素点的数量,具体如下:
i=0~(w-1),j=0~(h1-1)和j=(h-h1)~(h-1)
当Y(i,j)<T时,m=m+1
其中,T为预设灰度阈值。h1为第一区域或第二区域的高。该高可以预先配置于图像处理模块中,并且可以根据需要进行调整。此外,根据实际情况,h1不能太小,因为太小的话,即使第一区域和第二区域的像素点的灰度值均预设灰度阈值,也即第一区域和第二区域中图像颜色均接近黑色,由于第一区域和第二区域较小,对第二图像的整体的影响并不大,也并不会影响用户的视觉效果,在该种情况下,并不适合丢弃该图像。因此,h1不能太小,例如,h1可以大于10。
由于通常把白色与黑色之间按对数关系分成若干级,范围一般从0到255,其中,白色为255,黑色为0,图像灰度值越小,表明颜色越深,也即越接近黑色。因此,当某个像素点的图像灰度值小于预设灰度阈值时,可以认为该像素点接近黑色,此时,像素点的图像灰度值小于预设灰度阈值的数量加1,也即:当Y(i,j)<T时,m=m+1。
检测模块707用于检测所述数量是否在预设范围内。
通过检测模块707可以检测所述数量是否在预设范围内,以便确定第二图像是否为全黑屏图像或对称黑边图像,从而提高用户的视觉效果。
对于第二图像不为全黑屏图像的情况,该预设范围为小于或等于第一预设数量阈值,也即当m≤M1时,第二图像不为全黑屏图像,其中,M1为第一预设数量阈值;对于第二图像不为对称黑边图像的情况,该预设范围为小于或等于第二预设数量阈值,也即当m≤M2时,第二图像不为对称黑边图像,其中,M2为第二预设数量阈值。
其中,M1为第一预设数量阈值,该第一预设数量阈值用于界定第二图像是否为全黑屏图像。也即,当m>M1时,表明第二图像为全黑屏图像;当m≤M1时,表明第二图像不为全黑屏图像。其中,第一预设数量阈值M1由第二图像的尺寸及第一预设坏点容忍数所确定,也即M1=w×h-B1。B1为第一预设坏点容忍数。
M2为第二预设数量阈值,该第二预设数量阈值用于界定第二图像是否为对称黑边图像。也即,当m>M2时,表明第二图像为对称黑边图像;当m≤M2时,表明第二图像不为对称黑边图像。其中,第二预设数量阈值M2由第一区域与所述第二区域的尺寸及第二预设坏点容忍数所确定,也即M2=w×h1×2-B2。B2为第二预设坏点容忍数。
编码模块708用于对所述第二图像进行编码得到编码后的图像。
为了提高用户的视角效果,避免黑屏的情况,防止全黑屏图像或对称黑边图像传输到图像接收端,编码模块708进行编码的图像应为数量在预设范围内的第二图像,也即编码模块708具体用于:将所述数量在预设范围内的第二图像进行编码得到编码后的图像。
其中,编码标准可以是DIVX、XVID、AVC、H.263、H.264、H.265、Windows Madia Video等等。
传输模块709用于将所述编码后的图像传输给所述图像接收端。
其中,经过编码模块708编码后的图像可以通过传输模块709传输给图像接收端,以实现图像的实时传输。并且,在图像接收端接收到编码后的图像后,图像接收端可通过解码器对接收到图像进行解码,并将经解码的图像显示于图像接收端的显示屏上,以便用户可以实时了解采 集的图像的情况,从而使得在无人飞行器飞行过程中,若遇到所采集的原始图像出现过曝、过暗、视野等情况时,可以及时进行调整,以使所采集的原始图像的效果满足预期的效果。
丢弃模块710用于丢弃所述数量不在预设范围内的第二图像。
其中,丢弃模块710具体用于:丢弃全黑屏图像(图5中的第1种情况、第2种情况、第3种情况、第4种情况中的第n帧图像);当对称黑边图像的相邻帧图像为全黑屏图像时,丢弃对称黑边图像(图5中的第2种情况中的第n-1帧图像、第3种情况中的第n+1帧图像、及第4种情况中的第n-1帧图像和第n+1帧图像)。
其中,所述全黑屏图像为所述预设区域为所述第二图像的整个区域时所述数量大于第一预设数量阈值的第二图像,所述第一预设数量阈值由所述第二图像的尺寸及第一预设坏点容忍数所确定。也即:
如果m>M1=w×h-B1,则表明第二图像为全黑屏图像
其中,M1为第一预设数量阈值,该第一预设数量阈值用于界定第二图像是否为全黑屏图像。B1为第一预设坏点容忍数。
所述对称黑边图像为所述预设区域为所述第二图像的部分区域时所述数量大于第二预设数量阈值的第二图像。该部分区域包括第一区域和第二区域,所述第一区域与所述第二区域为对称分布,所述第二预设数量阈值由所述第一区域与所述第二区域的尺寸、第二预设坏点容忍数所确定。也即:
如果m>M2=w×h1×2-B2,则表明第二图像为对称黑边图像
其中,M2为第二预设数量阈值,该第二预设数量阈值用于界定第二图像是否为对称黑边图像。B2为第二预设坏点容忍数。
通过丢弃模块710丢弃所述数量不在预设范围内的第二图像,可以避免图像接收端接收到影响视觉效果的图像,以便提高用户的体验。
需要说明的是,在一些实施例中,存储模块702和/或压缩模块703在不同的实施例中,可以不是图像传输装置70的必要模块,也即,在一些实施例中,存储模块702和/或压缩模块703可以省略。
还需要说明的是,在本发明实施例中,所述图像传输装置70可执行任意方法实施例所提供的图像传输方法,具备执行方法相应的功能模 块和有益效果。未在图像传输装置70的实施例中详尽描述的技术细节,可参见本方法实施例所提供的图像传输方法。
实施例4:
图8为本发明实施例提供的图像发送端硬件结构示意图。所述图像发送端可以为上述无人飞行器100、无人船或无人飞行器100的摄像组件等。如图8所示,所述图像发送端80包括:
一个或多个处理器801以及存储器802,图8中以一个处理器801为例。
处理器801和存储器802可以通过总线或者其他方式连接,图8中以通过总线连接为例。
存储器802作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本发明实施例提供的图像传输方法对应的程序指令/模块(例如,附图7所示的获取模块701、存储模块702、压缩模块703、第一转换模块704、第二转换模块705、统计模块706、检测模块707、编码模块708、传输模块709以及丢弃模块710)。处理器801通过运行存储在存储器802中的非易失性软件程序、指令以及模块,从而执行图像发送端的各种功能应用以及数据处理,即实现所述方法实施例提供的图像传输方法。
存储器802可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据图像发送端使用所创建的数据等。此外,存储器802可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器802可选包括相对于处理器801远程设置的存储器,这些远程存储器可以通过网络连接至图像发送端。所述网络的实施例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
所述一个或者多个模块存储在所述存储器802中,当被所述一个或者多个处理器801执行时,执行本发明实施例提供的图像传输方法,例 如,执行以上描述的图4中的方法步骤401至步骤410,或实现图7中的701-710模块的功能。
示例性地,该图像发送端还可以包括通信接口,该通信接口用以实现与其他设备,如服务器等,进行通信。图像发送端包括的其他装置在此不予限定。
所述图像发送端可执行本发明实施例提供的图像传输方法,具备执行方法相应的功能模块和有益效果。未在图像发送端实施例中详尽描述的技术细节,可参见本发明实施例提供的图像传输方法。
本发明实施例提供了一种计算机程序产品,所述计算机程序产品包括存储在非易失性计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行本发明实施例提供的图像传输方法。例如,执行以上描述的图4中的方法步骤401至步骤410,或实现图7中的701-710模块的功能。
本发明实施例提供了一种非易失性计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使计算机执行本发明实施例提供的图像传输方法。例如,执行以上描述的图4中的方法步骤401至步骤410,或实现图7中的701-710模块的功能。
实施例5:
图9为本发明实施例提供的飞行器图传系统的示意图。其中,该飞行器图传系统90包括如上所述的图像发送端80以及图像接收端200。其中,所述图像发送端80和所述图像接收端200通信连接。图像发送端80可以将获取的原始图像经过处理后,实时传输给图像接收端200,以实现图像的实时传输。并且,图像发送端80通过检测第二图像的预设区域中的像素点满足预设条件的数量是否在预设范围内,以丢弃第二图像为全黑屏图像或对称黑边图像的图像,从而优化图像接收端200接收到的图像的视觉效果。
需要说明的是,以上所描述的装置实施例仅仅是示意性的,其中所 述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
通过以上的实施例的描述,本领域普通技术人员可以清楚地了解到各实施例可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。本领域普通技术人员可以理解实现所述实施例方法中的全部或部分流程是可以通过计算机程序指令相关的硬件来完成,所述的程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如所述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(RandomAccessMemory,RAM)等。
最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其限制;在本发明的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本发明的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (20)

  1. 一种图像传输方法,其特征在于,所述方法包括:
    获取原始图像;
    将所述原始图像转换为第一图像,所述第一图像为符合音视频接口格式的图像;
    将所述第一图像转换为第二图像,所述第二图像为符合图像接收端接口格式的图像;
    对所述第二图像进行编码得到编码后的图像;
    将所述编码后的图像传输给所述图像接收端。
  2. 根据权利要求1所述的方法,其特征在于,在对所述第二图像进行编码得到编码后的图像之前,所述方法还包括:
    统计所述第二图像的预设区域中的像素点满足预设条件的数量;
    检测所述数量是否在预设范围内;
    所述对所述第二图像进行编码得到编码后的图像,包括:
    将所述数量在预设范围内的第二图像进行编码得到编码后的图像。
  3. 根据权利要求2所述的方法,其特征在于,所述预设条件为:像素点的图像灰度值小于预设灰度阈值。
  4. 根据权利要求2或3所述的方法,其特征在于,所述方法还包括:
    丢弃所述数量不在预设范围内的第二图像。
  5. 根据权利要求4所述的方法,其特征在于,所述丢弃所述数量不在预设范围内的第二图像,包括:
    丢弃全黑屏图像,所述全黑屏图像为所述预设区域为所述第二图像的整个区域时所述数量大于第一预设数量阈值的第二图像;
    其中,所述第一预设数量阈值由所述第二图像的尺寸及第一预设坏点容忍数所确定。
  6. 根据权利要求5所述的方法,其特征在于,所述丢弃所述数量不在预设范围内的第二图像,还包括:
    当对称黑边图像的相邻帧图像为全黑屏图像时,丢弃对称黑边图像,所述对称黑边图像为所述预设区域为所述第二图像的部分区域时所述数量大于第二预设数量阈值的第二图像;
    其中,该部分区域包括第一区域和第二区域,所述第一区域与所述第二区域为对称分布,所述第二预设数量阈值由所述第一区域与所述第二区域的尺寸、第二预设坏点容忍数所确定。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,在将所述原始图像转换为第一图像之前,所述方法还包括:
    对所述原始图像按照预设比例进行压缩处理得到压缩后的图像;
    所述将所述原始图像转换为第一图像,包括:
    将所述压缩后的图像转换为第一图像。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,所述方法还包括:
    存储所述原始图像。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述音视频接口包括以下接口中的一种或多种:HDMI、VGA、DVI、DP。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述图像接收端接口包括以下接口中的一种或多种:BT1120、BT656、BT709。
  11. 一种图像传输装置,其特征在于,所述装置包括:
    获取模块,用于获取原始图像;
    第一转换模块,用于将所述原始图像转换为第一图像,所述第一图像为符合音视频接口格式的图像;
    第二转换模块,用于将所述第一图像转换为第二图像,所述第二图像为符合图像接收端接口格式的图像;
    编码模块,用于对所述第二图像进行编码得到编码后的图像;
    传输模块,用于将所述编码后的图像传输给所述图像接收端。
  12. 根据权利要求11所述的装置,其特征在于,所述装置还包括:
    统计模块,用于统计所述第二图像的预设区域中的像素点满足预设条件的数量;
    检测模块,用于检测所述数量是否在预设范围内;
    所述编码模块具体用于:
    将所述数量在预设范围内的第二图像进行编码得到编码后的图像。
  13. 根据权利要求12所述的装置,其特征在于,所述预设条件为:像素点的图像灰度值小于预设灰度阈值。
  14. 根据权利要求12或13所述的装置,其特征在于,所述装置还包括:
    丢弃模块,用于丢弃所述数量不在预设范围内的第二图像。
  15. 根据权利要求14所述的装置,其特征在于,所述丢弃模块具体用于:
    丢弃全黑屏图像,所述全黑屏图像为所述预设区域为所述第二图像的整个区域时所述数量大于第一预设数量阈值的第二图像;
    其中,所述第一预设数量阈值由所述第二图像的尺寸及第一预设坏点容忍数所确定。
  16. 根据权利要求15所述的装置,其特征在于,所述丢弃模块具体用于:
    当对称黑边图像的相邻帧图像为全黑屏图像时,丢弃对称黑边图像,所述对称黑边图像为所述预设区域为所述第二图像的部分区域时所述数量大于第二预设数量阈值的第二图像;
    其中,该部分区域包括第一区域和第二区域,所述第一区域与所述第二区域为对称分布,所述第二预设数量阈值由所述第一区域与所述第二区域的尺寸、第二预设坏点容忍数所确定。
  17. 根据权利要求11-16任一项所述的装置,其特征在于,所述装置还包括:
    压缩模块,用于对所述原始图像按照预设比例进行压缩处理得到压缩后的图像;
    所述第一转换模块具体用于:
    将所述压缩后的图像转换为第一图像。
  18. 根据权利要求11-17任一项所述的装置,其特征在于,所述装置还包括:
    存储模块,用于将所述原始图像存储至存储空间。
  19. 一种图像发送端,其特征在于,包括:
    至少一个处理器;以及,
    与所述至少一个处理器通信连接的存储器;其中,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1-10的任一项所述的方法。
  20. 一种飞行器图传系统,其特征在于,包括:图像发送端以及图像接收端;
    其中,所述图像发送端和所述图像接收端通信连接;
    所述图像发送端为如权利要求19所述的图像发送端。
PCT/CN2019/106726 2018-09-20 2019-09-19 图像传输方法、装置、图像发送端及飞行器图传系统 WO2020057609A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/206,466 US11523085B2 (en) 2018-09-20 2021-03-19 Method, system, device for video data transmission and photographing apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811101486.2A CN109040840B (zh) 2018-09-20 2018-09-20 图像传输方法、装置、图像发送端及飞行器图传系统
CN201811101486.2 2018-09-20

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/206,466 Continuation US11523085B2 (en) 2018-09-20 2021-03-19 Method, system, device for video data transmission and photographing apparatus

Publications (1)

Publication Number Publication Date
WO2020057609A1 true WO2020057609A1 (zh) 2020-03-26

Family

ID=64617714

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/106726 WO2020057609A1 (zh) 2018-09-20 2019-09-19 图像传输方法、装置、图像发送端及飞行器图传系统

Country Status (3)

Country Link
US (1) US11523085B2 (zh)
CN (2) CN109040840B (zh)
WO (1) WO2020057609A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117579896A (zh) * 2024-01-15 2024-02-20 深圳市光明顶技术有限公司 无人机视频传输处理方法、装置、设备及存储介质

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109040840B (zh) * 2018-09-20 2021-09-21 深圳市道通智能航空技术股份有限公司 图像传输方法、装置、图像发送端及飞行器图传系统
CN113411492B (zh) * 2018-12-26 2023-04-07 深圳市道通智能航空技术股份有限公司 一种图像处理方法、装置和无人机
CN110324669A (zh) * 2019-07-01 2019-10-11 广州视源电子科技股份有限公司 缩略图的传输方法、装置及系统
CN112272745A (zh) * 2019-11-27 2021-01-26 深圳市大疆创新科技有限公司 集成相机机身的云台组件和拍摄设备
CN110958431A (zh) * 2019-12-11 2020-04-03 武汉迅检科技有限公司 多路视频压缩后传系统及方法
CN113572999A (zh) * 2021-07-20 2021-10-29 康佳集团股份有限公司 一种数据转换方法、装置、终端设备及存储介质
CN116112712B (zh) * 2023-01-12 2023-09-26 联通沃音乐文化有限公司 一种自适应影音视频传输播放方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120200703A1 (en) * 2009-10-22 2012-08-09 Bluebird Aero Systems Ltd. Imaging system for uav
CN104219492A (zh) * 2013-11-14 2014-12-17 成都时代星光科技有限公司 无人机图像传输系统
CN105160636A (zh) * 2015-07-09 2015-12-16 北京控制工程研究所 一种面向星上光学成像敏感器的自适应图像预处理方法
CN105657486A (zh) * 2015-08-21 2016-06-08 乐视致新电子科技(天津)有限公司 音视频播放设备
CN106375720A (zh) * 2016-09-12 2017-02-01 中国科学院自动化研究所 智能视觉云台系统及其实现方法
CN107005673A (zh) * 2016-10-26 2017-08-01 深圳市大疆灵眸科技有限公司 云台
CN109040840A (zh) * 2018-09-20 2018-12-18 深圳市道通智能航空技术有限公司 图像传输方法、装置、图像发送端及飞行器图传系统

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102131101A (zh) * 2011-04-21 2011-07-20 江苏东怡软件技术有限公司 智能视频图像质量自动分析系统及其分析方法
JP2012248984A (ja) * 2011-05-26 2012-12-13 Sony Corp 信号送信装置、信号送信方法、信号受信装置、信号受信方法及び信号伝送システム
KR20150059722A (ko) * 2012-08-10 2015-06-02 엘지전자 주식회사 신호 송수신 장치 및 신호 송수신 방법
CN103269425A (zh) * 2013-04-18 2013-08-28 中国科学院长春光学精密机械与物理研究所 多功能智能图像转换系统
CN105450980B (zh) * 2014-08-25 2018-08-31 北京计算机技术及应用研究所 一种高清航拍控制与视频回传方法及系统
CN104683773B (zh) * 2015-03-25 2017-08-25 北京真德科技发展有限公司 无人机视频高速传输方法
CN106483971A (zh) * 2015-08-27 2017-03-08 成都鼎桥通信技术有限公司 无人机数据传输方法和装置
CN105187723B (zh) * 2015-09-17 2018-07-10 深圳市十方联智科技有限公司 一种无人飞行器的摄像处理方法
CN111182268B (zh) * 2016-01-29 2021-08-17 深圳市大疆创新科技有限公司 视频数据传输方法、系统、设备和拍摄装置
CN106060582B (zh) * 2016-05-24 2019-06-11 广州华多网络科技有限公司 视频传输系统、方法及装置
CN106127698A (zh) * 2016-06-15 2016-11-16 深圳市万普拉斯科技有限公司 图像降噪处理方法和装置
WO2018086099A1 (zh) * 2016-11-14 2018-05-17 深圳市大疆创新科技有限公司 图像处理方法、装置、设备及视频图传系统
CN106780598A (zh) * 2016-12-05 2017-05-31 歌尔科技有限公司 一种基于无人机的水面漂浮物检测方法和无人机
CN207638795U (zh) * 2017-12-29 2018-07-20 长江勘测规划设计研究有限责任公司 基于无人机的岩石高边坡远程信息采集设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120200703A1 (en) * 2009-10-22 2012-08-09 Bluebird Aero Systems Ltd. Imaging system for uav
CN104219492A (zh) * 2013-11-14 2014-12-17 成都时代星光科技有限公司 无人机图像传输系统
CN105160636A (zh) * 2015-07-09 2015-12-16 北京控制工程研究所 一种面向星上光学成像敏感器的自适应图像预处理方法
CN105657486A (zh) * 2015-08-21 2016-06-08 乐视致新电子科技(天津)有限公司 音视频播放设备
CN106375720A (zh) * 2016-09-12 2017-02-01 中国科学院自动化研究所 智能视觉云台系统及其实现方法
CN107005673A (zh) * 2016-10-26 2017-08-01 深圳市大疆灵眸科技有限公司 云台
CN109040840A (zh) * 2018-09-20 2018-12-18 深圳市道通智能航空技术有限公司 图像传输方法、装置、图像发送端及飞行器图传系统

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117579896A (zh) * 2024-01-15 2024-02-20 深圳市光明顶技术有限公司 无人机视频传输处理方法、装置、设备及存储介质
CN117579896B (zh) * 2024-01-15 2024-04-02 深圳市光明顶技术有限公司 无人机视频传输处理方法、装置、设备及存储介质

Also Published As

Publication number Publication date
US20210211607A1 (en) 2021-07-08
CN109040840A (zh) 2018-12-18
CN113645501A (zh) 2021-11-12
US11523085B2 (en) 2022-12-06
CN109040840B (zh) 2021-09-21

Similar Documents

Publication Publication Date Title
WO2020057609A1 (zh) 图像传输方法、装置、图像发送端及飞行器图传系统
US20220247898A1 (en) Replaceable gimbal camera, aircraft, aircraft system, and gimbal replacement method for aircraft
WO2017128314A1 (zh) 视频数据传输方法、系统、设备和拍摄装置
US11258949B1 (en) Electronic image stabilization to improve video analytics accuracy
US20200244970A1 (en) Video processing method, device, aerial vehicle, system, and storage medium
WO2020093850A1 (zh) 双光图像整合方法、整合设备及无人机
WO2021237616A1 (zh) 图像传输方法、可移动平台及计算机可读存储介质
EP3528490B1 (en) Image data frame synchronization method and terminal
EP3412031B1 (en) Method and apparatus for creating and rendering hdr images
US20200007794A1 (en) Image transmission method, apparatus, and device
US11006042B2 (en) Imaging device and image processing method
CN110383814B (zh) 控制方法、无人机、遥控设备以及非易失性存储介质
EP3893489A1 (en) Image processing method and device, and unmanned aerial vehicle
WO2020006650A1 (zh) 影像拍摄器和手持云台、可移动平台
US11438491B2 (en) Systems and methods for blocking a target in video monitoring
WO2023207445A1 (zh) 图像去噪方法、装置、系统及电子设备
US11889193B2 (en) Zoom method and apparatus, unmanned aerial vehicle, unmanned aircraft system and storage medium
KR100890710B1 (ko) 전방위 영상 장치
WO2021249562A1 (zh) 一种信息传输方法、相关设备及系统
US20200319818A1 (en) System and method for supporting low latency in a movable platform environment
JP2011017777A (ja) 画像表示装置
WO2019000371A1 (zh) 影像数据处理方法和影像数据处理系统
EP4280154A1 (en) Image blurriness determination method and device related thereto
WO2019183826A1 (zh) 图像处理方法、装置和无人机
US8115823B2 (en) Image processing system capable of reducing image data needed to be transmitted and method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19862164

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19862164

Country of ref document: EP

Kind code of ref document: A1