WO2021195967A1 - 一种图像处理方法、设备、控制终端及可移动平台 - Google Patents

一种图像处理方法、设备、控制终端及可移动平台 Download PDF

Info

Publication number
WO2021195967A1
WO2021195967A1 PCT/CN2020/082447 CN2020082447W WO2021195967A1 WO 2021195967 A1 WO2021195967 A1 WO 2021195967A1 CN 2020082447 W CN2020082447 W CN 2020082447W WO 2021195967 A1 WO2021195967 A1 WO 2021195967A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
grayscale image
grayscale
resampled
gray
Prior art date
Application number
PCT/CN2020/082447
Other languages
English (en)
French (fr)
Inventor
张青涛
曹子晟
夏斌强
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080004283.0A priority Critical patent/CN112544068A/zh
Priority to PCT/CN2020/082447 priority patent/WO2021195967A1/zh
Publication of WO2021195967A1 publication Critical patent/WO2021195967A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Definitions

  • the present disclosure relates to the field of image processing technology, and in particular to an image processing method, device, control terminal, and movable platform.
  • Steps such as image compression will cause the color image displayed on the display device to be blurred, with low resolution, and correspondingly reduce user experience and display accuracy.
  • the present disclosure provides an image processing method, which includes:
  • Pseudo-color mapping is performed on the resampled grayscale image to obtain a pseudo-color image, and the pseudo-color image is used for display by a display device.
  • the present disclosure also provides an image processing method, which includes:
  • the present disclosure also provides an image processing method, which includes:
  • Pseudo-color mapping is performed on the grayscale image to obtain a pseudo-color image, and the pseudo-color image is used for display by a display device.
  • the present disclosure also provides an image processing device, including:
  • Memory used to store executable instructions
  • the processor is configured to execute executable instructions stored in the memory to perform the following operations:
  • Pseudo-color mapping is performed on the resampled grayscale image to obtain a pseudo-color image, and the pseudo-color image is used for display by a display device.
  • the present disclosure also provides an image processing device, including:
  • Memory used to store executable instructions
  • the processor is configured to execute the executable instructions stored in the memory to perform the following operations:
  • the present disclosure also provides an image processing device, including:
  • Memory used to store executable instructions
  • the processor is configured to execute executable instructions stored in the memory to perform the following operations:
  • Pseudo-color mapping is performed on the grayscale image to obtain a pseudo-color image, and the pseudo-color image is used for display by a display device.
  • the present disclosure also provides a display device, including a display device and an image processing device, and the image processing device includes:
  • Memory used to store executable instructions
  • the processor is configured to execute executable instructions stored in the memory to perform the following operations:
  • Pseudo-color mapping is performed on the resampled grayscale image to obtain a pseudo-color image, and the pseudo-color image is used for display by a display device.
  • grayscale images including all the above embodiments can be processed by pseudo-color mapping other than resampling and data transmission, such as correction processing, noise reduction processing, etc. , And then display it through the display device.
  • the present disclosure also provides an imaging device, including an image sensor and an image processing device, and the image processing device includes:
  • Memory used to store executable instructions
  • the processor is configured to execute the executable instructions stored in the memory to perform the following operations:
  • the imaging device may also include a display device, that is, after the grayscale image collected by the image sensor is transmitted in a wired or wireless manner, the imaging device receives the grayscale image and performs pseudo-color mapping and other processing on it, and finally passes The display device that comes with the imaging device displays.
  • a display device that is, after the grayscale image collected by the image sensor is transmitted in a wired or wireless manner, the imaging device receives the grayscale image and performs pseudo-color mapping and other processing on it, and finally passes The display device that comes with the imaging device displays.
  • the present disclosure also provides a movable platform including a body and the imaging device as described above.
  • the present disclosure also provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed, the steps of the above-mentioned image processing method are realized.
  • the process of image compression, transmission, decompression, and resampling only has an effect on the gray-scale image, thereby suppressing the resolution loss of the pseudo-color image, and can output high resolution
  • the pseudo-color image can improve the ability to detect and find targets.
  • FIG. 1 is a flowchart of an image processing method according to an embodiment of the disclosure.
  • Fig. 2 is a flowchart of an image processing method according to another embodiment of the present disclosure.
  • FIG. 3 is a flowchart of an image processing method according to another embodiment of the present disclosure.
  • Fig. 4 is a flowchart of an image processing method according to another embodiment of the present disclosure.
  • FIG. 5 is a schematic structural diagram of an image processing device according to another embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of the image processing flow of the image processing device in FIG. 5.
  • FIG. 7 is a schematic structural diagram of an image processing system according to another embodiment of the disclosure.
  • FIG. 8 is a schematic diagram of the image processing flow of the image processing system in FIG. 7.
  • FIG. 9 is a schematic diagram of another image processing flow of the image processing device in FIG. 5.
  • FIG. 10 is a schematic diagram of another image processing flow of the image processing system in FIG. 7.
  • an image processing method is provided.
  • An imaging device senses a target to obtain a grayscale image that reflects frequency bands such as infrared and ultraviolet. After enhancement, it is often converted into a pseudo-color image, and different colors are used to characterize frequency band information, providing users with a more intuitive display experience.
  • the grayscale image is often output after compression, and then the grayscale image is decompressed and resampled, and then displayed on the display device at the image receiving end.
  • the resolution of the pseudo-color image will be reduced.
  • the pseudo-color image is a YUV format image
  • the image sending end uses H.264 and other compression algorithms to compress the pseudo-color image
  • only the compression of YUV422 or YUV420 images is supported, and the compression of YUV444 is not supported, resulting in UV resolution.
  • Loss, which impairs the resolution of the pseudo-color image in addition, in order to adapt to the display screen, it is often necessary to resample the image to be displayed, including zooming in or out, and performing UV resampling on the YUV422 or 420 pseudo-color image. It will bring about a great loss of false color resolution, and eventually cause the displayed false color image to be blurred and poor resolution, which reduces the detection and discovery capabilities of the infrared system.
  • the present disclosure provides an image processing method and an image processing device that executes the above method, which is suitable for any system that needs to process and transmit images including pictures and videos.
  • the following embodiments illustrate the present disclosure by taking an unmanned aerial vehicle system as an example, but those skilled in the art should know that the above methods and equipment provided in the present disclosure are also applicable to other systems, such as various movable imaging devices attached.
  • Platforms such as drones, unmanned vehicles, unmanned boats, or self-driving cars.
  • the unmanned aerial vehicle serves as the image sending end
  • the control device of the unmanned aerial vehicle serves as the image receiving end.
  • the control device is, for example, a remote controller.
  • the image processing device of the present disclosure may be located on the control device of the unmanned aerial vehicle. The images obtained by the drone can be transmitted to the control device in real time.
  • an image processing method is provided, as shown in FIG. 1, which includes the following steps:
  • the image receiving end receives the resampled grayscale image.
  • the resampling step of the grayscale image is performed at the image sending end.
  • S12 Perform pseudo-color mapping on the resampled grayscale image to obtain a pseudo-color image.
  • Pseudo-color mapping is a format that transforms a grayscale image into a color image, for example, RGB format or YUV format, where RGB format and YUV format can be converted mutually through matrix transformation.
  • RGB, YUV or other formats are encoding methods for color images. Taking YUV as an example, "Y” represents brightness, which is the grayscale value, and "U” and “V” represent chromaticity, which is used to describe the image Color and saturation are used to specify the color of the pixel.
  • YUV formats In order to save bandwidth, most YUV formats use less than 24 bits per pixel on average.
  • the main sampling formats are YUV420, YUV422, YUV411, and YUV444.
  • 4:2:0 means complete sampling; 4:2:2 It means 2:1 horizontal sampling and vertical complete sampling; 4:2:0 means 2:1 horizontal sampling and vertical 2:1 sampling; 4:1:1 means 4:1 horizontal sampling and vertical complete sampling.
  • the methods of pseudo-color mapping include density segmentation, spatial domain gray-level-color transformation, and frequency domain pseudo-color enhancement.
  • the image receiving end includes a display device for displaying pseudo-color images. It should be noted that the spatial resolution of the received resampled grayscale image matches the spatial resolution of the display device, so that the image to be displayed is adapted to the display device.
  • the source image of the gray image can be an infrared image, an ultraviolet image, or a near-infrared image, such as an infrared image.
  • the image directly out of the infrared sensor is executed as a grayscale image during each process of image processing. Before the image is pseudo-color transformation, the grayscale image is converted into a clear pseudo-color image, and different colors are used to characterize the temperature, which improves the user experience.
  • the image sending end can compress the resampled grayscale image before transmitting the resampled grayscale image to the image receiving end, and the resampled grayscale image received by the image receiving end
  • the grayscale image is a compressed grayscale image. Therefore, this embodiment further includes steps between steps S11 and S12:
  • the image receiving end decodes the gray image, that is, decompresses, which corresponds to the reverse operation of compression. After decompressing the grayscale image, pseudo-color mapping is performed on the decompressed grayscale image.
  • an image processing method is also provided, as shown in FIG. 2, which includes the following steps:
  • S21 Re-sample the gray-scale image to obtain the re-sampled gray-scale image, and send the re-sampled gray-scale image.
  • Image resampling is used to adjust the resolution of grayscale images, including up-sampling and down-sampling. Upsampling can increase the resolution of grayscale images, and downsampling can reduce the resolution of grayscale images.
  • the nearest neighbor method uses the pixel value closest to a certain pixel position in the grayscale image as the new value of the pixel. This method is simple to calculate, has a small amount of calculation, and does not destroy the grayscale information of the original grayscale image;
  • the linear interpolation method is to generate the new pixel value by weighting the distance of the sampled pixel to the surrounding 4 neighboring pixels, and the gray image obtained is continuous;
  • the cubic convolution interpolation method is a kind of high precision The method, at the same time, has a large amount of calculation. It achieves the best resampling effect by increasing the number of neighboring pixels involved in the interpolation calculation.
  • the image to be resampled is a grayscale image
  • the information of the grayscale image is comprehensive, and the resolution loss caused by the resampling process is not large.
  • the image receiving end receives the resampled grayscale image, and performs pseudo-color mapping on the resampled grayscale image to obtain a pseudo-color image.
  • step S23 Display the pseudo-color image through the display device. It should be noted that after the re-sampling of the gray-scale image in step S21, the spatial resolution of the received re-sampled gray-scale image has been matched with the spatial resolution of the display device, so that the image to be displayed is adapted to display screen.
  • the image sending end may compress the resampled gray image before transmitting the resampled gray image to the image receiving end. Therefore, this embodiment is After resampling, before sending the resampling grayscale image, it also includes compressing the resampling grayscale image. After that, the compressed grayscale image is transmitted to the image receiving end.
  • the resampled grayscale image received by the image receiving end is a compressed grayscale image. Therefore, in this embodiment, after the resampled grayscale image is received, before pseudo-color mapping is performed on the resampled grayscale image , Also includes: decompressing the resampling grayscale image, which corresponds to the reverse operation of compression. After decompressing the grayscale image, pseudo-color mapping is performed on the decompressed grayscale image.
  • the image compression algorithm is not limited, and it can be various standards or proprietary coding protocols, such as but not limited to H.263, H.264, H.265, MPEG-4, etc. Regardless of whether a standard encoding protocol or a private encoding protocol is used, the image to be encoded is a gray image.
  • the information of the gray image is comprehensive, and the resolution loss caused by the encoding process is not large, and it can even be lossless. Therefore, according to the compression effect, the coding of grayscale images can be divided into lossy coding and lossless coding. Lossy encoding can delete irrelevant information and can only approximate grayscale image reconstruction; while lossless encoding only deletes redundant information in grayscale image data, and can accurately restore grayscale images during decompression, but lossless encoding The amount of data is relatively large.
  • this embodiment further includes: obtaining the original gray image by using the image sensor, and preprocessing the original gray image obtained by the image sensor.
  • the preprocessing may include correction processing, noise reduction processing, enhancement processing, and so on.
  • UAVs generally carry image sensors.
  • the image sensor When the UAV performs a flight mission, the image sensor will sense the area in the line of sight and output images.
  • the output images need to go through a series of preprocessing, such as removing dead pixels and noise.
  • image enhancement after these preprocessing, the output is a grayscale image. Only after preprocessing can a grayscale image with many details and high resolution be obtained, which provides high-quality image data for the subsequent image processing process.
  • image enhancement includes image contrast enhancement and image detail enhancement.
  • an image processing method is also provided, as shown in FIG. 3, which includes the following steps:
  • the image receiving end receives the grayscale image.
  • the resampling step of the grayscale image is performed at the image receiving end.
  • Image resampling is used to adjust the resolution of grayscale images, including up-sampling and down-sampling. Upsampling can increase the resolution of grayscale images, and downsampling can reduce the resolution of grayscale images.
  • the nearest neighbor method uses the pixel value closest to a certain pixel position in the grayscale image as the new value of the pixel. This method is simple to calculate, has a small amount of calculation, and does not destroy the grayscale information of the original grayscale image;
  • the linear interpolation method is to generate the new pixel value by weighting the distance of the sampled pixel to the surrounding 4 neighboring pixels, and the gray image obtained is continuous;
  • the cubic convolution interpolation method is a kind of high precision The method, at the same time, has a large amount of calculation. It achieves the best resampling effect by increasing the number of neighboring pixels involved in the interpolation calculation.
  • the image to be resampled is a grayscale image
  • the information of the grayscale image is comprehensive, and the resolution loss caused by the resampling process is not large.
  • S33 Perform pseudo-color mapping on the resampled grayscale image to obtain a pseudo-color image.
  • the function of pseudo-color mapping is to transform a grayscale image into a color image format, for example, RGB format or YUV format, where the RGB format and the YUV format can be converted mutually through matrix transformation.
  • RGB, YUV or other formats are encoding methods for color images. Taking YUV as an example, "Y” represents brightness, which is the grayscale value, and "U” and “V” represent chromaticity, which is used to describe the image Color and saturation are used to specify the color of the pixel.
  • YUV formats In order to save bandwidth, most YUV formats use less than 24 bits per pixel on average.
  • the main sampling formats are YUV420, YUV422, YUV411, and YUV444.
  • 4:2:0 means complete sampling; 4:2:2 It means 2:1 horizontal sampling and vertical complete sampling; 4:2:0 means 2:1 horizontal sampling and vertical 2:1 sampling; 4:1:1 means 4:1 horizontal sampling and vertical complete sampling.
  • the methods of pseudo-color mapping include density segmentation, spatial domain gray-level-color transformation, and frequency domain pseudo-color enhancement.
  • the image receiving end includes a display device for displaying pseudo-color images. It should be noted that the spatial resolution of the received grayscale image matches the spatial resolution of the display device, so that the image to be displayed is adapted to the display device.
  • the source image of the gray image can be an infrared image, an ultraviolet image, or a near-infrared image, such as an infrared image.
  • the image directly out of the infrared sensor is executed as a grayscale image during each process of image processing. Before the image is pseudo-color transformation, the grayscale image is converted into a clear pseudo-color image, and different colors are used to characterize the temperature, which improves the user experience.
  • the image sending end can compress the gray image before transmitting the gray image to the image receiving end.
  • the gray image received by the image receiving end is the compressed gray image, so
  • the method further includes: decompressing the grayscale image. After decompressing the grayscale image, resample the decompressed grayscale image.
  • an image processing method is also provided, as shown in FIG. 4, which includes the following steps:
  • the image sending end sends a gray image.
  • the image receiving end receives the grayscale image, and resamples the grayscale image.
  • S43 Perform pseudo-color mapping on the resampled grayscale image to obtain a pseudo-color image.
  • the image sending end may compress the gray image before transmitting the gray image to the image receiving end. Therefore, this embodiment also includes compressing the gray image before step S41. After that, the compressed grayscale image is transmitted to the image receiving end.
  • the image receiving end After receiving the grayscale image and before resampling the grayscale image, the image receiving end also includes: decompressing the received grayscale image, which corresponds to the reverse operation of compression. After decompressing the grayscale image, resample the decompressed grayscale image.
  • this embodiment further includes: obtaining the original grayscale image by using the image sensor, and preprocessing the original grayscale image obtained by the image sensor.
  • UAVs generally carry image sensors.
  • the image sensor When the UAV performs a flight mission, the image sensor will sense the area in the line of sight and output images.
  • the output images need to go through a series of preprocessing, such as removing dead pixels and noise.
  • image enhancement after these preprocessing, the output is a grayscale image. Only after preprocessing can a grayscale image with many details and high resolution be obtained, which provides high-quality image data for the subsequent image processing process.
  • image enhancement includes image contrast enhancement and image detail enhancement.
  • an image processing device which is located at the image receiving end, such as a control device of a drone, and the control device may be a remote controller. See Figures 5 and 6, which include: a memory, a processor, and a display device.
  • Memory used to store executable instructions
  • the processor is configured to execute executable instructions stored in the memory to perform the following operations:
  • pseudo-color mapping is a format for transforming a grayscale image into a color image, for example, an RGB format or a YUV format, where the RGB format and the YUV format can be mutually transformed through matrix transformation.
  • RGB, YUV or other formats are encoding methods for color images. Taking YUV as an example, "Y” represents brightness, which is the grayscale value, and "U” and “V” represent chromaticity, which is used to describe the image Color and saturation are used to specify the color of the pixel.
  • YUV formats In order to save bandwidth, most YUV formats use less than 24 bits per pixel on average.
  • the main sampling formats are YUV420, YUV422, YUV411, and YUV444.
  • 4:2:0 means complete sampling; 4:2:2 It means 2:1 horizontal sampling and vertical complete sampling; 4:2:0 means 2:1 horizontal sampling and vertical 2:1 sampling; 4:1:1 means 4:1 horizontal sampling and vertical complete sampling.
  • the resampled grayscale image is a compressed grayscale image; therefore, the processor also performs the following operations:
  • the compressed grayscale image After receiving the resampled grayscale image and before performing pseudo-color mapping on the resampled grayscale image, the compressed grayscale image is decompressed.
  • an image processing system is also provided, as shown in FIGS. 7 and 8, which includes: a first image processing device and a second image processing device;
  • the first image processing device includes:
  • the first memory is used to store executable instructions
  • the first processor is configured to execute the executable instructions stored in the first memory to perform the following operations:
  • the second image processing device includes:
  • the second memory is used to store executable instructions
  • the second processor is configured to execute the executable instructions stored in the second memory to perform the following operations:
  • the first image processing device is located at the image sending end, such as a drone
  • the second image processing device is located at the image receiving end, such as the remote control of the drone.
  • the drone uses a wireless network to compress the compressed gray.
  • the degree image is sent to the remote controller.
  • the type of wireless network is not limited, and the wireless network can be a standard wireless network or a private wireless network.
  • the standard wireless network is, for example, WI-FI
  • the private wireless network is, for example, OcuSync, Light Bridge, and so on.
  • the above is for the case of unmanned aerial vehicles.
  • the image processing system of this embodiment is applied to other systems, the image can also be transmitted by means of a wired network.
  • image resampling is used to adjust the resolution of grayscale images, including up-sampling and down-sampling.
  • Upsampling can increase the resolution of grayscale images
  • downsampling can reduce the resolution of grayscale images.
  • the nearest neighbor method uses the pixel value closest to a certain pixel position in the grayscale image as the new value of the pixel. This method is simple to calculate, has a small amount of calculation, and does not destroy the grayscale information of the original grayscale image;
  • the linear interpolation method is to generate the new pixel value by weighting the distance of the sampled pixel to the surrounding 4 neighboring pixels, and the gray image obtained is continuous;
  • the cubic convolution interpolation method is a kind of high precision The method, at the same time, has a large amount of calculation. It achieves the best resampling effect by increasing the number of neighboring pixels involved in the interpolation calculation.
  • the image to be resampled is a grayscale image
  • the information of the grayscale image is comprehensive, and the resolution loss caused by the resampling process is not large.
  • Image compression can be performed before sending the resampled grayscale image. Therefore, the first processor also performs the following operations: after resampling the grayscale image and before sending the resampled grayscale image, The degree image is compressed.
  • the received resampled grayscale image is a compressed image. Therefore, the second processor also performs the following operations: after receiving the resampled grayscale image, before performing pseudo-color mapping on the resampled grayscale image, The resampled grayscale image is decompressed.
  • the image compression algorithm is not limited, and it can be various standards or proprietary coding protocols, such as but not limited to H.263, H.264, H.265, MPEG-4, etc.
  • the image to be encoded is a gray image.
  • the information of the gray image is comprehensive, and the resolution loss caused by the encoding process is not large, and it can even be lossless. Therefore, according to the compression effect, the coding of grayscale images can be divided into lossy coding and lossless coding.
  • Lossy coding deletes all irrelevant information during the coding process, and can only approximate the original image; while the lossless coding compression algorithm only removes the redundant information in the image data, which can be accurately decompressed. Restoring the original image will inevitably increase the amount of data for lossless encoding.
  • the first processor also performs the following operations:
  • an image processing device is also provided, as shown in FIG. 5 and FIG. 9, which includes a memory, a processor, and a display device.
  • Memory used to store executable instructions
  • the processor is configured to execute executable instructions stored in the memory to perform the following operations:
  • the received grayscale image is a compressed grayscale image. Therefore, the processor also performs the following operations: after receiving the grayscale image and before resampling the grayscale image, decompress the compressed grayscale image.
  • an image processing system is also provided, as shown in FIG. 7 and FIG. 10, which includes: a first image processing device and a second image processing device;
  • the first image processing device includes:
  • the first memory is used to store executable instructions
  • the first processor is configured to execute the executable instructions stored in the first memory to perform the following operations:
  • the second image processing device includes:
  • the second memory is used to store executable instructions
  • the second processor is configured to execute the executable instructions stored in the second memory to perform the following operations:
  • the first image processing device is located at the image sending end, such as a drone
  • the second image processing device is located at the image receiving end, such as the remote control device of the drone.
  • the drone uses a wireless network to compress the image.
  • the grayscale image of is sent to the remote controller.
  • the wireless network can be a standard wireless network or a private wireless network.
  • the standard wireless network is, for example, WI-FI
  • the private wireless network is, for example, OcuSync, Light Bridge, and so on.
  • the above is for the case of unmanned aerial vehicles.
  • the image processing system of this embodiment is applied to other systems, the image can also be transmitted by means of a wired network.
  • Image compression can be performed before sending the gray image. Therefore, the first processor also performs the following operations: before sending the gray image, compress the gray image.
  • the received grayscale image is a compressed image. Therefore, the second processor further performs the following operations: after receiving the grayscale image and before resampling the grayscale image, decompress the grayscale image.
  • the first processor also performs the following operations:
  • the original grayscale image is obtained, and the original grayscale image is preprocessed.
  • An embodiment of the present disclosure also provides a control terminal, including: a housing and the image processing device described in the foregoing embodiments, and the image processing device is located in the housing.
  • the embodiment of the present disclosure also provides a movable platform including the control terminal as described above.
  • the embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored, and when the computer program is executed, the steps of the image processing method described in the above-mentioned embodiments are realized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Image Processing (AREA)

Abstract

一种图像处理方法、设备、显示装置、成像装置及可移动平台,通过调整图像处理的步骤顺序,在显示图像之前才进行伪彩映射,这样在图像处理的其他步骤中只对灰度图像产生作用,而灰度图像的信息是全面的,在图像处理的整个过程中带来的分辨率损失并不大,甚至可以做到无损失,这样得到的伪彩图像清晰,分辨率高,提高了图像传感器的探测和发现能力。

Description

一种图像处理方法、设备、控制终端及可移动平台 技术领域
本公开涉及图像处理技术领域,尤其涉及一种图像处理方法、设备、控制终端及可移动平台。
背景技术
在将信息呈现给用户时,一般需要将图像传感器获得的灰度图像转换为彩色图像进行显示。对图像进行的压缩等步骤会导致显示设备上所显示的彩色图像模糊,分辨率低,相应降低了用户体验和显示精度。
公开内容
本公开提供了一种图像处理方法,其中,包括:
获取重采样后的灰度图像;
对所述重采样后的灰度图像进行伪彩映射,得到伪彩图像,所述伪彩图像用于显示设备显示。
本公开还提供了一种图像处理方法,其中,包括:
获取通过图像传感器采集的灰度图像;
发送所述通过图像传感器采集的灰度图像,所述通过图像传感器采集的灰度图像未经伪彩映射。
本公开还提供了一种图像处理方法,其中,包括:
获取通过图像传感器采集的灰度图像;
发送所述通过图像传感器采集的灰度图像;
接收所述通过图像传感器采集的灰度图像;
对所述灰度图像进行伪彩映射,得到伪彩图像,所述伪彩图像用于显示设备显示。
本公开还提供了一种图像处理设备,包括:
存储器,用于存储可执行指令;
处理器,用于执行存储器中存储的可执行指令,以执行如下操作:
获取重采样后的灰度图像;
对所述重采样后的灰度图像进行伪彩映射,得到伪彩图像,所述伪彩图像用于显示设备显示。
本公开还提供了一种图像处理设备,包括:
存储器,用于存储可执行指令;
处理器,用于执行所述存储器中存储的所述可执行指令,以执行如下操作:
获取通过图像传感器采集的灰度图像;
发送所述通过图像传感器采集的灰度图像,所述通过图像传感器采集的灰度图像未经伪彩映射。
本公开还提供了一种图像处理设备,包括:
存储器,用于存储可执行指令;
处理器,用于执行存储器中存储的可执行指令,以执行如下操作:
获取通过图像传感器采集的灰度图像;
发送所述通过图像传感器采集的灰度图像;
接收所述通过图像传感器采集的灰度图像;
对所述灰度图像进行伪彩映射,得到伪彩图像,所述伪彩图像用于显示设备显示。
本公开还提供了一种显示装置,包括显示设备和图像处理设备,图像处理设备,包括:
存储器,用于存储可执行指令;
处理器,用于执行存储器中存储的可执行指令,以执行如下操作:
获取重采样后的灰度图像;
对所述重采样后的灰度图像进行伪彩映射,得到伪彩图像,所述伪彩图像用于显示设备显示。
可以理解的是,根据系统资源的限制,包括上述所有实施例在内的灰度图像,经伪彩映射后均可进行除重采样及数据传输外的其他处理,例如矫正处理、降噪处理等,再通过显示设备进行显示。
本公开还提供了一种成像装置,包括图像传感器和图像处理设备,图像处理设备,包括:
存储器,用于存储可执行指令;
处理器,用于执行所述存储器中存储的所述可执行指令,以执行如下操作:
获取通过图像传感器采集的灰度图像;
发送所述通过图像传感器采集的灰度图像,所述通过图像传感器采集的灰度图像未经伪彩映射。
进一步地,该成像装置也可包含显示设备,即通过有线或无线方式发送所述通过图像传感器采集的灰度图像后,该成像装置接收灰度图像并对其进行伪彩映射等处理,最后通过该成像装置自带的显示设备显示。
本公开还提供了一种可移动平台,包括机体和如上所述的成像装置。
本公开还提供了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被执行时实现如上所述的图像处理方法的步骤。
从上述技术方案可以看出,本公开至少具有以下有益效果:
通过在对灰度图像重采样之后进行伪彩映射,图像压缩、传输、解压缩、重采样等过程只对灰度图像产生作用,从而抑制了伪彩图像的分辨率损失,能够输出高分辨率的伪彩图像,提高探测和发现目标的能力。
附图说明
附图是用来提供对本公开的进一步理解,并且构成说明书的一部分,与下面的具体实施方式一起用于解释本公开,但并不构成对本公开的限制。在附图中:
图1为本公开一实施例图像处理方法的流程图。
图2为本公开另一实施例图像处理方法的流程图。
图3为本公开再一实施例图像处理方法的流程图。
图4为本公开再一实施例图像处理方法的流程图。
图5为本公开再一实施例图像处理设备的结构示意图。
图6为图5中图像处理设备的图像处理流程示意图。
图7为本公开再一实施例图像处理系统的结构示意图。
图8为图7中图像处理系统的图像处理流程示意图。
图9为图5中图像处理设备的另一图像处理流程示意图。
图10为图7中图像处理系统的另一图像处理流程示意图。
具体实施方式
在本公开一实施例中,提供了一种图像处理方法,成像装置对目标进行感应,获得反映例如红外、紫外等频段的灰度图像,所获得的灰度图像 要经过去除噪声和瑕疵、图像增强后,往往会转换为伪彩图像,用不同的色彩来表征频段信息,为用户提供更直观的显示体验。灰度图像往往在压缩后进行输出,进而对灰度图像进行解压缩、重采样后,显示到图像接收端的显示设备上。
在上述整个过程中,会导致伪彩图像的分辨率降低。当伪彩图像为YUV格式的图像,图像发送端采用H.264等压缩算法对伪彩图像进行压缩时,只支持对YUV422或YUV420的图像的压缩,不支持YUV444的压缩,导致UV分辨率的损失,使伪彩图像的分辨率受损;另外为了适配显示屏,往往需要对待显示的图像进行重采样,包括放大或缩小,在YUV422或420的伪彩图像上进行UV的重采样,也会带来伪彩分辨率的极大损失,最终致使显示的伪彩图像模糊,分辨率差,降低了红外系统的探测和发现能力。
进而,本公开提供了一种图像处理方法以及执行上述方法的图像处理设备,其适用于任何需要对包括图片、视频在内的图像进行处理及传输的系统。为描述方便,以下实施例以无人机系统为例对本公开进行说明,但本领域技术人员应当知道,本公开提供的上述方法和设备同样适用于其他系统,例如各种附带成像装置的可移动平台,例如无人机、无人车、无人船或自动驾驶汽车。对于无人机系统,无人机作为图像发送端,无人机的控制设备作为图像接收端,该控制设备例如是遥控器,本公开的图像处理设备可以位于无人机的控制设备上。无人机获得的图像可实时传输至控制设备。
下面将结合实施例和实施例中的附图,对本公开技术方案进行清楚、完整的描述。显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
在本公开一实施例中,提供了一种图像处理方法,参见图1所示,其包括以下步骤:
S11:接收重采样后的灰度图像。
图像接收端接收重采样后的灰度图像,在本实施例中,灰度图像的重采样步骤在图像发送端执行。
S12:对重采样后的灰度图像进行伪彩映射,得到伪彩图像。
伪彩映射是将灰度图像变换为彩色图像的格式,例如,RGB格式或者YUV格式,其中RGB格式与YUV格式之间可以通过矩阵变换进行相互变换。RGB、YUV或者其他格式均为彩色图像的编码方式,以YUV为例,“Y”表示明亮度,也就是灰阶值,“U”和“V”表示的则是色度,作用是描述影像色彩及饱和度,用于指定像素的颜色。
为了节省带宽,大多数YUV格式平均使用的每像素位数都少于24位,主要的采样格式有YUV420、YUV422、YUV411、YUV444,其中,4∶2∶0表示完全取样;4∶2∶2表示2∶1的水平采样,垂直完全采样;4∶2∶0表示2∶1的水平采样,垂直2∶1采样;4∶1∶1表示4∶1的水平采样,垂直完全采样。
以RGB格式为例,伪彩映射的方法包括密度分割、空间域灰度级-彩色变换和频率域伪彩色增强。
S13:显示伪彩图像。
在图像接收端,包括一显示设备,用以显示伪彩图像。需要说明的是,接收到的重采样后的灰度图像的空间分辨率与显示设备的空间分辨率相匹配,使得要显示的图像适配于显示设备。
由上可以看出,通过调整图像处理的步骤顺序,在显示图像之前才进行伪彩映射,这样在图像处理的其他步骤中只对灰度图像产生作用,而灰度图像的信息是全面的,在图像处理的整个过程中带来的分辨率损失并不大,甚至可以做到无损失,这样得到的伪彩图像清晰,分辨率高,提高了图像传感器的探测和发现能力。
其中,上述灰度图像的源图像可以是红外图像、紫外图像或者近红外图像,比如红外图像,红外传感器直出的图像在进行图像处理的各个过程时均是以灰度图执行的,在显示图像之前才对其进行伪彩变换,将灰度图转化为清晰的伪彩图,用不同的色彩来表征温度高低,提升了用户体验。
为了节省带宽,提升图像传输体验,图像发送端将重采样后的灰度图像传输给图像接收端之前,可先对重采样后的灰度图像进行压缩,图像接收端接收到的重采样后的灰度图像为压缩后的灰度图像,因此,本实施例在步骤S11和S12之间,还包括步骤:
对灰度图像进行解压缩。
图像接收端对灰度图像进行解码,也就是解压缩,是对应于压缩的反操作。对灰度图像进行解压缩后,再对解压缩后的灰度图像进行伪彩映射。
在本公开另一实施例中,还提供了一种图像处理方法,参见图2所示,其包括以下步骤:
S21:对灰度图像进行重采样,得到重采样后的灰度图像,并发送重采样后的灰度图像。
该步骤在图像发送端执行。图像重采样用于调节灰度图像的分辨率,包括升采样和降采样。升采样可提高灰度图像的分辨率,降采样可减小灰度图像的分辨率。
至于重采样的算法不作限制,比如最近邻法、双线性内插法或者三次卷积内插法。最邻近法是将与灰度图像中距离某像元位置最近的像元值作为该像元的新值,此方法计算简单,运算量小,且不破坏原始灰度图像的灰度信息;双线性内插法是通过取采样像元到周围4邻域像元的距离加权来生成像元新值,获得的灰度图像是连续的;三次卷积内插法是一种精度较高的方法,同时运算量较大,它是通过增加参与内插计算的邻近像元的数目来达到最佳的重采样效果。
在本实施例中,进行重采样的图像为灰度图像,灰度图像的信息是全面的,在重采样过程中带来的分辨率损失并不大。
S22:图像接收端接收重采样后的灰度图像,并对重采样后的灰度图像进行伪彩映射,得到伪彩图像。
S23:通过显示设备显示伪彩图像。需要说明的是,经过步骤S21对灰度图像进行重采样后,接收到的重采样后的灰度图像的空间分辨率已经与显示设备的空间分辨率相匹配,使得要显示的图像适配于显示设备。
为了节省带宽,提升图像传输体验,图像发送端将重采样后的灰度图像传输给图像接收端之前,可先对重采样后的灰度图像进行压缩,因此,本实施例在对灰度图像进行重采样之后,发送重采样后的灰度图像之前,还包括对重采样后的灰度图像进行压缩。之后,再将压缩后的灰度图像传输给图像接收端。
图像接收端接收到的重采样后的灰度图像为压缩后的灰度图像,因此,本实施例在接收重采样后的灰度图像之后,对重采样后的灰度图像进行伪 彩映射之前,还包括:对重采样后的灰度图像进行解压缩,是对应于压缩的反操作。对灰度图像进行解压缩后,再对解压缩后的灰度图像进行伪彩映射。
图像压缩算法不加以限制,其可以是各种标准或私有的编码协议,标准编码协议例如但不限于H.263、H.264、H.265、MPEG-4等。无论采用标准编码协议还是私有编码协议,所进行编码的图像为灰图图像,灰度图像的信息是全面的,在编码过程中带来的分辨率损失并不大,甚至可以做到无损失,因此,根据压缩效果,对灰度图像的编码可分为有损编码和无损编码。有损编码可删除不相干的信息,只能对灰度图像进行近似的重建;而无损编码仅仅删除了灰度图像数据中的冗余信息,解压缩时能够精确恢复灰度图像,但无损编码的数据量较大。
另外,对灰图图像进行重采样之前,本实施例还包括:利用图像传感器获取原始灰度图像,并对图像传感器获得的原始灰度图像进行预处理。该预处理可以包括矫正处理、降噪处理、增强处理等。
无人机一般携带有图像传感器,当无人机执行飞行任务时,图像传感器会对视线内的区域进行感应,输出图像,所输出的图像需要经过一系列的预处理,比如去除坏点、噪声等瑕疵,又比如图强增强,经过这些预处理后输出的是灰度图像。经过预处理后才能得到细节多、分辨率高的灰度图像,为后续的图像处理过程提供高质量的图像数据。其中,图像增强又包括图像对比度增强和图像细节增强。
由上可以看出,通过调整图像处理的步骤顺序,在显示图像之前才进行伪彩映射,这样在图像处理的其他步骤中只对灰度图像产生作用,而灰度图像的信息是全面的,在图像处理的整个过程中带来的分辨率损失并不大,甚至可以做到无损失,这样得到的伪彩图像清晰,分辨率高,提高了图像传感器的探测和发现能力。
在本公开再一实施例中,还提供了一种图像处理方法,参见图3所示,其包括以下步骤:
S31:图像接收端接收灰度图像。
S32:对灰图图像进行重采样。
在本实施例中,灰度图像的重采样步骤在图像接收端执行。图像重采 样用于调节灰度图像的分辨率,包括升采样和降采样。升采样可提高灰度图像的分辨率,降采样可减小灰度图像的分辨率。
至于重采样的算法不作限制,比如最近邻法、双线性内插法或者三次卷积内插法。最邻近法是将与灰度图像中距离某像元位置最近的像元值作为该像元的新值,此方法计算简单,运算量小,且不破坏原始灰度图像的灰度信息;双线性内插法是通过取采样像元到周围4邻域像元的距离加权来生成像元新值,获得的灰度图像是连续的;三次卷积内插法是一种精度较高的方法,同时运算量较大,它是通过增加参与内插计算的邻近像元的数目来达到最佳的重采样效果。
在本实施例中,进行重采样的图像为灰度图像,灰度图像的信息是全面的,在重采样过程中带来的分辨率损失并不大。
S33:对重采样后的灰度图像进行伪彩映射,得到伪彩图像。
伪彩映射的功能是将灰度图像变换为彩色图像的格式,例如,RGB格式或者YUV格式,其中RGB格式与YUV格式之间可以通过矩阵变换进行相互变换。RGB、YUV或者其他格式均为彩色图像的编码方式,以YUV为例,“Y”表示明亮度,也就是灰阶值,“U”和“V”表示的则是色度,作用是描述影像色彩及饱和度,用于指定像素的颜色。
为了节省带宽,大多数YUV格式平均使用的每像素位数都少于24位,主要的抽样格式有YUV420、YUV422、YUV411、YUV444,其中,4∶2∶0表示完全取样;4∶2∶2表示2∶1的水平采样,垂直完全采样;4∶2∶0表示2∶1的水平采样,垂直2∶1采样;4∶1∶1表示4∶1的水平采样,垂直完全采样。
以RGB格式为例,伪彩映射的方法包括密度分割、空间域灰度级-彩色变换和频率域伪彩色增强。
S34:显示伪彩图像。
在图像接收端,包括一显示设备,用以显示伪彩图像。需要说明的是,接收到的灰度图像的空间分辨率与显示设备的空间分辨率相匹配,使得要显示的图像适配于显示设备。
由上可以看出,通过调整图像处理的步骤顺序,在显示图像之前才进行伪彩映射,这样在图像处理的其他步骤中只对灰度图像产生作用,而灰度图像的信息是全面的,在图像处理的整个过程中带来的分辨率损失并不 大,甚至可以做到无损失,这样得到的伪彩图像清晰,分辨率高,提高了图像传感器的探测和发现能力。
其中,上述灰度图像的源图像可以是红外图像、紫外图像或者近红外图像,比如红外图像,红外传感器直出的图像在进行图像处理的各个过程时均是以灰度图执行的,在显示图像之前才对其进行伪彩变换,将灰度图转化为清晰的伪彩图,用不同的色彩来表征温度高低,提升了用户体验。
为了节省带宽,提升图像传输体验,图像发送端将灰度图像传输给图像接收端之前,可先对灰度图像进行压缩,图像接收端接收到的灰度图像为压缩后的灰度图像,因此,本实施例在步骤S31之后、步骤S32之前,还包括:对灰度图像进行解压缩。对灰度图像进行解压缩后,再对解压缩后的灰度图像进行重采样。
在本公开再一实施例中,还提供了一种图像处理方法,参见图4所示,其包括以下步骤:
S41:图像发送端发送灰度图像。
S42:图像接收端接收灰度图像,并对灰度图像进行重采样。
S43:对重采样后的灰度图像进行伪彩映射,得到伪彩图像。
S44:显示伪彩图像。
为了节省带宽,提升图像传输体验,图像发送端将灰度图像传输给图像接收端之前,可先对灰度图像进行压缩,因此,本实施例在步骤S41之前还包括对灰度图像进行压缩,之后,再将压缩后的灰度图像传输给图像接收端。
图像接收端在接收灰度图像之后、对灰度图像进行重采样之前,还包括:对接收的灰度图像进行解压缩,是对应于压缩的反操作。对灰度图像进行解压缩后,再对解压缩后的灰度图像进行重采样。
另外,对灰度图像进行压缩之前,本实施例还包括:利用图像传感器获取原始灰度图像,并对图像传感器获得的原始灰度图像进行预处理。
无人机一般携带有图像传感器,当无人机执行飞行任务时,图像传感器会对视线内的区域进行感应,输出图像,所输出的图像需要经过一系列的预处理,比如去除坏点、噪声等瑕疵,又比如图强增强,经过这些预处理后输出的是灰度图像。经过预处理后才能得到细节多、分辨率高的灰度 图像,为后续的图像处理过程提供高质量的图像数据。其中,图像增强又包括图像对比度增强和图像细节增强。
由上可以看出,通过调整图像处理的步骤顺序,在显示图像之前才进行伪彩映射,这样在图像处理的其他步骤中只对灰度图像产生作用,而灰度图像的信息是全面的,在图像处理的整个过程中带来的分辨率损失并不大,甚至可以做到无损失,这样得到的伪彩图像清晰,分辨率高,提高了图像传感器的探测和发现能力。
在本公开再一实施例中,还提供了一种图像处理设备,该图像处理设备位于图像接收端,例如无人机的控制设备,控制设备可以是遥控器。参见图5、6所示,其包括:存储器、处理器和显示设备。
存储器,用于存储可执行指令;
处理器,用于执行存储器中存储的可执行指令,以执行如下操作:
接收重采样后的灰度图像;
对重采样后的灰度图像进行伪彩映射,得到伪彩图像;
通过显示设备显示伪彩图像。
其中,伪彩映射是将灰度图像变换为彩色图像的格式,例如,RGB格式或者YUV格式,其中RGB格式与YUV格式之间可以通过矩阵变换进行相互变换。RGB、YUV或者其他格式均为彩色图像的编码方式,以YUV为例,“Y”表示明亮度,也就是灰阶值,“U”和“V”表示的则是色度,作用是描述影像色彩及饱和度,用于指定像素的颜色。
为了节省带宽,大多数YUV格式平均使用的每像素位数都少于24位,主要的采样格式有YUV420、YUV422、YUV411、YUV444,其中,4∶2∶0表示完全取样;4∶2∶2表示2∶1的水平采样,垂直完全采样;4∶2∶0表示2∶1的水平采样,垂直2∶1采样;4∶1∶1表示4∶1的水平采样,垂直完全采样。
重采样后的灰度图像为压缩后的灰度图像;因此,所述处理器还执行以下操作:
在接收重采样后的灰度图像之后、对重采样后的灰度图像进行伪彩映射之前,对压缩后的灰度图像进行解压缩。
在本公开实施例中,还提供了一种图像处理系统,参见图7、8所示,其包括:第一图像处理设备和第二图像处理设备;
第一图像处理设备包括:
第一存储器,用于存储可执行指令;
第一处理器,用于执行所述第一存储器中存储的所述可执行指令,以执行如下操作:
对灰度图像进行重采样,得到重采样后的灰度图像;
发送重采样后的灰图图像。
第二图像处理设备包括:
第二存储器,用于存储可执行指令;
第二处理器,用于执行所述第二存储器中存储的所述可执行指令,以执行如下操作:
接收重采样后的灰度图像;
对重采样后的灰度图像进行伪彩映射,得到伪彩图像;
通过显示设备显示伪彩图像。
在图像处理系统中,第一图像处理设备位于图像发送端,例如无人机,第二图像处理设备位于图像接收端,例如无人机的遥控器,无人机利用无线网络将压缩后的灰度图像发送给遥控器,本实施例对无线网络的种类不加以限制,无线网络可以采用标准无线网络或私有无线网络。标准无线网络例如是WI-FI,私有无线网络例如是OcuSync、Light Bridge等。
以上是针对无人机的情况,当本实施例的图像处理系统应用于其他系统时,也可以采用有线网络的方式传输图像。
其中,图像重采样用于调节灰度图像的分辨率,包括升采样和降采样。升采样可提高灰度图像的分辨率,降采样可减小灰度图像的分辨率。
至于重采样的算法不作限制,比如最近邻法、双线性内插法或者三次卷积内插法。最邻近法是将与灰度图像中距离某像元位置最近的像元值作为该像元的新值,此方法计算简单,运算量小,且不破坏原始灰度图像的灰度信息;双线性内插法是通过取采样像元到周围4邻域像元的距离加权来生成像元新值,获得的灰度图像是连续的;三次卷积内插法是一种精度较高的方法,同时运算量较大,它是通过增加参与内插计算的邻近像元的数目来达到最佳的重采样效果。
在本实施例中,进行重采样的图像为灰度图像,灰度图像的信息是全 面的,在重采样过程中带来的分辨率损失并不大。
发送重采样后的灰度图像之前可进行图像压缩,因此,第一处理器还执行以下操作:对灰度图像进行重采样之后、发送重采样后的灰度图像之前,对重采样后的灰度图像进行压缩。
接收到的重采样后的灰度图像为压缩图像,因此,第二处理器还执行以下操作:接收重采样后的灰度图像之后、对重采样后的灰度图像进行伪彩映射之前,对重采样后的灰度图像进行解压缩。
其中,图像压缩算法不加以限制,其可以是各种标准或私有的编码协议,标准编码协议例如但不限于H.263、H.264、H.265、MPEG-4等。无论采用标准编码协议还是私有编码协议,所进行编码的图像为灰图图像,灰度图像的信息是全面的,在编码过程中带来的分辨率损失并不大,甚至可以做到无损失,因此,根据压缩效果,对灰度图像的编码可分为有损编码和无损编码。有损编码在编码的过程中把不相干的信息都删除了,只能对原图像进行近似的重建;而无损编码的压缩算法中仅仅删除了图像数据中的冗余信息,解压缩时能够精确恢复原图像,这必然使得无损编码的数据量增大。
另外,第一处理器还执行以下操作:
对灰度图像进行重采样之前,获取原始灰度图像,并对原始灰度图像进行预处理。
在本公开再一实施例中,还提供了一种图像处理设备,参见图5和图9所示,其包括:存储器、处理器和显示设备。
存储器,用于存储可执行指令;
处理器,用于执行存储器中存储的可执行指令,以执行如下操作:
接收灰度图像;
对灰度图像进行重采样;
对重采样后的灰度图像进行伪彩映射,得到伪彩图像;
通过显示设备显示伪彩图像。
接收到的灰度图像为压缩后的灰度图像,因此,处理器还执行以下操作:在接收灰度图像之后、对灰度图像进行重采样之前,对压缩后的灰度图像进行解压缩。
在本公开再一实施例中,还提供了一种图像处理系统,参见图7和图10所示,其包括:第一图像处理设备和第二图像处理设备;
第一图像处理设备包括:
第一存储器,用于存储可执行指令;
第一处理器,用于执行所述第一存储器中存储的所述可执行指令,以执行如下操作:
发送灰度图像;
第二图像处理设备包括:
第二存储器,用于存储可执行指令;
第二处理器,用于执行所述第二存储器中存储的所述可执行指令,以执行如下操作:
接收灰度图像;
对灰度图像进行重采样;
对重采样后的灰度图像进行伪彩映射,得到伪彩图像;
通过显示设备显示伪彩图像。
在图像处理系统中,第一图像处理设备位于图像发送端,例如无人机,第二图像处理设备位于图像接收端,例如无人机的控制设备遥控器,无人机利用无线网络将压缩后的灰度图像发送给遥控器,本实施例对无线网络的种类不加以限制,无线网络可以采用标准无线网络或私有无线网络。标准无线网络例如是WI-FI,私有无线网络例如是OcuSync、Light Bridge等。
以上是针对无人机的情况,当本实施例的图像处理系统应用于其他系统时,也可以采用有线网络的方式传输图像。
发送灰度图像之前可进行图像压缩,因此,第一处理器还执行以下操作:发送灰度图像之前,对灰度图像进行压缩。
接收到的灰度图像为压缩图像,因此,第二处理器还执行以下操作:接收灰度图像之后、对灰度图像进行重采样之前,对灰度图像进行解压缩。
另外,第一处理器还执行以下操作:
对灰度图像进行压缩之前,获取原始灰度图像,并对原始灰度图像进行预处理。
为了达到简要说明的目的,上述实施例中任何可作相同应用的技术特 征叙述皆并于此,无需再重复相同叙述。
本公开实施例还提供了一种控制终端,包括:壳体和上述实施例中所述的图像处理设备,图像处理设备位于壳体中。
本公开实施例还提供了一种可移动平台,包括如上所述的控制终端。
本公开实施例还提供了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被执行时实现上述实施例中所述的图像处理方法的步骤。
本领域技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的装置的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
最后应说明的是:以上各实施例仅用以说明本公开的技术方案,而非对其限制;尽管参照前述各实施例对本公开进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;在不冲突的情况下,本公开实施例中的特征可以任意组合;而这些修改或者替换,并不使相应技术方案的本质脱离本公开各实施例技术方案的范围。

Claims (70)

  1. 一种图像处理方法,其特征在于,包括:
    获取重采样后的灰度图像;
    对所述重采样后的灰度图像进行伪彩映射,得到伪彩图像,所述伪彩图像用于显示设备显示。
  2. 根据权利要求1所述的图像处理方法,其特征在于,所述重采样后的灰度图像的空间分辨率与所述显示设备的空间分辨率相匹配。
  3. 根据权利要求1所述的图像处理方法,其特征在于,在所述获取重采样后的灰度图像之前,还包括:
    接收压缩后的灰度图像;
    对所述压缩后的灰度图像解压缩,得到解压缩后的灰度图像;
    所述获取重采样后的灰度图像,包括:
    对所述解压缩后的灰度图像进行重采样,获取所述重采样后的灰度图像。
  4. 根据权利要求3所述的图像处理方法,其特征在于,所述图像处理方法还包括:
    获取原始灰度图像,并对所述原始灰度图像进行压缩,得到所述压缩后的灰度图像。
  5. 根据权利要求4所述的图像处理方法,其特征在于,所述获取原始灰度图像,并对所述原始灰度图像进行压缩,得到所述压缩后的灰度图像,还包括:
    对所述原始灰度图像进行处理,得到处理后的灰度图像;
    对所述处理后的灰度图像进行压缩,得到所述压缩后的灰度图像。
  6. 根据权利要求1所述的图像处理方法,其特征在于,所述获取重采样后的灰度图像,包括:
    接收对所述重采样后的灰度图像进行压缩后的灰度图像;
    所述对所述重采样后的灰度图像进行伪彩映射之前,还包括:
    对所述压缩后的灰度图像进行解压缩,得到解压缩后的灰度图像;
    所述对所述重采样后的灰度图像进行伪彩映射,包括:
    对所述解压缩后的灰度图像进行伪彩映射。
  7. 根据权利要求6所述的图像处理方法,其特征在于,所述图像处理方法还包括:
    获取原始灰度图像,并对所述原始灰度图像进行压缩,得到所述对所述重采样后的灰度图像进行压缩后的灰度图像。
  8. 根据权利要求7所述的图像处理方法,其特征在于,所述获取原始灰度图像,并对所述原始灰度图像进行压缩,得到所述对所述重采样后的灰度图像进行压缩后的灰度图像,还包括:
    对所述原始灰度图像进行处理,得到处理后的灰度图像;
    对所述处理后的灰度图像进行重采样,得到所述重采样后的灰度图像;
    对所述重采样后的灰度图像进行压缩,得到所述重采样后的灰度图像进行压缩后的灰度图像。
  9. 根据权利要求3-8中任一项所述的图像处理方法,其特征在于,所述原始灰度图像由图像传感器采集。
  10. 根据权利要求1-9中任一项所述的图像处理方法,其特征在于,所述伪彩图像包括RGB图像、YUV图像中的至少一种。
  11. 根据权利要求10所述的图像处理方法,其特征在于,所述YUV图像为YUV444图像。
  12. 根据权利要求1-11中任一项所述的图像处理方法,其特征在于,所述灰度图像包括红外灰度图像、紫外灰度图像的至少一种。
  13. 根据权利要求1-12中任一项所述的图像处理方法,其特征在于,采用以下的至少一种方法对所述灰度图像进行压缩:H263、H264、H265、MPEG-4。
  14. 一种图像处理方法,其特征在于,包括:
    获取通过图像传感器采集的灰度图像;
    发送所述通过图像传感器采集的灰度图像,所述通过图像传感器采集的灰度图像未经伪彩映射。
  15. 根据权利要求14所述的图像处理方法,其特征在于,所述发送所述通过图像传感器采集的灰度图像,还包括:
    所述通过图像传感器采集的灰度图像未经重采样。
  16. 根据权利要求14所述的图像处理方法,其特征在于,所述发送所述通过图像传感器采集的灰度图像,包括:
    对所述通过图像传感器采集的灰度图像进行压缩,获取压缩后的灰度图像;
    发送所述压缩后的灰度图像。
  17. 根据权利要求14所述的图像处理方法,其特征在于,所述发送所述通过图像传感器采集的灰度图像,包括:
    对所述通过图像传感器采集的灰度图像进行重采样,获取经过重采样的灰度图像;
    发送所述经过重采样的灰度图像。
  18. 根据权利要求17所述的图像处理方法,其特征在于,所述发送所述经过重采样的灰度图像,包括:
    对所述经过重采样的灰度图像进行压缩,获取压缩后的灰度图像;
    发送所述压缩后的灰度图像。
  19. 根据权利要求14所述的图像处理方法,其特征在于,所述发送所述通过图像传感器采集的灰度图像,包括:
    对所述通过图像传感器采集的灰度图像进行处理,得到处理后的灰度图像;
    发送所述处理后的灰度图像。
  20. 根据权利要求14-19中任一项所述的图像处理方法,其特征在于,所述灰度图像包括红外灰度图像、紫外灰度图像的至少一种。
  21. 根据权利要求14-20中任一项所述的图像处理方法,其特征在于,
    采用以下的至少一种方法对所述灰度图像进行压缩:H263、H264、H265、MPEG-4。
  22. 一种图像处理方法,其特征在于,包括:
    获取通过图像传感器采集的灰度图像;
    发送所述通过图像传感器采集的灰度图像;
    接收所述通过图像传感器采集的灰度图像;
    对所述灰度图像进行伪彩映射,得到伪彩图像,所述伪彩图像用于显示设备显示。
  23. 根据权利要求22所述的图像处理方法,其特征在于,
    所述发送所述通过图像传感器采集的灰度图像,包括:
    对所述通过图像传感器采集的灰度图像进行重采样,获取经过重采样的灰度图像;
    发送所述经过重采样的灰度图像;
    所述接收所述通过图像传感器采集的灰度图像,包括:
    接收重采样后的灰度图像;
    所述对所述灰度图像进行伪彩映射,包括:
    对所述重采样后的灰度图像进行伪彩映射。
  24. 根据权利要求23所述的图像处理方法,其特征在于,
    所述发送所述经过重采样的灰度图像,包括:
    对所述经过重采样的灰度图像进行压缩,获取压缩后的灰度图像;
    发送所述压缩后的灰度图像;
    在所述接收重采样后的灰度图像之后,还包括:
    对所述压缩后的灰度图像进行解压缩,得到解压缩后的灰度图像;
    所述对所述重采样后的灰度图像进行伪彩映射,包括:
    对所述解压缩后的灰度图像进行伪彩映射。
  25. 根据权利要求24所述的图像处理方法,其特征在于,所述对所述通过图像传感器采集的灰度图像进行重采样,获取经过重采样的灰度图像,包括:
    对所述通过图像传感器采集的灰度图像进行处理,得到处理后的灰度图像;
    对所述处理后的灰度图像进行重采样,得到经过重采样的灰度图像。
  26. 根据权利要求22所述的图像处理方法,其特征在于,
    在所述接收所述通过图像传感器采集的灰度图像之后,还包括:
    对所述灰度图像进行重采样,获取所述重采样后的灰度图像;
    所述对所述灰度图像进行伪彩映射,包括:
    对所述重采样后的灰度图像进行伪彩映射。
  27. 根据权利要求23或26所述的图像处理方法,其特征在于,所述重采样后的灰度图像的空间分辨率与所述显示设备的空间分辨率相匹配。
  28. 根据权利要求26所述的图像处理方法,其特征在于,
    所述发送所述通过图像传感器采集的灰度图像,包括:
    对所述通过图像传感器采集的灰度图像进行压缩,获取压缩后的灰度图像;
    发送所述压缩后的灰度图像;
    在所述对所述灰度图像进行重采样之前,还包括:
    对所述压缩后的灰度图像进行解压缩,得到解压缩后的灰度图像。
  29. 根据权利要求28所述的图像处理方法,其特征在于,
    发送所述压缩后的灰度图像,包括:
    对所述通过图像传感器采集的灰度图像进行处理,获取处理后的灰度图像;
    对所述处理后的灰度图像进行压缩,获取压缩后的灰度图像;
    发送所述压缩后的灰度图像。
  30. 根据权利要求22-29中任一项所述的图像处理方法,其特征在于,
    所述伪彩图像包括RGB图像、YUV图像中的至少一种。
  31. 根据权利要求30所述的图像处理方法,其特征在于,所述YUV图像为YUV444图像。
  32. 根据权利要求22-31中任一项所述的图像处理方法,其特征在于,所述灰度图像包括红外灰度图像、紫外灰度图像的至少一种。
  33. 根据权利要求24、25、28、29中任一项所述的图像处理方法,其特征在于,
    采用以下的至少一种方法对所述灰度图像进行压缩:H263、H264、H265、MPEG-4。
  34. 一种图像处理设备,其特征在于,包括:
    存储器,用于存储可执行指令;
    处理器,用于执行所述存储器中存储的所述可执行指令,以执行如下操作:
    获取重采样后的灰度图像;
    对所述重采样后的灰度图像进行伪彩映射,得到伪彩图像,所述伪彩图像用于显示设备显示。
  35. 根据权利要求34所述的图像处理设备,其特征在于,所述重采样后的灰度图像的空间分辨率与所述显示设备的空间分辨率相匹配。
  36. 根据权利要求34所述的图像处理设备,其特征在于,所述处理器还执行以下操作:在所述获取重采样后的灰度图像之前,
    接收压缩后的灰度图像;
    对所述压缩后的灰度图像解压缩,得到解压缩后的灰度图像;
    所述获取重采样后的灰度图像,包括:
    对所述解压缩后的灰度图像进行重采样,获取所述重采样后的灰度图像。
  37. 根据权利要求36所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    获取原始灰度图像,并对所述原始灰度图像进行压缩,得到所述压缩后的灰度图像。
  38. 根据权利要求37所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    对所述原始灰度图像进行处理,得到处理后的灰度图像;
    对所述处理后的灰度图像进行压缩,得到所述压缩后的灰度图像。
  39. 根据权利要求34所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    接收对所述重采样后的灰度图像进行压缩后的灰度图像;
    所述对所述重采样后的灰度图像进行伪彩映射之前,
    对所述压缩后的灰度图像进行解压缩,得到解压缩后的灰度图像;
    对所述解压缩后的灰度图像进行伪彩映射。
  40. 根据权利要求39所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    获取原始灰度图像,并对所述原始灰度图像进行压缩,得到所述对所述重采样后的灰度图像进行压缩后的灰度图像。
  41. 根据权利要求40所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    对所述原始灰度图像进行处理,得到处理后的灰度图像;
    对所述处理后的灰度图像进行重采样,得到所述重采样后的灰度图像;
    对所述重采样后的灰度图像进行压缩,得到所述重采样后的灰度图像进行压缩后的灰度图像。
  42. 根据权利要求36-41中任一项所述的图像处理设备,其特征在于,所述原始灰度图像由图像传感器采集。
  43. 根据权利要求34-42中任一项所述的图像处理设备,其特征在于,
    所述伪彩图像包括RGB图像、YUV图像中的至少一种。
  44. 根据权利要求43所述的图像处理设备,其特征在于,所述YUV图像为YUV444图像。
  45. 根据权利要求34-44中任一项所述的图像处理设备,其特征在于,所述灰度图像包括红外灰度图像、紫外灰度图像的至少一种。
  46. 根据权利要求34-45中任一项所述的图像处理设备,其特征在于,
    采用以下的至少一种方法对所述灰度图像进行压缩:H263、H264、H265、MPEG-4。
  47. 一种图像处理设备,其特征在于,包括:
    存储器,用于存储可执行指令;
    处理器,用于执行所述存储器中存储的所述可执行指令,以执行如下操作:
    获取通过图像传感器采集的灰度图像;
    发送所述通过图像传感器采集的灰度图像,所述通过图像传感器采集的灰度图像未经伪彩映射。
  48. 根据权利要求47所述的图像处理设备,其特征在于,
    发送的所述通过图像传感器采集的灰度图像未经重采样。
  49. 根据权利要求47所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    对所述通过图像传感器采集的灰度图像进行压缩,获取压缩后的灰度图像;
    发送所述压缩后的灰度图像。
  50. 根据权利要求47所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    对所述通过图像传感器采集的灰度图像进行重采样,获取经过重采样的灰度图像;
    发送所述经过重采样的灰度图像。
  51. 根据权利要求50所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    对所述经过重采样的灰度图像进行压缩,获取压缩后的灰度图像;
    发送所述压缩后的灰度图像。
  52. 根据权利要求47所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    对所述通过图像传感器采集的灰度图像进行处理,得到处理后的灰度图像;
    发送所述处理后的灰度图像。
  53. 根据权利要求47-52中任一项所述的图像处理设备,其特征在于,所述灰度图像包括红外灰度图像、紫外灰度图像的至少一种。
  54. 根据权利要求47-53中任一项所述的图像处理设备,其特征在于,
    采用以下的至少一种方法对所述灰度图像进行压缩:H263、H264、H265、MPEG-4。
  55. 一种图像处理设备,其特征在于,包括:
    存储器,用于存储可执行指令;
    处理器,用于执行所述存储器中存储的所述可执行指令,以执行如下操作:
    获取通过图像传感器采集的灰度图像;
    发送所述通过图像传感器采集的灰度图像;
    接收所述通过图像传感器采集的灰度图像;
    对所述灰度图像进行伪彩映射,得到伪彩图像,所述伪彩图像用于显示设备显示。
  56. 根据权利要求55所述的图像处理设备,其特征在于,
    所述处理器还执行以下操作:
    对所述通过图像传感器采集的灰度图像进行重采样,获取经过重采样的灰度图像;
    发送所述经过重采样的灰度图像;
    接收重采样后的灰度图像;
    对所述重采样后的灰度图像进行伪彩映射。
  57. 根据权利要求56所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    对所述经过重采样的灰度图像进行压缩,获取压缩后的灰度图像;
    发送所述压缩后的灰度图像;
    在所述接收重采样后的灰度图像之后,
    对所述压缩后的灰度图像进行解压缩,得到解压缩后的灰度图像;
    对所述解压缩后的灰度图像进行伪彩映射。
  58. 根据权利要求57所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    对所述通过图像传感器采集的灰度图像进行处理,得到处理后的灰度图像;
    对所述处理后的灰度图像进行重采样,得到经过重采样的灰度图像。
  59. 根据权利要求55所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    在所述接收所述通过图像传感器采集的灰度图像之后,
    对所述灰度图像进行重采样,获取所述重采样后的灰度图像;
    对所述重采样后的灰度图像进行伪彩映射。
  60. 根据权利要求56或59所述的图像处理设备,其特征在于,所述重采样后的灰度图像的空间分辨率与所述显示设备的空间分辨率相匹配。
  61. 根据权利要求59所述的图像处理设备,其特征在于,所述处理器还执行以下操作:
    对所述通过图像传感器采集的灰度图像进行压缩,获取压缩后的灰度图像;
    发送所述压缩后的灰度图像;
    在所述对所述灰度图像进行重采样之前,
    对所述压缩后的灰度图像进行解压缩,得到解压缩后的灰度图像。
  62. 根据权利要求61所述的图像处理设备,其特征在于,所述处理 器还执行以下操作:
    对所述通过图像传感器采集的灰度图像进行处理,获取处理后的灰度图像;
    对所述处理后的灰度图像进行压缩,获取压缩后的灰度图像;
    发送所述压缩后的灰度图像。
  63. 根据权利要求55-62中任一项所述的图像处理设备,其特征在于,
    所述伪彩图像包括RGB图像、YUV图像中的至少一种。
  64. 根据权利要求63所述的图像处理设备,其特征在于,所述YUV图像为YUV444图像。
  65. 根据权利要求55-64中任一项所述的图像处理设备,其特征在于,所述灰度图像包括红外灰度图像、紫外灰度图像的至少一种。
  66. 根据权利要求57、58、61、62中任一项所述的图像处理设备,其特征在于,
    采用以下的至少一种方法对所述灰度图像进行压缩:H263、H264、H265、MPEG-4。
  67. 一种显示装置,其特征在于,包括显示设备和根据权利要求34-46任一项所述的图像处理设备。
  68. 一种成像装置,其特征在于,包括图像传感器和根据权利要求47-54任一项所述的图像处理设备。
  69. 一种可移动平台,其特征在于,包括机体和根据权利要求68所述的成像装置。
  70. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被执行时实现根据权利要求1-33任一项所述的图像处理方法的步骤。
PCT/CN2020/082447 2020-03-31 2020-03-31 一种图像处理方法、设备、控制终端及可移动平台 WO2021195967A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080004283.0A CN112544068A (zh) 2020-03-31 2020-03-31 一种图像处理方法、设备、控制终端及可移动平台
PCT/CN2020/082447 WO2021195967A1 (zh) 2020-03-31 2020-03-31 一种图像处理方法、设备、控制终端及可移动平台

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/082447 WO2021195967A1 (zh) 2020-03-31 2020-03-31 一种图像处理方法、设备、控制终端及可移动平台

Publications (1)

Publication Number Publication Date
WO2021195967A1 true WO2021195967A1 (zh) 2021-10-07

Family

ID=75017384

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/082447 WO2021195967A1 (zh) 2020-03-31 2020-03-31 一种图像处理方法、设备、控制终端及可移动平台

Country Status (2)

Country Link
CN (1) CN112544068A (zh)
WO (1) WO2021195967A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102034229A (zh) * 2010-11-03 2011-04-27 中国科学院长春光学精密机械与物理研究所 高分辨多光谱空间光学遥感器的实时图像融合方法
CN107871302A (zh) * 2016-09-23 2018-04-03 电子科技大学 一种基于yuv颜色空间的红外图像伪彩色处理方法
US20180115770A1 (en) * 2016-10-25 2018-04-26 Intel Corporation Light field perception enhancement for integral display applications
CN108491869A (zh) * 2018-03-14 2018-09-04 北京师范大学 一种全色波段灰度值自适应反转的主成分变换遥感图像融合方法
CN109886214A (zh) * 2019-02-26 2019-06-14 中南民族大学 一种基于图像处理的鸟鸣声特征强化方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103428434B (zh) * 2012-05-23 2024-04-16 杭州美盛红外光电技术有限公司 红外摄影装置和红外摄影方法
CN103279938B (zh) * 2013-04-03 2016-12-28 昆明物理研究所 红外/微光图像融合夜视系统
CN106124061A (zh) * 2016-08-18 2016-11-16 湖南文理学院 在线式红外热像仪及红外热图数据处理方法
CN106979822A (zh) * 2017-03-25 2017-07-25 聊城大学 一种红外成像过耗故障探测仪

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102034229A (zh) * 2010-11-03 2011-04-27 中国科学院长春光学精密机械与物理研究所 高分辨多光谱空间光学遥感器的实时图像融合方法
CN107871302A (zh) * 2016-09-23 2018-04-03 电子科技大学 一种基于yuv颜色空间的红外图像伪彩色处理方法
US20180115770A1 (en) * 2016-10-25 2018-04-26 Intel Corporation Light field perception enhancement for integral display applications
CN108491869A (zh) * 2018-03-14 2018-09-04 北京师范大学 一种全色波段灰度值自适应反转的主成分变换遥感图像融合方法
CN109886214A (zh) * 2019-02-26 2019-06-14 中南民族大学 一种基于图像处理的鸟鸣声特征强化方法

Also Published As

Publication number Publication date
CN112544068A (zh) 2021-03-23

Similar Documents

Publication Publication Date Title
JP4527720B2 (ja) 画像圧縮法、画像圧縮装置、画像伝送システム、データ圧縮前処理装置及びコンピュータプログラム
CN108141505B (zh) 用于高位深医学灰度图像的压缩和解压缩方法
JP6141295B2 (ja) 知覚的に無損失のおよび知覚的に強調された画像圧縮システムならびに方法
US10038841B1 (en) System for streaming multiple regions deriving from a wide-angle camera
KR20200044653A (ko) 딥 뉴럴 네트워크를 이용한 영상의 ai 부호화 및 ai 복호화 방법, 및 장치
CN110300301B (zh) 图像编解码方法和装置
TW201143353A (en) Resolution based formatting of compressed image data
US20220188976A1 (en) Image processing method and apparatus
US7526134B2 (en) Image processing apparatus, program, recording medium, and data decompression method
WO2022068682A1 (zh) 图像处理方法及装置
US5684544A (en) Apparatus and method for upsampling chroma pixels
WO2020047756A1 (zh) 图像编码方法和装置
US20220272384A1 (en) Image compression
TW201540045A (zh) 具有減少色彩解析度的視訊流之自適應處理
US20040161156A1 (en) Image processing apparatus, method, program and medium storing image processing program
US20140010445A1 (en) System And Method For Image Compression
CN104780383A (zh) 一种3d-hevc多分辨率视频编码方法
Guleryuz et al. Sandwiched Image Compression: Increasing the resolution and dynamic range of standard codecs
WO2021195967A1 (zh) 一种图像处理方法、设备、控制终端及可移动平台
US7676105B2 (en) Method, apparatus, article and system for use in association with images
CN108156461A (zh) 一种Bayer图像压缩方法及装置
CN116847087A (zh) 视频处理方法、装置、存储介质及电子设备
US7146055B2 (en) Image processing decompression apparatus and method of using same different scaling algorithms simultaneously
JP5087724B2 (ja) 画像圧縮装置、画像伸張装置、画像圧縮プログラムおよび画像伸張プログラム
CN114827625A (zh) 一种基于灰度图压缩算法的高速图像云传输方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20928107

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20928107

Country of ref document: EP

Kind code of ref document: A1