CN112073722A - Image processing method, device, equipment and storage medium - Google Patents

Image processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN112073722A
CN112073722A CN202010719810.8A CN202010719810A CN112073722A CN 112073722 A CN112073722 A CN 112073722A CN 202010719810 A CN202010719810 A CN 202010719810A CN 112073722 A CN112073722 A CN 112073722A
Authority
CN
China
Prior art keywords
image
scanning area
frame image
sector
current frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010719810.8A
Other languages
Chinese (zh)
Other versions
CN112073722B (en
Inventor
刘诣荣
卢涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Wanxiang Electronics Technology Co Ltd
Original Assignee
Xian Wanxiang Electronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Wanxiang Electronics Technology Co Ltd filed Critical Xian Wanxiang Electronics Technology Co Ltd
Priority to CN202010719810.8A priority Critical patent/CN112073722B/en
Priority to PCT/CN2020/130305 priority patent/WO2022016756A1/en
Publication of CN112073722A publication Critical patent/CN112073722A/en
Application granted granted Critical
Publication of CN112073722B publication Critical patent/CN112073722B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The disclosure provides an image processing method, an image processing device, a storage medium and equipment, relates to the technical field of image processing, and can solve the problem that when a radar situation map is coded according to a traditional compression protocol, the data volume of a code stream of the coded radar situation map is large. The specific technical scheme is as follows: acquiring a current frame image, wherein the current frame image is a dynamically scanned radar situation map; determining a first sector scanning area of the current frame image and first image data of the first sector scanning area; determining a second sector scanning area of the target frame image according to the first sector scanning area; acquiring the target frame image; acquiring second image data from a second sector scanning area of the target frame image; and encoding the first image data and the second image data and then sending the encoded first image data and second image data to a decoding end. The method and the device are used for reducing the data volume of the code stream of the coded radar situation map.

Description

Image processing method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing, and in particular, to an image processing method, apparatus, device, and storage medium.
Background
In video coding and decoding transmission, a radar situation map is a special image and has the characteristics of single coloring color, basically unchanged background and only changed radar scanning area. Therefore, in order to reduce the data volume of the code stream of the radar situation map after encoding and decoding, a specific encoding mode can be adopted for the radar situation map, the code stream redundancy is reduced, and the data volume of the code stream of the radar situation map after encoding and decoding is further reduced.
Because a coding compression method specially aiming at the radar situation map is not provided at present, the radar situation map is still coded according to a traditional compression protocol, and the shape and the change characteristics of the radar situation map are not fully utilized, so that the data volume of the code stream of the coded radar situation map is larger when the radar situation map is coded according to the traditional compression protocol.
Disclosure of Invention
The embodiment of the disclosure provides an image processing method, an image processing device, an image processing apparatus and a storage medium, which can solve the problem that when a radar situation map is encoded according to a conventional compression protocol, the data volume of a code stream of the encoded radar situation map is large. The technical scheme is as follows:
according to a first aspect of embodiments of the present disclosure, there is provided an image processing method, including:
acquiring a current frame image, wherein the current frame image is a dynamically scanned radar situation map;
determining a first sector scanning area of the current frame image and first image data of the first sector scanning area;
determining a second sector scanning area of the target frame image according to the first sector scanning area;
acquiring the target frame image;
acquiring second image data from a second sector scanning area of the target frame image;
and encoding the first image data and the second image data and then sending the encoded first image data and second image data to a decoding end.
The image processing method provided by the embodiment of the disclosure can acquire the current frame image, wherein the current frame image is a dynamically scanned radar situation map; determining a first sector scanning area of the current frame image and first image data of the first sector scanning area; determining a second sector scanning area of the target frame image according to the first sector scanning area; acquiring the target frame image; acquiring second image data from a second sector scanning area of the target frame image; the first image data and the second image data are coded and then sent to the decoding end, the current frame image and the target frame image do not need to be coded and sent to the decoding end, the first image data obtained from the first sector scanning area of the current frame image and the second image data obtained from the second sector scanning area of the target frame image only need to be coded and sent to the decoding end, and data volume of code streams of the coded radar situation images is greatly reduced.
In one embodiment, before the acquiring the current frame image, the method further includes:
acquiring at least one frame of image;
determining a third sector scanning area of the at least one frame of image;
determining a dynamic scanning rule according to the third sector scanning area;
determining a second sector-scan region of the target frame image from the first sector-scan region comprises:
and determining a second sector scanning area of the target frame image according to the first sector scanning area and the dynamic scanning rule.
By determining the dynamic scanning rule, the second sector scanning area of the target frame image can be further determined according to the dynamic scanning rule.
In one embodiment, the determining the first sector scan region of the current frame map comprises:
acquiring an initial image, wherein the initial image is a radar situation map which is not subjected to dynamic scanning;
determining a circular scanning area of the initial image;
determining the circular scanning area of the current frame image according to the circular scanning area of the initial image;
and determining a first sector scanning area of the current frame image from the circular scanning area of the current frame image.
By determining the circular scanning area, the sector scanning area can be further determined in the circular scanning area of the current frame image.
In one embodiment, the determining the first sector scan region of the current frame image from the circular scan region of the current frame image comprises:
performing difference operation on the circular scanning area of the current frame image and the circular scanning area of the initial frame image to obtain a change area of the current frame image;
and performing linear fitting on the change area of the current frame image to generate the first sector area.
The first sector area can be accurately generated by linearly fitting the change area of the current frame image.
In one embodiment, the determining the circular scanning area of the initial image comprises:
and performing circle detection on the initial image, and determining a circular scanning area in the initial image.
By performing circle detection on the initial image, the circular scanning area of the initial image can be accurately determined.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
the current frame image acquisition module is used for acquiring a current frame image, and the current frame image is a dynamically scanned radar situation map;
a first sector scanning area determining module, configured to determine a first sector scanning area of the current frame image and first image data of the first sector scanning area;
a second sector scanning area determining module, configured to determine a second sector scanning area of the target frame image according to the first sector scanning area;
the target frame image acquisition module is used for acquiring the target frame image;
a second image data acquisition module, configured to acquire second image data from a second sector scanning area of the target frame image;
and the image data sending module is used for coding the first image data and the second image data and then sending the coded first image data and the second image data to the decoding end.
In one embodiment, the apparatus further comprises:
a dynamic scan rule determination module to:
acquiring at least one frame of image;
determining a third sector scanning area of the at least one frame of image;
determining a dynamic scanning rule according to the third sector scanning area;
the second sector scan area determination module is specifically configured to:
and determining a second sector scanning area of the target frame image according to the first sector scanning area and the dynamic scanning rule.
In one embodiment, the first sector scan area determination module is specifically configured to:
acquiring an initial image, wherein the initial image is a radar situation map which is not subjected to dynamic scanning;
determining a circular scanning area of the initial image;
determining the circular scanning area of the current frame image according to the circular scanning area of the initial image;
and determining a first sector scanning area of the current frame image from the circular scanning area of the current frame image.
In one embodiment, the first sector scan area determination module is specifically configured to:
performing difference operation on the circular scanning area of the current frame image and the circular scanning area of the initial frame image to obtain a change area of the current frame image;
and performing linear fitting on the change area of the current frame image to generate the first sector area.
In one embodiment, the dynamic scan rule determining module is specifically configured to:
and performing circle detection on the initial image, and determining a circular scanning area in the initial image.
According to a third aspect of embodiments of the present disclosure, there is provided an image processing apparatus comprising a processor and a memory, the memory having stored therein at least one computer instruction, the instruction being loaded and executed by the processor to implement the steps performed in the image processing method of any one of the first aspects.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium having stored therein at least one computer instruction, which is loaded and executed by a processor to implement the steps performed in the image processing method according to any one of the first aspect.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a block diagram of an image processing system provided by an embodiment of the present disclosure;
fig. 2 is a flowchart of an image processing method provided by an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an initial image provided by an embodiment of the present disclosure;
FIG. 4 is a schematic illustration of a region separation provided by embodiments of the present disclosure;
FIG. 5 is a schematic diagram of a circular scanning area and a background area of an initial image provided by an embodiment of the present disclosure;
fig. 6 is a first schematic diagram illustrating a first sector scanning area of the current frame image according to an embodiment of the present disclosure;
fig. 7 is a second schematic diagram of a first sector scanning area of the current frame image according to the embodiment of the disclosure;
FIG. 8 is a schematic diagram of a second sector scanning area of a frame image according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of an initial image encoded by a color code table according to an embodiment of the present disclosure;
fig. 10 is a first structural diagram of an image processing apparatus according to an embodiment of the present disclosure;
fig. 11 is a second structural diagram of an image processing apparatus according to an embodiment of the present disclosure;
fig. 12 is a block diagram of an image processing apparatus provided in an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a block diagram of an image processing system according to an embodiment of the present disclosure. As shown in fig. 1, the system includes:
an encoding side 101 and a decoding side 102. The encoding end 101 and the decoding end 102 are in communication connection. Illustratively, the encoding end 01 may be a computer, a mobile phone, a tablet, and the like, and the embodiment is not limited in this respect. Similarly, the decoding end 102 is a computer, a mobile phone, a tablet, and the like, and the embodiment is not limited herein.
In this embodiment, the encoding end 101 is configured to obtain a radar situation map, encode the obtained radar situation map, generate an encoded code stream, and send the encoded code stream to the decoding end 102.
Further, after receiving the encoded code stream sent by the encoding end 101, the decoding end 102 decodes the encoded code stream to generate a decoded code stream, and then generates the radar situation map according to the decoded code stream.
However, at present, there is no encoding compression method specially for radar situation maps, and the radar situation maps are still encoded according to the conventional compression protocol, so that the data volume of the code stream of the radar situation map encoded by the encoding end 101 is large, and further, the decoding end 102 decodes the encoded code stream to generate a decoded code stream with a large data volume.
The inventor notices the problem and proposes an image processing method, which comprises the following steps:
fig. 2 is a flowchart of an image processing method provided by an embodiment of the present disclosure, and the method is applied to an encoding end. As shown in fig. 2, the method includes:
s201, acquiring a current frame image, wherein the current frame image is a dynamically scanned radar situation map;
s202, determine a first sector scanning area of the current frame image and first image data of the first sector scanning area.
Illustratively, before acquiring the dynamically scanned radar situation map, an initial image is acquired, wherein the initial image is the radar situation map which is not subjected to dynamic scanning, and a sector scanning area does not exist on the initial image. Fig. 3 is a schematic diagram of an initial image provided by an embodiment of the present disclosure. As shown in fig. 3, the left diagram (a) is an initial image with 256-level (8bit) gradation.
Because the radar situation map has single color, if the original image with 256-level (8bit) gray level is directly compressed and coded, a lot of unnecessary code streams can be generated, so that in order to reduce the coding amount as much as possible and not influence the radar situation map to be coded, the scheme adopts an error diffusion method to convert the original image with 256-level gray level into the original image with 32-level gray level.
The principle of error diffusion is that when the color depth of an image is reduced, the change error of pixel colors is diffused, so that when the image is observed by naked eyes, the integral error of an adjacent pixel point set is reduced, in the application, an original image with 256-level (8bit) gray scale is subjected to error diffusion according to the proportion of 3: 2: 3, the gray scale of the original image with the 256-level (8bit) gray scale is degraded, and the original image with the 256-level (8bit) gray scale is converted into an original image with 32-level gray scale. An initial image of 32 gradations generated by degrading the gradation of an original image of 256 (8bit) gradations is shown in a diagram (b) on the right side of fig. 3.
Furthermore, after the original image with 256-level (8bit) gray level is subjected to gray level degradation to generate an initial image with 32-level gray level, the initial image with 32-level gray level is subjected to region separation to obtain two parts, namely a circular scanning region and a background region, of the initial image with 32-level gray level. Due to the particularity of the radar situation map, after the dynamic scanning is started, the main change area of the radar situation map exists in the circular scanning area, and the background area only has a few simple digital changes, and most of the change area is basically unchanged. Therefore, in the present application, the radar map is divided into a circular scanning area and a background area. Illustratively, a schematic diagram of the zone separation is shown in FIG. 4.
How to perform the region separation on the initial image of 32-level gray (hereinafter, referred to as the initial image) is explained as follows:
according to the scheme, the outer frame circle of the circular scanning area in the initial image is detected through the Hough circular detection algorithm, so that the purpose of separating the images is achieved. Specifically, the outer frame circle and the area within the outer frame circle are circular scanning areas, and the area outside the outer frame circle is a background area.
The Hough circular detection algorithm is the same as the Hough transformation detection straight line, and the pixel points of the initial image on the Y-X plane are correspondingly converted into an a-b coordinate system by carrying out coordinate transformation on all the pixel points in the initial image.
In the present embodiment, the variation range and step length of the angle theta and the variation range and step length of the radius r are set, and for example, theta ∈ [0, 2 pi ] may be set, and the angle step theta _ step is 0.1;
r ∈ [450, 550], radius step size, coordinate transformation using the following formula:
Figure BDA0002599543880000071
and if a is more than 0 and not more than a and not more than IMG _ height, b is more than 0 and not more than b and not more than IMG _ width, the position is superposed. Wherein, IMG _ height is the height of the initial image, and IMG _ width is the width of the initial image.
After the coordinate system is converted, if a circle boundary on the Y-X plane has many pixels, a plurality of circles corresponding to the coordinate system a-b will exist. Since the pixels are all on the same circle in the original image, then a and b must also satisfy the equations of all circles in the a-b coordinate system after transformation. The intuitive expression is that circles corresponding to the plurality of pixel points intersect at a pixel point, and then the intersection point is probably the center (a, b).
And counting the number of circles at the local intersection points, taking the maximum value of the number of circles at the local intersection points, wherein the intersection point corresponding to the maximum value is the circle center (a, b) of the outer frame circle of the circular scanning area in the initial image. And then drawing circles at the circle centers (a, b) in step lengths with different radiuses to obtain a plurality of circles with different radiuses. Counting the number of pixel points on circles with different radiuses, and taking the maximum value of the number of the pixel points, wherein the circle corresponding to the maximum value of the number of the pixel points is the outer frame circle of the initial image. And then, segmenting the initial image according to the outer frame circle to generate a circular scanning area and a background area of the initial image. The circular scanning area and the background area of the initial image are shown in fig. 5 (a) on the left side and fig. 5 (b) on the right side, respectively.
Further, after generating the circular scanning area and the background area of the initial image, encoding the initial image, and sending the position information of the outer frame circle in the initial image to a decoding end. So that the decoding end decodes the encoded initial image and determines the circular scanning area and the background area of the initial image.
Illustratively, after the initial image is encoded and sent to a decoding end, dynamic scanning of the radar situation map is started. A sector scanning area exists on a circular scanning area of the dynamically scanned radar situation map, and the sector scanning area is a radar scanning area.
In this embodiment, a current frame image is obtained, where the current frame image is a dynamically scanned radar situation map, and a first sector scanning area exists on the current frame image.
Illustratively, after the current frame image is acquired, the circular scanning area of the current frame image is determined according to the circular scanning area of the initial image. Because the initial image is a radar situation map which is not subjected to dynamic scanning, and the current frame image is a radar frame image which is subjected to dynamic scanning, the position of the circular scanning area of the current frame image in the image is consistent with the position of the circular scanning area of the initial image in the image, and therefore the circular scanning area of the current frame image can be determined according to the circular scanning area of the initial image.
Further, after the circular scanning area of the current frame image is determined, the image data in the circular scanning area of the current frame image and the image data in the circular scanning area of the initial frame image are subjected to difference operation to obtain a change area of the current frame image, and the image data in the change area of the current frame image is different from the initial image. Since the radar map is only scanned in the circular scanning area when being dynamically scanned, in this embodiment, the image data in the circular scanning area of the current frame and the image data in the circular scanning area of the initial frame are subjected to difference operation, so as to obtain the change area of the current frame image.
After the change area of the current frame image is determined, the change area of the current frame image is subjected to linear fitting to generate the first sector scanning area.
Illustratively, using a straight line fitting equation, the upper and lower edges 11, l2 of the first sector scanning area are derived as:
Figure BDA0002599543880000092
and then obtaining a sector included angle theta of the first sector scanning area according to a formula (2):
Figure BDA0002599543880000091
the first sector scan region of the current frame image is shown in fig. 6 and 7. Further, after determining a first sector scan area of the current frame image, first image data is acquired from the first sector scan area. The first image data is image data of the current frame image which has change compared with the initial image.
And S203, determining a second sector scanning area of the target frame image according to the first sector scanning area.
In an embodiment, when the radar situation map is dynamically scanned in a region, the dynamic scanning rule is uniform rotation scanning, so that the second sector scanning region of the target frame image can be determined and calculated according to the first sector scanning region of the current frame image and the sector included angle θ of the first sector scanning region, where the target frame image may be a next frame image or any one frame image acquired at a time after the current frame image is acquired.
For example, if the current frame image is a radar situation map of a first frame of dynamic scanning, and the dynamic scanning rule is clockwise scanning, the sector angle θ of the first sector scanning area of the current frame image is 10 °, for example, if the first sector scanning area of the current frame image is 1 ° to 10 °, the sector angle of the sector scanning area of the second frame image is 10 °, the sector scanning area of the second frame image is 11 ° to 20 °, the sector angle of the sector scanning area of the third frame image is 10 °, and the sector scanning area of the third frame image is 21 ° to 30 °. By analogy, the second sector-shaped scanning area of the target frame image acquired at any time after the current frame image can be determined. The second sector-shaped scan area of the target frame image is shown in fig. 8. Illustratively, when the current frame image is the first frame image, the leftmost drawing (a) of fig. 8 is a sector scanning area of the second frame image; the middle diagram (b) of fig. 8 is a sector scanning area of the third frame image; the rightmost plot (c) of fig. 8 is the sector scan region of the first three frame images.
In another embodiment, if it is not determined whether the dynamic scanning rule is clockwise scanning or counterclockwise scanning, before the current frame image is acquired, at least one frame image is acquired, the at least one frame image is a dynamically scanned radar situation map, a third sector scanning area of the at least one frame image is determined, and the dynamic scanning rule is determined according to the third sector scanning area of the at least one frame image. And after the dynamic scanning rule is determined, determining a second sector scanning area of the target frame image according to the first sector scanning area of the current frame image and the dynamic scanning rule. Here, the method for determining the third sector scanning area of the at least one frame image is similar to the method for determining the first sector scanning area of the current frame image, and this embodiment is not repeated herein.
For example, a frame image may be acquired at a time before the current frame image is acquired, and the dynamic scanning rule may be determined according to the third sector scanning area of the frame image and the second sector scanning area of the current frame image. For example, if the third sector scan region of the one frame image is 1 ° to 5 °, and the second sector scan region of the current frame image is 6 ° to 10 °, the dynamic scan rule is clockwise scan, and each time instant scans 5 °. After the dynamic scanning rule is determined, a second sector scanning area of the target frame image can be determined according to the first sector scanning area of the current frame image.
Illustratively, two consecutive frames of images may be acquired at a time before the current frame of image is acquired, and the dynamic scanning rule may be determined according to the third sector scanning area of the two frames of images. For example, the third sector scanning areas of the two consecutive frames of images are 1 ° to 5 ° and 6 ° to 10 °, respectively, then the dynamic scanning rule is clockwise scanning, and 5 ° is scanned at each instant. After the dynamic scanning rule is determined, a second sector scanning area of the target frame image can be determined according to the first sector scanning area of the current frame image.
S204, acquiring the target frame image;
s205, second image data is acquired from a second sector scanning area of the target frame image.
Further, after determining a second sector scan region of the target frame image, second image data is acquired from the acquired second sector scan region of the target frame image. The second image data is the data that the current frame image has changes compared with the initial image.
S206, the first image data and the second image data are sent to a decoding end after being coded.
In this embodiment, after acquiring the first image data from the first sector scanning area of the current frame image, the first image data is encoded, and the encoded first image data and the first sector scanning area are sent to a decoding end, so that the decoding end decodes the first image data, and superimposes the decoded first image data in the first sector scanning area of the initial image, thereby generating the current frame image.
Similarly, after acquiring second image data from a second sector scanning area of the target frame image, encoding the second image data, and sending the encoded second image data and the second sector scanning area to a decoding end, so that the decoding end decodes the second image data, and superimposes the decoded second image data in the second sector scanning area of the initial image to generate the target frame image.
How to encode the initial image, the first image data, and the second image data will be described below.
Because the initial image is a radar situation map which is not dynamically scanned, the color of the initial image is single, the gray level of the initial image is reduced, and the original 256-level gray level initial image is reduced to 32-level gray level initial image by adopting an error diffusion method. Therefore, when in final coding, the scheme encodes the initial image by establishing a color code table.
Exemplarily, in the present embodiment, the radar map is an image in RGB pixel format, and for R, G, B three channels, the gray level of each channel is 32 levels, so that the size of the created code table is 32 × 32 — 32768; the color code table of the initial image is shown in table 1:
Figure BDA0002599543880000111
Figure BDA0002599543880000121
TABLE 1
The color code table shown in table 1 can arrange the color combinations with high occurrence probability before the small numbers by performing Huffman probability statistical arrangement on the initial images, so that the code stream as small as possible is obtained during entropy coding.
Further, for each color pixel point in the initial image, the number corresponding to the color in the established color code table is used for encoding the pixel point.
The initial image encoded with the color code table is shown in fig. 9.
Further, after the initial image is coded by using a color code table, the initial image is coded by using a Joint Photographic Experts Group (JPEG) entropy coding mode, and a code stream of the initial image which is finally output is obtained.
In this embodiment, the first image data and the second image data are encoded in the same manner, which is not described herein again.
By adopting the scheme, the encoding end only needs to send the code stream of the encoded initial image to the decoding end, and the decoding end receives the code stream of the initial image and then decodes the code stream to generate the initial image. After the code stream of the encoded initial image is sent to the decoding end by the encoding end, aiming at the acquired radar situation map of each frame of dynamic scanning, only the image data of the change area of the radar situation map of the frame of dynamic scanning needs to be encoded, the generated code stream of the image data of the change area is sent to the decoding end, the decoding end decodes the code stream of the image data of the change area, and the image data of the change area is superposed on the initial image, so that the radar situation map of the frame of dynamic scanning can be generated. Therefore, the encoding end does not need to encode each frame of dynamically scanned radar situation map and send the encoded radar situation map to the decoding end, and the decoding end does not need to receive the code stream generated after encoding each frame of dynamically scanned radar situation map and decode the radar situation map of each frame of dynamically scanned radar situation map, so that the data volume of the code stream of the encoded radar situation map is greatly reduced.
The image processing method provided by the embodiment of the disclosure can acquire the current frame image, wherein the current frame image is a dynamically scanned radar situation map; determining a first sector scanning area of the current frame image and first image data of the first sector scanning area; determining a second sector scanning area of the target frame image according to the first sector scanning area; acquiring the target frame image; acquiring second image data from a second sector scanning area of the target frame image; the first image data and the second image data are coded and then sent to the decoding end, the current frame image and the target frame image do not need to be coded and sent to the decoding end, the first image data obtained from the first sector scanning area of the current frame image and the second image data obtained from the second sector scanning area of the target frame image only need to be coded and sent to the decoding end, and data volume of code streams of the coded radar situation images is greatly reduced.
Fig. 10 is a block diagram of an image processing apparatus according to an embodiment of the present disclosure. The device is applied to the encoding end. As shown in fig. 10, the apparatus 100 includes:
a current frame image obtaining module 1001, configured to obtain a current frame image, where the current frame image is a dynamically scanned radar situation map;
a first sector scanning area determining module 1002, configured to determine a first sector scanning area of the current frame image and first image data of the first sector scanning area;
a second sector scanning area determining module 1003, configured to determine a second sector scanning area of the target frame image according to the first sector scanning area;
a target frame image obtaining module 1004, configured to obtain the target frame image;
a second image data acquiring module 1005, configured to acquire second image data from a second sector scanning area of the target frame image;
the image data sending module 1006 is configured to send the encoded first image data and second image data to a decoding end.
In one embodiment, as shown in fig. 11, the apparatus 100 further comprises:
a dynamic scan rule determination module 1007 configured to:
acquiring at least one frame of image;
determining a third sector scanning area of the at least one frame of image;
determining a dynamic scanning rule according to the third sector scanning area;
the second sector scanning area determining module 1003 is specifically configured to:
and determining a second sector scanning area of the target frame image according to the first sector scanning area and the dynamic scanning rule.
In one embodiment, the first sector scan area determination module 1002 is specifically configured to:
acquiring an initial image, wherein the initial image is a radar situation map which is not subjected to dynamic scanning;
determining a circular scanning area of the initial image;
determining the circular scanning area of the current frame image according to the circular scanning area of the initial image;
and determining a first sector scanning area of the current frame image from the circular scanning area of the current frame image.
In one embodiment, the first sector scan area determination module 1002 is specifically configured to:
performing difference operation on the circular scanning area of the current frame image and the circular scanning area of the initial frame image to obtain a change area of the current frame image;
and performing linear fitting on the change area of the current frame image to generate the first sector area.
In one embodiment, the dynamic scan rule determining module is specifically configured to:
and performing circle detection on the initial image, and determining a circular scanning area in the initial image.
The image processing apparatus provided in the embodiment of the present disclosure may refer to the embodiments of fig. 2 to 9, and details thereof are not repeated herein.
Fig. 12 is a block diagram of an image processing apparatus provided in an embodiment of the present disclosure. As shown in fig. 12, the image processing apparatus 120 includes a processor 1201 and a memory 1202, where the memory 1202 stores at least one computer instruction, and the instruction is loaded and executed by the processor 1201 to implement the image processing method described in the above method embodiment.
Based on the image processing methods described in the embodiments corresponding to fig. 2 to 10, embodiments of the present disclosure further provide a computer-readable storage medium, for example, the non-transitory computer-readable storage medium may be a Read Only Memory (ROM), a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. The storage medium stores computer instructions for executing the image processing method, which is not described herein again.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. An image processing method, comprising:
acquiring a current frame image, wherein the current frame image is a dynamically scanned radar situation map;
determining a first sector scanning area of the current frame image and first image data of the first sector scanning area;
determining a second sector scanning area of the target frame image according to the first sector scanning area;
acquiring the target frame image;
acquiring second image data from a second sector scanning area of the target frame image;
and encoding the first image data and the second image data and then sending the encoded first image data and second image data to a decoding end.
2. The method of claim 1, wherein before the obtaining the current frame image, the method further comprises:
acquiring at least one frame of image;
determining a third sector scanning area of the at least one frame of image;
determining a dynamic scanning rule according to the third sector scanning area;
determining a second sector-scan region of the target frame image from the first sector-scan region comprises:
and determining a second sector scanning area of the target frame image according to the first sector scanning area and the dynamic scanning rule.
3. The method of claim 1, wherein determining the first sector scan region of the current frame map comprises:
acquiring an initial image, wherein the initial image is a radar situation map which is not subjected to dynamic scanning;
determining a circular scanning area of the initial image;
determining the circular scanning area of the current frame image according to the circular scanning area of the initial image;
and determining a first sector scanning area of the current frame image from the circular scanning area of the current frame image.
4. The method of claim 3, wherein the determining the first sector scan region of the current frame image from the circular scan region of the current frame image comprises:
performing difference operation on the circular scanning area of the current frame image and the circular scanning area of the initial frame image to obtain a change area of the current frame image;
and performing linear fitting on the change area of the current frame image to generate the first sector area.
5. The method of claim 3, wherein the determining the circular scan area of the initial image comprises:
and performing circle detection on the initial image, and determining a circular scanning area in the initial image.
6. An image processing apparatus characterized by comprising:
the current frame image acquisition module is used for acquiring a current frame image, and the current frame image is a dynamically scanned radar situation map;
a first sector scanning area determining module, configured to determine a first sector scanning area of the current frame image and first image data of the first sector scanning area;
a second sector scanning area determining module, configured to determine a second sector scanning area of the target frame image according to the first sector scanning area;
the target frame image acquisition module is used for acquiring the target frame image;
a second image data acquisition module, configured to acquire second image data from a second sector scanning area of the target frame image;
and the image data sending module is used for coding the first image data and the second image data and then sending the coded first image data and the second image data to the decoding end.
7. The apparatus of claim 6, further comprising:
a dynamic scan rule determination module to:
acquiring at least one frame of image;
determining a third sector scanning area of the at least one frame of image;
determining a dynamic scanning rule according to the third sector scanning area;
the second sector scan area determination module is specifically configured to:
and determining a second sector scanning area of the target frame image according to the first sector scanning area and the dynamic scanning rule.
8. The apparatus of claim 6, wherein the first sector scan area determination module is configured to:
acquiring an initial image, wherein the initial image is a radar situation map which is not subjected to dynamic scanning;
determining a circular scanning area of the initial image;
determining the circular scanning area of the current frame image according to the circular scanning area of the initial image;
and determining a first sector scanning area of the current frame image from the circular scanning area of the current frame image.
9. An image processing apparatus, characterized in that the image processing apparatus comprises a processor and a memory, in which at least one computer instruction is stored, which is loaded and executed by the processor to implement the steps performed in the image processing method according to any one of claims 1 to 5.
10. A computer-readable storage medium having stored thereon at least one computer instruction, which is loaded and executed by a processor to implement the steps performed in the image processing method according to any one of claims 1 to 5.
CN202010719810.8A 2020-07-23 2020-07-23 Image processing method, device, equipment and storage medium Active CN112073722B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010719810.8A CN112073722B (en) 2020-07-23 2020-07-23 Image processing method, device, equipment and storage medium
PCT/CN2020/130305 WO2022016756A1 (en) 2020-07-23 2020-11-20 Image processing method and apparatus, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010719810.8A CN112073722B (en) 2020-07-23 2020-07-23 Image processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112073722A true CN112073722A (en) 2020-12-11
CN112073722B CN112073722B (en) 2024-05-17

Family

ID=73657415

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010719810.8A Active CN112073722B (en) 2020-07-23 2020-07-23 Image processing method, device, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112073722B (en)
WO (1) WO2022016756A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070071337A1 (en) * 2005-09-28 2007-03-29 Novell, Inc. Adaptive method and system for encoding digital images for the Internet
CN102033224A (en) * 2010-11-30 2011-04-27 电子科技大学 Radar video signal encoding method for ship
CN104407339A (en) * 2014-11-21 2015-03-11 中国人民解放军海军工程大学 Generative system for environment situation map of laser decoy departure jamming
CN105357494A (en) * 2015-12-04 2016-02-24 广东中星电子有限公司 Video encoding and decoding method and apparatus, and computer program product
WO2016161675A1 (en) * 2015-04-08 2016-10-13 杭州海康威视数字技术股份有限公司 Method and system for target-based video encoding
US20170206434A1 (en) * 2016-01-14 2017-07-20 Ford Global Technologies, Llc Low- and high-fidelity classifiers applied to road-scene images
CN108734171A (en) * 2017-04-14 2018-11-02 国家海洋环境监测中心 A kind of SAR remote sensing image ocean floating raft recognition methods of depth collaboration sparse coding network
CN110324617A (en) * 2019-05-16 2019-10-11 西安万像电子科技有限公司 Image processing method and device
CN111372080A (en) * 2020-04-13 2020-07-03 西安万像电子科技有限公司 Processing method and device of radar situation map, storage medium and processor

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4194888A1 (en) * 2016-09-20 2023-06-14 Innoviz Technologies Ltd. Lidar systems and methods
CN108732584B (en) * 2017-04-17 2020-06-30 百度在线网络技术(北京)有限公司 Method and device for updating map
CN110648396A (en) * 2019-09-17 2020-01-03 西安万像电子科技有限公司 Image processing method, device and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070071337A1 (en) * 2005-09-28 2007-03-29 Novell, Inc. Adaptive method and system for encoding digital images for the Internet
CN102033224A (en) * 2010-11-30 2011-04-27 电子科技大学 Radar video signal encoding method for ship
CN104407339A (en) * 2014-11-21 2015-03-11 中国人民解放军海军工程大学 Generative system for environment situation map of laser decoy departure jamming
WO2016161675A1 (en) * 2015-04-08 2016-10-13 杭州海康威视数字技术股份有限公司 Method and system for target-based video encoding
CN105357494A (en) * 2015-12-04 2016-02-24 广东中星电子有限公司 Video encoding and decoding method and apparatus, and computer program product
US20170206434A1 (en) * 2016-01-14 2017-07-20 Ford Global Technologies, Llc Low- and high-fidelity classifiers applied to road-scene images
CN108734171A (en) * 2017-04-14 2018-11-02 国家海洋环境监测中心 A kind of SAR remote sensing image ocean floating raft recognition methods of depth collaboration sparse coding network
CN110324617A (en) * 2019-05-16 2019-10-11 西安万像电子科技有限公司 Image processing method and device
CN111372080A (en) * 2020-04-13 2020-07-03 西安万像电子科技有限公司 Processing method and device of radar situation map, storage medium and processor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
袁招洪 等: "一种数字化天气雷达回波原始资料的数据压缩方法", 南京气象学院学报, vol. 16, no. 04, 30 December 1993 (1993-12-30), pages 432 - 438 *

Also Published As

Publication number Publication date
WO2022016756A1 (en) 2022-01-27
CN112073722B (en) 2024-05-17

Similar Documents

Publication Publication Date Title
US20240169597A1 (en) Method and device for encoding/reconstructing attributes of points of a point cloud
US20240169593A1 (en) Processing a point cloud
US20200059653A1 (en) Video encoding and decoding method
CN109640089B (en) Image coding and decoding method and device
CN111372080B (en) Processing method and device of radar situation map, storage medium and processor
US20220172433A1 (en) Processing a point cloud
EP0613290B1 (en) Method and apparatus for binary image data compression
US20170041623A1 (en) Method and Apparatus for Intra Coding for a Block in a Coding System
CN113469869B (en) Image management method and device
CN112073722B (en) Image processing method, device, equipment and storage medium
EP3942827A1 (en) Processing missing points of a point cloud
US11582464B2 (en) Using morphological operations to process frame masks in video content
KR20220062556A (en) Encoding and decoding of point clouds using patches to intermediate samples
EP3713241A1 (en) Processing a point cloud
EP3713238A1 (en) Processing a point cloud
RU2767775C1 (en) Point cloud processing
CN114786037B (en) VR projection-oriented adaptive coding compression method
US11601665B2 (en) Embedding frame masks in a video stream
US20230319287A1 (en) Systems and methods for video encoding
EP3713240A1 (en) Processing a point cloud
EP3713239A1 (en) Processing a point cloud
CN111932643A (en) Image processing method, device, equipment and storage medium
JP2012109880A (en) Image encoding method, image encoding device, image decoding method, and image decoding device
EP3709272A1 (en) Processing a point cloud
WO2022232547A1 (en) Learning-based point cloud compression via tearing transform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant