CN117368879B - Radar diagram generation method and device, terminal equipment and readable storage medium - Google Patents

Radar diagram generation method and device, terminal equipment and readable storage medium Download PDF

Info

Publication number
CN117368879B
CN117368879B CN202311640764.2A CN202311640764A CN117368879B CN 117368879 B CN117368879 B CN 117368879B CN 202311640764 A CN202311640764 A CN 202311640764A CN 117368879 B CN117368879 B CN 117368879B
Authority
CN
China
Prior art keywords
sensing data
data
radar
processing
radar map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311640764.2A
Other languages
Chinese (zh)
Other versions
CN117368879A (en
Inventor
郝迎港
程辉
陆培建
高昆
刘兆雨
裔嗣胄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Highlandr Digital Technology Co ltd
Original Assignee
Beijing Highlandr Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Highlandr Digital Technology Co ltd filed Critical Beijing Highlandr Digital Technology Co ltd
Priority to CN202311640764.2A priority Critical patent/CN117368879B/en
Publication of CN117368879A publication Critical patent/CN117368879A/en
Application granted granted Critical
Publication of CN117368879B publication Critical patent/CN117368879B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/414Discriminating targets with respect to background clutter
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the invention provides a method and a device for generating a radar chart, terminal equipment and a readable storage medium, and belongs to the technical field of data processing. The method comprises the following steps: acquiring first sensing data corresponding to a target object detected by a radar sensor, and performing data preprocessing on the first sensing data to acquire second sensing data; determining corrosion cores, and performing corrosion treatment on the second sensing data by using the corrosion cores to obtain third sensing data; determining an expansion core, and performing expansion processing on the third sensing data by using the expansion core to obtain fourth sensing data; determining a first radar map corresponding to the target object according to the fourth sensing data, and performing image enhancement processing on the first radar map to obtain a second radar map; and carrying out image fusion on the first radar map and the second radar map to obtain a target radar map corresponding to the target object. The method solves the problem that the radar map generated in the related technology is low in quality and usability, so that the radar map is poor in efficiency in detecting and tracking targets, and the generation quality of the radar map is improved.

Description

Radar diagram generation method and device, terminal equipment and readable storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method and apparatus for generating a radar chart, a terminal device, and a readable storage medium.
Background
The radar detection technology is a key sensor technology for detecting and tracking a target object, and the radar system can measure sensing data such as distance, speed, direction and the like of the target object by transmitting radio waves and receiving reflected signals thereof, so that analysis is performed according to the sensing data to obtain a result of detecting and tracking the target object. In the related art, the sensing data are displayed in a radar chart mode, so that the relation and trend among the sensing data are obtained, and further, the target object is effectively supported for subsequent detection and tracking.
However, when sensing data is obtained according to a radar system, there is noise or discontinuity introduced by the radar system, and thus, there is a problem of noise or abnormal value in the obtained sensing data, and the quality and usability of the generated radar map are low, which results in poor effect of the generated radar map in detecting and tracking a target.
Therefore, a method for generating a radar chart is needed, so that the obtained radar chart can reflect the change trend and relation of the sensing data detected by the radar system, thereby being beneficial to more reliably detecting and tracking the target.
Disclosure of Invention
The embodiment of the invention mainly aims to provide a method, a device, a terminal device and a readable storage medium for generating a radar chart, and aims to solve the problems that in the related art, due to the fact that noise or abnormal values exist in obtained sensing data, the quality and usability of the generated radar chart are low, and further the effect of the generated radar chart is poor in detecting and tracking targets.
In a first aspect, an embodiment of the present invention provides a method for generating a radar chart, including:
acquiring first sensing data corresponding to a target object detected by a radar sensor, and performing data preprocessing on the first sensing data to acquire second sensing data;
determining a corrosion core, and performing corrosion treatment on the second sensing data by using the corrosion core to obtain third sensing data;
determining an expansion core, and performing expansion processing on the third sensing data by using the expansion core to obtain fourth sensing data;
determining a first radar map corresponding to the target object according to the fourth sensing data, and performing image enhancement processing on the first radar map to obtain a second radar map;
and carrying out image fusion on the first radar map and the second radar map to obtain a target radar map corresponding to the target object.
In a second aspect, an embodiment of the present invention provides a device for generating a radar chart, including:
the data acquisition module is used for acquiring first sensing data corresponding to a target object detected by the radar sensor, and carrying out data preprocessing on the first sensing data to acquire second sensing data;
the corrosion processing module is used for determining a corrosion core, and performing corrosion processing on the second sensing data by utilizing the corrosion core to obtain the third sensing data;
the expansion processing module is used for determining an expansion core, and performing expansion processing on the third sensing data by using the expansion core to obtain fourth sensing data;
the image generation module is used for determining a first radar image corresponding to the target object according to the fourth sensing data, and performing image enhancement processing on the first radar image to obtain a second radar image;
and the image fusion module is used for carrying out image fusion on the first radar image and the second radar image to obtain a target radar image corresponding to the target object.
In a third aspect, an embodiment of the present invention further provides a terminal device, where the terminal device includes a processor, a memory, a computer program stored on the memory and executable by the processor, and a data bus for implementing connection communication between the processor and the memory, where the computer program, when executed by the processor, implements the steps of any one of the methods for generating a radar map as provided in the present specification.
In a fourth aspect, an embodiment of the present invention further provides a storage medium for computer readable storage, where the storage medium stores one or more programs, where the one or more programs are executable by one or more processors to implement steps of a method for generating a radar chart according to any one of the embodiments provided in the present specification.
The embodiment of the invention provides a method, a device, terminal equipment and a readable storage medium for generating a radar chart, wherein the method comprises the steps of obtaining first sensing data corresponding to a radar sensor detection target object, and carrying out data preprocessing on the first sensing data to obtain second sensing data; determining corrosion cores, and performing corrosion treatment on the second sensing data by using the corrosion cores to obtain third sensing data; determining an expansion core, and performing expansion processing on the third sensing data by using the expansion core to obtain fourth sensing data; determining a first radar map corresponding to the target object according to the fourth sensing data, and performing image enhancement processing on the first radar map to obtain a second radar map; and carrying out image fusion on the first radar map and the second radar map to obtain a target radar map corresponding to the target object. According to the method and the device, the noise or abnormal data existing in the sensing data are processed, the corresponding radar image is generated according to the processed data, and the image fusion processing is carried out on the radar image, so that the problem that noise or abnormal values exist in the obtained sensing data in the related technology is solved, the quality and usability of the generated radar image are lower, the problem that the effect of the generated radar image is poor when a target is detected and tracked is further caused, and the generation quality of the radar image is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a method for generating a radar chart according to an embodiment of the present invention;
FIG. 2 is a flow chart of substep S101 of the method for generating a radar chart in FIG. 1;
fig. 3 is a schematic block diagram of a radar chart generating apparatus according to an embodiment of the present invention;
fig. 4 is a schematic block diagram of a structure of a terminal device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The flow diagrams depicted in the figures are merely illustrative and not necessarily all of the elements and operations/steps are included or performed in the order described. For example, some operations/steps may be further divided, combined, or partially combined, so that the order of actual execution may be changed according to actual situations.
It is to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
The embodiment of the invention provides a radar chart generation method, a radar chart generation device, terminal equipment and a readable storage medium. The method for generating the radar chart can be applied to terminal equipment, and the terminal equipment can be electronic equipment such as a tablet personal computer, a notebook computer, a desktop computer, a personal digital assistant, wearable equipment and the like. The terminal device may be a server or a server cluster.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a flowchart of a method for generating a radar chart according to an embodiment of the present invention.
As shown in fig. 1, the method for generating a radar chart includes steps S101 to S105.
Step S101, first sensing data corresponding to a target object detected by a radar sensor are obtained, and data preprocessing is carried out on the first sensing data to obtain second sensing data.
Illustratively, the first sensing data may be position data corresponding to the target object, where the position data includes, but is not limited to, distance, azimuth, and the like. The first sensing data is obtained by detecting the target object according to the radar sensor, but because the radar sensor has the problems of clutter, interference and the like in the information transmission process, the first sensing data is required to be subjected to data preprocessing to remove noise or abnormal values in the first sensing data, so that the second sensing data is obtained.
For example, the data preprocessing includes data cleaning, data conversion, and the like, and first data cleaning, identifying and removing repeated data, null data, and the like are performed on the first sensing data to obtain cleaned data, and then the cleaned data is converted into another data form, for example, the cleaned data is subjected to data conversion by means of smoothing, aggregation, and the like to obtain converted data, and further data screening is performed on the converted data, so that the remaining data in the converted data is used as second sensing data.
Optionally, the operation of preprocessing data in the present application is not particularly limited, and may be selected according to actual requirements.
In some embodiments, the data preprocessing is performed on the first sensing data to obtain second sensing data, specifically referring to fig. 2, step S101 includes: substep S1011 to substep S1013.
And a substep S1011, performing outlier detection on the first sensing data to obtain outliers in the first sensing data, and obtaining first processing data according to the outliers and the first sensing data.
Illustratively, the first processing data is obtained by detecting an abnormal point of the first sensing data by a distance-based method, determining a point far from other sensing data as an abnormal value, and deleting the abnormal value from the first sensing data.
In some embodiments, the performing outlier detection on the first sensing data to obtain an outlier in the first sensing data includes: calculating local density corresponding to each sensing data in the first sensing data, and obtaining outlier factors corresponding to each sensing data in the first sensing data according to the local density; and determining an abnormal value corresponding to the first sensing data according to the outlier factor.
The local density corresponding to each sensing data is calculated according to the data density condition around each sensing data in the first sensing data, and an outlier factor corresponding to each sensing data is further calculated according to the local density, wherein the outlier factor is used for representing the outlier degree corresponding to each sensing data in the first sensing data, the larger value of the outlier factor is used for representing the higher outlier degree of the corresponding sensing data in the first sensing data, and the smaller value of the outlier factor is used for representing the lower outlier degree of the corresponding sensing data in the first sensing data. And further, the outliers corresponding to each sensing data in the first sensing data are arranged from large to small, and the first n corresponding sensing data are regarded as outliers.
Optionally, n can be set according to manual experience, in the application, the setting size and the setting mode of n are not particularly limited, and a user can select according to actual requirements.
And step S1012, performing noise reduction processing on the first processing data by using a filtering algorithm to obtain second processing data.
Illustratively, the first processed data is weighted-averaged using gaussian filtering, and the gaussian noise in the first processed data is removed to obtain the second processed data.
In some embodiments, the noise reduction processing is performed on the first processed data by using a filtering algorithm to obtain second processed data, including: obtaining neighborhood data corresponding to each processing data in the first processing data in a preset range; and obtaining median data corresponding to the neighborhood data, and updating the first processing data according to the median data to obtain second processing data.
The method includes the steps of obtaining neighborhood data corresponding to each piece of processing data in the first piece of processing data in a preset range, sorting the neighborhood data to obtain sorted data, obtaining corresponding median data in the sorted data, and updating the processing data by using the median data until each piece of processing data in the first piece of processing data is updated to obtain second piece of processing data.
Each processing data in the first processing data is replaced by the median value of each point value in the neighborhood data of the processing data, so that surrounding processing data can be more close to a true value, isolated noise points are eliminated, noise interference is reduced, and noise caused by accidental factors can be effectively overcome.
And step S1013, performing data calibration on the second processing data to obtain the second sensing data.
For example, after the radar sensor detects the target object, the radar sensor converts the detection signal into the sensing data with the corresponding format, but as the parameter of the sensing data with the corresponding format converted by the detection signal is changed with the use of the radar sensor, the latest parameter of the sensing data with the corresponding format converted by the radar sensor needs to be obtained again, so as to calculate the deviation between the sensing data under the latest parameter and the sensing data under the previous parameter, and then perform data calibration on the second processing data according to the deviation, thereby obtaining the second sensing data.
In some embodiments, the performing data calibration on the second processed data to obtain the second sensed data includes: sequencing the second processing data according to the acquisition time to obtain sequencing sensing data; and comparing adjacent sensing data in the ordered sensing data to obtain a comparison result, and determining second sensing data according to the comparison result.
The second processing data is sorted according to the acquisition time to obtain sorted sensing data, and then adjacent sensing data in the sorted sensing data are compared to obtain a comparison result, if the sorted sensing data are data of a target object in the same position, the two adjacent sensing data are identical, the processing data are reserved, the processing data are not used as abnormal points or noise for removing processing, and if the two adjacent sensing data are not identical, the removing processing is performed; if the ordered sensing data is the data ordering of the target object in different positions, if the adjacent sensing data is in a regular change state, the processing is not performed; and then the second processing data after data calibration is used as second sensing data.
And step S102, determining a corrosion core, and performing corrosion treatment on the second sensing data by using the corrosion core to obtain the third sensing data.
For example, firstly, defining an etching core according to requirements, wherein the etching core can be m×n matrix, further sliding the etching core from the second sensing data from left to right in sequence, comparing the sensing data covered by the etching core with the etching core, for example, comparing each element in the etching core with the sensing value of the corresponding position in the sensing data, if each element in the etching core is matched with the sensing value of the corresponding position in the sensing data, the sensing value of the corresponding position in the sensing data is kept to be 1, otherwise, the sensing value of the corresponding position in the sensing data is kept to be 0.
Illustratively, the second sensed data is processed entirely through the corrosion core to obtain the third sensed data. Some small objects or noise may be removed from the third sensed data obtained through the corrosion core.
Alternatively, the corrosion core may be square or circular. The size and shape of the corrosion core can affect the effectiveness of the corrosion operation. The size and shape of the corrosion core are not limited in the application, and can be selected according to actual requirements.
And step 103, determining an expansion core, and performing expansion processing on the third sensing data by using the expansion core to obtain the fourth sensing data.
For example, firstly, an expansion core is defined according to the requirement, the expansion core may be a matrix of b×c, and then the expansion core slides from left to right in sequence from the third sensing data, the sensing data covered by the expansion core is compared with the expansion core, for example, each element in the expansion core is compared with the sensing value of the corresponding position in the sensing data, if at least one element in the expansion core matches with the sensing value of the corresponding position in the sensing data, the sensing value of the corresponding position in the sensing data is kept to be 1, otherwise, the sensing value of the corresponding position in the sensing data is kept to be 0.
Illustratively, the third sensed data is processed entirely through the expansion core to obtain fourth sensed data. The fourth sensor data obtained through the expansion core may be filled with the disconnected sensor data.
Alternatively, the expansion core may be square or circular. The size and shape of the expansion core may affect the effectiveness of the expansion operation. The size and shape of the expansion core are not limited in this application, and can be selected according to actual requirements.
Step S104, determining a first radar map corresponding to the target object according to the fourth sensing data, and performing image enhancement processing on the first radar map to obtain a second radar map.
The method includes the steps that information such as an axis label and a scale mark corresponding to a radar chart to be drawn is determined, and fourth sensing data are displayed in the radar chart to be drawn to obtain a first radar chart corresponding to a target object.
Illustratively, the second differential sharpening process is performed on the first radar map using the laplace operator, and the second radar map is obtained.
In some embodiments, the performing image enhancement processing on the first radar map to obtain a second radar map includes: determining a first template corresponding to the Laplace operator in the first direction and determining a second template corresponding to the Laplace operator in the second direction; calculating according to the first template to obtain a first gradient image corresponding to the first radar map and calculating according to the second template to obtain a second gradient image corresponding to the first radar map; performing pixel fusion according to the first gradient image and the second gradient image to obtain a target gradient image corresponding to the first radar image; performing cluster analysis on the target gradient image by using a cluster algorithm to obtain a threshold segmentation point corresponding to the target gradient image; performing enhancement processing on the first radar map according to the threshold segmentation points to obtain a second radar map; wherein the target gradient image is obtained according to the following formula:
g1 (x, y) represents a pixel value of the first gradient image corresponding to the coordinate (x, y), g2 (x, y) represents a pixel value of the second gradient image corresponding to the coordinate (x, y), a represents an attenuation factor, and g (x, y) represents a pixel value of the target gradient image corresponding to the coordinate (x, y).
Illustratively, the first direction is a horizontal direction, and the second direction is a vertical direction, a first template corresponding to the laplace operator in the horizontal direction is determined, and a second template corresponding to the laplace operator in the vertical direction is determined.
Illustratively, a first template is convolved with the first radar map to obtain a corresponding first gradient image and a second template is convolved with the first radar map pair to obtain a corresponding second gradient image. The pixel values of the corresponding positions in the first gradient image and the second gradient image are summed with absolute values and divided by an attenuation factor to obtain the pixel values of the corresponding positions in the target gradient image, and the specific formula is as follows:
wherein g1 (x, y) represents a pixel value corresponding to the first gradient image in the coordinates (x, y), g2 (x, y) represents a pixel value corresponding to the second gradient image in the coordinates (x, y), a represents an attenuation factor, g (x, y) represents a pixel value corresponding to the target gradient image in the coordinates (x, y), and the pixel values at the corresponding positions in the first gradient image and the second gradient image are summed up and divided by the attenuation factor to prevent the pixel value from overflowing.
The method comprises the steps of carrying out clustering analysis on a target gradient image by using a k-means clustering algorithm to obtain corresponding threshold segmentation points when the target gradient image is divided into two class clusters, wherein the threshold segmentation points segment characteristic points and non-characteristic points; and binarizing the first radar map according to the threshold segmentation points to obtain a second radar map.
And step 105, performing image fusion on the first radar map and the second radar map to obtain a target radar map corresponding to the target object.
Illustratively, the pixel values at the same position of the first radar map and the second radar map are summed and averaged to serve as the pixel value at the corresponding position in the target radar map, and the target radar map is obtained.
In some embodiments, the image fusion of the first radar map and the second radar map to obtain a target radar map corresponding to the target object includes: calculating a first information entropy corresponding to the first radar chart and a second information entropy corresponding to the second radar chart; determining a first weight corresponding to the first radar chart and a second weight corresponding to the second radar chart according to the first information entropy and the second information entropy; and performing image fusion by using the first weight, the first radar map, the second weight and the second radar map to obtain a target radar map corresponding to the target object.
Illustratively, the first radar map is converted into a first gray scale image, and gray scale pixel values are divided into several sections from 0 to 255, and the width of each section is the same. And calculating the number of the pixel points in each interval in the first gray level image, and further calculating the proportion of the number of the pixel points in each interval to the total pixel points. And calculating the information entropy in the section according to the proportion, carrying out weighted average on the information entropy of all sections, and finally obtaining a first information entropy corresponding to the first radar map. The specific calculation formula is as follows:
wherein S1 represents a first information entropy, N represents the number of sections into which gray-scale pixel values are divided from 0 to 255, pi represents the ratio of the number of pixels in the i-th section to the total number of pixels in the first gray-scale image, and ui represents the weight value in the i-th section.
The weight value can be determined according to practical use experience, is not particularly limited in the application, and can be set according to practical requirements.
Illustratively, the second radar map is converted into a second gray scale image, and gray scale pixel values are divided into several sections from 0 to 255, and the width of each section is the same. And calculating the number of the pixel points in each interval in the second gray level image, and further calculating the proportion of the number of the pixel points in each interval to the total pixel points. And calculating the information entropy in the section according to the proportion, carrying out weighted average on the information entropy of all sections, and finally obtaining a second information entropy corresponding to the second radar map. The specific calculation formula is as follows:
Wherein S2 represents a second information entropy, N represents the number of sections into which gray-scale pixel values are divided from 0 to 255, mi represents the ratio of the number of pixels in the i-th section to the total number of pixels in the second gray-scale image, and vi represents the weight value in the i-th section.
The weight value can be determined according to practical use experience, is not particularly limited in the application, and can be set according to practical requirements.
The weight corresponding to the minimum value in the first information entropy and the second information entropy is larger, namely the smaller the information entropy is, the higher the corresponding weight is, the first weight corresponding to the first information entropy and the second weight corresponding to the second information entropy are determined to be between 0 and 1 through normalization processing, and then the first weight, the first radar map, the second weight and the second radar map are used for weighted summation, so that a target radar map corresponding to a target object is obtained.
Referring to fig. 3, fig. 3 is a schematic diagram of a radar chart generating device 200 provided in an embodiment of the present application, where the radar chart generating device 200 includes a data obtaining module 201, a corrosion processing module 202, an expansion processing module 203, an image generating module 204, and an image fusion module 205, where the data obtaining module 201 is configured to obtain first sensing data corresponding to a radar sensor detection target object, and perform data preprocessing on the first sensing data to obtain second sensing data; the corrosion processing module 202 is configured to determine a corrosion core, and perform corrosion processing on the second sensing data by using the corrosion core to obtain the third sensing data; an expansion processing module 203, configured to determine an expansion core, and perform expansion processing on the third sensing data by using the expansion core to obtain the fourth sensing data; the image generating module 204 is configured to determine a first radar chart corresponding to the target object according to the fourth sensing data, and perform image enhancement processing on the first radar chart to obtain a second radar chart; and the image fusion module 205 is configured to perform image fusion on the first radar map and the second radar map to obtain a target radar map corresponding to the target object.
In some embodiments, the data acquisition module 201 performs, in the process of performing data preprocessing on the first sensing data to obtain the second sensing data:
abnormal point detection is carried out on the first sensing data to obtain an abnormal value in the first sensing data, and first processing data are obtained according to the abnormal value and the first sensing data;
carrying out noise reduction processing on the first processing data by using a filtering algorithm to obtain second processing data;
and carrying out data calibration on the second processing data to obtain the second sensing data.
In some embodiments, the data obtaining module 201 performs, in the process of performing outlier detection on the first sensing data to obtain an outlier in the first sensing data:
calculating local density corresponding to each sensing data in the first sensing data, and obtaining outlier factors corresponding to each sensing data in the first sensing data according to the local density;
and determining an abnormal value corresponding to the first sensing data according to the outlier factor.
In some embodiments, the data obtaining module 201 performs, in the process of performing the noise reduction processing on the first processed data using the filtering algorithm to obtain the second processed data, the following steps:
Obtaining neighborhood data corresponding to each processing data in the first processing data in a preset range;
and obtaining median data corresponding to the neighborhood data, and updating the first processing data according to the median data to obtain second processing data.
In some embodiments, the data acquisition module 201 performs, in the process of performing data calibration on the second processing data to obtain the second sensing data:
sequencing the second processing data according to the acquisition time to obtain sequencing sensing data;
and comparing adjacent sensing data in the ordered sensing data to obtain a comparison result, and determining second sensing data according to the comparison result.
In some embodiments, the image generating module 204 performs, in the process of performing the image enhancement processing on the first radar map to obtain the second radar map:
determining a first template corresponding to the Laplace operator in the first direction and determining a second template corresponding to the Laplace operator in the second direction;
calculating according to the first template to obtain a first gradient image corresponding to the first radar map and calculating according to the second template to obtain a second gradient image corresponding to the first radar map;
Performing pixel fusion according to the first gradient image and the second gradient image to obtain a target gradient image corresponding to the first radar image;
performing cluster analysis on the target gradient image by using a cluster algorithm to obtain a threshold segmentation point corresponding to the target gradient image;
performing enhancement processing on the first radar map according to the threshold segmentation points to obtain a second radar map;
wherein the target gradient image is obtained according to the following formula:
g1 (x, y) represents a pixel value of the first gradient image corresponding to the coordinate (x, y), g2 (x, y) represents a pixel value of the second gradient image corresponding to the coordinate (x, y), a represents an attenuation factor, and g (x, y) represents a pixel value of the target gradient image corresponding to the coordinate (x, y).
In some embodiments, the image fusion module 205 performs, in the process of performing image fusion on the first radar map and the second radar map to obtain a target radar map corresponding to the target object:
calculating a first information entropy corresponding to the first radar chart and a second information entropy corresponding to the second radar chart;
determining a first weight corresponding to the first radar chart and a second weight corresponding to the second radar chart according to the first information entropy and the second information entropy;
And performing image fusion by using the first weight, the first radar map, the second weight and the second radar map to obtain a target radar map corresponding to the target object.
In some embodiments, the apparatus 200 for generating a radar map may be applied to a terminal device.
It should be noted that, for convenience and brevity of description, specific working procedures of the above-described radar chart generating apparatus 200 may refer to corresponding procedures in the foregoing radar chart generating method embodiment, and are not described herein again.
Referring to fig. 4, fig. 4 is a schematic block diagram of a structure of a terminal device according to an embodiment of the present invention.
As shown in fig. 4, the terminal device 300 includes a processor 301 and a memory 302, the processor 301 and the memory 302 being connected by a bus 303, such as an I2C (Inter-integrated Circuit) bus.
In particular, the processor 301 is used to provide computing and control capabilities, supporting the operation of the entire terminal device. The processor 301 may be a central processing unit (Central Processing Unit, CPU), the processor 301 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field-programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Specifically, the Memory 302 may be a Flash chip, a Read-Only Memory (ROM) disk, an optical disk, a U-disk, a removable hard disk, or the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 4 is merely a block diagram of a portion of the structure related to the embodiment of the present invention, and does not constitute a limitation of the terminal device to which the embodiment of the present invention is applied, and that a specific server may include more or less components than those shown in the drawings, or may combine some components, or have a different arrangement of components.
The processor is used for running a computer program stored in the memory, and implementing any one of the radar chart generation methods provided by the embodiment of the invention when the computer program is executed.
In an embodiment, the processor is configured to run a computer program stored in a memory and to implement the following steps when executing the computer program:
acquiring first sensing data corresponding to a target object detected by a radar sensor, and performing data preprocessing on the first sensing data to acquire second sensing data;
determining a corrosion core, and performing corrosion treatment on the second sensing data by using the corrosion core to obtain third sensing data;
Determining an expansion core, and performing expansion processing on the third sensing data by using the expansion core to obtain fourth sensing data;
determining a first radar map corresponding to the target object according to the fourth sensing data, and performing image enhancement processing on the first radar map to obtain a second radar map;
and carrying out image fusion on the first radar map and the second radar map to obtain a target radar map corresponding to the target object.
In some embodiments, the processor 301 performs, in performing data preprocessing on the first sensing data to obtain second sensing data:
abnormal point detection is carried out on the first sensing data to obtain an abnormal value in the first sensing data, and first processing data are obtained according to the abnormal value and the first sensing data;
carrying out noise reduction processing on the first processing data by using a filtering algorithm to obtain second processing data;
and carrying out data calibration on the second processing data to obtain the second sensing data.
In some embodiments, the processor 301 performs, in the process of performing outlier detection on the first sensing data to obtain an outlier in the first sensing data:
Calculating local density corresponding to each sensing data in the first sensing data, and obtaining outlier factors corresponding to each sensing data in the first sensing data according to the local density;
and determining an abnormal value corresponding to the first sensing data according to the outlier factor.
In some embodiments, the processor 301 performs, in the process of performing the noise reduction processing on the first processing data using the filtering algorithm, a noise reduction processing on the second processing data, to perform:
obtaining neighborhood data corresponding to each processing data in the first processing data in a preset range;
and obtaining median data corresponding to the neighborhood data, and updating the first processing data according to the median data to obtain second processing data.
In some embodiments, the processor 301 performs, in performing data calibration on the second processing data to obtain the second sensing data:
sequencing the second processing data according to the acquisition time to obtain sequencing sensing data;
and comparing adjacent sensing data in the ordered sensing data to obtain a comparison result, and determining second sensing data according to the comparison result.
In some embodiments, the processor 301 performs, in performing image enhancement processing on the first radar map to obtain a second radar map:
Determining a first template corresponding to the Laplace operator in the first direction and determining a second template corresponding to the Laplace operator in the second direction;
calculating according to the first template to obtain a first gradient image corresponding to the first radar map and calculating according to the second template to obtain a second gradient image corresponding to the first radar map;
performing pixel fusion according to the first gradient image and the second gradient image to obtain a target gradient image corresponding to the first radar image;
performing cluster analysis on the target gradient image by using a cluster algorithm to obtain a threshold segmentation point corresponding to the target gradient image;
performing enhancement processing on the first radar map according to the threshold segmentation points to obtain a second radar map;
wherein the target gradient image is obtained according to the following formula:
g1 (x, y) represents a pixel value of the first gradient image corresponding to the coordinate (x, y), g2 (x, y) represents a pixel value of the second gradient image corresponding to the coordinate (x, y), a represents an attenuation factor, and g (x, y) represents a pixel value of the target gradient image corresponding to the coordinate (x, y).
In some embodiments, the processor 301 performs, in a process of performing image fusion on the first radar map and the second radar map to obtain a target radar map corresponding to the target object:
Calculating a first information entropy corresponding to the first radar chart and a second information entropy corresponding to the second radar chart;
determining a first weight corresponding to the first radar chart and a second weight corresponding to the second radar chart according to the first information entropy and the second information entropy;
and performing image fusion by using the first weight, the first radar map, the second weight and the second radar map to obtain a target radar map corresponding to the target object.
It should be noted that, for convenience and brevity of description, specific working processes of the terminal device described above may refer to corresponding processes in the foregoing embodiment of the method for generating a radar chart, which are not described herein again.
The embodiment of the invention also provides a storage medium for computer readable storage, wherein the storage medium stores one or more programs, and the one or more programs can be executed by one or more processors to implement the steps of any of the methods for generating a radar chart provided in the embodiment specification of the invention.
The storage medium may be an internal storage unit of the terminal device according to the foregoing embodiment, for example, a hard disk or a memory of the terminal device. The storage medium may also be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device.
Those of ordinary skill in the art will appreciate that all or some of the steps, systems, functional modules/units in the apparatus, and methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware embodiment, the division between the functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed cooperatively by several physical components. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as known to those skilled in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. Furthermore, as is well known to those of ordinary skill in the art, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
It should be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations. It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments. While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (8)

1. A method for generating a radar map, the method comprising:
acquiring first sensing data corresponding to a target object detected by a radar sensor, and performing data preprocessing on the first sensing data to acquire second sensing data;
performing data preprocessing on the first sensing data to obtain second sensing data, including:
abnormal point detection is carried out on the first sensing data to obtain an abnormal value in the first sensing data, and first processing data are obtained according to the abnormal value and the first sensing data;
carrying out noise reduction processing on the first processing data by using a filtering algorithm to obtain second processing data;
performing data calibration on the second processing data to obtain second sensing data;
the abnormal point detection for the first sensing data to obtain an abnormal value in the first sensing data includes:
calculating local density corresponding to each sensing data in the first sensing data, and obtaining outlier factors corresponding to each sensing data in the first sensing data according to the local density;
determining an abnormal value corresponding to the first sensing data according to the outlier factor;
determining a corrosion core, and performing corrosion treatment on the second sensing data by using the corrosion core to obtain third sensing data;
Determining an expansion core, and performing expansion processing on the third sensing data by using the expansion core to obtain fourth sensing data;
determining a first radar map corresponding to the target object according to the fourth sensing data, and performing image enhancement processing on the first radar map to obtain a second radar map;
and carrying out image fusion on the first radar map and the second radar map to obtain a target radar map corresponding to the target object.
2. The method of claim 1, wherein the denoising the first processed data using a filtering algorithm to obtain second processed data comprises:
obtaining neighborhood data corresponding to each processing data in the first processing data in a preset range;
and obtaining median data corresponding to the neighborhood data, and updating the first processing data according to the median data to obtain second processing data.
3. The method of claim 1, wherein said performing data calibration on said second processed data to obtain said second sensed data comprises:
sequencing the second processing data according to the acquisition time to obtain sequencing sensing data;
and comparing adjacent sensing data in the ordered sensing data to obtain a comparison result, and determining second sensing data according to the comparison result.
4. The method of claim 1, wherein performing image enhancement processing on the first radar map to obtain a second radar map comprises:
determining a first template corresponding to the Laplace operator in the first direction and determining a second template corresponding to the Laplace operator in the second direction;
calculating according to the first template to obtain a first gradient image corresponding to the first radar map and calculating according to the second template to obtain a second gradient image corresponding to the first radar map;
performing pixel fusion according to the first gradient image and the second gradient image to obtain a target gradient image corresponding to the first radar image;
performing cluster analysis on the target gradient image by using a cluster algorithm to obtain a threshold segmentation point corresponding to the target gradient image;
performing enhancement processing on the first radar map according to the threshold segmentation points to obtain a second radar map;
wherein the target gradient image is obtained according to the following formula:
g1 (x, y) represents a pixel value of the first gradient image corresponding to the coordinate (x, y), g2 (x, y) represents a pixel value of the second gradient image corresponding to the coordinate (x, y), a represents an attenuation factor, and g (x, y) represents a pixel value of the target gradient image corresponding to the coordinate (x, y).
5. The method according to claim 1, wherein the image fusing the first radar map and the second radar map to obtain a target radar map corresponding to the target object includes:
calculating a first information entropy corresponding to the first radar chart and a second information entropy corresponding to the second radar chart;
determining a first weight corresponding to the first radar chart and a second weight corresponding to the second radar chart according to the first information entropy and the second information entropy;
and performing image fusion by using the first weight, the first radar map, the second weight and the second radar map to obtain a target radar map corresponding to the target object.
6. A radar chart generating apparatus, comprising:
the data acquisition module is used for acquiring first sensing data corresponding to a target object detected by the radar sensor, and carrying out data preprocessing on the first sensing data to acquire second sensing data;
the data acquisition module performs, in the process of performing data preprocessing on the first sensing data to obtain second sensing data, the following steps:
abnormal point detection is carried out on the first sensing data to obtain an abnormal value in the first sensing data, and first processing data are obtained according to the abnormal value and the first sensing data;
Carrying out noise reduction processing on the first processing data by using a filtering algorithm to obtain second processing data;
performing data calibration on the second processing data to obtain second sensing data;
the data acquisition module executes the following steps in the process of detecting abnormal points of the first sensing data to obtain the abnormal value in the first sensing data:
calculating local density corresponding to each sensing data in the first sensing data, and obtaining outlier factors corresponding to each sensing data in the first sensing data according to the local density;
determining an abnormal value corresponding to the first sensing data according to the outlier factor;
the corrosion processing module is used for determining a corrosion core, and performing corrosion processing on the second sensing data by utilizing the corrosion core to obtain third sensing data;
the expansion processing module is used for determining an expansion core, and performing expansion processing on the third sensing data by using the expansion core to obtain fourth sensing data;
the image generation module is used for determining a first radar image corresponding to the target object according to the fourth sensing data, and performing image enhancement processing on the first radar image to obtain a second radar image;
And the image fusion module is used for carrying out image fusion on the first radar image and the second radar image to obtain a target radar image corresponding to the target object.
7. A terminal device, characterized in that the terminal device comprises a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and to implement the method of generating a radar map according to any one of claims 1 to 5 when the computer program is executed.
8. A computer-readable storage medium, which, when executed by one or more processors, causes the one or more processors to perform the method steps of generating a radar map as claimed in any one of claims 1 to 5.
CN202311640764.2A 2023-12-04 2023-12-04 Radar diagram generation method and device, terminal equipment and readable storage medium Active CN117368879B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311640764.2A CN117368879B (en) 2023-12-04 2023-12-04 Radar diagram generation method and device, terminal equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311640764.2A CN117368879B (en) 2023-12-04 2023-12-04 Radar diagram generation method and device, terminal equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN117368879A CN117368879A (en) 2024-01-09
CN117368879B true CN117368879B (en) 2024-03-19

Family

ID=89398741

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311640764.2A Active CN117368879B (en) 2023-12-04 2023-12-04 Radar diagram generation method and device, terminal equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN117368879B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441766A (en) * 2008-11-28 2009-05-27 西安电子科技大学 SAR image fusion method based on multiple-dimension geometric analysis
CN101639934A (en) * 2009-09-04 2010-02-03 西安电子科技大学 SAR image denoising method based on contour wave domain block hidden Markov model
CN102288945A (en) * 2011-05-11 2011-12-21 成都成电电子信息技术工程有限公司 Image enhancing method for ship radar
CN112541872A (en) * 2020-12-08 2021-03-23 陕西长岭电子科技有限责任公司 Weather radar display effect optimization method
WO2021227797A1 (en) * 2020-05-13 2021-11-18 长沙智能驾驶研究院有限公司 Road boundary detection method and apparatus, computer device and storage medium
CN116503716A (en) * 2023-03-21 2023-07-28 大连理工大学 Radar image derivatization and database capacity expansion method
CN116630216A (en) * 2023-06-02 2023-08-22 中国电建集团昆明勘测设计研究院有限公司 Target fusion method, device, equipment and storage medium based on radar and image
CN116679267A (en) * 2023-06-02 2023-09-01 中国电建集团昆明勘测设计研究院有限公司 Combined calibration method, device, equipment and storage medium based on radar and image

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11663832B2 (en) * 2021-01-19 2023-05-30 Micromax International Corp. Method and system for detecting and analyzing objects

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441766A (en) * 2008-11-28 2009-05-27 西安电子科技大学 SAR image fusion method based on multiple-dimension geometric analysis
CN101639934A (en) * 2009-09-04 2010-02-03 西安电子科技大学 SAR image denoising method based on contour wave domain block hidden Markov model
CN102288945A (en) * 2011-05-11 2011-12-21 成都成电电子信息技术工程有限公司 Image enhancing method for ship radar
WO2021227797A1 (en) * 2020-05-13 2021-11-18 长沙智能驾驶研究院有限公司 Road boundary detection method and apparatus, computer device and storage medium
CN112541872A (en) * 2020-12-08 2021-03-23 陕西长岭电子科技有限责任公司 Weather radar display effect optimization method
CN116503716A (en) * 2023-03-21 2023-07-28 大连理工大学 Radar image derivatization and database capacity expansion method
CN116630216A (en) * 2023-06-02 2023-08-22 中国电建集团昆明勘测设计研究院有限公司 Target fusion method, device, equipment and storage medium based on radar and image
CN116679267A (en) * 2023-06-02 2023-09-01 中国电建集团昆明勘测设计研究院有限公司 Combined calibration method, device, equipment and storage medium based on radar and image

Also Published As

Publication number Publication date
CN117368879A (en) 2024-01-09

Similar Documents

Publication Publication Date Title
CN108898086B (en) Video image processing method and device, computer readable medium and electronic equipment
CN113111212B (en) Image matching method, device, equipment and storage medium
CN109118456B (en) Image processing method and device
CN107220962B (en) Image detection method and device for tunnel cracks
CN111860060A (en) Target detection method and device, terminal equipment and computer readable storage medium
CN112336342A (en) Hand key point detection method and device and terminal equipment
CN111695429A (en) Video image target association method and device and terminal equipment
CN111915657A (en) Point cloud registration method and device, electronic equipment and storage medium
CN109300139B (en) Lane line detection method and device
CN114022614A (en) Method and system for estimating confidence of three-dimensional reconstruction target position
CN111898408B (en) Quick face recognition method and device
CN117368879B (en) Radar diagram generation method and device, terminal equipment and readable storage medium
EP2875488B1 (en) Biological unit segmentation with ranking based on similarity applying a shape and scale descriptor
CN114742849B (en) Leveling instrument distance measuring method based on image enhancement
CN115049590B (en) Image processing method and device, electronic equipment and storage medium
CN105631869A (en) Tubular object segmentation method, device and equipment
CN115546143A (en) Method and device for positioning center point of wafer, storage medium and electronic equipment
CN115187642A (en) Remote sensing image registration method, device and storage medium
CN114419068A (en) Medical image segmentation method, device, equipment and storage medium
CN110399892B (en) Environmental feature extraction method and device
CN113935896A (en) Image splicing method and device, computer equipment and storage medium
Zhao et al. IR saliency detection via a GCF-SB visual attention framework
CN112508009A (en) Circular feature detection method and device and storage device
CN114881908B (en) Abnormal pixel identification method, device and equipment and computer storage medium
CN114694138B (en) Road surface detection method, device and equipment applied to intelligent driving

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant