CN111179191A - Laser radar point cloud data 3D display enhancement method and system - Google Patents
Laser radar point cloud data 3D display enhancement method and system Download PDFInfo
- Publication number
- CN111179191A CN111179191A CN201911311524.1A CN201911311524A CN111179191A CN 111179191 A CN111179191 A CN 111179191A CN 201911311524 A CN201911311524 A CN 201911311524A CN 111179191 A CN111179191 A CN 111179191A
- Authority
- CN
- China
- Prior art keywords
- cloud data
- point cloud
- gamma
- gray
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 238000009499 grossing Methods 0.000 claims abstract description 48
- 238000004364 calculation method Methods 0.000 claims abstract description 25
- 238000006243 chemical reaction Methods 0.000 claims abstract description 19
- 230000009466 transformation Effects 0.000 claims description 16
- 230000008569 process Effects 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 abstract description 6
- 230000000694 effects Effects 0.000 description 4
- 238000012935 Averaging Methods 0.000 description 2
- 238000005282 brightening Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/94—Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Image Processing (AREA)
Abstract
The invention provides a laser radar point cloud data 3D display enhancement method and a system, comprising the following steps: step M1: smoothing the gray value of the cloud data of each point and the gray value of the cloud data of the points in the 3D neighborhood; step M2: calculating a gamma value by utilizing the gray value of the point cloud data after the smoothing treatment; step M3: and performing point cloud data gamma conversion by using the gamma value obtained by calculation to realize effective adjustment of local contrast. The method can be applied to the 3D display of the laser radar point cloud data, the visual effect of the 3D display of the point cloud data is obviously improved by adaptively adjusting the local contrast, and the visual observation of the point cloud data is convenient.
Description
Technical Field
The invention relates to the technical field of display, in particular to a laser radar point cloud data 3D display enhancement method.
Background
The point cloud data refers to the scanning data recorded in the form of points, and each point comprises three-dimensional coordinates. The laser radar point cloud data also comprises intensity information, which is the echo intensity collected by the laser scanner receiving device, and the intensity information is related to the surface material, roughness and incident angle direction of the target, the emission energy of the instrument and the laser wavelength.
When the laser radar point cloud data is displayed in a 3D mode, the specific position of each point is determined through the three-dimensional coordinates of each point, and the intensity information is converted into 255-level gray scale values to serve as the gray scale values of the points. Due to the fact that the intensity range of different point cloud data in a scene is large, the intensity value range is converted into the range of 0-255 of gray values for display, the contrast of a local area is poor, interesting characteristics are not obvious, and the overall visual effect is affected. For example, the point cloud intensity of a general traffic sign area in an automatic driving scene reaches more than 100, and the partial object area with large material reflectivity and close distance even reaches more than 200; the point cloud intensity of the road surface area is below 10, and the point cloud intensity of the lane line area is about 20; if the intensity is mapped to the gray value range of 0-255 for display, the contrast ratio of the interested lane line area and the interested road surface area is poor, and the observation is not facilitated.
In order to improve the situation, the range of the intensity value range is partially intercepted, and then the intercepted part is converted into the range of 0-255 gray values for display, but the contrast of part of point cloud data (the intensity exceeds the intercepted part) is lost when the point cloud data is displayed. For example, in the above example, the point cloud data interception intensity is in the range of 0 to 40, and at this time, the contrast between the lane line area and the road surface area is excellent, but the contrast in the traffic sign area is lost.
In view of the above situation, the enhancement method, local gamma correction method, provided by the patent can adaptively adjust the local contrast, output 255-level gray scale values of enhanced point cloud data, and significantly improve the visual effect.
Patent numbers: 110346808A (application number: 201910635164.4) discloses a method and a system for processing point cloud data of a laser radar, comprising the following steps: receiving point cloud data of a laser radar; preprocessing the point cloud data by using a preset deep convolutional neural network model to obtain processed point cloud data; and outputting the processed point cloud data.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a laser radar point cloud data 3D display enhancement method and system.
The invention provides a laser radar point cloud data 3D display enhancement method, which comprises the following steps:
step M1: smoothing the gray value of the cloud data of each point and the gray value of the cloud data of the points in the 3D neighborhood;
step M2: calculating a gamma value by utilizing the gray value of the point cloud data after the smoothing treatment;
step M3: and performing point cloud data gamma conversion by using the gamma value obtained by calculation to realize effective adjustment of local contrast.
Preferably, the step M1 includes:
step M1.1: traversing all the point cloud data, taking the point cloud data as a center, generating a sphere area according to a fixed radius, and taking the point cloud data in the area as the point cloud data in the 3D neighborhood of the point cloud data; smoothing the gray values of the point cloud data and the point cloud data in the 3D neighborhood to obtain the gray value of the smoothed point cloud data;
the smoothing process includes gaussian smoothing and mean smoothing.
Preferably, the step M2 includes: traversing all the point cloud data, taking the smoothed gray value of the point cloud data as input, carrying out gamma value calculation on the point cloud data, and outputting a gamma value corresponding to each point cloud data;
the gamma value calculation method comprises the following steps:
γ[i,j,k]=2[(Gray[i,j,k]-128)/128](1)
wherein γ represents a gamma value; [ i, j, k ] represents three-dimensional coordinates; gray [ i, j, k ] represents the Gray value after smoothing.
Preferably, step M3 includes:
and traversing all the point cloud data, taking the gamma value corresponding to each point cloud data in the point cloud data as input, carrying out gamma transformation on the point cloud data, and outputting the enhanced point cloud data to realize effective adjustment of local contrast.
Preferably, the gamma transformation is:
s=crγ(2)
wherein s represents a gray value after gamma conversion; r represents an input gray value; c represents a gray scale factor; the superscript γ denotes the gamma value.
The invention provides a laser radar point cloud data 3D display enhancement system, which comprises:
module M1: smoothing the gray value of the cloud data of each point and the gray value of the cloud data of the points in the 3D neighborhood;
module M2: calculating a gamma value by utilizing the gray value of the point cloud data after the smoothing treatment;
module M3: and performing point cloud data gamma conversion by using the gamma value obtained by calculation to realize effective adjustment of local contrast.
Preferably, said module M1 comprises:
module M1.1: traversing all the point cloud data, taking the point cloud data as a center, generating a sphere area according to a fixed radius, and taking the point cloud data in the area as the point cloud data in the 3D neighborhood of the point cloud data; smoothing the gray values of the point cloud data and the point cloud data in the 3D neighborhood to obtain the gray value of the smoothed point cloud data;
the smoothing process includes gaussian smoothing and mean smoothing.
Preferably, said module M2 comprises: traversing all the point cloud data, taking the smoothed gray value of the point cloud data as input, performing gamma value calculation on the point cloud data, and outputting a gamma value corresponding to each point cloud data;
the gamma value calculation method comprises the following steps:
γ[i,j,k]=2[(Gray[i,j,k]-128)/128](1)
wherein γ represents a gamma value; [ i, j, k ] represents three-dimensional coordinates; gray [ i, j, k ] represents the Gray value after smoothing.
Preferably, the module M3 includes:
and traversing all the point cloud data, taking the gamma value corresponding to each point cloud data in the point cloud data as input, carrying out gamma transformation on the point cloud data, and outputting the enhanced point cloud data to realize effective adjustment of local contrast.
Preferably, the gamma transformation is:
s=crγ(2)
wherein s represents a gray value after gamma conversion; r represents an input gray value; c represents a gray scale factor; the superscript γ denotes the gamma value.
Compared with the prior art, the invention has the following beneficial effects: the method can be applied to the 3D display of the laser radar point cloud data, the visual effect of the 3D display of the point cloud data is obviously improved by adaptively adjusting the local contrast, and the visual observation of the point cloud data is convenient.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
When the traditional laser radar point cloud data is displayed in a 3D mode, the specific position of each point is determined by the three-dimensional coordinates of each point, and the intensity information is converted into 255-level gray values to serve as the gray values of the points; for example, the intensity value range is from Imin to Imax, and the intensity value range is linearly mapped to 0 to 255 as a gray value.
According to the display enhancement method, the 3D display result is used as input, the gray value of cloud data of each point is changed through processing of a local gamma correction method, and enhanced 3D display is output; effectively solving the defect of poor local contrast.
The invention provides a laser radar point cloud data 3D display enhancement method, which comprises the following steps:
step M1: smoothing the gray value of the cloud data of each point and the gray value of the cloud data of the points in the 3D neighborhood; the smoothed gray value can reflect the gray condition and the contrast condition of a local area formed by the point cloud data and the neighborhood point cloud data.
Specifically, the step M1 includes:
the 3D data smoothing module, a common smoothing method such as gaussian smoothing, average smoothing, etc., takes average smoothing as an example:
step M1.1: traversing all the point cloud data, taking the point cloud data as a center, generating a sphere area according to a fixed radius, and taking the point cloud data in the area as the point cloud data in the 3D neighborhood of the point cloud data; averaging the gray values of the point cloud data and the neighborhood point cloud data, and updating the gray value of the point cloud data by using the average value;
step M2: calculating a gamma value by utilizing the gray value of the point cloud data after the smoothing treatment; and determining the gamma value of the point cloud data according to the gray condition of a local area formed by the point cloud data and the neighborhood point cloud data.
Specifically, the step M2 includes: traversing all the point cloud data, taking the smoothed gray value of the point cloud data as input, performing gamma value calculation on the point cloud data, and outputting a gamma value corresponding to each point cloud data;
the gamma value calculation method comprises the following steps:
γ[i,j,k]=2[(Gray[i,j,k]-128)/128](1)
wherein γ represents a gamma value; [ i, j, k ] represents three-dimensional coordinates; gray [ i, j, k ] represents the Gray value after smoothing.
The gamma value calculation module is started from the top. The input point cloud data gray value processed by the 3D data smoothing module reflects the gray condition and the contrast condition of a local area formed by the point cloud data and the neighborhood point cloud data; for example, if the smoothed gray value of the point cloud data is equal to 128, which reflects that the local area gray of the point cloud data is moderate and the contrast is strong, the gamma value is set to 1.0, and the gray value is kept unchanged; if the smoothed gray value of the point cloud data is much smaller than 128, it reflects that when the gray value of the local area of the cloud data is low and the contrast is weak, the gamma value is set to be smaller than 1.0, and the gamma conversion will improve the gray value and the contrast.
Step M3: and performing point cloud data gray value transformation by using the gamma value obtained by calculation to realize effective adjustment of local contrast.
Specifically, step M3 includes:
and traversing all the point cloud data, taking the gamma value corresponding to each point cloud data in the point cloud data as input, carrying out gamma transformation on the point cloud data, and outputting the enhanced point cloud data to realize effective adjustment of local contrast. The self-adaptation means that the gamma values of different point cloud data during gamma conversion are different and are obtained through automatic calculation according to the gray value.
Specifically, the gamma transformation is:
s=crγ(2)
wherein s represents a gray value after gamma conversion; r represents an input gray value; c represents a gray scale factor; the superscript γ denotes the gamma value. When the gamma value is larger than 1.0, the gamma conversion will lower the gray value and become dark visually; when the gamma value is less than 1.0, the gamma transformation will increase the gray value, visually brightening. The gamma conversion method can obtain a satisfactory enhancement effect by adjusting the gamma value of the parameter when the whole is dark or bright, but cannot achieve the satisfactory effect by integrally using the same gamma value for the problem of poor local contrast. The patent introduces a gamma value self-adaptive adjustment along with the change of local area information, namely a gamma value calculation module.
The invention provides a laser radar point cloud data 3D display enhancement system, which comprises:
module M1: smoothing the gray value of the cloud data of each point and the gray value of the cloud data of the points in the 3D neighborhood; the smoothed gray value can reflect the gray condition and the contrast condition of a local area formed by the point cloud data and the neighborhood point cloud data.
Specifically, the module M1 includes:
the 3D data smoothing module, a common smoothing method such as gaussian smoothing, average smoothing, etc., takes average smoothing as an example:
module M1.1: traversing all the point cloud data, taking the point cloud data as a center, generating a sphere area according to a fixed radius, and taking the point cloud data in the area as the point cloud data in the 3D neighborhood of the point cloud data; and averaging the gray values of the point cloud data and the neighborhood point cloud data, and updating the gray value of the point cloud data by using the average value.
Module M2: calculating a gamma value by utilizing the gray value of the point cloud data after the smoothing treatment; and determining the gamma value of the point cloud data according to the gray condition of a local area formed by the point cloud data and the neighborhood point cloud data.
Specifically, the module M2 includes: traversing all point cloud data, taking the smoothed gray value of the point cloud data as input, performing gamma value calculation on the point cloud data, and outputting a gamma value corresponding to each point cloud data;
the gamma value calculation method comprises the following steps:
γ[i,j,k]=2[(Gray[i,j,k]-128)/128](1)
wherein γ represents a gamma value; [ i, j, k ] represents three-dimensional coordinates; gray [ i, j, k ] represents the Gray value after smoothing.
The gamma value calculation module is started from the top. The input point cloud data gray value processed by the 3D data smoothing module reflects the gray condition and the contrast condition of a local area formed by the point cloud data and the neighborhood point cloud data; for example, if the smoothed gray value of the point cloud data is equal to 128, which reflects that the local area gray of the point cloud data is moderate and the contrast is strong, the gamma value is set to 1.0, and the gray value is kept unchanged; if the smoothed gray value of the point cloud data is far less than 128, which reflects that the local area gray value of the point cloud data is low and the contrast is weak, the gamma value is set to be a smaller value less than 1.0, and the gray value and the contrast are improved through gamma conversion.
Module M3: and performing point cloud data gray value transformation by using the gamma value obtained by calculation to realize effective adjustment of local contrast.
Specifically, the module M3 includes:
and traversing all the point cloud data, taking the gamma value corresponding to each point cloud data in the point cloud data as input, carrying out gamma transformation on the point cloud data, and outputting the enhanced point cloud data to realize effective adjustment of local contrast. The self-adaptation means that the gamma values of different point cloud data during gamma conversion are different and are obtained through automatic calculation according to the gray value.
Specifically, the gamma transformation is:
s=crγ(2)
wherein s represents a gray value after gamma conversion; r represents an input gray value; c represents a gray scale factor; the superscript γ denotes the gamma value. When the gamma value is larger than 1.0, the gamma conversion will lower the gray value and become dark visually; when the gamma value is less than 1.0, the gamma transformation will increase the gray value, visually brightening. The gamma conversion method can obtain a satisfactory enhancement effect by adjusting the gamma value of the parameter when the whole is dark or bright, but cannot achieve the satisfactory effect by integrally using the same gamma value for the problem of poor local contrast. The patent introduces a gamma value self-adaptive adjustment along with the change of local area information, namely a gamma value calculation module.
Those skilled in the art will appreciate that, in addition to implementing the systems, apparatus, and various modules thereof provided by the present invention in purely computer readable program code, the same procedures can be implemented entirely by logically programming method steps such that the systems, apparatus, and various modules thereof are provided in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system, the device and the modules thereof provided by the present invention can be considered as a hardware component, and the modules included in the system, the device and the modules thereof for implementing various programs can also be considered as structures in the hardware component; modules for performing various functions may also be considered to be both software programs for performing the methods and structures within hardware components.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.
Claims (10)
1. A laser radar point cloud data 3D display enhancement method is characterized by comprising the following steps:
step M1: smoothing the gray value of the cloud data of each point and the gray value of the cloud data of the points in the 3D neighborhood;
step M2: calculating a gamma value by utilizing the gray value of the point cloud data after the smoothing treatment;
step M3: and performing point cloud data gamma conversion by using the gamma value obtained by calculation to realize local contrast adjustment.
2. The lidar point cloud data 3D display enhancement method according to claim 1, wherein the step M1 comprises: traversing all the point cloud data, taking the point cloud data as a center, generating a sphere area according to a fixed radius, and taking the point cloud data in the area as the point cloud data in the 3D neighborhood of the point cloud data; smoothing the gray values of the point cloud data and the point cloud data in the 3D neighborhood to obtain the gray value of the smoothed point cloud data;
the smoothing process includes gaussian smoothing and mean smoothing.
3. The lidar point cloud data 3D display enhancement method according to claim 1, wherein the step M2 comprises: traversing all the point cloud data, taking the smoothed gray value of the point cloud data as input, performing gamma value calculation on the point cloud data, and outputting a gamma value corresponding to each point cloud data;
the gamma value calculation method comprises the following steps:
γ[i,j,k]=2[(Gray[i,j,k]-128)/128](1)
wherein γ represents a gamma value; [ i, j, k ] represents three-dimensional coordinates; gray [ i, j, k ] represents the Gray value after smoothing.
4. The lidar point cloud data 3D display enhancement method according to claim 1, wherein the step M3 comprises:
and traversing all the point cloud data, taking the gamma value corresponding to each point cloud data in the point cloud data as input, carrying out gamma transformation on the point cloud data, outputting the enhanced point cloud data, and realizing local contrast adjustment.
5. The lidar point cloud data 3D display enhancement method of claim 1, wherein the gamma transformation is:
s=crγ(2)
wherein s represents a gray value after gamma conversion; r represents an input gray value; c represents a gray scale factor; the superscript γ denotes the gamma value.
6. A laser radar point cloud data 3D display enhancement system is characterized by comprising
Module M1: smoothing the gray value of the cloud data of each point and the gray value of the cloud data of the points in the 3D neighborhood;
module M2: calculating a gamma value by utilizing the gray value of the point cloud data after the smoothing treatment;
module M3: and performing point cloud data gamma conversion by using the gamma value obtained by calculation to realize effective adjustment of local contrast.
7. The lidar point cloud data 3D display enhancement system of claim 6, wherein the module M1 comprises:
module M1.1: traversing all the point cloud data, taking the point cloud data as a center, generating a sphere area according to a fixed radius, and taking the point cloud data in the area as the point cloud data in the 3D neighborhood of the point cloud data; smoothing the gray values of the point cloud data and the point cloud data in the 3D neighborhood to obtain the gray value of the smoothed point cloud data;
the smoothing process includes gaussian smoothing and mean smoothing.
8. The lidar point cloud data 3D display enhancement system of claim 6, wherein the module M2 comprises: traversing all the point cloud data, taking the smoothed gray value of the point cloud data as input, performing gamma value calculation on the point cloud data, and outputting a gamma value corresponding to each point cloud data;
the gamma value calculation method comprises the following steps:
γ[i,j,k]=2[(Gray[i,j,k]-128)/128](1)
wherein γ represents a gamma value; [ i, j, k ] represents three-dimensional coordinates; gray [ i, j, k ] represents the Gray value after smoothing.
9. The lidar point cloud data 3D display enhancement system of claim 1, wherein module M3 comprises:
and traversing all the point cloud data, taking the gamma value corresponding to each point cloud data in the point cloud data as input, carrying out gamma transformation on the point cloud data, and outputting the enhanced point cloud data to realize effective adjustment of local contrast.
10. The lidar point cloud data 3D display enhancement system of claim 6, wherein the gamma transformation is:
s=crγ(2)
wherein s represents a gray value after gamma conversion; r represents an input gray value; c represents a gray scale factor; the superscript γ denotes the gamma value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911311524.1A CN111179191A (en) | 2019-12-18 | 2019-12-18 | Laser radar point cloud data 3D display enhancement method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911311524.1A CN111179191A (en) | 2019-12-18 | 2019-12-18 | Laser radar point cloud data 3D display enhancement method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111179191A true CN111179191A (en) | 2020-05-19 |
Family
ID=70653973
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911311524.1A Withdrawn CN111179191A (en) | 2019-12-18 | 2019-12-18 | Laser radar point cloud data 3D display enhancement method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111179191A (en) |
-
2019
- 2019-12-18 CN CN201911311524.1A patent/CN111179191A/en not_active Withdrawn
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112835037B (en) | All-weather target detection method based on fusion of vision and millimeter waves | |
CN110741282B (en) | External parameter calibration method, device, computing equipment and computer storage medium | |
CN109785245B (en) | Light spot image trimming method | |
CN112270251B (en) | Self-adaptive multi-sensor data fusion method and system based on mutual information | |
CN111830502B (en) | Data set establishing method, vehicle and storage medium | |
WO2020003586A1 (en) | Data generation device, image identification device, data generation method, and storage medium | |
US11380046B2 (en) | Surround view | |
Zheng et al. | Underwater image enhancement algorithm based on CLAHE and USM | |
CN107742291B (en) | Defect detection method and device for photovoltaic glass | |
CN110246079B (en) | B-spline surface fitting-based camera distortion correction method, system and medium | |
CN112017246B (en) | Image acquisition method and device based on inverse perspective transformation | |
CN102096913B (en) | Multi-strategy image fusion method under compressed sensing framework | |
CN109829858B (en) | Ship-borne radar image oil spill monitoring method based on local adaptive threshold | |
CN109509148B (en) | Panoramic all-around image stitching and fusion method and device | |
CN109523474A (en) | A kind of enhancement method of low-illumination image based on greasy weather degradation model | |
CN107633549B (en) | Real-time rendering method and device based on ambient illumination probe | |
CN108961176A (en) | Range gating three-dimensional imaging is adaptive bilateral with reference to restorative procedure | |
CN105913391B (en) | A kind of defogging method can be changed Morphological Reconstruction based on shape | |
CN104794689A (en) | Preprocessing method for enhancing sonar image contract | |
CN111179191A (en) | Laser radar point cloud data 3D display enhancement method and system | |
CN109615590B (en) | Sonar image enhancement method based on fuzzy algorithm and fractional differential algorithm | |
CN108074275B (en) | High-frame-frequency visible light image simulation method and system based on ray tracing algorithm | |
US11881028B2 (en) | Vehicle lidar system with neural network-based dual density point cloud generator | |
CN113298738A (en) | Automatic enhancement device and method for X-ray welding seam image | |
CN115719310A (en) | Pretreatment method of fundus image data set and fundus image training model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20200519 |
|
WW01 | Invention patent application withdrawn after publication |