CN117289247A - Laser radar performance detection method and device, electronic equipment and storage medium - Google Patents

Laser radar performance detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117289247A
CN117289247A CN202311158437.3A CN202311158437A CN117289247A CN 117289247 A CN117289247 A CN 117289247A CN 202311158437 A CN202311158437 A CN 202311158437A CN 117289247 A CN117289247 A CN 117289247A
Authority
CN
China
Prior art keywords
gray
laser
photon
gray value
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311158437.3A
Other languages
Chinese (zh)
Inventor
郑治钦
谢锦阳
闫合
张健
唐昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lizhen Precision Intelligent Manufacturing Kunshan Co ltd
Original Assignee
Lizhen Precision Intelligent Manufacturing Kunshan Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lizhen Precision Intelligent Manufacturing Kunshan Co ltd filed Critical Lizhen Precision Intelligent Manufacturing Kunshan Co ltd
Priority to CN202311158437.3A priority Critical patent/CN117289247A/en
Publication of CN117289247A publication Critical patent/CN117289247A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • G01S17/14Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein a voltage or current pulse is initiated and terminated in accordance with the pulse transmission and echo reception respectively, e.g. using counters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application relates to a laser radar performance detection method, a laser radar performance detection device, electronic equipment and a storage medium, wherein the method comprises the following steps: generating a photon result graph by laser beams respectively received by a laser receiver based on a laser radar at the laser emission time and the reflected light arrival time; converting the photon result graph into a first gray value table; executing a data processing step on the first gray data of each row to determine a detection distance detected by the laser radar; according to the method, curve fitting is carried out on the obtained photon quantity histogram, the laser occurrence time and the reflected light arrival time can be accurately determined according to the peak value of the curve, and the accurate detection distance is determined based on the time difference of the laser occurrence time and the reflected light arrival time, so that the performance of the laser radar can be detected according to the preset distance and the detection distances, and because the distances between each point of the test surface of the test board and the test position are the same, errors caused by environmental factors are avoided, and the accuracy of the performance detection of the laser radar is improved.

Description

Laser radar performance detection method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of lidar technologies, and in particular, to a method and apparatus for detecting performance of lidar, an electronic device, and a storage medium.
Background
At present, most intelligent terminals have shooting and photographing functions so as to meet shooting requirements of users, such as smart phones, and in the shooting process, laser radar is required to be used for calibrating shooting distances between shooting objects so as to improve shooting definition. The laser radar installed on the mobile phone comprises a laser emitter and a laser receiver, wherein the laser emitter is used for emitting lattice structure light with a certain wavelength, the laser receiver is used for receiving laser reflected by a mobile phone screen and a test board, the back-and-forth running time is recorded, and the distance from a shooting object to the mobile phone is calculated. In the current performance test process, the highest point of a photon number histogram generated by sampling according to a laser receiver is generally used as reference time, so that the error is too large, and the accuracy of the laser radar performance test is affected.
Disclosure of Invention
In order to solve the technical problem of how to improve the performance detection accuracy of the laser radar, the application provides a performance detection method, device, electronic equipment and storage medium of the laser radar.
In a first aspect, the present application provides a method for detecting laser radar performance, the method comprising:
controlling a laser radar positioned at a test position to emit laser beams and receiving the laser beams reflected by the test plate; the distance from each point on the test surface of the test board to the test position is the same;
generating a photon result graph based on laser beams respectively received by a laser receiver of the laser radar at the laser emission time and the reflected light arrival time;
converting the photon result graph into a first gray value table; the first gray value table comprises N rows of first gray data; n is a positive integer greater than 1; each row of the first gray data comprises a plurality of first gray values;
performing a data processing step on the first gradation data for each row, the data processing step including: generating a photon number histogram based on a plurality of the first gray values and the receiving time corresponding to each of the first gray values; performing curve fitting on the photon quantity histogram to obtain a photon quantity curve; determining a time difference between the laser emission time and the reflected light arrival time based on the photon quantity curve; determining a detection distance detected by the laser radar based on the time difference;
Determining the performance of the laser radar according to a preset distance and a plurality of detection distances, wherein the preset distance is the vertical distance from the test position to the test board;
optionally, the test board comprises a hemispherical calibration board, the test surface is positioned on the inner surface of the hemispherical calibration board, and the test position is positioned at the center of the hemispherical calibration board;
optionally, a plurality of feature patterns are arranged in the test surface; the difference between the color gray values of the characteristic patterns and the color gray values of the background areas except the characteristic patterns in the test surface is larger than a preset gray difference threshold;
optionally, before the step of determining the performance of the lidar according to the preset distance and the plurality of detection distances, the method further comprises:
acquiring a radar imaging picture, wherein the radar imaging picture is a picture of the test board shot by the laser receiver;
determining a target sampling picture corresponding to a sampling area from the radar imaging picture; wherein the sampling region comprises at least a portion of the background region and at least a portion of the feature pattern;
determining the definition of the radar imaging picture based on the target sampling picture;
Accordingly, the determining the performance of the laser radar according to the preset distance and the detection distances comprises:
under the condition that the definition of the radar imaging picture meets a preset condition, determining the performance of the laser radar according to the preset distance and the detection distances;
optionally, the step of determining the sharpness of the radar imaging picture based on the target sampling picture comprises:
converting the target sampling picture into a second gray value table; the second gray value table comprises M rows of second gray data; each row of the second gray scale data comprises a plurality of second gray scale values; m is a positive integer greater than 1;
determining gray value groups corresponding to the second gray values of each row from the second gray values, wherein each gray value group comprises a preset number of target gray values with continuous positions and maximum gray value difference;
performing data expansion on the target gray values to obtain expanded gray values corresponding to each gray value group;
performing imaging definition calculation based on the extended gray value and the target gray value to obtain definition of the radar imaging picture;
optionally, the photon number histogram includes an emitted light photon number histogram and a reflected light photon number histogram, and the step of performing curve fitting on the photon number histogram to obtain a photon number curve includes:
Performing curve fitting on the emitted light photon quantity histogram based on a Gaussian function to obtain an emitted light photon quantity curve;
performing curve fitting on the reflected light photon quantity histogram based on a Gaussian function to obtain a reflected light photon quantity curve;
taking the emitted light photon quantity curve and the reflected light photon quantity curve as the photon quantity curves;
optionally, the step of determining the time difference between the laser emission time and the reflected light arrival time based on the photon number curve comprises:
acquiring a first time corresponding to a photon number peak value of the emitted light photon number curve;
acquiring a second time corresponding to a photon number peak value of the reflected light photon number curve;
determining the second time and the first time; time difference between laser emission time and arrival time of the reflected light
Accordingly, the determining the detection distance detected by the laser radar based on the time difference includes:
and calculating the detection distance according to the light speed and half of the time difference.
In a second aspect, the present application provides an imaging quality detection method, the method comprising:
Controlling a laser radar positioned at a test position to emit laser beams and receiving the laser beams reflected by the test plate; the test surface of the test board comprises a plurality of characteristic patterns and a background area except the characteristic patterns;
a laser receiver based on the laser radar receives the laser beam reflected by the test board, and a radar imaging picture is obtained;
determining a target sampling picture corresponding to a sampling area from the radar imaging picture; wherein the sampling region comprises at least a portion of the background region and at least a portion of the feature pattern;
converting the target sampling picture into a gray value table; the gray value table comprises R rows of gray data; each row of the gray data comprises a plurality of gray values; r is a positive integer greater than 0;
determining gray value groups corresponding to each row of gray data from the gray values, wherein each gray value group comprises a preset number of target gray values with continuous positions and maximum gray value difference;
performing data expansion on the target gray values to obtain expanded gray values corresponding to each gray value group;
performing imaging definition calculation based on the target gray value and the extended gray value to obtain imaging quality of the laser receiver;
Optionally, the step of performing data expansion on the target gray value to obtain an expanded gray value corresponding to each gray value group includes:
determining curve parameters of a preset fitting curve according to the target gray values to obtain a target fitting curve corresponding to each gray value group;
in the target fitting curve, sampling curves between any two adjacent target gray values in the gray value group corresponding to the target fitting curve to obtain the extended gray value;
optionally, the preset fitting curve is a cubic polynomial curve, and the step of determining the curve parameters of the preset fitting curve according to the target gray values to obtain the target fitting curve corresponding to each gray value group includes:
numbering the target gray values according to the position sequence of each target gray value;
determining curve parameters of the cubic polynomial curve by taking the target gray value as a y-axis coordinate and taking a number corresponding to the target gray value as an x-axis coordinate to obtain the target fitting curve;
optionally, the step of obtaining the imaging quality of the laser receiver by performing imaging sharpness calculation based on the target gray value and the extended gray value includes:
Sorting the target gray values and the extended gray values corresponding to the target gray values based on the position sequences of the target gray values in each gray value group to obtain gray value sequences corresponding to each gray value group;
carrying out average value calculation on gray values in the same sequence in different gray value sequences to obtain a gray value average value sequence;
obtaining the difference value between every two adjacent gray values from the last gray value of the gray value average value sequence to obtain gray value difference value data;
performing positive quantization and normalization on the gray value difference data to obtain normalized data;
performing discrete Fourier transform on the normalized data to obtain a modulation transfer function image;
determining the imaging quality of the target sampling picture based on the value of the target reference point in the modulation transfer function image;
optionally, the step of normalizing the gray value difference data to obtain normalized data includes:
determining a minimum value in the gray value difference data;
subtracting the minimum value from the gray value difference data to obtain positive digital data;
Determining a maximum value in the positive digitized data;
dividing the positive digitized data by the maximum value respectively to obtain normalized data;
optionally, the step of determining the imaging quality of the target picture based on the value of the target reference point in the modulation transfer function image comprises:
acquiring the target reference point to be detected;
acquiring a target value of the target reference point in the modulation transfer function image;
determining the imaging quality of the target picture according to the target value, wherein the imaging quality is inversely related to a first difference value, and the first difference value is a difference value with the target value;
optionally, the difference between the color gray value of the feature pattern and the color gray value of the background area is greater than a preset gray difference threshold.
In a third aspect, the present application provides a lidar performance detection apparatus, the apparatus comprising:
the transmitting and receiving module is used for controlling the laser radar at the test position to transmit laser beams and receiving the laser beams reflected by the test plate; the distance from each point on the test surface of the test board to the test position is the same;
the photon result graph generating module is used for generating a photon result graph based on laser beams respectively received by a laser receiver of the laser radar at the laser emission time and the reflected light arrival time;
The first gray value conversion module is used for converting the photon result graph into a first gray value table; the first gray value table comprises N rows of first gray data; n is a positive integer greater than 1; each row of the first gray data comprises a plurality of first gray values;
a data processing module, configured to perform a data processing step on the first grayscale data for each row, where the data processing step includes: generating a photon number histogram based on a plurality of the first gray values and the receiving time corresponding to each of the first gray values; performing curve fitting on the photon quantity histogram to obtain a photon quantity curve; determining a time difference between the laser emission time and the reflected light arrival time based on the photon quantity curve; determining a detection distance detected by the laser radar based on the time difference;
and the performance determining module is used for determining the performance of the laser radar according to a preset distance and a plurality of detection distances, wherein the preset distance is the vertical distance from the test position to the test board.
In a fourth aspect, the present application provides an imaging quality detection apparatus, the apparatus comprising:
the second transmitting and receiving module is used for controlling the laser radar at the test position to transmit laser beams and receiving the laser beams reflected by the test plate; the test surface of the test board comprises a plurality of characteristic patterns and a background area except the characteristic patterns;
The image acquisition module is used for receiving the laser beam reflected by the test board based on the laser receiver of the laser radar to acquire a radar imaging image;
the first determining module is used for determining a target sampling picture corresponding to the sampling area from the radar imaging pictures; wherein the sampling region comprises at least a portion of the background region and at least a portion of the feature pattern;
the second gray value conversion module is used for converting the target sampling picture into a gray value table; the gray value table comprises R rows of gray data; each row of the gray data comprises a plurality of gray values; r is a positive integer greater than 0;
a second determining module, configured to determine, from the plurality of gray values, gray value groups corresponding to each row of the gray data, where each gray value group includes a preset number of target gray values with continuous positions and a maximum gray value difference;
the data expansion module is used for carrying out data expansion on the target gray values to obtain expanded gray values corresponding to each gray value group;
and the quality detection module is used for carrying out imaging definition calculation based on the target gray value and the extended gray value to obtain the imaging quality of the laser receiver.
In a fifth aspect, the present application provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus;
a memory for storing a computer program;
and a processor, configured to implement the steps of the laser radar performance detection method according to any one of the embodiments of the first aspect, or implement the steps of the imaging quality detection method according to any one of the embodiments of the second aspect when executing the program stored in the memory.
In a sixth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the laser radar performance detection method according to any one of the embodiments of the first aspect, or implements the steps of the imaging quality detection method according to any one of the embodiments of the second aspect.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages:
the laser radar performance detection method provided by the embodiment of the application comprises the following steps: controlling a laser radar positioned at a test position to emit laser beams and receiving the laser beams reflected by the test plate; the distance from each point on the test surface of the test board to the test position is the same; generating a photon result graph based on laser beams respectively received by a laser receiver of the laser radar at the laser emission time and the reflected light arrival time; converting the photon result graph into a first gray value table; the first gray value table comprises N rows of first gray data; n is a positive integer greater than 1; each row of the first gray data comprises a plurality of first gray values; performing a data processing step on the first gradation data for each row, the data processing step including: generating a photon number histogram based on a plurality of the first gray values and the receiving time corresponding to each of the first gray values; performing curve fitting on the photon quantity histogram to obtain a photon quantity curve; determining a time difference between the laser emission time and the reflected light arrival time based on the photon quantity curve; determining a detection distance detected by the laser radar based on the time difference; and determining the performance of the laser radar according to a preset distance and a plurality of detection distances, wherein the preset distance is the vertical distance from the test position to the test board. According to the method, curve fitting is carried out on the obtained photon quantity histogram, the laser occurrence time and the reflected light arrival time can be accurately determined according to the peak value of the curve, and the accurate detection distance is determined based on the time difference of the laser occurrence time and the reflected light arrival time, so that the performance of the laser radar can be detected according to the preset distance and the detection distances, and because the distances between each point of the test surface of the test board and the test position are the same, errors caused by environmental factors are avoided, and the accuracy of the performance detection of the laser radar is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the description of the embodiments or the prior art will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a system architecture diagram of a method for detecting laser radar performance according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a laser radar performance detection generation result diagram;
FIG. 3 is a schematic diagram of a histogram derived based on gray data;
fig. 4 is a schematic flow chart of a laser radar performance detection method according to an embodiment of the present application;
FIG. 5 is a schematic view of a test plate at an angle according to one embodiment of the present application;
FIG. 6 is a schematic view of another angle of a test plate according to one embodiment of the present application;
fig. 7 is a schematic diagram of generating a fitting curve of a laser radar performance detection method according to an embodiment of the present application;
Fig. 8 is a schematic diagram of generating a fitting curve of a laser radar performance detection method according to another embodiment of the present application;
FIG. 9 is a schematic flow chart of an imaging quality detection method according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram of image capturing and data conversion alignment according to an embodiment of the present disclosure;
FIG. 11 is a flowchart of an imaging quality detection method according to another embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of a laser radar performance detecting apparatus according to an embodiment of the present application;
FIG. 13 is a schematic structural diagram of an imaging quality detection apparatus according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present application based on the embodiments herein.
The first embodiment of the present application provides a laser radar performance detection method, which can be applied to a system architecture shown in fig. 1, where the system architecture includes at least a device to be tested 101 and a test board 102. The device to be tested 101 may be a smart terminal including a lidar, such as a mobile terminal (smart phone, tablet, etc.), a notebook computer, etc., without limitation.
For easier understanding of the technical solution of the present application, the means in the prior art will be briefly described below, and the device to be tested 101 is exemplified by a smart phone (hereinafter referred to as a mobile phone).
The laser radar installed on the mobile phone at present comprises a laser transmitter and a laser receiver, wherein the laser transmitter is used for transmitting lattice structure light with the wavelength of 1064nm, the laser receiver is used for receiving laser reflected by an object, and the time of back and forth running is recorded and used for calculating the distance from the object to the mobile phone. At present, the mobile phone is placed at a position 45cm away from the test board, the laser transmitter emits lattice structure light, as shown in fig. 2, the laser beam emitted by the laser transmitter is reflected for the first time by the mobile phone screen or transparent glass, and then reflected laser beams are reflected for the second time by the test board, and the reflected laser beams enter the laser receiver respectively for two times. Since the start time is the time of the laser receiver reflected by the mobile phone screen or transparent glass, the time is the same for any position of the shooting test board, so that a vertical line is approximately formed in the result diagram, the horizontal axis of the result diagram represents the time, the end time is reflected by the object (test board) back to the laser receiver, finally a photon result diagram containing the photon result diagram received by the start time and the photon result diagram received by the end time can be obtained, each pixel in the picture is converted into a gray value and stored in a table, each numerical value in the table can be regarded as the received photon number, each row can be divided into two parts, the number of photons received at different intervals of the start time and the number of photons received at different intervals of the end time are respectively formed into histograms for the numerical values of each row, as shown in fig. 3, the highest point in the emitted light photon number histogram of the found start time is set as T0, the highest point in the reflected light photon number histogram of the found end time is set as T1, and the distance between the laser radar and the test board can be calculated as follows:
Where C is the speed of light.
The current products have at least the following disadvantages:
(1) the pulses are stretched by the propagation of light in air, imaging through a lens group, reflection of light, etc. So that the light received by the laser receiver is widened relative to the emitted laser pulse, and therefore the highest point of the histogram is selected as the reference time, and the error is too large;
(2) the method is characterized in that only the distance between the laser radar and the test board can be measured during product test, and whether the capacity of the laser receiver when receiving photons meets the standard can not be judged, so that the problem of the laser transmitter or the problem of the laser receiver can not be accurately judged when the test distance deviation is overlarge;
(3) because the test board is a plane, in the shot result diagram, the reference diagram of the ending time is a curve, all directions are different, the calculated distance of each row is only 45cm, and other test values are different, so that the efficiency is low and the error is larger when the data analysis is performed in the later stage.
Based on this, a first embodiment of the present application provides a method for detecting laser radar performance, as shown in fig. 4, including:
step 401, controlling the laser radar at the test position to emit laser beams and receiving the laser beams reflected by the test board, wherein the distance from each point on the test surface of the test board to the test position is the same.
The laser radar comprises a laser transmitter and a laser receiver, wherein the laser transmitter transmits laser beams, the laser receiver receives the transmitted laser beams, and the reflected laser beams comprise the laser beams transmitted back by the test board and the laser beams transmitted back by transparent glass (which can be a mobile phone screen).
Step 402, generating a photon result graph based on laser beams respectively received by a laser receiver of the laser radar at the laser emission time and the reflected light arrival time.
In order to overcome the problem that the distance between the laser radar and the plane test board is only a constant value (such as 45 cm) at the center position perpendicular to the plane test board, the distances between the rest test values are larger than 45cm due to the angle, and the test values are different, the embodiment uses a test board with the same distance between each point of the test surface and the test position, as shown in fig. 5, the test board comprises a hemispherical calibration board, the test surface is positioned on the inner surface of the hemispherical calibration board, the test position is positioned at the center of the hemispherical calibration board, the test surface can comprise a plurality of characteristic points, and the characteristic points can be characteristic patterns. Because the distances between the center of the sphere and any point on the sphere are the same, the laser radar is tested at the center of the sphere of the hemispherical calibration plate, and the tested distances are the same, the efficiency in the later data analysis can be improved, and the analysis error can be reduced.
In one embodiment, as shown in fig. 6, a schematic view of another angle of the test board, where a plurality of feature patterns may be disposed in the test surface, the feature patterns may be disposed as square, triangle, ellipse, circle, etc., without limitation, and the number of feature patterns is not limited, and when there are a plurality of feature patterns, the feature patterns may be uniformly distributed on the test surface.
In order to obtain obvious numerical value difference between the collected characteristic pattern region and the background region except the characteristic pattern during data analysis, the difference between the color gray value of the characteristic pattern and the color gray value of the background region is larger than a preset gray difference threshold, namely, two colors with obvious color difference can be selected to respectively configure the colors of the characteristic pattern region and the background region. For example, in one embodiment, the feature pattern is set to black and the background area in the test surface is set to white. Of course, the colors of the feature pattern region and the background region can be configured by two colors with obvious color differences, and the difference of the gray values of the colors between the two colors is larger than a preset gray difference threshold value, so that the method is not limited.
Step 403, converting the photon result graph into a first gray value table; the first gray value table includes N rows of first gray data; n is a positive integer greater than 1; each row of first gray data includes a plurality of first gray values.
The conversion of the photon result map into the first gray value table may be as shown in fig. 7, where the brightness in the photon result map is converted into a gray value, to obtain the first gray value table including N rows of first gray data.
Step 404, performing a data processing step on the first gray data of each line, the data processing step including: generating a photon quantity histogram based on the first gray values and the receiving time corresponding to each first gray value, performing curve fitting on the photon quantity histogram to obtain a photon quantity curve, determining the time difference between the laser emission time and the arrival time of the reflected light based on the photon quantity curve, and determining the detection distance detected by the laser radar based on the time difference.
By curve fitting the obtained photon quantity histogram, the laser occurrence time and the reflected light arrival time can be accurately determined according to the peak value of the curve, and then the detection distance detected by the laser radar can be accurately obtained.
In one embodiment, as shown in fig. 7, the photon number histogram includes an emitted light photon number histogram and a reflected light photon number histogram, and the step of curve fitting the photon number histogram to obtain a photon number curve includes: and performing curve fitting on the emitted light photon quantity histogram based on a Gaussian function to obtain an emitted light photon quantity curve, performing curve fitting on the reflected light photon quantity histogram based on the Gaussian function to obtain a reflected light photon quantity curve, and taking the emitted light photon quantity curve and the reflected light photon quantity curve as photon quantity curves.
In this embodiment, curve fitting may be performed on the photon number histogram based on a gaussian function, where the function expressions of the emitted light photon number histogram and the reflected light photon number histogram may be as follows:
where y represents the number of photons, x represents time, w represents the variance of the gaussian distribution, xc represents the median of the curve pulse, y 0 Representing the lowest photon number of the fitted curve, a is greater than 0 and a specific value can be determined from the region of the curve. Where the variance w may be derived based on σ, which represents the mean distribution or degree of dispersion of the photon number.
Of course, in the curve fitting process, as shown in fig. 8, fitting software may be used, and only a plurality of first gray value data are input to obtain a fitted curve.
In one embodiment, the step of determining the time difference between the laser emission time and the reflected light arrival time based on the photon number curve comprises: and acquiring a first time corresponding to the photon number peak value of the emitted light photon number curve, acquiring a second time corresponding to the photon number peak value of the reflected light photon number curve, and determining the time difference between the laser emission time and the arrival time of the reflected light according to the second time and the first time.
Accordingly, determining a detection distance detected by the lidar based on the time difference includes: the detection distance is calculated from the speed of light and half of the time difference.
In this embodiment, the first time corresponding to the peak value of the photon number of the emitted light photon number curve is a time value more accurate than the time corresponding to the highest point of the emitted light photon number histogram of the start time, and can more represent a specific value of the start time, and the second time corresponding to the peak value of the photon number of the reflected light photon number curve is a time value more accurate than the time corresponding to the highest point of the reflected light photon number histogram, and can more represent a specific value of the end time.
Step 405, determining the performance of the laser radar according to the preset distance and the plurality of detection distances, wherein the preset distance is the vertical distance from the test position to the test board.
According to the method, curve fitting is carried out on the obtained photon quantity histogram, the laser occurrence time and the reflected light arrival time can be accurately determined according to the peak value of the curve, and the accurate detection distance is determined based on the time difference of the laser occurrence time and the reflected light arrival time, so that the performance of the laser radar can be detected according to the preset distance and the detection distances, and because the distances between each point of the test surface of the test board and the test position are the same, errors caused by environmental factors are avoided, and the accuracy of the performance detection of the laser radar is improved.
In one embodiment, for the problem that in the current scheme, whether the capability of the laser receiver when receiving the photon meets the standard cannot be judged, so that whether the laser receiver is a laser emitter or a laser receiver cannot be accurately judged when the deviation between the detected distance and the preset distance is too large, whether the performance of the laser receiver is good can be judged according to the target sampling picture, and further, when the deviation between the detected distance and the preset distance is too large, equipment with problems is examined, for example, before the step of determining the performance of the laser radar according to the preset distance and the multiple detection distances, the method further includes: and acquiring a radar imaging picture, wherein the radar imaging picture is a picture of a test board shot based on a laser receiver, and a target sampling picture corresponding to a sampling area is determined from the radar imaging picture, wherein the sampling area comprises at least a part of background area and at least a part of characteristic pattern, and the definition of the radar imaging picture is determined based on the target sampling picture.
Accordingly, determining the performance of the lidar according to the preset distance and the plurality of detection distances includes: and under the condition that the definition of the radar imaging picture meets the preset condition, determining the performance of the laser radar according to the preset distance and the detection distances.
In this embodiment, the test surface includes a plurality of feature patterns, for example, the feature patterns are black circular patterns, the background area is white, the target sample image includes black circular feature patterns, when the sample area is taken, the sample area includes a part of white background area and a part of black circular feature patterns, the image in the sample area is subjected to gray value conversion, and since there is an obvious difference value between the gray values after the gray value conversion between the black and the white, if the maximum difference value between the converted gray values is greater than a preset difference value, it can be determined that the image definition imaged by the laser receiver is normal, that is, if the maximum difference value is less than the preset difference value, it can be determined that the laser receiver may have a problem to cause poor performance.
In one embodiment, the step of determining the sharpness of the radar imaging picture based on the target sample picture comprises: converting the target sampling picture into a second gray value table, wherein the second gray value table comprises M rows of second gray data, each row of second gray data comprises a plurality of second gray values, M is a positive integer greater than 1, gray value groups corresponding to each row of second gray data are determined from the plurality of second gray values, each gray value group comprises a preset number of target gray values which are continuous in position and have the largest gray value difference, data expansion is carried out on the target gray values to obtain expanded gray values corresponding to each gray value group, and imaging definition calculation is carried out based on the expanded gray values and the target gray values to obtain the definition of the radar imaging picture.
In this embodiment, in the process of determining the sharpness of the radar imaging image based on the target sampling image, the target sampling image may be converted into the second gray value table, the gray value group corresponding to each row of the second gray value data is determined from the second gray value table, the gray value group may be an array formed by four target gray values with the largest gray value difference in each row of the second gray value data, in order to improve the accuracy of calculation, the gray value group may be further expanded to obtain an expanded gray value, the imaging sharpness calculation may be performed according to the expanded gray value and the target gray value in the gray value group, and the gray change between every two target gray values may be simulated by the expanded gray value, thereby determining the edge of the feature pattern imaging more accurately and improving the accuracy of the imaging sharpness calculation.
The solution of the various embodiments described above has at least the following advantages: the accurate starting time and ending time can be obtained by utilizing the photon number curve obtained by fitting the photon number histogram through the Gaussian function, the influence caused by the extension characteristic of the light pulse width can be avoided, the test accuracy is greatly improved, the black circle center pattern is drawn on the white test board, the gray value deviation of the black and white critical point can be utilized to judge whether the shooting capability of the laser receiver meets the standard, the condition that the performance of the laser receiver becomes a variable influencing the test result in the distance test is avoided, the variable in the test process is unique, and the accuracy and the reliability of the radar distance test are ensured. And the plane test board is changed into a hemispherical calibration board with the radius of 45cm, so that the result graph with different directions originally can be changed into two regular photon quantity record stripes, errors caused by environmental factors are avoided, the accuracy of an algorithm is improved, the problem analysis capability of factories on equipment to be tested before delivery is also improved, and the condition that unqualified equipment to be tested flows into a terminal client is avoided.
It should be noted that, in the imaging definition calculation, the maximum difference between the second gray values in the second gray data of the same row in the second gray value table may be taken, the maximum difference between the second gray values in part of the rows may be taken, or the maximum difference between all the second gray values in the table may be taken, which is not limited.
It should be noted that, the photon result graph is converted into the first gray value table, and the processing manner of converting the target sampling image into the second gray value table may be the same.
The second embodiment of the present application provides an imaging quality detection method for solving the technical problem of how to accurately determine imaging quality, where the method may be applied to imaging quality detection of a picture acquired based on a system architecture shown in fig. 1, where the system architecture includes at least a device to be tested 101 and a test board 102, as shown in fig. 9, and the method includes:
step 901, controlling a laser radar at a test position to emit laser beams and receiving the laser beams reflected by a test board; the test surface in the test board includes a plurality of feature patterns and a background area other than the feature patterns.
In order to make a significant numerical difference between the acquired characteristic pattern region and the background region during data analysis, the difference between the color gray value of the characteristic pattern and the color gray value of the background region can be set to be larger than a preset gray difference threshold. For example, two colors with distinct color differences may be selected to configure the colors of the feature pattern region and the background region, respectively.
In one embodiment, the feature pattern is set to black, and the background area in the test surface is set to white, however, the colors of the feature pattern area and the background area can be configured by two colors with obvious difference of other colors, without limitation.
In addition, a plurality of characteristic patterns can be arranged in the test surface, the characteristic patterns can be square, triangular, elliptic, circular and the like, and when the characteristic patterns are circular, the sampling areas comprising the characteristic pattern areas and the non-characteristic pattern areas are easy to determine. Also, the number of the feature patterns is not limited, and when there are a plurality of feature patterns, the feature patterns may be uniformly distributed on the test surface. The distance from each point on the test surface of the test board to the test position is the same.
Step 902, a laser receiver based on a laser radar receives a laser beam reflected by a test board, and a radar imaging picture is obtained.
And step 903, determining a target sampling picture corresponding to a sampling area from the radar imaging picture, wherein the sampling area comprises at least a part of background area and at least a part of characteristic pattern.
Step 904, converting the target sampling picture into a gray value table, wherein the gray value table comprises R rows of gray data; each row of gray data includes a plurality of gray values, and R is a positive integer greater than 0.
The value of R is not limited, and may be any positive integer, and different R-line gray data may be obtained according to the size of the sampling area.
In step 905, a gray value group corresponding to each line of gray data is determined from a plurality of gray values, where each gray value group includes a preset number of target gray values with continuous positions and maximum gray value difference.
Because the color difference between the configured characteristic pattern area and the background area is obvious, for example, the characteristic pattern is set to be black, the background area is set to be white, and in each line of gray data, the sampling area with larger gray value difference corresponds to the critical area between the characteristic pattern area and the background area in the target sampling picture, and the target gray value with continuous preset number of positions and largest difference is selected from the critical area. For example, K consecutive target gray values may be determined therefrom, and K may be set to be greater than or equal to 3, for example, K may be four, in order for the sampled target gray values to more represent a difference.
Step 906, performing data expansion on the target gray values to obtain expanded gray values corresponding to each gray value group.
R lines of gray data including K target gray values in the gray value table can be selected, and before the target gray values are expanded, the R lines of gray data can be aligned first, as shown in FIG. 10, which is a schematic diagram of image shooting and data conversion alignment, and the aligned R lines of gray data include R lines and K columns aligned in rows and columns.
In one embodiment, the step of performing data expansion on the target gray value to obtain an expanded gray value corresponding to each gray value group includes: determining curve parameters of a preset fitting curve according to the target gray values to obtain a target fitting curve corresponding to each gray value group, and sampling curves between any two adjacent target gray values in the gray value groups corresponding to the target fitting curve in the target fitting curve to obtain an expanded gray value.
In one embodiment, the preset fitting curve is a cubic polynomial curve, and the step of determining the curve parameters of the preset fitting curve according to the target gray values to obtain the target fitting curve corresponding to each gray value group includes:
numbering the target gray values according to the position sequence of each target gray value;
and determining curve parameters of a cubic polynomial curve by taking the target gray value as a y-axis coordinate and taking a number corresponding to the target gray value as an x-axis coordinate to obtain a target fitting curve.
In this embodiment, for the K target gray values of each row, a cubic polynomial curve may be fitted, and the cubic polynomial curve may be used as a function y=a+bx+cx 2 +DX 3 Wherein Y represents a target gradation value, X represents a number corresponding to the target gradation value, a represents a first coefficient, B represents a second coefficient, C represents a third coefficient, and D represents a fourth coefficient. A. B, C, D, which may also be referred to as the curve parameter of the cubic polynomial curve, can be derived from three sets of values of X and Y and above.
And step 907, performing imaging definition calculation based on the target gray value and the extended gray value to obtain the imaging quality of the laser receiver.
The test surface in the test board used by the method comprises a plurality of characteristic patterns and background areas except the characteristic patterns, and the sampling area comprises at least one part of background areas and at least one part of characteristic patterns, so that the larger gray value difference in a gray value table converted by a target sampling picture can be ensured, a target gray value with continuous preset number of positions and maximum gray value difference is determined, the target gray value is subjected to data expansion to obtain an expanded gray value, and imaging definition calculation is performed based on the target gray value and the expanded gray value, so that the accuracy of imaging quality detection is further ensured, the efficiency of imaging quality detection is improved, and the detection error is reduced.
In one embodiment, the step of obtaining the imaging quality of the laser receiver by performing imaging sharpness calculation based on the target gray value and the extended gray value includes:
sorting the target gray values and the extended gray values corresponding to the target gray values based on the position sequences of the target gray values in each gray value group to obtain gray value sequences corresponding to each gray value group;
Carrying out average value calculation on gray values in the same sequence in different gray value sequences to obtain a gray value average value sequence;
starting from the last gray value of the gray value average value sequence, obtaining the difference value between every two adjacent gray values to obtain gray value difference value data;
carrying out positive quantization and normalization on the gray value difference data to obtain normalized data;
performing discrete Fourier transform on the normalized data to obtain a modulation transfer function image;
and determining the imaging quality of the target sampling picture based on the value of the target reference point in the modulation transfer function image.
In a specific embodiment, the step of normalizing the gray value difference data to obtain normalized data includes: determining the minimum value in the gray value difference data, and respectively subtracting the minimum value from the gray value difference data to obtain positive digital data; and determining the maximum value in the positive digitized data, and dividing the positive digitized data by the maximum value to obtain normalized data.
In this embodiment, the gray value sequence is an expanded and ordered gray value group, and the brightness of the four target gray values selected by one line is respectively: 129. 135, 173, 211, the fitting result may obtain a first coefficient a=185.1, a second coefficient b= -99.6, a third coefficient c=47.52, and a fourth coefficient d= -5.28, and further may obtain a function expression of the cubic polynomial curve: y=185.1-99.6x+47.52x 2 -5.28X 3
Taking L numbers (such as 332) between every two points of the fitted curve, expanding the original four numbers into 3L+4 numbers (in the case of taking 332 by L, expanding into 3L+4 numbers to obtain 1000 numbers), wherein the 1000 numbers are gray value sequences corresponding to the gray value group of the line, if 50 lines are taken in total, four target gray values of each line are treated by adopting the same method, 50 gray value sequences corresponding to 50 gray value groups are obtained, the 50 gray value sequences are divided into 1000 columns, the gray value sequences are subjected to average calculation according to the columns to obtain 1000 gray value average values, the 1000 gray value average values can be called as gray value average value sequences, the last data of the gray value average value sequence is subtracted by the last data to obtain 999 difference data, the minimum value is determined, the 999 difference data are respectively subtracted by the minimum value, the 999 positive numerations of the 999 difference data are realized, the 999 positive numerations are obtained, the maximum 999 positive numerations of the 999 positive numerations are determined, and the 999 positive numerations of the positive numerations are divided by the 999 data respectively one of the maximum values is obtained. Discrete fourier transform was performed on 999 normalized data to obtain a modulation transfer function image (Modulation Transfer Function, MTF image), so that the imaging quality of the target sample picture can be determined based on the value of the target reference point in the MTF image.
In one embodiment, determining the imaging quality of the target sample picture based on the value of the target reference point in the modulation transfer function image comprises: and acquiring a target reference point to be detected, acquiring a target value of the target reference point in the modulation transfer function image, and determining the imaging quality of the target sampling picture according to the target value, wherein the imaging quality is inversely related to a first difference value, and the first difference value is a difference value with the target value.
In this embodiment, the closer the target value of the target reference point in the MTF image is to one, the better the imaging quality is represented.
In one embodiment, as shown in fig. 11, the imaging quality detection method includes:
step 1101, a laser receiver takes pictures;
step 1102, converting gray values of the sampling area, wherein each pixel corresponds to a gray value and is stored in a table;
step 1103, selecting four values with larger gray value difference for each row, and aligning the four values of each row up and down;
step 1104, fitting the four values of each row with a cubic polynomial, where the cubic polynomial is: y=a+bx+cx 2 +DX 3
Step 1105, 332 values are taken between every two points of the fitted curve, and the original four values can be expanded into 1000 values;
Step 1106, 50 rows of 1000 data are taken, and 50000 data are taken in total; partial data, such as 50 rows of data, can be selected from the table without limitation;
step 1107, each column averages to obtain 1000 averages;
step 1108, subtracting the front data from the back data to obtain 999 values;
1109, taking the minimum value in 999 values, and then subtracting the minimum value for positive quantization;
step 1110, taking the maximum value of 999 values after positive quantization, and then dividing the maximum value by the maximum value to perform data normalization;
step 1111, performing discrete Fourier transform on the 999 data after normalization to obtain an MTF image;
step 1112 takes a point required in the MTF image as a reference point, the closer the point value is to 1, the better the imaging quality is represented.
In this embodiment, gray value conversion is performed on a shot picture, four gray values representing obvious color differences are selected, four values of each row are aligned, a curve is fitted by using a cubic polynomial, digital value expansion, positive numeration and normalization are performed, discrete fourier transform is performed to obtain an MTF image, and imaging quality detection is performed on the MTF image.
Based on the same technical concept, a third embodiment of the present application provides a laser radar performance detecting apparatus, as shown in fig. 12, including:
a first transceiver module 1201 for controlling the laser radar at the test position to transmit the laser beam and receiving the laser beam reflected by the test board; the distance from each point on the test surface of the test board to the test position is the same;
a photon result graph generating module 1202, configured to generate a photon result graph based on laser beams received by a laser receiver of the laser radar at a laser emission time and a reflected light arrival time respectively;
a first gray value conversion module 1203 configured to convert the photon result map into a first gray value table; the first gray value table comprises N rows of first gray data; n is a positive integer greater than 1; each row of the first gray data comprises a plurality of first gray values;
a data processing module 1204, configured to perform a data processing step on the first gray data of each line, where the data processing step includes: generating a photon number histogram based on a plurality of the first gray values and the receiving time corresponding to each of the first gray values; performing curve fitting on the photon quantity histogram to obtain a photon quantity curve; determining a time difference between the laser emission time and the reflected light arrival time based on the photon quantity curve; determining a detection distance detected by the laser radar based on the time difference;
And a performance determining module 1205, configured to determine performance of the lidar according to a preset distance and a plurality of detection distances, where the preset distance is a vertical distance from the test position to the test board.
The device carries out curve fitting on the obtained photon quantity histogram, can accurately determine the laser occurrence time and the reflected light arrival time according to the peak value of the curve, and determines the accurate detection distance based on the time difference of the laser occurrence time and the reflected light arrival time, so that the performance of the laser radar can be detected according to the preset distance and a plurality of detection distances, and the error caused by environmental factors is avoided due to the fact that the distances between each point of the test surface of the test board and the test position are the same, and the accuracy of the performance detection of the laser radar is improved.
Based on the same technical idea, a fourth embodiment of the present application provides an imaging quality detection apparatus, as shown in fig. 13, including:
the second transmitting and receiving module 1301 is configured to control the laser radar at the test position to transmit a laser beam and receive the laser beam reflected by the test board; the test surface of the test board comprises a plurality of characteristic patterns and a background area except the characteristic patterns;
The image obtaining module 1302 is configured to obtain a radar imaging image based on the laser receiver of the laser radar receiving the laser beam reflected by the test board;
a first determining module 1303, configured to determine a target sampling picture corresponding to a sampling area from the radar imaging pictures; wherein the sampling region comprises at least a portion of the background region and at least a portion of the feature pattern;
a second gray value conversion module 1304, configured to convert the target sampled picture into a gray value table; the gray value table comprises R rows of gray data; each row of the gray data comprises a plurality of gray values; r is a positive integer greater than 0;
a second determining module 1305, configured to determine, from the plurality of gray values, gray value groups corresponding to each row of the gray data, where each gray value group includes a preset number of target gray values with continuous positions and maximum gray value differences;
the data expansion module 1306 is configured to perform data expansion on the target gray values to obtain expanded gray values corresponding to each gray value group;
and the quality detection module 1307 is configured to perform imaging definition calculation based on the target gray level value and the extended gray level value, so as to obtain imaging quality of the laser receiver.
The device comprises a plurality of characteristic patterns and a background area except the characteristic patterns on a test surface in a test board, wherein a sampling area comprises at least one part of the background area and at least one part of the characteristic patterns, so that the large gray value difference in a gray value table converted by a target sampling picture can be ensured, a target gray value with continuous preset number of positions and maximum gray value difference is determined, the target gray value is subjected to data expansion to obtain an expanded gray value, imaging definition calculation is performed based on the target gray value and the expanded gray value, the accuracy of imaging quality detection is further ensured, the efficiency of imaging quality detection is improved, and the detection error is reduced.
It should be appreciated that the first transceiver module and the second transceiver module may be the same module.
As shown in fig. 14, a fifth embodiment of the present application provides an electronic device including a processor 111, a communication interface 112, a memory 113, and a communication bus 114, wherein the processor 111, the communication interface 112, the memory 113 perform communication with each other through the communication bus 114,
a memory 113 for storing a computer program;
in one embodiment, the processor 111 is configured to implement the method for detecting laser radar performance provided in any one of the foregoing method embodiments, or implement the method for detecting imaging quality provided in any one of the foregoing method embodiments when executing the program stored in the memory 113.
The communication bus mentioned by the above terminal may be a peripheral component interconnect standard (Peripheral Component Interconnect, abbreviated as PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, abbreviated as EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the terminal and other devices.
The memory may include random access memory (Random Access Memory, RAM) or non-volatile memory (non-volatile memory), such as at least one disk memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but also digital signal processors (Digital Signal Processing, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field-programmable gate arrays (Field-Programmable Gate Array, FPGA for short) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
The sixth embodiment of the present application further provides a computer readable storage medium, on which a computer program is stored, where the computer program when executed by a processor implements the laser radar performance detection method provided in any one of the foregoing method embodiments, or implements the imaging quality detection method provided in any one of the foregoing method embodiments.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application. In the description, suffixes such as "module", "part" or "unit" used for representing elements are used only for facilitating the description of the present application, and have no specific meaning per se. Thus, "module," "component," or "unit" may be used in combination.
The foregoing is merely a specific embodiment of the application to enable one skilled in the art to understand or practice the application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method for detecting laser radar performance, the method comprising:
controlling a laser radar positioned at a test position to emit laser beams and receiving the laser beams reflected by the test plate; the distance from each point on the test surface of the test board to the test position is the same;
generating a photon result graph based on laser beams respectively received by a laser receiver of the laser radar at the laser emission time and the reflected light arrival time;
converting the photon result graph into a first gray value table; the first gray value table comprises N rows of first gray data; n is a positive integer greater than 1; each row of the first gray data comprises a plurality of first gray values;
Performing a data processing step on the first gradation data for each row, the data processing step including: generating a photon number histogram based on a plurality of the first gray values and the receiving time corresponding to each of the first gray values; performing curve fitting on the photon quantity histogram to obtain a photon quantity curve; determining a time difference between the laser emission time and the reflected light arrival time based on the photon quantity curve; determining a detection distance detected by the laser radar based on the time difference;
and determining the performance of the laser radar according to a preset distance and a plurality of detection distances, wherein the preset distance is the vertical distance from the test position to the test board.
2. The method of claim 1, wherein the test plate comprises a hemispherical calibration plate, the test surface is located on an inner surface of the hemispherical calibration plate, and the test location is located at a center of sphere of the hemispherical calibration plate.
3. The method of claim 2, wherein a plurality of feature patterns are provided in the test surface; the difference between the color gray scale value of the characteristic pattern and the color gray scale value of the background area except the characteristic pattern in the test surface is larger than a preset gray scale difference threshold value.
4. A method according to claim 3, wherein prior to the step of determining the performance of the lidar based on a preset distance and a plurality of the detected distances, the method further comprises:
acquiring a radar imaging picture, wherein the radar imaging picture is a picture of the test board shot by the laser receiver;
determining a target sampling picture corresponding to a sampling area from the radar imaging picture; wherein the sampling region comprises at least a portion of the background region and at least a portion of the feature pattern;
determining the definition of the radar imaging picture based on the target sampling picture;
accordingly, the determining the performance of the laser radar according to the preset distance and the detection distances comprises:
and under the condition that the definition of the radar imaging picture meets the preset condition, determining the performance of the laser radar according to the preset distance and the detection distances.
5. The method of claim 4, wherein the step of determining the sharpness of the radar imaging picture based on the target sample picture comprises:
converting the target sampling picture into a second gray value table; the second gray value table comprises M rows of second gray data; each row of the second gray scale data comprises a plurality of second gray scale values; m is a positive integer greater than 1;
Determining gray value groups corresponding to the second gray values of each row from the second gray values, wherein each gray value group comprises a preset number of target gray values with continuous positions and maximum gray value difference;
performing data expansion on the target gray values to obtain expanded gray values corresponding to each gray value group;
and carrying out imaging definition calculation based on the extended gray value and the target gray value to obtain the definition of the radar imaging picture.
6. The method of claim 1, wherein the photon number histogram comprises an emitted light photon number histogram and a reflected light photon number histogram, and wherein curve fitting the photon number histogram to obtain a photon number curve comprises:
performing curve fitting on the emitted light photon quantity histogram based on a Gaussian function to obtain an emitted light photon quantity curve;
performing curve fitting on the reflected light photon quantity histogram based on a Gaussian function to obtain a reflected light photon quantity curve;
and taking the emitted light photon quantity curve and the reflected light photon quantity curve as the photon quantity curves.
7. The method of claim 6, wherein the step of determining the time difference between the laser emission time and the reflected light arrival time based on the photon number curve comprises:
Acquiring a first time corresponding to a photon number peak value of the emitted light photon number curve;
acquiring a second time corresponding to a photon number peak value of the reflected light photon number curve;
determining the second time and the first time; time difference between laser emission time and arrival time of the reflected light
Accordingly, the determining the detection distance detected by the laser radar based on the time difference includes:
and calculating the detection distance according to the light speed and half of the time difference.
8. A lidar performance detection apparatus, the apparatus comprising:
the first transmitting and receiving module is used for controlling the laser radar at the test position to transmit laser beams and receiving the laser beams reflected by the test plate; the distance from each point on the test surface of the test board to the test position is the same;
the photon result graph generating module is used for generating a photon result graph based on laser beams respectively received by a laser receiver of the laser radar at the laser emission time and the reflected light arrival time;
the first gray value conversion module is used for converting the photon result graph into a first gray value table; the first gray value table comprises N rows of first gray data; n is a positive integer greater than 1; each row of the first gray data comprises a plurality of first gray values;
A data processing module, configured to perform a data processing step on the first grayscale data for each row, where the data processing step includes: generating a photon number histogram based on a plurality of the first gray values and the receiving time corresponding to each of the first gray values; performing curve fitting on the photon quantity histogram to obtain a photon quantity curve; determining a time difference between the laser emission time and the reflected light arrival time based on the photon quantity curve; determining a detection distance detected by the laser radar based on the time difference;
and the performance determining module is used for determining the performance of the laser radar according to a preset distance and a plurality of detection distances, wherein the preset distance is the vertical distance from the test position to the test board.
9. The electronic equipment is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
a memory for storing a computer program;
a processor for implementing the steps of the lidar performance detection method of any of claims 1 to 7 when executing a program stored on a memory.
10. A computer readable storage medium having stored thereon a computer program, which when executed by a processor performs the steps of the lidar performance detection method according to any of claims 1 to 7.
CN202311158437.3A 2023-09-08 2023-09-08 Laser radar performance detection method and device, electronic equipment and storage medium Pending CN117289247A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311158437.3A CN117289247A (en) 2023-09-08 2023-09-08 Laser radar performance detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311158437.3A CN117289247A (en) 2023-09-08 2023-09-08 Laser radar performance detection method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117289247A true CN117289247A (en) 2023-12-26

Family

ID=89252709

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311158437.3A Pending CN117289247A (en) 2023-09-08 2023-09-08 Laser radar performance detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117289247A (en)

Similar Documents

Publication Publication Date Title
US20190011542A1 (en) Method and apparatus for measuring angular resolution of multi-beam lidar
US20210398311A1 (en) Positioning method and device, and storage medium
CN111757097B (en) Detection method, detection device and computer readable storage medium
US20230351635A1 (en) Optical axis calibration method and apparatus of optical axis detection system, terminal, system, and medium
CN112489140A (en) Attitude measurement method
CN110765631B (en) Effective imaging pixel-based small target judgment method for infrared radiation characteristic measurement
CN113781478B (en) Oil tank image detection method, oil tank image detection device, electronic equipment and computer readable medium
CN109489560B (en) Linear dimension measuring method and device and intelligent terminal
US10067029B2 (en) Systems and methods for estimating modulation transfer function in an optical system
CN117289247A (en) Laser radar performance detection method and device, electronic equipment and storage medium
CN116879873B (en) Imaging quality detection method, imaging quality detection device, electronic equipment and storage medium
CN114419076B (en) Data processing method and device
CN114511489B (en) Beam divergence angle detection method and system of VCSEL chip and electronic equipment
US11727664B2 (en) Systems and methods for determining an adaptive region of interest (ROI) for image metrics calculations
CN116993654A (en) Camera module defect detection method, device, equipment, storage medium and product
CN110738191B (en) Object classification method, device, equipment and medium based on sonar signals
CN111435080B (en) Water level measuring method, device and system
CN115953484B (en) Parameter calibration method and device of detection equipment, storage medium and electronic device
WO2023279756A1 (en) Distance measurement system, and method, apparatus and device for determining relative precision thereof
CN113379835B (en) Calibration method, device and equipment of detection equipment and readable storage medium
CN117192562B (en) Laser ranging sensor and ranging method thereof
EP4386675A1 (en) External parameter determination method and image processing device
CN112837227B (en) Parameter correction method, device and system, electronic equipment and storage medium
CN115937296A (en) Tire surface optical character recognition preprocessing method and device
CN114283185A (en) Size detection method, device, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination