CN113393508A - Laser ranging imaging method and device, electronic equipment and storage medium - Google Patents

Laser ranging imaging method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113393508A
CN113393508A CN202110682489.5A CN202110682489A CN113393508A CN 113393508 A CN113393508 A CN 113393508A CN 202110682489 A CN202110682489 A CN 202110682489A CN 113393508 A CN113393508 A CN 113393508A
Authority
CN
China
Prior art keywords
target
ranging
target object
value
values
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110682489.5A
Other languages
Chinese (zh)
Inventor
高乐
马玲玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202110682489.5A priority Critical patent/CN113393508A/en
Publication of CN113393508A publication Critical patent/CN113393508A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The present disclosure provides a laser ranging imaging method, which can be used in the technical field of laser imaging, financial field or other fields. The method comprises the following steps: scanning a target area through a laser beam to obtain a plurality of ranging values of the target area, wherein the target area comprises a target object; acquiring position information of a target object through satellite remote sensing; determining a plurality of target ranging values for the target object from the plurality of ranging values according to the position information; and generating a three-dimensional image of the target object based on the plurality of target ranging values. In addition, the present disclosure also provides a laser ranging imaging apparatus, an electronic device, a readable storage medium, and a computer program product.

Description

Laser ranging imaging method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of laser imaging technology and the field of finance, and more particularly, to a laser range finding imaging method, a laser range finding imaging apparatus, an electronic device, a readable storage medium, and a computer program product.
Background
At present, the laser ranging imaging technology is mature day by day, and the imaging of a long-distance target can be realized.
In the course of implementing the disclosed concept, the inventors found that the laser ranging imaging method in the related art cannot directly acquire an image of a target object in the presence of an obstacle.
Disclosure of Invention
In view of the above, the present disclosure provides a laser range imaging method, a laser range imaging apparatus, an electronic device, a readable storage medium, and a computer program product.
One aspect of the present disclosure provides a laser range finding imaging method, including: scanning a target area through a laser beam to acquire a plurality of ranging values of the target area, wherein the target area comprises a target object; acquiring the position information of the target object through satellite remote sensing; determining a plurality of target ranging values for the target object from the plurality of ranging values according to the position information; and generating a three-dimensional image of the target object based on the sum of the plurality of target ranging values.
According to an embodiment of the present disclosure, the determining a plurality of target ranging values for the target object from the plurality of ranging values according to the position information includes: acquiring a first distance value between the target object and the target object based on the position information and the current position information of the target object; and determining a plurality of target ranging values for the target object from the plurality of ranging values according to the first distance value.
According to an embodiment of the present disclosure, the determining a plurality of target ranging values for the target object from the plurality of ranging values according to the distance value includes: clustering the plurality of ranging values into a plurality of ranging value point sets based on the numerical values of the plurality of ranging values; for each distance measurement value point set, taking the distance measurement value of the center point of the distance measurement value point set as a second distance value of the distance measurement value point set; calculating the difference between the first distance value and the second distance value of each distance value point set respectively to obtain a plurality of third distance values; and determining a plurality of ranging values contained in the ranging value point set corresponding to the third distance value with the minimum absolute value as the plurality of target ranging values.
According to an embodiment of the present disclosure, the determining a plurality of target ranging values for the target object from the plurality of ranging values according to the distance value includes: for each of the ranging values, calculating a difference between the ranging value and the first distance value; and determining the distance measurement value as the target distance measurement value under the condition that the absolute value of the difference value is smaller than a preset threshold value.
According to an embodiment of the present disclosure, the generating a three-dimensional image of the target object based on the plurality of target ranging values includes: generating a point cloud image of the target object according to the plurality of target ranging values; and processing the point cloud image to obtain a three-dimensional image of the target object.
According to an embodiment of the present disclosure, further comprising: acquiring the depth of field parameter and the plane parameter of the target object according to the plurality of target ranging values; correcting imaging parameters based on the depth of field parameter and the plane parameter; and generating a three-dimensional image of the target object based on the plurality of target ranging values and the corrected imaging parameters.
Another aspect of the disclosure provides a laser range imaging apparatus comprising a scanning module, a remote sensing module, a processing module, and a first generating module. The scanning module is used for scanning a target area through a laser beam and acquiring a plurality of ranging values of the target area, wherein the target area comprises a target object; the remote sensing module is used for acquiring the position information of the target object through satellite remote sensing; a processing module, configured to determine a plurality of target ranging values for the target object from the plurality of ranging values according to the location information; and a first generating module, configured to generate a three-dimensional image of the target object based on the plurality of target ranging values.
Another aspect of the present disclosure provides an electronic device including: one or more processors; memory to store one or more instructions, wherein the one or more instructions, when executed by the one or more processors, cause the one or more processors to implement a method as described above.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program product comprising computer executable instructions for implementing the method as described above when executed.
According to the embodiment of the disclosure, for the acquired ranging value of the target area, the target ranging value of the target object is selected from the ranging values through the position information of the target object, and the three-dimensional image of the target object is generated by using the target ranging value, so that the influence of the influence factors such as obstacles on the imaging quality is at least partially overcome, and the requirements of imaging scenes such as micro-distance and long-distance are met.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent from the following description of embodiments of the present disclosure with reference to the accompanying drawings, in which:
fig. 1 schematically shows a schematic diagram of an application scenario of a laser range imaging method according to an embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow diagram of a laser range imaging method according to an embodiment of the disclosure;
FIG. 3 schematically illustrates a schematic diagram of a laser range imaging method according to another embodiment of the present disclosure;
FIG. 4 schematically illustrates a block diagram of a laser range imaging apparatus according to an embodiment of the present disclosure;
fig. 5 schematically shows a block diagram of an electronic device suitable for implementing a laser range imaging method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Embodiments of the present disclosure provide a laser range imaging method, a laser range imaging apparatus, an electronic device, a readable storage medium, and a computer program product. The method comprises the following steps: scanning a target area through a laser beam to obtain a plurality of ranging values of the target area, wherein the target area comprises a target object; acquiring position information of a target object through satellite remote sensing; determining a plurality of target ranging values for the target object from the plurality of ranging values according to the position information; and generating a three-dimensional image of the target object based on the plurality of target ranging values.
It should be noted that the laser ranging imaging method and apparatus provided by the embodiment of the present disclosure may be used in the field of laser imaging technology or the field of finance, and may also be used in any other fields except the field of laser imaging technology and the field of finance.
Fig. 1 schematically shows a schematic diagram of an application scenario of a laser range imaging method according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of an application scenario of the embodiment of the present disclosure to help those skilled in the art understand the technical content of the present disclosure, but it does not mean that the embodiment of the present disclosure may not be applied to other scenarios or implemented by other devices.
As shown in fig. 1, the laser ranging imaging method provided by the embodiment of the present disclosure may be implemented by a system in which a laser ranging device 101, a remote sensing device 102, and a terminal device 103 are combined.
The laser ranging device 101 may excite a laser signal to the target object 104, and after receiving the laser signal returned by the target object 104, may convert the returned laser signal into an electrical signal, and send the electrical signal to the terminal device 103 for processing.
Remote sensing device 102 may establish a communication connection with satellite 105 via a telemetry signal. Satellite 105 may obtain position information of target object 104 and transmit the position information to remote sensing device 102, after which remote sensing device 102 may also convert the position information into an electrical signal and transmit it to terminal device 103.
The terminal device 103 may be various electronic devices having a display screen and supporting input and output of information, including but not limited to a smart phone, a tablet computer, a laptop portable computer, a desktop computer, and the like.
After receiving the electric signals sent by the laser ranging device 101 and the remote sensing device 102, the terminal device 103 may produce a three-dimensional image of the target object 104 and display the three-dimensional image on a display screen of the terminal device 103.
Network connection can be established between the laser ranging device 101 and the terminal equipment 103 and between the remote sensing device 102 and the terminal equipment 103 in a wired or wireless mode, so that information interaction is realized.
The laser ranging imaging method provided by the embodiment of the present disclosure may be generally performed by the terminal device 103. Accordingly, the laser ranging imaging apparatus provided by the embodiment of the present disclosure may be generally disposed in the terminal device 103.
In addition, the laser ranging imaging method provided by the embodiment of the present disclosure may also be executed by other terminal devices or a server cluster capable of establishing a communication connection with the terminal device 103. Accordingly, the laser ranging imaging apparatus provided by the embodiment of the present disclosure may also be disposed in other terminal devices or servers, or a server cluster, which can establish a communication connection with the terminal device 103. The terminal device 103 may send the received electrical signal to another terminal device or a server cluster, and the other terminal device or the server cluster may execute the method of the embodiment of the present disclosure to generate a three-dimensional image, and return the three-dimensional image to the terminal device 103 for displaying.
It should be understood that the laser ranging device, the remote sensing device and the terminal device in fig. 1 are only illustrative and can be arbitrarily set according to implementation needs.
Fig. 2 schematically illustrates a flow chart of a laser range imaging method according to an embodiment of the present disclosure.
It should be noted that, unless explicitly stated that there is an execution sequence between different operations or there is an execution sequence between different operations in technical implementation, the execution sequence between multiple operations may not be sequential, or multiple operations may be executed simultaneously in the flowchart in this disclosure.
As shown in fig. 2, the method includes operations S201 to S204.
In operation S201, a target area is scanned by a laser beam, and a plurality of ranging values of the target area are acquired, wherein the target area includes a target object therein.
In operation S202, position information of a target object is acquired through satellite remote sensing.
In operation S203, a plurality of target ranging values for the target object are determined from the plurality of ranging values according to the position information.
In operation S204, a three-dimensional image of the target object is generated based on the plurality of target ranging values.
According to embodiments of the present disclosure, the laser beam may be generated by activating a laser rangefinder.
According to the embodiment of the present disclosure, the target region may further include a shielding object and a background object, where the shielding object may be an object located between the target object and the laser beam excitation device, and the background object may be an object located behind the target object, that is, a distance between the background object and the laser beam excitation device is greater than a distance between the target object and the laser beam excitation device.
According to an embodiment of the present disclosure, the target object may be an object of an arbitrary shape.
According to an embodiment of the present disclosure, the ranging value may be acquired by a pulse method, a phase method, a triangular reflection method, or the like.
According to embodiments of the present disclosure, the position information of the target object may be used to calculate the distance between the target object and the laser beam excitation device.
According to the embodiment of the disclosure, the categories of the plurality of ranging values can be determined according to the distance between the target object and the laser beam excitation device, namely, the plurality of ranging values are derived from the ranging values obtained by converting the laser signals reflected by the shielding object, the target object or the background object.
According to the embodiment of the disclosure, a contour image of a target object can be constructed according to a target ranging value; and then, adding colors and textures to the contour image by a computer rendering method to obtain a three-dimensional image of the target object.
According to the embodiment of the disclosure, for the acquired ranging value of the target area, the target ranging value of the target object is selected from the ranging values through the position information of the target object, and the three-dimensional image of the target object is generated by using the target ranging value, so that the influence of the influence factors such as obstacles on the imaging quality is at least partially overcome, and the requirements of imaging scenes such as micro-distance and long-distance are met.
The method of fig. 2 is further described with reference to fig. 3 in conjunction with specific embodiments.
Fig. 3 schematically illustrates a schematic diagram of a laser range imaging method according to another embodiment of the present disclosure.
As shown in fig. 3, the method includes operations S301 to S307.
In operation S301, a plurality of ranging values of a target area are acquired;
in operation S302, position information of a target object in a target area is acquired.
In operation S303, a first distance value from the target object is calculated according to the position information.
In operation S304, a plurality of target ranging values for the target object are determined from the plurality of ranging values according to the first distance value.
In operation S305, it is determined whether an abnormal value exists in the target ranging value. If the determination result is that there is an abnormal value in the target ranging value, performing operation S306; in case that there is no abnormal value in the target ranging value as a result of the determination, operation S307 is performed.
In operation S306, the abnormal value is eliminated, and operation S305 is re-executed to make a re-judgment.
In operation S307, a three-dimensional image of the target object is generated according to the plurality of target ranging values.
According to the embodiment of the present disclosure, the method of operations S301 to S302 may be implemented according to the method of operations S201 to S202, and will not be described herein again.
According to the embodiment of the present disclosure, a first distance value from a target object may be acquired according to position information and current position information of the target object.
According to an embodiment of the present disclosure, the first distance value may be a straight-line distance between the laser and the target object.
According to embodiments of the present disclosure, a plurality of methods may be employed to determine a plurality of target ranging values for a target object from a plurality of ranging values.
For example, the plurality of ranging values may be clustered, and a representative value of each category is selected, and whether the ranging value in the category is the target ranging value is determined according to a relationship between the representative value and the first distance value, specifically, the plurality of ranging values may be clustered into a plurality of ranging value point sets based on the magnitude of the plurality of ranging values; for each distance measurement value point set, taking the distance measurement value of the central point of the distance measurement value point set as a second distance value of the distance measurement value point set; respectively calculating the difference value between the first distance value and the second distance value of each distance value point set to obtain a plurality of third distance values; and determining a plurality of ranging values contained in the ranging value point set corresponding to the third distance value with the minimum absolute value as a plurality of target ranging values.
For another example, each ranging value may be paired with a first range value to determine a target ranging value. Specifically, for each ranging value, a difference value between the ranging value and the first distance value may be calculated; and determining the distance measurement value as a target distance measurement value under the condition that the absolute value of the difference value is smaller than a preset threshold value. The preset threshold may be related to the first ranging value, for example, may be 10% of the first ranging value, 5% of the first ranging value, etc.; when a known target object is imaged, the preset threshold value may also be related to shape information of the target object.
For another example, when a known target object is imaged, a machine learning model such as an example segmentation model or an image recognition model may be used to determine a target range value from a plurality of range values. Specifically, a contour map of the target area may be generated using a plurality of range values; then inputting the contour map into an instance segmentation model to obtain a plurality of identified instances; the obtained examples can be input into an image recognition model for recognizing the target object so as to determine target examples; determining a target ranging value according to the target example; alternatively, the target instance may be rendered directly to obtain an image of the target object.
According to the embodiments of the present disclosure, since the number of abnormal values is generally small, a method of cluster determination may be employed to determine whether there is an abnormal value in the target ranging value.
According to an embodiment of the present disclosure, generating a three-dimensional image of a target object from a plurality of target ranging values may include the steps of: generating a point cloud image of a target object according to the plurality of target ranging values; and processing the point cloud image to obtain a three-dimensional image of the target object.
According to the embodiment of the disclosure, the target ranging value can be converted into point cloud data in a data space by a point cloud extraction method, and the point cloud extraction process may include processes of numerical processing, cluster merging and the like, for example.
According to an embodiment of the present disclosure, each point cloud data is connected with other point cloud data in the vicinity, and a point cloud image of a target object may be generated.
According to the embodiment of the disclosure, the point cloud image is subjected to denoising, rendering and other processing by using electronic equipment such as a computer, and a three-dimensional image of a target object can be obtained.
In other embodiments of the present disclosure, two laser excitation processes may be included in one laser range imaging process. In the first laser excitation process, the depth of field parameter and the plane parameter of the target object can be determined according to the received multiple distance measurement values, and then the imaging parameter can be corrected by using the depth of field parameter and the plane parameter, for example, parameters such as power and receiving angle of a laser distance meter can be adjusted; during the second laser excitation, a three-dimensional image may be generated using the parameter-corrected laser rangefinder by a method such as operations S301 to S307.
Fig. 4 schematically illustrates a block diagram of a laser range imaging apparatus according to an embodiment of the present disclosure.
As shown in FIG. 4, the apparatus includes a scanning module 410, a telemetry module 420, a processing module 430, and a first generation module 440.
The scanning module 410 is configured to scan a target area with a laser beam to obtain a plurality of ranging values of the target area, where the target area includes a target object therein.
And the remote sensing module 420 is configured to obtain the position information of the target object through satellite remote sensing.
And a processing module 430, configured to determine a plurality of target ranging values for the target object from the plurality of ranging values according to the position information.
A first generating module 440, configured to generate a three-dimensional image of the target object based on the plurality of target ranging values.
According to the embodiment of the disclosure, for the acquired ranging value of the target area, the target ranging value of the target object is selected from the ranging values through the position information of the target object, and the three-dimensional image of the target object is generated by using the target ranging value, so that the influence of the influence factors such as obstacles on the imaging quality is at least partially overcome, and the requirements of imaging scenes such as micro-distance and long-distance are met.
According to an embodiment of the present disclosure, the processing module 430 includes a first processing unit and a second processing unit.
And the first processing unit is used for acquiring a first distance value between the first processing unit and the target object based on the position information and the current position information of the target object.
And a second processing unit for determining a plurality of target ranging values for the target object from the plurality of ranging values according to the first distance value.
According to an embodiment of the present disclosure, the second processing unit includes a first processing subunit, a second processing subunit, a third processing subunit, and a fourth processing subunit.
The first processing subunit is configured to cluster the plurality of ranging values into a plurality of ranging value point sets based on the magnitude of the plurality of ranging values.
And the second processing subunit is used for taking the ranging value of the central point of the ranging value point set as the second distance value of the ranging value point set for each ranging value point set.
And the third processing subunit is used for respectively calculating the difference value between the first distance value and the second distance value of each distance measurement value point set to obtain a plurality of third distance values.
And the fourth processing subunit is configured to determine that the plurality of ranging values included in the ranging value point set corresponding to the third distance value with the smallest absolute value are the plurality of target ranging values.
According to an embodiment of the present disclosure, the second processing unit further includes a fifth processing subunit and a sixth processing subunit.
And the fifth processing subunit is used for calculating the difference value between the ranging value and the first distance value for each ranging value.
And the sixth processing subunit is used for determining the ranging value as the target ranging value under the condition that the absolute value of the difference value is smaller than the preset threshold value.
According to an embodiment of the present disclosure, the first generating module 440 includes a first generating unit and a second generating unit.
And the first generating unit is used for generating a point cloud image of the target object according to the plurality of target ranging values.
And the second generation unit is used for processing the point cloud image to acquire a three-dimensional image of the target object.
According to an embodiment of the present disclosure, the apparatus further comprises a second generating module. The second generation module comprises a third generation unit, a fourth generation unit and a fifth generation unit.
And the third generating unit is used for acquiring the depth of field parameter and the plane parameter of the target object according to the plurality of target ranging values.
A fourth generation unit for correcting the imaging parameter based on the depth of field parameter and the plane parameter.
A fifth generating unit for generating a three-dimensional image of the target object based on the plurality of target ranging values and the corrected imaging parameters.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any number of the scanning module 410, the telemetry module 420, the processing module 430, and the first generating module 440 may be combined in one module/unit/sub-unit for implementation, or any one of the modules/units/sub-units may be split into a plurality of modules/units/sub-units. Alternatively, at least part of the functionality of one or more of these modules/units/sub-units may be combined with at least part of the functionality of other modules/units/sub-units and implemented in one module/unit/sub-unit. According to an embodiment of the disclosure, at least one of the scanning module 410, the remote sensing module 420, the processing module 430, and the first generating module 440 may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or in any one of three implementations of software, hardware, and firmware, or in any suitable combination of any of them. Alternatively, at least one of the scanning module 410, the telemetry module 420, the processing module 430 and the first generation module 440 may be at least partially implemented as a computer program module, which when executed, may perform corresponding functions.
It should be noted that the laser ranging imaging device portion in the embodiment of the present disclosure corresponds to the laser ranging imaging method portion in the embodiment of the present disclosure, and the description of the laser ranging imaging device portion specifically refers to the laser ranging imaging method portion, which is not described herein again.
Fig. 5 schematically shows a block diagram of an electronic device suitable for implementing a laser range imaging method according to an embodiment of the present disclosure. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, a computer electronic device 500 according to an embodiment of the present disclosure includes a processor 501 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. The processor 501 may comprise, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), among others. The processor 501 may also include onboard memory for caching purposes. Processor 501 may include a single processing unit or multiple processing units for performing different actions of a method flow according to embodiments of the disclosure.
In the RAM 503, various programs and data necessary for the operation of the electronic apparatus 500 are stored. The processor 501, the ROM 502, and the RAM 503 are connected to each other by a bus 504. The processor 501 performs various operations of the method flows according to the embodiments of the present disclosure by executing programs in the ROM 502 and/or the RAM 503. Note that the programs may also be stored in one or more memories other than the ROM 502 and the RAM 503. The processor 501 may also perform various operations of method flows according to embodiments of the present disclosure by executing programs stored in the one or more memories.
According to an embodiment of the present disclosure, electronic device 500 may also include an input/output (I/O) interface 505, input/output (I/O) interface 505 also being connected to bus 504. The electronic device 500 may also include one or more of the following components connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output portion 507 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
According to embodiments of the present disclosure, method flows according to embodiments of the present disclosure may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable storage medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program, when executed by the processor 501, performs the above-described functions defined in the system of the embodiments of the present disclosure. The systems, devices, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to an embodiment of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium. Examples may include, but are not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
For example, according to embodiments of the present disclosure, a computer-readable storage medium may include ROM 502 and/or RAM 503 and/or one or more memories other than ROM 502 and RAM 503 described above.
Embodiments of the present disclosure also include a computer program product comprising a computer program containing program code for performing the method provided by the embodiments of the present disclosure, when the computer program product is run on an electronic device, for causing the electronic device to implement the laser range-finding imaging method provided by the embodiments of the present disclosure.
The computer program, when executed by the processor 501, performs the above-described functions defined in the system/apparatus of the embodiments of the present disclosure. The systems, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
In one embodiment, the computer program may be hosted on a tangible storage medium such as an optical storage device, a magnetic storage device, or the like. In another embodiment, the computer program may also be transmitted, distributed in the form of a signal on a network medium, downloaded and installed through the communication section 509, and/or installed from the removable medium 511. The computer program containing program code may be transmitted using any suitable network medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
In accordance with embodiments of the present disclosure, program code for executing computer programs provided by embodiments of the present disclosure may be written in any combination of one or more programming languages, and in particular, these computer programs may be implemented using high level procedural and/or object oriented programming languages, and/or assembly/machine languages. The programming language includes, but is not limited to, programming languages such as Java, C + +, python, the "C" language, or the like. The program code may execute entirely on the user computing device, partly on the user device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
The embodiments of the present disclosure have been described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described separately above, this does not mean that the measures in the embodiments cannot be used in advantageous combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the present disclosure, and such alternatives and modifications are intended to be within the scope of the present disclosure.

Claims (10)

1. A laser range imaging method, comprising:
scanning a target area through a laser beam, and acquiring a plurality of ranging values of the target area, wherein the target area comprises a target object;
acquiring the position information of the target object through satellite remote sensing;
determining a plurality of target ranging values for the target object from the plurality of ranging values according to the location information; and
generating a three-dimensional image of the target object based on the plurality of target ranging values.
2. The method of claim 1, wherein the determining a plurality of target ranging values for the target object from the plurality of ranging values according to the location information comprises:
acquiring a first distance value with the target object based on the position information and the current position information of the target object; and
determining a plurality of target ranging values for the target object from the plurality of ranging values according to the first distance value.
3. The method of claim 2, wherein said determining a plurality of target ranging values for the target object from the plurality of ranging values based on the first distance value comprises:
clustering the plurality of ranging values into a plurality of ranging value point sets based on numerical magnitudes of the plurality of ranging values;
for each ranging value point set, taking a ranging value of a central point of the ranging value point set as a second distance value of the ranging value point set;
calculating the difference value between the first distance value and the second distance value of each distance value point set respectively to obtain a plurality of third distance values; and
and determining a plurality of ranging values contained in the ranging value point set corresponding to the third distance value with the minimum absolute value as the plurality of target ranging values.
4. The method of claim 2, wherein said determining a plurality of target ranging values for the target object from the plurality of ranging values based on the first distance value comprises:
for each of the ranging values, calculating a difference between the ranging value and the first distance value; and
and under the condition that the absolute value of the difference value is smaller than a preset threshold value, determining the distance measurement value as the target distance measurement value.
5. The method of claim 1, wherein the generating a three-dimensional image of the target object based on the plurality of target ranging values comprises:
generating a point cloud image of the target object according to the plurality of target ranging values; and
and processing the point cloud image to acquire a three-dimensional image of the target object.
6. The method of claim 1, further comprising:
acquiring the depth of field parameter and the plane parameter of the target object according to the plurality of target ranging values;
correcting imaging parameters based on the depth of field parameter and the plane parameter; and
generating a three-dimensional image of the target object based on the plurality of target ranging values and the corrected imaging parameters.
7. A laser range imaging apparatus comprising:
the scanning module is used for scanning a target area through a laser beam and acquiring a plurality of ranging values of the target area, wherein the target area comprises a target object;
the remote sensing module is used for acquiring the position information of the target object through satellite remote sensing;
a processing module configured to determine a plurality of target ranging values for the target object from the plurality of ranging values according to the location information; and
a first generating module for generating a three-dimensional image of the target object based on the plurality of target ranging values.
8. An electronic device, comprising:
one or more processors;
a memory to store one or more instructions that,
wherein the one or more instructions, when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-6.
9. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to carry out the method of any one of claims 1 to 6.
10. A computer program product comprising computer executable instructions for implementing the method of any one of claims 1 to 6 when executed.
CN202110682489.5A 2021-06-18 2021-06-18 Laser ranging imaging method and device, electronic equipment and storage medium Pending CN113393508A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110682489.5A CN113393508A (en) 2021-06-18 2021-06-18 Laser ranging imaging method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110682489.5A CN113393508A (en) 2021-06-18 2021-06-18 Laser ranging imaging method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113393508A true CN113393508A (en) 2021-09-14

Family

ID=77623054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110682489.5A Pending CN113393508A (en) 2021-06-18 2021-06-18 Laser ranging imaging method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113393508A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115900644A (en) * 2023-01-06 2023-04-04 西安华创马科智能控制系统有限公司 Hydraulic support robot working face bottom plate laser scanning imaging method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6249600B1 (en) * 1997-11-07 2001-06-19 The Trustees Of Columbia University In The City Of New York System and method for generation of a three-dimensional solid model
US20090220145A1 (en) * 2008-02-28 2009-09-03 Kabushiki Kaisha Topcon Target and three-dimensional-shape measurement device using the same
US20140336935A1 (en) * 2013-05-07 2014-11-13 Google Inc. Methods and Systems for Detecting Weather Conditions Using Vehicle Onboard Sensors
US20170345311A1 (en) * 2016-05-24 2017-11-30 Kabushiki Kaisha Toshiba Information processing apparatus and information processing method
CN110460758A (en) * 2019-08-28 2019-11-15 上海集成电路研发中心有限公司 A kind of imaging device and imaging method based on laser ranging point identification
US20190391270A1 (en) * 2018-06-25 2019-12-26 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for mitigating effects of high-reflectivity objects in lidar data
WO2021058016A1 (en) * 2019-09-29 2021-04-01 睿镞科技(北京)有限责任公司 Laser radar and method for generating laser point cloud data

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6249600B1 (en) * 1997-11-07 2001-06-19 The Trustees Of Columbia University In The City Of New York System and method for generation of a three-dimensional solid model
US20090220145A1 (en) * 2008-02-28 2009-09-03 Kabushiki Kaisha Topcon Target and three-dimensional-shape measurement device using the same
US20140336935A1 (en) * 2013-05-07 2014-11-13 Google Inc. Methods and Systems for Detecting Weather Conditions Using Vehicle Onboard Sensors
US20170345311A1 (en) * 2016-05-24 2017-11-30 Kabushiki Kaisha Toshiba Information processing apparatus and information processing method
US20190391270A1 (en) * 2018-06-25 2019-12-26 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for mitigating effects of high-reflectivity objects in lidar data
CN110460758A (en) * 2019-08-28 2019-11-15 上海集成电路研发中心有限公司 A kind of imaging device and imaging method based on laser ranging point identification
WO2021058016A1 (en) * 2019-09-29 2021-04-01 睿镞科技(北京)有限责任公司 Laser radar and method for generating laser point cloud data

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115900644A (en) * 2023-01-06 2023-04-04 西安华创马科智能控制系统有限公司 Hydraulic support robot working face bottom plate laser scanning imaging method and device

Similar Documents

Publication Publication Date Title
CN109188457B (en) Object detection frame generation method, device, equipment, storage medium and vehicle
EP3627180A1 (en) Sensor calibration method and device, computer device, medium, and vehicle
CN114550177B (en) Image processing method, text recognition method and device
CN109118456B (en) Image processing method and device
CN115817463B (en) Vehicle obstacle avoidance method, device, electronic equipment and computer readable medium
CN113607185A (en) Lane line information display method, lane line information display device, electronic device, and computer-readable medium
CN114240805B (en) Multi-angle SAR dynamic imaging detection method and device
WO2019188509A1 (en) Radar image processing device, radar image processing method, and storage medium
CN113393508A (en) Laser ranging imaging method and device, electronic equipment and storage medium
CN113177497B (en) Training method of visual model, vehicle identification method and device
US20230029628A1 (en) Data processing method for vehicle, electronic device, and medium
CN109839645B (en) Speed detection method, system, electronic device and computer readable medium
CN114782574A (en) Image generation method, face recognition device, electronic equipment and medium
CN115511870A (en) Object detection method and device, electronic equipment and storage medium
CN111383337B (en) Method and device for identifying objects
CN109859254B (en) Method and device for sending information in automatic driving
KR20180097004A (en) Method of position calculation between radar target lists and vision image ROI
CN109613553B (en) Method, device and system for determining number of objects in scene based on laser radar
CN110389349B (en) Positioning method and device
US20230162383A1 (en) Method of processing image, device, and storage medium
US20230070349A1 (en) Positioning methods and cloud device
CN115829898B (en) Data processing method, device, electronic equipment, medium and automatic driving vehicle
CN114092874B (en) Training method of target detection model, target detection method and related equipment thereof
CN113628208B (en) Ship detection method, device, electronic equipment and computer readable medium
CN115294234B (en) Image generation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination