CN115049810A - Coloring method, device and equipment for solid-state laser radar point cloud and storage medium - Google Patents

Coloring method, device and equipment for solid-state laser radar point cloud and storage medium Download PDF

Info

Publication number
CN115049810A
CN115049810A CN202210657501.1A CN202210657501A CN115049810A CN 115049810 A CN115049810 A CN 115049810A CN 202210657501 A CN202210657501 A CN 202210657501A CN 115049810 A CN115049810 A CN 115049810A
Authority
CN
China
Prior art keywords
target
point
solid
camera
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210657501.1A
Other languages
Chinese (zh)
Inventor
王境淇
童天辰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shitu Technology Hangzhou Co ltd
Original Assignee
Shitu Technology Hangzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shitu Technology Hangzhou Co ltd filed Critical Shitu Technology Hangzhou Co ltd
Priority to CN202210657501.1A priority Critical patent/CN115049810A/en
Publication of CN115049810A publication Critical patent/CN115049810A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a coloring method, a coloring device, coloring equipment and a storage medium for solid-state laser radar point cloud. The method comprises the steps of obtaining a target image and a target three-dimensional point cloud which are respectively collected by a camera and a solid-state laser radar sensor aiming at a target scene; acquiring projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where a target image is located according to external parameters between the camera and the solid-state laser radar sensor and internal parameters of the camera; acquiring target projection points falling into a target image, and acquiring pixel point ranges corresponding to the target projection points in the target image; and coloring the point cloud point associated with the target projection point by using the color characteristic value of each target pixel point in the pixel point range. The technical scheme of the embodiment of the invention provides a novel method for coloring solid-state laser radar point cloud, which can effectively color the solid-state laser radar point cloud.

Description

Coloring method, device and equipment for solid-state laser radar point cloud and storage medium
Technical Field
The invention relates to the technical field of point cloud coloring, in particular to a coloring method, a coloring device, coloring equipment and a storage medium for solid-state laser radar point cloud.
Background
With the development of an automatic driving technology, the requirements on an image construction sensing technology are higher and higher, and a laser radar for image construction positioning sensing at present does not have color information although having accurate distance information, and the lack of the color information can seriously affect the sensing task, so that a camera is required to obtain an image to color a point cloud. Most of the existing schemes aim at 360 mechanical radars, and utilize the external reference relation between a camera and the radar to perform back projection to obtain the color of point cloud, so that the existing schemes are not suitable for solid-state radars.
Disclosure of Invention
The invention provides a coloring method, a coloring device, coloring equipment and a storage medium for solid-state laser radar point cloud, and provides a novel method for coloring the solid-state laser radar point cloud, so that the solid-state laser radar point cloud is effectively colored.
According to an aspect of the invention, a coloring method for a solid-state lidar point cloud is provided, and the method comprises the following steps:
acquiring a target image and a target three-dimensional point cloud which are respectively acquired by a camera and a solid-state laser radar sensor aiming at a target scene;
acquiring projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where a target image is located according to external parameters between the camera and the solid-state laser radar sensor and internal parameters of the camera;
acquiring target projection points falling into a target image, and acquiring pixel point ranges corresponding to the target projection points in the target image;
and coloring the point cloud point associated with the target projection point by using the color characteristic value of each target pixel point in the pixel point range.
According to another aspect of the present invention, there is provided an apparatus for coloring a solid state lidar point cloud, the apparatus comprising:
the acquisition module of the target image and the target three-dimensional point cloud is used for acquiring the target image and the target three-dimensional point cloud which are respectively acquired by the camera and the solid-state laser radar sensor aiming at a target scene;
the projection point acquisition module is used for acquiring projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where the target image is located according to external parameters between the camera and the solid-state laser radar sensor and internal parameters of the camera;
the pixel point range acquisition module is used for acquiring target projection points falling into a target image and acquiring pixel point ranges corresponding to the target projection points in the target image;
and the point cloud coloring module is used for coloring the point cloud points associated with the target projection points by using the color characteristic values of the target pixel points in the pixel point range.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of coloring a solid state lidar point cloud of any embodiment of the invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium storing computer instructions for causing a processor to implement the coloring method of solid-state lidar point cloud according to any of the embodiments of the present invention when executed.
According to the technical scheme of the embodiment of the invention, a target image and a target three-dimensional point cloud which are respectively collected by a camera and a solid-state laser radar sensor aiming at a target scene are obtained; acquiring projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where a target image is located according to external parameters between the camera and the solid-state laser radar sensor and internal parameters of the camera; acquiring target projection points falling into a target image, and acquiring pixel point ranges corresponding to the target projection points in the target image; the technical means of coloring the point cloud point associated with the target projection point by using the color characteristic value of each target pixel point in the pixel point range provides a new method for coloring the solid-state laser radar point cloud, and the solid-state laser radar point cloud is effectively colored.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present invention, nor do they necessarily limit the scope of the invention. Other features of the present invention will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a coloring method for a solid-state lidar point cloud according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a coloring apparatus for solid-state lidar point cloud according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device of a coloring method for a solid-state lidar point cloud according to a third embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
Fig. 1 is a flowchart of a coloring method for a solid-state lidar point cloud according to an embodiment of the present invention, where the embodiment is applicable to a case of coloring the point cloud collected by a solid-state lidar sensor, and the method may be implemented by a coloring apparatus for a solid-state lidar point cloud, and the apparatus may be implemented in a form of hardware and/or software, and may be configured in a CPU processor or a server. As shown in fig. 1, the method includes:
and S110, acquiring a target image and a target three-dimensional point cloud which are respectively acquired by a camera and a solid-state laser radar sensor aiming at a target scene.
The target scene can be a small-range scene, and the target scene can be completely shot and scanned by the camera and the solid-state laser radar sensor. The target image may refer to an image that a camera takes for a target scene. The target three-dimensional point cloud can refer to point cloud obtained by scanning the solid-state laser radar for a target scene.
In this embodiment, an image and a three-dimensional point cloud acquired by a camera and a solid-state lidar sensor, respectively, for a target scene may be acquired.
And S120, acquiring projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where the target image is located according to external parameters between the camera and the solid-state laser radar sensor and internal parameters of the camera.
Where the external reference between the camera and the solid state lidar sensor may refer to a matrix of rotation matrices R and translation vectors t, in the form of,
Figure BDA0003688845480000051
the camera's internal reference may refer to an internal reference matrix, such as,
Figure BDA0003688845480000052
wherein f is x And fy is the focal length in the x and y directions, respectively, C x And C y Is the longitudinal and lateral offset (unit: pixel) of the image origin relative to the optical center imaging point. The projection points may be points obtained by projecting cloud points of each point onto an image plane where the target image is located.
Optionally, before acquiring a projection point of each cloud point in the three-dimensional point cloud of the target in the image plane where the target image is located according to external parameters between the camera and the solid-state lidar sensor and internal parameters of the camera, the method may further include:
initializing a trigger camera and a solid laser radar sensor to realize internal parameter calibration and distortion removal; and calculating external parameters between the camera and the solid-state laser radar sensor according to the position relation and the attitude relation between the camera and the solid-state laser radar sensor.
Wherein distortion may refer to radial distortion and tangential distortion of the camera. The position relation and the attitude relation may respectively refer to a relative position relation and a relative attitude relation of the camera and the solid-state lidar sensor.
In this embodiment, the trigger camera and the solid-state lidar sensor may be specifically controlled to initialize, so as to achieve internal parameter calibration and distortion removal; and calculating external parameters between the camera and the solid-state laser radar sensor according to the relative position relation and the relative attitude relation between the camera and the solid-state laser radar sensor.
In an optional implementation manner of this embodiment, acquiring projection points of cloud points in the target three-dimensional point cloud on an image plane where the target image is located according to external parameters between the camera and the solid-state lidar sensor and internal parameters of the camera may include:
acquiring conversion three-dimensional points of cloud points of each point in the target three-dimensional point cloud under a camera coordinate system according to external parameters between the camera and the solid laser radar sensor; and acquiring projection points of the converted three-dimensional points on an image plane where the target image is located according to internal parameters of the camera.
The camera coordinate system may be a coordinate system in which the optical center of the camera is a circular point and the optical center ray is a z-axis. The three-dimensional point conversion can be a three-dimensional point obtained by multiplying the coordinates of cloud points of each point in a solid-state radar sensor coordinate system by an external parameter matrix between the camera and the solid-state laser radar sensor. The projection point may be a two-dimensional point obtained by multiplying the coordinates of the converted three-dimensional point by an internal reference matrix of the camera.
S130, acquiring target projection points falling into the target image, and acquiring pixel point ranges corresponding to the target projection points in the target image.
The target projection point may be a part of projection points of all projection points, of which coordinate values are located in the coordinate range of the target image.
In an optional implementation manner of this embodiment, acquiring the target projection point falling within the target image may include: matching the coordinate value of each projection point with the coordinate range of the target image; and determining projection points in the coordinate range of the target image as target projection points.
For example, if the abscissa range of the target image is (0, 256) and the ordinate range is (0, 512), the projection point having the coordinate value of (-1, 300) does not fall within the coordinate range of the target image.
The coordinate value of the target projection point may be a floating point number, each pixel point in the target image has a corresponding integer coordinate value, and the coordinate values between the two may not correspond to each other. Therefore, in order to effectively color the three-dimensional point cloud point corresponding to the target projection point, a pixel point range matched with the target projection point can be determined in each pixel point of the target head portrait aiming at the target projection point with the floating point number as the coordinate value.
In this embodiment, the pixel range corresponding to the target projection point can be searched in the target image according to the coordinate value of the target projection point and the coordinate value of each pixel in the target image.
And S140, coloring the point cloud point associated with the target projection point by using the color characteristic value of each target pixel point in the pixel point range.
The target pixel point may refer to each pixel point within the pixel point range. The color feature value may refer to an RGB value.
In an optional implementation manner of this embodiment, a weight value matched with each pixel point may be calculated according to a coordinate value of each target pixel point within the pixel point range; according to the weighted value, the color characteristic value of each target pixel point is weighted and summed to obtain a target color characteristic value; and coloring the point cloud points associated with the target projection points according to the target color characteristic values.
Optionally, calculating a weighted value matched with each pixel point according to the coordinate value of each target pixel point in the pixel point range may include: respectively calculating a plurality of distances between each target pixel point and the currently processed target projection point according to the coordinate value of each target pixel point and the coordinate value of the currently processed target projection point; and calculating the weight value matched with each target pixel point according to the plurality of distances.
It should be noted that, the closer the target pixel point to the currently processed target projection point, the higher the matching weight value.
Illustratively, the pixel point range includes pixel point 1, pixel point 2 and pixel point 3, and the distances between the pixel points and the currently processed target projection point are d respectively 1 、d 2 And d 3 If the weighted value of pixel 1, pixel 2 and pixel 3 is 1-d 1 %、1-d 2 % and 1-d 3 %。
According to the technical scheme of the embodiment of the invention, a target image and a target three-dimensional point cloud which are respectively collected by a camera and a solid-state laser radar sensor aiming at a target scene are obtained; acquiring projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where a target image is located according to external parameters between the camera and the solid-state laser radar sensor and internal parameters of the camera; acquiring target projection points falling into a target image, and acquiring pixel point ranges corresponding to the target projection points in the target image; the technical means of coloring the point cloud point associated with the target projection point by using the color characteristic value of each target pixel point in the pixel point range provides a new method for coloring the solid-state laser radar point cloud, and the solid-state laser radar point cloud is effectively colored.
Example two
Fig. 2 is a schematic structural diagram of a coloring device for solid-state lidar point cloud according to a second embodiment of the present invention. As shown in fig. 2, the apparatus includes: an acquisition module 210 for a target image and a target three-dimensional point cloud, a projection point acquisition module 220, a pixel point range acquisition module 230, and a point cloud coloring module 240. Wherein:
a target image and target three-dimensional point cloud obtaining module 210, configured to obtain a target image and a target three-dimensional point cloud, which are respectively collected by a camera and a solid-state lidar sensor for a target scene;
a projection point obtaining module 220, configured to obtain projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where the target image is located according to external parameters between the camera and the solid-state lidar sensor and internal parameters of the camera;
a pixel point range obtaining module 230, configured to obtain target projection points falling in a target image, and obtain pixel point ranges corresponding to the target projection points in the target image;
and the point cloud coloring module 240 is configured to color the point cloud point associated with the target projection point by using the color feature value of each target pixel point in the pixel point range.
According to the technical scheme of the embodiment of the invention, a target image and a target three-dimensional point cloud which are respectively collected by a camera and a solid-state laser radar sensor aiming at a target scene are obtained; acquiring projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where a target image is located according to external parameters between the camera and the solid-state laser radar sensor and internal parameters of the camera; acquiring target projection points falling into a target image, and acquiring pixel point ranges corresponding to the target projection points in the target image; the technical means of coloring the point cloud point associated with the target projection point by using the color characteristic value of each target pixel point in the pixel point range provides a new method for coloring the solid-state laser radar point cloud, and the solid-state laser radar point cloud is effectively colored.
Optionally, the system further comprises an internal reference calibration and external reference calculation module, configured to, before obtaining projection points of cloud points of each point in the target three-dimensional point cloud in an image plane where the target image is located according to external references between the camera and the solid-state lidar sensor and the internal references of the camera:
initializing a trigger camera and a solid-state laser radar sensor to realize internal parameter calibration and distortion removal;
and calculating external parameters between the camera and the solid-state laser radar sensor according to the position relation and the attitude relation between the camera and the solid-state laser radar sensor.
Optionally, the projection point obtaining module 220 may be specifically configured to:
acquiring conversion three-dimensional points of cloud points of each point in the target three-dimensional point cloud under a camera coordinate system according to external parameters between the camera and the solid laser radar sensor;
and acquiring projection points of the converted three-dimensional points on an image plane where the target image is located according to internal parameters of the camera.
Optionally, the pixel point range obtaining module 230 may be specifically configured to:
matching the coordinate value of each projection point with the coordinate range of the target image;
and determining the projection point positioned in the coordinate range of the target image as a target projection point.
Optionally, the point cloud coloring module 240 includes:
the weighted value calculating unit is used for calculating the weighted value matched with each pixel point according to the coordinate value of each target pixel point in the pixel point range;
the target color characteristic value obtaining unit is used for weighting and summing the color characteristic values of the target pixel points according to the weight values to obtain target color characteristic values;
and the point cloud point coloring unit is used for coloring the point cloud points associated with the target projection points according to the target color characteristic values.
Optionally, the weight value calculating unit may be specifically configured to:
respectively calculating a plurality of distances between each target pixel point and the currently processed target projection point according to the coordinate value of each target pixel point and the coordinate value of the currently processed target projection point;
and calculating the weight value matched with each target pixel point according to the plurality of distances.
The coloring device for the solid-state laser radar point cloud provided by the embodiment of the invention can execute the coloring method for the solid-state laser radar point cloud provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
EXAMPLE III
FIG. 3 illustrates a schematic diagram of an electronic device 300 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 3, the electronic device 300 includes at least one processor 301, and a memory communicatively connected to the at least one processor 301, such as a Read Only Memory (ROM)302, a Random Access Memory (RAM)303, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 301 may perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM)302 or the computer program loaded from the storage unit 308 into the Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for the operation of the electronic apparatus 300 can also be stored. The processor 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
A number of components in the electronic device 300 are connected to the I/O interface 305, including: an input unit 306 such as a keyboard, a mouse, or the like; an output unit 307 such as various types of displays, speakers, and the like; a storage unit 308 such as a magnetic disk, optical disk, or the like; and a communication unit 309 such as a network card, modem, wireless communication transceiver, etc. The communication unit 309 allows the electronic device 300 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The processor 301 may be a variety of general and/or special purpose processing components with processing and computing capabilities. Some examples of processor 301 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and the like. Processor 301 performs the various methods and processes described above, such as the coloring method of the solid state lidar point cloud.
In some embodiments, the method of coloring a solid-state lidar point cloud may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as storage unit 308. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 300 via the ROM 302 and/or the communication unit 309. When the computer program is loaded into RAM 303 and executed by processor 301, one or more steps of the method for coloring a solid state lidar point cloud described above may be performed. Alternatively, in other embodiments, the processor 301 may be configured to perform the coloring method of the solid state lidar point cloud by any other suitable means (e.g., by way of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A coloring method for solid-state laser radar point cloud is characterized by comprising the following steps:
acquiring a target image and a target three-dimensional point cloud which are respectively acquired by a camera and a solid-state laser radar sensor aiming at a target scene;
acquiring projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where a target image is located according to external parameters between the camera and the solid-state laser radar sensor and internal parameters of the camera;
acquiring target projection points falling into a target image, and acquiring pixel point ranges corresponding to the target projection points in the target image;
and coloring the point cloud point associated with the target projection point by using the color characteristic value of each target pixel point in the pixel point range.
2. The method according to claim 1, further comprising, before the obtaining the projection points of the cloud points of each point in the three-dimensional point cloud of the target in the image plane of the target image according to the external parameters between the camera and the solid-state lidar sensor and the internal parameters of the camera:
initializing a trigger camera and a solid-state laser radar sensor to realize internal parameter calibration and distortion removal;
and calculating external parameters between the camera and the solid-state laser radar sensor according to the position relation and the attitude relation between the camera and the solid-state laser radar sensor.
3. The method according to claim 1, wherein the obtaining projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where the target image is located according to external parameters between the camera and the solid-state lidar sensor and internal parameters of the camera comprises:
acquiring conversion three-dimensional points of cloud points of each point in the target three-dimensional point cloud under a camera coordinate system according to external parameters between the camera and the solid laser radar sensor;
and acquiring projection points of the converted three-dimensional points on an image plane where the target image is located according to internal parameters of the camera.
4. The method of claim 1, wherein the obtaining a target proxel that falls within a target image comprises:
matching the coordinate value of each projection point with the coordinate range of the target image;
and determining the projection point positioned in the coordinate range of the target image as a target projection point.
5. The method as claimed in claim 1, wherein the coloring the point cloud point associated with the target projection point by using the color feature value of each target pixel point in the pixel point range comprises:
calculating a weight value matched with each pixel point according to the coordinate value of each target pixel point in the pixel point range;
according to the weighted value, the color characteristic values of the target pixel points are weighted and summed to obtain a target color characteristic value;
and coloring the point cloud points associated with the target projection points according to the target color characteristic values.
6. The method according to claim 5, wherein the calculating a weight value matching each pixel point according to the coordinate value of each target pixel point in the pixel point range comprises:
respectively calculating a plurality of distances between each target pixel point and the currently processed target projection point according to the coordinate value of each target pixel point and the coordinate value of the currently processed target projection point;
and calculating the weight value matched with each target pixel point according to the distances.
7. The utility model provides a coloring device of solid-state lidar point cloud which characterized in that includes:
the acquisition module of the target image and the target three-dimensional point cloud is used for acquiring the target image and the target three-dimensional point cloud which are respectively acquired by the camera and the solid-state laser radar sensor aiming at a target scene;
the projection point acquisition module is used for acquiring projection points of cloud points of each point in the target three-dimensional point cloud on an image plane where the target image is located according to external parameters between the camera and the solid-state laser radar sensor and internal parameters of the camera;
the pixel point range acquisition module is used for acquiring target projection points falling into a target image and acquiring pixel point ranges corresponding to the target projection points in the target image;
and the point cloud coloring module is used for coloring the point cloud points associated with the target projection points by using the color characteristic values of the target pixel points in the pixel point range.
8. The device of claim 7, further comprising an internal reference and external reference calculation module, configured to, before the acquiring of the projection point of each cloud point in the target three-dimensional point cloud in the image plane of the target image according to the external reference between the camera and the solid-state lidar sensor and the internal reference of the camera:
initializing a trigger camera and a solid-state laser radar sensor to realize internal parameter calibration and distortion removal;
and calculating external parameters between the camera and the solid-state laser radar sensor according to the position relation and the attitude relation between the camera and the solid-state laser radar sensor.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of coloring a solid state lidar point cloud of any of claims 1-6.
10. A computer readable storage medium having stored thereon computer instructions for causing a processor to execute a method of coloring a solid state lidar point cloud of any of claims 1-6.
CN202210657501.1A 2022-06-10 2022-06-10 Coloring method, device and equipment for solid-state laser radar point cloud and storage medium Pending CN115049810A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210657501.1A CN115049810A (en) 2022-06-10 2022-06-10 Coloring method, device and equipment for solid-state laser radar point cloud and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210657501.1A CN115049810A (en) 2022-06-10 2022-06-10 Coloring method, device and equipment for solid-state laser radar point cloud and storage medium

Publications (1)

Publication Number Publication Date
CN115049810A true CN115049810A (en) 2022-09-13

Family

ID=83161636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210657501.1A Pending CN115049810A (en) 2022-06-10 2022-06-10 Coloring method, device and equipment for solid-state laser radar point cloud and storage medium

Country Status (1)

Country Link
CN (1) CN115049810A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116912451A (en) * 2022-09-20 2023-10-20 梅卡曼德(北京)机器人科技有限公司 Point cloud image acquisition method, device, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116912451A (en) * 2022-09-20 2023-10-20 梅卡曼德(北京)机器人科技有限公司 Point cloud image acquisition method, device, equipment and storage medium
CN116912451B (en) * 2022-09-20 2024-05-07 梅卡曼德(北京)机器人科技有限公司 Point cloud image acquisition method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN112966587B (en) Training method of target detection model, target detection method and related equipment
CN113095336B (en) Method for training key point detection model and method for detecting key points of target object
CN114186632B (en) Method, device, equipment and storage medium for training key point detection model
CN112597837A (en) Image detection method, apparatus, device, storage medium and computer program product
CN115457152A (en) External parameter calibration method and device, electronic equipment and storage medium
CN115273071A (en) Object identification method and device, electronic equipment and storage medium
CN116661477A (en) Substation unmanned aerial vehicle inspection method, device, equipment and storage medium
CN115049810A (en) Coloring method, device and equipment for solid-state laser radar point cloud and storage medium
CN117078767A (en) Laser radar and camera calibration method and device, electronic equipment and storage medium
CN114734444B (en) Target positioning method and device, electronic equipment and storage medium
CN114596362B (en) High-point camera coordinate calculation method and device, electronic equipment and medium
CN115311624A (en) Slope displacement monitoring method and device, electronic equipment and storage medium
CN112241977A (en) Depth estimation method and device for feature points
CN115100296A (en) Photovoltaic module fault positioning method, device, equipment and storage medium
CN114910892A (en) Laser radar calibration method and device, electronic equipment and storage medium
CN116258714B (en) Defect identification method and device, electronic equipment and storage medium
CN116182807B (en) Gesture information determining method, device, electronic equipment, system and medium
CN117746069B (en) Graph searching model training method and graph searching method
CN117576395A (en) Point cloud semantic segmentation method and device, electronic equipment and storage medium
CN116721308A (en) Training method of centroid labeling model and training method of object segmentation model
CN115953469A (en) Positioning method and device based on single and binocular vision, electronic equipment and storage medium
CN117830197A (en) Pupil determination method, device, equipment and storage medium
CN117968624A (en) Binocular camera ranging method, device, equipment and storage medium
CN114757845A (en) Light ray adjusting method and device based on face recognition, electronic equipment and medium
CN117444970A (en) Mechanical arm movement control method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination