CN117173324A - Point cloud coloring method, system, terminal and storage medium - Google Patents

Point cloud coloring method, system, terminal and storage medium Download PDF

Info

Publication number
CN117173324A
CN117173324A CN202310094604.6A CN202310094604A CN117173324A CN 117173324 A CN117173324 A CN 117173324A CN 202310094604 A CN202310094604 A CN 202310094604A CN 117173324 A CN117173324 A CN 117173324A
Authority
CN
China
Prior art keywords
point
point cloud
coloring
image
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310094604.6A
Other languages
Chinese (zh)
Inventor
杜渊洋
余崇圣
许家铭
陈方圆
潘一聪
李浏阳
杨秀义
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhizhi Technology Co ltd
Original Assignee
Suzhou Zhizhi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhizhi Technology Co ltd filed Critical Suzhou Zhizhi Technology Co ltd
Priority to CN202310094604.6A priority Critical patent/CN117173324A/en
Publication of CN117173324A publication Critical patent/CN117173324A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The application provides a point cloud coloring method, a system, a terminal and a storage medium, wherein the method comprises the following steps: acquiring camera parameters and a shooting image of a camera, and determining a coloring sample image in the shooting image according to the camera parameters; respectively acquiring camera positions corresponding to the coloring sample images, and respectively determining point clouds to be colored in the coloring sample images according to the camera positions; respectively carrying out pixel projection on each point cloud point in the cloud of points to be colored to obtain candidate colors of each point cloud point, and respectively obtaining state parameters of each point cloud point; and carrying out state updating on the state parameters of the cloud points of each point according to the candidate colors, and coloring the corresponding cloud points of each point according to the state parameters after the state updating. According to the application, the state parameters of cloud points of each point are updated based on the candidate colors corresponding to the colored sample images, so that the effect of continuously endowing the color information of the images to the cloud point to be colored is achieved, the discontinuity of the color of the surface of the cloud point is reduced, and the accuracy of coloring the cloud point is improved.

Description

Point cloud coloring method, system, terminal and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a point cloud coloring method, a system, a terminal, and a storage medium.
Background
In a lidar and camera fusion system, color image information generated by a camera may be used to color a measured three-dimensional point cloud. Unlike other fixed-point three-dimensional mapping systems, the mobile laser radar and camera fusion system is characterized in that when the system works, the camera continuously shoots images at a higher frequency, and the system has higher repeatability in space, so that higher requirements are put forward on point cloud coloring.
In the existing point cloud coloring process, color in a first frame image of each point cloud point is generally taken for coloring the point cloud, but due to the factors of object variation and illumination in the environment, different shooting angles for the same position and the like, the same object or surface may have different colors in a plurality of frame images, so that obvious discontinuity of the color of the colored point cloud surface is easily caused, and the accuracy of coloring the point cloud is reduced.
Disclosure of Invention
The embodiment of the application aims to provide a point cloud coloring method, a system, a terminal and a storage medium, and aims to solve the problem that in the existing point cloud coloring process, the accuracy of point cloud coloring is low.
The embodiment of the application is realized in such a way that a point cloud coloring method comprises the following steps:
acquiring camera parameters and a shooting image of a camera, and determining a coloring sample image in the shooting image according to the camera parameters;
respectively acquiring camera positions corresponding to the coloring sample images, and respectively determining point clouds to be colored in the coloring sample images according to the camera positions;
respectively carrying out pixel projection on each point cloud point in the point cloud to be colored to obtain candidate colors of each point cloud point, and respectively obtaining state parameters of each point cloud point, wherein the state parameters are used for representing color information of each point cloud point;
and carrying out state updating on the state parameters of the cloud points of each point according to the candidate color, and coloring the corresponding cloud points of each point according to the state parameters after the state updating.
Preferably, the determining a coloring sample image in the photographed image according to the camera parameters includes:
according to the camera parameters, respectively calculating a camera rotation angular velocity corresponding to each image frame in the shot image, and respectively determining the image ambiguity of each image frame according to the camera rotation angular velocity;
and determining the coloring sample image in the shooting image according to the image ambiguity.
Preferably, the formula adopted for updating the state parameters of the cloud points according to the candidate color comprises:
V 1,n =V 1,n-1 +p n
V 2,n =V 2,n-1 +p n 2
d n =d n-1 +p n (c n -C 0,n-1 )(c n -C 0,n )
σ 2 =d/V 1
s 2 =d/(V 1 -V 2 /V 1 )
w n =w n-1 +p n f(c n ;C 0 ,σ 2 )
wherein the state parameter is (C, w, d, V) 1 ,C 0 ,V 2 ),p n Is the weight of the nth group of candidate colors received by the current point cloud point, c n Is the candidate color of the current point cloud point, sigma 2 Is the overall variance, s 2 Is unbiased sample variance, p n f(c n ;C 0 ,σ 2 ) Is the weight of the current point cloud point corresponding to the coloring sample image, C 0 Is a three-dimensional vector of RGB values of the corresponding color of the point cloud point, C is the color finally selected by the point cloud point, w is the weight used for updating C, d and V 1 ,V 2 Is used for calculating the probability distribution of colorContains mean and variance information for the received candidate colors.
Preferably, the formula adopted for updating the state parameters of the cloud points according to the candidate color further comprises:
where ci is the candidate color of the i-th group received by the current point cloud point.
Preferably, after the coloring of the corresponding point cloud point according to the state parameter after the state updating, the method further includes:
acquiring uncolored points in the point cloud points, and determining a target point area according to the positions of the uncolored points;
respectively obtaining the number of colored point cloud points in each target point area to obtain colored number;
if the colored number in any target point area is smaller than a number threshold, stopping coloring operation of the target point area corresponding to the uncolored point;
if the colored number in any target point area is greater than or equal to the number threshold, screening the colored cloud points in the target point area to obtain screening points;
and coloring the uncolored points corresponding to the target point area according to the colors of the screening points.
Preferably, after the acquiring the camera parameters of the camera and capturing the image, the method further includes:
carrying out gray processing on each image frame in the shot image to obtain gray images, and carrying out edge detection on each gray image to obtain edge images;
and calculating edge variances of the edge images respectively, and determining the coloring sample image in the photographed image according to the edge variances.
Preferably, the respectively performing pixel projection on each point cloud point in the point cloud to be colored to obtain a candidate color of each point cloud point, including:
and determining a camera parameter matrix according to the camera parameters, and projecting each point cloud point to a pixel plane according to the camera parameter matrix to obtain candidate colors of each point cloud point.
It is another object of an embodiment of the present application to provide a point cloud coloring system, the system including:
the sample determining module is used for acquiring camera parameters of a camera and a shot image and determining a coloring sample image in the shot image according to the camera parameters;
the point cloud determining module is used for respectively acquiring camera positions corresponding to the coloring sample images and respectively determining point clouds to be colored in the coloring sample images according to the camera positions;
the pixel projection module is used for respectively carrying out pixel projection on each point cloud point in the point cloud to be colored to obtain candidate colors of each point cloud point, and respectively obtaining state parameters of each point cloud point, wherein the state parameters are used for representing color information of each point cloud point;
and the point cloud coloring module is used for carrying out state update on the state parameters of each point cloud point according to the candidate color and coloring the corresponding point cloud point according to the state parameters after the state update.
It is a further object of an embodiment of the present application to provide a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, which processor implements the steps of the method as described above when executing the computer program.
It is a further object of embodiments of the present application to provide a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the above method.
According to the embodiment of the application, the camera positions corresponding to the coloring sample images are respectively obtained, the to-be-colored point cloud in the coloring sample images can be effectively determined based on the camera positions, the candidate colors of the point cloud can be effectively obtained by respectively carrying out pixel projection on the point cloud points in the to-be-colored point cloud, the state parameters of the point cloud points can be effectively updated based on the candidate colors, and the corresponding point cloud points can be effectively colored based on the state parameters after the state update, so that the point cloud coloring effect is achieved.
Drawings
FIG. 1 is a flow chart of a point cloud coloring method provided by a first embodiment of the present application;
FIG. 2 is an uncolored point cloud image provided by a first embodiment of the present application;
FIG. 3 is a point cloud image rendered using only a single frame image provided by a first embodiment of the present application;
fig. 4 is an image after coloring a point cloud according to the present embodiment provided by the first embodiment of the present application;
FIG. 5 is a flow chart of a method for point cloud coloring provided by a second embodiment of the present application;
FIG. 6 is a schematic diagram of a third embodiment of a point cloud coloring system according to the present application;
fig. 7 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
Example 1
Referring to fig. 1, a flowchart of a point cloud coloring method according to a first embodiment of the present application may be applied to any terminal device or system, where the point cloud coloring method includes the steps of:
step S10, acquiring camera parameters and a shooting image of a camera, and determining a coloring sample image in the shooting image according to the camera parameters;
wherein, the image sequence obtained by the camera in the measuring process is I (t), wherein t is { t ∈ } 0 ,t 1 ,...,t n }. The angle and position of the camera corresponding to the image frame time under the global coordinate system is R|T](t) acquiring coordinate information of the camera and associating time withThe image sequences have been aligned synchronously.
The self parameter matrix of the camera is K, and the image time { t } 0 ,t 1 ,...,t n The t interval is about 0.1 seconds, for example, if the camera shooting frequency is 10 frames/second.
In the step, a coloring sample image for coloring is extracted from a shooting image, optionally, according to indexes such as the speed of camera movement and scene characteristics, an image can be extracted at fixed time intervals or frame numbers, so that the colored point cloud is ensured to be covered as much as possible, and a certain processing time is saved while enough color samples are provided.
For example, for a typical hand-held or on-board use scenario, a frame of image may be extracted every 0.5 seconds to obtain a colored sample image, and the image may be selected by image blur. The images produced by the moving devices often have varying degrees of motion blur, affecting the quality of the coloration. Thus, one or more images with the lowest blur (highest sharpness) may be selected for coloring at certain time intervals (e.g., 1-3 seconds).
Specifically, in this step, the determining the coloring sample image in the captured image according to the camera parameter includes:
according to the camera parameters, respectively calculating a camera rotation angular velocity corresponding to each image frame in the shot image, and respectively determining the image ambiguity of each image frame according to the camera rotation angular velocity;
determining the coloring sample image in the shooting image according to the image ambiguity;
if the image blur degree is used to extract the image, the blur degree can be estimated according to the angular velocity of the camera or the device or the characteristics of the image, and the image movement mainly comes from the angular variation of the camera when the distance between the camera and the object is not too small, so that the larger the angular velocity of the camera is, the more blurred the image is, for the time t i The angular velocity of the camera rotation can be calculated as follows:
the time t of the front and back frames is used, the angle (rotation) matrix R and Tr are calculated for the matrix, the angular velocity calculated by the above formula is the angular velocity around the Euler rotation axis, and in the extraction process, one or a plurality of frames with the lowest angular velocity within a certain time are selected to obtain the coloring sample image.
Further, in this step, after the capturing the camera parameters of the camera and capturing the image, the method further includes:
carrying out gray processing on each image frame in the shot image to obtain gray images, and carrying out edge detection on each gray image to obtain edge images;
calculating edge variances of the edge images respectively, and determining the coloring sample images in the photographed images according to the edge variances;
the method comprises the steps of converting an image into a gray level image, detecting edges by using a two-dimensional discrete Laplacian operator, and calculating variances of intensities of all positions of the Laplacian image to obtain edge variances, wherein for the same scene, the clearer the image is, the more obvious the edges are, and the larger the edge variances are, otherwise, the smaller the edge variances are, so that a coloring sample image can be selected through the edge variances, but the edge variances are easily influenced by scene changes, and therefore, the selected time range is not excessively large.
Step S20, respectively acquiring camera positions corresponding to the coloring sample images, and respectively determining point clouds to be colored in the coloring sample images according to the camera positions;
wherein each of the coloring sample images is sequentially processed in turn, and the coloring sample image I (t i ) According to its corresponding camera position [ R|T ]](t i ) Determining a point cloud to be colored which needs to be colored;
step S30, respectively carrying out pixel projection on each point cloud point in the point cloud to be colored to obtain candidate colors of each point cloud point, and respectively obtaining state parameters of each point cloud point;
wherein, the state parameter is used for representing the color information of each cloud point, and the state parameter can be a listThe state parameters include a set of 5 or 6 state variables, C, w, d, V 1 ,C 0 ,V 2 ,C 0 The three-dimensional vectors (r, g, b) representing the RGB values of the color are continuously updated dynamic averages, C is the final color selected, w is the weight used to update C, d, V 1 ,V 2 Is used for calculating the probability distribution of colorA set of continuously updated parameters (preset as gaussian distribution) containing information such as mean, variance, etc. of the received candidate colors, V 2 Is an option for computing the variance with weights and the probability distribution is used for computing w.
In this embodiment, d, V 1 ,V 2 ,C 0 Independent of other variables, all images it accepts may be weighted with a weight p for a particular point i For example, according to the above image blur degree and the distance between the point and the camera at the time of image shooting, the image can be given a corresponding weight, and the weight can be used for optimizing the probability distributionIt is also possible to select +.f as a factor of w only, for example, using the shooting distance to weight the image>Wherein r is i For the point-to-camera distance, a is a constant, the larger the shooting distance, the lower the image (color) weight, and this weight term can be multiplied by other factors, such as the above-mentioned image blur degree, without using the blur degree weight again in the case of having picked out a clear image.
D, V are given below 1 ,V 2 ,C 0 Is defined by:
wherein c i ,p i Is the i-th group color received by the current point and its weight. The addition of the colors (r, g, b) is vector addition, and the square is the square of the vector modulus (i.e., r 2 +g 2 +b 2 ) The multiplication is dot product, and if vector operation is not used, the three color channels can be calculated respectively.
Optionally, in this step, the respectively performing pixel projection on each point cloud point in the point cloud to be colored to obtain a candidate color of each point cloud point, including:
determining a camera parameter matrix according to the camera parameters, and projecting each point cloud point to a pixel plane according to the camera parameter matrix to obtain candidate colors of each point cloud point;
the method comprises the steps of projecting each cloud point onto a pixel plane through a camera parameter matrix K, wherein the projected pixel colors are candidate colors of the cloud points, and as the same object possibly appears in a plurality of frames of images, single points in the cloud points can acquire a plurality of candidate colors from front to back, and the number of the candidate colors depends on factors such as camera movement speed, image resolution, extraction rate of images and the like.
Step S40, carrying out state update on the state parameters of each point cloud point according to the candidate color, and coloring the corresponding point cloud point according to the state parameters after the state update;
in order to continuously update the group of parameters with lower memory occupation and avoid the problem of floating point numerical stability possibly caused by a large amount of data, the step adopts a Welford online updating method with weight to continuously and iteratively update the state parameters of cloud points of each point:
V 1,n =V 1,n-1 +p n
V 2,n =V 2,n-1 +p n 2
d n =d n-1 +p n (c n -C 0,n-1 )(c n -C 0,n )
when n=0, d, V are defined as initial conditions 1 ,V 2 ,C 0 All are 0, and the first candidate color number is 1;
from the state parameters, a distribution is derivedMean and variance of (2), mean is C 0 The variance is:
σ 2 =d/V 1
s 2 =d/(V 1 -V 2 /V 1 )
wherein sigma 2 Is the overall variance, s 2 Is an unbiased sample variance, and can be selected as appropriate in practical application;
if not using weight to calculate distributionIt is considered that p in the above formula i All being 1, then V 1 =V 2 The above equation degenerates to an unweighted form, V 1 Becomes an integer of the number of recording colors and no longer requires V 2
At this time it has been determined that the current frame (candidate color c n ) Gaussian distribution inThe corresponding Gaussian distribution function is recorded as f (C; C) 0 ,σ 2 ) Next C, w may be updated:
w n =w n-1 +p n f(c n ;C 0 ,σ 2 )
wherein p is n f(c n ;C 0 ,σ 2 ) Is the current weight, combines the weight brought by the camera with the weight of the probability distribution, here added with Gaussian distributionCan make C n The method has certain resistance to abnormal values and noise, because occasional abnormal values generally obtain lower weights, and the final color of each point is C in the state parameters after the coloring sample image is processed.
Referring to fig. 2 to 4, the image after the point cloud coloring in the embodiment is clearly clearer than the point cloud image after the point cloud coloring is not performed and only a single frame of image is used for coloring, and the embodiment can be used for processing continuous image frames, and by a specific frame selection method, the calculation time and the coloring quality are balanced, and the influence of image blurring is reduced. The average color of each point is calculated by a stream updating method, the noise and abnormal value are well stabilized, the calculation cost and the memory occupation are small, the coloring effect is clear and smooth, and various weight information such as indexes of image shooting distance, angular speed, ambiguity and the like can be selectively added in the calculation process. The method adopts a dynamic stream type average method weighted according to object distance or ambiguity, continuously endows the color information of the image to the point cloud, has relatively stable error and abnormal value, has clearer and smoother coloring effect compared with other methods (such as common average, median and the like), has high processing speed and occupies small memory.
In this embodiment, the to-be-colored point cloud in each coloring sample image can be effectively determined based on the camera positions by respectively acquiring the camera positions corresponding to each coloring sample image, the candidate colors of each point cloud can be effectively obtained by respectively performing pixel projection on each point cloud point in the to-be-colored point cloud, the state parameters of each point cloud point can be effectively updated based on the candidate colors, and the corresponding point cloud points can be effectively colored based on the state parameters after the state update, so as to achieve the effect of coloring the point cloud.
Example two
Referring to fig. 5, a flowchart of a point cloud coloring method according to a second embodiment of the present application is provided, where the step after step S40 in the first embodiment is further refined, and the method includes the steps of:
step S50, obtaining uncolored points in the point cloud points, and determining a target point area according to the positions of the uncolored points;
after the coloring process of the point cloud, there may still be some uncolored points (candidate colors are not received) due to the view angle of the image, point occlusion, etc., and for the uncolored points, the color thereof may be determined by the neighboring points thereof.
Step S60, respectively obtaining the number of colored point cloud points in each target point area to obtain colored number;
step S70, if the colored number in any target point area is smaller than a number threshold, stopping the coloring operation of the target point area corresponding to the uncolored point;
step S80, if the colored number in any target point area is greater than or equal to the number threshold, screening the colored point cloud points in the target point area to obtain screening points;
searching colored points in the area of each target point area to obtain colored quantity, if the colored quantity of the target point area is smaller than a quantity threshold value n, discarding the cloud points, and if the colored quantity of the target point area is larger than or equal to the quantity threshold value n, searching colored points closest to uncolored points in the target point area so as to achieve the screening effect on the colored points, wherein the quantity of the colored points is not more than k;
step S90, coloring the uncolored points corresponding to the target point areas according to the colors of the screening points;
the final color of the uncolored dot is taken as (not more than) the average color of k colored dots (weighted according to distance, high weight of the nearest person), for example r=0.2 m, n=2, k=8, preferably, the r, n, k parameters can be adjusted to balance the dot coloring rate and the coloring accuracy.
In this embodiment, by acquiring uncolored dots in the dot cloud dots, a corresponding target dot area can be effectively determined based on the position of the uncolored dots, the number of colored dot cloud dots in each target dot area can be obtained by respectively acquiring the number of colored dot cloud dots, whether the coloring operation of the uncolored dots is to be stopped can be effectively judged based on the number of colored dots, and the accuracy of coloring the uncolored dots can be effectively improved by screening the colored dot cloud dots in the target dot area.
Example III
Referring to fig. 6, a schematic structural diagram of a point cloud coloring system 100 according to a third embodiment of the present application includes: a sample determination module 10, a point cloud determination module 11, a pixel projection module 12, and a point cloud shading module 13, wherein:
the sample determining module 10 is configured to acquire camera parameters and a captured image of a camera, and determine a coloring sample image in the captured image according to the camera parameters.
Optionally, the sample determination module 10 is further configured to: according to the camera parameters, respectively calculating a camera rotation angular velocity corresponding to each image frame in the shot image, and respectively determining the image ambiguity of each image frame according to the camera rotation angular velocity;
and determining the coloring sample image in the shooting image according to the image ambiguity.
Further, the sample determination module 10 is further configured to: carrying out gray processing on each image frame in the shot image to obtain gray images, and carrying out edge detection on each gray image to obtain edge images;
and calculating edge variances of the edge images respectively, and determining the coloring sample image in the photographed image according to the edge variances.
The point cloud determining module 11 is configured to respectively obtain camera positions corresponding to the respective coloring sample images, and respectively determine a point cloud to be colored in each coloring sample image according to the camera positions.
And the pixel projection module 12 is used for respectively carrying out pixel projection on each point cloud point in the point cloud to be colored to obtain candidate colors of each point cloud point, and respectively obtaining state parameters of each point cloud point, wherein the state parameters are used for representing color information of each point cloud point.
Optionally, the pixel projection module 12 is further configured to: and determining a camera parameter matrix according to the camera parameters, and projecting each point cloud point to a pixel plane according to the camera parameter matrix to obtain candidate colors of each point cloud point.
And the point cloud coloring module 13 is used for carrying out state update on the state parameters of the cloud points of each point according to the candidate color and coloring the corresponding cloud points according to the state parameters after the state update.
Optionally, the formula adopted for updating the state parameters of the cloud points of each point according to the candidate color includes:
V 1,n =V 1,n-1 +p n
V 2,n =V 2,n-1 +p n 2
d n =d n-1 +p n (c n -C 0,n-1 )(c n -C 0,n )
σ 2 =d/V 1
s 2 =d/(V 1 -V 2 /V 1 )
w n =w n-1 +p n f(c n ;C 0 ,σ 2 )
wherein the state parameter is (C, w, d, V) 1 ,C 0 ,V 2 ),p n Is the weight of the nth group of candidate colors received by the current point cloud point, c n Is the candidate color of the current point cloud point, sigma 2 Is the overall variance, s 2 Is unbiased sample variance, p n f(c n ;C 0 ,σ 2 ) Is the weight of the current point cloud point corresponding to the coloring sample image, C 0 Is a three-dimensional vector of RGB values of the corresponding color of the point cloud point, C is the color finally selected by the point cloud point, w is the weight used for updating C, d and V 1 ,V 2 Is used for calculating the probability distribution of colorContains mean and variance information for the received candidate colors.
The formula adopted for carrying out state update on the state parameters of each cloud point according to the candidate color further comprises the following steps:
where ci is the candidate color of the i-th group received by the current point cloud point.
Further, the point cloud coloring module 13 is further configured to: acquiring uncolored points in the point cloud points, and determining a target point area according to the positions of the uncolored points;
respectively obtaining the number of colored point cloud points in each target point area to obtain colored number;
if the colored number in any target point area is smaller than a number threshold, stopping coloring operation of the target point area corresponding to the uncolored point;
if the colored number in any target point area is greater than or equal to the number threshold, screening the colored cloud points in the target point area to obtain screening points;
and coloring the uncolored points corresponding to the target point area according to the colors of the screening points.
According to the embodiment, the camera positions corresponding to the coloring sample images are obtained respectively, the to-be-colored point cloud in the coloring sample images can be effectively determined based on the camera positions, the candidate colors of the point cloud can be effectively obtained by respectively carrying out pixel projection on the point cloud points in the to-be-colored point cloud, the state parameters of the point cloud points can be effectively updated based on the candidate colors, the corresponding point cloud points can be effectively colored based on the state parameters after the state update, so that the point cloud coloring effect is achieved.
Example IV
Fig. 7 is a block diagram of a terminal device 2 according to a fourth embodiment of the present application. As shown in fig. 7, the terminal device 2 of this embodiment includes: a processor 20, a memory 21 and a computer program 22 stored in said memory 21 and executable on said processor 20, for example a program of a point cloud shading method. The steps of the various embodiments of the point cloud shading method described above are implemented when the processor 20 executes the computer program 22.
Illustratively, the computer program 22 may be partitioned into one or more modules that are stored in the memory 21 and executed by the processor 20 to complete the present application. The one or more modules may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 22 in the terminal device 2. The terminal device may include, but is not limited to, a processor 20, a memory 21.
The processor 20 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 21 may be an internal storage unit of the terminal device 2, such as a hard disk or a memory of the terminal device 2. The memory 21 may be an external storage device of the terminal device 2, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 2. Further, the memory 21 may also include both an internal storage unit and an external storage device of the terminal device 2. The memory 21 is used for storing the computer program as well as other programs and data required by the terminal device. The memory 21 may also be used for temporarily storing data that has been output or is to be output.
In addition, each functional module in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Wherein the computer readable storage medium may be nonvolatile or volatile. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of each method embodiment described above may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The computer readable storage medium may include: any entity or device capable of carrying computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable storage medium may be appropriately scaled according to the requirements of jurisdictions in which such computer readable storage medium does not include electrical carrier signals and telecommunication signals, for example, according to jurisdictions and patent practices.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A method of point cloud coloring, the method comprising:
acquiring camera parameters and a shooting image of a camera, and determining a coloring sample image in the shooting image according to the camera parameters;
respectively acquiring camera positions corresponding to the coloring sample images, and respectively determining point clouds to be colored in the coloring sample images according to the camera positions;
respectively carrying out pixel projection on each point cloud point in the point cloud to be colored to obtain candidate colors of each point cloud point, and respectively obtaining state parameters of each point cloud point, wherein the state parameters are used for representing color information of each point cloud point;
and carrying out state updating on the state parameters of the cloud points of each point according to the candidate color, and coloring the corresponding cloud points of each point according to the state parameters after the state updating.
2. The point cloud coloring method of claim 1, wherein the determining a colored sample image in the captured image from the camera parameters comprises:
according to the camera parameters, respectively calculating a camera rotation angular velocity corresponding to each image frame in the shot image, and respectively determining the image ambiguity of each image frame according to the camera rotation angular velocity;
and determining the coloring sample image in the shooting image according to the image ambiguity.
3. The method of claim 1, wherein the formula for updating the state parameters of each cloud point according to the candidate color comprises:
V 1,n =V 1,n-1 +p n
V 2,n =V 2,n-1 +p n 2
d n =d n-1 +p n (c n -C 0,n-1 )(c n -C 0,n )
σ 2 =d/V 1
s 2 =d/(V 1 -V 2 /V 1 )
w n =w n-1 +p n f(c n ;C 0 ,σ 2 )
wherein the state parameter is (C, w, d, V) 1 ,C 0 ,V 2 ),p n Is the weight of the nth group of candidate colors received by the current point cloud point, c n Is the candidate color of the current point cloud point, sigma 2 Is the overall variance, s 2 Is unbiased sample variance, p n f(c n ;C 0 ,σ 2 ) Is the weight of the current point cloud point corresponding to the coloring sample image, C 0 Is a three-dimensional vector of RGB values of the corresponding color of the point cloud point, C is the color finally selected by the point cloud point, w is the weight used for updating C, d and V 1 ,V 2 Is used for calculating the probability distribution of colorContains mean and variance information for the received candidate colors.
4. The method of claim 3, wherein the formula for updating the state parameters of each cloud point according to the candidate color further comprises:
wherein c i Is the candidate color of the ith group received by the current point cloud point.
5. The method of claim 1, wherein after coloring the corresponding point cloud point according to the state parameter after the state update, further comprising:
acquiring uncolored points in the point cloud points, and determining a target point area according to the positions of the uncolored points;
respectively obtaining the number of colored point cloud points in each target point area to obtain colored number;
if the colored number in any target point area is smaller than a number threshold, stopping coloring operation of the target point area corresponding to the uncolored point;
if the colored number in any target point area is greater than or equal to the number threshold, screening the colored cloud points in the target point area to obtain screening points;
and coloring the uncolored points corresponding to the target point area according to the colors of the screening points.
6. The point cloud coloring method of claim 1, wherein after the capturing of the camera parameters of the camera and the capturing of the image, further comprising:
carrying out gray processing on each image frame in the shot image to obtain gray images, and carrying out edge detection on each gray image to obtain edge images;
and calculating edge variances of the edge images respectively, and determining the coloring sample image in the photographed image according to the edge variances.
7. The method of claim 1, wherein the respectively performing pixel projection on each cloud point in the to-be-colored point cloud to obtain candidate colors of each cloud point comprises:
and determining a camera parameter matrix according to the camera parameters, and projecting each point cloud point to a pixel plane according to the camera parameter matrix to obtain candidate colors of each point cloud point.
8. A point cloud coloring system, the system comprising:
the sample determining module is used for acquiring camera parameters of a camera and a shot image and determining a coloring sample image in the shot image according to the camera parameters;
the point cloud determining module is used for respectively acquiring camera positions corresponding to the coloring sample images and respectively determining point clouds to be colored in the coloring sample images according to the camera positions;
the pixel projection module is used for respectively carrying out pixel projection on each point cloud point in the point cloud to be colored to obtain candidate colors of each point cloud point, and respectively obtaining state parameters of each point cloud point, wherein the state parameters are used for representing color information of each point cloud point;
and the point cloud coloring module is used for carrying out state update on the state parameters of each point cloud point according to the candidate color and coloring the corresponding point cloud point according to the state parameters after the state update.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 7.
CN202310094604.6A 2023-02-10 2023-02-10 Point cloud coloring method, system, terminal and storage medium Pending CN117173324A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310094604.6A CN117173324A (en) 2023-02-10 2023-02-10 Point cloud coloring method, system, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310094604.6A CN117173324A (en) 2023-02-10 2023-02-10 Point cloud coloring method, system, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN117173324A true CN117173324A (en) 2023-12-05

Family

ID=88941859

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310094604.6A Pending CN117173324A (en) 2023-02-10 2023-02-10 Point cloud coloring method, system, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN117173324A (en)

Similar Documents

Publication Publication Date Title
EP3496383A1 (en) Image processing method, apparatus and device
CN111080526B (en) Method, device, equipment and medium for measuring and calculating farmland area of aerial image
CN110660090B (en) Subject detection method and apparatus, electronic device, and computer-readable storage medium
CN110852997B (en) Dynamic image definition detection method and device, electronic equipment and storage medium
EP3236424B1 (en) Information processing apparatus and method of controlling the same
CN109005367B (en) High dynamic range image generation method, mobile terminal and storage medium
CN111368587B (en) Scene detection method, device, terminal equipment and computer readable storage medium
CN110796041B (en) Principal identification method and apparatus, electronic device, and computer-readable storage medium
CN114862929A (en) Three-dimensional target detection method and device, computer readable storage medium and robot
CN113888583A (en) Real-time judgment method and device for visual tracking accuracy
CN111539975B (en) Method, device, equipment and storage medium for detecting moving object
CN110765875B (en) Method, equipment and device for detecting boundary of traffic target
CN116977671A (en) Target tracking method, device, equipment and storage medium based on image space positioning
CN111540042A (en) Method, device and related equipment for three-dimensional reconstruction
CN117173324A (en) Point cloud coloring method, system, terminal and storage medium
CN111062984B (en) Method, device, equipment and storage medium for measuring area of video image area
CN113052886A (en) Method for acquiring depth information of double TOF cameras by adopting binocular principle
EP4346221A1 (en) Photographing method and apparatus, storage medium
CN114881908B (en) Abnormal pixel identification method, device and equipment and computer storage medium
CN116363031B (en) Imaging method, device, equipment and medium based on multidimensional optical information fusion
EP4171015A1 (en) Handling blur in multi-view imaging
CN109151299B (en) Focusing method and device
CN116793345A (en) Posture estimation method and device of self-mobile equipment and readable storage medium
CN115984357A (en) Method and system for selecting key frame from picture sequence
CN116012323A (en) Image definition calculation method, image processing model training method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination