CN111724464A - Mobile measurement point cloud coloring method and device - Google Patents

Mobile measurement point cloud coloring method and device Download PDF

Info

Publication number
CN111724464A
CN111724464A CN202010571647.5A CN202010571647A CN111724464A CN 111724464 A CN111724464 A CN 111724464A CN 202010571647 A CN202010571647 A CN 202010571647A CN 111724464 A CN111724464 A CN 111724464A
Authority
CN
China
Prior art keywords
image
planar
panoramic
point cloud
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010571647.5A
Other languages
Chinese (zh)
Inventor
刘梦庚
汪开理
杨晶
陈海佳
罗胜
何平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Hi Cloud Technology Co ltd
Original Assignee
Wuhan Hi Cloud Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Hi Cloud Technology Co ltd filed Critical Wuhan Hi Cloud Technology Co ltd
Priority to CN202010571647.5A priority Critical patent/CN111724464A/en
Publication of CN111724464A publication Critical patent/CN111724464A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a mobile measurement point cloud coloring method and a device thereof, relates to the technical field of photoelectric mapping, and acquires panoramic information and point cloud information of a target object through an acquisition module. And then, the projection module projects the panoramic information to form a planar panoramic image, and projects the point cloud information to form a planar intensity image. And correcting the plane panoramic image according to the plane intensity image through a correction module to form a corrected plane panoramic image. And finally, coloring the point cloud information according to the correction plane panoramic image through a coloring module.

Description

Mobile measurement point cloud coloring method and device
Technical Field
The invention relates to the technical field of photoelectric mapping, in particular to a mobile measurement point cloud coloring method and a mobile measurement point cloud coloring device.
Background
With the development of science and technology, social demands are greatly changed, and especially higher requirements are put forward for mapping information services. The vehicle-mounted mobile measurement system comprises a global positioning system, an inertial navigation system, a laser scanner and a digital camera. The position information of the ground object is obtained through the mounted laser scanner, and finally the discrete point cloud is formed. And the digital camera obtains the color information of the ground features through the panoramic image to finally form the panoramic image. The information acquired by the digital camera and the information acquired by the laser scanner are complemented, and then respective advantages of the two types of data can be integrated, so that the target can be better described. When the two data are mutually complemented, color information of the panoramic image needs to be given to the point cloud, and a color point cloud is correspondingly generated, and the process is called point cloud coloring.
In the prior art, when the color is applied to the point cloud, the orientation elements of a camera are corrected, so that the error when the panoramic image is matched with the point cloud is reduced, and the error caused when the panoramic image is spliced is ignored, so that the panoramic image cannot accurately apply the color to the point cloud even when the orientation elements have no error.
Disclosure of Invention
The invention aims to provide a mobile measurement point cloud coloring method and a mobile measurement point cloud coloring device aiming at the defects in the prior art, so as to solve the problem that the conventional panoramic image cannot accurately color the point cloud.
In order to achieve the above purpose, the embodiment of the present invention adopts the following technical solutions:
in one aspect of the embodiments of the present invention, a mobile measurement point cloud coloring method is provided, including: acquiring panoramic information and point cloud information of a target object; projecting the panoramic information to form a planar panoramic image, and projecting the point cloud information to form a planar intensity image; correcting the planar panoramic image according to the planar intensity image to form a corrected planar panoramic image; and coloring the point cloud information according to the correction plane panoramic image.
Optionally, projecting the panoramic information to form a planar panoramic image, and projecting the point cloud information to form a planar intensity image, includes:
establishing a panoramic ball according to the panoramic information; projecting the panorama sphere to a projection plane of a first projection volume to form a planar panorama image; establishing a point cloud intensity image according to the point cloud information; the point cloud intensity image is projected toward a projection plane of a second projection volume to form a planar intensity image.
Optionally, the projecting the panorama sphere onto the projection plane of the first projection volume includes:
selecting a pixel on the first projection body, and enabling the pixel and the center of the panorama sphere to form a straight line; determining the intersection point of the straight line and the panoramic ball; the color of the intersection point position on the panorama sphere is assigned to the corresponding pixel.
Optionally, the first projection is a cube; the projection plane of the first projection volume includes six planes on a cube.
Optionally, projecting the point cloud intensity image onto a projection plane of the second projection to form a plane intensity image includes:
after the point cloud intensity image is projected to the projection plane of the second projection body, rasterizing the projection plane of the second projection body to form a plane intensity image; and the point intensity information in the point cloud intensity image corresponds to the gray scale information of the pixels in the projection plane of the second projection body.
Optionally, when a single pixel in the projection plane of the second projection corresponds to a plurality of pieces of point intensity information in the point cloud intensity image, at least one piece of relevant point intensity information is determined from the plurality of pieces of point intensity information according to a preset range, and the gray scale information of the single pixel corresponds to an average value of the at least one piece of relevant point intensity information.
Optionally, the forming a corrected planar panoramic image according to the planar intensity image corrected planar panoramic image includes:
respectively extracting image characteristics of the plane intensity image and the plane panoramic image; and correcting the planar panoramic image according to the planar intensity image and the image characteristics of the planar panoramic image, so that the corrected planar panoramic image forms a corrected planar panoramic image.
Optionally, the image feature is a line feature; correcting image characteristics of the planar panoramic image from the planar intensity image and image characteristics of the planar panoramic image comprises:
determining corresponding line feature groups according to the line features of the plane intensity image and the plane panoramic image respectively; determining a first correction value of the planar panoramic image according to the corresponding line feature group; and correcting the planar panoramic image according to the first correction value.
Optionally, the image features further include contour features; correcting the planar panoramic image according to the planar intensity image and the image characteristics of the planar panoramic image further comprises:
determining a corresponding contour feature group of the plane intensity image and the plane panoramic image according to the corresponding line feature group; determining a second correction value of the planar panoramic image according to the corresponding profile feature set; and correcting the planar panoramic image according to the second correction value.
In another aspect of the embodiments of the present invention, a mobile measurement point cloud coloring apparatus is provided, including:
the acquisition module is used for acquiring panoramic information and point cloud information of the target object; the projection module is used for projecting the panoramic information to form a planar panoramic image and projecting the point cloud information to form a planar intensity image; the correction module is used for correcting the planar panoramic image according to the planar intensity image to form a corrected planar panoramic image; and the color-giving module is used for giving color to the point cloud information according to the correction plane panoramic image.
The beneficial effects of the invention include:
the invention provides a mobile measurement point cloud coloring method, which comprises the steps of obtaining panoramic information and point cloud information of a target object, respectively projecting the panoramic information and the point cloud information to form a planar panoramic image of a visible light image and a planar intensity image of a laser point, correcting the planar panoramic image according to the planar intensity image to form a corrected planar panoramic image, and coloring the point cloud information according to the corrected planar panoramic image. The matching precision of the panorama and the point cloud is improved from the angle of nonlinear distortion generated during splicing of the repaired panoramic image, and therefore the panoramic image can be accurately colored to the point cloud.
The invention also provides a mobile measurement point cloud coloring device, which obtains the panoramic information and the point cloud information of the target object through the obtaining module. And then, the projection module projects the panoramic information to form a planar panoramic image, and projects the point cloud information to form a planar intensity image. And correcting the plane panoramic image according to the plane intensity image through a correction module to form a corrected plane panoramic image. And finally, coloring the point cloud information according to the correction plane panoramic image through a coloring module. The matching precision of the panorama and the point cloud is improved, so that the device can be used for accurately coloring the point cloud by using the panorama image.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic flow chart of a mobile measurement point cloud coloring method according to an embodiment of the present invention;
fig. 2 is a second schematic flow chart of a mobile measurement point cloud coloring method according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a mobile measurement point cloud coloring apparatus according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an instruction execution processing apparatus according to an embodiment of the present invention.
Icon: 500-moving a measurement point cloud colorizing device; 501-an obtaining module; 502-a projection module; 503-a correction module; 504-color-imparting module; 31-a processor; 32-storage medium.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. It should be noted that, in the case of no conflict, various features in the embodiments of the present invention may be combined with each other, and the combined embodiments are still within the scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present invention, it should be noted that the terms "first", "second", "third", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
The laser scanner in the vehicle-mounted mobile measurement system can acquire the position information of the target object and embody the position information in the form of discrete point cloud, namely the collection of laser points. Meanwhile, a digital camera in the vehicle-mounted mobile measurement system obtains color information of the ground features through the panoramic image, and finally forms a panoramic image in the form of a visible light image. With the society making higher demands on mapping information services, in more and more scenes, color information of each point in the point cloud data needs to be acquired, but the laser point cloud does not carry color information, so that a relationship between a laser scanner and a digital camera needs to be established and further fused, color information of a panoramic image is given to the point cloud, and a color point cloud is generated.
The existing point cloud coloring method mainly comprises three methods:
first, a multi-sensor semi-automatic calibration method, which directly uses panoramic data for rendering. When the absolute position posture of the panoramic ball is known, the point cloud can be projected onto the panoramic ball to obtain the color of the point cloud, and the position of the panoramic ball is the same as the position posture of the camera. The problem of acquiring the absolute position posture of the panoramic ball is solved by a multi-sensor semi-automatic calibration method. The panoramic camera and the POS system are fixed on the equipment frame, the relative position posture relation between the POS system and the camera system is unchanged, and the orientation element value output by the POS system and the relative position posture compensation can be directly used as the orientation element of the panoramic ball. The method obtains the relative position relation between the camera and the POS system through calibration. After calibration is completed, a subsequent coloring step can be performed, firstly, the azimuth angle of a point under a panoramic ball coordinate system is calculated, the storage mode of the panoramic ball is grid storage, the number of rows and columns represents a vertical angle and a horizontal angle, and corresponding colors can be led out by the aid of the azimuth angle. However, the method is not suitable for each panoramic image, because the position and the posture of each panoramic photo have random errors during measurement, and the random errors mainly include system time synchronization errors and POS system self errors, so that the calibration parameters of each panoramic image are not universal. In practical use, since the number of the panoramic photos is hundreds or even thousands of the panoramic photos, the corresponding adjustment of the position and the posture for each panoramic photo is not practical.
And secondly, a geometric primitive matching automatic registration method, which is similar to an error model of a point selection matching method, is used for registration through point cloud and image homonymous geometric features. The method for automatically registering the geometric primitives has the advantages that each panorama can be registered, and the defect that semi-automatic registration parameters are not universal is overcome, but the method has certain requirements on scenes, and a feature algorithm is invalid due to the fact that the feature algorithm is not easy to extract in regions with unobvious features.
Thirdly, based on a 3D-3D registration method of multi-view stereo point cloud and laser point cloud, the panoramic camera comprises a plurality of lenses, images shot by each lens are overlapped, and the images can be generated into dense point cloud by a multi-view geometric method. The point cloud coloring is realized by performing model matching on the image point cloud and the laser point cloud, but in actual use, the accuracy is higher compared with the former two methods, but the time consumption of the point cloud recovering process from the image and the point cloud ICP iterative process is longer.
The orientation elements of the camera can be corrected to a certain extent by the three methods, but distortion is generated at the spliced part when the frame type images are spliced into the panoramic image and belongs to local nonlinear distortion because the distortion does not take into account, so that the panoramic image and the point cloud are difficult to be accurately matched even if the corrected orientation elements have no errors. The application provides a mobile measurement point cloud coloring method, so that the defects are effectively overcome.
In one aspect of the embodiments of the present invention, referring to fig. 1, a method for coloring a mobile measurement point cloud is provided, including: acquiring panoramic information and point cloud information of a target object; projecting the panoramic information to form a planar panoramic image, and projecting the point cloud information to form a planar intensity image; correcting the planar panoramic image according to the planar intensity image to form a corrected planar panoramic image; and coloring the point cloud information according to the correction plane panoramic image.
For example, as shown in fig. 1, the point cloud is mainly colored in terms of re-projection, image correction, and the like, and the specific method is as follows:
s100: and acquiring panoramic information and point cloud information of the target object.
By acquiring the panoramic information and the point cloud information of the target object, the panoramic information and the point cloud information are used as information sources for subsequent reprojection, correction and coloring.
S200: and projecting the panoramic information to form a planar panoramic image, and projecting the point cloud information to form a planar intensity image.
Because the data formats of the original point cloud information and the panoramic information are inconvenient to process, the panoramic image formed by the panoramic information can be subjected to plane projection so as to form a plane panoramic image. And simultaneously, carrying out plane projection on the point cloud information and forming a plane intensity image. Namely, the panoramic information and the point cloud information are converted into two-position plane images which are convenient to process, so that the two projected planes can be conveniently compared and corrected in the subsequent steps. It should be noted that, the present application does not specifically limit the order of executing the projection of the panoramic information and the projection of the point cloud information. Meanwhile, in this embodiment, the form of the panoramic image formed by the panoramic information is not particularly limited, and may be, for example, a panoramic ball in the subsequent embodiments, or an original image generated by a camera.
S300: and correcting the planar panoramic image according to the planar intensity image to form a corrected planar panoramic image.
Because the point cloud image does not generate distortion during splicing, the point cloud image can be used as a reference of a planar panoramic image formed after panoramic information projection after being projected into a planar intensity image. Through the mode of comparing, determine the part that plane panoramic image needs to be rectified, simultaneously, with the intensity information that corresponds plane intensity image and contain, rectify plane panoramic image, accomplish the back at whole corrections, can form and rectify plane panoramic image to effectual restoration frame format image is when splicing into panoramic image, distortion that can produce in the concatenation department, and then makes and rectifies plane panoramic image and can carry out accurate matching with plane intensity image.
S400: and coloring the point cloud information image according to the correction plane panoramic image.
And mapping the point cloud information and the correction plane panoramic image through the panoramic external parameters to obtain the position of the point cloud information on the correction plane panoramic image, so as to color the point cloud information and obtain the color point cloud.
Because the panorama and the point cloud are registered by using an automatic correction mode, the full-automatic coloring can be realized only by the design values of the relative position and the posture of the panoramic camera, less manual intervention and no calibration.
Optionally, projecting the panoramic information to form a planar panoramic image, and projecting the point cloud information to form a planar intensity image, includes: establishing a panoramic ball according to the panoramic information; projecting the panorama sphere to a projection plane of a first projection volume to form a planar panorama image; establishing a point cloud intensity image according to the point cloud information; the point cloud intensity image is projected toward a projection plane of a second projection volume to form a planar intensity image.
For example, as shown in fig. 1, when the panoramic information in S200 is projected to form a planar panoramic image, the specific projection process may be: and establishing a panoramic ball according to the panoramic information, wherein the obtained panoramic information can be in a form of the panoramic ball, and meanwhile, the panoramic information can also be used for forming a plurality of frame type panoramic images, and the plurality of frame type panoramic images are spliced to establish the panoramic ball. The embodiment does not limit the form of creating the panoramic ball by the panoramic information. After the panorama sphere is formed, a planar panoramic image is formed by projecting the panorama sphere onto a projection plane on the first projection body. When the first projection volume is a cube, the projection plane may include six planes on the cube, i.e., the corresponding planar panoramic image includes all the images projected on the six planes respectively.
When the point cloud information in S200 is projected to form a plane intensity image, the point cloud information may be established as the point cloud intensity image, where the point cloud information obtained may already be the point cloud intensity image, or the intensity information may be extracted according to content information included in the point cloud information, and the point cloud intensity image is established according to the intensity information. The embodiment does not limit the form of establishing the point cloud intensity image by the point cloud information. And projecting the point cloud intensity image onto a projection surface of the second projection body to form a plane intensity image. When the second projection body is a cube, the projection plane is similar to the first projection body and also comprises six planes on the cube, namely, the corresponding plane intensity image comprises all images of the point cloud intensity image projected on the six planes. Here, the first projection body and the second projection body may be projection bodies having the same shape and size, or projection bodies having different shapes and sizes, and this embodiment is not particularly limited thereto. The panoramic information and the point cloud information are respectively projected to form a planar panoramic image and a planar intensity image, so that the problems of inconsistent original point cloud and panoramic data formats and difficulty in processing can be effectively solved, and the two point clouds and the panoramic data formats are converted into a two-dimensional planar image format, so that the subsequent correction processing is facilitated. The difficulty of comparing and correcting the two is effectively reduced, and the processing speed and efficiency are improved.
Optionally, the projecting the panorama sphere onto the projection plane of the first projection volume includes: selecting a pixel on the first projection body, and enabling the pixel and the center of the panorama sphere to form a straight line; determining the intersection point of the straight line and the panoramic ball; the color of the intersection point position on the panorama sphere is assigned to the corresponding pixel.
A common storage format for a panorama sphere is a raster image, where the position of a point on the sphere represents the azimuth angle at that point, and the columns and rows in which the point is located represent the horizontal and vertical angles of that point, and in this format a panorama sphere, when viewed directly and used, is distorted and its re-projection into a raster image eliminates this distortion.
For example, as shown in fig. 2, when the first projective volume is a cube, the panorama sphere is re-projected on the cube:
s201: one pixel on the projection cube is selected to form a straight line with the center of the panorama sphere.
A pixel on the cube to be projected is preselected (that is, each plane on the cube is composed of a plurality of pixels, where the pixel is the preselected pixel), and a straight line formed by connecting the pixel with the center point of the panorama ball is a virtual light, so that the intersection point with the panorama ball is determined in S202.
S202: and calculating the intersection point of the straight line and the panoramic ball.
S203: and according to the index position of the intersection point on the panorama sphere, assigning the color of the intersection point position of the panorama sphere to the cubic pixel.
The intersection position is taken as an index position, and the color is assigned to the corresponding pixel on the cube. At this time, the determination of the color of the pixel point is completed.
S204: and judging whether the traversal of all the cube pixels is finished or not, if so, ending, and if not, returning to the step S201.
And judging whether all the pixels on the six faces of the cube are endowed with corresponding colors or not, returning to S201 if the judgment result is that the projection is not finished, and repeating the step (the selected pixels on the cube are not selected from the beginning to the end when the step is repeated every time) until S204. If the judgment result is yes, the projection is finished, and the program is ended. Through this kind of mode, can be accurate convert the colour information of panorama ball into two-bit plane image, be convenient for carry out contrast correction with the plane intensity information after the projection.
Optionally, the first projection is a cube; the projection plane of the first projection volume includes six planes on a cube.
In the above description, the first projection body is a cube, and it should be noted here that the first projection body may be a cube, and may also be a cuboid with an infinite height, that is, the bottom surface and the top surface of the cube are removed, and the remaining four surfaces are infinitely extended in height (that is, a special form of the cube), and the panorama sphere may be projected onto these four surfaces to form a planar panorama image. The corresponding second projection body may also be a cuboid in a special form, which is similar to the projection of the panoramic ball, and is not described herein again.
Optionally, projecting the point cloud intensity image onto a projection plane of the second projection to form a plane intensity image includes: after the point cloud intensity image is projected to the projection plane of the second projection body, rasterizing the projection plane of the second projection body to form a plane intensity image; and the point intensity information in the point cloud intensity image corresponds to the gray scale information of the pixels in the projection plane of the second projection body.
Illustratively, the storage format of the point cloud intensity image is a discrete storage of location plus attribute information. And projecting the point cloud intensity image onto a second projection body, projecting the point cloud intensity image onto six planes of a cube when the second projection body is the cube, and rasterizing each plane of the cube after the projection is finished, so that the two-dimensional image after the point cloud intensity image is projected can be compared with the two-dimensional image after the panorama ball is projected. The point intensity information in the cloud intensity image and the gray scale information of the pixels in each plane of the cube are in corresponding relation, so that the point intensity information and the gray scale information can be connected.
Optionally, when a single pixel in the projection plane of the second projective volume corresponds to a plurality of point intensity information in the point cloud intensity image, at least one piece of relevant point intensity information is determined from the plurality of point intensity information according to a preset range (i.e., after the occluded point is removed), and the gray scale information of the single pixel corresponds to an average value of the at least one piece of relevant point intensity information.
For example, when intensity information of a point cloud is used to compare with gray scale information of corresponding pixels, there may be point intensity information of a plurality of points corresponding to each pixel, and when determining gray scale information of a pixel, an occlusion point may be removed first, that is, point intensity information (i.e., related point intensity information) in a predetermined range is determined, for example: a point closest to a single pixel and a point within a certain distance range from the point are selected from the plurality of points, and the selected points constitute a plurality of pieces of correlated point intensity information. And taking the average value of the point cloud intensity information of a plurality of related points to correspond to the gray information of a single pixel. The method for removing the shielding comprises the following steps: taking a point which is closest to the projection center in all points corresponding to the pixel as a seed point, wherein the difference between the distance of other points and the distance of the seed point is smaller than a preset threshold value, and if the difference is larger than the preset threshold value, the pixel is considered to be blocked, and the judgment formula is as follows:
D(pi)-D(ps)<T
in the formula, D is a distance function, the distance between the output point and the projection center, piAs point to be determined, psIs the seed point and T is the threshold.
Optionally, the forming a corrected planar panoramic image according to the planar intensity image corrected planar panoramic image includes: respectively extracting image characteristics of the plane intensity image and the plane panoramic image; and correcting the planar panoramic image according to the planar intensity image and the image characteristics of the planar panoramic image, so that the corrected planar panoramic image forms a corrected planar panoramic image.
For example, when the planar panorama image is corrected according to the planar intensity image in S300, image features of the planar intensity image and image features of the planar panorama image are extracted, respectively, and then the planar panorama image is corrected according to the image features of the planar intensity image and the planar panorama image, thereby forming a corrected planar panorama image.
Optionally, the image feature is a line feature; correcting the planar panorama image according to the planar intensity image and the image characteristics of the planar panorama image includes: determining corresponding line feature groups according to the line features of the plane intensity image and the plane panoramic image respectively; determining a first correction value of the planar panoramic image according to the corresponding line feature group; and correcting the planar panoramic image according to the first correction value.
For example, when the image features are line features, the line features of the planar intensity image and the line features of the planar panoramic image are extracted first, and at this time, the two-dimensional image of the panoramic information and the two-dimensional image formed by the point cloud information belong to data of different sources, wherein the point features are difficult to extract and match, and the line features are easier to extract and match. Therefore, the line features of the plane intensity image and the line features of the plane panoramic image can be matched through a position and normal threshold method, and a plurality of groups of corresponding line feature groups are obtained. I.e. to correlate line features of the planar intensity image with line features of the planar panoramic image.
The panoramic image position and orientation use design values as initial values, after which the point cloud and panoramic image are in a substantially registered state. Under the state of rough registration of the point cloud and the panoramic image, the position and the direction of the homonymous line feature are basically the same, and the threshold method matched line feature can be used. After matching is completed, local correction needs to be performed on the area where each line feature is located, the correction is to correct the line feature on the color image to the point cloud intensity map, when correcting, a first correction value needs to be calculated first, and the first correction value can use the following formula:
Figure BDA0002548549750000141
in the formula A, B, C is the linear equation parameter of the line feature of the planar intensity image, and x and y are the coordinates of any point on the line feature of the planar panoramic image, and the midpoint is generally taken. n is the normal of the line feature, v is the correction vector, which is a two-dimensional vector, and the feature of the planar panoramic image plus the correction value v can be coincided with the line feature of the planar intensity image.
After the first correction value is calculated, the planar panoramic image is corrected to the planar intensity image, the planar intensity image and the point cloud intensity image have a corresponding relation, and if the planar panoramic image can be completely registered with the planar intensity image, the panoramic information can be correctly corresponding to the point cloud information.
Optionally, the image features further include contour features; correcting the planar panoramic image according to the planar intensity image and the image characteristics of the planar panoramic image further comprises: determining a corresponding contour feature group of the plane intensity image and the plane panoramic image according to the corresponding line feature group; determining a second correction value of the planar panoramic image according to the corresponding profile feature set; and correcting the planar panoramic image according to the second correction value.
Illustratively, when the image features further include contour features, the contour features of the planar intensity image and the contour features of the planar panoramic image are respectively extracted, and the corresponding contour feature set is determined by the corresponding line feature set with which the association has been established. It should be noted here that, since the linear portion of the profile extracted from the profile by the line feature belongs to the subset of the profile, the index from the line feature to the profile can be established according to the coincidence relationship between the line feature and the profile.
In determining the second correction values of the planar panoramic image, since the first correction values of the line features have been determined according to the foregoing embodiment, at this time, the second correction value of each contour feature may be an average value of the first correction values of the contour inclusion line features. The planar panoramic image correction calculates the correction amount of each contour, and each contour is corrected independently to solve the local nonlinear distortion of the panoramic image.
And moving the outline with the correction value on the planar panoramic image, and filling a blank area generated by the movement by using an interpolation method. The method of aligning only the contour cannot correct the registration error of the region without the contour, but in the two-dimensional image where the contour represents a severe color change, the pixel of the region without the contour is not greatly different from the surrounding pixels, and the color difference caused by the position difference is not large, so the correction is not needed. The correction method is to move the points of the contour on the color point cloud to the gray level image, so that the color point cloud contour and the gray level image contour are coincident, and the movement amount is calculated by line characteristics. After the contour point moves, pixels on the moving path need to be assigned again, the pixels on the moving path are obtained by distance interpolation through non-contour pixels in the direction opposite to the moving direction, and the distance interpolation formula is as follows:
Figure BDA0002548549750000151
colou in formular is the color of a point on the movement locus, piAnd the position of the ith point is reversed, n is the total number of points, p is the position of the track point, D is the distance between two points output by the distance function, C is the color taking, and the color is output by the input position. The distance interpolation characteristically shows a smoothness that the closer the pixel color is to the interpolated pixel color image, the larger the pixel color image is.
And projecting the panorama and the point cloud by directly using a design value as an initial external parameter, and registering the panorama and the point cloud in a local correction mode. The method avoids complicated panoramic camera calibration steps and can solve the problem that the traditional calibration method cannot solve the problem that different panoramic photos have different errors.
In another aspect of the embodiments of the present invention, there is provided a mobile measurement point cloud coloring apparatus 500, including: an obtaining module 501, configured to obtain panoramic information and point cloud information of a target object; a projection module 502, configured to project the panoramic information to form a planar panoramic image, and project the point cloud information to form a planar intensity image; a correction module 503, configured to correct the planar panoramic image according to the planar intensity image to form a corrected planar panoramic image; and a color-giving module 504, configured to give color to the point cloud information according to the corrected planar panoramic image.
Illustratively, as shown in fig. 3, panoramic information and point cloud information of a target object are acquired by an acquisition module 501, and then the panoramic information is projected by a projection module 502 to form a planar panoramic image, and at the same time, the point cloud information is projected to form a planar intensity image. The planar panoramic image is then corrected using the correction module 503 to form a corrected planar panoramic image from the planar intensity image. And finally, the point cloud information is colored by utilizing a coloring module 504 according to the correction plane panoramic image. Therefore, errors of panoramic information caused by distortion can be effectively improved, accuracy in color endowing is improved, meanwhile, the panoramic and the point cloud are registered in an automatic local correction mode, only the design values of the relative position and the posture of the panoramic camera are needed, less manual intervention is needed, and full-automatic coloring can be achieved without calibration.
Optionally, the projection module 502 may specifically be configured to establish a panoramic ball according to the panoramic information; projecting the panorama sphere to a projection plane of a first projection volume to form a planar panorama image; and establishing a point cloud intensity image according to the point cloud information.
Optionally, the projection module 502 may be further specifically configured to establish a panoramic ball according to the panoramic information; selecting a pixel on the first projection body, and enabling the pixel and the center of the panorama sphere to form a straight line; determining the intersection point of the straight line and the panoramic ball, and determining the corresponding azimuth angle according to the intersection point; determining an index position corresponding to the pixel on the panoramic ball according to the azimuth angle; assigning the color of the index position on the panorama sphere to the corresponding pixel to form a planar panorama image; establishing a point cloud intensity image according to the point cloud information;
optionally, the projection module 502 may be further specifically configured to establish a panoramic ball according to the panoramic information; projecting the panorama sphere to a projection plane of a first projection volume to form a planar panorama image; after the point cloud intensity image is projected to the projection plane of the second projection body, rasterizing the projection plane of the second projection body to form a plane intensity image; and the point intensity information in the point cloud intensity image corresponds to the gray scale information of the pixels in the projection plane of the second projection body.
Optionally, the projection module 502 may be further specifically configured to establish a panoramic ball according to the panoramic information; projecting the panorama sphere to a projection plane of a first projection volume to form a planar panorama image; after the point cloud intensity image is projected to the projection plane of the second projection body, rasterizing the projection plane of the second projection body to form a plane intensity image; and when a single pixel in the projection plane of the second projective body corresponds to a plurality of point intensity information in the point cloud intensity image, the gray scale information of the single pixel in the projection plane of the second projective body corresponds to an average value of the plurality of point intensity information corresponding to the gray scale information.
Optionally, the correction module 503 may be further specifically configured to extract image features of the planar intensity image and the planar panoramic image respectively; and correcting the planar panoramic image according to the planar intensity image and the image characteristics of the planar panoramic image, so that the corrected planar panoramic image forms a corrected planar panoramic image.
Optionally, the correction module 503 may be further specifically configured to determine corresponding line feature groups according to line features of the planar intensity image and the planar panoramic image, respectively; determining a first correction value of the planar panoramic image according to the corresponding line feature group; and correcting the planar panoramic image according to the first correction value, so that the corrected planar panoramic image forms a corrected planar panoramic image.
Optionally, the correction module 503 may be further specifically configured to determine corresponding line feature groups according to line features of the planar intensity image and the planar panoramic image, respectively; determining a first correction value of the planar panoramic image according to the corresponding line feature group; correcting the planar panoramic image according to the first correction value, and determining a corresponding contour feature group of the planar intensity image and the planar panoramic image according to the corresponding line feature group; determining a second correction value of the planar panoramic image according to the corresponding profile feature set; and correcting the planar panoramic image according to the second correction value of the planar panoramic image, so that the corrected planar panoramic image forms a corrected planar panoramic image.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus described above may refer to the corresponding process of the method in the foregoing method embodiment, and is not described in detail herein.
The embodiment of the invention also provides an execution instruction processing device, which can be a server, a computer and the like capable of executing the execution instruction processing method.
As shown in fig. 4, the instruction execution processing device may include a processor 31, a storage medium 32 and a bus (not shown in the figure), the storage medium 32 stores machine-readable instructions executable by the processor 31, when the instruction execution processing device runs, the processor 31 communicates with the storage medium 32 through the bus, and the processor 31 executes the machine-readable instructions to execute the instruction execution processing method as described above. The specific implementation and technical effects are similar, and are not described herein again.
For ease of illustration, only one processor is depicted in the above-described executing instruction processing apparatus. However, it should be noted that the instruction execution device in the present invention may also include a plurality of processors, and thus, the steps executed by one processor described in the present invention may also be executed by a plurality of processors in combination or individually. For example, if the processor executing the instruction processing device executes steps a and B, it should be understood that steps a and B may also be executed by two different processors together or separately in one processor. For example, a first processor performs step a and a second processor performs step B, or the first processor and the second processor perform steps a and B together, etc.
In some embodiments, a processor may include one or more processing cores (e.g., a single-core processor (S) or a multi-core processor (S)). Merely by way of example, a Processor may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Set Processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller Unit, a Reduced Instruction Set computer (Reduced Instruction Set computer), a microprocessor, or the like, or any combination thereof.
The embodiment of the present invention further provides a storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the method for processing the execution instruction is executed. The specific implementation and technical effects are similar, and are not described herein again.
Alternatively, the storage medium may be a U disk, a removable hard disk, a ROM, a RAM, a magnetic or optical disk, or the like.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A mobile measurement point cloud coloring method is characterized by comprising the following steps:
acquiring panoramic information and point cloud information of a target object;
projecting the panoramic information to form a planar panoramic image, and projecting the point cloud information to form a planar intensity image;
correcting the planar panoramic image according to the planar intensity image to form a corrected planar panoramic image;
and coloring the point cloud information according to the correction plane panoramic image.
2. The method of mobile measurement point cloud coloration as claimed in claim 1, wherein said projecting the panoramic information to form a planar panoramic image and projecting the point cloud information to form a planar intensity image comprises:
establishing a panoramic ball according to the panoramic information; projecting the panoramic ball toward a projection plane of a first projection volume to form the planar panoramic image;
establishing a point cloud intensity image according to the point cloud information; projecting the point cloud intensity image onto a projection plane of a second projection volume to form the planar intensity image.
3. The method of mobile measurement point cloud coloration as claimed in claim 2, wherein said projecting the panorama sphere to a projection plane of a first projection volume comprises:
selecting a pixel on the first projection body, and enabling the pixel and the center of the panoramic ball to form a straight line; determining the intersection point of the straight line and the panoramic ball; and assigning the color of the intersection point position on the panoramic ball to the corresponding pixel.
4. The method of mobile measurement point cloud coloration according to claim 2 or 3, wherein the first projection volume is a cube; the projection plane of the first projection volume includes six planes on the cube.
5. The method of mobile measurement point cloud coloration as claimed in claim 2, wherein said projecting the point cloud intensity image to a projection plane of the second projection volume to form the planar intensity image comprises:
after the point cloud intensity image is projected to the projection plane of the second projection, rasterizing the projection plane of the second projection to form the plane intensity image; wherein the point intensity information in the point cloud intensity image corresponds to the gray scale information of the pixels in the projection plane of the second projection volume.
6. The method of moving measurement point cloud coloring of claim 5, wherein when a single pixel in a projection plane of the second projection corresponds to a plurality of point intensity information in the point cloud intensity image, at least one piece of relevant point intensity information is determined from the plurality of point intensity information according to a preset range, and gray scale information of the single pixel corresponds to an average value of the at least one piece of relevant point intensity information.
7. The method of mobile measurement point cloud coloration of claim 1, wherein said correcting the planar panoramic image from the planar intensity image to form a corrected planar panoramic image comprises:
respectively extracting image characteristics of the plane intensity image and the plane panoramic image;
and correcting the planar panoramic image according to the planar intensity image and the image characteristics of the planar panoramic image, so that the corrected planar panoramic image forms the corrected planar panoramic image.
8. The mobile measurement point cloud colorization method of claim 7 wherein the image features are line features; the correcting the planar panorama image according to the planar intensity image and the image features of the planar panorama image includes:
determining corresponding line feature groups according to the line features of the plane intensity image and the plane panoramic image respectively;
determining a first correction value of the planar panoramic image according to the corresponding line feature group; and correcting the planar panoramic image according to the first correction value.
9. The mobile measurement point cloud colorization method of claim 8 wherein the image features further comprise contour features; the correcting the image characteristics of the planar panoramic image according to the planar intensity image and the image characteristics of the planar panoramic image further comprises:
determining a corresponding contour feature set of the plane intensity image and the plane panoramic image according to the corresponding line feature set;
determining a second correction value of the planar panoramic image according to the corresponding contour feature group; and correcting the planar panoramic image according to the second correction value.
10. A mobile measurement point cloud colorizing apparatus, comprising:
the acquisition module is used for acquiring panoramic information and point cloud information of the target object;
the projection module is used for projecting the panoramic information to form a planar panoramic image and projecting the point cloud information to form a planar intensity image;
the correction module is used for correcting the plane panoramic image according to the plane intensity image to form a corrected plane panoramic image;
and the color-giving module is used for giving color to the point cloud information according to the correction plane panoramic image.
CN202010571647.5A 2020-06-19 2020-06-19 Mobile measurement point cloud coloring method and device Pending CN111724464A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010571647.5A CN111724464A (en) 2020-06-19 2020-06-19 Mobile measurement point cloud coloring method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010571647.5A CN111724464A (en) 2020-06-19 2020-06-19 Mobile measurement point cloud coloring method and device

Publications (1)

Publication Number Publication Date
CN111724464A true CN111724464A (en) 2020-09-29

Family

ID=72569837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010571647.5A Pending CN111724464A (en) 2020-06-19 2020-06-19 Mobile measurement point cloud coloring method and device

Country Status (1)

Country Link
CN (1) CN111724464A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115761045A (en) * 2022-11-21 2023-03-07 北京城市网邻信息技术有限公司 Household type graph generation method, device, equipment and storage medium
CN115861476A (en) * 2022-11-21 2023-03-28 北京城市网邻信息技术有限公司 Method, device and equipment for generating house type graph and storage medium
CN115904188A (en) * 2022-11-21 2023-04-04 北京城市网邻信息技术有限公司 Method and device for editing house-type graph, electronic equipment and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115761045A (en) * 2022-11-21 2023-03-07 北京城市网邻信息技术有限公司 Household type graph generation method, device, equipment and storage medium
CN115861476A (en) * 2022-11-21 2023-03-28 北京城市网邻信息技术有限公司 Method, device and equipment for generating house type graph and storage medium
CN115904188A (en) * 2022-11-21 2023-04-04 北京城市网邻信息技术有限公司 Method and device for editing house-type graph, electronic equipment and storage medium
CN115761045B (en) * 2022-11-21 2023-08-18 北京城市网邻信息技术有限公司 House pattern generation method, device, equipment and storage medium
CN115861476B (en) * 2022-11-21 2023-10-13 北京城市网邻信息技术有限公司 House pattern generation method, device, equipment and storage medium
CN115904188B (en) * 2022-11-21 2024-05-31 北京城市网邻信息技术有限公司 Editing method and device for house type diagram, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109872397B (en) Three-dimensional reconstruction method of airplane parts based on multi-view stereo vision
CN110276808B (en) Method for measuring unevenness of glass plate by combining single camera with two-dimensional code
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN111724464A (en) Mobile measurement point cloud coloring method and device
US7079679B2 (en) Image processing apparatus
CN112053432B (en) Binocular vision three-dimensional reconstruction method based on structured light and polarization
CN107993263B (en) Automatic calibration method for panoramic system, automobile, calibration device and storage medium
CN111563921B (en) Underwater point cloud acquisition method based on binocular camera
CN109727290B (en) Zoom camera dynamic calibration method based on monocular vision triangulation distance measurement method
Douxchamps et al. High-accuracy and robust localization of large control markers for geometric camera calibration
KR100681320B1 (en) Method for modelling three dimensional shape of objects using level set solutions on partial difference equation derived from helmholtz reciprocity condition
WO2000000926A1 (en) Method and apparatus for capturing stereoscopic images using image sensors
CN113205592B (en) Light field three-dimensional reconstruction method and system based on phase similarity
CN110349257B (en) Phase pseudo mapping-based binocular measurement missing point cloud interpolation method
CN111649694B (en) Implicit phase-parallax mapping binocular measurement missing point cloud interpolation method
CN108917640A (en) A kind of laser blind hole depth detection method and its system
CN112489193B (en) Three-dimensional reconstruction method based on structured light
CN208254424U (en) A kind of laser blind hole depth detection system
CN116363226A (en) Real-time multi-camera multi-projector 3D imaging processing method and device
CN111968182B (en) Calibration method for nonlinear model parameters of binocular camera
GB2569609A (en) Method and device for digital 3D reconstruction
CN110458951B (en) Modeling data acquisition method and related device for power grid pole tower
CN116894907A (en) RGBD camera texture mapping optimization method and system
CN115631317B (en) Tunnel lining ortho-image generation method and device, storage medium and terminal
CN114596355B (en) High-precision pose measurement method and system based on cooperative targets

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination