CN111007485B - Image processing method and device and computer storage medium - Google Patents

Image processing method and device and computer storage medium Download PDF

Info

Publication number
CN111007485B
CN111007485B CN202010155209.0A CN202010155209A CN111007485B CN 111007485 B CN111007485 B CN 111007485B CN 202010155209 A CN202010155209 A CN 202010155209A CN 111007485 B CN111007485 B CN 111007485B
Authority
CN
China
Prior art keywords
point cloud
main
plane
auxiliary
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010155209.0A
Other languages
Chinese (zh)
Other versions
CN111007485A (en
Inventor
陈旭
曾元一
王劲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ciic Technology Co ltd
Original Assignee
Ciic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ciic Technology Co ltd filed Critical Ciic Technology Co ltd
Priority to CN202010155209.0A priority Critical patent/CN111007485B/en
Publication of CN111007485A publication Critical patent/CN111007485A/en
Application granted granted Critical
Publication of CN111007485B publication Critical patent/CN111007485B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application discloses an image processing method, an image processing device and a computer storage medium, wherein the method comprises the following steps: the method comprises the steps of determining a reference measurement area corresponding to a laser radar carrier, measuring the reference measurement area based on a main laser radar and an auxiliary laser radar corresponding to the laser radar carrier to obtain a main point cloud image and an auxiliary point cloud image, determining a target reference measurement plane from a plurality of reference measurement planes corresponding to the reference measurement area, respectively performing plane fitting on point cloud data corresponding to the target reference measurement plane in the main point cloud image and the auxiliary point cloud image to obtain a main fitting plane corresponding to the main point cloud image and an auxiliary fitting plane corresponding to the auxiliary point cloud image, and aligning the main fitting plane and the auxiliary fitting plane so as to realize calibration between the main laser radar and the auxiliary laser radar. The scheme can be combined with the surrounding environment of the laser radar carrier to finish the calibration of the laser radar, so that the calibration efficiency among multiple laser radars is improved.

Description

Image processing method and device and computer storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method and apparatus, and a computer storage medium.
Background
The laser radar is a radar system for detecting characteristic quantities such as position, speed and the like of a target by emitting laser beams, and the working principle of the laser radar is to emit detection signals to the target, then compare the received signals reflected back from the target with the emission signals to obtain parameter information about the target, thereby realizing the detection, tracking and identification of the target.
In view of the advantage of laser radar in spatial information acquisition, it is extensive in the field of unmanned driving, but when including at least one laser radar on unmanned carrier, in order to promote the accuracy of target detection, need mark between a plurality of laser radars. The existing method for calibrating multiple laser radars usually needs to use special calibration equipment such as spherical calibration objects, and the calibration preparation period and efficiency of the method for calibrating the multiple laser radars are long and low.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device and a computer storage medium, which can improve the efficiency of calibration among multiple laser radars.
An embodiment of the present application provides an image processing method, including:
determining a reference measurement area corresponding to a laser radar carrier, wherein the reference measurement area comprises a plurality of reference measurement planes;
measuring the reference measurement area based on a main laser radar and an auxiliary laser radar corresponding to the laser radar carrier to obtain a main point cloud image corresponding to the main laser radar and an auxiliary point cloud image corresponding to the auxiliary laser radar;
determining a target reference measurement plane from a plurality of reference measurement planes corresponding to the reference measurement area;
respectively carrying out plane fitting on point cloud data corresponding to target reference measurement planes in the main point cloud image and the auxiliary point cloud image to obtain a main fitting plane corresponding to the main point cloud image and an auxiliary fitting plane corresponding to the auxiliary point cloud image;
aligning the primary and secondary fitting planes so as to achieve calibration between the primary and secondary lidar.
Correspondingly, an embodiment of the present application further provides an image processing apparatus, including:
the laser radar carrier measuring device comprises a region determining module, a measuring module and a measuring module, wherein the region determining module is used for determining a reference measuring region corresponding to a laser radar carrier, and the reference measuring region comprises a plurality of reference measuring planes;
the measuring module is used for measuring the reference measuring area based on a main laser radar and an auxiliary laser radar corresponding to the laser radar carrier to obtain a main point cloud image corresponding to the main laser radar and an auxiliary point cloud image corresponding to the auxiliary laser radar;
the plane determining module is used for determining a target reference measuring plane from a plurality of reference measuring planes corresponding to the reference measuring area;
the fitting module is used for respectively performing plane fitting on point cloud data corresponding to target reference measurement planes in the main point cloud image and the auxiliary point cloud image to obtain a main fitting plane corresponding to the main point cloud image and an auxiliary fitting plane corresponding to the auxiliary point cloud image;
and the alignment module is used for aligning the main fitting plane and the auxiliary fitting plane so as to realize the calibration between the main laser radar and the auxiliary laser radar.
Optionally, in some embodiments, the fitting module may include a determination submodule, a first fitting submodule, and a second fitting submodule, as follows:
the determining submodule is used for determining point cloud data corresponding to a target reference measuring plane in the main point cloud image;
the first fitting submodule is used for performing plane fitting on the point cloud data based on the distribution condition of the point cloud data to obtain a main fitting plane corresponding to the main point cloud image;
and the second fitting submodule is used for performing plane fitting on the point cloud data corresponding to the target reference measurement plane in the auxiliary point cloud image to obtain an auxiliary fitting plane corresponding to the auxiliary point cloud image.
At this time, the first fitting sub-module may be specifically configured to select a plurality of selected points from the point cloud data when the point cloud data is sparsely distributed, and determine a main fitting plane corresponding to the main point cloud image based on the selected points.
At this time, the first fitting sub-module may be specifically configured to determine a plane area for framing the point cloud data in the main point cloud image when the point cloud data are densely distributed and a distribution range of the point cloud data meets a first distribution condition, and determine a main fitting plane corresponding to the main point cloud image based on the plane area.
At this time, the first fitting sub-module may be specifically configured to determine a target reference point from a plurality of points of the point cloud data when the point cloud data is densely distributed and a distribution range of the point cloud data satisfies a second distribution condition, determine a search area corresponding to the target reference point in the main point cloud image, update a point satisfying the fitting condition in the search area to the target reference point, and return to the step of determining the search area corresponding to the target reference point in the main point cloud image until a main fitting plane corresponding to the main point cloud image is determined.
Optionally, in some embodiments, the image processing apparatus may further include a returning module, where the returning module is specifically configured to return to perform the step of determining the target reference measurement plane from among the plurality of reference measurement planes corresponding to the reference measurement region until all the plurality of reference measurement planes in the reference measurement region are fitted, so as to obtain a primary fit plane corresponding to the plurality of reference measurement planes in the primary point cloud image and a secondary fit plane corresponding to the plurality of reference measurement planes in the secondary point cloud image.
Optionally, in some embodiments, the image processing apparatus may further include a scanning module, where the scanning module is specifically configured to acquire first scanning data and second scanning data of the main lidar and the sub lidar after scanning the same object, and when a difference between the first scanning data and the second scanning data is not smaller than a preset threshold, return to performing the step of determining the target reference measurement plane from among the plurality of reference measurement planes corresponding to the reference measurement area.
At this time, the alignment module may be specifically configured to obtain a transformation relationship between the measurement coordinate system of the master lidar and the measurement coordinate system of the slave lidar based on the master fitting plane and the slave fitting plane, determine a reference coordinate system serving as a transformation reference from the measurement coordinate system of the master lidar and the measurement coordinate system of the slave lidar, and align the measurement coordinate system of the master lidar and the measurement coordinate system of the slave lidar based on the transformation relationship and the reference coordinate system, so as to implement calibration between the master lidar and the slave lidar.
In addition, a storage medium is provided, where a plurality of instructions are stored, and the instructions are suitable for being loaded by a processor to perform the steps in any one of the image processing methods provided in the embodiments of the present application.
The method and the device can determine a reference measurement area corresponding to a laser radar carrier, the reference measurement area comprises a plurality of reference measurement planes, a main laser radar and an auxiliary laser radar are corresponding to the laser radar carrier, the reference measurement area is measured, a main point cloud image corresponding to the main laser radar and an auxiliary point cloud image corresponding to the auxiliary laser radar are obtained, a target reference measurement plane is determined from the plurality of reference measurement planes corresponding to the reference measurement area, point cloud data corresponding to the target reference measurement plane in the main point cloud image and the auxiliary point cloud image are subjected to plane fitting respectively, a main fitting plane corresponding to the main point cloud image and an auxiliary fitting plane corresponding to the auxiliary point cloud image are obtained, and the main fitting plane and the auxiliary fitting plane are aligned so as to realize calibration between the main laser radar and the auxiliary laser radar. The scheme can be combined with the surrounding environment of the laser radar carrier to finish the calibration of the laser radar, so that the calibration efficiency among multiple laser radars is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic view of a scene of an image processing system provided in an embodiment of the present application;
FIG. 2 is a first flowchart of an image processing method provided in an embodiment of the present application;
FIG. 3 is a second flowchart of an image processing method provided by an embodiment of the present application;
FIG. 4 is a third flowchart of an image processing method provided in an embodiment of the present application;
fig. 5 is a fourth flowchart of an image processing method provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of a reference measurement area provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of a plane fitting method provided by an embodiment of the present application;
fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a network device according to an embodiment of the present application.
Detailed description of the invention
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
In the description that follows, specific embodiments of the present application will be described with reference to steps and symbols executed by one or more computers, unless otherwise indicated. Accordingly, these steps and operations will be referred to, several times, as being performed by a computer, the computer performing operations involving a processing unit of the computer in electronic signals representing data in a structured form. This operation transforms the data or maintains it at locations in the computer's memory system, which may be reconfigured or otherwise altered in a manner well known to those skilled in the art. The data maintains a data structure that is a physical location of the memory that has particular characteristics defined by the data format. However, while the principles of the application have been described in language specific to above, it is not intended to be limited to the specific form set forth herein, and it will be recognized by those of ordinary skill in the art that various of the steps and operations described below may be implemented in hardware.
The term "module" as used herein may be considered a software object executing on the computing system. The different components, modules, engines, and services described herein may be considered as implementation objects on the computing system. The apparatus and method described herein may be implemented in software, or may be implemented in hardware, and are within the scope of the present application.
The terms "first", "second", and "third", etc. in this application are used to distinguish between different objects and not to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules listed, but rather, some embodiments may include other steps or modules not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The embodiment of the present application provides an image processing method, and an execution subject of the image processing method may be the image processing apparatus provided in the embodiment of the present application, or a network device integrated with the image processing apparatus, where the image processing apparatus may be implemented by a hardware or software method. The network device may be a smart phone, a tablet computer, a palm computer, a notebook computer, or a desktop computer. Network devices include, but are not limited to, computers, network hosts, a single network server, multiple sets of network servers, or a cloud of multiple servers.
The image processing method provided by the embodiment of the application can be applied to the field of unmanned driving, wherein the unmanned driving refers to the situation that the vehicle running task can be guided and decided without the need of testing the physical driving operation of a driver, and the control behavior of the driver is replaced to be tested, so that the vehicle can complete the function of safe running. The train in the unmanned system runs completely under a control system based on communication, so that the operation which originally needs manual participation can be automatically carried out in an unmanned state.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario of an image processing method provided in an embodiment of the present application, taking an example that an image processing apparatus is integrated in a network device, the network device may determine a reference measurement area corresponding to a laser radar carrier, where the reference measurement area includes a plurality of reference measurement planes, and based on a primary laser radar and a secondary laser radar corresponding to the laser radar carrier, measure the reference measurement area to obtain a primary point cloud image corresponding to the primary laser radar and a secondary point cloud image corresponding to the secondary laser radar, determine a target reference measurement plane from the plurality of reference measurement planes corresponding to the reference measurement area, perform plane fitting on point cloud data corresponding to the target reference measurement plane in the primary point cloud image and the secondary point cloud image respectively to obtain a primary fitting plane corresponding to the primary point cloud image and a secondary fitting plane corresponding to the secondary point cloud image, and aligning the main fitting plane and the auxiliary fitting plane so as to realize the calibration between the main laser radar and the auxiliary laser radar.
Referring to fig. 2, fig. 2 is a schematic flow chart of an image processing method according to an embodiment of the present application, which is specifically described by the following embodiments:
201. and determining a reference measurement area corresponding to the laser radar carrier.
The laser radar carrier can be an object capable of bearing laser radars, and not only can a single laser radar be included on the laser radar carrier, but also a laser radar system composed of a plurality of laser radars can be included. For example, various vehicles equipped with a lidar may be referred to as lidar carriers.
The measuring area can be an area which can be measured by the laser radar, and the laser radar can measure the measuring area to obtain a point cloud image corresponding to the measuring area. When the point cloud images have different purposes, different types of measurement areas can be selected correspondingly for measurement, for example, when calibration between different laser radars is required, the measurement areas can be determined as three mutually perpendicular planes in space.
In practical application, when calibration between different laser radars is required, in order to facilitate subsequent steps such as plane alignment, a measurement area including a plurality of reference measurement planes can be used as a reference measurement area corresponding to a laser radar carrier. The reference measuring plane may be a rigid plane located in the reference measuring area, and the size, type, and the like of the reference measuring plane may not be limited, for example, in order to improve the accuracy of radar calibration, the reference measuring plane may be determined to be a plane that is not easily bent, such as a wall surface and a ground surface.
In an embodiment, the image processing method can be applied to various vehicle-mounted multi-laser radar systems, and because the vehicle-mounted multi-laser radar system comprises at least one laser radar, calibration needs to be carried out among the multiple laser radars. In order to improve the flexibility of the image processing method, the reference measurement area can be determined as an area which is easily obtained in practical application and meets the requirement of accuracy, for example, three mutually perpendicular planes in space can be used as the reference measurement plane, and an area formed by the reference measurement plane is used as the reference measurement area. As shown in fig. 6, the wall surface a, the wall surface B, and the floor surface C may be used as reference measurement planes, and the area including the wall surface a, the wall surface B, and the floor surface C may be used as a reference measurement area.
The selected reference measurement area is an area which is easy to obtain in practical application, such as a wall surface and a ground surface, so that the image processing method can calibrate the multiple laser radars at any time by combining the surrounding environment of the laser radar carrier without specially preparing the reference measurement area.
In an embodiment, the number of the reference measurement planes in the reference measurement area is not limited, and the reference measurement area including different numbers of reference measurement planes can be selected according to the difference of the actual application conditions, for example, the number of the reference measurement planes in the reference measurement area can be reduced under the condition that the calculation amount needs to be reduced and the requirement on accuracy is not high; for another example, when the accuracy needs to be improved, the number of the reference measurement planes in the reference measurement area can be increased; for another example, when there are only two reference measurement planes suitable for lidar calibration at the location of the lidar carrier, an area including the two reference measurement planes may also be used as the reference measurement area, and so on.
In an embodiment, in order to improve the flexibility of the image processing method, the reference measurement planes included in the reference measurement area may also be planes that are not perpendicular to each other, for example, when there are no planes that are perpendicular to each other in the location of the lidar carrier, the planes that are not perpendicular to each other and not coplanar may be used as the reference measurement planes to perform calibration of the lidar.
202. And measuring the reference measurement area based on the main laser radar and the auxiliary laser radar corresponding to the laser radar carriers to obtain a main point cloud image corresponding to the main laser radar and an auxiliary point cloud image corresponding to the auxiliary laser radar.
The point cloud image is a massive point set which expresses target space distribution and target surface characteristics under the same space reference system, and can be obtained by measuring through a measuring instrument. For example, the point cloud image may be an image obtained by laser radar measurement, and the point cloud image includes a plurality of points obtained by scanning, and the points describe information of a target object measured by the laser radar.
Wherein, because including a plurality of lidar on the lidar carrier, consequently can divide into main lidar and vice lidar according to lidar kind, lidar distribution position on the lidar carrier etc.. For example, an n-line lidar in the vehicle-mounted lidar system may be determined as the primary lidar, and an m-line lidar located on the left side of the lidar carrier may be determined as the secondary lidar.
In an embodiment, the number of the primary lidar and the secondary lidar on the lidar carrier may not be limited, for example, n-line lidar in the vehicle-mounted lidar system may be determined as the primary lidar, and a plurality of m-line lidar located at two sides of the lidar carrier may be determined as the secondary lidar.
In practical applications, for example, when the laser radar carrier includes an n-line laser radar as the main laser radar and an m-line laser radar as the sub laser radar located on the left side of the laser radar carrier, the area shown in fig. 6 may be used as a reference measurement area, and the reference measurement area includes a wall surface a, a wall surface B, and a ground surface C. And then measuring the reference measurement area through the main laser radar and the auxiliary laser radar respectively to obtain a main point cloud image obtained after the main laser radar is measured and an auxiliary point cloud image obtained after the auxiliary laser radar is measured.
In an embodiment, since the lidar on the lidar carrier is not necessarily located on the same side of the lidar carrier, part of the lidar may not be able to measure the reference measurement area, e.g. the lidar on the right side of the lidar carrier may not be able to measure the reference measurement area on the left side of the lidar carrier, or even if the measurement is possible, the effect is not good. At this time, the area that can be measured by the laser radar may be determined as the reference measurement area, that is, the number of the reference measurement areas may be more than one.
For example, when the laser radar carrier includes an n-line laser radar as a main laser radar, an m-line laser radar as a first sub laser radar located on the left side of the laser radar carrier, and an m-line laser radar as a second sub laser radar located on the right side of the laser radar carrier, the area shown in fig. 6 may be used as a first reference measurement area located on the left side of the laser radar carrier, including wall surface a, wall surface B, and ground surface C, and a second reference measurement area located on the right side of the laser radar carrier, including wall surface a ', wall surface B ', and ground surface C ', is determined.
And then, measuring the first reference measurement area through the main laser radar and the first auxiliary laser radar respectively to obtain a first main point cloud image obtained after the main laser radar is measured and a first auxiliary point cloud image obtained after the first auxiliary laser radar is measured. And measuring the second reference measurement area through the main laser radar and the second auxiliary laser radar to obtain a second main point cloud image obtained after the main laser radar is measured and a second auxiliary point cloud image obtained after the second auxiliary laser radar is measured.
203. And determining a target reference measuring plane from a plurality of reference measuring planes corresponding to the reference measuring area.
In practical applications, for example, as shown in fig. 6, when the reference measurement area includes three reference measurement planes, i.e., a wall surface a, a wall surface B, and a floor surface C, one reference measurement plane can be selected as a target reference measurement plane for the subsequent steps.
In an embodiment, since the ground is easily obtained and the features are obvious in practical application, a reference measurement area including the ground is usually selected, and at this time, a reference measurement plane corresponding to the ground can be first used as a target reference measurement plane, so that the accuracy of a subsequent plane fitting step can be improved.
204. And respectively carrying out plane fitting on the point cloud data corresponding to the target reference measurement plane in the main point cloud image and the auxiliary point cloud image to obtain a main fitting plane corresponding to the main point cloud image and an auxiliary fitting plane corresponding to the auxiliary point cloud image.
In practical applications, for example, when the reference measurement area includes three reference measurement planes, i.e., a wall surface a, a wall surface B, and a ground surface C, and the ground surface C is determined as a target reference measurement plane, a main point cloud image obtained after measurement by the main lidar and an auxiliary point cloud image obtained after measurement by the auxiliary lidar are obtained. The main point cloud image comprises a plurality of points obtained after measuring the ground C, and the auxiliary point cloud image also comprises a plurality of points obtained after measuring the ground C. At this time, the points corresponding to the ground C in the main point cloud image can be fitted through a plane fitting method to obtain a main fitting plane corresponding to the ground C in the main point cloud image, the points corresponding to the ground C in the auxiliary point cloud image are fitted, and an auxiliary fitting plane corresponding to the ground C in the auxiliary point cloud image is obtained through fitting.
In an embodiment, since the distribution of the point cloud data in the point cloud image is different, it may be determined to perform plane fitting by using an appropriate method according to the distribution of the point cloud data in the point cloud image. Specifically, the step of performing plane fitting on the point cloud data corresponding to the target reference measurement plane in the primary point cloud image and the secondary point cloud image respectively to obtain a primary fitting plane corresponding to the primary point cloud image and a secondary fitting plane corresponding to the secondary point cloud image may include:
determining point cloud data corresponding to a target reference measuring plane in the main point cloud image;
performing plane fitting on the point cloud data based on the distribution condition of the point cloud data to obtain a main fitting plane corresponding to the main point cloud image;
and performing plane fitting on the point cloud data corresponding to the target reference measurement plane in the auxiliary point cloud image to obtain an auxiliary fitting plane corresponding to the auxiliary point cloud image.
In practical application, for example, when the reference measurement area includes three reference measurement planes, namely a wall surface a, a wall surface B and a ground surface C, and the ground surface C is determined as the target reference measurement plane, the main point cloud image obtained after the main laser radar measurement is obtained at this time, and the point cloud data corresponding to the ground surface C can be determined from the main point cloud image. And then, selecting a proper method to perform plane fitting on the point cloud data through judging the distribution condition of the point cloud data corresponding to the ground C to obtain a main fitting plane corresponding to the main point cloud image. And determining point cloud data corresponding to the ground C from the obtained auxiliary point cloud image obtained after the auxiliary laser radar is measured, and then selecting a proper method to perform plane fitting on the point cloud data through judging the distribution condition of the point cloud data corresponding to the ground C to obtain an auxiliary fitting plane corresponding to the auxiliary point cloud image.
In one embodiment, when the point cloud data is sparsely distributed, a plane can be fitted by a point selection method. Specifically, the step "performing plane fitting on the point cloud data based on the distribution of the point cloud data to obtain a main fitting plane corresponding to the main point cloud image" may include:
when the point cloud data is sparsely distributed, selecting a plurality of selected points from the point cloud data;
and determining a main fitting plane corresponding to the main point cloud image based on the selected point.
In practical application, for example, when the reference measurement area includes three reference measurement planes, namely a wall surface a, a wall surface B and a ground surface C, and the ground surface C is determined as the target reference measurement plane, the main point cloud image obtained after the main laser radar measurement is obtained at this time, and the point cloud data corresponding to the ground surface C can be determined from the main point cloud image. As shown in fig. 7, through analysis of the point cloud data corresponding to the ground C, when the point cloud data is found to be sparsely distributed, three non-collinear points can be directly selected from the points corresponding to the ground C in the main point cloud image as selected points, and plane fitting is performed according to the selected points to obtain a main fitting plane.
In an embodiment, the number of the selected points in the point cloud image may not be limited, for example, to improve the accuracy of plane fitting in the point cloud image, the number of the selected points may also be increased.
In one embodiment, the sparse or dense point cloud distribution may be determined by calculating the number of points in the region per unit area in the point cloud image. When the number of points in each unit area region in the point cloud image exceeds a preset value, the point cloud can be considered to be densely distributed; when the number of points in each unit area region in the point cloud image does not exceed a preset value, the point cloud distribution can be considered sparse. In the embodiment of the present application, a method for determining whether the point cloud distribution is dense or sparse is not limited.
In an embodiment, in order to improve the accuracy of plane fitting in the point cloud image, when the distribution of the point cloud data corresponding to the target reference measurement plane in the point cloud image is dense and the distribution range is wide, plane fitting may be performed by a method such as frame selection. Specifically, the step "performing plane fitting on the point cloud data based on the distribution of the point cloud data to obtain a main fitting plane corresponding to the main point cloud image" may include:
when the point cloud data are densely distributed and the distribution range of the point cloud data meets a first distribution condition, determining a plane area for framing the point cloud data in the main point cloud image;
and determining a main fitting plane corresponding to the main point cloud image based on the plane area.
The first distribution condition may be a condition for determining a distribution range of the point cloud data, for example, the first distribution condition may be that the distribution range of the point cloud data is wide, and therefore, when the distribution range of the point cloud data satisfies the distribution condition, it may be stated that the distribution range of the point cloud data is wide at this time. For example, since an area with too low point cloud distribution density can be ignored, the distribution range of the point cloud data can be determined according to the distribution density of the point cloud data. Identifying a region of which the distribution density of the point cloud data exceeds a preset density as a distribution region of the point cloud data, and when the area of the distribution region exceeds the preset area, considering that the distribution range of the point cloud data is wider and meets a first distribution condition; accordingly, when the area of the distribution region does not exceed the preset area, the distribution range of the point cloud data may be considered to be narrow, and the first distribution condition may not be satisfied. The method for judging whether the point cloud data distribution range meets the first distribution condition is not limited in the embodiment of the application.
In practical application, for example, when the reference measurement area includes three reference measurement planes, namely a wall surface a, a wall surface B and a ground surface C, and the ground surface C is determined as the target reference measurement plane, the main point cloud image obtained after the main laser radar measurement is obtained at this time, and the point cloud data corresponding to the ground surface C can be determined from the main point cloud image. Through the analysis of the point cloud data corresponding to the ground C, when the point cloud data are found to be densely distributed and the distribution range meets the first distribution condition, the point cloud image can be distributed in the plane area obtained through framing as many points as possible through a framing method, and the main fitting plane corresponding to the main point cloud image is determined according to the plane area.
In an embodiment, the shape of the plane area obtained by framing may be various, for example, the point in the point cloud image may be framed by a regular rectangular frame or a circular frame, or for example, the point in the point cloud image may be distributed in an irregular plane area obtained by framing as many points as possible by a manual framing method.
In an embodiment, since the fitting of the plane by using the framing method is not accurate enough when the point cloud data is distributed narrowly, in order to improve the accuracy of the plane fitting, the plane fitting may be performed by using a filtering method for filtering the point cloud data in the search area. Specifically, the step "performing plane fitting on the point cloud data based on the distribution of the point cloud data to obtain a main fitting plane corresponding to the main point cloud image" may include:
when the point cloud data are densely distributed and the distribution range of the point cloud data meets a second distribution condition, determining a target reference point from a plurality of points of the point cloud data;
determining a search area corresponding to the target reference point in the main point cloud image;
updating the points meeting the fitting condition in the search area as target reference points;
and returning to the step of determining the search area corresponding to the target reference point in the main point cloud image until a main fitting plane corresponding to the main point cloud image is determined.
The second distribution condition may be a condition for determining a distribution range of the point cloud data, for example, the second distribution condition may include a narrow distribution range of the point cloud data, or a situation that the point cloud data is blocked by other point cloud data in an operation interface view of the point cloud image.
In practical application, for example, when the reference measurement area includes three reference measurement planes, namely a wall surface a, a wall surface B and a ground surface C, and the ground surface C is determined as the target reference measurement plane, the main point cloud image obtained after the main laser radar measurement is obtained at this time, and the point cloud data corresponding to the ground surface C can be determined from the main point cloud image. Through analysis of the point cloud data corresponding to the ground C, when the point cloud data are found to be densely distributed and the distribution range meets a second distribution condition, one point meeting the plane fitting requirement can be selected from the point cloud data corresponding to the ground C to serve as a target reference point, then a search area is determined according to the position of the target reference point in the point cloud image, a plurality of points meeting the plane fitting requirement are determined in the point cloud data located in the search area, and the points are updated to be the target reference points. And then, continuously determining a search area and target reference points according to the target reference points, and finally performing plane fitting based on the obtained multiple target reference points to obtain a main fitting plane corresponding to the main point cloud image.
In an embodiment, the shape of the search area may be various, for example, the search area may be a circular area with a preset size as a radius and a target reference point as a center, wherein the direction of the search area may be determined by manual observation, for example, a coordinate system may be defined in the point cloud image, and when the point cloud data is mostly distributed in the xoy direction, the direction of the search area may be defined as the xoy direction. The search area can also be an area which is marked manually and has an irregular shape.
In an embodiment, the case that the distribution of the point cloud data satisfies the second distribution condition may not only be limited to the long and narrow region, but also represent a case that the distribution range of the point cloud data is narrow due to occlusion by other point cloud data in the point cloud image, and the like.
In an embodiment, each time the search area is determined, the direction of the search area may be adjusted according to the distribution of the point cloud data in the point cloud image. For example, when the laser radar carrier is a vehicle, the direction of the head of the vehicle can be set to be the positive direction of an x axis, the direction of a normal vector of the ground where the vehicle is located, which is vertically upward, is set to be the positive direction of a z axis, and the direction which is located on the left side of the vehicle and is perpendicular to the x axis and the z axis is set to be the positive direction of a y axis. When the distribution condition of the point cloud data in the point cloud image is close to a plane vertical to the ground, the search range of the search area in the y-axis direction can be reduced, and the search range of the search area in the x-axis direction and the z-axis direction can be increased. For another example, when the distribution of the point cloud data in the point cloud image is close to a plane perpendicular to the ground, the search range of the search area in the z-axis direction can be reduced, and the search range of the search area in the x-axis direction and the y-axis direction can be increased.
In an embodiment, the embodiment of the present application is not limited to performing the fitting step of the main fitting plane in the main point cloud image only by one method, for example, when the main fitting plane corresponding to the main point cloud image needs to be obtained, the plane fitting by the point selection method, the frame selection method, and the filtering method can be flexibly selected according to actual conditions, that is, in the process of determining the fitting plane once, multiple methods can be flexibly used to perform plane fitting, for example, the filtering method can be used to determine multiple target reference points, and then the point selection method is used to fit the plane; if the target datum points are determined by using a frame selection method, then a plane is fitted by using a point selection method, and the like.
For example, when the reference measurement area includes three reference measurement planes, i.e., a wall surface a, a wall surface B, and a ground surface C, and the ground surface C is determined as the target reference measurement plane, the main point cloud image obtained after the main lidar measurement is obtained, and the point cloud data corresponding to the ground surface C can be determined from the main point cloud image. Through analysis of the point cloud data corresponding to the ground C, when the point cloud data are found to be densely distributed and the distribution range meets a second distribution condition, one point meeting the plane fitting requirement can be selected from the point cloud data corresponding to the ground C as a target reference point, then a search area is determined according to the position of the target reference point in the point cloud image, then a plurality of points meeting the plane fitting requirement can be determined in the point cloud data in the search area through a point selection method, and plane fitting is carried out based on the points meeting the plane fitting requirement.
In an embodiment, since each point cloud image includes point cloud data corresponding to a plurality of reference measurement planes, a plurality of plane fitting operations may be performed. Specifically, after the step of performing plane fitting on the point cloud data corresponding to the target reference measurement plane in the primary point cloud image and the secondary point cloud image respectively to obtain a primary fitting plane corresponding to the primary point cloud image and a secondary fitting plane corresponding to the secondary point cloud image, the method may further include:
and returning to the step of determining a target reference measuring plane from the plurality of reference measuring planes corresponding to the reference measuring area until the plurality of reference measuring planes in the reference measuring area are fitted, so as to obtain a main fitting plane corresponding to the plurality of reference measuring planes in the main point cloud image and a secondary fitting plane corresponding to the plurality of reference measuring planes in the secondary point cloud image.
In practical applications, for example, after the reference measurement area includes three reference measurement planes, i.e., a wall surface a, a wall surface B, and a ground surface C, and the ground surface C is determined as a target reference measurement plane, a main fitting plane corresponding to the ground surface C in the main point cloud image and an auxiliary fitting plane corresponding to the ground surface C in the auxiliary point cloud image are obtained. Then, the target reference measurement plane may be determined again, for example, the wall surface a is determined as the target reference measurement plane, and a main fitting plane corresponding to the wall surface a in the main point cloud image and a sub fitting plane corresponding to the wall surface a in the sub point cloud image are obtained. And then, the target reference measurement plane can be continuously determined again, for example, the wall surface B is determined as the target reference measurement plane, and a main fitting plane corresponding to the wall surface B in the main point cloud image and an auxiliary fitting plane corresponding to the wall surface B in the auxiliary point cloud image are obtained.
In an embodiment, in order to improve the flexibility of the image processing method in the embodiment of the present application, the method of performing plane fitting according to the point cloud data in the point cloud image is not limited to the three methods described above, and plane fitting may be performed by using a plane fitting method such as Universal-RANSAC or least square method.
205. And aligning the main fitting plane and the auxiliary fitting plane so as to realize the calibration between the main laser radar and the auxiliary laser radar.
In practical applications, for example, when the reference measurement area includes three reference measurement planes, i.e., a wall surface a, a wall surface B, and a ground surface C, a main fitting plane corresponding to the ground surface C in the main point cloud image, an auxiliary fitting plane corresponding to the ground surface C in the auxiliary point cloud image, a main fitting plane corresponding to the wall surface a in the main point cloud image, an auxiliary fitting plane corresponding to the wall surface a in the auxiliary point cloud image, a main fitting plane corresponding to the wall surface B in the main point cloud image, and an auxiliary fitting plane corresponding to the wall surface B in the auxiliary point cloud image can be obtained.
A main fitting plane corresponding to the ground C in the main point cloud image and an auxiliary fitting plane corresponding to the ground C in the auxiliary point cloud image can be parallel to each other through calculation; a main fitting plane corresponding to the wall surface A in the main point cloud image is parallel to an auxiliary fitting plane corresponding to the wall surface A in the auxiliary point cloud image; and a main fitting plane corresponding to the wall surface B in the main point cloud image and an auxiliary fitting plane corresponding to the wall surface B in the auxiliary point cloud image are parallel to each other. And when the fitting planes corresponding to all the reference measuring planes in the point cloud image are parallel to each other and translation calculation is completed according to a plane equation, calibration between the main laser radar and the auxiliary laser radar is completed.
In one embodiment, the calibration process of the multi-lidar system is as follows: the coordinate system of one laser radar is selected as a reference coordinate system, and the measurement data of other laser radars are unified under the reference coordinate system, so that the calibration of the multi-laser radar system can be regarded as a transformation process among a plurality of coordinate systems. Specifically, the step of "aligning the main fitting plane and the auxiliary fitting plane so as to achieve calibration between the main lidar and the auxiliary lidar" may include:
based on the main fitting plane and the auxiliary fitting plane, acquiring a transformation relation between a measurement coordinate system of the main laser radar and a measurement coordinate system of the auxiliary laser radar;
determining a reference coordinate system serving as a transformation reference from the measurement coordinate system of the main laser radar and the measurement coordinate system of the auxiliary laser radar;
and aligning the measurement coordinate system of the main laser radar and the measurement coordinate system of the auxiliary laser radar based on the transformation relation and the reference coordinate system so as to realize the calibration between the main laser radar and the auxiliary laser radar.
In practical applications, for example, the measurement coordinate system of the main lidar can be determined separately
Figure 924313DEST_PATH_IMAGE001
And a measurement coordinate system of the secondary laser radar
Figure 575874DEST_PATH_IMAGE002
And determining the measurement coordinate system of the main laser radar as a reference coordinate system. Due to the reference coordinate system
Figure 886770DEST_PATH_IMAGE001
And measuring the coordinate system
Figure 78717DEST_PATH_IMAGE002
The transformation relationship between can be as follows:
Figure 424247DEST_PATH_IMAGE003
wherein the content of the first and second substances,
Figure 777868DEST_PATH_IMAGE005
is composed of
Figure 576060DEST_PATH_IMAGE007
Representing a reference coordinate system
Figure 807583DEST_PATH_IMAGE001
And measuring the coordinate system
Figure 210883DEST_PATH_IMAGE002
The relative rotation angle therebetween includes three variables of yaw angle (yaw) rotating about the y-axis, pitch angle (pitch) rotating about the x-axis, and roll angle (roll) rotating about the z-axis.
Wherein the content of the first and second substances,
Figure 735405DEST_PATH_IMAGE008
is composed of
Figure 20893DEST_PATH_IMAGE009
Representing a reference coordinate system
Figure 289063DEST_PATH_IMAGE001
And measuring the coordinate system
Figure 343607DEST_PATH_IMAGE002
The relative translation transformation quantity between the two translation transformation quantities comprises translation values in three directions of an x axis, a y axis and a z axis.
Therefore, it is only necessary to calculate from a plurality of fitting planes that have already been acquired
Figure 570189DEST_PATH_IMAGE010
And
Figure 342973DEST_PATH_IMAGE011
can obtain a reference coordinate system
Figure 149255DEST_PATH_IMAGE001
And measuring the coordinate system
Figure 556840DEST_PATH_IMAGE002
The measurement data of the auxiliary laser radar are unified to a reference coordinate system, and the calibration between the main laser radar and the auxiliary laser radar is completed.
In one embodiment, for example, a main fitting plane corresponding to the ground C in the main point cloud image has been acquired
Figure 423165DEST_PATH_IMAGE012
And a secondary fitting plane corresponding to the ground C in the secondary point cloud image
Figure 683245DEST_PATH_IMAGE013
Main fitting plane corresponding to wall surface A in main point cloud image
Figure 558797DEST_PATH_IMAGE014
Auxiliary fitting plane corresponding to wall surface A in auxiliary point cloud image
Figure 587933DEST_PATH_IMAGE015
Main fitting plane corresponding to wall B in main point cloud image
Figure 94001DEST_PATH_IMAGE016
And a secondary fitting plane corresponding to the wall surface B in the secondary point cloud image
Figure 575798DEST_PATH_IMAGE017
Thereafter, the main fitting plane may be first passed
Figure 255041DEST_PATH_IMAGE012
And a secondary fitting plane
Figure 138683DEST_PATH_IMAGE013
To carry out
Figure 815652DEST_PATH_IMAGE018
Is calculated by. At this time, the main fitting planes may be acquired separately
Figure 286210DEST_PATH_IMAGE012
Corresponding normal vector
Figure 503565DEST_PATH_IMAGE019
And a sub-fitting plane
Figure 507293DEST_PATH_IMAGE013
Corresponding normal vector
Figure 355163DEST_PATH_IMAGE020
By the rodriger rotation formula (Rodrigues' rotation formula),
Figure 811552DEST_PATH_IMAGE018
the calculation formula of (c) may be as follows:
Figure 832598DEST_PATH_IMAGE021
wherein:
Figure 425253DEST_PATH_IMAGE022
Figure 444025DEST_PATH_IMAGE023
Figure 653289DEST_PATH_IMAGE024
Figure 212446DEST_PATH_IMAGE025
can be obtained by the above formula
Figure 128450DEST_PATH_IMAGE018
And then can pass through the main fitting plane accordingly
Figure 347816DEST_PATH_IMAGE014
And a secondary fitting plane
Figure 44377DEST_PATH_IMAGE015
To carry out
Figure 141646DEST_PATH_IMAGE026
By the principal fitting plane
Figure 974473DEST_PATH_IMAGE016
And a secondary fitting plane
Figure 69468DEST_PATH_IMAGE027
To carry out
Figure 253324DEST_PATH_IMAGE028
And (4) calculating. According to the formula
Figure 154284DEST_PATH_IMAGE029
Can be used for the final
Figure 576038DEST_PATH_IMAGE030
And (6) solving.
The plane may then be fitted through the master
Figure 107514DEST_PATH_IMAGE031
Main fitting plane
Figure 247508DEST_PATH_IMAGE014
Main fitting plane
Figure 719203DEST_PATH_IMAGE016
Minor fitting plane
Figure 261043DEST_PATH_IMAGE013
Minor fitting plane
Figure 963420DEST_PATH_IMAGE015
And a sub-fitting plane
Figure 856289DEST_PATH_IMAGE027
To carry out
Figure 99052DEST_PATH_IMAGE011
And (4) calculating. At this time, the main fitting planes may be acquired separately
Figure 495398DEST_PATH_IMAGE031
Corresponding plane equation
Figure 368676DEST_PATH_IMAGE032
Main fitting plane
Figure 483263DEST_PATH_IMAGE014
Corresponding plane equation
Figure 529716DEST_PATH_IMAGE033
And a main fitting plane
Figure 46148DEST_PATH_IMAGE016
Corresponding plane equation
Figure 385600DEST_PATH_IMAGE034
Then, then
Figure 987483DEST_PATH_IMAGE011
The calculation formula of (c) may be as follows:
Figure 572048DEST_PATH_IMAGE035
wherein the content of the first and second substances,
Figure 942986DEST_PATH_IMAGE036
Figure 954805DEST_PATH_IMAGE037
and, and
Figure 778404DEST_PATH_IMAGE038
in plane equations for respective pairs of fitting planes for the secondary lidar
Figure 166660DEST_PATH_IMAGE040
A constant value.
In an embodiment, in order to improve the calibration accuracy of the laser radar, after the main laser radar and the auxiliary laser radar are calibrated, quality evaluation can be performed on the calibration. Specifically, after the step of "aligning the main fitting plane and the auxiliary fitting plane so as to achieve calibration between the main lidar and the auxiliary lidar", the method may further include:
respectively acquiring first scanning data and second scanning data of the main laser radar and the auxiliary laser radar after scanning the same object;
and when the difference between the first scanning data and the second scanning data is not less than a preset threshold value, returning to the step of determining a target reference measuring plane from a plurality of reference measuring planes corresponding to the reference measuring area.
In practical application, for example, when the laser radar carrier is a vehicle, calibration between the main laser radar and the auxiliary laser radar is completed, after a sum value is obtained, the vehicle can be parked in an open region, an object is placed in the detection range of the laser radar on the vehicle, the object is scanned through the main laser radar and the auxiliary laser radar respectively, point cloud information of the laser radar for the object is acquired according to distance information between the radar and the object, and data acquired by the main laser radar is recorded as point cloud information of the object
Figure 595368DEST_PATH_IMAGE041
Data acquired by the secondary lidar is recorded as
Figure 778087DEST_PATH_IMAGE042
. At this time, it can be defined that the following formula holds:
Figure 652765DEST_PATH_IMAGE043
wherein the content of the first and second substances,
Figure 579132DEST_PATH_IMAGE045
representing a set of points formed by the object scanned by the secondary lidar in the measurement coordinate system of the primary lidar.
Can be determined by
Figure 659084DEST_PATH_IMAGE046
The points are each in the set of points
Figure 481546DEST_PATH_IMAGE041
And (3) evaluating the quality of the calibration between the two laser radars according to the distance between the closest points in the point set, wherein the formula of the quality evaluation function can be as follows:
Figure 279738DEST_PATH_IMAGE047
wherein the content of the first and second substances,
Figure 275376DEST_PATH_IMAGE048
and
Figure 413096DEST_PATH_IMAGE049
to represent
Figure 203198DEST_PATH_IMAGE050
Each point in the point set
Figure 754265DEST_PATH_IMAGE041
The point set is a target point set formed by points closest to the Euclidean distance,
Figure 520970DEST_PATH_IMAGE051
representing the mean of the squared euclidean distances between all corresponding points in the set of target points.
Calculated by the quality evaluation function formula
Figure 575514DEST_PATH_IMAGE051
After the value is reached, when
Figure 739779DEST_PATH_IMAGE051
When the value is less than the preset threshold value,the calibration between the two laser radars can be considered to meet the calibration requirement at the moment; when in use
Figure 246984DEST_PATH_IMAGE051
When the value is not less than the preset threshold value, it can be considered that the calibration between the two laser radars does not meet the calibration requirement at this time, and the step of plane alignment needs to be repeated, so that the step of determining the target reference measurement plane from the plurality of reference measurement planes corresponding to the reference measurement area can be returned to at this time.
According to the method and the device, the reference measurement area corresponding to the laser radar carrier can be determined, the reference measurement area comprises a plurality of reference measurement planes, the reference measurement area is measured based on the main laser radar and the auxiliary laser radar corresponding to the laser radar carrier to obtain the main point cloud image corresponding to the main laser radar and the auxiliary point cloud image corresponding to the auxiliary laser radar, the target reference measurement plane is determined from the plurality of reference measurement planes corresponding to the reference measurement area, the point cloud data corresponding to the target reference measurement plane in the main point cloud image and the auxiliary point cloud image are subjected to plane fitting respectively to obtain the main fitting plane corresponding to the main point cloud image and the auxiliary fitting plane corresponding to the auxiliary point cloud image, and the main fitting plane and the auxiliary fitting plane are aligned to achieve calibration between the main laser radar and the auxiliary laser radar. The scheme can finish the calibration between the laser radars at any time by utilizing the surrounding environment of the laser radar carrier, has lower requirement on the common area range between the multiple laser radars, does not need special calibration equipment, has low requirement on calibration personnel, and has clear steps, high efficiency and accurate result. When the relative position of the laser radar needs to be changed, the efficiency can be greatly improved.
The method described in the foregoing embodiment will be described in further detail below by way of example in which the image processing apparatus is specifically integrated in a network device.
Referring to fig. 3, a specific flow of the image processing method according to the embodiment of the present application may be as follows:
301. the network device determines a first reference measurement region and a second reference measurement region.
In practical applications, for example, when a vehicle is equipped with an n-line lidar as a primary lidar, an m-line lidar as a first secondary lidar located on the left side of the vehicle, and an m-line lidar as a second secondary lidar located on the right side of the vehicle, since the lidars are mounted on both the left and right sides of the vehicle, a first reference measurement area can be determined on the left side of the vehicle, and a second reference measurement area can be determined on the right side of the vehicle, respectively, so that the first secondary lidar and the primary lidar located on the left side of the vehicle can measure the first reference measurement area located on the left side of the vehicle, and the second secondary lidar and the primary lidar located on the right side of the vehicle can measure the second reference measurement area located on the right side of the vehicle.
The first reference measurement area may include a reference measurement plane a, a reference measurement plane B, and a reference measurement plane C, and the second reference measurement area may include a reference measurement plane a ', a reference measurement plane B ', and a reference measurement plane C ', where the reference measurement plane may be selected as a ground or a wall that is easily available in practical applications, and so on.
302. The network equipment acquires a first main point cloud image, a second main point cloud image, a first auxiliary point cloud image and a second auxiliary point cloud image.
In practical application, for example, a first main point cloud image and a second main point cloud image obtained by measuring the first reference measurement area and the second reference measurement area by the main laser radar, a first auxiliary point cloud image obtained by measuring the first reference measurement area by the first auxiliary laser radar, and a second auxiliary point cloud image obtained by measuring the second reference measurement area by the second auxiliary laser radar can be obtained.
303. The network equipment acquires a main fitting plane corresponding to the reference measurement plane A in the first main point cloud image
Figure 584424DEST_PATH_IMAGE052
And a secondary fitting plane corresponding to the reference measurement plane A in the first secondary point cloud image
Figure 759053DEST_PATH_IMAGE053
And align the two planes.
In practical applications, for example, as shown in fig. 5, a method for performing plane fitting may be determined according to a distribution of point cloud data corresponding to a reference measurement plane a in the first main point cloud image. As shown in fig. 7, when the point cloud data corresponding to the reference measurement plane a in the first main point cloud image is sparsely distributed, three points that are not on the same straight line may be selected from the point cloud data corresponding to the reference measurement plane a, and a main fitting plane corresponding to the reference measurement plane a in the first main point cloud image may be obtained according to the selected point fitting plane
Figure 828641DEST_PATH_IMAGE054
When point cloud data corresponding to a reference measuring plane A in a first main point cloud image are distributed densely and the distribution range of the point cloud data is narrow and long or is shielded by other point clouds in an operation interface view, a target reference point can be selected from the point cloud data corresponding to the reference measuring plane A, then a filtering rule is manually input according to the distribution area of the selected target reference point in the space, the target reference point is used as a starting point, a searching range is set, points meeting the searching requirement in the searching range are added into a point set and combined into a new target reference point, then the new target reference point is continuously used as the starting point, the searching range is set, points meeting the searching requirement are determined, finally a plurality of target reference points are obtained, planes are fitted according to the target reference points, and a main fitting plane corresponding to the reference measuring plane A in the first main point cloud image is obtained
Figure 88721DEST_PATH_IMAGE055
When the point cloud data corresponding to the reference measurement plane A in the first main point cloud image are distributed densely and the distribution range of the point cloud data is wide, the point cloud data can be directly framed in the first main point cloud image, so that the point cloud data are distributed on a plane determined by framing as much as possible, and then plane fitting is performed according to the plane obtained by framing, and a main fitting plane corresponding to the reference measurement plane A in the first main point cloud image is obtained
Figure 964273DEST_PATH_IMAGE055
Acquiring a main fitting plane corresponding to a reference measuring plane A in the first main point cloud image
Figure 993409DEST_PATH_IMAGE055
Thereafter, the main fitting plane can be obtained through the matching
Figure 233897DEST_PATH_IMAGE055
A similar method, a pair fitting plane corresponding to the reference measuring plane A in the first pair of point cloud images is obtained
Figure 748317DEST_PATH_IMAGE056
. As shown in fig. 4, a main fitting plane corresponding to the reference measurement plane a in the first main point cloud image is obtained
Figure 161981DEST_PATH_IMAGE055
And a secondary fitting plane corresponding to the reference measurement plane A in the first secondary point cloud image
Figure 248886DEST_PATH_IMAGE056
Thereafter, the main fitting plane can be fitted
Figure 722592DEST_PATH_IMAGE057
And a sub-fitting plane
Figure 957265DEST_PATH_IMAGE056
Alignment is performed when the main fitting plane is
Figure 909040DEST_PATH_IMAGE057
And a sub-fitting plane
Figure 116031DEST_PATH_IMAGE056
When the alignment is completed, the subsequent steps can be carried out; when principal fitting plane
Figure 760639DEST_PATH_IMAGE057
And, andminor fitting plane
Figure 217028DEST_PATH_IMAGE056
When the alignment between the two planes is not finished, the two planes can be aligned again until the alignment between the two planes is finished.
304. The network equipment acquires a main fitting plane corresponding to the reference measurement plane B in the first main point cloud image
Figure 175756DEST_PATH_IMAGE059
And a secondary fitting plane corresponding to the reference measurement plane B in the first secondary point cloud image
Figure 33991DEST_PATH_IMAGE060
And align the two planes.
In practical application, a plane fitting method can be determined according to the distribution condition of point cloud data corresponding to the reference measurement plane B in the point cloud image, and a main fitting plane corresponding to the reference measurement plane B in the first main point cloud image is obtained
Figure 613615DEST_PATH_IMAGE059
And a secondary fitting plane corresponding to the reference measurement plane B in the first secondary point cloud image
Figure 557300DEST_PATH_IMAGE061
Acquiring a main fitting plane corresponding to a reference measuring plane B in the first main point cloud image
Figure 54140DEST_PATH_IMAGE059
And a secondary fitting plane corresponding to the reference measurement plane B in the first secondary point cloud image
Figure 766881DEST_PATH_IMAGE060
Thereafter, the main fitting plane can be fitted
Figure 18871DEST_PATH_IMAGE059
And a sub-fitting plane
Figure 653115DEST_PATH_IMAGE061
Alignment is performed when the main fitting plane is
Figure 750384DEST_PATH_IMAGE062
And a sub-fitting plane
Figure 583211DEST_PATH_IMAGE061
When the alignment is completed, the subsequent steps can be carried out; when principal fitting plane
Figure 740523DEST_PATH_IMAGE062
And a sub-fitting plane
Figure 862062DEST_PATH_IMAGE060
When the alignment between the two planes is not finished, the two planes can be aligned again until the alignment between the two planes is finished.
305. The network equipment acquires a main fitting plane corresponding to the reference measurement plane C in the first main point cloud image
Figure 497443DEST_PATH_IMAGE064
And a secondary fitting plane corresponding to the reference measurement plane C in the first secondary point cloud image
Figure 951820DEST_PATH_IMAGE065
And align the two planes.
In practical application, a plane fitting method can be determined according to the distribution condition of point cloud data corresponding to the reference measuring plane C in the point cloud image, and a main fitting plane corresponding to the reference measuring plane C in the first main point cloud image is obtained
Figure 483296DEST_PATH_IMAGE067
And a secondary fitting plane corresponding to the reference measurement plane C in the first secondary point cloud image
Figure 623290DEST_PATH_IMAGE065
Acquiring a reference measuring plane C pair in the first main point cloud imagePrincipal fitting plane of
Figure 327941DEST_PATH_IMAGE064
And a secondary fitting plane corresponding to the reference measurement plane C in the first secondary point cloud image
Figure 135360DEST_PATH_IMAGE065
Thereafter, the main fitting plane can be fitted
Figure 634474DEST_PATH_IMAGE064
And a sub-fitting plane
Figure 465027DEST_PATH_IMAGE065
Alignment is performed when the main fitting plane is
Figure 707790DEST_PATH_IMAGE064
And a sub-fitting plane
Figure 369715DEST_PATH_IMAGE065
When the alignment is completed, the subsequent steps can be carried out; when principal fitting plane
Figure 242993DEST_PATH_IMAGE064
And a sub-fitting plane
Figure 357580DEST_PATH_IMAGE065
When the alignment between the two planes is not finished, the two planes can be aligned again until the alignment between the two planes is finished.
Wherein the measuring coordinate system of the main laser radar can be determined separately
Figure 926006DEST_PATH_IMAGE068
And a measurement coordinate system of the secondary laser radar
Figure 442438DEST_PATH_IMAGE069
And determining the measurement coordinate system of the main laser radar as a reference coordinate system. Due to the reference coordinate system
Figure 486617DEST_PATH_IMAGE070
And measuring the coordinate system
Figure 88500DEST_PATH_IMAGE069
The transformation relationship between can be as follows:
Figure 938644DEST_PATH_IMAGE071
wherein the content of the first and second substances,
Figure 512845DEST_PATH_IMAGE072
is composed of
Figure 524663DEST_PATH_IMAGE073
Representing a reference coordinate system
Figure 613842DEST_PATH_IMAGE070
And measuring the coordinate system
Figure 2098DEST_PATH_IMAGE069
The relative rotation angle therebetween includes three variables of yaw angle (yaw) rotating about the y-axis, pitch angle (pitch) rotating about the x-axis, and roll angle (roll) rotating about the z-axis.
Wherein the content of the first and second substances,
Figure 227543DEST_PATH_IMAGE075
is composed of
Figure 613525DEST_PATH_IMAGE077
Representing a reference coordinate system
Figure 957044DEST_PATH_IMAGE070
And measuring the coordinate system
Figure 883412DEST_PATH_IMAGE069
The relative translation transformation quantity between the two translation transformation quantities comprises translation values in three directions of an x axis, a y axis and a z axis.
Therefore, it is only necessary to calculate from a plurality of fitting planes that have already been acquired
Figure 166625DEST_PATH_IMAGE072
And
Figure 785826DEST_PATH_IMAGE075
can obtain a reference coordinate system
Figure 584017DEST_PATH_IMAGE070
And measuring the coordinate system
Figure 314076DEST_PATH_IMAGE069
The measurement data of the auxiliary laser radar are unified to a reference coordinate system, and the calibration between the main laser radar and the auxiliary laser radar is completed.
Wherein, a main fitting plane corresponding to the reference measuring plane A in the first main point cloud image is acquired
Figure 514113DEST_PATH_IMAGE078
And a secondary fitting plane corresponding to the reference measurement plane A in the first secondary point cloud image
Figure 507477DEST_PATH_IMAGE079
And a main fitting plane corresponding to the reference measurement plane B in the first main point cloud image
Figure 792965DEST_PATH_IMAGE080
And a secondary fitting plane corresponding to the reference measurement plane B in the first secondary point cloud image
Figure 326714DEST_PATH_IMAGE081
And a main fitting plane corresponding to the reference measurement plane C in the first main point cloud imageAnd a secondary fitting plane corresponding to the reference measurement plane C in the first secondary point cloud image
Figure 778479DEST_PATH_IMAGE083
Can first pass through the masterFitting plane
Figure 551263DEST_PATH_IMAGE078
And a secondary fitting plane
Figure 888703DEST_PATH_IMAGE079
To carry out
Figure 1016DEST_PATH_IMAGE084
And (4) calculating. At this time, the main fitting planes may be acquired separately
Figure 132920DEST_PATH_IMAGE078
Corresponding normal vector
Figure 393000DEST_PATH_IMAGE085
And a sub-fitting plane
Figure 268552DEST_PATH_IMAGE079
Corresponding normal vector
Figure 500950DEST_PATH_IMAGE086
By the rodriger rotation formula (Rodrigues' rotation formula),
Figure 538176DEST_PATH_IMAGE018
the calculation formula of (c) may be as follows:
Figure 551132DEST_PATH_IMAGE087
wherein:
Figure 466260DEST_PATH_IMAGE088
Figure 553165DEST_PATH_IMAGE089
Figure 26872DEST_PATH_IMAGE090
Figure 995965DEST_PATH_IMAGE091
can be obtained by the above formula
Figure 213320DEST_PATH_IMAGE092
And then can pass through the main fitting plane accordingly
Figure 154731DEST_PATH_IMAGE093
And a secondary fitting plane
Figure 799339DEST_PATH_IMAGE094
To carry out
Figure 521307DEST_PATH_IMAGE095
By the principal fitting plane
Figure 276773DEST_PATH_IMAGE096
And a secondary fitting plane
Figure 338270DEST_PATH_IMAGE097
To carry out
Figure 419359DEST_PATH_IMAGE098
And (4) calculating. According to the formula
Figure 596000DEST_PATH_IMAGE099
Can be used for the final
Figure 358420DEST_PATH_IMAGE100
And (6) solving.
The plane may then be fitted through the master
Figure 71161DEST_PATH_IMAGE102
Main fitting plane
Figure 323151DEST_PATH_IMAGE103
Main fitting plane
Figure 754132DEST_PATH_IMAGE104
Minor fitting plane
Figure 54663DEST_PATH_IMAGE105
Minor fitting plane
Figure 621911DEST_PATH_IMAGE106
And a sub-fitting plane
Figure 44802DEST_PATH_IMAGE107
To carry out
Figure 697500DEST_PATH_IMAGE108
And (4) calculating. At this time, the main fitting planes may be acquired separately
Figure 801722DEST_PATH_IMAGE102
Corresponding plane equation
Figure 489056DEST_PATH_IMAGE109
Main fitting plane
Figure 318734DEST_PATH_IMAGE103
Corresponding plane equation
Figure 724307DEST_PATH_IMAGE110
And a main fitting plane
Figure 163379DEST_PATH_IMAGE104
Corresponding plane equation
Figure 705218DEST_PATH_IMAGE112
Then, then
Figure 407595DEST_PATH_IMAGE113
The calculation formula of (c) may be as follows:
Figure 34886DEST_PATH_IMAGE114
wherein the content of the first and second substances,
Figure 543227DEST_PATH_IMAGE115
Figure 408415DEST_PATH_IMAGE116
and, and
Figure 78431DEST_PATH_IMAGE117
in plane equations for respective pairs of fitting planes for the secondary lidar
Figure 193018DEST_PATH_IMAGE118
A constant value.
306. And the network equipment realizes the calibration between the main laser radar and the first auxiliary laser radar.
In practical application, after the rotation translation transformation matrix between the two measurement coordinate systems is solved, the measurement data of the first secondary laser radar can be unified to the reference coordinate system of the main laser radar, and calibration between the main laser radar and the first secondary laser radar is completed.
The calibration between the main laser radar and the first auxiliary laser radar is finished to obtain
Figure 738006DEST_PATH_IMAGE119
And
Figure 457700DEST_PATH_IMAGE121
after the numerical value, the vehicle can be parked in an open region, an object is placed in the detection range of the laser radar on the vehicle, the object is scanned through the main laser radar and the first auxiliary laser radar respectively, point cloud information of the laser radar on the object is obtained according to distance information between the radar and the object, and data obtained by the main laser radar is recorded as
Figure 298617DEST_PATH_IMAGE122
The data acquired by the first secondary lidar is recorded as
Figure 900500DEST_PATH_IMAGE123
. At this time, it can be defined that the following formula holds:
Figure 485065DEST_PATH_IMAGE124
wherein the content of the first and second substances,
Figure 59266DEST_PATH_IMAGE125
representing a set of points formed by the object scanned by the first secondary lidar in the measurement coordinate system of the primary lidar.
Can be determined by
Figure 71084DEST_PATH_IMAGE126
The points are each in the set of points
Figure 425842DEST_PATH_IMAGE122
And (3) evaluating the quality of the calibration between the two laser radars according to the distance between the closest points in the point set, wherein the formula of the quality evaluation function can be as follows:
Figure 548519DEST_PATH_IMAGE127
wherein the content of the first and second substances,
Figure 242806DEST_PATH_IMAGE128
and
Figure 425525DEST_PATH_IMAGE129
to represent
Figure 503465DEST_PATH_IMAGE130
Each point in the point set
Figure 633095DEST_PATH_IMAGE122
The point set is a target point set formed by points closest to the Euclidean distance,
Figure 978626DEST_PATH_IMAGE131
representing the mean of the squared euclidean distances between all corresponding points in the set of target points.
Calculated by the quality evaluation function formula
Figure 597826DEST_PATH_IMAGE131
After the value is reached, when
Figure 396018DEST_PATH_IMAGE131
When the value is smaller than the preset threshold value, the calibration between the two laser radars at the moment can be considered to reach the calibration requirement; when in use
Figure 329339DEST_PATH_IMAGE131
When the value is not less than the preset threshold value, the calibration between the two laser radars does not meet the calibration requirement at the moment, and the step of plane alignment needs to be repeated, so that the network equipment can be returned to obtain the main fitting plane corresponding to the reference measurement plane A in the first main point cloud image at the moment
Figure 529376DEST_PATH_IMAGE132
And a secondary fitting plane corresponding to the reference measurement plane A in the first secondary point cloud image
Figure 319477DEST_PATH_IMAGE133
And aligning the two planes.
307. The network equipment acquires a main fitting plane corresponding to the reference measurement plane A' in the second main point cloud image
Figure 604965DEST_PATH_IMAGE134
And a secondary fitting plane corresponding to the reference measurement plane A' in the second secondary point cloud image
Figure 341977DEST_PATH_IMAGE135
And align the two planes.
In practical application, a plane fitting method can be determined according to the distribution condition of point cloud data corresponding to the reference measurement plane A 'in the point cloud image, and a main fitting plane corresponding to the reference measurement plane A' in the second main point cloud image is obtained
Figure 396521DEST_PATH_IMAGE134
And the reference measuring plane A' in the second secondary point cloud image corresponds toBy fitting plane
Figure 856058DEST_PATH_IMAGE135
Acquiring a main fitting plane corresponding to the reference measurement plane A' in the second main point cloud image
Figure 628842DEST_PATH_IMAGE134
And a secondary fitting plane corresponding to the reference measurement plane A' in the second secondary point cloud image
Figure 903966DEST_PATH_IMAGE135
Thereafter, the main fitting plane can be fitted
Figure 78595DEST_PATH_IMAGE134
And a sub-fitting plane
Figure 210499DEST_PATH_IMAGE135
Alignment is performed when the main fitting plane is
Figure 408263DEST_PATH_IMAGE134
And a sub-fitting plane
Figure 283815DEST_PATH_IMAGE135
When the alignment is completed, the subsequent steps can be carried out; when principal fitting plane
Figure 312951DEST_PATH_IMAGE134
And a sub-fitting plane
Figure 615756DEST_PATH_IMAGE135
When the alignment between the two planes is not finished, the two planes can be aligned again until the alignment between the two planes is finished.
308. The network equipment acquires a main fitting plane corresponding to the reference measuring plane B' in the second main point cloud image
Figure 300815DEST_PATH_IMAGE137
And the reference measuring plane B' in the second secondary point cloud image corresponds toBy fitting plane
Figure 714479DEST_PATH_IMAGE139
And align the two planes.
In practical application, a plane fitting method can be determined according to the distribution condition of point cloud data corresponding to the reference measurement plane B 'in the point cloud image, and a main fitting plane corresponding to the reference measurement plane B' in the second main point cloud image is obtained
Figure 365165DEST_PATH_IMAGE137
And a secondary fitting plane corresponding to the reference measurement plane B' in the second secondary point cloud image
Figure 838872DEST_PATH_IMAGE139
Acquiring a main fitting plane corresponding to a reference measuring plane B' in the second main point cloud image
Figure 11227DEST_PATH_IMAGE137
And a secondary fitting plane corresponding to the reference measurement plane B' in the second secondary point cloud image
Figure 228582DEST_PATH_IMAGE142
Thereafter, the main fitting plane can be fitted
Figure 232310DEST_PATH_IMAGE137
And a sub-fitting plane
Figure 876918DEST_PATH_IMAGE144
Alignment is performed when the main fitting plane is
Figure 536570DEST_PATH_IMAGE145
And a sub-fitting plane
Figure 557615DEST_PATH_IMAGE142
When the alignment is completed, the subsequent steps can be carried out; when principal fitting plane
Figure 415850DEST_PATH_IMAGE145
And a sub-fitting plane
Figure 169042DEST_PATH_IMAGE146
When the alignment between the two planes is not finished, the two planes can be aligned again until the alignment between the two planes is finished.
309. The network equipment acquires a main fitting plane corresponding to the reference measurement plane C' in the second main point cloud image
Figure 112727DEST_PATH_IMAGE147
And a secondary fitting plane corresponding to the reference measurement plane C' in the second secondary point cloud image
Figure DEST_PATH_IMAGE148
And align the two planes.
In practical application, a plane fitting method can be determined according to the distribution condition of point cloud data corresponding to the reference measurement plane C 'in the point cloud image, and a main fitting plane corresponding to the reference measurement plane C' in the second main point cloud image is obtained
Figure 967158DEST_PATH_IMAGE149
And a secondary fitting plane corresponding to the reference measurement plane C' in the second secondary point cloud image
Figure 883161DEST_PATH_IMAGE148
Acquiring a main fitting plane corresponding to the reference measurement plane C' in the second main point cloud image
Figure 869572DEST_PATH_IMAGE147
And a secondary fitting plane corresponding to the reference measurement plane C' in the second secondary point cloud image
Figure 300553DEST_PATH_IMAGE148
Thereafter, the main fitting plane can be fitted
Figure 601084DEST_PATH_IMAGE147
And a simulationCoplanar surface
Figure 433911DEST_PATH_IMAGE148
Alignment is performed when the main fitting plane is
Figure 856802DEST_PATH_IMAGE147
And a sub-fitting plane
Figure 509500DEST_PATH_IMAGE148
When the alignment is completed, the subsequent steps can be carried out; when principal fitting plane
Figure 613723DEST_PATH_IMAGE147
And a sub-fitting plane
Figure 301056DEST_PATH_IMAGE148
When the alignment between the two planes is not finished, the two planes can be aligned again until the alignment between the two planes is finished.
310. And the network equipment realizes the calibration between the main laser radar and the second auxiliary laser radar.
In practical application, after the rotation and translation transformation matrix between the two measurement coordinate systems is solved, the measurement data of the secondary laser radar of the second can be unified to the reference coordinate system of the main laser radar, and the calibration between the main laser radar and the secondary laser radar of the second is completed.
After calibration between the main laser radar and the second auxiliary laser radar is completed, the calibration can be obtained through calculation of a quality evaluation function formula
Figure DEST_PATH_IMAGE150
Value when
Figure 927472DEST_PATH_IMAGE150
When the value is smaller than the preset threshold value, the calibration between the two laser radars at the moment can be considered to reach the calibration requirement; when in use
Figure 270728DEST_PATH_IMAGE150
When the value is not less than the preset threshold value, the two laser radars can be considered to be at the momentThe calibration does not meet the calibration requirement, and the step of plane alignment needs to be repeated, so that the network equipment can be returned to obtain the main fitting plane corresponding to the reference measurement plane A' in the second main point cloud image
Figure 975379DEST_PATH_IMAGE151
And a secondary fitting plane corresponding to the reference measurement plane A' in the second secondary point cloud image
Figure DEST_PATH_IMAGE152
And aligning the two planes.
As can be seen from the above, in the embodiment of the present application, the first reference measurement area and the second reference measurement area may be determined by the network device, the first main point cloud image, the second main point cloud image, the first auxiliary point cloud image, and the second auxiliary point cloud image are obtained, and the main fitting plane corresponding to the reference measurement plane a in the first main point cloud image is obtained
Figure 48377DEST_PATH_IMAGE153
And a secondary fitting plane corresponding to the reference measurement plane A in the first secondary point cloud image
Figure DEST_PATH_IMAGE154
Aligning the two planes to obtain a main fitting plane corresponding to the reference measuring plane B in the first main point cloud image
Figure 547492DEST_PATH_IMAGE155
And a secondary fitting plane corresponding to the reference measurement plane B in the first secondary point cloud image
Figure DEST_PATH_IMAGE156
Aligning the two planes to obtain a main fitting plane corresponding to the reference measuring plane C in the first main point cloud image
Figure DEST_PATH_IMAGE158
And a secondary fitting plane corresponding to the reference measurement plane C in the first secondary point cloud image
Figure 440361DEST_PATH_IMAGE159
And aligning the two planes to realize calibration between the main laser radar and the first auxiliary laser radar, and obtaining a main fitting plane corresponding to the reference measurement plane A' in the second main point cloud image
Figure 447238DEST_PATH_IMAGE161
And a secondary fitting plane corresponding to the reference measurement plane A' in the second secondary point cloud image
Figure DEST_PATH_IMAGE162
Aligning the two planes to obtain a main fitting plane corresponding to the reference measurement plane B' in the second main point cloud image
Figure 578005DEST_PATH_IMAGE163
And a secondary fitting plane corresponding to the reference measurement plane B' in the second secondary point cloud image
Figure DEST_PATH_IMAGE164
Aligning the two planes to obtain a main fitting plane corresponding to the reference measurement plane C' in the second main point cloud image
Figure 44759DEST_PATH_IMAGE165
And a secondary fitting plane corresponding to the reference measurement plane C' in the second secondary point cloud image
Figure DEST_PATH_IMAGE166
And aligning the two planes to realize the calibration between the main laser radar and the second auxiliary laser radar. The scheme can finish the calibration between the laser radars at any time by utilizing the surrounding environment of the laser radar carrier, has lower requirement on the common area range between the multiple laser radars, does not need special calibration equipment, has low requirement on calibration personnel, and has clear steps, high efficiency and accurate result. When the relative position of the laser radar needs to be changed, the efficiency can be greatly improved.
In order to better implement the above method, an embodiment of the present application may further provide an image processing apparatus, where the image processing apparatus may be specifically integrated in a network device, and the network device may include a server, a terminal, and the like, where the terminal may include: a mobile phone, a tablet Computer, a notebook Computer, or a Personal Computer (PC).
For example, as shown in fig. 8, the image processing apparatus may include a region determining module 81, a measuring module 82, a plane determining module 83, a fitting module 84, and an aligning module 85, as follows:
the area determining module 81 is configured to determine a reference measurement area corresponding to the laser radar carrier, where the reference measurement area includes a plurality of reference measurement planes;
a measuring module 82, configured to measure the reference measurement area based on a primary lidar and a secondary lidar corresponding to the lidar carrier, so as to obtain a primary point cloud image corresponding to the primary lidar and a secondary point cloud image corresponding to the secondary lidar;
a plane determining module 83, configured to determine a target reference measurement plane from a plurality of reference measurement planes corresponding to the reference measurement area;
a fitting module 84, configured to perform plane fitting on point cloud data corresponding to target reference measurement planes in the main point cloud image and the auxiliary point cloud image, respectively, to obtain a main fitting plane corresponding to the main point cloud image and an auxiliary fitting plane corresponding to the auxiliary point cloud image;
and an alignment module 85, configured to align the primary fitting plane and the secondary fitting plane, so as to achieve calibration between the primary lidar and the secondary lidar.
In an embodiment, the fitting module 84 may include a determining submodule 841, a first fitting submodule 842, and a second fitting submodule 843, as follows:
the determining submodule 841 is configured to determine point cloud data corresponding to a target reference measurement plane in the main point cloud image;
a first fitting submodule 842, configured to perform plane fitting on the point cloud data based on a distribution condition of the point cloud data, to obtain a main fitting plane corresponding to the main point cloud image;
and the second fitting submodule 843 is configured to perform plane fitting on the point cloud data corresponding to the target reference measurement plane in the auxiliary point cloud image to obtain an auxiliary fitting plane corresponding to the auxiliary point cloud image.
In an embodiment, the first fitting submodule 842 may be specifically configured to:
when the point cloud data is sparsely distributed, selecting a plurality of selected points from the point cloud data;
and determining a main fitting plane corresponding to the main point cloud image based on the selected point.
In an embodiment, the first fitting submodule 842 may be specifically configured to:
when the point cloud data are densely distributed and the distribution range of the point cloud data meets a first distribution condition, determining a plane area for framing the point cloud data in the main point cloud image;
and determining a main fitting plane corresponding to the main point cloud image based on the plane area.
In an embodiment, the first fitting submodule 842 may be specifically configured to:
when the point cloud data are densely distributed and the distribution range of the point cloud data meets a second distribution condition, determining a target reference point from a plurality of points of the point cloud data;
determining a search area corresponding to the target reference point in the main point cloud image;
updating the points meeting the fitting condition in the search area as target reference points;
and returning to the step of determining the search area corresponding to the target reference point in the main point cloud image until a main fitting plane corresponding to the main point cloud image is determined.
In an embodiment, the image processing apparatus may further include a return module 88, and the return module 88 may be specifically configured to:
and returning to the step of determining a target reference measuring plane from the plurality of reference measuring planes corresponding to the reference measuring area until the plurality of reference measuring planes in the reference measuring area are fitted, so as to obtain a main fitting plane corresponding to the plurality of reference measuring planes in the main point cloud image and a secondary fitting plane corresponding to the plurality of reference measuring planes in the secondary point cloud image.
In an embodiment, the image processing apparatus may further include a scanning module 87, and the scanning module 87 may be specifically configured to:
respectively acquiring first scanning data and second scanning data of the main laser radar and the auxiliary laser radar after scanning the same object;
and when the difference between the first scanning data and the second scanning data is not less than a preset threshold value, returning to the step of determining a target reference measuring plane from a plurality of reference measuring planes corresponding to the reference measuring area.
In an embodiment, the alignment module 85 may be specifically configured to:
based on the main fitting plane and the auxiliary fitting plane, acquiring a transformation relation between a measurement coordinate system of the main laser radar and a measurement coordinate system of the auxiliary laser radar;
determining a reference coordinate system serving as a transformation reference from the measurement coordinate system of the main laser radar and the measurement coordinate system of the auxiliary laser radar;
and aligning the measurement coordinate system of the main laser radar and the measurement coordinate system of the auxiliary laser radar based on the transformation relation and the reference coordinate system so as to realize the calibration between the main laser radar and the auxiliary laser radar.
In a specific implementation, the above units may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and the specific implementation of the above units may refer to the foregoing method embodiments, which are not described herein again.
From the above, in the embodiment of the present application, the reference measurement area corresponding to the laser radar carrier may be determined by the area determination module 81, the reference measurement area includes a plurality of reference measurement planes, the measurement module 82 measures the reference measurement area based on the primary laser radar and the secondary laser radar corresponding to the laser radar carrier to obtain the primary point cloud image corresponding to the primary laser radar and the secondary point cloud image corresponding to the secondary laser radar, the plane determination module 83 determines the target reference measurement plane from the plurality of reference measurement planes corresponding to the reference measurement area, the fitting module 84 performs plane fitting on the point cloud data corresponding to the target reference measurement plane in the primary point cloud image and the secondary point cloud image respectively to obtain the primary fitting plane corresponding to the primary point cloud image and the secondary fitting plane corresponding to the secondary point cloud image, the alignment module 85 aligns the primary fitting plane and the secondary fitting plane, so as to realize the calibration between the main laser radar and the auxiliary laser radar. The scheme can finish the calibration between the laser radars at any time by utilizing the surrounding environment of the laser radar carrier, has lower requirement on the common area range between the multiple laser radars, does not need special calibration equipment, has low requirement on calibration personnel, and has clear steps, high efficiency and accurate result. When the relative position of the laser radar needs to be changed, the efficiency can be greatly improved.
The embodiment of the present application further provides a network device, and the network device may integrate any one of the image processing apparatuses provided in the embodiments of the present application.
For example, as shown in fig. 9, it shows a schematic structural diagram of a network device according to an embodiment of the present application, specifically:
the network device may include components such as a processor 91 of one or more processing cores, memory 92 of one or more computer-readable storage media, a power supply 93, and an input unit 94. Those skilled in the art will appreciate that the network device architecture shown in fig. 9 does not constitute a limitation of network devices and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. Wherein:
the processor 91 is a control center of the network device, connects various parts of the entire network device using various interfaces and lines, and performs various functions of the network device and processes data by running or executing software programs and/or modules stored in the memory 92 and calling data stored in the memory 92, thereby performing overall monitoring of the network device. Optionally, processor 91 may include one or more processing cores; preferably, the processor 91 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 91.
The memory 92 may be used to store software programs and modules, and the processor 91 executes various functional applications and data processing by operating the software programs and modules stored in the memory 92. The memory 92 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the network device, and the like. Further, memory 92 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 92 may also include a memory controller to provide the processor 91 access to the memory 92.
The network device further comprises a power supply 93 for supplying power to each component, and preferably, the power supply 93 is logically connected to the processor 91 through a power management system, so that functions of managing charging, discharging, power consumption, and the like are realized through the power management system. The power supply 93 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The network device may also include an input unit 94, the input unit 94 being operable to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
Although not shown, the network device may further include a display unit and the like, which are not described in detail herein. Specifically, in this embodiment, the processor 91 in the network device loads the executable file corresponding to the process of one or more application programs into the memory 92 according to the following instructions, and the processor 91 runs the application programs stored in the memory 92, thereby implementing various functions as follows:
determining a reference measurement area corresponding to a laser radar carrier, wherein the reference measurement area comprises a plurality of reference measurement planes, measuring the reference measurement area based on a main laser radar and an auxiliary laser radar corresponding to the laser radar carrier to obtain a main point cloud image corresponding to the main laser radar and an auxiliary point cloud image corresponding to the auxiliary laser radar, determining a target reference measurement plane from the plurality of reference measurement planes corresponding to the reference measurement area, respectively performing plane fitting on point cloud data corresponding to the target reference measurement plane in the main point cloud image and the auxiliary point cloud image to obtain a main fitting plane corresponding to the main point cloud image and an auxiliary fitting plane corresponding to the auxiliary point cloud image, and aligning the main fitting plane and the auxiliary fitting plane so as to realize calibration between the main laser radar and the auxiliary laser radar.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
According to the method and the device, the reference measurement area corresponding to the laser radar carrier can be determined, the reference measurement area comprises a plurality of reference measurement planes, the reference measurement area is measured based on the main laser radar and the auxiliary laser radar corresponding to the laser radar carrier to obtain the main point cloud image corresponding to the main laser radar and the auxiliary point cloud image corresponding to the auxiliary laser radar, the target reference measurement plane is determined from the plurality of reference measurement planes corresponding to the reference measurement area, the point cloud data corresponding to the target reference measurement plane in the main point cloud image and the auxiliary point cloud image are subjected to plane fitting respectively to obtain the main fitting plane corresponding to the main point cloud image and the auxiliary fitting plane corresponding to the auxiliary point cloud image, and the main fitting plane and the auxiliary fitting plane are aligned to achieve calibration between the main laser radar and the auxiliary laser radar. The scheme can finish the calibration between the laser radars at any time by utilizing the surrounding environment of the laser radar carrier, has lower requirement on the common area range between the multiple laser radars, does not need special calibration equipment, has low requirement on calibration personnel, and has clear steps, high efficiency and accurate result. When the relative position of the laser radar needs to be changed, the efficiency can be greatly improved.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer device, in which a plurality of instructions are stored, and the instructions can be loaded by a processor to execute the steps in any one of the image processing methods provided by the embodiments of the present application. For example, the instructions may perform the steps of:
determining a reference measurement area corresponding to a laser radar carrier, wherein the reference measurement area comprises a plurality of reference measurement planes, measuring the reference measurement area based on a main laser radar and an auxiliary laser radar corresponding to the laser radar carrier to obtain a main point cloud image corresponding to the main laser radar and an auxiliary point cloud image corresponding to the auxiliary laser radar, determining a target reference measurement plane from the plurality of reference measurement planes corresponding to the reference measurement area, respectively performing plane fitting on point cloud data corresponding to the target reference measurement plane in the main point cloud image and the auxiliary point cloud image to obtain a main fitting plane corresponding to the main point cloud image and an auxiliary fitting plane corresponding to the auxiliary point cloud image, and aligning the main fitting plane and the auxiliary fitting plane so as to realize calibration between the main laser radar and the auxiliary laser radar.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any image processing method provided in the embodiments of the present application, beneficial effects that can be achieved by any image processing method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing detailed description has provided an image processing method, an image processing apparatus, and a computer storage medium according to embodiments of the present application, and specific embodiments have been applied to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific implementation method and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (5)

1. An image processing method, comprising:
determining a reference measurement area corresponding to the unmanned vehicle, wherein the reference measurement area comprises a plurality of reference measurement planes, and the position relation between the reference measurement planes comprises a vertical relation and a non-vertical relation;
measuring the reference measurement area based on a main laser radar and an auxiliary laser radar corresponding to the unmanned vehicle to obtain a main point cloud image corresponding to the main laser radar and an auxiliary point cloud image corresponding to the auxiliary laser radar;
determining a target reference measurement plane from a plurality of reference measurement planes corresponding to the reference measurement area;
determining point cloud data corresponding to a target reference measuring plane in the main point cloud image;
when point cloud data corresponding to a target reference measuring plane in the main point cloud image are distributed sparsely, selecting a plurality of selected points from the point cloud data; determining a main fitting plane corresponding to the main point cloud image based on the selected point;
when point cloud data corresponding to a target reference measuring plane in the main point cloud image are densely distributed and the distribution range of the point cloud data meets a first distribution condition, determining a plane area for framing the point cloud data in the main point cloud image, wherein the first distribution condition is that the area of a distribution area in the point cloud data exceeds a preset area, and the distribution area is an area where the distribution density of the point cloud data exceeds a preset density; determining a main fitting plane corresponding to the main point cloud image based on the plane area;
when point cloud data corresponding to a target reference measuring plane in the main point cloud image are densely distributed and the distribution range of the point cloud data meets a second distribution condition, determining a target reference point from a plurality of points of the point cloud data, wherein the second distribution condition comprises that the area of a distribution area in the point cloud data does not exceed a preset area or the point cloud data is shielded by other point cloud data in an operation interface view angle of the point cloud image; determining a search area corresponding to the target reference point in the main point cloud image; updating the points meeting the fitting condition in the search area as target reference points; returning to the step of determining a search area corresponding to the target reference point in the main point cloud image until a main fitting plane corresponding to the main point cloud image is determined;
performing plane fitting on point cloud data corresponding to a target reference measurement plane in the auxiliary point cloud image to obtain an auxiliary fitting plane corresponding to the auxiliary point cloud image;
returning to the step of determining a target reference measurement plane from a plurality of reference measurement planes corresponding to the reference measurement area until all the plurality of reference measurement planes in the reference measurement area are fitted, and obtaining a main fitting plane corresponding to the plurality of reference measurement planes in the main point cloud image and an auxiliary fitting plane corresponding to the plurality of reference measurement planes in the auxiliary point cloud image, wherein the main fitting plane comprises a first main fitting plane, a second main fitting plane and a third main fitting plane, and the auxiliary fitting plane comprises a first auxiliary fitting plane, a second auxiliary fitting plane and a third auxiliary fitting plane;
aligning the first main fitting plane and the first auxiliary fitting plane, and determining a first rotation transformation matrix and a first translation transformation parameter; aligning the second main fitting plane and the second auxiliary fitting plane, and determining a second rotation transformation matrix and a second translation transformation parameter; aligning the third main fitting plane and the third auxiliary fitting plane, and determining a third rotation transformation matrix and a third translation transformation parameter;
determining a transformation relation between a reference coordinate system of the master lidar and a measurement coordinate system of the slave lidar based on the first rotational transformation matrix, the second rotational transformation matrix, the third rotational transformation matrix, the first translational transformation parameter, the second translational transformation parameter, and the third translational transformation parameter so as to achieve calibration between the master lidar and the slave lidar;
respectively acquiring first scanning data and second scanning data of the main laser radar and the auxiliary laser radar after scanning the same object;
and when the difference between the first scanning data and the second scanning data is not less than a preset threshold value, returning to the step of determining a target reference measuring plane from a plurality of reference measuring planes corresponding to the reference measuring area.
2. The image processing method of claim 1, wherein aligning the primary and secondary fitting planes to achieve calibration between the primary and secondary lidar comprises:
based on the main fitting plane and the auxiliary fitting plane, acquiring a transformation relation between a measurement coordinate system of the main laser radar and a measurement coordinate system of the auxiliary laser radar;
determining a reference coordinate system serving as a transformation reference from the measurement coordinate system of the main laser radar and the measurement coordinate system of the auxiliary laser radar;
and aligning the measurement coordinate system of the main laser radar and the measurement coordinate system of the auxiliary laser radar based on the transformation relation and the reference coordinate system so as to realize the calibration between the main laser radar and the auxiliary laser radar.
3. A laser radar calibration device is characterized by comprising:
the area determination module is used for determining a reference measurement area corresponding to the unmanned vehicle, the reference measurement area comprises a plurality of reference measurement planes, and the position relation between the reference measurement planes comprises a vertical relation and a non-vertical relation;
the measuring module is used for measuring the reference measuring area based on a main laser radar and an auxiliary laser radar corresponding to the unmanned vehicle to obtain a main point cloud image corresponding to the main laser radar and an auxiliary point cloud image corresponding to the auxiliary laser radar;
the plane determining module is used for determining a target reference measuring plane from a plurality of reference measuring planes corresponding to the reference measuring area;
the fitting module is used for determining point cloud data corresponding to a target reference measuring plane in the main point cloud image; when point cloud data corresponding to a target reference measuring plane in the main point cloud image are distributed sparsely, selecting a plurality of selected points from the point cloud data; determining a main fitting plane corresponding to the main point cloud image based on the selected point; when point cloud data corresponding to a target reference measuring plane in the main point cloud image are densely distributed and the distribution range of the point cloud data meets a first distribution condition, determining a plane area for framing the point cloud data in the main point cloud image, wherein the first distribution condition is that the area of a distribution area in the point cloud data exceeds a preset area, and the distribution area is an area where the distribution density of the point cloud data exceeds a preset density; determining a main fitting plane corresponding to the main point cloud image based on the plane area; when point cloud data corresponding to a target reference measuring plane in the main point cloud image are densely distributed and the distribution range of the point cloud data meets a second distribution condition, determining a target reference point from a plurality of points of the point cloud data, wherein the second distribution condition comprises that the area of a distribution area in the point cloud data does not exceed a preset area or the point cloud data is shielded by other point cloud data in an operation interface view angle of the point cloud image; determining a search area corresponding to the target reference point in the main point cloud image; updating the points meeting the fitting condition in the search area as target reference points; returning to the step of determining a search area corresponding to the target reference point in the main point cloud image until a main fitting plane corresponding to the main point cloud image is determined; performing plane fitting on point cloud data corresponding to a target reference measurement plane in the auxiliary point cloud image to obtain an auxiliary fitting plane corresponding to the auxiliary point cloud image;
the alignment module is used for returning to execute the step of determining a target reference measurement plane from a plurality of reference measurement planes corresponding to the reference measurement area until the plurality of reference measurement planes in the reference measurement area are fitted, and obtaining a main fitting plane corresponding to the plurality of reference measurement planes in the main point cloud image and an auxiliary fitting plane corresponding to the plurality of reference measurement planes in the auxiliary point cloud image, wherein the main fitting plane comprises a first main fitting plane, a second main fitting plane and a third main fitting plane, and the auxiliary fitting plane comprises a first auxiliary fitting plane, a second auxiliary fitting plane and a third auxiliary fitting plane; aligning the first main fitting plane and the first auxiliary fitting plane, and determining a first rotation transformation matrix and a first translation transformation parameter; aligning the second main fitting plane and the second auxiliary fitting plane, and determining a second rotation transformation matrix and a second translation transformation parameter; aligning the third main fitting plane and the third auxiliary fitting plane, and determining a third rotation transformation matrix and a third translation transformation parameter; determining a transformation relation between a reference coordinate system of the master lidar and a measurement coordinate system of the slave lidar based on the first rotational transformation matrix, the second rotational transformation matrix, the third rotational transformation matrix, the first translational transformation parameter, the second translational transformation parameter, and the third translational transformation parameter so as to achieve calibration between the master lidar and the slave lidar; respectively acquiring first scanning data and second scanning data of the main laser radar and the auxiliary laser radar after scanning the same object; and when the difference between the first scanning data and the second scanning data is not less than a preset threshold value, returning to the step of determining a target reference measuring plane from a plurality of reference measuring planes corresponding to the reference measuring area.
4. A computer storage medium having a computer program stored thereon, which, when run on a computer, causes the computer to perform the image processing method according to any one of claims 1-2.
5. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method according to any of claims 1 to 2 are implemented when the program is executed by the processor.
CN202010155209.0A 2020-03-09 2020-03-09 Image processing method and device and computer storage medium Active CN111007485B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010155209.0A CN111007485B (en) 2020-03-09 2020-03-09 Image processing method and device and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010155209.0A CN111007485B (en) 2020-03-09 2020-03-09 Image processing method and device and computer storage medium

Publications (2)

Publication Number Publication Date
CN111007485A CN111007485A (en) 2020-04-14
CN111007485B true CN111007485B (en) 2020-10-27

Family

ID=70120967

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010155209.0A Active CN111007485B (en) 2020-03-09 2020-03-09 Image processing method and device and computer storage medium

Country Status (1)

Country Link
CN (1) CN111007485B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111965627B (en) * 2020-08-18 2021-06-25 湖北亿咖通科技有限公司 Multi-laser radar calibration method for vehicle
CN114440922A (en) * 2020-10-30 2022-05-06 阿里巴巴集团控股有限公司 Method and device for evaluating laser calibration, related equipment and storage medium
CN112180348B (en) * 2020-11-27 2021-03-02 深兰人工智能(深圳)有限公司 Attitude calibration method and device for vehicle-mounted multi-line laser radar
CN112965047B (en) * 2021-02-01 2023-03-14 中国重汽集团济南动力有限公司 Vehicle multi-laser radar calibration method, system, terminal and storage medium
CN113513988B (en) * 2021-07-12 2023-03-31 广州小鹏自动驾驶科技有限公司 Laser radar target detection method and device, vehicle and storage medium
CN115690219A (en) * 2023-01-03 2023-02-03 山东矩阵软件工程股份有限公司 Method and system for detecting three-dimensional information of running train in complex environment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109839624A (en) * 2017-11-27 2019-06-04 北京万集科技股份有限公司 A kind of multilasered optical radar position calibration method and device
WO2020000137A1 (en) * 2018-06-25 2020-01-02 Beijing DIDI Infinity Technology and Development Co., Ltd Integrated sensor calibration in natural scenes
CN109375195B (en) * 2018-11-22 2019-07-16 中国人民解放军军事科学院国防科技创新研究院 Parameter quick calibrating method outside a kind of multi-line laser radar based on orthogonal normal vector
CN109696663B (en) * 2019-02-21 2021-02-09 北京大学 Vehicle-mounted three-dimensional laser radar calibration method and system
CN110031824B (en) * 2019-04-12 2020-10-30 杭州飞步科技有限公司 Laser radar combined calibration method and device
CN110333503B (en) * 2019-05-29 2023-06-09 菜鸟智能物流控股有限公司 Laser radar calibration method and device and electronic equipment

Also Published As

Publication number Publication date
CN111007485A (en) 2020-04-14

Similar Documents

Publication Publication Date Title
CN111007485B (en) Image processing method and device and computer storage medium
EP3506212B1 (en) Method and apparatus for generating raster map
CN109188457B (en) Object detection frame generation method, device, equipment, storage medium and vehicle
US20110205340A1 (en) 3d time-of-flight camera system and position/orientation calibration method therefor
CN109872366B (en) Method and device for detecting three-dimensional position of object
CN107920246B (en) The gradient test method and device of camera module
CN111383279A (en) External parameter calibration method and device and electronic equipment
CN113156407B (en) Vehicle-mounted laser radar external parameter joint calibration method, system, medium and device
CN115880252B (en) Container sling detection method, device, computer equipment and storage medium
CN110068826B (en) Distance measurement method and device
CN114241057A (en) External reference calibration method and system for camera and laser radar and readable storage medium
WO2024001804A1 (en) Three-dimensional object detection method, computer device, storage medium, and vehicle
CN113759348A (en) Radar calibration method, device, equipment and storage medium
CN114910892A (en) Laser radar calibration method and device, electronic equipment and storage medium
CN110796707B (en) Calibration parameter calculation method, calibration parameter calculation device and storage medium
WO2023009508A1 (en) Determining minimum region for finding planar surfaces
CN112229396B (en) Unmanned vehicle repositioning method, device, equipment and storage medium
CN116401326A (en) Road identification updating method and device
CN116363192A (en) Volume measurement method and device for warehouse goods, computer equipment and storage medium
CN113654538A (en) Room square finding method, laser radar and measuring system for actual measurement
CN112712062A (en) Monocular three-dimensional object detection method and device based on decoupling truncated object
CN111366911A (en) Method and device for calibrating positioning consistency of multiple AGV (automatic guided vehicle) and electronic terminal
CN110595363A (en) Three-dimensional virtual modeling method, system, device and storage medium
CN112733817B (en) Method for measuring precision of point cloud layer in high-precision map and electronic equipment
CN114998426B (en) Robot ranging method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20200414

Assignee: Tianyi Transportation Technology Co.,Ltd.

Assignor: CIIC Technology Co.,Ltd.|Zhongzhixing (Shanghai) Transportation Technology Co.,Ltd.

Contract record no.: X2022980001515

Denomination of invention: The invention relates to an image processing method, a device and a computer storage medium

Granted publication date: 20201027

License type: Common License

Record date: 20220214