CN111638500A - Calibration method for a measuring device and measuring device - Google Patents

Calibration method for a measuring device and measuring device Download PDF

Info

Publication number
CN111638500A
CN111638500A CN202010484365.1A CN202010484365A CN111638500A CN 111638500 A CN111638500 A CN 111638500A CN 202010484365 A CN202010484365 A CN 202010484365A CN 111638500 A CN111638500 A CN 111638500A
Authority
CN
China
Prior art keywords
camera
calibration
rotation matrix
calibration object
translation vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010484365.1A
Other languages
Chinese (zh)
Inventor
关卡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seashell Housing Beijing Technology Co Ltd
Original Assignee
Beike Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beike Technology Co Ltd filed Critical Beike Technology Co Ltd
Priority to CN202010484365.1A priority Critical patent/CN111638500A/en
Publication of CN111638500A publication Critical patent/CN111638500A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A calibration method for a measuring device comprising a lidar and a primary camera which are arranged together on a carrier member and do not have a common field of view. The calibration method comprises the following steps: setting a first calibration object in the visual field of the laser radar; setting a second calibration object in the visual field of the main camera; setting the auxiliary camera so that the visual field of the auxiliary camera covers the first calibration object and the second calibration object; calculating a first rotation matrix and a first translation vector from the lidar to the auxiliary camera with the aid of the first calibration object; calculating a second rotation matrix and a second translation vector from the secondary camera to the primary camera by means of a second calibration object; and calculating a third rotation matrix and a third translation vector from the lidar to the primary camera by means of the first and second rotation matrices and the first and second translation vectors.

Description

Calibration method for a measuring device and measuring device
Technical Field
The present disclosure relates to calibration methods, and more particularly, to a calibration method and a measurement apparatus for a measurement apparatus.
Background
The single sensor has inevitable limitation, and in order to improve the robustness of the system, a multi-sensor fusion scheme is adopted. In the prior art, the lidar and the camera are usually fixed together in a measuring device, the relative position of which is known. However, the actual relative position and the theoretical relative position are often different due to an assembly error of the measuring apparatus, etc., and the relationship therebetween is uncertain, so that calibration is required for each measuring apparatus before actual use.
Because the viewing angles of the lidar and the camera are different, a calibration object (e.g., a calibration checkerboard calibration board, or other three-dimensional calibration object) needs to be placed in the field of view where the two coincide during calibration to ensure that the calibration object can be sensed by the lidar and the camera at the same time. Therefore, the laser radar and the camera respectively shoot the calibration objects under different angles to form a group of constraint equations. And finally, solving the constraint equation to obtain the actual relative position between the laser radar and the camera, namely completing calibration.
Due to structural and industrial design considerations, it may happen that there is no common view between the lidar and the camera. In this case, calibration cannot be performed by means of the existing calibration method.
Disclosure of Invention
In order to solve the above problem, the present application discloses a calibration method for a measuring device, wherein the measuring device comprises a lidar and a main camera which are commonly arranged on a carrying member and do not have a common field of view. The calibration method comprises the following steps: setting a first calibration object in the visual field of the laser radar; setting a second calibration object in the visual field of the main camera; setting the auxiliary camera so that the visual field of the auxiliary camera covers the first calibration object and the second calibration object; calculating a first rotation matrix and a first translation vector from the lidar to the auxiliary camera with the aid of the first calibration object; calculating a second rotation matrix and a second translation vector from the secondary camera to the primary camera by means of a second calibration object; and calculating a third rotation matrix and a third translation vector from the laser radar to the main camera by means of the first rotation matrix, the first translation vector, the second rotation matrix and the second translation vector to complete the calibration.
According to an alternative embodiment, the auxiliary camera is arranged on the carrier member.
According to an alternative embodiment, the auxiliary camera is provided separately from the carrier member.
According to an alternative embodiment, the primary camera and the secondary camera are monocular cameras; and the lidar is a single line lidar.
According to an alternative embodiment, in the step of calculating a first rotation matrix and a first translation vector from the lidar to the auxiliary camera by means of the first calibration object, the first rotation matrix and the first translation vector are calculated by means of a first coordinate of the first calibration object in the coordinate system of the lidar and a second coordinate of the first calibration object in the coordinate system of the auxiliary camera. The first coordinates and the second coordinates are calculated using a computer vision algorithm.
According to an alternative embodiment, in the step of calculating a second rotation matrix and a second translation vector from the auxiliary camera to the main camera by means of the second calibration object, the second rotation matrix and the second translation vector are calculated by means of a third coordinate of the second calibration object in the coordinate system of the auxiliary camera and a fourth coordinate of the second calibration object in the coordinate system of the main camera.
According to an alternative embodiment, the third and fourth coordinates are calculated using a computer vision algorithm.
In order to solve the above problem, the present application also discloses a measuring device including a laser radar and a main camera that are provided together on a carrier member and do not have a common field of view. The measuring device is configured to calibrate the lidar and the primary camera by: setting a first calibration object in the visual field of the laser radar; setting a second calibration object in the visual field of the main camera; setting the auxiliary camera so that the visual field of the auxiliary camera covers the first calibration object and the second calibration object; calculating a first rotation matrix and a first translation vector from the lidar to the auxiliary camera with the aid of the first calibration object; calculating a second rotation matrix and a second translation vector from the secondary camera to the primary camera by means of a second calibration object; and calculating a third rotation matrix and a third translation vector from the laser radar to the main camera by means of the first rotation matrix, the first translation vector, the second rotation matrix and the second translation vector to complete the calibration.
In order to solve the above problem, the present application also discloses a non-transitory computer readable storage medium. The non-transitory computer readable storage medium stores instructions that, when executed by the processor, cause the processor to perform the various steps in the calibration method for the measurement device as described above.
In order to solve the above problem, the present application also discloses a terminal device, including a processor. The processor is adapted to perform the various steps of the calibration method for a measurement device as described above.
According to the method and the device, the auxiliary camera is introduced to be used as the visual field transition between the laser radar and the main camera, and twice calibration is carried out to indirectly finish the calibration of the measuring device comprising the laser radar and the camera. The calibration can be completed under the condition that no common visual field exists between the laser radar and the camera, and the arrangement modes of the laser radar and the camera are expanded.
Drawings
In order to more clearly explain the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that for those skilled in the art, other related drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a flow chart of a calibration method for a measurement device according to the present application;
FIG. 2 is a schematic view of a measurement device according to the present application;
FIG. 3 is a partial schematic view of the measurement device of FIG. 2;
FIG. 4 is a schematic view of a measurement device according to another embodiment of the present application; and
fig. 5 is a schematic diagram of a terminal device according to yet another embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art without any inventive work based on the embodiments in the present application are within the scope of protection of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements explicitly listed, but may include other steps or elements not explicitly listed or inherent to such process, method, article, or apparatus.
The technical means of the present invention will be described in detail below with reference to specific embodiments. Several of the following detailed description may be combined with one another and may not be repeated in some embodiments for the same or similar concepts or processes.
FIG. 1 is a flow chart of a calibration method for a measurement device according to the present application. The measuring device 1 suitable for the calibration method comprises a laser radar L1 and a main camera C1 which are arranged jointly on the carrier member 3 and do not have a common field of view. In the present embodiment, the main camera C1 and the auxiliary camera a1 are monocular cameras, and the laser radar L1 is a single line laser radar.
The calibration method for the measuring device comprises the following steps:
a first calibration object 4 is placed in the field of view of the laser radar L1;
a second calibration object 5 is placed in the field of view of the main camera C1;
the auxiliary camera a1 is disposed such that the field of view of the auxiliary camera a1 covers the first and second markers 4, 5.
When the laser radar L1 and the auxiliary camera a1 are calibrated, it is necessary to ensure that the first calibration object 4 is within the visual field of the laser radar L1 and the auxiliary camera a 1. When the calibration is performed on the main camera C1 and the auxiliary camera a1, it is necessary to ensure that the second calibration object 5 is within the visual field of the main camera C1 and the auxiliary camera a 1. The main camera C1 and the auxiliary camera a1 can recognize whether the first calibration object 4 and the second calibration object 5 can be seen through images, but it is difficult for the laser radar L1 to discriminate whether the detection signal is emitted to the first calibration object 4. In response to the above problem, manual intervention is usually required in the acquisition phase and the data extraction phase to ensure that the data acquired by the lidar L1 contains information about the position of the first calibration object 4. Specifically, the intervention mode includes moving first calibration object 4 while observing the output of lidar L1 to determine the presence of a reflected signal of first calibration object 4 in the output of lidar L1.
In the present embodiment, the first and second markers 4 and 5 may be checkerboard-type calibration plates, the size of which can be freely set. Of course, the first and second calibrators 4 and 5 should have sufficient detail to enable the data obtained by lidar L1, main camera C1, and auxiliary camera A1 to be sufficient for positioning. Furthermore, the first and second markers 4, 5 should be sufficiently distinguishable from the background to be separated in the point cloud captured by the lidar L1 and in the images captured by the primary and secondary cameras C1, a 1.
The calibration method further comprises the following steps:
calculating a first rotation matrix (rotation matrix) and a first translation vector (translation vector) from the laser radar L1 to the auxiliary camera a1 by means of the first calibration object 4;
calculating a second rotation matrix and a second translation vector from the auxiliary camera a1 to the main camera C1 by means of the second scale 5; and
a third rotation matrix and a third translation vector from the laser radar L1 to the main camera C1 are calculated by means of the first rotation matrix, the first translation vector, the second rotation matrix and the second translation vector to complete the calibration.
It should be noted that the order of the steps of calculating the first rotation matrix and the first translation vector and the steps of calculating the second rotation matrix and the second translation vector may be reversed.
Fig. 2 is a schematic view of a measurement device according to the present application. The measuring device 1 includes a laser radar L1 and a main camera C1 that are provided commonly on the carrier member 3 and do not have a common field of view. The auxiliary camera a1 is provided on the carrier member 3.
Fig. 3 is a partial schematic view of the measurement device of fig. 2.
Referring to fig. 1, in the step of calculating a first rotation matrix and a first translation vector from the laser radar L1 to the auxiliary camera a1 by means of the first calibration object 4, the first rotation matrix and the first translation vector are calculated from first coordinates of the first calibration object 4 in the coordinate system of the laser radar L1 and second coordinates of the first calibration object 4 in the coordinate system of the auxiliary camera a 1. In the step of calculating the second rotation matrix and the second translation vector from the auxiliary camera a1 to the main camera C1 by means of the second marker 5, the second rotation matrix and the second translation vector are calculated by the third coordinate of the second marker 5 in the coordinate system of the auxiliary camera a1 and the fourth coordinate of the second marker 5 in the coordinate system of the main camera C1.
The first, second, third and fourth coordinates are calculated using computer vision and a PNP algorithm. In the step of calculating a first rotation matrix and a first translation vector from the lidar L1 to the auxiliary camera a1 by means of the first calibration object 4, the lidar L1 and the auxiliary camera a1 are adjusted for point cloud acquisition and image acquisition, respectively. At least three corresponding point pairs, namely, point pairs consisting of three-dimensional points in the point cloud and two-dimensional points in the image, are found in the acquired point cloud and image. And performing PNP algorithm solution by using the three point pairs, and calculating a transformation relation between the coordinate system of the laser radar L1 and the coordinate system sum of the auxiliary camera A1. In the step of calculating a second rotation matrix and a second translation vector from the auxiliary camera a1 to the main camera C1 by means of the second calibration object 5, the auxiliary camera a1 and the main camera C1 are adjusted to obtain their coordinates and directions in the calibration object coordinate system, i.e. the rotation matrix R and the translation vector t (collectively referred to as transformation matrices). The above process is called a PNP algorithm, and specifically, the PNP algorithm calculates a transformation matrix of a camera in a coordinate system of a calibration object by taking the calibration object as a coordinate center (origin) and then according to a real size of a feature point on the calibration object and a pixel size of the feature point corresponding to the calibration object on an obtained picture, and further derives an equation representation of the calibration object in the coordinate system of the camera.
The lidar itself may obtain three-dimensional coordinates, but the three-dimensional coordinates have the lidar itself position as the origin of coordinates (referred to as the lidar coordinate system). In other words, the lidar is not able to directly determine the position of the calibration object. In this case, a three-dimensional calibration object, a clustering algorithm, or other auxiliary method may be used to obtain the coordinate range of the calibration object in the lidar coordinate system. Since the calibration object can only be sensed as one line in a plane in the singlet lidar and at least two lines are required to obtain the plane equation according to the geometrical principle, determining the final position of the calibration object (e.g. checkerboard calibration plate) requires more data acquisition to obtain more lines.
By means of the above steps, equation C of the calibration object in the camera coordinate system and coordinate p of a straight line of the plane of the calibration object in the lidar coordinate system can be obtained. From these two parameters (C, p) acquired multiple times, the distance between the laser point and the calibration object can be minimized by the least squares method to obtain the transformation matrix (R, t) of the camera and the lidar, i.e.:
Figure BDA0002518457290000061
here, norm represents a distance between the laser point and the calibration object, and is actually a second-order norm, also referred to as euclidean distance. Minimizing this distance allows R and t to be found.
The transformation matrix (R, t) is typically expressed as:
Figure BDA0002518457290000062
where R is the rotation matrix and t is the translation vector.
Using R and t to map a coordinate p1Changing to p in another coordinate system2The method comprises the following steps:
p2=R*p1+ t, or
p2_homo=T*p1_homoWhere the subscript suffix _ homo represents the coordinates of the homogeneous, e.g. [ x, y, z, 1 ]]。
Fig. 4 is a schematic view of a measurement device according to another embodiment of the present application. The difference with the embodiment in fig. 2 is. The auxiliary camera a1 is provided separately from the carrier member 3, for example fixed on the field.
Fig. 5 is a schematic diagram of a terminal device according to yet another embodiment of the present application. The terminal device 700 comprises a processor 70, which processor 70 is adapted to perform the steps of a calibration method for a measuring apparatus as described above. As can also be seen from fig. 4, the terminal device 700 further comprises a non-transitory computer-readable storage medium 71, the non-transitory computer-readable storage medium 71 having stored thereon a computer program which, when executed by the processor 70, performs one of the above-described calibration methods for a measurement apparatus. In practice, the terminal device may be one or more computers, as long as the computer-readable medium and the processor are included.
In addition, the method steps described in the present application may be implemented by hardware, for example, logic gates, switches, Application Specific Integrated Circuits (ASICs), programmable logic controllers, embedded microcontrollers, and the like, in addition to the calibration program for the measurement device. Such hardware capable of implementing the methods described herein may also constitute the present application.
Specifically, the non-transitory computer-readable storage medium 71 may be a general-purpose storage medium such as a removable disk, a hard disk, a FLASH, and the like, and when the computer program on the non-transitory computer-readable storage medium 71 is executed, the above-described calibration method for the measurement apparatus may be performed. In practical applications, the non-transitory computer readable storage medium 71 may be included in the device/apparatus/system described in the above embodiments, or may exist separately without being assembled into the device/apparatus/system. The non-transitory computer readable storage medium 71 carries one or more programs which, when executed, are capable of performing the above-described calibration method for a measurement apparatus.
According to embodiments disclosed herein, the non-transitory computer readable storage medium 71 may include, but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing, without limiting the scope of the present disclosure. In the embodiments disclosed herein, the non-transitory computer readable storage medium 71 may be any tangible medium that contains or stores a program, which can be used by or in connection with an instruction execution system, apparatus, or device.
The flowchart and block diagrams in the figures of the present application illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments disclosed herein. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It will be appreciated by a person skilled in the art that various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present application are possible, even if such combinations or combinations are not explicitly recited in the present application. In particular, the features recited in the various embodiments and/or claims of the present application may be combined and/or coupled in various ways, all of which fall within the scope of the present disclosure, without departing from the spirit and teachings of the present application.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not for limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present application, and should be construed as being included therein. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. Calibration method for a measuring device (1), wherein the measuring device (1) comprises a lidar (L1) and a main camera (C1) which are jointly arranged on a carrier member (3) and do not have a common field of view,
it is characterized in that the preparation method is characterized in that,
the calibration method comprises the following steps:
-providing a first calibration object (4) in the field of view of the laser radar (L1);
-providing a second calibration object (5) in the field of view of the main camera (C1);
-arranging an auxiliary camera (a1) such that the field of view of the auxiliary camera (a1) covers the first and second calibration objects (4, 5);
calculating a first rotation matrix and a first translation vector from the lidar (L1) to the auxiliary camera (A1) by means of the first calibration object (4);
calculating a second rotation matrix and a second translation vector from the auxiliary camera (A1) to the main camera (C1) by means of the second calibration object (5); and
calculating a third rotation matrix and a third translation vector from the lidar (L1) to the primary camera (C1) by means of the first rotation matrix, the first translation vector, the second rotation matrix and the second translation vector to complete calibration.
2. The calibration method according to claim 1, wherein,
the auxiliary camera (A1) is disposed on the carrier member (3).
3. The calibration method according to claim 1, wherein,
the auxiliary camera (A1) is provided separately from the carrier member (3).
4. Calibration method according to any one of claims 1 to 3,
the primary camera (C1) and the secondary camera (A1) are monocular cameras; and
the lidar (L1) is a single line lidar.
5. The calibration method according to claim 1, wherein,
in the step of calculating a first rotation matrix and a first translation vector from the lidar (L1) to the auxiliary camera (a1) by means of the first calibration object (4), the first rotation matrix and the first translation vector are calculated by first coordinates of the first calibration object (4) in the coordinate system of the lidar (L1) and second coordinates of the first calibration object (4) in the coordinate system of the auxiliary camera (a 1); and
the first and second coordinates are calculated using a computer vision algorithm.
6. The calibration method according to claim 1, wherein,
in the step of calculating a second rotation matrix and a second translation vector from the auxiliary camera (a1) to the main camera (C1) by means of the second calibration object (5), the second rotation matrix and the second translation vector are calculated by means of a third coordinate of the second calibration object (5) in the coordinate system of the auxiliary camera (a1) and a fourth coordinate of the second calibration object (5) in the coordinate system of the main camera (C1).
7. The calibration method according to claim 7, wherein,
the third coordinate and the fourth coordinate are calculated by using a computer vision algorithm.
8. A measuring device (1) comprising a lidar (L1) and a main camera (C1) which are arranged jointly on a carrier member (3) and do not have a common field of view,
characterized in that the measuring device (1) is configured to calibrate the lidar (L1) and the primary camera (C1) by:
-providing a first calibration object (4) in the field of view of the laser radar (L1);
-providing a second calibration object (5) in the field of view of the main camera (C1);
-arranging an auxiliary camera (a1) such that the field of view of the auxiliary camera (a1) covers the first and second calibration objects (4, 5);
calculating a first rotation matrix and a first translation vector from the lidar (L1) to the auxiliary camera (A1) by means of the first calibration object (4);
calculating a second rotation matrix and a second translation vector from the auxiliary camera (A1) to the main camera (C1) by means of the second calibration object (5); and
calculating a third rotation matrix and a third translation vector from the lidar (L1) to the primary camera (C1) by means of the first rotation matrix, the first translation vector, the second rotation matrix and the second translation vector to complete calibration.
9. A non-transitory computer readable storage medium (71), characterized in that the non-transitory computer readable storage medium (71) stores instructions that, when executed by a processor (70), cause the processor (70) to perform the steps in the calibration method for a measurement apparatus (1) according to any one of claims 1-7.
10. A terminal device (700), characterized in that it comprises a processor (70), said processor (70) being adapted to perform the steps of a calibration method for a measuring apparatus (1) according to any of claims 1-7.
CN202010484365.1A 2020-06-01 2020-06-01 Calibration method for a measuring device and measuring device Pending CN111638500A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010484365.1A CN111638500A (en) 2020-06-01 2020-06-01 Calibration method for a measuring device and measuring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010484365.1A CN111638500A (en) 2020-06-01 2020-06-01 Calibration method for a measuring device and measuring device

Publications (1)

Publication Number Publication Date
CN111638500A true CN111638500A (en) 2020-09-08

Family

ID=72328605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010484365.1A Pending CN111638500A (en) 2020-06-01 2020-06-01 Calibration method for a measuring device and measuring device

Country Status (1)

Country Link
CN (1) CN111638500A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114509762A (en) * 2022-02-15 2022-05-17 南京慧尔视智能科技有限公司 Data processing method, device, equipment and medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016018411A1 (en) * 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Measuring and correcting optical misalignment
CN107976669A (en) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 A kind of device of outer parameter between definite camera and laser radar
US20190014310A1 (en) * 2017-07-06 2019-01-10 Arraiy, Inc. Hardware system for inverse graphics capture
CN109767474A (en) * 2018-12-31 2019-05-17 深圳积木易搭科技技术有限公司 A kind of more mesh camera calibration method, apparatus and storage medium
CN110244282A (en) * 2019-06-10 2019-09-17 于兴虎 A kind of multicamera system and laser radar association system and its combined calibrating method
CN110766759A (en) * 2019-10-09 2020-02-07 北京航空航天大学 Multi-camera calibration method and device without overlapped view fields
CN111145269A (en) * 2019-12-27 2020-05-12 武汉大学 Calibration method for external orientation elements of fisheye camera and single-line laser radar

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016018411A1 (en) * 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Measuring and correcting optical misalignment
CN107976669A (en) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 A kind of device of outer parameter between definite camera and laser radar
US20190014310A1 (en) * 2017-07-06 2019-01-10 Arraiy, Inc. Hardware system for inverse graphics capture
CN109767474A (en) * 2018-12-31 2019-05-17 深圳积木易搭科技技术有限公司 A kind of more mesh camera calibration method, apparatus and storage medium
CN110244282A (en) * 2019-06-10 2019-09-17 于兴虎 A kind of multicamera system and laser radar association system and its combined calibrating method
CN110766759A (en) * 2019-10-09 2020-02-07 北京航空航天大学 Multi-camera calibration method and device without overlapped view fields
CN111145269A (en) * 2019-12-27 2020-05-12 武汉大学 Calibration method for external orientation elements of fisheye camera and single-line laser radar

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114509762A (en) * 2022-02-15 2022-05-17 南京慧尔视智能科技有限公司 Data processing method, device, equipment and medium

Similar Documents

Publication Publication Date Title
US10529076B2 (en) Image processing apparatus and image processing method
Li et al. A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern
EP3032818B1 (en) Image processing device
US7764284B2 (en) Method and system for detecting and evaluating 3D changes from images and a 3D reference model
WO2017022033A1 (en) Image processing device, image processing method, and image processing program
WO2015110847A1 (en) Data-processing system and method for calibration of a vehicle surround view system
KR20170139548A (en) Camera extrinsic parameters estimation from image lines
JP2012253758A (en) Method of calibrating vehicle vision system and vehicle vision system
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
Dawood et al. Harris, SIFT and SURF features comparison for vehicle localization based on virtual 3D model and camera
Perez-Yus et al. Extrinsic calibration of multiple RGB-D cameras from line observations
JP2019032218A (en) Location information recording method and device
Ding et al. A robust detection method of control points for calibration and measurement with defocused images
Martins et al. Monocular camera calibration for autonomous driving—a comparative study
US10252417B2 (en) Information processing apparatus, method of controlling information processing apparatus, and storage medium
CN111638500A (en) Calibration method for a measuring device and measuring device
US12002237B2 (en) Position coordinate derivation device, position coordinate derivation method, position coordinate derivation program, and system
US20230070281A1 (en) Methods and systems of generating camera models for camera calibration
Natroshvili et al. Automatic extrinsic calibration methods for surround view systems
US11039114B2 (en) Method for determining distance information from images of a spatial region
Jóźków et al. Combined matching of 2d and 3d kinect™ data to support indoor mapping and navigation
Mazzei et al. A lasers and cameras calibration procedure for VIAC multi-sensorized vehicles
CN110444102B (en) Map construction method and device and unmanned equipment
JP2018125706A (en) Imaging apparatus
Kumar et al. Generalized radial alignment constraint for camera calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201104

Address after: No. 102-1, West Haidian Road, Beijing 100085

Applicant after: Seashell Housing (Beijing) Technology Co.,Ltd.

Address before: 300 457 days Unit 5, Room 1, 112, Room 1, Office Building C, Nangang Industrial Zone, Binhai New Area Economic and Technological Development Zone, Tianjin

Applicant before: BEIKE TECHNOLOGY Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200908