CN112272757A - External parameter calibration method and device for detection device and movable platform - Google Patents

External parameter calibration method and device for detection device and movable platform Download PDF

Info

Publication number
CN112272757A
CN112272757A CN201980038511.3A CN201980038511A CN112272757A CN 112272757 A CN112272757 A CN 112272757A CN 201980038511 A CN201980038511 A CN 201980038511A CN 112272757 A CN112272757 A CN 112272757A
Authority
CN
China
Prior art keywords
detection device
environment
target
detection
relative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980038511.3A
Other languages
Chinese (zh)
Inventor
刘天博
李威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
SZ DJI Innovations Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112272757A publication Critical patent/CN112272757A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

A method, a device and a movable platform (10) for calibrating external parameters of a detection device, wherein at least a first detection device and a second detection device are arranged at different positions of the movable platform (10), and the method comprises the following steps: acquiring first environment detection information detected by a first detection device and second environment detection information detected by a second detection device (S301), respectively determining pose data of the first detection device and the second detection device relative to a target environment area according to the first environment detection information detected by the first detection device and the second environment detection information detected by the second detection device (S302), and determining an off-target parameter between the first detection device and the second detection device according to the pose data (S303). The method does not depend on special calibration equipment and a specific calibration environment, and can improve the flexibility and the efficiency of external parameter calibration of the detection device.

Description

External parameter calibration method and device for detection device and movable platform
Technical Field
The embodiment of the invention relates to the technical field of mobile platforms, in particular to a method and a device for calibrating external parameters of a detection device and a mobile platform.
Background
External parameter calibration, abbreviated as external parameter calibration, is used to calculate the transformation relationship between the positions and orientations of a plurality of different detection devices. Common detection devices include, but are not limited to, an inertial measurement unit, a camera, and a laser radar, and after the detection devices are calibrated, displacement and rotation parameters between any two detection devices can be calculated, namely, external parameters. By using these external parameters, the position and posture between any two different detecting devices can be switched.
At present, a calibration method for a detection device on a movable platform generally depends on special environment information, and a calibration object needs to be arranged in the movable environment of the movable platform in advance, so that external parameter calibration can be carried out only in a specifically constructed area and cannot be carried out in other places, and the flexibility of external parameter calibration is greatly influenced.
Disclosure of Invention
The embodiment of the invention provides an external parameter calibration method and device of a detection device, a mobile platform and a storage medium, which can conveniently finish external parameter calibration.
In one aspect, an embodiment of the present invention provides an external parameter calibration method for a detection device, where the method is applied to a movable platform, and at least a first detection device and a second detection device are configured at different positions of the movable platform, and the first detection device and the second detection device are respectively configured to acquire environment detection information, and the method includes:
acquiring first environment detection information detected by the first detection device and second environment detection information detected by the second detection device, wherein the first environment detection information and the second environment detection information both comprise information about a target environment region, and the target environment region is a partial environment region in an environment detection region corresponding to the movable platform;
determining pose data of the first detection device relative to the target environment area according to the first environment detection information, and determining pose data of the second detection device relative to the target environment area according to the second environment detection information;
determining an off-target parameter between the first and second probing devices based on the pose data of the first probing device relative to the target environment region and the pose data of the second probing device relative to the target environment region.
In another aspect, an embodiment of the present invention provides an external parameter calibration device for a detection device, where the device is configured on a movable platform, and at least a first detection device and a second detection device are configured at different positions of the movable platform, where the first detection device and the second detection device are respectively configured to acquire environment detection information, and the device includes:
an obtaining module, configured to obtain first environment detection information detected by the first detection device, and obtain second environment detection information detected by the second detection device, where the first environment detection information and the second environment detection information both include information about a target environment region, and the target environment region is a partial environment region in an environment detection region corresponding to the movable platform;
a processing module, configured to determine pose data of the first detection device with respect to the target environment area according to the first environment detection information, and determine pose data of the second detection device with respect to the target environment area according to the second environment detection information;
the processing module is further configured to determine an off-target parameter between the first detection device and the second detection device based on the pose data of the first detection device relative to the target environment region and the pose data of the second detection device relative to the target environment region.
In another aspect, an embodiment of the present invention provides a movable platform of a detection apparatus, where the movable platform is configured on the movable platform, and at least a first detection apparatus and a second detection apparatus are configured at different positions of the movable platform, where the first detection apparatus and the second detection apparatus are respectively configured to acquire environment detection information, and the movable platform includes a processor and a communication interface, where the processor and the communication interface are connected to each other, where the communication interface is controlled by the processor to send and receive instructions, and the processor is configured to:
acquiring first environment detection information detected by the first detection device and second environment detection information detected by the second detection device, wherein the first environment detection information and the second environment detection information both comprise information about a target environment region, and the target environment region is a partial environment region in an environment detection region corresponding to the movable platform;
determining pose data of the first detection device relative to the target environment area according to the first environment detection information, and determining pose data of the second detection device relative to the target environment area according to the second environment detection information;
determining an off-target parameter between the first and second probing devices based on the pose data of the first probing device relative to the target environment region and the pose data of the second probing device relative to the target environment region.
In another aspect, an embodiment of the present invention provides another external parameter calibration method for a detection device, where the method is applied to a movable platform, and detection devices are configured at different positions of the movable platform, where the detection devices include a first detection device and a third detection device, and the third detection device includes an inertial sensor, and the method includes:
determining relative translation and relative rotation between different positions of the first detection device under the detected target environment area;
determining an acceleration and an angular velocity of the first detecting means based on the relative translation and the relative rotation of the first detecting means between the different positions;
acquiring the acceleration and the angular velocity of the third detection device;
and obtaining the target external parameter between the first detection device and the third detection device by comparing the acceleration of the third detection device, the angular velocity of the third detection device, the acceleration of the first detection device and the angular velocity of the first detection device.
In another aspect, an embodiment of the present invention provides another external parameter calibration apparatus for a detection apparatus, where the apparatus is adapted to a movable platform, and detection apparatuses are disposed at different positions of the movable platform, where the detection apparatus includes a first detection apparatus and a third detection apparatus, and the third detection apparatus includes an inertial sensor, and the apparatus includes:
a processing module for determining relative translation and relative rotation between different positions of the first detection device in detecting the target environmental zone;
the processing module is further configured to determine an acceleration and an angular velocity of the first detecting device based on the relative translation and the relative rotation of the first detecting device between the different positions;
the acquisition module is used for acquiring the acceleration and the angular velocity of the third detection device;
the processing module is further configured to obtain an external target parameter between the first detection device and the third detection device by comparing the acceleration of the third detection device, the angular velocity of the third detection device, the acceleration of the first detection device, and the angular velocity of the first detection device.
In another aspect, an embodiment of the present invention provides another movable platform of a detection device, where the detection device is configured at different positions of the movable platform, the detection device includes a first detection device and a third detection device, the third detection device includes an inertial sensor, and the movable platform includes: a processor and a communication interface, the processor to:
determining relative translation and relative rotation between different positions of the first detection device under the detected target environment area;
determining an acceleration and an angular velocity of the first detecting means based on the relative translation and the relative rotation of the first detecting means between the different positions;
acquiring the acceleration and the angular velocity of the third detection device;
and obtaining the target external parameter between the first detection device and the third detection device by comparing the acceleration of the third detection device, the angular velocity of the third detection device, the acceleration of the first detection device and the angular velocity of the first detection device.
In another aspect, an embodiment of the present invention provides a computer storage medium, where computer program instructions are stored, and when the computer program instructions are executed, the computer storage medium is used to implement the external parameter calibration method for a detection apparatus described above.
In an embodiment of the present invention, the movable platform may determine, according to the first environment detection information detected by the first detection device and the second environment detection information detected by the second detection device, pose data of the first detection device and the second detection device with respect to the target environment region, respectively, and determine, according to the pose data, an off-target parameter between the first detection device and the second detection device. By adopting the external parameter calibration mode, the flexibility and the efficiency of external parameter calibration of the detection device can be improved without depending on special calibration equipment and a specific calibration environment.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic view of a scenario of external parameter calibration of a detection apparatus according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a calibration scenario of external parameters of another detection apparatus according to an embodiment of the present invention;
FIG. 3 is a schematic flowchart of an external parameter calibration method for a detection apparatus according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart illustrating an external parameter calibration method for a detection apparatus according to another embodiment of the present invention;
FIG. 5 is a schematic flow chart illustrating a method for calibrating an external parameter of a detecting device according to another embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an external parameter calibration apparatus of a detection apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a movable platform according to an embodiment of the present invention.
Detailed Description
The calibration for the external parameters can be applied to various fields, such as the field of automatic driving, and the calibration for the external parameters of the detection device mounted on the automatic driving vehicle is a necessary and key link in the research and development and production processes of the automatic driving vehicle. The embodiment of the invention provides an external parameter calibration method for a detection device, which is used for calibrating external parameters of the detection devices configured at different positions of a movable platform, can calibrate the external parameters of the detection devices configured at different positions in real time in the moving process of the movable platform, and can calibrate the external parameters of the detection devices configured at different positions in advance before the movable platform moves.
The movable platform can be a plurality of mobile devices capable of running on public transportation roads, such as vehicles of automatic driving vehicles, intelligent electric vehicles, scooters, balance cars and the like. At least a first detecting device and a second detecting device are arranged at different positions of the movable platform, the first detecting device can be any one of a target type sensor including an image sensor (such as but not limited to a camera device) or a perception sensor (such as but not limited to a laser radar), and the second detecting device can be a plurality of different or same type sensors.
In one embodiment, when N (N is an integer greater than 1) detecting devices of the target type are arranged at different positions of the movable platform, any one of the N detecting devices may be determined as a first detecting device and the other N-1 detecting devices may be determined as second detecting devices. Exemplarily, assuming that 4 detection devices are disposed at different positions of the movable platform, which are respectively a binocular camera a, a binocular camera B, a lidar C, and a lidar D, a certain sensor at a certain position may be pre-selected as a main sensor (i.e., a first detection device). For example, the binocular camera a may be determined as a main sensor, the binocular camera B may be determined as a main sensor, and the lidar C or D may be determined as a main sensor, which is not specifically limited in this embodiment of the present application.
Illustratively, referring to fig. 1, one detection device is disposed at each of front, rear, left, and right sides outside the movable platform 10, wherein the detection device disposed at the front side of the movable platform 10 may be selected as a first detection device, and the detection devices disposed at the rear side, left, and right sides of the movable platform 10 may be selected as a second detection device. During the moving process of the movable platform 10, the detection data of all the detection devices (including but not limited to the inertial measurement unit, all the cameras, all the laser radars, etc.) can be continuously collected and buffered in the memory, and the first detection device is invoked to observe the same environment region (for example, the environment region 1 and the environment region 2 in fig. 1) at different positions and rotations in a shorter time, and by comparing the detection data of the first detection device when the same environment region is detected, the relative translation and the relative rotation of the first detection device between the two different positions can be determined, and the relative translation and the relative rotation are accumulated, so that the motion track of the first detection device can be determined. Wherein, the first detection device and each second detection device are required to detect the same environmental area at least once during the moving process.
After the movable platform 10 is moved and all the detection data are collected, the movable platform 10 may automatically check all the collected detection data, and when the same environment region (i.e., the target environment region) is detected, select first detection data (i.e., first environment detection information) collected by the first detection device and second detection data (i.e., first environment detection information) collected by the second detection device, where the first detection data and the second detection data may both include point cloud data (e.g., a frame of point cloud collected by the laser radar) and/or image data (e.g., a frame of picture collected by the camera). In this embodiment of the application, for all the detection data of the second detection device that detects a certain environmental zone, the detection data corresponding to the first detection device that detects the certain environmental zone at different times may be found. The target environmental area is a partial area of the environment detection area corresponding to the movable platform, such as environmental area 1 or environmental area 2 in fig. 1.
Further, pose data of the first detection device relative to the target environment area is determined according to the first detection data, pose data of the second detection device relative to the target environment area is determined according to the second detection data, and the off-target parameter between the first detection device and the second detection device is determined based on the pose data of the first detection device relative to the target environment area and the pose data of the second detection device relative to the target environment area. It can be seen that by adopting the external parameter calibration mode, the external parameter calibration for the detection device can be realized more flexibly and efficiently without depending on special calibration equipment and a specific calibration environment and without shared vision among sensors.
Illustratively, referring again to fig. 2, the movable platform 10 is configured with 4 different detecting devices around the periphery thereof, the front side of the movable platform 10 is configured with 1 first detecting device, and each of the other sides is configured with one second detecting device, and at different times, the first detecting device and the second detecting device located at the right side of the movable platform both detect the same target environment region: ambient region 1. In this case, first detection data when the first detection device detects the environmental region 1 and second detection data when the second detection device on the right side of the mobile platform detects the environmental region 1 may be obtained, and then pose data of the first detection device with respect to the target environmental region and pose data of the second detection device on the right side of the mobile platform with respect to the target environmental region may be determined based on the first detection data and the second detection data, respectively. Further, according to the pose data of the first detection device relative to the target environment area and the pose data of the second detection device positioned on the right side of the mobile platform relative to the target environment area, the target external parameter between the first detection device and the second detection device positioned on the right side of the mobile platform is calculated.
Wherein the pose data of the first detecting device relative to the target environment area comprises first position data and first attitude data of the first detecting device relative to the target environment area, and the pose data of the second detecting device relative to the target environment area comprises second position data and second attitude data of the second detecting device relative to the target environment area. In one embodiment, a difference between the first position data and the second position data, and a difference between the first attitude data and the second attitude data may be determined as the off-target parameter between the first detection device and the second detection device. The target out-of-parameter table includes displacement and rotation parameters between the first probe and the second probe.
The movable platform 10 in fig. 1 and 2 is only an example, and in other examples, the movable platform shown in fig. 1 and 2 may also be other mobile devices, and may also be a mobile device mounted on a competitive robot, an unmanned aerial vehicle, an unmanned vehicle, or the like, which is not limited in this respect.
Referring to fig. 3, fig. 3 is a schematic flow chart of an external parameter calibration method for a detection device according to an embodiment of the present invention, where the method according to an embodiment of the present invention may be executed by a movable platform, and at least a first detection device and a second detection device are configured at different positions of the movable platform, and the first detection device and the second detection device are respectively used to acquire environment detection information.
In the external parameter calibration method of the detecting device shown in fig. 3, the movable platform may acquire the first environment detection information detected by the first detecting device and acquire the second environment detection information detected by the second detecting device in S301. The first environment detection information and the second environment detection information both include information about a target environment area, and the target environment area is a partial environment area in the environment detection area corresponding to the movable platform. In one embodiment, the first detection device and the second detection device detect the target environmental zone at different times.
The movable platform can detect a plurality of environment areas through the first detection device and the second detection device in the moving process, the environment detection areas corresponding to the moving process of the movable platform are formed by the environment areas, and the target environment area is any one of the environment areas. For example, referring to fig. 1, an environment detection area corresponding to the current moving process of the movable platform includes an environment area 1 and an environment area 2, and the target environment area is a partial environment area in the environment detection area, and may be, for example, the environment area 1 or the environment area 2.
In one embodiment, the movable platform can use the first detection device and the second detection device to respectively collect environment detection information during the moving process, and store the collected environment detection information in a preset storage area. Further, after the movable platform finishes moving and all the environment detection information is collected, the movable platform can automatically check all the collected environment detection information, and when the same target environment area is detected from all the environment detection information, the first environment detection information detected by the first detection device and the second environment information detected by the second detection device are obtained.
After the movable platform acquires the first environment detection information detected by the first detection device and the second environment detection information detected by the second detection device, in step S302, the position and orientation data of the first detection device relative to the target environment area is determined according to the first environment detection information, and the position and orientation data of the second detection device relative to the target environment area is determined according to the second environment detection information.
Both the first detection device and the second detection device may be an image sensor (e.g., an image pickup device) or a sensing sensor. Accordingly, the first environment detection information detected by the first detection device and the second environment detection information detected by the second detection device may each include point cloud data regarding the target environment region or image data regarding the target environment region.
The perception sensor may be, for example, a laser radar, and the laser radar may obtain three-dimensional information of a scene. The method comprises the following steps of actively emitting a laser pulse signal to a detected object, receiving a laser pulse signal reflected by the detected object, and calculating depth information of the detected object according to the time difference between the emitted laser pulse signal and the received reflected laser pulse signal and the propagation speed of the laser pulse signal; according to the transmitting direction of the laser radar, obtaining angle information of the measured object relative to the laser radar; and combining the depth information and the angle information to obtain massive detection points, wherein a data set of the detection points is called point cloud, and the spatial three-dimensional information of the detected object relative to the laser radar can be reconstructed based on the point cloud.
In one embodiment, assuming that the second environment detection information includes image data about the target environment area, determining the pose data of the second detection device relative to the target environment area according to the second environment detection information may be: and processing the image data of the target environment area based on an image algorithm to obtain the pose data of the second detection device relative to the target environment area. Wherein the pose data of the second detection device relative to the target environment area comprises second position data and second pose data of the second detection device relative to the target environment area. The second position data may be world coordinates of the second detecting device relative to the target environmental region, and the second posture data may be a rotation angle of the second detecting device relative to the target environmental region.
Illustratively, the second detecting device is a camera, and the second environment detecting information includes image data about the target environment area, the image data may be a picture J1 about the target environment area, and the image algorithm may be a Perspective-n-Point (PnP). For this case, the portable platform may use PnP to solve the world coordinates and the rotation angle of the camera capturing the picture J1 with respect to the target environment area by combining the world coordinates of the feature point in the world coordinate system in the picture J1 and the imaging (i.e. pixel coordinates) of the feature point in the picture J1, which may be respectively represented by a translation matrix (t 1) and a rotation matrix (R1).
Similarly, assuming that the first environment detection information includes image data about the target environment region, the determining the pose data of the first detection apparatus with respect to the target environment region according to the first environment detection information may be: and processing the image data of the target environment area based on an image algorithm to obtain the pose data of the first detection device relative to the target environment area.
In one embodiment, assuming that the second environment detection information includes point cloud data about the target environment area, determining the pose data of the second detection device relative to the target environment area according to the second environment detection information may be: and processing the Point cloud data of the target environment area based on an Iterative Closest Point algorithm (ICP) to obtain pose data of the second detection device relative to the target environment area.
Illustratively, the second detection device is a laser radar, the second environment detection information includes point cloud data about the target environment area, the point cloud data may be a frame of point cloud about the target environment area, and the movable platform may process the point cloud by using ICP, and when solving that the laser radar acquires the frame of point cloud about the target environment area, the world coordinate and the rotation angle with respect to the target environment area may be respectively represented by a translation matrix (t 2) and a rotation matrix (R2).
Similarly, assuming that the first environmental detection information includes point cloud data about the target environmental region, the specific embodiment of determining the pose data of the first detection device relative to the target environmental region according to the first environmental detection information may be: and processing the image data of the target environment area based on ICP to obtain the pose data of the first detection device relative to the target environment area.
Further, after the movable platform determines the pose data of the first detecting device and the pose data of the second detecting device relative to the target environment area, the off-target parameter between the first detecting device and the second detecting device may be determined in step S303 based on the pose data of the first detecting device relative to the target environment area and the pose data of the second detecting device relative to the target environment area.
Wherein the pose data of the first detecting device relative to the target environment area comprises first position data and first attitude data of the first detecting device relative to the target environment area, and the pose data of the second detecting device relative to the target environment area comprises second position data and second attitude data of the second detecting device relative to the target environment area.
In one embodiment, the external parameters of the object may be displacement and rotation parameters between the first detection device and the second detection device, and the first position data and the first posture data may be position coordinates S of the first detection device when the first detection device detects the environmental region of the object1And a rotation angle alpha1The second position data and the second posture data may be position coordinates S of the second detecting device when the target environment region is detected2And a rotation angle alpha2. In this case, the movable platform may coordinate the position S1And position coordinates S2Is determined as the displacement between the first and second detection means, the angle of rotation alpha is determined1And a rotation angle alpha2Is determined as a rotation parameter between the first detection means and the second detection means.
In one embodiment, the off-target parameter may be a translation matrix and a rotation matrix between the first detecting device and the second detecting device. The first position data and the first posture data may be a first translation matrix and a first rotation matrix, respectively, with respect to the target environment region when the first detection device detects the target environment region, and the second position data and the second posture data may be a second translation matrix and a second rotation matrix, respectively, with respect to the target environment region when the second detection device detects the target environment region, further, the mobile platform may calculate a translation matrix between the first detection device and the second detection device based on the first translation matrix and the second translation matrix, and calculate a rotation matrix between the first detection device and the second detection device based on the first rotation matrix and the second rotation matrix.
In one embodiment, the moveable platform may determine a first extrinsic parameter between the first and second probing apparatuses based on the pose data of the first probing apparatus relative to the target environmental region and the pose data of the second probing apparatus relative to the target environmental region. Further, a second extrinsic parameter between the first detection device and the second detection device may be determined based on the pose data of the first detection device with respect to the reference environment area and the pose data of the second detection device with respect to the reference environment area, and the first extrinsic parameter and the second extrinsic parameter are subjected to data processing to obtain an objective extrinsic parameter between the first detection device and the second detection device.
The specific implementation manner of performing data processing on the first extrinsic parameter and the second extrinsic parameter to obtain the target extrinsic parameter between the first detection device and the second detection device may be: the first and second extrinsic parameters are averaged, and the averaged value is determined as the target extrinsic parameter between the first and second detection devices.
In an embodiment, the target environmental area and the reference environmental area may be different, the target environmental area is a partial environmental area in the environmental detection area corresponding to the movable platform, and the reference environmental area is another partial environmental area in the environmental detection area. For example, referring to fig. 1, the environment detection region corresponding to the current moving process of the movable platform includes an environment region 1 and an environment region 2, the target environment region is the environment region 1, and then the reference environment region may be the environment region 2, and in the moving process of the current movable platform, the first detection device and the second detection device both detect the environment region 1 and the environment region 2 at least once at different times.
In one embodiment, the target environmental region and the reference environmental region may be the same. In this case, in the moving process of the movable platform, the first detection device and the second detection device both detect the target environment region at different times at least twice. For example, in the moving process of the movable platform, the first detection device and the second detection device both detect the target environment region twice at different times, and specific detection time is shown in table 1. In this case, the movable platform may determine a first external parameter between the first detection device and the second detection device based on the pose data of the first detection device with respect to the target environment region at the 5 th minute and the pose data of the second detection device with respect to the target environment region at the 10 th minute; and determining a second extrinsic parameter between the first detection device and the second detection device based on the pose data of the first detection device relative to the target environment area at the 20 th minute and the pose data of the second detection device relative to the target environment area at the 25 th minute. Further, the first external parameter and the second external parameter are subjected to data processing, and a target external parameter between the first detection device and the second detection device is obtained.
TABLE 1
Detection time (minutes) Probe apparatus
At 5 th minute First detecting device
At 10 th minute Second detecting device
At 20 th minute First detecting device
At 25 th minute Second detecting device
In an embodiment of the present invention, the movable platform may determine, according to the first environment detection information detected by the first detection device and the second environment detection information detected by the second detection device, pose data of the first detection device and the second detection device with respect to the target environment region, respectively, and determine, according to the pose data, an off-target parameter between the first detection device and the second detection device. By adopting the external parameter calibration mode, the flexibility and the efficiency of external parameter calibration of the detection device can be improved without depending on special calibration equipment and a specific calibration environment.
Referring to fig. 4, fig. 4 is a schematic flow chart of another external parameter calibration method for a detection device according to an embodiment of the present invention, where the method according to an embodiment of the present invention may be executed by a movable platform, and at least a first detection device and a second detection device are configured at different positions of the movable platform, and the first detection device and the second detection device are respectively used to acquire environment detection information.
In the external parameter calibration method of the detecting device shown in fig. 4, the movable platform may acquire the first environment detection information detected by the first detecting device and acquire the second environment detection information detected by the second detecting device in S401. Further, in step S402, the pose data of the first detecting device relative to the target environment area is determined according to the first environment detection information, and the pose data of the second detecting device relative to the target environment area is determined according to the second environment detection information. For specific implementation of steps S401 to S402, reference may be made to the related description of steps S301 to S302 in the foregoing embodiment, and details are not repeated here.
Further, after the movable platform determines the pose data of the first detection device and the pose data of the second detection device with respect to the target environment area, the motion trajectory of the first detection device may be obtained in step S403, and the off-target parameter between the first detection device and the second detection device may be calculated according to the motion trajectory, the pose data of the first detection device with respect to the target environment area, and the pose data of the second detection device with respect to the target environment area.
In one embodiment, before the movable platform acquires the motion track of the first detection device, each second environment detection information detected by the first detection device at different positions may be acquired, each second environment detection information includes information about the target environment region, and further, based on each second environment detection information, a relative translation and a relative rotation of the first detection device between different positions are determined, and based on the relative translation and the relative rotation, the motion track of the first detection device is determined.
For example, during the moving process of the movable platform, the first detection device may be invoked to observe the same environment region (e.g., the target environment region) at different positions and rotation angles in a shorter time, the relative translation and relative rotation of the first detection device between the different positions are determined by comparing the corresponding second environment detection information when the first detection device detects the target environment region at the different positions, and the motion trajectory of the first detection device may be determined by accumulating the relative translation and relative rotation.
In one embodiment, the movable platform is further provided with a third detection device, the third detection device comprises an inertial sensor, after the movable platform determines the relative translation and the relative rotation of the first detection device between different positions, the movable platform can determine the acceleration and the angular velocity of the first detection device based on the relative translation and the relative rotation of the first detection device between different positions, acquire the acceleration and the angular velocity of the inertial sensor, and obtain the off-target parameter between the first detection device and the inertial sensor by comparing the acceleration of the inertial sensor, the angular velocity of the inertial sensor, the acceleration of the first detection device and the angular velocity of the first detection device.
For example, the inertial sensors may output measured acceleration and angular velocity in real time during movement of the movable platform. Similarly, when calculating the motion trajectory of the first detection device, the acceleration and the angular velocity of the first detection device can be obtained by differentiating the position of the first detection device in the second order (i.e. the relative translation of the first detection device between different positions) and differentiating the posture of the first detection device in the first order (i.e. the relative rotation of the first detection device between different positions). Further, by comparing the acceleration and the angular velocity of the inertial sensor and the acceleration and the angular velocity of the detecting device, the relative position and the attitude between the inertial sensor and the first detecting device, that is, the off-target parameter, can be obtained.
In an embodiment of the present invention, the movable platform may determine, according to the first environment detection information detected by the first detection device and the second environment detection information detected by the second detection device, pose data of the first detection device and the second detection device with respect to the target environment region, respectively, and determine, according to the pose data, an off-target parameter between the first detection device and the second detection device. By adopting the external parameter calibration mode, the flexibility and the efficiency of external parameter calibration of the detection device can be improved without depending on special calibration equipment and a specific calibration environment.
Referring to fig. 5, fig. 5 is a schematic flow chart of a method for calibrating an external parameter of a detection apparatus according to an embodiment of the present invention, where the method according to an embodiment of the present invention may be performed by a movable platform, and the detection apparatus is disposed at different positions of the movable platform, and includes a first detection apparatus and a third detection apparatus, and the third detection apparatus includes an inertial sensor.
In the external parameter calibration method of the detecting device shown in fig. 5, the relative translation and relative rotation of the first detecting device between different positions in the detected target environmental region may be determined in step S501, and the acceleration and the angular velocity of the first detecting device may be determined based on the relative translation and relative rotation of the first detecting device between different positions.
In one embodiment, determining the relative translation and relative rotation between different positions of the first detection device at which the target environmental zone is detected comprises: when the movable platform moves, the first detection device is called to detect the target environment regions at different positions and postures, and the second environment detection information detected by the first detection device at different positions is obtained, wherein the second environment detection information comprises information about the target environment regions. Further, based on the respective second environment detection information, a relative translation and a relative rotation of the first detection device between the different positions are determined.
Exemplarily, assuming a target environmental zone such as environmental zone 1 shown in fig. 1, the movable platform moves at 3 positions: A. b, C detect the same environmental zone 1, the respective second environmental detection information detected by the first detection means at different positions is indicative of the detection time and position coordinates corresponding to each position, as shown in table 2. In this case, the relative translation S2-S1 and the relative rotation α of the first detecting means between the detection position a and the detection position B may be calculated based on the above-described respective second environment detection information21(ii) a The relative translation S3-S2 and the relative rotation alpha of the first detecting means between the detecting position B and the detecting position C are calculated32. Further, by second-order differentiating the position of the first detection means:
Figure BDA0002826149910000141
the acceleration of the first detection device is determined and accordingly the angular velocity of the first detection device can be determined by first differentiating the attitude of the first detection device.
TABLE 2
Detecting position Time of detection Position coordinates Rotation angle
A T1 S1 α1
B T2 S2 α2
C T3 S3 α3
The acceleration and the angular velocity of the third detecting device are acquired in step S502, and the off-target parameter between the first detecting device and the third detecting device is obtained by comparing the acceleration of the third detecting device, the angular velocity of the third detecting device, the acceleration of the first detecting device, and the angular velocity of the first detecting device in step S503.
For example, during the movement of the movable platform, the inertial sensor (i.e., the third detecting device) may output the measured acceleration and angular velocity to the movable platform in real time, and after receiving the acceleration and angular velocity, the movable platform may store the measured acceleration and angular velocity in the preset region. When step S502 is executed subsequently, the acceleration and the angular velocity measured by the inertial sensor may be acquired from the preset region.
In the embodiment of the invention, the movable platform can realize the external parameter calibration between the first detection device and the inertial sensor more efficiently and flexibly without depending on special calibration equipment and a specific calibration environment.
Based on the above description of the method embodiment, in an embodiment, an external parameter calibration apparatus of the detection apparatus shown in fig. 6 is further provided in an embodiment of the present invention. The control device may be configured on, but not limited to, a movable platform, at least a first detection device and a second detection device are configured at different positions of the movable platform, the first detection device and the second detection device are respectively configured to acquire environment detection information, and the external parameter calibration device includes:
an obtaining module 60, configured to obtain first environment detection information detected by the first detection device, and obtain second environment detection information detected by the second detection device, where the first environment detection information and the second environment detection information each include information about a target environment region, and the target environment region is a partial environment region in an environment detection region corresponding to the movable platform;
a processing module 61, configured to determine pose data of the first detecting device relative to the target environment area according to the first environment detection information, and determine pose data of the second detecting device relative to the target environment area according to the second environment detection information;
the processing module 61 is further configured to determine an off-target parameter between the first detecting device and the second detecting device based on the pose data of the first detecting device relative to the target environment area and the pose data of the second detecting device relative to the target environment area.
In one embodiment, the first detecting means is any one of target type sensors including an image sensor and a perception sensor.
In an embodiment, the second environment detection information includes image data about the target environment area, and the processing module 61 is specifically configured to process the image data about the target environment area based on an image algorithm to obtain pose data of the second detection device relative to the target environment area.
In an embodiment, the second environment detection information includes point cloud data about the target environment area, and the processing module 61 is specifically configured to process the point cloud data about the target environment area based on an iterative closest point algorithm, so as to obtain pose data of the second detection device relative to the target environment area.
In one embodiment, the processing module 61 is specifically configured to determine a first external parameter between the first detecting device and the second detecting device based on the pose data of the first detecting device relative to the target environment area and the pose data of the second detecting device relative to the target environment area;
determining a second extrinsic parameter between the first detecting means and the second detecting means based on the pose data of the first detecting means with respect to a reference environmental area and the pose data of the second detecting means with respect to the reference environmental area; and carrying out data processing on the first external parameter and the second external parameter to obtain a target external parameter between the first detection device and the second detection device.
In one embodiment, the off-target parameter is a translation matrix and a rotation matrix between the first detection device and the second detection device.
In one embodiment, the first detection device and the second detection device detect the target environmental zone at different times.
In an embodiment, the processing module 61 is specifically configured to obtain a motion trajectory of the first detecting device; and calculating an off-target parameter between the first detection device and the second detection device according to the motion trail, the pose data of the first detection device relative to the target environment area and the pose data of the second detection device relative to the target environment area.
In one embodiment, the obtaining module 60 is further configured to obtain each second environment detection information detected by the first detecting device at different positions, where each second environment detection information includes information about the target environment region; the processing module 61 is further configured to determine, based on the respective second environment detection information, a relative translation and a relative rotation of the first detection device between the different positions; and determining the motion track of the first detection device based on the relative translation and the relative rotation.
In one embodiment, the movable platform is further provided with a third detection device comprising an inertial sensor, the processing module 61 is further configured to determine an acceleration and an angular velocity of the first detection device based on the relative translation and the relative rotation of the first detection device between the different positions; acquiring the acceleration and the angular velocity of the inertial sensor; and obtaining the target external parameter between the first detection device and the inertial sensor by comparing the acceleration of the inertial sensor, the angular velocity of the inertial sensor, the acceleration of the first detection device and the angular velocity of the first detection device.
In one embodiment, the detecting means comprises a first detecting means and a third detecting means, the third detecting means comprises an inertial sensor, the processing module 61 is further configured to determine a relative translation and a relative rotation of the first detecting means between different positions of the detected target environmental region, and determine an acceleration and an angular velocity of the first detecting means based on the relative translation and the relative rotation of the first detecting means between the different positions; an obtaining module 60, configured to obtain an acceleration and an angular velocity of the third detecting device; the processing module 61 is further configured to obtain an external target parameter between the first detection device and the third detection device by comparing the acceleration of the third detection device, the angular velocity of the third detection device, the acceleration of the first detection device, and the angular velocity of the first detection device.
In an embodiment, the processing module 61 is specifically configured to invoke the first detecting device to detect the target environment area at different positions and postures when the movable platform moves; acquiring each piece of second environment detection information detected by the first detection device at the different positions, wherein each piece of second environment detection information comprises information about the target environment area; determining a relative translation and a relative rotation of the first detection device between the different positions based on the respective second environment detection information.
In the embodiment of the present invention, the specific implementation of the above modules may refer to the description of relevant contents in the embodiments corresponding to fig. 3, fig. 4, or fig. 5.
Fig. 7 is a schematic block diagram of a structure of a movable platform according to an embodiment of the present invention. The movable platform comprises a processor and a communication interface, the movable platform can comprise a processor 70, a communication interface 71 and a memory 72, the processor 70, the communication interface 71 and the memory 72 are connected through a bus, and the memory 72 is used for storing program instructions and environment detection information.
The memory 72 may include a volatile memory (volatile memory), such as a random-access memory (RAM); the memory 72 may also include a non-volatile memory (non-volatile memory), such as a flash memory (flash memory), a solid-state drive (SSD), etc.; the memory 72 may also be a Double Data Rate (DDR) synchronous dynamic random access memory (DDR); the memory 72 may also comprise a combination of the above types of memories.
In an embodiment of the present invention, the memory 72 is configured to store a computer program, the computer program includes program instructions, and the processor 70 is configured to execute, when the program instructions are called: acquiring first environment detection information detected by the first detection device and second environment detection information detected by the second detection device, wherein the first environment detection information and the second environment detection information both comprise information about a target environment region, and the target environment region is a partial environment region in an environment detection region corresponding to the movable platform; determining pose data of the first detection device relative to the target environment area according to the first environment detection information, and determining pose data of the second detection device relative to the target environment area according to the second environment detection information; determining an off-target parameter between the first and second probing devices based on the pose data of the first probing device relative to the target environment region and the pose data of the second probing device relative to the target environment region.
In one embodiment, the first detecting means is any one of target type sensors including an image sensor and a perception sensor.
In an embodiment, the second environment detection information includes image data about the target environment area, and the processor 70 is specifically configured to process the image data about the target environment area based on an image algorithm to obtain pose data of the second detection device relative to the target environment area.
In one embodiment, the second environment detection information includes point cloud data about the target environment area, and the processor 70 is further specifically configured to process the point cloud data about the target environment area based on an iterative closest point algorithm to obtain pose data of the second detection device with respect to the target environment area.
In one embodiment, the processor 70 is further specifically configured to determine a first external parameter between the first detection device and the second detection device based on the pose data of the first detection device relative to the target environment region and the pose data of the second detection device relative to the target environment region; determining a second extrinsic parameter between the first detecting means and the second detecting means based on the pose data of the first detecting means with respect to a reference environmental area and the pose data of the second detecting means with respect to the reference environmental area; and carrying out data processing on the first external parameter and the second external parameter to obtain a target external parameter between the first detection device and the second detection device.
In one embodiment, the off-target parameter is a translation matrix and a rotation matrix between the first detection device and the second detection device.
In one embodiment, the first detection device and the second detection device detect the target environmental zone at different times.
In an embodiment, the processor 70 is further specifically configured to obtain a motion trajectory of the first detecting device; and calculating an off-target parameter between the first detection device and the second detection device according to the motion trail, the pose data of the first detection device relative to the target environment area and the pose data of the second detection device relative to the target environment area.
In one embodiment, the processor 70 is further configured to obtain respective second environment detection information detected by the first detecting device at different positions, where the respective second environment detection information includes information about the target environment area; determining relative translation and relative rotation of the first detection device between the different positions based on the respective second environmental detection information; and determining the motion track of the first detection device based on the relative translation and the relative rotation.
In one embodiment, the movable platform is further provided with a third detecting means, and the processor 70 is further configured to determine an acceleration and an angular velocity of the first detecting means based on the relative translation and the relative rotation of the first detecting means between the different positions; acquiring the acceleration and the angular velocity of the inertial sensor; and obtaining the target external parameter between the first detection device and the inertial sensor by comparing the acceleration of the inertial sensor, the angular velocity of the inertial sensor, the acceleration of the first detection device and the angular velocity of the first detection device.
In one embodiment, the detecting device further comprises a third detecting device, the third detecting device comprises an inertial sensor, and the processor 70 is further configured to execute, when the program instructions are invoked: determining relative translation and relative rotation between different positions of the first detection device under the detected target environment area; determining an acceleration and an angular velocity of the first detecting means based on the relative translation and the relative rotation of the first detecting means between the different positions; acquiring the acceleration and the angular velocity of the third detection device; and obtaining the target external parameter between the first detection device and the third detection device by comparing the acceleration of the third detection device, the angular velocity of the third detection device, the acceleration of the first detection device and the angular velocity of the first detection device.
In one embodiment, the processor 70 is further specifically configured to: when the movable platform moves, calling the first detection device to detect the target environment area in different positions and postures; acquiring each piece of second environment detection information detected by the first detection device at the different positions, wherein each piece of second environment detection information comprises information about the target environment area; determining a relative translation and a relative rotation of the first detection device between the different positions based on the respective second environment detection information.
In the embodiment of the present invention, the processor 70 may be implemented as described in the foregoing embodiment with reference to fig. 3, fig. 4, or fig. 5.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
While the invention has been described with reference to a number of embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (28)

1. An external parameter calibration method for a detection device is characterized in that the method is applied to a movable platform, at least a first detection device and a second detection device are arranged at different positions of the movable platform, and the first detection device and the second detection device are respectively used for acquiring environment detection information, and the method comprises the following steps:
acquiring first environment detection information detected by the first detection device and second environment detection information detected by the second detection device, wherein the first environment detection information and the second environment detection information both comprise information about a target environment region, and the target environment region is a partial environment region in an environment detection region corresponding to the movable platform;
determining pose data of the first detection device relative to the target environment area according to the first environment detection information, and determining pose data of the second detection device relative to the target environment area according to the second environment detection information;
determining an off-target parameter between the first and second probing devices based on the pose data of the first probing device relative to the target environment region and the pose data of the second probing device relative to the target environment region.
2. The method of claim 1, wherein the first detection device is any one of a target type of sensor including an image sensor and a perception sensor.
3. The method according to claim 1 or 2, wherein the second environment detection information comprises image data about the target environment area, and wherein the determining pose data of the second detection device relative to the target environment area from the second environment detection information comprises:
and processing the image data about the target environment area based on an image algorithm to obtain the pose data of the second detection device relative to the target environment area.
4. The method of claim 1 or 2, wherein the second environmental detection information comprises point cloud data regarding the target environmental area, and wherein determining pose data of the second detection device relative to the target environmental area from the second environmental detection information comprises:
and processing the point cloud data about the target environment area based on an iterative closest point algorithm to obtain pose data of the second detection device relative to the target environment area.
5. The method of claim 1, the determining off-target parameters between the first and second probing apparatuses based on pose data of the first probing apparatus relative to the target environment region and pose data of the second probing apparatus relative to the target environment region, comprising:
determining a first external parameter between the first detection device and the second detection device based on the pose data of the first detection device relative to the target environment area and the pose data of the second detection device relative to the target environment area;
determining a second extrinsic parameter between the first detecting means and the second detecting means based on the pose data of the first detecting means with respect to a reference environmental area and the pose data of the second detecting means with respect to the reference environmental area;
and carrying out data processing on the first external parameter and the second external parameter to obtain a target external parameter between the first detection device and the second detection device.
6. The method of claim 1, wherein the off-target parameter is a translation matrix and a rotation matrix between the first detection device and the second detection device.
7. The method of claim 1, wherein the first detection device and the second detection device detect the target environmental zone at different times.
8. The method of claim 1, wherein determining the off-target parameters between the first and second probing apparatuses based on the pose data of the first probing apparatus relative to the target environment region and the pose data of the second probing apparatus relative to the target environment region comprises:
acquiring a motion track of the first detection device;
and calculating an off-target parameter between the first detection device and the second detection device according to the motion trail, the pose data of the first detection device relative to the target environment area and the pose data of the second detection device relative to the target environment area.
9. The method of claim 8, wherein prior to said obtaining the motion profile of the first probe device, the method comprises:
acquiring each piece of second environment detection information detected by the first detection device at different positions, wherein each piece of second environment detection information comprises information about the target environment area;
determining relative translation and relative rotation of the first detection device between the different positions based on the respective second environmental detection information;
and determining the motion track of the first detection device based on the relative translation and the relative rotation.
10. The method of claim 1, wherein the movable platform is further provided with a third detection device, the third detection device comprising an inertial sensor, the method further comprising:
determining acceleration and angular velocity of the first detecting means based on relative translation and relative rotation of the first detecting means between different positions;
acquiring the acceleration and the angular velocity of the inertial sensor;
and obtaining the target external parameter between the first detection device and the inertial sensor by comparing the acceleration of the inertial sensor, the angular velocity of the inertial sensor, the acceleration of the first detection device and the angular velocity of the first detection device.
11. A method for calibrating an external parameter of a probe device, the method being applied to a movable platform, the probe device being disposed at different positions of the movable platform, the probe device including a first probe device and a third probe device, the third probe device including an inertial sensor, the method comprising:
determining relative translation and relative rotation between different positions of the first detection device under the detected target environment area;
determining an acceleration and an angular velocity of the first detecting means based on the relative translation and the relative rotation of the first detecting means between the different positions;
acquiring the acceleration and the angular velocity of the third detection device;
and obtaining the target external parameter between the first detection device and the third detection device by comparing the acceleration of the third detection device, the angular velocity of the third detection device, the acceleration of the first detection device and the angular velocity of the first detection device.
12. The method of claim 11, wherein determining the relative translation and relative rotation of the first detection device between detecting different positions of the target environmental zone comprises:
when the movable platform moves, calling the first detection device to detect the target environment area in different positions and postures;
acquiring each piece of second environment detection information detected by the first detection device at the different positions, wherein each piece of second environment detection information comprises information about the target environment area;
determining a relative translation and a relative rotation of the first detection device between the different positions based on the respective second environment detection information.
13. An external parameter calibration device for a detection device, wherein the device is configured on a movable platform, and at least a first detection device and a second detection device are configured at different positions of the movable platform, and the first detection device and the second detection device are respectively used for acquiring environment detection information, and the device comprises:
an obtaining module, configured to obtain first environment detection information detected by the first detection device, and obtain second environment detection information detected by the second detection device, where the first environment detection information and the second environment detection information both include information about a target environment region, and the target environment region is a partial environment region in an environment detection region corresponding to the movable platform;
a processing module, configured to determine pose data of the first detection device with respect to the target environment area according to the first environment detection information, and determine pose data of the second detection device with respect to the target environment area according to the second environment detection information;
the processing module is further configured to determine an off-target parameter between the first detection device and the second detection device based on the pose data of the first detection device relative to the target environment region and the pose data of the second detection device relative to the target environment region.
14. A movable platform, wherein at least a first detection device and a second detection device are disposed at different positions of the movable platform, and the first detection device and the second detection device are respectively configured to acquire environment detection information, the movable platform comprises a processor and a communication interface, and the processor is configured to:
acquiring first environment detection information detected by the first detection device and second environment detection information detected by the second detection device, wherein the first environment detection information and the second environment detection information both comprise information about a target environment region, and the target environment region is a partial environment region in an environment detection region corresponding to the movable platform;
determining pose data of the first detection device relative to the target environment area according to the first environment detection information, and determining pose data of the second detection device relative to the target environment area according to the second environment detection information;
determining an off-target parameter between the first and second probing devices based on the pose data of the first probing device relative to the target environment region and the pose data of the second probing device relative to the target environment region.
15. The movable platform of claim 14, wherein the first detection device is any one of a target type of sensor including an image sensor and a perception sensor.
16. The movable platform of claim 14 or 15, wherein the second environment detection information comprises image data about the target environment area, and wherein the processor is configured to process the image data about the target environment area based on an image algorithm to obtain pose data of the second detection device with respect to the target environment area.
17. The movable platform of claim 14 or 15, wherein the second environment detection information comprises point cloud data regarding the target environment area, and wherein the processor is further configured to process the point cloud data regarding the target environment area based on an iterative closest point algorithm to obtain pose data of the second detection device with respect to the target environment area.
18. The movable platform of claim 14, the processor further specifically configured to determine a first extrinsic parameter between the first probe device and the second probe device based on the pose data of the first probe device relative to the target environmental zone and the pose data of the second probe device relative to the target environmental zone; determining a second extrinsic parameter between the first detecting means and the second detecting means based on the pose data of the first detecting means with respect to a reference environmental area and the pose data of the second detecting means with respect to the reference environmental area; and carrying out data processing on the first external parameter and the second external parameter to obtain a target external parameter between the first detection device and the second detection device.
19. The movable platform of claim 14, wherein the off-target parameter is a translation matrix and a rotation matrix between the first detection device and the second detection device.
20. The movable platform of claim 14, wherein the first and second detection devices detect the target environmental zone at different times.
21. The movable platform of claim 14, wherein the processor is further configured to obtain a motion trajectory of the first detection device; and calculating an off-target parameter between the first detection device and the second detection device according to the motion trail, the pose data of the first detection device relative to the target environment area and the pose data of the second detection device relative to the target environment area.
22. The movable platform of claim 21, wherein the processor is further configured to obtain respective second environment detection information detected by the first detection device at different positions, the respective second environment detection information including information about the target environment zone; determining relative translation and relative rotation of the first detection device between the different positions based on the respective second environmental detection information; and determining the motion track of the first detection device based on the relative translation and the relative rotation.
23. The movable platform of claim 14, wherein the movable platform is further provided with a third detection device, and the processor is further configured to determine an acceleration and an angular velocity of the first detection device based on a relative translation and a relative rotation of the first detection device between different positions; acquiring the acceleration and the angular velocity of the inertial sensor; and obtaining the target external parameter between the first detection device and the inertial sensor by comparing the acceleration of the inertial sensor, the angular velocity of the inertial sensor, the acceleration of the first detection device and the angular velocity of the first detection device.
24. An extrinsic parameter calibration apparatus for a probe apparatus, wherein the apparatus is adapted to a movable platform, probe apparatuses are disposed at different positions of the movable platform, the probe apparatus includes a first probe apparatus and a third probe apparatus, the third probe apparatus includes an inertial sensor, and the apparatus includes:
a processing module for determining relative translation and relative rotation between different positions of the first detection device in detecting the target environmental zone;
the processing module is further configured to determine an acceleration and an angular velocity of the first detecting device based on the relative translation and the relative rotation of the first detecting device between the different positions;
the acquisition module is used for acquiring the acceleration and the angular velocity of the third detection device;
the processing module is further configured to obtain an external target parameter between the first detection device and the third detection device by comparing the acceleration of the third detection device, the angular velocity of the third detection device, the acceleration of the first detection device, and the angular velocity of the first detection device.
25. The apparatus according to claim 24, wherein the processing module is specifically configured to invoke the first detecting device to detect the target environmental zone at different positions and attitudes when the movable platform moves; acquiring, by the acquisition module, second environment detection information detected by the first detection device at the different positions, where the second environment detection information includes information about the target environment area; determining a relative translation and a relative rotation of the first detection device between the different positions based on the respective second environment detection information.
26. A movable platform, characterized in that detection means are provided at different positions of the movable platform, the detection means comprising first detection means and third detection means, the third detection means comprising inertial sensors, the movable platform comprising: a processor and a communication interface, the processor to:
determining relative translation and relative rotation between different positions of the first detection device under the detected target environment area;
determining an acceleration and an angular velocity of the first detecting means based on the relative translation and the relative rotation of the first detecting means between the different positions;
acquiring the acceleration and the angular velocity of the third detection device;
and obtaining the target external parameter between the first detection device and the third detection device by comparing the acceleration of the third detection device, the angular velocity of the third detection device, the acceleration of the first detection device and the angular velocity of the first detection device.
27. The movable platform of claim 26, wherein the processor is specifically configured to: when the movable platform moves, calling the first detection device to detect the target environment area in different positions and postures; acquiring each piece of second environment detection information detected by the first detection device at the different positions, wherein each piece of second environment detection information comprises information about the target environment area; determining a relative translation and a relative rotation of the first detection device between the different positions based on the respective second environment detection information.
28. A computer storage medium having stored thereon program instructions for implementing a method according to any one of claims 1 to 12 when executed.
CN201980038511.3A 2019-11-22 2019-11-22 External parameter calibration method and device for detection device and movable platform Pending CN112272757A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/120278 WO2021097807A1 (en) 2019-11-22 2019-11-22 Method and device for calibrating external parameters of detection device, and mobile platform

Publications (1)

Publication Number Publication Date
CN112272757A true CN112272757A (en) 2021-01-26

Family

ID=74349512

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980038511.3A Pending CN112272757A (en) 2019-11-22 2019-11-22 External parameter calibration method and device for detection device and movable platform

Country Status (2)

Country Link
CN (1) CN112272757A (en)
WO (1) WO2021097807A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113655453A (en) * 2021-08-27 2021-11-16 阿波罗智能技术(北京)有限公司 Data processing method and device for sensor calibration and automatic driving vehicle

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130335562A1 (en) * 2012-06-14 2013-12-19 Qualcomm Incorporated Adaptive switching between vision aided ins and vision only pose
CN107747941A (en) * 2017-09-29 2018-03-02 歌尔股份有限公司 A kind of binocular visual positioning method, apparatus and system
CN108375775A (en) * 2018-01-17 2018-08-07 上海禾赛光电科技有限公司 The method of adjustment of vehicle-mounted detection equipment and its parameter, medium, detection system
CN207923150U (en) * 2017-08-04 2018-09-28 广东工业大学 A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude
CN109143205A (en) * 2018-08-27 2019-01-04 深圳清创新科技有限公司 Integrated transducer external parameters calibration method, apparatus
CN109767475A (en) * 2018-12-28 2019-05-17 广州小鹏汽车科技有限公司 A kind of method for calibrating external parameters and system of sensor
CN109946680A (en) * 2019-02-28 2019-06-28 北京旷视科技有限公司 External parameters calibration method, apparatus, storage medium and the calibration system of detection system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016187760A1 (en) * 2015-05-23 2016-12-01 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
EP3159828B1 (en) * 2015-10-19 2022-03-02 Continental Automotive GmbH Adaptive calibration using visible car details
CN109100741B (en) * 2018-06-11 2020-11-20 长安大学 Target detection method based on 3D laser radar and image data

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130335562A1 (en) * 2012-06-14 2013-12-19 Qualcomm Incorporated Adaptive switching between vision aided ins and vision only pose
CN207923150U (en) * 2017-08-04 2018-09-28 广东工业大学 A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude
CN107747941A (en) * 2017-09-29 2018-03-02 歌尔股份有限公司 A kind of binocular visual positioning method, apparatus and system
CN108375775A (en) * 2018-01-17 2018-08-07 上海禾赛光电科技有限公司 The method of adjustment of vehicle-mounted detection equipment and its parameter, medium, detection system
CN109143205A (en) * 2018-08-27 2019-01-04 深圳清创新科技有限公司 Integrated transducer external parameters calibration method, apparatus
CN109767475A (en) * 2018-12-28 2019-05-17 广州小鹏汽车科技有限公司 A kind of method for calibrating external parameters and system of sensor
CN109946680A (en) * 2019-02-28 2019-06-28 北京旷视科技有限公司 External parameters calibration method, apparatus, storage medium and the calibration system of detection system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113655453A (en) * 2021-08-27 2021-11-16 阿波罗智能技术(北京)有限公司 Data processing method and device for sensor calibration and automatic driving vehicle
CN113655453B (en) * 2021-08-27 2023-11-21 阿波罗智能技术(北京)有限公司 Data processing method and device for sensor calibration and automatic driving vehicle

Also Published As

Publication number Publication date
WO2021097807A1 (en) 2021-05-27

Similar Documents

Publication Publication Date Title
CN112785702B (en) SLAM method based on tight coupling of 2D laser radar and binocular camera
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
JP7082545B2 (en) Information processing methods, information processing equipment and programs
US20220036574A1 (en) System and method for obstacle avoidance
JP5992184B2 (en) Image data processing apparatus, image data processing method, and image data processing program
CN110573901A (en) calibration of laser sensor and vision sensor
JP2019528501A (en) Camera alignment in a multi-camera system
US20100164807A1 (en) System and method for estimating state of carrier
KR101672732B1 (en) Apparatus and method for tracking object
Nienaber et al. A comparison of low-cost monocular vision techniques for pothole distance estimation
CN110470333B (en) Calibration method and device of sensor parameters, storage medium and electronic device
EP3113147B1 (en) Self-location calculating device and self-location calculating method
JPWO2015145543A1 (en) Object detection apparatus, object detection method, and mobile robot
CN111308415B (en) Online pose estimation method and equipment based on time delay
KR101030317B1 (en) Apparatus for tracking obstacle using stereo vision and method thereof
JP6543935B2 (en) PARALLEL VALUE DERIVING DEVICE, DEVICE CONTROL SYSTEM, MOBILE OBJECT, ROBOT, PARALLEL VALUE DERIVING METHOD, AND PROGRAM
JP7348414B2 (en) Method and device for recognizing blooming in lidar measurement
CN111142514A (en) Robot and obstacle avoidance method and device thereof
KR101735325B1 (en) Apparatus for registration of cloud points
US20210156710A1 (en) Map processing method, device, and computer-readable storage medium
CN112272757A (en) External parameter calibration method and device for detection device and movable platform
US20230109473A1 (en) Vehicle, electronic apparatus, and control method thereof
WO2020018140A1 (en) Ballistic estimnation of vehicle data
CN115147495A (en) Calibration method, device and system for vehicle-mounted system
US20210149412A1 (en) Position estimating apparatus, method for determining position of movable apparatus, and non-transitory computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210126