CN116165615A - Lightweight calibration method, system and medium of radar fusion sensing system - Google Patents

Lightweight calibration method, system and medium of radar fusion sensing system Download PDF

Info

Publication number
CN116165615A
CN116165615A CN202310064815.5A CN202310064815A CN116165615A CN 116165615 A CN116165615 A CN 116165615A CN 202310064815 A CN202310064815 A CN 202310064815A CN 116165615 A CN116165615 A CN 116165615A
Authority
CN
China
Prior art keywords
camera
point cloud
size
radar
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310064815.5A
Other languages
Chinese (zh)
Inventor
徐阳
肖罡
赵斯杰
张蔚
万可谦
杨钦文
刘小兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi Kejun Industrial Co ltd
Original Assignee
Jiangxi Kejun Industrial Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi Kejun Industrial Co ltd filed Critical Jiangxi Kejun Industrial Co ltd
Priority to CN202310064815.5A priority Critical patent/CN116165615A/en
Publication of CN116165615A publication Critical patent/CN116165615A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention discloses a light-weight calibration method, a light-weight calibration system and a light-weight calibration medium for a radar fusion sensing system, wherein the light-weight calibration method comprises the steps of simultaneously shooting a calibration plate by using a laser radar and a camera with relatively fixed positions; extracting a marker element on a calibration plate from image data of a camera and point cloud data of a laser radar; vectorizing the two marker elements; registering the marker elements after vectorization representation to obtain external parameters between the laser radar and the camera. The invention uses the mark elements after vectorization representation to register so as to obtain external parameters between the laser radar and the camera, can be used for realizing 3D portable real-time reconstruction with color information, overcomes the defects of slow speed and poor 3D precision in visual SLAM, supplements the defect of lack of color information in the laser SLAM, and has the advantages of both reconstruction speed, precision and information richness, and can realize the acquisition based on lightweight portable devices by using vectorization representation to the low consumption characteristic of computing resources.

Description

Lightweight calibration method, system and medium of radar fusion sensing system
Technical Field
The invention relates to the technical field of computer vision, in particular to a lightweight calibration method, a lightweight calibration system and a lightweight calibration medium for a radar fusion sensing system.
Background
The construction of drawings (synchronous positioning and map construction) in SLAM (Simultaneous Localization and Mapping) technology was first proposed in the robot field, and it refers to: the robot starts from an unknown place with unknown environment, positions and postures of the robot are positioned through repeatedly observed environment characteristics in the motion process, and then an incremental map of the surrounding environment is built according to the positions of the robot, so that the purposes of simultaneous positioning and map building are achieved. The SLAM technology is divided into five parts of sensor data acquisition, front-end odometer, back-end nonlinear optimization, loop detection and graph construction. Depending on the type of sensor data acquisition, SLAM schemes include a vision SLAM scheme and a laser SLAM scheme.
The visual SLAM scheme represented by RTAB-Map is divided into five parts from a functional angle, namely image data acquisition, visual odometer, back-end nonlinear optimization, loop detection and image construction, wherein an image data acquisition module acquires 2D visual data of the environment through a camera, the visual odometer predicts and calculates 3D three-dimensional information through 2D images (image changes generated due to movement) at different moments and different positions, and the process of self pose estimation is further carried out through the back-end nonlinear optimization and loop detection. The input is an image and video sequence, and the motion trail of the camera and the local map are output. And matching and splicing the camera motion trail and the local map which are obtained by current calculation into the original map in the map building process. And (3) splicing new data from the laser radar in the round into the original map by map fusion, and finally completing the updating of the map. Unlike the visual SLAM scheme, the laser SLAM scheme represented by LIO-SAM has a 3D point cloud as direct input data. The method is divided into five parts from the functional aspect, namely point cloud data acquisition, laser odometer, back-end nonlinear optimization, loop detection and graph construction. The cloud data acquisition is performed by acquiring environmental information of a position where the cloud data acquisition is located through a laser radar or other sensors, then optimizing the original data of the laser radar, eliminating some problematic data, or performing filtering. The laser odometer does not predict 3D stereo information any more, but directly searches the point cloud data of the current local environment for a corresponding position on an established map, and the quality of matching has a direct influence on the accuracy of the SLAM map construction. In the SLAM process, the point cloud (red part) currently acquired by the laser radar needs to be matched and spliced into the original map. The back-end nonlinear optimization, loop detection and mapping module are consistent with the visual SLAM scheme.
However, in the visual SLAM scheme, because the data is directly acquired as a 2D image, the 3D stereo information obtained through calculation is low in precision, high in calculation cost and low in speed; the laser SLAM scheme lacks visual data as input, so that the finally generated map lacks color information, and has a large limit on the application of the actual outdoor environment. In addition, the two technical schemes need to implement complicated calibration to realize external parameter calibration between sensors when in actual use.
Disclosure of Invention
The invention aims to solve the technical problems that: aiming at the problems in the prior art, the invention provides a light-weight calibration method, a light-weight calibration system and a light-weight calibration medium for a radar fusion sensing system, which aim to realize 3D portable real-time reconstruction with color information, overcome the defects of slow speed and poor 3D precision in visual SLAM, supplement the defect of lack of color information in laser SLAM, and give consideration to the reconstruction speed, precision and information richness, and realize acquisition based on a light-weight portable device by utilizing the low consumption characteristic of vectorization representation to computing resources.
In order to solve the technical problems, the invention adopts the following technical scheme:
a lightweight calibration method of a radar fusion sensing system comprises the following steps:
s101, shooting a calibration plate by using a radar fusion sensing system comprising a laser radar and a camera with relatively fixed positions;
s102, extracting a marker element on a calibration plate from image data of a camera and point cloud data of a laser radar;
s103, respectively vectorizing and representing the two marker elements;
and S104, registering the marker elements after vectorization representation to obtain external parameters between the laser radar and the camera.
Optionally, in the radar fusion sensing system of step S101, the laser radar is fixed on the handheld bracket, the camera is fixed on the handheld bracket, and the lens direction is made to be the same as the orientation of the laser radar so that the laser radar and the camera keep relatively fixed in position.
Optionally, in step S101, one or more marker elements hollowed out in the interior are disposed on the calibration board, where the marker elements are circular or rectangular.
Optionally, step S102 includes:
s201, aiming at image data of a camera with the size of H multiplied by W and point cloud data of a laser radar with the size of N multiplied by 3, wherein H and W are respectively the height and the width of the camera, N is the number of point cloud data points, and a marker element k is selected;
s202, extracting the size H corresponding to the marker element k from the image data of the camera with the size H×W for the marker element k k ×W k Image block i of (2) k Extracting a marker element k from point cloud data of a laser radar with the size of N multiplied by 3, wherein the size of the marker element k corresponds to N k X 3 point cloud block P k
Optionally, step S102 includes:
s301, for a size of N k X 3 point cloud block P k Extracting physical sign to obtain a size of m p Point cloud block feature vector of x C, where m pk The method comprises the steps of carrying out a first treatment on the surface of the For a size of H k ×W k Image block I of (2) k Extracting features to obtain a size of m i X C image block feature vector, where m ik ×W k C is a designated feature dimension;
s302, the size is m p Feature vector of XC is according to point cloud block P k Vectorized representation is performed on vector positions in point cloud data of the lidar with the size of N x 3, and vectorized representation (x, y, z, p (d), where (x, y, z) is a point cloud P k A position in point cloud data of a lidar of size n×3; will have a size of m i The feature vectors of the xc are vectorized to obtain vectorized representations of the visual features (x, y, i (ii) where (x, y) is image block I k A position in the image data of the camera of size h×w.
Optionally, step S104 includes:
s401, for each of the vectorized representations (x, y, z, p (x, y), i (ii) euclidean distance in three dimensions;
s402, respectively calculating the transformation relation between the Euclidean distance and the actual three-dimensional space distance of the laser radar and the camera under the sensor coordinate system of the laser radar and the camera, and obtaining a rotation matrix R and a translation vector t as external parameters between the laser radar and the camera.
Optionally, step S104 further includes: and shooting the target object by using the laser radar with relatively fixed positions and the camera, and converting one of the image data of the camera and the point cloud data of the laser radar into a coordinate system of the other by using an external parameter.
Optionally, after one of the image data of the camera and the point cloud data of the laser radar is converted to the coordinate system of the other by using the external parameter, the method further comprises overlapping the image data of the camera and the point cloud data of the laser radar projected under the same coordinate system to obtain the radar fusion data.
In addition, the invention also provides a lightweight calibration system of the radar fusion sensing system, which comprises a microprocessor and a memory which are connected with each other, wherein the microprocessor is programmed or configured to execute the lightweight calibration method of the radar fusion sensing system.
Furthermore, the invention also provides a computer readable storage medium, wherein the computer readable storage medium stores a computer program, and the computer program is used for being programmed or configured by a microprocessor to execute the lightweight calibration method of the radar fusion sensing system.
Compared with the prior art, the invention has the following advantages: shooting a calibration plate by using a laser radar and a camera with relatively fixed positions at the same time; extracting a marker element on a calibration plate from image data of a camera and point cloud data of a laser radar; respectively vectorizing and representing the two marker elements; registering the marker elements after vectorization representation to obtain external parameters between the laser radar and the camera. The invention uses the mark elements after vectorization representation to register so as to obtain external parameters between the laser radar and the camera, can be used for realizing 3D portable real-time reconstruction with color information, overcomes the defects of slow speed and poor 3D precision in visual SLAM, supplements the defect of lack of color information in the laser SLAM, and has the advantages of both reconstruction speed, precision and information richness, and can realize the acquisition based on lightweight portable devices by using vectorization representation to the low consumption characteristic of computing resources.
Drawings
FIG. 1 is a schematic diagram of a basic flow of a method according to an embodiment of the present invention.
Detailed Description
As shown in fig. 1, the light-weight calibration method of the radar fusion sensing system of the embodiment includes:
s101, shooting a calibration plate by using a radar fusion sensing system comprising a laser radar and a camera with relatively fixed positions;
s102, extracting a marker element on a calibration plate from image data of a camera and point cloud data of a laser radar;
s103, respectively vectorizing and representing the two marker elements;
and S104, registering the marker elements after vectorization representation to obtain external parameters between the laser radar and the camera.
In the radar fusion sensing system of step S101 of this embodiment, the laser radar is fixed on the handheld bracket, the camera is fixed on the handheld bracket, and the lens direction is made to be the same as the orientation of the laser radar, so that the laser radar and the camera keep relatively fixed positions.
In step S101 of this embodiment, the calibration board is provided with one or more hollow-out marker elements, where the marker elements are circular or rectangular. The marking plate can be made of any rectangular hard plate, and the hollow marker elements in the marking plate can be configured into round or rectangular structures with required quantity and size according to the requirement.
In this embodiment, step S102 includes:
s201, aiming at image data of a camera with the size of H multiplied by W and point cloud data of a laser radar with the size of N multiplied by 3, wherein H and W are respectively the height and the width of the camera, N is the number of point cloud data points, and a marker element k is selected;
s202, extracting the size H corresponding to the marker element k from the image data of the camera with the size H×W for the marker element k k ×W k Image block I of (2) k Extracting a marker element k from point cloud data of a laser radar with the size of N multiplied by 3, wherein the size of the marker element k corresponds to N k X 3 point cloud block P k
In this embodiment, step S102 includes:
s301, for a size of N k X 3 point cloud block P k Extracting physical sign to obtain a size of m p Point cloud block feature vector of x C, where m pk The method comprises the steps of carrying out a first treatment on the surface of the For a size of H k ×W k Image block I of (2) k Extracting features to obtain a size of m i X C image block feature vector, where m ik ×W k C is a specified feature dimension (which can be arbitrarily specified by itself);
s302, the size is m p Feature vector of XC is according to point cloud block P k Vectorized representation is performed on vector positions in point cloud data of the lidar with the size of N x 3, and vectorized representation (x, y, z, p (d), where (x, y, z) is a point cloud P k Point cloud count in a size N x 3 lidarAccording to the position in the map; will have a size of m i The feature vectors of the xc are vectorized to obtain vectorized representations of the visual features (x, y, i (ii) where (x, y) is image block I k A position in the image data of the camera of size h×w.
In step S401, the size is N k X 3 point cloud block P k The extracting of the physical sign can adopt the existing point cloud feature extracting algorithm according to the requirement, for example, as an optional implementation manner, the extracting mode of the physical sign adopted in the embodiment is as follows: qi C R, su H, mo K, et al Pointet Deep learning on point sets for 3d classification and segmentation[C]The ratio of the components is/Proceedings of the IEEE conference on computer vision and pattern recoganization.2017:652-660. Also, for a size H k ×W k Image block I of (2) k The feature extraction is also performed by using an existing image feature extraction algorithm as required, for example, as an alternative implementation, the feature extraction method used in this embodiment is as follows: he K, zhang X, ren S, et al deep residual learning for image recognition [ C]//Proceedings of the IEEE conference on computer vision and pattern recognition.2016:770-778。
In this embodiment, step S104 includes:
s401, vectorized representation (x, y, z, m) of point cloud features respectively p C), vectorized representation of visual features (x, y, m i C) euclidean distance in three dimensions;
s402, respectively calculating the transformation relation between the Euclidean distance and the actual three-dimensional space distance of the laser radar and the camera under the sensor coordinate system of the laser radar and the camera, and obtaining a rotation matrix R and a translation vector t as external parameters between the laser radar and the camera.
In this embodiment, step S104 further includes: and shooting the target object by using the laser radar with relatively fixed positions and the camera, and converting one of the image data of the camera and the point cloud data of the laser radar into a coordinate system of the other by using an external parameter. In this embodiment, the camera's image data and the laser radar's points are used by external parametersAfter one of the cloud data is converted into the coordinate system of the other, the method further comprises the step of superposing the image data projected to the camera under the same coordinate system and the point cloud data of the laser radar to obtain the radar fusion data. For example, by rotating the matrix R and shifting the vector t, in the process of acquiring the point cloud data and the image data in real time, the pixel points in the camera plane are projected under the world coordinate system to obtain (x c ,y c ,z c ) And is matched with (x) l ,y l ,z l ) And overlapping the point cloud points to obtain the radar fusion data (radar and vision fusion data).
In summary, the visual SLAM scheme directly collects data into a 2D image, so that the 3D stereoscopic information obtained through calculation has low precision, high calculation cost and low speed; the laser SLAM scheme lacks visual data as input, so that the finally generated map lacks color information, and has a large limit on the application of the actual outdoor environment. The method of the embodiment is based on vectorization representation and calculation to realize light weight calibration of the radar fusion sensing system, can be used for registering and fusing 3D information of laser point cloud and 2D image information of a camera to realize 3D portable real-time reconstruction with color information, overcomes the defects of slow speed and poor 3D precision in visual SLAM and the defect of lack of color information in supplementary laser SLAM, and has the advantages of reconstruction speed, precision and information richness.
In addition, the embodiment also provides a lightweight calibration system of the radar fusion sensing system, which comprises a microprocessor and a memory which are connected with each other, wherein the microprocessor is programmed or configured to execute the lightweight calibration method of the radar fusion sensing system. In addition, the embodiment also provides a computer readable storage medium, wherein a computer program is stored in the computer readable storage medium, and the computer program is used for being programmed or configured by a microprocessor to execute the lightweight calibration method of the thunder fusion sensing system.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-readable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein. The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above examples, and all technical solutions belonging to the concept of the present invention belong to the protection scope of the present invention. It should be noted that modifications and adaptations to the present invention may occur to one skilled in the art without departing from the principles of the present invention and are intended to be within the scope of the present invention.

Claims (10)

1. The light-weight calibration method of the radar fusion sensing system is characterized by comprising the following steps of:
s101, shooting a calibration plate by using a radar fusion sensing system comprising a laser radar and a camera with relatively fixed positions;
s102, extracting a marker element on a calibration plate from image data of a camera and point cloud data of a laser radar;
s103, respectively vectorizing and representing the two marker elements;
and S104, registering the marker elements after vectorization representation to obtain external parameters between the laser radar and the camera.
2. The method according to claim 1, wherein in the radar fusion sensing system of step S101, the lidar is fixed on the handheld bracket, the camera is fixed on the handheld bracket, and the lens direction is the same as the orientation of the lidar so that the lidar and the camera remain in a fixed position.
3. The method for lightweight calibration of a radar fusion sensing system according to claim 1, wherein in step S101, the calibration plate is provided with one or more marker elements hollowed out inside, and the marker elements are circular or rectangular.
4. The method for lightweight calibration of a radar fusion awareness system according to claim 1, wherein the step S102 includes:
s201, aiming at image data of a camera with the size of H multiplied by W and point cloud data of a laser radar with the size of N multiplied by 3, wherein H and W are respectively the height and the width of the camera, N is the number of point cloud data points, and a marker element k is selected;
s202, extracting the size H corresponding to the marker element k from the image data of the camera with the size H×W for the marker element k k ×W k Image block I of (2) k Extracting a marker element k from point cloud data of a laser radar with the size of N multiplied by 3, wherein the size of the marker element k corresponds to N k X 3 point cloud block P k
5. The method for lightweight calibration of a radar fusion awareness system according to claim 4, wherein the step S102 includes:
s301, for a size of N k X 3 point cloud block P k Extracting physical sign to obtain a size of m p Point cloud block feature vector of x C, where m p =N k The method comprises the steps of carrying out a first treatment on the surface of the For a size of H k ×W k Image block I of (2) k Extracting features to obtain a size of m i X C image block feature vector, where m i =H k ×W k C is a designated feature dimension;
s302, the size is m p Feature vector of XC is according to point cloud block P k Vectorizing the vector position in the point cloud data of the laser radar with the size of N multiplied by 3 to obtain vectorized representation (x, y, z, m) of the point cloud characteristics p C), wherein (x, y, z) is a point cloud P k A position in point cloud data of a lidar of size n×3; will have a size of m i The feature vectors of the xC are vectorized to obtain vectorized representations (x, y, m) of the visual features i C), wherein (x, y) is image block I k A position in the image data of the camera of size h×w.
6. The method for lightweight calibration of a radar fusion awareness system according to claim 5, wherein the step S104 includes:
s401, vectorized representation (x, y, z, m) of point cloud features respectively p C), vectorized representation of visual features (x, y, m i C) euclidean distance in three dimensions;
s402, respectively calculating the transformation relation between the Euclidean distance and the actual three-dimensional space distance of the laser radar and the camera under the sensor coordinate system of the laser radar and the camera, and obtaining a rotation matrix R and a translation vector t as external parameters between the laser radar and the camera.
7. The method for lightweight calibration of a radar fusion sensing system according to claim 1, further comprising, after step S104: and shooting the target object by using the laser radar with relatively fixed positions and the camera, and converting one of the image data of the camera and the point cloud data of the laser radar into a coordinate system of the other by using an external parameter.
8. The method for lightweight calibration of a radar fusion sensing system according to claim 7, wherein after converting one of the image data of the camera and the point cloud data of the laser radar to the coordinate system of the other by using the external parameter, further comprising superposing the image data of the camera and the point cloud data of the laser radar projected under the same coordinate system to obtain the radar fusion data.
9. A lightweight calibration system for a radar fusion sensing system comprising a microprocessor and a memory interconnected, wherein the microprocessor is programmed or configured to perform a lightweight calibration method for a radar fusion sensing system according to any one of claims 1 to 8.
10. A computer readable storage medium having a computer program stored therein, wherein the computer program is for being programmed or configured by a microprocessor to perform a lightweight calibration method of a radar fusion awareness system according to any of claims 1-8.
CN202310064815.5A 2023-02-03 2023-02-03 Lightweight calibration method, system and medium of radar fusion sensing system Pending CN116165615A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310064815.5A CN116165615A (en) 2023-02-03 2023-02-03 Lightweight calibration method, system and medium of radar fusion sensing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310064815.5A CN116165615A (en) 2023-02-03 2023-02-03 Lightweight calibration method, system and medium of radar fusion sensing system

Publications (1)

Publication Number Publication Date
CN116165615A true CN116165615A (en) 2023-05-26

Family

ID=86410732

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310064815.5A Pending CN116165615A (en) 2023-02-03 2023-02-03 Lightweight calibration method, system and medium of radar fusion sensing system

Country Status (1)

Country Link
CN (1) CN116165615A (en)

Similar Documents

Publication Publication Date Title
CN111968129B (en) Instant positioning and map construction system and method with semantic perception
CN110070615B (en) Multi-camera cooperation-based panoramic vision SLAM method
Patil et al. The h3d dataset for full-surround 3d multi-object detection and tracking in crowded urban scenes
CN112894832B (en) Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium
Liu et al. Keypose: Multi-view 3d labeling and keypoint estimation for transparent objects
CN112132972B (en) Three-dimensional reconstruction method and system for fusing laser and image data
CN111209915B (en) Three-dimensional image synchronous recognition and segmentation method based on deep learning
CN109270534A (en) A kind of intelligent vehicle laser sensor and camera online calibration method
CN109509230A (en) A kind of SLAM method applied to more camera lens combined type panorama cameras
CN110146099B (en) Synchronous positioning and map construction method based on deep learning
CN108401461A (en) Three-dimensional mapping method, device and system, cloud platform, electronic equipment and computer program product
CN106504287B (en) Monocular vision object space positioning system based on template
CN113989450A (en) Image processing method, image processing apparatus, electronic device, and medium
CN106651957B (en) Monocular vision object space localization method based on template
CN110243375A (en) Method that is a kind of while constructing two-dimensional map and three-dimensional map
CN109318227B (en) Dice-throwing method based on humanoid robot and humanoid robot
Agrawal et al. PCE-SLAM: A real-time simultaneous localization and mapping using LiDAR data
CN108958256A (en) A kind of vision navigation method of mobile robot based on SSD object detection model
CN113888639B (en) Visual odometer positioning method and system based on event camera and depth camera
CN100416466C (en) Single-eye vision semi-matter simulating system and method
CN113239072A (en) Terminal equipment positioning method and related equipment thereof
Du et al. Autonomous measurement and semantic segmentation of non-cooperative targets with deep convolutional neural networks
CN116165615A (en) Lightweight calibration method, system and medium of radar fusion sensing system
CN113628284B (en) Pose calibration data set generation method, device and system, electronic equipment and medium
CN108592789A (en) A kind of steel construction factory pre-assembly method based on BIM and machine vision technique

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination