CN115235525A - Sensor detection method and device, electronic equipment and readable storage medium - Google Patents

Sensor detection method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN115235525A
CN115235525A CN202111486395.7A CN202111486395A CN115235525A CN 115235525 A CN115235525 A CN 115235525A CN 202111486395 A CN202111486395 A CN 202111486395A CN 115235525 A CN115235525 A CN 115235525A
Authority
CN
China
Prior art keywords
point cloud
cloud data
sensor
calibration
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111486395.7A
Other languages
Chinese (zh)
Other versions
CN115235525B (en
Inventor
黄超
张�浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xiantu Intelligent Technology Co Ltd
Original Assignee
Shanghai Xiantu Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Xiantu Intelligent Technology Co Ltd filed Critical Shanghai Xiantu Intelligent Technology Co Ltd
Priority to CN202111486395.7A priority Critical patent/CN115235525B/en
Priority to PCT/CN2022/071109 priority patent/WO2023103143A1/en
Publication of CN115235525A publication Critical patent/CN115235525A/en
Application granted granted Critical
Publication of CN115235525B publication Critical patent/CN115235525B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The present disclosure provides a method and a device for detecting sensor calibration abnormality, namely a readable storage medium, wherein the method comprises the following steps: respectively acquiring first point cloud data of a plurality of sensors arranged on a vehicle in an initial coordinate system; respectively converting first point cloud data of the sensor into a vehicle coordinate system based on a first conversion relation to obtain second point cloud data of the sensor, wherein the first conversion relation is obtained according to an initial setting pose of the sensor on the vehicle; merging the second point cloud data of at least two sensors in the plurality of sensors to obtain merged point cloud data; and determining the calibration state of the sensor according to the second point cloud data of the sensor and the merged point cloud data. Through the technical scheme provided by the disclosure, under the condition that the sensor is abnormal in calibration, the abnormality can be detected in time, operation and maintenance personnel can conveniently handle the abnormality, and the safety of the vehicle during operation is improved.

Description

Sensor detection method and device, electronic equipment and readable storage medium
Technical Field
The present disclosure relates to the field of intelligent driving technologies, and in particular, to a sensor detection method and apparatus, an electronic device, and a readable storage medium.
Background
The environmental perception is a key part of an intelligent driving system, is the front-end input of planning decisions, and provides an important basis for the planning decisions. The environment sensing module comprises various sensors, such as a laser radar, a camera, a millimeter wave radar and the like, and the vehicle senses the surrounding environment through the sensors and controls the automatic driving vehicle to run according to the sensed environment information. Because the detection range of a single sensor is limited, in order to reduce detection blind areas, a plurality of sensors are arranged on the vehicle, and point cloud coordinate systems respectively generated by the sensors are converted into a unified coordinate system through calibration. And when the relative position of the sensor and the vehicle changes, the point cloud combination result is abnormal. In the related art, operation and maintenance personnel mainly perform calibration anomaly detection and troubleshooting on the sensor in an off-line state when the vehicle finishes running, so that the calibration state of the sensor cannot be acquired in real time. And because the calibration state of the sensor cannot be known in time in the running process of the vehicle, the vehicle continues to run under the condition of abnormal calibration of the sensor, and the safety risk exists.
Disclosure of Invention
In view of this, the present disclosure provides a sensor detection method, an apparatus, an electronic device and a readable storage medium, so as to achieve automatic detection of a calibration state of a sensor by a vehicle, shorten detection feedback time, and improve safety during vehicle operation.
According to a first aspect of the present disclosure, there is provided a sensor detection method, the method comprising:
respectively acquiring first point cloud data of a plurality of sensors arranged on a vehicle in an initial coordinate system;
respectively converting first point cloud data of the sensor into a vehicle coordinate system based on a first conversion relation to obtain second point cloud data of the sensor, wherein the first conversion relation is obtained according to an initial setting pose of the sensor on the vehicle;
merging the second point cloud data of at least two sensors of the plurality of sensors to obtain merged point cloud data;
and determining the calibration state of the sensor according to the second point cloud data and the merged point cloud data.
In combination with any one of the embodiments provided by the present disclosure, the converting the first point cloud data of the sensor to the vehicle coordinate system based on the first conversion relationship respectively includes:
respectively converting the first point cloud data of the sensor to a vehicle coordinate system based on a six-degree-of-freedom conversion matrix converted from an initial coordinate system of the sensor to the vehicle coordinate system, wherein the six-degree-of-freedom conversion matrix comprises a rotation matrix and a translation matrix.
In combination with any embodiment provided by the present disclosure, the determining the calibration state of the sensor according to the first point cloud data, the second point cloud data, and the merged point cloud data of the sensor includes:
registering first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relation, wherein the first sensor is any one of the plurality of sensors;
based on the second conversion relation, converting the first point cloud data of the first sensor into a coordinate system of combined point cloud to obtain third point cloud data of the first sensor, wherein the coordinate system of the combined point cloud is obtained by fitting the combined point cloud data;
and comparing the second point cloud data of the first sensor with the third point cloud data, and determining the calibration state of the first sensor according to the comparison result.
In combination with any one of the embodiments provided by the present disclosure, the comparing the second point cloud data of the first sensor with the third point cloud data, and determining the calibration state of the sensor according to the comparison result includes:
determining that the first sensor is abnormal in calibration under the condition that the average position offset distance of the corresponding point coordinates in the second point cloud data and the third point cloud data exceeds a first set threshold;
and under the condition that the average position offset distance of the corresponding point coordinates in the second point cloud data and the third point cloud data does not exceed a first set threshold value, determining that the calibration of the first sensor is normal.
In combination with any embodiment provided by the present disclosure, the determining a calibration state of the sensor according to the second point cloud data and the merged point cloud data includes:
registering the first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relation;
determining that the first sensor calibration is abnormal under the condition that the parameter deviation in the first conversion relation and the second conversion relation exceeds a second set threshold value;
and determining that the first sensor is calibrated normally under the condition that the parameter deviation in the first conversion relation and the second conversion relation does not exceed a second set threshold value.
In combination with any one of the embodiments provided by the present disclosure, the merging the second point cloud data of at least two of the plurality of sensors to obtain merged point cloud data includes:
and merging the second point cloud data of other sensors except the first sensor in the plurality of sensors to obtain merged point cloud data.
In combination with any embodiment provided by the present disclosure, the method further comprises:
performing multi-frame point cloud data merging on the merged point cloud data to obtain multi-frame merged point cloud data;
based on the multi-frame merged point cloud data, obtaining vehicle relative displacement to obtain a vehicle moving direction;
and comparing the positive direction of the coordinate system of the merged point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result.
In combination with any embodiment provided by the present disclosure, the comparing the positive direction orientation of the coordinate system of the merged point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result includes:
determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle of the included angle between the two directions exceeds a first angle threshold value;
and under the condition that the angle of the included angle between the two directions does not exceed the first angle threshold value, determining that the calibration of the plurality of sensors is normal.
In combination with any embodiment provided by the present disclosure, the method further comprises:
performing ground segmentation processing on the merged point cloud data to obtain ground segmentation point cloud data;
and comparing the merged point cloud data with the ground segmentation point cloud data, and determining the calibration state of the sensor according to the comparison result.
In combination with any one of the embodiments provided by the present disclosure, the comparing the merged point cloud data with the ground segmented point cloud data, and determining the calibration state of the sensor according to the comparison result includes:
determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground obtained by the ground segmentation point cloud exceeds a second angle threshold;
and under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground included angle obtained by the ground segmentation point cloud does not exceed a second angle threshold value, determining that the plurality of sensors are normally calibrated.
In combination with any embodiment provided by the present disclosure, the method further comprises:
and under the condition that the sensor calibration is determined to be abnormal, generating a sensor calibration abnormal report, and sending the abnormal report to an operation and maintenance terminal.
In combination with any embodiment provided by the present disclosure, the method further comprises:
and generating a sensor calibration detection report according to the calibration state of the sensor at a set time interval, and sending the calibration detection report to an operation and maintenance terminal.
According to a second aspect of the present disclosure, there is provided a sensor detection apparatus, the apparatus comprising:
the system comprises a first point cloud acquisition module, a second point cloud acquisition module and a control module, wherein the first point cloud acquisition module is used for respectively acquiring first point cloud data of a plurality of sensors arranged on a vehicle in an initial coordinate system;
the second point cloud generating module is used for respectively converting the first point cloud data of the sensor to a vehicle coordinate system based on a first conversion relation to obtain second point cloud data of the sensor, wherein the first conversion relation is obtained according to an initial setting pose of the sensor on the vehicle;
the merged point cloud generating module is used for merging the second point cloud data of at least two sensors in the plurality of sensors to obtain merged point cloud data;
and the calibration detection module is used for determining the calibration state of the sensor according to the second point cloud data and the merged point cloud data.
In combination with any embodiment provided by the present disclosure, the second point cloud generating module converts the first point cloud data of the sensor to the vehicle coordinate system based on the first conversion relationship, and is specifically configured to:
respectively converting the first point cloud data of the sensor to a vehicle coordinate system based on a six-degree-of-freedom conversion matrix converted from an initial coordinate system of the sensor to the vehicle coordinate system, wherein the six-degree-of-freedom conversion matrix comprises a rotation matrix and a translation matrix.
In combination with any embodiment provided by the present disclosure, the calibration detection module determines a calibration state of the sensor according to the first point cloud data, the second point cloud data, and the merged point cloud data of the sensor, and is specifically configured to:
registering the first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relation, wherein the first sensor is any one of the plurality of sensors;
based on the second conversion relation, converting the first point cloud data of the first sensor into a coordinate system of combined point cloud to obtain third point cloud data of the first sensor, wherein the coordinate system of the combined point cloud is obtained by fitting the combined point cloud data;
and comparing the second point cloud data of the first sensor with the third point cloud data, and determining the calibration state of the first sensor according to the comparison result.
In combination with any one of the embodiments provided by the present disclosure, the comparing, by the calibration detection module, the second point cloud data and the third point cloud data of the first sensor, and determining the calibration state of the sensor according to the comparison result includes:
determining that the first sensor is abnormal in calibration under the condition that the average position offset distance of the corresponding point coordinates in the second point cloud data and the third point cloud data exceeds a first set threshold;
and under the condition that the average position offset distance of the corresponding point coordinates in the second point cloud data and the third point cloud data does not exceed a first set threshold value, determining that the calibration of the first sensor is normal.
In combination with any real-time mode provided by the present disclosure, the determining, by the calibration detection module, a calibration state of the sensor according to the second point cloud data and the merged point cloud data includes:
registering the first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relation;
determining that the first sensor calibration is abnormal under the condition that the parameter deviation in the first conversion relation and the second conversion relation exceeds a second set threshold value;
and determining that the first sensor is calibrated normally under the condition that the parameter deviation in the first conversion relation and the second conversion relation does not exceed a second set threshold value.
In combination with any embodiment provided by the present disclosure, the merged point cloud generating module merges the second point cloud data of at least two of the plurality of sensors to obtain merged point cloud data, which is specifically configured to:
and merging the second point cloud data of other sensors except the first sensor in the plurality of sensors to obtain merged point cloud data.
In connection with any embodiment provided by the present disclosure, the apparatus further includes a first angle comparison module configured to:
performing multi-frame point cloud data merging on the merged point cloud data to obtain multi-frame merged point cloud data;
obtaining vehicle relative displacement based on the multi-frame merged point cloud data to obtain the vehicle moving direction;
and comparing the positive direction of the coordinate system of the merged point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result.
In combination with any one of the embodiments provided by the present disclosure, the first angle comparison module compares the positive direction orientation of the coordinate system of the merged point cloud with the moving direction of the vehicle, and determines the calibration state of the sensor according to the comparison result, specifically configured to:
determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle of the included angle between the two directions exceeds a first angle threshold value;
and under the condition that the angle of the included angle between the two directions does not exceed the first angle threshold value, determining that the calibration of the plurality of sensors is normal.
In combination with any one of the embodiments provided by the present disclosure, the apparatus further includes a second angle comparison module configured to:
performing ground segmentation processing on the merged point cloud data to obtain ground segmentation point cloud data;
and comparing the merged point cloud data with the ground segmentation point cloud data, and determining the calibration state of the sensor according to the comparison result.
In combination with any embodiment provided by the present disclosure, the second angle comparison module compares the merged point cloud data with the ground segmented point cloud data, and determines a calibration state of the sensor according to a comparison result, specifically configured to:
determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground obtained by the ground segmentation point cloud exceeds a second angle threshold;
and under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground included angle obtained by the ground segmentation point cloud does not exceed a second angle threshold value, determining that the calibration of the plurality of sensors is normal.
In combination with any one of the embodiments provided by the present disclosure, the apparatus further includes an exception reporting module configured to:
and under the condition that the sensor calibration abnormity exists, generating a sensor calibration abnormity report, and sending the abnormity report to an operation and maintenance terminal.
In connection with any embodiment provided by the present disclosure, the apparatus further includes a detection reporting module configured to:
and generating a sensor calibration detection report according to the calibration state of the sensor at a set time interval, and sending the detection report to an operation and maintenance terminal.
According to a third aspect of the present disclosure, there is provided an electronic device comprising:
a memory for storing processor-executable instructions;
a processor configured to execute the executable instructions in the memory to implement the steps of the method according to any of the embodiments of the first aspect.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium, on which a computer program is stored, which when executed by a processor, performs the steps of the method according to any of the embodiments of the first aspect described above.
According to a fifth aspect of the present disclosure, there is provided a smart vehicle including the above electronic device.
The technical scheme provided by the disclosure can comprise the following beneficial effects:
the automatic detection of the calibration state of the sensor by the vehicle is realized through the first point cloud data, the second point cloud data and the combined point cloud data of the sensor, the abnormal detection of the calibration of the sensor can be carried out in real time in the running process of the vehicle, the abnormal detection time of the calibration of the sensor is shortened, the feedback process from the abnormal calibration to the detection of the abnormal calibration is simplified, the operation and maintenance personnel can process the abnormal calibration at any time, and the safety of the vehicle during running is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate embodiments consistent with the disclosure and together with the disclosure, serve to explain the principles of the disclosure.
Fig. 1A is a flow chart illustrating a sensor detection method of the present disclosure according to an exemplary embodiment.
FIG. 1B is a flow chart illustrating another sensor detection method of the present disclosure according to an exemplary embodiment.
FIG. 2 is a flow chart illustrating another sensor detection method according to an exemplary embodiment of the present disclosure.
FIG. 3 is a flow chart illustrating another sensor detection method of the present disclosure according to an exemplary embodiment.
FIG. 4 is a schematic diagram of a sensor detection device shown in the present disclosure according to an exemplary embodiment.
FIG. 5 is a block diagram of an electronic device shown in accordance with an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the disclosure, as detailed in the appended claims.
The terminology used in the disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if," as used herein, may be interpreted as "at \8230; \8230when" or "when 8230; \823030when" or "in response to a determination," depending on the context.
FIG. 1A illustrates a flow chart of a sensor detection method illustrated by the present disclosure according to an exemplary embodiment.
In step S101, first point cloud data of a plurality of sensors provided on a vehicle in an initial coordinate system is acquired, respectively.
In order to reduce the detection blind area, a plurality of sensors may be provided on the vehicle, and the sensors may include at least one of commonly used sensors such as a laser radar, a multi-view camera, and a millimeter wave radar.
First point cloud data of a plurality of sensors on the vehicle in an initial coordinate system is acquired through an electronic device arranged in the vehicle. The initial coordinate system may be set according to the position of the sensor on the vehicle, for example, the position of the sensor on the vehicle is set as an origin of the initial coordinate system. The initial coordinate system may also be obtained from configuration parameters of the vehicle.
In step S102, based on a first conversion relationship, the first point cloud data of the sensor is converted into a vehicle coordinate system, so as to obtain second point cloud data of the sensor.
And obtaining the first conversion relation according to the initial setting pose of the sensor on the vehicle and the setting information of the vehicle coordinate system. The vehicle coordinate system may be set as required, for example, a midpoint of a front axle or a rear axle of the vehicle may be used as an origin of the vehicle coordinate system, the midpoint of the front axle or the rear axle is forward along a vehicle body direction and is used as a positive X-axis direction, the midpoint of the front axle or the rear axle is forward along the vehicle body direction and is used as a positive Y-axis direction, and the midpoint of the front axle or the rear axle is upward and is used as a positive Z-axis direction.
Converting the first point cloud data into second point cloud data based on a first conversion relation, wherein a coordinate system of the second point cloud data obtained after conversion is consistent with the vehicle coordinate system under the condition that the calibration of a sensor is normal, for example, parameters in the first conversion relation are correct, and the position and the direction of the sensor are not deviated from an initial set pose; when the calibration of the sensor is abnormal, for example, the parameter in the first conversion relation is wrong, or the position and the direction of the sensor deviate from the initial setting pose, the coordinate system of the second point cloud data obtained after conversion is inconsistent with the vehicle coordinate system.
In step S103, the second point cloud data of at least two of the plurality of sensors are merged to obtain merged point cloud data.
In step S104, a calibration state of the sensor is determined according to the second point cloud data and the merged point cloud data.
And fitting a coordinate system of the second point cloud data obtained by the plurality of sensors to obtain the combined point cloud coordinate system. Although the coordinate system of the second point cloud data acquired by the individual sensor is deviated from the vehicle coordinate system in the case of abnormal sensor calibration, the coordinate system of the merged point cloud data approaches the vehicle coordinate system after point cloud data merging and coordinate system fitting, and the coordinate system of the merged point cloud data can be taken as a standard coordinate system approximating the vehicle coordinate system.
By comparing the second point cloud data with the merged point cloud data approximate to the vehicle coordinate system or other data processed by the merged point cloud data, it can be determined whether the coordinate system of the second point cloud data is the vehicle coordinate system, i.e. the calibration state of the sensor.
According to the method and the device, the automatic detection of the calibration state of the sensor by the vehicle is realized through the first point cloud data, the second point cloud data and the combined point cloud data of the sensor, the abnormal detection of the calibration of the sensor can be carried out in real time in the running process of the vehicle, the abnormal detection time of the calibration of the sensor is shortened, the feedback process from the abnormal calibration to the detection of the number of the detection is simplified, operation and maintenance personnel can conveniently process the abnormal calibration at any time, and the safety of the vehicle in running is improved.
In an optional embodiment, the converting the first point cloud data of the sensor to the vehicle coordinate system based on the first conversion relation respectively includes: respectively converting the first point cloud data of the sensor to a vehicle coordinate system based on a six-degree-of-freedom conversion matrix converted from an initial coordinate system of the sensor to the vehicle coordinate system, wherein the six-degree-of-freedom conversion matrix comprises a rotation matrix and a translation matrix.
And converting the first point cloud data into a vehicle coordinate system through a first conversion relation to obtain second point cloud data, wherein the first conversion relation is the conversion relation between the sensor coordinate system and the vehicle coordinate system and can be an affine matrix or a six-degree-of-freedom conversion matrix containing the position and orientation parameters of the sensor and the vehicle. In one example shown in the present disclosure, the first transformation relationship comprises a six degree of freedom transformation matrix consisting of a rotation matrix and a translation matrix transformed from the initial coordinate system of the sensor to the vehicle coordinate system.
In the running process of a vehicle, the initial coordinate system of the first point cloud and the vehicle coordinate system are both a three-dimensional rectangular coordinate system, the first point cloud data are calibrated through a six-degree-of-freedom conversion matrix, and the first point cloud data of the sensor are converted into the vehicle coordinate system to obtain second point cloud data of the sensor.
FIG. 1B illustrates a flow chart of another sensor detection method illustrated by the present disclosure according to an exemplary embodiment.
In step S104-1, registering the first point cloud data of the first sensor with the merged point cloud data to obtain a second conversion relationship, where the first sensor is any one of the plurality of sensors.
In the present disclosure, the registration may be a process of obtaining a conversion relationship from an original point cloud to a target point cloud. In one example, a second transformation relationship that can transform the first point cloud data coordinate system to the merged point cloud data coordinate system is obtained by registering the first point cloud data of the first sensor and the merged point cloud data.
In one example, the electronic device in the vehicle may take each sensor on the vehicle as the first sensor in turn to detect the calibration status of each sensor respectively; it is also possible to individually control a specific sensor on the vehicle as the first sensor to perform calibration status detection on the sensor in a targeted manner.
In step S104-2, based on the second conversion relationship, the first point cloud data of the first sensor is converted into a coordinate system of a merged point cloud, so as to obtain third point cloud data of the first sensor, where the coordinate system of the merged point cloud is obtained by fitting the merged point cloud data.
Since the plurality of sensors are not calibrated to be abnormal at the same time in a normal situation, the deviation between the coordinate system obtained by fitting the merged point cloud data and the vehicle coordinate system is generally small, and the merged point cloud data can be approximated to a standard coordinate system close to the vehicle coordinate system. The coordinate system of the third point cloud data is consistent with the coordinate system of the merged point cloud data, and the third point cloud data is obtained by fitting the merged point cloud data, so that the third point cloud data can also be used as a standard coordinate system similar to a vehicle coordinate system.
And converting the first point cloud data of the first sensor to a coordinate system of the combined point cloud based on the second conversion relation obtained in the previous step to obtain third point cloud data of the first sensor.
In step S104-3, the second point cloud data and the third point cloud data of the first sensor are compared, and the calibration state of the sensor is determined according to the comparison result.
Since the coordinate system of the third point cloud data is consistent with the coordinate system of the merged point cloud data, the third point cloud data can also be used as a standard coordinate system approximate to the coordinate system of the vehicle. Therefore, by comparing the second point cloud data with the third point cloud data, it can be determined whether the coordinate system of the second point cloud data is the vehicle coordinate system, that is, the calibration state of the first sensor is determined.
Since the merged point cloud data and the second point cloud data may not be directly compared, the calibration state of the first sensor is determined according to the comparison result by comparing the second point cloud data and the third point cloud data of the first sensor, wherein the second point cloud data and the third point cloud data are obtained by converting the first sensor, and the point cloud number of the two point cloud data is the same, so that the comparison and the processing are convenient, the abnormal calibration of the sensor caused by the change of the relative position of the sensor and the vehicle is detected in real time, the operation and maintenance personnel can process the abnormal calibration at any time, and the safety of the vehicle during operation is improved.
In an optional embodiment, the comparing the second point cloud data of the first sensor with the third point cloud data, and determining the calibration state of the sensor according to the comparison result includes: determining that the first sensor is abnormal in calibration under the condition that the average position offset distance of the corresponding point coordinates in the second point cloud data and the third point cloud data exceeds a first set threshold; and under the condition that the average position offset distance of the corresponding point coordinates in the second point cloud data and the third point cloud data does not exceed a first set threshold value, determining that the calibration of the first sensor is normal.
The abnormal calibration of the sensor may be caused by two conditions, and in one condition, the abnormal calibration of the sensor may be caused by the change of the relative position of the sensor and the vehicle. Under the condition that the relative position of the first sensor and the vehicle changes to cause the first sensor to be abnormal in calibration, the average position of the point coordinates corresponding to the first point cloud data and the second point cloud data acquired by the first sensor has an offset distance. Determining that the first sensor is calibrated abnormally under the condition that the offset distance exceeds a second set threshold value; and determining that the calibration of the first sensor is normal under the condition that the offset distance does not exceed a second set threshold, wherein the second set threshold can be set according to actual requirements.
According to the method, the sensor calibration abnormity caused by the change of the relative position of the sensor and the vehicle can be detected in real time, operation and maintenance personnel can conveniently process the abnormal sensor calibration abnormity at any time, and the safety of the vehicle during operation is improved.
In an optional embodiment, the determining the calibration state of the sensor according to the second point cloud data and the merged point cloud data includes: registering the first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relation; determining that the first sensor calibration is abnormal under the condition that the parameter deviation in the first conversion relation and the second conversion relation exceeds a second set threshold value;
and determining that the first sensor is calibrated normally under the condition that the parameter deviation in the first conversion relation and the second conversion relation does not exceed a second set threshold value.
The abnormal calibration of the sensor may be caused by a parameter error in the first correlation. In one case, the parameters include parameters in a six degree of freedom transformation matrix consisting of a rotation matrix and a translation matrix transformed from the initial coordinate system of the sensor to the vehicle coordinate system. In the case that the sensor calibration is abnormal due to parameter errors in the first conversion relation, the parameter deviation in the first conversion relation and the second conversion relation exceeds a second set threshold value; and under the condition that the parameter deviation does not exceed a first set threshold value, the first sensor is calibrated normally. The second setting threshold value can be set according to actual requirements.
According to the method, sensor calibration abnormity caused by parameter errors in the first conversion relation can be detected in real time, operation and maintenance personnel can conveniently process the abnormity at any time, and the safety of the vehicle during operation is improved.
In an optional embodiment, the merging the second point cloud data of at least two of the plurality of sensors to obtain merged point cloud data includes: and merging the second point cloud data of other sensors except the first sensor in the plurality of sensors to obtain merged point cloud data.
In one example, the vehicle comprises N sensors, and after the first sensor to be detected is determined, the second point cloud data of other (N-1) sensors except the first sensor are combined to obtain combined point cloud data.
According to the method, under the condition that the first sensor is abnormally calibrated, the other multiple sensors except the first sensor are combined, the calibration effect of the combined point cloud data is prevented from being influenced due to the abnormal calibration of the first sensor, and the calibration state detection effect is improved.
FIG. 2 illustrates a flow chart of another sensor detection method illustrated by the present disclosure according to an exemplary embodiment.
In step S201, multi-frame point cloud data merging is performed on the merged point cloud data to obtain multi-frame merged point cloud data.
And acquiring the merged point cloud data obtained by the sensor at set time intervals within a set time period to obtain multi-frame data of the merged point cloud, and performing point cloud fusion on the multi-frame data of the merged point cloud to obtain multi-frame merged point cloud data. Performing point cloud fusion on the multi-frame data of the point cloud is a means in the related art, and the disclosure is not described herein.
In step S202, based on the multiple frames of merged point cloud data, a relative displacement of the vehicle is obtained, and a moving direction of the vehicle is obtained.
Through the multi-frame merged point cloud data, the displacement condition of the merged point cloud in continuous time, namely the displacement condition of the vehicle in the set time period can be determined, and the moving direction of the vehicle can be obtained from the displacement condition.
In step S203, the positive direction of the coordinate system of the merged point cloud is compared with the moving direction of the vehicle, and the calibration state of the sensor is determined according to the comparison result.
Under the condition that the calibration of the sensor is normal, the positive direction of the coordinate system of the combined point cloud is consistent with the moving direction of the vehicle, and the included angle exists between the positive direction of the coordinate system of the combined point cloud and the moving direction of the vehicle due to abnormal calibration, so that whether the calibration of the sensor is abnormal or not in the plurality of sensors for acquiring the combined point cloud can be determined by comparing the positive direction of the coordinate system of the combined point cloud with the moving direction of the vehicle, namely, the calibration state of the sensor is determined.
According to the method, the calibration state of the sensor is determined based on the direction in the driving process of the vehicle, and the detection effect of the calibration state is improved. By detecting the calibration state of the vehicle in running, the operation and maintenance personnel can know whether the vehicle needs to be subjected to sensor calibration optimization or maintenance in time, the time from the occurrence of the calibration abnormity to the detection of the calibration abnormity is shortened, the operation and maintenance personnel can process the calibration abnormity in time, and the safety of the vehicle in running is improved.
In an optional embodiment, the comparing the positive direction of the coordinate system of the merged point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result comprises: determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle of the included angle between the two directions exceeds a first angle threshold; and under the condition that the angle of the included angle between the two directions does not exceed the first angle threshold value, determining that the calibration of the plurality of sensors is normal.
And under the condition that at least one sensor in the sensors is abnormal in calibration, the positive direction of the coordinate system of the merged point cloud and the moving direction of the vehicle form an included angle. Determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle of the included angle between the two directions exceeds a first angle threshold; and under the condition that the angle of the included angle between the two directions does not exceed a first angle threshold value, determining that the calibration of the plurality of sensors is normal, wherein the first angle threshold value can be set according to actual requirements.
According to the method, the included angle between the positive direction of the combined point cloud coordinate system and the moving direction of the vehicle is compared, the sensor calibration abnormity is detected in real time in the driving process of the vehicle, the sensor calibration abnormity detection time is shortened, the feedback process from the occurrence of the calibration abnormity to the detection of the calibration abnormity is simplified, operation and maintenance personnel can conveniently process the abnormal calibration at any time, and the safety of the vehicle during operation is improved.
FIG. 3 illustrates a flow chart of another sensor detection method illustrated by the present disclosure according to an exemplary embodiment.
In step S301, ground segmentation is performed on the merged point cloud data to obtain ground segmented point cloud data.
The ground segmentation processing is a method for obtaining ground segmentation point cloud, for example, the method includes a plane grid method, a point cloud normal vector, a model fitting method, a bin grid method, and the like. In the embodiment of the disclosure, the ground segmentation point cloud can be obtained by the ground segmentation method.
In step S302, the merged point cloud data is compared with the ground segmented point cloud data, and the calibration state of the sensor is determined according to the comparison result.
Under the condition that the calibration of the sensor is normal, the positive direction orientation of the coordinate system of the combined point cloud and the ground obtained by the ground segmentation point cloud are in a parallel relation, and the abnormal calibration can cause the positive direction orientation of the coordinate system of the combined point cloud and the ground obtained by the ground segmentation point cloud to have an included angle, so that whether the calibration of the sensor is abnormal or not in a plurality of sensors for obtaining the combined point cloud can be determined by comparing the positive direction of the coordinate system of the combined point cloud and the ground included angle, and the calibration state of the sensor is also determined.
By comparing the merged point cloud data with the ground segmentation point cloud data, whether calibration abnormality exists in the sensors for acquiring the merged point cloud or not can be determined in real time, and the calibration state of the sensors is also determined. The method has the advantages that the detection time of the calibration abnormity of the sensor is shortened, the feedback flow from the occurrence of the calibration abnormity to the detection of the calibration abnormity is simplified, operation and maintenance personnel can conveniently process the abnormal calibration at any time, and the safety of the vehicle during operation is improved.
According to the method, the calibration state of the sensor is determined based on the ground state, and the detection effect of the calibration state is improved. Through the ground-based calibration state detection, the operation and maintenance personnel can know whether the vehicle needs to be subjected to sensor calibration optimization or maintenance in time, the feedback time from the occurrence of the calibration abnormity to the detection of the calibration abnormity is shortened, the operation and maintenance personnel can process the calibration abnormity in time, and the safety of the vehicle in operation is improved.
In an optional embodiment, the comparing the merged point cloud data with the ground segmentation point cloud data, and determining the calibration state of the sensor according to the comparison result includes: determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground obtained by the ground segmentation point cloud exceeds a second angle threshold; and under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground included angle obtained by the ground segmentation point cloud does not exceed a second angle threshold value, determining that the calibration of the plurality of sensors is normal.
Determining that at least one sensor in the plurality of sensors is abnormal in calibration under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground exceeds a second angle threshold; and under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the included angle formed by the ground does not exceed a second angle threshold, determining that the calibration of the plurality of sensors is normal, wherein the second angle threshold can be set according to actual requirements.
According to the method, the included angle between the positive direction of the combined point cloud coordinate system and the ground acquired by the ground segmentation point cloud is compared, sensor calibration abnormity detection is carried out in real time in the vehicle running process, the sensor calibration abnormity detection time is shortened, the feedback process from the occurrence of calibration abnormity to the detection of the calibration abnormity is simplified, operation and maintenance personnel can conveniently process the abnormal operation at any time, and the safety of the vehicle in operation is improved.
In an alternative embodiment, the calibration status of the sensor may also be determined directly by the surface segmentation process.
Respectively carrying out ground segmentation processing on first point cloud data acquired by any two sensors arranged on a vehicle under the condition that the ground is horizontal and flat, and determining that at least one sensor in the sensors is abnormally calibrated under the condition that the height coordinates of the ground segmentation point cloud data acquired by any two sensors respectively exceed a third set threshold; and determining that the calibration of the sensors is normal under the condition that the height coordinates of the ground segmentation point cloud data respectively acquired by any two sensors of the vehicle do not exceed a third set threshold.
According to the method, the height coordinates of the ground segmentation point cloud data respectively acquired by any two sensors are compared, sensor calibration abnormity detection is carried out in real time in the vehicle running process, the sensor calibration abnormity detection time is shortened, the feedback process from the occurrence of the calibration abnormity to the detection of the calibration abnormity is simplified, operation and maintenance personnel can conveniently process the data at any time, and the safety of the vehicle in running is improved.
In an optional embodiment, the method further comprises: and under the condition that the sensor calibration is determined to be abnormal, generating a sensor calibration abnormal report, and sending the abnormal report to an operation and maintenance terminal. The abnormal report may include all the point cloud data acquired by the sensor acquired by the above method, and other vehicle operation related data.
The method comprises the steps that when calibration abnormality of specific sensors in a vehicle can be determined, operation and maintenance personnel can maintain the sensors with the calibration abnormality, and when the calibration abnormality of at least one sensor in the sensors in the vehicle is determined, the operation and maintenance personnel can optimize or maintain the sensors integrally. In addition, under the condition that potential safety hazards exist in the vehicle due to abnormal calibration of the sensor, operation and maintenance personnel can timely inform a vehicle user of the current risks.
According to the method, when the sensor is abnormal in calibration, the abnormality can be detected in time, the feedback time from the abnormal in calibration to the abnormal in calibration is shortened, operation and maintenance personnel can process the abnormality in time, and the safety of a vehicle in operation is improved.
In an optional embodiment, the method further comprises: and generating a sensor calibration detection report according to the calibration state of the sensor at set time intervals, and sending the calibration detection report to an operation and maintenance terminal.
The set time interval can be set daily or weekly, a sensor calibration detection report is generated according to the calibration state of the sensor, and the calibration detection report is sent to the operation and maintenance terminal. Under the condition that the calibration abnormity occurs in the sensor within set time, the abnormity report is included in the calibration detection report; when the calibration abnormality does not occur in the sensor within the set time, all the point cloud data acquired by the sensor and the current relevant operation data of the vehicle acquired by the method can be included in the calibration detection report.
By the method, operation and maintenance personnel can acquire the calibration state of the sensor and the running state of the vehicle within the set time, so that comprehensive evaluation on the vehicle is facilitated, and the safety of the vehicle in running is improved.
While, for purposes of simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present disclosure is not limited by the order of acts, as some steps may, in accordance with the present disclosure, occur in other orders and concurrently.
Further, those skilled in the art will appreciate that the embodiments described in the specification are exemplary embodiments and that acts and modules are not necessarily required for the disclosure.
Corresponding to the embodiment of the application function implementation method, the disclosure also provides an embodiment of an application function implementation device and a corresponding terminal.
FIG. 4 shows a schematic diagram of a sensor detection device shown in the present disclosure according to an exemplary embodiment, which may include:
a first point cloud obtaining module 401, configured to obtain first point cloud data of multiple sensors arranged on a vehicle in an initial coordinate system, respectively;
a second point cloud generating module 402, configured to convert the first point cloud data of the sensor into a vehicle coordinate system respectively based on a first conversion relationship, so as to obtain second point cloud data of the sensor, where the first conversion relationship is obtained according to an initial setting pose of the sensor on the vehicle;
a merged point cloud generating module 403, configured to merge second point cloud data of at least two sensors in the multiple sensors to obtain merged point cloud data;
and a calibration detection module 404, configured to determine a calibration state of the sensor according to the second point cloud data and the merged point cloud data.
In combination with any embodiment provided by the present disclosure, the second point cloud generating module converts the first point cloud data of the sensor to the vehicle coordinate system based on the first conversion relationship, and is specifically configured to:
respectively converting the first point cloud data of the sensor to a vehicle coordinate system based on a six-degree-of-freedom conversion matrix converted from an initial coordinate system of the sensor to the vehicle coordinate system, wherein the six-degree-of-freedom conversion matrix comprises a rotation matrix and a translation matrix.
In combination with any embodiment provided by the present disclosure, the calibration detection module determines a calibration state of the sensor according to the first point cloud data, the second point cloud data, and the merged point cloud data of the sensor, and is specifically configured to:
registering first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relation, wherein the first sensor is any one of the plurality of sensors;
based on the second conversion relation, converting the first point cloud data of the first sensor into a coordinate system of combined point cloud to obtain third point cloud data of the first sensor, wherein the coordinate system of the combined point cloud is obtained by fitting the combined point cloud data;
and comparing the second point cloud data of the first sensor with the third point cloud data, and determining the calibration state of the first sensor according to the comparison result.
In combination with any one of the embodiments provided by the present disclosure, the comparing, by the calibration detection module, the second point cloud data and the third point cloud data of the first sensor, and determining the calibration state of the sensor according to the comparison result includes:
determining that the first sensor is abnormal in calibration under the condition that the average position offset distance of the corresponding point coordinates in the second point cloud data and the third point cloud data exceeds a first set threshold;
and under the condition that the average position offset distance of the corresponding point coordinates in the second point cloud data and the third point cloud data does not exceed a first set threshold value, determining that the calibration of the first sensor is normal.
In combination with any real-time mode provided by the present disclosure, the determining, by the calibration detection module, a calibration state of the sensor according to the second point cloud data and the merged point cloud data includes:
registering the first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relation;
determining that the first sensor calibration is abnormal under the condition that the parameter deviation in the first conversion relation and the second conversion relation exceeds a second set threshold value;
and determining that the first sensor is calibrated normally under the condition that the parameter deviation in the first conversion relation and the second conversion relation does not exceed a second set threshold value.
In combination with any embodiment provided by the present disclosure, the merged point cloud generating module merges the second point cloud data of at least two sensors of the plurality of sensors to obtain merged point cloud data, which is specifically configured to:
and merging the second point cloud data of other sensors except the first sensor in the plurality of sensors to obtain merged point cloud data.
In combination with any one of the embodiments provided by the present disclosure, the apparatus further includes a first angle comparing module, configured to:
performing multi-frame point cloud data merging on the merged point cloud data to obtain multi-frame merged point cloud data;
based on the multi-frame merged point cloud data, obtaining vehicle relative displacement to obtain a vehicle moving direction;
and comparing the positive direction of the coordinate system of the merged point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result.
In combination with any embodiment provided by the present disclosure, the first angle comparison module compares the positive direction orientation of the coordinate system of the merged point cloud with the moving direction of the vehicle, and determines the calibration state of the sensor according to the comparison result, specifically configured to:
determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle of the included angle between the two directions exceeds a first angle threshold;
and under the condition that the angle of the included angle between the two directions does not exceed the first angle threshold value, determining that the calibration of the plurality of sensors is normal.
In combination with any one of the embodiments provided by the present disclosure, the apparatus further includes a second angle comparison module, configured to:
performing ground segmentation processing on the merged point cloud data to obtain ground segmentation point cloud data;
and comparing the merged point cloud data with the ground segmentation point cloud data, and determining the calibration state of the sensor according to the comparison result.
In combination with any embodiment provided by the present disclosure, the second angle comparison module compares the merged point cloud data with the ground segmented point cloud data, and determines a calibration state of the sensor according to a comparison result, specifically configured to:
determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground obtained by the ground segmentation point cloud exceeds a second angle threshold;
and under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground included angle obtained by the ground segmentation point cloud does not exceed a second angle threshold value, determining that the plurality of sensors are normally calibrated.
In combination with any one of the embodiments provided by the present disclosure, the apparatus further includes an exception reporting module configured to:
and under the condition that the sensor calibration abnormity exists, generating a sensor calibration abnormity report, and sending the abnormity report to an operation and maintenance terminal.
In combination with any one of the embodiments provided in this disclosure, the apparatus further includes a detection reporting module configured to:
and generating a sensor calibration detection report according to the calibration state of the sensor at a set time interval, and sending the detection report to an operation and maintenance terminal. For the device embodiment, since it basically corresponds to the method embodiment, reference may be made to the partial description of the method embodiment for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the disclosure. One of ordinary skill in the art can understand and implement it without inventive effort.
FIG. 5 illustrates a block diagram of an electronic device in accordance with an exemplary embodiment of the present disclosure.
As shown in fig. 5, the apparatus may include: a processor, memory, a network interface, and an internal bus. The processor, the memory and the network interface are in communication connection with each other inside the device through the bus.
The processor may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute a relevant program, so as to implement the technical solution provided in the present Application.
The Memory may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static storage device, a dynamic storage device, or the like. The memory can store an operating system and other application programs, and when the technical solution provided by the present application is implemented by software or firmware, the relevant program codes are stored in the memory and called by the processor to be executed.
The network interface is used for connecting a communication module (not shown in the figure) to realize communication interaction between the device and other devices. The communication module can realize communication in a wired mode (for example, USB, network cable, etc.), and can also realize communication in a wireless mode (for example, mobile network, WIFI, bluetooth, etc.).
A bus includes a path that transfers information between the various components of the device (e.g., processor, memory, network interface).
It should be noted that although the above-described device shows only a processor, a memory, a network interface and a bus, in a specific implementation, the device may also include other components necessary for proper operation. Furthermore, it will be understood by those skilled in the art that the apparatus described above may also include only the components necessary to implement the solution of the present application, and not necessarily all of the components shown in the figures.
In an exemplary embodiment, the present disclosure also provides a non-transitory computer readable storage medium comprising instructions, such as a memory comprising instructions, executable by a processor of an electronic device to perform the steps of the wireless headset connection method described above. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, the present disclosure also provides an intelligent vehicle including the above-described electronic device.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice in the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

1. A sensor detection method, the method comprising:
respectively acquiring first point cloud data of a plurality of sensors arranged on a vehicle in an initial coordinate system;
respectively converting first point cloud data of the sensor to a vehicle coordinate system based on a first conversion relation to obtain second point cloud data of the sensor, wherein the first conversion relation is obtained according to an initial set pose of the sensor on the vehicle;
merging the second point cloud data of at least two sensors of the plurality of sensors to obtain merged point cloud data;
and determining the calibration state of the sensor according to the second point cloud data and the merged point cloud data.
2. The method of claim 1, wherein the converting the first point cloud data of the sensor to a vehicle coordinate system based on the first conversion relationship, respectively, comprises:
respectively converting the first point cloud data of the sensor to a vehicle coordinate system based on a six-degree-of-freedom conversion matrix converted from an initial coordinate system of the sensor to the vehicle coordinate system, wherein the six-degree-of-freedom conversion matrix comprises a rotation matrix and a translation matrix.
3. The method of claim 1, wherein determining the calibration status of the sensor from the first point cloud data, the second point cloud data, and the merged point cloud data of the sensor comprises:
registering the first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relation, wherein the first sensor is any one of the plurality of sensors;
based on the second conversion relation, converting the first point cloud data of the first sensor into a coordinate system of combined point cloud to obtain third point cloud data of the first sensor, wherein the coordinate system of the combined point cloud is obtained by fitting the combined point cloud data;
and comparing the second point cloud data of the first sensor with the third point cloud data, and determining the calibration state of the first sensor according to the comparison result.
4. The method of claim 3, wherein comparing the second point cloud data of the first sensor with the third point cloud data and determining the calibration status of the sensor based on the comparison comprises:
determining that the calibration of the first sensor is abnormal under the condition that the average position offset distance of corresponding point coordinates in the second point cloud data and the third point cloud data exceeds a first set threshold;
and under the condition that the average position offset distance of the corresponding point coordinates in the second point cloud data and the third point cloud data does not exceed a first set threshold value, determining that the calibration of the first sensor is normal.
5. The method of claim 1, wherein determining the calibration state of the sensor from the second point cloud data and the merged point cloud data comprises:
registering the first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relation;
determining that the first sensor calibration is abnormal under the condition that the parameter deviation in the first conversion relation and the second conversion relation exceeds a second set threshold value;
and determining that the first sensor is calibrated normally under the condition that the parameter deviation in the first conversion relation and the second conversion relation does not exceed a second set threshold value.
6. The method of claim 3, wherein the merging the second point cloud data of at least two of the plurality of sensors to obtain merged point cloud data comprises:
and merging the second point cloud data of other sensors except the first sensor in the plurality of sensors to obtain merged point cloud data.
7. The method of claim 1, further comprising:
carrying out multi-frame point cloud data merging on the merged point cloud data to obtain multi-frame merged point cloud data;
obtaining vehicle relative displacement based on the multi-frame merged point cloud data to obtain the vehicle moving direction;
and comparing the positive direction of the coordinate system of the merged point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result.
8. The method of claim 7, wherein comparing the positive coordinate system orientation of the merged point cloud with the direction of vehicle movement and determining the calibration status of the sensor based on the comparison comprises:
determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle of the included angle between the two directions exceeds a first angle threshold value;
and under the condition that the angle of the included angle between the two directions does not exceed the first angle threshold value, determining that the calibration of the plurality of sensors is normal.
9. The method of claim 1, further comprising:
performing ground segmentation processing on the merged point cloud data to obtain ground segmentation point cloud data;
and comparing the merged point cloud data with the ground segmentation point cloud data, and determining the calibration state of the sensor according to the comparison result.
10. The method of claim 9, wherein comparing the merged point cloud data with the ground segmented point cloud data and determining a calibration status of the sensor based on the comparison comprises:
determining that at least one sensor in the plurality of sensors is calibrated abnormally under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground obtained by the ground segmentation point cloud exceeds a second angle threshold;
and under the condition that the angle between the positive direction of the coordinate system of the merged point cloud data and the ground included angle obtained by the ground segmentation point cloud does not exceed a second angle threshold value, determining that the plurality of sensors are normally calibrated.
11. The method according to any one of claims 1-10, further comprising:
and under the condition that the sensor calibration is determined to be abnormal, generating a sensor calibration abnormal report, and sending the abnormal report to an operation and maintenance terminal.
12. The method according to any one of claims 1-10, further comprising:
and generating a sensor calibration detection report according to the calibration state of the sensor at set time intervals, and sending the calibration detection report to an operation and maintenance terminal.
13. A sensor detection apparatus, the apparatus comprising:
the system comprises a first point cloud acquisition module, a second point cloud acquisition module and a third point cloud acquisition module, wherein the first point cloud acquisition module is used for respectively acquiring first point cloud data of a plurality of sensors arranged on a vehicle in an initial coordinate system;
the second point cloud generating module is used for respectively converting the first point cloud data of the sensor into a vehicle coordinate system based on a first conversion relation to obtain second point cloud data of the sensor, wherein the first conversion relation is obtained according to an initial setting pose of the sensor on the vehicle;
the merged point cloud generating module is used for merging the second point cloud data of at least two sensors in the plurality of sensors to obtain merged point cloud data;
and the calibration detection module is used for determining the calibration state of the sensor according to the first point cloud data, the second point cloud data and the combined point cloud data of the sensor.
14. An electronic device, characterized in that the electronic device comprises:
a memory for storing processor-executable instructions;
a processor configured to execute the executable instructions in the memory to implement the steps of the method of any one of claims 1 to 12.
15. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 12.
16. An intelligent vehicle, characterized in that it comprises an electronic device according to claim 14.
CN202111486395.7A 2021-12-07 2021-12-07 Sensor detection method, sensor detection device, electronic equipment and readable storage medium Active CN115235525B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111486395.7A CN115235525B (en) 2021-12-07 2021-12-07 Sensor detection method, sensor detection device, electronic equipment and readable storage medium
PCT/CN2022/071109 WO2023103143A1 (en) 2021-12-07 2022-01-10 Sensor inspection method and apparatus, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111486395.7A CN115235525B (en) 2021-12-07 2021-12-07 Sensor detection method, sensor detection device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN115235525A true CN115235525A (en) 2022-10-25
CN115235525B CN115235525B (en) 2023-05-23

Family

ID=83666049

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111486395.7A Active CN115235525B (en) 2021-12-07 2021-12-07 Sensor detection method, sensor detection device, electronic equipment and readable storage medium

Country Status (2)

Country Link
CN (1) CN115235525B (en)
WO (1) WO2023103143A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107167090A (en) * 2017-03-13 2017-09-15 深圳市速腾聚创科技有限公司 Vehicle overall dimension measuring method and system
US20190291721A1 (en) * 2016-06-29 2019-09-26 Clarion Co., Ltd. In-Vehicle Processing Apparatus
CN110741282A (en) * 2019-08-21 2020-01-31 深圳市速腾聚创科技有限公司 External parameter calibration method and device, computing equipment and computer storage medium
CN111007530A (en) * 2019-12-16 2020-04-14 武汉汉宁轨道交通技术有限公司 Laser point cloud data processing method, device and system
US20200158869A1 (en) * 2018-11-19 2020-05-21 Elmira Amirloo Abolfathi System, device and method of generating a high resolution and high accuracy point cloud
WO2021189479A1 (en) * 2020-03-27 2021-09-30 深圳市速腾聚创科技有限公司 Pose correction method and device for roadbed sensor, and roadbed sensor
CN113487479A (en) * 2021-06-30 2021-10-08 北京易控智驾科技有限公司 Method and system for detecting and identifying high-precision map boundary in real time at vehicle end
CN113658256A (en) * 2021-08-16 2021-11-16 智道网联科技(北京)有限公司 Target detection method and device based on laser radar and electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019117821A1 (en) * 2019-07-02 2021-01-07 Valeo Schalter Und Sensoren Gmbh Calibration of an active optical sensor system using a calibration target
CN112819896B (en) * 2019-11-18 2024-03-08 商汤集团有限公司 Sensor calibration method and device, storage medium and calibration system
CN112241007A (en) * 2020-07-01 2021-01-19 北京新能源汽车技术创新中心有限公司 Calibration method and arrangement structure of automatic driving environment perception sensor and vehicle
US11703577B2 (en) * 2020-08-14 2023-07-18 Baidu Usa Llc Recalibration determination system for autonomous driving vehicles with multiple LiDAR sensors
CN112146682B (en) * 2020-09-22 2022-07-19 福建牧月科技有限公司 Sensor calibration method and device for intelligent automobile, electronic equipment and medium
CN112577517A (en) * 2020-11-13 2021-03-30 上汽大众汽车有限公司 Multi-element positioning sensor combined calibration method and system
CN113610745A (en) * 2021-01-20 2021-11-05 腾讯科技(深圳)有限公司 Calibration evaluation parameter acquisition method and device, storage medium and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190291721A1 (en) * 2016-06-29 2019-09-26 Clarion Co., Ltd. In-Vehicle Processing Apparatus
CN107167090A (en) * 2017-03-13 2017-09-15 深圳市速腾聚创科技有限公司 Vehicle overall dimension measuring method and system
US20200158869A1 (en) * 2018-11-19 2020-05-21 Elmira Amirloo Abolfathi System, device and method of generating a high resolution and high accuracy point cloud
CN110741282A (en) * 2019-08-21 2020-01-31 深圳市速腾聚创科技有限公司 External parameter calibration method and device, computing equipment and computer storage medium
CN111007530A (en) * 2019-12-16 2020-04-14 武汉汉宁轨道交通技术有限公司 Laser point cloud data processing method, device and system
WO2021189479A1 (en) * 2020-03-27 2021-09-30 深圳市速腾聚创科技有限公司 Pose correction method and device for roadbed sensor, and roadbed sensor
CN113487479A (en) * 2021-06-30 2021-10-08 北京易控智驾科技有限公司 Method and system for detecting and identifying high-precision map boundary in real time at vehicle end
CN113658256A (en) * 2021-08-16 2021-11-16 智道网联科技(北京)有限公司 Target detection method and device based on laser radar and electronic equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
W. LIU 等: "A novel multifeature based on-site calibration method for Lidar-IMU system", IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS *
张恬洁;康志忠;: "融合深度相机点云与光学影像的室内三维建模", 测绘科学 *
钟棉卿: "基于移动激光雷达数据的路面状况检测方法研究", 中国博士学位论文全文数据库 工程科技Ⅱ辑 *

Also Published As

Publication number Publication date
WO2023103143A1 (en) 2023-06-15
CN115235525B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
US10423164B2 (en) Object position measurement with automotive camera using vehicle motion data
CN107710094B (en) Online calibration check during autonomous vehicle operation
US9911041B2 (en) Monitoring device, monitoring system and monitoring method
CN109094669A (en) Method and apparatus for assessing hinge angle
CN109591009B (en) Robot system
CN110850859B (en) Robot and obstacle avoidance method and obstacle avoidance system thereof
CN110827361B (en) Camera group calibration method and device based on global calibration frame
CN112802092B (en) Obstacle sensing method and device and electronic equipment
WO2022121460A1 (en) Agv intelligent forklift, and method and apparatus for detecting platform state of floor stack inventory areas
KR20200121756A (en) Initialization Diagnosis Method and System of a Mobile Robot
CN104006740A (en) Object detecting method and object detecting device
KR20210061875A (en) Method for detecting defects in the 3d lidar sensor using point cloud data
CN111684382A (en) Movable platform state estimation method, system, movable platform and storage medium
CN110426714B (en) Obstacle identification method
CN113741388A (en) Safety diagnosis system and method based on automatic driving perception failure
US11836945B2 (en) Image processing apparatus, image processing method, and program
CN115235525A (en) Sensor detection method and device, electronic equipment and readable storage medium
CN117067261A (en) Robot monitoring method, device, equipment and storage medium
CN113739819B (en) Verification method, verification device, electronic equipment, storage medium and chip
CN116047499A (en) High-precision real-time protection system and method for power transmission line of target construction vehicle
CN112344966B (en) Positioning failure detection method and device, storage medium and electronic equipment
CN111251301A (en) Motion planning method for operation arm of power transmission line maintenance robot
EP3396594A1 (en) Tracking system and method thereof
WO2022239355A1 (en) Position measurement system
KR20230076008A (en) A apparatus and method for multi-sensor calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant