WO2020147498A1 - 自动驾驶传感器的检测方法、装置和电子设备 - Google Patents

自动驾驶传感器的检测方法、装置和电子设备 Download PDF

Info

Publication number
WO2020147498A1
WO2020147498A1 PCT/CN2019/126304 CN2019126304W WO2020147498A1 WO 2020147498 A1 WO2020147498 A1 WO 2020147498A1 CN 2019126304 W CN2019126304 W CN 2019126304W WO 2020147498 A1 WO2020147498 A1 WO 2020147498A1
Authority
WO
WIPO (PCT)
Prior art keywords
detected
sensor
standard
result
shooting
Prior art date
Application number
PCT/CN2019/126304
Other languages
English (en)
French (fr)
Inventor
黄轩
吴楠
周珣
薛晶晶
刘颖楠
Original Assignee
北京百度网讯科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京百度网讯科技有限公司 filed Critical 北京百度网讯科技有限公司
Publication of WO2020147498A1 publication Critical patent/WO2020147498A1/zh
Priority to US17/125,207 priority Critical patent/US11933604B2/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Definitions

  • the embodiments of the present invention relate to the technical field of intelligent driving, and in particular to a detection method, device and electronic equipment of an automatic driving sensor.
  • Environmental awareness is one of the core capabilities of the autonomous driving system, and the environmental awareness is mainly achieved by various sensors, such as cameras, lidars, etc.
  • the installation position and angle of the sensor have a greater impact on the environment perception algorithm. Therefore, it is particularly important to ensure the accuracy of the installation position during the installation of the sensor.
  • the detection of the installation position of the sensor is mainly performed manually, resulting in low detection efficiency.
  • the embodiments of the present invention provide a detection method, device, and electronic equipment of an automatic driving sensor to improve detection efficiency.
  • an embodiment of the present invention provides a detection method for an automatic driving sensor, including:
  • the standard association relationship is the corresponding relationship between the standard shooting result of the sensor to be detected in a fixed scene and the ID information when it is correctly installed;
  • a first detection result is output, and the first detection result is used to indicate whether the installation sequence of the sensor to be detected is correct.
  • the sensor to be detected includes: a camera or a lidar;
  • the shooting result of the camera is a shooting image
  • the shooting result of the lidar is a shooting point cloud
  • the method further includes:
  • the installation sequence of the sensors to be detected is adjusted.
  • the method further includes:
  • the installation angle of the sensor to be detected is detected, and a second detection result is output, wherein the second detection result is used to indicate the Whether the installation angle of the sensor is accurate.
  • the method further includes:
  • one or more of the horizontal angle, the pitch angle and the rotation angle of the sensor to be detected is adjusted.
  • the standard reference mark is determined in the following manner:
  • Identify one of the target objects in the standard shooting result and set a standard reference mark at the position of the target object, where the standard reference mark is a horizontal line, a vertical line, or a contour line of the target object;
  • a plurality of the targets are identified in the standard photographing result, and a standard reference mark is set at the positions of the plurality of targets, and the standard reference mark is a line connecting the center points of the plurality of targets.
  • the target object is adjusted according to the position of the target in the shooting result and the standard reference mark. After detecting one or more of the horizontal angle, pitch angle, and rotation angle of the sensor, it also includes:
  • an automatic driving sensor detection device which includes:
  • the standard establishment module is used to establish a standard association relationship in advance, wherein the standard association relationship is the corresponding relationship between the standard shooting result of the sensor to be detected in a fixed scene and the ID information when it is correctly installed;
  • the display module is used to display the corresponding shooting result when the sensor to be detected is used for shooting in a fixed scene;
  • the first detection module is configured to output a first detection result according to the shooting result and the standard association relationship, and the first detection result is used to indicate whether the installation sequence of the sensor to be detected is correct.
  • the senor to be detected includes: a camera or a lidar;
  • the shooting result of the camera is a shooting image
  • the shooting result of the lidar is a shooting point cloud
  • the above-mentioned device further includes: an adjustment module;
  • the adjustment module is configured to adjust the installation sequence of the sensors to be detected according to the standard association relationship when the installation sequence of the sensors to be detected is wrong.
  • the device further includes:
  • the second detection module is configured to detect the installation angle of the sensor to be detected according to the position of the target in the shooting result and the standard reference mark, and output a second detection result, wherein the second detection result It is used to indicate whether the installation angle of the sensor to be detected is accurate.
  • the adjustment module is also used for when the installation angle of the sensor to be detected is not accurate, according to the The position of the target in the shooting result and the standard reference mark are adjusted to adjust one or more of the horizontal angle, the pitch angle and the rotation angle of the sensor to be detected.
  • the device further includes: a determining module
  • the determining module is used to determine the standard reference mark in the following manner:
  • Identify one of the target objects in the standard shooting result and set a standard reference mark at the position of the target object, where the standard reference mark is a horizontal line, a vertical line, or a contour line of the target object;
  • a plurality of the targets are identified in the standard photographing result, and a standard reference mark is set at the positions of the plurality of targets, and the standard reference mark is a line connecting the center points of the plurality of targets.
  • the device further includes: a fixing module
  • the fixing module is configured to fix the sensor after adjusting one or more of the horizontal angle, pitch angle, and rotation angle of the sensor to be detected according to the position of the target in the shooting result and the standard reference mark The sensor to be detected.
  • the present invention provides an electronic device including: a memory, a processor, and a display;
  • the memory is used to store computer programs
  • the display is used to display the shooting result of the sensor to be detected in a fixed scene
  • the processor is configured to execute the computer program to execute the method described in any one of the first aspect.
  • the present invention provides a computer-readable storage medium in which a computer program is stored, and when the computer program is executed by a processor, the method according to any one of the first aspects is executed.
  • Embodiments of the present invention provide a detection method, device, and electronic equipment for an automatic driving sensor.
  • the sensor to be detected is used to shoot in a fixed scene, and the corresponding shooting result is displayed.
  • the standard association relationship outputs the first detection result, where the first detection result is used to indicate whether the installation sequence of the sensors to be detected is correct. That is, the method of the embodiment of the present invention can effectively ensure the accuracy of the detection result by automatically identifying whether the installation sequence of the sensors is correct, and effectively improve the detection efficiency.
  • the first detection result and the standard association relationship can also provide a reliable basis for adjusting the installation sequence of the sensors to be detected.
  • FIG. 1 is a schematic flowchart of Embodiment 1 of a detection method for an automatic driving sensor provided by the present invention
  • Embodiment 2 is a schematic flowchart of Embodiment 2 of the detection method for automatic driving sensors provided by the present invention
  • Embodiment 3 is a schematic flowchart of Embodiment 3 of the detection method of an automatic driving sensor provided by the present invention
  • Embodiment 1 is a schematic structural diagram of Embodiment 1 of the detection device for an automatic driving sensor provided by the present invention
  • Embodiment 2 is a schematic structural diagram of Embodiment 2 of the detection device for an automatic driving sensor provided by the present invention
  • Embodiment 3 is a schematic structural diagram of Embodiment 3 of the detection device for an automatic driving sensor provided by the present invention.
  • FIG. 7 is a schematic structural diagram of Embodiment 1 of an electronic device provided by the present invention.
  • the environmental perception ability is one of the core capabilities of the automatic driving system, and the environmental perception ability is mainly realized by a variety of automatic driving sensors, such as cameras, lidar, etc.
  • the installation position and installation angle of the sensor have a great influence on the environment perception algorithm. Therefore, in the process of installing the sensor, it is particularly important to ensure the accuracy of the installation position and installation angle.
  • embodiments of the present invention provide a detection method for an automatic driving sensor to improve detection efficiency.
  • the method of the embodiment of the present invention can be applied to the scenario where the autonomous driving sensor installed on the vehicle is detected and calibrated before the unmanned vehicle is put into operation, so as to ensure that the unmanned vehicle is installed on the put into operation.
  • Autopilot sensors can accurately perceive road environment information.
  • the method of the embodiment of the present invention can also be applied to scenarios where a manually driven vehicle detects and calibrates multiple automatic driving sensors installed on the vehicle to ensure that the automatic driving sensors installed on the vehicle can accurately perceive road environment information
  • FIG. 1 is a schematic flowchart of Embodiment 1 of a detection method for an automatic driving sensor provided by the present invention.
  • the detection method of the automatic driving sensor provided by the embodiment of the present invention is executed by the detection device or electronic device of the automatic driving sensor provided by the embodiment of the present invention.
  • the electronic device may be, but not limited to, a computer, a smart phone, or a tablet. Computer etc.
  • the execution subject is an electronic device as an example for detailed description.
  • the method of this embodiment includes:
  • the sensor to be detected includes a camera or a lidar, and accordingly, the shooting result of the camera is a shot image, and the shooting result of the lidar is a shooting point cloud.
  • the pre-established standard association relationship is the corresponding relationship between the standard captured image and the camera ID information when it is correctly installed; if the sensor to be detected is a lidar, the pre-established standard association relationship Correspondence between standard shooting point cloud and lidar ID information.
  • the identification (Identification, ID for short) information may be an identification number assigned by the user to the sensor.
  • the ID information may also be Internet Protocol Address (IP address) information corresponding to the lidar.
  • IP address Internet Protocol Address
  • the standard shooting image may be a shooting image of a standard camera in a fixed scene
  • the standard shooting point cloud may be a shooting point cloud of a standard lidar in a fixed scene.
  • the fixed scene in this embodiment may be a pre-established standard test room, where the vehicle can be parked at a preset position in the standard test room, and the sensors installed on the vehicle can respond to the standard test room.
  • a possible way to establish a standard association relationship in advance is: the electronic device can display a corresponding input interface according to the user's instruction, and the user can input a set of corresponding standard shooting results and the ID of the sensor on the input interface Information, the electronic device generates a set of standard association relationships based on the standard shooting results input by the user and the ID information of the sensor. If the vehicle to be detected includes multiple sensors to be detected, the above steps are repeated to input the standard association relationships corresponding to the multiple sensors to be detected into the electronic device.
  • the standard association relationship is stored in a storage medium in the form of a document
  • the electronic device is connected to the storage medium
  • the user associates the standard through a corresponding control instruction
  • the relationship is imported into the electronic device.
  • S102 Use the sensor to be detected to shoot in a fixed scene, and display a corresponding shooting result.
  • the electronic device displays the shooting result of the sensor to be detected in a fixed scene on the display unit of the electronic device, where the display unit may be a display screen or a display component.
  • the electronic device displays the image taken by the camera in a fixed scene on the display unit.
  • the camera is installed at the front left position of the vehicle to be detected, the camera is used to photograph the left front area of the vehicle in a fixed scene, and the electronic device displays the captured image on the display unit.
  • the camera is installed at the front right position of the vehicle to be detected, the camera is used to photograph the front right area of the vehicle in the fixed scene, and the electronic device displays the photographed image on the display unit.
  • the electronic device displays the shooting point cloud of the lidar in a fixed scene on the display unit.
  • the lidar is installed at the front left position of the vehicle to be detected, the lidar is used to photograph the left front area of the vehicle in a fixed scene, and the electronic device displays the photographed point cloud on the display unit.
  • the lidar is installed at the front right position of the vehicle to be detected, then the lidar is used to photograph the front right area of the vehicle in a fixed scene, and the electronic device displays the photographed point cloud on the display unit.
  • the first detection result output by the electronic device is used to indicate whether the installation sequence of the sensors to be detected is correct.
  • the electronic device determines whether the current installation sequence of the sensors to be detected is correct according to the shooting result and the standard shooting result in the standard association relationship.
  • the sensor to be detected is a camera
  • a possible implementation is that the electronic device uses image recognition technology to identify the target in the captured image and the target in the standard captured image. If the captured image has the same characteristics as the target in the standard captured image, the electronic device outputs on the display unit The prompt message of "correct installation sequence”; if the characteristics of the target in the captured image are inconsistent with the characteristics of the target in the standard captured image, the electronic device outputs a prompt message of "error in installation sequence" on the display unit.
  • the electronic equipment automatically recognizes whether the camera installation sequence is correct, and outputs the detection results, which not only ensures the accuracy of the detection results, but also improves the detection efficiency.
  • the electronic device displays the captured image in the first preset area of the display unit, and displays the standard captured image in the second preset area of the display unit, and then the electronic device uses image recognition technology to recognize the captured images.
  • the target in the image and the target in the standard shot image If the characteristics of the target in the shot image are consistent with the characteristics of the target in the standard shot image, then the electronic device will output a prompt message "The installation sequence is correct" on the display unit;
  • the image and the standard captured image are displayed in different areas of the display unit.
  • the relevant staff can also observe whether the target in the captured image is consistent with the target in the standard captured image. Therefore, the electronic device can also display a prompt box.
  • the electronic device will output a prompt message "installation sequence error" on the display unit, because the captured image and the standard captured image are displayed differently on the display unit.
  • the relevant staff can also observe whether the target in the captured image is consistent with the target in the standard captured image. Therefore, the electronic device can also display a prompt box that includes a prompt message of "Manually confirm whether they are consistent", and The following related buttons: "Yes” and "No". The relevant staff can click the "Yes” or “No” buttons to input the manually confirmed test results. If the manually confirmed test results are consistent with the automatic confirmation results of the electronic device, Then, it can be determined that the installation sequence of the camera is wrong.
  • the electronic equipment automatically recognizes whether the camera installation sequence is correct, and outputs the detection results, which not only ensures the accuracy of the detection results, but also improves the detection efficiency.
  • the manual confirmation of the detection result is compared with the identification result of the electronic device, which further improves the accuracy of the detection result.
  • the sensor to be detected is Lidar
  • the electronic device uses image recognition technology to recognize the target in the shooting point cloud and the target in the standard shooting point cloud respectively. If the feature of the target in the shooting point cloud is consistent with the feature of the target in the standard shooting point cloud , Then, the electronic device outputs a prompt message “installation sequence is correct” on the display unit; if the feature of the target in the shooting point cloud is inconsistent with the feature of the target in the standard shooting point cloud, then the electronic device outputs “installation sequence” on the display unit Error" prompt message.
  • the electronic equipment automatically recognizes whether the lidar installation sequence is correct, and outputs the detection results, which not only ensures the accuracy of the detection results, but also improves the detection efficiency.
  • the electronic device displays the shooting point cloud in the first preset area of the display unit, and displays the standard shooting point cloud in the second preset area of the display unit, and then the electronic device uses image recognition technology respectively Identify the target in the shooting point cloud and the target in the standard shooting point cloud. If the feature of the target in the shooting point cloud is consistent with the feature of the target in the standard shooting point cloud, then the electronic device outputs "the installation sequence is correct" on the display unit As the shooting point cloud and the standard shooting point cloud are displayed in different areas of the display unit, the relevant staff can also observe whether the target in the shooting point cloud is consistent with the target in the standard shooting point cloud.
  • the electronic The device can also display a prompt box, which includes a prompt message of "Manually confirm whether it is consistent", and the following related buttons:, "Yes” and “No".
  • the relevant staff can click the "Yes” or “No” buttons to Enter the manually confirmed detection result. If the manually confirmed detection result is consistent with the automatic confirmation result of the electronic device, then the correct installation sequence of the lidar can be determined;
  • the electronic device will output a prompt message of “installation sequence error” on the display unit, because the shooting point cloud and the standard shooting point cloud are displayed separately Displaying different areas of the screen, the relevant staff can also observe whether the target in the shooting point cloud is consistent with the target in the standard shooting point cloud. Therefore, the electronic device can also display a prompt box that includes "Manually confirm whether it is consistent "" prompt message, and the following related buttons: "Yes” and "No". The relevant staff can click the "Yes” or "No” buttons to input the manually confirmed test results. If the manually confirmed test results are related to the electronic equipment The results of the automatic confirmation are consistent, then it can be determined that the lidar installation sequence is wrong.
  • the electronic equipment automatically recognizes whether the lidar installation sequence is correct, and outputs the detection results, which not only ensures the accuracy of the detection results, but also improves the detection efficiency.
  • the manual confirmation of the detection result is compared with the identification result of the electronic device, which further improves the accuracy of the detection result.
  • the captured image or the captured point cloud and the above-mentioned prompt message may also be displayed on the display unit in other forms of layout, and are not limited to the form described in this embodiment.
  • the sensor to be detected is used to shoot in a fixed scene, and the corresponding shooting result is displayed.
  • the first detection result is output according to the shooting result and the above standard association relationship.
  • the test result is used to indicate whether the installation sequence of the sensors to be tested is correct. That is, the method of this embodiment automatically recognizes whether the installation sequence of the sensors is correct, effectively ensuring the accuracy of the detection result, and effectively improving the detection efficiency.
  • the first detection result and the standard association relationship can also provide a reliable basis for adjusting the installation sequence of the sensors to be detected.
  • Embodiment 2 is a schematic flowchart of Embodiment 2 of the detection method for automatic driving sensors provided by the present invention. As shown in Figure 2, the method of this embodiment includes:
  • S202 Use the sensor to be detected to shoot in a fixed scene, and display a corresponding shooting result.
  • S203 Output a first detection result according to the shooting result and the standard association relationship.
  • Steps S201-S203 in this embodiment are similar to steps S101-S103 in the embodiment shown in FIG. 1, and will not be repeated here.
  • step S204 may be performed.
  • the sensor to be detected is a camera
  • One possible implementation is to determine the ID information of the camera that should be installed at the current installation location according to the standard association relationship, and determine the parameter information of the camera corresponding to the above ID information according to the ID information, and the electronic device updates the current
  • the parameter information of the position camera realizes the adjustment of the installation sequence of the camera.
  • Another possible implementation manner is: the electronic device determines the correct installation position of the camera according to the standard association relationship, removes the camera from the current position, and installs it to the correct installation position.
  • the sensor to be detected is Lidar
  • One possible implementation is to determine the ID information of the lidar that should be installed at the current installation location according to the standard association relationship, and determine the lidar parameter information corresponding to the above ID information based on the ID information, and the electronic device passes Update the parameter information of the lidar at the current position to realize the adjustment of the installation sequence of the lidar.
  • the electronic device determines the correct installation position of the lidar according to the standard association relationship, disassembles the lidar from the current position, and installs it to the correct installation position.
  • the sensor to be detected is used to shoot in a fixed scene, and the corresponding shooting result is displayed. After that, the first detection result is output according to the shooting result and the above standard association relationship.
  • the detection result is used to indicate whether the installation sequence of the sensors to be detected is correct. If the first detection result indicates that the installation sequence of the sensors to be detected is wrong, further, according to the standard association relationship, the installation sequence of the sensors to be detected is adjusted. That is, the method of this embodiment automatically recognizes whether the installation sequence of the sensors is correct, effectively ensuring the accuracy of the detection result, and effectively improving the detection efficiency.
  • the installation sequence of the sensors to be detected is adjusted according to the first detection result and the standard association relationship, which has high reliability.
  • the installation sequence of the multiple sensors installed on the vehicle to be detected can be detected, and the installation sequence of the sensors with the wrong installation sequence can also be detected according to the detection result. Adjustment.
  • Embodiment 3 is a schematic flowchart of Embodiment 3 of the detection method for automatic driving sensors provided by the present invention.
  • the method in this embodiment may be executed after step S103 in the embodiment shown in FIG. 1, or may be executed after step S204 in the embodiment shown in FIG. 2.
  • the method of this embodiment includes:
  • S301 Detect the installation angle of the sensor to be detected according to the position of the target in the shooting result and the standard reference mark, and output a second detection result.
  • the second detection result is used to indicate whether the installation angle of the sensor to be detected is accurate.
  • the standard reference mark is a mark corresponding to the position of the target object in the standard shooting result.
  • the standard reference mark may be external to the display unit by using a mold, or may be a reference mark set in the pixel area corresponding to the target object in the display unit when the standard photographed image is displayed in a preset manner.
  • the electronic device can determine whether the installation angle of the sensor is accurate by judging whether the position of the target in the shooting result coincides with the standard reference mark.
  • the standard reference mark is determined in the following way:
  • the electronic device recognizes a target in the standard shooting result, and sets a standard reference mark at the position of the target, and the standard reference mark is a horizontal line, a vertical line, or a contour line of the target.
  • the electronic device recognizes multiple targets in the standard shooting result, and sets a standard reference mark at the positions of the multiple targets, and the standard reference mark is a line connecting the center points of the multiple targets.
  • the target object may be a wall joint in a standard test room, a marker placed in the standard test room, one or more two-dimensional code patterns set on the wall of the standard test room, etc.
  • the target can be set as the wall gap corresponding to the left front of the standard test room, then the standard reference mark is the vertical line corresponding to the wall gap;
  • the target can be set to the corresponding wall gap in the front right of the standard test room, then the standard reference mark is the vertical line corresponding to the wall gap;
  • the target can be set as the standard test
  • One or more two-dimensional codes corresponding to the back of the room then, when the number of two-dimensional codes is one, the standard reference mark can be the contour line corresponding to the two-dimensional code, and when the number of two-dimensional codes is more than one, the standard reference mark
  • the mark can be a connection between the center points of multiple QR codes.
  • the target can be set as a landmark that is placed in a preset position in the standard test room and can be recognized by the lidar.
  • S302 Adjust one or more of the horizontal angle, the pitch angle and the rotation angle of the sensor to be detected according to the position of the target in the shooting result and the standard reference mark.
  • the sensor to be detected is a camera
  • one or more of the horizontal angle, the pitch angle, and the rotation angle of the camera is adjusted, so as to realize the calibration of the camera installation angle.
  • the horizontal angle of the camera is adjusted to the left by a corresponding angle to make the standard reference mark coincide with the target.
  • the sensor to be detected is Lidar
  • one or more of the horizontal angle, the pitch angle, and the rotation angle of the lidar are adjusted, so as to realize the calibration of the lidar installation angle.
  • the lidar can be fixed on a rotatable base during installation, and the installation angle of the lidar can be adjusted by adjusting one or more of the horizontal angle, pitch angle, and rotation angle of the base.
  • the adjustment method is similar to that of the camera.
  • the installation angle of the sensor to be detected is adjusted to the standard state, that is, when the standard reference mark coincides with the target in the shooting result, the sensor to be detected is fixed.
  • the installation angle of the sensor to be detected is detected according to the position of the target in the shooting result and the standard reference mark, and the second detection result is output. Further, the adjustment is made according to the position of the target in the shooting result and the standard reference mark. One or more of the horizontal angle, pitch angle, and rotation angle of the sensor to be detected is then fixed.
  • the method of this embodiment by automatically identifying whether the installation angle of the sensor is correct, effectively guarantees the accuracy of the detection result, and effectively improves the detection efficiency.
  • it is adjusted according to the position of the target in the shooting result and the installation angle of the sensor to be detected by the standard reference mark, which has high reliability.
  • the method provided by the embodiment of the present invention effectively guarantees the reliability of the vehicle to be detected by detecting and calibrating the installation position and installation angle of the sensor to be detected, and, because the accuracy of the sensor installation position and installation angle is higher, The accuracy of the sensor detection data can effectively reduce the complexity of the environment perception algorithm.
  • the device 40 of this embodiment includes: a standard establishment module 41, a display module 42 and a first detection module 43.
  • the standard establishing module 41 is used to establish a standard association relationship in advance, where the standard association relationship is the corresponding relationship between the standard shooting result of the sensor to be detected in a fixed scene and the ID information when it is correctly installed.
  • the display module 42 is used for displaying the corresponding shooting result when the sensor to be detected is used for shooting in a fixed scene.
  • the first detection module 43 is configured to output a first detection result according to the shooting result and the standard association relationship, and the first detection result is used to indicate whether the installation sequence of the sensors to be detected is correct.
  • the senor to be detected includes a camera and a lidar. Accordingly, the shooting result of the camera is a shot image, and the shooting result of the lidar is a shooting point cloud.
  • the device in this embodiment can be used to implement the technical solution of the method embodiment shown in FIG. 1, and its implementation principles and technical effects are similar, and will not be repeated here.
  • FIG. 5 is a schematic structural diagram of Embodiment 2 of the detection device for an automatic driving sensor provided by the present invention. As shown in FIG. 5, on the basis of the embodiment shown in FIG. 4, the apparatus 50 of this embodiment further includes: an adjustment module 44.
  • the adjustment module 44 is used to adjust the installation sequence of the sensors to be detected according to the standard association relationship when the installation sequence of the sensors to be detected is wrong.
  • the device in this embodiment can be used to implement the technical solution of the method embodiment shown in FIG. 2, and its implementation principles and technical effects are similar, and will not be repeated here.
  • FIG. 6 is a schematic structural diagram of Embodiment 3 of the detection device for an automatic driving sensor provided by the present invention. As shown in FIG. 6, on the basis of the embodiment shown in FIG. 5, the device 60 of this embodiment further includes: a second detection module 45, a fixing module 46, and a determination module 47.
  • the second detection module 45 is used to detect the installation angle of the sensor to be detected according to the position of the target in the shooting result and the standard reference mark, and output a second detection result, where the second detection result is used to indicate the Whether the installation angle of the sensor is accurate.
  • the adjustment module 44 is also used to adjust one of the horizontal angle, pitch angle, and rotation angle of the sensor to be detected according to the position of the target in the shooting result and the standard reference mark when the installation angle of the sensor to be detected is not accurate. Or more.
  • the fixing module 46 is configured to fix the sensor to be detected after the adjustment module 44 adjusts one or more of the horizontal angle, pitch angle, and rotation angle of the sensor to be detected according to the position of the target object in the shooting result and the standard reference mark. sensor.
  • the determining module 47 is used to determine the standard reference mark in the following manner:
  • the standard reference mark is a horizontal line, a vertical line, or the contour line of the target; or, identify multiple locations in the standard shooting result.
  • the target object is described, and a standard reference mark is set at the positions of the multiple targets, and the standard reference mark is a line connecting the center points of the multiple targets.
  • the device in this embodiment can be used to implement the technical solution of the method embodiment shown in FIG. 3, and its implementation principles and technical effects are similar, and will not be repeated here.
  • FIG. 7 is a schematic structural diagram of Embodiment 1 of an electronic device provided by the present invention.
  • the device 70 in this embodiment includes: a memory 71, a processor 72, and a display 73.
  • the memory 71 may be an independent physical unit, and may be connected to the processor 72 through a bus 74.
  • the memory 71 and the processor 72 may also be integrated together and implemented by hardware.
  • the memory 71 is used to store a computer program that implements the above method embodiment, and the processor 72 calls the computer program to execute the operation of the above method embodiment.
  • the display 73 is connected to the processor 72 via the bus 74, and is used to display the shooting result of the sensor to be detected in a fixed scene.
  • the display 73 displays the captured image; if the sensor to be detected is a lidar, correspondingly, the shooting result is a shooting point cloud, then the display 73 displays the shooting point cloud .
  • the foregoing electronic device 70 may also only include a processor and a display 73.
  • the memory for storing the program is located outside the electronic device 70, and the processor is connected to the memory through a circuit/wire for reading and executing the computer program stored in the memory.
  • the processor 72 may be a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), or a combination of CPU and NP.
  • CPU Central Processing Unit
  • NP Network Processor
  • the processor 72 may further include a hardware chip.
  • the aforementioned hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (Programmable Logic Device, PLD), or a combination thereof.
  • ASIC application-specific integrated circuit
  • PLD programmable logic device
  • the above-mentioned PLD may be a complex programmable logic device (Complex Programmable Logic Device, CPLD), a field programmable logic gate array (Field-Programmable Gate Array, FPGA), a general array logic (Generic Array Logic, GAL) or any combination thereof.
  • the memory 71 may include a volatile memory (Volatile Memory), such as random access memory (Random-Access Memory, RAM); the memory may also include a non-volatile memory (Non-volatile Memory), such as flash memory (Flash Memory). ), a hard disk (Hard Disk Drive, HDD) or a solid-state drive (Solid-state Drive, SSD); the storage may also include a combination of the above types of storage.
  • volatile memory such as random access memory (Random-Access Memory, RAM)
  • non-volatile Memory such as flash memory (Flash Memory).
  • flash memory Flash Memory
  • HDD hard disk
  • SSD solid-state drive
  • the present invention also provides a program product, for example, a computer-readable storage medium, including: a computer program, which is used to execute the above method when executed by a processor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Studio Devices (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种自动驾驶传感器的检测方法、装置、电子设备及存储介质,其中,该方法包括:预先建立标准关联关系,采用待检测传感器在固定场景下进行拍摄,显示相应的拍摄结果,之后,根据拍摄结果以及上述标准关联关系,输出第一检测结果,其中,第一检测结果用于表示待检测传感器的安装次序是否正确。即该方法,通过自动识别传感器的安装次序是否正确,有效保证了检测结果的准确性,且有效提高了检测效率。另外,第一检测结果以及标准关联关系还能够为待检测传感器的安装次序进行调整提供可靠依据。

Description

自动驾驶传感器的检测方法、装置和电子设备
本申请要求于2019年1月16日提交中国专利局、申请号为201910039103.1、申请名称为“自动驾驶传感器的检测方法、装置、电子设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明实施例涉及智能驾驶技术领域,尤其涉及一种自动驾驶传感器的检测方法、装置和电子设备。
背景技术
环境感知能力是自动驾驶系统的核心能力之一,而环境感知能力主要依靠各种各样的传感器来实现,例如:摄像头、激光雷达等。传感器的安装位置和安装角度对环境感知算法具有较大的影响,因此,在安装传感器的过程中,保证安装位置的准确性显得尤为重要。
现有技术中,传感器的安装位置的检测主要由人工进行,导致检测效率较低。
发明内容
本发明实施例提供一种自动驾驶传感器的检测方法、装置和电子设备,以提高检测效率。
第一方面,本发明实施例提供一种自动驾驶传感器的检测方法,包括:
预先建立标准关联关系,其中,所述标准关联关系为正确安装时待检测传感器在固定场景下的标准拍摄结果与ID信息的对应关系;
采用待检测传感器在固定场景下进行拍摄,显示相应的拍摄结果;
根据所述拍摄结果以及所述标准关联关系,输出第一检测结果,所述第一检测结果用于表示所述待检测传感器的安装次序是否正确。
结合第一方面,本发明在第一方面的第一种实现方式,所述待检测传感器包括:摄像头或激光雷达;
相应地,所述摄像头的拍摄结果为拍摄图像,所述激光雷达的拍摄结果为拍摄点云。
结合第一方面,本发明在第一方面的第二种实现方式,若所述待检测传感器的安 装次序错误,所述方法还包括:
根据所述标准关联关系,对所述待检测传感器的安装次序进行调整。
结合第一方面,本发明在第一方面的第三种实现方式,所述方法还包括:
根据所述拍摄结果中目标物的位置以及所述标准参考标记,对所述待检测传感器的安装角度进行检测,输出第二检测结果,其中,所述第二检测结果用于表示所述待检测传感器的安装角度是否准确。
结合本发明在第一方面的第三种实现方式,本发明在第一方面的第四种实现方式,若所述待检测传感器的安装角度不准确,所述方法还包括:
根据所述拍摄结果中目标物的位置以及所述标准参考标记,调整所述待检测传感器的水平角、俯仰角以及旋转角中的一个或多个。
结合本发明在第一方面的第三种实现方式,本发明在第一方面的第五种实现方式,所述标准参考标记通过以下方式确定:
在标准拍摄结果中识别一个所述目标物,并在所述目标物的位置设置标准参考标记,所述标准参考标记为水平线、竖直线或所述目标物的轮廓线;
或者,
在标准拍摄结果中识别多个所述目标物,并在多个所述目标物的位置设置标准参考标记,所述标准参考标记为所述多个目标物的中心点连线。
结合本发明在第一方面的第三种实现方式,本发明在第一方面的第六种实现方式,所述根据所述拍摄结果中目标物的位置以及所述标准参考标记,调整所述待检测传感器的水平角、俯仰角以及旋转角中的一个或多个之后,还包括:
固定所述待检测传感器。
第二方面,本发明提供一种自动驾驶传感器的检测装置,该装置包括:
标准建立模块,用于预先建立标准关联关系,其中,所述标准关联关系为正确安装时待检测传感器在固定场景下的标准拍摄结果与ID信息的对应关系;
显示模块,用于在采用待检测传感器在固定场景下进行拍摄时,显示相应的拍摄结果;
第一检测模块,用于根据所述拍摄结果以及所述标准关联关系,输出第一检测结果,所述第一检测结果用于表示所述待检测传感器的安装次序是否正确。
结合第二方面,本发明在第二方面的第一种实现方式,所述待检测传感器包括:摄像头或激光雷达;
相应地,所述摄像头的拍摄结果为拍摄图像,所述激光雷达的拍摄结果为拍摄点云。
结合第二方面,本发明在第二方面的第二种实现方式,上述装置还包括:调整模块;
所述调整模块,用于在所述待检测传感器的安装次序错误时,根据所述标准关联关系,对所述待检测传感器的安装次序进行调整。
结合第二方面,本发明在第二方面的第三种实现方式,所述装置还包括:
第二检测模块,用于根据所述拍摄结果中目标物的位置以及所述标准参考标记,对所述待检测传感器的安装角度进行检测,输出第二检测结果,其中,所述第二检测结果用于表示所述待检测传感器的安装角度是否准确。
结合本发明在第二方面的第三种实现方式,本发明在第二方面的第四种实现方式,所述调整模块还用于在所述待检测传感器的安装角度不准确时,根据所述拍摄结果中目标物的位置以及所述标准参考标记,调整所述待检测传感器的水平角、俯仰角以及旋转角中的一个或多个。
结合本发明在第二方面的第三种实现方式,本发明在第二方面的第五种实现方式,所述装置还包括:确定模块;
所述确定模块,用于通过以下方式确定标准参考标记:
在标准拍摄结果中识别一个所述目标物,并在所述目标物的位置设置标准参考标记,所述标准参考标记为水平线、竖直线或所述目标物的轮廓线;
或者,
在标准拍摄结果中识别多个所述目标物,并在多个所述目标物的位置设置标准参考标记,所述标准参考标记为所述多个目标物的中心点连线。
结合本发明在第二方面的第三种实现方式,本发明在第二方面的第六种实现方式,所述装置还包括:固定模块;
所述固定模块,用于在根据所述拍摄结果中目标物的位置以及所述标准参考标记,调整所述待检测传感器的水平角、俯仰角以及旋转角中的一个或多个之后,固定所述待检测传感器。
第三方面,本发明提供一种电子设备,该电子设备包括:存储器,处理器及显示器;
所述存储器,用于存储计算机程序;
所述显示器,用于显示待检测传感器在固定场景下拍摄的拍摄结果;
所述处理器,用于执行所述计算机程序,以执行第一方面任一项所述的方法。
第四方面,本发明提供一种计算机可读存储介质,所述存储介质中存储计算机程序,所述计算机程序在被处理器执行时,以执行第一方面任一项所述的方法。
本发明实施例提供一种自动驾驶传感器的检测方法、装置和电子设备,通过预先建立标准关联关系,采用待检测传感器在固定场景下进行拍摄,显示相应的拍摄结果,之后,根据拍摄结果以及上述标准关联关系,输出第一检测结果,其中,第一检测结果用于表示待检测传感器的安装次序是否正确。即本发明实施例的方法,通过自动识别传感器的安装次序是否正确,有效保证了检测结果的准确性,且有效提高了检测效率。另外,第一检测结果以及标准关联关系还能够为待检测传感器的安装次序进行调整提供可靠依据。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图做一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明提供的自动驾驶传感器的检测方法实施例一的流程示意图;
图2为本发明提供的自动驾驶传感器的检测方法实施例二的流程示意图;
图3为本发明提供的自动驾驶传感器的检测方法实施例三的流程示意图;
图4为本发明提供的自动驾驶传感器的检测装置实施例一的结构示意图;
图5为本发明提供的自动驾驶传感器的检测装置实施例二的结构示意图;
图6为本发明提供的自动驾驶传感器的检测装置实施例三的结构示意图;
图7为本发明提供的电子设备实施例一的结构示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明 保护的范围。
本发明实施例中,“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
目前,环境感知能力是自动驾驶系统的核心能力之一,而环境感知能力主要依靠各种各样的自动驾驶传感器来实现,例如:摄像头、激光雷达等。传感器的安装位置和安装角度对环境感知算法具有较大的影响,因此,在安装传感器的过程中,保证安装位置以及安装角度的准确性显得尤为重要。
但是,在进行传感器安装的过程中,由于安装人员的操作失误或其他原因,导致多个传感器的安装次序或安装角度不准确,无法满足设计要求。因此,在车辆投入运营或投入使用前对车辆上安装的传感器的安装情况进行检测。
现有技术中,检测多个传感器是否出现安装次序错误主要由人工进行。导致检测效率较低。
基于现有技术中存在的问题,本发明实施例提供一种自动驾驶传感器的检测方法,以提高检测效率。
需要说明的是,本发明实施例的方法可应用于无人驾驶车辆在投入运营前,对车辆上安装的自动驾驶传感器进行检测及校准的场景中,以保证投入运营的无人驾驶车辆上安装的自动驾驶传感器能够准确感知道路环境信息。
当然,本发明实施例的方法还可应用于人工驾驶的车辆对车辆上安装的多个自动驾驶传感器进行检测及校准的场景中,以保证车辆上安装的自动驾驶传感器能够准确感知道路环境信息
下面以具体地实施例对本申请的技术方案进行详细说明。下面这几个具体的实施例可以相互结合,对于相同或相似的概念或过程可能在某些实施例不再赘述。
图1为本发明提供的自动驾驶传感器的检测方法实施例一的流程示意图。本发明实施例提供的自动驾驶传感器的检测方法的执行主体为本发明实施例提供的自动驾驶传感器的检测装置或电子设备,示例性地,该电子设备可以但不限于为计算机、智能手机、平板电脑等。本实施例中,以执行主体为电子设备为例进行详细说明。
如图1所示,本实施例的方法包括:
S101、预先建立标准关联关系,其中,标准关联关系为正确安装时待检测传感器在固定场景下的标准拍摄结果与ID信息的对应关系。
可选地,待检测传感器包括摄像头或激光雷达,相应地,摄像头的拍摄结果为拍摄图像,激光雷达的拍摄结果为拍摄点云。那么,可以理解的是,若待检测传感器为摄像头,预先建立的标准关联关系为正确安装时,标准拍摄图像与摄像头ID信息的对应关系;若待检测传感器为激光雷达,预先建立的标准关联关系为标准拍摄点云与激光雷达ID信息的对应关系。其中,身份标识(Identification,简称ID)信息可以为用户为传感器所分配的身份标识号码。当待检测传感器为激光雷达时,ID信息也可为激光雷达对应的互联网协议地址(Internet Protocol Address,简称:IP地址)信息。
可选地,该标准拍摄图像可以是标准摄像头在固定场景中的拍摄图像,标准拍摄点云可以是标准激光雷达在固定场景中的拍摄点云。
另外,需要说明的是,本实施例中的固定场景可以为预先建立的标准测试间,其中,车辆能够停放于标准测试间中的预设位置,车辆上安装的传感器能够对标准测试间中相应方向的固定场景进行拍摄。因此,可采用标准车辆上安装的传感器能够对标准测试间中相应方向的固定场景进行拍摄,以获取标准拍摄结果。
示例性地,一种可能的预先建立标准关联关系的实现方式为:电子设备能够根据用户的指令显示相应的输入界面,用户可在该输入界面上输入一组相应的标准拍摄结果以及传感器的ID信息,电子设备根据用户输入的标准拍摄结果以及传感器的ID信息生成一组标准关联关系。若待检测车辆上包括多个待检测传感器,那么,重复上述步骤,将多个待检测传感器对应的标准关联关系输入电子设备中。
示例性地,另一种可能的预先建立标准关联关系的实现方式为:标准关联关系以文档的形式存储于存储介质中,电子设备与该存储介质相连接,用户通过相应的控制指令将标准关联关系导入电子设备中。
S102、采用待检测传感器在固定场景下进行拍摄,显示相应的拍摄结果。
本步骤中,电子设备将待检测传感器在固定场景下的拍摄结果显示在电子设备的显示单元,其中,该显示单元可以是显示屏幕或显示组件。
具体地,若待检测传感器为摄像头,那么,电子设备将该摄像头在固定场景下的拍摄图像显示在显示单元上。示例性地,若该摄像头安装于待检测车辆的左前方位置,那么,采用该摄像头对固定场景中车辆左前方区域进行拍摄,电子设备将拍摄图像显示在显示单元上。若该摄像头安装于待检测车辆的右前方位置,那么,采用该摄像头对固定场景中车辆右前方区域进行拍摄,电子设备将拍摄图像显示在显示单元上。
若待检测传感器为激光雷达,那么,电子设备将该激光雷达在固定场景下的拍摄点云显示在显示单元上。示例性地,若该激光雷达安装于待检测车辆的左前方位置,那么,采用该激光雷达对固定场景中车辆左前方区域进行拍摄,电子设备将拍摄点云显示在显示单元上。若该激光雷达安装于待检测车辆的右前方位置,那么,采用该激光雷达对固定场景中车辆右前方区域进行拍摄,电子设备将拍摄点云显示在显示单元上。
S103、根据拍摄结果以及标准关联关系,输出第一检测结果。
其中,电子设备输出的第一检测结果用于表示待检测传感器的安装次序是否正确。
示例性地,电子设备根据拍摄结果以及标准关联关系中的标准拍摄结果,判断当前待检测传感的安装次序是否正确。
下面分别对待检测传感器为摄像头和待检测传感器为激光雷达,两种不同情况进行详细说明:
1、待检测传感器为摄像头
一种可能的实现方式,电子设备采用图像识别技术分别识别拍摄图像中目标物以及标准拍摄图像中目标物,若拍摄图像与标准拍摄图像中目标物的特征一致,那么,电子设备在显示单元输出“安装次序正确”的提示消息;若拍摄图像中目标物的特征与标准拍摄图像中目标物的特征不一致,那么,电子设备在显示单元输出“安装次序错误”的提示消息。
通过电子设备自动识别摄像头安装次序是否正确,并输出检测结果,不仅保证了检测结果了准确性,还提高了检测效率。
另一种可能的实现方式,电子设备将拍摄图像显示在显示单元的第一预设区域,将标准拍摄图像显示在显示单元的第二预设区域,然后,电子设备采用图像识别技术分别识别拍摄图像中目标物以及标准拍摄图像中目标物,若拍摄图像中目标物的特征与标准拍摄图像中目标物的特征一致,那么,电子设备在显示单元输出“安装次序正确”的提示消息;由于拍摄图像和标准拍摄图像分别显示在显示单元的不同区域,相关工作人员也可通过观察确认拍摄图像中目标物与标准拍摄图像中目标物是否一致,因此,电子设备还可显示提示框,提示框中包括“人工确认是否一致”的提示消息,以及以下相关按钮:“是”和“否”,相关工作人员可通过点击“是”或“否”按钮,输入人工确认的检测结果,若人工确认的检测结果与电子设备自动确认的结果保持一致,那么,可确定该摄像头的安装次序准确;
若拍摄图像中目标物的特征与标准拍摄图像中目标物的特征不一致,那么,电子 设备在显示单元输出“安装次序错误”的提示消息,由于拍摄图像和标准拍摄图像分别显示在显示单元的不同区域,相关工作人员也可通过观察确认拍摄图像中目标物与标准拍摄图像中目标物是否一致,因此,电子设备还可显示提示框,提示框中包括“人工确认是否一致”的提示消息,以及以下相关按钮:“是”和“否”,相关工作人员可通过点击“是”或“否”按钮,输入人工确认的检测结果,若人工确认的检测结果与电子设备自动确认的结果保持一致,那么,可确定该摄像头的安装次序错误。
通过电子设备自动识别摄像头安装次序是否正确,并输出检测结果,不仅保证了检测结果了准确性,还提高了检测效率。另外,通过人工确认的检测结果与电子设备识别结果进行对比,进一步提高了检测结果的准确性。
2、待检测传感器为激光雷达
一种可能的实现方式,电子设备采用图像识别技术分别识别拍摄点云中目标物以及标准拍摄点云中目标物,若拍摄点云中目标物的特征与标准拍摄点云中目标物的特征一致,那么,电子设备在显示单元输出“安装次序正确”的提示消息;若拍摄点云中目标物的特征与标准拍摄点云中目标物的特征不一致,那么,电子设备在显示单元输出“安装次序错误”的提示消息。
通过电子设备自动识别激光雷达安装次序是否正确,并输出检测结果,不仅保证了检测结果了准确性,还提高了检测效率。
另一种可能的实现方式,电子设备将拍摄点云显示在显示单元的第一预设区域,将标准拍摄点云显示在显示单元的第二预设区域,然后,电子设备采用图像识别技术分别识别拍摄点云中目标物以及标准拍摄点云中目标物,若拍摄点云中目标物的特征与标准拍摄点云中目标物的特征一致,那么,电子设备在显示单元输出“安装次序正确”的提示消息;由于拍摄点云和标准拍摄点云分别显示在显示单元的不同区域,相关工作人员也可通过观察确认拍摄点云中目标物与标准拍摄点云中目标物是否一致,因此,电子设备还可显示提示框,提示框中包括“人工确认是否一致”的提示消息,以及以下相关按钮:、“是”和“否”,相关工作人员可通过点击“是”或“否”按钮,输入人工确认的检测结果,若人工确认的检测结果与电子设备自动确认的结果保持一致,那么,可确定该激光雷达的安装次序准确;
若拍摄点云中目标物的特征与标准拍摄点云中目标物的特征不一致,那么,电子设备在显示单元输出“安装次序错误”的提示消息,由于拍摄点云和标准拍摄点云分别显示在显示屏幕的不同区域,相关工作人员也可通过观察确认拍摄点云中目标物与标准拍摄点云中目标物是否一致,因此,电子设备还可显示提示框,提示框中包括“人 工确认是否一致”的提示消息,以及以下相关按钮:、“是”和“否”,相关工作人员可通过点击“是”或“否”按钮,输入人工确认的检测结果,若人工确认的检测结果与电子设备自动确认的结果保持一致,那么,可确定该激光雷达的安装次序错误。
通过电子设备自动识别激光雷达安装次序是否正确,并输出检测结果,不仅保证了检测结果了准确性,还提高了检测效率。另外,通过人工确认的检测结果与电子设备识别结果进行对比,进一步提高了检测结果的准确性。
需要说明的是,拍摄图像或拍摄点云以及上述提示消息等还可通过其他形式的布局显示在显示单元,并不限于本实施例所描述的形式。
本实施例中,通过预先建立标准关联关系,采用待检测传感器在固定场景下进行拍摄,显示相应的拍摄结果,之后,根据拍摄结果以及上述标准关联关系,输出第一检测结果,其中,第一检测结果用于表示待检测传感器的安装次序是否正确。即本实施例的方法,通过自动识别传感器的安装次序是否正确,有效保证了检测结果的准确性,且有效提高了检测效率。另外,第一检测结果以及标准关联关系还能够为待检测传感器的安装次序进行调整提供可靠依据。
图2为本发明提供的自动驾驶传感器的检测方法实施例二的流程示意图。如图2所示,本实施例的方法包括:
S201、预先建立标准关联关系,其中,标准关联关系为正确安装时待检测传感器在固定场景下的标准拍摄结果与ID信息的对应关系。
S202、采用待检测传感器在固定场景下进行拍摄,显示相应的拍摄结果。
S203、根据拍摄结果以及标准关联关系,输出第一检测结果。
本实施例中步骤S201-S203与图1所示实施例中步骤S101-S103类似,此处不再赘述。
进一步地,若电子设备输出的第一检测结果表示待检测激光雷达的安装次序错误,那么可执行步骤S204。
S204、根据标准关联关系,对待检测传感器的安装次序进行调整。
针对待检测传感器为摄像头和待检测传感器为激光雷达两种情况分别进行说明:
1、待检测传感器为摄像头
一种可能的实现方式为:根据标准关联关系确定正确安装时,当前安装位置应该安装的摄像头的ID信息,根据该ID信息,确定与上述ID信息对应的摄像头的参数信息,电子设备通过更新当前位置摄像头的参数信息,实现对摄像头的安装次序的调整。
另一种可能的实现方式为:电子设备根据标准关联关系,确定该摄像头的正确安装位置,将该摄像头由当前位置拆卸下来,安装至正确安装位置上。
2、待检测传感器为激光雷达
一种可能的实现方式为:根据标准关联关系确定正确安装时,当前安装位置应该安装的激光雷达的ID信息,根据该ID信息,确定与上述ID信息对应的激光雷达的参数信息,电子设备通过更新当前位置激光雷达的参数信息,实现对激光雷达的安装次序的调整。
另一种可能的实现方式为:电子设备根据标准关联关系,确定该激光雷达的正确安装位置,将该激光雷达由当前位置拆卸下来,安装至正确安装位置上。
本实施例中,通过预先建立标准关联关系,采用待检测传感器在固定场景下进行拍摄,显示相应的拍摄结果,之后,根据拍摄结果以及上述标准关联关系,输出第一检测结果,其中,第一检测结果用于表示待检测传感器的安装次序是否正确,若第一检测结果表示待检测传感器的安装次序错误,进一步,根据标准关联关系,对待检测传感器的安装次序进行调整。即本实施例的方法,通过自动识别传感器的安装次序是否正确,有效保证了检测结果的准确性,且有效提高了检测效率。另外,根据第一检测结果以及标准关联关系对待检测传感器的安装次序进行调整,可靠性较高。
需要说明的是,通过重复执行图1或图2所示实施例的方法,便可对待检测车辆上安装的多个传感器的安装次序进行检测,还可根据检测结果对安装次序错误的传感器的进行调整。
图3为本发明提供的自动驾驶传感器的检测方法实施例三的流程示意图。本实施例的方法可以在图1所示实施例的步骤S103之后执行,也可以在图2所示实施例步骤S204之后执行。
如图3所示,本实施例的方法包括:
S301、根据拍摄结果中目标物的位置以及标准参考标记,对待检测传感器的安装角度进行检测,输出第二检测结果。
其中,第二检测结果用于表示待检测传感器的安装角度是否准确。
在本步骤中,标准参考标记为与目标物在标准拍摄结果中的位置对应的标记。该标准参考标记可采用模具外设于显示单元,也可以为按照预设方式显示标准拍摄图像时,显示单元中目标实物对应的像素区域设置的参考标记。
具体地,电子设备通过判断拍摄结果中目标物的位置与标准参考标记是否重合,则可确定传感器的安装角度是否准确。
示例性地,标准参考标记通过以下方式确定:
电子设备在标准拍摄结果中识别一个目标物,并在目标物的位置设置标准参考标记,该标准参考标记为水平线、竖直线或所述目标物的轮廓线。
或者,电子设备在标准拍摄结果中识别多个目标物,并在多个目标物的位置设置标准参考标记,标准参考标记为多个目标物的中心点连线。
可选地,目标物可以为标准测试间的墙缝、摆放在标准测试间的标志物、标准测试间墙面上设置的一个或多个二维码图样等。
优选地,针对待检测车辆左前方安装的摄像头,目标物可设置为标准测试间左前方对应的墙缝,那么,标准参考标记则为墙缝对应的竖直线;针对待检测车辆右前方安装的摄像头,目标物可设置为标准测试间右前方对应的墙缝,那么,标准参考标记则为墙缝对应的竖直线;针对待检测车辆后侧安装的摄像头,目标物可设置为标准测试间后侧对应的一个或多个二维码,那么,当二维码的数量为一个时,标准参考标记可为二维码对应的轮廓线,当二维码数量为多个时,标准参考标记则可为多个二维码的中心点的连线。
优选地,针对待检测车辆上安装的激光雷达,目标物可设置为摆放在标准测试间内预设位置且激光雷达能够识别的标志物。
S302、根据拍摄结果中目标物的位置以及标准参考标记,调整待检测传感器的水平角、俯仰角以及旋转角中的一个或多个。
针对待检测传感器为摄像头和待检测传感器为激光雷达两种情况分别进行说明:
1、待检测传感器为摄像头
示例性地,根据拍摄图像中目标物的位置以及标准参考标记的角度差,调整摄像头的水平角、俯仰角以及旋转角中的一个或多个,从而实现对摄像头安装角度的校准。
例如,当电子设备显示单元显示,标准参考标记位于拍摄图像中目标物的右侧,那么,将摄像头的水平角向左侧调整相应的角度,以使标准参考标记与目标物重合。
2、待检测传感器为激光雷达
示例性地,根据拍摄点云中目标物的位置以及标准参考标记的角度差,调整激光雷达的水平角、俯仰角以及旋转角中的一个或多个,从而实现对激光雷达安装角度的校准。可选地,在安装时可将激光雷达固定在可旋转底座上,通过调整底座的水平角、俯仰角以及旋转角中的一个或多个,从而实现调整激光雷达的安装角度。
其调整方式与摄像头的调整方式类似。
S303、固定待检测传感器。
将待检测传感器的安装角度调整至标准状态时,即标准参考标记与拍摄结果中目标物重合时,固定待检测传感器。
本实施例中,通过根据拍摄结果中目标物的位置以及标准参考标记,对待检测传感器的安装角度进行检测,输出第二检测结果,进一步,根据拍摄结果中目标物的位置以及标准参考标记,调整待检测传感器的水平角、俯仰角以及旋转角中的一个或多个,之后,固定该待检测传感器。本实施例的方法,通过自动识别传感器的安装角度是否正确,有效保证了检测结果的准确性,且有效提高了检测效率。另外,根据拍摄结果中目标物的位置以及标准参考标记待检测传感器的安装角度进行调整,可靠性较高。
本发明实施例提供的方法,通过对待检测传感器的安装位置和安装角度进行检测并校准,有效保证了待检测车辆的可靠性,并且,由于传感器安装位置和安装角度的准确性较高,提高了传感器探测数据的准确性,且能够有效降低环境感知算法的复杂度。
图4为本发明提供的自动驾驶传感器的检测装置实施例一的结构示意图。如图4所示,本实施例的装置40包括:标准建立模块41、显示模块42以及第一检测模块43。
其中,标准建立模块41,用于预先建立标准关联关系,其中,标准关联关系为正确安装时待检测传感器在固定场景下的标准拍摄结果与ID信息的对应关系。
显示模块42,用于在采用待检测传感器在固定场景下进行拍摄时,显示相应的拍摄结果。
第一检测模块43,用于根据拍摄结果以及标准关联关系,输出第一检测结果,第一检测结果用于表示待检测传感器的安装次序是否正确。
可选地,待检测传感器包括:摄像头和激光雷达,相应地,摄像头的拍摄结果为拍摄图像,激光雷达的拍摄结果为拍摄点云。
本实施例的装置,可以用于执行图1所示方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
图5为本发明提供的自动驾驶传感器的检测装置实施例二的结构示意图。如图5所示,本实施例的装置50在图4所示实施例的基础上,还包括:调整模块44。
调整模块44,用于在待检测传感器的安装次序错误时,根据标准关联关系,对待检测传感器的安装次序进行调整。
本实施例的装置,可以用于执行图2所示方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
图6为本发明提供的自动驾驶传感器的检测装置实施例三的结构示意图。如图6所示,本实施例的装置60在图5所示实施例的基础上,还包括:第二检测模块45、固定模块46和确定模块47。
其中吗,第二检测模块45,用于根据拍摄结果中目标物的位置以及标准参考标记,对待检测传感器的安装角度进行检测,输出第二检测结果,其中,第二检测结果用于表示待检测传感器的安装角度是否准确。
相应的,调整模块44,还用于在待检测传感器的安装角度不准确时,根据拍摄结果中目标物的位置以及标准参考标记,调整待检测传感器的水平角、俯仰角以及旋转角中的一个或多个。
进一步地,固定模块46,用于在调整模块44根据拍摄结果中目标物的位置以及标准参考标记,调整待检测传感器的水平角、俯仰角以及旋转角中的一个或多个之后,固定待检测传感器。
其中,确定模块47,用于通过以下方式确定标准参考标记:
在标准拍摄结果中识别一个目标物,并在目标物的位置设置标准参考标记,标准参考标记为水平线、竖直线或所述目标物的轮廓线;或者,在标准拍摄结果中识别多个所述目标物,并在多个目标物的位置设置标准参考标记,标准参考标记为多个目标物的中心点连线。
本实施例的装置,可以用于执行图3所示方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
图7为本发明提供的电子设备实施例一的结构示意图。如图7所示,本实施例的装置70包括:存储器71、处理器72和显示器73。
其中,存储器71可以是独立的物理单元,与处理器72可以通过总线74连接。存储器71、处理器72也可以集成在一起,通过硬件实现等。
存储器71用于存储实现以上方法实施例的计算机程序,处理器72调用该计算机程序,执行以上方法实施例的操作。
进一步地,显示器73通过总线74与处理器72连接,用于显示待检测传感器在固定场景下拍摄的拍摄结果。
若待检测传感器为摄像头,相应的,拍摄结果为拍摄图像,那么,显示器73显示拍摄图像;若待检测传感器为激光雷达,相应的,拍摄结果为拍摄点云,那么,显示器73显示拍摄点云。
可选地,当上述实施例的方法中的部分或全部通过软件实现时,上述电子设备70 也可以只包括处理器和显示器73。用于存储程序的存储器位于电子设备70之外,处理器通过电路/电线与存储器连接,用于读取并执行存储器中存储的计算机程序。
处理器72可以是中央处理器(Central Processing Unit,CPU),网络处理器(Network Processor,NP)或者CPU和NP的组合。
处理器72还可以进一步包括硬件芯片。上述硬件芯片可以是专用集成电路(Application-Specific Integrated Circuit,ASIC),可编程逻辑器件(Programmable Logic Device,PLD)或其组合。上述PLD可以是复杂可编程逻辑器件(Complex Programmable Logic Device,CPLD),现场可编程逻辑门阵列(Field-Programmable Gate Array,FPGA),通用阵列逻辑(Generic Array Logic,GAL)或其任意组合。
存储器71可以包括易失性存储器(Volatile Memory),例如随机存取存储器(Random-Access Memory,RAM);存储器也可以包括非易失性存储器(Non-volatile Memory),例如快闪存储器(Flash Memory),硬盘(Hard Disk Drive,HDD)或固态硬盘(Solid-state Drive,SSD);存储器还可以包括上述种类的存储器的组合。
另外,本发明还提供一种程序产品,例如,计算机可读存储介质,包括:计算机程序,计算机程序在被处理器执行时用于执行以上方法。
本领域普通技术人员可以理解:实现上述各方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成。前述的程序可以存储于一计算机可读取存储介质中。该程序在执行时,执行包括上述各方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (10)

  1. 一种自动驾驶传感器的检测方法,其特征在于,包括:
    预先建立标准关联关系,其中,所述标准关联关系为正确安装时待检测传感器在固定场景下的标准拍摄结果与ID信息的对应关系;
    采用待检测传感器在固定场景下进行拍摄,显示相应的拍摄结果;
    根据所述拍摄结果以及所述标准关联关系,输出第一检测结果,所述第一检测结果用于表示所述待检测传感器的安装次序是否正确。
  2. 根据权利要求1所述的方法,其特征在于,所述待检测传感器包括:摄像头或激光雷达;
    相应地,所述摄像头的拍摄结果为拍摄图像,所述激光雷达的拍摄结果为拍摄点云。
  3. 根据权利要求1所述的方法,其特征在于,若所述待检测传感器的安装次序错误,所述方法还包括:
    根据所述标准关联关系,对所述待检测传感器的安装次序进行调整。
  4. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    根据所述拍摄结果中目标物的位置以及标准参考标记,对所述待检测传感器的安装角度进行检测,输出第二检测结果,其中,所述第二检测结果用于表示所述待检测传感器的安装角度是否准确。
  5. 根据权利要求4所述的方法,其特征在于,若所述待检测传感器的安装角度不准确,所述方法还包括:
    根据所述拍摄结果中目标物的位置以及所述标准参考标记,调整所述待检测传感器的水平角、俯仰角以及旋转角中的一个或多个。
  6. 根据权利要求4所述的方法,其特征在于,所述标准参考标记通过以下方式确定:
    在标准拍摄结果中识别一个所述目标物,并在所述目标物的位置设置标准参考标记,所述标准参考标记为水平线、竖直线或所述目标物的轮廓线;
    或者,
    在标准拍摄结果中识别多个所述目标物,并在多个所述目标物的位置设置标准参考标记,所述标准参考标记为所述多个目标物的中心点连线。
  7. 根据权利要求4所述的方法,其特征在于,所述根据所述拍摄结果中目标物的 位置以及所述标准参考标记,调整所述待检测传感器的水平角、俯仰角以及旋转角中的一个或多个之后,还包括:
    固定所述待检测传感器。
  8. 一种自动驾驶传感器的检测装置,其特征在于,包括:
    标准建立模块,用于预先建立标准关联关系,其中,所述标准关联关系为正确安装时待检测传感器在固定场景下的标准拍摄结果与ID信息的对应关系;
    显示模块,用于在采用待检测传感器在固定场景下进行拍摄时,显示相应的拍摄结果;
    第一检测模块,用于根据所述拍摄结果以及所述标准关联关系,输出第一检测结果,所述第一检测结果用于表示所述待检测传感器的安装次序是否正确。
  9. 一种电子设备,其特征在于,包括:存储器,处理器及显示器;
    所述存储器,用于存储计算机程序;
    所述显示器,用于显示待检测传感器在固定场景下拍摄的拍摄结果;
    所述处理器,用于执行所述计算机程序,以执行权利要求1-7任一项所述的方法。
  10. 一种计算机可读存储介质,其特征在于,所述存储介质中存储计算机程序,所述计算机程序在被处理器执行时,以执行权利要求1-7任一项所述的方法。
PCT/CN2019/126304 2019-01-16 2019-12-18 自动驾驶传感器的检测方法、装置和电子设备 WO2020147498A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/125,207 US11933604B2 (en) 2019-01-16 2020-12-17 Detection method and apparatus for automatic driving sensor, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910039103.1 2019-01-16
CN201910039103.1A CN109855568B (zh) 2019-01-16 2019-01-16 自动驾驶传感器的检测方法、装置、电子设备及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/125,207 Continuation US11933604B2 (en) 2019-01-16 2020-12-17 Detection method and apparatus for automatic driving sensor, and electronic device

Publications (1)

Publication Number Publication Date
WO2020147498A1 true WO2020147498A1 (zh) 2020-07-23

Family

ID=66894827

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/126304 WO2020147498A1 (zh) 2019-01-16 2019-12-18 自动驾驶传感器的检测方法、装置和电子设备

Country Status (3)

Country Link
US (1) US11933604B2 (zh)
CN (2) CN113418543B (zh)
WO (1) WO2020147498A1 (zh)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113418543B (zh) * 2019-01-16 2023-06-20 北京百度网讯科技有限公司 自动驾驶传感器的检测方法、装置、电子设备及存储介质
CN112307825B (zh) * 2019-07-30 2024-05-31 阿波罗智能技术(北京)有限公司 自动驾驶的视觉检测方法、装置、电子设备和存储介质
CN112368592A (zh) * 2019-11-05 2021-02-12 深圳市大疆创新科技有限公司 雷达安装状态的检测方法、可移动平台、设备和存储介质
CN111147844A (zh) * 2019-12-31 2020-05-12 麦格纳电子(张家港)有限公司 一种组装摄像头的光轴检验校准方法
CN113706964A (zh) * 2021-07-30 2021-11-26 山东星科智能科技股份有限公司 一种智能驾驶教学实训系统及自动驾驶车辆控制方法
CN113821873B (zh) * 2021-08-31 2023-08-04 重庆长安汽车股份有限公司 用于自动驾驶的目标关联的校验方法及存储介质
CN114755642B (zh) * 2022-03-23 2023-03-10 广州文远知行科技有限公司 一种雷达线路检测方法、装置、交通工具及存储介质
CN114781567B (zh) * 2022-06-21 2022-09-16 深圳市信润富联数字科技有限公司 检测设备安装方法、装置、电子设备及可读存储介质
CN116481830A (zh) * 2023-04-07 2023-07-25 襄阳达安汽车检测中心有限公司 一种车载自动驾驶感知设备结冰模拟测试系统

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101023322A (zh) * 2004-05-26 2007-08-22 沃思测量技术股份有限公司 用于测量对象的坐标测量仪和方法
CN102501252A (zh) * 2011-09-28 2012-06-20 三一重工股份有限公司 一种控制执行臂末端运动的方法及控制系统
CN103945123A (zh) * 2014-04-03 2014-07-23 北京大恒图像视觉有限公司 一种调节工业相机水平角度的方法
CN106303206A (zh) * 2015-06-12 2017-01-04 西安蒜泥电子科技有限责任公司 一种人体扫描仪的摄像系统定位方法及装置
JP6280674B1 (ja) * 2016-09-30 2018-02-14 株式会社オプティム 撮像角度調整システム、撮像角度調整方法及びプログラム
CN107966495A (zh) * 2017-12-29 2018-04-27 潮峰钢构集团有限公司 一种基于大数据的钢结构焊点自检综合分析系统及分析方法
WO2018100717A1 (ja) * 2016-12-01 2018-06-07 株式会社Fuji 部品実装ラインの生産管理システム
CN108229410A (zh) * 2018-01-12 2018-06-29 深圳市商汤科技有限公司 图像处理方法和设备
CN108406297A (zh) * 2018-03-30 2018-08-17 深圳市鹰眼在线电子科技有限公司 多摄像头组装的定位系统及定位方法
CN109855568A (zh) * 2019-01-16 2019-06-07 北京百度网讯科技有限公司 自动驾驶传感器的检测方法、装置、电子设备及存储介质

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007064684A (ja) * 2005-08-29 2007-03-15 Canon Inc マーカ配置補助方法及び装置
US8918302B2 (en) * 2008-09-19 2014-12-23 Caterpillar Inc. Machine sensor calibration system
JP5573266B2 (ja) * 2010-03-19 2014-08-20 アイシン・エィ・ダブリュ株式会社 車両用対象物画像認識装置、車両用対象物画像認識方法及びコンピュータプログラム
CN102759953A (zh) * 2011-04-28 2012-10-31 周丽明 一种自动摄像头
DE102013222291A1 (de) * 2013-11-04 2015-05-07 Conti Temic Microelectronic Gmbh Verfahren und Vorrichtung zur Schätzung der Einbauwinkel eines in einem Fahrzeug montierten bildgebenden Sensors
CN103863205A (zh) 2014-04-03 2014-06-18 深圳市德赛微电子技术有限公司 一种辅助车载全景系统摄像头安装的方法及其辅助系统
US9933515B2 (en) * 2014-12-09 2018-04-03 Toyota Motor Engineering & Manufacturing North America, Inc. Sensor calibration for autonomous vehicles
CN206594292U (zh) * 2016-12-05 2017-10-27 武汉万集信息技术有限公司 一种激光雷达测距精度自动检测系统
CN107247268B (zh) 2017-05-16 2020-01-21 深圳市速腾聚创科技有限公司 多线激光雷达系统及其水平安装角度的校正方法
US10401484B2 (en) * 2017-06-30 2019-09-03 Aptiv Technologies Limited LiDAR sensor alignment system
US10176596B1 (en) * 2017-07-06 2019-01-08 GM Global Technology Operations LLC Calibration verification methods for autonomous vehicle operations
CN107479061A (zh) * 2017-08-25 2017-12-15 重庆交通大学 一种基于图像识别与雷达测距的安全预警方法及系统
US11415683B2 (en) * 2017-12-28 2022-08-16 Lyft, Inc. Mobile sensor calibration
CN108280828B (zh) * 2018-01-25 2020-11-10 上海闻泰电子科技有限公司 相机装配位置检测方法与装置
CN207937339U (zh) * 2018-03-05 2018-10-02 江苏东森智能科技有限公司 相机、光源一体配合机构
CN110264520B (zh) * 2019-06-14 2021-06-08 北京百度网讯科技有限公司 车载传感器与车辆位姿关系标定方法、装置、设备和介质

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101023322A (zh) * 2004-05-26 2007-08-22 沃思测量技术股份有限公司 用于测量对象的坐标测量仪和方法
CN102501252A (zh) * 2011-09-28 2012-06-20 三一重工股份有限公司 一种控制执行臂末端运动的方法及控制系统
CN103945123A (zh) * 2014-04-03 2014-07-23 北京大恒图像视觉有限公司 一种调节工业相机水平角度的方法
CN106303206A (zh) * 2015-06-12 2017-01-04 西安蒜泥电子科技有限责任公司 一种人体扫描仪的摄像系统定位方法及装置
JP6280674B1 (ja) * 2016-09-30 2018-02-14 株式会社オプティム 撮像角度調整システム、撮像角度調整方法及びプログラム
WO2018100717A1 (ja) * 2016-12-01 2018-06-07 株式会社Fuji 部品実装ラインの生産管理システム
CN107966495A (zh) * 2017-12-29 2018-04-27 潮峰钢构集团有限公司 一种基于大数据的钢结构焊点自检综合分析系统及分析方法
CN108229410A (zh) * 2018-01-12 2018-06-29 深圳市商汤科技有限公司 图像处理方法和设备
CN108406297A (zh) * 2018-03-30 2018-08-17 深圳市鹰眼在线电子科技有限公司 多摄像头组装的定位系统及定位方法
CN109855568A (zh) * 2019-01-16 2019-06-07 北京百度网讯科技有限公司 自动驾驶传感器的检测方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
US20210103741A1 (en) 2021-04-08
CN109855568B (zh) 2021-07-09
CN109855568A (zh) 2019-06-07
CN113418543B (zh) 2023-06-20
US11933604B2 (en) 2024-03-19
CN113418543A (zh) 2021-09-21

Similar Documents

Publication Publication Date Title
WO2020147498A1 (zh) 自动驾驶传感器的检测方法、装置和电子设备
US10504244B2 (en) Systems and methods to improve camera intrinsic parameter calibration
EP3248374B1 (en) Method and apparatus for multiple technology depth map acquisition and fusion
CN112272292B (zh) 投影校正方法、装置和存储介质
CN108280828B (zh) 相机装配位置检测方法与装置
WO2019037088A1 (zh) 一种曝光的控制方法、装置以及无人机
CN108111828B (zh) 投影设备校正方法、装置及投影设备
CN111008985B (zh) 全景图拼缝检测方法、装置、可读存储介质及电子设备
US10692262B2 (en) Apparatus and method for processing information of multiple cameras
US20170118451A1 (en) Information processing apparatus, image projection system, and computer program product
JP2018207373A (ja) 投写型表示装置の較正装置、較正方法、プログラム、投写型表示装置、及び投写型表示システム
US9654749B2 (en) Projection methods and projection devices
CN111798374A (zh) 一种图像拼接方法、装置、设备及介质
CN113542709B (zh) 投影图像亮度调整方法、装置、存储介质及投影设备
US11166005B2 (en) Three-dimensional information acquisition system using pitching practice, and method for calculating camera parameters
US9847011B2 (en) Warning system for sub-optimal sensor settings
CN113114975B (zh) 图像拼接方法、装置、电子设备和存储介质
WO2021022989A1 (zh) 标定参数的获取方法、装置、处理器及电子设备
CN111145674B (zh) 显示面板的检测方法、电子设备和存储介质
CN114339179A (zh) 投影校正方法、装置、存储介质以及投影设备
CN113790874A (zh) 镜头的测试系统
GB2555643A (en) Determining an intersection location of an optical axis of a lens with a camera sensor
CN112907540A (zh) 一种拼接异常检测方法、装置、设备及介质
KR102430726B1 (ko) 멀티 카메라의 정보 처리 장치 및 방법
WO2019165611A1 (zh) 图像的水波纹检测方法及其装置、无人机和存储装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19910835

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19910835

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 28/02/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 19910835

Country of ref document: EP

Kind code of ref document: A1