CN115439558A - Combined calibration method and device, electronic equipment and computer readable storage medium - Google Patents

Combined calibration method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN115439558A
CN115439558A CN202211144675.4A CN202211144675A CN115439558A CN 115439558 A CN115439558 A CN 115439558A CN 202211144675 A CN202211144675 A CN 202211144675A CN 115439558 A CN115439558 A CN 115439558A
Authority
CN
China
Prior art keywords
point cloud
point
coordinate value
initial
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211144675.4A
Other languages
Chinese (zh)
Inventor
张泫舜
陈熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecoflow Technology Ltd
Original Assignee
Ecoflow Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecoflow Technology Ltd filed Critical Ecoflow Technology Ltd
Priority to CN202211144675.4A priority Critical patent/CN115439558A/en
Publication of CN115439558A publication Critical patent/CN115439558A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application provides a joint calibration method, a joint calibration device, electronic equipment and a computer readable storage medium. The method includes the steps that a chessboard pattern image of a calibration board is obtained through shooting equipment, a skeleton point cloud picture corresponding to the chessboard pattern image is obtained through a laser radar, the chessboard patterns arranged in an array mode are arranged on the calibration board, each chessboard pattern comprises a central area and a boundary area arranged around the periphery of the central area, the central area and the boundary area have different reflectivity, a plurality of initial angular points and a space coordinate value of each initial angular point are determined according to point cloud brightness of cloud data of each point in the skeleton point cloud picture, the point cloud brightness and the reflectivity are in positive correlation, a pixel coordinate value corresponding to each initial angular point is obtained from the chessboard pattern image, a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar is calculated according to the space coordinate value of each initial angular point and the pixel coordinate value corresponding to each initial angular point, and the calibration effect can be improved.

Description

Combined calibration method and device, electronic equipment and computer readable storage medium
Technical Field
The present invention relates to the field of sensor calibration technologies, and in particular, to a joint calibration method and apparatus, an electronic device, and a computer-readable storage medium.
Background
When the camera and the laser radar are arranged on the electronic device, in order to enable the camera and the laser radar to work cooperatively, the camera and the laser radar need to be calibrated in a combined mode, and a transformation matrix between a coordinate system of the camera and a coordinate system of the laser radar needs to be determined.
The common joint calibration method is to use a checkerboard calibration board for calibration. Because the checkerboard calibration plate is a plane and has no outstanding shape characteristics, after the laser radar collects the point cloud data of the checkerboard calibration plate, the point cloud data of the characteristic points is difficult to find in the point cloud data of the checkerboard calibration plate for calibration, a user needs to manually select the characteristic points in the point cloud data of the checkerboard calibration plate, the operation is complicated, the point selection error is large, and the accuracy of the transformation matrix obtained by calibration is low.
Disclosure of Invention
In view of the foregoing, it is necessary to provide a joint calibration method, device, electronic device, and computer-readable storage medium, which can solve the technical problems that the camera external reference cannot be obtained through joint calibration of a laser radar and an image, and the calibration effect of the camera is not accurate enough.
On one hand, the application provides a combined calibration method, which comprises the following steps: obtaining a checkerboard image of a calibration board through shooting equipment, and obtaining a skeleton point cloud picture corresponding to the checkerboard image through a laser radar, wherein the checkerboard is arranged in an array on the calibration board, each checkerboard comprises a central area and a boundary area which is arranged around the periphery of the central area, the central area and the boundary area have different reflectivities, a plurality of initial angular points and a spatial coordinate value of each initial angular point are determined according to the point cloud brightness of cloud data of each point in the skeleton point cloud picture, the point cloud brightness and the reflectivities are in a positive correlation relationship, a pixel coordinate value corresponding to each initial angular point is obtained from the checkerboard image, and a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar is calculated according to the pixel coordinate value corresponding to each initial angular point and the pixel coordinate value corresponding to each initial angular point.
According to an optional embodiment of the present application, determining a plurality of initial corner points according to the point cloud brightness of each point cloud data in a skeleton point cloud image includes: determining a plurality of transverse lines and a plurality of longitudinal lines according to the point cloud brightness of each point cloud data in the skeleton point cloud picture, and determining a plurality of initial angular points according to the intersection points of each transverse line and each longitudinal line.
In this embodiment, since a plurality of intersections of a plurality of intersecting lateral lines and vertical lines, which are accurately displayed, are directly determined as a plurality of initial corner points, the position of the corner points can be quickly located.
According to the optional embodiment of the application, the reflectivity of the boundary area of the checkerboard is greater than that of the central area of the checkerboard, and a plurality of transverse lines and a plurality of longitudinal lines are determined according to the point cloud brightness of each point cloud data in the skeleton point cloud picture, and the method comprises the following steps: and determining the point cloud corresponding to the point cloud brightness which is greater than a first threshold value in the skeleton map point cloud image as a first target point cloud, and determining a plurality of transverse lines and a plurality of longitudinal lines according to the first target point cloud.
In this embodiment, since the reflectivity of the boundary area of the checkerboard is greater than the reflectivity of the central area of the checkerboard, the point cloud brightness of the boundary area of each checkerboard in the skeleton point cloud image is brighter than the point cloud brightness of the central area of the checkerboard, and therefore, by comparing the point cloud brightness with the first threshold, a plurality of horizontal lines and a plurality of vertical lines can be determined quickly.
According to the optional embodiment of the application, the reflectivity of the boundary area of the checkerboard is smaller than that of the central area of the checkerboard, and a plurality of transverse lines and a plurality of longitudinal lines are determined according to the point cloud brightness of each point cloud data in the skeleton point cloud picture, and the method comprises the following steps: and determining the point cloud corresponding to the point cloud brightness smaller than the second threshold value in the skeleton map point cloud picture as a second target point cloud, and determining a plurality of transverse lines and a plurality of longitudinal lines according to the second target point cloud.
In this embodiment, since the reflectivity of the boundary area of the checkerboard is smaller than the reflectivity of the central area of the checkerboard, the point cloud brightness of the boundary area of each checkerboard in the skeleton point cloud image is darker than the point cloud brightness of the central area of the checkerboard, and therefore, by comparing the point cloud brightness with the second threshold, a plurality of horizontal lines and a plurality of vertical lines can be quickly determined.
According to an optional embodiment of the present application, calculating a transformation matrix between a camera coordinate system corresponding to the shooting device and a lidar coordinate system corresponding to the lidar according to a spatial coordinate value of each initial corner point and a pixel coordinate value corresponding to the initial corner point, includes: acquiring an internal reference matrix of the shooting equipment, determining a first pixel coordinate value corresponding to each initial angular point according to the spatial coordinate value and the internal reference matrix of each initial angular point, determining a second pixel coordinate value corresponding to each initial angular point according to the pixel coordinate value and the first pixel coordinate value corresponding to each initial angular point, and calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the second pixel coordinate value, the spatial coordinate value and the internal reference matrix of each initial angular point.
In this embodiment, a transformation matrix is calculated according to spatial coordinate values of a plurality of initial corner points and corresponding pixel coordinate values, and since a plurality of intersection points of a plurality of intersecting transverse lines and longitudinal lines which are accurately displayed are directly determined as the plurality of initial corner points, it is not necessary to manually select the spatial coordinate values of the initial corner points and the pixel coordinate values corresponding to the initial corner points, and therefore, errors caused by manual point selection during manual calibration can be greatly reduced.
On the other hand, this application still provides a combined calibration device, and combined calibration device includes: the system comprises an acquisition unit, a determination unit and a calculation unit, wherein the acquisition unit is used for acquiring a checkerboard image of a calibration board through shooting equipment and acquiring a skeleton point cloud picture corresponding to the checkerboard image through a laser radar, the calibration board is provided with checkerboards which are arranged in an array mode, each checkerboard comprises a central area and a boundary area which is arranged around the periphery of the central area, the central area and the boundary area have different reflectivities, the determination unit is used for determining a plurality of initial angular points and a space coordinate value of each initial angular point according to point cloud brightness of cloud data of each point in the skeleton point cloud picture, the point cloud brightness and the reflectivities are in a positive correlation relationship, the acquisition unit is also used for acquiring a pixel coordinate value corresponding to each initial angular point from the checkerboard image, and the calculation unit is used for calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the pixel coordinate value corresponding to each initial angular point and the reflectivity.
On the other hand, the present application also proposes an electronic device, which includes:
a memory storing computer readable instructions; and
a processor that executes computer readable instructions stored in the memory to implement the joint calibration method.
In another aspect, the present application further provides a computer-readable storage medium, in which computer-readable instructions are stored, and the computer-readable instructions are executed by a processor in an electronic device to implement the joint calibration method.
According to the technical scheme, the chessboard pattern image of the calibration plate can be obtained through the shooting device, and the skeleton point cloud picture corresponding to the chessboard pattern image is obtained through the laser radar. The calibration plate is formed with checkerboards arranged in an array, and a central area and a boundary area in each checkerboard have different reflectivities. Because the point cloud brightness and the reflectivity are in a positive correlation relationship, in the skeleton point cloud image, the point cloud brightness of the point cloud data corresponding to the boundary area of the checkerboard is different from the point cloud brightness of the point cloud data corresponding to the central area of the checkerboard. Therefore, a plurality of initial angular points can be determined according to the point cloud brightness of each point cloud data in the skeleton point cloud picture, and the space coordinate value of the initial angular point can be accurately acquired from the point cloud data corresponding to the initial angular points. In addition, the pixel coordinate value corresponding to each initial angle point can be obtained from the checkerboard image. And subsequently, calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the space coordinate value and the corresponding pixel coordinate value of each initial angular point. In summary, in the method provided by the embodiment of the present application, the electronic device may automatically select the initial corner point according to the point cloud brightness of the cloud data of each point in the skeleton point cloud image, and a user does not need to manually select a feature point, so that the operation is simple, the point selection error can be reduced, and the accuracy of the transformation matrix is improved.
Drawings
Fig. 1 is an application environment diagram of a joint calibration method provided in an embodiment of the present application.
Fig. 2 is a flowchart of a joint calibration method provided in an embodiment of the present application.
Fig. 3 is a schematic diagram of a calibration plate provided by an embodiment of the present application.
Fig. 4 is a schematic diagram of a skeleton point cloud provided by an embodiment of the present application.
Fig. 5 is a functional block diagram of a joint calibration method provided in an embodiment of the present application.
Fig. 6 is a schematic structural diagram of an electronic device of a joint calibration method provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in detail below with reference to the accompanying drawings and specific embodiments.
Fig. 1 is a diagram of an application environment of the joint calibration method according to the embodiment of the present application. The combined calibration method can be applied to one or more electronic devices 1, the electronic devices 1 are communicated with the shooting device 2 and the laser radar 3, and the shooting device 2 can be a monocular camera or other devices with shooting functions.
The electronic device 1 is a device capable of automatically performing parameter value calculation and/or information processing according to a preset or stored instruction, and the hardware thereof includes, but is not limited to: a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The electronic device 1 may be any electronic product capable of performing human-computer interaction with a user, for example, a Personal computer, a tablet computer, a smart phone, a Personal Digital Assistant (PDA), a game machine, an interactive web Television (IPTV), an intelligent wearable device, and the like.
The electronic device 1 may also comprise a network device and/or a user device. The network device includes, but is not limited to, a single network server, a server group consisting of a plurality of network servers, or a Cloud Computing (Cloud Computing) based Cloud consisting of a large number of hosts or network servers.
The network in which the electronic device 1 is located includes, but is not limited to: the internet, a wide area Network, a metropolitan area Network, a local area Network, a Virtual Private Network (VPN), and the like.
The electronic device 1 may also comprise a self-moving device. The self-moving device may be a device that includes self-moving assistance functionality. The self-moving auxiliary function can be realized by a vehicle-mounted terminal, and the corresponding self-moving equipment can be a vehicle with the vehicle-mounted terminal. The autonomous mobile device may also be a semi-autonomous mobile device or a fully autonomous mobile device. Such as lawn mowers, floor sweepers, robots with navigation functions, etc.
Fig. 2 is a flowchart of a joint calibration method provided in an embodiment of the present application. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
S101, obtaining a checkerboard image of the calibration plate through shooting equipment, and obtaining a skeleton point cloud picture corresponding to the checkerboard image through a laser radar.
In the embodiment of the present application, the photographing apparatus and the laser radar may be installed at a designated position on the self-moving apparatus. Each checkerboard includes a central region and a border region disposed around the periphery of the central region, the central region and the border region having different reflectivities.
In at least one embodiment of the present application, the calibration board is a checkerboard calibration board, the calibration board has checkerboards arranged in an array, and a boundary area of each checkerboard and a central area of each checkerboard are made of materials with different reflectivities. For example, the material of the border area of each checkerboard may be an aluminum-plated reflective mylar, and the material of the central area of the checkerboard may be different from the material of the border area, i.e., different materials with different reflectivities, for example, the material of the central area of the checkerboard may be a glass material, a ceramic material, a plastic material, or the like, which is not limited herein. Or the material of the central region of the tessellation may be the same as the material of the border regions but the material thickness of the central region is different from the material thickness on the border such that the different thicknesses reflect different reflectivities. Fig. 3 is a schematic diagram of a calibration board provided in an embodiment of the present application, in which the calibration board in fig. 3 has checkerboards arranged in an array, and the color of each checkerboard is alternated between black and white. The calibration plate shown in fig. 3 is merely an example.
In at least one embodiment of the present application, the electronic device obtaining the checkerboard image of the calibration board by the shooting device includes:
and the electronic equipment controls the shooting equipment to shoot the calibration board to obtain a checkerboard image.
The shooting device may be a monocular camera or other devices that can perform shooting, and the present application is not limited herein.
In at least one embodiment of the present application, the obtaining, by an electronic device, a skeleton point cloud image corresponding to a checkerboard image through a laser radar includes:
and the electronic equipment controls the laser radar to scan the calibration plate to obtain a skeleton point cloud picture.
In this embodiment, since the boundary area of each checkerboard and the central area of each checkerboard are made of materials with different reflectivities, the point cloud brightness of the boundary area of each checkerboard and the central area of each checkerboard in the skeleton point cloud image is different, so that the skeleton point cloud image can obviously display a plurality of transverse lines and a plurality of longitudinal lines. The skeleton point cloud picture is a point cloud image comprising horizontal lines and vertical lines.
S102, determining a plurality of initial corner points and a space coordinate value of each initial corner point according to the point cloud brightness of each point cloud data in the skeleton point cloud image.
The point cloud brightness and the reflectivity are in a positive correlation relationship, namely the larger the reflectivity is, the stronger the corresponding point cloud brightness is.
Fig. 4 is a schematic diagram of a skeleton dot cloud provided by an embodiment of the present application, where fig. 4 includes a plurality of intersecting horizontal lines and vertical lines. Among these, the skeleton point cloud shown in fig. 4 is only an example.
In at least one embodiment of the present application, the electronic device determines a plurality of initial corner points according to the point cloud brightness of each point cloud data in the skeleton point cloud image, including:
the electronic equipment determines a plurality of transverse lines and a plurality of longitudinal lines according to the point cloud brightness of each point cloud data in the skeleton point cloud picture, and further determines a plurality of initial angular points according to the intersection points of each transverse line and each longitudinal line.
In this embodiment, the electronic device determines the intersection of each horizontal line and the plurality of vertical lines as a plurality of initial corner points.
In this embodiment, since the boundary area of each checkerboard and the central area of each checkerboard are made of materials with different reflectivities, the magnitude relationship between the reflectivity of the boundary area of the checkerboard and the reflectivity of the central area of the checkerboard includes the following conditions: the reflectivity of the boundary area of the checkerboard is larger than that of the central area of the checkerboard, or the reflectivity of the boundary area of the checkerboard is smaller than that of the central area of the checkerboard.
In this embodiment, if the reflectivity of the boundary area of the checkerboard is greater than the reflectivity of the central area of the checkerboard, the electronic device determines a plurality of horizontal lines and a plurality of vertical lines according to the point cloud brightness of the cloud data of each point in the skeleton point cloud image, including:
and determining the point cloud corresponding to the point cloud brightness which is greater than a first threshold value in the skeleton point cloud image as a first target point cloud, and determining a plurality of transverse lines and a plurality of longitudinal lines according to the first target point cloud.
In this embodiment, if the reflectivity of the boundary area of the checkerboard is smaller than the reflectivity of the central area of the checkerboard, the electronic device determines a plurality of horizontal lines and a plurality of vertical lines according to the point cloud brightness of each point cloud data in the skeleton point cloud image, including:
and the electronic equipment determines the point cloud corresponding to the point cloud brightness smaller than the second threshold value in the skeleton point cloud image as a second target point cloud, and determines a plurality of transverse lines and a plurality of longitudinal lines according to the second target point cloud.
The first threshold and the second threshold may be set by themselves, and the application is not limited herein. The point cloud brightness refers to the brightness fed back by materials with different reflectivities, and the point cloud brightness can represent the reflection intensity of the point cloud.
In this embodiment, the plurality of transverse lines and the plurality of longitudinal lines are formed by point clouds in the skeleton point cloud image, and the point clouds are a plurality of spatial points on a spatial coordinate system, so that the point cloud data corresponding to each point cloud includes a spatial coordinate value of the point cloud, and the electronic device can acquire the spatial coordinate value of the initial angular point from the point cloud data corresponding to each initial angular point.
S103, acquiring pixel coordinate values corresponding to each initial angle point from the checkerboard image.
In at least one embodiment of the present application, the electronic device uses the pixel O in the first row and the first column of the checkerboard image as an origin, uses the parallel line where the pixel O in the first row is located as a u-axis, and uses the vertical line where the pixel O in the first column is located as a v-axis to construct a pixel coordinate system uOv.
In at least one embodiment of the present application, the electronic device obtaining the pixel coordinate value corresponding to each initial corner point from the checkerboard image includes:
and acquiring the corresponding position of each initial corner point in a pixel coordinate system as a pixel coordinate value.
And S104, calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the space coordinate value of each initial angular point and the pixel coordinate value corresponding to the initial angular point.
In at least one embodiment of the present application, the transformation matrix refers to a transformation relationship between a camera coordinate system and a lidar coordinate system, and points on the radar coordinate system are transformed into the camera coordinate system through the transformation matrix, or the points on the camera coordinate system are transformed into the radar coordinate system through the transformation matrix.
In at least one embodiment of the present application, the calculating, by the electronic device, a transformation matrix between a camera coordinate system corresponding to the shooting device and a lidar coordinate system corresponding to the lidar according to the spatial coordinate value and the corresponding pixel coordinate value of each initial corner point includes:
and acquiring an internal reference matrix of the shooting equipment, and determining a first pixel coordinate value corresponding to each initial corner point according to the space coordinate value and the internal reference matrix of each initial corner point. And determining a second pixel coordinate value corresponding to each initial angle point according to the pixel coordinate value and the first pixel coordinate value corresponding to each initial angle point. And calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the second pixel coordinate value, the space coordinate value of the initial angular point and the internal reference matrix.
Specifically, the spatial coordinate values include a horizontal axis (e.g., x axis) coordinate value, a vertical axis (e.g., y axis) coordinate value and a vertical axis (e.g., z axis) coordinate value of each initial corner point, and the first pixel coordinate values include a first horizontal coordinate value and a first vertical coordinate value of each initial corner point, and the calculation formula of the first pixel coordinate values is:
Figure BDA0003854787890000071
Figure BDA0003854787890000072
Figure BDA0003854787890000081
wherein, U 0 A first abscissa value, p, representing each initial corner point lidar.x Coordinate value of abscissa representing initial corner point, p lidar.z Vertical axis coordinate value, V, representing the initial corner point 0 First ordinate value, p, representing the initial corner point lidar.y The coordinate value of the longitudinal axis representing the initial corner point, K representing an internal reference matrix, f x Denotes the focal length of the photographing apparatus in the u-axis direction, f y Represents the focal length of the photographing device in the direction of the v-axis, cx represents the abscissa value of the principal point on the pixel coordinate system, cy represents the ordinate value of the principal point on the pixel coordinate system, and the principal point is the intersection point between the optical axis of the photographing device and the checkerboard image.
Specifically, the pixel coordinate value corresponding to each initial corner point includes an initial abscissa value and an initial ordinate value, the second pixel coordinate value includes a second abscissa value and a second ordinate value of each initial corner point, and the calculation formula of the second pixel coordinate value is as follows:
Figure BDA0003854787890000082
Figure BDA0003854787890000083
wherein, U 1 Indicating a second abscissa value, V 1 And expressing a second ordinate value, x expressing an initial abscissa value, y expressing an initial ordinate value, dx expressing the length of each pixel point in the checkerboard image on the u axis, and dy expressing the length of each pixel point in the checkerboard image on the v axis.
In this embodiment, the transformation matrix includes a rotation matrix and a translation vector, and the calculation formula of the transformation matrix is:
P camera =K*(R*P lidar +t);
Figure BDA0003854787890000084
Figure BDA0003854787890000085
where R denotes a rotation matrix, t denotes a translation vector, P lidar Representing a spatial coordinate value, P camera Indicating the second pixel coordinate value.
In this embodiment, a plurality of equations about a rotation matrix and a translation vector are obtained according to spatial coordinate values of a plurality of initial corner points and pixel coordinate values corresponding to the plurality of initial corner points, wherein each equation corresponds to each initial corner point one by one, the plurality of equations are solved to obtain a solved rotation matrix and a solved translation vector, and the solved rotation matrix and the solved translation vector are spliced to obtain a transformation matrix.
The rotation matrix comprises a plurality of unknown Euler angles, the translation vector comprises a plurality of unknown parameters, and a plurality of initial corner points are selected according to the sum of the number of the unknown Euler angles and the number of the unknown parameters.
Exemplarily, if the rotation matrix includes 3 unknown euler angles and the translation vector includes 3 unknown parameters, the sum of the unknown quantities between the unknown euler angles and the unknown parameters is 6, so that only 6 initial angular points need to be selected, 6 equations are obtained according to spatial coordinate values of the 6 initial angular points and pixel coordinate values corresponding to the 6 initial angular points, and the 6 equations are solved to obtain a transformation matrix between a camera coordinate system corresponding to the shooting device and a laser radar coordinate system corresponding to the laser radar.
In this embodiment, the sum of unknown quantities between the unknown euler angles and the unknown parameters is calculated, and since the transformation matrix can be obtained only by calculating the spatial coordinate values and the pixel coordinate values of the initial angular points with the same quantities and the unknown quantities, and the spatial coordinate values of all the initial angular points and the pixel coordinate values corresponding to all the initial angular points do not need to be all involved in the calculation, the amount of calculation can be reduced, and thus the transformation matrix between the camera coordinate system corresponding to the shooting device and the lidar coordinate system corresponding to the lidar can be quickly calculated.
In other embodiments of the present application, the electronic device, according to the spatial coordinate value and the corresponding pixel coordinate value of each initial corner point, calculating a transformation matrix between a camera coordinate system corresponding to the shooting device and a lidar coordinate system corresponding to the lidar, further includes:
the electronic equipment calculates a total error according to the first pixel coordinate value and the second pixel coordinate value, and further optimizes the total error value based on a least square algorithm until the total error value is minimum, so that a transformation matrix is obtained.
Specifically, the total error value is calculated by the formula:
Figure BDA0003854787890000091
E u =|U 0 -U 1 |;
E v =|V 0 -V 1 |;
wherein, E error Representing the total error value, n representing the number of the plurality of initial corner points, E u An abscissa difference, E, between a first and a corresponding second abscissa value representing each initial corner point v A difference in ordinate between a first ordinate value representing each initial corner point and a corresponding second ordinate value, E iu A difference of abscissa, E, representing an ith initial corner point of the plurality of initial corner points iv Representing a difference of ordinates of an ith initial corner point of the plurality of initial corner points.
In this embodiment, the total error value is optimized according to the least square algorithm, and since more spatial coordinate values of the initial corner points and corresponding pixel coordinate values are used, the precision of the transformation matrix can be improved.
According to the technical scheme, the chessboard pattern image of the calibration plate can be obtained through the shooting device, and the skeleton point cloud picture corresponding to the chessboard pattern image can be obtained through the laser radar. The calibration plate is formed with checkerboards arranged in an array mode, and the boundary area of each checkerboard and the central area of each checkerboard are made of materials with different reflectivities. Because the boundary area of each checkerboard and the central area of each checkerboard are made of materials with different reflectivities, in the skeleton point cloud picture, the point cloud brightness of the point cloud data corresponding to the boundary area of the checkerboard is different from the point cloud brightness of the point cloud data corresponding to the central area of the checkerboard. Therefore, a plurality of initial corner points can be determined according to the point cloud brightness of each point cloud data in the skeleton point cloud image. The space coordinate values of the initial angular points can be accurately obtained from the point cloud data corresponding to the initial angular points because the materials with different reflectivity cannot influence the laser radar on obtaining the space coordinate values. In addition, the pixel coordinate value corresponding to each initial angle point can be obtained from the checkerboard image. And subsequently, calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the space coordinate value and the corresponding pixel coordinate value of each initial angular point. In summary, in the method provided by the embodiment of the present application, the electronic device may automatically select the initial corner point according to the point cloud brightness of the cloud data of each point in the skeleton point cloud image, and a user does not need to manually select a feature point, so that the operation is simple, the point selection error can be reduced, and the accuracy of the transformation matrix is improved.
Fig. 5 is a functional block diagram of a joint calibration method provided in an embodiment of the present application. The joint calibration apparatus 11 includes an acquisition unit 110, a determination unit 111, and a calculation unit 112. A module/unit as referred to herein is a series of computer readable instruction segments stored in memory 12 that can be accessed by processor 13 and perform a fixed function. In the present embodiment, the functions of the modules/units will be described in detail in the following embodiments.
The obtaining unit 110 is configured to obtain a checkerboard image of the calibration board through the shooting device, and obtain a skeleton point cloud pattern corresponding to the checkerboard image through the laser radar, where the calibration board has checkerboards arranged in an array, and a boundary area of each checkerboard and a central area of each checkerboard are made of materials with different reflectivities.
The determining unit 111 is configured to determine a plurality of initial corner points and a spatial coordinate value of each initial corner point according to the point cloud brightness of each point cloud data in the skeleton point cloud image.
In at least one embodiment of the present application, the determining unit 111 comprises:
the first determining subunit is used for determining a plurality of transverse lines and a plurality of longitudinal lines according to the point cloud brightness of each point cloud data in the skeleton point cloud picture;
and the second determining subunit is used for determining a plurality of initial corner points according to the intersection points of each transverse line and each longitudinal line.
In this embodiment, if the reflectivity of the boundary area of the checkerboard is greater than the reflectivity of the central area of the checkerboard, the determining unit 111 is configured to:
and determining the point cloud corresponding to the point cloud brightness which is greater than a first threshold value in the skeleton point cloud image as a first target point cloud, and determining a plurality of transverse lines and a plurality of longitudinal lines according to the first target point cloud.
In this embodiment, if the reflectivity of the boundary area of the checkerboard is smaller than the reflectivity of the central area of the checkerboard, the determining unit 111 is further configured to:
and determining the point cloud corresponding to the point cloud brightness smaller than the second threshold value in the skeleton point cloud image as a second target point cloud, and determining a plurality of transverse lines and a plurality of longitudinal lines according to the second target point cloud.
An obtaining unit 110 is configured to obtain a pixel coordinate value corresponding to each initial angle point from the checkerboard image.
And a calculating unit 112, configured to calculate a transformation matrix between a camera coordinate system corresponding to the shooting device and a lidar coordinate system corresponding to the lidar according to the spatial coordinate value of each initial corner point and the pixel coordinate value corresponding to the initial corner point.
In at least one embodiment of the present application, the computing unit 112 includes:
the internal reference acquisition subunit is used for acquiring an internal reference matrix of the shooting equipment;
the coordinate determination subunit is used for determining a first pixel coordinate value corresponding to each initial corner point according to the spatial coordinate value and the internal reference matrix of each initial corner point; determining a second pixel coordinate value corresponding to each initial angle point according to the pixel coordinate value corresponding to each initial angle point and the first pixel coordinate value;
and the matrix transformation subunit is used for calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the second pixel coordinate value, the space coordinate value of the initial corner point and the internal reference matrix.
According to the technical scheme, the chessboard pattern image of the calibration plate can be obtained through the shooting device, and the skeleton point cloud picture corresponding to the chessboard pattern image is obtained through the laser radar. The calibration plate is formed with checkerboards arranged in an array, and a central area and a boundary area in each checkerboard have different reflectivities. Because the point cloud brightness and the reflectivity are in a positive correlation relationship, in the skeleton point cloud image, the point cloud brightness of the point cloud data corresponding to the boundary area of the checkerboard is different from the point cloud brightness of the point cloud data corresponding to the central area of the checkerboard. Therefore, a plurality of initial angular points can be determined according to the point cloud brightness of each point cloud data in the skeleton point cloud picture, and the space coordinate value of the initial angular point can be accurately acquired from the point cloud data corresponding to the initial angular points. In addition, the pixel coordinate value corresponding to each initial angle point can be obtained from the checkerboard image. And subsequently, calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the space coordinate value and the corresponding pixel coordinate value of each initial angular point. In summary, in the method provided by the embodiment of the present application, the electronic device may automatically select the initial corner point according to the point cloud brightness of the cloud data of each point in the skeleton point cloud image, and a user does not need to manually select a feature point, so that the operation is simple, the point selection error can be reduced, and the accuracy of the transformation matrix is improved.
Fig. 6 is a schematic structural diagram of an electronic device of a joint calibration method provided in an embodiment of the present application.
In one embodiment of the present application, the electronic device 1 includes, but is not limited to, a memory 12, a processor 13, and computer readable instructions stored in the memory 12 and executable on the processor 13, such as a joint calibration program.
It will be appreciated by a person skilled in the art that the schematic diagram is merely an example of the electronic device 1 and does not constitute a limitation of the electronic device 1 and may comprise more or less components than shown, or combine certain components, or different components, e.g. the electronic device 1 may further comprise an input output device, a network access device, a bus, etc.
The Processor 13 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. The general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc., and the processor 13 is an operation core and a control center of the electronic device 1, and is connected to each part of the whole electronic device 1 by various interfaces and lines, and executes an operating system of the electronic device 1 and various installed application programs, program codes, etc.
Illustratively, the computer readable instructions may be divided into one or more modules/units, which are stored in the memory 12 and executed by the processor 13 to accomplish the present application. One or more modules/units may be a series of computer readable instruction segments capable of performing certain functions, the computer readable instruction segments being used for describing the execution of computer readable instructions in the electronic device 1. For example, the computer readable instructions may be partitioned into an acquisition unit 110, a determination unit 111, and a calculation unit 112.
The memory 12 may be used to store computer readable instructions and/or modules, and the processor 13 may implement various functions of the electronic device 1 by executing or executing the computer readable instructions and/or modules stored in the memory 12 and invoking data stored in the memory 12. The memory 12 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the electronic device, and the like. The memory 12 may include non-volatile and volatile memories, such as: a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other storage device.
The memory 12 may be an external memory and/or an internal memory of the electronic device 1. Further, the memory 12 may be a memory having a physical form, such as a memory stick, a TF Card (Trans-flash Card), or the like.
The integrated modules/units of the electronic device 1 may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by the present application, and can also be realized by hardware related to computer readable instructions, which can be stored in a computer readable storage medium, and when the computer readable instructions are executed by a processor, the steps of the above described method embodiments can be realized.
Where the computer readable instructions comprise computer readable instruction code, the computer readable instruction code may be in source code form, object code form, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer readable instruction code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a Random Access Memory (RAM).
Referring to fig. 2, the memory 12 in the electronic device 1 stores computer-readable instructions to implement a joint calibration method, and the processor 13 can execute the computer-readable instructions to implement:
obtaining a checkerboard image of a calibration plate through shooting equipment, and obtaining a skeleton point cloud picture corresponding to the checkerboard image through a laser radar, wherein the calibration plate is provided with checkerboards which are arranged in an array mode, each checkerboard comprises a central area and a boundary area which is arranged around the periphery of the central area, and the central area and the boundary area have different reflectivities; determining a plurality of initial angular points and a space coordinate value of each initial angular point according to the point cloud brightness of each point cloud data in the skeleton point cloud image, wherein the point cloud brightness and the reflectivity are in a positive correlation relationship; acquiring a pixel coordinate value corresponding to each initial angle point from the checkerboard image; and calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the space coordinate value of each initial angular point and the pixel coordinate value corresponding to the initial angular point.
Specifically, the processor 13 may refer to the description of the relevant steps in the embodiment corresponding to fig. 2 for a specific implementation method of the computer readable instructions, which is not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a module may be divided into only one logical function, and may be divided into other ways in actual implementation.
The computer readable storage medium has computer readable instructions stored thereon, wherein the computer readable instructions when executed by the processor 13 are for performing the steps of:
obtaining a checkerboard image of a calibration plate through shooting equipment, and obtaining a skeleton point cloud picture corresponding to the checkerboard image through a laser radar, wherein the calibration plate is provided with checkerboards which are arranged in an array mode, each checkerboard comprises a central area and a boundary area which is arranged around the periphery of the central area, and the central area and the boundary area have different reflectivities; determining a plurality of initial angular points and a space coordinate value of each initial angular point according to the point cloud brightness of each point cloud data in the skeleton point cloud image, wherein the point cloud brightness and the reflectivity are in a positive correlation relationship; acquiring a pixel coordinate value corresponding to each initial angle point from the checkerboard image; and calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the space coordinate value of each initial angular point and the pixel coordinate value corresponding to the initial angular point.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference signs in the claims shall not be construed as limiting the claim concerned.
Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. The plurality of units or devices may also be implemented by one unit or device through software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present application and not for limiting, and although the present application is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made on the technical solutions of the present application without departing from the spirit and scope of the technical solutions of the present application.

Claims (10)

1. A joint calibration method is characterized by comprising the following steps:
obtaining a checkerboard image of a calibration plate through shooting equipment, and obtaining a skeleton point cloud picture corresponding to the checkerboard image through a laser radar, wherein the calibration plate is provided with checkerboards which are arranged in an array mode, each checkerboard comprises a central area and a boundary area which is arranged around the periphery of the central area, and the central area and the boundary area have different reflectivities;
determining a plurality of initial angular points and a space coordinate value of each initial angular point according to the point cloud brightness of each point cloud data in the skeleton point cloud image; wherein the point cloud brightness and the reflectivity are in a positive correlation relationship;
acquiring pixel coordinate values corresponding to each initial angle point from the checkerboard image;
and calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the space coordinate value of each initial angular point and the pixel coordinate value corresponding to the initial angular point.
2. The joint calibration method according to claim 1, wherein the determining a plurality of initial corner points according to the point cloud brightness of each point cloud data in the skeleton point cloud image comprises:
determining a plurality of transverse lines and a plurality of longitudinal lines according to the point cloud brightness of the cloud data of each point in the skeleton point cloud image;
and determining a plurality of initial corner points according to the intersection points of each transverse line and each longitudinal line.
3. The joint calibration method according to claim 2, wherein the reflectivity of the boundary area of the checkerboard is greater than the reflectivity of the central area of the checkerboard; the determining of the plurality of transverse lines and the plurality of longitudinal lines according to the point cloud brightness of the cloud data of each point in the skeleton point cloud picture comprises the following steps:
determining a point cloud corresponding to the point cloud brightness which is greater than a first threshold value in the skeleton map point cloud image as a first target point cloud;
and determining a plurality of transverse lines and a plurality of longitudinal lines according to the first target point cloud.
4. The joint calibration method according to claim 2, wherein the reflectivity of the boundary area of the checkerboard is smaller than the reflectivity of the central area of the checkerboard; the method for determining a plurality of transverse lines and a plurality of longitudinal lines according to the point cloud brightness of each point cloud data in the skeleton point cloud picture comprises the following steps:
determining the point cloud corresponding to the point cloud brightness smaller than a second threshold value in the skeleton map point cloud picture as a second target point cloud;
and determining a plurality of transverse lines and a plurality of longitudinal lines according to the second target point cloud.
5. The joint calibration method according to claim 1, wherein the calculating a transformation matrix between the camera coordinate system corresponding to the shooting device and the lidar coordinate system corresponding to the lidar, according to the spatial coordinate value of each initial corner point and the pixel coordinate value corresponding to the initial corner point, comprises:
acquiring an internal reference matrix of the shooting equipment;
determining a first pixel coordinate value corresponding to each initial angular point according to the space coordinate value of each initial angular point and the internal reference matrix;
determining a second pixel coordinate value corresponding to each initial angle point according to the pixel coordinate value corresponding to each initial angle point and the first pixel coordinate value;
and calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the second pixel coordinate value, the space coordinate value of the initial angular point and the internal reference matrix.
6. A joint calibration apparatus, comprising:
the system comprises an acquisition unit, a calibration unit and a control unit, wherein the acquisition unit is used for acquiring a checkerboard image of a calibration plate through shooting equipment and acquiring a skeleton point cloud picture corresponding to the checkerboard image through a laser radar, the calibration plate is provided with checkerboards which are arranged in an array mode, each checkerboard comprises a central area and a boundary area which is arranged around the periphery of the central area, and the central area and the boundary area have different reflectivities;
the determining unit is used for determining a plurality of initial angular points and a space coordinate value of each initial angular point according to the point cloud brightness of each point cloud data in the skeleton point cloud image; wherein the point cloud brightness and the reflectivity are in a positive correlation relationship;
the obtaining unit is further configured to obtain a pixel coordinate value corresponding to each initial angle point from the checkerboard image;
and the calculation unit is used for calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the space coordinate value of each initial angular point and the pixel coordinate value corresponding to the initial angular point.
7. The joint calibration apparatus according to claim 6, wherein the determination unit includes:
the first determining subunit is used for determining a plurality of transverse lines and a plurality of longitudinal lines according to the point cloud brightness of each point cloud data in the skeleton point cloud picture;
and the second determining subunit is used for determining a plurality of initial corner points according to the intersection points of each transverse line and each longitudinal line.
8. The joint calibration apparatus according to claim 6, wherein the calculation unit includes:
the internal reference acquisition subunit is used for acquiring an internal reference matrix of the shooting equipment;
the coordinate determination subunit is used for determining a first pixel coordinate value corresponding to each initial corner point according to the spatial coordinate value and the internal reference matrix of each initial corner point; determining a second pixel coordinate value corresponding to each initial angle point according to the pixel coordinate value corresponding to each initial angle point and the first pixel coordinate value;
and the matrix transformation subunit is used for calculating a transformation matrix between a camera coordinate system corresponding to the shooting equipment and a laser radar coordinate system corresponding to the laser radar according to the second pixel coordinate value, the space coordinate value of the initial corner point and the internal reference matrix.
9. An electronic device, characterized in that the electronic device comprises:
a memory storing at least one instruction; and
a processor executing the at least one instruction to implement the joint calibration method of any one of claims 1 to 5.
10. A computer-readable storage medium, characterized in that: the computer-readable storage medium has stored therein at least one instruction, which is executed by a processor in an electronic device to implement the joint calibration method according to any one of claims 1 to 5.
CN202211144675.4A 2022-09-20 2022-09-20 Combined calibration method and device, electronic equipment and computer readable storage medium Pending CN115439558A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211144675.4A CN115439558A (en) 2022-09-20 2022-09-20 Combined calibration method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211144675.4A CN115439558A (en) 2022-09-20 2022-09-20 Combined calibration method and device, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN115439558A true CN115439558A (en) 2022-12-06

Family

ID=84249024

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211144675.4A Pending CN115439558A (en) 2022-09-20 2022-09-20 Combined calibration method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN115439558A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116563297A (en) * 2023-07-12 2023-08-08 中国科学院自动化研究所 Craniocerebral target positioning method, device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116563297A (en) * 2023-07-12 2023-08-08 中国科学院自动化研究所 Craniocerebral target positioning method, device and storage medium
CN116563297B (en) * 2023-07-12 2023-10-31 中国科学院自动化研究所 Craniocerebral target positioning method, device and storage medium

Similar Documents

Publication Publication Date Title
US10922844B2 (en) Image positioning method and system thereof
CN111815719B (en) External parameter calibration method, device and equipment of image acquisition equipment and storage medium
CN112462350B (en) Radar calibration method and device, electronic equipment and storage medium
CN111640180B (en) Three-dimensional reconstruction method and device and terminal equipment
CN113436238B (en) Point cloud registration accuracy evaluation method and device and electronic equipment
CN106815869B (en) Optical center determining method and device of fisheye camera
CN114387347B (en) Method, device, electronic equipment and medium for determining external parameter calibration
CN108062788A (en) A kind of three-dimensional rebuilding method, device, equipment and medium
CN115439558A (en) Combined calibration method and device, electronic equipment and computer readable storage medium
Jiang et al. An accurate and flexible technique for camera calibration
CN113112553B (en) Parameter calibration method and device for binocular camera, electronic equipment and storage medium
CN111383264A (en) Positioning method, positioning device, terminal and computer storage medium
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN116912391A (en) Reverse rendering method and device combining nerve radiation field and steerable path tracking
CN115953478A (en) Camera parameter calibration method and device, electronic equipment and readable storage medium
CN114322751B (en) Target measuring method, device, computer equipment and storage medium
CN116152347A (en) Vehicle-mounted camera mounting attitude angle calibration method and system
CN113034615B (en) Equipment calibration method and related device for multi-source data fusion
CN111189413B (en) Double-camera line structured light measurement system optimization method and terminal equipment
CN115131273A (en) Information processing method, ranging method and device
CN112734857A (en) Calibration method for camera internal reference and camera relative laser radar external reference and electronic equipment
CN114693532A (en) Image correction method and related equipment
CN111145268A (en) Video registration method and device
CN112233185A (en) Camera calibration method, image registration method, camera device and storage device
US20230345135A1 (en) Method, apparatus, and device for processing images, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination