CN110148185B - Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment - Google Patents

Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment Download PDF

Info

Publication number
CN110148185B
CN110148185B CN201910430855.0A CN201910430855A CN110148185B CN 110148185 B CN110148185 B CN 110148185B CN 201910430855 A CN201910430855 A CN 201910430855A CN 110148185 B CN110148185 B CN 110148185B
Authority
CN
China
Prior art keywords
coordinate system
imaging device
reflection
imaging
system conversion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910430855.0A
Other languages
Chinese (zh)
Other versions
CN110148185A (en
Inventor
时一峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910430855.0A priority Critical patent/CN110148185B/en
Publication of CN110148185A publication Critical patent/CN110148185A/en
Application granted granted Critical
Publication of CN110148185B publication Critical patent/CN110148185B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/60Rotation of a whole image or part thereof
    • G06T3/604Rotation of a whole image or part thereof using a CORDIC [COordinate Rotation Digital Compute] device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

Embodiments of the present disclosure provide a method, an apparatus, an electronic device, and a computer-readable storage medium for determining coordinate system conversion parameters of an imaging device. In the method, initial values of coordinate system conversion parameters of an imaging device are obtained, the coordinate system conversion parameters being used to convert a world coordinate system into a device coordinate system of the imaging device. A reflection value map of an imaging area of an imaging apparatus is obtained, the reflection value map having an abscissa axis and an ordinate axis that coincide with a world coordinate system, coordinate points of which record reflection intensities associated with at least one reflection point in the imaging area, the at least one reflection point being formed by an object in the imaging area reflecting probe light and having the same abscissa and ordinate in the world coordinate system. The initial values of the coordinate system conversion parameters are updated based on the reflection value map to obtain target values of the coordinate system conversion parameters. The embodiment of the disclosure improves the flexibility and universality of parameter calibration of the imaging device.

Description

Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
Technical Field
Embodiments of the present disclosure relate generally to the technical field of imaging devices and autopilot, and more particularly, to a method, apparatus, electronic device, and computer-readable storage medium for determining imaging device coordinate system conversion parameters.
Background
In recent years, techniques such as automated driving and autonomous parking have been developed in a fashion that are based on perception of the environment around the vehicle, that is, recognition of specific conditions of the environment in the vicinity of the vehicle. It has been proposed that, in addition to a sensor device (e.g., a vehicle-mounted lidar, an imaging device, etc.) mounted on a vehicle (also referred to as "vehicle side"), data relating to the vehicle environment may be acquired by a sensor device (e.g., an imaging device mounted on both sides of a road or in a parking lot) outside the vehicle (also referred to as "road side") in order to better support autonomous driving or parking of the vehicle. Since the autonomous driving or parking vehicle is usually located with reference to a world coordinate system (e.g., universal transverse ink carto UTM coordinate system), in order to support autonomous driving or parking, the imaging device outside the vehicle needs to calibrate external parameters first, that is, determine conversion parameters between the world coordinate system and the camera coordinate system of the imaging device.
At present, the external parameter calibration of the vehicle-mounted imaging device is usually realized by calibrating the relationship between the vehicle-mounted laser radar and the imaging device, and the external parameter calibration of the vehicle-mounted imaging device can be completed by measuring based on a Global Positioning System (GPS) signal under the condition that the vehicle-mounted imaging device is covered by the GPS signal. However, in some scenarios, such as underground parking lots, tunnels, etc., there may be no GPS signal nor lidar sensor, and thus external parameter calibration of imaging devices in the scenario is difficult to achieve.
Disclosure of Invention
The embodiment of the disclosure relates to a technical scheme for determining coordinate system conversion parameters of an imaging device.
In a first aspect of the present disclosure, a method of determining coordinate system conversion parameters of an imaging device is provided. The method comprises the following steps: initial values of coordinate system conversion parameters of the imaging device are obtained, the coordinate system conversion parameters being used for converting a world coordinate system into a device coordinate system of the imaging device. The method further comprises the following steps: a reflection value map of an imaging area of the imaging apparatus is obtained, the reflection value map having an abscissa axis and an ordinate axis that coincide with a world coordinate system, coordinate points of the reflection value map recording reflection intensities associated with at least one reflection point in the imaging area, the at least one reflection point being formed by an object in the imaging area reflecting probe light and having the same abscissa and ordinate in the world coordinate system. The method further comprises the following steps: the initial values of the coordinate system conversion parameters are updated based on the reflection value map to obtain target values of the coordinate system conversion parameters.
In a second aspect of the present disclosure, an apparatus for determining coordinate system conversion parameters of an imaging device is provided. The device includes: an initial value obtaining module configured to obtain an initial value of a coordinate system conversion parameter of the imaging device, the coordinate system conversion parameter being used to convert a world coordinate system into a device coordinate system of the imaging device. The device also includes: a reflection value map obtaining module configured to obtain a reflection value map of an imaging area of the imaging apparatus, the reflection value map having an abscissa axis and an ordinate axis that coincide with a world coordinate system, coordinate points of the reflection value map recording reflection intensities associated with at least one reflection point in the imaging area, the at least one reflection point being formed by an object in the imaging area reflecting the probe light and having the same abscissa and ordinate in the world coordinate system. The apparatus further comprises: an initial value updating module configured to update an initial value of the coordinate system conversion parameter based on the reflection value map to obtain a target value of the coordinate system conversion parameter.
In a third aspect of the disclosure, an electronic device is provided. The electronic device includes one or more processors; and a storage device for storing one or more programs. The one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of the first aspect.
In a fourth aspect of the disclosure, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when executed by a processor, implements the method of the first aspect.
It should be understood that the statements herein reciting aspects are not intended to limit the critical or essential features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other objects, features and advantages of the embodiments of the present disclosure will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
FIG. 1 illustrates a schematic diagram of an example environment in which some embodiments of the present disclosure can be implemented;
FIG. 2 shows a schematic flow diagram of an example method of determining coordinate system conversion parameters of an imaging device in accordance with an embodiment of the present disclosure;
FIG. 3 shows a schematic flow diagram of an example method of obtaining initial values of coordinate system conversion parameters, in accordance with an embodiment of the present disclosure;
FIG. 4 shows a schematic flow diagram of an example method of updating initial values of coordinate system conversion parameters, in accordance with an embodiment of the present disclosure;
FIG. 5 shows a schematic flow diagram of an example method of determining a difference of a given point between a projected image and a captured image in accordance with an embodiment of the present disclosure;
FIG. 6 shows a schematic block diagram of an apparatus for determining coordinate system conversion parameters of an imaging device according to an embodiment of the present disclosure; and
FIG. 7 shows a schematic block diagram of a device that may be used to implement embodiments of the present disclosure.
Throughout the drawings, the same or similar reference numerals are used to designate the same or similar components.
Detailed Description
The principles and spirit of the present disclosure will be described with reference to a number of exemplary embodiments shown in the drawings. It is understood that these specific embodiments are described merely to enable those skilled in the art to better understand and implement the present disclosure, and are not intended to limit the scope of the present disclosure in any way.
As used herein, the term "coordinate system conversion parameters" may be, for example, parameters required to convert between a camera coordinate system, an image coordinate system, a pixel coordinate system, and a world coordinate system, such as a translation matrix, a rotation matrix, and the like. In the context of the present disclosure, a world coordinate system may refer to a reference coordinate system covering a global scope, which may be used, for example, to assist in autonomous driving or parking of a vehicle, etc., examples of which include a UTM coordinate system, a latitude and longitude coordinate system, and so on. The origin of the camera coordinate system may be located at the optical center of the imaging device, the vertical axis (z-axis) may coincide with the optical axis of the imaging device, and the horizontal axis (x-axis) and the vertical axis (y-axis) may be parallel to the imaging plane. In the context of the present disclosure, the camera coordinate system may also be referred to as the imaging device coordinate system or simply as the device coordinate system. The origin of the pixel coordinate system may be at the upper left corner of the image, and the horizontal axis and the vertical axis may be the pixel row and the pixel column, respectively, where the image is located, and the unit may be a pixel. The origin of the image coordinate system may be at the center of the image (i.e., the midpoint of the pixel coordinate system), and the horizontal and vertical axes may be parallel to the pixel coordinate system and may be in millimeters. However, it will be appreciated that in other embodiments, these coordinate systems may be defined in other reasonable ways as is accepted in the art.
In embodiments of the present disclosure, "coordinate system conversion parameters" may include or refer to so-called "external parameters", "external parameter matrix", and the like in the field of camera calibration. In general, an "extrinsic parameter" may refer to a transformation parameter between a camera coordinate system associated with a particular imaging device and a world coordinate system (e.g., the UTM coordinate system). "extrinsic parameter calibration" may refer to the determination of conversion parameters between the camera coordinate system and the world coordinate system. Therefore, in the description of the embodiments of the present disclosure, the term "extrinsic parameter" may be used instead of the term "coordinate system conversion parameter" for convenience.
As noted above, in some scenarios, such as underground parking lots, tunnels, etc., there may be no GPS signals nor lidar sensors, and thus external parameter calibration of imaging devices in the scenario is difficult to achieve. However, only after obtaining the external reference of the imaging device, the imaging device can be better used to assist automatic driving or autonomous parking of the vehicle, such as performing monocular vision back to three-dimensional (3D) algorithms, and so on. Therefore, an extrinsic parameter calibration method with a wider application range is needed to obtain the transformation relationship between the camera coordinate system and the world coordinate system of the imaging device.
In view of the above problems and potentially other problems with conventional approaches, embodiments of the present disclosure provide a method, an apparatus, an electronic device, and a computer-readable storage medium for determining coordinate system transformation parameters of an imaging device, so as to provide an out-of-parameter calibration method with a wider application range. It should be noted that although embodiments of the present disclosure are applicable to scenarios without GPS signals and lidar sensors, embodiments of the present disclosure are not limited to such scenarios, but are equally applicable to scenarios in which GPS signals and lidar sensors are present.
Compared with the traditional calibration method of the external parameters of the imaging equipment, the embodiment of the disclosure realizes the conversion external parameters between the camera coordinate system and the world coordinate system of the imaging equipment under the condition without GPS signals and field-end laser radar sensors. The embodiment of the disclosure has the advantages of simple operation and high calibration efficiency, and the average pixel error can be less than or equal to 2 pixels, thereby providing precision guarantee for roadside perception. In addition, the embodiment of the disclosure does not depend on a positioning signal and a radar sensor, so that the flexibility and universality of parameter calibration of the imaging device are improved. Several embodiments of the present disclosure are described below in conjunction with the following figures.
Fig. 1 illustrates a schematic diagram of an example environment 100 in which some embodiments of the present disclosure can be implemented. As shown in fig. 1, an example environment 100 depicts a scene of a certain parking lot in a schematic manner. Specifically, the depicted parking lot has a plurality of parking spaces disposed therein, such as parking space "CW 185" identified by space number 108. In addition, a lane line 101, a guide symbol 104, a parking space line 106, and the like are drawn on the ground of the parking lot. It should be understood that these facilities and identifications depicted in fig. 1 are merely examples, and that different or additional facilities or identifications would be present in other parking lots, and embodiments of the present disclosure are not limited in this respect. Further, it should also be understood that embodiments of the present disclosure are not limited to the scenario of a parking lot depicted in fig. 1, but are generally applicable to any scenario associated with autonomous driving or parking. More generally, embodiments of the present disclosure are also applicable to an imaging apparatus of any use, and are not limited to an imaging apparatus that assists automatic driving or autonomous parking.
In the example of FIG. 1, a plurality of vehicles 110-1 through 110-5 (hereinafter collectively referred to as vehicles 110) are parking in corresponding parking spaces. Vehicle 110 may be any type of vehicle that may carry people and/or things and be moved by a powered system such as an engine, including but not limited to a car, truck, bus, electric vehicle, motorcycle, recreational vehicle, train, and the like. One or more vehicles 110 in the example environment 100 may be vehicles with autonomous driving or parking capabilities, such vehicles also referred to as unmanned vehicles. Of course, one or some of the vehicles 110 in the example environment 100 may also be vehicles without autopilot or autonomous parking capabilities.
Also disposed in the example environment 100 is an imaging device (also referred to as an image sensor) 105 for capturing images in the example environment. In the context of the present disclosure, an imaging apparatus generally refers to any apparatus having an imaging function, including an imaging apparatus on board and an imaging apparatus off board, and an imaging apparatus used for any purpose. Such imaging devices include, but are not limited to, cameras, camcorders, video cameras, surveillance probes, automobile recorders, mobile devices with camera or photographic functions, and the like. In some embodiments, the imaging device 105 may be independent of the vehicle 110 for monitoring conditions of the example environment 100 to obtain sensory information related to the example environment 100 to assist in autonomous driving or parking of the vehicle 110. To reduce occlusion, the imaging device 105 may be disposed at a higher position in the example environment 100. For example, higher up on a fixed pole or wall to better monitor the example environment 100.
Although only one imaging device 105 is shown in fig. 1, it will be understood that multiple imaging devices may be disposed in each region of the example environment 100. In some embodiments, in addition to fixed imaging devices 105 in a particular location, a movable or rotatable imaging device may be provided in the example environment 100, and so forth. Furthermore, although imaging device 105 in fig. 1 is depicted as being disposed outside of vehicle 110, it will be understood that embodiments of the present disclosure are equally applicable to imaging devices disposed on vehicle 110, i.e., vehicle-mounted imaging devices. As shown, the imaging device 105 is communicatively (e.g., wired or wirelessly) connected to the computing device 120. In performing extrinsic parameter calibration on imaging device 105, position and/or orientation information of imaging device 105 and the captured image data may be provided to computing device 120 for use in determining coordinate system conversion parameters of imaging device 105. In addition, computing device 120 may also send various control signals to imaging device 105 to control various operations of imaging device 105, such as controlling imaging device 105 to capture images, to move or rotate, and so forth.
It will be appreciated that the computing device 120 may be any type of mobile terminal, fixed terminal, or portable terminal including a mobile phone, station, unit, device, multimedia computer, multimedia tablet, internet node, communicator, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, Personal Communication System (PCS) device, personal navigation device, Personal Digital Assistant (PDA), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, gaming device, or any combination thereof, including accessories and peripherals of these devices, or any combination thereof. It is also contemplated that computing device 120 can support any type of interface to the user (such as "wearable" circuitry, etc.). More generally, the computing device 120 may be any server or client device that can be used to determine coordinate system conversion parameters for an imaging device. An example method of determining coordinate system conversion parameters of an imaging device according to an embodiment of the present disclosure is described below in conjunction with fig. 2.
Fig. 2 shows a schematic flow diagram of an example method 200 of determining coordinate system conversion parameters of the imaging device 105 according to an embodiment of the present disclosure. In some embodiments, the example method 200 may be implemented by the computing device 120 in fig. 1, e.g., may be implemented by a processor or processing unit of the computing device 120. In other embodiments, all or part of the example method 200 may also be implemented by a computing device separate from the example environment 100, or may be implemented by other elements in the example environment 100. For ease of discussion, the example method 200 will be described in conjunction with FIG. 1.
As mentioned above, the imaging device 105 in the example environment 100 may be used to assist the vehicle 110 in autonomous parking or autonomous driving within a parking lot. More generally, in an autonomous driving scenario on a traffic road, an imaging device external to the vehicle may similarly assist the vehicle in autonomous driving. Because vehicle 110 is typically autonomously parked or autonomous driven with reference to the world coordinate system, to assist autonomous parking or autonomous driving of vehicle 110, imaging device 105 may need to calibrate external parameters, i.e., determine coordinate system transformation parameters between the camera coordinate system of imaging device 105 and the world coordinate system. With the determined coordinate system conversion parameters, the computing device 120 may use the image information captured by the imaging device 105 to assist the vehicle 110 in autonomous driving or parking.
Accordingly, at 210, the computing device 120 obtains initial values for the coordinate system conversion parameters of the imaging device 105. As noted above, the extrinsic parameters of the imaging device 105 are used for conversion between the camera coordinate system of the imaging device 105 and the world coordinate system, so the extrinsic parameters of the imaging device 105 may be calculated from the position and orientation of the imaging device 105 in the world coordinate system. In other words, to obtain initial values for the extrinsic parameters of imaging device 105, computing device 120 may first determine the position and orientation of imaging device 105 in a world coordinate system. The determination of position and orientation is described below, respectively.
Computing device 120 may determine the location of imaging device 105 in the world coordinate system, i.e., the coordinates in the world coordinate system, using any suitable approach. For example, computing device 120 may obtain location information of imaging device 105 determined by handheld GPS device measurements. As another example, computing device 120 may acquire position information of imaging device 105 determined by measurements of the total station. In some embodiments, the computing device 120 may obtain user input or estimated location information of the imaging device 105. It should be understood that the position information of the imaging device 105 may be obtained by any other suitable device or method, and embodiments of the present disclosure are not limited in this respect. Further, it should be appreciated that the position information of the imaging device 105 may be in any suitable form suitable for the coordinate system transformation parameters. One example process of obtaining location information for the imaging device 105 is described below in conjunction with fig. 3.
Fig. 3 shows a schematic flow diagram of an example method 300 of obtaining initial values of coordinate system conversion parameters, in accordance with an embodiment of the present disclosure. In some embodiments, the example method 300 may be implemented by the computing device 120 in fig. 1, e.g., may be implemented by a processor or processing unit of the computing device 120. In other embodiments, all or part of the example method 300 may also be implemented by a computing device separate from the example environment 100, or may be implemented by other elements in the example environment 100. With the example method 300, the initial value of the translation vector of the external parameter of the imaging device 105 may be efficiently calculated regardless of whether a GPS signal is present.
At 310, the computing device 120 may obtain reference coordinates of a reference point in the world coordinate system, which may be any point in the world coordinate system that facilitates determining its coordinates. Since the imaging device 105 may be set in an area where there is no GPS signal, the coordinates of the above-described reference point may be determined at the area where there is a GPS signal. While the positional relationship between the imaging device 105 and the above-mentioned reference point may be measured using, for example, a total station, it will be appreciated that other measuring tools are also possible. In some embodiments, the origin of the total station may be set at the reference point. In other words, the coordinates of the origin of the total station in the world coordinate system may be measured, which may be denoted as a.
At 320, the computing device 120 may obtain a positional relationship of the optical center of the imaging device 105 to the reference point. As previously noted, the origin of the camera coordinate system of the imaging device 105 is set at the optical center of the imaging device 105, so the coordinates of the optical center of the imaging device 105 in the world coordinate system can determine the translation vector in the extrinsic parameters of the imaging device 105. For example, in an example using a total station, the total station may be used to take a segmented measurement from the origin until the optical center of the imaging device 105 is measured. The measured transformation relation of each segment can be represented as TiFinally, the optical center of the imaging device 105 can be calculatedThe coordinates of (a): a. theCenter of a ship=Ti*...*T1*a。
At 330, the computing device 120 may determine an initial value of the translation vector in the coordinate system conversion parameters based on the reference coordinates of the reference point and its positional relationship with the optical center of the imaging device 105. The optical center of the imaging device 105 is the origin of the camera coordinate system, and thus the translation vector of the extrinsic parameter can be determined by the translation of the optical center of the imaging device 105 between the coordinates in the world coordinate system and the origin of the world coordinate system. Therefore, based on the positional relationship of the optical center of the imaging device 105 and the reference point, and the coordinates of the reference point in the world coordinate system, the computing device 120 may derive the coordinates of the optical center of the imaging device 105 in the world coordinate system, and thus derive the initial values of the translation vectors of the external parameters of the imaging device 105.
In addition to the translation vector, a rotation matrix is included in the external parameters of the imaging device 105, which characterizes the rotational relationship between the camera coordinate system of the imaging device 105 and the world coordinate system. Therefore, in order to obtain the initial values of the external parameters of the imaging device 105, the computing device 120 also needs to determine the initial values of the rotation matrix. In some embodiments, computing device 120 may obtain the angles imaging device 105 makes with the true east direction, the true north direction, and the sky direction to obtain initial values for the rotation matrix in the extrinsic parameters. For example, since the angle does not require high accuracy, it can be measured by an angle scale. In this way, the initial values of the rotation matrix of the external parameters of the imaging device 105 can be simply measured without using other complex sensors.
It should be understood that the direction information may also be obtained by any other suitable device or method, and embodiments of the present disclosure are not limited in this respect. Furthermore, the orientation information may be in any suitable form suitable for use in coordinate system transformation parameters. After determining the initial values of the translation vector and the rotation matrix, the computing device 120 obtains the initial values of the coordinate system conversion parameters of the imaging device 105.
Referring back to fig. 2, at 220, computing device 120 obtains a reflection value map of the imaging area of imaging device 105. In the context of the present disclosure, a reflection value map refers to a map with reflection value information of reflection points formed by reflection of probe light by an object or object, which may be made by a lidar point cloud. Specifically, laser light emitted by the laser radar is propagated to the ground or the surface of an object through air and is reflected through the ground or the surface, and the reflected energy value of the laser light can be recorded. Thus, each point in the lidar point cloud has, in addition to position information (e.g., x, y, and z), intensity information of the reflection value (which may be denoted as i). The reflection value map can be made based on the x, y, z, i four variables of the point cloud.
Accordingly, a reflection value map of the imaging area of the imaging device 105 can be made using a reflection point cloud of the laser light by the object acquired by the laser radar in the imaging area of the imaging device 105. For example, an acquisition vehicle mounted with a lidar may acquire a point cloud of the imaging area of the imaging device 105. A point cloud map can then be made from the original laser point cloud using a simultaneous localization and mapping (SLAM) method. Then, the vertical axis (z axis) dimension information is removed from the point cloud map to form a two-dimensional image, the abscissa of the two-dimensional image is the abscissa x of the point cloud, the ordinate of the two-dimensional image is the ordinate y of the point cloud, the pixel content of the two-dimensional image is the reflection value intensity i of the point cloud, and the two-dimensional image is the reflection value map.
As can be seen, the reflection-value map has an abscissa axis and an ordinate axis that coincide with the world coordinate system, and the coordinate points of the reflection-value map record the reflection intensities associated with one or more reflection points in the imaging area of the imaging device 105, which are formed by the reflection of the probe light by the object in the imaging area and have the same abscissa and ordinate in the world coordinate system. In other words, a plurality of reflection points having the same abscissa and ordinate in the laser point cloud map may correspond to the same coordinate point in the reflection value map, and the reflection value of the coordinate point may correspond to the sum of the reflection values of the plurality of reflection points, for example.
Several example methods of producing a reflection value map from a laser point cloud map will be described below, where it will be assumed that the reflection value map is produced by a computing device 120 executing the example method 200. However, it should be understood that in other embodiments, the reflection value map may be produced by other computing devices, and the computing device 120 may perform the example method 200 using the reflection value map that has already been produced.
As a first example manner of making, the computing device 120 may first select a laser point cloud for constructing a reflection value map from laser point clouds collected in an area corresponding to the reflection value map to be constructed, and select a sample frame laser point cloud from the laser point clouds for constructing the reflection value map. Next, the computing device 120 may select a key frame laser point cloud from the sample frame laser point clouds and determine an optimal key frame laser point cloud based on the adjustment amount corresponding to the key frame laser point cloud. The adjustment amount is determined based on the spliced position of the key frame laser point cloud spliced to the center point of the lidar corresponding to the other key frame laser point cloud relative to the movement amount of the position of the center point of the lidar corresponding to the key frame laser point cloud.
Then, the computing device 120 may perform global pose optimization on the laser point clouds, except for the optimal key frame laser point cloud, in the laser point clouds for constructing the reflection value map, to obtain a position and a pose angle for constructing the reflection value map of a center point of the lidar corresponding to the laser point cloud for constructing the reflection value map in each frame. Finally, the computing device 120 may construct the reflection value map based on the position and attitude angle used to construct the reflection value map of the center point of the lidar corresponding to the laser point cloud used to construct the reflection value map for each frame.
As another example manufacturing manner, the computing device 120 may first select a laser point cloud for constructing the reflection value map from the laser point clouds collected in each collection area in the area corresponding to the reflection value map to be constructed, and select a sample frame laser point cloud from the laser point clouds for constructing the reflection value map collected in each collection area. Then, the computing device 120 may select a key frame laser point cloud from the sample frame laser point clouds collected in each collection area, and determine an optimal key frame laser point cloud collected in each collection area based on the adjustment amount corresponding to the key frame laser point cloud collected in each collection area. The adjustment amount is determined based on the amount of movement of the stitched position of the center point of the lidar corresponding to the key frame laser point cloud stitched to the other key frame laser point cloud relative to the position of the center point of the lidar corresponding to the key frame laser point cloud.
Then, the computing device 120 may perform global pose optimization on the laser point clouds, except the key frame laser point cloud, in the laser point clouds collected in each collection area and used for constructing the reflection value map, to obtain a position and a pose angle, which are used for constructing the reflection value map, of a central point of the laser radar corresponding to the laser point cloud collected in each collection area and used for constructing the reflection value map. Finally, the computing device 120 may construct the reflection value map based on the position and attitude angle used for constructing the reflection value map of the center point of the lidar corresponding to the laser point cloud used for constructing the reflection value map of each frame acquired in each acquisition area. It will be appreciated that the computing device 120 may fabricate the reflection value map from the laser point cloud map in any suitable manner other than the exemplary manners of fabrication provided herein.
At 230, after obtaining the initial values of the external parameters of the imaging device 105 and the reflection value map, the computing device 120 updates the initial values of the coordinate system conversion parameters based on the reflection value map to obtain target values of the coordinate system conversion parameters. For example, the initial values of the external parameters obtained by the measurement means may not be sufficiently accurate, and thus need to be optimized (or improved in accuracy) so that the imaging device 105 is better used to assist automatic driving, autonomous parking, or the like. As described above, the coordinates of the reflection point in the imaging area of the imaging device 105 in the world coordinate system are recorded in the reflection value map. Accordingly, the coordinate information in the reflection value map may be used to optimize the initial values of the external parameters of the imaging device 105, thereby deriving the target values of the initial values of the external parameters. It will be appreciated that the computing device 120 may use the coordinate information in the reflection value map to optimize the initial values of the extrinsic parameters in any suitable manner, for example, associating coordinates in the reflection value map with pixel points in an image captured by the imaging device 105. An example of an initial value optimization process is described below in conjunction with fig. 4.
Fig. 4 shows a schematic flow diagram of an example method 400 of updating initial values of coordinate system conversion parameters according to an embodiment of the present disclosure. In some embodiments, the example method 400 may be implemented by the computing device 120 of fig. 1, e.g., may be implemented by a processor or processing unit of the computing device 120. In other embodiments, all or part of the example method 400 may also be implemented by a computing device separate from the example environment 100, or may be implemented by other elements in the example environment 100. By the example method 400, the external parameters of the imaging device 105 may be adjusted more precisely so that the imaging device 105 may be used to assist in autonomous driving or parking.
At 410, computing device 120 projects the reflection value map into a pixel coordinate system of imaging device 105 based on initial values of the external parameters of imaging device 105 to obtain a first image of an imaging area of imaging device 105. As previously described, the external parameters of the imaging device 105 may enable a conversion from the three-dimensional coordinates of the world coordinate system to the three-dimensional coordinates of the camera coordinate system of the imaging device 105. Further, the three-dimensional coordinates of the camera coordinate system may be converted to two-dimensional pixel coordinates in the pixel coordinate system of the imaging device 105 using intrinsic parameters and/or other parameters of the imaging device 105. It should be noted that in the context of the present disclosure, as the determination of the external parameters of the imaging device 105 is of primary interest, the internal parameters or other parameters of the imaging device 105 may be assumed to be known.
Thus, with the initial values of the extrinsic parameters obtained previously, the computing device 120 may project a reflection value map to the pixel coordinate system of the imaging device 105 for comparison with the image captured by the imaging device 105 to determine whether the extrinsic parameters are sufficiently accurate. As described above, the coordinate points of the reflection value map have the abscissa and the ordinate in the world coordinate system. During projection, the coordinates of the third dimension (z-axis) where the coordinate points are missing can be back-checked by the abscissa and ordinate of the coordinate points. However, in the reflection value map, the same abscissa and ordinate may correspond to a plurality of ordinates, and thus the ordinate may not be uniquely determined by the abscissa and ordinate. In this regard, it is noted that the imaging device 105 is typically mounted at a higher position in the example environment 100, and thus the image captured by the imaging device 105 may be considered a top view. In this case, in the case where there are a plurality of vertical coordinates corresponding to a specific abscissa and a characteristic ordinate, the smallest vertical coordinate among the plurality of vertical coordinates may be selected as the vertical coordinate value corresponding to the abscissa and the ordinate.
At 420, computing device 120 captures a second image of its imaging area through imaging device 105 for comparison or matching with the first image projected by the reflectance map to determine if the initial values of the extrinsic parameters are sufficiently accurate. In some embodiments, the matching may be performed by corresponding pixel points in the first image and the second image. Thus, at 430, computing device 120 determines a difference between a first pixel coordinate in the first image and a second pixel coordinate in the second image for a given point in the imaging region of imaging device 105. As described above, since the initial value of the external parameter of the imaging device 105 may not be accurate, there may be a deviation between the first image obtained by projecting the reflection value map according to the initial value and the second image directly captured by the imaging device 105. This deviation can be measured by the difference between the pixel coordinates of a given point in the two images.
For example, the given point may be an edge point or a boundary point of an object in the imaging area of the imaging device 105, that is, a point having a significant difference from the surrounding environment, such as an edge point of a post in a parking lot, an edge point of a road sign on a road, or the like. More generally, the given point may also be any point in the imaged area that has a salient feature. In some embodiments, the computing device 120 may extract sets of feature points in the first image and the second image, respectively, to compute the differences, whereby the accuracy of the computation may be improved. This is described in detail below in conjunction with fig. 5.
Fig. 5 shows a schematic flow diagram of an example method 500 of determining a difference of a given point between a projected image and a captured image in accordance with an embodiment of the present disclosure. In some embodiments, the example method 500 may be implemented by the computing device 120 in fig. 1, e.g., may be implemented by a processor or processing unit of the computing device 120. In other embodiments, all or part of the example method 500 may also be implemented by a computing device separate from the example environment 100, or may be implemented by other elements in the example environment 100. By the example method 500, the difference between the projected image and the captured image may be more accurately determined, thereby improving the accuracy of the value of the finally determined extrinsic parameter.
At 510, computing device 120 may extract a first set of feature points from the first image. Computing device 120 may extract the first set of feature points from the first image using any suitable approach. In some embodiments, the computing device 120 may identify the first set of feature points in the first image by a predetermined feature point identification condition (e.g., a difference from surrounding pixel points exceeds a threshold). In other embodiments, the computing device 120 may extract the first set of feature points through an existing image feature point extraction and matching algorithm, such as an ORB algorithm or SIFT algorithm, or the like.
At 520, the computing device 120 may extract a second set of feature points from the second image that corresponds to the first set of feature points. Computing device 120 may extract a second set of feature points from the second image that corresponds to the first set of feature points using any suitable approach. For example, the computing device 120 may directly compare the second image with the second image, so as to determine a second feature point set corresponding to the first feature point set in the second image. For another example, the computing device 120 may identify the second feature point set in the second image by the same feature point identification condition, and thus the identified feature point set may be considered to correspond to the first feature point set. As another example, the computing device 120 may use an existing image feature point extraction and matching algorithm to determine a second set of feature points corresponding to the first set of feature points, such as an ORB algorithm or a SIFT algorithm, or the like.
At 530, computing device 120 may calculate a sum of pixel distances between corresponding points in the first set of feature points and the second set of feature points. The sum may reflect the magnitude of the deviation between the first image and the second image, and further may reflect the accuracy of the initial value of the external parameter of the imaging device 105. In some embodiments, to derive the sum of the pixel distances described above, for each given feature point in the first set of feature points, the computing device 120 may determine, in the second set of feature points, the corresponding feature point that is closest in distance to the given feature point, i.e., determine the feature point corresponding to the given feature point. Next, the computing device 120 may calculate a pixel distance between the given feature point and the corresponding feature point, which may be represented by pixel coordinates. The computing device 120 may then sum the pixel distances between all pairs of matching feature points to arrive at a sum of pixel distances. In this way, the corresponding feature points in the second image can be determined efficiently and accurately, thereby reducing the amount of calculation of the sum of the pixel distances.
Referring back to fig. 4, at 440, computing device 120 adjusts the initial values of the coordinate system conversion parameters based on the difference between the first pixel coordinates and the second pixel coordinates of the given point. The coordinates of coordinate points in the reflectance value map may be considered accurate, and in the case where the initial value of the external parameter of the imaging device 105 is sufficiently accurate, then the difference in pixel coordinates for the same given point between the first image projected based on the initial value and the second image directly captured by the imaging device 105 should be sufficiently small. Thus, in some embodiments, if the difference is greater than a preset configurable threshold, it means that the initial value of the extrinsic parameter is not accurate enough. In this case, the computing device 120 may adjust the initial value of the external parameter of the imaging device 105 and re-project the reflection value image into the first image so that the difference becomes small. In other words, the computing device 120 may adjust the value of the external parameter in an iterative manner until the difference is smaller than the preset configurable threshold.
For example, in the example of extracting the feature point sets described above, if the sum of the pixel distances between corresponding points in the first and second feature point sets calculated by the computing device 120 is smaller than a preset configurable threshold, which means that the value of the external parameter of the imaging device 105 (which may have been iteratively adjusted multiple times from the initial value) is already sufficiently accurate, the computing device 120 may determine the value of the adjusted coordinate system conversion parameter of the imaging device 105 as the target value of the coordinate system conversion parameter, that is, the value of the external parameter of the imaging device 105 is finally obtained.
To better illustrate the matching process of the feature point set, it will be described below by way of mathematical operations. It should be understood that the following mathematical description is exemplary only, and is not intended to limit the scope of the present disclosure in any way. In other embodiments, the above process may be described using another mathematical description.
Specifically, the computing device 120 may first perform feature point extraction on a first image obtained by projection of the reflection value map and a second image captured by the imaging device 105, respectively, and extract a first feature point set and a second feature point set in the two images. For example, the first and second sets of feature points may correspond to corner points and feature-significant contour information of the object imaged in the first and second images, respectively, and so on. Assuming that the first set of feature points is denoted as P, it is possible to find the point closest to the set of projected points P in the second image to constitute a second set of feature points P. Further, in the following derivation, the rotation matrix in the external reference of the imaging device 105 is denoted as R, and the translation vector is denoted as t.
Next, the computing device 120 may define two feature point sets (which may also be referred to as point clouds), i.e., an error term e between the first feature point set P and the second feature point set P, which may be represented by the following equation (1):
e=p-(RP+t) (1)
then, the computing device 120 may construct a least squares problem and solve R and t such that the sum of squared errors of the pixel point pairs of the feature point set reaches a minimum, which may be represented by the following formula (2), where i represents the ith feature point, and n represents the number of feature points:
Figure GDA0003283176280000161
it will be appreciated that the least squares method is merely an example, and that in other embodiments, any other mathematical method, existing or developed in the future, having similar functionality may be used for the calculations herein. To solve the above optimization problem, the computing device 120 may first define the centroids of the two sets of feature point sets, which is represented by equation (3) below.
Figure GDA0003283176280000162
Therefore, the objective function expressed by equation (2) can be converted into the following equation (4):
Figure GDA0003283176280000163
let p bei-p=q,PiSince the second term in the above equation (4) may be 0, the objective function may become the following equation (5):
Figure GDA0003283176280000164
the first term in equation (5) above is independent of R, and the second term RTR ═ I is also independent of R. Therefore, the optimization function can become the following equation (6):
Figure GDA0003283176280000171
next, computing device 120 may define
Figure GDA0003283176280000172
Then canTo svd decompose W to obtain W ═ U ∑ VTThen R is VUT(ii) a the matrix of external parameters of the imaging device 105 is thus obtained by t-p-RP, which can be expressed as the following formula (7).
Figure GDA0003283176280000173
And then recalculating the error term e by using the solved T, if the e is smaller than a preset configurable threshold or the iteration times reach a preset threshold time, stopping iteration, otherwise, reprojecting by using the newly solved extrinsic parameter T to obtain a first image, then recalculating the closest point set, namely the second characteristic point set p, in the second image, and starting the next iteration.
Fig. 6 shows a schematic block diagram of an apparatus 600 for determining coordinate system conversion parameters of an imaging device according to an embodiment of the present disclosure. In some embodiments, apparatus 600 may be included in computing device 120 of fig. 1 or implemented as computing device 120.
As shown in fig. 6, the apparatus 600 includes an initial value obtaining module 610, a reflection value map obtaining module 620, and an initial value updating module 630. The initial value obtaining module 610 is configured to obtain initial values of coordinate system conversion parameters of the imaging device, the coordinate system conversion parameters being used to convert the world coordinate system into a device coordinate system of the imaging device. The reflection-value map obtaining module 620 is configured to obtain a reflection-value map of an imaging area of the imaging device, the reflection-value map having an abscissa axis and an ordinate axis that coincide with a world coordinate system, coordinate points of the reflection-value map recording reflection intensities associated with at least one reflection point in the imaging area, the at least one reflection point being formed by an object in the imaging area reflecting probe light and having the same abscissa and ordinate in the world coordinate system. The initial value update module 630 is configured to update the initial values of the coordinate system conversion parameters based on the reflection value map to obtain target values of the coordinate system conversion parameters.
In some embodiments, the initial value update module 630 may include a projection module, a capture module, a difference determination module, and an initial value adjustment module. The projection module is configured to project a reflection value map into a pixel coordinate system of the imaging device based on the initial value to obtain a first image of the imaging area. The capture module is configured to capture a second image of the imaging area by the imaging device. The difference determination module is configured to determine a difference between a first pixel coordinate in the first image and a second pixel coordinate in the second image for a given point in the imaged region. The initial value adjustment module is configured to adjust an initial value of the coordinate system conversion parameter based on the difference.
In some embodiments, the difference determination module may further include a first extraction module, a second extraction module, and a sum calculation module. The first extraction module is configured to extract a first set of feature points from the first image. The second extraction module is configured to extract a second set of feature points corresponding to the first set of feature points from the second image. The sum calculation module is configured to calculate a sum of pixel distances between corresponding points in the first set of feature points and the second set of feature points.
In some embodiments, the sum computation module may further include, for each given feature point in the first set of feature points, a corresponding feature point determination module and a pixel distance computation module. The corresponding feature point determination module is configured to determine a corresponding feature point in the second set of feature points that is closest in distance to the given feature point. The pixel distance calculation module is configured to calculate a pixel distance between a given feature point and a corresponding feature point.
In some embodiments, the apparatus 600 may further include a target value determination module. The target value determination module is configured to determine a value of the adjusted coordinate system conversion parameter as the target value in response to the sum of the pixel distances being less than the threshold.
In some embodiments, the initial value obtaining module 610 may include a reference coordinate obtaining module, a positional relationship obtaining module, and a translation vector initial value determining module. The reference coordinate obtaining module is configured to obtain reference coordinates of a reference point in a world coordinate system. The positional relationship obtaining module is configured to obtain a positional relationship of an optical center of the imaging device and the reference point. The translation vector initial value determination module is configured to determine an initial value of a translation vector in the coordinate system conversion parameter based on the reference coordinate and the positional relationship.
In some embodiments, the initial value obtaining module 610 may include a rotation matrix initial value determining module. The rotation matrix initial value determination module is configured to obtain angles of the imaging device with a true east direction, a true north direction, and a sky direction to obtain an initial value of a rotation matrix in the coordinate system conversion parameter.
In some embodiments, the reflection value mapping module 620 may include a reflection value mapping module. The reflection value mapping module is configured to utilize a reflection point cloud collected by a laser radar in an imaging region to create a reflection value map.
Fig. 7 schematically illustrates a block diagram of a device 700 that may be used to implement embodiments of the present disclosure. As shown in fig. 7, device 700 includes a Central Processing Unit (CPU)701 that may perform various appropriate actions and processes in accordance with computer program instructions stored in a read-only memory device (ROM)702 or computer program instructions loaded from a storage unit 708 into a random access memory device (RAM) 703. In the RAM 703, various programs and data required for the operation of the device 700 can also be stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
Various components in the device 700 are connected to the I/O interface 705, including: an input unit 706 such as a keyboard, a mouse, or the like; an output unit 707 such as various types of displays, speakers, and the like; a storage unit 708 such as a magnetic disk, optical disk, or the like; and a communication unit 709 such as a network card, modem, wireless communication transceiver, etc. The communication unit 709 allows the device 700 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The various processes and processes described above, such as the example methods 200, 300, 400, and 500, may be performed by the processing unit 701. For example, in some embodiments, the example methods 200, 300, 400, and 500 may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as the storage unit 708. In some embodiments, part or all of a computer program may be loaded onto and/or installed onto device 700 via ROM 702 and/or communications unit 709. When the computer program is loaded into the RAM 703 and executed by the CPU 701, one or more steps of the example methods 200, 300, 400, and 500 described above may be performed.
As used herein, the terms "comprises," comprising, "and the like are to be construed as open-ended inclusions, i.e.," including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions may also be included herein.
As used herein, the term "determining" encompasses a wide variety of actions. For example, "determining" can include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Further, "determining" can include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory), and the like. Further, "determining" may include resolving, selecting, choosing, establishing, and the like.
It should be noted that the embodiments of the present disclosure can be realized by hardware, software, or a combination of software and hardware. The hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the apparatus and methods described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, in programmable memory or on a data carrier such as an optical or electronic signal carrier.
Further, while the operations of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Rather, the steps depicted in the flowcharts may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions. It should also be noted that the features and functions of two or more devices according to the present disclosure may be embodied in one device. Conversely, the features and functions of one apparatus described above may be further divided into embodiments by a plurality of apparatuses.
While the present disclosure has been described with reference to several particular embodiments, it is to be understood that the disclosure is not limited to the particular embodiments disclosed. The disclosure is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (16)

1. A method of determining coordinate system conversion parameters of an imaging device, comprising:
obtaining initial values of the coordinate system conversion parameters of the imaging device, the coordinate system conversion parameters being used for converting a world coordinate system into a device coordinate system of the imaging device;
obtaining a reflection value map of an imaging area of the imaging device, the reflection value map having an abscissa axis and an ordinate axis that coincide with the world coordinate system, coordinate points of the reflection value map recording reflection intensities associated with at least one reflection point in the imaging area, the at least one reflection point being formed by an object in the imaging area reflecting probe light and having the same abscissa and ordinate in the world coordinate system; and
updating the initial values of the coordinate system conversion parameters based on the reflection value map to obtain target values of the coordinate system conversion parameters
Wherein updating the initial value comprises:
projecting the reflection value map into a pixel coordinate system of the imaging device based on the initial value to obtain a first image of the imaging area;
capturing, by the imaging device, a second image of the imaging region;
determining a difference between a first pixel coordinate in the first image and a second pixel coordinate in the second image for a given point in the imaged region; and
adjusting the initial values of the coordinate system conversion parameters based on the difference.
2. The method of claim 1, wherein determining the difference comprises:
extracting a first feature point set from the first image;
extracting a second feature point set corresponding to the first feature point set from the second image; and
calculating a sum of pixel distances between corresponding points in the first set of feature points and the second set of feature points.
3. The method of claim 2, wherein calculating the sum comprises:
for each given feature point in the first set of feature points,
determining a corresponding feature point closest to the given feature point in the second feature point set; and
calculating a pixel distance between the given feature point and the corresponding feature point.
4. The method of claim 2, further comprising:
and in response to the sum of the pixel distances being smaller than a threshold value, determining the adjusted value of the coordinate system conversion parameter as the target value.
5. The method of claim 1, wherein obtaining the initial values for the coordinate system conversion parameters comprises:
obtaining reference coordinates of a reference point in the world coordinate system;
obtaining the position relation between the optical center of the imaging device and the reference point; and
determining an initial value of a translation vector in the coordinate system conversion parameter based on the reference coordinate and the positional relationship.
6. The method of claim 1, wherein obtaining the initial values for the coordinate system conversion parameters comprises:
obtaining angles of the imaging device to a right east direction, a right north direction and a sky direction to obtain an initial value of a rotation matrix in the coordinate system conversion parameter.
7. The method of claim 1, wherein obtaining the reflection value map comprises:
and utilizing reflection point clouds collected by a laser radar in the imaging area to manufacture the reflection value map.
8. An apparatus for determining coordinate system conversion parameters of an imaging device, comprising:
an initial value obtaining module configured to obtain an initial value of the coordinate system conversion parameter of the imaging device, the coordinate system conversion parameter being used to convert a world coordinate system into a device coordinate system of the imaging device;
a reflection value map obtaining module configured to obtain a reflection value map of an imaging area of the imaging apparatus, the reflection value map having an abscissa axis and an ordinate axis that coincide with the world coordinate system, coordinate points of the reflection value map recording reflection intensities associated with at least one reflection point in the imaging area, the at least one reflection point being formed by an object in the imaging area reflecting probe light and having the same abscissa and ordinate in the world coordinate system; and
an initial value update module configured to update the initial value of the coordinate system conversion parameter based on the reflection value map to obtain a target value of the coordinate system conversion parameter
Wherein the initial value updating module comprises:
a projection module configured to project the reflection value map into a pixel coordinate system of the imaging device based on the initial value to obtain a first image of the imaging area;
a capture module configured to capture a second image of the imaging region by the imaging device;
a difference determination module configured to determine a difference between a first pixel coordinate in the first image and a second pixel coordinate in the second image for a given point in the imaged region; and
an initial value adjustment module configured to adjust the initial value of the coordinate system conversion parameter based on the difference.
9. The apparatus of claim 8, wherein the difference determination module comprises:
a first extraction module configured to extract a first set of feature points from the first image;
a second extraction module configured to extract a second feature point set corresponding to the first feature point set from the second image; and
a sum calculation module configured to calculate a sum of pixel distances between corresponding points in the first set of feature points and the second set of feature points.
10. The apparatus of claim 9, wherein the sum computation module comprises, for each given feature point in the first set of feature points:
a corresponding feature point determination module configured to determine a corresponding feature point closest to the given feature point in the second set of feature points; and
a pixel distance calculation module configured to calculate a pixel distance between the given feature point and the corresponding feature point.
11. The apparatus of claim 9, further comprising:
a target value determination module configured to determine the adjusted value of the coordinate system conversion parameter as the target value in response to a sum of the pixel distances being less than a threshold.
12. The apparatus of claim 8, wherein the initial value obtaining module comprises:
a reference coordinate obtaining module configured to obtain reference coordinates of a reference point in the world coordinate system;
a positional relationship obtaining module configured to obtain a positional relationship of an optical center of the imaging device and the reference point; and
a translation vector initial value determination module configured to determine an initial value of a translation vector in the coordinate system conversion parameter based on the reference coordinate and the positional relationship.
13. The apparatus of claim 8, wherein the initial value obtaining module comprises:
a rotation matrix initial value determination module configured to obtain angles of the imaging device with a true east direction, a true north direction, and a sky direction to obtain an initial value of a rotation matrix in the coordinate system conversion parameter.
14. The device of claim 8, wherein the reflection value map obtaining module comprises:
a reflection value mapping module configured to map the reflection value using a reflection point cloud collected by a laser radar in the imaging region.
15. An electronic device, comprising:
one or more processors; and
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method of any one of claims 1-7.
16. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN201910430855.0A 2019-05-22 2019-05-22 Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment Active CN110148185B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910430855.0A CN110148185B (en) 2019-05-22 2019-05-22 Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910430855.0A CN110148185B (en) 2019-05-22 2019-05-22 Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment

Publications (2)

Publication Number Publication Date
CN110148185A CN110148185A (en) 2019-08-20
CN110148185B true CN110148185B (en) 2022-04-15

Family

ID=67592801

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910430855.0A Active CN110148185B (en) 2019-05-22 2019-05-22 Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment

Country Status (1)

Country Link
CN (1) CN110148185B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110561428B (en) * 2019-08-23 2023-01-24 大族激光科技产业集团股份有限公司 Method, device and equipment for determining pose of robot base coordinate system and readable medium
CN110673115B (en) * 2019-09-25 2021-11-23 杭州飞步科技有限公司 Combined calibration method, device, equipment and medium for radar and integrated navigation system
CN110751693B (en) * 2019-10-21 2023-10-13 北京百度网讯科技有限公司 Method, apparatus, device and storage medium for camera calibration
CN110766761B (en) * 2019-10-21 2023-09-26 北京百度网讯科技有限公司 Method, apparatus, device and storage medium for camera calibration
CN110728720B (en) * 2019-10-21 2023-10-13 阿波罗智能技术(北京)有限公司 Method, apparatus, device and storage medium for camera calibration
CN110926453A (en) * 2019-11-05 2020-03-27 杭州博信智联科技有限公司 Obstacle positioning method and system
CN113311422A (en) * 2020-02-27 2021-08-27 富士通株式会社 Coordinate conversion method and device and data processing equipment
CN111388092B (en) * 2020-03-17 2023-04-07 京东方科技集团股份有限公司 Positioning tracking piece, registration method, storage medium and electronic equipment
CN111680685B (en) * 2020-04-14 2023-06-06 上海高仙自动化科技发展有限公司 Positioning method and device based on image, electronic equipment and storage medium
CN111667545B (en) * 2020-05-07 2024-02-27 东软睿驰汽车技术(沈阳)有限公司 High-precision map generation method and device, electronic equipment and storage medium
CN111553956A (en) * 2020-05-20 2020-08-18 北京百度网讯科技有限公司 Calibration method and device of shooting device, electronic equipment and storage medium
CN111680596B (en) * 2020-05-29 2023-10-13 北京百度网讯科技有限公司 Positioning true value verification method, device, equipment and medium based on deep learning
CN113091889A (en) * 2021-02-20 2021-07-09 周春伟 Method and device for measuring road brightness
CN112973121B (en) * 2021-04-30 2021-07-20 成都完美时空网络技术有限公司 Reflection effect generation method and device, storage medium and computer equipment
CN113284194A (en) * 2021-06-22 2021-08-20 智道网联科技(北京)有限公司 Calibration method, device and equipment for multiple RS (remote sensing) equipment
CN113722796B (en) * 2021-08-29 2023-07-18 中国长江电力股份有限公司 Vision-laser radar coupling-based lean texture tunnel modeling method
CN114266876B (en) * 2021-11-30 2023-03-28 北京百度网讯科技有限公司 Positioning method, visual map generation method and device
CN117593385A (en) * 2023-11-28 2024-02-23 广州赋安数字科技有限公司 Method for generating camera calibration data in auxiliary mode through image spots

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982548A (en) * 2012-12-11 2013-03-20 清华大学 Multi-view stereoscopic video acquisition system and camera parameter calibrating method thereof
CN103727930A (en) * 2013-12-30 2014-04-16 浙江大学 Edge-matching-based relative pose calibration method of laser range finder and camera
CN103871071A (en) * 2014-04-08 2014-06-18 北京经纬恒润科技有限公司 Method for camera external reference calibration for panoramic parking system
CN107025670A (en) * 2017-03-23 2017-08-08 华中科技大学 A kind of telecentricity camera calibration method
CN107492123A (en) * 2017-07-07 2017-12-19 长安大学 A kind of road monitoring camera self-calibrating method using information of road surface
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
CN108694882A (en) * 2017-04-11 2018-10-23 百度在线网络技术(北京)有限公司 Method, apparatus and equipment for marking map
CN108732582A (en) * 2017-04-20 2018-11-02 百度在线网络技术(北京)有限公司 Vehicle positioning method and device
CN109410735A (en) * 2017-08-15 2019-03-01 百度在线网络技术(北京)有限公司 Reflected value map constructing method and device
CN109523597A (en) * 2017-09-18 2019-03-26 百度在线网络技术(北京)有限公司 The scaling method and device of Camera extrinsic
CN109754432A (en) * 2018-12-27 2019-05-14 深圳市瑞立视多媒体科技有限公司 A kind of automatic camera calibration method and optics motion capture system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6507730B2 (en) * 2015-03-10 2019-05-08 富士通株式会社 Coordinate transformation parameter determination device, coordinate transformation parameter determination method, and computer program for coordinate transformation parameter determination
US20180356831A1 (en) * 2017-06-13 2018-12-13 TuSimple Sparse image point correspondences generation and correspondences refinement method for ground truth static scene sparse flow generation
CN107368790B (en) * 2017-06-27 2020-07-28 汇纳科技股份有限公司 Pedestrian detection method, system, computer-readable storage medium and electronic device
CN109215083B (en) * 2017-07-06 2021-08-31 华为技术有限公司 Method and device for calibrating external parameters of vehicle-mounted sensor

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982548A (en) * 2012-12-11 2013-03-20 清华大学 Multi-view stereoscopic video acquisition system and camera parameter calibrating method thereof
CN103727930A (en) * 2013-12-30 2014-04-16 浙江大学 Edge-matching-based relative pose calibration method of laser range finder and camera
CN103871071A (en) * 2014-04-08 2014-06-18 北京经纬恒润科技有限公司 Method for camera external reference calibration for panoramic parking system
CN107025670A (en) * 2017-03-23 2017-08-08 华中科技大学 A kind of telecentricity camera calibration method
CN108694882A (en) * 2017-04-11 2018-10-23 百度在线网络技术(北京)有限公司 Method, apparatus and equipment for marking map
CN108732582A (en) * 2017-04-20 2018-11-02 百度在线网络技术(北京)有限公司 Vehicle positioning method and device
CN107492123A (en) * 2017-07-07 2017-12-19 长安大学 A kind of road monitoring camera self-calibrating method using information of road surface
CN109410735A (en) * 2017-08-15 2019-03-01 百度在线网络技术(北京)有限公司 Reflected value map constructing method and device
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
CN109523597A (en) * 2017-09-18 2019-03-26 百度在线网络技术(北京)有限公司 The scaling method and device of Camera extrinsic
CN109754432A (en) * 2018-12-27 2019-05-14 深圳市瑞立视多媒体科技有限公司 A kind of automatic camera calibration method and optics motion capture system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
《Research on multi-camera calibration and point cloud correction method based on three-dimensional calibration object》;Huang lin 等;《Optics and Lasers in Engineering》;20190430;第115卷;第32-41页 *
基于最大互信息的激光雷达与相机的配准;夏鹏飞 等;《仪器仪表学报》;20180131;第39卷(第1期);第277-285页 *
基于环形镜面的相机外部参数自动标定方法;付生鹏 等;《机器人》;20150531;第37卷(第3期);第34-41页 *

Also Published As

Publication number Publication date
CN110148185A (en) 2019-08-20

Similar Documents

Publication Publication Date Title
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
CN110146869B (en) Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium
CN110378965B (en) Method, device and equipment for determining coordinate system conversion parameters of road side imaging equipment
US10659677B2 (en) Camera parameter set calculation apparatus, camera parameter set calculation method, and recording medium
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
CN113657224B (en) Method, device and equipment for determining object state in vehicle-road coordination
US20220371602A1 (en) Vehicle positioning method, apparatus, and controller, intelligent vehicle, and system
US10909395B2 (en) Object detection apparatus
CN110766760B (en) Method, device, equipment and storage medium for camera calibration
WO2018120040A1 (en) Obstacle detection method and device
KR102006291B1 (en) Method for estimating pose of moving object of electronic apparatus
US10996337B2 (en) Systems and methods for constructing a high-definition map based on landmarks
KR101880185B1 (en) Electronic apparatus for estimating pose of moving object and method thereof
CN110766761B (en) Method, apparatus, device and storage medium for camera calibration
CN110751693B (en) Method, apparatus, device and storage medium for camera calibration
EP3845927B1 (en) Merging multiple lidar point cloud data using an iterative closest point (icp) algorithm with weighting factor
CN110728720B (en) Method, apparatus, device and storage medium for camera calibration
CN112946609A (en) Calibration method, device and equipment for laser radar and camera and readable storage medium
WO2021056283A1 (en) Systems and methods for adjusting a vehicle pose
WO2022133986A1 (en) Accuracy estimation method and system
US20240112363A1 (en) Position estimation system, position estimation method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant