CN110146869B - Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium - Google Patents

Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium Download PDF

Info

Publication number
CN110146869B
CN110146869B CN201910423326.8A CN201910423326A CN110146869B CN 110146869 B CN110146869 B CN 110146869B CN 201910423326 A CN201910423326 A CN 201910423326A CN 110146869 B CN110146869 B CN 110146869B
Authority
CN
China
Prior art keywords
coordinate system
imaging device
offset
image
world coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910423326.8A
Other languages
Chinese (zh)
Other versions
CN110146869A (en
Inventor
时一峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Intelligent Technology Beijing Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910423326.8A priority Critical patent/CN110146869B/en
Publication of CN110146869A publication Critical patent/CN110146869A/en
Application granted granted Critical
Publication of CN110146869B publication Critical patent/CN110146869B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

Embodiments of the present disclosure provide a method, an apparatus, an electronic device, and a computer-readable storage medium for determining coordinate system conversion parameters of an imaging device. In the method, reference values of coordinate system conversion parameters and a reference image captured by an imaging device are obtained. The reference values are determined when the imaging device is at a reference position and a reference orientation in a world coordinate system. The reference image is captured by the imaging device at a reference position and a reference orientation. In response to a change in at least one of a position and an orientation of the imaging device in the world coordinate system, an offset of a target image captured by the imaging device relative to a reference image after the change occurs is determined. And correcting the offset based on the corresponding coordinates of the reference pixel points in the reference image in the world coordinate system. Target values of the coordinate system conversion parameters are obtained based on the reference values and the corrected offsets. The embodiment of the disclosure has the advantages of simple operation and high calibration efficiency, and provides precision guarantee for roadside perception of automatic driving and autonomous parking.

Description

Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium
Technical Field
Embodiments of the present disclosure relate generally to the technical field of imaging devices and autopilot, and more particularly, to a method, apparatus, electronic device, and computer-readable storage medium for determining coordinate system conversion parameters.
Background
In recent years, techniques such as automated driving and autonomous parking have been developed in a fashion that are based on perception of the environment around the vehicle, that is, recognition of specific conditions of the environment in the vicinity of the vehicle. It has been proposed that, in addition to a sensor device (e.g., a vehicle-mounted lidar, an imaging device, etc.) mounted on a vehicle (also referred to as "vehicle side"), data relating to the vehicle environment may be acquired by a sensor device (e.g., an imaging device mounted on both sides of a road or in a parking lot) outside the vehicle (also referred to as "road side") in order to better support autonomous driving or parking of the vehicle. Since the autonomous driving or parking vehicle is usually located with reference to a world coordinate system (e.g., universal transverse ink carto UTM coordinate system), in order to support autonomous driving or parking, the imaging device outside the vehicle needs to calibrate external parameters first, that is, determine conversion parameters between the world coordinate system and the camera coordinate system of the imaging device.
At present, the external parameter calibration of the vehicle-mounted imaging device is usually realized by calibrating the relationship between the vehicle-mounted laser radar and the imaging device, and the external parameter calibration can be completed by measuring based on a Global Positioning System (GPS) signal when the vehicle-mounted imaging device is under the coverage of the GPS signal. However, when the position or orientation of the imaging device outside the vehicle changes, the external parameters of the imaging device will also change. Therefore, there is a need for an effective solution to calibrate external parameters of an imaging device on-line (i.e., in real-time) for techniques to assist in autonomous driving or parking, etc.
Disclosure of Invention
The embodiment of the disclosure relates to a technical scheme for determining coordinate system conversion parameters of an imaging device.
In a first aspect of the present disclosure, a method of determining coordinate system conversion parameters of an imaging device is provided. The method comprises the following steps: reference values of the coordinate system conversion parameters are obtained, the reference values being determined when the imaging device is at a reference position and a reference orientation in the world coordinate system, and reference images captured by the imaging device when at the reference position and the reference orientation. The method further comprises the following steps: in response to a change in at least one of a position and an orientation of the imaging device in the world coordinate system, an offset of a target image captured by the imaging device relative to a reference image after the change occurs is determined. The method further comprises the following steps: and correcting the offset based on the corresponding coordinates of the reference pixel points in the reference image in the world coordinate system. The method further comprises the following steps: target values of the coordinate system conversion parameters are obtained based on the reference values and the corrected offsets.
In a second aspect of the present disclosure, an apparatus for determining coordinate system conversion parameters of an imaging device is provided. The device includes: a first obtaining module configured to obtain reference values of the coordinate system conversion parameters, which are determined when the imaging device is at a reference position and a reference orientation in a world coordinate system, and a reference image captured by the imaging device when the imaging device is at the reference position and the reference orientation. The device also includes: an offset determination module configured to determine an offset of a target image captured by the imaging device relative to the reference image after the change occurs in response to a change in at least one of a position and an orientation of the imaging device in the world coordinate system. The device also includes: and the offset correction module is configured to correct the offset based on the corresponding coordinates of the reference pixel points in the reference image in the world coordinate system. The apparatus further comprises: a second obtaining module configured to obtain a target value of the coordinate system conversion parameter based on the reference value and the corrected offset.
In a third aspect of the disclosure, an electronic device is provided. The electronic device includes one or more processors; and a storage device for storing one or more programs. The one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of the first aspect.
In a fourth aspect of the disclosure, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when executed by a processor, implements the method of the first aspect.
It should be understood that the statements herein reciting aspects are not intended to limit the critical or essential features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other objects, features and advantages of the embodiments of the present disclosure will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
FIG. 1 illustrates a schematic diagram of an example environment in which some embodiments of the present disclosure can be implemented;
FIG. 2 shows a schematic flow diagram of an example method of determining coordinate system conversion parameters of an imaging device in accordance with an embodiment of the present disclosure;
FIG. 3 shows a schematic diagram of a reference image and a target image of an imaging device according to an embodiment of the present disclosure;
FIG. 4 shows a schematic flow diagram of an example method for determining coordinates of a pixel point in a target image in a world coordinate system, in accordance with an embodiment of the present disclosure;
FIG. 5 shows a schematic block diagram of an apparatus for determining coordinate system conversion parameters of an imaging device according to an embodiment of the present disclosure; and
FIG. 6 shows a schematic block diagram of a device that may be used to implement embodiments of the present disclosure.
Throughout the drawings, the same or similar reference numerals are used to designate the same or similar components.
Detailed Description
The principles and spirit of the present disclosure will be described with reference to a number of exemplary embodiments shown in the drawings. It is understood that these specific embodiments are described merely to enable those skilled in the art to better understand and implement the present disclosure, and are not intended to limit the scope of the present disclosure in any way.
As used herein, the term "coordinate system conversion parameters" may be, for example, parameters required to convert between a camera coordinate system, an image coordinate system, a pixel coordinate system, and a world coordinate system, such as a translation matrix, a rotation matrix, and the like. In the context of the present disclosure, a world coordinate system may refer to a reference coordinate system covering a global scope, which may be used, for example, to assist in autonomous driving or parking of a vehicle, etc., examples of which include a UTM coordinate system, a latitude and longitude coordinate system, and so on. The origin of the camera coordinate system may be located at the optical center of the imaging device, the vertical axis (z-axis) may coincide with the optical axis of the imaging device, and the horizontal axis (x-axis) and the vertical axis (y-axis) may be parallel to the imaging plane. The origin of the pixel coordinate system may be at the upper left corner of the image, and the horizontal axis and the vertical axis may be the pixel row and the pixel column, respectively, where the image is located, and the unit may be a pixel. The origin of the image coordinate system may be at the center of the image (i.e., the midpoint of the pixel coordinate system), and the horizontal and vertical axes may be parallel to the pixel coordinate system in millimeters. However, it will be appreciated that in other embodiments, these coordinate systems may be defined in other reasonable ways as is accepted in the art.
In embodiments of the present disclosure, "coordinate system conversion parameters" may include or refer to so-called "external parameters", "external parameter matrix", and the like in the field of camera calibration. In general, an "extrinsic parameter" may refer to a transformation parameter between a camera coordinate system associated with a particular imaging device and a world coordinate system (e.g., the UTM coordinate system). "extrinsic parameter calibration" may refer to the determination of conversion parameters between the camera coordinate system and the world coordinate system. Therefore, in the description of the embodiments of the present disclosure, the term "extrinsic parameter" may be used instead of the term "coordinate system conversion parameter" for convenience.
As noted above, as the position or orientation of the imaging device changes outside of the vehicle, the extrinsic parameters of the imaging device will also change. In particular, an imaging device outside a vehicle that is automatically driven or automatically parked may be shaken in bad weather (such as wind, etc.), resulting in external parameter variations. Without GPS signals or lidar sensors in the area of the imaging device outside the vehicle, it would be difficult to directly obtain the transformation of the camera coordinate system of the imaging device to the world coordinate system (e.g., UTM coordinate system). However, the imaging device may be better used to assist in autonomous driving or parking of the vehicle only after external reference to the imaging device is obtained, such as performing monocular vision back to three-dimensional (3D) algorithms, and so forth. Therefore, a calibration method of external parameters is needed to obtain the conversion relationship between the camera coordinate system and the world coordinate system of the imaging device in real time.
In view of the above-mentioned problems and potentially other problems in the conventional solutions, embodiments of the present disclosure propose a method, apparatus, electronic device, and computer-readable storage medium for determining coordinate system conversion parameters of an imaging device to enable real-time calibration of extrinsic parameters for conversion between a camera coordinate system and a world coordinate system (e.g., UTM coordinate system) of the imaging device. Embodiments of the present disclosure may effectively obtain external references of an imaging device to a world coordinate system in real-time without GPS and field-side lidar sensors. The problem of external parameter change caused by shaking of the imaging device under the condition of weather change is particularly solved. The scheme of the embodiment of the disclosure is simple to operate and high in calibration efficiency, the average pixel error can be smaller than or equal to two pixels, and precision guarantee is provided for roadside perception of automatic driving and autonomous parking. Several embodiments of the present disclosure are described below in conjunction with the following figures.
Fig. 1 illustrates a schematic diagram of an example environment 100 in which some embodiments of the present disclosure can be implemented. As shown in fig. 1, an example environment 100 depicts a scene of a certain parking lot in a schematic manner. Specifically, the depicted parking lot has a plurality of parking spaces disposed therein, such as parking space "CW 185" identified by space number 108. In addition, a lane line 101, a guide symbol 104, a parking space line 106, and the like are drawn on the ground of the parking lot. It should be understood that these facilities and identifications depicted in fig. 1 are merely examples, and that different or additional facilities or identifications would be present in other parking lots, and embodiments of the present disclosure are not limited in this respect. Further, it should also be understood that embodiments of the present disclosure are not limited to the scenario of a parking lot depicted in fig. 1, but are generally applicable to any scenario associated with autonomous driving or parking. More generally, embodiments of the present disclosure are also applicable to an imaging apparatus of any use, and are not limited to an imaging apparatus that assists automatic driving or autonomous parking.
In the example of FIG. 1, a plurality of vehicles 110-1 through 110-5 (hereinafter collectively referred to as vehicles 110) are parking in corresponding parking spaces. Vehicle 110 may be any type of vehicle that may carry people and/or things and be moved by a powered system such as an engine, including but not limited to a car, truck, bus, electric vehicle, motorcycle, recreational vehicle, train, and the like. One or more vehicles 110 in the example environment 100 may be vehicles with autonomous driving or parking capabilities, such vehicles also referred to as unmanned vehicles. Of course, one or some of the vehicles 110 in the example environment 100 may also be vehicles without autopilot or autonomous parking capabilities.
Also disposed in the example environment 100 is an imaging device (also referred to as an image sensor) 105 for capturing images in the example environment. In the context of the present disclosure, an imaging apparatus generally refers to any apparatus having an imaging function, including an imaging apparatus on board and an imaging apparatus off board, and an imaging apparatus used for any purpose. Such imaging devices include, but are not limited to, cameras, camcorders, video cameras, surveillance probes, automobile recorders, mobile devices with camera or photographic functions, and the like. In some embodiments, the imaging device 105 may be independent of the vehicle 110 for monitoring conditions of the example environment 100 to obtain sensory information related to the example environment 100 to assist in autonomous driving or parking of the vehicle 110. To reduce occlusion, the imaging device 105 may be disposed at a higher position in the example environment 100. For example, higher up on a fixed pole or wall to better monitor the example environment 100.
Although only one imaging device 105 is shown in fig. 1, it will be understood that multiple imaging devices may be disposed in each region of the example environment 100. In some embodiments, in addition to fixed imaging devices 105 in a particular location, a movable or rotatable imaging device may be provided in the example environment 100, and so forth. Furthermore, although imaging device 105 in fig. 1 is depicted as being disposed outside of vehicle 110, it will be understood that embodiments of the present disclosure are equally applicable to imaging devices disposed on vehicle 110, i.e., vehicle-mounted imaging devices. As shown, the imaging device 105 is communicatively (e.g., wired or wirelessly) connected to the computing device 120. In performing extrinsic parameter calibration on imaging device 105, position and/or orientation information of imaging device 105 and the captured image data may be provided to computing device 120 for use in determining coordinate system conversion parameters of imaging device 105. In addition, computing device 120 may also send various control signals to imaging device 105 to control various operations of imaging device 105, such as controlling imaging device 105 to capture images, to move or rotate, and so forth.
It will be appreciated that the computing device 120 may be any type of mobile terminal, fixed terminal, or portable terminal including a mobile phone, station, unit, device, multimedia computer, multimedia tablet, internet node, communicator, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, Personal Communication System (PCS) device, personal navigation device, Personal Digital Assistant (PDA), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, gaming device, or any combination thereof, including accessories and peripherals of these devices, or any combination thereof. It is also contemplated that computing device 120 can support any type of interface to the user (such as "wearable" circuitry, etc.). More generally, the computing device 120 may be any server or client device that can be used to determine coordinate system conversion parameters for an imaging device. An example method of determining coordinate system conversion parameters of an imaging device according to an embodiment of the present disclosure is described below in conjunction with fig. 2.
Fig. 2 shows a schematic flow diagram of an example method 200 of determining coordinate system conversion parameters of the imaging device 105 according to an embodiment of the present disclosure. In some embodiments, the method 200 may be implemented by the computing device 120 in fig. 1, for example, may be implemented by a processor or processing unit of the computing device 120. In other embodiments, all or part of the method 200 may also be implemented by a computing device separate from the example environment 100, or may be implemented by other elements in the example environment 100. For ease of discussion, the method 200 will be described in conjunction with FIG. 1.
As mentioned above, the imaging device 105 in the example environment 100 may be used to assist the vehicle 110 in autonomous parking or autonomous driving within a parking lot. More generally, in an autonomous driving scenario on a traffic road, an imaging device external to the vehicle may similarly assist the vehicle in autonomous driving. Because vehicle 110 is typically autonomously parked or autonomous driven with reference to the world coordinate system, to assist autonomous parking or autonomous driving of vehicle 110, imaging device 105 may need to calibrate external parameters, i.e., determine coordinate system transformation parameters between the camera coordinate system of imaging device 105 and the world coordinate system. With the determined coordinate system conversion parameters, the computing device 120 may use the image information captured by the imaging device 105 to assist the vehicle 110 in autonomous driving or parking.
Thus, at 210, the computing device 120 obtains reference values for the coordinate system conversion parameters of the imaging device 105 and a reference image captured by the imaging device 105. The reference value is determined when the imaging device 105 is at a reference position and a reference orientation in the world coordinate system. For example, the reference position and reference orientation may be the position and orientation at which the imaging device 105 was initially installed. In other embodiments, the reference position and reference orientation may also be any position and orientation used by the imaging device 105 to perform image capture. Correspondingly, the reference image captured by the imaging device 105 is captured by the imaging device 105 when in the reference position and the reference orientation. Therefore, the reference image of the imaging device 105 and the reference value of the coordinate system conversion parameter are associated. In other words, when the imaging device 105 is set or installed at a reference position and reference orientation in the example environment 100, the computing device 120 may perform calibration of the extrinsic parameter of the imaging device 105, thereby determining a reference value of the extrinsic parameter of the imaging device 105, and may control the imaging device 105 to capture an image as a reference image associated with the reference value of the extrinsic parameter.
The computing device 120 may determine the reference value of the external parameter of the imaging device 105 using various ways. For example, in the presence of GPS signals or lidar sensors in the example environment 100, the computing device 120 may perform calibration of the external parameters of the imaging device 105, i.e., determine reference values for the coordinate system conversion parameters of the imaging device 105, by detecting GPS signals at the location of the imaging device 105 or radar detection with lidar sensors. However, this method is limited by the presence of GPS signals and lidar sensors, and thus is not universal and is not suitable for scenarios lacking GPS signals and lidar sensors.
To this end, embodiments of the present disclosure also present a possible way of obtaining a reference value of an external parameter without GPS signals at the location of the imaging device 105 and without a lidar sensor. For example, since the vehicle 110 needs to be driven automatically or parked autonomously, at the imaging area of the imaging device 105, a high-precision map in which coordinates of various objects within the imaging area in the world coordinate system are recorded may have been drawn in advance. In this case, the computing device 120 may utilize the high-precision map to calculate the reference value of the external parameter of the imaging device 105. More specifically, the position of the imaging device 105 in the world coordinate system may first be roughly measured using a measurement tool to obtain initial extrinsic parameters of the imaging device 105. The computing device 120 may then optimize the resulting initial extrinsic parameters using the three-dimensional coordinate information in the high-precision map to obtain final extrinsic parameters.
Additionally or alternatively, at the imaging area of the imaging device 105, a map having coordinate information of an object in a world coordinate system, such as a point cloud map (e.g., a laser point cloud map) or a reflection value map, may have been drawn in advance. In this case, the computing device 120 may similarly utilize a point cloud map or a reflection value map at the imaging area of the imaging device 105 to compute the reference value of the external parameter of the imaging device 105. More specifically, the position of the imaging device 105 in the world coordinate system may first be roughly measured using a measurement tool to obtain initial extrinsic parameters of the imaging device 105. The computing device 120 may then optimize the resulting initial extrinsic parameters using the three-dimensional coordinate information in the point cloud map or reflectance value map to obtain final extrinsic parameters. In this way, the reference value of the external parameter of the imaging device 105 can be determined without depending on the GPS positioning signal and/or the lidar sensor, thereby improving the flexibility and universality of external parameter calibration.
At 220, computing device 120 determines whether the position and/or orientation of imaging device 105 in the world coordinate system has changed. In general, the camera coordinate system of the imaging device 105 is established with the optical center of the imaging device 105 as the origin and with the optical axis of the imaging device 105 as the vertical axis (z-axis). Thus, the camera coordinate system of the imaging device 105 is varied as the position and/or orientation of the imaging device 105 is varied. That is, as the position and/or orientation of imaging device 105 in the world coordinate system changes, the position and/or orientation of the camera coordinate system relative to the world coordinate system also changes. This means that the external parameters of the imaging device 105 also change and therefore need to be re-determined.
Variations in the position and/or orientation of the imaging device 105 in the world coordinate system may be due to various reasons. In some example scenarios, the imaging device 105 is fixedly mounted and is intended to have its position and orientation fixed, but the position and orientation of the imaging device 105 may change due to inclement weather (e.g., wind causing shaking of the imaging device), maintenance of the mounting infrastructure (e.g., mounting poles or walls), and the like. Further, in other example scenarios, the imaging device 105 may also be configured to be movable or rotatable in the world coordinate system to perform image capture of the example environment 100 from different positions or angles.
As described above, changes in the position and/or orientation of the imaging device 105 in the world coordinate system will result in an offset of its camera coordinate system relative to the world coordinate system. In this case, the imaging device 105 needs to be recalibrated, i.e. the value of the external parameter is re-determined. In other words, if the computing device 120 detects that the position and/or orientation of the imaging device 105 in the world coordinate system has changed, i.e., the imaging device 105 is no longer at the reference position and reference orientation associated with the reference value of the extrinsic parameter, the current value of the coordinate system transformation parameter of the imaging device 105 will no longer be the reference value. Accordingly, the computing device 120 may re-determine the values of the coordinate system conversion parameters of the imaging device 105. In the context of the present disclosure, this value may also be referred to as a current value or a target value.
At 230, after determining that the position and/or orientation of the imaging device 105 in the world coordinate system has changed, the computing device 120 determines an offset of the target image captured by the imaging device 105 relative to the reference image after the change has occurred. It will be appreciated that as the position and/or orientation of the imaging device 105 changes in the world coordinate system, the target image currently captured by the imaging device 105 is captured at a different position and/or orientation from the reference position and/or orientation, which will result in a shift between the target image captured by the imaging device 105 and the reference image due to a shift in the current camera coordinate system relative to the reference camera coordinate system associated with the reference position and orientation. In other words, by determining the offset between the target image and the reference image, the computing device 120 may indirectly determine a change in the current position and/or orientation of the imaging device 105 relative to the reference position and/or orientation associated with the reference value of the extrinsic parameter, i.e., an offset of the current camera coordinate system relative to the reference camera coordinate system. This is further described below in conjunction with fig. 3.
Fig. 3 shows a schematic diagram of a reference image 310 and a target image 320 of an imaging device 105 according to an embodiment of the disclosure. As shown in fig. 3, reference image 310 captured by imaging device 105 at a reference position and reference orientation schematically includes a pole image 312 of a pole (not shown in fig. 1) in example environment 100, a space number image 317 of space number 108, and a stop line image 319 of stop line 106. Correspondingly, target images 320 captured by imaging device 105 at the current location and current orientation illustratively include a post image 322 of a post (not shown in fig. 1) in example environment 100, a slot number image 327 of slot number 108, and a stop line image 329 of stop line 106.
As shown in fig. 3, there is an offset between target image 320 and reference image 310 because target image 320 was captured by imaging device 105 when at a current position and orientation that is different from the reference position and/or reference orientation at which imaging device 105 captured reference image 310. This offset is reflected in the offset between the imaging of the same object in the target image 320 and the reference image 310. For example, there is an offset between the pole image 322, the parking space number image 327, the stop line image 329, and the pole image 312, the parking space number image 317, the stop line image 319.
Physically, the offset may be characterized by an amount of rotation, which represents the amount of rotation that occurs with respect to the reference image 310 for the target image 320, and an amount of translation, which represents the amount of translation that occurs with respect to the reference image 310 for the target image 320. Note that since the magnitude of the undesired movement of the imaging device 105 is typically small, it is assumed here that the reference image 310 and the target image 320 still capture images of substantially the same object. However, it will be appreciated that embodiments of the present disclosure are also possible in scenes where the magnitude of movement of the imaging device 105 is large, as long as the same captured object or region still exists between the reference image 310 and the target image 320.
The computing device 120 may determine the offset of the target image 320 relative to the reference image 310 using various means. For example, the computing device 120 may simply measure the offset of the imaging of the same object in the target image 320 relative to the imaging in the reference image 310 by the measurement tool. In other embodiments, to derive the offset of the target image 320 relative to the reference image 310, the computing device 120 may determine matching first and second feature points 315 and 325 in the reference image 310 and the target image 320, respectively. In the example of fig. 3, first feature points 315 may be corner points of the upright imagery 312 and second feature points 325 may be corresponding corner points of the upright imagery 322. The matching first feature point 315 and second feature point 325 will have an epipolar constraint relationship between them according to the geometric relationship between the imaged points of the same point in different images. In some examples, the above-mentioned searching and matching of feature points may be performed by an existing image feature matching algorithm, such as an ORB algorithm or a SIFT algorithm, and so on.
Next, the computing device 120 may calculate the amount of rotation and the amount of translation included in the offset of the target image 320 with respect to the reference image 310 using the epipolar constraint relationship between the first feature point 315 and the second feature point 325. For example, the computing device 120 may solve the conversion relationship between the pixel coordinates of the first feature point 315 and the pixel coordinates of the second feature point 325 according to the epipolar constraint relationship, thereby deriving the above-described amount of rotation and amount of translation. The above-described approach of using matching feature points may improve the accuracy of the determined offset compared to an approach of using a measurement tool to measure or compare the reference image 310 and the target image 320. Further, it is to be noted that although it is described here that the above-described offset is determined by matching of feature points by taking one feature point as an example, in other embodiments, two sets of feature points that match may be determined in the reference image 310 and the target image 320, respectively, to calculate the above-described offset, thereby further improving the accuracy of the calculated offset.
As can be seen from the imaging geometry and the coordinate transformation of the imaging device 105, the amount of rotation in the offset of the target image 320 with respect to the reference image 310 determined by the computing device 120 is the rotation matrix of the current camera coordinate system of the imaging device 105 with respect to the reference camera coordinate system. However, for the amount of translation included in the offset, it is not exactly equivalent to the translation vector of the current camera coordinate system of the imaging device 105 relative to the reference camera coordinate system, but there is a scaling factor between each other. For example, the translation amount obtained by the epipolar constraint relationship between the first feature point 315 and the second feature point 325 multiplied by an arbitrary scaling factor will satisfy the epipolar constraint relationship. This is because the above-mentioned offset is determined based on the two-dimensional reference image 310 and the target image 320, whereas the transformation of the current camera coordinate system with respect to the reference camera coordinate system is a three-dimensional transformation relationship, so that information of the third dimension will be absent from the offset determined by the two-dimensional images. Therefore, in order to finally determine the current values of the coordinate system conversion parameters of the imaging device 105, the computing device 120 needs to correct the offset determined based on the reference image 310 and the target image 320.
Referring back to fig. 2, at 240, the computing device 120 corrects the offset based on the coordinates in the world coordinate system corresponding to the reference pixel point 315 in the reference image 310. As described above, the offset determined by the reference image 310 and the target image 320 needs to be corrected due to the lack of one-dimensional information, so the computing device 120 can correct the offset by using the coordinates of any one pixel point in the target image 320 in the world coordinate system, for example, determining a scaling factor of the amount of translation included in the offset. In some embodiments, the coordinates of the pixel point in the target image 320 in the world coordinate system can be determined by the coordinates of the corresponding pixel point of the pixel point in the reference image 310 in the world coordinate system, since their coordinates in the world coordinate system are the same. It will be appreciated that the computing device 120 may use any suitable method to determine the coordinates of the corresponding pixel points in the reference image 310 in the world coordinate system. For example, the coordinates of the corresponding pixel points may be derived with the aid of a map with the coordinates of the world coordinate system. This is described in detail below in connection with fig. 4.
Fig. 4 shows a schematic flow diagram of an example method 400 for determining coordinates of a pixel point in a target image 320 in a world coordinate system, in accordance with an embodiment of the present disclosure. In some embodiments, the method 400 may be implemented by the computing device 120 in fig. 1, e.g., may be implemented by a processor or processing unit of the computing device 120. In other embodiments, all or part of the method 400 may also be implemented by a computing device separate from the example environment 100, or may be implemented by other elements in the example environment 100. By way of example method 400, computing device 120 may efficiently determine coordinates of pixel points in target image 320 in the world coordinate system, thereby enabling correction of the offset of target image 320 relative to reference image 310.
At 410, the computing device 120 may obtain a map of the imaging area of the imaging device 105, the map having coordinates of a world coordinate system. As one example, the map may be a point cloud map made with laser radar collection points on an autonomous vehicle. The point cloud map records coordinates and corresponding reflection intensity of a reflection point formed by the object in the imaging area reflecting the detection laser in a world coordinate system. Thus, the point cloud map has coordinates of a world coordinate system, which may be used to determine coordinates of pixel points in the target image 320 in the world coordinate system. Alternatively or additionally, similar maps having coordinates of the world coordinate system also include, but are not limited to, high precision maps and reflectance value maps, among others.
At 420, computing device 120 may project a map having coordinates in the world coordinate system to the pixel coordinate system in which reference image 310 is located using known reference values of the coordinate system conversion parameters of imaging device 105 to obtain a projected image. It will be appreciated that in this projection process, in addition to the external parameters that require the use of the imaging device 105, other parameters may also be involved, such as the internal parameters of the imaging device 105. In the context of the present disclosure, as the determination of the external parameters of the imaging device 105 is mainly focused on, the internal parameters or other parameters of the imaging device 105 may be assumed to be known. In this case, in the above-described projection process, the computing device 120 may obtain a one-to-one mapping relationship or correspondence relationship between the pixel coordinates in the projection image and the three-dimensional coordinates in the world coordinate system by the external parameters and the internal parameters of the imaging device 105, and the like.
At 430, computing device 120 may determine projected pixel points in the projected projection image that correspond to reference pixel points 315. For example, computing device 120 may compare the projected image to the reference image to determine corresponding projected pixel points to reference pixel point 315. For another example, the computing device 120 may determine the corresponding projected pixel point by a pixel point matching algorithm (e.g., ORB algorithm or SIFT algorithm). The corresponding projected pixel points can be considered as pixel points formed in the projected image by the one or more object points corresponding to the reference pixel point 315, and thus they will correspond to the same coordinates in the world coordinate system. In view of this, at 440, computing device 120 may determine the coordinates of the projected pixel point in the world coordinate system as the coordinates of reference pixel point 315.
After obtaining the coordinates of the reference pixel 315 in the reference image 310 in the world coordinate system, the computing device 120 may use it to correct the offset between the target image 320 and the reference image 310. For example, as mentioned above, computing device 120 may determine target pixel 325 in target image 320 that corresponds to reference pixel 315, and the coordinates of target pixel 325 in the world coordinate system are considered to be the same as the coordinates of reference pixel 315 in the world coordinate system. Computing device 120 may then determine a scaling factor to be applied to the amount of translation in the offset based on the pixel coordinates of target pixel point 325 in target image 320 and the determined coordinates of target pixel point 325 in the world coordinate system, and based on the coordinate transformation relationship between the pixel coordinates and the coordinates in the world coordinate system. That is, the current value or the target value of the translation vector in the external parameter of the imaging device 105 is finally determined. In this manner, the computing device 120 may more accurately determine the translation vector in the external parameter of the imaging device 105, thereby improving the accuracy of the online calibration of the imaging device 105.
Referring back to fig. 2, at 250, the computing device 120 obtains a target value of the coordinate system conversion parameter based on the reference value of the coordinate system conversion parameter and the corrected offset. As noted above, the amount of rotation in the offset between the target image 320 and the reference image 310 represents the amount of rotation between the two images, and the amount of translation in the offset represents the amount of translation between the two images. Thus, the corrected offset characterizes a translation between the current position and orientation of the imaging device 105 and the reference position and orientation, i.e. a translation between the current camera coordinate system of the imaging device 105 and the reference camera coordinate system. Further, the conversion relationship between the reference camera coordinate system and the world coordinate system is known (i.e., the reference value of the extrinsic parameter), so the current value or the target value of the extrinsic parameter can be derived from the corrected offset and the reference value of the extrinsic parameter.
For example, the computing device 120 may determine a transformation matrix that represents the corrected offset, such as using an amount of rotation and an amount of translation in the corrected offset to compose the transformation matrix. The transformation matrix characterizes a transformation relationship of the current position and the current orientation of the imaging device 105 with respect to the reference position and the reference orientation, i.e. a transformation relationship between the current camera coordinate system and the reference camera coordinate system of the imaging device 105. The computing device 120 may then apply the transformation matrix to the reference values of the coordinate system conversion parameters to obtain target values of the coordinate system conversion parameters. For example, the transformation matrix and the reference values of the coordinate system transformation parameters (which may also be characterized by a matrix) may be directly multiplied to derive the target values of the coordinate system transformation parameters. In this way, the computing device 102 may derive the current target value from the reference value of the external parameter of the imaging device 105 by a matrix operation method.
Fig. 5 shows a schematic block diagram of an apparatus 500 for determining coordinate system conversion parameters of an imaging device according to an embodiment of the present disclosure. In some embodiments, the apparatus 500 may be included in the computing device 120 of fig. 1 or implemented as the computing device 120.
As shown in fig. 5, the apparatus 500 includes a first obtaining module 510, an offset determining module 520, an offset correcting module 530, and a second obtaining module 540. The first obtaining module 510 is configured to obtain reference values of the coordinate system conversion parameters and a reference image captured by the imaging device. The reference values are determined when the imaging device is at a reference position and a reference orientation in a world coordinate system. The reference image is captured by the imaging device at a reference position and a reference orientation.
The offset determination module 520 is configured to determine an offset of the target image captured by the imaging device relative to the reference image after the change occurs in response to a change in at least one of a position and an orientation of the imaging device in the world coordinate system. The offset correction module 530 is configured to correct the offset based on the coordinates of the reference pixel points in the reference image in the world coordinate system. The second obtaining module 540 is configured to obtain target values of the coordinate system conversion parameters based on the reference values and the corrected offsets.
In some embodiments, the first obtaining module 510 may include a reference value calculation module. The reference value calculation module is configured to calculate a reference value based on at least one of a high-precision map, a point cloud map, and a reflection value map of an imaging area of the imaging device.
In some embodiments, the offset determination module 520 may include a feature point determination module and a rotation amount and translation amount calculation module. The feature point determination module is configured to determine a first feature point and a second feature point which are matched in the reference image and the target image respectively, wherein the first feature point and the second feature point have an epipolar constraint relationship. The rotation and translation amount calculation module is configured to calculate a rotation amount and a translation amount of the offset using the epipolar constraint relationship.
In some embodiments, the apparatus 500 may further include a map obtaining module, a projection pixel point determining module, and a coordinate determining module. The map obtaining module is configured to obtain a map of an imaging area of the imaging device, the map having coordinates of a world coordinate system. The projection module is configured to project the map to a pixel coordinate system in which the reference image is located, using the reference value, to obtain a projected image. The projected pixel determination module is configured to determine projected pixels in the projected image that correspond to the reference pixels. The coordinate determination module is configured to determine coordinates of the projected pixel point in the world coordinate system as coordinates of the reference pixel point.
In some embodiments, the offset correction module 530 may include a target pixel point determination module and a scaling factor determination module. The target pixel point determination module is configured to determine a target pixel point corresponding to the reference pixel point in the target image. The scaling factor determination module is configured to determine a scaling factor to be applied to the amount of translation in the offset based on a conversion relationship between pixel coordinates of the target pixel point and coordinates in a world coordinate system.
In some embodiments, the second obtaining module 540 may include a transformation matrix determining module and a transformation matrix applying module. The transformation matrix determination module is configured to determine a transformation matrix representing the offset. The transformation matrix application module is configured to apply a transformation matrix to the reference value to obtain the target value.
Fig. 6 schematically illustrates a block diagram of a device 600 that may be used to implement embodiments of the present disclosure. As shown in fig. 6, device 600 includes a Central Processing Unit (CPU)601 that may perform various appropriate actions and processes in accordance with computer program instructions stored in a read-only memory device (ROM)602 or loaded from a storage unit 608 into a random access memory device (RAM) 603. In the RAM 603, various programs and data required for the operation of the device 600 can also be stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
A number of components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, a mouse, or the like; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The various processes and processes described above, such as the example methods 200 and 400, may be performed by the processing unit 601. For example, in some embodiments, the example methods 200 and 400 may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as the storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into RAM 603 and executed by CPU 601, one or more steps of the example methods 200 and 400 described above may be performed.
As used herein, the terms "comprises," comprising, "and the like are to be construed as open-ended inclusions, i.e.," including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions may also be included herein.
As used herein, the term "determining" encompasses a wide variety of actions. For example, "determining" can include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Further, "determining" can include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory), and the like. Further, "determining" may include resolving, selecting, choosing, establishing, and the like.
It should be noted that the embodiments of the present disclosure can be realized by hardware, software, or a combination of software and hardware. The hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the apparatus and methods described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, in programmable memory or on a data carrier such as an optical or electronic signal carrier.
Further, while the operations of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Rather, the steps depicted in the flowcharts may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions. It should also be noted that the features and functions of two or more devices according to the present disclosure may be embodied in one device. Conversely, the features and functions of one apparatus described above may be further divided into embodiments by a plurality of apparatuses.
While the present disclosure has been described with reference to several particular embodiments, it is to be understood that the disclosure is not limited to the particular embodiments disclosed. The disclosure is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (14)

1. A method of determining coordinate system conversion parameters of an imaging device, comprising:
obtaining reference values of the coordinate system conversion parameters and reference images captured by the imaging device, the reference values being determined when the imaging device is at a reference position and a reference orientation in a world coordinate system, the reference images being captured by the imaging device when at the reference position and the reference orientation;
in response to a change in at least one of a position and an orientation of the imaging device in the world coordinate system, determining an offset of a target image captured by the imaging device relative to the reference image after the change occurs;
correcting the offset based on the coordinates corresponding to the reference pixel points in the reference image in the world coordinate system; and
obtaining target values of the coordinate system conversion parameters based on the reference values and the corrected offsets.
2. The method of claim 1, wherein obtaining the reference value comprises:
calculating the reference value based on at least one of a high-precision map, a point cloud map, and a reflectance value map of an imaging area of the imaging device.
3. The method of claim 1, wherein determining the offset comprises:
respectively determining a first feature point and a second feature point which are matched in the reference image and the target image, wherein the first feature point and the second feature point have an epipolar constraint relationship; and
and calculating the rotation amount and the translation amount of the offset by utilizing the epipolar constraint relation.
4. The method of claim 1, further comprising:
obtaining a map of an imaging area of the imaging device, the map having coordinates of the world coordinate system;
projecting the map to a pixel coordinate system where the reference image is located by using the reference value to obtain a projected image;
determining projection pixel points corresponding to the reference pixel points in the projection image; and
and determining the coordinates of the projection pixel point in the world coordinate system as the coordinates of the reference pixel point.
5. The method of claim 1, wherein correcting the offset comprises:
determining target pixel points corresponding to the reference pixel points in the target image; and
determining a scaling factor to be applied to the amount of translation in the offset based on a conversion relationship between pixel coordinates of the target pixel point and coordinates in the world coordinate system.
6. The method of claim 1, wherein obtaining the target value comprises:
determining a transformation matrix representing the offset; and
applying the transformation matrix to the reference value to obtain the target value.
7. An apparatus for determining coordinate system conversion parameters of an imaging device, comprising:
a first obtaining module configured to obtain reference values of the coordinate system conversion parameters and a reference image captured by the imaging device, the reference values being determined when the imaging device is at a reference position and a reference orientation in a world coordinate system, the reference image being captured by the imaging device when at the reference position and the reference orientation;
an offset determination module configured to determine an offset of a target image captured by the imaging device relative to the reference image after a change in at least one of a position and an orientation of the imaging device in the world coordinate system occurs in response to the change;
the offset correction module is configured to correct the offset based on the corresponding coordinates of the reference pixel points in the reference image in the world coordinate system; and
a second obtaining module configured to obtain a target value of the coordinate system conversion parameter based on the reference value and the corrected offset.
8. The apparatus of claim 7, wherein the first obtaining means comprises:
a reference value calculation module configured to calculate the reference value based on at least one of a high-precision map, a point cloud map, and a reflection value map of an imaging area of the imaging device.
9. The apparatus of claim 7, wherein the offset determination module comprises:
a feature point determination module configured to determine a first feature point and a second feature point which are matched in the reference image and the target image respectively, wherein the first feature point and the second feature point have an epipolar constraint relationship therebetween; and
a rotation and translation amount calculation module configured to calculate a rotation amount and a translation amount of the offset using the epipolar constraint relationship.
10. The apparatus of claim 7, further comprising:
a map obtaining module configured to obtain a map of an imaging area of the imaging device, the map having coordinates of the world coordinate system;
a projection module configured to project the map to a pixel coordinate system in which the reference image is located by using the reference value to obtain a projection image;
a projection pixel determination module configured to determine a projection pixel corresponding to the reference pixel in the projection image; and
a coordinate determination module configured to determine coordinates of the projected pixel point in the world coordinate system as the coordinates of the reference pixel point.
11. The apparatus of claim 7, wherein the offset correction module comprises:
a target pixel point determining module configured to determine a target pixel point corresponding to the reference pixel point in the target image; and
a scaling factor determination module configured to determine a scaling factor to be applied to the amount of translation in the offset based on a conversion relationship between pixel coordinates of the target pixel point and coordinates in the world coordinate system.
12. The apparatus of claim 7, wherein the second obtaining means comprises:
a transformation matrix determination module configured to determine a transformation matrix representing the offset; and
a transformation matrix application module configured to apply the transformation matrix to the reference value to obtain the target value.
13. An electronic device, comprising:
one or more processors; and
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method of any one of claims 1-6.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-6.
CN201910423326.8A 2019-05-21 2019-05-21 Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium Active CN110146869B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910423326.8A CN110146869B (en) 2019-05-21 2019-05-21 Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910423326.8A CN110146869B (en) 2019-05-21 2019-05-21 Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110146869A CN110146869A (en) 2019-08-20
CN110146869B true CN110146869B (en) 2021-08-10

Family

ID=67592598

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910423326.8A Active CN110146869B (en) 2019-05-21 2019-05-21 Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110146869B (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110788858B (en) * 2019-10-23 2023-06-13 武汉库柏特科技有限公司 Object position correction method based on image, intelligent robot and position correction system
WO2021081993A1 (en) * 2019-11-01 2021-05-06 深圳市大疆创新科技有限公司 Data storage and processing methods, related devices, and storage medium
CN112819896B (en) * 2019-11-18 2024-03-08 商汤集团有限公司 Sensor calibration method and device, storage medium and calibration system
US11461929B2 (en) * 2019-11-28 2022-10-04 Shanghai United Imaging Intelligence Co., Ltd. Systems and methods for automated calibration
CN111192308B (en) * 2019-12-31 2023-11-03 浙江商汤科技开发有限公司 Image processing method and device, electronic equipment and computer storage medium
CN113129382B (en) * 2019-12-31 2024-06-14 华为云计算技术有限公司 Method and device for determining coordinate conversion parameters
CN113311422A (en) * 2020-02-27 2021-08-27 富士通株式会社 Coordinate conversion method and device and data processing equipment
CN111323751B (en) * 2020-03-25 2022-08-02 苏州科达科技股份有限公司 Sound source positioning method, device and storage medium
CN111311743B (en) * 2020-03-27 2023-04-07 北京百度网讯科技有限公司 Three-dimensional reconstruction precision testing method and device and electronic equipment
CN111460071B (en) * 2020-03-31 2023-09-26 北京百度网讯科技有限公司 Deflection method, deflection device, deflection equipment and readable storage medium for high-precision map
CN111477013B (en) * 2020-04-01 2021-06-25 清华大学苏州汽车研究院(吴江) Vehicle measuring method based on map image
CN111681281B (en) * 2020-04-16 2023-05-09 北京诺亦腾科技有限公司 Calibration method and device for limb motion capture, electronic equipment and storage medium
CN111741214A (en) * 2020-05-13 2020-10-02 北京迈格威科技有限公司 Image processing method and device and electronic equipment
CN111612852B (en) * 2020-05-20 2023-06-09 阿波罗智联(北京)科技有限公司 Method and apparatus for verifying camera parameters
CN111578839B (en) * 2020-05-25 2022-09-20 阿波罗智联(北京)科技有限公司 Obstacle coordinate processing method and device, electronic equipment and readable storage medium
CN111832642A (en) * 2020-07-07 2020-10-27 杭州电子科技大学 Image identification method based on VGG16 in insect taxonomy
CN111914048B (en) * 2020-07-29 2024-01-05 北京天睿空间科技股份有限公司 Automatic generation method for corresponding points of longitude and latitude coordinates and image coordinates
CN112150542B (en) * 2020-09-24 2023-02-24 上海联影医疗科技股份有限公司 Method and device for measuring radiation field, electronic equipment and storage medium
CN113420581A (en) * 2020-10-19 2021-09-21 杨宏伟 Correction method and device for written document image, electronic equipment and readable medium
CN112509058B (en) * 2020-11-30 2023-08-22 北京百度网讯科技有限公司 External parameter calculating method, device, electronic equipment and storage medium
CN112560769B (en) * 2020-12-25 2023-08-29 阿波罗智联(北京)科技有限公司 Method for detecting obstacle, electronic device, road side device and cloud control platform
CN112924955B (en) * 2021-01-29 2022-12-16 同济大学 Roadside laser radar point cloud coordinate dynamic correction method
CN113329181B (en) * 2021-06-08 2022-06-14 厦门四信通信科技有限公司 Angle switching method, device, equipment and storage medium of camera
CN113777589B (en) * 2021-08-18 2024-04-02 北京踏歌智行科技有限公司 LIDAR and GPS/IMU combined calibration method based on point characteristics
CN113822943B (en) * 2021-09-17 2024-06-11 中汽创智科技有限公司 External parameter calibration method, device and system of camera and storage medium
CN114347917B (en) * 2021-12-28 2023-11-10 华人运通(江苏)技术有限公司 Calibration method and device for vehicle and vehicle-mounted camera system
TWI811954B (en) * 2022-01-13 2023-08-11 緯創資通股份有限公司 Positioning system and calibration method of object location

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004236199A (en) * 2003-01-31 2004-08-19 Canon Inc Image processor and image processing method
CN101699313B (en) * 2009-09-30 2012-08-22 北京理工大学 Method and system for calibrating external parameters based on camera and three-dimensional laser radar
CN101839692B (en) * 2010-05-27 2012-09-05 西安交通大学 Method for measuring three-dimensional position and stance of object with single camera
CN103871071B (en) * 2014-04-08 2018-04-24 北京经纬恒润科技有限公司 Join scaling method outside a kind of camera for panoramic parking system
CN103997637A (en) * 2014-05-30 2014-08-20 天津大学 Correcting method of multi-view-point images of parallel camera array
CN108876826B (en) * 2017-05-10 2021-09-21 深圳先进技术研究院 Image matching method and system
CN107170010A (en) * 2017-05-11 2017-09-15 四川大学 System calibration method, device and three-dimensional reconstruction system
CN107481292B (en) * 2017-09-05 2020-07-28 百度在线网络技术(北京)有限公司 Attitude error estimation method and device for vehicle-mounted camera
CN108198219B (en) * 2017-11-21 2022-05-13 合肥工业大学 Error compensation method for camera calibration parameters for photogrammetry
CN108052910A (en) * 2017-12-19 2018-05-18 深圳市保千里电子有限公司 A kind of automatic adjusting method, device and the storage medium of vehicle panoramic imaging system
CN108535753A (en) * 2018-03-30 2018-09-14 北京百度网讯科技有限公司 Vehicle positioning method, device and equipment
CN108765498B (en) * 2018-05-30 2019-08-23 百度在线网络技术(北京)有限公司 Monocular vision tracking, device and storage medium
CN109087382A (en) * 2018-08-01 2018-12-25 宁波发睿泰科智能科技有限公司 A kind of three-dimensional reconstruction method and 3-D imaging system
CN109186616B (en) * 2018-09-20 2020-04-07 禾多科技(北京)有限公司 Lane line auxiliary positioning method based on high-precision map and scene retrieval
CN109685855B (en) * 2018-12-05 2022-10-14 长安大学 Camera calibration optimization method under road cloud monitoring platform

Also Published As

Publication number Publication date
CN110146869A (en) 2019-08-20

Similar Documents

Publication Publication Date Title
CN110146869B (en) Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
CN110378965B (en) Method, device and equipment for determining coordinate system conversion parameters of road side imaging equipment
EP3751519B1 (en) Method, apparatus, device and medium for calibrating pose relationship between vehicle sensor and vehicle
US10659677B2 (en) Camera parameter set calculation apparatus, camera parameter set calculation method, and recording medium
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
CN113657224B (en) Method, device and equipment for determining object state in vehicle-road coordination
US20220371602A1 (en) Vehicle positioning method, apparatus, and controller, intelligent vehicle, and system
US10909395B2 (en) Object detection apparatus
CN112419385B (en) 3D depth information estimation method and device and computer equipment
CN110766760B (en) Method, device, equipment and storage medium for camera calibration
KR101880185B1 (en) Electronic apparatus for estimating pose of moving object and method thereof
KR102006291B1 (en) Method for estimating pose of moving object of electronic apparatus
CN110766761B (en) Method, apparatus, device and storage medium for camera calibration
WO2022183685A1 (en) Target detection method, electronic medium and computer storage medium
CN110751693B (en) Method, apparatus, device and storage medium for camera calibration
JP4132068B2 (en) Image processing apparatus, three-dimensional measuring apparatus, and program for image processing apparatus
CN112946609B (en) Calibration method, device and equipment for laser radar and camera and readable storage medium
US20230027622A1 (en) Automated real-time calibration
CN110728720B (en) Method, apparatus, device and storage medium for camera calibration
CN114494466A (en) External parameter calibration method, device and equipment and storage medium
CN113312403B (en) Map acquisition method and device, electronic equipment and storage medium
CN115523929B (en) SLAM-based vehicle-mounted integrated navigation method, device, equipment and medium
US20240112363A1 (en) Position estimation system, position estimation method, and program
US12033400B2 (en) Overhead-view image generation device, overhead-view image generation system, and automatic parking device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211011

Address after: 105 / F, building 1, No. 10, Shangdi 10th Street, Haidian District, Beijing 100085

Patentee after: Apollo Intelligent Technology (Beijing) Co.,Ltd.

Address before: 100094 2 / F, baidu building, No.10 Shangdi 10th Street, Haidian District, Beijing

Patentee before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.