CN110728720B - Method, apparatus, device and storage medium for camera calibration - Google Patents

Method, apparatus, device and storage medium for camera calibration Download PDF

Info

Publication number
CN110728720B
CN110728720B CN201911001905.XA CN201911001905A CN110728720B CN 110728720 B CN110728720 B CN 110728720B CN 201911001905 A CN201911001905 A CN 201911001905A CN 110728720 B CN110728720 B CN 110728720B
Authority
CN
China
Prior art keywords
point
points
camera
dimensional image
module configured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911001905.XA
Other languages
Chinese (zh)
Other versions
CN110728720A (en
Inventor
时一峰
贾金让
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Intelligent Technology Beijing Co Ltd
Original Assignee
Apollo Intelligent Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Intelligent Technology Beijing Co Ltd filed Critical Apollo Intelligent Technology Beijing Co Ltd
Priority to CN201911001905.XA priority Critical patent/CN110728720B/en
Publication of CN110728720A publication Critical patent/CN110728720A/en
Application granted granted Critical
Publication of CN110728720B publication Critical patent/CN110728720B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Abstract

Embodiments of the present disclosure provide methods, apparatuses, devices, and computer-readable storage media for coordinate system conversion parameters of an imaging device, which may be used for autopilot. The method comprises the following steps: determining a first set of points corresponding to a predetermined reference line from a two-dimensional image captured by a camera; determining a second point set from the two-dimensional image based on the position information of the reference line in the three-dimensional map; determining a first point from the first set of points, the first point corresponding to a second point in the second set of points, and the first point being less than a predetermined distance threshold from the second point; and determining an external parameter of the camera based at least on the first point and the second point, the external parameter indicating a conversion relationship of the camera coordinate system and the world coordinate system. Therefore, the interference caused by inaccurate matching points on the camera can be reduced, and the external parameters of the camera can be calibrated more accurately.

Description

Method, apparatus, device and storage medium for camera calibration
Technical Field
Embodiments of the present disclosure relate generally to the field of computer technology, and may be used for autopilot, and more particularly, to methods, apparatuses, devices, and computer-readable storage media for camera calibration.
Background
In recent years, development of automatic driving technology has been more and more rapid. The basis of the autopilot technique is the perception of the surroundings of the vehicle, i.e. the recognition of specific conditions of the surroundings. It has been proposed that, in addition to environmental awareness with an onboard sensor device (e.g., an onboard lidar or an onboard camera), environmental information of a vehicle can be acquired by an off-board sensor device (e.g., a roadside-mounted camera) to better support the autopilot technology. However, for some reasons, the mounting position of the roadside mounted camera may appear dithered with respect to the initial mounting position, thereby affecting the accuracy of the position of the vehicle or obstacle determined, for example, based on the image data captured by the roadside camera. Such errors in position may be unacceptable for autopilot.
Disclosure of Invention
According to an embodiment of the present disclosure, a scheme for camera calibration is provided.
In a first aspect of the present disclosure, a method for camera calibration is provided. The method comprises the following steps: determining a first set of points corresponding to a predetermined reference line from a two-dimensional image captured by a camera; determining a second point set from the two-dimensional image based on the position information of the reference line in the three-dimensional map; determining a first point from the first set of points, the first point corresponding to a second point in the second set of points, and the first point being less than a predetermined distance threshold from the second point; and determining an external parameter of the camera based at least on the first point and the second point, the external parameter indicating a conversion relationship of the camera coordinate system and the world coordinate system.
In a second aspect of the present disclosure, an apparatus for camera calibration is provided. The device comprises: a first point set determination module configured to determine a first point set corresponding to a predetermined reference line from a two-dimensional image captured by a camera; a second point set determining module configured to determine a second point set from the two-dimensional image based on the position information of the reference line in the three-dimensional map; a first point determination module that determines a first point from a first set of points, the first point corresponding to a second point in a second set of points, and a distance of the first point from the second point being less than a predetermined distance threshold; and an external parameter determination module configured to determine an external parameter of the camera based at least on the first point and the second point, the external parameter indicating a conversion relationship of the camera coordinate system and the world coordinate system.
In a third aspect of the present disclosure, an electronic device is provided that includes one or more processors; and storage means for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement a method according to the first aspect of the present disclosure.
In a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor implements a method according to the first aspect of the present disclosure.
It should be understood that what is described in this summary is not intended to limit the critical or essential features of the embodiments of the disclosure nor to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals designate like or similar elements, and wherein:
FIG. 1 illustrates a schematic diagram of an example environment in which various embodiments of the present disclosure may be implemented;
FIG. 2 illustrates a flow chart of a method for camera calibration according to some embodiments of the present disclosure;
FIG. 3 illustrates a flowchart of an example method for determining a first set of points, according to some embodiments of the present disclosure;
FIG. 4 shows a schematic diagram of projecting three-dimensional coordinate points onto a two-dimensional image;
FIG. 5 illustrates a flowchart of an example method for determining external parameters of a camera, according to some embodiments of the present disclosure;
FIG. 6 illustrates a flowchart of an example method for determining candidate extrinsic parameters according to some embodiments of the present disclosure
FIG. 7 illustrates a schematic block diagram of an apparatus for determining external parameters of a camera according to some embodiments of the present disclosure; and
FIG. 8 illustrates a block diagram of a computing device capable of implementing various embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
In describing embodiments of the present disclosure, the term "comprising" and its like should be taken to be open-ended, i.e., including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like, may refer to different or the same object. Other explicit and implicit definitions are also possible below.
As used herein, the term "external parameters of the camera" may be, for example, parameters required to convert between a camera coordinate system and a world coordinate system, such as a translation matrix, a rotation matrix, and so forth. The term "internal parameters of the camera" may for example be parameters required for conversion between the image coordinate system and/or the pixel coordinate system and the camera coordinate system, such as a translation matrix, a rotation matrix, etc. "calibrating the external parameters of the camera" may refer to the determination of conversion parameters between the camera coordinate system and the world coordinate system.
In the context of the present disclosure, the world coordinate system may refer to a reference coordinate system covering a global scope, which may be used, for example, to assist in automatic driving or autonomous parking of a vehicle, etc., examples of which include UTM coordinate systems, latitude and longitude coordinate systems, etc. The origin of the camera coordinate system may be located at the optical center of the imaging device, the vertical axis (z-axis) may coincide with the optical axis of the imaging device, and the horizontal axis (x-axis) and the vertical axis (y-axis) may be parallel to the imaging plane. The origin of the pixel coordinate system may be in the upper left corner of the image, and the horizontal axis and the vertical axis may be the pixel row and the pixel column, respectively, where the image is located, and the unit may be a pixel. The origin of the image coordinate system may be at the center of the image (i.e., the midpoint of the pixel coordinate system), and the horizontal and vertical axes may be parallel to the pixel coordinate system in millimeters. However, it will be appreciated that in other embodiments, these coordinate systems may be defined in other reasonable ways as is accepted in the art.
As mentioned previously, for some reasons, the mounting position of a roadside mounted camera may appear dithered relative to the initial mounting position, thereby affecting the accuracy of the position of the vehicle or obstacle determined, for example, based on image data captured by the roadside camera.
According to various embodiments of the present disclosure, a scheme for camera calibration is provided. In an embodiment of the present disclosure, a first set of points corresponding to a predetermined reference line is determined from a two-dimensional image captured by a camera; determining a second point set from the two-dimensional image based on the position information of the reference line in the three-dimensional map; determining a first point from the first set of points, the first point corresponding to a second point in the second set of points, and the first point being less than a predetermined distance threshold from the second point; and determining an external parameter of the camera based at least on the first point and the second point, the external parameter indicating a conversion relationship of the camera coordinate system and the world coordinate system. By means of the constraint of the distance threshold, interference caused by inaccurate points on the camera can be reduced, and therefore the external parameters of the camera can be calibrated more accurately.
Embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
FIG. 1 illustrates a schematic diagram of an example environment 100 in which various embodiments of the present disclosure may be implemented. In this example environment 100, a number of typical objects are schematically illustrated, including a roadway 102, and a vehicle 110 traveling on the roadway 102. As shown in fig. 1, the road 102 includes, for example, a stop sign line 115-1 and a lane sign line 115-2 (individually or collectively referred to as sign lines 115), and further, a camera 105 for sensing environmental information of the road 102 is included in the environment 100. It should be understood that these illustrated facilities and objects are examples only, and that there may be objects in different traffic environments that will vary depending on the actual situation. The scope of the present disclosure is not limited in this respect.
Vehicle 110 may be any type of vehicle that may carry a person and/or object and that is moved by a power system such as an engine, including, but not limited to, a car, truck, bus, electric vehicle, motorcycle, caravan, train, and the like. One or more vehicles 110 in environment 100 may be vehicles with certain autopilot capabilities, such vehicles also being referred to as unmanned vehicles. Of course, another vehicle or vehicles 110 in the environment 100 may also be vehicles that do not have autopilot capability.
In some embodiments, the camera 105 may be disposed above the roadway 102. In some embodiments, the cameras 105 may also be arranged on both sides of the road 102, for example. As shown in fig. 1, camera 105 may be communicatively coupled to computing device 120. Although shown as a separate entity, the computing device 120 may be embedded in the camera 105. The computing device 120 may also be an entity external to the camera 105 and may communicate with the camera 105 via a wireless network. Computing device 120 may be implemented as one or more computing devices that include at least processors, memory, and other components typically found in general purpose computers to perform computing, storage, communication, control, etc. functions.
In some embodiments, the camera 105 may acquire environmental information (e.g., lane line information, road boundary information, or obstacle information) related to the road 102 and transmit the environmental information to the vehicle 110 for use in driving decisions of the vehicle 110. In some embodiments, camera 105 may also determine the location of vehicle 110 based on the camera's external and internal parameters and the captured image of vehicle 110 and send the location to vehicle 110 to enable positioning of vehicle 110. It can be seen that it is necessary to determine the exact internal and external parameters of the camera, whether to obtain exact environmental information or to determine exact location information.
The process of camera calibration according to an embodiment of the present disclosure will be described below in connection with fig. 2 to 5. FIG. 2 illustrates a flow chart of a method 200 for camera calibration according to an embodiment of the present disclosure. The method 200 may be performed, for example, by the computing device 120 shown in fig. 1.
As shown in fig. 2, at block 202, the computing device 120 determines a first set of points corresponding to a predetermined reference line from a two-dimensional image captured by the camera 105. In some embodiments, the reference line may be, for example, two lines orthogonal in the environment, such as the lane marker line 115-2 and the stop marker line 115-1 of the road 102 shown in FIG. 1. In some embodiments, the reference line may also be, for example, a special marking line, such as one or more sets of intersecting lines, painted on the road 102 for calibration purposes. In some embodiments, the reference line may also comprise only one line when the locations of at least two feature points in the world coordinate system and the image coordinate system are known.
In some embodiments, the computing device 120 may determine the first set of points corresponding to the reference line from the two-dimensional image captured by the camera 105 through image recognition techniques. The specific process of block 202 will be described below with reference to fig. 3. FIG. 3 illustrates a flowchart of a process 202 of determining a first set of points, according to an embodiment of the present disclosure.
As shown in fig. 3, at block 302, computing device 120 may obtain a mask image of a two-dimensional image. According to some embodiments of the present disclosure, the computing device 120 may acquire a two-dimensional image captured by the calibrated camera 105, wherein points located on the reference line and points located outside the reference line in the mask image are identified differently. Taking fig. 1 as an example, the computing device 120 may determine the marker line 115 (stop marker line 115-1 and lane marker line 115-2) using a marker line detection model, may mark points determined to be the stop marker line 115-1 and lane marker line 115-2 as white and other points as black in the mask image, thereby forming the mask image.
For example, fig. 4 shows a schematic diagram 400 of projecting a three-dimensional location onto a two-dimensional image. As shown in fig. 4, the stop sign line 115-1 and the lane sign line 115-2 are shown as diagonal line areas in fig. 4.
In some embodiments, the computing device 120 may perform an intra-parameter calibration of the camera 105 prior to acquiring the two-dimensional image. The internal parameters refer to parameters related to the characteristics of the imaging device itself. Taking a camera as an example, the internal parameters refer to parameters such as focal length, pixel size, etc. In some embodiments, the camera 105 may capture the two-dimensional image after distortion correction. In some embodiments, the camera 105 may capture the two-dimensional image after internal parameter calibration and distortion correction. Therefore, the accuracy of external parameter calibration of the camera can be improved.
At block 304, computing device 120 may determine a centerline of an area corresponding to a point located on a reference line from the mask image. In some embodiments, computing device 120 may determine a centerline of the region marked as marker line 115, for example, using a bone extraction model.
At block 306, the computing device 120 may determine a first set of points based on the centerline. In some embodiments, computing device 120 utilizes that the determined centerlines may be sampled to determine a plurality of points on the centerlines to form a first set of points. For example, as shown in fig. 4, computing device 120 may determine a plurality of points 405-1 and 405-2 (individually or collectively referred to as points 405, shown as black solid points in fig. 4) corresponding to marker line 115, the identified plurality of points 405 comprising a first set of points.
With continued reference to fig. 2, at block 204, the computing device 120 determines a second set of points from the two-dimensional image based on the location information of the reference line in the three-dimensional map. In some embodiments, the three-dimensional map may be collected by a map data collection vehicle for collecting information related to the environment 100 and generated based on such information. For example, for a scene without a GPS signal, an acquisition vehicle can be driven into the scene from a position with the GPS signal outdoors by using an instant localization and mapping (SLAM) method, road environment information is acquired by using a vehicle-mounted laser radar, a camera and a looking-around image acquisition system, and then recognition and fusion are performed to superimpose the acquired data together, so as to generate a three-dimensional map. It should be appreciated that the three-dimensional map may be generated in any other suitable manner, and the present application is not limited in any way to the manner in which the three-dimensional map is generated.
According to some embodiments of the present disclosure, the computing device 120 may determine location information corresponding to the reference line from the three-dimensional map. For example, the computing device 120 may determine position information of the stop sign line 115-1 and the lane sign line 115-2 in a three-dimensional map, such position information may be represented as a set of three-dimensional coordinate points, for example.
In some embodiments, computing device 120 may obtain initial external parameters of camera 105. In some embodiments, the initial external parameters may be, for example, determined at the time of installation of the camera 105, which may be indicative of at least the position and angle of the camera 105 in the world coordinate system. In some embodiments, the initial external parameter may also be, for example, the external parameter determined by the last calibration camera 105.
In some embodiments, the computing device 120 may determine the second set of points in the two-dimensional image based on the initial extrinsic parameters and the location information of the reference line in the three-dimensional map. In some embodiments, the computing device 120 may project a set of three-dimensional coordinate points corresponding to the location information into an image coordinate system or pixel coordinate system corresponding to the two-dimensional image based on the initial external parameters and the internal parameters known to the camera 105, thereby obtaining the second set of points.
For example, as shown in fig. 4, based on the position information of the reference line (e.g., the marker line 115) in the three-dimensional map, the computing device 120 may determine a projected point 410-1 and a projected point 410-2 (individually or collectively referred to as projected points 410, shown as open points in fig. 4) of a set of three-dimensional coordinate points in the two-dimensional image corresponding to the position information. The set of proxels 410 forms a second set of points. It should be appreciated that the second set of points may for example only consider points that fall within the range of the two-dimensional image. For example, due to the longer distance of the lane marker 115-2 in the three-dimensional map, certain points may be projected to points outside the two-dimensional image, which may not be added to the second set of points.
With continued reference to fig. 2, at block 206, the computing device 120 determines a first point from a first set of points, wherein the first point corresponds to a second point in a second set of points and the distance of the first point from the second point is less than a predetermined distance threshold. In particular, the computing device 120 may determine a first point corresponding to each second point in the second set of points from the first set of points, wherein the first point represents a point in the first set of points that is closest to the second point and the distance of the first point from the second point is less than a predetermined distance threshold.
Taking fig. 4 as an example, for point 410-1 in the second set, the point closest to its location in the first set is point 405-1, and its distance is, for example, less than a predetermined distance threshold, then point 405-1 may be determined to be the first point and 410-1 may be determined to be the second point. Conversely, for point 410-2 in the second set, the closest point to its location in the first set is point 405-2, and because the distance between these two points is, for example, greater than the predetermined distance threshold, the pair of adjacent points formed by point 410-2 and point 405-2 will not be considered in the calculation. In this manner, computing device 120 may determine all satisfactory pairs of points in the first set of points and the second set of points.
With continued reference to fig. 2, at block 208, the computing device 120 determines an external parameter of the camera based at least on the first point and the second point, wherein the external parameter indicates a conversion relationship of the camera coordinate system to the world coordinate system. The computing device 120 may determine external parameters of the camera 105 based on the matching of the first point and the second point. The specific process of block 208 will be described below with reference to fig. 5. Fig. 5 shows a flowchart of a process of determining an external parameter according to an embodiment of the present disclosure.
As shown in fig. 5, at block 502, computing device 120 may determine candidate extrinsic parameters based on a distance between a first point and a second point. The process of 502 will be described below in connection with fig. 6, which shows a flowchart of a process of determining candidate extrinsic parameters according to an embodiment of the present disclosure.
As shown in fig. 6, at block 602, the computing device 120 may determine a distance of the first set of points and the second set of points based on a distance between the first point and the second point. In some embodiments, computing device 120 may determine, for example, a sum of distances of all first and second points satisfying the condition determined at block 206 as a distance of the first and second point sets. In some embodiments, computing device 120 may also determine, for example, an average of the distances of all the first and second points satisfying the condition determined at block 206 as the distance between the first and second point sets.
At block 604, the computing device 120 may adjust an initial extrinsic parameter of the camera 105 based on a determination that the distance of the first set of points and the second set of points is greater than a predetermined threshold. In some embodiments, the computing device 120 may adjust the initial external parameters of the camera 105 based on a minimum re-projection error method in some embodiments. In particular, the computing device 120 may determine a Jacobi (Jacobi) matrix of distances with respect to external parameters, e.g., the Jacobi matrix may be represented as:
wherein e represents the distance between the projection point in the second point set and the corresponding adjacent point, δζ represents the representation of the pose under lie algebra, X, Y, Z represents the coordinates of the projection point in the world coordinate system, X ', Y ', Z ' represent the position in the camera coordinate system after pose transformation, f x 、f y Representing internal parameters of the camera 105, it can be seen that the jacobian gives a derivative of the distance with respect to pose. The computing device 120 may further adjust initial external parameters of the camera 105 based on the determined jacobian matrix.
At block 606, computing device 120 may determine an updated second set of points based on the adjusted external parameters. It should be appreciated that the computing device 120 may utilize the adjusted external parameters and the internal parameters of the camera 105 to project a set of three-dimensional coordinate points in the three-dimensional map corresponding to the reference line into the two-dimensional image to obtain the updated second set of points.
At block 608, the computing device 120 may determine that the distance of the first set of points from the updated second set of points is less than or equal to a predetermined threshold. In response to determining at block 608 that the distance is still greater than the threshold, the method may proceed to block 604 to continue adjusting the outer parameter, i.e., to enter the next iteration. In response to determining at block 608 that the distance is less than or equal to the predetermined threshold, then the method may proceed to block 510, i.e., computing device 120 may determine the adjusted extrinsic parameters as candidate extrinsic parameters.
In some embodiments, the termination condition of the iteration may also be set to terminate the iteration when the iteration reaches a predetermined number of times. That is, when the distance of the first set of points from the second set of points is greater than a predetermined threshold, the computing device 120 may adjust the initial extrinsic parameters, for example, based on the jacobian matrix, until the number of adjustments reaches a predetermined number of times threshold. The computing device 120 may determine the initial extrinsic parameters adjusted at the termination of the iteration as candidate extrinsic parameters.
In some embodiments, the reference line may include only one line, for example, only the lane marker line 115-2, in which case the unique external parameters may not be available based on only the position information of the lane marker line 115-2 in the three-dimensional map and the corresponding point in the two-dimensional map. In this embodiment, computing device 120 may solve for the optimal extrinsic parameters using the reference points of known locations as another constraint. Specifically, when determining the external parameter, the computing device 120 may obtain the optimal external parameter by performing minimum projection error or gesture search so that the distance between the first point set and the second point set corresponding to the reference point is less than a predetermined threshold on the premise that the reference point of the absolute position in the known world coordinate system is matched to the reference point in the two-dimensional image. It should be appreciated that the reference point may be any point of known world coordinates, such as a traffic sign of a known location, a painted reference point of a known location, or any other reference of a known location.
In some embodiments, as described above, the reference line may include two intersecting lines in the world coordinate system. The optimal outer parameters may be determined based on the minimum re-projection error method or the pose search method above.
With continued reference to fig. 5, at block 504, based on the candidate extrinsic parameters and the location information, the computing device 120 may determine a third set of points from the two-dimensional image. In particular, the computing device 120 may project a set of three-dimensional coordinate points corresponding to the location information into the two-dimensional image based on the candidate external parameters and the internal parameters of the camera 105 to determine a third set of points.
At block 506, based on the updated distance threshold, the computing device 120 may determine a third point from the first set of points, the third point corresponding to a fourth point in the third set of points, and the third point being less than the updated distance threshold from the fourth point. In some embodiments, the computing device 120 may adjust the distance threshold after each iteration. For example, computing device 120 may subtract a predetermined step size from the distance threshold. Alternatively, computing device 120 may divide the distance threshold by a predetermined coefficient. In some other examples, computing device 120 may also adjust the distance threshold based on a distance between the first point and the second point. For example, computing device 120 may set a distance threshold based on an average of the distances of all matching pairs of points that satisfy a previous distance threshold constraint, thereby filtering some significantly larger pairs of points than the average, thereby reducing interference of noisy pairs of points. It should be appreciated that after determining the updated distance threshold, block 506 may be performed with reference to the process described above with respect to block 206.
At block 508, the computing device 120 may update the candidate extrinsic parameters based on the third point and the fourth point to determine extrinsic parameters of the camera. In particular, the computing device 120 may determine the candidate extrinsic parameters in the current iteration based on the candidate extrinsic parameters determined by the previous iteration and the jacobian matrix determined based on the distance between the matched point pairs determined by the new distance threshold. It should be appreciated that the candidate extrinsic parameters in this iteration may be determined with reference to the process described above with respect to block 502.
In some embodiments, the computing device 120 may set a predetermined lower distance limit for the threshold distance, i.e., the termination condition for the iterative adjustment of the threshold distance may be that when the threshold distance is less than the predetermined lower distance limit, then the threshold distance is no longer adjusted, but the candidate external parameter determined based on the previous threshold distance is determined as the external parameter for the camera.
Based on the methods described above, embodiments of the present disclosure may utilize the locations of reference lines in an environment in a three-dimensional map and project these locations into a two-dimensional image acquired based on image recognition and determine external parameters of a camera through location matching. By adding the constraint of the distance threshold value in the position matching process, the embodiment of the disclosure not only can overcome the defect that a camera installed on a road side is difficult to calibrate, but also can effectively solve the problem of mismatching caused by misdetection and omission of a marker line and too sparse points in the marker line, so that the external parameters of the camera can be determined more accurately, and support is provided for accurate environment sensing and positioning determination.
In some embodiments, the two-dimensional image captured by the camera 105 may also be used for obstacle detection. It should be appreciated that the obstacle detection and camera calibration process described above may be performed in parallel, for example, using different threads, thereby improving processing efficiency. In some embodiments, when an obstacle is detected from the two-dimensional image, the computing device 120 may determine an area in the two-dimensional image corresponding to the obstacle. It should be appreciated that the obstacle may comprise any dynamic obstacle, such as a vehicle, pedestrian, or animal, etc., and the obstacle may also comprise any static obstacle. The present disclosure is not intended to be limiting in any way as to the type of obstruction.
Further, the computing device 120 may determine a location of the obstacle in the world coordinate system based on the determined external parameters and the region. In particular, the computing device 120 may utilize internal parameters known to the camera and the determined external parameters to effect a conversion of the region from the image coordinate system to the world coordinate system.
In some embodiments, computing device 120 may also provide the location of the obstacle in the world coordinate system. For example, the computing device 120 may broadcast obstacle information about the road 102 to surrounding vehicles (e.g., the vehicle 110) to provide a basis for automated driving decisions of the vehicle. In some embodiments, computing device 120 may also determine a location of vehicle 110 based on the determined external parameters, for example, and send the location to vehicle 110 to enable positioning of vehicle 110.
It should be appreciated that while the methods of the present disclosure are described with reference to examples of drive test cameras, it should be understood that such environments are merely illustrative, and that the methods of the present disclosure may also be used for calibration of cameras located elsewhere, for example (e.g., initial calibration of cameras mounted on a vehicle). The present disclosure is not intended to be limited in any way to the location where the camera is mounted.
Embodiments of the present disclosure also provide corresponding apparatus for implementing the above-described methods or processes. Fig. 7 illustrates a schematic block diagram of an apparatus 700 for camera calibration according to some embodiments of the present disclosure. The apparatus 700 may be implemented at, for example, the computing device 120 of fig. 1.
As shown in fig. 7, the apparatus 700 may include a first point set determination module 710 configured to determine a first point set corresponding to a predetermined reference line from a two-dimensional image captured by a camera. The apparatus 700 may further comprise a second point set determination module 720 configured to determine a second point set from the two-dimensional image based on the position information of the reference line in the three-dimensional map. Additionally, the apparatus 700 may further include a first point determination module 730 configured to determine a first point from the first set of points, the potential first point corresponding to a second point from the second set of points, and the first point being less than a predetermined distance threshold from the second point. The apparatus 700 further comprises an external parameter determination module 740 configured to determine an external parameter of the camera based at least on the first point and the second point, the potential external parameter being indicative of a conversion relation of the camera coordinate system to the world coordinate system.
In some embodiments, the first point set determination module 710 includes: a mask image acquisition module configured to acquire a mask image of a two-dimensional image, wherein points located on a reference line and points located outside the reference line in the mask image are differently identified; a center line determination module configured to determine a center line of a region corresponding to a point located on a reference line from the mask image; and a first determination module configured to determine a first set of points based on the centerline.
In some embodiments, the second point set determination module 720 includes: an initial external parameter acquisition module configured to acquire an initial external parameter of the camera; and a second determination module configured to determine a second set of points based on the initial extrinsic parameters and the location information.
In some embodiments, the extrinsic parameter determination module 740 includes: a candidate extrinsic parameter determination module configured to determine a candidate extrinsic parameter based on a distance between the first point and the second point; a third point set determination module configured to determine a third point set from the two-dimensional image based on the candidate extrinsic parameters and the position information; a third point determination module configured to determine a third point from the first set of points based on the updated distance threshold, the third point corresponding to a fourth point in the third set of points, and a distance of the third point from the fourth point being less than the updated distance threshold; and a candidate extrinsic parameter updating module configured to update the candidate extrinsic parameters to determine extrinsic parameters of the camera based on the third point and the fourth point.
In some embodiments, wherein the updated distance threshold is greater than a predetermined lower distance limit.
In some embodiments, the external parameter determination module 740 further includes at least one of: a first threshold adjustment module configured to subtract a predetermined step from the distance threshold; a second threshold adjustment module configured to divide the distance threshold by a predetermined coefficient; or a third threshold adjustment module configured to adjust the distance threshold based on the distance between the first point and the second point.
In some embodiments, the apparatus 700 further comprises: a region determination module configured to determine a region in the two-dimensional image corresponding to the obstacle in response to detecting the obstacle from the two-dimensional image; and a position determination module configured to determine a position of the obstacle in a world coordinate system based on the external parameter and the region.
In some embodiments, the apparatus 700 further comprises: a providing module configured to provide a location of an obstacle.
The elements included in apparatus 700 may be implemented in various manners, including software, hardware, firmware, or any combination thereof. In some embodiments, one or more units may be implemented using software and/or firmware, such as machine executable instructions stored on a storage medium. In addition to or in lieu of machine-executable instructions, some or all of the elements in apparatus 700 may be at least partially implemented by one or more hardware logic components. By way of example and not limitation, exemplary types of hardware logic components that can be used include Field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standards (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
These elements shown in fig. 7 may be implemented partially or fully as hardware modules, software modules, firmware modules, or any combination thereof. In particular, in certain embodiments, the above-described flows, methods, or processes may be implemented by hardware in a storage system or a host corresponding to the storage system or other computing device independent of the storage system.
Fig. 8 shows a schematic block diagram of an example device 800 that may be used to implement embodiments of the present disclosure. Device 800 may be used to implement computing device 120. As shown, the device 800 includes a Central Processing Unit (CPU) 801 that can perform various suitable actions and processes in accordance with computer program instructions stored in a Read Only Memory (ROM) 802 or loaded from a storage unit 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data required for the operation of the device 800 can also be stored. The CPU 801, ROM 802, and RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
Various components in device 800 are connected to I/O interface 805, including: an input unit 806 such as a keyboard, mouse, etc.; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, etc.; and a communication unit 809, such as a network card, modem, wireless communication transceiver, or the like. The communication unit 809 allows the device 800 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The processing unit 801 performs the various methods and processes described above, such as method 200. For example, in some embodiments, the method 200 may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 808. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 800 via ROM 802 and/or communication unit 809. When a computer program is loaded into RAM 803 and executed by CPU 801, one or more steps of method 200 described above may be performed. Alternatively, in other embodiments, CPU 801 may be configured to perform method 200 by any other suitable means (e.g., by means of firmware).
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Moreover, although operations are depicted in a particular order, this should be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (16)

1. A method for camera calibration, comprising:
determining a first set of points corresponding to a predetermined reference line from a two-dimensional image captured by a camera;
determining a second set of points from the two-dimensional image by projecting a set of three-dimensional coordinate points corresponding to position information of the reference line in a three-dimensional map to the two-dimensional image based on initial external parameters and known internal parameters of the camera, wherein the second set of points does not include points projected from the three-dimensional map to outside the two-dimensional image;
determining a first point from the first set of points, the first point corresponding to a second point from the second set of points, and the first point being less than a predetermined distance threshold from the second point; and
an external parameter of the camera is determined based at least on the first point and the second point, the external parameter indicating a conversion relationship of a camera coordinate system and a world coordinate system.
2. The method of claim 1, wherein determining the external parameters of the camera comprises:
determining a candidate extrinsic parameter based on a distance between the first point and the second point;
determining a third set of points from the two-dimensional image based on the candidate extrinsic parameters and the location information;
determining a third point from the first set of points based on the updated distance threshold, the third point corresponding to a fourth point in the third set of points and the third point being less than the updated distance threshold; and
based on the third point and the fourth point, the candidate extrinsic parameters are updated to determine extrinsic parameters of the camera.
3. The method of claim 2, wherein the updated distance threshold is greater than a predetermined lower distance limit.
4. The method of claim 2, wherein determining the external parameters of the camera further comprises updating the distance threshold by at least one of:
subtracting a predetermined step size from the distance threshold;
dividing the distance threshold by a predetermined coefficient; or (b)
The distance threshold is adjusted based on a distance between the first point and the second point.
5. The method of claim 1, wherein determining the first set of points comprises:
acquiring a mask image of the two-dimensional image, wherein points located on the reference line and points located outside the reference line in the mask image are differently identified;
determining a center line of a region corresponding to a point located on the reference line from the mask image; and
the first set of points is determined based on the centerline.
6. The method of claim 1, further comprising:
responsive to detecting an obstacle from the two-dimensional image, determining a region in the two-dimensional image corresponding to the obstacle; and
based on the external parameters and the region, a position of the obstacle in the world coordinate system is determined.
7. The method of claim 6, further comprising:
providing the position of the obstacle.
8. An apparatus for camera calibration, comprising:
a first point set determination module configured to determine a first point set corresponding to a predetermined reference line from a two-dimensional image captured by a camera;
a second point set determination module configured to determine a second point set from the two-dimensional image by projecting a set of three-dimensional coordinate points corresponding to position information of the reference line in a three-dimensional map to the two-dimensional image based on an initial external parameter and a known internal parameter of the camera, wherein the second point set does not include points projected from the three-dimensional map to outside the two-dimensional image;
a first point determination module that determines a first point from the first set of points, the first point corresponding to a second point in the second set of points, and a distance of the first point from the second point being less than a predetermined distance threshold; and
an extrinsic parameter determination module configured to determine an extrinsic parameter of the camera based at least on the first point and the second point, the extrinsic parameter indicating a conversion relation of a camera coordinate system and a world coordinate system.
9. The apparatus of claim 8, wherein the external parameter determination module comprises:
a candidate extrinsic parameter determination module configured to determine a candidate extrinsic parameter based on a distance between the first point and the second point;
a third point set determination module configured to determine a third point set from the two-dimensional image based on the candidate extrinsic parameters and the position information;
a third point determination module configured to determine a third point from the first set of points based on the updated distance threshold, the third point corresponding to a fourth point in the third set of points, and a distance of the third point from the fourth point being less than the updated distance threshold; and
a candidate extrinsic parameter updating module configured to update the candidate extrinsic parameter to determine an extrinsic parameter of the camera based on the third point and the fourth point.
10. The apparatus of claim 9, wherein the updated distance threshold is greater than a predetermined lower distance limit.
11. The apparatus of claim 9, wherein the external parameter determination module further comprises at least one of:
a first threshold adjustment module configured to subtract a predetermined step size from the distance threshold;
a second threshold adjustment module configured to divide the distance threshold by a predetermined coefficient; or (b)
A third threshold adjustment module configured to adjust the distance threshold based on a distance between the first point and the second point.
12. The apparatus of claim 8, wherein the first point set determination module comprises:
a mask image acquisition module configured to acquire a mask image of the two-dimensional image, wherein points located on the reference line and points located outside the reference line in the mask image are differently identified;
a center line determination module configured to determine a center line of a region corresponding to a point located on the reference line from the mask image; and
a first determination module configured to determine the first set of points based on the centerline.
13. The apparatus of claim 8, further comprising:
a region determination module configured to determine a region in the two-dimensional image corresponding to an obstacle in response to detecting the obstacle from the two-dimensional image; and
a position determination module configured to determine a position of the obstacle in the world coordinate system based on the external parameter and the region.
14. The apparatus of claim 13, further comprising:
a providing module configured to provide the location of the obstacle.
15. An electronic device, the device comprising:
one or more processors; and
storage means for storing one or more programs which when executed by the one or more processors cause the one or more processors to implement the method of any of claims 1-7.
16. A computer readable storage medium having stored thereon a computer program which when executed by a processor implements the method of any of claims 1-7.
CN201911001905.XA 2019-10-21 2019-10-21 Method, apparatus, device and storage medium for camera calibration Active CN110728720B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911001905.XA CN110728720B (en) 2019-10-21 2019-10-21 Method, apparatus, device and storage medium for camera calibration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911001905.XA CN110728720B (en) 2019-10-21 2019-10-21 Method, apparatus, device and storage medium for camera calibration

Publications (2)

Publication Number Publication Date
CN110728720A CN110728720A (en) 2020-01-24
CN110728720B true CN110728720B (en) 2023-10-13

Family

ID=69220530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911001905.XA Active CN110728720B (en) 2019-10-21 2019-10-21 Method, apparatus, device and storage medium for camera calibration

Country Status (1)

Country Link
CN (1) CN110728720B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114140533A (en) * 2020-09-04 2022-03-04 华为技术有限公司 Method and device for calibrating external parameters of camera
CN112560769B (en) * 2020-12-25 2023-08-29 阿波罗智联(北京)科技有限公司 Method for detecting obstacle, electronic device, road side device and cloud control platform
CN113658268A (en) * 2021-08-04 2021-11-16 智道网联科技(北京)有限公司 Method and device for verifying camera calibration result, electronic equipment and storage medium
CN113963060B (en) * 2021-09-22 2022-03-18 腾讯科技(深圳)有限公司 Vehicle information image processing method and device based on artificial intelligence and electronic equipment

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2150170A1 (en) * 2007-05-23 2010-02-10 The University of British Columbia Methods and apparatus for estimating point-of-gaze in three dimensions
JP2011087308A (en) * 2010-11-25 2011-04-28 Aisin Seiki Co Ltd Device, method, and program for calibration of in-vehicle camera
CN102123194A (en) * 2010-10-15 2011-07-13 张哲颖 Method for optimizing mobile navigation and man-machine interaction functions by using augmented reality technology
JP2013083505A (en) * 2011-10-07 2013-05-09 National Institute Of Information & Communication Technology Three-dimensional coordinate position estimating device, method and program thereof, three-dimensional coordinate estimating system, and camera calibration informative generator
EP2597614A1 (en) * 2011-11-28 2013-05-29 Clarion Co., Ltd. Automotive camera system and its calibration method and calibration program
KR20140049361A (en) * 2012-10-17 2014-04-25 한국과학기술원 Multiple sensor system, and apparatus and method for three dimensional world modeling using the same
CN105844696A (en) * 2015-12-31 2016-08-10 清华大学 Image positioning method and device based on ray model three-dimensional reconstruction
EP3086284A1 (en) * 2015-04-23 2016-10-26 Application Solutions (Electronics and Vision) Limited Camera extrinsic parameters estimation from image lines
CN106651942A (en) * 2016-09-29 2017-05-10 苏州中科广视文化科技有限公司 Three-dimensional rotation and motion detecting and rotation axis positioning method based on feature points
CN107464264A (en) * 2016-06-02 2017-12-12 南京理工大学 A kind of camera parameter scaling method based on GPS
JP2018044943A (en) * 2016-09-08 2018-03-22 パナソニックIpマネジメント株式会社 Camera parameter set calculation device, camera parameter set calculation method and program
WO2018235163A1 (en) * 2017-06-20 2018-12-27 株式会社ソニー・インタラクティブエンタテインメント Calibration device, calibration chart, chart pattern generation device, and calibration method
CN109166156A (en) * 2018-10-15 2019-01-08 Oppo广东移动通信有限公司 A kind of generation method, mobile terminal and the storage medium of camera calibration image
CN109658504A (en) * 2018-10-31 2019-04-19 百度在线网络技术(北京)有限公司 Map datum mask method, device, equipment and storage medium
CN109754432A (en) * 2018-12-27 2019-05-14 深圳市瑞立视多媒体科技有限公司 A kind of automatic camera calibration method and optics motion capture system
CN109816704A (en) * 2019-01-28 2019-05-28 北京百度网讯科技有限公司 The 3 D information obtaining method and device of object
CN110135376A (en) * 2019-05-21 2019-08-16 北京百度网讯科技有限公司 Determine method, equipment and the medium of the coordinate system conversion parameter of imaging sensor
CN110148185A (en) * 2019-05-22 2019-08-20 北京百度网讯科技有限公司 Determine method, apparatus, electronic equipment and the storage medium of coordinate system conversion parameter

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005041579A2 (en) * 2003-10-24 2005-05-06 Reactrix Systems, Inc. Method and system for processing captured image information in an interactive video display system
EP3100234B1 (en) * 2014-01-27 2021-04-07 XYLON d.o.o. Data-processing system and method for calibration of a vehicle surround view system
JP6975929B2 (en) * 2017-04-18 2021-12-01 パナソニックIpマネジメント株式会社 Camera calibration method, camera calibration program and camera calibration device
JP7054803B2 (en) * 2017-07-21 2022-04-15 パナソニックIpマネジメント株式会社 Camera parameter set calculation device, camera parameter set calculation method and program

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2150170A1 (en) * 2007-05-23 2010-02-10 The University of British Columbia Methods and apparatus for estimating point-of-gaze in three dimensions
CN102123194A (en) * 2010-10-15 2011-07-13 张哲颖 Method for optimizing mobile navigation and man-machine interaction functions by using augmented reality technology
JP2011087308A (en) * 2010-11-25 2011-04-28 Aisin Seiki Co Ltd Device, method, and program for calibration of in-vehicle camera
JP2013083505A (en) * 2011-10-07 2013-05-09 National Institute Of Information & Communication Technology Three-dimensional coordinate position estimating device, method and program thereof, three-dimensional coordinate estimating system, and camera calibration informative generator
EP2597614A1 (en) * 2011-11-28 2013-05-29 Clarion Co., Ltd. Automotive camera system and its calibration method and calibration program
KR20140049361A (en) * 2012-10-17 2014-04-25 한국과학기술원 Multiple sensor system, and apparatus and method for three dimensional world modeling using the same
EP3086284A1 (en) * 2015-04-23 2016-10-26 Application Solutions (Electronics and Vision) Limited Camera extrinsic parameters estimation from image lines
CN105844696A (en) * 2015-12-31 2016-08-10 清华大学 Image positioning method and device based on ray model three-dimensional reconstruction
CN107464264A (en) * 2016-06-02 2017-12-12 南京理工大学 A kind of camera parameter scaling method based on GPS
JP2018044943A (en) * 2016-09-08 2018-03-22 パナソニックIpマネジメント株式会社 Camera parameter set calculation device, camera parameter set calculation method and program
CN106651942A (en) * 2016-09-29 2017-05-10 苏州中科广视文化科技有限公司 Three-dimensional rotation and motion detecting and rotation axis positioning method based on feature points
WO2018235163A1 (en) * 2017-06-20 2018-12-27 株式会社ソニー・インタラクティブエンタテインメント Calibration device, calibration chart, chart pattern generation device, and calibration method
CN109166156A (en) * 2018-10-15 2019-01-08 Oppo广东移动通信有限公司 A kind of generation method, mobile terminal and the storage medium of camera calibration image
CN109658504A (en) * 2018-10-31 2019-04-19 百度在线网络技术(北京)有限公司 Map datum mask method, device, equipment and storage medium
CN109754432A (en) * 2018-12-27 2019-05-14 深圳市瑞立视多媒体科技有限公司 A kind of automatic camera calibration method and optics motion capture system
CN109816704A (en) * 2019-01-28 2019-05-28 北京百度网讯科技有限公司 The 3 D information obtaining method and device of object
CN110135376A (en) * 2019-05-21 2019-08-16 北京百度网讯科技有限公司 Determine method, equipment and the medium of the coordinate system conversion parameter of imaging sensor
CN110148185A (en) * 2019-05-22 2019-08-20 北京百度网讯科技有限公司 Determine method, apparatus, electronic equipment and the storage medium of coordinate system conversion parameter

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
基于公路双平行线组的相机外参数在线标定;孙英慧;;电子科技(第07期);第40-42+46页 *
基于小波变换的嵌入式超声内窥图像处理系统;时一峰;白宝平;陈晓冬;汪毅;郁道银;;光电工程(第05期);第88-94页 *
基于视觉跟踪的机器人伺服系统研究;刘孝星;《中国优秀硕士学位论文全文数据库 信息科技辑》(第2期);第1138-4129页 *
室内监控摄像机外参数多级标定方法;任越;张云生;张明磊;;测绘与空间地理信息(第06期);第79-82页 *
欧阳元新,熊璋.物联网引论.北京航空航天大学出版社,2016,(第2016年5月第1版版),241-242. *

Also Published As

Publication number Publication date
CN110728720A (en) 2020-01-24

Similar Documents

Publication Publication Date Title
CN110728720B (en) Method, apparatus, device and storage medium for camera calibration
CN110378965B (en) Method, device and equipment for determining coordinate system conversion parameters of road side imaging equipment
CN110766760B (en) Method, device, equipment and storage medium for camera calibration
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
CN110146869B (en) Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
WO2022022694A1 (en) Method and system for sensing automated driving environment
CN110163930B (en) Lane line generation method, device, equipment, system and readable storage medium
CN110751693B (en) Method, apparatus, device and storage medium for camera calibration
TWI722355B (en) Systems and methods for correcting a high-definition map based on detection of obstructing objects
CN110766761B (en) Method, apparatus, device and storage medium for camera calibration
CN113657224B (en) Method, device and equipment for determining object state in vehicle-road coordination
CN110969055B (en) Method, apparatus, device and computer readable storage medium for vehicle positioning
US10996337B2 (en) Systems and methods for constructing a high-definition map based on landmarks
CN111652072A (en) Track acquisition method, track acquisition device, storage medium and electronic equipment
CN116997771A (en) Vehicle, positioning method, device, equipment and computer readable storage medium thereof
CN112348752A (en) Lane line vanishing point compensation method and device based on parallel constraint
CN115409965A (en) Mining area map automatic generation method for unstructured roads
WO2022133986A1 (en) Accuracy estimation method and system
CN114648576B (en) Target vehicle positioning method, device and system
CN117523005A (en) Camera calibration method and device
CN117953046A (en) Data processing method, device, controller, vehicle and storage medium
CN116740295A (en) Virtual scene generation method and device
CN117809285A (en) Traffic sign board ranging method and system applied to port external collector card
CN114882727A (en) Parking space detection method based on domain controller, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20211015

Address after: 105 / F, building 1, No. 10, Shangdi 10th Street, Haidian District, Beijing 100085

Applicant after: Apollo Intelligent Technology (Beijing) Co.,Ltd.

Address before: 2 / F, baidu building, 10 Shangdi 10th Street, Haidian District, Beijing 100094

Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant