US20240019267A1 - Vehicle Camera Calibration System - Google Patents
Vehicle Camera Calibration System Download PDFInfo
- Publication number
- US20240019267A1 US20240019267A1 US18/353,352 US202318353352A US2024019267A1 US 20240019267 A1 US20240019267 A1 US 20240019267A1 US 202318353352 A US202318353352 A US 202318353352A US 2024019267 A1 US2024019267 A1 US 2024019267A1
- Authority
- US
- United States
- Prior art keywords
- laser
- camera
- distance
- location
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract description 46
- 238000005259 measurement Methods 0.000 description 9
- 238000001514 detection method Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C9/00—Measuring inclination, e.g. by clinometers, by levels
- G01C9/02—Details
- G01C9/06—Electric or photoelectric indication or reading means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/46—Indirect determination of position data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/46—Indirect determination of position data
- G01S17/48—Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4865—Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
- G07C5/085—Registering performance data using electronic data carriers
- G07C5/0866—Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C9/00—Measuring inclination, e.g. by clinometers, by levels
- G01C9/02—Details
- G01C9/06—Electric or photoelectric indication or reading means
- G01C2009/066—Electric or photoelectric indication or reading means optical
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
Definitions
- This document describes techniques to perform camera calibration in a vehicle.
- a vehicle may include cameras attached to the vehicle for several purposes. For example, cameras may be attached to a roof of the vehicle for security purposes, for driving aid, or for facilitating autonomous driving. Cameras mounted on a vehicle can obtain images of one or more areas surrounding the vehicle. These images can be processed to obtain information about the road or about the objects surrounding the vehicle. For example, images obtained by a camera can be analyzed to determine distances of objects surrounding the autonomous vehicle so that the autonomous vehicle can be safely maneuvered around the objects.
- a method of performing camera calibration includes emitting, by a laser emitter located on a vehicle and pointed towards a road at a first angle, a first laser pulse group towards a first location on a road, emitting, by the laser emitter pointed towards the road at a second angle, a second laser pulse group towards a second location on the road, where each of the first laser pulse group and the second laser pulse group comprises one or more laser spots.
- the method further includes detecting, by a laser receiver located on the vehicle, the one or more laser spots, calculating a first set of distances from a location of the laser receiver to the one or more laser spots, obtaining, from a camera located on the vehicle, an image comprising the one or more laser spots, and determining, from the image, a second set of distances from a location of the camera to the one or more laser spots.
- the method further includes determining two camera calibration parameters of the camera by solving two equations, where each equation includes two unknown camera calibration parameters, and a first value associated with the first set of distances and a second value associated with the second set of distances for a same laser pulse group.
- the determining the two camera calibration parameters comprises determining a slope value and an intercept value by: determining, for both the first laser pulse group and the second laser pulse group, a first average distance of the first set of distances and a second average distance of the second set of distances, where the first value is the first average distance and the second value is the second average distance, and determining, by solving two linear equations, the slope value and the intercept value with which the camera is calibrated, where a first linear equation includes an unknown slope value, an unknown intercept value, and the first average distance and the second average distance corresponding to the first laser pulse group, and where a second linear equation includes the unknown slope value, the unknown intercept value, and the first average distance and the second average distance corresponding to the second laser pulse group.
- the method further comprises determining, while the vehicle is being driven, a location of an object in another image obtained from the camera by using the slope value and intercept value of the camera.
- the first location is located at a first pre-determined distance along the road from a location of the laser emitter
- the second location is located at a second pre-determined distance along the road from the location of the laser emitter.
- the first pre-determined distance and the second pre-determined distance are within a range of distances from a location of the camera at which the camera is configured to obtain the images.
- the method further comprises: determining the first angle and the second angle formed in between a direction in which the laser emitter is pointed towards the road and an imaginary horizontal plane that is at least partly parallel to the road and that includes at least a portion of the laser emitter, and adjusting the laser emitter according to each of first angle and the second angle to emit the first laser pulse group and the second laser pulse group at the first pre-determined distance and the second pre-determined distance, respectively.
- each of first pre-determined distance and the second pre-determined distance is a distance from the location of the laser emitter to at least one laser spot of the one or more laser spots.
- each of the first laser pulse group and the second laser pulse group is emitted for a pre-determined time duration.
- the pre-determined time duration is greater than a reciprocal of a frame rate of the camera.
- the detecting the one or more laser spots and the calculating the first set of distances comprises: receiving, from the laser receiver, signals indicative of times when the one or more laser spots are detected, calculating, for each laser spot, a distance from the laser receiver to a laser spot by: obtaining a time delay value by subtracting a first time of detection of the laser spot from a second time when a laser pulse group that included the one or more laser spots was emitted, and multiplying the time delay value by one-half of a speed of light.
- the first laser pulse group and the second laser pulse group are emitted when the vehicle is stopped or parked on the road.
- the camera is located adjacent to the laser receiver.
- the above-described methods are embodied in the form of processor-executable code and stored in a computer-readable program medium.
- the computer readable program is stored on a non-transitory computer readable media, the computer readable program including code that when executed by a processor, causes the processor to implement the methods described in this patent document.
- the computer readable program including code that when executed by a processor, causes the processor to: instruct a laser emitter located on a vehicle to point towards a road at a first angle and emit a first laser pulse group towards a first location on a road, instruct the laser emitter to point towards the road at a second angle and emit a second laser pulse group towards a second location on the road, where each of the first laser pulse group and the second laser pulse group comprises a one or more laser spots.
- the computer readable program when executed by the processor further causes the processor to detect, by a laser receiver located on the vehicle, the one or more laser spots, calculate a first set of distances from a location of the laser receiver to the one or more laser spots, obtain, from a camera located on the vehicle, an image comprising the one or more laser spots, and determine, from the image, a second set of distances from a location of the camera to the one or more laser spots.
- the computer readable program when executed by the processor further causes the processor to determine two camera calibration parameters of the camera by solving two equations, where each equation includes two unknown camera calibration parameters, and a first value associated with the first set of distances and a second value associated with the second set of distances for a same laser pulse group.
- a device that is configured or operable to perform the above-described methods is disclosed.
- a vehicle fitted with a camera calibration apparatus is disclosed.
- FIG. 1 shows a conventional system to perform a static calibration operation with a camera on a vehicle.
- FIG. 2 A shows an exemplary camera calibration system.
- FIG. 2 B shows an exemplary camera and laser system that includes a camera, a laser emitter, and a laser receiver.
- FIG. 2 C shows an exemplary two laser pulses generated and emitted by a laser emitter that may be sensed by a laser receiver.
- FIG. 3 shows an exemplary block diagram of an in-vehicle control computer located in a vehicle to control the camera and laser system.
- FIG. 4 shows an exemplary flow diagram of performing camera calibration.
- An autonomous vehicle includes cameras mounted on the autonomous vehicle to obtain images of one or more areas surrounding the autonomous vehicle. These images can be analyzed by a computer on-board the autonomous vehicle to obtain distance or other information about the road or about the objects surrounding the autonomous vehicle.
- the cameras on the autonomous vehicle need to be calibrated so that the computer on-board the autonomous vehicle can precisely or accurately detect an object and determine its distance.
- a driver in an autonomous vehicle can perform a static calibration technique before the autonomous vehicle is driven to its destination.
- FIG. 1 shows a conventional system 100 to perform a static calibration operation with a camera on a vehicle.
- a vehicle 106 is first driven to a location where a target calibration board 102 is located.
- the target calibration board 102 has a pre-defined pattern of black and white squares.
- the target calibration board 102 is located at a fixed distance from the vehicle 106 so that the distance from the camera 104 and the target calibration board 102 is fixed.
- the fixed distance is set according to the focal length of the camera.
- the camera 104 can take several pictures of the target calibration board 102 .
- the computer on-board the vehicle 106 can use space geometry relation to calculate the distance between the target calibration board 102 and the camera 104 .
- the computer on-board the vehicle 106 can adjust the slope and intercept calibration parameters to compensate for differences between the known fixed distance and the calculated distance.
- the vehicle 106 can be driven to its destination.
- the conventional system 100 described in FIG. 1 has several technical drawbacks.
- the slope and intercept values obtained from the calibration process can become inaccurate as the vehicle is driven towards its destination. For instance, when a vehicle is driven towards its destination, a camera's position can change in response to vibrations from the vehicle or in response to forces exerted by environmental elements (e.g., wind).
- environmental elements e.g., wind
- the slope and intercept values cannot be easily re-adjusted.
- a conventional system cannot easily perform the calibration process that should be performed periodically (e.g., daily, weekly or monthly) to re-adjust the camera's slope and intercept values.
- the calibration process performed by a convention system may not account for a condition of the road on which as the vehicle will be driven towards its destination. For example, a conventional system may perform a calibration process on a flat surface, but the vehicle may have a different pitch angle as it is driven uphill or downhill to its destination. A different pitch angle of the truck can affect the calibration result and thus affect the distance measurements derived from images obtained by a camera.
- the techniques described in the present document may be used by some embodiments to solve the above-discuss problems, and others.
- a laser emitter on a vehicle emits towards a road a first laser pulse group and a second laser pulse group, where each laser pulse group includes one or more laser spots.
- the first laser pulse group is emitted at a first distance along or parallel to the road from the location of the laser emitter
- the second laser pulse is emitted at a second distance along or parallel to the road from the location of the laser emitter.
- a laser receiver on the vehicle can detect the one or more laser spots and a camera on the vehicle can obtain an image of the one or more laser spots.
- the exemplary calibration technique can determine a first set of distances from the one or more laser spots detected by the laser receiver and a second set of distances from the one or more lase spots in the image obtained by the camera.
- the exemplary calibration technique can solve for two calibration parameters using two equations, where each equation can include two unknown camera calibration parameters, and both a first average of the first set of distances and a second average of the second set of distances for a same laser pulse group.
- the techniques described in this patent document can perform a regression to determine the slope and intercept calibration parameters between measured distance and real distance.
- FIG. 2 A shows an exemplary camera calibration system 200 that includes a vehicle 202 stopped or parked on a road 208 , where the vehicle 202 includes a camera and laser system 204 .
- the vehicle 202 can be an autonomous vehicle.
- the camera and laser system 204 includes at least one laser emitter (or a laser source), one or more camera and one or more laser receivers. Each camera may be adjacent to one of the laser receivers so that the distance measurements obtained from the camera images can be related to the distance measurements obtained from the data provided by the laser receivers, as further explained in this patent document. For example, a camera can be within one-foot of a location of a laser receiver. For ease of description, the camera and laser system 204 in FIGS.
- 2 A and 2 B is shown as a single camera with a single laser emitter and a single laser emitter in an integrated unit coupled to the vehicle.
- the laser emitter, the one or more cameras, and one or more laser receivers may be separate units coupled to the vehicle.
- the laser emitter is pointed towards the road 208 and in the same direction as is one or more cameras and the one or more laser receivers.
- the laser emitter generates and emits towards a road 208 a laser pulse group that includes a one or more laser spots 206 .
- the direction toward which the laser pulse group is emitted is chosen such that at least one of the spots of the one or more laser spots 206 can be sensed by the one or more laser receivers and imaged or recorded by the one or more cameras as shown in FIG. 2 B .
- the laser emitter has an adjustable pitch angle such that it can emit a laser pulse toward a road 208 at multiple pre-determined distances along the road or parallel to the road from the location of the laser emitter for calibrating multiple camera.
- a pre-determined distance 210 can be defined along the road or parallel to the road from the location of the laser emitter to at least one spot of the one or more laser spots 206 .
- a pitch angle (shown as “0” in FIG. 2 A ) can be considered an angle formed in between the direction in which the laser emitter is pointing, and an imaginary horizontal plane or horizontal line at least partly parallel to the road and including at least a portion of the laser emitter.
- a pitch angle can be considered an angle formed between the direction in which the laser emitter is pointing, and an imaginary vertical plane or vertical line at least partly perpendicular to the road and including at least a portion of the laser emitter.
- the adjustable pitch angle can allow the laser emitter to generate and emit a plurality of laser pulse groups at various pre-determined distances from the location of the laser emitter.
- a laser pulse group includes one or more laser spots 206
- the pre-determined distance 210 can be a distance from the laser emitter to at least one laser spot or point of the one or more laser spots 206 .
- a laser emitter can emit laser pulse groups with at least one laser spot or point located at pre-determined distances of 10 meters, 20 meters, 70 meters, 120 meters, 300 meters, and 350 meters from the location of the laser emitter.
- the pre-determined distances may be proportional to the speed at which the vehicle is moving. For example, in some embodiments, smaller distances may be used for calibration when the vehicle is moving at a relatively high speed (e.g., above a threshold) because of dangers of making mistakes at near-distances at high speeds. For example, in some embodiments, calibration range may be increased when the vehicle is moving at a faster speed in order to look farther ahead when traveling at higher velocities.
- the laser emitter can be installed on the vehicle 202 so that the distance between the laser emitter and the road is fixed or previously known or is a function of the velocity of the vehicle.
- an in-vehicle control computer located in the vehicle 202 includes a laser emitter module ( 315 in FIG. 3 ) that can use a trigonometric function to instruct the laser emitter to adjust the pitch angle and to instruct the laser emitter to emit a laser pulse group at each of two or more pre-determined distances from the location of the laser emitter, where the pre-determined distances are parallel to a road or along the road.
- a laser emitter with an adjustable pitch angle is a beneficial technical feature at least because it can allow the exemplary calibration technique (as explained herein) to calibrate multiple cameras that capture images at different range of distances from the location of the vehicle.
- laser pulse groups generated and emitted at a set of two or more pre-determined distances are used to calibrate the slope and intercept calibration parameters of each camera.
- the slope and intercept calibration parameters can be solved by setting up the slope and intercept calibration parameters in a slope-intercept form as shown in Equation (2) below, where the slope parameter corresponds to the slope and the intercept parameter corresponds to the intercept (e.g., distance from the origin where a line or curve or surface intersects a plane or axis).
- calibration parameters of a first camera that captures images from 5 meters to 50 meters can be adjusted based on laser pulse groups emitted at distances of 10 meters and 20 meters
- calibration parameters of a second camera that captures images from 50 meters to 200 meters can be adjusted based on laser pulse groups emitted at distances of 70 meters and 120 meters
- calibration parameters of a third camera that captures images from 200 meters to 500 meters can be adjusted based on laser pulse groups emitted at distances of 300 meters and 350 meters.
- the laser emitter can generate and emit multiple laser pulse groups at multiple pre-determined distance from the location of the laser emitter that is pointed towards a road. Each laser pulse group is emitted towards an object (e.g., road) at a pre-determined distance along the road from the location of the laser emitter. As further explained below, the laser emitter generates and emits a laser pulse group to impinge one or more laser spots upon an expected opaque or reflective object or on a road approximately at each pre-determined distance.
- the in-vehicle control computer in the vehicle 202 can perform filtering operations to minimize the effects of noise, distortion, or lack of detection by the laser receiver and/or camera as they sense or image the one or more laser spots or points of the one or more laser spots 206 .
- the laser emitter generates and emits each laser pulse group for a pre-determined time duration to allow a camera and/or a laser receiver sufficient time to detect or capture the one or more laser spots 206 .
- the pre-determined time duration of the laser pulse group emitted by the laser emitter can be greater than a reciprocal of the frame rate (e.g., 1/60 seconds, or approximately 16.67 milliseconds).
- the pre-determined time duration can be at least 60 milliseconds so that at least three frames of the camera can capture the laser pulse group.
- Each laser pulse group may include the same one or more laser spots 206 .
- the one or more laser spots 206 includes several individual laser spots or points.
- the one or more laser spots 206 includes four laser spots that can be located at four vertices of an imaginary quadrilateral in a spatial laser pattern.
- the one or more laser spots 206 is emitted toward a road. This is a beneficial technical feature that can eliminate the need to use the target calibration board as described in FIG. 1 .
- the exemplary calibration technique described in this patent document can account for variations in road conditions by generating on the road the one or more laser spots 206 .
- FIG. 2 C shows an exemplary two laser pulses generated and emitted by a laser emitter that may be sensed by a laser receiver.
- each laser pulse 252 , 254 may be one or more laser spots, and laser pulse 254 is emitted at a time t R after the emission of laser pulse 252 .
- FIG. 2 C shows that each of the first and second pulses 252 , 254 are generated to be shone on an object (e.g., a road) for a pre-determined time duration to.
- a laser receiver senses the laser pulse emitted onto the object after a time delay t D .
- the laser receiver can sense the one or more laser spots and generates signals indicating the times when each of the laser spots or points are detected or sensed by the laser receiver. Referring to the laser pulses 252 , 254 in FIG. 2 C , the laser receiver sends signals indicative of the times when the laser spots associated with laser pulses 252 , 254 are sensed to an in-vehicle control computer as further explained in the calibration related operations in FIGS. 3 and 4 .
- a laser emitter can generate and emit multiple laser pulse groups comprising one or more laser spots at multiple pre-determined distances along a road from the location of the laser emitter, where the one or more laser receivers can sense one or more laser spots from each laser pulse group. For example, if a laser emitter emits a laser pulse group including four laser spots to be shown at a distance of 10 meters along a road from the location of the laser emitter, then one or more cameras that are configured to capture images from 5 meters to 50 meters and each of the one or more laser receivers adjacent to a camera can sense and/or obtain up to four laser spots from the laser pulse group.
- FIG. 3 shows an exemplary block diagram of an in-vehicle control computer located in a vehicle to control the camera and laser system as described in FIGS. 2 A and 2 B .
- the in-vehicle control computer 300 includes at least one processor 310 and a memory 305 having instructions stored thereupon. The instructions upon execution by the processor 310 configure the computer 300 to perform the operations described for a laser emitter, one or more laser receivers, one or more cameras, and various modules as described in FIGS. 2 A to 4 , and/or the operations described in the various embodiments or sections in this patent document.
- the laser emitter module 315 controls the operations of the laser emitter as described in this patent document.
- the laser receiver module 320 controls the operations of the one or more laser receivers as described in this patent document.
- the laser emitter generates and emits multiple laser pulse groups, where each laser pulse group includes one or more laser spots.
- the laser receiver module 320 can receive signals from a laser receiver indicative of the time(s) when the one or more laser spots are sensed by the laser receiver as explained in this patent document.
- the laser receiver module 320 can also obtain from the laser emitter module 315 the plurality of times corresponding to the plurality of emitted laser pulse groups. Based on the signals received from the laser receiver for each detected laser spot and based on the time when a laser pulse is transmitted, the laser receiver module 320 can calculate a distance between the laser receiver and each of one or more detected laser spots based on Equation (1) as shown below:
- t D for each detected laser spot is the difference in time between when the laser emitter emitted the laser pulse and when the laser receiver detected a laser spot, and where 300,000,000 m/s is a rounded value for the speed of light.
- the camera module 325 can process the images received from the one or more cameras to detect the one or more laser spots of each laser pulse group.
- the in-vehicle control computer stores in a hard drive or memory a set of previous images of one or more laser spots at various pre-determined distances from the location of the laser emitter. These previously storage images can include the one or more laser spots previously obtained from a flat road at various pre-determined distances.
- the camera module 325 can compare the previously stored one or more laser spots at a pre-determined distance to the images obtained of the one or more laser spots emitted by the laser emitter at the same pre-determined distance.
- the camera module 325 can determine a distance D_Detect from the camera to one or more laser spots based on the comparison of the stored and obtained images and/or based on the camera matrix.
- the laser receiver module 320 For each laser pulse group, distances obtained from the laser receiver module 320 for each laser spot can be averaged and the distances obtained from the camera module 325 for each laser spot can be averaged to minimize the effects or distortion or noise or lack of detection. For example, if a laser emitter generates and emits a laser pulse group at a pre-determined distance from the laser emitter where the laser pulse group includes six laser spots, the laser receiver module 320 can average the six distance measurements associated with a six detected spots to obtain a single averaged distance measurement. In the above example, the camera module 320 can also average six distance measurements associated with the six spots in a camera image to obtain a single averaged distance measurement. Thus, for each laser pulse group, the laser receiver module 320 and the camera module 325 can separately calculate a single averaged distance measurement which can be considered a single D_Real value and a single D_Detect value, respectively.
- the calibration module 330 can obtain from the laser receiver module 320 and camera module 325 the single D_Real and single D_Detect values for at least two laser pulse groups to solve for the Sale and Intercept parameters by using, for example, Equation (2) as shown below:
- the calibration module 330 obtains at least two D_Real values and at least two D_Detect values to calculate the slope and intercept parameters for each camera. For example, if a first camera captures images from 5 meters to 50 meters, and a second camera captures images from 50 meters to 200 meters, then the laser emitter can generate and emit a laser pulse group at each of the following exemplary distances: 10 meters, 20 meters, 70 meters, and 120 meters.
- the laser receiver module 320 provides a first D_Real value for laser pulse group emitted at 10 meters and a second D_Real value for laser pulse group emitted at 20 meters
- the camera module 325 provides a first D_Detect value for a laser pulse group emitted at 10 meters and a second D_Detect value for laser pulse group emitted at 20 meters
- the laser receiver module 320 and the camera module 325 also provide two D_Real values for the laser pulse group emitted at 70 meters and 120 meters and two D_Detect values for the laser pulse group emitted at 70 meters and 120 meters.
- the calibration module 330 can use the two D_Real values and two D_Detect values for each set of two or more distances within an imaging range of each camera (e.g., 10 meters and 20 meters for the first camera) to solve for the intercept and slope parameters for each camera (e.g., the first camera). For example, the calibration module 330 can use the two D_Real values and two D_Detect values for the laser pulse groups emitted at 10 meters and 20 meters to solve for the intercept and slope parameters for the first camera using Equation (2).
- a non-linear equation such as an n-th order polynomial can be used to determine the slope and intercept values based on n-number of D_Real and D_Detect values, where n is greater than or equal to two.
- the n-number of D_Real and D_Detect values can be obtained from n-number of laser pulse groups each comprising one or more laser spots.
- a simplified linear equation can be used (e.g., Equation (2)) to determine the slope and intercept values based on at least two D_Real and D_Detect values.
- the distance module 335 can determine distances from the vehicle to one or more objects located in an environment that surrounds the vehicle as the vehicle is being driven to its destination. The distance module 335 can use the slope and intercept parameters of a camera when determining a distance of an object located in an image obtained from the camera.
- FIG. 4 shows an exemplary flow diagram of performing camera calibration.
- a laser emitter located on a vehicle and pointed towards a road at a first angle emits a first laser pulse group towards a first location on a road.
- the laser emitter pointed towards the road at a second angle emits a second laser pulse group towards a second location on the road.
- Each of the first laser pulse group and the second laser pulse group comprises one or more laser spots.
- Operations 406 to 412 are performed for each of the first laser pulse group and the second laser pulse group emitted at the first location and the second location, respectively.
- a laser receiver located on the vehicle detects the one or more laser spots.
- a first set of distances from a location of the laser receiver to the one or more laser spots are calculated.
- an image comprising the one or more laser spots is obtained from a camera located on the vehicle.
- a second set of distances from a location of the camera to the one or more laser spots are determined from the image
- two camera calibration parameters of the camera are determined by solving two equations.
- Each equation includes two unknown camera calibration parameters, and a first value associated with the first set of distances and a second value associated with the second set of distances for a same laser pulse group.
- the determining the two camera calibration parameters comprises determining a slope value and an intercept value by determining, for both the first laser pulse group and the second laser pulse group, a first average distance of the first set of distances and a second average distance of the second set of distances, where the first value is the first average distance and the second value is the second average distance, and by determining, by solving two linear equations, the slope value and the intercept value with which the camera is calibrated, where a first linear equation includes an unknown slope value, an unknown intercept value, and the first average distance and the second average distance corresponding to the first laser pulse group, and where a second linear equation includes the unknown slope value, the unknown intercept value, and the first average distance and the second average distance corresponding to the second laser pulse group.
- the method of FIG. 4 further includes determining, while the vehicle is being driven, a location of an object in another image obtained from the camera by using the slope value and intercept value of the camera.
- the first location is located at a first pre-determined distance along the road from a location of the laser emitter
- the second location is located at a second pre-determined distance along the road from the location of the laser emitter.
- the first pre-determined distance and the second pre-determined distance are within a range of distances from a location of the camera at which the camera is configured to obtain the images.
- the method of FIG. 4 further includes determining the first angle and the second angle formed in between a direction in which the laser emitter is pointed towards the road and an imaginary horizontal plane that is at least partly parallel to the road and that includes at least a portion of the laser emitter, and adjusting the laser emitter according to each of first angle and the second angle to emit the first laser pulse group and the second laser pulse group at the first pre-determined distance and the second pre-determined distance, respectively.
- each of first pre-determined distance and the second pre-determined distance is a distance from the location of the laser emitter to at least one laser spot of the one or more laser spots.
- each of the first laser pulse group and the second laser pulse group is emitted for a pre-determined time duration.
- the pre-determined time duration is greater than a reciprocal of a frame rate of the camera.
- the detecting the one or more laser spots and the calculating the first set of distances comprises: receiving, from the laser receiver, signals indicative of times when the one or more laser spots are detected, calculating, for each laser spot, a distance from the laser receiver to a laser spot by: obtaining a time delay value by subtracting a first time of detection of the laser spot from a second time when a laser pulse group that included the one or more laser spots was emitted, and multiplying the time delay value by one-half of a speed of light.
- the first laser pulse group and the second laser pulse group are emitted when the vehicle is stopped or parked on the road.
- the camera is located adjacent to the laser receiver.
- methods described in the various embodiments in this patent document are embodied in a computer readable program stored on a non-transitory computer readable media.
- the computer readable program includes code that when executed by a processor, causes the processor to perform the methods described in this patent document, including the method described in FIG. 3 .
- the term “exemplary” is used to mean “an example of” and, unless otherwise stated, does not imply an ideal or a preferred embodiment.
- a computer-readable medium may include removable and non-removable storage devices including, but not limited to, Read Only Memory (ROM), Random Access Memory (RAM), compact discs (CDs), digital versatile discs (DVD), etc. Therefore, the computer-readable media can include a non-transitory storage media.
- program modules may include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
- Computer- or processor-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps or processes.
- a hardware circuit implementation can include discrete analog and/or digital components that are, for example, integrated as part of a printed circuit board.
- the disclosed components or modules can be implemented as an Application Specific Integrated Circuit (ASIC) and/or as a Field Programmable Gate Array (FPGA) device.
- ASIC Application Specific Integrated Circuit
- FPGA Field Programmable Gate Array
- DSP digital signal processor
- the various components or sub-components within each module may be implemented in software, hardware or firmware.
- the connectivity between the modules and/or components within the modules may be provided using any one of the connectivity methods and media that is known in the art, including, but not limited to, communications over the Internet, wired, or wireless networks using the appropriate protocols.
Abstract
Technique for performing camera calibration on a vehicle is disclosed. A method of performing camera calibration includes emitting, by a laser emitter located on a vehicle and pointed towards a road, a first laser pulse group towards a first location on a road and a second laser pulse group towards a second location on the road, where each laser pulse group includes one or more laser spots. For each laser pulse group: a first set of distances are calculated from a location of a laser receiver to the one or more laser spots, and a second set of distances are determined from an image obtained from a camera, where the second set of distances are from a location of the camera to the one or more laser spots. The method also includes determining two camera calibration parameters of the camera by solving two equations.
Description
- This application is a continuation of U.S. patent application Ser. No. 17/648,992 filed Jan. 26, 2022, by Xiaoling Han et al. and entitled “VEHICLE CAMERA CALIBRATION SYSTEM,” which is a continuation of U.S. patent application Ser. No. 17/082,551 filed Oct. 28, 2020, by Xiaoling Han et al. and entitled “VEHICLE CAMERA CALIBRATION SYSTEM”, now, U.S. Pat. No. 11,359,932 issued Jun. 14, 2022, which is a continuation of U.S. patent application Ser. No. 16/572,001 filed Sep. 16, 2019, by Xiaoling Han et al. and entitled “VEHICLE CAMERA CALIBRATION SYSTEM” now U.S. Pat. No. 10,837,795 issued Nov. 17, 2020, which are incorporated herein by references.
- This document describes techniques to perform camera calibration in a vehicle.
- A vehicle may include cameras attached to the vehicle for several purposes. For example, cameras may be attached to a roof of the vehicle for security purposes, for driving aid, or for facilitating autonomous driving. Cameras mounted on a vehicle can obtain images of one or more areas surrounding the vehicle. These images can be processed to obtain information about the road or about the objects surrounding the vehicle. For example, images obtained by a camera can be analyzed to determine distances of objects surrounding the autonomous vehicle so that the autonomous vehicle can be safely maneuvered around the objects.
- Camera calibration techniques are described to determine slope and intercept values for cameras on a vehicle. In an exemplary embodiment, a method of performing camera calibration includes emitting, by a laser emitter located on a vehicle and pointed towards a road at a first angle, a first laser pulse group towards a first location on a road, emitting, by the laser emitter pointed towards the road at a second angle, a second laser pulse group towards a second location on the road, where each of the first laser pulse group and the second laser pulse group comprises one or more laser spots. For each of the first laser pulse group and the second laser pulse group emitted at the first location and the second location, respectively, the method further includes detecting, by a laser receiver located on the vehicle, the one or more laser spots, calculating a first set of distances from a location of the laser receiver to the one or more laser spots, obtaining, from a camera located on the vehicle, an image comprising the one or more laser spots, and determining, from the image, a second set of distances from a location of the camera to the one or more laser spots. The method further includes determining two camera calibration parameters of the camera by solving two equations, where each equation includes two unknown camera calibration parameters, and a first value associated with the first set of distances and a second value associated with the second set of distances for a same laser pulse group.
- In some embodiments, the determining the two camera calibration parameters comprises determining a slope value and an intercept value by: determining, for both the first laser pulse group and the second laser pulse group, a first average distance of the first set of distances and a second average distance of the second set of distances, where the first value is the first average distance and the second value is the second average distance, and determining, by solving two linear equations, the slope value and the intercept value with which the camera is calibrated, where a first linear equation includes an unknown slope value, an unknown intercept value, and the first average distance and the second average distance corresponding to the first laser pulse group, and where a second linear equation includes the unknown slope value, the unknown intercept value, and the first average distance and the second average distance corresponding to the second laser pulse group. In some embodiments, the method further comprises determining, while the vehicle is being driven, a location of an object in another image obtained from the camera by using the slope value and intercept value of the camera.
- In some embodiments, the first location is located at a first pre-determined distance along the road from a location of the laser emitter, and the second location is located at a second pre-determined distance along the road from the location of the laser emitter. In some embodiments, the first pre-determined distance and the second pre-determined distance are within a range of distances from a location of the camera at which the camera is configured to obtain the images. In some embodiments, the method further comprises: determining the first angle and the second angle formed in between a direction in which the laser emitter is pointed towards the road and an imaginary horizontal plane that is at least partly parallel to the road and that includes at least a portion of the laser emitter, and adjusting the laser emitter according to each of first angle and the second angle to emit the first laser pulse group and the second laser pulse group at the first pre-determined distance and the second pre-determined distance, respectively. In some embodiments, each of first pre-determined distance and the second pre-determined distance is a distance from the location of the laser emitter to at least one laser spot of the one or more laser spots.
- In some embodiments, each of the first laser pulse group and the second laser pulse group is emitted for a pre-determined time duration. In some embodiments, the pre-determined time duration is greater than a reciprocal of a frame rate of the camera.
- In some embodiments, the detecting the one or more laser spots and the calculating the first set of distances comprises: receiving, from the laser receiver, signals indicative of times when the one or more laser spots are detected, calculating, for each laser spot, a distance from the laser receiver to a laser spot by: obtaining a time delay value by subtracting a first time of detection of the laser spot from a second time when a laser pulse group that included the one or more laser spots was emitted, and multiplying the time delay value by one-half of a speed of light.
- In some embodiments, the first laser pulse group and the second laser pulse group are emitted when the vehicle is stopped or parked on the road. In some embodiments, the camera is located adjacent to the laser receiver.
- In another exemplary aspect, the above-described methods are embodied in the form of processor-executable code and stored in a computer-readable program medium. The computer readable program is stored on a non-transitory computer readable media, the computer readable program including code that when executed by a processor, causes the processor to implement the methods described in this patent document. For example, the computer readable program including code that when executed by a processor, causes the processor to: instruct a laser emitter located on a vehicle to point towards a road at a first angle and emit a first laser pulse group towards a first location on a road, instruct the laser emitter to point towards the road at a second angle and emit a second laser pulse group towards a second location on the road, where each of the first laser pulse group and the second laser pulse group comprises a one or more laser spots. For each of the first laser pulse group and the second laser pulse group emitted at the first location and the second location, respectively, the computer readable program when executed by the processor further causes the processor to detect, by a laser receiver located on the vehicle, the one or more laser spots, calculate a first set of distances from a location of the laser receiver to the one or more laser spots, obtain, from a camera located on the vehicle, an image comprising the one or more laser spots, and determine, from the image, a second set of distances from a location of the camera to the one or more laser spots. The computer readable program when executed by the processor further causes the processor to determine two camera calibration parameters of the camera by solving two equations, where each equation includes two unknown camera calibration parameters, and a first value associated with the first set of distances and a second value associated with the second set of distances for a same laser pulse group.
- In yet another exemplary embodiment, a device that is configured or operable to perform the above-described methods is disclosed.
- In another aspect, a vehicle fitted with a camera calibration apparatus is disclosed.
- The above and other aspects and their implementations are described in greater detail in the drawings, the descriptions, and the claims.
-
FIG. 1 shows a conventional system to perform a static calibration operation with a camera on a vehicle. -
FIG. 2A shows an exemplary camera calibration system. -
FIG. 2B shows an exemplary camera and laser system that includes a camera, a laser emitter, and a laser receiver. -
FIG. 2C shows an exemplary two laser pulses generated and emitted by a laser emitter that may be sensed by a laser receiver. -
FIG. 3 shows an exemplary block diagram of an in-vehicle control computer located in a vehicle to control the camera and laser system. -
FIG. 4 shows an exemplary flow diagram of performing camera calibration. - An autonomous vehicle includes cameras mounted on the autonomous vehicle to obtain images of one or more areas surrounding the autonomous vehicle. These images can be analyzed by a computer on-board the autonomous vehicle to obtain distance or other information about the road or about the objects surrounding the autonomous vehicle. However, the cameras on the autonomous vehicle need to be calibrated so that the computer on-board the autonomous vehicle can precisely or accurately detect an object and determine its distance. In a conventional system, a driver in an autonomous vehicle can perform a static calibration technique before the autonomous vehicle is driven to its destination.
-
FIG. 1 shows aconventional system 100 to perform a static calibration operation with a camera on a vehicle. In theconventional system 100, avehicle 106 is first driven to a location where atarget calibration board 102 is located. Thetarget calibration board 102 has a pre-defined pattern of black and white squares. Furthermore, thetarget calibration board 102 is located at a fixed distance from thevehicle 106 so that the distance from thecamera 104 and thetarget calibration board 102 is fixed. The fixed distance is set according to the focal length of the camera. Thecamera 104 can take several pictures of thetarget calibration board 102. The computer on-board thevehicle 106 can use space geometry relation to calculate the distance between thetarget calibration board 102 and thecamera 104. The computer on-board thevehicle 106 can adjust the slope and intercept calibration parameters to compensate for differences between the known fixed distance and the calculated distance. After the calibration is performed, thevehicle 106 can be driven to its destination. - The
conventional system 100 described inFIG. 1 has several technical drawbacks. First, the slope and intercept values obtained from the calibration process can become inaccurate as the vehicle is driven towards its destination. For instance, when a vehicle is driven towards its destination, a camera's position can change in response to vibrations from the vehicle or in response to forces exerted by environmental elements (e.g., wind). However, in aconventional system 100, the slope and intercept values cannot be easily re-adjusted. - Second, a conventional system cannot easily perform the calibration process that should be performed periodically (e.g., daily, weekly or monthly) to re-adjust the camera's slope and intercept values. Third, the calibration process performed by a convention system may not account for a condition of the road on which as the vehicle will be driven towards its destination. For example, a conventional system may perform a calibration process on a flat surface, but the vehicle may have a different pitch angle as it is driven uphill or downhill to its destination. A different pitch angle of the truck can affect the calibration result and thus affect the distance measurements derived from images obtained by a camera. The techniques described in the present document may be used by some embodiments to solve the above-discuss problems, and others.
- This patent document describes exemplary camera calibration techniques to determine camera calibration parameters for a camera. In an exemplary embodiment, a laser emitter on a vehicle emits towards a road a first laser pulse group and a second laser pulse group, where each laser pulse group includes one or more laser spots. The first laser pulse group is emitted at a first distance along or parallel to the road from the location of the laser emitter, and the second laser pulse is emitted at a second distance along or parallel to the road from the location of the laser emitter. For each laser pulse group emitted, a laser receiver on the vehicle can detect the one or more laser spots and a camera on the vehicle can obtain an image of the one or more laser spots. The exemplary calibration technique can determine a first set of distances from the one or more laser spots detected by the laser receiver and a second set of distances from the one or more lase spots in the image obtained by the camera. The exemplary calibration technique can solve for two calibration parameters using two equations, where each equation can include two unknown camera calibration parameters, and both a first average of the first set of distances and a second average of the second set of distances for a same laser pulse group. For example, if the real distance of a laser spot determined from a laser receiver may be 10 meters, 20 meters, 30 meters, 50 meters, and 100 meters but the measured distance determined based on images obtained from a camera may be 8 meters, 19 meters, 31 meters, 52 meters, and 105 meters, respectively, then the techniques described in this patent document can perform a regression to determine the slope and intercept calibration parameters between measured distance and real distance.
-
FIG. 2A shows an exemplary camera calibration system 200 that includes avehicle 202 stopped or parked on aroad 208, where thevehicle 202 includes a camera andlaser system 204. Thevehicle 202 can be an autonomous vehicle. The camera andlaser system 204 includes at least one laser emitter (or a laser source), one or more camera and one or more laser receivers. Each camera may be adjacent to one of the laser receivers so that the distance measurements obtained from the camera images can be related to the distance measurements obtained from the data provided by the laser receivers, as further explained in this patent document. For example, a camera can be within one-foot of a location of a laser receiver. For ease of description, the camera andlaser system 204 inFIGS. 2A and 2B is shown as a single camera with a single laser emitter and a single laser emitter in an integrated unit coupled to the vehicle. In some embodiments, the laser emitter, the one or more cameras, and one or more laser receivers may be separate units coupled to the vehicle. - The laser emitter is pointed towards the
road 208 and in the same direction as is one or more cameras and the one or more laser receivers. The laser emitter generates and emits towards a road 208 a laser pulse group that includes a one or more laser spots 206. The direction toward which the laser pulse group is emitted is chosen such that at least one of the spots of the one ormore laser spots 206 can be sensed by the one or more laser receivers and imaged or recorded by the one or more cameras as shown inFIG. 2B . - In some embodiments, the laser emitter has an adjustable pitch angle such that it can emit a laser pulse toward a
road 208 at multiple pre-determined distances along the road or parallel to the road from the location of the laser emitter for calibrating multiple camera. As shown inFIG. 2A , apre-determined distance 210 can be defined along the road or parallel to the road from the location of the laser emitter to at least one spot of the one or more laser spots 206. A pitch angle (shown as “0” inFIG. 2A ) can be considered an angle formed in between the direction in which the laser emitter is pointing, and an imaginary horizontal plane or horizontal line at least partly parallel to the road and including at least a portion of the laser emitter. In some embodiments, a pitch angle can be considered an angle formed between the direction in which the laser emitter is pointing, and an imaginary vertical plane or vertical line at least partly perpendicular to the road and including at least a portion of the laser emitter. - The adjustable pitch angle can allow the laser emitter to generate and emit a plurality of laser pulse groups at various pre-determined distances from the location of the laser emitter. As further explained herein, a laser pulse group includes one or
more laser spots 206, thus thepre-determined distance 210 can be a distance from the laser emitter to at least one laser spot or point of the one or more laser spots 206. For example, a laser emitter can emit laser pulse groups with at least one laser spot or point located at pre-determined distances of 10 meters, 20 meters, 70 meters, 120 meters, 300 meters, and 350 meters from the location of the laser emitter. - In some embodiments, where calibration is performed as the vehicle is moving, the pre-determined distances may be proportional to the speed at which the vehicle is moving. For example, in some embodiments, smaller distances may be used for calibration when the vehicle is moving at a relatively high speed (e.g., above a threshold) because of dangers of making mistakes at near-distances at high speeds. For example, in some embodiments, calibration range may be increased when the vehicle is moving at a faster speed in order to look farther ahead when traveling at higher velocities.
- As shown in
FIG. 2A , the laser emitter can be installed on thevehicle 202 so that the distance between the laser emitter and the road is fixed or previously known or is a function of the velocity of the vehicle. Thus, at each measurement instance, an in-vehicle control computer located in thevehicle 202 includes a laser emitter module (315 inFIG. 3 ) that can use a trigonometric function to instruct the laser emitter to adjust the pitch angle and to instruct the laser emitter to emit a laser pulse group at each of two or more pre-determined distances from the location of the laser emitter, where the pre-determined distances are parallel to a road or along the road. - A laser emitter with an adjustable pitch angle is a beneficial technical feature at least because it can allow the exemplary calibration technique (as explained herein) to calibrate multiple cameras that capture images at different range of distances from the location of the vehicle. In some embodiments, laser pulse groups generated and emitted at a set of two or more pre-determined distances are used to calibrate the slope and intercept calibration parameters of each camera. The slope and intercept calibration parameters can be solved by setting up the slope and intercept calibration parameters in a slope-intercept form as shown in Equation (2) below, where the slope parameter corresponds to the slope and the intercept parameter corresponds to the intercept (e.g., distance from the origin where a line or curve or surface intersects a plane or axis). Continuing with the example from above, calibration parameters of a first camera that captures images from 5 meters to 50 meters can be adjusted based on laser pulse groups emitted at distances of 10 meters and 20 meters, calibration parameters of a second camera that captures images from 50 meters to 200 meters can be adjusted based on laser pulse groups emitted at distances of 70 meters and 120 meters, and calibration parameters of a third camera that captures images from 200 meters to 500 meters can be adjusted based on laser pulse groups emitted at distances of 300 meters and 350 meters.
- The laser emitter can generate and emit multiple laser pulse groups at multiple pre-determined distance from the location of the laser emitter that is pointed towards a road. Each laser pulse group is emitted towards an object (e.g., road) at a pre-determined distance along the road from the location of the laser emitter. As further explained below, the laser emitter generates and emits a laser pulse group to impinge one or more laser spots upon an expected opaque or reflective object or on a road approximately at each pre-determined distance. By using a laser pulse group having one or
more laser spots 206, the in-vehicle control computer in thevehicle 202 can perform filtering operations to minimize the effects of noise, distortion, or lack of detection by the laser receiver and/or camera as they sense or image the one or more laser spots or points of the one or more laser spots 206. - For example, in some embodiments, the laser emitter generates and emits each laser pulse group for a pre-determined time duration to allow a camera and/or a laser receiver sufficient time to detect or capture the one or more laser spots 206. For example, in case that a camera's frame rate is 60 frames per second, then the pre-determined time duration of the laser pulse group emitted by the laser emitter can be greater than a reciprocal of the frame rate (e.g., 1/60 seconds, or approximately 16.67 milliseconds). In some implementations, the pre-determined time duration can be at least 60 milliseconds so that at least three frames of the camera can capture the laser pulse group.
- Each laser pulse group may include the same one or more laser spots 206. The one or
more laser spots 206 includes several individual laser spots or points. For example, as shown inFIG. 2A , the one ormore laser spots 206 includes four laser spots that can be located at four vertices of an imaginary quadrilateral in a spatial laser pattern. As shown inFIG. 2A , the one ormore laser spots 206 is emitted toward a road. This is a beneficial technical feature that can eliminate the need to use the target calibration board as described inFIG. 1 . Furthermore, the exemplary calibration technique described in this patent document can account for variations in road conditions by generating on the road the one or more laser spots 206. -
FIG. 2C shows an exemplary two laser pulses generated and emitted by a laser emitter that may be sensed by a laser receiver. For ease of description eachlaser pulse laser pulse 254 is emitted at a time tR after the emission oflaser pulse 252.FIG. 2C shows that each of the first andsecond pulses laser pulses FIG. 2C , the laser receiver sends signals indicative of the times when the laser spots associated withlaser pulses FIGS. 3 and 4 . - A laser emitter can generate and emit multiple laser pulse groups comprising one or more laser spots at multiple pre-determined distances along a road from the location of the laser emitter, where the one or more laser receivers can sense one or more laser spots from each laser pulse group. For example, if a laser emitter emits a laser pulse group including four laser spots to be shown at a distance of 10 meters along a road from the location of the laser emitter, then one or more cameras that are configured to capture images from 5 meters to 50 meters and each of the one or more laser receivers adjacent to a camera can sense and/or obtain up to four laser spots from the laser pulse group.
-
FIG. 3 shows an exemplary block diagram of an in-vehicle control computer located in a vehicle to control the camera and laser system as described inFIGS. 2A and 2B . The in-vehicle control computer 300 includes at least oneprocessor 310 and amemory 305 having instructions stored thereupon. The instructions upon execution by theprocessor 310 configure thecomputer 300 to perform the operations described for a laser emitter, one or more laser receivers, one or more cameras, and various modules as described inFIGS. 2A to 4 , and/or the operations described in the various embodiments or sections in this patent document. Thelaser emitter module 315 controls the operations of the laser emitter as described in this patent document. - The
laser receiver module 320 controls the operations of the one or more laser receivers as described in this patent document. As mentioned above, the laser emitter generates and emits multiple laser pulse groups, where each laser pulse group includes one or more laser spots. Thelaser receiver module 320 can receive signals from a laser receiver indicative of the time(s) when the one or more laser spots are sensed by the laser receiver as explained in this patent document. Thelaser receiver module 320 can also obtain from thelaser emitter module 315 the plurality of times corresponding to the plurality of emitted laser pulse groups. Based on the signals received from the laser receiver for each detected laser spot and based on the time when a laser pulse is transmitted, thelaser receiver module 320 can calculate a distance between the laser receiver and each of one or more detected laser spots based on Equation (1) as shown below: -
D_Real=½*300,000,000 m/s*t D Equation (1) - where tD for each detected laser spot is the difference in time between when the laser emitter emitted the laser pulse and when the laser receiver detected a laser spot, and where 300,000,000 m/s is a rounded value for the speed of light.
- The
camera module 325 can process the images received from the one or more cameras to detect the one or more laser spots of each laser pulse group. The in-vehicle control computer stores in a hard drive or memory a set of previous images of one or more laser spots at various pre-determined distances from the location of the laser emitter. These previously storage images can include the one or more laser spots previously obtained from a flat road at various pre-determined distances. Thecamera module 325 can compare the previously stored one or more laser spots at a pre-determined distance to the images obtained of the one or more laser spots emitted by the laser emitter at the same pre-determined distance. Thecamera module 325 can determine a distance D_Detect from the camera to one or more laser spots based on the comparison of the stored and obtained images and/or based on the camera matrix. - For each laser pulse group, distances obtained from the
laser receiver module 320 for each laser spot can be averaged and the distances obtained from thecamera module 325 for each laser spot can be averaged to minimize the effects or distortion or noise or lack of detection. For example, if a laser emitter generates and emits a laser pulse group at a pre-determined distance from the laser emitter where the laser pulse group includes six laser spots, thelaser receiver module 320 can average the six distance measurements associated with a six detected spots to obtain a single averaged distance measurement. In the above example, thecamera module 320 can also average six distance measurements associated with the six spots in a camera image to obtain a single averaged distance measurement. Thus, for each laser pulse group, thelaser receiver module 320 and thecamera module 325 can separately calculate a single averaged distance measurement which can be considered a single D_Real value and a single D_Detect value, respectively. - The
calibration module 330 can obtain from thelaser receiver module 320 andcamera module 325 the single D_Real and single D_Detect values for at least two laser pulse groups to solve for the Sale and Intercept parameters by using, for example, Equation (2) as shown below: -
D_Real=Slope*D_Detect+Intercept Equation (2) - Since each camera operates within a range of distances, the
calibration module 330 obtains at least two D_Real values and at least two D_Detect values to calculate the slope and intercept parameters for each camera. For example, if a first camera captures images from 5 meters to 50 meters, and a second camera captures images from 50 meters to 200 meters, then the laser emitter can generate and emit a laser pulse group at each of the following exemplary distances: 10 meters, 20 meters, 70 meters, and 120 meters. In this example, thelaser receiver module 320 provides a first D_Real value for laser pulse group emitted at 10 meters and a second D_Real value for laser pulse group emitted at 20 meters, and thecamera module 325 provides a first D_Detect value for a laser pulse group emitted at 10 meters and a second D_Detect value for laser pulse group emitted at 20 meters. Similarly, thelaser receiver module 320 and thecamera module 325 also provide two D_Real values for the laser pulse group emitted at 70 meters and 120 meters and two D_Detect values for the laser pulse group emitted at 70 meters and 120 meters. Thecalibration module 330 can use the two D_Real values and two D_Detect values for each set of two or more distances within an imaging range of each camera (e.g., 10 meters and 20 meters for the first camera) to solve for the intercept and slope parameters for each camera (e.g., the first camera). For example, thecalibration module 330 can use the two D_Real values and two D_Detect values for the laser pulse groups emitted at 10 meters and 20 meters to solve for the intercept and slope parameters for the first camera using Equation (2). - In some embodiments, a non-linear equation such as an n-th order polynomial can be used to determine the slope and intercept values based on n-number of D_Real and D_Detect values, where n is greater than or equal to two. The n-number of D_Real and D_Detect values can be obtained from n-number of laser pulse groups each comprising one or more laser spots. In some embodiments, as explained in this patent document, a simplified linear equation can be used (e.g., Equation (2)) to determine the slope and intercept values based on at least two D_Real and D_Detect values.
- After the
calibration module 330 obtains the intercept and slope parameters for the cameras on the vehicle, thedistance module 335 can determine distances from the vehicle to one or more objects located in an environment that surrounds the vehicle as the vehicle is being driven to its destination. Thedistance module 335 can use the slope and intercept parameters of a camera when determining a distance of an object located in an image obtained from the camera. -
FIG. 4 shows an exemplary flow diagram of performing camera calibration. At the first emittingoperation 402, a laser emitter located on a vehicle and pointed towards a road at a first angle emits a first laser pulse group towards a first location on a road. At the second emittingoperation 404, the laser emitter pointed towards the road at a second angle emits a second laser pulse group towards a second location on the road. Each of the first laser pulse group and the second laser pulse group comprises one or more laser spots. -
Operations 406 to 412 are performed for each of the first laser pulse group and the second laser pulse group emitted at the first location and the second location, respectively. At the detectingoperation 406, a laser receiver located on the vehicle detects the one or more laser spots. At the calculatingoperation 408, a first set of distances from a location of the laser receiver to the one or more laser spots are calculated. At the obtainingoperation 410, an image comprising the one or more laser spots is obtained from a camera located on the vehicle. At the determiningoperation 412, a second set of distances from a location of the camera to the one or more laser spots are determined from the image - At the determining
operation 414, two camera calibration parameters of the camera are determined by solving two equations. Each equation includes two unknown camera calibration parameters, and a first value associated with the first set of distances and a second value associated with the second set of distances for a same laser pulse group. - In some embodiments, the determining the two camera calibration parameters comprises determining a slope value and an intercept value by determining, for both the first laser pulse group and the second laser pulse group, a first average distance of the first set of distances and a second average distance of the second set of distances, where the first value is the first average distance and the second value is the second average distance, and by determining, by solving two linear equations, the slope value and the intercept value with which the camera is calibrated, where a first linear equation includes an unknown slope value, an unknown intercept value, and the first average distance and the second average distance corresponding to the first laser pulse group, and where a second linear equation includes the unknown slope value, the unknown intercept value, and the first average distance and the second average distance corresponding to the second laser pulse group.
- In some embodiments, the method of
FIG. 4 further includes determining, while the vehicle is being driven, a location of an object in another image obtained from the camera by using the slope value and intercept value of the camera. - In some embodiments, the first location is located at a first pre-determined distance along the road from a location of the laser emitter, and the second location is located at a second pre-determined distance along the road from the location of the laser emitter. In some embodiments, the first pre-determined distance and the second pre-determined distance are within a range of distances from a location of the camera at which the camera is configured to obtain the images.
- In some embodiments, the method of
FIG. 4 further includes determining the first angle and the second angle formed in between a direction in which the laser emitter is pointed towards the road and an imaginary horizontal plane that is at least partly parallel to the road and that includes at least a portion of the laser emitter, and adjusting the laser emitter according to each of first angle and the second angle to emit the first laser pulse group and the second laser pulse group at the first pre-determined distance and the second pre-determined distance, respectively. In some embodiments, each of first pre-determined distance and the second pre-determined distance is a distance from the location of the laser emitter to at least one laser spot of the one or more laser spots. - In some embodiments, each of the first laser pulse group and the second laser pulse group is emitted for a pre-determined time duration. In some embodiments, the pre-determined time duration is greater than a reciprocal of a frame rate of the camera.
- In some embodiments, the detecting the one or more laser spots and the calculating the first set of distances comprises: receiving, from the laser receiver, signals indicative of times when the one or more laser spots are detected, calculating, for each laser spot, a distance from the laser receiver to a laser spot by: obtaining a time delay value by subtracting a first time of detection of the laser spot from a second time when a laser pulse group that included the one or more laser spots was emitted, and multiplying the time delay value by one-half of a speed of light.
- In some embodiments, the first laser pulse group and the second laser pulse group are emitted when the vehicle is stopped or parked on the road. In some embodiments, the camera is located adjacent to the laser receiver.
- In some implementations, methods described in the various embodiments in this patent document are embodied in a computer readable program stored on a non-transitory computer readable media. The computer readable program includes code that when executed by a processor, causes the processor to perform the methods described in this patent document, including the method described in
FIG. 3 . In this document the term “exemplary” is used to mean “an example of” and, unless otherwise stated, does not imply an ideal or a preferred embodiment. - Some of the embodiments described herein are described in the general context of methods or processes, which may be implemented in one embodiment by a computer program product, embodied in a computer-readable medium, including computer-executable instructions, such as program code, executed by computers in networked environments. A computer-readable medium may include removable and non-removable storage devices including, but not limited to, Read Only Memory (ROM), Random Access Memory (RAM), compact discs (CDs), digital versatile discs (DVD), etc. Therefore, the computer-readable media can include a non-transitory storage media. Generally, program modules may include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Computer- or processor-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps or processes.
- Some of the disclosed embodiments can be implemented as devices or modules using hardware circuits, software, or combinations thereof. For example, a hardware circuit implementation can include discrete analog and/or digital components that are, for example, integrated as part of a printed circuit board. Alternatively, or additionally, the disclosed components or modules can be implemented as an Application Specific Integrated Circuit (ASIC) and/or as a Field Programmable Gate Array (FPGA) device. Some implementations may additionally or alternatively include a digital signal processor (DSP) that is a specialized microprocessor with an architecture optimized for the operational needs of digital signal processing associated with the disclosed functionalities of this application. Similarly, the various components or sub-components within each module may be implemented in software, hardware or firmware. The connectivity between the modules and/or components within the modules may be provided using any one of the connectivity methods and media that is known in the art, including, but not limited to, communications over the Internet, wired, or wireless networks using the appropriate protocols.
- While this document contains many specifics, these should not be construed as limitations on the scope of an invention that is claimed or of what may be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this document in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or a variation of a sub-combination. Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results.
- Only a few implementations and examples are described and other implementations, enhancements and variations can be made based on what is described and illustrated in this disclosure.
Claims (20)
1. A method comprising:
emitting, by a laser emitter located on a vehicle, a laser pulse toward a first location on a road;
detecting, by a laser receiver located on the vehicle, a first distance between the laser receiver and the first location;
detecting, based at least in part upon an image captured by a camera pointed towards the first location, a second distance between the camera and the first location;
determining two camera calibration parameters based at least in part upon the first distance and the second distance; and
adjusting the two camera calibration parameters based at least on the laser pulse, such that the camera captures images of objects at the first location.
2. The method of claim 1 , wherein:
the laser emitter is pointed toward the first location on the road; and
the laser emitter forms a first angle between a direction in which the laser emitter is pointed towards and an imaginary horizontal plane that is at least partly parallel to the road.
3. The method of claim 1 , wherein the laser emitter is pointed toward the first location.
4. The method of claim 1 , wherein:
the laser emitter is located on a vehicle;
the laser receiver is located on the vehicle, and
the two camera calibration parameters are adjusted while the vehicle is moving.
5. The method of claim 1 , wherein the determining the two camera calibration parameters comprises solving two predetermined equations, wherein each of the two predetermined equations comprises the two camera calibration parameters, the first distance, and the second distance.
6. The method of claim 1 , further comprising determining, based at least in part upon a speed at which the vehicle is moving, a first angle formed in between a direction in which the laser emitter is pointed towards the road and an imaginary horizontal plane that is at least partly parallel to the road and that includes at least a portion of the laser emitter.
7. The method of claim 6 , further comprising adjusting the laser emitter according to at least the first angle to emit the laser pulse toward the first location.
8. The method of claim 1 , wherein:
the two camera calibration parameters are a slope value and an intercept value;
the slope value is a result of the camera pointing with an angle with respect to an imaginary horizontal plane at least partially parallel to the road; and
the intercept value is a distance from the camera where a line intersects the imaginary horizontal plane when the camera is pointed towards with the angle.
9. The method of claim 1 , wherein:
a distance between the laser emitter and the first location is a pre-determined distance; and
the pre-determined distance is proportional to a traveling speed of the vehicle.
10. The method of claim 1 , wherein determining the two camera calibration parameters comprises determining a slope value and an intercept value by:
determining, by solving two predetermined linear equations, the slope value, and the intercept value with which the camera is calibrated, wherein:
a first predetermined linear equation includes an unknown slope value, an unknown intercept value, and the first distance and the second distance; and
a second predetermined linear equation includes the unknown slope value, the unknown intercept value, and the first distance and the second distance.
11. A non-transitory computer-readable medium storing instructions that, when executed by a processor, cause the processor to:
emit, by a laser emitter located on a vehicle, a laser pulse toward a first location on a road;
detect, by a laser receiver located on the vehicle, a first distance between the laser receiver and the first location;
detect, based at least in part upon an image captured by a camera pointed towards the first location, a second distance between the camera and the first location;
determine two camera calibration parameters based at least in part upon the first distance and the second distance; and
adjust the two camera calibration parameters based at least on the laser pulse, such that the camera captures images of objects at the first location.
12. The non-transitory computer-readable medium of claim 11 , further comprising determining, while the vehicle is being driven, a location of an object in another image obtained by using the two camera calibration parameters.
13. The non-transitory computer-readable medium of claim 11 , wherein:
a distance between the laser emitter and the first location is a pre-determined distance; and
the pre-determined distance is proportional to a traveling speed of the vehicle.
14. The non-transitory computer-readable medium of claim 13 , wherein the pre-determined distance is within a range of distances from a location of the camera at which the camera is configured to obtain images.
15. The non-transitory computer-readable medium of claim 13 , wherein the pre-determined distance is a distance from the location of the laser emitter to at least one laser spot.
16. The non-transitory computer-readable medium of claim 11 , wherein the laser pulse is emitted when the vehicle is stopped or parked on the road.
17. The non-transitory computer-readable medium of claim 11 , wherein the determining the two camera calibration parameters comprises solving two predetermined equations, wherein each of the two predetermined equations comprises the two camera calibration parameters, the first distance, and the second distance.
18. The non-transitory computer-readable medium of claim 11 , wherein the instructions further cause the processor to determine, based at least in part upon a speed at which the vehicle is moving, a first angle formed in between a direction in which the laser emitter is pointed towards the road and an imaginary horizontal plane that is at least partly parallel to the road and that includes at least a portion of the laser emitter.
19. The non-transitory computer-readable medium of claim 18 , wherein the instructions further cause the processor to adjust the laser emitter according to at least the first angle to emit the laser pulse toward the first location.
20. The non-transitory computer-readable medium of claim 11 , wherein:
the two camera calibration parameters are a slope value and an intercept value;
the slope value is a result of the camera pointing with an angle with respect to an imaginary horizontal plane at least partially parallel to the road; and
the intercept value is a distance from the camera where a line intersects the imaginary horizontal plane when the camera is pointed towards with the angle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/353,352 US20240019267A1 (en) | 2019-09-16 | 2023-07-17 | Vehicle Camera Calibration System |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/572,001 US10837795B1 (en) | 2019-09-16 | 2019-09-16 | Vehicle camera calibration system |
US17/082,551 US11359932B2 (en) | 2019-09-16 | 2020-10-28 | Vehicle camera calibration system |
US17/648,992 US11747171B2 (en) | 2019-09-16 | 2022-01-26 | Vehicle camera calibration system |
US18/353,352 US20240019267A1 (en) | 2019-09-16 | 2023-07-17 | Vehicle Camera Calibration System |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/648,992 Continuation US11747171B2 (en) | 2019-09-16 | 2022-01-26 | Vehicle camera calibration system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20240019267A1 true US20240019267A1 (en) | 2024-01-18 |
Family
ID=73264045
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/572,001 Active US10837795B1 (en) | 2019-09-16 | 2019-09-16 | Vehicle camera calibration system |
US17/082,551 Active US11359932B2 (en) | 2019-09-16 | 2020-10-28 | Vehicle camera calibration system |
US17/648,992 Active US11747171B2 (en) | 2019-09-16 | 2022-01-26 | Vehicle camera calibration system |
US18/353,352 Pending US20240019267A1 (en) | 2019-09-16 | 2023-07-17 | Vehicle Camera Calibration System |
Family Applications Before (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/572,001 Active US10837795B1 (en) | 2019-09-16 | 2019-09-16 | Vehicle camera calibration system |
US17/082,551 Active US11359932B2 (en) | 2019-09-16 | 2020-10-28 | Vehicle camera calibration system |
US17/648,992 Active US11747171B2 (en) | 2019-09-16 | 2022-01-26 | Vehicle camera calibration system |
Country Status (2)
Country | Link |
---|---|
US (4) | US10837795B1 (en) |
CN (1) | CN112596067A (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10837795B1 (en) | 2019-09-16 | 2020-11-17 | Tusimple, Inc. | Vehicle camera calibration system |
US10852158B1 (en) * | 2019-09-27 | 2020-12-01 | Kitty Hawk Corporation | Distance sensor test system |
CN112964191B (en) * | 2021-03-25 | 2022-11-04 | 四川合众精准科技有限公司 | Micro-deformation laser collimation measurement method |
CN115598624B (en) * | 2021-06-28 | 2023-12-12 | 苏州一径科技有限公司 | Laser radar calibration method, device and equipment |
CN116299403B (en) * | 2023-05-15 | 2023-08-08 | 上海几何伙伴智能驾驶有限公司 | Statistical data-based 4D millimeter wave Lei Dadian cloud space position correction method |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3995846B2 (en) * | 1999-09-24 | 2007-10-24 | 本田技研工業株式会社 | Object recognition device |
US20140293266A1 (en) * | 2011-08-04 | 2014-10-02 | Ying Hsu | Local Alignment and Positioning Device and Method |
US9639941B2 (en) * | 2012-12-28 | 2017-05-02 | Trimble Inc. | Scene documentation |
WO2014152470A2 (en) * | 2013-03-15 | 2014-09-25 | Tk Holdings, Inc. | Path sensing using structured lighting |
CN107438775B (en) * | 2015-01-30 | 2022-01-21 | 特里纳米克斯股份有限公司 | Detector for optical detection of at least one object |
US10250833B2 (en) * | 2015-04-20 | 2019-04-02 | Samsung Electronics Co., Ltd. | Timestamp calibration of the 3D camera with epipolar line laser point scanning |
KR102372088B1 (en) * | 2015-10-29 | 2022-03-08 | 삼성전자주식회사 | Method for generating depth image and image generating apparatus using thereof |
WO2017159382A1 (en) * | 2016-03-16 | 2017-09-21 | ソニー株式会社 | Signal processing device and signal processing method |
CN109891265B (en) * | 2016-10-25 | 2023-12-01 | 特里纳米克斯股份有限公司 | Detector for optically detecting at least one object |
JP6891375B2 (en) * | 2017-07-20 | 2021-06-18 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Systems and methods for measuring optical distance |
US10791275B2 (en) * | 2017-09-25 | 2020-09-29 | The Boeing Company | Methods for measuring and inspecting structures using cable-suspended platforms |
EP3707572B1 (en) * | 2017-11-10 | 2023-08-23 | Nvidia Corporation | Systems and methods for safe and reliable autonomous vehicles |
WO2019165451A1 (en) * | 2018-02-26 | 2019-08-29 | Nvidia Corporation | Systems and methods for computer-assisted shuttles, buses, robo-taxis, ride-sharing and on-demand vehicles with situational awareness |
CN108764187B (en) * | 2018-06-01 | 2022-03-08 | 百度在线网络技术(北京)有限公司 | Method, device, equipment, storage medium and acquisition entity for extracting lane line |
US11353557B2 (en) * | 2018-06-04 | 2022-06-07 | Infineon Technologies Ag | Hybrid LIDAR receiver and lidar methods |
JP2020003236A (en) * | 2018-06-25 | 2020-01-09 | 株式会社リコー | Distance measurement device, moving body, distance measurement method, and distance measurement system |
US11487010B2 (en) * | 2018-09-12 | 2022-11-01 | Research Foundation Of The City University Of New York | Laser ranging over-height vehicle detector system (LARA-OHVD) |
US10837795B1 (en) | 2019-09-16 | 2020-11-17 | Tusimple, Inc. | Vehicle camera calibration system |
-
2019
- 2019-09-16 US US16/572,001 patent/US10837795B1/en active Active
-
2020
- 2020-09-15 CN CN202010970111.0A patent/CN112596067A/en active Pending
- 2020-10-28 US US17/082,551 patent/US11359932B2/en active Active
-
2022
- 2022-01-26 US US17/648,992 patent/US11747171B2/en active Active
-
2023
- 2023-07-17 US US18/353,352 patent/US20240019267A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US11747171B2 (en) | 2023-09-05 |
US20220146282A1 (en) | 2022-05-12 |
US10837795B1 (en) | 2020-11-17 |
CN112596067A (en) | 2021-04-02 |
US11359932B2 (en) | 2022-06-14 |
US20210080286A1 (en) | 2021-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11359932B2 (en) | Vehicle camera calibration system | |
CN108020825B (en) | Fusion calibration system and method for laser radar, laser camera and video camera | |
EP3111165B1 (en) | Distance measuring device and parallax calculation system | |
US9123247B2 (en) | Surrounding area monitoring apparatus for vehicle | |
JP4788798B2 (en) | Object detection device | |
KR100352423B1 (en) | Vehicle distance measuring device | |
JP4882428B2 (en) | Environment recognition device | |
US20180059219A1 (en) | Multi-beam position sensing devices | |
KR102059244B1 (en) | Apparatus for Light Detection and Ranging | |
JP2007024590A (en) | Object detector | |
KR20120003967A (en) | A laser diode based multiple-beam laser spot imaging system for characterization of vehicle dynamics | |
RU2019131588A (en) | STRUCTURE AND METHOD FOR DETERMINING THE INCLINATION SIGNAL IN THE VEHICLE | |
US11531351B2 (en) | Light receiving element array, light detection apparatus, driving support system, and automated driving system | |
KR20190098242A (en) | System for characterizing the environment around the vehicle | |
US11874379B2 (en) | Time-resolved contrast imaging for lidar | |
US10514447B2 (en) | Method for propagation time calibration of a LIDAR sensor | |
JP2020020612A (en) | Distance measuring device, method for measuring distance, program, and mobile body | |
KR102435313B1 (en) | Method for determining the angular position of an optoelectronic sensor and a test stand | |
US20240125936A1 (en) | Time-resolved contrast imaging for lidar | |
US20220179077A1 (en) | Method for supplementary detection of objects by a lidar system | |
US20230194666A1 (en) | Object Reflectivity Estimation in a LIDAR System | |
US20230384436A1 (en) | Distance measurement correction device, distance measurement correction method, and distance measurement device | |
US11543493B2 (en) | Distance measuring unit | |
US20230036431A1 (en) | BLOOM COMPENSATION IN A LIGHT DETECTION AND RANGING (LiDAR) SYSTEM | |
KR102327342B1 (en) | Calibration apparatus for vehicle camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TUSIMPLE, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAN, XIAOLING;HUANG, ZEHUA;REEL/FRAME:064289/0743 Effective date: 20190911 Owner name: TUSIMPLE, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAN, XIAOLING;HUANG, ZEHUA;REEL/FRAME:064289/0667 Effective date: 20190911 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |