US20220284629A1 - System and method of calibrating an optical sensor mounted on board of a vehicle - Google Patents
System and method of calibrating an optical sensor mounted on board of a vehicle Download PDFInfo
- Publication number
- US20220284629A1 US20220284629A1 US17/701,362 US202217701362A US2022284629A1 US 20220284629 A1 US20220284629 A1 US 20220284629A1 US 202217701362 A US202217701362 A US 202217701362A US 2022284629 A1 US2022284629 A1 US 2022284629A1
- Authority
- US
- United States
- Prior art keywords
- projection surface
- optical sensor
- image
- vehicle
- control unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 113
- 238000000034 method Methods 0.000 title claims abstract description 41
- 238000012360 testing method Methods 0.000 claims abstract description 41
- 230000004044 response Effects 0.000 description 8
- 230000003068 static effect Effects 0.000 description 8
- 230000000670 limiting effect Effects 0.000 description 6
- 239000004744 fabric Substances 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000002452 interceptive effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001143 conditioned effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000009424 underpinning Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/26—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
- G01B11/27—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes
- G01B11/272—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes using photoelectric detection means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B21/00—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
- G01B21/02—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
- G01B21/04—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4004—Means for monitoring or calibrating of parts of a radar system
- G01S7/4021—Means for monitoring or calibrating of parts of a radar system of receivers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
- G01S7/4972—Alignment of sensor
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0808—Diagnosing performance data
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/40—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
- B60R2300/402—Image calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the subject matter described herein relates to a system and method of calibrating an optical sensor mounted on board of a vehicle.
- ADAS Advanced Driver Assistance Systems
- ADAS systems are electronic driving assistance systems for vehicles that support the driver for the purpose of increasing safety and/or driving comfort. Such systems have been classified into six levels according to the degree of autonomy, as indicated below:
- ADAS systems that are already widespread include adaptive cruise control, automatic full-beam headlamp adjustment, automatic headlamp orientation, automatic parking system, navigation system with traffic information, night vision system, blind spot monitor, frontal collision warning system, automatic emergency braking, etc.
- ADAS systems are based on a plurality of sensors (television cameras, radar, Lidar, etc.) able to detect different information that can possibly be used as the input data for a smart algorithm that oversees the degree of autonomy of the vehicle.
- the sensors are calibrated directly by the manufacturer.
- the initial calibration of a television camera is performed through a simulation environment specifically provided by the manufacturer in which the television camera is placed opposite a monitor onto which settable dynamic scenarios are projected (e.g. a pedestrian crossing the road).
- the sensors are calibrated periodically (e.g. when the vehicle is serviced) or after exceptional events (e.g. replacement of the sensor following a defect, damage or breakdown warning).
- one aspect provides a calibration system that calibrates an optical sensor mounted on board of a vehicle, comprising: a test station consisting of a horizontal or inclined support zone for supporting the vehicle; a projection surface for images or videos, said projection surface being located in front of said test station; at least one memory containing a plurality of images and/or videos archived by type of optical sensor; a calibration unit for calibrating the optical sensor configured to adjust the position of the optical axis of said optical sensor; and a control unit which, in response to a signal representing the type of said optical sensor, is configured to: search in said memory for at least one image or video archived in association with the type of said optical sensor; command the projection onto said projection surface of the image or video found in said memory or a processed version of said image or said video; interface with said calibration unit; and adapt or deform the image or video found in said memory to the size of the projection surface.
- Another aspect provides a method of calibrating an optical sensor mounted on board of a vehicle, comprising the steps of: positioning the vehicle in a test station consisting of a horizontal or inclined support zone for supporting the vehicle; arranging a projection surface for images or videos in front of said test station; identifying the type of said optical sensor; selecting in a memory an image or video associated with the type of said optical sensor; adapting or deforming the image or video selected in said memory ( 6 ) to the size of the projection surface; projecting the image or video selected or the adapted or deformed version thereof onto said projection surface; and adjusting the position of the optical axis of said optical sensor starting from said projected image or video.
- FIG. 1 schematically illustrates a system of calibrating an optical sensor mounted on board of a vehicle, according to an embodiment
- FIG. 2 and FIG. 3 illustrate the reciprocal arrangement of a vehicle in the test station and a projection surface of the calibration system of FIG. 1 , in a perspective view, in which the projection surface projects a pattern and a video, respectively;
- FIG. 4 schematically illustrates the communication between a scan tool and a calibration unit of the calibration system of FIG. 1 .
- Static calibration is performed in a closed environment (generally the workshop) through a portable device—known in the sector as a “scan tool”—connected to the vehicle's EOBD (European On Board Diagnostic) diagnostic socket and using specific target panels for each type of sensor (e.g. photo camera, radar, Lidar, etc.) usually placed on the front of the stationary vehicle (they can also be positioned on the side or the rear of the vehicle).
- EOBD European On Board Diagnostic
- the panel alignment step takes a long time.
- the movement of the panels also requires special care to prevent damage and breakages.
- the panels are made of plastic material, generally forex, and have a significant extension with respect to the thickness, which is reduced (usually max 5 mm).
- a first limit of dynamic calibration is connected with the fact that it must be performed under good weather conditions, with the clear planning difficulties of the time scales.
- a second limit is connected with the need to provide paths with determined characteristics (horizontal signage, vertical signage, etc.) for performing the calibration.
- DE 10 2006060 553 there is disclosed a method for testing a motor vehicle driver assistance system.
- the technical task underpinning an embodiment is to provide a system and method of calibrating an optical sensor mounted on board of a vehicle, that obviate the above-cited drawbacks.
- an embodiment provides a universal system of calibrating an optical sensor mounted on board of a vehicle, i.e. that can be used for the sensors of any vehicle, regardless of the manufacturer, the specific model and the ADAS system being implemented and, at the same time, more reliable and compact with respect to known solutions.
- An embodiment provides a method for calibrating an optical sensor mounted on board of a vehicle that can be performed in a shorter time and more easily with respect to the calibration methods known to date.
- a further embodiment provides a system and method of calibrating an optical sensor mounted on board of a vehicle, which are reliably applicable also to vehicles that normally require an on-road test, i.e. dynamic calibration.
- control unit is also configured to determine, in response to the signal representing the type of optical sensor, a spatial position of the projection surface with respect to the optical sensor mounted on board of the vehicle arranged in the test station.
- control unit is also configured to adapt or deform the image or video found in the memory to the dimensions of the projection surface, in response to the signal representing the type of optical sensor.
- the calibration system further comprises a screen or monitor located in front of the test station, the projection surface being the display of said monitor.
- the calibration system further comprises a television set, said monitor being the monitor of said television set.
- the calibration system further comprises a multimedia interactive board, said monitor being the monitor of the multimedia interactive board.
- the calibration system further comprises a computer, said monitor being the monitor of the computer.
- the projection surface is obtained from a sheet made of PVC.
- the calibration system further comprises a projector or a luminous board, said control unit being configured to command the projector or luminous board to project the image or video found in the memory onto the projection surface.
- control unit in response to the signal representing the type of optical sensor, is configured to project a set of parameters or initial calibration conditions onto said projection surface.
- the calibration system also comprises an automatic means for adjusting the spatial position of the projection surface with respect to the test station.
- the calibration method further comprises a step of determining, according to the type of optical sensor, a spatial measurement position that the projection surface must assume with respect to the optical sensor during calibration.
- the calibration method further comprises a step of adapting or deforming the image or video selected based on the size of the projection surface and the distance from the optical sensor.
- the number 1 indicates a system of calibrating an optical sensor 2 mounted on board of a vehicle 100 , in particular a motor vehicle such as an automobile, a bus, a lorry, a road tractor, a tractor trailer, an articulated lorry, a farm machinery, a working vehicle, a self-propelled vehicle, etc.
- a motor vehicle such as an automobile, a bus, a lorry, a road tractor, a tractor trailer, an articulated lorry, a farm machinery, a working vehicle, a self-propelled vehicle, etc.
- the optical sensor 2 is a CMOS or CCD type sensor of a television camera installed on the vehicle 100 .
- the calibration system 1 preferably comprises:
- test station 3 consists of a horizontal or inclined support zone for supporting the vehicle 100 .
- the stationary vehicle 100 is arranged in the test station 3 according to techniques and with means of the known type, which are not the subject matter this disclosure.
- the projection surface 4 is arranged in front of the test station 3 so that the optical sensor 2 can acquire images or videos projected onto such projection surface 4 .
- the projection surface 4 is rectangular shaped.
- the calibration system 1 comprises a screen or monitor, whose display constitutes the projection surface 4 .
- the monitor comprising the projection surface 4 may be a monitor of a television set 40 , as illustrated in FIGS. 2 and 3 .
- the monitor of the television set 40 may be plasma, liquid crystal, OLED.
- a television set 40 can be used with a 65′′ or greater anti-glare monitor.
- the monitor comprising the projection surface 4 is the monitor of a multimedia interactive whiteboard (often indicated by the acronym IWB), or the monitor of a computer.
- IWB multimedia interactive whiteboard
- the calibration system 1 comprises a projector or a video projector or a luminous board that projects images or videos onto the projection surface 4 , preferably made of (polarised or lenticular) high-contrast PVC fabric.
- the projection surface 4 is the surface of a fabric sheet which when unrolled and taut, must have a planarity of +/ ⁇ 2 millimetres per linear metre.
- the fabric is opaque white so as to have a good contrast.
- the calibration system 1 comprises a control unit 5 which receives at least one input signal (indicated as S 1 ) representing the type of optical sensor 2 . In response to such signal S 1 , the control unit 5 is configured for:
- the memory 6 is part of the calibration system 1 and contains a plurality of images and/or videos archived by type of optical sensor.
- each optical sensor requires ad hoc calibration.
- the selection of the image or video by the control unit 5 is performed by searching in the memory 6 for at least one image or video that is archived in association with the type of that particular optical sensor 2 subject to calibration.
- the image projected onto the projection surface 4 can reproduce the shape, size and pattern of a target panel for the calibration of a specific optical sensor.
- control unit 5 is housed in a portable device 20 (generally known in the sector as a scan tool) which can be connected to the vehicle's 100 EOBD diagnostic socket 31 .
- the memory 6 can be housed in the same portable device 20 .
- it may be the computer memory, or an external memory (e.g. USB memory connectible directly to the television set 40 ).
- an external memory e.g. USB memory connectible directly to the television set 40 .
- control unit 5 is configured to command the projector or video projector or luminous board to project the image or video onto such projection surface 4 .
- control unit 5 is configured to project onto the projection surface 4 a set of parameters or initial calibration conditions, in response to the signal S 1 representing the type of optical sensor 2 .
- the calibration of the optical sensor 2 takes place by a calibration unit 30 that interfaces with the control unit 5 .
- the calibration unit 30 is preferably part of the vehicle's 100 electronic control unit and it interfaces with the control unit 5 of the scan tool 20 through the connection to the EOBD diagnostic socket 31 .
- control unit 5 is also configured to determine a spatial position of the projection surface 4 with respect to the optical sensor 2 mounted on board of the vehicle 100 arranged in the test station 3 . Such determination is performed based on the signal S 1 representing the type of optical sensor 2 .
- the calibration system 1 also comprises an automatic means for adjusting (i.e. regulating) the spatial position of the projection surface 4 with respect to the test station 3 .
- Said adjusting means is of the known type and will not be described further.
- This position adjustment of the projection surface 4 is usually used in the event in which the vehicle 100 is placed on a horizontal support surface.
- control unit 5 is also configured to process the images or videos resident in the memory 6 .
- control unit 5 is configured to adapt or deform the selected image or video to the size of the projection surface 4 . Such adaptation is performed in response to the signal S 1 representing the type of optical sensor 2 .
- the image or video is to be deformed rather than adjusting the spatial position of the projection surface 4 with respect to the optical sensor 2 .
- the vehicle's 100 support plane is inclined forwards by a maximum of 1° with respect to the horizontal.
- the vehicle's 100 support plane is inclined backwards by a maximum of 3° with respect to the horizontal.
- control unit 5 is configured to perform both a determination of the spatial position of the projection surface 4 with respect to the optical sensor 2 mounted on board of the vehicle 100 in the test station 3 and a deformation of the selected images or videos.
- the determination of the spatial position is rough, and is performed from a deformed projection of the image or video.
- the vehicle 100 is parked in the test station 3 , according to known techniques, as already mentioned above.
- the projection surface 4 e.g. the display of a monitor, is arranged in front of the test station 3 , in particular transverse to the longitudinal axis AA of the vehicle 100 .
- the portable device 20 has a screen 21 on which a graphical interface is displayed, configured to allow text or instructions to be entered by an operator.
- the operator can select the vehicle 100 to be calibrated, by choosing from different types of vehicles split into brands (manufacturers) and models.
- the portable device 20 performs such selection automatically or semi-automatically, asking the operator to confirm that the vehicle 100 detected is the correct one.
- the operator also selects the ADAS system to be calibrated, specifically the optical sensor 2 to be calibrated. Also in this case, the selection can take place manually, automatically or semi-automatically.
- control unit 5 in the scan tool 20 ) can determine the spatial measurement position that the monitor must assume with respect to the optical sensor 2 during calibration.
- Such determination takes place, for example, in the case of a vehicle 100 placed on a horizontal support surface.
- the spatial measurement position is preferably displayed in the form of instructions on the projection surface 4 .
- the mutual position of the optical sensor and its target in this case the display or, in general, the projection surface
- the mutual position of the optical sensor and its target must be adjusted according to the type of optical sensor and the position that it occupies in the vehicle 2 .
- the operator confirms to the portable device 20 (still through the graphical interface that can be loaded onto its screen 21 ) that the preliminary step has been performed and the actual calibration can take place.
- the operator can choose whether to perform a calibration with a static image or a dynamic video.
- the control unit 5 searches inside the memory 6 for the image (in the former case) or the video (in the latter case) associated with the type of optical sensor 2 to be calibrated.
- the image or video selected can then be displayed on the projection surface 4 .
- the calibration is performed by the calibration unit 30 which communicates with the scan tool 20 .
- the actual calibration meaning the adjustment of the position of the optical axis of the optical sensor 2 , takes place according to an algorithm of the known type.
- the operator can easily repeat the aforesaid method for other optical sensors located on board of the vehicle 100 .
- a similar calibration system may also be applied for the calibration of a radar mounted on board of a vehicle, i.e. a frontal radar.
- the calibration of the frontal radar is achieved by means of a plane reflector arranged at a certain distance D from the radar and perpendicular to the axis of the radar.
- the manufacturers declare a certain tolerance ⁇ D range for the distance D, i.e. D ⁇ D, in the arrangement of the plane reflector with respect to the radar.
- the user simply place the plane reflector in front of the radar at the distance D recommended by the manufacturer, then he inserts the relevant distances obtained by means of laser meters, and the system calculates the magnitude of the angle that the plane reflector needs to be rotated. Furthermore, the system indicates to the user whether and of which amount the plane reflector shall slide right or left in order to be centred with respect to the radar.
- the radar substitutes the optical sensor 2 , while the plane reflector substitutes the projection surface 4 .
- the calibration system proposed herein further allows a contrast to be achieved that is also compatible with use in an open environment.
- the screen also allows videos that reproduce real dynamic or simulated scenarios to be projected. Therefore, even for static calibration (i.e. with the vehicle stationary), comparable performance levels are obtained to those of dynamic calibration, which can therefore be prevented for vehicles which usually required an on-road test. Preventing the on-road test simplifies planning (connected with weather and road conditions) and prevents risks for the driver.
- the method and system proposed can also be used in the event of inclination of the vehicle (within certain limits) because it is sufficient to suitably deform the image/video to be projected onto the screen instead of performing the spatial adjustment of the screen with respect to the vehicle.
- the user may save up to 20 minutes times for each vehicle.
- aspects may be embodied as a system, method or device program product. Accordingly, aspects may take the form of an entirely hardware embodiment or an embodiment including software that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects may take the form of a device program product embodied in one or more device readable medium(s) having device readable program code embodied therewith.
- a storage device may be, for example, a system, apparatus, or device (e.g., an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device) or any suitable combination of the foregoing.
- a storage device/medium include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
- a storage device is not a signal and “non-transitory” includes all media except signal media.
- Program code embodied on a storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, et cetera, or any suitable combination of the foregoing.
- Program code for carrying out operations may be written in any combination of one or more programming languages.
- the program code may execute entirely on a single device, partly on a single device, as a stand-alone software package, partly on single device and partly on another device, or entirely on the other device.
- the devices may be connected through any type of connection or network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made through other devices (for example, through the Internet using an Internet Service Provider), through wireless connections, e.g., near-field communication, or through a hard wire connection, such as over a USB connection.
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- Example embodiments are described herein with reference to the figures, which illustrate example methods, devices and program products according to various example embodiments. It will be understood that the actions and functionality may be implemented at least in part by program instructions. These program instructions may be provided to a processor of a device, a special purpose information handling device, or other programmable data processing device to produce a machine, such that the instructions, which execute via a processor of the device implement the functions/acts specified.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Geometry (AREA)
- Mechanical Engineering (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
- This application is a Continuation of U.S. application Ser. No. 16/447,918, filed on Jun. 20, 2019, which claims priority to European Application No. 18178983.5, which was filed on Jun. 21, 2018, the contents of which are fully incorporated by reference herein.
- The subject matter described herein relates to a system and method of calibrating an optical sensor mounted on board of a vehicle.
- Over recent years, the attention of those developing the safety of motor vehicles has extended from the traditional passive safety systems (airbags, seat belts, impact resistance, etc.) to advanced active safety systems, known to specialists as ADAS (Advanced Driver Assistance Systems).
- ADAS systems are electronic driving assistance systems for vehicles that support the driver for the purpose of increasing safety and/or driving comfort. Such systems have been classified into six levels according to the degree of autonomy, as indicated below:
-
- Level 0 (no automation): the driver is in charge of all the driving aspects, even when he/she is facilitated by the systems installed on board of the vehicle.
- Level 1 (driver assistance): in some situations the vehicle can accelerate, brake or steer autonomously, but the driver must be ready at all times to regain control of the vehicle.
- Level 2 (partial automation): the vehicle has full control of the accelerator, brake and steering, but the driver must still monitor the surrounding environment.
- Level 3 (conditioned automation): the vehicle has full control of the accelerator, brake, steering and monitoring of the environment, but the driver must be ready to intervene if required by the system.
- Level 4 (high automation): the automatic system is able to handle any event, but must not be activated in extreme driving conditions such as in bad weather.
- Level 5 (complete automation): the automatic driving system is able to handle all driving situations; there is no longer any need for intervention by a human driver.
- Currently, the most advanced vehicles are equipped with level 3 systems. The objective over coming years is to reach
level 5 in most of the vehicles on the roads. - By way of example, ADAS systems that are already widespread include adaptive cruise control, automatic full-beam headlamp adjustment, automatic headlamp orientation, automatic parking system, navigation system with traffic information, night vision system, blind spot monitor, frontal collision warning system, automatic emergency braking, etc.
- At technological level, ADAS systems are based on a plurality of sensors (television cameras, radar, Lidar, etc.) able to detect different information that can possibly be used as the input data for a smart algorithm that oversees the degree of autonomy of the vehicle.
- Before the vehicle is placed on the market, the sensors are calibrated directly by the manufacturer. For example, the initial calibration of a television camera is performed through a simulation environment specifically provided by the manufacturer in which the television camera is placed opposite a monitor onto which settable dynamic scenarios are projected (e.g. a pedestrian crossing the road).
- After the vehicle has been placed on the market, the sensors are calibrated periodically (e.g. when the vehicle is serviced) or after exceptional events (e.g. replacement of the sensor following a defect, damage or breakdown warning).
- In summary, one aspect provides a calibration system that calibrates an optical sensor mounted on board of a vehicle, comprising: a test station consisting of a horizontal or inclined support zone for supporting the vehicle; a projection surface for images or videos, said projection surface being located in front of said test station; at least one memory containing a plurality of images and/or videos archived by type of optical sensor; a calibration unit for calibrating the optical sensor configured to adjust the position of the optical axis of said optical sensor; and a control unit which, in response to a signal representing the type of said optical sensor, is configured to: search in said memory for at least one image or video archived in association with the type of said optical sensor; command the projection onto said projection surface of the image or video found in said memory or a processed version of said image or said video; interface with said calibration unit; and adapt or deform the image or video found in said memory to the size of the projection surface.
- Another aspect provides a method of calibrating an optical sensor mounted on board of a vehicle, comprising the steps of: positioning the vehicle in a test station consisting of a horizontal or inclined support zone for supporting the vehicle; arranging a projection surface for images or videos in front of said test station; identifying the type of said optical sensor; selecting in a memory an image or video associated with the type of said optical sensor; adapting or deforming the image or video selected in said memory (6) to the size of the projection surface; projecting the image or video selected or the adapted or deformed version thereof onto said projection surface; and adjusting the position of the optical axis of said optical sensor starting from said projected image or video.
- The foregoing is a summary and thus may contain simplifications, generalizations, and omissions of detail; consequently, those skilled in the art will appreciate that the summary is illustrative only and is not intended to be in any way limiting.
- For a better understanding of the embodiments, together with other and further features and advantages thereof, reference is made to the following description, taken in conjunction with the accompanying drawings. The scope of the invention will be pointed out in the appended claims.
-
FIG. 1 schematically illustrates a system of calibrating an optical sensor mounted on board of a vehicle, according to an embodiment; -
FIG. 2 andFIG. 3 illustrate the reciprocal arrangement of a vehicle in the test station and a projection surface of the calibration system ofFIG. 1 , in a perspective view, in which the projection surface projects a pattern and a video, respectively; -
FIG. 4 schematically illustrates the communication between a scan tool and a calibration unit of the calibration system ofFIG. 1 . - It will be readily understood that the components of the embodiments, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations in addition to the described example embodiments. Thus, the following more detailed description of the example embodiments, as represented in the figures, is not intended to limit the scope of the embodiments, as claimed, but is merely representative of example embodiments.
- Reference throughout this specification to “one embodiment” or “an embodiment” (or the like) means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearance of the phrases “in one embodiment” or “in an embodiment” or the like in various places throughout this specification are not necessarily all referring to the same embodiment.
- Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments. One skilled in the relevant art will recognize, however, that the various embodiments can be practiced without one or more of the specific details, or with other methods, components, materials, et cetera. In other instances, well known structures, materials, or operations are not shown or described in detail to avoid obfuscation.
- Two types of calibration are currently performed in the aftermarket: static and dynamic.
- Static calibration is performed in a closed environment (generally the workshop) through a portable device—known in the sector as a “scan tool”—connected to the vehicle's EOBD (European On Board Diagnostic) diagnostic socket and using specific target panels for each type of sensor (e.g. photo camera, radar, Lidar, etc.) usually placed on the front of the stationary vehicle (they can also be positioned on the side or the rear of the vehicle). An example of such a calibration method is proposed in patent US 2013/0325252.
- The main criticality of the static calibration methods known to date is connected with the wide variety of parameters at stake. As every manufacturer requires ad hoc calibration settings for each vehicle model and for each type of sensor, workshops are generally affiliated to only some manufacturers, for which they must be equipped with the related target panels (numerous ones as they differ in terms of shape, size and pattern).
- Furthermore, to guarantee reliable calibration, it is essential to guarantee the correct transverse and longitudinal alignment of the target panels with respect to the vehicle. At each calibration, the panel alignment step takes a long time.
- The movement of the panels also requires special care to prevent damage and breakages. In fact, the panels are made of plastic material, generally forex, and have a significant extension with respect to the thickness, which is reduced (usually max 5 mm).
- Furthermore, static calibration cannot take place outdoors as there must be a well defined contrast of the panels.
- For some types of vehicles, static calibration is not sufficient but an on-road test is required.
- In that case, dynamic calibration methods are applied, i.e. performed while driving the vehicle. Two scenarios are possible:
-
- dynamic calibration performed automatically by the vehicle systems while a generic driver is driving,
- dynamic calibration using a scan tool connected to the vehicle's EOBD diagnostic socket for performing specific calibration procedures established by the manufacturer, performed by an authorized repairer.
- A first limit of dynamic calibration is connected with the fact that it must be performed under good weather conditions, with the clear planning difficulties of the time scales. A second limit is connected with the need to provide paths with determined characteristics (horizontal signage, vertical signage, etc.) for performing the calibration.
- Furthermore, during dynamic calibration the vehicle could have unexpected reactions (precisely due to calibration errors), which put the driver's safety at risk.
- The aftermarket calibration methods known to date (static and dynamic) require long performance times to guarantee the reliability of the results.
- From US 2018/100783 it is already known a calibration system for optical sensors using a screen or other projection surface disposed within the field of view of an optical sensor system onboard a vehicle.
- From WO 2018/067354 it is disclosed an ADAS calibration support structure in which it is possible to project indicia on a screen and to correct parallax distortion by means of mechanical rotation of a laser emitter.
- From WO 2014/192347 it is also known an inspection system for an optical sensor.
- In U.S. Pat. No. 9,247,222 a projection display for images is disclosed, that may be applied to a vehicle.
- In DE 10 2006060 553 there is disclosed a method for testing a motor vehicle driver assistance system.
- In this context, the technical task underpinning an embodiment is to provide a system and method of calibrating an optical sensor mounted on board of a vehicle, that obviate the above-cited drawbacks.
- In particular, it an embodiment provides a universal system of calibrating an optical sensor mounted on board of a vehicle, i.e. that can be used for the sensors of any vehicle, regardless of the manufacturer, the specific model and the ADAS system being implemented and, at the same time, more reliable and compact with respect to known solutions.
- An embodiment provides a method for calibrating an optical sensor mounted on board of a vehicle that can be performed in a shorter time and more easily with respect to the calibration methods known to date.
- A further embodiment provides a system and method of calibrating an optical sensor mounted on board of a vehicle, which are reliably applicable also to vehicles that normally require an on-road test, i.e. dynamic calibration.
- The stated technical task are substantially achieved by a system of calibrating an optical sensor mounted on board of a vehicle, comprising:
-
- a test station for the stationary vehicle;
- a projection surface for images or videos, which is located in front of the test station;
- at least one memory containing a plurality of images and/or videos archived by type of optical sensor;
- a calibration unit for calibrating the optical sensor configured to adjust the position of the optical axis of the optical sensor;
- a control unit which, in response to a signal representing the type of the optical sensor, is configured to:
- search in the memory for at least one image or video archived in association with the type of optical sensor;
- command the projection onto the projection surface of the image or video found in the memory or a processed version of the image or video;
- interface with the calibration unit.
- In accordance with one embodiment, the control unit is also configured to determine, in response to the signal representing the type of optical sensor, a spatial position of the projection surface with respect to the optical sensor mounted on board of the vehicle arranged in the test station.
- In accordance with one embodiment, the control unit is also configured to adapt or deform the image or video found in the memory to the dimensions of the projection surface, in response to the signal representing the type of optical sensor.
- In accordance with one embodiment, the calibration system further comprises a screen or monitor located in front of the test station, the projection surface being the display of said monitor.
- In accordance with one embodiment, the calibration system further comprises a television set, said monitor being the monitor of said television set.
- In accordance with one embodiment, the calibration system further comprises a multimedia interactive board, said monitor being the monitor of the multimedia interactive board.
- In accordance with one embodiment, the calibration system further comprises a computer, said monitor being the monitor of the computer.
- In accordance with one embodiment, the projection surface is obtained from a sheet made of PVC.
- In accordance with one embodiment, the calibration system further comprises a projector or a luminous board, said control unit being configured to command the projector or luminous board to project the image or video found in the memory onto the projection surface.
- Preferably, in response to the signal representing the type of optical sensor, the control unit is configured to project a set of parameters or initial calibration conditions onto said projection surface.
- Preferably, the calibration system also comprises an automatic means for adjusting the spatial position of the projection surface with respect to the test station.
- The stated technical task and specified objects are substantially achieved by a method of calibrating an optical sensor mounted on board of a vehicle, comprising the steps of:
-
- positioning the vehicle in a test station;
- arranging a projection surface for images or videos in front of said test station;
- identifying the type of optical sensor;
- selecting in a memory an image or video associated with the type of said optical sensor;
- projecting the image or video selected or a processed version thereof onto the projection surface;
- adjusting the position of the optical axis of the optical sensor starting from said projected image or video.
- In accordance with one embodiment, the calibration method further comprises a step of determining, according to the type of optical sensor, a spatial measurement position that the projection surface must assume with respect to the optical sensor during calibration.
- In accordance with one embodiment, the calibration method further comprises a step of adapting or deforming the image or video selected based on the size of the projection surface and the distance from the optical sensor.
- Further characteristics and advantages will become more apparent from the indicative and thus non-limiting description of a preferred, but not exclusive, embodiment of a system and method of calibrating an optical sensor mounted on board of a vehicle, as illustrated in the accompanying drawings.
- With reference to the figures, the
number 1 indicates a system of calibrating anoptical sensor 2 mounted on board of avehicle 100, in particular a motor vehicle such as an automobile, a bus, a lorry, a road tractor, a tractor trailer, an articulated lorry, a farm machinery, a working vehicle, a self-propelled vehicle, etc. - For example, the
optical sensor 2 is a CMOS or CCD type sensor of a television camera installed on thevehicle 100. - The
calibration system 1 preferably comprises: -
- a test station 3 for the
stationary vehicle 100; - a projection surface 4 for projecting images or videos.
- a test station 3 for the
- In particular, the test station 3 consists of a horizontal or inclined support zone for supporting the
vehicle 100. - The
stationary vehicle 100 is arranged in the test station 3 according to techniques and with means of the known type, which are not the subject matter this disclosure. - The projection surface 4 is arranged in front of the test station 3 so that the
optical sensor 2 can acquire images or videos projected onto such projection surface 4. - Preferably, the projection surface 4 is rectangular shaped.
- Preferably, the
calibration system 1 comprises a screen or monitor, whose display constitutes the projection surface 4. - The monitor comprising the projection surface 4 may be a monitor of a
television set 40, as illustrated inFIGS. 2 and 3 . - For example, the monitor of the
television set 40 may be plasma, liquid crystal, OLED. - For example, a
television set 40 can be used with a 65″ or greater anti-glare monitor. - Alternatively, the monitor comprising the projection surface 4 is the monitor of a multimedia interactive whiteboard (often indicated by the acronym IWB), or the monitor of a computer.
- In accordance with another embodiment, the
calibration system 1 comprises a projector or a video projector or a luminous board that projects images or videos onto the projection surface 4, preferably made of (polarised or lenticular) high-contrast PVC fabric. - For example, the projection surface 4 is the surface of a fabric sheet which when unrolled and taut, must have a planarity of +/−2 millimetres per linear metre. Preferably, the fabric is opaque white so as to have a good contrast.
- The
calibration system 1 comprises acontrol unit 5 which receives at least one input signal (indicated as S1) representing the type ofoptical sensor 2. In response to such signal S1, thecontrol unit 5 is configured for: -
- selecting an image or a video in a memory 6;
- commanding the projection onto the projection surface 4 of the image or video selected or a processed version thereof.
- In particular, the memory 6 is part of the
calibration system 1 and contains a plurality of images and/or videos archived by type of optical sensor. - In fact, on board of the
vehicle 100, different television cameras, stereo pairs etc. can be installed. Each of such devices has optical sensors of different types that together form an ADAS system. Above all, according to the manufacturer and the model, thevehicle 100 has its own ADAS system, therefore each optical sensor requires ad hoc calibration. - The selection of the image or video by the
control unit 5 is performed by searching in the memory 6 for at least one image or video that is archived in association with the type of that particularoptical sensor 2 subject to calibration. - For example, the image projected onto the projection surface 4 can reproduce the shape, size and pattern of a target panel for the calibration of a specific optical sensor.
- In the case of a video, it is possible to display a real dynamic scenario or a simulated one, which reproduces an on-road test of the vehicle.
- For example, the
control unit 5 is housed in a portable device 20 (generally known in the sector as a scan tool) which can be connected to the vehicle's 100 EOBDdiagnostic socket 31. - The memory 6 can be housed in the same portable device 20.
- Alternatively, it may be the computer memory, or an external memory (e.g. USB memory connectible directly to the television set 40).
- If the projection surface 4 is composed of the fabric sheet, then the
control unit 5 is configured to command the projector or video projector or luminous board to project the image or video onto such projection surface 4. - Preferably, in a preliminary step it is necessary to configure the
calibration system 1. For this reason, thecontrol unit 5 is configured to project onto the projection surface 4 a set of parameters or initial calibration conditions, in response to the signal S1 representing the type ofoptical sensor 2. - The calibration of the
optical sensor 2, meaning the adjustment of the position of the optical axis, takes place by acalibration unit 30 that interfaces with thecontrol unit 5. Thecalibration unit 30 is preferably part of the vehicle's 100 electronic control unit and it interfaces with thecontrol unit 5 of the scan tool 20 through the connection to the EOBDdiagnostic socket 31. - In accordance with one embodiment, the
control unit 5 is also configured to determine a spatial position of the projection surface 4 with respect to theoptical sensor 2 mounted on board of thevehicle 100 arranged in the test station 3. Such determination is performed based on the signal S1 representing the type ofoptical sensor 2. - Preferably, the
calibration system 1 also comprises an automatic means for adjusting (i.e. regulating) the spatial position of the projection surface 4 with respect to the test station 3. Said adjusting means is of the known type and will not be described further. - This position adjustment of the projection surface 4 is usually used in the event in which the
vehicle 100 is placed on a horizontal support surface. - In accordance with another embodiment, the
control unit 5 is also configured to process the images or videos resident in the memory 6. In particular, thecontrol unit 5 is configured to adapt or deform the selected image or video to the size of the projection surface 4. Such adaptation is performed in response to the signal S1 representing the type ofoptical sensor 2. - For example, if the
vehicle 100 in the test station 3 is placed on an inclined plane, the image or video is to be deformed rather than adjusting the spatial position of the projection surface 4 with respect to theoptical sensor 2. - For example, the vehicle's 100 support plane is inclined forwards by a maximum of 1° with respect to the horizontal.
- Or, the vehicle's 100 support plane is inclined backwards by a maximum of 3° with respect to the horizontal.
- It is also envisaged that the
control unit 5 is configured to perform both a determination of the spatial position of the projection surface 4 with respect to theoptical sensor 2 mounted on board of thevehicle 100 in the test station 3 and a deformation of the selected images or videos. - In that case, the determination of the spatial position is rough, and is performed from a deformed projection of the image or video.
- The method of calibrating an optical sensor mounted on board of a vehicle, according to an embodiment, is described below.
- First of all, the
vehicle 100 is parked in the test station 3, according to known techniques, as already mentioned above. - The projection surface 4, e.g. the display of a monitor, is arranged in front of the test station 3, in particular transverse to the longitudinal axis AA of the
vehicle 100. - The operator then connects the portable device 20 (scan tool) to the EOBD
diagnostic socket 31 of thevehicle 100. - The portable device 20 has a
screen 21 on which a graphical interface is displayed, configured to allow text or instructions to be entered by an operator. - In particular, the operator can select the
vehicle 100 to be calibrated, by choosing from different types of vehicles split into brands (manufacturers) and models. - Alternatively, the portable device 20 performs such selection automatically or semi-automatically, asking the operator to confirm that the
vehicle 100 detected is the correct one. - The operator also selects the ADAS system to be calibrated, specifically the
optical sensor 2 to be calibrated. Also in this case, the selection can take place manually, automatically or semi-automatically. - These detection or selection steps of the
vehicle 100 and of the type ofoptical sensor 2 to be calibrated are known in themselves and therefore are not the subject matter of this disclosure. - Once the type of
optical sensor 2 has been identified, the control unit 5 (in the scan tool 20) can determine the spatial measurement position that the monitor must assume with respect to theoptical sensor 2 during calibration. - Such determination takes place, for example, in the case of a
vehicle 100 placed on a horizontal support surface. - The spatial measurement position is preferably displayed in the form of instructions on the projection surface 4.
- It is known that the mutual position of the optical sensor and its target (in this case the display or, in general, the projection surface) must be adjusted according to the type of optical sensor and the position that it occupies in the
vehicle 2. - Preferably, other parameters or initial calibration conditions are also projected onto the projection surface 4.
- The operator then manually adjusts the projection surface 4 until the latter assumes the spatial measurement position. Alternatively, the adjustment of the position of the projection surface 4 takes place automatically.
- Once this adjustment has been performed, the operator confirms to the portable device 20 (still through the graphical interface that can be loaded onto its screen 21) that the preliminary step has been performed and the actual calibration can take place. The operator can choose whether to perform a calibration with a static image or a dynamic video.
- The
control unit 5 searches inside the memory 6 for the image (in the former case) or the video (in the latter case) associated with the type ofoptical sensor 2 to be calibrated. - The image or video selected can then be displayed on the projection surface 4.
- Once the target (which in this case is the projection surface 4) has been adjusted and the image or video has been projected, the calibration is performed by the
calibration unit 30 which communicates with the scan tool 20. The actual calibration, meaning the adjustment of the position of the optical axis of theoptical sensor 2, takes place according to an algorithm of the known type. - Once the
optical sensor 2 has been calibrated, the operator can easily repeat the aforesaid method for other optical sensors located on board of thevehicle 100. - Alternatively to the determination of the spatial position that the projection surface 4 must have and its subsequent adjustment, it is possible to project a deformed image or video onto the projection surface 4. Such solution, used in particular when the
vehicle 100 is on a horizontal support surface, is particularly advantageous because it prevents having to adjust the position of the projection surface 4. - Finally, it is also possible to adopt a combined solution, in which the spatial adjustment is performed on both the position of the projection surface 4 and a projection of the deformed image/video.
- A similar calibration system may also be applied for the calibration of a radar mounted on board of a vehicle, i.e. a frontal radar.
- According to prior art solutions, the calibration of the frontal radar is achieved by means of a plane reflector arranged at a certain distance D from the radar and perpendicular to the axis of the radar.
- Usually, the manufacturers declare a certain tolerance ΔD range for the distance D, i.e. D±ΔD, in the arrangement of the plane reflector with respect to the radar.
- Nevertheless, it is well-known in this field that a deviation, even of a few degrees, in the orthogonal arrangement of the plane reflector with respect to the radar results in failure of the calibration.
- Applying to the frontal radar a system and the method similar proposed herewith for the optical sensor, the user simply place the plane reflector in front of the radar at the distance D recommended by the manufacturer, then he inserts the relevant distances obtained by means of laser meters, and the system calculates the magnitude of the angle that the plane reflector needs to be rotated. Furthermore, the system indicates to the user whether and of which amount the plane reflector shall slide right or left in order to be centred with respect to the radar.
- In practice, in the calibration of a radar, the radar substitutes the
optical sensor 2, while the plane reflector substitutes the projection surface 4. - The characteristics and the advantages of the system and method of calibrating an optical sensor mounted on board of a vehicle, according to an embodiment, are clear, as are the advantages.
- In particular, the use of a surface onto which the images are projected prevents the storage or delicate handling in the workshop of numerous target panels having different shapes, sizes and patterns.
- In the solution using a screen, e.g. of a television set, the calibration system proposed herein further allows a contrast to be achieved that is also compatible with use in an open environment.
- Furthermore, the screen also allows videos that reproduce real dynamic or simulated scenarios to be projected. Therefore, even for static calibration (i.e. with the vehicle stationary), comparable performance levels are obtained to those of dynamic calibration, which can therefore be prevented for vehicles which usually required an on-road test. Preventing the on-road test simplifies planning (connected with weather and road conditions) and prevents risks for the driver.
- The method and system proposed can also be used in the event of inclination of the vehicle (within certain limits) because it is sufficient to suitably deform the image/video to be projected onto the screen instead of performing the spatial adjustment of the screen with respect to the vehicle.
- In addition, in case of calibration of the radar, the user may save up to 20 minutes times for each vehicle.
- As will be appreciated by one skilled in the art, various aspects may be embodied as a system, method or device program product. Accordingly, aspects may take the form of an entirely hardware embodiment or an embodiment including software that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects may take the form of a device program product embodied in one or more device readable medium(s) having device readable program code embodied therewith.
- It should be noted that the various functions described herein may be implemented using instructions stored on a device readable storage medium such as a non-signal storage device that are executed by a processor. A storage device may be, for example, a system, apparatus, or device (e.g., an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device) or any suitable combination of the foregoing. More specific examples of a storage device/medium include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a storage device is not a signal and “non-transitory” includes all media except signal media.
- Program code embodied on a storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, et cetera, or any suitable combination of the foregoing.
- Program code for carrying out operations may be written in any combination of one or more programming languages. The program code may execute entirely on a single device, partly on a single device, as a stand-alone software package, partly on single device and partly on another device, or entirely on the other device. In some cases, the devices may be connected through any type of connection or network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made through other devices (for example, through the Internet using an Internet Service Provider), through wireless connections, e.g., near-field communication, or through a hard wire connection, such as over a USB connection.
- Example embodiments are described herein with reference to the figures, which illustrate example methods, devices and program products according to various example embodiments. It will be understood that the actions and functionality may be implemented at least in part by program instructions. These program instructions may be provided to a processor of a device, a special purpose information handling device, or other programmable data processing device to produce a machine, such that the instructions, which execute via a processor of the device implement the functions/acts specified.
- It is worth noting that while specific blocks are used in the figures, and a particular ordering of blocks has been illustrated, these are non-limiting examples. In certain contexts, two or more blocks may be combined, a block may be split into two or more blocks, or certain blocks may be re-ordered or re-organized as appropriate, as the explicit illustrated examples are used only for descriptive purposes and are not to be construed as limiting.
- As used herein, the singular “a” and “an” may be construed as including the plural “one or more” unless clearly indicated otherwise.
- This disclosure has been presented for purposes of illustration and description but is not intended to be exhaustive or limiting. Many modifications and variations will be apparent to those of ordinary skill in the art. The example embodiments were chosen and described in order to explain principles and practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.
- Thus, although illustrative example embodiments have been described herein with reference to the accompanying figures, it is to be understood that this description is not limiting and that various other changes and modifications may be affected therein by one skilled in the art without departing from the scope or spirit of the disclosure.
Claims (25)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/701,362 US20220284629A1 (en) | 2018-06-21 | 2022-03-22 | System and method of calibrating an optical sensor mounted on board of a vehicle |
US17/943,738 US20230005183A1 (en) | 2018-06-21 | 2022-09-13 | System and method of calibrating an optical sensor mounted on board of a vehicle |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18178983.5A EP3588001B1 (en) | 2018-06-21 | 2018-06-21 | System and method of calibrating an optical sensor mounted on board of a vehicle |
EP18178983.5 | 2018-06-21 | ||
US16/447,918 US20190392610A1 (en) | 2018-06-21 | 2019-06-20 | System and method of calibrating an optical sensor mounted on board of a vehicle |
US17/701,362 US20220284629A1 (en) | 2018-06-21 | 2022-03-22 | System and method of calibrating an optical sensor mounted on board of a vehicle |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/447,918 Continuation US20190392610A1 (en) | 2018-06-21 | 2019-06-20 | System and method of calibrating an optical sensor mounted on board of a vehicle |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/943,738 Continuation-In-Part US20230005183A1 (en) | 2018-06-21 | 2022-09-13 | System and method of calibrating an optical sensor mounted on board of a vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220284629A1 true US20220284629A1 (en) | 2022-09-08 |
Family
ID=62874554
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/447,918 Abandoned US20190392610A1 (en) | 2018-06-21 | 2019-06-20 | System and method of calibrating an optical sensor mounted on board of a vehicle |
US17/701,362 Pending US20220284629A1 (en) | 2018-06-21 | 2022-03-22 | System and method of calibrating an optical sensor mounted on board of a vehicle |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/447,918 Abandoned US20190392610A1 (en) | 2018-06-21 | 2019-06-20 | System and method of calibrating an optical sensor mounted on board of a vehicle |
Country Status (5)
Country | Link |
---|---|
US (2) | US20190392610A1 (en) |
EP (1) | EP3588001B1 (en) |
ES (1) | ES2941712T3 (en) |
HU (1) | HUE061524T2 (en) |
PL (1) | PL3588001T3 (en) |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3264360A1 (en) * | 2016-06-28 | 2018-01-03 | Dassault Systèmes | Dynamical camera calibration |
US10788400B2 (en) | 2016-10-11 | 2020-09-29 | Hunter Engineering Company | Method and apparatus for vehicle inspection and safety system calibration using projected images |
US10852731B1 (en) * | 2017-12-28 | 2020-12-01 | Waymo Llc | Method and system for calibrating a plurality of detection systems in a vehicle |
US10942045B1 (en) * | 2018-04-03 | 2021-03-09 | Waymo Llc | Portable sensor calibration target for autonomous vehicle |
EP3629053B1 (en) | 2018-09-28 | 2020-10-21 | NEXION S.p.A. | System for calibrating a vehicle camera |
US11681030B2 (en) | 2019-03-05 | 2023-06-20 | Waymo Llc | Range calibration of light detectors |
US11747453B1 (en) | 2019-11-04 | 2023-09-05 | Waymo Llc | Calibration system for light detection and ranging (lidar) devices |
US20230122529A1 (en) * | 2020-03-11 | 2023-04-20 | Moog Inc. | Camera system in situation built-in-test |
US11453348B2 (en) * | 2020-04-14 | 2022-09-27 | Gm Cruise Holdings Llc | Polyhedral sensor calibration target for calibrating multiple types of sensors |
US11872965B2 (en) * | 2020-05-11 | 2024-01-16 | Hunter Engineering Company | System and method for gyroscopic placement of vehicle ADAS targets |
EP3929619A1 (en) | 2020-06-22 | 2021-12-29 | Mahle International GmbH | A method of calibration of a front optical sensor mounted on board of a vehicle |
EP3945300A1 (en) | 2020-07-28 | 2022-02-02 | Mahle International GmbH | Adas calibration system for calibrating at least one headlamp of a vehicle |
CN114415189A (en) * | 2020-10-12 | 2022-04-29 | 北醒(北京)光子科技有限公司 | Laser radar system and calibration method thereof |
CN113766199A (en) * | 2021-09-06 | 2021-12-07 | 湖南联科科技有限公司 | Optical projection leveling device and method |
US20230145082A1 (en) * | 2021-11-08 | 2023-05-11 | Kinetic Automation Inc. | System and method for automated extrinsic calibration of lidars, cameras, radars and ultrasonic sensors on vehicles and robots |
EP4279864A1 (en) | 2022-05-16 | 2023-11-22 | Mahle International GmbH | A method for aligning a vehicle to an adas calibration target and an adas calibration system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090326757A1 (en) * | 2004-07-22 | 2009-12-31 | Keith Andreasen | Scan tool user interface |
US20100082281A1 (en) * | 2008-09-30 | 2010-04-01 | Aisin Seiki Kabushiki Kaisha | Calibration device for on-vehicle camera |
US20190055015A1 (en) * | 2017-08-17 | 2019-02-21 | Here Global B.V. | Method and apparatus for intelligent inspection and interaction between a vehicle and a drone |
US20190204184A1 (en) * | 2016-09-16 | 2019-07-04 | Dürr Assembly Products GmbH | Vehicle test bench for calibrating and/or testing systems of a vehicle, which comprise at least one camera, and method for carrying out the calibrating and/or tests of systems of a vehicle, which comprise at least one camera |
US10365355B1 (en) * | 2016-04-21 | 2019-07-30 | Hunter Engineering Company | Method for collective calibration of multiple vehicle safety system sensors |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102006060553A1 (en) * | 2006-12-21 | 2008-06-26 | Bayerische Motoren Werke Ag | Test of a motor vehicle imaging system, to show the driver the conditions around the vehicle, uses a reference camera on a test drive to give stored images as reference images for projection and comparison |
DE102010062696A1 (en) | 2010-12-09 | 2012-06-14 | Robert Bosch Gmbh | Method and device for calibrating and adjusting a vehicle environment sensor. |
DE102011076083A1 (en) * | 2011-05-18 | 2012-11-22 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Projection display and method for displaying an overall image for projection relief surfaces or tilted projection surfaces |
JP5897213B2 (en) * | 2013-05-31 | 2016-03-30 | 本田技研工業株式会社 | Optical sensor inspection system and optical sensor inspection method |
EP3523604B1 (en) * | 2016-10-04 | 2020-05-13 | Hunter Engineering Company | Vehicle wheel alignment measurement system camera and adas calibration support structure |
US10788400B2 (en) * | 2016-10-11 | 2020-09-29 | Hunter Engineering Company | Method and apparatus for vehicle inspection and safety system calibration using projected images |
-
2018
- 2018-06-21 HU HUE18178983A patent/HUE061524T2/en unknown
- 2018-06-21 PL PL18178983.5T patent/PL3588001T3/en unknown
- 2018-06-21 EP EP18178983.5A patent/EP3588001B1/en active Active
- 2018-06-21 ES ES18178983T patent/ES2941712T3/en active Active
-
2019
- 2019-06-20 US US16/447,918 patent/US20190392610A1/en not_active Abandoned
-
2022
- 2022-03-22 US US17/701,362 patent/US20220284629A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090326757A1 (en) * | 2004-07-22 | 2009-12-31 | Keith Andreasen | Scan tool user interface |
US20100082281A1 (en) * | 2008-09-30 | 2010-04-01 | Aisin Seiki Kabushiki Kaisha | Calibration device for on-vehicle camera |
US10365355B1 (en) * | 2016-04-21 | 2019-07-30 | Hunter Engineering Company | Method for collective calibration of multiple vehicle safety system sensors |
US20190204184A1 (en) * | 2016-09-16 | 2019-07-04 | Dürr Assembly Products GmbH | Vehicle test bench for calibrating and/or testing systems of a vehicle, which comprise at least one camera, and method for carrying out the calibrating and/or tests of systems of a vehicle, which comprise at least one camera |
US20190055015A1 (en) * | 2017-08-17 | 2019-02-21 | Here Global B.V. | Method and apparatus for intelligent inspection and interaction between a vehicle and a drone |
Also Published As
Publication number | Publication date |
---|---|
HUE061524T2 (en) | 2023-07-28 |
ES2941712T3 (en) | 2023-05-25 |
PL3588001T3 (en) | 2023-05-22 |
EP3588001B1 (en) | 2023-03-15 |
US20190392610A1 (en) | 2019-12-26 |
EP3588001A1 (en) | 2020-01-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220284629A1 (en) | System and method of calibrating an optical sensor mounted on board of a vehicle | |
US10510276B1 (en) | Apparatus and method for controlling a display of a vehicle | |
US20160275683A1 (en) | Camera Calibration Device | |
JP5339124B2 (en) | Car camera calibration system | |
US20240080435A1 (en) | Calibration of a surround view camera system | |
CN107848417B (en) | Display device for vehicle | |
EP3621037B1 (en) | System and method for calibrating advanced driver assistance system based on vehicle positioning | |
US9902322B2 (en) | Filling in surround view areas blocked by mirrors or other vehicle parts | |
US9794552B1 (en) | Calibration of advanced driver assistance system | |
US20200027241A1 (en) | Auto-calibration for vehicle cameras | |
US20170070725A1 (en) | Calibration method, calibration device, and computer program product | |
US20080186384A1 (en) | Apparatus and method for camera calibration, and vehicle | |
US20120293659A1 (en) | Parameter determining device, parameter determining system, parameter determining method, and recording medium | |
US20220375281A1 (en) | Adas system calibration guiding method and apparatus, and vehicle diagnosis device | |
CN101786439A (en) | Methods and systems for calibrating vehicle vision systems | |
US11830221B2 (en) | Method for aligning a vehicle service system relative to a vehicle | |
US20170168561A1 (en) | Head up display automatic correction method and correction system | |
CN114286079B (en) | Enhanced pointing angle verification | |
US20230005183A1 (en) | System and method of calibrating an optical sensor mounted on board of a vehicle | |
US20220157088A1 (en) | Automotive sensor calibration system and related methodology | |
US20210278203A1 (en) | Method for aligning a vehicle service system relative to a vehicle | |
CN102165503A (en) | Steering assistance device | |
EP3945300A1 (en) | Adas calibration system for calibrating at least one headlamp of a vehicle | |
US20200282833A1 (en) | Method and system for demonstrating function of vehicle-mounted heads up display, and computer-readable storage medium | |
US20220132049A1 (en) | Systems and method for image normalization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MAHLE AFTERMARKET ITALY S.P.A., ITALY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GANDOLFI, PAOLO;REEL/FRAME:060064/0004 Effective date: 20190226 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: MAHLE AFTERMARKET ITALY S.P.A., ITALY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CANTADORI, ANDREA;REEL/FRAME:060079/0566 Effective date: 20220203 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |