GB2603731A - Method and device for calibrating a vehicle camera of a vehicle - Google Patents

Method and device for calibrating a vehicle camera of a vehicle Download PDF

Info

Publication number
GB2603731A
GB2603731A GB2206240.0A GB202206240A GB2603731A GB 2603731 A GB2603731 A GB 2603731A GB 202206240 A GB202206240 A GB 202206240A GB 2603731 A GB2603731 A GB 2603731A
Authority
GB
United Kingdom
Prior art keywords
vehicle
camera
camera image
calibration
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2206240.0A
Other versions
GB202206240D0 (en
GB2603731B (en
Inventor
Dreuw Philippe
Focke Thomas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of GB202206240D0 publication Critical patent/GB202206240D0/en
Publication of GB2603731A publication Critical patent/GB2603731A/en
Application granted granted Critical
Publication of GB2603731B publication Critical patent/GB2603731B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/103Side slip angle of vehicle body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/40Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
    • B60R2300/402Image calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)
  • Rear-View Mirror Devices That Are Mounted On The Exterior Of The Vehicle (AREA)

Abstract

A method for self-calibrating an onboard vehicle camera 105 is described. Two subsequent images (305, 310, Fig. 3) are taken with the camera moving 210 while the automobile 100 is stationary, and said images are used to calibrate at least one calibration parameter (315, Fig. 3). Flow vectors (513, Fig. 5) might be used for calibration, which might be compared with reference vectors. The flow vectors may be determined by using multiple feature points from different images. The travel path 200 of the camera may be reconstructed via the control signal of an actuator motor. A car motion signal may be used to confirm that the vehicle is stationary. The camera trajectory may be related to the displacement of a camera mounted on a car door or a tailgate being opened or closed. Said motion may also derive from an exterior mirror being folded in or out of a door. The calibration procedure may be interrupted if an inclination sensor recognises that the automobile is inclined.

Description

Description Title
Method and device for calibrating a vehicle camera of a vehicle
Prior art
The concept is based on a device or a method of the type defined in the independent claims. Another object of this concept is a computer programme.
Known vehicle video surround view systems, in other words systems for monitoring the surrounding area of a vehicle, use four or more cameras to detect an area immediately surrounding a vehicle. To enable intuitive detection of the situation by a driver in the vehicle, individual images of the cameras are pieced together to form an overall view. Due to the fact that the cameras are fixedly mounted on the vehicle, a static rule relating to the geometric relationships of the individual images to one another is usually stored in the vehicle surround view system or VSV system, this process also being referred to as extrinsic calibration. This rule is determined as part of an end-ofline calibration or solely from the construction data of the vehicle and cameras. In the prior art, any changes which might occur in the calibration are corrected by means of a so-called online calibration. Since every individual surround view camera is to be regarded as a mono-camera and three dimensional surround models necessary for a calibration can therefore only be calculated by a camera movement, the correction, in other words the online calibration, is usually effected at sporadic time intervals when the vehicle is in motion during normal driving. -2 -
DE 10 2008 259 551 Al discloses a method for determining a change in position of a camera system by means of a first and second image of a camera, for example.
Against this background, using the concept preserned here, a method for calibrating a vehicle camera of a vehicle and a device which uses this method and finally a corresponding computer programme are proposed, as defined in claims 1, 10 and 11. The features defined in dependent claims 2 to 9 enable other advantageous embodiments and improvements to the method specified in independent claim 1 to be obtained.
The advantages which can be achieved by the concept presented here reside in the fact that a correct calibration of the vehicle cameras is made possible before a vehicle sets off already as opposed to only when a vehicle is driving. This is particularly important because it is precisely when setting off, for example when exiting a parking space, that a correct representation of an area surrounding the vehicle in an overall view of the individual vehicle camera images is necessary for the driver, for example in order to be able to estimate distances correctly.
A method for calibrating a vehicle camera of a vehicle is proposed. The method comprises at least a reading step and a setting step. In the reading step, at least a first camera image and a second camera image representing images taken by at least the vehicle camera during a camera movement of the vehicle camera whilst the vehicle is stationary are read. These camera images may be read in the form of one or more signals read from an interface to the vehicle camera, for example. In the setting step, a calibration parameter for calibrating at least the vehicle camera is set using the first camera image and the second camera image. As a result, -3 -the vehicle camera can be set relative to the vehicle and a separate camera position.
The method may comprise a step of determining the camera movement. This may be done by evaluating a signal of an acceleration sensor or by evaluating the camera images provided by the camera, for example.
This method may be implemented in a control device using software or hardware for example, or in a combined form of software and hardware for example.
In order to make the stationary state of the vehicle detectable, the method may comprise an additional reading step in which a vehicle motion signal indicating or representing the stationary state of the vehicle is read. The stationary state may be understood as being a vehicle speed of the vehicle of substantially 0 km/h. The setting step can then be implemented in response to the reading step and the additional reading step.
The method presented here may advantageously comprise a determination step in which a flow vector is determined using the first camera image and the second camera image, and the calibration parameter can be set in the setting sT_ep using the flow vector.
If the method further comprises a receiving step in which a predefined reference vector is received and a comparison step in which the flow vector is compared with the reference vector, the calibration parameter can be set in the setting step using a comparison result of the comparison step based on one advantageous embodiment of the method presented here. The reference vector may be a vector which was determined in accordance with the flow vector but on the basis of an empty -4 -or unladen vehicle. Since a depth at which the shock absorbers sit and hence also a position of the vehicle cameras relative to the road surface changes when the vehicle is laden due to goods and/or persons for example, it may be that an existing extrinsic calibration of the vehicle camera or the camera system is no longer valid and another calibration is therefore necessary. Whether the existing calibration is valid can advantageously be established by means of the described comparison step. Depending on the resultant comparison result, the calibration parameter can then be set if the flow vector does not match the reference vector, for example, because such a variance indicates that the vehicle camera is not correctly calibrated, in other words is decalibrated.
In the determination step, the flow vector may be determined using a point of the road surface imaged in the first camera image and the point of the road surface imaged in the second camera image, for example. The flow vector can be rapidly and easily determined by comparing or superimposing the two camera images respectively imaging the same point of the road surface.
In the reading step, camera images representing images taken by the vehicle camera disposed in the region of an exterior mirror and/or a vehicle door and/or a tailgate of the vehicle may be read. Dispositions of vehicle cameras described here are typical and of practical use in vehicle surround view systems. Since their camera positions on the vehicle are also known, their possible camera movements are also predictable. When the vehicle door is opened, a a vehicle defined camera mounted on the vehicle door moves along travel path, for example. The same applies to a vehicle camera disposed on an exterior mirror or on a tailgate. This -5 -means that defined vehicle camera movements are possible when a vehicle is stationary.
In the reading step, therefore, it is of advantage if the camera movement represents a travel path of at Least the vehicle camera caused by folding in and/or folding out an exterior mirror of the vehicle and/or by opening and/or closing a vehicle door and/or by opening and/or closing a tailgate of the vehicle. This is practical because such a travel path is known from construction data of the vehicle and an exact knowledge of the plane estimation for the unladen vehicle is therefore also available.
In order to determine the travel path representing the camera movement, however, the method may also comprise a determination step in which the travel path representing the camera movement is determined using a control signal of an actuator motor for effecting the camera movement.
Since it is particularly practical to implement a method proposed here when the vehicle is standing level, it is of advantage if the method comprises an additional receiving step in which an inclination signal is received from an interface to an inclination sensor of the vehicle, and the setting step is not implemented if the inclination signal indicates an inclination of the vehicle. In this context, the inclination may represent a non-horizontal field of view of at least the vehicle camera with respect to the vehicle.
The concept presented here also provides a device configured to run, control and implement the steps of a variant of a method as proposed here in corresponding units. This embodiment of the concept in the form of a device also enables the underlying objective of the concept to be rapidly -6 -and efficiently achieved.
To this end, the device may comprise at least one computer unit for processing signals or data, at least one memory unit for storing signals or data, at least one interface to a sensor or an actuator for reading sensor signals from the sensor or for outputting data signals or control signals to the actuator and/or at least one communications interface embedded in a communications protocol for reading or outputting data. The computer unit may be a signal processor, a microcontroller or similar, for example, and the memory unit may be a flash memory, an EPROM or a magnetic memory unit. The communications interface may be configured to read or output data wirelessly and/or via cables, and a communications interface which is capable of reading or outputting data via cables can read this data electrically or optically from a corresponding data transmission cable or can output this data to a corresponding data transmission cable, for example.
A device may be understood as meaning an electrical apparatus which processes sensor signals and outputs control signals and/or data signals depending thereon. The device may comprise an interface which may be configured in the form of hardware and/or software. In the case of a hardware configuration, the interfaces may be part of a so-called system ASIC containing the different functions of the device, for example. However, it would also be possible for the interfaces to be separate integrated circuits or for at least some of them to be discrete components. In the case of a software configuration, the interfaces may be software modules which are provided on a microcontroller in addition to other software modules, for example.
Based on one advantageous embodiment, a calibration parameter for calibrating at least one vehicle camera of a vehicle is set by the device. To this end, the device is able to access sensor signals representing the at least one first camera image and a second camera image and optionally also a vehicle motion signal, for example. Activation takes place by means of actuators such as at least one reading unit for reading at least the first camera image and the second camera image and a setting unit for setting the calibration parameter.
Also of advantage is a computer programme product or computer programme with programme code which can be stored on a machine-readable carrier or memory medium such as a semiconductor memory, a hard disc storage or an optical storage and used for running, implementing and/or controlling the steps of the method based on one of the embodiments described above, in particular when the programme product or programme is run on a computer or a device.
Examples of embodiments of the concept presented here are illustrated in the drawings and will be explained in more detail in the description below. Of the drawings: Fig. 1 is a schematic plan view of a vehicle having a plurality of decalibrated vehicle cameras; Fig. 2 is a schematic view of a travel path of a vehicle camera on a side door of a vehicle; Fig. 3 is a schematic plan view of a vehicle having a device for calibrating a vehicle camera based on one example of an embodiment; -8 -Fig. 4 is a sequence diagram of a method for calibrating a vehicle camera of a vehicle based on one example of an embodiment; and Fig. 5 is a functional block diagram of a device for calibrating a vehicle camera of a vehicle based on one example of an embodiment.
In the following description of examples of practical embodiments of this concept, the same or similar reference numbers are used to denote similarly acting elements illustrated in the different drawings and a repeated description of these elements will be dispensed with.
Fig. 1 is a schematic plan view of a vehicle 100 having a plurality of decalibrated vehicle cameras 105.
The vehicle 100 illustrated here is stationary and is laden with goods and/or persons, as a result of which the depth at which shock absorbers of the vehicle 100 sit and hence a position of the vehicle cameras 105 from the road surface has changed compared with a position of the vehicle cameras 105 from the road surface in an unladen state of the vehicle 100. An existing extrinsic calibration of the camera system of the individual vehicle cameras 105 is therefore no longer valid and a recalibration is necessary. Since vehicle surround view systems are usually used for parking assistance in particular, an exit from the parked position after the changed load would take place with an incorrect calibration and hence also a surround view representation that does not match the vehicle's surroundings. The incorrect calibration is illustrated in Fig. 1, in particular at points 110, 115 at which individual images of the vehicle cameras 105 have been pieced together or stitched. This leads to a situation -9 -in which parking lines and kerbs do not fit together at the transition from one image to an adjacent image.
Markings 117 illustrate camera fields of view of the front camera 105 and the rear camera 105 of the vehicle 100.
Points 110 marked in the region of the tailgate of the vehicle 100 illustrate lines which do not extend parallel with the vehicle 100. Lines from a front camera image do not fit exactly with lines in the side camera image. This deviation has arisen due to vehicle cameras 105 losing calibration on loading the vehicle 100.
Points 115 marked in the region of the front of the vehicle 100 illustrate kinks in the course of straight lines. Lines from the front camera image do not fit exactly with lines in the side camera image. This deviation has arisen as a result of vehicle cameras 105 losing calibration due to misalignment.
Due to the device 105 illustrated in Fig. 3, a correction of the calibration to reflect the changed load can advantageously be run already whilst the vehicle 100 is still stationary as opposed to after significant travel of the vehicle 100 as is the case with known devices. An advantage of this is that a correct calibration is guaranteed before embarking on a journey.
Fig. 2 is a schematic view illustrating a travel path 200 of a vehicle camera 105 on a side door 205 of a vehicle 100. This may be the vehicle 100 with the vehicle cameras 105 described with reference to Figure 1.
As already explained in connection with Figure 1, in the case of known vehicle surround view systems, the vehicle cameras 105 are initially intrinsically and then extrinsically calibrated with one another in a one-off operation in the factory. An online calibration of the system may take place later whilst the vehicle 100 is travelling, in other words a correction of the calibration which typically assumes that a surface underneath the vehicle 100 is flat and is therefore able to compensate for inaccuracies in the mounting tolerances of up to ca. 3°. Until now, this type of online calibration can take place exclusively whilst the vehicle 100 is travelling, when surround view systems have rarely been used in the past. However, if a deviation from the calibration occurs already whilst stationary due to the vehicle being loaded, as is the case with the illustrated vehicle 100, the images of the vehicle cameras 105 no longer match one another, having been distorted accordingly, as seen in Figure 1, and the user in the vehicle 100 receives an overall image of the surrounding area on his screen that has been incorrectly transformed accordingly. This no longer serves any purpose, especially when moving out of a parked position or when moving off slowly.
An improvement of the situation can be achieved in the stationary state whereby, based on a concept presented here, the calibration is run in the stationary state using moved vehicle cameras 105. The movement of the vehicle cameras 105 comes as a gift, as it were, because both the doors, in this instance the side door 205, and a boot lid of the vehicle 100 are typically moved before the start of a journey already. As a result of moving the side door 205 as illustrated here, the side door vehicle camera 105 illustrated in this instance mounted on the side door 205 moves on the travel path 200, as a result of which a field of view of the side door vehicle camera 105 sweeps a yaw angle 210. The movement of a vehicle camera 105 may be detected, for example by evaluating images of the camera or at least one sensor signal, for example of an acceleration sensor.
Fig. 3 is a schematic plan view of a vehicle 100 having a device 300 for calibrating a vehicle camera 105 based on one example of an embodiment. This may be the laden vehicle 100 in the stationary state described with reference to the preceding drawings, the difference being that the vehicle cameras 105 of the vehicle 100 have been correctly calibrated due to the device 300 presented here.
To this end, the device 300 of at least one of the vehicle cameras 105 has read at least a first camera image 305 of the vehicle camera 105 and a second camera image 310 of the vehicle camera 105, the camera images 305, 310 representing images taken during a camera movement of the vehicle camera 105 when the vehicle 100 is stationary. Using the first camera image 305 and the second camera image 310, the device 300 has set a calibration parameter 315 for calibrating at least the vehicle camera 105.
The following features of the device 100 are optional.
Based on this example of an embodiment, at least the one vehicle camera 105 was calibrated by the set calibration parameter 315.
The camera movement based on this example of an embodiment is the travel path of the vehicle camera 105 illustrated in Fig. 2 which the vehicle camera 105 travelled because the side door on which the vehicle camera 105 is disposed was opened and/or closed.
-12 -Based on this example of an embodiment, the device 100 has determined a flow vector using the first camera image 305 and the second camera image 310 and the calibration Parameter 315 was set using the flow vector. The flow vector was determined by the device 100 using a point of the road surface imaged in the first camera image 305 and the point of the road surface imaged in the second camera image 310. Furthermore, a predefined reference vector was received by the device 100 and compared with the flow vector, after which the calibration parameter 315 was set using a comparison result of this comparison. Based on this example of an embodiment, the travel path of the vehicle camera 105 representing the camera movement was determined by the device 100 using a control signal of an actuator motor for effecting the camera movement. Furthermore, a vehicle motion signal indicating the stationary state of the vehicle 100 was read by the device 100 before the calibration parameter 315 was set.
In addition or as an alternative, based on an example of an alternative embodiment, other camera images representing images taken by at least the vehicle camera 105 disposed in the region of an exterior mirror and/or another vehicle door and/or a tailgate of the vehicle 100 are read by the device 100, and the calibration parameter 315 is set as described above, additionally or alternatively using the other camera images. In this example of an alternative embodiment, the camera movement represents a travel path of at least the vehicle camera 105 caused by folding in and/or folding out the exterior mirror of the vehicle 100 and/or by opening and/or closing the other vehicle door and/or by opening and/or closing the tailgate of the vehicle 100.
Details of the device 100 will be described again below in -13 -a different way.
Fig. 3 illustrates calibrated vehicle cameras 105 after a correction applied by the device 100 presented here. Points 320 illustrate lines which now extend parallel with the vehicle 100. As may be seen, there is now a seamless transition between lines from a tailgate camera image and lines in the side camera image.
The device 100 presented here is configured to use the movement of side mirrors which can be folded in and out for calculating the calibration. A key point in this respect is the fact that the travel path of the mirrors, doors and optionally also the tailgate is known from construction data and is thus available as a means of providing an exact knowledge of a plane estimation for the unladen vehicle 100. As an alternative to the movement of the tailgate, a movement of a vehicle camera which is automatically folded out or lowered when a reverse gear is engaged may also be used. This defined movement may also be used in the same way as folding the side mirrors. Every deviation from this plane estimation is then detected as an error of the calibration and corrected accordingly. Information about the calibration of the front camera can be gleaned from the data of the side and tailgate cameras based on the assumption that the vehicle 100 is to be regarded as a "rigid system".
The device 100 described here is therefore configured to additionally make use of the known movement of side mirrors in which the side cameras of a vehicle surround view system are typically integrated as they are folded in. A knowledge about a circular path thus travelled by the vehicle cameras 105 in the mirrors, this circular path being referred to as travel path above, advantageously enables the online -14 -calibration in the stationary state -a comparison between resultant actual circular paths based on corresponding video sequences and desired circular paths based on corresponding construction plans and swivel joints of the mirrors enables the calibration to be derived.
With the calibration implemented with the aid of this device 100 before embarking on a journey, therefore, a geometrically correct representation of the surrounding area of the vehicle 100 is obtained and thus represents an improvement over known devices.
In other words, a function of the device 100 can be described as follows. A video surround view (VSV) system is calibrated in the empty state. On every individual vehicle camera 105 which moves as doors open or mirrors are folded, an estimation of a plane can be run based on this inherent movement of the vehicle camera 100 using an optical flow on the textured road surface. The travel path of the vehicle camera 105 being known, a direction of the flow vectors on the plane is characteristic and in this case corresponds to the estimation of the plane. A comparison of the calibrated version of the flow vectors stored in a memory based on this embodiment, referred to above as reference vectors, with the currently determined flow vectors will result in a deviation in the event of decalibration. This deviation can be geometrically determined in all three spatial angles and algorithmically corrected. This is implemented by the device 100 for tailgate and side cameras independently. From the three camera corrections, information is gleaned about the position of the vehicle 100 from the plane and the front camera is also corrected using this. As an alternative to storing the estimation of the plane in memory, the travel path of the door/mirror/tailgate may also be -15 -measured/determined by means of the integrated actuator motor and a desired plane estimation thus calculated in every image. The current plane estimation from the image data can then be respectively compared for each image and the correction calculated. The plurality of images as well as the two travel paths, folding in and folding out, result in a sufficient robustness of the plane estimation.
Since a method proposed by this device 100 can only function correctly if a field of view illuminated by the vehicle camera 105 is horizontal with respect to the vehicle 100, the device 100 is also configured to receive an inclination signal from an interface to an inclination sensor of the vehicle 100. The calibration parameter 315 is then not set if the received inclination signal indicates an inclination of the vehicle 100. Inclination sensors that are integrated as standard are therefore able to detect if the vehicle 100 is standing with at least one wheel on an inclined surface. In this case, the calibration is simply aborted. A plausibility check by inclination sensors will likewise be aborted if the vehicle 100 is standing directly next to an inclined plane, e.g. a dyke. In this case, the old calibration is used. An incorrect calibration must have been detected for this to happen.
Fig. 4 is a sequence diagram illustrating a method 400 for calibrating a vehicle camera of a vehicle based on one example of an embodiment. This may be a method 400 which can be implemented by the device described with reference to Fig. 3.
The method 400 comprises at least a reading step 405 and a setting step 410. In the reading step 405, at least a first camera image and a second camera image representing images -16 -taken by at least the vehicle camera during a camera movement of the vehicle camera whilst the vehicle is stationary are read. In the setting step 410, a calibration parameter for calibrating at least the vehicle camera is set using the first camera image and the second camera image.
The embodiments of the method 400 described below are optional.
During the reading step 405 based on this example of an embodiment, a plurality of camera images representing images taken by the vehicle camera disposed in the region of an exterior mirror and/or a vehicle door and/or a tailgate of the vehicle are read.
During the reading step 405 based on this example of an embodiment, the camera movement representing a travel path of at least the vehicle camera caused by the exterior mirror of the vehicle being folded in and/or folded out and/or by the vehicle door being opened and/or closed and/or by the tailgate of the vehicle being opened and/or closed is read by means of the plurality of images.
Based on this example of an embodiment, the setting step 410 is implemented because a vehicle motion signal is read which indicates that the vehicle is stationary.
The method 400 optionally further comprises a determination step 415, a receiving step 420, a determination step 425, a comparison step 430 and an additional receiving step 435.
In the determination step 415, a flow vector is determined using the first camera image and the second camera image and in the setting step 410, the calibration parameter is set using the flow vector. In the determination step 415 based -17 -on this example of an embodiment, the flow vector is determined using a point of the road surface imaged in the first camera image and the point of the road surface imaged in the second camera image.
In the receiving step 420, a predefined reference vector is received. Based on an example of an alternative embodiment, in addition to or as an alternative to the receiving step 420, the method 400 comprises a determination step 425 in which a travel path representing the camera movement is determined using a control signal of an actuator motor for effecting the camera movement.
In the comparison step 430, the flow vector is compared with the reference vector and in the setting step 410, the calibration parameter is set using a comparison result of the comparison step 430.
In the additional receiving step 435, an inclination signal is received from an interface to an inclination sensor of the vehicle and setting step 410 is not implemented if the inclination signal indicates an inclination of the vehicle.
The method steps described here can be implemented repeatedly and in a different sequence from the one described.
Fig. 5 is a functional block diagram illustrating a device 300 for calibrating a vehicle camera 105 of a vehicle based on one example of an embodiment. This may be the device 300 described with reference to Fig. 3, which is configured to implement the method described with reference to Fig. 4. The vehicle camera 105 may be one of the vehicle cameras 105 on the vehicle described with reference to Figures 1 to 3.
Based on this example of an embodiment, the device 300 has -18 -a reading unit 500, a determination unit 505, a comparison unit 510 and a setting unit 515.
The vehicle camera 105 based on this example of an embodiment is disposed on the side door 205 of the vehicle and/or in a region of a mirror motor of a mirror of the vehicle. The reading unit 500 is configured to implement an image acquisition by the vehicle camera 105, in other words read at least the first camera image 305 and the second camera image 310 representing the images taken during the camera movement of the vehicle camera 105 when the vehicle is stationary. The determination unit 505 is configured to determine the flow vector 523 using at least the first camera image 305 and the second camera Image 310 and, based on this example of an embodiment, also run a plane estimation. The comparison unit 510 is configured to receive the predefined reference vector 525 representing another flow vector or a flow image of the calibrated system in a memory. Based on this example of an embodiment, the reference vector 525 was determined using the known travel path of the side door 205 or, based on the example of an alternative embodiment, of the mirror or tailgate. Based on an example of an alternative embodiment, the reference vector 525 may also be stored in the device 300. Furthermore, the comparison unit 510 is configured to compare the flow vector 523 with the reference vector 525. The setting unit 515 is configured to set the calibration parameter 315 for calibrating at least the vehicle camera 105. Based on this example of an embodiment, the setting unit 515 is configured to set the calibration parameter 315 using a comparison result of the comparison of the comparison device 510. The setting unit 515 based on this example of an embodiment is configured to calibrate at least the vehicle camera 105 and/or at least another vehicle -19 -camera 105 of the vehicle by setting the calibration parameter 315 and this procedure may also be described as an algorithmic correction. The at least one vehicle camera 105 is then extrinsically calibrated.
If, with respect to an example of an embodiment, an "and/or" link is made between a first feature and a second feature, this should be understood as meaning that the exemplary embodiment based on one embodiment has both the first feature and the second feature and based on another embodiment has either only the first feature or only the second feature.

Claims (12)

  1. -20 -Claims 1. Method (400) for calibrating a vehicle camera (105) of a vehicle (100), the method (400) comprising at least the following steps: reading (405) a first camera image (305) and a second camera image (310) representing images taken by at least the vehicle camera (105) during a camera movement of the vehicle camera (105) whilst the vehicle (100) is stationary; and setting (410) a calibration parameter (315) for calibrating at least the vehicle camera (105) using the first camera image (305) and the second camera image (310).
  2. 2. Method (400) as claimed in claim 1, comprising a step (415) of determining a flow vector (523) using the first camera image (305) and the second camera image (310), and the calibration parameter (315) is set in the setting step (410) using the flow vector (523).
  3. 3. Method (400) as claimed in claim 2, comprising a step (420) of receiving a predefined reference vector (525) and a step (430) of comparing the flow vector (523) with the reference vector (525), and the calibration parameter (315) is set in the setting step (410) using a comparison result of the comparison step (430).
  4. 4. Method (400) as claimed in one of claims 2 to 3, wherein in the determination step (415), the flow vector (523) is determined using a point of the road surface imaged in the first camera image (305) and the point of the -21 -road surface imaged in the second camera image (310).
  5. 5. Method (400) as claimed in one of the preceding claims, comprising a step (425) of determining a travel path (200) representing the camera movement using a control signal of an actuator motor for effecting the camera movement.
  6. 6. Method (400) as claimed in one of the preceding claims, wherein the setting step (410) is implemented if a vehicle motion signal is read which indicates that the vehicle (100) is stationary.
  7. 7. Method (400) as claimed in one of the preceding claims, wherein in the reading step (405), camera images (305, 310) representing images taken by the vehicle camera (105) disposed in the region of an exterior mirror and/or a vehicle door (205; 520) and/or a tailgate of the vehicle (100) are read.
  8. 8. Method (400) as claimed in one of the preceding claims, wherein in the reading step (405), the camera movement represents a travel path (200) of at least the vehicle camera (105) caused by an exterior mirror of the vehicle (100) being folded in and/or folded out and/or by a vehicle door (205; 520) being opened and/or closed and/or by a tailgate of the vehicle (100) being opened and/or closed.
  9. 9. Method (400) as claimed in one of the preceding claims, comprising a step (435) of additionally receiving an inclination signal from an interface to an inclination sensor of the vehicle (100), and the setting step (410) is not implemented if the inclination signal indicates an inclination of the vehicle (100).
  10. -22 - 10. Device (300) which is configured to implement and/or control steps of the method (400) as claimed in one of the preceding claims in corresponding units (500, 505, 510, 515).
  11. 11. Computer programme which is configured to implement the method (400) as claimed in one of claims 1 to 9.
  12. 12. Machine-readable memory medium on which the computer programme as claimed in claim 11 is stored.
GB2206240.0A 2017-04-12 2018-04-11 Method and device for calibrating a vehicle camera of a vehicle Active GB2603731B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102017206295.9A DE102017206295B3 (en) 2017-04-12 2017-04-12 Method, apparatus and computer program for calibrating a vehicle camera of a vehicle
GB1806014.5A GB2562898A (en) 2017-04-12 2018-04-11 Method and device for calibrating a vehicle camera of a vehicle

Publications (3)

Publication Number Publication Date
GB202206240D0 GB202206240D0 (en) 2022-06-15
GB2603731A true GB2603731A (en) 2022-08-10
GB2603731B GB2603731B (en) 2022-11-23

Family

ID=60890546

Family Applications (2)

Application Number Title Priority Date Filing Date
GB1806014.5A Withdrawn GB2562898A (en) 2017-04-12 2018-04-11 Method and device for calibrating a vehicle camera of a vehicle
GB2206240.0A Active GB2603731B (en) 2017-04-12 2018-04-11 Method and device for calibrating a vehicle camera of a vehicle

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB1806014.5A Withdrawn GB2562898A (en) 2017-04-12 2018-04-11 Method and device for calibrating a vehicle camera of a vehicle

Country Status (5)

Country Link
US (1) US10438374B2 (en)
JP (1) JP7096053B2 (en)
CN (1) CN108696719B (en)
DE (1) DE102017206295B3 (en)
GB (2) GB2562898A (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3644279A1 (en) * 2018-10-25 2020-04-29 Continental Automotive GmbH Static camera calibration using motion of vehicle portion
DE102018219666B4 (en) * 2018-11-16 2020-12-24 Zf Friedrichshafen Ag Image processing method and camera system for generating a vehicle visibility
KR102177878B1 (en) * 2019-03-11 2020-11-12 현대모비스 주식회사 Apparatus and method for processing image
CN111223150A (en) * 2020-01-15 2020-06-02 电子科技大学 Vehicle-mounted camera external parameter calibration method based on double vanishing points
CN112706755B (en) * 2021-01-27 2022-08-16 广州小鹏自动驾驶科技有限公司 Vehicle-mounted camera adjusting method and device
JP2022123419A (en) * 2021-02-12 2022-08-24 本田技研工業株式会社 Information recording device, information recording method, and program
US11405559B1 (en) 2021-02-19 2022-08-02 Honda Motor Co., Ltd. Systems and methods for live signal adjustment of a movable camera

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0770529A2 (en) * 1995-10-25 1997-05-02 Toyota Jidosha Kabushiki Kaisha Device for estimating side slide velocity of vehicle compatible with rolling and cant
JP2008094375A (en) * 2006-09-14 2008-04-24 Toyota Central R&D Labs Inc Vehicle physical quantity estimating apparatus and program
US20090179773A1 (en) * 2005-10-28 2009-07-16 Hi-Key Limited Method and apparatus for calibrating an image capturing device, and a method and apparatus for outputting image frames from sequentially captured image frames with compensation for image capture device offset
US20100201814A1 (en) * 2009-02-06 2010-08-12 Gm Global Technology Operations, Inc. Camera auto-calibration by horizon estimation
US20100253784A1 (en) * 2009-04-06 2010-10-07 Samsung Electro-Mechanics Co., Ltd. Calibration method and apparatus for automotive camera system, and method and ecu for determining angular misalignments of automotive camera system
WO2012139636A1 (en) * 2011-04-13 2012-10-18 Connaught Electronics Limited Online vehicle camera calibration based on road surface texture tracking and geometric properties
EP2665037A1 (en) * 2012-05-15 2013-11-20 Toshiba Alpine Automotive Technology Corporation Onboard camera automatic calibration apparatus
US20150145965A1 (en) * 2013-11-26 2015-05-28 Mobileye Vision Technologies Ltd. Stereo auto-calibration from structure-from-motion

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008059551B4 (en) * 2008-11-28 2021-08-12 Car.Software Estonia As Method for determining the change in position of a camera system and device for capturing and processing images
KR101557678B1 (en) * 2009-04-22 2015-10-19 삼성전자주식회사 Apparatus and method for calibration of portable terminal
WO2013019707A1 (en) * 2011-08-01 2013-02-07 Magna Electronics Inc. Vehicle camera alignment system
JP5923422B2 (en) * 2012-09-24 2016-05-24 クラリオン株式会社 Camera calibration method and apparatus
JP6177006B2 (en) * 2013-05-24 2017-08-09 京セラ株式会社 Camera calibration apparatus and camera calibration method
CN103824346B (en) * 2014-02-17 2016-04-13 深圳市宇恒互动科技开发有限公司 Driving recording and replay method and system
JP2016001378A (en) * 2014-06-11 2016-01-07 株式会社デンソー Calibration device of on-vehicle camera
DE102014117888A1 (en) 2014-12-04 2016-10-13 Connaught Electronics Ltd. Online calibration of a motor vehicle camera system
JP6488749B2 (en) 2015-02-13 2019-03-27 株式会社デンソー Camera calibration device
KR102366402B1 (en) * 2015-05-21 2022-02-22 엘지전자 주식회사 Driver assistance apparatus and control method for the same
US10443287B2 (en) * 2015-07-29 2019-10-15 Ford Global Technologies, Llc Door position sensor and system for a vehicle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0770529A2 (en) * 1995-10-25 1997-05-02 Toyota Jidosha Kabushiki Kaisha Device for estimating side slide velocity of vehicle compatible with rolling and cant
US20090179773A1 (en) * 2005-10-28 2009-07-16 Hi-Key Limited Method and apparatus for calibrating an image capturing device, and a method and apparatus for outputting image frames from sequentially captured image frames with compensation for image capture device offset
JP2008094375A (en) * 2006-09-14 2008-04-24 Toyota Central R&D Labs Inc Vehicle physical quantity estimating apparatus and program
US20100201814A1 (en) * 2009-02-06 2010-08-12 Gm Global Technology Operations, Inc. Camera auto-calibration by horizon estimation
US20100253784A1 (en) * 2009-04-06 2010-10-07 Samsung Electro-Mechanics Co., Ltd. Calibration method and apparatus for automotive camera system, and method and ecu for determining angular misalignments of automotive camera system
WO2012139636A1 (en) * 2011-04-13 2012-10-18 Connaught Electronics Limited Online vehicle camera calibration based on road surface texture tracking and geometric properties
EP2665037A1 (en) * 2012-05-15 2013-11-20 Toshiba Alpine Automotive Technology Corporation Onboard camera automatic calibration apparatus
US20150145965A1 (en) * 2013-11-26 2015-05-28 Mobileye Vision Technologies Ltd. Stereo auto-calibration from structure-from-motion

Also Published As

Publication number Publication date
JP2019003613A (en) 2019-01-10
GB2562898A (en) 2018-11-28
DE102017206295B3 (en) 2018-01-25
CN108696719A (en) 2018-10-23
US10438374B2 (en) 2019-10-08
US20180300899A1 (en) 2018-10-18
GB202206240D0 (en) 2022-06-15
JP7096053B2 (en) 2022-07-05
CN108696719B (en) 2022-03-01
GB201806014D0 (en) 2018-05-23
GB2603731B (en) 2022-11-23

Similar Documents

Publication Publication Date Title
GB2603731A (en) Method and device for calibrating a vehicle camera of a vehicle
US10919458B2 (en) Method and system for calibrating vehicular cameras
CN108141570B (en) Calibration device, calibration method, and calibration program storage medium
US8018488B2 (en) Vehicle-periphery image generating apparatus and method of switching images
US11377029B2 (en) Vehicular trailering assist system with trailer state estimation
US7277123B1 (en) Driving-operation assist and recording medium
JP2001227982A (en) Calibration method for sensor system
JP2002259995A (en) Position detector
CN109421453A (en) Trailer with prediction mounting angle function falls back auxiliary system
CN105432074A (en) Camera system for a vehicle, and method and device for controlling an image region of an image of a vehicle camera for a vehicle
CN107826168A (en) For distance-finding method, control device and the motor vehicle of the position for asking for motor vehicle
US11145112B2 (en) Method and vehicle control system for producing images of a surroundings model, and corresponding vehicle
CN102436758B (en) Method and apparatus for supporting parking process of vehicle
CN112572460A (en) Method and apparatus for estimating yaw rate with high accuracy, and storage medium
EP3924866A1 (en) Systems and methods for image normalization
WO2021140864A1 (en) System for detecting orientation/position of detector and method for detecting orientation/position of detector
US10846884B2 (en) Camera calibration device
CN110497851B (en) Method for providing a defined image of a vehicle environment of a motor vehicle, camera system and motor vehicle
JP6664491B2 (en) Method, electronic control and system for position determination
JP2019135620A (en) Traveling support device
JP7066378B2 (en) Image processing device and image processing method
US11321871B2 (en) Method for calibrating a detection system
JP2020145581A (en) Display control unit, display system, and display control method