GB2462071A - Method for determining the separation distance between automotive vehicles - Google Patents

Method for determining the separation distance between automotive vehicles Download PDF

Info

Publication number
GB2462071A
GB2462071A GB0813246A GB0813246A GB2462071A GB 2462071 A GB2462071 A GB 2462071A GB 0813246 A GB0813246 A GB 0813246A GB 0813246 A GB0813246 A GB 0813246A GB 2462071 A GB2462071 A GB 2462071A
Authority
GB
United Kingdom
Prior art keywords
image
registration plate
character
distance
characters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0813246A
Other versions
GB0813246D0 (en
Inventor
George Ferrie
Andrzej Ordys
Kin Yip Chan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
INNOVATIVE VEHICLE SYSTEMS Ltd
Original Assignee
INNOVATIVE VEHICLE SYSTEMS Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by INNOVATIVE VEHICLE SYSTEMS Ltd filed Critical INNOVATIVE VEHICLE SYSTEMS Ltd
Priority to GB0813246A priority Critical patent/GB2462071A/en
Publication of GB0813246D0 publication Critical patent/GB0813246D0/en
Priority to EP09784743A priority patent/EP2318981A1/en
Priority to PCT/GB2009/001790 priority patent/WO2010007392A1/en
Publication of GB2462071A publication Critical patent/GB2462071A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K31/00Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
    • B60K31/0008Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator including means for detecting potential obstacles in vehicle path
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • G06K9/20
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/625License plates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

A method is provided for measuring the distance from a host vehicle to a target vehicle using a distance measuring system provided on the host vehicle. The method comprises capturing an image; using image processing to detect a registration plate in the image thereby identifying the target vehicle, to detect a character on the registration plate, and to measure the height of the character in the image; and determining the distance to the target vehicle by using the measured height of the character. The distance to the target vehicle may be determined by means of a look-up table relating character height to distance. The method may utilize optical character recognition to determine the identity of a detected character. The method may be used in collision avoidance systems.

Description

DETERMINING SEPARATION BETWEEN AUTOMOTIVE VEHICLES
FIELD OF THE INVENTION
The present invention relates to measuring the distance from one automotive vehicle to another automotive vehicle.
In particular, the present invention relates to using optical methods to allow a moving vehicle to determine the distance to another vehicle. Such a method may be used in collision avoidance systems.
BACKGROUND TO THE INVENTION
Automotive vehicles such as cars are becoming increasingly sophisticated, and commonly feature a variety of driver aids. For example, anti-lock braking systems, cruise control and parking assist devices are all commonly provided in modern vehicles. More recent developments include systems that maintain a minimum separation to the vehicle in front, and systems that assist in lane changes (either to warn of an inadvertent lane change, or to warn of a vehicle in the blind spot when attempting a lane change) Many of these systems require a determination of the distance from the host vehicle to a nearby vehicle.
Traditionally, radar-based systems have been used on vehicles to determine the distance to another vehicle. Such systems most often use a time-of-flight measurement. As such, they work well at larger separations, but are less accurate when the time-of-flight is short. This creates an inherent problem: the accuracy is at its worst at short separations when vehicles are close to one another, which is exactly when accuracy is most important.
Optical systems are known for measuring the separation of one vehicle to another. Such systems may measure a characteristic dimension of the car and compare this measured dimension to a look-up table or similar to determine the distance to the vehicle. For example, a digital imaging system may be used. The height of the vehicle in pixels may be used to deduce the distance to the vehicle. The height of the vehicle in pixels may be compared to a range of pixel sizes occupied by an average car height at different distances. If the height is 20 pixels, this may correspond to an expected distance of 75 m and a height of 50 pixels may correspond to 40 m, etc. Obviously, such a system is limited in its accuracy as it relies on an average value for the height or whichever other characteristic dimension is used.
Some systems offer an improvement by determining the size of a registration plate. This is more accurate because the plates tend to be of a standard size, or at least show a limited variation in size. However, such systems are more complex as it is more difficult to identify a registration plate and determine its small size than it is to identify a vehicle and measure its size.
The present invention seeks to improve on such systems that use optical image processing of registration plates to determine the distance from the host vehicle to another vehicle.
SUMMARY OF THE INVENTION
Against this background, and from a first aspect, the present invention resides in a method of measuring the distance from a host vehicle to a target vehicle using a distance measuring system provided on the host vehicle. The method comprises capturing an image and using image processing. Image processing is used to detect a registration plate in the image thereby identifying the target vehicle, to detect a character on the registration plate, and to measure the height in the image of the character. Then the distance to the target vehicle is determined by using the measured height of the character.
This method may be repeated, i.e. frames may be successively captured and the distance to the vehicle computed time and time again.
The vehicles may be of any type commonly provided with registration plates. For instance, the vehicles may be automotive vehicles such as cars, vans, lorries, trucks, buses, coaches, MPVs, Stills, etc. The host vehicle may even be other types such as a tram. A tram is a good example as it shares roads with other traffic provided with registration plates.
The present invention detects one, or possibly more, characters on the registration plate. This may be achieved in a number of ways, e.g. by comparing the height and/or width of the character to the size of the registration plate. This comparison may then be used to determine whether the relative sizes match known registration plate formats. Alternatively, a number of candidate characters may have their heights and/or widths compared: the candidates having approximately the same height may be determined to be characters as the actual characters are likely to have very similar heights. Once a character has been determined, the measured height of that character within the image may be used to determine the distance to the vehicle more accurately. Whilst some variation in registration plate size is possible, the size of the characters is generally well defined and prescribed by national law.
While all that is needed is to be able to detect that a mark on a registration plate corresponds to a character, the identity of that character may also be determined. For example, optical character recognition may be used to identify a particular character. Using character recognition allows further advantages and other optional and beneficial steps to be used, as will be described below.
Preferably, the method comprises detecting more than one character on the registration plate, measuring the height in the image of each character detected, calculating an average height of the characters detected, and determining the distance to the target vehicle from the average height.
Using more than a single character is beneficial, as it means an average value can be taken. This of course increases accuracy. For example, the prior art system described above that measures the size of the registration plate has only the size of the registration plate to rely upon: if this value is inaccurately measured, the distance will also be inaccurate. The present invention offers greater robustness. Potentially, an average of many characters is possible to dilute the effect of a single inaccurate measurement. For example, most registration plates have at least three or four characters, and more usually seven or eight characters. Also, it is possible to remove inaccurate measurements, e.g. by checking deviation amongst the average value and discarding any outside a tolerance, or even simply by discarding the highest and/or lowest values.
Optionally, the method may comprise using optical character recognition to determine the identity of the detected character during first and second iterations, and comparing the determined identities to see if they match.
This may be developed by detecting more than one character on the registration plate, and then using optical character recognition to determine the identity of all the detected characters during first and second iterations, and comparing the determined identities to see if they match. Moreover, the method may comprise identifying the total number of characters on the registration plate, using optical character recognition to determine the identity of at least some of the characters, and comparing the identities of only a subset of the characters present on the registration plate to determine if they match.
These optional steps are beneficial as they allow further information to be obtained through using optical character recognition and identification. For example, a registration plate may be checked between one frame and the next. Confirmation that it is the same registration plate may then be performed, or an indication that another vehicle has been detected can be provided. This has useful application. Consider a host car following a target car. A first reading indicates the target car to be 100 in ahead.
The next reading indicates the car to be only 30 m ahead.
The prior art system cannot distinguish between the more dangerous situation of the target car ahead braking sharply or the less dangerous situation where another car has moved into the lane immediately ahead of the host car. Using the character identification described above allows a distinction to be made between these situations.
While an identification of all the characters in the registration plate may be attempted, it is preferred that only a subset are identified. This provides better resilience against failure to detect any particular character. As an example, the method may comprise identifying two or three characters. The method may scan the registration plate, e.g. left to right, and analyse each character in turn until two or three characters are identified. These characters may be looked for in subsequent frames. The presence of either all or some of the characters may be used to confirm it is the same registration plate. Say, three characters are identified initially and the presence of any two characters from these three used to confirm it is the same registration plate.
Again this improves robustness against occasional errors leading to a failure to identify any particular character.
Obviously, the number of characters chosen can be varied.
The fewer characters chosen, the more prone the system is to making false matches, e.g. fewer cars have both the letters A and G in their registration plates than cars having a B in their registration plate. Conversely, increasing the number of characters identified increases the computational burden.
The method may include detecting more than one registration plate in the image. The method may then include using all detected registration plates, thereby determining the distance to each registration plate (and the associated target vehicle). Alternatively, only a single or subset of registration plates may be selected for further analysis. Selecting one or more candidate registration plates may be based on any number of criteria. For example, only the closest registration plate may be of interest, in which case a simple selection of the largest registration plate image may be made.
In some situations, the location of the registration plate may be important, such as straight ahead or to the left/right or to the rear. A corresponding area in the field of view of the camera may be selected. For example, the drive activating the left indicator may be used by the vehicle to enter a lane-change mode in which it primarily looks for vehicles to the left, in which case this portion of the captured image may be scanned for registration plates.
Preferably, the method comprises, in a first iteration, determining the location of the registration plate in the image, and defining an area of interest around the location of the registration plate. In a second iteration, image processing of just the region of interest may be used to detect a registration plate in that region. Optionally, if a registration plate is not found in the region of interest, the next iteration may see image processing of the full image to detect a registration plate in that image.
Furthermore, the method may further comprise determining the location of the registration plate in the image in the second iteration, comparing the change in location between the first and second iteration, and determining whether to use image processing of the region of interest or the full image dependent upon that comparison.
Determining the position of an identified registration plate within a captured image has large benefits for handling subsequent frames. It can be anticipated that the registration plate will appear in much the same position in the next image. Thus, image processing may be performed for only a smaller portion of the subsequent image corresponding to the position of the registration plate in the previous image. This allows faster processing and/or more detailed processing without a time penalty. The size and position of the registration plate within the whole image may be recalculated for each frame, such that the registration plate may be tracked as it changes in size and/or moves within the whole image.
Further constraints may be used. For example, if a registration plate moves across the whole image to be outside a predefined zone, this may trigger a new full scan to be performed (i.e. a whole image may be checked for registration plates, rather than the cropped image from the previous image). This feature may be used to protect against a target vehicle changing lanes. Should a vehicle change lanes, focus should be switched to the vehicle ahead in the same lane as the host vehicle. Thus a zone may be created to indicate when a target vehicle is no longer in the same lane as the host vehicle, and hence to prompt a search for the next vehicle now in the same lane as the host vehicle.
Storing the history of distances determined is beneficial for many reasons. It allows predictions to be made about the behaviour of vehicles, e.g. to indicate that a vehicle ahead is braking hard, or that a vehicle behind is approaching fast. Where the location of the registration plate in the image is determined during iterations, using historical information about location may provide other useful information. For example, the left/right position of a registration plate in a series of images can provide lane change information, e.g. that a vehicle is moving into the lane ahead of you and that you are rapidly approaching that vehicle.
DESCRIPTION OF THE DRAWINGS
In order to provide a better understanding of the present invention, preferred embodiments will now be described, by way of example only, with reference to the following drawings, in which: Figure 1 is a plan view of a host vehicle provided with a distance measurement system according to an embodiment of the present invention driving along a three-lane motorway; Figure 2 is a cartoon representation of a full image that may be captured by a distance measurement system like that shown in Figure 1; Figures 3a to 3c show detail images of a registration plate formed by the distance measurement system of Figure 1 during image processing; Figure 4a is a schematic diagram of the functional blocks of the distance measurement system of Figure 1, and Figure 4b shows the associated method steps performed by the distance measurement system; Figure 5 shows the method steps performed by the distance measurement system, with further detail shown for the create image step; Figure 6 shows the method steps performed by the distance measurement system, with further detail shown for the process image step; Figure 7 shows the method steps performed by the distance measurement system, with further detail shown for the find registration plate step; Figure 8 shows the method steps performed by the distance measurement system, with further detail shown for the optical character recognition step; -10 -Figure 9 shows the method steps performed by the distance measurement system, with further detail shown for the provide output step; Figure 10 shows the method steps performed by the distance measurement system, with further detail shown for the tracking step; and Figure 11 shows the method steps performed by the distance measurement system, with further detail shown for a second embodiment of the tracking step.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Figure 1 shows in plan view a car 20 fitted with a distance measurement system according to a first embodiment of the present invention. The distance measurement system is shown schematically at 22. Although a car 20 is shown, it will be appreciated that the distance measurement system 22 may be used on any type of vehicle, such as a van, lorry, bus, coach or motorbike. The distance measurement system 22 may even be used with rail vehicles such as trains and trants.
The distance measurement system 22 uses optical techniques to determine the distance to other traffic 24 sharing the road 26 with the host car 20, in front and/or behind the host car 20.
The distance measurement system 22 includes a camera 28 for capturing images from the host car 20. The camera 28 may be positioned at different locations around the host car 20, and any car 20 may be provided with more than one camera 28.
Figure 1 shows contemplated locations for cameras 28, along with the field of view 30 that may be seen by a camera 28 placed at each location.
A camera 28 may be placed centrally on the front of the car 20, as indicated at 28a. Such a camera 28 would provide -11 -good surveillance of the area immediately in front of the car 20, as shown at 30a. Consequently, such a camera 28 would find useful application in monitoring the distance to a car 24 being driven ahead of the host car 20, such as car 24a. For example, the camera 28 may be used to monitor the distance to the car 24a ahead of and in the same lane as the host car 20. Such a function is used in adaptive cruise control systems that maintain a minimum separation from the car in front as well as a set speed. Such a camera location 28a may also provide good visibility of traffic in adjacent lanes, and so may be used with a lane-change assist system.
An alternative location for a camera 28 that is particularly suited to use with lane-change assist systems is on the wing mirror, as shown at 28b. Such a camera location 28b may provide good forwards 30b or backwards 30c visibility of the lane next to the wing mirror. For example, the presence of car 24c can be detected.
A further camera location is shown at 28d, namely a central location at the rear of the car 20. Such a location 28d gives good rearward visibility 30d and may be used to provide warning of traffic close behind or rapidly approaching the host car 20.
Figure 2 shows a typical image 32 that may be taken with a camera 28 provided centrally on the front of a host car 20, like that indicated at 28a in Figure 1. Figure 2 shows that the image 32 taken by the camera 28 includes a car 24a, van 24b, along with other extraneous detail like trees 34, buildings 36, etc. According to this embodiment of the present invention, the distance measurement system 22 obtains the image 32 shown in Figure 2 and uses the -12 -registration plate 38 within the image 32 to determine the distance to the car 24a.
As will be described in more detail below, the registration plate 38 must be identified within the full image 32 shown in Figure 2. A detail 40 from the full image 32 may then be taken that corresponds to the registration plate 38, as shown in Figure 3a-3c. This detail image 40 is then analysed to determine the characters 42 appearing in the registration plate 38. The height of the characters 40 determined within the detail 40 are then measured, averaged, and the average value is used to determine the distance to the car 24a.
The functional parts of the distance measurement system 22 are shown schematically in Figure 4a. The distance measurement system 22 essentially comprises three units: a scanning unit 52, an image processing unit 54 and an output unit 56.
The scanning unit 52 comprises a camera 28, like those shown in Figure 1 and described above. The camera 28 may be any common type, although digital systems comprising sensors with discrete pixels are preferred, such as CCD or CMOS devices. Preferably, the scanning unit 52 performs progressive scans rather than interlaced scans.
The images 32 captured by the scanning unit 52 are transmitted to the image processing unit 54, for example using a Firewire (RTM), USB (universal serial bus) or similar connection. The image processing unit 54 may be implemented in many different ways, although the following description of how the image processing unit 54 works will make it clear that a suitable implementation is as software installed and run on a microprocessor.
-13 -The output unit 56 provides an output of the results of the distance measurement system 22 in a form that is suitable for its application. The output may merely be a signal indicative of the distance to the car in front 24a for use by another component. For example, such a signal may be passed to an adaptive cruise control processor that uses the signal to determine whether the separation has dropped below a set threshold, and so take appropriate action where required. Alternatively, the output unit 56 may present the results of the distance measurement system 22 to a user in a form that is suitable for its application. For example, an alarm may be presented to indicate the separation to the car in front 24a has dropped below a threshold. The alarm may be any of or any combination of an audible alarm, a visual alarm or a tactile alarm (e.g. vibration of the steering wheel).
Figure 4b complements Figure 4a and shows the basic method steps performed by the distance measurement system 22. First, at 200, the scanning unit 52 creates an image 32 by capturing the view with the scanning unit 52.
This image 32 is passed to the image processing unit 54 where, at 400, the image processing unit 54 processes the image 32, as has been described briefly above and as will be described more fully below. Once the image processing unit 54 has determined the distance to the target car 24a, this information is provided by the output unit 56 at 600.
The method then loops back to 200 via path 800 where a further image 32 is captured, and so on. Typically, the scanning unit 52 and the distance measurement system 22 as a whole works at a rate of 10 frames per second. Although Figure 4b suggests that the whole method is performed before the next frame 32 is captured, forms of parallel processing -14 -may be used. For example, buffers may be used to store information such that the three steps may be performed concurrently. In such an arrangement, at any instance, the scanning unit 52 may be acquiring the current frame 32, the image processing unit 54 may be processing the previous frame 32, and the output unit 56 may be outputting the result obtained from two frames 32 ago.
In Figure 5, the steps performed by the scanning unit 52 are shown in more detail. At 210, the scanning unit 52 captures a full image 32, for example by accumulating charge on a CCD array. At 220, the full image 32 is read and a signal representative of the full image 32 outputted. For example, the charge values acquired by the pixels of a CCD array may be read and stored, and then provided to the image processing unit 54.
Figure 6 shows the processing step 400 performed by the image processing unit 54 in more detail. As described above, for each frame 32 captured, the image processing unit 54 receives the full image 32 from the scanning unit 52. At a high level, the image processing unit 54 performs two functions. The first is registration plate localisation at 410, and the second is optical character recognition at 460.
Registration plate localisation 410 comprises identifying registration plates 38 within the full image 32 and, as appropriate, identifying one or more of the registration plates 38 as targets. For the sake of simplicity, the following description will be in the context of selecting only a single target registration plate 38, although it will be appreciated that the method may be repeated for as many target registration plates 38 as desired. With the target registration plate 38 identified, -15 -a detail 40 containing the plate 38 may be fixed and its location provided for the optical character recognition step at 460.
The optical character recognition step at 460 sees the detail image 40 analysed to locate all characters 42 displayed by the plate 38, as indicated by the boxes 44 in Figure 3c. The characters 42 themselves need not necessarily be identified as it may be enough merely to know that they are a character 42 (as may be done by size relative to the plate 38, for example). The average pixel height of the identified characters 42 may then be found and the distance determination made.
The registration plate localisation 410 and optical character recognition steps 460 will now be described in more detail with reference to Figures 7 and 8.
Registration plate localisation 410 starts at 412 where registration plate tracking functions are performed. These will be described in detail below. For the time being, it can be assumed that the full image 32 passes through this step 412 unchanged, and continues to step 414.
In this embodiment, the colour band filter 414 produces three colour band images: a RGB image, a YCrCb image and an HSV image, although any of these images or combinations may be used. These images are then processed separately.
However, the images may be combined at any of the stages described below, with appropriate weighting if desired.
At 416, the RGB and YCrCb colour band images are each converted to greyscale to produce RGB and YCrCb greyscale images. The YCrCb and HSV colour images and the RGB and YCrCb greyscale images are passed through a noise filter to remove noise at 418. A Gaussian or median filter may be used for this step.
-16 -Edge detection is performed at 420. This is accomplished using thresholding of the RGB and YCrCb greyscale images, and by using a canny edge detection algorithm for the YCrCb and HSV colour images. Sobel or Hewitt edge detection may be used as an alternative to or as a complement to canny edge detection. The results are four contoured images, i.e. each image contains contours indicating where edges are present. If no edges are found in one of the images, that image is discarded. If none of the images contain edges, the method returns to step 200, where a new full image 32 is captured.
The four contour images (or however many were found to contain edges) are filtered at 422. This filtering sees all contours below a predetermined size discarded. For example, any contour less than 400 pixels long may be discarded. A contour image is discarded if no edges remain, and the method returns to step 200 where a new full image 32 is captured if no contour image remains that contains contours.
Assuming all four filtered contour images remain, the contours within each contour image are analysed to determine whether they may correspond to a registration plate 38.
This is done in two steps. At 424, the contours are tested to see if they are rectangular. This may be achieved by testing to see if the contour contains four corners, each of around 90 degrees. Al contours not meeting these requirements are discarded. In the second step at 426, the remaining contours are tested to see whether they meet expected aspect ratio requirements. The aspect ratios are preset to correspond to registration plate sizes commonly found within any particular territory. For example, in the UK, two ranges of 0.2-0.4 and 0.6-0.8 may be used. Again, any contours not meeting this requirement are discarded. At -17 -each of the two stages 424 and 426, any filtered contour images no longer containing any contours are discarded, and the method may return to step 20 when no filtered contour images remain. If no registration plates 38 are found, a flag that indicates the presence of a registration plate 38 is set to negative at 428 (conversely, the flag is set to be positive if a registration plate 38 is found).
At 430, the target registration plate 38 is selected.
As mentioned above, more than one plate 38 may be used as a target, so multiple target plates 38 may be selected here.
If all registration plates 38 are to be used, this step 430 would be unnecessary as no selection would be required.
Different requirements may be applied for locating the target registration plate 38. For example, the largest contour may be selected from each filtered contour image.
Alternatively, the contour most central with respect to left and right maybe selected, Of course, the same contour selected from each filtered contour image should be the same, although this is checked. Where a difference exists, this may be resolved by any number of voting schemes, e.g. majority voting or by turning to a favoured contour image to decide such as the contour image derived from the HSV colour band image.
With the target registration plate 38 selected from each remaining filtered contour image, the area in the full image 32 occupied by the registration plate 38 is determined at 432 and a detail 40 of the registration plate is formed at 436. In this embodiment, a single location and size is determined and applied to create each detail 40. To do this, a filtered contour image is chosen, e.g. the HSV-derived contour image, and the co-ordinates of the target contour in this image are used to define the location and -18 -size of the detail 40 (e.g. the corner pixel locations may be used). The co-ordinates of the registration plate 38 are stored at 434 for later use during the registration plate tracking function indicated at 412. The four colour and S greyscale images are cropped at 436 to leave just the registration plate detail 40 as shown in Figure 3a, and it is these detail images 40 that are passed to the optical character recognition step at 460.
Figure 8 shows the steps performed by the optical character recognition at 460 in more detail. As described above, the optical character recognition 460 receives detail images 40 formed from the colour and greyscale images that contain just the registration plate 38. The filtered contour images are not used during the cropping process as the filtering may remove contours corresponding to the characters 42 on the registration plate 38, because the contours may well fall below the size threshold.
These detail images are again passed through a noise filter at 462, similarly to as described with respect to step 418 in Figure 7. However, the smaller image size means that a finer filter may be used without too severe a time penalty. Each filtered image is then converted to a detail contour image at 464 using thresholding for the greyscale images and canny edge detection for the colour images, as described previously for step 420. Figure 3b shows such a resulting contour image. Finer filters may be used due to the decrease in image size now that the smaller detail images are being processed.
The detail contour images are then analysed at 466 to determine how many characters 42 are present. As the detail 40 contains only the registration plate 38, only features on the registration plate 38 give rise to -19 -contours 50. However, some contours 50 may arise from features other than characters, e.g. from screw heads, dirt 46 or other marks 48 on the registration plate 38.
Each contour 50, 50' is compared against stored templates of characters 42 at 468. Where a match is found, that contour 50 is determined to be a character 42. The identified character 42 is also stored at 470 for use in the registration plate tracking at 412.
Alternatively, contours may be determined to be characters without actually identifying the particular character. For example, a contour may have its size compared to the size of the registration plate to see if the ratio is as expected. Another method is to compare the size of candidate characters and then to determine the characters from those candidates that show good correlation.
The height of the character 42 is then determined at 472. This is repeated for all the contours at 474 until all characters 42 have been identified and the height of each determined.
At 476, the average of the heights of the characters 42 are calculated. It will be remembered that the above steps are performed for each contour image derived from each of the colour and greyscale images. The average character height from each image is then used to obtain an overall average: this overall average may be formed as a weighted combination of the averages from each image to introduce a greater reliance on the images that most often produce the best results (the HSV image for example) At 478, this average height is used to determine the distance to the target plate 38. This is performed by consulting a look-up table and reading an appropriate distance recorded against an appropriate pixel size. In -20 -most cases, the average value will not correspond to a pixel size recorded in the look-up table. Many common methods may be used to overcome this. The nearest available pixel size recorded in the table may be used. The corresponding distance may then be used. Alternatively, the distances corresponding to the adjacent pixel sizes recorded in the table may be used to arrive at an interpolated distance.
The distance, however determined, is then provided to the output unit 56.
Figure 9 shows a general representation of the steps performed by the output unit 56. In general, the output unit 56 will process the distance value it receives at 610, and will form a suitable output at 620. The precise form these two steps will take will depend upon the application to which the distance measurement system 22 is being applied.
For example, the distance value processing at 610 may comprise a comparison against a threshold value. Then, the output formed at 620 may take one of two values depending upon the comparison. Suppose that the distance measurement system 22 is used to monitor whether or not the separation to the vehicle in front 24a drops below a safe minimum. The safe minimum may vary so as to be speed dependent. If the measured distance value is found to be above the safe minimum, no action need be taken. If, on the other hand, the separation falls below the safe minimum, the form of the output may be an alarm such as a warning light, an audible alarm or a tactile alarm to alert the driver to the fact that greater separation should be achieved. A signal may also be generated when the separation falls below the safe threshold to control the speed of the host car 20 -21 -automatically, either by reducing the throttle and/or by applying brakes.
Of course, more than single threshold maybe used, and information may be provided to a driver and the speed of the car 20 controlled in response to the comparisons made. For example, a green warning light may indicate that the separation is fine, an amber warning light to indicate that the separation has fallen into a middle band where the driver is expected to take remedial action, and a red warning light to indicate that the separation has fallen below a safe minimum and that may also indicate the host car is being slowed automatically.
With the output determined at 620, the method returns to step 200 via loop 800 where the next image 32 is captured, and the method repeats the loop described above.
As mentioned above, the distance measurement system 22 operates at a rate of 10 frames per second. Thus, each iteration of the method should take 0.1 s at most to complete. In the alternative described above where parallel processing is employed, each parallel step may take up to 0.1 S. Registration plate tracking 412 has been mentioned above. This step 412 may influence how the image 32 is processed by the image processing unit 54. A key advantage of using registration plate tracking 412 is a reduction in image processing. This is because the location of the registration plate 38 in the previous full image 32 may be used to guide where image processing will be performed on the next full image 32.
Figure 10 shows a first form of registration plate tracking 412. These steps are performed after capture of a full image 32 at 200. At 510, the flag stored at 428 that -22 -indicates the presence of a registration plate 38 in the previous image 32 is read. As shown at 520, if the flag is negative, such that no registration plate 38 was found in the previous image 32, the full image 32 just captured at 200 is left untouched, and image processing proceeds based on this full image 32 as was previously described.
If, on the other hand, the flag is positive to indicate that a registration plate 38 was found in the previous image 32, the co-ordinates stored for the last registration found at 434 are read at 530. At 540, the current full image 32 just captured at 200 is then taken, and a region of interest 60 is formed around the stored co-ordinates. In this embodiment, the region of interest 60 is defined as being a window extending three times the width and five times the height of the area occupied by the registration plate 38 in the previous image 32, the registration plate location being positioned centrally within this window.
Other window sizes may be used. Figure 2 shows such a region of interest 60. At 550, the just-captured full image 32 is then cropped around this area of interest 60.
It is then this cropped image that is used in the subsequent image processing, i.e. it is the cropped image that is passed to the colour band filter at 414 for further processing in the remainder of the registration plate localisation step 410'.
Thus, most of the full image 32 may be discarded and image processing is instead performed on a generally much smaller cropped image. The size of the region of interest 60 is set so as to allow for some movement and change in size of the imaged registration plate 38 between frames 32 as the host and target cars move relative to each other. Of course, in exceptional circumstances it is -23 -possible that the registration plate 38 is no longer present or wholly present within the region of interest 60. In this case, a registration plate 38 will not be identified at steps 424 and 426. This will result in the flag indicating the presence of a registration plate 38 being set to negative at 428 such that step 510 in the following loop will determine that a full image 32 is to be used in the subsequent image processing. Hence, the analysis of the full image 32 next time around will ensure a registration plate 38 is found again.
As the co-ordinates of the target registration plate 38 are updated during each iteration, the registration plate 38 may be tracked through successive images 32, such that processing of full images 32 may be minimised.
Figure 11 shows a second form of registration plate tracking 412. This form is essentially a development of the form shown in Figure 10, and like parts are denoted by like reference numerals.
At 510, the flag is read as before. If the flag is negative to indicate that the previous image 32 did not contain a registration plate 38, the method proceeds to step 520 where the full image 32 is passed to the remainder of registration plate localisation 410' for further image processing.
If the flag indicates that a registration plate 38 was present in the previous image 32, the method proceeds to step 522 where the characters 42 recognised in the previous image and stored at 470 are read. This step 522 is repeated for the recognised characters 42 stored for the image before that, i.e. recognised characters 42 for the previous two images are read. These characters 42 may be stored in a -24 -first-in first-out type memory that stores the two most recent sets of characters 42.
At 524, the two sets of characters 42 are compared to determine whether they match. If the characters 42 do not match, the method proceeds to step 520 where the full image 32 is passed for further image processing. This step 524 is useful as it allows an identification of when another vehicle 24 moves into the lane ahead of the host car 20 between the host car 20 and the target car 24a, or when the previous target car 24a suddenly moves out of the lane ahead to leave another more distant car 24 in the camera's field of view 30a. In these instances, processing of a full image 32 is used to ensure that the most suitable registration plate 38 is identified as the new target plate 38.
If the characters 42 are found to match, thereby indicating that the target car 24a has not changed, the method proceeds to step 530 where the co-ordinates of the registration plate 38 in the previous image 32 are read, as was described above.
A further step is preformed at 532. Here, the current co-ordinates are compared against the co-ordinates stored during the iteration before. This comparison indicates how far the registration plate 38 has moved to the left or right. A threshold is set. This threshold may be a movement of a set amount, e.g. a fixed or fractional distance from the position first recorded for the registration plate 38, or it may be a fixed location (in which case a comparison to the previous iteration is not necessary) For example, the scanning unit 52 may be looking to the lane ahead and expect to find target registration plates 38 -25 -appearing in the middle third of the full image 32. If the position of the target registration plate 38 moves outside this middle third, this may indicate that the target car 24a has moved too far in which case the method proceeds to step 520 where the full image 32 is passed for further image processing.
Movement of the registration plate 38 in this way may indicate that the target car 24a has moved out of the lane ahead, such that the distance rneasurenient system 22 should find the next car 24 in the lane ahead. This feature mitigates against the distance measurement system 22 tracking a car 24 out of lane. For instance, without this feature, the distance measurement system 22 may target a car 24 that is initially immediately ahead in the same lane as the host car 20, but then continue to track that car 24 as it changes lane and overtakes another slow-moving car 24 that remains in the lane ahead of the host car 20 and is rapidly caught by the host car 20.
If the comparison at 532 indicates that the registration plate 38 has not moved too far, the method proceeds to steps 540 and where, as before, the region of interest 60 is set and the full image 32 is cropped at 550 to match this region of interest 60.
The person skilled in the art will appreciate that modifications may be made to the above-described embodiments without departing from the scope of the invention defined by the appended claims.
It has been described elsewhere to which types of vehicles the present invention may be applied. How the distance measurement system is implemented is a matter of choice. Implementation primarily as software running on a -26 -suitably programmed computer would make a good choice, but other hardware/software/firmware options are possible.
The embodiments described above make use of different image types, namely RGB, YCrCb and HSV. However, this is not necessary and a single image type could be used. This may even be a raw image, either colour or greyscale. Where more than a single image is used, these may be combined in different ways. For example, a weighted combination of images may be formed for later image processing. The combination may be made at different stages, e.g. after noise filtering at 418, edge detection at 420, contour filtering at 422, registration plate identification 424 and 426, or after the target registration plate has been chosen at 430. Similarly, the combination may be made after any of the optical character recognition steps shown in Figure 8. An alternative scheme is to use multiple image types for the registration plate localisation step at 410, followed by selection of only a single image type for processing during the optical character recognition step at 460.
The method may include other steps commonly used in image processing, e.g. image correction and compensation.
For example, skew may be corrected in the image so as to present the registration plate as viewed from directly in front, and with characters aligned to the horizontal and vertical. Other well-known algorithms may be used to optimise image contrast, brightness, etc. As noted above, the present invention need not rely on optical character recognition. Characters on a registration plate may be identified as such using alternative methods, e.g. using either of the methods mentioned above, namely using the relative size of a candidate character to the -27 -registration plate size, or by matching sizes of candidate characters. Other methods of identifying contours as characters will be apparent to those skilled in the art.

Claims (17)

  1. -28 -CLAIMS1. A method of measuring the distance from a host vehicle to a target vehicle using a distance measuring system provided on the host vehicle, the method comprising: capturing an image; using image processing to detect a registration plate in the image thereby identifying the target vehicle, to detect a character on the registration plate, and to measure the height in the image of the character; and determining the distance to the target vehicle by using the measured height of the character.
  2. 2. The method of claim 1, comprising determining the distance to the target vehicle by using a look-up table relating character height to distance.
  3. 3. The method of claim 1 or claim 2, comprising detecting more than one character on the registration plate, measuring the height in the image of each character detected, calculating an average height of the characters detected, and determining the distance to the target vehicle from the average height.
  4. 4. The method of any preceding claim, comprising cropping the image after the registration plate has been detected to an area substantially corresponding to the registration plate, and then performing the image processing on this cropped image.
  5. 5. The method of any preceding claim, comprising further iterations of capturing an image, using image processing to -29 -detect the registration plate, to detect a character and to measure the height of the character, and determining the distance to the target vehicle.
  6. 6. The method of claim 5, comprising using optical character recognition to determine the identity of the detected character during first and second iterations, and comparing the determined identities to see if they match.
  7. 7. The method of claim 6, comprising detecting more than one character on the registration plate, using optical character recognition to determine the identity of the detected characters during first and second iterations, and comparing the determined identities to see if they match.
  8. 8. The method of claim 7, comprising identifying the total number of characters on the registration plate, using optical character recognition to determine the identity of at least some of the characters, and comparing the identities of only a subset of the characters present on the registration plate to determine if they match.
  9. 9. The method of any of claims 5 to 8, comprising: in a first iteration, determining the location of the registration plate in the image, defining an area of interest around the location of the registration plate; and, in a second iteration, using image processing of just the region of interest to detect a registration plate in that region.
  10. 10. The method of claim 9 wherein, if a registration plate is not found in the region of interest, using in the next -30 -iteration image processing of the full image to detect a registration plate in that image.
  11. 11. The method of claim 9 or claim 10, further comprising determining the location of the registration plate in the image in the second iteration, comparing the change in location between the first and second iteration, and determining whether to use image processing of the region of interest or the full image dependent upon that comparison.
  12. 12. The method of any of claims 5 to 11, further comprising using optical character recognition to determine the identity of the detected character in the first and second iterations and to confirm that the character is the same, thereby indicating that the registration plate is the same.
  13. 13. The method of any of claims 5 to 12, comprising determining the location of the registration plate in the image during iterations, and storing a history of the registration plate locations.
  14. 14. The method of any of claims 5 to 13, comprising storing a history of the distances determined.
  15. 15. A distance measurement system arranged to implement the method of any preceding claim.
  16. 16. A vehicle comprising the distance measurement system of claim 15.-31 -
  17. 17. A computer program that, when executed, is operable to measure the distance from a host vehicle to a target vehicle by: receiving a digitised image; using image processing to detect a registration plate in the image thereby identifying the target vehicle, to detect a character on the registration plate, and to measure the height in the image of the character; and determining the distance to the target vehicle by using the measured height of the character.329009; NPT; ACO
GB0813246A 2008-07-18 2008-07-18 Method for determining the separation distance between automotive vehicles Withdrawn GB2462071A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB0813246A GB2462071A (en) 2008-07-18 2008-07-18 Method for determining the separation distance between automotive vehicles
EP09784743A EP2318981A1 (en) 2008-07-18 2009-07-20 Determining separation between automotive vehicles
PCT/GB2009/001790 WO2010007392A1 (en) 2008-07-18 2009-07-20 Determining separation between automotive vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0813246A GB2462071A (en) 2008-07-18 2008-07-18 Method for determining the separation distance between automotive vehicles

Publications (2)

Publication Number Publication Date
GB0813246D0 GB0813246D0 (en) 2008-08-27
GB2462071A true GB2462071A (en) 2010-01-27

Family

ID=39737327

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0813246A Withdrawn GB2462071A (en) 2008-07-18 2008-07-18 Method for determining the separation distance between automotive vehicles

Country Status (3)

Country Link
EP (1) EP2318981A1 (en)
GB (1) GB2462071A (en)
WO (1) WO2010007392A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014167676A (en) * 2013-02-28 2014-09-11 Fujifilm Corp Inter-vehicle distance calculation device and motion controlling method for the same

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011111074A2 (en) * 2010-03-11 2011-09-15 Matrix Laboratories Ltd An improved process for the preparation of tenofovir disoproxil fumarate
DE102010033212A1 (en) * 2010-08-03 2012-02-09 Valeo Schalter Und Sensoren Gmbh Method and apparatus for determining a distance of a vehicle to an adjacent vehicle
DE102011005780B4 (en) 2011-03-18 2022-06-02 Robert Bosch Gmbh Method and device for determining a distance between a vehicle and an object
DE102011075674A1 (en) * 2011-05-11 2012-11-15 Continental Teves Ag & Co. Ohg Distance determination by means of a camera sensor
DE102011055441A1 (en) * 2011-11-17 2013-05-23 Continental Teves Ag & Co. Ohg Method for determining spacing between preceding and forthcoming motor cars by using mono camera in e.g. adaptive cruise control system, involves determining spacing between cars based on information about license plate number
KR101326943B1 (en) * 2012-06-29 2013-11-11 엘지이노텍 주식회사 Overtaking vehicle warning system and overtaking vehicle warning method
WO2015147764A1 (en) * 2014-03-28 2015-10-01 Kisa Mustafa A method for vehicle recognition, measurement of relative speed and distance with a single camera
CN104897132A (en) * 2015-04-29 2015-09-09 江苏保千里视像科技集团股份有限公司 System for measuring vehicle distance through single camera, and measurement method thereof
WO2017012743A1 (en) * 2015-07-17 2017-01-26 Robert Bosch Gmbh Method for checking the plausibility of a control decision for safety means
KR101736104B1 (en) * 2015-10-28 2017-05-16 현대자동차주식회사 Vehicle and controlling method for the vehicle
DE102018125450A1 (en) * 2018-10-15 2020-04-16 HELLA GmbH & Co. KGaA Method for determining at least one relative parameter of at least the position or the movement of a first vehicle in relation to a second vehicle
DE102019108595A1 (en) * 2019-04-02 2020-10-08 Bayerische Motoren Werke Aktiengesellschaft Simulation of road users
JP7488010B2 (en) 2020-10-15 2024-05-21 矢崎総業株式会社 Distance calculation device and distance calculation method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11166811A (en) * 1997-12-05 1999-06-22 Nippon Telegr & Teleph Corp <Ntt> Inter-vehicular distance measuring method and device and storage medium storing inter-vehicular distance measuring program
JP2004271250A (en) * 2003-03-06 2004-09-30 Mitsubishi Motors Corp Vehicle following distance detector
JP2006329776A (en) * 2005-05-25 2006-12-07 Sumitomo Electric Ind Ltd Car position detection method, vehicle speed detection method, and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1173514A (en) * 1997-08-29 1999-03-16 Nissan Motor Co Ltd Recognition device for vehicle
JP3904988B2 (en) * 2002-06-27 2007-04-11 株式会社東芝 Image processing apparatus and method
FR2883818B1 (en) * 2005-03-31 2008-09-12 Valeo Vision Sa METHOD OF ANTICIPATED DETECTION OF AN ARRIVAL OF A MOTOR VEHICLE IN A DARK SECTOR

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11166811A (en) * 1997-12-05 1999-06-22 Nippon Telegr & Teleph Corp <Ntt> Inter-vehicular distance measuring method and device and storage medium storing inter-vehicular distance measuring program
JP2004271250A (en) * 2003-03-06 2004-09-30 Mitsubishi Motors Corp Vehicle following distance detector
JP2006329776A (en) * 2005-05-25 2006-12-07 Sumitomo Electric Ind Ltd Car position detection method, vehicle speed detection method, and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Isobe & Nakamura: "Measurement of the distance between cars by an in-vehicle monocular camera" Bulletin of Hiroshima Institute of Technology Research Volume Feb.2008 Hiroshima Institute of Technology Japan ISSN1346-9975 (print) VOL 42 NR 2 PG 255 - 260 Abstract *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014167676A (en) * 2013-02-28 2014-09-11 Fujifilm Corp Inter-vehicle distance calculation device and motion controlling method for the same
US9361528B2 (en) 2013-02-28 2016-06-07 Fujifilm Corporation Vehicle-to-vehicle distance calculation apparatus and method

Also Published As

Publication number Publication date
GB0813246D0 (en) 2008-08-27
WO2010007392A1 (en) 2010-01-21
EP2318981A1 (en) 2011-05-11

Similar Documents

Publication Publication Date Title
GB2462071A (en) Method for determining the separation distance between automotive vehicles
US8194998B2 (en) Preceding vehicle detection system
CN107703505B (en) Trailer size estimation using two-dimensional radar and camera
US9058524B2 (en) Measuring the range to an object, in an image, using size categorization
US9594155B2 (en) Vehicle radar system with trailer detection
JP3596314B2 (en) Object edge position measuring device and moving object traffic judging device
JP3630100B2 (en) Lane detection device
EP3293052A1 (en) Trailer lane departure warning and sway alert
US9047518B2 (en) Method for the detection and tracking of lane markings
US10393862B2 (en) Trailer estimation with elevation enhanced sensing
US11691585B2 (en) Image processing apparatus, imaging device, moving body device control system, image processing method, and program product
US10885351B2 (en) Image processing apparatus to estimate a plurality of road surfaces
US10944958B2 (en) Method and device for detecting height-limiting rod, and automatic driving system
US11151395B2 (en) Roadside object detection device, roadside object detection method, and roadside object detection system
CN107750213B (en) Front vehicle collision warning device and warning method
JPH10283461A (en) Outer-vehicle monitoring device
EP3087532B1 (en) Method for determining a width of a target vehicle by means of a camera system of a motor vehicle, camera system and motor vehicle
JP2008117073A (en) Interruption vehicle detection device
JP2013057992A (en) Inter-vehicle distance calculation device and vehicle control system using the same
CN107886729B (en) Vehicle identification method and device and vehicle
CN110717361A (en) Vehicle parking detection method, preceding vehicle start reminding method and storage medium
CN112141084A (en) Parking assistance device and parking assistance method
KR101721442B1 (en) Avoiding Collision Systemn using Blackbox Rear Camera for vehicle and Method thereof
JPH09272414A (en) Vehicle control device
JP2000259997A (en) Height of preceding vehicle and inter-vehicle distance measuring device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)