EP3144853B1 - Detection of water droplets on a vehicle camera lens - Google Patents

Detection of water droplets on a vehicle camera lens Download PDF

Info

Publication number
EP3144853B1
EP3144853B1 EP15185771.1A EP15185771A EP3144853B1 EP 3144853 B1 EP3144853 B1 EP 3144853B1 EP 15185771 A EP15185771 A EP 15185771A EP 3144853 B1 EP3144853 B1 EP 3144853B1
Authority
EP
European Patent Office
Prior art keywords
image
droplet
detection portion
synthetic
image frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP15185771.1A
Other languages
German (de)
French (fr)
Other versions
EP3144853A1 (en
Inventor
Clifford Lawson
Lingjun Gao
Peter Gagnon
Dev Yadav
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive GmbH
Original Assignee
Continental Automotive GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive GmbH filed Critical Continental Automotive GmbH
Priority to EP15185771.1A priority Critical patent/EP3144853B1/en
Priority to PCT/EP2016/070417 priority patent/WO2017045913A1/en
Priority to DE112016003300.1T priority patent/DE112016003300T5/en
Publication of EP3144853A1 publication Critical patent/EP3144853A1/en
Application granted granted Critical
Publication of EP3144853B1 publication Critical patent/EP3144853B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/754Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries involving a deformation of the sample pattern or of the reference pattern; Elastic matching

Definitions

  • the present application relates to vehicle camera systems.
  • Video-based raindrop detection for improved image registration by Martin Roser and Andreas Geiger relates to the topic of video-based raindrop detection for improved image registration, wherein artificial raindrop patterns are generated in a region of interest (ROIs) and raindrops are verified by intensity-based correlation, and wherein prior knowledge of the vehicle dynamics and on the camera setup in combination with homography constraints is used.
  • ROIs region of interest
  • the method is also used for detecting droplets on a transparent surface in front of the camera lens, such as a windscreen or on a transparent lens cover.
  • the droplet detection procedure according to the present specification makes use of the lens property of droplets.
  • each droplet itself acts as a further lens and contains a miniature view of the entire scene that is being seen by the main camera image.
  • the focal plane of the droplets is very close to the lens surface of the camera lens. Behind the focal plane of the drops the light rays diverge again and the droplets act as wide angle lenses. As a consequence, the droplets appear as tiny wide angle lenses at the focal plane of the camera lens, which is far behind the focal plane of the drops and produce a minimized image of the scene.
  • the droplets have a large curvature and act as fish eye lenses which can cover a large part of the scene or even provide a 180° view that covers the complete scene which is observed through the camera lens.
  • the image that is produced by the droplet is in general rotated. Usually the image rotation is about 180° degrees, but for some shapes of droplets the rotation angle can be different from 180°.
  • the rotation can be found out by identifying a demarcation line between a bright and a dark area, such as the horizon, and observing its rotation.
  • the droplet image is distorted. If the drops are not spherical but thicker at one end, the wide angle effect is stronger at the thicker end of the drop and the distortion is stronger.
  • the imaging plane where the drop image appears sharp is located behind the imaging plane in which the scene appears sharp. If the objects are sufficiently far away, the imaging planes are close to the respective focal planes.
  • the effect can be compensated according to the present specification by focusing part of the camera light to a second sensor, which is placed at a distance at which the droplet images appear sharp for a typical droplet size and using the memory representation of this image for the detection of water drops.
  • the droplets which form "little windows” often stick to the surface and do not move from frame to frame. According to the present specification, this property can be used to improve the identification of the droplets by using multiple image frames. If it is detected that an object resembles the scene and the boundaries are not changing or are changing only slightly, the detection of a droplet can be confirmed.
  • the droplet detection can be confirmed or improved by detecting whether the content of an object, such as the relative illumination of image portions changes in a similar way as the complete observed scene.
  • Consecutive image frames can be considered as forming a three dimensional cube.
  • miniature versions of the N frame cube are detected in the N frame cube.
  • a droplet detection according to the present specification can also be initialized by detecting boundaries of objects and confirming the detection by matching the content of the detected object with the scene.
  • the method is carried out on an image which is not already corrected for lens distortions.
  • the image detection can be carried out on simple mono cameras that provide grey scale images. If colour images are available, the colour information can be used to improve the detection quality. If image frames from a stereo camera are available, the difference between the stereo images may be used in addition to speed up the detection of droplets.
  • rain drops are detected by identified image areas which appear to be a small scaled view of the entire image.
  • the entire image view is scaled to various small sizes within the known range of likely raindrop sizes. Then, these scaled down areas are matched against areas of the main image by pattern matching. If small, entire views of the complete image are found on top of image patches it can be concluded that the image patches correspond to raindrops in that part of the image.
  • the detected raindrop areas are reported or forwarded to other algorithms as areas that should be ignored or treated with caution. If the proportion of the rain drops is large, it may even be decided that the entire image cannot be trusted for other object detection algorithms because it is known to be severely obscured.
  • the current specification discloses a computer implemented method for detecting droplets on a lens of a vehicle camera.
  • Image data with image frames is received from a vehicle camera.
  • a synthetic droplet image is generated by reducing a first image frame of the image data to a predetermined size of a droplet, which is within a pre-determined range of droplet sizes.
  • the reduction in size can in general involve some sort of averaging or interpolation procedure to map the pixels of the image frame to the size of the synthetic droplet image.
  • the image frame does not have to include every pixel of the image frame. For example it also refers to a major portion of the image frame.
  • a reduced size image is obtained, and the reduced size image is rotated by a pre-determined rotation angle, which is typically 180 degrees.
  • a 180 degree rotation may be obtained without copying computer memory portions just by changing an indexing of the lines of the memory representation of the image frame or of the synthetic droplet image.
  • a computer program may be operative to access the computer memory in a reverse order.
  • the rotation may also be obtaining by copying memory content, however.
  • a similarity is determined or computed between the synthetic droplet image and a detection portion of a second image frame at a pre-determined position of the detection portion.
  • the detection portion has essentially the same shape and size as the synthetic droplet image.
  • the first and second image frame may refer to the same image frame or to copies of the same image frame.
  • the first and second image frames are obtained from separate image sensors but cover essentially the same region of the scene.
  • the comparison is repeated for different positions of the detection portion.
  • the comparison may be repeated for different positions of the detection portion until at least 80% of the frame area is covered with a pre-determined resolution of positions.
  • the pre-determined resolution may be chosen as 1 pixel, which means that close-by detection regions are offset by 1 pixel with respect to each other, or it may be lower.
  • a droplet, or a droplet candidate, is detected if the similarity exceeds a predetermined threshold.
  • a ratio of the area occupied by the droplets and the area of the image frame, and an alert message is sent if the ratio exceeds a predetermined value.
  • the locations of the detected droplets are forwarded to other image processing applications.
  • the comparison between the synthetic droplet and the image portion is performed in parallel for multiple image portions.
  • the comparison can be carried out by multiple threads, which a scheduler program or a scheduler hardware component can assign to different processors or processor kernels depending on the availability of hardware resources.
  • the second image frame is stored in a first memory location and in a second memory location.
  • a second synthetic droplet image is generated by reducing the first image frame of the image data to a second predetermined size of a droplet, thereby obtaining a reduced size image and by rotating the reduced size image by a pre-determined rotation angle, which is typically 180 degrees.
  • a similarity is determined between the second synthetic droplet image and a detection portion of the second image frame at the second memory location at pre-determined positions of the detection portion.
  • the detection portion has the same shape and size as the second synthetic droplet image.
  • a droplet, or a droplet candidate, is detected if the similarity exceeds a predetermined threshold.
  • the first image frame is separated into regions with differing brightness for example regions below and above a horizon.
  • the comparison between a synthetic droplet image and a detection portion comprises separating the detection portion into corresponding regions and comparing the regions of the first image frame with the regions of the detection portion. Thereby, the droplet detection may be improved or it may be accelerated.
  • the position and shape of the synthetic droplet is adjusted according to a predetermined lens distortion function and according to a position of the detection portion in the second image frame.
  • This embodiment takes into account that droplet may appear differently in different locations, in particular next to the border of the image frame.
  • the first image frame is obtained from a first sensor and the second image frame is obtained from a second sensor, wherein an optical distance from a camera lens to the first sensor is different from an optical distance from the camera lens to the second sensor.
  • This embodiment can take into account that the droplets change the effective focal distance of the lens, and the image of the droplet appears focused at a different optical distance than the image plane of the main image.
  • the sizes of the droplets can be chosen according to a pre-determined geometric progression or according to a pre-determined statistical distribution of droplets.
  • previously determined positions of droplets can be used as starting positions for comparing the synthetic droplets with underlying portions of the image frame.
  • the determined droplet positions of the last 10 frames may be used to predict the positions of the droplet in a present frame.
  • the specification also discloses a computer program for executing the steps of the aforementioned method and a computer readable storage medium with the computer program.
  • the current specification discloses an image processing device for a vehicle camera which is operative to execute the aforementioned detection method.
  • the present specification discloses an image processing device which comprises an input connection for receiving image data from the vehicle camera, the image data comprising image frames and a computation unit that is connected to the input connection.
  • the computation unit is operative to generate a synthetic droplet image by reducing a first image frame of the image data to a predetermined size of a droplet, thereby obtaining a reduced size image and by rotating the reduced size image by a pre-determined rotation angle. Furthermore, the computation unit is operative to determine a similarity between the synthetic droplet image and a detection portion of a second image frame at a pre-determined position of the detection portion, wherein the detection portion has the same shape and size as the synthetic droplet image, and to detect a droplet if the similarity exceeds a predetermined threshold.
  • the computation unit comprises a multi-kernel graphics processor. Furthermore the computation unit provides scheduling means to assign comparisons of a synthetic droplet image with multiple image portions of an image frame to different processor kernels of the multi-kernel graphics processor.
  • the present specification discloses a kit with the image processing device wherein the vehicle camera is connectable to the image processing device.
  • the image processing device can be supplied as part of the vehicle camera or separately for attachment to the car electronics.
  • the vehicle camera is adapted to focus the camera image to a first sensor and to a second sensor.
  • the first image frame is obtained from the first sensor and the second image frame is obtained from the second sensor.
  • the present specification discloses a vehicle, such as a passenger car, a utility car, a bus etc., with the kit installed in the vehicle.
  • the vehicle camera is mounted to the vehicle such that the vehicle camera points to a scene outside the vehicle and is connected to the image processing device.
  • the image processing device is provided as an electronic component of the vehicle or of the vehicle camera.
  • Fig. 1 shows a car with a camera system.
  • Figure 1 shows a car 10 with a surround view system 11.
  • the surround view system 11 comprises a front view camera 12, a right side view camera 13, a left side view camera 14 and a rear view camera 15.
  • the cameras 11 - 14 are connected to a CPU of a controller, which is not shown in Fig. 1 .
  • the controller is connected to further sensors and units, such as a velocity sensor, a steering angle sensor, a GPS unit, and acceleration and orientation sensors.
  • Fig. 2 shows a camera image 16 with droplets 17.
  • the droplets 17 are shown with uniform shapes and sizes. In a real picture the shape, the orientation and also the sizes of the droplets vary.
  • Fig. 3 shows a droplet recognition procedure, in which a synthetic droplet image 18 is compared to corresponding underlying image portions at multiple locations.
  • Fig. 3 shows a direction 20 in which the synthetic droplet image is moved across the image frame.
  • the image frame may be scanned in various ways, for example line by line, column by column, according to a star pattern, on circular trajectories and so forth.
  • the synthetic droplet image 18 is obtained by reducing the complete image frame 16' to a pre-determined droplet size and by rotating the reduced size image by 180 degrees.
  • the synthetic droplet image 18 will in general also contain reduced size versions of the droplets while the images produced by the droplets do not contain these artefacts.
  • the droplets only cover a minor portion of the screen, this difference does not affect the droplet recognition.
  • the synthetic droplet images are only compared with respect to an approximate similarity. In general, the droplet images also show artefacts which are due to the irregular shape of the droplets.
  • a similarity of the synthetic droplet image 18 with an underlying image portion is measured with a least squares estimate that compares the pixel brightness.
  • the comparison also makes use of detected image features of the synthetic droplet 18, such as divisions into dark and bright image areas or regions of high contrast, lines, feature points, vanishing points etc.
  • the camera image contains features that are aligned along street boundaries, which essentially have the same vanishing point. This feature can be used for comparison provided that the distortion of the droplets is not such that it prevents the determination of a vanishing point.
  • the pixels of the synthetic droplet image are averaged to effective pixels with a bigger size than the original pixels and the effective pixels are compared with effective pixels of the underlying image portion.
  • only a subset of the pixels is compared, such as every second pixel.
  • Fig. 4 shows a further droplet recognition procedure in which a synthetic droplet image 18 is compared to multiple different image portions of an image frame 16' in parallel.
  • the comparisons are assigned to different computational threads.
  • a scheduler assigns the threads to different processor kernels of a graphics card according to the availability of computational resources.
  • the synthetic droplet images 18, 19 in the Figs. 4 - 6 are shown bigger than their typical size would be.
  • a droplet size could be 1/8 or 1/10 of the horizontal or the vertical extension in the respective direction or smaller. Among other factors, this property depends on the size of the lens.
  • the droplet images 18 are assigned to rectangular portions of the image 16' and are compared with corresponding underlying image portions of the rectangular portion of the image 16'.
  • Fig. 5 shows a further droplet recognition procedure in which the image frame 16' is compared with differently sized synthetic droplet images, a synthetic droplet image 18 of a first size and a synthetic droplet image 19 of a second size.
  • the area sizes of the synthetic droplet images 18, 19 are chosen within a pre-determined range of droplet sizes.
  • the synthetic droplet images 18 are moved to a position in which one of the synthetic droplet images 18 has a high similarity to an underlying portion 17' of the image frame 16.
  • the darker foreground almost coincides with the image of the foreground provided by the droplet on the camera lens and the brighter sky region almost coincides with the image of the sky provided by the droplet on the camera lens.
  • columns 21, 22, 23 of synthetic droplet images 18 are moved across the image frame 16' in a horizontal direction.
  • the columns of synthetic droplet images are compared to underlying image portions of the image frame 16' at consecutive locations, and the consecutive locations are moved from left to right.
  • the columns contain droplet images 18' of different sizes and/or of different vertical alignments.
  • the direction of movement 20 from left to right is only provided by way of example.
  • the columns 21, 22, 23 could be moved from right to left or the synthetic droplet images could be grouped into rows which are moved from bottom to top or from top to bottom.
  • Fig. 6 shows a further droplet recognition procedure.
  • the image frame 16' which is stored in a memory area 24 of a computer memory, is copied to a first memory area 25 and to a second memory area 26.
  • the dots in Fig. 6 indicate that the memory area 24 can be copied to more than two memory areas.
  • Synthetic droplet images 18 of a first size are compared with corresponding underlying image portions of the image frame 16' of the first memory area 24.
  • synthetic droplet images 19 of a second size are compared with corresponding underlying image portions of the image frame 16' of the second memory area 26.
  • the first memory area 25 is provided by the original memory area 24.
  • a copying of computer memory areas according to the embodiment of Fig. 6 can facilitate a parallel processing.
  • the comparisons that involve the copied memory areas 25, 26 can be carried out in parallel by different processor kernels or they can be assigned to different threads.

Description

  • The present application relates to vehicle camera systems.
  • The scientific publication "Video-based raindrop detection for improved image registration" by Martin Roser and Andreas Geiger relates to the topic of video-based raindrop detection for improved image registration, wherein artificial raindrop patterns are generated in a region of interest (ROIs) and raindrops are verified by intensity-based correlation, and wherein prior knowledge of the vehicle dynamics and on the camera setup in combination with homography constraints is used.
  • It is an object of the present specification to provide an improved method and device for the detection of droplets on a camera lens. In a broader sense, the method is also used for detecting droplets on a transparent surface in front of the camera lens, such as a windscreen or on a transparent lens cover.
  • The droplet detection procedure according to the present specification makes use of the lens property of droplets. When there are droplets of rain on the lens of a camera each droplet itself acts as a further lens and contains a miniature view of the entire scene that is being seen by the main camera image.
  • Due to the smallness of the droplets the focal plane of the droplets is very close to the lens surface of the camera lens. Behind the focal plane of the drops the light rays diverge again and the droplets act as wide angle lenses. As a consequence, the droplets appear as tiny wide angle lenses at the focal plane of the camera lens, which is far behind the focal plane of the drops and produce a minimized image of the scene.
  • Furthermore, the droplets have a large curvature and act as fish eye lenses which can cover a large part of the scene or even provide a 180° view that covers the complete scene which is observed through the camera lens.
  • The image that is produced by the droplet is in general rotated. Usually the image rotation is about 180° degrees, but for some shapes of droplets the rotation angle can be different from 180°. The rotation can be found out by identifying a demarcation line between a bright and a dark area, such as the horizon, and observing its rotation.
  • In general, the droplet image is distorted. If the drops are not spherical but thicker at one end, the wide angle effect is stronger at the thicker end of the drop and the distortion is stronger. The imaging plane where the drop image appears sharp is located behind the imaging plane in which the scene appears sharp. If the objects are sufficiently far away, the imaging planes are close to the respective focal planes.
  • If the depth of focus of the camera lens does not cover the image plane of the droplet and the image plane of the camera lens, the effect can be compensated according to the present specification by focusing part of the camera light to a second sensor, which is placed at a distance at which the droplet images appear sharp for a typical droplet size and using the memory representation of this image for the detection of water drops.
  • The droplets, which form "little windows" often stick to the surface and do not move from frame to frame. According to the present specification, this property can be used to improve the identification of the droplets by using multiple image frames. If it is detected that an object resembles the scene and the boundaries are not changing or are changing only slightly, the detection of a droplet can be confirmed.
  • Likewise, the droplet detection can be confirmed or improved by detecting whether the content of an object, such as the relative illumination of image portions changes in a similar way as the complete observed scene. Consecutive image frames can be considered as forming a three dimensional cube. According to an extended embodiment, miniature versions of the N frame cube are detected in the N frame cube.
  • Furthermore, a droplet detection according to the present specification can also be initialized by detecting boundaries of objects and confirming the detection by matching the content of the detected object with the scene.
  • According to one embodiment, the method is carried out on an image which is not already corrected for lens distortions. The image detection can be carried out on simple mono cameras that provide grey scale images. If colour images are available, the colour information can be used to improve the detection quality. If image frames from a stereo camera are available, the difference between the stereo images may be used in addition to speed up the detection of droplets.
  • According to the present specification, rain drops are detected by identified image areas which appear to be a small scaled view of the entire image.
  • According to one embodiment, the entire image view is scaled to various small sizes within the known range of likely raindrop sizes. Then, these scaled down areas are matched against areas of the main image by pattern matching. If small, entire views of the complete image are found on top of image patches it can be concluded that the image patches correspond to raindrops in that part of the image.
  • According to one embodiment, the detected raindrop areas are reported or forwarded to other algorithms as areas that should be ignored or treated with caution. If the proportion of the rain drops is large, it may even be decided that the entire image cannot be trusted for other object detection algorithms because it is known to be severely obscured.
  • The current specification discloses a computer implemented method for detecting droplets on a lens of a vehicle camera. Image data with image frames is received from a vehicle camera. A synthetic droplet image is generated by reducing a first image frame of the image data to a predetermined size of a droplet, which is within a pre-determined range of droplet sizes.
  • The reduction in size can in general involve some sort of averaging or interpolation procedure to map the pixels of the image frame to the size of the synthetic droplet image. Herein, "the image frame" does not have to include every pixel of the image frame. For example it also refers to a major portion of the image frame.
  • Thereby a reduced size image is obtained, and the reduced size image is rotated by a pre-determined rotation angle, which is typically 180 degrees. In particular, a 180 degree rotation may be obtained without copying computer memory portions just by changing an indexing of the lines of the memory representation of the image frame or of the synthetic droplet image. For example, a computer program may be operative to access the computer memory in a reverse order. The rotation may also be obtaining by copying memory content, however.
  • A similarity is determined or computed between the synthetic droplet image and a detection portion of a second image frame at a pre-determined position of the detection portion. Herein, the detection portion has essentially the same shape and size as the synthetic droplet image. In particular, the first and second image frame may refer to the same image frame or to copies of the same image frame. In an alternative embodiment, the first and second image frames are obtained from separate image sensors but cover essentially the same region of the scene.
  • The comparison is repeated for different positions of the detection portion. For example, the comparison may be repeated for different positions of the detection portion until at least 80% of the frame area is covered with a pre-determined resolution of positions.
  • The pre-determined resolution may be chosen as 1 pixel, which means that close-by detection regions are offset by 1 pixel with respect to each other, or it may be lower. A droplet, or a droplet candidate, is detected if the similarity exceeds a predetermined threshold.
  • According to a further application, a ratio of the area occupied by the droplets and the area of the image frame, and an alert message is sent if the ratio exceeds a predetermined value. The locations of the detected droplets are forwarded to other image processing applications.
  • The comparison between the synthetic droplet and the image portion is performed in parallel for multiple image portions. For example, the comparison can be carried out by multiple threads, which a scheduler program or a scheduler hardware component can assign to different processors or processor kernels depending on the availability of hardware resources.
  • According to a further embodiment, which provides further options to carry out the comparisons in parallel, the second image frame is stored in a first memory location and in a second memory location.
  • A second synthetic droplet image is generated by reducing the first image frame of the image data to a second predetermined size of a droplet, thereby obtaining a reduced size image and by rotating the reduced size image by a pre-determined rotation angle, which is typically 180 degrees.
  • A similarity is determined between the second synthetic droplet image and a detection portion of the second image frame at the second memory location at pre-determined positions of the detection portion. Similarly, the detection portion has the same shape and size as the second synthetic droplet image. A droplet, or a droplet candidate, is detected if the similarity exceeds a predetermined threshold.
  • According to a further embodiment the first image frame is separated into regions with differing brightness for example regions below and above a horizon. The comparison between a synthetic droplet image and a detection portion comprises separating the detection portion into corresponding regions and comparing the regions of the first image frame with the regions of the detection portion. Thereby, the droplet detection may be improved or it may be accelerated.
  • According to a specific embodiment, the position and shape of the synthetic droplet is adjusted according to a predetermined lens distortion function and according to a position of the detection portion in the second image frame. This embodiment takes into account that droplet may appear differently in different locations, in particular next to the border of the image frame.
  • According to a further embodiment, the first image frame is obtained from a first sensor and the second image frame is obtained from a second sensor, wherein an optical distance from a camera lens to the first sensor is different from an optical distance from the camera lens to the second sensor. This embodiment can take into account that the droplets change the effective focal distance of the lens, and the image of the droplet appears focused at a different optical distance than the image plane of the main image.
  • In particular, the sizes of the droplets can be chosen according to a pre-determined geometric progression or according to a pre-determined statistical distribution of droplets.
  • In order to speed up the detection procedure, previously determined positions of droplets can be used as starting positions for comparing the synthetic droplets with underlying portions of the image frame. For example, the determined droplet positions of the last 10 frames may be used to predict the positions of the droplet in a present frame.
  • The specification also discloses a computer program for executing the steps of the aforementioned method and a computer readable storage medium with the computer program.
  • In a further aspect, the current specification discloses an image processing device for a vehicle camera which is operative to execute the aforementioned detection method.
  • In particular, the present specification discloses an image processing device which comprises an input connection for receiving image data from the vehicle camera, the image data comprising image frames and a computation unit that is connected to the input connection.
  • The computation unit is operative to generate a synthetic droplet image by reducing a first image frame of the image data to a predetermined size of a droplet, thereby obtaining a reduced size image and by rotating the reduced size image by a pre-determined rotation angle. Furthermore, the computation unit is operative to determine a similarity between the synthetic droplet image and a detection portion of a second image frame at a pre-determined position of the detection portion, wherein the detection portion has the same shape and size as the synthetic droplet image, and to detect a droplet if the similarity exceeds a predetermined threshold.
  • In a further embodiment, the computation unit comprises a multi-kernel graphics processor. Furthermore the computation unit provides scheduling means to assign comparisons of a synthetic droplet image with multiple image portions of an image frame to different processor kernels of the multi-kernel graphics processor.
  • Furthermore, the present specification discloses a kit with the image processing device wherein the vehicle camera is connectable to the image processing device. For example, the image processing device can be supplied as part of the vehicle camera or separately for attachment to the car electronics.
  • In a further embodiment, the vehicle camera is adapted to focus the camera image to a first sensor and to a second sensor.
  • The first image frame is obtained from the first sensor and the second image frame is obtained from the second sensor.
  • Furthermore, the present specification discloses a vehicle, such as a passenger car, a utility car, a bus etc., with the kit installed in the vehicle.
  • The vehicle camera is mounted to the vehicle such that the vehicle camera points to a scene outside the vehicle and is connected to the image processing device. The image processing device is provided as an electronic component of the vehicle or of the vehicle camera.
  • The subject matter of the present specification is now explained in further detail with respect to the following Figures in which
  • Fig. 1
    shows a vehicle with a camera system,
    Fig. 2
    shows a camera image with droplets recorded by the camera system of Fig. 1,
    Fig. 3
    shows a droplet recognition procedure for detecting droplets, such as the droplets of Fig. 2,
    Fig. 4
    shows a further droplet recognition procedure that involves multiple comparisons in parallel,
    Fig. 5
    shows a further droplet recognition procedure that involves multiple comparisons in parallel, and
    Fig. 6
    shows a droplet recognition procedure that involves copying of memory areas and performing the comparison in parallel for the multiple memory areas.
  • In the following description, details are provided to describe embodiments of the application. It shall be apparent to one skilled in the art, however, that the embodiments may be practiced without such details.
  • Some parts of the embodiments have similar parts. The similar parts may have the same names or similar part numbers. The description of one similar part also applies by reference to another similar parts, where appropriate, thereby reducing repetition of text without limiting the disclosure.
  • Fig. 1 shows a car with a camera system. Figure 1 shows a car 10 with a surround view system 11. The surround view system 11 comprises a front view camera 12, a right side view camera 13, a left side view camera 14 and a rear view camera 15. The cameras 11 - 14 are connected to a CPU of a controller, which is not shown in Fig. 1. The controller is connected to further sensors and units, such as a velocity sensor, a steering angle sensor, a GPS unit, and acceleration and orientation sensors.
  • Fig. 2 shows a camera image 16 with droplets 17. For simplicity, the droplets 17 are shown with uniform shapes and sizes. In a real picture the shape, the orientation and also the sizes of the droplets vary.
  • Fig. 3 shows a droplet recognition procedure, in which a synthetic droplet image 18 is compared to corresponding underlying image portions at multiple locations. By way of example, Fig. 3 shows a direction 20 in which the synthetic droplet image is moved across the image frame. The image frame may be scanned in various ways, for example line by line, column by column, according to a star pattern, on circular trajectories and so forth.
  • The synthetic droplet image 18 is obtained by reducing the complete image frame 16' to a pre-determined droplet size and by rotating the reduced size image by 180 degrees. The synthetic droplet image 18 will in general also contain reduced size versions of the droplets while the images produced by the droplets do not contain these artefacts.
  • As long as the droplets only cover a minor portion of the screen, this difference does not affect the droplet recognition. The synthetic droplet images are only compared with respect to an approximate similarity. In general, the droplet images also show artefacts which are due to the irregular shape of the droplets.
  • In a simple realization, a similarity of the synthetic droplet image 18 with an underlying image portion is measured with a least squares estimate that compares the pixel brightness. According to other realizations, the comparison also makes use of detected image features of the synthetic droplet 18, such as divisions into dark and bright image areas or regions of high contrast, lines, feature points, vanishing points etc.
  • Often, the camera image contains features that are aligned along street boundaries, which essentially have the same vanishing point. This feature can be used for comparison provided that the distortion of the droplets is not such that it prevents the determination of a vanishing point.
  • On the other hand it can be advantageous to compare only simple features of the synthetic droplet image 18 that do not require extensive image processing. For example, according to one realization, the pixels of the synthetic droplet image are averaged to effective pixels with a bigger size than the original pixels and the effective pixels are compared with effective pixels of the underlying image portion. According to another realization, only a subset of the pixels is compared, such as every second pixel.
  • Fig. 4 shows a further droplet recognition procedure in which a synthetic droplet image 18 is compared to multiple different image portions of an image frame 16' in parallel. According to one exemplary embodiment, the comparisons are assigned to different computational threads. A scheduler assigns the threads to different processor kernels of a graphics card according to the availability of computational resources.
  • For easier visualization, the synthetic droplet images 18, 19 in the Figs. 4 - 6 are shown bigger than their typical size would be. For example, a droplet size could be 1/8 or 1/10 of the horizontal or the vertical extension in the respective direction or smaller. Among other factors, this property depends on the size of the lens.
  • The droplet images 18 are assigned to rectangular portions of the image 16' and are compared with corresponding underlying image portions of the rectangular portion of the image 16'.
  • Fig. 5 shows a further droplet recognition procedure in which the image frame 16' is compared with differently sized synthetic droplet images, a synthetic droplet image 18 of a first size and a synthetic droplet image 19 of a second size. The area sizes of the synthetic droplet images 18, 19 are chosen within a pre-determined range of droplet sizes.
  • In the position of the synthetic droplets 18 of Fig. 5, the synthetic droplet images 18 are moved to a position in which one of the synthetic droplet images 18 has a high similarity to an underlying portion 17' of the image frame 16. In particular, the darker foreground almost coincides with the image of the foreground provided by the droplet on the camera lens and the brighter sky region almost coincides with the image of the sky provided by the droplet on the camera lens.
  • In one realization according to Fig. 5, columns 21, 22, 23 of synthetic droplet images 18 are moved across the image frame 16' in a horizontal direction. In other words, the columns of synthetic droplet images are compared to underlying image portions of the image frame 16' at consecutive locations, and the consecutive locations are moved from left to right. The columns contain droplet images 18' of different sizes and/or of different vertical alignments.
  • The direction of movement 20 from left to right is only provided by way of example. For example, the columns 21, 22, 23 could be moved from right to left or the synthetic droplet images could be grouped into rows which are moved from bottom to top or from top to bottom.
  • Fig. 6 shows a further droplet recognition procedure. According to the procedure of Fig. 6, the image frame 16', which is stored in a memory area 24 of a computer memory, is copied to a first memory area 25 and to a second memory area 26. The dots in Fig. 6 indicate that the memory area 24 can be copied to more than two memory areas.
  • Synthetic droplet images 18 of a first size are compared with corresponding underlying image portions of the image frame 16' of the first memory area 24. Similarly, synthetic droplet images 19 of a second size are compared with corresponding underlying image portions of the image frame 16' of the second memory area 26.
  • In an alternative embodiment, the first memory area 25 is provided by the original memory area 24. A copying of computer memory areas according to the embodiment of Fig. 6 can facilitate a parallel processing. For example, the comparisons that involve the copied memory areas 25, 26 can be carried out in parallel by different processor kernels or they can be assigned to different threads.
  • Although the above description contains much specificity, this should not be construed as limiting the scope of the embodiments but merely providing illustration of the foreseeable embodiments. The above stated advantages of the embodiments should not be construed especially as limiting the scope of the embodiments but merely to explain possible achievements if the described embodiments are put into practice. Thus, the scope of the embodiments should be determined by the claims and their equivalents, rather than by the examples given.

Claims (15)

  1. A method for detecting droplets on a lens of a vehicle camera, comprising
    - receiving image data from a vehicle camera, the image data comprising image frames,
    - generating a synthetic droplet image by reducing a first image frame of the image data to a predetermined size of a droplet, thereby obtaining a reduced size image and by rotating the reduced size image by a predetermined rotation angle,
    - comparing the synthetic droplet image and a detection portion of a second image frame, the detection portion being a portion of an image frame at a predetermined position, and determining a similarity between the synthetic droplet image and the detection portion, wherein the detection portion has the same shape and size as the synthetic droplet image, and wherein the comparison is repeated for different positions of the detection portion,
    - detecting a droplet if the similarity exceeds a predetermined threshold, wherein the similarity is measured with a least squares estimate that compares the pixel brightness of the synthetic droplet image and the underlying detection portion, wherein the pixels of the synthetic droplet image are averaged to effective pixels with a bigger size than the original pixels, and the effective pixels of the synthetic droplet image are compared with effective pixels of the underlying image portion.
  2. The method of claim 1, comprising
    - performing the comparison between the synthetic droplet and the detection portion in parallel for multiple detection portions.
  3. The method of claim 1 or claim 2, comprising
    - storing the second image frame in a first memory location and a second memory location,
    - generating a second synthetic droplet image by reducing the first image frame of the image data to a second predetermined size of a droplet, thereby obtaining a reduced size image and by rotating the reduced size image by a pre-determined rotation angle,
    - determining a similarity between the second synthetic droplet image and a detection portion of the second image frame at the second memory location at pre-determined positions of the detection portion, wherein the detection portion has the same shape and size as the second synthetic droplet image,
    - detecting a droplet if the similarity exceeds a predetermined threshold.
  4. The method of one of the preceding claims, comprising
    - separating the first image frame into regions with differing brightness,
    wherein the comparison between a synthetic droplet image and a detection portion comprises separating the detection portion into corresponding regions and comparing the regions of the synthetic droplet image, corresponding to the regions of the first image frame, with the regions of the detection portion.
  5. The method according to one of the preceding claims, comprising
    - adjusting the shape of the synthetic droplet according to a pre-determined lens distortion function and according to a position of the detection portion in the second image frame.
  6. The method according to one of the preceding claims, comprising
    - obtaining the first image frame from a first sensor and the obtaining the second image frame from a second sensor, wherein an distance from a camera lens to the first sensor is different from an distance from the camera lens to the second sensor.
  7. The method according to one of the preceding claims, wherein the sizes of the synthetic droplet images are chosen according to a pre-determined geometric progression or according to a pre-determined statistical distribution of droplets.
  8. The method according to one of the preceding claims, wherein previously determined positions of droplets are used as starting positions for comparing the synthetic droplets with underlying portions of the image frame.
  9. A computer program for executing the steps of a method according to one of the claims 1 to 7.
  10. A computer readable storage medium with the computer program according to claim 9.
  11. An image processing device for a vehicle camera, the image processing device comprising
    - an input connection for receiving image data from the vehicle camera, the image data comprising image frames,
    - a computation unit, the computation unit being connected to the input connection, and the computation unit being operative
    to generate a synthetic droplet image by reducing a first image frame of the image data to a predetermined size of a droplet, thereby obtaining a reduced size image and by rotating the reduced size image by a predetermined rotation angle,
    to compare the synthetic droplet image and a detection portion of a second image frame, the detection portion being a portion of an image frame at a predetermined position, and to determine a similarity between the synthetic droplet image and the detection portion, wherein the detection portion has the same shape and size as the synthetic droplet image, and wherein the comparison is repeated for different positions of the detection portion, and
    to detect a droplet if the similarity exceeds a predetermined threshold, wherein the similarity is measured with a least squares estimate that compares the pixel brightness of the synthetic droplet image and the underlying detection portion, wherein the pixels of the synthetic droplet image are averaged to effective pixels with a bigger size than the original pixel, and the effective pixels of the synthetic droplet image are compared with effective pixels of the underlying image portion.
  12. The image processing device according to claim 11, wherein the computation unit comprising a multi-kernel graphics processor, and wherein the computation unit provides scheduling means to assign comparisons of a synthetic droplet image with multiple image portions of an image frame to different processor kernels of the multi-kernel graphics processor.
  13. A kit, the kit comprising the image processing device according to claim 11 or claim 12, and a vehicle camera, the vehicle camera being connectable to the image processing device.
  14. The kit of claim 13, wherein the vehicle camera is adapted to focus the camera image to a first sensor and to a second sensor, and wherein the first image frame is obtained from the first sensor and the second image frame is obtained from the second sensor.
  15. Vehicle with the kit according to claim 13 or claim 14, wherein the vehicle camera is mounted to the vehicle such that the vehicle camera points to a scene outside the vehicle, the vehicle camera is connected to the image processing device, and wherein the image processing device is provided as an electronic component of the vehicle or of the vehicle camera.
EP15185771.1A 2015-09-18 2015-09-18 Detection of water droplets on a vehicle camera lens Active EP3144853B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP15185771.1A EP3144853B1 (en) 2015-09-18 2015-09-18 Detection of water droplets on a vehicle camera lens
PCT/EP2016/070417 WO2017045913A1 (en) 2015-09-18 2016-08-30 Detection of water droplets on a vehicle camera lens
DE112016003300.1T DE112016003300T5 (en) 2015-09-18 2016-08-30 Detection of water droplets on a lens of a vehicle camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP15185771.1A EP3144853B1 (en) 2015-09-18 2015-09-18 Detection of water droplets on a vehicle camera lens

Publications (2)

Publication Number Publication Date
EP3144853A1 EP3144853A1 (en) 2017-03-22
EP3144853B1 true EP3144853B1 (en) 2020-03-18

Family

ID=54207302

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15185771.1A Active EP3144853B1 (en) 2015-09-18 2015-09-18 Detection of water droplets on a vehicle camera lens

Country Status (3)

Country Link
EP (1) EP3144853B1 (en)
DE (1) DE112016003300T5 (en)
WO (1) WO2017045913A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111812691B (en) * 2019-04-11 2023-09-12 北京魔门塔科技有限公司 Vehicle-mounted terminal and image frame detection processing method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140028849A1 (en) * 2011-04-13 2014-01-30 Nissan Motor Co., Ltd. Driving assistance system and raindrop detection method thereof
WO2013034166A1 (en) * 2011-09-07 2013-03-14 Valeo Schalter Und Sensoren Gmbh Method and camera assembly for detecting raindrops on a windscreen of a vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
DE112016003300T5 (en) 2018-04-26
EP3144853A1 (en) 2017-03-22
WO2017045913A1 (en) 2017-03-23

Similar Documents

Publication Publication Date Title
US11170466B2 (en) Dense structure from motion
Wu et al. Lane-mark extraction for automobiles under complex conditions
US20190205671A1 (en) Hazard detection from a camera in a scene with moving shadows
Roser et al. Realistic modeling of water droplets for monocular adherent raindrop recognition using bezier curves
Roser et al. Video-based raindrop detection for improved image registration
US9031285B2 (en) Detection of floating objects in maritime video using a mobile camera
CN110659547B (en) Object recognition method, device, vehicle and computer-readable storage medium
US11887336B2 (en) Method for estimating a relative position of an object in the surroundings of a vehicle and electronic control unit for a vehicle and vehicle
JP2017138736A (en) Image processor and water droplet removal system
US9615050B2 (en) Topology preserving intensity binning on reduced resolution grid of adaptive weighted cells
CN106203431A (en) A kind of image-recognizing method and device
Yakimov et al. Traffic signs detection and tracking using modified hough transform
CN107209930B (en) Method and device for stabilizing all-round looking image
JP6065629B2 (en) Object detection device
JP7206909B2 (en) Attached matter detection device and attached matter detection method
Einecke et al. Detection of camera artifacts from camera images
WO2011016257A1 (en) Distance calculation device for vehicle
EP3144853B1 (en) Detection of water droplets on a vehicle camera lens
CN111612812B (en) Target object detection method, detection device and electronic equipment
US20220014674A1 (en) Imaging system and image processing apparatus
US11003944B2 (en) Topology preserving intensity binning on reduced resolution grid of adaptive weighted cells
US11393128B2 (en) Adhered substance detection apparatus
JP7175188B2 (en) Attached matter detection device and attached matter detection method
JP4765363B2 (en) Occupant detection device and occupant detection method
Hautière et al. Free Space Detection for Autonomous Navigation in Daytime Foggy Weather.

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20170922

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180724

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20191114

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602015048901

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1246785

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200415

Ref country code: IE

Ref legal event code: FG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200618

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20200318

RAP2 Party data changed (patent owner data changed or rights of a patent transferred)

Owner name: CONTINENTAL AUTOMOTIVE GMBH

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200618

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200619

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200718

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200812

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1246785

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200318

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602015048901

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

26N No opposition filed

Effective date: 20201221

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20200918

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20200930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200918

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200930

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200930

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200930

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200918

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200918

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602015048901

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06K0009620000

Ipc: G06V0030190000

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 602015048901

Country of ref document: DE

Owner name: CONTINENTAL AUTONOMOUS MOBILITY GERMANY GMBH, DE

Free format text: FORMER OWNER: CONTINENTAL AUTOMOTIVE GMBH, 30165 HANNOVER, DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230930

Year of fee payment: 9