EP4052215A1 - Method for detecting a moving state of a vehicle - Google Patents

Method for detecting a moving state of a vehicle

Info

Publication number
EP4052215A1
EP4052215A1 EP20774947.4A EP20774947A EP4052215A1 EP 4052215 A1 EP4052215 A1 EP 4052215A1 EP 20774947 A EP20774947 A EP 20774947A EP 4052215 A1 EP4052215 A1 EP 4052215A1
Authority
EP
European Patent Office
Prior art keywords
frames
vehicle
pixel
subsequent
groups
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20774947.4A
Other languages
German (de)
French (fr)
Inventor
Shuai FAN
Shuai LI
Jiehu HOU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive GmbH
Original Assignee
Continental Automotive GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive GmbH filed Critical Continental Automotive GmbH
Publication of EP4052215A1 publication Critical patent/EP4052215A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the embodiments relate to a method for detecting a moving state, in particular a stopped state, of a vehicle, for example an autonomous driving car.
  • Visual odometry and driver assistance systems have become more and more popular in recent years and naturally attract increasing attention from research ers in the areas of intelligent robotics and autonomous driving.
  • One frequently occurring task is to detect whether a vehicle is currently stopped. In standard production vehicles this can be achieved by using vehicle movement sensors such as wheel ticks.
  • the sensor data which are easily available from the vehicle, or with simple additional equipment, are GPS data, IMU data, and data from a camera mounted in the vehicle.
  • data received from IMU and GPS do not provide a relia ble detection of a stopped state of a vehicle.
  • the detection of a stopped state of a vehicle with GPS fails, for example, in tunnels or urban canyons, but also often in standard situations. For example, even if a vehicle is stopped, the GPS position can move.
  • the detection of a stopped state of the vehicle by using IMU is very sensitive to noise. Small movements of the vehicle, such as movements induced by the wipers, moving passengers or moving traffic can indicate a movement of the vehicle.
  • SLAM Sim ultaneous Localization And Mapping
  • optical flow analysis most of the prior art optical methods such as feature point tracking or optical flow analy sis are sensitive to movement in parts of a captured image of a scene. For exam ple, rain splashing on the windscreen, moving traffic, wiper movement or other movements in a captured image can falsely indicate a movement of the vehicle.
  • the known prior art optical methods do not allow a robust and reliable detection of a stopped state of a vehicle.
  • the problem to be solved by the invention is to provide a method for detecting a moving state of a vehicle, in particular a stopped state of a vehicle, by using opti cal methods that are robust against disturbing influences that might erroneously indicate a movement of the vehicle, even if the vehicle is really stopped.
  • An embodiment relates to a method for detecting a moving state of a vehicle, in particular a stopped state of the vehicle, which allows a reliable detection of the moving state of the vehicle even if optical data are influenced by disturbing fac tors.
  • a plurality of frames of a scene are captured by a camera of the vehicle.
  • the frames may be consecutive frames.
  • a perspective transformation of at least a portion of each of the plurality of consecutive frames may be performed to achieve transformed portions of each of the frames.
  • a pixel shift may be determined in the respective trans formed portions of a group of at least two preferably subsequent frames of the plurality of the frames.
  • the respective pixel shifts of each of the groups of the at least two preferably subsequent frames of the plurality of the frames may be evaluated.
  • the moving state of the vehicle may be determined based on the evaluation of the respective pixel shifts of each of the groups of the at least two preferably subsequent frames of the plurality of the frames. It is preferred to use subsequent frames, but frames may also be skipped, for example, if they are distorted by a foreign object or noise, or if they cannot be used for other rea sons.
  • Such an embodiment for detecting a moving/stopped state of a vehicle only re quires at least one optical sensor, i.e. a camera.
  • the method works with the use of a single camera which may be a monocular camera.
  • the evaluation of the data of the camera does not depend on the details of a captured image, for example where local movement can happen, but the meth od rather may evaluate the global movement of the captured images/frames.
  • the frames/images which may be consecutive ones, may be captured from the camera by a front view.
  • the method may derive several portions from each front view image.
  • the perspective transformation may be performed for each portion of a captured image/frame in order to better emphasize the vehicle movement.
  • each portion of a captured frame/image is transformed from the front view in a side-view or bird's eye view transformed picture part in which the movement of the vehicle can be better detected than in the original front view image.
  • the shift be tween two frames may be analyzed for each pair of captured frames to obtain a judgement regarding the moving state of the vehicle, for example to obtain a judgement of whether the vehicle is stopping.
  • the embodiment may analyze the change of the image overall and can tolerate movement in parts of a captured image. As a result, it may even handle extreme scenes, such as scenes with heavy rain or dense traffic.
  • the method is robust with regard to nearly all kinds of scenes difficult to analyze, such as moving wip ers, scenes with busy traffic and bad weather conditions.
  • the method can be applied to any vehicle independent of the type of vehicle. Except for the camera, it does not require specific equipment.
  • the method is inde pendent of the type of camera used.
  • Figure 1 shows a flowchart illustrating method steps of a method for detecting a moving state of a vehicle
  • Figure 2 illustrates a frame/image captured by a camera in a front-view with por tions of the frame to be evaluated by the method for detecting a mov ing state of a vehicle
  • Figure 3 illustrates an exemplified result of a Fourier analysis and inverse Fourier transformation to determine a pixel shift between captured frames.
  • Figure 4 shows an embodiment with two thresholds.
  • FIG 1 a first embodiment including steps SI - S5 for detecting a moving state/stopped state of a vehicle according to the invention are shown.
  • a plurality of frames which may be consecutive frames of a scene is captured by a camera of the vehicle.
  • the camera used may be configured as a monocular camera.
  • the plurality of frames may be captured by the camera in a front-view.
  • a perspective transformation of at least a portion of each of the plu rality of the frames is performed to achieve transformed portions of each of the frames.
  • the portion of each of the plurality of the frames is transformed from the captured front-view in a side-view or a birds-eye view by the perspective transformation.
  • the perspective transformation is used such that the transformed picture part allows for a better detection of the movement of the vehicle than the originally captured image.
  • a pixel shift is determined in the respective transformed portions of a group of at least two subsequent frames of the plurality of the frames.
  • the pixel shift is determined by calculating a phase corre lation.
  • the phase correlation may be calculated by performing a Fourier analysis transformation on the respective transformed portions of the group of the at least two subsequent frames of the plurality of the frames, calculating the cross power spectrum by taking the complex conjugate of one of the Fourier trans formed frames, multiplying the Fourier transforms together elementwise, nor malizing this product elementwise, and performing an inverse Fourier transform of the result.
  • the pixel shift is determined by performing a structural similarity analysis or a joint information entropy analysis on the re spective transformed portions of the group of the at least two subsequent frames of the plurality of the frames.
  • a step S4 the respective pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames are evaluated.
  • the respective pixel shifts are evaluated by calculating at least one mean deviation or a standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames.
  • a respective threshold value of the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames is defined. It may fur ther be evaluated if the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames exceeds the respective threshold value of the mean devi ation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames.
  • the moving state of the vehicle is determined based on the evalua tion of the respective pixel shifts of each of the groups of the at least two subse quent frames of the plurality of the frames.
  • the moving state of the vehicle is determined such that, in particular, a stopped state of the vehicle is identified when it is evaluated that the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames is below the respective threshold value of the mean deviation and/or the stand ard deviation of the pixel shifts of each of the groups of the at least two subse quent frames of the plurality of the frames.
  • the various method steps SI to S5 may be performed by a control unit of the vehicle which may be embodied as a processor or a controller.
  • the above-described method steps are explained in the following by an example of a captured front-view image/frame shown in Figure 2.
  • the frame of a scene shown in Figure 2 is captured by a camera of a vehicle.
  • the camera has captured the illustrated front view through the front window of the vehicle.
  • a perspective transformation is applied to the left portion PI and the right por tion P2 of the captured image.
  • the definition of perspective transformation is that the collineation is set up in a plane by projecting the points of another plane on it from two different centers of projection.
  • the trapezoid shown in dashed lines is con verted to the rectangle R2.
  • the transformation matrix is calculated from four corresponding points, and then the image in the entire trapezoid P2 is trans formed.
  • the portion PI on the left side of the front view is transformed into a pic ture which is just like looking out of the left side window of the vehicle.
  • the por tion P2 of the right side of the front view of the captured image is transformed into a picture which is just like looking out of the right side window of the vehi cle.
  • the purpose of this transformation is to provide a better data for the follow- ing image analysis, for example a following Fourier analysis, because panned im ages are easier to process for a Fourier analysis.
  • a Fourier analysis may be performed on the left and right transformed portions of respective frames between two frames. After having performed the Fourier analysis, calculating the cross-power spectrum by taking the complex conjugate of one of the Fourier transformed frames, multiplying the Fourier transforms together elementwise, normalizing this product elementwise, an in verse Fourier transform is applied.
  • the respective pixel shifts of each group/pair of two subsequent frames of the plurality of the frames are determined.
  • the mean deviation and/or standard deviation may be calculated.
  • a respective threshold value can be set. If the determined pixel shift is below the respective threshold value of the mean deviation and/or the standard deviation, a stopped state of the vehicle can be detected. On the other hand, if it is determined that the pixel shift is above the respective threshold value of the mean deviation and/or the standard deviation, a moving state of the vehicle is identified.
  • the optical analysis can be made from front view images.
  • panning images such as images captured by a drone carrying a downward directed camera, there might not be any need for making a perspective transformation.
  • the method has been explained by performing a perspective transformation of the left and right portions of an image. It has to be noted that the method is not limited to these left and right portions of the image. Other portions of captured images may be used for perspective transformation such as a portion of the image directly ahead of the vehicle, for example to make a bird's-eye view transformation.
  • the method may be applied to detect a moving state of a vehicle, for example an autonomous driving car.
  • the method is not limited to be used for de tecting the moving state of a vehicle.
  • the method may be applied for other ap plications having limited movement sensors, where a stopped state of an object needs to be detected by a camera.
  • the method may be used for control of virtual reality scenes with a smartphone, or for controlling the flight of an aerial vehicle, for example for control of a hovering state of a drone.
  • Figure 4 shows an embodiment where two thresholds are set for the mean and the standard deviation, and if it is less than corresponding thresholds set, the vehicle is stopping, otherwise it is moving. This can give the result very accurate ly, as shown in the figure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)

Abstract

According to a method for detecting a moving state of a vehicle, a plurality of frames of a scene are captured by a camera of the vehicle. A perspective transformation of at least a portion of each of the plurality of the frames is performed to achieve transformed portions of each of the frames. A pixel shift is determined in the respective transformed portions of a group of at least two subsequent frames of the plurality of the frames. A moving state of the vehicle is determined based on the evaluation of the respective pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames.

Description

Method for detecting a moving state of a vehicle
Field of the invention
The embodiments relate to a method for detecting a moving state, in particular a stopped state, of a vehicle, for example an autonomous driving car.
Description of the related art
Visual odometry and driver assistance systems have become more and more popular in recent years and naturally attract increasing attention from research ers in the areas of intelligent robotics and autonomous driving. One frequently occurring task is to detect whether a vehicle is currently stopped. In standard production vehicles this can be achieved by using vehicle movement sensors such as wheel ticks.
Especially in early test vehicles, access to vehicle movement sensors can be lim ited. The sensor data which are easily available from the vehicle, or with simple additional equipment, are GPS data, IMU data, and data from a camera mounted in the vehicle. However, data received from IMU and GPS do not provide a relia ble detection of a stopped state of a vehicle. The detection of a stopped state of a vehicle with GPS fails, for example, in tunnels or urban canyons, but also often in standard situations. For example, even if a vehicle is stopped, the GPS position can move. The detection of a stopped state of the vehicle by using IMU is very sensitive to noise. Small movements of the vehicle, such as movements induced by the wipers, moving passengers or moving traffic can indicate a movement of the vehicle. Therefore, the detection of a stopped state of a vehicle has to be focused on evaluating camera data and subsequently performing visual processing. Known methods of visual processing that could be applied for optical detection of a stopped state of a vehicle are, for example, feature point tracking in SLAM (Sim ultaneous Localization And Mapping) or optical flow analysis. However, most of the prior art optical methods such as feature point tracking or optical flow analy sis are sensitive to movement in parts of a captured image of a scene. For exam ple, rain splashing on the windscreen, moving traffic, wiper movement or other movements in a captured image can falsely indicate a movement of the vehicle. In conclusion, the known prior art optical methods do not allow a robust and reliable detection of a stopped state of a vehicle.
Summary of the invention
The problem to be solved by the invention is to provide a method for detecting a moving state of a vehicle, in particular a stopped state of a vehicle, by using opti cal methods that are robust against disturbing influences that might erroneously indicate a movement of the vehicle, even if the vehicle is really stopped.
Solutions of the problem are described in the independent claims. The depend ent claims relate to further improvements of the invention.
An embodiment relates to a method for detecting a moving state of a vehicle, in particular a stopped state of the vehicle, which allows a reliable detection of the moving state of the vehicle even if optical data are influenced by disturbing fac tors.
In a method for detecting a moving state of a vehicle, a plurality of frames of a scene are captured by a camera of the vehicle. The frames may be consecutive frames. A perspective transformation of at least a portion of each of the plurality of consecutive frames may be performed to achieve transformed portions of each of the frames. A pixel shift may be determined in the respective trans formed portions of a group of at least two preferably subsequent frames of the plurality of the frames. The respective pixel shifts of each of the groups of the at least two preferably subsequent frames of the plurality of the frames may be evaluated. Finally, the moving state of the vehicle may be determined based on the evaluation of the respective pixel shifts of each of the groups of the at least two preferably subsequent frames of the plurality of the frames. It is preferred to use subsequent frames, but frames may also be skipped, for example, if they are distorted by a foreign object or noise, or if they cannot be used for other rea sons.
Such an embodiment for detecting a moving/stopped state of a vehicle only re quires at least one optical sensor, i.e. a camera. In particular, the method works with the use of a single camera which may be a monocular camera.
The evaluation of the data of the camera does not depend on the details of a captured image, for example where local movement can happen, but the meth od rather may evaluate the global movement of the captured images/frames.
The frames/images, which may be consecutive ones, may be captured from the camera by a front view. The method may derive several portions from each front view image. The perspective transformation may be performed for each portion of a captured image/frame in order to better emphasize the vehicle movement.
For example, each portion of a captured frame/image is transformed from the front view in a side-view or bird's eye view transformed picture part in which the movement of the vehicle can be better detected than in the original front view image.
Several techniques such as Fourier analysis/transformation, structural similarity analysis or joint information entropy analysis for each portion of the frames/images may be used to get the shift between two frames. The shift be tween two frames may be analyzed for each pair of captured frames to obtain a judgement regarding the moving state of the vehicle, for example to obtain a judgement of whether the vehicle is stopping.
The embodiment may analyze the change of the image overall and can tolerate movement in parts of a captured image. As a result, it may even handle extreme scenes, such as scenes with heavy rain or dense traffic. The method is robust with regard to nearly all kinds of scenes difficult to analyze, such as moving wip ers, scenes with busy traffic and bad weather conditions. Moreover, the method can be applied to any vehicle independent of the type of vehicle. Except for the camera, it does not require specific equipment. Moreover, the method is inde pendent of the type of camera used.
Additional features and advantages are set forth in the detailed description that follows. It is to be understood that both the foregoing general description and the following detailed description are merely exemplary, and are intended to provide an overview or framework for understanding the nature and character of the claims.
Description of Drawings
In the following the invention will be described by way of example, without limi tation of the general inventive concept, on examples of embodiment with refer ence to the drawings.
Figure 1 shows a flowchart illustrating method steps of a method for detecting a moving state of a vehicle; Figure 2 illustrates a frame/image captured by a camera in a front-view with por tions of the frame to be evaluated by the method for detecting a mov ing state of a vehicle; and
Figure 3 illustrates an exemplified result of a Fourier analysis and inverse Fourier transformation to determine a pixel shift between captured frames.
Figure 4 shows an embodiment with two thresholds.
In figure 1 a first embodiment including steps SI - S5 for detecting a moving state/stopped state of a vehicle according to the invention are shown.
In a first step SI, a plurality of frames, which may be consecutive frames of a scene is captured by a camera of the vehicle. According to an embodiment, the camera used may be configured as a monocular camera. The plurality of frames may be captured by the camera in a front-view.
In a step S2, a perspective transformation of at least a portion of each of the plu rality of the frames is performed to achieve transformed portions of each of the frames. According to an embodiment, the portion of each of the plurality of the frames is transformed from the captured front-view in a side-view or a birds-eye view by the perspective transformation. The perspective transformation is used such that the transformed picture part allows for a better detection of the movement of the vehicle than the originally captured image.
In a step SB, a pixel shift is determined in the respective transformed portions of a group of at least two subsequent frames of the plurality of the frames. Accord ing to an embodiment, the pixel shift is determined by calculating a phase corre lation. The phase correlation may be calculated by performing a Fourier analysis transformation on the respective transformed portions of the group of the at least two subsequent frames of the plurality of the frames, calculating the cross power spectrum by taking the complex conjugate of one of the Fourier trans formed frames, multiplying the Fourier transforms together elementwise, nor malizing this product elementwise, and performing an inverse Fourier transform of the result.
According to another embodiment, the pixel shift is determined by performing a structural similarity analysis or a joint information entropy analysis on the re spective transformed portions of the group of the at least two subsequent frames of the plurality of the frames.
In a step S4, the respective pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames are evaluated. According to an embodiment, the respective pixel shifts are evaluated by calculating at least one mean deviation or a standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames.
According to an embodiment, a respective threshold value of the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames is defined. It may fur ther be evaluated if the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames exceeds the respective threshold value of the mean devi ation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames.
In a step S5, the moving state of the vehicle is determined based on the evalua tion of the respective pixel shifts of each of the groups of the at least two subse quent frames of the plurality of the frames. According to an embodiment, the moving state of the vehicle is determined such that, in particular, a stopped state of the vehicle is identified when it is evaluated that the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames is below the respective threshold value of the mean deviation and/or the stand ard deviation of the pixel shifts of each of the groups of the at least two subse quent frames of the plurality of the frames.
The various method steps SI to S5 may be performed by a control unit of the vehicle which may be embodied as a processor or a controller.
The above-described method steps are explained in the following by an example of a captured front-view image/frame shown in Figure 2. The frame of a scene shown in Figure 2 is captured by a camera of a vehicle. The camera has captured the illustrated front view through the front window of the vehicle.
A perspective transformation is applied to the left portion PI and the right por tion P2 of the captured image. The definition of perspective transformation is that the collineation is set up in a plane by projecting the points of another plane on it from two different centers of projection. For example, in the right portion P2 of the captured image of Figure 2, the trapezoid shown in dashed lines is con verted to the rectangle R2. The transformation matrix is calculated from four corresponding points, and then the image in the entire trapezoid P2 is trans formed.
Finally, the portion PI on the left side of the front view is transformed into a pic ture which is just like looking out of the left side window of the vehicle. The por tion P2 of the right side of the front view of the captured image is transformed into a picture which is just like looking out of the right side window of the vehi cle. The purpose of this transformation is to provide a better data for the follow- ing image analysis, for example a following Fourier analysis, because panned im ages are easier to process for a Fourier analysis.
After having performed the above-described transformation for each of the frames, a Fourier analysis may be performed on the left and right transformed portions of respective frames between two frames. After having performed the Fourier analysis, calculating the cross-power spectrum by taking the complex conjugate of one of the Fourier transformed frames, multiplying the Fourier transforms together elementwise, normalizing this product elementwise, an in verse Fourier transform is applied.
As a result of the Fourier analysis and the inverse Fourier transformation, a pulse signal 14 which may be extending from the zero plane 12 will be obtained, as illustrated in the diagram 10 of Figure 3. The pixel shift in the transformed por tions of the two subsequent frames is determined by the coordinate of this pulse signal.
The respective pixel shifts of each group/pair of two subsequent frames of the plurality of the frames are determined. In order to evaluate the respective pixel shifts of each of the groups of the subsequent frames of the plurality of the frames, the mean deviation and/or standard deviation may be calculated. For each of the mean deviation and/or the standard deviation a respective threshold value can be set. If the determined pixel shift is below the respective threshold value of the mean deviation and/or the standard deviation, a stopped state of the vehicle can be detected. On the other hand, if it is determined that the pixel shift is above the respective threshold value of the mean deviation and/or the standard deviation, a moving state of the vehicle is identified.
When considering a ground vehicle, the optical analysis can be made from front view images. In the case of panning images, such as images captured by a drone carrying a downward directed camera, there might not be any need for making a perspective transformation.
Regarding Figure 2, the method has been explained by performing a perspective transformation of the left and right portions of an image. It has to be noted that the method is not limited to these left and right portions of the image. Other portions of captured images may be used for perspective transformation such as a portion of the image directly ahead of the vehicle, for example to make a bird's-eye view transformation.
The method may be applied to detect a moving state of a vehicle, for example an autonomous driving car. However, the method is not limited to be used for de tecting the moving state of a vehicle. The method may be applied for other ap plications having limited movement sensors, where a stopped state of an object needs to be detected by a camera. For example, the method may be used for control of virtual reality scenes with a smartphone, or for controlling the flight of an aerial vehicle, for example for control of a hovering state of a drone.
Figure 4 shows an embodiment where two thresholds are set for the mean and the standard deviation, and if it is less than corresponding thresholds set, the vehicle is stopping, otherwise it is moving. This can give the result very accurate ly, as shown in the figure.

Claims

Claims
1. A method for detecting a moving state of a vehicle, comprising:
- capturing a plurality of frames of a scene by a camera of the vehicle,
- performing a perspective transformation of at least a section of each of the plurality of frames to achieve transformed portions of the frames,
- determining a pixel shift in the respective transformed portions of at least one group of at least two frames of the plurality of frames,
- evaluating the respective pixel shifts of at least one group of the at least two subsequent frames of the plurality of frames,
- determining the moving state of the vehicle based on the evalua tion of the respective pixel shifts of the at least one group of the at least two subsequent frames of the plurality of frames.
2. The method of claim 1, wherein the camera is embodied as a monocular camera.
3. The method of claim 1 or 2, wherein the plurality of frames are captured in a front-view of the camera.
4. The method of any of the claims 1 to 3, wherein the frames are consecutive frames.
5. The method of any of the claims 1 to 4, wherein the portion of each of the plurality of the frames is transformed in a side-view or a bird-view by the perspective transformation.
6. The method of any of the claims 1 to 5, comprising: determining the pixel shift by performing based on the group of the at least two subsequent frames of the plurality of the frames a Fourier analysis, calculating the cross-power spectrum by taking the complex conjugate of one of the Fourier transformed frames, multiplying the Fourier transforms together elementwise, normalizing this product elementwise, and perform ing an inverse Fourier transform of the result .
7. The method of any of the claims 1 to 5, comprising: determining the pixel shift by performing a phase correlation on the re spective transformed portions of the group of the at least two subsequent frames of the plurality of the frames.
8. The method of any of the claims 1 to 5, comprising: determining the pixel shift by performing a structural similarity analysis on the respective transformed portions of the group of the at least two sub sequent frames of the plurality of the frames.
9. The method of any of the claims 1 to 5, comprising: determining the pixel shift by performing a joint information entropy anal ysis on the respective transformed portions of the group of the at least two subsequent frames of the plurality of the frames.
10. The method of any of the claims 1 to 9, comprising: evaluating the respective pixel shifts by calculating at least one of a mean deviation or a standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames.
11. The method of claim 10, comprising:
- defining a respective threshold value of the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames,
- evaluating if the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames exceeds the respective threshold value of the mean deviation and/or the standard devia tion of the pixel shifts of each of the groups of the at least two sub sequent frames of the plurality of the frames.
12. The method of claim 11, comprising: determining the moving state of the vehicle such that a stopped state of the vehicle is identified, when it is evaluated that the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames is below the respective threshold value of the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subse quent frames of the plurality of the frames.
EP20774947.4A 2019-09-20 2020-09-16 Method for detecting a moving state of a vehicle Pending EP4052215A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019214395 2019-09-20
PCT/EP2020/075891 WO2021053031A1 (en) 2019-09-20 2020-09-16 Method for detecting a moving state of a vehicle

Publications (1)

Publication Number Publication Date
EP4052215A1 true EP4052215A1 (en) 2022-09-07

Family

ID=72560585

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20774947.4A Pending EP4052215A1 (en) 2019-09-20 2020-09-16 Method for detecting a moving state of a vehicle

Country Status (3)

Country Link
EP (1) EP4052215A1 (en)
CN (1) CN114730453A (en)
WO (1) WO2021053031A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117576200B (en) * 2024-01-15 2024-05-03 山东大学 Long-period mobile robot positioning method, system, equipment and medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0327339D0 (en) * 2003-11-25 2003-12-31 Fortkey Ltd Inspection apparatus and method
IES20060800A2 (en) * 2005-10-28 2007-09-05 Hi Key Ltd A method and apparatus for calibrating an image capturing device, and a method and apparatus for outputting image frames from sequentially captured image frames with compensation for image capture device offset
CN107590438A (en) * 2017-08-16 2018-01-16 中国地质大学(武汉) A kind of intelligent auxiliary driving method and system
US20190126941A1 (en) * 2017-10-31 2019-05-02 Wipro Limited Method and system of stitching frames to assist driver of a vehicle

Also Published As

Publication number Publication date
WO2021053031A1 (en) 2021-03-25
CN114730453A (en) 2022-07-08

Similar Documents

Publication Publication Date Title
US10691962B2 (en) Systems and methods for rear signal identification using machine learning
KR102098140B1 (en) Method for monotoring blind spot of vehicle and blind spot monitor using the same
JP6668435B2 (en) Blind spot monitoring method of automobile and blind spot monitor using the same {METHOD FOR MONOTORING BLIND SPOT OF VEHICLE AND BLIND SPOT MONITOR USING THE SAME}
US20220157068A1 (en) System and Method of Determining a Curve
EP3007099B1 (en) Image recognition system for a vehicle and corresponding method
CN110737266B (en) Automatic driving control method and device, vehicle and storage medium
US9626599B2 (en) Reconfigurable clear path detection system
US20200143179A1 (en) Infrastructure-free nlos obstacle detection for autonomous cars
US20200097739A1 (en) Object detection device and object detection method
US11774582B2 (en) Imaging and radar fusion for multiple-object tracking
CN106845332B (en) Vision-based wet road condition detection using tire side splash
EP3154835A1 (en) Top-down refinement in lane marking navigation
EP3286056B1 (en) System and method for a full lane change aid system with augmented reality technology
KR20170124299A (en) A method and apparatus of assisting parking by creating virtual parking lines
CN107273788A (en) The imaging system and vehicle imaging systems of lane detection are performed in vehicle
WO2021062596A1 (en) Systems and methods for predicting a vehicle trajectory
KR20170127036A (en) Method and apparatus for detecting and assessing road reflections
JP3562278B2 (en) Environment recognition device
EP4052215A1 (en) Method for detecting a moving state of a vehicle
US20230316539A1 (en) Feature detection device, feature detection method, and computer program for detecting feature
US20220101025A1 (en) Temporary stop detection device, temporary stop detection system, and recording medium
EP3227827B1 (en) Driver assistance system, motor vehicle and method for classifying a flow vector
Kurihata et al. Detection of raindrops on a windshield from an in-vehicle video camera
JP3081660B2 (en) Distance detection method
US11417115B2 (en) Obstacle recognition device

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220712

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
111L Licence recorded

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

Free format text: EXCLUSIVE LICENSE

Name of requester: QUALCOMM TECHNOLOGIES, INC., US

Effective date: 20231103