EP4052215A1 - Method for detecting a moving state of a vehicle - Google Patents
Method for detecting a moving state of a vehicleInfo
- Publication number
- EP4052215A1 EP4052215A1 EP20774947.4A EP20774947A EP4052215A1 EP 4052215 A1 EP4052215 A1 EP 4052215A1 EP 20774947 A EP20774947 A EP 20774947A EP 4052215 A1 EP4052215 A1 EP 4052215A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- frames
- vehicle
- pixel
- subsequent
- groups
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 230000009466 transformation Effects 0.000 claims abstract description 21
- 238000011156 evaluation Methods 0.000 claims abstract description 5
- 238000004458 analytical method Methods 0.000 claims description 16
- 230000002311 subsequent effect Effects 0.000 claims description 5
- 238000001228 spectrum Methods 0.000 claims description 3
- 241000479907 Devia <beetle> Species 0.000 claims 1
- 239000003981 vehicle Substances 0.000 description 57
- 230000003287 optical effect Effects 0.000 description 9
- 238000001514 detection method Methods 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 3
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 2
- 238000005206 flow analysis Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 241000238876 Acari Species 0.000 description 1
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- ULFUTCYGWMQVIO-PCVRPHSVSA-N [(6s,8r,9s,10r,13s,14s,17r)-17-acetyl-6,10,13-trimethyl-3-oxo-2,6,7,8,9,11,12,14,15,16-decahydro-1h-cyclopenta[a]phenanthren-17-yl] acetate;[(8r,9s,13s,14s,17s)-3-hydroxy-13-methyl-6,7,8,9,11,12,14,15,16,17-decahydrocyclopenta[a]phenanthren-17-yl] pentano Chemical compound C1CC2=CC(O)=CC=C2[C@@H]2[C@@H]1[C@@H]1CC[C@H](OC(=O)CCCC)[C@@]1(C)CC2.C([C@@]12C)CC(=O)C=C1[C@@H](C)C[C@@H]1[C@@H]2CC[C@]2(C)[C@@](OC(C)=O)(C(C)=O)CC[C@H]21 ULFUTCYGWMQVIO-PCVRPHSVSA-N 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000003455 independent Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- MYWUZJCMWCOHBA-VIFPVBQESA-N methamphetamine Chemical compound CN[C@@H](C)CC1=CC=CC=C1 MYWUZJCMWCOHBA-VIFPVBQESA-N 0.000 description 1
- KRTSDMXIXPKRQR-AATRIKPKSA-N monocrotophos Chemical compound CNC(=O)\C=C(/C)OP(=O)(OC)OC KRTSDMXIXPKRQR-AATRIKPKSA-N 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 229920000136 polysorbate Polymers 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000002226 simultaneous effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/14—Transformations for image registration, e.g. adjusting or mapping for alignment of images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the embodiments relate to a method for detecting a moving state, in particular a stopped state, of a vehicle, for example an autonomous driving car.
- Visual odometry and driver assistance systems have become more and more popular in recent years and naturally attract increasing attention from research ers in the areas of intelligent robotics and autonomous driving.
- One frequently occurring task is to detect whether a vehicle is currently stopped. In standard production vehicles this can be achieved by using vehicle movement sensors such as wheel ticks.
- the sensor data which are easily available from the vehicle, or with simple additional equipment, are GPS data, IMU data, and data from a camera mounted in the vehicle.
- data received from IMU and GPS do not provide a relia ble detection of a stopped state of a vehicle.
- the detection of a stopped state of a vehicle with GPS fails, for example, in tunnels or urban canyons, but also often in standard situations. For example, even if a vehicle is stopped, the GPS position can move.
- the detection of a stopped state of the vehicle by using IMU is very sensitive to noise. Small movements of the vehicle, such as movements induced by the wipers, moving passengers or moving traffic can indicate a movement of the vehicle.
- SLAM Sim ultaneous Localization And Mapping
- optical flow analysis most of the prior art optical methods such as feature point tracking or optical flow analy sis are sensitive to movement in parts of a captured image of a scene. For exam ple, rain splashing on the windscreen, moving traffic, wiper movement or other movements in a captured image can falsely indicate a movement of the vehicle.
- the known prior art optical methods do not allow a robust and reliable detection of a stopped state of a vehicle.
- the problem to be solved by the invention is to provide a method for detecting a moving state of a vehicle, in particular a stopped state of a vehicle, by using opti cal methods that are robust against disturbing influences that might erroneously indicate a movement of the vehicle, even if the vehicle is really stopped.
- An embodiment relates to a method for detecting a moving state of a vehicle, in particular a stopped state of the vehicle, which allows a reliable detection of the moving state of the vehicle even if optical data are influenced by disturbing fac tors.
- a plurality of frames of a scene are captured by a camera of the vehicle.
- the frames may be consecutive frames.
- a perspective transformation of at least a portion of each of the plurality of consecutive frames may be performed to achieve transformed portions of each of the frames.
- a pixel shift may be determined in the respective trans formed portions of a group of at least two preferably subsequent frames of the plurality of the frames.
- the respective pixel shifts of each of the groups of the at least two preferably subsequent frames of the plurality of the frames may be evaluated.
- the moving state of the vehicle may be determined based on the evaluation of the respective pixel shifts of each of the groups of the at least two preferably subsequent frames of the plurality of the frames. It is preferred to use subsequent frames, but frames may also be skipped, for example, if they are distorted by a foreign object or noise, or if they cannot be used for other rea sons.
- Such an embodiment for detecting a moving/stopped state of a vehicle only re quires at least one optical sensor, i.e. a camera.
- the method works with the use of a single camera which may be a monocular camera.
- the evaluation of the data of the camera does not depend on the details of a captured image, for example where local movement can happen, but the meth od rather may evaluate the global movement of the captured images/frames.
- the frames/images which may be consecutive ones, may be captured from the camera by a front view.
- the method may derive several portions from each front view image.
- the perspective transformation may be performed for each portion of a captured image/frame in order to better emphasize the vehicle movement.
- each portion of a captured frame/image is transformed from the front view in a side-view or bird's eye view transformed picture part in which the movement of the vehicle can be better detected than in the original front view image.
- the shift be tween two frames may be analyzed for each pair of captured frames to obtain a judgement regarding the moving state of the vehicle, for example to obtain a judgement of whether the vehicle is stopping.
- the embodiment may analyze the change of the image overall and can tolerate movement in parts of a captured image. As a result, it may even handle extreme scenes, such as scenes with heavy rain or dense traffic.
- the method is robust with regard to nearly all kinds of scenes difficult to analyze, such as moving wip ers, scenes with busy traffic and bad weather conditions.
- the method can be applied to any vehicle independent of the type of vehicle. Except for the camera, it does not require specific equipment.
- the method is inde pendent of the type of camera used.
- Figure 1 shows a flowchart illustrating method steps of a method for detecting a moving state of a vehicle
- Figure 2 illustrates a frame/image captured by a camera in a front-view with por tions of the frame to be evaluated by the method for detecting a mov ing state of a vehicle
- Figure 3 illustrates an exemplified result of a Fourier analysis and inverse Fourier transformation to determine a pixel shift between captured frames.
- Figure 4 shows an embodiment with two thresholds.
- FIG 1 a first embodiment including steps SI - S5 for detecting a moving state/stopped state of a vehicle according to the invention are shown.
- a plurality of frames which may be consecutive frames of a scene is captured by a camera of the vehicle.
- the camera used may be configured as a monocular camera.
- the plurality of frames may be captured by the camera in a front-view.
- a perspective transformation of at least a portion of each of the plu rality of the frames is performed to achieve transformed portions of each of the frames.
- the portion of each of the plurality of the frames is transformed from the captured front-view in a side-view or a birds-eye view by the perspective transformation.
- the perspective transformation is used such that the transformed picture part allows for a better detection of the movement of the vehicle than the originally captured image.
- a pixel shift is determined in the respective transformed portions of a group of at least two subsequent frames of the plurality of the frames.
- the pixel shift is determined by calculating a phase corre lation.
- the phase correlation may be calculated by performing a Fourier analysis transformation on the respective transformed portions of the group of the at least two subsequent frames of the plurality of the frames, calculating the cross power spectrum by taking the complex conjugate of one of the Fourier trans formed frames, multiplying the Fourier transforms together elementwise, nor malizing this product elementwise, and performing an inverse Fourier transform of the result.
- the pixel shift is determined by performing a structural similarity analysis or a joint information entropy analysis on the re spective transformed portions of the group of the at least two subsequent frames of the plurality of the frames.
- a step S4 the respective pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames are evaluated.
- the respective pixel shifts are evaluated by calculating at least one mean deviation or a standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames.
- a respective threshold value of the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames is defined. It may fur ther be evaluated if the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames exceeds the respective threshold value of the mean devi ation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames.
- the moving state of the vehicle is determined based on the evalua tion of the respective pixel shifts of each of the groups of the at least two subse quent frames of the plurality of the frames.
- the moving state of the vehicle is determined such that, in particular, a stopped state of the vehicle is identified when it is evaluated that the mean deviation and/or the standard deviation of the pixel shifts of each of the groups of the at least two subsequent frames of the plurality of the frames is below the respective threshold value of the mean deviation and/or the stand ard deviation of the pixel shifts of each of the groups of the at least two subse quent frames of the plurality of the frames.
- the various method steps SI to S5 may be performed by a control unit of the vehicle which may be embodied as a processor or a controller.
- the above-described method steps are explained in the following by an example of a captured front-view image/frame shown in Figure 2.
- the frame of a scene shown in Figure 2 is captured by a camera of a vehicle.
- the camera has captured the illustrated front view through the front window of the vehicle.
- a perspective transformation is applied to the left portion PI and the right por tion P2 of the captured image.
- the definition of perspective transformation is that the collineation is set up in a plane by projecting the points of another plane on it from two different centers of projection.
- the trapezoid shown in dashed lines is con verted to the rectangle R2.
- the transformation matrix is calculated from four corresponding points, and then the image in the entire trapezoid P2 is trans formed.
- the portion PI on the left side of the front view is transformed into a pic ture which is just like looking out of the left side window of the vehicle.
- the por tion P2 of the right side of the front view of the captured image is transformed into a picture which is just like looking out of the right side window of the vehi cle.
- the purpose of this transformation is to provide a better data for the follow- ing image analysis, for example a following Fourier analysis, because panned im ages are easier to process for a Fourier analysis.
- a Fourier analysis may be performed on the left and right transformed portions of respective frames between two frames. After having performed the Fourier analysis, calculating the cross-power spectrum by taking the complex conjugate of one of the Fourier transformed frames, multiplying the Fourier transforms together elementwise, normalizing this product elementwise, an in verse Fourier transform is applied.
- the respective pixel shifts of each group/pair of two subsequent frames of the plurality of the frames are determined.
- the mean deviation and/or standard deviation may be calculated.
- a respective threshold value can be set. If the determined pixel shift is below the respective threshold value of the mean deviation and/or the standard deviation, a stopped state of the vehicle can be detected. On the other hand, if it is determined that the pixel shift is above the respective threshold value of the mean deviation and/or the standard deviation, a moving state of the vehicle is identified.
- the optical analysis can be made from front view images.
- panning images such as images captured by a drone carrying a downward directed camera, there might not be any need for making a perspective transformation.
- the method has been explained by performing a perspective transformation of the left and right portions of an image. It has to be noted that the method is not limited to these left and right portions of the image. Other portions of captured images may be used for perspective transformation such as a portion of the image directly ahead of the vehicle, for example to make a bird's-eye view transformation.
- the method may be applied to detect a moving state of a vehicle, for example an autonomous driving car.
- the method is not limited to be used for de tecting the moving state of a vehicle.
- the method may be applied for other ap plications having limited movement sensors, where a stopped state of an object needs to be detected by a camera.
- the method may be used for control of virtual reality scenes with a smartphone, or for controlling the flight of an aerial vehicle, for example for control of a hovering state of a drone.
- Figure 4 shows an embodiment where two thresholds are set for the mean and the standard deviation, and if it is less than corresponding thresholds set, the vehicle is stopping, otherwise it is moving. This can give the result very accurate ly, as shown in the figure.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019214395 | 2019-09-20 | ||
PCT/EP2020/075891 WO2021053031A1 (en) | 2019-09-20 | 2020-09-16 | Method for detecting a moving state of a vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4052215A1 true EP4052215A1 (en) | 2022-09-07 |
Family
ID=72560585
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20774947.4A Pending EP4052215A1 (en) | 2019-09-20 | 2020-09-16 | Method for detecting a moving state of a vehicle |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4052215A1 (en) |
CN (1) | CN114730453A (en) |
WO (1) | WO2021053031A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117576200B (en) * | 2024-01-15 | 2024-05-03 | 山东大学 | Long-period mobile robot positioning method, system, equipment and medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB0327339D0 (en) * | 2003-11-25 | 2003-12-31 | Fortkey Ltd | Inspection apparatus and method |
IES20060800A2 (en) * | 2005-10-28 | 2007-09-05 | Hi Key Ltd | A method and apparatus for calibrating an image capturing device, and a method and apparatus for outputting image frames from sequentially captured image frames with compensation for image capture device offset |
CN107590438A (en) * | 2017-08-16 | 2018-01-16 | 中国地质大学(武汉) | A kind of intelligent auxiliary driving method and system |
US20190126941A1 (en) * | 2017-10-31 | 2019-05-02 | Wipro Limited | Method and system of stitching frames to assist driver of a vehicle |
-
2020
- 2020-09-16 WO PCT/EP2020/075891 patent/WO2021053031A1/en unknown
- 2020-09-16 EP EP20774947.4A patent/EP4052215A1/en active Pending
- 2020-09-16 CN CN202080080080.XA patent/CN114730453A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2021053031A1 (en) | 2021-03-25 |
CN114730453A (en) | 2022-07-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10691962B2 (en) | Systems and methods for rear signal identification using machine learning | |
KR102098140B1 (en) | Method for monotoring blind spot of vehicle and blind spot monitor using the same | |
JP6668435B2 (en) | Blind spot monitoring method of automobile and blind spot monitor using the same {METHOD FOR MONOTORING BLIND SPOT OF VEHICLE AND BLIND SPOT MONITOR USING THE SAME} | |
US20220157068A1 (en) | System and Method of Determining a Curve | |
EP3007099B1 (en) | Image recognition system for a vehicle and corresponding method | |
CN110737266B (en) | Automatic driving control method and device, vehicle and storage medium | |
US9626599B2 (en) | Reconfigurable clear path detection system | |
US20200143179A1 (en) | Infrastructure-free nlos obstacle detection for autonomous cars | |
US20200097739A1 (en) | Object detection device and object detection method | |
US11774582B2 (en) | Imaging and radar fusion for multiple-object tracking | |
CN106845332B (en) | Vision-based wet road condition detection using tire side splash | |
EP3154835A1 (en) | Top-down refinement in lane marking navigation | |
EP3286056B1 (en) | System and method for a full lane change aid system with augmented reality technology | |
KR20170124299A (en) | A method and apparatus of assisting parking by creating virtual parking lines | |
CN107273788A (en) | The imaging system and vehicle imaging systems of lane detection are performed in vehicle | |
WO2021062596A1 (en) | Systems and methods for predicting a vehicle trajectory | |
KR20170127036A (en) | Method and apparatus for detecting and assessing road reflections | |
JP3562278B2 (en) | Environment recognition device | |
EP4052215A1 (en) | Method for detecting a moving state of a vehicle | |
US20230316539A1 (en) | Feature detection device, feature detection method, and computer program for detecting feature | |
US20220101025A1 (en) | Temporary stop detection device, temporary stop detection system, and recording medium | |
EP3227827B1 (en) | Driver assistance system, motor vehicle and method for classifying a flow vector | |
Kurihata et al. | Detection of raindrops on a windshield from an in-vehicle video camera | |
JP3081660B2 (en) | Distance detection method | |
US11417115B2 (en) | Obstacle recognition device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20220712 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
111L | Licence recorded |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR Free format text: EXCLUSIVE LICENSE Name of requester: QUALCOMM TECHNOLOGIES, INC., US Effective date: 20231103 |