EP4133403A1 - Method and system for detecting a vehicle having at least one wheel - Google Patents

Method and system for detecting a vehicle having at least one wheel

Info

Publication number
EP4133403A1
EP4133403A1 EP20717628.0A EP20717628A EP4133403A1 EP 4133403 A1 EP4133403 A1 EP 4133403A1 EP 20717628 A EP20717628 A EP 20717628A EP 4133403 A1 EP4133403 A1 EP 4133403A1
Authority
EP
European Patent Office
Prior art keywords
wheel
person
vehicle
vehicle wheel
observed region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20717628.0A
Other languages
German (de)
French (fr)
Inventor
Marko Stefanovic
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xovis Germany GmbH
Original Assignee
Hella GmbH and Co KGaA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hella GmbH and Co KGaA filed Critical Hella GmbH and Co KGaA
Publication of EP4133403A1 publication Critical patent/EP4133403A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Definitions

  • the system 100 further comprises a second detection module 104 for detecting a per son in the observed region.
  • the detection module 104 receives as input the elevation maps computed by the processor unit 102.
  • the output of the detection module 104 is at least one candidate person.
  • the tracking module 106 tracks the at least one hypothesis in the consecutive images recorded by the at least one sensor 101.
  • the three segments (309, 310 and 311) are formed by adjacent pixels that have similar characteristic features.
  • the matching module 105 looks for matching in the AOI 301.
  • the matching module 105 receives from the first module 103 the segments 309, 310 and 311 as inputs.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The present invention discloses a method for detecting in an observed region a vehicle having at least one wheel comprising at least the following steps: - recording a sequence of images of the observed region by at least one sensor, - calculating at least one elevation map for each of the images of the ob-served region, - detecting at least one candidate wheel in the at least one elevation map according to at least one characteristic feature of a vehicle wheel - detecting at least one person in the at least one elevation map, - matching in at least one image of the observed region the at least one detected candidate vehicle wheel with at least one detected person, said detected person being located in a predetermined distance from the at least one candidate vehicle wheel - tracking the matched at least one candidate vehicle wheel and the at least one detected person in a predetermined number of consecutive im-ages of the observed region.

Description

Method and system for detecting a vehicle having at least one wheel
The present invention relates to a method and a system for detecting a vehicle having at least one wheel in an observed region.
State of the art
In the counting methods known from the prior art, systems that comprises a video camera connected to an image analysis unit are used as a counting device. A system of this type is designed to observe a region, for example the entry region of a store or means of transport, to recognize and localize objects moving in the region and to trig ger a counting event when one of the objects crosses a predefined boundary line run ning in the region, for example a door threshold, which runs in the region and divides it.
The counting methods known from the prior art are usually based on the computation of elevation maps of the observed region (see for example US2011169917). While those methods can detect person in a reliable way, the detection of vehicles having at least one wheel, especially bicycles, presents more difficulties as the different compo nents of the vehicle cannot be recognized in a reliable way in the elevation map and therefore components of the same vehicle are often detected as being part of differ ent vehicles or not detected at all.
Description
The object of the present invention is therefore to develop a method and a system with the aid of which a vehicle having at least one wheel can be detected in a reliable way.
The object is achieved by a method according to claim 1. Further a system to perform the method of claim 1 is described in claim 9.
Advantageous developments and embodiments of the present invention are disclosed by the features in the dependent claims.
The present invention describes a method for detecting a vehicle having at least one wheel in an observed region, the method comprising the steps of:
- recording a sequence of images of the observed region by at least one sensor, - calculating at least one elevation map for each of the images of the observed region,
- detecting at least one candidate vehicle wheel in the at least one elevation map ac cording to at least one characteristic feature of a vehicle's wheel
- detecting at least one person in the at least one elevation map,
-matching for each image the at least one detected candidate vehicle wheel with the at least one detected person, said detected person being located within a predeter mined distance from the at least one candidate vehicle wheel,
- tracking the matched at least one candidate vehicle wheel and at least one detected person in a predetermined number of consecutive images.
With the aid of this method it is possible to detect in a reliable way vehicles having at least one wheel. Preferably, the vehicle having at least one wheel is a bicycle. Prefera bly, the wheel which is detected is a front wheel of the vehicle.
The sequence of images of the detected region can be recorded using at least one sen sor. The at least one sensor is preferably an optical sensor. The optical sensor can be a single photo camera, a CCD camera, a video camera, a streak camera, a time of flight camera or a stereo camera. Preferably the optical sensor is a stereo camera. The at least one sensor is preferably characterized by an exposure time and by an image repe tition rate. The image repetition rate indicates how many images are recorded in a given period of time.
When recording images using a sensor arrangement, the observed region is the region that is observed by the sensor arrangement. Typically, the observed region is three-di mensional and contiguous. Preferably, a boundary line runs in the observed region. In a preferred embodiment, the boundary line reproduces the course of a door threshold in an entry region of a store or a means of transport. Preferably, the boundary line di vides the region in an entry region and an exit region.
Each image recorded by the at least one sensor can be divided in one or more areas of interest (AOIs). Preferably, the one or more AOIs have a rectangular shape. Preferably, at least one AOI includes the boundary line of the observed region. Preferably, at least one AOI includes a door threshold of a transport means.
Preferably consecutive images present the same AOIs: in such a way, each AOI can be monitored in time. A consecutive AOI is defined as the same AOI in consecutive im ages. An AOI could coincide with the whole image.
An elevation map, also known as height map, is computed for the one or more AOIs. The elevation map gives the height values for each pixel in the AOI. Preferably, the ground level of the one or more AOI has a pixel value equal to zero. In a preferred em bodiment the value of the pixels is represented by different colors in the elevation map: preferably a grey scale is used, with bright pixels representing higher pixel values and dark pixels representing the ground level with value 0. Objects, which have a height above the ground level, have a pixel value different from zero. Higher objects are represented by pixels having higher values. Preferably, higher objects are repre sented by brighter pixels.
In a preferred embodiment, the elevation map of the one or more AOIs is further fil tered. Preferably, pixels of the elevation maps that are outside a predetermined threshold are filtered out. Preferably the predetermined threshold is a range and com prises a maximum value and a minimum value: pixels that have values below said mini mum value and/or above said maximum value are filtered out. Preferably, pixels that have values outside the threshold range are set equal to zero. Preferably the maximum and/or minimum values of the threshold range are set equal to a typical minimum and/or maximum value of a vehicle wheel height. Preferably the maximum and/or minimum values of the threshold range are set equal to a typical minimum and/or maximum value of a bicycle wheel height.
Preferably the filtered elevation map of the one or more AOIs is searched for segments that could represent a wheel of a vehicle, preferably of a bicycle wheel. The operation of searching for segments in one or more AOI is called segmentation. Preferably, seg ments are formed by adjacent pixels having similar characteristics. Preferably, the seg ments are analyzed in respect to one or more characteristic features of a vehicle wheel, preferably of a bicycle wheel. The one or more characteristic features of a vehi cle wheel can be one or more of: size, length, eccentricity, inclination angle, ridgeness, curviness and gap. The size is defined as the area in terms of number of pixels occupied by a segment in the elevation map. Preferably segments that occupy an area below a predetermined threshold are disregarded as possible wheel candidate.
The number of pixels occupied by the segments can be translated in centimeter: in a preferred embodiments segments that occupy an area of less than 225 cm2 are not taken into account as wheel candidates.
The length of the segments is defined as the length of their main axle. Preferably, a threshold is defined for the length of the segments: segments with a length under or above a predetermined threshold are disregarded as possible candidate vehicle wheel.
The inclination angle is defined as the angle between the main axle of the segments and the normal vector to a predefined boundary line running in the image, preferably in the AOI. In a preferred embodiment the boundary line is a door threshold. In a pre ferred embodiment an angle value of 45° is taken as threshold: angles above said pre determined threshold are regarded as possible candidate vehicle wheel.
In the elevation map possible wheel candidates present an eccentricity, i.e. they pre sent an elongated form rather than a circular one. In a preferred embodiment, seg ments which have a pronounced eccentricity are regarded as possible wheel candidates. Preferably a pronounced eccentricity presents value near to 1.
In a preferred embodiment, when a vehicle having at least one wheel is pushed by a person, there is usually a free space in front of the wheel. The free space is defined as an area where the pixel value is equal zero. Preferably if at least two adjacent pixels have value equal to zero, a gap is formed.
In a preferred embodiment, the ridgeness of a candidate vehicle wheel is also taken into account as characteristic feature of a vehicle wheel. Vehicle wheels, especially bi cycle wheel, have a long and narrow profile that resembles a ridge. The ridgeness gives a measure of how narrow a wheel profile is. A value around 1 means that the candi date vehicle wheel presents a narrow profile.
Another characteristic feature of a vehicle wheel is the curviness. A vehicle wheel pro file usually presents a curviness which can be measured by calculating the gradient along the main axle of a segment and counting how many times the gradient lays above and below a predetermined threshold. The curviness is usually comprised be tween 1 and -1.
When a segment in the elevation map of the one or more AOIs presents one or more of the characteristic features mentioned above (size, length, inclination angle, gap, curviness and ridgeness), then the segment is regarded as candidate vehicle wheel. In a preferred embodiment, a segment must fulfill all the characteristic features men tioned above in order to be regarded as candidate vehicle wheel.
An elevation map of an AOI can have more than one segment that is regarded as can didate vehicle wheel. The candidate vehicle wheels are regarded as part of a vehicle, if at least one person is detected within a predefined distance from the candidate vehi cle wheel.
In a preferred embodiment, the detection of one or more persons is done by a person detection module. Preferably the detection module receives as input an elevation map of the observed region. Preferably the elevation map is the elevation map of one or more AOIs of the observed region.
The detection of persons using an elevation map is known from the state of the art.
In an image or AOI, if the detected person is located within a predetermined distance from a candidate vehicle wheel, then the candidate vehicle wheel is matched with said detected person. The matching of one candidate vehicle wheel with one detected per son is defined as matching process.
In each image or AOI, a candidate vehicle wheel can be matched or associated with one person. In each image or AOI, a person can be matched or associated with more than one candidate vehicle wheel, if said detected person can be found within a prede termined distance from more than one candidate vehicle wheel. However, if in one im age or AOI a candidate vehicle wheel is already been matched with one detected per son, further matching would be disregarded.
Preferably the matching process is done for each image. Preferably if an image is di vided in more than one AOI, then the matching process is done for each AOI sepa rately. If a candidate vehicle wheel can not be matched in one image or AOI with any de tected person, the matching process will try to find a candidate detected person for the matching in the consecutive image or consecutive AOI.
A matching or association is regarded as hypothesis. Each hypothesis is then tracked in consecutive images.
The tracking process comprises finding the positions of the same candidate vehicle wheel together with the same matched/associated person consecutive images. If the same candidate wheel together with the same matched/associated person is tracked for more than a predetermined number of consecutive images, then the candidate ve hicle wheel is considered to be part of a vehicle having a wheel.
If a person was matched/associated with more than one vehicle wheel, forming in such a way more than one hypothesis, the tracking will validate only one hypothesis or no hypothesis at all.
If a hypothesis is not validated, then the tracking process for said hypothesis is ended and the hypothesis is disregarded. A matching process will be started again for the can didate vehicle wheel of said disregarded hypothesis.
In a preferred embodiment the vehicle having at least one wheel is a bicycle. Prefera bly, the candidate vehicle wheel is the front wheel of the bicycle.
Optionally, the method further comprises triggering a counting event if the at least one detected vehicle and/or the associated/matched detected person cross a predefined boundary running in the observed region, for example a door threshold.
The present invention further describes a system for detecting a vehicle having at least one wheel.
The system for detecting a vehicle having at least one wheel comprises: at least one sensor for recording a sequence of images of an observed region; a processor for computing at least one elevation map for each of the images of the observed region, a first detection module for detecting at least one candidate vehicle wheel in the at least one elevation map according to at least one characteristic feature of a ve hicle wheel, a second detection module for detecting at least one person in the at least one elevation map, a matching module for matching in at least one image of the observed region the at least one candidate vehicle wheel with at least one detected person, if said de tected person is located within a predetermined distance from the at least one candi date vehicle wheel, a tracking module for tracking the at least one candidate vehicle wheel with the matched person in a predetermined number of consecutive images of the observed re gion.
The at least one sensor is preferably an optical sensor. The optical sensor can be a sin gle photo camera, a CCD camera, a video camera, a streak camera, a time of flight camera or a stereo camera. Preferably the optical sensor is a stereo camera. The at least one sensor is preferably characterized by an exposure time and by an image repe tition rate. The image repetition rate indicates how many images are recorded in a given period of time.
The system comprises a processor for processing the sequence of images of the ob served region recorded by the at least one sensor. Preferably, the processor generates at least one elevation map for each image of the observed region recorded by the at least one sensor. Preferably, the processor divides each image recorded by the at least one senor in one or more area of interest (AOIs). The processor further computes at least one elevation map for each area of interest.
Preferably, the first detection module detects the presence of at least one candidate vehicle wheel in each elevation map computed by the processor. Preferably the first detection module further filters the at least one elevation map by filtering out all the pixel values in the at least one elevation map that are outside a predetermined range. Preferably said predetermined range set equal to the height of a vehicle wheel. The first detection module detects the presence of at least one candidate vehicle wheel based on one or more predetermined characteristic features of a wheel. The predeter mined features of the wheel could be saved in a database in the detection module. Al ternatively, the predetermined characteristic features of the vehicle wheel could be saved in a server: in this case the first detection module exhibits means for connecting with the server and retrieving the information about the characteristic features of the vehicle wheel.
Preferably the vehicle is a bicycle. Preferably the wheel to be detected is a front wheel of a bicycle.
Preferably, a second detection module detects the presence of a person in at least one AOIs. Preferably the second detection module receives as input at least one elevation map of at least one AOI of the observed region computed by the aforementioned pro cessor. Alternatively, the second detection module could receive the elevation map of at least one AOI of the observed region by a different processor.
The system further comprises a matching module. The matching module receives as input at least one candidate vehicle wheel and at least one detected person. Prefera bly the matching module receives as input at least one candidate vehicle wheel and at least one detected person for each AOI or image.
The matching module tries to match each candidate vehicle wheel with at least one detected person. In order to be matched with a candidate vehicle wheel, the person should be within a predetermined distance from said candidate vehicle wheel.
In each image or AOI, the matching module can match one candidate vehicle wheel with one detected person. In each image or AOI, the matching module can match one detected person with more than one candidate vehicle wheel, if said detected person can be found within a predetermined distance from more than one candidate vehicle wheel. However, if in one image or AOI a candidate vehicle wheel is already been matched with one detected person, further matching would be disregarded.
The output of the matching module is then one matching for each candidate vehicle wheel and one or more matching for each detected person. Those matchings are also called hypothesis. If a candidate vehicle wheel has no matching in an image or AOI, the matching module will look for possible matchings in the consecutive image or consecu tive AOI.
The system further comprises a tracking module for tracking the hypothesis of the matching module in consecutive images or consecutive AOI. If a hypothesis is tracked for a predetermined number of consecutive images or consecutive AOIs, then the hy pothesis is confirmed and a vehicle with a matched person is detected.
If a hypothesis is not validated, then the
In a preferred embodiment, the matching module is integrated in the tracking module.
In a preferred embodiment, the system further comprises a counting module for counting the detected vehicle and/or the detected person when the detected vehicle and/or the detected person cross a boundary line in the observed region.
The system and method described in this application can be used for detecting and/or counting vehicle having at least one wheel in the entry region of a store or in the entry region of a means of transport. Preferably, the vehicle having at least one wheel is a bicycle.
Preferably the means of transport is a train, a bus, a tram or a public transport means. The system and method of the present invention could help to provide statistics about the number of persons and vehicles, preferably bicycles, that enter and/or exit a means of transport at a stop and/or station of the means of transport.
Description of the figures
Fig. 1 describes a system for detecting a vehicle having a wheel, according to one em bodiment of the present invention
Fig. 2 is a diagram flow of a method for detecting a vehicle having at least one wheel, according to one embodiment of the present invention
Fig. 3a is an exemplary image of an AOI, according to one embodiment of the present invention Fig. 3b represents candidate vehicle wheels, according to one embodiment of the pre sent invention
In Fig. 1 an exemplary system 100 of the present invention is represented.
In a preferred embodiment, the system 100 comprises at least one sensor 101 for re cording a sequence of images of an observed region. Preferably the observed region is an entry region of a store or means of transport. The sensor 101 is preferably an opti cal sensor. In this exemplary embodiment the sensor 101 is a stereo camera. The sys tem further comprises a processor 102 for processing the sequence of images of the observed region recorded by the sensor 101. In a preferred embodiment, the proces sor 102 generates at least one elevation map for each image of the observed region recorded by the at least one sensor. Preferably, the processor 102 divides each rec orded image in one or more area of interest: the processor 102 computes then the ele vation map for each area of interest.
The system 100 further comprises a first detection module 103 for detecting the pres ence of at least one candidate wheel of a vehicle in each elevation map computed by the processor 102. The first detection module 103 further filters the at least one eleva tion map by filtering out all the pixel values in the at least one elevation map that are outside a predetermined range, said predetermined range being a characteristic height of a vehicle wheel, preferably a bicycle wheel. The first detection module 103 detects the presence of at least one candidate vehicle wheel based on one or more predeter mined characteristic features of a wheel. In a preferred embodiment the characteristic features of a wheel are characteristic features of a bicycle wheel, preferably a front wheel of a bicycle. The predetermined features of the wheel could be saved in a data base in the detection module 103. Alternatively, the predetermined characteristic fea tures of the wheel could be saved in a server (not showed): in this case the detection module 103 exhibits means for connecting with the server and retrieve the infor mation about the characteristic features of the vehicle wheel.
The output of the detection module 103 is at least one candidate vehicle wheel.
The system 100 further comprises a second detection module 104 for detecting a per son in the observed region. The detection module 104 receives as input the elevation maps computed by the processor unit 102. The output of the detection module 104 is at least one candidate person.
The system further comprises a matching module 105. The matching module 105 re ceives as input the at least one candidate vehicle wheel from the first detection mod ule 103 and the at least one candidate person from the second detection module 104. The matching module 105 looks for each candidate vehicle wheel from module 103, if at least one candidate person from module 103 can be found in a predetermined dis tance from the candidate vehicle wheel and if this is the case, the candidate vehicle wheel is matched with the candidate person. The matching of a candidate vehicle wheel with at least one person within a predetermined distance defines a hypothesis.
The system further comprises a tracking module 106. The tracking module 106 re ceives as input at least one hypothesis from the matching module 106.
The tracking module 106 tracks the at least one hypothesis in the consecutive images recorded by the at least one sensor 101.
A vehicle is considered to be detected if the hypothesis is tracked in a predetermined number of consecutive images. Preferably, a vehicle is considered to be detected if a given hypothesis is tracked in 20 consecutive images.
In one preferred embodiment, the matching module 105 is integrated in the tracking module 106.
Optionally, the system can further comprise a counting module (not shown) for count ing the detected vehicle and/or the detected person when the detected vehicle and/or the detected person cross a boundary line in the observed region
Figure 2 represents a diagram flow describing a preferred method for the vehicle wheel detection carried on from the first detection module 103.
The first detection module 103 receives as input at least one elevation map for each AOIs. The elevation map represents a top view of the AOIs and gives as results the height of possible objects present in the AOIs. The objects could be persons, animals, vehicles, luggage. The height of the objects in the elevation map is given by the values of the pixels in the elevation map: the ground level has value zero. In a preferred em bodiment the value of the pixels is represented by different colors in the elevation map: preferably a grey scale is used, with bright pixels representing higher pixel values and dark pixels representing the ground level with value 0. At least one so computed elevation map of at least one AOI is used as input for the step 201 of Fig. 2. Preferably the at least one elevation map of the at least one AOI represents a door threshold.
In step 201 all the pixel values in the at least one elevation map of the at least one AOI that do not represent a typical vehicle wheel height are filtered out. The height of a typical vehicle wheel is usually comprised in a range between 25cm and 75 cm. Prefer ably typical heights for a front wheel of a bicycle are taken for the filter.
In step 202 a segmentation is performed on the elevation map of the at least one AOI obtained in step 201.
In step 203 the segments computed in step 202 are searched for one or more of the following characteristic features:
Size
Inclination angle
Length
Eccentricity
Ridgeness
Curviness
Gap
If a segment presents one or more of the above characteristic features, then the seg ment is considered to be a candidate vehicle wheel. In a preferred embodiment, the segment is identified as a candidate vehicle wheel if the segment presents all the char acteristics described above. In a preferred embodiment more than one candidate vehi cle wheel for AOI can be identified.
The at least one candidate vehicle wheel 204 is then sent as input to the tracking mod ule 105.
Fig. 3a represents an exemplary image of an observed region recorded by a sensor, preferably a stereo camera, according to a preferred embodiment of the present in vention.
In the preferred embodiment, the observed region is an entry region of a means of transport. An area of interest (AOI) is determined by a rectangular shape 301. The AOI includes a boundary line 302: this boundary lane represents a door threshold of the means of transport. The boundary line divides the AOI in two regions: a first region (303) located outside the means of transport and a second region (304) located inside the means of transport.
In the observed region two persons (305 and 306) and two bicycles (307 and 308) can be seen.
Persons and/or vehicles, preferably bicycles, crossing the boundary line 302 are count ing as entering the means of transport if they go from the region 303 to region 304. Persons and/or vehicles, preferably bicycles, crossing the boundary line 302 are count ing as exiting the means of transport if they go from the region 304 to region 303.
Fig. 3b represents an elevation map of Fig 3a, where segments that are regarded as candidate vehicle wheel are represented.
The elevation map is filtered out for all the pixels that have a height below or above a predetermined threshold. In the preferred embodiment of figure 3b this threshold is set equal to a typical height of a bicycle wheel.
The values of the filtered-out pixels are set to zero. For a better visualization of figure 3b, the filtered-out pixels that usually will be represented as dark pixels, are not shown in figure 3b.
The three segments (309, 310 and 311) are formed by adjacent pixels that have similar characteristic features.
The segments (309, 310 and 311) presents one or more of the characteristic features (size, inclination angle, length, eccentricity, ridgeness, curviness and gap) of a candi date vehicle wheel. In the exemplary embodiment of figure 3b the segments (309, 310 and 311) present all the characteristic features of a vehicle wheel. In the figure, the segments (309, 310 and 311) form ellipses with a pronounced eccentricity.
The matching module 105 looks for matching in the AOI 301. The matching module 105 receives from the first module 103 the segments 309, 310 and 311 as inputs.
For the AOI 301, the matching module 105 also receives as input the detected persons 305 from the second detection module 104.
The person 306 is not located in the AOI 301 and for this reason in this preferred em bodiment is not regarded as input. Note that if the person 306 enters the AOI 301 in a consecutive image, then the person 306 will be regarded as input for the matching module 105. For each of the segments 309, 310 and 311, the matching module 105 searches for a detected person in a predetermined range near them. The segment 309 is then matched/associated to the person 305.
The segments 310 and 311 cannot be matched with person 305 as person 305 is out side the predetermined range. Segments 310 and 311 are not considering as forming a valid hypothesis.
The matching of segment 309 with person 305 form a hypothesis. The hypothesis serves as input for the tracking module 106. The tracking module 106 tracks the hy pothesis separately in a predetermined number of consecutive images.
If the hypothesis can be consistently tracked for the preferred number of consecutive images, then the hypothesis is validated, and the bicycle 307 is confirmed to be de tected. Hypotheses that could not be validated are disregarded as candidate bicycle. The tracking of the hypothesis of figure 3b will then confirm the detection of bicycle 307.
This confirmed detection can be further used for triggering a counting event if the de tected vehicle (307) and/or the detected person (305) cross the boundary line 302 in the observed region 301.
Note that, if in a consecutive image, the matching module 105 receives as input a per son within the predetermined distance from segment 310 and/or segment 311, then the matching module will match said person with segment 310 and/or segment 311: this matching will form a hypothesis.
The above explanation of the embodiments describes the present invention in the con text of examples. Of course, individual features of the embodiments, if technically meaningful, can be freely combined with one another without departing from the scope of the present invention.

Claims

Claims
1. A method for detecting in an observed region a vehicle having at least one wheel comprising at least the following steps:
- recording a sequence of images of the observed region by at least one sensor,
- calculating at least one elevation map for each of the images of the observed region,
- detecting at least one candidate vehicle wheel in the at least one elevation map according to at least one characteristic feature of a vehicle wheel,
- detecting at least one person in the at least one elevation map,
- matching in at least one image of the observed region the at least one de tected candidate vehicle wheel with at least one detected person, said de tected person being located within a predetermined distance from the at least one candidate vehicle wheel,
- tracking the matched at least one candidate vehicle wheel and the at least one detected person in a predetermined number of consecutive images of the observed region.
2. The method according to claim 1, wherein the at least one characteristic fea ture of a vehicle wheel is one or more of the following characteristic features: size, inclination angle, length, eccentricity, rigdeness, curviness, gap.
3. The method according to any of the previous claims, wherein the observed re gion is divided in one or more areas of interest having a rectangular shape.
4. The method according to any of the previous claims, wherein the observed re gion includes a door threshold.
5. The method according to any of the previous claims, wherein the vehicle having at least one wheel is a bicycle.
6. The method according to any of the previous claim wherein the candidate wheel is a front wheel of the vehicle.
7. The method according to any of the previous claim wherein the method further comprises triggering a counting event if the detected vehicle and/or the de tected person cross a boundary line in the observed region.
8. A system (100) for detecting a vehicle having at least one wheel comprising: at least one sensor (101) for recording a sequence of images of an observed region; a processor (102) for computing at least one elevation map for each of the images of the observed region, a first detection module (103) for detecting at least one candidate vehicle wheel in the at least one elevation map according to at least one character istic feature of a vehicle's wheel, second detection module (104) for detecting of at least one person in the at least one elevation map, a matching module (105) for matching in at least one image of the observed region the at least one candidate vehicle wheel with at least one detected person, if said detected person is located in a predetermined distance from the at least one candidate vehicle wheel, a tracking module (106) for tracking the at least one candidate vehicle wheel with the matched person in a predetermined number of consecutive images of the observed region.
9. The system of claim 8, wherein the at least one sensor (101) is a stereo camera.
10. The system of claim 8 or 9, wherein the matching module (105) is integrated in the tracking module (106)
11. The system of any of the claims 8 to 10 wherein the system further comprises a counting module for counting the detected vehicle and/or the detected person when the detected vehicle and/or the detected person cross a boundary line in the observed region
EP20717628.0A 2020-04-06 2020-04-06 Method and system for detecting a vehicle having at least one wheel Pending EP4133403A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2020/059731 WO2021204344A1 (en) 2020-04-06 2020-04-06 Method and system for detecting a vehicle having at least one wheel

Publications (1)

Publication Number Publication Date
EP4133403A1 true EP4133403A1 (en) 2023-02-15

Family

ID=70224374

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20717628.0A Pending EP4133403A1 (en) 2020-04-06 2020-04-06 Method and system for detecting a vehicle having at least one wheel

Country Status (2)

Country Link
EP (1) EP4133403A1 (en)
WO (1) WO2021204344A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009039162A1 (en) * 2009-08-27 2011-03-17 Knorr-Bremse Gmbh Monitoring device and method for monitoring an entry or exit area from an access opening of a vehicle to a building part
US20110169917A1 (en) 2010-01-11 2011-07-14 Shoppertrak Rct Corporation System And Process For Detecting, Tracking And Counting Human Objects of Interest
DE102013200817A1 (en) * 2013-01-18 2014-07-24 Hella Kgaa Hueck & Co. Method for detecting an occupancy of a monitored zone
SE1550006A1 (en) * 2015-01-07 2016-06-14 Viscando Ab Method and system for categorization of a scene
CN108898067B (en) * 2018-06-06 2021-04-30 北京京东尚科信息技术有限公司 Method and device for determining association degree of person and object and computer-readable storage medium
JP2020004252A (en) * 2018-06-29 2020-01-09 株式会社東芝 Residue detection system

Also Published As

Publication number Publication date
WO2021204344A1 (en) 2021-10-14

Similar Documents

Publication Publication Date Title
Rashid et al. Automatic parking management system and parking fee collection based on number plate recognition
CN101689328B (en) Image-processing device and image-processing method
EP3007099A1 (en) Image recognition system for a vehicle and corresponding method
AU2015352462B2 (en) Method of controlling a traffic surveillance system
US20030235327A1 (en) Method and apparatus for the surveillance of objects in images
US20080166018A1 (en) Method and apparatus for performing object recognition on a target detected using motion information
CN110826356B (en) Non-motor vehicle violation detection system, method and server
JP6569138B2 (en) Axle number detection device, vehicle type discrimination system, axle number detection method and program
KR101665961B1 (en) Apparatus for providing image of vehicle and method thereof
Lan et al. Real-time automatic obstacle detection method for traffic surveillance in urban traffic
CN115331191B (en) Vehicle type recognition method, device, system and storage medium
KR102197449B1 (en) Enforcement system for enforcement a certain section in the section enforcement point
KR20160035121A (en) Method and Apparatus for Counting Entity by Using Location Information Extracted from Depth Image
KR20150029551A (en) Determining source lane of moving item merging into destination lane
KR20170088692A (en) Device and Method for Calculating Vehicle Speed by Image
CN108399360A (en) A kind of continuous type obstacle detection method, device and terminal
Di et al. Forward Collision Warning system based on vehicle detection and tracking
CN103577790B (en) road turn type detection method and device
Kumar et al. Traffic surveillance and speed limit violation detection system
EP3244344A1 (en) Ground object tracking system
CN113569812A (en) Unknown obstacle identification method and device and electronic equipment
EP4133403A1 (en) Method and system for detecting a vehicle having at least one wheel
KR102133045B1 (en) Method and system for data processing using CCTV images
WO2021056185A1 (en) Systems and methods for partially updating high-definition map based on sensor data matching
KR101363176B1 (en) Electronic toll collecting system and method thereof

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20221025

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: XOVIS GERMANY GMBH