DK2920767T3 - ROAD TAX OR TELEMATICS SYSTEM DEVICE - Google Patents

ROAD TAX OR TELEMATICS SYSTEM DEVICE Download PDF

Info

Publication number
DK2920767T3
DK2920767T3 DK13792913.9T DK13792913T DK2920767T3 DK 2920767 T3 DK2920767 T3 DK 2920767T3 DK 13792913 T DK13792913 T DK 13792913T DK 2920767 T3 DK2920767 T3 DK 2920767T3
Authority
DK
Denmark
Prior art keywords
image
zone
image sensor
vehicle
optical system
Prior art date
Application number
DK13792913.9T
Other languages
Danish (da)
Inventor
Carl-Olov Carlsson
Original Assignee
Kapsch Trafficcom Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kapsch Trafficcom Ab filed Critical Kapsch Trafficcom Ab
Application granted granted Critical
Publication of DK2920767T3 publication Critical patent/DK2920767T3/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07BTICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
    • G07B15/00Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points
    • G07B15/06Arrangements for road pricing or congestion charging of vehicles or vehicle users, e.g. automatic toll systems
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07BTICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
    • G07B15/00Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points
    • G07B15/06Arrangements for road pricing or congestion charging of vehicles or vehicle users, e.g. automatic toll systems
    • G07B15/063Arrangements for road pricing or congestion charging of vehicles or vehicle users, e.g. automatic toll systems using wireless information transmission between the vehicle and a fixed station
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/625License plates

Landscapes

  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Devices For Checking Fares Or Tickets At Control Points (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)
  • Character Input (AREA)

Description

DESCRIPTION
Technical Field [0001] The present Invention relates to the field of devices for tolling or telematics systems. In particular, the invention relates to a device which is provided with a pixel based image sensor wherein the image sensor is adapted to be arranged above a surveillance zone provided on a road. The surveillance zone has an extension along the direction of the road, and the image sensor is adapted to be arranged in between a first and a second end zone of the surveillance zone.
Background [0002] Road tolling systems are used to detect and register vehicles that are utilizing a road that is subjected to toll fees, sometimes in terms of tax The most commonly used road tolling systems comprises a gantry with several devices, including a transceiver and sensors, arranged onto it. The transceiver is used for automatic registration of passing vehicles with transponders. Vehicles with transponders are automatically tolled each time they pass a road toll station. The sensors, preferably cameras, are used to capture images of the license plates of passing vehicles without transponders. Depending on the system concept the images can be used to perform tolling based on the license plate number or serve as an evidence of non-compliance In the case transponders are required.
[0003] Commonly, two cameras are arranged as a pair on the gantry and positioned to capture one lane of the surveillance zone. The first of the two cameras is directed towards a first end zone wherein vehicles are entering the surveillance zone and the second of the two cameras is directed towards a second end zone wherein the vehicles are exiting the surveillance zone. The cameras are focused at a preset height above the road corresponding to the height of the license plate of the passing vehicles. The height is preset based on the most common height of license plates on a standard vehicle. The Images that are captured by the cameras are normally processed in an ANPR (automatic number plate recognition) system.
[0004] As described in US6959869B the cameras capture the front and rear license plate numbers and the numbers are automatically matched to each other by a control device such as a computer. With this system, at least two cameras are needed to capture both the front license plate and the rear license plate. A transceiver is also needed for the automatic reading and tracking of transponders. Hence, several separate devices are used for vehicle detection, tracking and registration which increase the cost of the road tolling systems and increase the visual impact of the gantry with several devices mounted thereon.
[0005] With systems as described above, a tracking system utilized between the capturing zones is needed to ensure that the images captured by the two cameras can be linked to the same vehicle. Documents US2004/222904 A1 and US2009/219387 A1 disclose surveillance devices according to prior art.
[0006] Devices are often also needed for vehicle classification based on physical characteristics such as size or volume of the vehicle.
[0007] There is hence a need for an improved device for use in road toll stations removing the above mentioned disadvantages. Summary of Invention [0008] An object among others of the present invention is to provide a device for tolling or telematics systems that eliminates the problems of a system needing more than one device e.g. several cameras or sensors for vehicle detection, tracking and classification. This object is achieved by a device provided with an image sensor arranged above a surveillance zone and provided with a wide-angle optical system. The invention also describes a method to control a device for tolling or telematics systems.
[0009] The present invention relates to a device for tolling or telematics systems provided with an image sensor. The image sensor is adapted to be arranged above a surveillance zone in a plane at a predetermined distance from the device. The predetermined distance is adapted to be the distance the device is mounted above a road it is provided to monitor. The surveillance zone has an elongated extension, which when the device is in use is adapted to be aligned with the direction of the road. The image sensor is adapted to be arranged in between a first and a second end zone of said surveillance zone in order to capture both a front and a rear view of a vehicle driving through the surveillance zone.
[0010] The image sensor is provided with a wide-angle optical system, which is provided with a refraction and which is arranged such relative said image sensor, that an image captured by said image sensor comprises both said first and said second end zone. The effect of this is that only one image sensor is needed for Identification of vehicles passing the surveillance zone since the image sensor can capture the front of vehicles entering the first end zone as well as the rear of vehicles exiting the second end zone. The images comprise the characteristics of the vehicles that can be used for identification, such as license plates or other individual marks, i.e. a "finger print" of the vehicle.
[0011] The image captured by the image sensor also comprises the zone in between the first and the second end zone, i.e. the middle of the surveillance zone hence the entire surveillance zone is captured in the image. In the middle of the surveillance zone the image sensor captures the vehicles essentially from above.
[0012] According to one aspect of the present invention a resolution of the image captured by the image sensor varies and the optical system is arranged such relative said image sensor that the resolution is higher in a part of the image capturing a high prioritized area, specifically said first and second end zone, and lower in a part of the image capturing a low prioritized area, specifically a zone between said first and second end zone. The high prioritized area is preferably the first and second end zone. Which area of the surveillance zone is chosen as high priority depends on the intended use of the device. The first and second end zones are areas in which a vehicle may be captured by the image sensor in an angle such that the front and the rear of the vehicle are captured respectively. Therefore, the first and second end zones are often preferred as the high prioritized areas in tolling systems because then the front of a vehicle as it enters the first end zone and the rear of a vehicle as it exits the second end zone are captured in the image with the highest resolution.
[0013] In tolling systems the middle zone is often considered the low prioritized area. In the middle zone the vehicles are captured by the image sensor from above at a low resolution, this because a high resolution image of the top of the vehicle is not needed in order to track the vehicle through the low prioritized area or to classify the vehicle. However, because the Image sensor is also capturing the vehicle in the middle zone, the vehicle may be tracked in the entire surveillance zone. Depending on application of the device the high and low prioritized areas can be provided differently within the surveillance zone.
[0014] The first and second end zones will be considered equivalent to the high prioritized areas In the description of the invention that follows. The zone in between the first and the second end zone, the middle zone, will be equivalent to the low prioritized area In the description that follows. The high and low prioritized areas can be provided elsewhere in the surveillance zone and/or be of a larger or smaller number without departing from the scope of the invention.
[0015] One advantageous feature of the device is that the image sensor is a pixel base image sensor and that the variation in resolution is excelled with a varied pixel density of said image sensor. The pixel density is higher in the part of the Image sensor that captures the first and second end zone and the pixel density is lower in the part of the image sensor capturing the middle of the surveillance zone. The image sensor can thereby comprise fewer pixels compared to if the image sensor has a constant pixel density across the entire image sensor. The lowering in pixel density is achieved with maintained high resolution in the high prioritized area. A high resolution facilitates the identification of the vehicle by license plates or other identifiers on the front and rear of the vehicle and it is hence preferred to capture the high prioritized areas in high resolution by the image sensor.
[0016] Another advantageous feature of the device is that the variation in resolution is due to a refraction of said optical system, whereby said optical system projects one area unit of said surveillance zone upon a different number of pixels depending on where in said surveillance zone said one area unit is located. The refraction of the optical system may vary with the cut, grind or polish of the lens of the Image sensor. The first and second end zones are refracted by the optical system to project on a large number of pixels in the image sensor relative the end zones area. The middle of the surveillance zone is refracted to project on a smaller number of pixels relative the area of the middle of the surveillance zone. Hence, the resolution in the image of the first and second end zone, i.e. the high prioritized areas, is higher than the resolution of the rest of the surveillance zone in the captured image. This enables that the entire surveillance zone can be captured with an image sensor with a lower number of pixels than if the both the end zones and the middle zone of the surveillance zone were projected upon the same number of pixels relative the area of the respective zone. Accordingly, the resolution of the first and second end zone is higher than It would be using a device which does not enable a variation of the refraction of the optical system. Fewer pixels can therby be used to achieve a resolution of the first and second end zone, and still capture the entire surveillance zone in one image. Fewer pixels lead to lower costs for the image sensor and to images with fewer pixels. Less data memory and less data processing are thereby needed and the processing speed of the whole system using the device is increased and the cost thereof is lowered.
[0017] A combination of variation In pixel density of the image sensor and the refraction of the optical system may also be possible. A combination will increase the difference in resolution between the part of the image capturing the first and a second end zone and the part capturing the rest of the surveillance zone. The first and second end zone may be projected by the optical system onto a larger part of the image sensor relative their actual area, determined by the refraction of the optical system, and wherein this large part of the sensor comprising a high pixel density. Analogously the middle zone of the surveillance area is projected by the optical system onto a smaller part of the image sensor relative its actual area, wherein this smaller area comprises a low pixel density. Thereby the characteristics of the vehicles in the first and second end zones are clear and easily read In an image provided from the image sensor, which facilitates a reliable identification of the vehicles.
[0018] In one embodiment of the present invention the optical system comprises a mirror and/or a lens arranged in front of said image sensor and causing said refraction. Thereby, the optical system may be arranged adjacent but not in front of the image sensor. This allows for flexibility in manufacturing of the device, as well as in the positioning of the image sensor and optical system in the device and onto the gantry.
[0019] The width and length of the surveillance zone captured by the image sensor is determined by the wide angle optical system characteristics. One advantageous feature of the optical system is that it is provided with a shape enabling a rectangular surveillance zone. Hence, an optical system may capture the width of only one single lane or the width of several lanes. The characteristics of the optical system are also determining the length of the surveillance zone, hence where the first and second end zones are located along the road. Independent of the shape of the optical system the refraction of the optical system may vary as described above.
[0020] Alternatively the shape of the optical system may be of fish-eye type, and cover a surveillance area having an extension along the direction of the road as well as the width of several lanes of the road.
[0021] One advantageous feature of the present invention is that the device may be a part of, or can be combined with sensors providing stereoscopic and/or multidimensional images in which vehicle types and/or heights of vehicles can be detected. The effect of this is that the type of vehicle can be identified. By determining for example the height of a vehicle passing through the surveillance zone the toll fee for that vehicle may automatically be determined e.g. if it is a truck or a car.
[0022] The device may also comprise a transceiver for sending and receiving information from passing vehicles transponders. By this means, vehicles with transponders are automatically charged with a toll fee and do not have to be identified by images captured by the image sensor.
[0023] The scope of the present invention also encompasses a system of devices according to the present invention. Such a system having access to images captures by a plurality of devices.
[0024] Preferably the system of devices can combine information withdrawn from the images in order to track a vehicle moving between surveillance zones corresponding to different image sensors. One image sensor may correspond to one lane of the road in the surveillance zone. If a vehicle enters the first end zone in one lane, and exits the second end zone in a different lane, i.e. change lane in the middle of the surveillance zone, then the system can combine the information withdrawn from the images captured by the two corresponding sensors. The effect of this is that a vehicle cannot pass the surveillance zone without being registered by the system. The vehicle may also be captured by an image sensor in the middle of the surveillance zone. Hence, the vehicle can more easily be tracked through the entire surveillance zone, by the images sensor continuously capturing images of all plausible positions in the surveillance zone.
[0025] According to another aspect of the system it can combine information from images taken by different image sensors in order to estimate a height, a length and/or a volume of a vehicle. Thereby, the vehicle type may be determined and additional stereoscopic sensors may be precluded.
[0026] Another aspect of the invention is a method to control a device or system of the above described art, wherein the sensor continuously captures images of the surveillance zone. No vehicle can pass the surveillance zone without being captured by the sensor because images are constantly captured. Thereby, the method is trustworthy in respect to making sure all vehicles are registered and that toll is applied to all passing vehicles that should pay toll.
[0027] According to another aspect of the method the device or system of the above described art is adapted to be controlled to save an output from the sensor, wherein the output comprises at least a first and a second image wherein the first image comprise characteristics of a front of a vehicle in the first end zone and a second image comprise the characteristics of a rear of the vehicle in the second end zone. The first and second images saved as an output from the sensor are paired and can be used to identify the passing vehicle. Pairing is preferably done by identification of the license plate number read In the image. The image pair may also be used to control that the vehicle does not carry false license plates, i.e. a license plate number identified in the first end zone should be identified in an image of the second end zone within reasonable time. If not, the vehicle might be carrying a fake license plate. The image pair may also be saved as a proof that the vehicle did indeed pass the tolling zone, in case of a dispute over the toll applied.
[0028] Images comprising characteristics of a front of a vehicle or rear or a vehicle carrying a transponder may not be saved as an output. These vehicles are preferably tolled automatically; thereby the images are not needed for identification. However, the images could be saved for other purposes.
[0029] Another advantageous aspect of the method is that the output comprises a third image wherein the third image comprises the vehicle in between the first and the second end zone. The vehicle is seen from above in between the first and the second end zone. Thereby, the vehicle may be tracked all the way through the surveillance zone, and proof in the form of images is kept of the passing. This is made possible by continuous capturing by the sensor.
[0030] According to the method the device may also be combined with other technology for vehicle Identification, such as radio frequency identification.
Brief Description of Drawings [0031] The present invention will now be described in detail with reference to the figures, wherein:
Figure 1 shows a schematic view of one embodiment of the present invention.
Figure 2 shows a schematic view of another embodiment of the present invention.
Figure 3 shows a schematic view of the present invention seen from above.
Figure 4 shows a schematic view of an image sensor according to the invention.
Detailed Description of Drawings [0032] In the following, embodiments of the present invention are described. The invention is however not limited thereto. All the figures are schematic.
[0033] Figure 1 shows a first embodiment of the device 1 for tolling or telematics systems provided with a pixel based image sensor 2. Figure 1 further shows a surveillance zone 14 provided on a road 4. The device 1 is adapted upon a gantry (not shown) a predetermined distance above the road, such that the plane of the surveillance zone 14 becomes level with the surface of the road 4- The elongated surveillance zone 14 has an extension along the direction of the road 4. The image sensor 2 is provided with a wide-angle optical system 9. The optical system 9 have a refraction which is arranged such, relative the image sensor 2 that an image captured by the image sensor 2 comprises the first and the second 6, 7 end zone. The refraction and reflection by the optical system 9 upon the image sensor 2 is shown schematically in figure 1 and figure 2. The cut and grind of the optical system 9 determining the refraction is not shown in the figure.
[0034] In figure 1 a first vehicle 8 is entering the first end zone 6. The front of the first vehicle 8 is captured in an image by the image sensor 2, as the refraction of the wide-angle optical system 9 covers the entire surveillance zone 14. If a second vehicle (not shown) were to exit the surveillance zone 14 in the second end zone 7 at the same time, the rear of the second vehicle would be captured by the image sensor 2 in the same image. The first and second end zones 6, 7 are high prioritized areas in the embodiment shown in figure 1.
[0035] Figure 1 further shows the optical system 9 arranged relative the image sensor 2 such that the resolution is higher in the part of the image capturing said first 6 and second 7 end zone and lower in the part of the image capturing a zone between said first 6 and second 7 end zone. The resolution is set by the pixel density of the image sensor 2 as well as by the refraction of the optical system 9. In figure 1 the optical system 9 projects an area unit representing 1/4 of the surveillance zone 14, represented by the first and second end zone 6, 7, upon 2/3 of the pixel area of the image sensor 2 in the device 1, i.e. each end zone 6, 7 is projected upon 1/3 of the image sensor each. The first end zone 6 is projected upon the pixel area 6s of the image sensor 2 and the second end zone 7 is projected upon the pixel area 7s of the image sensor 2. The low prioritized area 3 is in between the first and the second end zone 6, 7 in the embodiment shown in figure 1. Hence, for the low prioritized area 3, an area unit representing of 314 of the surveillance zone 14 is being projected upon 1/3 of the pixel area, shown as area 3s, of the image sensor 2. Therefore, the high prioritized areas which are smaller than the low prioritized areas are projected on a larger pixel area comparably, and hence the number of pixels representing the high priority areas compared to number of pixels representing the low priority areas is larger. The resolution of the high priority areas, the first and second end zones 6, 7, is thus high. The density of the pixels In the pixel area of the image sensor 2 whereon the high priority area is projected may also be higher, hence increasing the resolution further.
[0036] The surveillance zone 14 may be divided differently, such that the ratio between the high prioritized area end the low prioritized area is greater or smaller, and that the ratio between the projections of the high and low prioritized area upon the image sensor 2 is different as well. The high prioritized area does not have to correspond to the end zones of the surveillance zone 14. Just as before, the first and second end zones 6, 7 will be considered equivalent to general high prioritized areas, and the zone in between the first and the second end zone, the middle zone 3, will be equivalent to a general low prioritized area. The high and low prioritized areas can be provided elsewhere in the surveillance zone and/or be of a larger or smaller number without departing from the scope of the invention.
[0037] In another embodiment, shown in figure 2, the optical system 9 comprises a mirror 10 arranged in front of the image sensor 2 causing refraction such that the surveillance zone 14 is captured on the image sensor 2. The embodiment in figure 2 differs only in that the optical system 9 is of a different kind, the function thereof and the device as such is the same as the embodiment in figure 1. Hence, similar as in figure 1, the ration between the areas of the first end zone 6 and second end zone 7 versus the surveillance zone 14 is smaller than the ratio of the two projected on the image sensor 2. Thereby, the first and second end zone 6, 7 is captured by the image sensor 2 with a larger number of pixels and thus at a higher resolution than the middle zone 3 of the surveillance zone 14. The mirror also allows for other ratios than the one exemplified in figure 2.
[0038] Figure 3 shows a system 11 of devices 1 according to the invention provided with a pixel based image sensor 2, not shown in figure 3. The system 11 of devices 1 has access to images captures by a plurality of devices 1, 21. Thereby, the system 11 of devices 1,21 covers the surveillance zones 14 of lanes 12a, 12b provided on a road 4 In the embodiment shown in figure 3. The system 11 combines information withdrawn from the images captured by the image sensors 2, corresponding to the devices 1, 21, in order to track the first vehicle 8 in case it moves between surveillance zones 14 corresponding to different devices 1, 21 and image sensors (not shown in figure 3). In figure 3 each image sensor 2, not shown in the figure, is provided with an individual optical system 9, and that is capturing the surveillance zone 14 of one individual lane 12a, 12b each. The surveillance zones 14 captured by the individual devices 1,21 positioned next to each other could also overlap depending on the characteristics of the individual optical systems 9. In the embodiment shown in figure 3, the surveillance zones 14 are of rectangular shape, and may overlap between the lanes 12a, 12b in order to handle vehicles that passes between the lanes.
[0039] The system 11 of devices 1,21 combines information from images taken by different image sensors in order to estimate a height, a length and/or a volume of the vehicle 8. Thereby, the vehicle 8 is classified and appropriate toll fees are charged based on the vehicle type, in figure 3 a passenger car. The license plate 13 at the front of the vehicle 8 is also captured by the image sensor 2 as the vehicle enters the first end zone 26 of lane 12a. Thereby, the vehicle 8 can be identified by the license plate 13. Alternatively, fingerprint detection could be used to identify the car. In such a method the vehicle is identified based on dents, specific bolts or other characteristics different from the license plate 13.
[0040] Figure 4 shows the projections by the optical systems 9 on the image sensors (not shown in the figure) from the system 11 of devices 1, 21 shown in figure 3. The two images 32, 33 in figure 4 represent the output of the image sensors, wherein the projections of the first end zones 6, 26 is captured in the areas 6s, 26s of the images 32, 33, the low prioritized middle zones 3, 23 is captured in the areas 3s, 23s and the second end zone 7, 27 is captured in the areas 7s, 27s. As exemplified in figure 4, the middle zone 3s, 23s, even though the largest area of the surveillance zone, is represented upon 1/3 of the image 32, 33. The first vehicle 8 is captured at high resolution in the pixel area 26s of the image sensor. Thereby, the license plate 13 is also captured at high resolution and can be used to identify the first vehicle 8. An overlap between the two surveillance zones captured by the two image sensors is shown in figure 4. Thereby, the first vehicle 8 can easily be tracked even if it moves between the two lanes 12a, 12b during its passing through the surveillance zone.
[0041] The invention is not limited to the specific embodiments presented, but includes all variations within the scope of the present claims.
[0042] Reference signs mentioned in the claims should not be seen as limiting the extent of the matter protected by the claims, and their sole function is to make claims easier to understand.
[0043] As will be realised, the invention is capable of modification in various obvious respects, all without departing from the scope of the appended claims. Accordingly, the drawings and the description thereto are to be regarded as illustrative in nature, and not restrictive.
REFERENCES CITED IN THE DESCRIPTION
This list of references cited by the applicant is for the reader's convenience only. It does not form part of the European patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be excluded and the EPO disclaims all liability in this regard.
Patent documents cited in the description • US69598693 [0084] • US2004222904A1 [00051 • US2009219387A1 [0005]

Claims (11)

INDRETNING TIL VEJAFGIFTSBETALINGS- ELLER TELEMATIKSYSTEMERROAD TAX OR TELEMATICS SYSTEM DEVICE 1. Indretning (1,21) til vejafgiftsbetalings- eller telematiksystemer, hvor nævnte indretning (1,21) er tilvejebragt med en billedsensor (2), hvor nævnte billedsensor (2) er indrettet til at blive anbragt over en vej (4), hvor nævnte billedsensor (2) definerer en overvågningszone (14) i et plan i en forudbestemt afstand fra nævnte indretning (1, 21), hvor nævnte overvågningszone (14) bar en aflang udvidelse i nævnte plan, hvor nævnte billedsensor (2) er tilvejebragt med et optisk system (9, 29), hvilket optiske system (9, 29) er tilvejebragt med en refraktion og er anbragt således i forbold til nævnte billedsensor (2), ai et billede (32,33), der tages af nævnte billedsensor (2), omfatter en første og en anden endezone (6, 26 ; 7, 27) og en midterzone (6, 26 ; 7, 27 ; 3, 23), der er placerei meliem nævnte første og nævnte anden endezone (δ, 26 ; 7, 27), kendetegnet ved, at en opløsning af nævnte billede (32, 33), der indfanges af nævnte billedsensor (2), varierer, og det optiske system (9, 29) er indrettet til og anbragt således i forhold til nævnte billedsensor (2), at nævnte opløsning er højere i en dei af billedet, som indfanger mindst én af nævnte første og anden zone (6, 26; 7, 27), og lavere i en del af billedet, som indfanger nævnte midterzone (3, 23), hvor nævnte variation i opiøsningen skyldes en refraktion af nævnte optiske system (9, 29), hvor nævnte optiske system (9, 29) projicerer én områdeenhed af nævnte overvågningszone (14) på et forskeliigt antal pixels, afhængigt af hvor i nævnte overvågningszone (14) nævnte områdeenhed er placeret, 2. indretning (1,21) ifølge krav 1, hvor nævnte billedsensor (2) er pixel-baseret, og variationen i opiøsningen udmærker sig ved en varieret pixeitæthed for nævnte billedsensor (2).Device (1,21) for toll payment or telematics systems, wherein said device (1,21) is provided with an image sensor (2), said image sensor (2) arranged to be arranged over a road (4), wherein said image sensor (2) defines a monitoring zone (14) in a plane at a predetermined distance from said device (1, 21), wherein said monitoring zone (14) bears an elongated extension in said plane where said image sensor (2) is provided. having an optical system (9, 29), which optical system (9, 29) is provided with a refraction and is thus arranged in advance of said image sensor (2), in an image (32,33) taken by said image sensor (2), comprises a first and a second end zone (6, 26; 7, 27) and a middle zone (6, 26; 7, 27; 3, 23), which are said to be said first and said second end zone (δ, 26; 7, 27), characterized in that a resolution of said image (32, 33) captured by said image sensor (2) varies, and the optical system (9, 29) is arranged and arranged in relation to said image sensor (2) so that said resolution is higher in one of the images which captures at least one of said first and second zones (6, 26; 7, 27) and lower in a portion of the image capturing said center zone (3, 23), wherein said variation in resolution is due to a refraction of said optical system (9, 29), wherein said optical system (9, 29) projecting one area unit of said surveillance zone (14) to a different number of pixels, depending on where said area unit mentioned in said monitoring zone (14), device (1.21) according to claim 1, wherein said image sensor (2) is based, and the variation in opacity is distinguished by a varied pixel density for said image sensor (2). 3. Indretning (1,21) ifølge et af de foregående krav 1 eller 2, hvor nævnte optiske system (9, 29) omfatter mindst ét spejl og/elier én linse, der er placeret foran nævnte billedsensor (2), og som forårsager nævnte refraktion.Device (1,21) according to any one of the preceding claims 1 or 2, wherein said optical system (9, 29) comprises at least one mirror and / or one lens located in front of said image sensor (2), which causes said refraction. 4. Indretning (1,21) ifølge et af de foregående krav 1 - 3, hvor nævnte optiske system (9, 29) er tilvejebragt med en form, der muliggør en rektangulær overvågningszone (14).Device (1,21) according to one of the preceding claims 1 - 3, wherein said optical system (9, 29) is provided with a shape which enables a rectangular monitoring zone (14). 5. Indretning (1,21) ifølge et af de foregående krav, hvor nævnte indretning (1, 21) er en del af, eller kombineret med, sensorer, der tiivejebringer sfereoskopiske og/eller multidimensionale billeder, hvor køretøjstyper og/elier køreføjshøjder kan detekteres.Device (1,21) according to any one of the preceding claims, wherein said device (1, 21) is part of, or combined with, sensors providing spheroscopic and / or multi-dimensional images, where vehicle types and / or driving height can be detected. 8. System (11) af indretninger (1, 21) ifølge et af de foregående krav, hvor nævnte system (11) har adgang til billeder, der indfanges af en flerhed af indretninger (1,21).The system (11) of devices (1, 21) according to one of the preceding claims, wherein said system (11) has access to images captured by a plurality of devices (1,21). 7. System (11) ifølge krav 6, hvor nævnte system (11) kan kombinere information, som trækkes ud af nævnte billeder for at spore et køretøj (8), der bevæger sig meliem overvågningszoner (14) svarende til forskellige biliedsensorer (2).The system (11) of claim 6, wherein said system (11) is capable of combining information extracted from said images to track a vehicle (8) moving between monitoring zones (14) corresponding to different image sensors (2). . 8. System (11) ifølge krav 6 eller 7, hvor nævnte system (11) kan kombinere information fra billeder, der lages af forskellige biliedsensorer (2), for at estimere en højde, en længde og/eller et rumfang af et køretøj (8),The system (11) of claim 6 or 7, wherein said system (11) can combine information from images taken by different image sensors (2) to estimate the height, length and / or volume of a vehicle ( 8) 9. Fremgangsmåde ti! styring af en indretning (1, 21) ifølge krav 1 5 elier system (11) ifølge krav 6-8, hvor nævnte biiiedsensor (2) kontinueriigt indfanger billeder af nævnte overvågningszone (14).9. Method Ten! controlling a device (1, 21) according to claim 15 or system (11) according to claims 6-8, wherein said image sensor (2) continuously captures images of said monitoring zone (14). 10. Fremgangsmåde ifølge krav 9, hvor nævnte Indretning (1, 21) eller system (11) er indrettet til at blive styret for at spare en udiæsning fra nævnte biiiedsensor (2), hvor nævnte udiæsning omfatter mindst et første og et andet billede (32, 33), hvor nævnte første billede (32,33) omfatter karakteristika for en forende af et køretøj (8) i nævnte første endezone (6, 26), og hvor et andet billede (32, 33) omfatter karakteristika af en bagende af nævnte køretøj (8) i nævnte anden endezone (7, 27).The method of claim 9, wherein said Device (1, 21) or system (11) is arranged to be controlled to save a readout from said image sensor (2), wherein said readout comprises at least a first and a second image ( 32, 33), wherein said first image (32, 33) comprises the characteristics of a front end of a vehicle (8) in said first end zone (6, 26) and wherein a second image (32, 33) comprises the characteristics of a rear end. of said vehicle (8) in said second end zone (7, 27). 11. Fremgangsmåde ifølge det foregående krav 10, hvor nævnte udiæsning omfatter et tredje billede (32, 33), hvor dei tredje billede omfatter nævnte køretøj (8) meliem nævnte første og nævnte anden endezone (6, 26; 7, 27).The method of the preceding claim 10, wherein said readout comprises a third image (32, 33), said third image comprising said vehicle (8) between said first and said second end zone (6, 26; 7, 27). 12. Fremgangsmåde ifølge et af de foregående krav 9-11, hvor nævnte udiæsning består af ét sammensat billede af nævnte overvågningszone (14), hvor nævnte sammensatte billede er sat sammen af mindst et første og et andet billede, der er indfanget på forskellige tidspunkter således, at nævnte sammensatte billede afslører nævnte overvågningszone (14) og omfatter mindst en visning forfra og en visning bagfra af nævnte køretøj (8) i nævnte første og anden respektive endezone (6, 28; 7, 27).A method according to any of the preceding claims 9-11, wherein said readout consists of one composite image of said monitoring zone (14), wherein said composite image is composed of at least a first and a second image captured at different times. so that said composite image reveals said monitoring zone (14) and comprises at least one front view and a rear view of said vehicle (8) in said first and second respective end zones (6, 28; 7, 27). 13. Fremgangsmåde ifølge det foregående krav 12, hvor nævnte sammensatte billede endvidere omfatter et tredje billede, der er indfanget i et tredje moment, således at en visning af nævnte køretøj i alt væsentligt ovenfra og i nævnte midterzone (3, 23) også er omfattet i nævnte sammensatte billede.The method of the preceding claim 12, wherein said composite image further comprises a third image captured in a third step such that a view of said vehicle substantially from above and in said central zone (3, 23) is also included. in said composite image. 14. Fremgangsmåde ifølge et af de foregående krav 9-13, hvor nævnte indretning (1, 21) eller system (11) er kombineret med anden teknologi til køretøjsidentifikation som for eksempel radiofrekvens-identifikation.The method of any one of the preceding claims 9-13, wherein said device (1, 21) or system (11) is combined with other vehicle identification technology such as radio frequency identification.
DK13792913.9T 2012-11-19 2013-11-19 ROAD TAX OR TELEMATICS SYSTEM DEVICE DK2920767T3 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP12193179.4A EP2733677A1 (en) 2012-11-19 2012-11-19 Device for tolling or telematics systems
PCT/EP2013/074156 WO2014076300A1 (en) 2012-11-19 2013-11-19 Device for tolling or telematics systems

Publications (1)

Publication Number Publication Date
DK2920767T3 true DK2920767T3 (en) 2017-04-03

Family

ID=47178509

Family Applications (1)

Application Number Title Priority Date Filing Date
DK13792913.9T DK2920767T3 (en) 2012-11-19 2013-11-19 ROAD TAX OR TELEMATICS SYSTEM DEVICE

Country Status (14)

Country Link
US (1) US10777075B2 (en)
EP (3) EP2733677A1 (en)
AU (2) AU2013346697B2 (en)
CA (1) CA2889639C (en)
CL (1) CL2015001325A1 (en)
DK (1) DK2920767T3 (en)
ES (2) ES2618281T3 (en)
NZ (2) NZ738642A (en)
PL (1) PL2920767T3 (en)
PT (1) PT2920767T (en)
RU (1) RU2658204C2 (en)
SI (1) SI2920767T1 (en)
WO (1) WO2014076300A1 (en)
ZA (1) ZA201503192B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2733677A1 (en) 2012-11-19 2014-05-21 Kapsch TrafficCom AB Device for tolling or telematics systems
EP3026653A1 (en) 2014-11-27 2016-06-01 Kapsch TrafficCom AB Method of controlling a traffic surveillance system
EP3026652A1 (en) * 2014-11-27 2016-06-01 Kapsch TrafficCom AB Double stereoscopic sensor
CN105357440B (en) * 2015-11-27 2018-10-12 苏州佳世达电通有限公司 focusing system and focusing method
CN108205895A (en) * 2016-12-16 2018-06-26 杭州海康威视数字技术股份有限公司 A kind of method and device for building vehicle style training sample picture library
CN109448141A (en) * 2018-11-01 2019-03-08 北京悦畅科技有限公司 A kind of parking fee self-help charging method and self-help charger
CN110084183A (en) * 2019-04-25 2019-08-02 杭州鸿雁电器有限公司 Determine that personnel pass in and out the method and system in region
US11172112B2 (en) 2019-09-09 2021-11-09 Embedtek, LLC Imaging system including a non-linear reflector
CN110855883B (en) * 2019-11-05 2021-07-20 浙江大华技术股份有限公司 Image processing system, method, device equipment and storage medium
CN114508662A (en) * 2022-02-15 2022-05-17 特瑞拓软件(辽宁)有限公司 Security check photographing device applied to expressway
CN115439783B (en) * 2022-09-01 2023-10-31 苏州思卡信息系统有限公司 Detection method and equipment for vehicle identification and tracking

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5568406A (en) * 1995-12-01 1996-10-22 Gerber; Eliot S. Stolen car detection system and method
US6466260B1 (en) * 1997-11-13 2002-10-15 Hitachi Denshi Kabushiki Kaisha Traffic surveillance system
US6959869B2 (en) 1999-06-07 2005-11-01 Metrologic Instruments, Inc. Automatic vehicle identification (AVI) system employing planar laser illumination and imaging (PLIIM) based subsystems
JP3601392B2 (en) * 1999-12-27 2004-12-15 住友電気工業株式会社 Image processing apparatus, image processing method, and vehicle monitoring system
JP4108468B2 (en) 2002-12-25 2008-06-25 三菱電機株式会社 License plate reader
US6970102B2 (en) * 2003-05-05 2005-11-29 Transol Pty Ltd Traffic violation detection, recording and evidence processing system
US7495638B2 (en) * 2003-05-13 2009-02-24 Research Triangle Institute Visual display with increased field of view
US7295106B1 (en) * 2003-09-03 2007-11-13 Siemens Schweiz Ag Systems and methods for classifying objects within a monitored zone using multiple surveillance devices
DE102004013807B4 (en) * 2004-03-18 2010-12-09 T-Mobile Deutschland Gmbh Electronic toll system for traffic routes and method of operation thereof
JP2005308961A (en) 2004-04-20 2005-11-04 Canon Inc Imaging optical system
US8289399B2 (en) * 2004-08-09 2012-10-16 Hewlett-Packard Development Company, L.P. System and method for image capture device
JP2010504596A (en) * 2006-09-25 2010-02-12 トニー・メイアー Micro diffraction monitoring illumination system
PT103960B (en) 2008-02-07 2010-05-10 Brisa Auto Estradas De Portuga AUTOMATIC REGISTRY SYSTEM INTEGRATED IN AN ELECTRONIC CHARGING SYSTEM OF PORTAGENS
US9584710B2 (en) * 2008-02-28 2017-02-28 Avigilon Analytics Corporation Intelligent high resolution video system
MD4332C1 (en) * 2010-02-08 2015-09-30 Общество С Ограниченной Ответственностью "Корпорация "Строй Инвест Проект М" Process for determining the movement speed and coordinates of vehicles with their subsequent identification and automatic recording of traffic rules violations and device for its implementation
PL2463682T3 (en) 2010-12-07 2013-08-30 Kapsch Trafficcom Ag Method for determining the distance of a vehicle to a wireless beacon and wireless beacon for same
US9407819B2 (en) * 2011-06-30 2016-08-02 Dvp Technologies Ltd. System and method for multidirectional imaging
SI2574092T1 (en) 2011-09-21 2014-03-31 Kapsch Trafficcom Ag Wireless beacon and method for selective communication according to 5.8 and 5.9-GHz DSRC standards
US8760317B2 (en) * 2011-10-28 2014-06-24 Xerox Corporation High occupancy vehicle lane enforcement system using an information system for reduced false positives
US9530060B2 (en) * 2012-01-17 2016-12-27 Avigilon Fortress Corporation System and method for building automation using video content analysis with depth sensing
US8781172B2 (en) * 2012-03-30 2014-07-15 Xerox Corporation Methods and systems for enhancing the performance of automated license plate recognition applications utilizing multiple results
EP2733677A1 (en) 2012-11-19 2014-05-21 Kapsch TrafficCom AB Device for tolling or telematics systems

Also Published As

Publication number Publication date
RU2658204C2 (en) 2018-06-19
US10777075B2 (en) 2020-09-15
RU2015123662A (en) 2017-01-10
EP3182380B1 (en) 2019-04-03
ZA201503192B (en) 2016-06-29
AU2017279793B2 (en) 2019-08-22
EP3182380A1 (en) 2017-06-21
PT2920767T (en) 2017-02-28
EP2920767B1 (en) 2017-01-04
ES2733060T3 (en) 2019-11-27
AU2013346697B2 (en) 2018-01-18
CA2889639A1 (en) 2014-05-22
AU2013346697A1 (en) 2015-05-14
US20160247398A1 (en) 2016-08-25
CL2015001325A1 (en) 2016-03-04
NZ707316A (en) 2018-07-27
WO2014076300A1 (en) 2014-05-22
EP2733677A1 (en) 2014-05-21
ES2618281T3 (en) 2017-06-21
SI2920767T1 (en) 2017-04-26
EP2920767A1 (en) 2015-09-23
PL2920767T3 (en) 2017-07-31
CA2889639C (en) 2016-06-14
NZ738642A (en) 2019-06-28
AU2017279793A1 (en) 2018-01-25

Similar Documents

Publication Publication Date Title
DK2920767T3 (en) ROAD TAX OR TELEMATICS SYSTEM DEVICE
JP6569138B2 (en) Axle number detection device, vehicle type discrimination system, axle number detection method and program
US20120033074A1 (en) Method and device for determining a valid lane marking
TW201702936A (en) Dual embedded optical character recognition (OCR) engines
KR101880243B1 (en) Multi-lane hi-pass system for impruving matching accuracy of vehicle number and OBU unique number
WO2016136660A1 (en) Vehicle type determination device, toll collection facility, vehcile type dtermination method, and program
KR101894710B1 (en) Apparatus for classifying vehicle type and control method thereof
JP4961305B2 (en) Vehicle monitoring device for toll road automatic toll booth
US20100117864A1 (en) Vehicle identification and speed detection method and system
JPH0714037A (en) Monitor system for checking state of fare payment of road user
WO1993019429A1 (en) Vision apparatus
EP3736777A1 (en) Method and system for determining the digital fingerprint of vehicles in transit
JP2009048225A (en) Vehicle recognition device and vehicle recognition method
KR101363176B1 (en) Electronic toll collecting system and method thereof
CN112133120A (en) Vehicle prompt processing method and device
WO1997050067A1 (en) A multilane traffic registration system, comprising multiple and gantry-mounted detector means
CN219303146U (en) Vehicle type abnormality detection system for expressway exit toll station
AU638929B1 (en)
JP2007265224A (en) Toll collection system and toll collection method
Blythe et al. The technical and institutional issues associated with the enforcement of a multi-lane debiting system
JPH03214292A (en) Illegal passing preventing device
JP2019012332A (en) Toll collection system and lane monitoring method
AU3739893A (en) Vision apparatus
JP2012174121A (en) Toll collection system