New! View global litigation for patent families

WO2011078845A1 - System and method for monitoring road traffic - Google Patents

System and method for monitoring road traffic

Info

Publication number
WO2011078845A1
WO2011078845A1 PCT/US2009/069051 US2009069051W WO2011078845A1 WO 2011078845 A1 WO2011078845 A1 WO 2011078845A1 US 2009069051 W US2009069051 W US 2009069051W WO 2011078845 A1 WO2011078845 A1 WO 2011078845A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
vehicle
sensor
road
image
camera
Prior art date
Application number
PCT/US2009/069051
Other languages
French (fr)
Inventor
Isaac S. Daniel
Original Assignee
F3M3 Companies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules

Abstract

A system comprising at least one sensor, at least one camera, at least one processor electronically connected to the at least one sensor and the at least one camera, and computer executable instructions readable by the at least one processor and operative to determine when the at least one sensor detects the presence of a vehicle and use the at least one camera to capture at least one image of the vehicle. A system comprising at least one sensor positioned along a no pass zone of a road, and at least one camera positioned along the road such that it is operative to capture at least one image of at least one vehicle in the no pass zone. A method comprising using at least one processor to perform any of the following: detecting the presence of a vehicle in a no pass zone, and capturing at least one image of the vehicle. An apparatus comprising a road marker and at least one sensor connected to the road marker.

Description

TITLE

SYSTEM AND METHOD FOR MONITORING TRAFFIC AND ENFORCING TRAFFIC LAWS

5

FIELD

The present disclosure relates generally to electronic systems and methods, and more particularly, to systems, methods, and various other disclosures related to road safety and traffic monitoring, and in particular, to monitoring and enforcing no pass zones. 10

BACKGROUND

One of the most dangerous types of accidents drivers face are head on collisions with oncoming traffic. Such head on collisions are quite violent due to the high speeds and head on angles of impact involved. 15 A common cause of head on collisions are violations of no pass zones on roads and highways. No pass zones are sections of a road that have been designated as a zone in which vehicles are not allowed to pass each other; these zones are usually designated by a double solid line, or a solid line that separate lanes on the road. Sometimes no pass zones are designed to prevent vehicles from entering into oncoming traffic along particularly dangerous portions of the 20 road, such as portions of the road which are winding and therefore do not allow drivers to see oncoming traffic from a distance. Other passing zones are designed to prevent vehicles from passing other cars on a multiple lane road which have dangerous portions, such as construction zones and lane shifts. No pass zones are particularly dangerous if drivers disregard them, since they may cause a head on collision and/or a traffic accident on a dangerous part of the road, such as a construction zone.

Traditionally, drivers have ignored no pass zones because there has been no practical way to enforce them, besides the presence of law enforcement officers. The disregard of no pass zones has caused countless of accident related deaths, and, often times, detennining who caused the accident can be difficult because the magnitude of the impact caused by a head on collision can severely damage the vehicles involved and drastically change their positions, thus making the determination of who caused the accident extremely difficult to calculate. Thus, the disregard of no pass zones continues to be a problem for many drivers, law enforcement agencies, and society in general,

SUMMARY

The various embodiments and disclosures described herein result from the realization that no pass zones can be enforced by providing a system and method for monitoring the presence of vehicles in a no pass zone, capturing images of vehicles violating the no pass zone, and analyzing the images to detemune the owners and/or drivers of the vehicles, as well as the drivers at fault for accidents caused in a no pass zone.

Accordingly, the various embodiments and disclosures described herein solve the limitations of the prior art in a new and novel manner.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 shows a system in accordance with one embodiment; FIG. 2 shows a system in accordance with another embodiment;

FIG. 3 shows an image in accordance with one embodiment;

FIG. 4 shows a system in accordance with one embodiment;

FIG. 5 shows an image in accordance with one embodiment;

FIG. 6 shows a block diagram representing a method in accordance with one embodiment;

FIG. 7 shows an article in accordance with one embodiment; and

FIG. 8 shows an apparatus in accordance with one embodiment.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS

System Level Overview

FIG. 1 shows system 100, in accordance with one embodiment. System 100 comprises at least one sensor 102, at least one camera 104, at least one processor 106 electronically connected to at least one sensor 102 and at least one camera 104, and computer executable instructions (not shown) readable by at least one processor 106 and operative to determine when at least one sensor 102 detects the presence of a vehicle (as show by reference numeral 204 in FIG. 2), and use at least one camera 104 to capture at least one image (as shown by reference numeral 302 in FIG. 3) of vehicle 204.

Referring now to FIG. 2, in some embodiments of system 100, at least one sensor 102 may be positioned on a road 202. In further embodiments, at least one sensor 102 may be positioned anywhere on or near a road, such as, but not limited to, in a no pass zone on the road, on the road line, on a shoulder of the road, on the center of the road, on the lane of a road, over the road, and the like. At least one sensor 102 may be any type of sensor, including, but not limited to, a capacitive sensor, a pressure sensor, a light sensor, an optical sensor, a photoelectric sensor, a metal detector, a range finder, a speed sensor, and the like. In a further embodiment, at least one sensor 102 may be connected to and/or embedded in a road marker, such as, but not limited to, a reflective road marker, sticker road marker, a line, a stripe, a speed bump, and the like. In yet another embodiment, sensor 102 maybe connected to and/or embedded in the road itself, such as under or in the surface of the pavement or other road surface.

In one embodiment, at least one sensor 102 may be powered by an internal power source, such as a battery. In another embodiment, at least one sensor 102 may be powered by an external power source, such as, but not limited to, solar power, a wired external power source, such as a power grid, a piezoelectric power source, a photoelectric power source, a thermal power source, and the like. In yet a further embodiment, at least one sensor 102 may be powered by a combination of internal and external power sources, such as a solar power source that charges a battery.

In yet a further embodiment, at least one sensor 102 may be equipped with a communications means, such as a wired communications means, including a modem, or a wireless communications means, such as a wireless communications module, including, but not limited to, a ZIGBY module, a BLUETOOTH module, a wireless GSM modem, and the like. In some embodiments, at least one sensor 102 may be wirelessly connected with at least one processor 106 and/or at least one camera 104.

In some embodiments, at least one camera 104 may be any kind of camera, including, but not limited to, a photographic camera, a video camera, a digital camera, an analog camera, a closed circuit camera, any combination thereof, and the like. At least one camera 104 may be powered by an external power source, such as, but not limited to, solar power, a wired external power source, such as a power grid, a piezoelectric power source, a photoelectric power source, a thermal power source, and the like. In yet a further embodiment, at least one camera 104 may be powered by a combination of internal and external power sources, such as a solar power source that charges a battery.

In some embodiments, at least one camera 104 may be positioned along road 202. In further embodiments, at least one camera 104 may be positioned anywhere on, near, or along road 202, such as, but not limited to, a no pass zone, a shoulder of the road, on the center of the road, over a lane of a road, over the entire road, and the like. In other embodiments, at least one camera 104 may be positioned in a covert location, such as, but not limited to, in a tree, on a street lamp, or the like. In a preferred embodiment, at least one camera 104 is positioned such that it is operative to capture an image of at least one vehicle 204 on road 202.

In yet a further embodiment, at least one camera 104 may be equipped with a communications means, such as a wired communications means, including a modem, or a wireless cormnunications means, such as a wireless communications module, including, but not limited to, a ZIGBY module, a BLUETOOTH module, a wireless GSM modem, and the like, hi some embodiments, at least one camera 104 may be wirelessly connected with at least one processor 106 and/or at least one sensor 102.

In some embodiments, at least one processor 106 may be any kind of processor, such as, but not limited to, a central processing unit (CPU), a microprocessor, a video processor, a front end processor, a coprocessor, a single-core processor, a multi-core processor, and the like.

In a further embodiment, system 100 may comprise at least one communications means electronically connected to the at least one processor. The communications means may be any kind of communications means, such as a wired communications means, including a modem, or a wireless communications means, such as a wireless communications module, including, but not limited to, a ZIGBY module, a BLUETOOTH module, a wireless GSM modem, and the like. In a further embodiment, the wireless communications means may be also electronically connected, such as wirelessly connected, to at least one sensor 102 and/or at least one camera 104. The wireless communications means may be used to transmit at least one image 302 captured by the at least one camera 104 to a central station (not shown), such as a server, wherein the server may be located at a law enforcement agency or other governmental body that may thereafter analyze the image. At least one image 302 may be analyzed to determine whether vehicle 204 has violated any traffic laws, such as passing in a no pass zone, and/or whether vehicle 204 was the cause of an accident. In a further embodiment, At least one image 302 may be analyzed to determine the owner and/or driver of vehicle 204, by analyzing an identifying object in At least one image 302, such as license plate number 304.

In some embodiments, the computer executable instructions may be further operative to transmit the at least one image 302 to a central station, such as by using the communications means connected to at least one processor 106, as described above. As mentioned above, at the central station, at least one image 302 may be analyzed to determine whether vehicle 204 has violated any traffic laws, such as passing in a no pass zone, and/or whether vehicle 204 was the cause of an accident. In a further embodiment, at least one image 302 may be analyzed to determine the owner and/or driver of a car, by analyzing an identifying object in at least one image 302, such as license plate number 304.

In a further embodiment, the computer executable instructions may be operative to determine the speed of vehicle 204. The computer executable instructions may be operative to determine the speed of vehicle 204 based on input/feedback from at least one sensor 102 and/or at least one camera 104, In some embodiments, the computer executable instructions may be operative to determine the speed of vehicle 204 based on input/feedback from a plurality of sensors, whereby the speed can be calculated by dividing the distance between at least two sensors by the time between each of the two sensors detects the presence of vehicle 204. in further embodiments, the computer executable instructions may be operative to determine the speed of vehicle 204 based on input/feedback from at least one camera 104, such as by analyzing the image(s) 302 captured by at least one camera 104 and calculating the speed of vehicle 204 by comparing the position of vehicle 204 in images(s) 302 with the time taken between the captaing of image(s) 302 and calculating a speed based on those numbers. Such a calculation may again involve dividing the distance vehicle 204 has traveled (which would be determined from image(s) 302) by the time taken between capturing image(s) 302.

In some embodiments, the computer executable instructions may be operative to use at least one camera 104 to capture at least one image 302 of vehicle 204 when at least one sensor 102 detects the presence of vehicle 204 passing in a no pass zone.

In some embodiments, the computer executable instructions may be composed in any type of programming language, such as, but not limited to, C++, Java, HTML, Javascript, XML, Basic, C, and the like. In a further embodiment, the computer executable instructions may be stored on a storage means connected to at least one processor 106. Such a storage means may be any kind of storage means, such as, but not limited to, a storage module, a hard drive, a solid state drive, a CD-R, a CD-ROM, a DVD, flash memory, random access memory, read only memory, any other type of computer readable medium, and the like. In alternate embodiments, the computer executable instructions maybe stored directly on at least one processor 106. Referring now to FIG. 4, a system 400 is shown in accordance with one embodiment. System 400 comprises at least one sensor 402 positioned along a no pass zone of a road 404, and at least one camera 406 positioned along road 404 such that it is operative to capture at least one image (shown in FIG. 5, with reference to numeral 502) of at least one vehicle 408 in the no pass zone.

In some embodiments, system 400 may further comprise at least one processor (not shown) electronically connected to at least one sensor 402 and/or at least one camera 406. The at least one processor may be any kind of processor, such as, but not limited to, a central processing unit (CPU), a microprocessor, a video processor, a front end processor, a coprocessor, a single- core processor, a multi-core processor, and the like.

In further embodiments, at least one sensor 402 may be positioned anywhere on or near road 404, such as, but not limited to, in a no pass zone on the road, on the road line, on a shoulder of the road, on the center of the road, on the lane of a road, over the road, and the like.

At least one sensor 402 may be any type of sensor, including, but not limited to, a capacitive sensor, a pressure sensor, a light sensor, an optical sensor, a photoelectric sensor, a metal detector, a range finder, a speed sensor, and the like. In a further embodiment, at least one sensor 402 may be connected to and/or embedded in a road marker, such as, but not limited to, a reflective road marker, sticker road marker, a line, a stripe, a speed bump, and the like. In yet another embodiment, sensor 402 maybe connected to and/or embedded in the road itself, such as under or in the surface of the pavement or other road surface.

In one embodiment, at least one sensor 402 may be powered by an internal power source, such as a battery. In another embodiment, at least one sensor 402 may be powered by an external power source, such as, but not limited to, solar power, a wired external power source, such as a power grid, a piezoelectric power source, a photoelectric power source, a thermal power source, and the like. In yet a further embodiment, at least one sensor 402 may be powered by a combination of internal and external power sources, such as a solar power source that charges a battery.

In yet a further embodiment, at least one sensor 402 may be equipped with a communications means, such as a wired communications means, including a modem, or a wireless communications means, such as a wireless communications module, including, but not limited to, a ZIGBY module, a BLUETOOTH module, a wireless GSM modem, and the like. In some embodiments, at least one sensor 402 may be wirelessly connected with the at least one processor and/or at least one camera 406.

In some embodiments, at least one camera 406 may be any kind of camera, including, but not limited to, a photographic camera, a video camera, a digital camera, an analog camera, a closed circuit camera, any combination thereof, and the like. At least one camera 406 may be powered by an external power source, such as, but not limited to, solar power, a wired external power source, such as a power grid, a piezoelectric power source, a photoelectric power source, a thermal power source, and the like. In yet a further embodiment, at least one camera 406 may be powered by a combination of internal and external power sources, such as a solar power source that charges a battery.

In some embodiments, at least one camera 406 may be positioned along road 404. hi further embodiments, at least one camera 406 may be positioned anywhere on, near, or along road 404, such as, but not limited to, a no pass zone, a shoulder of the road, on the center of the road, over a lane of a road, over the entire road, and the like. In other embodiments, at least one camera 406 may be positioned in a covert location, such as, but not limited to, in a tree, on a street lamp, or the like. In a preferred embodiment, at least one camera 406 is positioned such that it is operative to capture an image of at least one vehicle 408 on road 404, and preferably to capture an image when at least one vehicle 408 is violating a traffic law, such as a no pass zone on road 404,

In yet a further embodiment, at least one camera 406 may be equipped with a communications means, such as a wired communications means, including a modem, or a wireless communications means, such as a wireless communications module, including, but not limited to, a ZIGBY module, a BLUETOOTH module, a wireless GSM modem, and the like. In some embodiments, at least one camera 406 may be wirelessly connected with at least one processor and/or at least one sensor 402.

In yet another embodiment, system 400 further comprises computer executable instructions readable by the at least one processor and operative to use at least one camera 406 to capture at least one image 502 of vehicle 408. In a preferred embodiment, the computer executable instructions are operative to capture at least one image 502 of vehicle 408 violating a traffic law, such as by passing in a no pass zone. In some embodiments, the computer executable instructions are operative to capture at least one image 502 of vehicle 408 when at least one sensor 402 detects the presence of vehicle 408 passing in a no pass zone.

In some embodiments, the computer executable instructions may be further operative to transmit the at least one image 502 to a central station, such as by using the communications means connected to the at least one processor, as described above with reference to FIGS. 1-3. At the central station, at least one image 502 may be analyzed to determine whether vehicle 408 has violated any traffic laws, such as passing in a no pass zone, and/or whether vehicle 408 was the cause of an accident. In a further embodiment, at least one image 502 may be analyzed to determine the owner and/or driver of a car, by analyzing an identifying object in at least one image 502, such as license plate number 504.

In a further embodiment, the computer executable instructions may be operative to determine the speed of vehicle 408. The computer executable instructions may be operative to determine the speed of vehicle 408 based on input/feedback from at least one sensor 402 and/or at least one camera 406. In some embodiments, the computer executable instructions may be operative to determine the speed of vehicle 408 based on input/feedback from a plurality of sensors, whereby the speed can be calculated by dividing the distance between at least two sensors by the time between each of the two sensors detects the presence of vehicle 408. In further embodiments, the computer executable instructions may be operative to determine the speed of vehicle 408 based on input/feedback from at least one camera 406, such as by analyzing the image(s) 502 captured by at least one camera 406 and calculating the speed of vehicle 408 by comparing the position of vehicle 408 in image(s) 502 with the time taken between the capturing of image(s) 502 and calculating a speed based on those numbers. Such a calculation may again involve dividing the distance vehicle 408 has traveled (which would be determined from image(s) 502) by the time taken between capturing image(s) 502.

In some embodiments, the computer executable instructions may be composed in any type of programming language, such as, but not limited to, C++, Java, HTML, Javascript, XML, Basic, C, and the like. In a further embodiment, the computer executable instructions may be stored on a storage means connected to the at least one processor. Such a storage means may be any kind of storage means, such as, but not limited to, a storage module, a hard drive, a solid state drive, a CD-R, a CD-ROM, a DVD, flash memory, random access memory, read only memory, any other type of computer readable medium, and the like. In alternate embodiments, the computer executable instructions maybe stored directly on the at least one processor.

Methods

Referring now to FIG. 6, a block diagram of method 600 is shown in accordance with one embodiment. Method 600 comprises using at least one processor to perform any of the following: detecting the presence of a vehicle in a no pass zone (block 602), and capturing at least one image of the vehicle (block 604).

The at least one processor may be any kind of processor, such as, but not limited to, a central processing unit (CPU), a microprocessor, a video processor, a front end processor, a coprocessor, a single-core processor, a multi-core processor, and the like.

In a further embodiment of method 600, detecting the presence of a vehicle in a no pass zone 602 comprises using at least one sensor positioned on a road to detect the presence of a vehicle in a no pass zone. The at least one sensor may be any kind of sensor, including, but not limited, those embodiments described above with reference FIGS. 1 -5, and elsewhere throughout the present disclosure, hi a further embodiment of method 600, detecting the presence of a vehicle in a no pass zone 602 may comprise detecting whether a vehicle is passing in a no pass zone, such as by using at least one sensor to detect the presence of a vehicle passing in a no pass zone (e.g. using at least one sensor positioned between two lanes to detect whether a vehicle has exited its lane and crossed over to the other lane). In further embodiments, detecting the presence of a vehicle in a no pass zone 602 comprises using a plurality of sensors positioned on a road to detect the presence of a vehicle in a no pass zone. In yet another embodiment of method 600, capturing at least one image of the vehicle 604 comprises using at least one camera positioned along the road to capture at least one image of the vehicle. The at least one camera may be any kind of camera, including, but not limited to, those embodiments described above with reference to FIGS. 1-5 and elsewhere throughout the present disclosure. In further embodiments, capturing at least one image of the vehicle 604 comprises using a plurality of cameras to capture at least one image of the vehicle. In yet a further embodiment, capturing at least one image of the vehicle 604 includes capturing at least one image of the vehicle when the at least one sensor has detected the presence of the vehicle in a no pass zone, and in particular, when the at least one sensor has detected that the vehicle is passing in a no pass zone.

In yet a further embodiment, capturing at least one image of the vehicle 604 includes capturing at least one image of at least one identifying object. In one embodiment, the identifying object may be any identifying object, such as, but not limited to, a license plate, a license plate number, a driver and/or passenger's face, a color and model of the vehicle, and the like.

In a further embodiment, method 600 further comprises using at least one processor to transmit the at least one image of the vehicle to a central station. In one embodiment, transmitting the at least one image of the vehicle to a central station includes using at least one communications means to transmit the at least one image of the vehicle to a central station. The communications means may be any kind of communications means, such as, but not limited to, those embodiments described herein with reference to FIGS. 1-5 and elsewhere throughout the present disclosure. In another embodiment, method 600 further comprises using at least one processor to analyze the at least one captured image of the vehicle to determine if the vehicle was violating a traffic law, such as, but not limited to, whether or not the vehicle has passed in a no pass zone. In some embodiments, such analysis and/or determination may take place at a central station, such as a police station, or other governmental agency, while in other embodiments, such analysis and/or determination may take place at a local station, such as in a law enforcement vehicle.

In yet a further embodiment, method 600 comprises using at least one processor to analyze the at least one captured image of the vehicle to determine if the vehicle was the cause of an accident. Determining whether the vehicle was the cause of an accident may include comparing pre-accident images with post-accident images to determine whether the vehicle was the cause of the accident. In some embodiments, such analysis and/or determination may take place at a central station, such as a police station, or other governmental agency, while in other embodiments, such analysis and/or determination may take place at a local station, such as in a law enforcement vehicle.

In yet another embodiment, method 600 further comprises using at least one processor to analyze the at least one captured image of the vehicle to determine the owner or driver of the vehicle. Determining the owner or driver of the vehicle may include using an image of an identifying object, such as, but not limited to, a face or license plate and/or license plate number, captured within the at least one captured image to determine the owner or driver of the vehicle. In some embodiments, such analysis and/or determination may take place at a central station, such as a police station, or other governmental agency, while in other embodiments, such analysis and/or determination may take place at a local station, such as in a law enforcement vehicle. In yet a further embodiment, method 600 comprises using at least one processor to determine the speed of the vehicle. Determining the speed of the vehicle may include using information and/or data received from the at least one sensor and/or at least one camera to determine the speed of the vehicle. Determining the speed of the vehicle may include using any of those methods and other embodiments described above with reference to FIGS. 1-5, such as by dividing the distance over which the vehicle has traveled by the time during which the vehicle has traveled that distance.

Hardware and Operating Environment

This section provides an overview of example hardware and the operating environments in conjunction with which embodiments of the inventive subject matter can be implemented.

A software program may be launched from a computer readable medium in a computer- based system to execute function defined in the software program. Various programming languages may be employed to create software programs designed to implement and perform the methods disclosed herein. The progi'ams may be structured in an object-orientated format using an object-oriented language such as Java or C++. Alternatively the programs may be structured in a procedure-oriented format using a procedural language, such as assembly or C. The software components may coimnunicate using a number of mechanisms, such as application program interfaces, or inter-process communication techniques, including remote procedure calls. The teachings of various embodiments are not limited to any particular programming language or environment. Thus, other embodiments may be realized, as discussed regarding Fig. 6 below.

FIG. 7 is a block diagram representing an article according to various embodiments. Such embodiments may comprise a computer, a memory system, a magnetic or optical disk, some other storage device, or any type of electronic device or system. The article 700 may include one or more processor(s) 702 couple to a machine-accessible medium such as a memory 704 (e.g., a memory including electrical, optical, or electromagnetic elements). The medium may contain associated information 706 (e.g., computer program instructions, data, or both) which, when accessed, results in a machine (e.g., the processor(s) 702) perfonning the activities previously described herein.

The principles of the present disclosure may be applied to all types of computers, systems, and the Hke, include desktop computers, servers, notebook computers, personal digital assistants, microcomputers, and the like. However, the present disclosure may not be limited to the personal computer.

Apparatuses

Referring now to FIG. 8, an apparatus 800 is shown in accordance with one embodiment. Apparatus 800 comprises a road marker 802 and at least one sensor connected to the road marker. In some embodiments, road marker 802 may be any kind of road marker, including, but not limited to, a reflective road marker, a sticker road marker, a line road marker, center line road marker, a shoulder road marker, a speed bump, and the like.

In some embodiments, at least one sensor 804 may be any kind of sensor, including, but not limited to a capacitive sensor, a pressure sensor, a light sensor, an optical sensor, a photoelectric sensor, a metal detector, and a range finder.

At least one sensor 804 may be connected to road marker 802 by any means, such as, but not limited to, mechanical means, including screws, glue, nails, magnets, and the like. In other embodiments, at least one sensor 804 may be connected to and/or embedded in road marker 802. In other embodiments, at least one sensor 804 may be detachably connected to road marker 802.

In some embodiments, apparatus 800 may further comprise at least one power module (not shown) electronically connected to at least one sensor 804. The power module may be any kind of power module, including, but not limited to, a battery, a solar electric generator, a piezoelectric generator, a thermal electric generator, a chemical electric generator, a power port connected to an external power grid, and the like.

In yet a further embodiment, apparatus 800 may further comprise at least one

communications means electronically connected to the at least one sensor. The at least one communications means may be any kind of communications means, including, but not limited to, any of the embodiments described above with reference to FIGS. 1-7 and elsewhere throughout the present disclosure.

In some embodiments, apparatus 800 may be used in any of the embodiments of systems and methods described herein with reference to FIGS. 1-7 and elsewhere throughout the present disclosure. In particular embodiments, apparatus 800 may be used as the at least one sensor in any of the embodiments of systems and methods described herein with reference to numerals 102 and 402 in FIGS. 1-7 and elsewhere throughout the present disclosure.

It should be noted that the various embodiments disclose herein may be applied to other forms of traffic monitoring and enforcement of traffic laws, such as, but not limited, whether parking violations have occurred, whether cars have stopped at a stop sign, whether jaywalking violations have occurred, whether cars have run red lights, and the like. While the principles of the disclosure have been described herein, it is to be understood by those skilled in the art that this description is made only by way of example and not as a limitation as to the scope of the disclosure. Other embodiments are contemplated within the scope of the present disclosure in addition to the exemplary embodiments shown and described herein. Modifications and substitutions by one of ordinary skill in the art are considered to be within the scope of the present disclosure.

Claims

CLAIMS What is claimed is:
1. A system comprising:
a. at least one sensor;
b. at least one camera;
c. at least one processor electronically connected to the at least one sensor and the at least one camera; and
d. computer executable instructions readable by the at least one processor and operative to:
i. determine when the at least one sensor detects the presence of a vehicle; and ii. use the at least one camera to capture at least one image of the vehicle.
2. The system of claim 1 , wherein the at least one sensor is positioned on a road.
3. The system of claim 1, wherein the at least one sensor is positioned in a type of position selected from the group consisting essentially of: a no pass zone on the road, at least one road line, a shoulder of a road, a center of a road, a lane of a road, and over a road.
4. The system of claim 1, wherein the at least one sensor is a type of sensor selected from the group consisting essentially of: a capacitive sensor, a pressure sensor, a light sensor, an optical sensor, a photoelectric sensor, a metal detector, and a range finder.
5. The system of claim 1 , wherein the at least one sensor is connected to a road marker.
6. The system of claim 1, wherein the at least one sensor is connected to the road.
7. The system of claim 1, wherein the at least one camera is wirelessly connected to the at least one sensor.
8. The system of claim 1, wherein the at least one processor is wirelessly connected to the at least one sensor and the at least one camera,
9. The system of claim 1, further comprising at least one communications means electronically connected to the at least one processor.
10. The system of claim 1, wherein the computer executable instructions are further operative to transmit the at least one image to a central station.
11. The system of claim 1, wherein the computer executable instructions are further operative to determine the speed of the vehicle.
12. A system comprising:
a. at least one sensor positioned along a no pass zone of a road; and
b. at least one camera positioned along the road such that it is operative to capture at least one image of at least one vehicle in the no pass zone.
13. The system of claim 12, wherein the at least one sensor is connected to a road marker.
14. The system of claim 12, wherein the at least one sensor is connected to the road.
15. The system of claim 12, further comprising at least one processor electronically connected to the at least one sensor and the at least one camera.
16. The system of claim 15, further comprising computer executable instructions readable by the at least one processor and operative to use the at least one camera to capture at least one image of a vehicle.
17. The system of claim 16, wherein the computer executable instructions are further operative to transmit the at least one image to a central station.
18. The system of claim 16, wherein the computer executable instructions are further operative to determine the speed of the vehicle.
19. A method comprising:
a. using at least one processor to perform any of the following:
i. detecting the presence of a vehicle in a no pass zone; and
ii. capturing at least one image of the vehicle.
20. The method of claim 19, wherein detecting the presence of a vehicle in a no pass zone comprises using at least one sensor positioned on a road to detect the presence of a vehicle in a no pass zone.
21. The method of claim 19 wherein capturing at least one image of the vehicle in the no pass zone includes using at least one camera positioned along the road to capture at least one image of the vehicle.
22. The method of claim 19, wherein capturing at least one image of the vehicle includes capturing at least one image of at least one identifying object.
23. The method of claim 22, wherein capturing at least one image of an identifying object includes capturing at least one image of at least one license plate.
24. The method of claim 19, further comprising using at least one processor to transmit the at least one image of the vehicle to a central station.
25. The method of claim 19, wherein transmitting the at least one image of the vehicle to a central station includes using at least one communications means to transmit the at least one image of the vehicle to a central station.
26. The method of claim 19, further comprising using at least one processor to analyze the at least one captured image of the vehicle to determine if the vehicle was violating a traffic law.
27. The method of claim 19, further comprising using at least one processor to analyze the at least one captured image of the vehicle to determine if the vehicle was the cause of an accident.
28. The method of claim 19, further comprising using at least one processor to analyze the at least one captured image of the vehicle to determine the owner or driver of the vehicle.
29. The method of claim 19, further comprising using at least one processor to determine the speed of the vehicle.
30. An apparatus comprising:
a. a road marker; and
b. at least one sensor connected to the road marker.
31. The apparatus of claim 30, wherein the at least one sensor is a type of sensor selected from the group consisting essentially of: a capacitive sensor, a pressure sensor, a light sensor, an optical sensor, a photoelectric sensor, a metal detector, and a range finder.
32. The apparatus of claim 30, further comprising at least one power module electronically
connected to the at least one sensor.
33. The apparatus of claim 30, further comprising at least one communications means
electronically connected to the at least one sensor.
34. The apparatus of claim 30, wherein the at least one sensor is detachably connected to the road marker.
PCT/US2009/069051 2009-12-21 2009-12-21 System and method for monitoring road traffic WO2011078845A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2009/069051 WO2011078845A1 (en) 2009-12-21 2009-12-21 System and method for monitoring road traffic

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2009/069051 WO2011078845A1 (en) 2009-12-21 2009-12-21 System and method for monitoring road traffic

Publications (1)

Publication Number Publication Date
WO2011078845A1 true true WO2011078845A1 (en) 2011-06-30

Family

ID=42634908

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/069051 WO2011078845A1 (en) 2009-12-21 2009-12-21 System and method for monitoring road traffic

Country Status (1)

Country Link
WO (1) WO2011078845A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160351052A1 (en) * 2015-05-29 2016-12-01 Denso Corporation Vehicle driving assistance apparatus and vehicle driving assistance method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL1005785C2 (en) * 1997-04-10 1998-10-14 Traffic Test B V A system for performing an overtaking control in vehicles.
US6160494A (en) * 1996-07-26 2000-12-12 Sodi; Paolo Machine and method for detecting traffic offenses with dynamic aiming systems
WO2007058618A1 (en) * 2005-11-18 2007-05-24 St Electronics (Info-Comm Systems) Pte. Ltd. System and method for detecting road traffic violations
WO2007107875A2 (en) * 2006-03-22 2007-09-27 Kria S.R.L. A system for detecting vehicles

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6160494A (en) * 1996-07-26 2000-12-12 Sodi; Paolo Machine and method for detecting traffic offenses with dynamic aiming systems
NL1005785C2 (en) * 1997-04-10 1998-10-14 Traffic Test B V A system for performing an overtaking control in vehicles.
WO2007058618A1 (en) * 2005-11-18 2007-05-24 St Electronics (Info-Comm Systems) Pte. Ltd. System and method for detecting road traffic violations
WO2007107875A2 (en) * 2006-03-22 2007-09-27 Kria S.R.L. A system for detecting vehicles

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160351052A1 (en) * 2015-05-29 2016-12-01 Denso Corporation Vehicle driving assistance apparatus and vehicle driving assistance method
US9852626B2 (en) * 2015-05-29 2017-12-26 Denso Corporation Vehicle driving assistance apparatus and vehicle driving assistance method

Similar Documents

Publication Publication Date Title
US8996234B1 (en) Driver performance determination based on geolocation
US20100332266A1 (en) Traffic information system
US6970102B2 (en) Traffic violation detection, recording and evidence processing system
US6100819A (en) Vehicular traffic signalization method and apparatus for automatically documenting traffic light violations and protecting non-violating drivers
US20110246210A1 (en) Traffic monitoring system
US20120148092A1 (en) Automatic traffic violation detection system and method of the same
US6442474B1 (en) Vision-based method and apparatus for monitoring vehicular traffic events
US7248149B2 (en) Detection and enforcement of failure-to-yield in an emergency vehicle preemption system
US20150211870A1 (en) Method for using street level images to enhance automated driving mode for vehicle
US20100246890A1 (en) Detection of objects in images
JP2007047914A (en) Danger reaction point recording system and operation support system
US20130215273A1 (en) Traffic enforcement system and methods thereof
CN202018743U (en) Express way safety distance early warning system based on GPS (global positioning system) and 3G wireless communication
US20140363051A1 (en) Methods and systems for selecting target vehicles for occupancy detection
US20140169633A1 (en) Emergency rescue vehicle video based violation enforcement method and system
CN101635093A (en) Identification system and method of unlicensed motor vehicles
WO2002082400A2 (en) A system and a method for event detection and storage
CN101714263A (en) Vehicle management method and management system for electronic toll collection lane
US8294595B1 (en) Speed detector for moving vehicles
US20130311641A1 (en) Traffic event data source identification, data collection and data storage
Chen et al. Analysis of risk factors affecting the severity of intersection crashes by logistic regression
CN200968986Y (en) Multifunctional GPS vehicle mounted system
Benekohal et al. Automated speed photo enforcement effects on speeds in work zones
US20120245758A1 (en) Driving behavior detecting method and apparatus
CN103198659A (en) Snapshot device for vehicle passing through pedestrian crosswalk and cutting in with pedestrian and snapshot method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09801628

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 09801628

Country of ref document: EP

Kind code of ref document: A1