US20190389380A1 - Detecting Method for Front-Parked Vehicles At Night - Google Patents

Detecting Method for Front-Parked Vehicles At Night Download PDF

Info

Publication number
US20190389380A1
US20190389380A1 US16/563,914 US201916563914A US2019389380A1 US 20190389380 A1 US20190389380 A1 US 20190389380A1 US 201916563914 A US201916563914 A US 201916563914A US 2019389380 A1 US2019389380 A1 US 2019389380A1
Authority
US
United States
Prior art keywords
vehicle
parked
moving
night
vehicle sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/563,914
Other versions
US10737617B2 (en
Inventor
Guobiao Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Haicun Information Technology Co Ltd
Original Assignee
Hangzhou Haicun Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/636,323 external-priority patent/US9475429B2/en
Priority claimed from US15/628,617 external-priority patent/US10075590B2/en
Application filed by Hangzhou Haicun Information Technology Co Ltd filed Critical Hangzhou Haicun Information Technology Co Ltd
Priority to US16/563,914 priority Critical patent/US10737617B2/en
Publication of US20190389380A1 publication Critical patent/US20190389380A1/en
Application granted granted Critical
Publication of US10737617B2 publication Critical patent/US10737617B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/586Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of parking space
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/002Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • G06K9/00812
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/106Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using night vision cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking

Definitions

  • the present invention relates to the field of electronics, and more particularly to device and method to detect parked vehicles at night.
  • Locating a vacant parking space causes much frustration to motorists. It increases fuel consumption and has a negative impact to the environment. To conserve energy resources and enhance the quality of the environment, it is highly desired to develop a parking-monitoring system, which can transmit substantially real-time parking states (i.e. occupied or vacant) to motorists. Based on the parking states, a motorist can be guided towards a vacant parking space at destination.
  • substantially real-time parking states i.e. occupied or vacant
  • Parking enforcement is an important aspect of city management.
  • the current parking-enforcement system is patrol-based, i.e. parking enforcement officers patrol the streets and/or parking lots to enforce the parking regulations. This operation requires significant amount of man-power and also consumes a lot of fuel. It is highly desired to take advantage of the above-mentioned parking-monitoring system and automatically measure the parking time for each monitored parking space.
  • Both parking monitoring and enforcement are based on parked vehicle detection.
  • Parked vehicle detection preferably can be carried out both during the day and at night. This is particularly important for commercial districts during the day and for residential areas at night. Relying on the natural light to capture the images of a parking area, prior art devices only work during the day. At night, because street lights generally do not provide adequate lighting coverage (often blocked by trees or other obstacles), prior art devices cannot reliably detect parked vehicles.
  • the present invention discloses a device and method to detect parked vehicles at night.
  • the present invention discloses a night-detection device for parked vehicles. It uses the light beam from a passing-by vehicle to detect parked vehicles.
  • the night-detection device comprises a parked-vehicle sensor for monitoring a parking area and a moving-vehicle sensor for sensing a moving vehicle around the parking area.
  • the parked-vehicle sensor captures the images of the parking area when the moving-vehicle sensor detects a passing-by vehicle. These images are then processed to determine the state of each parking space in the parking area.
  • the light beam of the passing-by vehicle can only illuminate a small number of the parked vehicles (typically around three vehicles). Considering that the passing-by vehicle can only illuminate the parking area for a few seconds, the parked-vehicle sensor needs to capture at least one image every two seconds. This is more frequent than that during the day when the parked-vehicle sensor only needs to capture an image every five to ten seconds. Accordingly, for a parked-vehicle sensor with a powerful processor, the images can be processed in real time; for a parked-vehicle sensor with a less powerful processor, the images can be recorded first and then processed after the moving vehicle is out of range.
  • the region of interest (ROI) at night is different from that during the day.
  • the ROI's at night have different shapes and locations than those during the day.
  • the extracted features at night are different from those during the day.
  • the extracted features at night are reflections (where the pixel intensity is large), whereas the extracted features during the day are edges (where the pixel intensity changes sharply).
  • typical extracted features at night include the tail-light reflection, the wheel reflection and the body reflection.
  • typical extracted features at night include the rear/front bumper reflection and the tail/head-light reflection (“/” means “or” here).
  • FIG. 1 is a top view of a street with vehicles parked along its side and a moving vehicle passing by these parked vehicles;
  • FIG. 2 is a block diagram of a preferred night-detection device for parked vehicles
  • FIG. 3 is a block diagram of a preferred parked-vehicle sensor
  • FIGS. 4A-4C disclose several preferred moving-vehicle sensors and moving-vehicle detection methods
  • FIGS. 5A-5B are flow charts showing two preferred night-detection methods for parked vehicles
  • FIG. 6 illustrates the extracted features on inline parked vehicles during the day (prior art).
  • FIG. 7 illustrates the extracted features on inline parked vehicles at night
  • FIG. 8 illustrates the extracted feature on front-parked vehicles at night.
  • a street 20 with several parked vehicles and a passing-by vehicle is shown.
  • the street 20 is along the x-axis and has two curbs 20 a , 20 b .
  • a parking-monitoring device 30 a is installed to monitor a large parking area 35 , which includes the parking spaces 10 a - 10 f .
  • the device 30 a is mounted on a support such as a utility pole or a street-lamp post, which also provides power to the device 30 a .
  • the device 30 a is preferably mounted at a position higher than the highest roof of the parked vehicles.
  • the monitored parking area 35 four parking spaces 10 a , 10 c , 10 d and 10 f are occupied by the vehicles 40 a , 40 c , 40 d and 40 f , respectively, while the other two parking spaces 10 b , 10 e are vacant.
  • the states of these parking spaces 10 a - 10 f can be easily monitored by the parking-monitoring device 30 a .
  • the light beam 60 from a moving vehicle 50 which illuminates the parked vehicles while passing by, is used to determine the states of the parking spaces 10 a - 10 f.
  • the night-detection device 30 is actually the parking-monitoring device 30 a . It takes advantage of the light beam 60 from a moving vehicle 50 which illuminates the parked vehicles while passing by.
  • the night-detection device 30 comprises a parked-vehicle sensor 80 for monitoring a parking area and a moving-vehicle sensor 70 for sensing a moving vehicle around this parking area. After it detects a passing-by vehicle 50 , the moving-vehicle sensor 70 sends out a trigger signal 78 to the parked-vehicle sensor 80 .
  • the parked-vehicle sensor 80 captures the images of the parking area 35 and determines the parking state 72 of each parking space (e.g., 10 a - 10 f ).
  • a passing-by vehicle 50 is a moving vehicle within a pre-determined range from the parking area 35 . More details on the parked-vehicle sensor 80 and the moving-vehicle sensor 70 are disclosed in FIG. 3 and FIGS. 4A-4C , respectively.
  • FIG. 3 is a block diagram of a preferred parked-vehicle sensor 80 . It comprises an optical detector 82 , a processor 84 and a memory 86 .
  • the optical detector 82 captures the images of the monitored parking area 35 and it is generally a camera. It may also comprise a number of cameras facing different directions.
  • the processor 84 processes the images captured by the optical detector 82 to determine the parking states. It could be any type of central-processing unit (CPU) and/or digital signal processor (DSP).
  • the memory 86 could be any type of non-volatile memory (NVM), e.g. flash memory. It stores at least a portion of the images captured by the optical detector 82 . It also stores an operating system for the parking-monitoring device 80 . Preferably, the operating system is an operating system of a smart-phone, e.g. iOS or Android. It further stores at least a parked vehicle detection algorithm 87 . This algorithm 87 configures the processor 84 to detect parked vehicles.
  • FIGS. 4A-4C disclose three preferred moving-vehicle sensors 70 and moving-vehicle detection methods.
  • the moving-vehicle sensor 70 could be an audio sensor, an optical sensor, or an electromagnetic sensor.
  • the audio sensor listens to the ambient sound change caused by a nearby moving vehicle 50 ;
  • the optical sensor monitors the ambient light change caused by a nearby moving vehicle 50 (more details disclosed in FIG. 4B );
  • the electromagnetic sensor detects the changes in electromagnetic wave caused by a nearby moving vehicle 50 .
  • FIG. 4B discloses another preferred moving-vehicle sensor 70 . It uses the parked-vehicle sensor 80 of FIG. 3 as the moving-vehicle sensor 70 . Note that the memory 86 of the parked-vehicle sensor 80 further stores a moving vehicle detection algorithm 89 . This algorithm 89 configures the processor 84 to detect an incoming light beam on the street. Once the intensity of this light beam is above a threshold, the moving vehicle is considered in range.
  • FIG. 4C discloses a third preferred moving-vehicle sensor.
  • the moving-vehicle sensor 70 b in an adjacent block 22 b are used to provide an advance notice of a passing-by vehicle 50 .
  • the moving-vehicle sensor 70 b can communicate this advance notice to the parked-vehicle sensor 80 a using a wireless means 98 , e.g. WiFi or Bluetooth.
  • a wireless means 98 e.g. WiFi or Bluetooth.
  • the parked-vehicle sensor 80 a and the moving-vehicle sensor 70 b could be a portion of the parking-monitoring device of their respective block.
  • the parked-vehicle sensor 80 a can monitor the parked vehicles more efficiently and more accurately.
  • FIGS. 5A-5B flow charts showing two preferred night-detection methods for parked vehicles are shown.
  • the captured images are processed in real time as the moving vehicle 50 is passing the parking area 35 .
  • the captured images are processed after the moving vehicle 50 has left the monitored parking area 35 .
  • the first preferred night-detection method includes the following steps.
  • the moving-vehicle sensor 70 senses a moving vehicle 50 (step 110 ). If the moving vehicle is in range (step 120 ), the parked-vehicle sensor 80 captures an image of the parking area 35 (step 130 ). This image is processed for each parking space, particularly for the parking spaces which are illuminated by the light beam 60 of the passing-by vehicle 50 (step 140 ). Steps 130 , 140 are repeated until the moving vehicle 50 is out of range (step 150 ). Then wait for another moving vehicle (step 160 ).
  • the light beam 60 of a passing-by vehicle 50 can only illuminate a small number of the parked vehicles (typically around three vehicles). Considering that the passing-by vehicle 50 can only illuminate the parking area for a few seconds, the parked-vehicle sensor 80 needs to capture at least one image of the parking area 35 every two seconds. This is more frequent than during the day when the parked-vehicle sensor 80 only needs to capture an image every five to ten seconds.
  • the images can be processed in real time; for a parked-vehicle sensor 80 with a less powerful processor 84 , the images can be recorded first and then processed after the moving vehicle 50 is out of range. This is further illustrated in FIG. 5B .
  • the parked-vehicle sensor 80 When the moving vehicle 50 is in range (step 120 ), the parked-vehicle sensor 80 only captures the images (step 130 ) and records them to the memory 86 (step 145 ), but does not process these images.
  • the processor 84 processes these images and determines the states of the parking area 35 (step 155 ).
  • FIGS. 6 and 7 compare these differences, primarily in the areas of region of interest (ROI) and signature features.
  • ROI is a region in an image that is image-processed to detect if a vehicle is parked in an associated parking space
  • signature feature is a feature on a vehicle indicating that this vehicle is parked in a parking space of interest.
  • FIG. 6 shows the ROI's 200 a , 200 c for the vehicles 40 a , 40 c parked in the parking spaces 10 a , 10 c along the curb 10 during the day. Because they are parked along a line 10 , the vehicles 40 a , 40 c are inline parked vehicles.
  • Each ROI (e.g. 200 a ) for each parking space (e.g. 10 a ) roughly starts from a side line (e.g. “ab”) of the parking space (e.g. 10 a ) and extends upward to cover at least a side window of the vehicle (e.g. 40 a ).
  • the extracted features in the ROI are signature edges of the vehicle.
  • its signature edges include the bottom edge of its body 310 a and the bottom edge of its side window 300 a . More details on the day detection of parked vehicles are disclosed in U.S. Patent Provisional Application “Occluded Vehicle Detection”, App. Ser. No. 61/883,122, filed Sep. 26, 2013.
  • FIG. 7 shows the ROI's for the inline parked vehicles 40 a , 40 c at night.
  • Each vehicle e.g. 40 a
  • the first ROI 220 a covers at least a wheel and a portion of the body of the vehicle 40 a
  • the second ROI 210 a covers the tail-light of the vehicle 40 a .
  • the extracted features at night are different from those during the day: the extracted features at night are reflections (where the pixel intensity is large), whereas the extracted features during the day are edges (where the pixel intensity changes sharply).
  • its night signature features include the wheel reflections 310 , 320 , the tail-light reflection 330 and the body reflection 340 .
  • a signature reflection can be detected by searching for the pixels whose intensity is larger than a threshold within the ROI.
  • FIG. 8 shows the ROI's for the front-parked vehicles 40 d , 40 f at night.
  • the vehicles 40 d , 40 f are parked in the parking spaces 10 d , 10 f , respectively, while the parking space 10 e is un-occupied.
  • a front-parked vehicle e.g. 40 d
  • a front-parked vehicle is parked in such a way that its head (i.e. its front side) faces the inside of the parking space (e.g. 10 d ), while its tail (i.e. its back side) faces the outside of the parking space (e.g. 10 d ).
  • Each vehicle e.g. 40 d
  • has an ROI e.g. 230 d ).
  • the ROI 230 d covers two tail lights and at least a portion of the back bumper of the vehicle 40 d .
  • its night signature features include the tail-light reflections 350 d , 360 d , and/or the back-bumper reflection 370 d .
  • its night signature edges include the tail-light reflections 350 f , 360 f , and/or the back-bumper reflection 370 f.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)

Abstract

To detect front-parked vehicles at night (i.e. a vehicle is parked with its head facing the inside of a parking space), a detection device uses the light beam from a passing-by vehicle to extract at least a reflection of at least a tail light or at least a portion of a back bumper from an image captured for a parking space.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This is a continuation-in-part of application “Night Detection of Side-by-Side Parked Vehicles”, application Ser. No. 15/628,617, filed Jun. 20, 2017, which is a continuation-in-part of application ““Night Detection of Parked Vehicles”, application Ser. No. 15/260,277, filed Sep. 8, 2016, now U.S. Pat. No. 9,688,197, which is a continuation of application “Night Detection of Parked Vehicles”, application Ser. No. 14/636,323, filed Mar. 3, 2015, now U.S. Pat. No. 9,475,429.
  • BACKGROUND 1. Technical Field of the Invention
  • The present invention relates to the field of electronics, and more particularly to device and method to detect parked vehicles at night.
  • 2. Prior Art
  • Locating a vacant parking space causes much frustration to motorists. It increases fuel consumption and has a negative impact to the environment. To conserve energy resources and enhance the quality of the environment, it is highly desired to develop a parking-monitoring system, which can transmit substantially real-time parking states (i.e. occupied or vacant) to motorists. Based on the parking states, a motorist can be guided towards a vacant parking space at destination.
  • Parking enforcement is an important aspect of city management. The current parking-enforcement system is patrol-based, i.e. parking enforcement officers patrol the streets and/or parking lots to enforce the parking regulations. This operation requires significant amount of man-power and also consumes a lot of fuel. It is highly desired to take advantage of the above-mentioned parking-monitoring system and automatically measure the parking time for each monitored parking space.
  • Both parking monitoring and enforcement are based on parked vehicle detection. Parked vehicle detection preferably can be carried out both during the day and at night. This is particularly important for commercial districts during the day and for residential areas at night. Relying on the natural light to capture the images of a parking area, prior art devices only work during the day. At night, because street lights generally do not provide adequate lighting coverage (often blocked by trees or other obstacles), prior art devices cannot reliably detect parked vehicles.
  • Objects and Advantages
  • It is a principle object of the present invention to conserve energy resources and enhance the quality of the environment.
  • It is a further object of the present invention to reliably detect parked vehicles at night.
  • It is a further object of the present invention to provide parking monitoring at night.
  • It is a further object of the present invention to provide parking enforcement at night.
  • In accordance with these and other objects of the present invention, the present invention discloses a device and method to detect parked vehicles at night.
  • SUMMARY OF THE INVENTION
  • The present invention discloses a night-detection device for parked vehicles. It uses the light beam from a passing-by vehicle to detect parked vehicles. The night-detection device comprises a parked-vehicle sensor for monitoring a parking area and a moving-vehicle sensor for sensing a moving vehicle around the parking area. The parked-vehicle sensor captures the images of the parking area when the moving-vehicle sensor detects a passing-by vehicle. These images are then processed to determine the state of each parking space in the parking area.
  • Because it has a limited range (with effective range of ˜20 meters), the light beam of the passing-by vehicle can only illuminate a small number of the parked vehicles (typically around three vehicles). Considering that the passing-by vehicle can only illuminate the parking area for a few seconds, the parked-vehicle sensor needs to capture at least one image every two seconds. This is more frequent than that during the day when the parked-vehicle sensor only needs to capture an image every five to ten seconds. Accordingly, for a parked-vehicle sensor with a powerful processor, the images can be processed in real time; for a parked-vehicle sensor with a less powerful processor, the images can be recorded first and then processed after the moving vehicle is out of range.
  • Because the parked vehicles are illuminated by the light beam of a passing-by vehicle, not by the natural light, image processing at night is different from that during the day. First of all, the region of interest (ROI) at night is different from that during the day. The ROI's at night have different shapes and locations than those during the day. Secondly, the extracted features at night are different from those during the day. The extracted features at night are reflections (where the pixel intensity is large), whereas the extracted features during the day are edges (where the pixel intensity changes sharply). For inline parked vehicles (i.e. vehicles parked along a line and the parked-vehicle sensor captures the side image of the parked vehicles), typical extracted features at night include the tail-light reflection, the wheel reflection and the body reflection. For side-by-side parked vehicles (i.e. vehicles parked side-by-side and the parked-vehicle sensor captures the tail/head image of the parked vehicles), typical extracted features at night include the rear/front bumper reflection and the tail/head-light reflection (“/” means “or” here).
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a top view of a street with vehicles parked along its side and a moving vehicle passing by these parked vehicles;
  • FIG. 2 is a block diagram of a preferred night-detection device for parked vehicles;
  • FIG. 3 is a block diagram of a preferred parked-vehicle sensor;
  • FIGS. 4A-4C disclose several preferred moving-vehicle sensors and moving-vehicle detection methods;
  • FIGS. 5A-5B are flow charts showing two preferred night-detection methods for parked vehicles;
  • FIG. 6 illustrates the extracted features on inline parked vehicles during the day (prior art);
  • FIG. 7 illustrates the extracted features on inline parked vehicles at night;
  • FIG. 8 illustrates the extracted feature on front-parked vehicles at night.
  • It should be noted that all the drawings are schematic and not drawn to scale. Relative dimensions and proportions of parts of the device structures in the figures have been shown exaggerated or reduced in size for the sake of clarity and convenience in the drawings. The same reference symbols are generally used to refer to corresponding or similar features in the different embodiments.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Those of ordinary skills in the art will realize that the following description of the present invention is illustrative only and is not intended to be in any way limiting. Other embodiments of the invention will readily suggest themselves to such skilled persons from an examination of the within disclosure.
  • Referring now to FIG. 1, a street 20 with several parked vehicles and a passing-by vehicle is shown. The street 20 is along the x-axis and has two curbs 20 a, 20 b. Along the curb 20 a, there are a number of parking spaces (e.g. 10 a-10 f . . . ). On the opposite curb 20 b, a parking-monitoring device 30 a is installed to monitor a large parking area 35, which includes the parking spaces 10 a-10 f. Generally, the device 30 a is mounted on a support such as a utility pole or a street-lamp post, which also provides power to the device 30 a. To make it easier to detect a parked vehicle, the device 30 a is preferably mounted at a position higher than the highest roof of the parked vehicles.
  • Within the monitored parking area 35, four parking spaces 10 a, 10 c, 10 d and 10 f are occupied by the vehicles 40 a, 40 c, 40 d and 40 f, respectively, while the other two parking spaces 10 b, 10 e are vacant. During the day (i.e. under the natural lighting), the states of these parking spaces 10 a-10 f can be easily monitored by the parking-monitoring device 30 a. At night, because these parked vehicles may not have enough lighting for the parking-monitoring device 30 a to make reliable detection, the light beam 60 from a moving vehicle 50, which illuminates the parked vehicles while passing by, is used to determine the states of the parking spaces 10 a-10 f.
  • Referring now to FIG. 2, a preferred night-detection device 30 for parked vehicles is disclosed. This night-detection device 30 is actually the parking-monitoring device 30 a. It takes advantage of the light beam 60 from a moving vehicle 50 which illuminates the parked vehicles while passing by. The night-detection device 30 comprises a parked-vehicle sensor 80 for monitoring a parking area and a moving-vehicle sensor 70 for sensing a moving vehicle around this parking area. After it detects a passing-by vehicle 50, the moving-vehicle sensor 70 sends out a trigger signal 78 to the parked-vehicle sensor 80. Once it receives the trigger signal 78, the parked-vehicle sensor 80 captures the images of the parking area 35 and determines the parking state 72 of each parking space (e.g., 10 a-10 f). A passing-by vehicle 50 is a moving vehicle within a pre-determined range from the parking area 35. More details on the parked-vehicle sensor 80 and the moving-vehicle sensor 70 are disclosed in FIG. 3 and FIGS. 4A-4C, respectively.
  • FIG. 3 is a block diagram of a preferred parked-vehicle sensor 80. It comprises an optical detector 82, a processor 84 and a memory 86. The optical detector 82 captures the images of the monitored parking area 35 and it is generally a camera. It may also comprise a number of cameras facing different directions. The processor 84 processes the images captured by the optical detector 82 to determine the parking states. It could be any type of central-processing unit (CPU) and/or digital signal processor (DSP). The memory 86 could be any type of non-volatile memory (NVM), e.g. flash memory. It stores at least a portion of the images captured by the optical detector 82. It also stores an operating system for the parking-monitoring device 80. Preferably, the operating system is an operating system of a smart-phone, e.g. iOS or Android. It further stores at least a parked vehicle detection algorithm 87. This algorithm 87 configures the processor 84 to detect parked vehicles.
  • FIGS. 4A-4C disclose three preferred moving-vehicle sensors 70 and moving-vehicle detection methods. In the preferred embodiment of FIG. 4A, the moving-vehicle sensor 70 could be an audio sensor, an optical sensor, or an electromagnetic sensor. The audio sensor listens to the ambient sound change caused by a nearby moving vehicle 50; the optical sensor monitors the ambient light change caused by a nearby moving vehicle 50 (more details disclosed in FIG. 4B); the electromagnetic sensor detects the changes in electromagnetic wave caused by a nearby moving vehicle 50.
  • FIG. 4B discloses another preferred moving-vehicle sensor 70. It uses the parked-vehicle sensor 80 of FIG. 3 as the moving-vehicle sensor 70. Note that the memory 86 of the parked-vehicle sensor 80 further stores a moving vehicle detection algorithm 89. This algorithm 89 configures the processor 84 to detect an incoming light beam on the street. Once the intensity of this light beam is above a threshold, the moving vehicle is considered in range.
  • FIG. 4C discloses a third preferred moving-vehicle sensor. For the parked-vehicle sensor 80 a monitoring a parking area in the block 22 a, the moving-vehicle sensor 70 b in an adjacent block 22 b are used to provide an advance notice of a passing-by vehicle 50. The moving-vehicle sensor 70 b can communicate this advance notice to the parked-vehicle sensor 80 a using a wireless means 98, e.g. WiFi or Bluetooth. Note that the parked-vehicle sensor 80 a and the moving-vehicle sensor 70 b could be a portion of the parking-monitoring device of their respective block. With the advance notice, the parked-vehicle sensor 80 a can monitor the parked vehicles more efficiently and more accurately.
  • Referring now to FIGS. 5A-5B, flow charts showing two preferred night-detection methods for parked vehicles are shown. In the preferred method of FIG. 5A, the captured images are processed in real time as the moving vehicle 50 is passing the parking area 35. On the other hand, in the preferred method of FIG. 5B, the captured images are processed after the moving vehicle 50 has left the monitored parking area 35.
  • As is disclosed in FIG. 5A, the first preferred night-detection method includes the following steps. The moving-vehicle sensor 70 senses a moving vehicle 50 (step 110). If the moving vehicle is in range (step 120), the parked-vehicle sensor 80 captures an image of the parking area 35 (step 130). This image is processed for each parking space, particularly for the parking spaces which are illuminated by the light beam 60 of the passing-by vehicle 50 (step 140). Steps 130, 140 are repeated until the moving vehicle 50 is out of range (step 150). Then wait for another moving vehicle (step 160).
  • Because it has a limited range (with effective range of ˜20 meters), the light beam 60 of a passing-by vehicle 50 can only illuminate a small number of the parked vehicles (typically around three vehicles). Considering that the passing-by vehicle 50 can only illuminate the parking area for a few seconds, the parked-vehicle sensor 80 needs to capture at least one image of the parking area 35 every two seconds. This is more frequent than during the day when the parked-vehicle sensor 80 only needs to capture an image every five to ten seconds. Accordingly, for a parked-vehicle sensor 80 with a powerful processor 84, the images can be processed in real time; for a parked-vehicle sensor 80 with a less powerful processor 84, the images can be recorded first and then processed after the moving vehicle 50 is out of range. This is further illustrated in FIG. 5B. When the moving vehicle 50 is in range (step 120), the parked-vehicle sensor 80 only captures the images (step 130) and records them to the memory 86 (step 145), but does not process these images. After the moving vehicle 50 is out of range (step 150), the processor 84 processes these images and determines the states of the parking area 35 (step 155).
  • Because the parked vehicles are illuminated by the light beam 60 of a passing-by vehicle 50, not by the natural light, image processing at night is different from that during the day. FIGS. 6 and 7 compare these differences, primarily in the areas of region of interest (ROI) and signature features. Here, a ROI is a region in an image that is image-processed to detect if a vehicle is parked in an associated parking space; and a signature feature is a feature on a vehicle indicating that this vehicle is parked in a parking space of interest.
  • FIG. 6 shows the ROI's 200 a, 200 c for the vehicles 40 a, 40 c parked in the parking spaces 10 a, 10 c along the curb 10 during the day. Because they are parked along a line 10, the vehicles 40 a, 40 c are inline parked vehicles. Each ROI (e.g. 200 a) for each parking space (e.g. 10 a) roughly starts from a side line (e.g. “ab”) of the parking space (e.g. 10 a) and extends upward to cover at least a side window of the vehicle (e.g. 40 a). The extracted features in the ROI are signature edges of the vehicle. For an inline parked vehicle, its signature edges include the bottom edge of its body 310 a and the bottom edge of its side window 300 a. More details on the day detection of parked vehicles are disclosed in U.S. Patent Provisional Application “Occluded Vehicle Detection”, App. Ser. No. 61/883,122, filed Sep. 26, 2013.
  • FIG. 7 shows the ROI's for the inline parked vehicles 40 a, 40 c at night. Each vehicle (e.g. 40 a) has two ROI's (e.g. 210 a, 220 a). The first ROI 220 a covers at least a wheel and a portion of the body of the vehicle 40 a, while the second ROI 210 a covers the tail-light of the vehicle 40 a. The extracted features at night are different from those during the day: the extracted features at night are reflections (where the pixel intensity is large), whereas the extracted features during the day are edges (where the pixel intensity changes sharply). For an inline parked vehicle, its night signature features include the wheel reflections 310, 320, the tail-light reflection 330 and the body reflection 340. Here, a signature reflection can be detected by searching for the pixels whose intensity is larger than a threshold within the ROI.
  • FIG. 8 shows the ROI's for the front-parked vehicles 40 d, 40 f at night. The vehicles 40 d, 40 f are parked in the parking spaces 10 d, 10 f, respectively, while the parking space 10 e is un-occupied. As used herein, a front-parked vehicle (e.g. 40 d) is parked in such a way that its head (i.e. its front side) faces the inside of the parking space (e.g. 10 d), while its tail (i.e. its back side) faces the outside of the parking space (e.g. 10 d). Each vehicle (e.g. 40 d) has an ROI (e.g. 230 d). The ROI 230 d covers two tail lights and at least a portion of the back bumper of the vehicle 40 d. For the front-parked vehicle 40 d, its night signature features include the tail- light reflections 350 d, 360 d, and/or the back-bumper reflection 370 d. Similarly, for the front-parked vehicle 40 f, its night signature edges include the tail- light reflections 350 f, 360 f, and/or the back-bumper reflection 370 f.
  • While illustrative embodiments have been shown and described, it would be apparent to those skilled in the art that many more modifications than that have been mentioned above are possible without departing from the inventive concepts set forth therein. The invention, therefore, is not to be limited except in the spirit of the appended claims.

Claims (10)

What is claimed is:
1. A detecting method for front-parked vehicles at night, comprising the steps of:
A) sensing by a moving-vehicle sensor if a moving vehicle is within a pre-determined range of a parking area;
B) capturing at least an image of said parking area by a parked-vehicle sensor when said moving-vehicle sensor detects said moving vehicle;
C) determining if said parking area is occupied by extracting at least a reflection of at least a tail light or at least a portion of a back bumper from said image.
2. The method according to claim 1, wherein said moving-vehicle sensor is an audio sensor, an optical sensor, or an electromagnetic sensor.
3. The method according to claim 1, wherein said moving-vehicle sensor is located at the same location as said parked-vehicle sensor.
4. The method according to claim 1, wherein said moving-vehicle sensor is located at a different location from said parked-vehicle sensor.
5. The method according to claim 4, wherein said moving-vehicle sensor communicates with said parked-vehicle sensor using a wireless means.
6. The method according to claim 1, wherein said parked-vehicle sensor comprises an optical detector, a processor and a memory.
7. The method according to claim 6, wherein said optical detector comprises at least a camera.
8. The method according to claim 7, wherein said camera operates more frequently at night than during the day.
9. The method according to claim 6, wherein said memory stores a front-parked vehicle detection algorithm.
10. The method according to claim 6, wherein said memory stores a passing-by vehicle detection algorithm.
US16/563,914 2015-03-03 2019-09-08 Detecting method for front-parked vehicles at night Expired - Fee Related US10737617B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/563,914 US10737617B2 (en) 2015-03-03 2019-09-08 Detecting method for front-parked vehicles at night

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US14/636,323 US9475429B2 (en) 2015-03-03 2015-03-03 Night detection of parked vehicles
US15/260,277 US9688197B2 (en) 2015-03-03 2016-09-08 Night detection of parked vehicles
US15/628,617 US10075590B2 (en) 2015-03-03 2017-06-20 Night detection of side-by-side parked vehicles
CN201810830067 2018-07-26
CN201810829696.7 2018-07-26
CN201810830067.6 2018-07-26
CN201810830067 2018-07-26
CN201810829696 2018-07-26
CN201810829696 2018-07-26
US16/563,914 US10737617B2 (en) 2015-03-03 2019-09-08 Detecting method for front-parked vehicles at night

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/628,617 Continuation-In-Part US10075590B2 (en) 2015-03-03 2017-06-20 Night detection of side-by-side parked vehicles

Publications (2)

Publication Number Publication Date
US20190389380A1 true US20190389380A1 (en) 2019-12-26
US10737617B2 US10737617B2 (en) 2020-08-11

Family

ID=68980557

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/563,914 Expired - Fee Related US10737617B2 (en) 2015-03-03 2019-09-08 Detecting method for front-parked vehicles at night

Country Status (1)

Country Link
US (1) US10737617B2 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4891624A (en) * 1987-06-12 1990-01-02 Stanley Electric Co., Ltd. Rearward vehicle obstruction detector using modulated light from the brake light elements
US20090243889A1 (en) * 2008-03-27 2009-10-01 Mando Corporation Monocular motion stereo-based free parking space detection apparatus and method
US20140022068A1 (en) * 2011-04-13 2014-01-23 Toyota Jidosha Kabushiki Kaisha Vehicle-mounted surrounding object recognizing apparatus and drive support apparatus using the same

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8139115B2 (en) 2006-10-30 2012-03-20 International Business Machines Corporation Method and apparatus for managing parking lots
KR101844289B1 (en) 2011-07-06 2018-04-02 삼성전자 주식회사 Method and apparatus for managing security of mobile terminal based on location information in mobile communication system
US8698652B1 (en) 2013-01-28 2014-04-15 HangZhou HaiCun Information Technology Co., Ltd Large-area parking-monitoring system
US8923565B1 (en) 2013-09-26 2014-12-30 Chengdu Haicun Ip Technology Llc Parked vehicle detection based on edge detection
US9475429B2 (en) 2015-03-03 2016-10-25 Chengdu Haicun Ip Technology Llc Night detection of parked vehicles

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4891624A (en) * 1987-06-12 1990-01-02 Stanley Electric Co., Ltd. Rearward vehicle obstruction detector using modulated light from the brake light elements
US20090243889A1 (en) * 2008-03-27 2009-10-01 Mando Corporation Monocular motion stereo-based free parking space detection apparatus and method
US20140022068A1 (en) * 2011-04-13 2014-01-23 Toyota Jidosha Kabushiki Kaisha Vehicle-mounted surrounding object recognizing apparatus and drive support apparatus using the same

Also Published As

Publication number Publication date
US10737617B2 (en) 2020-08-11

Similar Documents

Publication Publication Date Title
US9688197B2 (en) Night detection of parked vehicles
US8923565B1 (en) Parked vehicle detection based on edge detection
US9566900B2 (en) Driver assistance system and operating procedure for the latter
US9665783B2 (en) Night parking detection
KR101780320B1 (en) Intelligent type of system for monitoring with a plural functions
CN101469985A (en) Single-frame image detection apparatus for vehicle queue length at road junction and its working method
US20150117705A1 (en) Hybrid Parking Detection
JP2009201064A (en) Method and apparatus for specifying related region, and method and apparatus for recognizing image
GB2496278A (en) Vehicle reverse detection method and system via video acquisition and processing
RU2014139644A (en) ROAD MONITORING SYSTEM
CN111027383A (en) Patrol robot
US20160280135A1 (en) Animal Detection System for a Vehicle
CN204856897U (en) It is detection device violating regulations in abscission zone territory that motor vehicle stops promptly
CN108711283A (en) The night monitoring to park cars
US10737617B2 (en) Detecting method for front-parked vehicles at night
WO2015055737A1 (en) Method and system for determining a reflection property of a scene
US10464479B2 (en) Night detection of front-parked vehicles
US10075590B2 (en) Night detection of side-by-side parked vehicles
US20180339655A1 (en) Detecting Method for Front-Parked Vehicles At Night
KR20140096575A (en) Apparatus and method for supervising illegal parking and stopping of vehicle
US20190217774A1 (en) Image recognition-based intelligent alarm lamp
JP4300274B2 (en) Snow detection system
CN107705612A (en) The night detection to park cars
Chen et al. Robust rear light status recognition using symmetrical surfs
CN102542241A (en) Embedded image processing and identifying system and method

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

ZAAA Notice of allowance and fees due

Free format text: ORIGINAL CODE: NOA

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362