CN112633039A - Method and system for filtering congestion area - Google Patents

Method and system for filtering congestion area Download PDF

Info

Publication number
CN112633039A
CN112633039A CN201910907355.1A CN201910907355A CN112633039A CN 112633039 A CN112633039 A CN 112633039A CN 201910907355 A CN201910907355 A CN 201910907355A CN 112633039 A CN112633039 A CN 112633039A
Authority
CN
China
Prior art keywords
area
congestion
determining
road condition
congestion detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910907355.1A
Other languages
Chinese (zh)
Inventor
石永禄
尹科才
毛河
高枫
朱彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Topplusvision Science & Technology Co ltd
Original Assignee
Chengdu Topplusvision Science & Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Topplusvision Science & Technology Co ltd filed Critical Chengdu Topplusvision Science & Technology Co ltd
Priority to CN201910907355.1A priority Critical patent/CN112633039A/en
Publication of CN112633039A publication Critical patent/CN112633039A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application discloses a method and a system for filtering a congestion area. The congestion area filtering method comprises the following steps: acquiring an input road condition image; acquiring a congestion detection area through a congestion detection model based on the road condition image; acquiring suspected illegal parking vehicle information; determining whether the vehicle is illegally parked based on information related to the suspected illegal parking vehicle information and in combination with the congestion detection area; and outputting a result of the determination.

Description

Method and system for filtering congestion area
Technical Field
The present application relates to the field of image processing, and in particular, to a method and system for filtering a congestion area.
Background
With the rapid development of social economy and the continuous improvement of living standard of people, vehicles become necessities in family life, and the quantity of various vehicles in cities is increased day by day. With the increasing number of various vehicles, the situation of traffic jam is more common for the existing public transportation system. In some cases, for example, when a congestion situation occurs on an illegal parking lane, since the detection area of the illegal parking event is larger than the detection area for performing congestion detection only in the congested area, it is necessary to shield the illegal parking event occurring on the congested lane. Therefore, a congestion area filtering method is needed to filter the illegal parking events of vehicles on the congested lane, so as to reduce the false alarm rate of the illegal parking events.
Disclosure of Invention
One aspect of the present application provides a congestion area filtering method. The method comprises the following steps: acquiring an input road condition image; acquiring a congestion detection area through a congestion detection model based on the road condition image; acquiring suspected illegal parking vehicle information; determining whether the vehicle is illegally parked based on information related to the suspected illegal parking vehicle information and in combination with the congestion detection area; and outputting the judged result.
In some embodiments, the method for determining whether a vehicle is parked illegally in conjunction with the congestion detection area based on the information related to the suspected illegal parking vehicle information includes: determining a masked area based on the congestion detection area; and judging whether the suspected illegal parking vehicle is positioned in the shielding area.
In some embodiments, the method of determining a masked area based on the congestion detection area comprises: determining a relevant polygon based on the congestion detection area; enlarging the associated polygon as a mask region.
In some embodiments, the relevant polygon may be a minimum bounding moment.
In some embodiments, the method of enlarging the relevant polygon as a masked region comprises: judging the long side and the short side of the related polygon; and determining a shielding area based on the extension line of the long side and the short side and the road condition image boundary.
In some embodiments, the method of determining a shielded area based on the extended line of the long side and the short side and the road condition image boundary includes: extending the long and short edges to the road condition image boundary; determining an intersection of the extension line and the road condition image boundary; and determining an area surrounded by the road condition image boundary and the extension line as a shielding area.
In some embodiments, the method of determining a masked area based on the congestion detection area further comprises: and determining a shielding area based on the lane lines of the congestion detection area.
Another aspect of the present application provides a congestion area filtering system. The system comprises: the system comprises an acquisition module, a storage module and a display module, wherein the acquisition module is used for acquiring an input road condition image and acquiring suspected illegal vehicle information; the congestion detection module is used for acquiring a congestion detection area through a congestion detection model based on the road condition image; the judging module is used for judging whether the vehicle is illegally parked or not by combining the congestion detection area based on the information related to the suspected illegal parking vehicle information; and the output module is used for outputting the judged result.
In some embodiments, the system for determining vehicle cleanliness further comprises: a masked region determination module for determining a masked region based on the congestion detection region; the judging module is used for judging whether the suspected illegal parking vehicle is located in the shielding area.
Another aspect of the present application provides an apparatus for congestion area filtering, comprising at least one storage medium and at least one processor, the at least one storage medium configured to store computer instructions; the at least one processor is configured to execute the computer instructions to implement the method for congestion area filtering as described above.
Another aspect of the application provides a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the method of congestion area filtering as previously described.
Drawings
The present application will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic illustration of an application scenario of a congestion zone filtering system according to some embodiments of the present application;
FIG. 2 is a block diagram of a congestion area filtering system according to some embodiments of the present application;
FIG. 3 is an exemplary flow chart of a congestion area filtering method according to some embodiments of the present application;
FIG. 4 is an exemplary flow diagram of a masked region determination method according to some embodiments of the present application;
FIG. 5 is an exemplary flow chart of another masked region determination method according to some embodiments of the present application.
6A-6H are schematic diagrams illustrating the determination of an intersection of an extension line with a road condition image boundary according to some embodiments of the present application.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "device", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic diagram of an application scenario of a congestion area filtering system 100 according to some embodiments of the present application.
The congestion area filtering system 100 may be an on-line monitoring platform for road management. For example, the congestion area filtering system 100 may filter events occurring in the congestion area to reduce the occurrence of false positives for violations or other events due to congestion. In some embodiments, the congestion area filtering system 100 may be applied to roads such as urban traffic roads, tunnels, expressways, airport roads, train station roads, and the like. In some embodiments, the congestion area filtering system 100 may be applied to roads with large traffic flow or complicated road conditions, where congestion events are likely to occur. In some embodiments, the congestion area filtering system 100 may also be applied to the entrance or exit of a road such as a highway, a tunnel, a fast way, etc. As shown in fig. 1, the congestion area filtering system 100 may include a server 110, a network 120, a terminal 130, a storage device 140, and an image acquisition device 150.
In some embodiments, the server 110 may process data and/or information from at least one component of the present system or an external data source (e.g., a cloud data center). In some embodiments, the server 110 may be a single server, may be a computing platform comprising a plurality of servers, may be centralized or distributed in a server farm, may be dedicated, or may be served by other devices or systems. In some embodiments, the server 110 may be local or remote. For example, server 110 may access information and/or data stored in terminals 130, storage devices 140 via network 120. In some embodiments, server 110 may be directly connected to terminal 130, storage device 140, image capture device 150 to access information and/or data stored therein. In some embodiments, the server 110 may be implemented on a cloud platform. For example, the cloud platform may include one or any combination of a private cloud, a public cloud, a hybrid cloud, a community cloud, a decentralized cloud, an internal cloud, and the like. In other embodiments, the server 110 may be one of the terminals 130 at the same time.
Server 110, terminal 130, and other components may include a processing device. In some embodiments, the processing apparatus described above may include one or more processing engines (e.g., single core processing engines or multi-core processing engines). The processing device may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), or the like, as is known to those skilled in the art. The processing device may process data and/or information obtained from image-capturing device 150, terminal 130, and/or storage device 140 and process such data and/or information or control other devices in the system based on such data, information, and/or processing results to perform one or more of the functions described herein. For example, the processing device may receive the image captured by the image capturing device 150, process the image to determine whether there is a congestion detection area, determine whether the vehicle is parked illegally in combination with other related information, and finally output the determination result to the terminal 130.
The network 120 connects the various components of the system so that communication can occur between the various components. The network between the various parts in the system may be any one or more of a wired network or a wireless network. For example, network 120 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network (ZigBee), Near Field Communication (NFC), an intra-device bus, an intra-device line, a cable connection, and the like, or any combination thereof. The network connection between each two parts may be in one of the above-mentioned ways, or in a plurality of ways.
Terminal 130 is one or more terminal devices or software used by the user. In some embodiments, the terminal 130 may be used by one or more users, and may include users who directly use the service, and may also include other related users. In some embodiments, the user of the above description refers to the relevant staff or service requester using the congestion filtering system 100. In some embodiments, terminal 130 may include various types of devices having information receiving and/or transmitting capabilities. For example, it may be one of or any combination of the mobile device 130-1, the tablet 130-2, the notebook 130-3, the transportation system terminal device 130-4, and the like. The above examples are intended to illustrate the broad scope of the device and not to limit its scope.
Storage device 140 may store data and/or instructions. The storage device may include one or more storage components, each of which may be a separate device or part of another device. In some embodiments, storage device 140 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, the storage device 140 may be implemented on a cloud platform.
In some embodiments, a storage device 140 may be connected to network 120 to communicate with one or more components (e.g., server 110, terminal 130, image acquisition device 150) in congestion area filtering system 100. One or more components in congestion area filtering system 100 may access data or instructions stored in storage device 140 via network 120. In some embodiments, storage device 140 may be directly connected or in communication with one or more components in congestion area filtering system 100 (e.g., server 110, terminal 130, image acquisition device 150). In some embodiments, the storage device 140 may be part of the server 110.
The image capturing device 150 may capture an image of the monitored area. In some embodiments, image capture device 150 may send the captured monitored area image data to storage device 140 for storage. In some embodiments, image capture device 150 may send captured image data to server 110. In some embodiments, image capture device 150 may include any combination of one or more of a spherical camera, a dome camera, a surveillance camera, a smart camera, a pinhole camera, and the like. In some embodiments, image capture device 150 may also include any combination of one or more of a tachograph, smart glasses, smart helmet, cell phone, tablet, and the like. In some embodiments, image capture device 150 may include any combination of one or more of a digital camera, a single lens reflex camera, a micro single camera, and the like. In some embodiments, the image capture device may comprise any apparatus with a camera. In some embodiments, the camera may comprise any device having image capture capabilities.
Server 110 may communicate with terminals 130, storage device 140, and image capture device 150 via network 120 to provide congestion area filtering functionality. In some embodiments, the server 110 may obtain the captured road condition image from the storage device 140 or the image capturing device 150, detect the congestion detection area thereof, and obtain the suspected illegal parking vehicle information. In some embodiments, the server 110 may determine whether the vehicle is illicit based on the shielded area determined by the congestion detection area and the suspected illicit vehicle information, and transmit the determination result to the terminal 130. Storage device 140 and network 120 serve the processes described above.
Fig. 2 is a block diagram of a congestion area filtering system 200 according to some embodiments of the present application. As shown in fig. 2, the congestion area filtering system 200 may include an acquisition module 210, a congestion detection module 220, a masked area determination module 230, a decision module 240, and an output module 250.
The acquisition module 210 may be used to acquire road condition images and suspected illegal vehicles.
In some embodiments, the acquisition module 210 may acquire the road condition image and the suspected illegal parking vehicle image acquired by the image acquisition device 150. In some embodiments, the image capture device 150 may also capture road condition image-related information as well as suspected contra-parking vehicle information. In some embodiments, the road condition image and its related information, the suspected illegal vehicle image and its related information may be acquired by the same or different image acquisition devices 150. In some embodiments, the acquisition mode of image acquisition device 150 may include any combination of one or more of timing acquisition, real-time acquisition, panoramic acquisition, sliced acquisition, and the like. The module implementation process can be seen in steps 310 and 330.
The congestion detection module 220 may be used to determine a congestion detection area.
In some embodiments, the congestion detection module 220 may obtain the congestion detection area in the road condition image through a congestion detection model. In some embodiments, the congestion detection module 220 may also be used to train a congestion detection model and to validate the trained congestion detection model. The module implementation process may refer to step 320.
The masked zone determination module 230 may be used to determine a masked zone in conjunction with a congestion detection zone.
In some embodiments, the masked region determining module 230 may determine the masked region by determining a relevant polygon of the congestion detection region or an enlarged relevant polygon. For example, the relevant polygon may be a minimum bounding moment, and for example, the long side and the short side of the relevant polygon may be extended to be enlarged, respectively. In some embodiments, the masked area determination module 230 may also determine the masked area by the lane lines of the congestion detection area. The module implementation process can be seen in step 340, flow 400 and flow 500.
The determination module 240 may be used to determine whether a suspected parking violation vehicle violates.
In some embodiments, the determination module 240 may determine whether the suspected parking violation vehicle violates the congestion detection area based on information related to the suspected parking violation vehicle. In some embodiments, the parking violations for the congestion area may be filtered by determining whether suspected parking violations are located in a shielded area determined based on the congestion detection area. The module implementation process can be seen in step 350.
An output module 250 may be used for the result of the determination.
In some embodiments, the output module 250 may output the determination result of the violation event to the terminal 130. In some embodiments, the output module 250 may report the vehicle-related information determined as illegal vehicles and delete or perform other subsequent processes on the vehicle-related information determined as non-illegal vehicles. The module implementation process can be seen in step 360.
It should be understood that the system and its modules shown in FIG. 2 may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules of the present application may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the congestion area filtering system 200 and the modules thereof is merely for convenience of description and is not intended to limit the present application to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. For example, in some embodiments, the data acquisition module 210, the congestion detection module 220, the masked area determination module 230, the determination module 240, and the output module 250 disclosed in fig. 2 may be different modules in a system, or may be a module that implements the functions of two or more modules described above. Each of the modules in the congestion area filtering system 200 may share one storage module, and each of the modules may have its own storage module. Such variations are within the scope of the present application.
Fig. 3 is an exemplary flow chart of a congestion area filtering method according to some embodiments of the present application. As shown in fig. 3, a congestion area filtering method specifically includes the following steps:
at step 310, an input road condition image is acquired. In some embodiments, step 310 may be implemented by acquisition module 210.
In some embodiments, the road condition image is an image reflecting traffic road condition information, such as vehicle travel conditions, lane congestion conditions, traffic flow conditions, and the like. In some embodiments, the road condition image may be continuously captured by a fixed image capturing device provided in the road in advance. In some embodiments, the fixed image acquisition device may include one or any combination of a road condition monitoring device, a security monitoring device, an intersection violation monitoring device, a speed measurement monitoring device, a traffic flow monitoring device, and the like. In some embodiments, the monitoring/surveillance device may be any device with camera functionality. In some embodiments, the road condition images may be captured continuously by a mobile image capture device. In some embodiments, the mobile image capture device may include one or any combination of a vehicle event recorder, a camera, a cell phone, a tablet, a computer, smart glasses, and the like.
In some embodiments, the acquired road condition image may be various types of image information such as video, photos, and the like. In some embodiments, the acquired road condition image may be real-time video information for reflecting the road condition information in real time. In some embodiments, the acquired road condition image may be a recorded video of the road, reflecting road condition information for a certain period of time in the past. In some embodiments, the acquired road condition image may be one or more photographs reflecting road condition information at a certain time point.
In some embodiments, the road condition information may be obtained all day in real time, may be obtained at certain time intervals, or may be obtained within a predetermined specific time period. In some embodiments, the acquisition module 210 may acquire the road condition images at intervals according to a set time threshold. In some embodiments, the time threshold may be set based on the processing power of the system, and may also be determined based on road traffic flow. For example, if the traffic flow is large, the time interval value is small. In some embodiments, the obtaining module 210 may set a certain time period for obtaining, such as a time period prone to congestion. In some embodiments, congestion peak periods may be set for commute acquisition, and not during times of low traffic, such as early morning hours. In some embodiments, the specific time period can be flexibly set according to traffic commuting conditions or other relevant information of different cities or regions.
In some embodiments, the acquired road condition image further includes image-related feature information such as an image capturing time, an image capturing place, a type of the image, and an image acquisition mode. In some embodiments, the acquired road condition image may be directly processed in step 320, or may be filtered to remove a portion of the road condition image where no congestion is apparent. In some embodiments, images that preclude significant congestion may be filtered by the number of vehicles passing in a period of time, the location where the image was captured (e.g., the location is sufficiently far away), the time the image was captured (e.g., about two or three points in the morning), etc.
And 320, acquiring a congestion detection area through a congestion detection model based on the road condition image. In some embodiments, step 320 may be implemented by congestion detection module 220.
In some embodiments, the congestion detection area is an area in which a congestion state in which the road is congested and the vehicle is traveling slowly or even unable to advance can be detected in the road condition image. In some embodiments, congestion detection areas are often found in cities where there are few lanes but large traffic flows, or where surrounding roads are under construction, etc. In some embodiments, the congestion area also has a certain time characteristic, and congestion is easy to form in certain time periods, such as working days, work rush hours and the like.
In some embodiments, the congestion detection area in the road condition image may be directly obtained by a congestion detection model. In some embodiments, the input of the congestion detection model is a road condition image, the output is whether a congestion detection area is included in the road condition image, and if the congestion detection area is included, the detected congestion detection area needs to be output at the same time.
In some embodiments, the congestion detection model is a machine learning model, and may be a supervised learning model for classification, such as a linear classifier (e.g., LR), Naive Bayes (NB), K-nearest neighbors (KNN), Decision Trees (DT), integrated models (RF/GDBT, etc.), and so on. It may also be a supervised learning model for regression, such as linear regression, Support Vector Machine (SVM), K-nearest neighbor (KNN), regression trees (DT), integrated models (ExtraTrees/RF/GDBT), and the like.
In some embodiments, the machine learning model may be trained from sample images of information to be labeled. In some embodiments, the training process comprises: acquiring a sample road condition image; whether the sample image contains the congestion detection area and the specific position of the congestion detection area in the image is marked; and inputting the marked sample image into a machine learning model for training, so as to obtain a trained congestion detection model. In some embodiments, the sample images may be divided into a training set and a validation set, where the images in the training set are used to train the congestion detection model and the sample images in the validation set are used to validate the trained congestion detection model.
Step 330, obtaining suspected illegal vehicle information. In some embodiments, step 330 may be implemented by acquisition module 210.
In some embodiments, the suspected illegal parking vehicle is a vehicle that stops traveling in the illegal parking area for various reasons, and may be a vehicle that temporarily stops in a short time or a vehicle that stops for a long time. In some embodiments, the parking violating vehicle may be acquired by different kinds of image acquisition devices, which may be stationary or mobile.
In some embodiments, the parking violation area can be a specific area, such as one or more of an emergency lane, a freeway, a fire passage, a sidewalk, an emergency escape passage, an emergency lane, a doorway of a convenience store, a doorway of a human vehicle, a gas station or a toll-gate queuing area, and the like. In some embodiments, the parking violation area is a fixed area, and the congestion area has uncertainty of time and location, so that a vehicle suspected of parking violation in the parking violation area may stop traveling in the parking violation area due to the congestion condition of the lane. In such a case, it is necessary to filter the obtained suspected illegal parking vehicles to exclude the situation that the vehicles are determined to be illegal parking due to congestion.
In some embodiments, the image capture device automatically captures a photograph of the incoming vehicle when the vehicle enters the parking violation area. If the driving vehicle drives out of the illegal parking area within a certain time (for example, within 5 minutes), the captured vehicle photos can be automatically deleted; if the driving vehicle does not exit the parking violation area within a certain time (for example, within 5 minutes), the image capturing device automatically captures a second vehicle picture and simultaneously uploads the vehicle picture. In some embodiments, the suspected illegal vehicle is a vehicle that does not exit the illegal parking area within a certain time period, has been snapshot of the second vehicle photograph, but has not been reported yet.
In some embodiments, the suspected illegal parking vehicle information may include a license plate number, a vehicle model, a vehicle color, vehicle owner information, suspected illegal parking time of the vehicle, suspected illegal parking places of the vehicle, and the like. In some embodiments, the suspected illegal parking vehicle information may identify a license plate number of the vehicle through the image acquisition device, and obtain other information (e.g., vehicle model, owner information, etc.) of the vehicle through the license plate number. In some embodiments, the suspected parking violation time/location of the vehicle may be directly obtained by identifying the suspected parking violation time/location of the vehicle by the image capturing device and the location set by the image capturing device.
Step 340, determining a shielding area based on the congestion detection area. In some embodiments, step 340 may be implemented by the shielded region determination module 230.
In some embodiments, a shielded region is a region within which conditions within the region may be filtered under certain circumstances. In some embodiments, a shielded area is an area that may filter suspected contra-parking vehicles located within the area where they for some reason stop traveling. In some embodiments, the suspected parking violation vehicle may stop traveling in the violation area due to lane congestion, a traffic accident, or other reasons. In some embodiments described herein, a masked area is mainly an area where suspected parking violations vehicles stopped in the area due to congestion are filtered, and in the area, the suspected parking violations vehicles are not determined to be parking violations.
In some embodiments, the masked area may be determined by different methods based on the congestion detection area. For example, the congestion detection area output in step 320 may be directly filtered as a masked area. In some embodiments, the detected congestion detection area may not include all congested vehicles or cover the entire congested lane, so that a vehicle located outside the congestion detection area stops traveling due to the lane congestion and cannot be filtered to report a violation. Therefore, the congestion detection area can be enlarged in different forms, and the enlarged area is determined as a shielding area and then filtered. In some embodiments, the enlarging of the congestion detection area may include determining the relevant polygon or the boundary line of the extended area, and the detailed steps are detailed in fig. 4 and fig. 5, which are not described in detail herein.
Step 350, judging whether the suspected illegal parking vehicle is located in the shielding area. In some embodiments, step 350 may be implemented by decision module 240.
In some embodiments, when the vehicle is identified as a suspected illegal vehicle, it is required to determine whether the vehicle is illegal based on the information related to the suspected illegal vehicle in combination with the determined shielding area. In some embodiments, the information related to the suspected illegal parking vehicle may include a license plate number, a vehicle model, a vehicle color, owner information, suspected illegal parking time of the vehicle, suspected illegal parking place of the vehicle, and the like. In some embodiments, suspected illegal parking vehicles can be flexibly filtered according to the related information of the suspected illegal parking vehicles. For example, if the suspected parking time of the vehicle is not within the parking time of the parking area, the suspected parking violation vehicle may be directly filtered.
In some embodiments, it is further determined whether the suspected illegal parking vehicle is located in the shielded area determined in step 340, where the shielded area may be a congestion detection area directly or an area with an enlarged congestion detection area. In some embodiments, if a suspected parking violation vehicle is located within the determined shielded area, the suspected parking violation vehicle may be filtered without being determined to be a parking violation vehicle; if the suspected illegal parking vehicle is not located in the determined shielding area, the suspected illegal parking vehicle can be judged as the illegal parking vehicle, and the illegal parking vehicle is reported. In some embodiments, the determination of whether the suspected illegal vehicle is located in the shielded area may be performed by different methods, such as area determination, angle determination, ray method, etc., or other algorithms that achieve the same result.
And 360, outputting the judgment result. In some embodiments, step 360 may be implemented by output module 250.
In some embodiments, the output is based on the above determination. In some embodiments, if the suspected parking violation vehicle is located in the shielded area and is not determined as the parking violation vehicle, the vehicle information is not reported, and meanwhile, the suspected parking violation vehicle and the related information may be subjected to subsequent processing or directly deleted. In some embodiments, if the suspected illegal parking vehicle is determined as an illegal parking vehicle, the vehicle information may be reported, and according to the specific situation of the illegal parking vehicle, penalties such as points or fines are given to the illegal parking vehicle.
In some embodiments, for a suspected illegal parking vehicle that is not determined as an illegal parking vehicle, if the front and rear vehicles stop running in the illegal parking area, the front and rear vehicles may also be located in the shielded area with a certain probability. If the preceding and following vehicles are determined to be the parked vehicles, the shielded area may not cover the range of the preceding and following vehicles, and the determination may be made after a new shielded area is determined again to some extent.
In some embodiments, the congestion area filtering system 200 may also be used for filtering other events, not just for filtering parking violations, such as planning of a vehicle travel route, selection of a vehicle parking location, and the like.
It should be noted that the above description related to the flow 300 is only for illustration and explanation, and does not limit the applicable scope of the present application. Various modifications and changes to flow 300 will be apparent to those skilled in the art in light of this disclosure. However, such modifications and variations are intended to be within the scope of the present application. For example, step 330 may be performed simultaneously with steps 310, 320, 340, or may be performed before step 310.
Fig. 4 is an exemplary flow diagram of a masked region determination method 400 shown according to some embodiments of the present application. In some embodiments, the masked zone determination method 400 may be performed by the congestion zone filtering system 200.
At step 410, a relevant polygon is determined based on the congestion detection area. In some embodiments, step 410 may be implemented by the shielded region determination module 230.
In some embodiments, the congestion detection area may be directly used as a shielding area, but the congestion detection area often cannot cover all congestion areas, and therefore, the congestion detection area needs to be expanded to make the coverage range of the congestion detection area wider. In some embodiments, a correlation polygon may be determined based on the congestion detection area, the correlation polygon circumscribing a boundary of the congestion detection area and having a larger extent than the congestion detection area. In some embodiments, the related polygons may extend the coverage of the congestion detection area to some extent, i.e., extend the shielding area of the parking violation event, thereby reducing the false alarm rate of the parking violation event.
In some embodiments, the relevant polygon may be a Minimum Bounding Rectangle (MBR), also referred to as a Minimum Bounding Rectangle, or the like. In some embodiments, the relevant polygon may also be a circumscribed diamond, a circumscribed circle, etc. of the congestion detection area. In some embodiments, the minimum external moment is the maximum extent of several two-dimensional shapes (e.g., points, lines, polygons) in two-dimensional coordinates, i.e., the rectangle whose lower boundary is bounded by the minimum abscissa, maximum abscissa, minimum ordinate, maximum ordinate in the vertices of a given two-dimensional shape. In some embodiments, the minimum bounding moment is a rectangular frame circumscribing the minimum area or the minimum perimeter of the two-dimensional polygon in the two-dimensional coordinate representation, and the minimum bounding moment may be in any direction and is not limited to being parallel to the coordinate axes. In some embodiments, the minimum external moment of the congestion detection area may be calculated by a rotation basis algorithm, a convex hull algorithm, or the like.
Step 420, determining the long side and the short side of the related polygon. In some embodiments, step 420 may be implemented by the shielded region determination module 230.
In some embodiments, the boundary may also be expanded or lengthened to expand the shielded area for the calculated minimum external moment. In some embodiments, it is necessary to determine the long side and the short side of the minimum external moment, so as to extend the boundary according to the long side and the short side. In some embodiments, the long side is a relatively long side of the four-side boundary of the minimum circumscribed moment, and the short side is a relatively short side of the four-side boundary of the minimum circumscribed moment.
In some embodiments, the long side and the short side can be determined by: assuming that four vertexes of the minimum external moment are p1, p2, p3 and p4 in sequence; calculating the distance L between p1 and p2, p2 and p3p12And Lp23(ii) a Comparison Lp12And Lp23The size of (d); if L isp12Greater than Lp23The long side is a side composed of p1 and p2, a side composed of p3 and p4, the short side is a side composed of p2 and p2, and the groups of p1 and p4One side of the aluminum alloy; if L isp23Greater than Lp12The long side is a side composed of p2 and p3, a side composed of p1 and p4, and the short side is a side composed of p1 and p2, and a side composed of p3 and p 4. In some embodiments, the size of the difference between the vertex coordinates of the minimum external moment can be determined to determine the long side and the short side, or other methods that can achieve the same effect.
In some embodiments, it is determined that the long side and the short side of the minimum external torque may extend the long side and the short side, respectively, so as to expand the boundary of the minimum external torque and expand the shielding area.
And 430, extending the long sides and the short sides to the road condition image boundary. In some embodiments, step 430 may be implemented by the shielded region determination module 230.
In some embodiments, since the congestion detection area often cannot completely cover the entire congestion area, even if the minimum external torque of the congestion detection area is enlarged, there may still be vehicles affected by the congestion situation outside the area of the minimum external torque, and these vehicles cannot be filtered. It is therefore also desirable to extend the boundary of the minimum external moment to continue to enlarge the shielded area so as to filter as many vehicles as possible in the congested area.
In some embodiments, the long edge to road condition image boundary determined in step 420 is extended while the short edge to road condition image boundary determined in step 420 is extended. In some embodiments, the above steps may also be performed alternately, and the order of execution is not limited. In some embodiments, only the long side may be extended to the boundary of the road condition image without extending the short side, or only the short side may be extended to the boundary of the road condition image without extending the long side.
Step 440, determining an intersection of the extension line and the road condition image boundary. In some embodiments, step 440 may be implemented by shielded region determination module 230.
In some embodiments, the method of determining the intersection of the extended lines of the long side and the short side and the road condition image boundary may be set in a two-dimensional coordinate axis, and the intersection coordinates are calculated. In some embodiments, the road condition image boundary may be set as a coordinate axis, wherein one of the angles is a coordinate origin, and may be any one of an upper left corner, a lower left corner, an upper right corner, and a lower right corner of the image. In some embodiments, if the upper left corner of the image is the origin of coordinates, then horizontal to the right is the positive direction of the X-axis and vertical to the bottom is the positive direction of the Y-axis.
In some embodiments, if the slope of the linear equation where the extension line is located is greater than 0, the intersection point of the extension line and the upper left boundary of the image is divided into two cases, and the intersection point solving step is as follows: assuming that x is 0, substituting the value into a y value solved in a linear equation; if the value of y is greater than 0, the intersection point is located at the left boundary of the image (as shown in FIG. 6A), and the coordinates of the intersection point are (0, y); if the value of y is less than 0, the intersection point is located at the upper boundary of the image (as shown in fig. 6B), and it is necessary to solve the value of x by substituting the linear equation with the assumption that y is 0, and the coordinate of the intersection point is (x, 0).
In some embodiments, if the slope of the linear equation of the extension line is greater than 0, the intersection point of the extension line and the lower right boundary of the image is divided into two cases, and the intersection point solving step is as follows: setting the image width value (the length of the upper and lower boundaries) as width and the height value (the length of the left and right boundaries) as height; assuming that x is width, substituting the width into a linear equation to solve a y value; if the y value is less than height, the intersection point is located at the right boundary of the image (as shown in FIG. 6C), and the intersection point coordinate is (width, y); if the y value is greater than height, the intersection point is located at the lower boundary of the image (as shown in fig. 6D), and it is necessary to substitute the straight line equation to solve the value of x assuming that y is height, and the intersection point coordinate is (x, height).
In some embodiments, if the slope of the linear equation where the extension line is located is less than 0, the intersection point of the extension line and the upper right boundary of the image is divided into two cases, and the intersection point solving step is as follows: setting the width value of the image as width and the height value as height; assuming that x is width, substituting the width into a linear equation to solve a y value; if the y value is greater than 0, the intersection point is located at the right boundary of the image (as shown in fig. 6E), and the intersection point coordinate is (width, y); if the value of y is less than 0, the intersection point is located at the upper boundary of the image (as shown in fig. 6F), and it is necessary to solve the value of x by substituting the linear equation with the assumption that y is 0, and the coordinate of the intersection point is (x, 0).
In some embodiments, if the slope of the linear equation where the extension line is located is less than 0, the intersection point of the extension line and the lower left boundary of the image is divided into two cases, and the intersection point solving step is as follows: setting the width value of the image as width and the height value as height; assuming that x is 0, substituting the x into a linear equation to solve a y value; if the y value is less than height, the intersection point is located at the left boundary of the image (as shown in FIG. 6G), and the intersection point coordinate is (0, y); if the y value is greater than height, the intersection point is located at the lower boundary of the image (as shown in fig. 6H), and it is necessary to substitute the straight line equation to solve the value of x assuming that y is height, and the intersection point coordinate is (x, height).
In some embodiments, the intersection point of the extension line of the long side and the road condition image boundary and the intersection point of the extension line of the short side and the road condition image boundary may be solved separately, without mutual influence, and the respective corresponding intersection points are solved. In some embodiments, if only the long side needs to be extended or only the short side needs to be extended, only the intersection of the extension line that needs to be extended and the road condition image boundary needs to be solved.
And step 450, determining an area surrounded by the road condition image boundary and the extension line as a shielding area. In some embodiments, step 450 may be implemented by the shielded region determination module 230.
In some embodiments, four intersections of two extensions of the long/short sides with the road condition image boundary may be solved according to step 440. In some embodiments, if only one boundary of the long side and the short side needs to be extended, the solved four intersection points are connected, and the enclosed area is the area enclosed by the road condition image boundary and the extended line, and the area can be determined as the shielding area.
In some embodiments, if the long side and the short side need to be extended simultaneously, four intersections of the long side and the short side with the road condition image boundary exist respectively. In some embodiments, a region is respectively defined by four intersection points of the long side and four intersection points of the short side, and a final shielding region can be obtained by solving a set of two regions.
In some embodiments, based on the determined masked area, it may be determined whether the vehicle belongs to the masked area in step 350, thereby achieving the purpose of filtering the congested area.
It should be noted that the above description related to the flow 400 is only for illustration and explanation, and does not limit the applicable scope of the present application. Various modifications and changes to flow 400 may occur to those skilled in the art in light of the teachings herein. However, such modifications and variations are intended to be within the scope of the present application. For example, the relevant polygon determined in step 410 may be directly used as a mask region, and the like.
FIG. 5 is an exemplary flow chart of another masked region determination method according to some embodiments of the present application. In some embodiments, the masked zone determination method 500 may be performed by the congestion zone filtering system 200.
Step 510, determining a lane line of the congestion detection area. In some embodiments, step 510 may be implemented by the shielded region determination module 230.
In some embodiments, the lane lines are traffic markings used to separate the flow of traffic traveled by vehicles, typically white, dashed, solid, or yellow, dashed, solid. In some embodiments, another method may be used to determine the masked area by identifying lane lines in the congestion detection area.
In some embodiments, lane lines in the congestion detection area may be identified by color selection, ROI (region of interest) selection, color selection in combination with ROI selection. In some embodiments, lane lines in the congestion detection area may also be identified by edge detection, hough transform, or other methods that may be effective. In some embodiments, the determination of the masked areas that need filtering may be aided by lane lines identified in the congestion detection area.
And step 520, extending the lane line of the congestion detection area. In some embodiments, step 520 may be implemented by the shielded region determination module 230.
In some embodiments, in the same road condition image, if a vehicle is located in a congestion detection area in the same lane, the vehicles in front of and behind the vehicle should also be affected by congestion and be located in the congestion area. In some embodiments, shielding is also required for vehicles before and after the congestion detection area, and therefore, the lane lines identified in the congestion detection area can be extended directly to the road condition image boundary.
In some embodiments, the lane lines of the congestion detection area determined in the extending step 510 may be extended to the boundary of the road condition image, or may be extended only to a predetermined distance (for example, may be set to be equal to 5 meters, 10 meters, 20 meters, or 30 meters) or other limited places.
Step 530, determining a shielding area based on the extension line of the lane line. In some embodiments, step 530 may be implemented by the shielded region determination module 230.
In some embodiments, based on the extended line of the lane line in the congestion detection area and the image boundary of the road condition, the intersection point of the extended line and the image boundary may be solved, and the specific solving method is similar to the method of step 440, or other methods that may achieve the same effect, and will not be described in detail herein. In some embodiments, four intersection points of the extended line and the image boundary are solved to be connected, and the enclosed area is the area enclosed by the road condition image boundary and the extended line, and the area can be determined as the shielding area.
In some embodiments, based on the determined masked area, it may be determined whether the vehicle belongs to the masked area in step 350, thereby achieving the purpose of filtering the congested area.
It should be noted that the above description related to the flow 500 is only for illustration and explanation, and does not limit the applicable scope of the present application. Various modifications and changes to flow 500 may occur to those skilled in the art upon review of the present application. However, such modifications and variations are intended to be within the scope of the present application.
The beneficial effects that may be brought by the embodiments of the present application include, but are not limited to: (1) filtering the illegal events of the vehicles in the congestion area so as to shield the illegal events in the congestion area; (2) the shielding area is enlarged by determining the minimum external torque of the congestion detection area, and the false alarm rate of an illegal parking event is reduced; (3) the long side and the short side of the minimum external torque are prolonged, the shielding area is enlarged, and the false alarm rate of the illegal stopping event is reduced. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the broad application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visualbasic, Fortran2003, Perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing processing device or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
The entire contents of each patent, patent application publication, and other material cited in this application, such as articles, books, specifications, publications, documents, and the like, are hereby incorporated by reference into this application. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements and/or uses of the present application in the material attached to this application.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application can be viewed as being consistent with the teachings of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and depicted herein.

Claims (16)

1. A congestion area filtering method, comprising:
acquiring an input road condition image;
acquiring a congestion detection area through a congestion detection model based on the road condition image;
acquiring suspected illegal parking vehicle information;
determining whether the vehicle is illegally parked based on information related to the suspected illegal parking vehicle information and in combination with the congestion detection area;
and outputting the judged result.
2. The congestion area filtering method according to claim 1, wherein the method of determining whether a vehicle is illicit in conjunction with the congestion detection area based on the information related to the suspected illicit vehicle information comprises:
determining a masked area based on the congestion detection area;
and judging whether the suspected illegal parking vehicle is positioned in the shielding area.
3. The congestion area filtering method according to claim 2, wherein the method of determining the masked area based on the congestion detection area comprises:
determining a relevant polygon based on the congestion detection area;
enlarging the associated polygon as a mask region.
4. The congestion area filtering method according to claim 3, wherein the associated polygon may be a minimum external moment.
5. The congestion area filtering method according to claim 3, wherein the method of enlarging the associated polygon as the mask area comprises:
judging the long side and the short side of the related polygon;
and determining a shielding area based on the extension line of the long side and the short side and the road condition image boundary.
6. The congestion area filtering method according to claim 5, wherein the method of determining the blocking area based on the extended line of the long side and the short side and the road condition image boundary comprises:
extending the long and short edges to the road condition image boundary;
determining an intersection of the extension line and the road condition image boundary;
and determining an area surrounded by the road condition image boundary and the extension line as a shielding area.
7. The congestion area filtering method according to claim 2, wherein the method of determining a masked area based on the congestion detection area further comprises:
and determining a shielding area based on the lane lines of the congestion detection area.
8. A congestion zone filtering system, comprising:
the system comprises an acquisition module, a storage module and a display module, wherein the acquisition module is used for acquiring an input road condition image and acquiring suspected illegal vehicle information;
the congestion detection module is used for acquiring a congestion detection area through a congestion detection model based on the road condition image;
the judging module is used for judging whether the vehicle is illegally parked or not by combining the congestion detection area based on the information related to the suspected illegal parking vehicle information;
and the output module is used for outputting the judged result.
9. The congestion zone filtering system according to claim 8, wherein the system further comprises:
a masked region determination module for determining a masked region based on the congestion detection region;
the determination module is further configured to determine whether the suspected illegal parking vehicle is located in the shielded area.
10. The congestion zone filtering system of claim 9, wherein the masked zone determining module is to:
determining a relevant polygon based on the congestion detection area;
enlarging the associated polygon as a mask region.
11. The congestion zone filtering system according to claim 10, wherein the associated polygon may be a minimum external moment.
12. The congestion zone filtering system of claim 10, wherein the masked zone determining module is to:
judging the long side and the short side of the related polygon;
and determining a shielding area based on the extension line of the long side and the short side and the road condition image boundary.
13. The congestion zone filtering system of claim 12, wherein the masked zone determining module is to:
extending the long and short edges to the road condition image boundary;
determining an intersection of the extension line and the road condition image boundary;
and determining an area surrounded by the road condition image boundary and the extension line as a shielding area.
14. The congestion zone filtering system of claim 9, wherein the masked zone determining module is further configured to:
and determining a shielding area based on the lane lines of the congestion detection area.
15. An apparatus for congestion area filtering, comprising a processor, wherein the processor is configured to perform the method for congestion area filtering according to any one of claims 1 to 7.
16. A computer readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform the method of congestion area filtering according to any one of claims 1 to 7.
CN201910907355.1A 2019-09-24 2019-09-24 Method and system for filtering congestion area Pending CN112633039A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910907355.1A CN112633039A (en) 2019-09-24 2019-09-24 Method and system for filtering congestion area

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910907355.1A CN112633039A (en) 2019-09-24 2019-09-24 Method and system for filtering congestion area

Publications (1)

Publication Number Publication Date
CN112633039A true CN112633039A (en) 2021-04-09

Family

ID=75282851

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910907355.1A Pending CN112633039A (en) 2019-09-24 2019-09-24 Method and system for filtering congestion area

Country Status (1)

Country Link
CN (1) CN112633039A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114023071A (en) * 2021-12-02 2022-02-08 东软集团股份有限公司 Traffic violation prompting method and device, storage medium and electronic equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06251283A (en) * 1993-02-22 1994-09-09 Matsushita Electric Ind Co Ltd Traffic control system
CN103824452A (en) * 2013-11-22 2014-05-28 银江股份有限公司 Lightweight peccancy parking detection device based on full view vision
CN105046960A (en) * 2015-07-10 2015-11-11 潘进 Method and apparatus for analyzing road congestion state and detecting illegal parking
CN105809975A (en) * 2016-05-30 2016-07-27 北京精英智通科技股份有限公司 Abnormal parking judgment method and abnormal parking judgment device
CN106128110A (en) * 2016-07-15 2016-11-16 尚艳燕 A kind of method and apparatus utilizing balance car to improve traffic congestion
CN107067734A (en) * 2017-04-11 2017-08-18 山东大学 A kind of urban signal controlling intersection vehicles are detained peccancy detection method
CN108460970A (en) * 2017-09-11 2018-08-28 江苏本能科技有限公司 The recognition methods of road vehicle traffic behavior and system
CN109543647A (en) * 2018-11-30 2019-03-29 国信优易数据有限公司 A kind of road abnormality recognition method, device, equipment and medium
CN109830108A (en) * 2019-02-26 2019-05-31 北京汽车股份有限公司 The processing method of rule-breaking vehicle parking

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06251283A (en) * 1993-02-22 1994-09-09 Matsushita Electric Ind Co Ltd Traffic control system
CN103824452A (en) * 2013-11-22 2014-05-28 银江股份有限公司 Lightweight peccancy parking detection device based on full view vision
CN105046960A (en) * 2015-07-10 2015-11-11 潘进 Method and apparatus for analyzing road congestion state and detecting illegal parking
CN105809975A (en) * 2016-05-30 2016-07-27 北京精英智通科技股份有限公司 Abnormal parking judgment method and abnormal parking judgment device
CN106128110A (en) * 2016-07-15 2016-11-16 尚艳燕 A kind of method and apparatus utilizing balance car to improve traffic congestion
CN107067734A (en) * 2017-04-11 2017-08-18 山东大学 A kind of urban signal controlling intersection vehicles are detained peccancy detection method
CN108460970A (en) * 2017-09-11 2018-08-28 江苏本能科技有限公司 The recognition methods of road vehicle traffic behavior and system
CN109543647A (en) * 2018-11-30 2019-03-29 国信优易数据有限公司 A kind of road abnormality recognition method, device, equipment and medium
CN109830108A (en) * 2019-02-26 2019-05-31 北京汽车股份有限公司 The processing method of rule-breaking vehicle parking

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
WEILING CHEN 等: "Unauthorized Parking Detection using Deep Networks at Real Time", 《2019 IEEE INTERNATIONAL CONFERENCE ON SMART COMPUTING (SMARTCOMP)》, 1 August 2019 (2019-08-01), pages 459 - 463 *
XUEMEI XIE 等: "Real-Time Illegal Parking Detection System Based on Deep Learning", 《COMPUTER VISION AND PATTERN RECOGNITION》, 5 October 2017 (2017-10-05), pages 1 - 5 *
赵成强: "公路交通车辆违章行为检测与监控实现系统", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 1, 15 January 2019 (2019-01-15), pages 138 - 4386 *
靳龙飞: "基于球型摄像机的车辆违章行为检测方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 8, 15 August 2015 (2015-08-15), pages 138 - 1303 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114023071A (en) * 2021-12-02 2022-02-08 东软集团股份有限公司 Traffic violation prompting method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
US11790699B2 (en) Systems and methods for traffic violation detection
US11380105B2 (en) Identification and classification of traffic conflicts
KR102122859B1 (en) Method for tracking multi target in traffic image-monitoring-system
CN109003455B (en) Method and device for reminding vehicle owner of illegal parking behavior
CN108806272B (en) Method and device for reminding multiple motor vehicle owners of illegal parking behaviors
CN110188807A (en) Tunnel pedestrian target detection method based on cascade super-resolution network and improvement Faster R-CNN
US9177214B1 (en) Method and apparatus for an adaptive threshold based object detection
CN112041908A (en) System and method for monitoring traffic sign violations
CN110738150B (en) Camera linkage snapshot method and device and computer storage medium
KR102122850B1 (en) Solution for analysis road and recognition vehicle license plate employing deep-learning
CN107529659B (en) Seatbelt wearing detection method, device and electronic equipment
US20220139090A1 (en) Systems and methods for object monitoring
WO2021014464A1 (en) System, multi-utility device and method to monitor vehicles for road saftey
JP2022008672A (en) Information processing apparatus, information processing method, and program
CN107534717B (en) Image processing device and traffic violation management system with same
CN116420058A (en) Replacing autonomous vehicle data
CN110782653A (en) Road information acquisition method and system
Kahlon et al. An intelligent framework to detect and generate alert while cattle lying on road in dangerous states using surveillance videos
CN112633039A (en) Method and system for filtering congestion area
CN114141022B (en) Emergency lane occupation behavior detection method and device, electronic equipment and storage medium
KR102434154B1 (en) Method for tracking multi target in traffic image-monitoring-system
CN111985304A (en) Patrol alarm method, system, terminal equipment and storage medium
CN115440071B (en) Automatic driving illegal parking detection method
Arul et al. Modelling and Simulation of Smart Traffic Light System for Emergency Vehicle using Image Processing Techniques
Kiac et al. ADEROS: artificial intelligence-based detection system of critical events for road security

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination