CN117152949A - Traffic event identification method and system based on unmanned aerial vehicle - Google Patents

Traffic event identification method and system based on unmanned aerial vehicle Download PDF

Info

Publication number
CN117152949A
CN117152949A CN202310990535.7A CN202310990535A CN117152949A CN 117152949 A CN117152949 A CN 117152949A CN 202310990535 A CN202310990535 A CN 202310990535A CN 117152949 A CN117152949 A CN 117152949A
Authority
CN
China
Prior art keywords
vehicle
target
unmanned aerial
aerial vehicle
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310990535.7A
Other languages
Chinese (zh)
Inventor
闫军
张冠洲
宋瑞丹
王鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Love Parking Technology Co ltd
Original Assignee
Love Parking Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Love Parking Technology Co ltd filed Critical Love Parking Technology Co ltd
Priority to CN202310990535.7A priority Critical patent/CN117152949A/en
Publication of CN117152949A publication Critical patent/CN117152949A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Abstract

The application discloses a traffic event identification method and a traffic event identification system based on unmanned aerial vehicles, which relate to the field of intelligent urban traffic management and comprise the following steps: the vehicle moving targets are identified and extracted from the images acquired by the unmanned aerial vehicle, and the traffic event information corresponding to the road sections is identified and acquired according to the movement characteristic change information corresponding to each vehicle target, and the unmanned aerial vehicle can flexibly fly to the corresponding traffic event place to perform close-range image acquisition, so that the accuracy of image acquisition can be ensured, and meanwhile, the unmanned aerial vehicle is not fixed on a monitoring rod, but flexibly patrols according to a preset patrolling route, so that the identification accuracy of traffic events is improved, and meanwhile, the coverage rate of traffic event identification is improved.

Description

Traffic event identification method and system based on unmanned aerial vehicle
Technical Field
The application relates to the field of urban intelligent traffic management, in particular to a traffic event identification method and system based on unmanned aerial vehicles.
Background
If traffic events such as traffic jams or traffic accidents can be effectively identified and timely issued to surrounding traffic participants, vehicles or pedestrians can make a coping plan in advance, the occurrence of the accidents can be effectively reduced or the traffic jams can be relieved, and the traffic efficiency is improved. At present, a video identification mode is generally adopted for traffic event identification, but video equipment is generally configured on a fixed lamp post and monitoring at present, so that flexibility is poor, the traffic event identification system cannot effectively work at night or when light is poor, event information cannot be timely issued, the cost of the current monitoring equipment is high, coverage rate and accuracy are low, and monitoring of traffic events of a whole road network cannot be met.
Disclosure of Invention
In order to solve the technical problems, the application provides the traffic event identification method and the traffic event identification system based on the unmanned aerial vehicle, which can solve the problems of higher hardware cost, lower identification accuracy and lower coverage rate of the existing full road network traffic event identification.
To achieve the above object, in one aspect, the present application provides a traffic event recognition method based on an unmanned aerial vehicle, the method comprising:
corresponding road section inspection and image acquisition are carried out according to a preset unmanned aerial vehicle inspection plan;
identifying and extracting a vehicle moving target from images acquired by the unmanned aerial vehicle;
tracking each vehicle moving target according to a preset tracker model to obtain movement characteristic change information corresponding to each vehicle target;
and identifying and acquiring traffic event information corresponding to the road section according to the motion characteristic change information corresponding to each vehicle target.
Further, the step of identifying and extracting the vehicle moving object from the image acquired by the unmanned aerial vehicle comprises the following steps:
processing an image acquired by the unmanned aerial vehicle through a Gaussian mixture algorithm, and establishing an image background model;
differentiating the image according to the image background model and a background differentiating algorithm to obtain a foreground moving object in the image;
and filtering the foreground moving target image through self-adaptive threshold segmentation, morphological denoising and shadow removal to obtain a moving vehicle target.
Further, the step of tracking each vehicle moving object according to the preset tracker model to obtain the motion characteristic change information corresponding to each vehicle object respectively includes:
establishing a continuous self-adaptive mean shift tracker for each vehicle moving target and tracking and acquiring the moving characteristic information of each vehicle moving target;
acquiring motion characteristic information of a vehicle target at the next moment through a Kalman filter;
and acquiring motion characteristic change information corresponding to each vehicle target respectively according to the motion characteristic information of the current vehicle moving target and the motion characteristic information of the vehicle target at the next moment.
Further, the step of establishing a continuous adaptive mean shift tracker for each vehicle moving object and tracking and acquiring the motion characteristic information of each vehicle moving object comprises the following steps:
selecting a search window with an initial size containing the tracking target from the color probability distribution diagram corresponding to the image;
acquiring the mass center of the search window as the mass center point of each vehicle moving object:
resetting the size of a search window and acquiring centroid points of all vehicle moving targets according to the centroid of the search box;
and when the change value of the centroid point of each vehicle moving object corresponding to different search window sizes is smaller than a preset threshold value, acquiring the long axis, the short axis and the direction angle of each vehicle moving object.
Further, the obtaining motion feature change information corresponding to each vehicle target according to the motion feature information of each vehicle target and the motion feature information of the vehicle target at the next moment includes:
and carrying out multi-feature weighted fusion on the motion feature change information corresponding to each vehicle target respectively to obtain traffic events.
In another aspect, the present application provides a traffic event recognition system based on an unmanned aerial vehicle, the system comprising:
the acquisition unit is used for carrying out corresponding road section inspection and image acquisition according to a preset unmanned aerial vehicle inspection plan;
the identification unit is used for identifying and extracting a vehicle moving object from the image acquired by the unmanned aerial vehicle;
the acquisition unit is used for tracking each vehicle moving target according to a preset tracker model to acquire movement characteristic change information corresponding to each vehicle target respectively;
the acquisition unit is also used for identifying and acquiring traffic event information corresponding to the road section according to the motion characteristic change information corresponding to each vehicle target.
Further, the identification unit is specifically configured to process an image acquired by the unmanned aerial vehicle through a mixed gaussian algorithm, and establish an image background model; differentiating the image according to the image background model and a background differentiating algorithm to obtain a foreground moving object in the image; and filtering the foreground moving target image through self-adaptive threshold segmentation, morphological denoising and shadow removal to obtain a moving vehicle target.
Further, the acquiring unit is specifically configured to establish a continuous adaptive mean shift tracker for each vehicle moving object and track and acquire motion feature information of each vehicle moving object; acquiring motion characteristic information of a vehicle target at the next moment through a Kalman filter; and acquiring motion characteristic change information corresponding to each vehicle target respectively according to the motion characteristic information of the current vehicle moving target and the motion characteristic information of the vehicle target at the next moment.
Further, the acquiring unit is specifically further configured to select a search window with an initial size including the tracking target from a color probability distribution diagram corresponding to the image; resetting the size of the search window and acquiring the centroid point of each vehicle moving object according to the centroid of the search frame; and when the change value of the centroid point of each vehicle moving object corresponding to different search window sizes is smaller than a preset threshold value, acquiring the long axis, the short axis and the direction angle of each vehicle moving object.
Further, the obtaining unit is specifically further configured to perform multi-feature weighted fusion on motion feature change information corresponding to each vehicle target, so as to obtain a traffic event.
According to the traffic event identification method and system based on the unmanned aerial vehicle, the vehicle moving targets are identified and extracted from the images acquired by the unmanned aerial vehicle, and the traffic event information corresponding to the road sections is identified and acquired according to the movement characteristic change information corresponding to each vehicle target, and the unmanned aerial vehicle can flexibly fly to the corresponding traffic event place to perform close-range image acquisition, so that the accuracy of image acquisition can be ensured, and meanwhile, the unmanned aerial vehicle does not need to be fixed on a monitoring rod, and can flexibly patrol according to a preset patrol route, so that the identification accuracy of traffic events is improved, and the coverage rate of traffic event identification is improved.
Drawings
FIG. 1 is a flow chart of a traffic event identification method based on an unmanned aerial vehicle provided by the application;
fig. 2 is a schematic structural diagram of a traffic event recognition system based on an unmanned aerial vehicle.
Detailed Description
The technical scheme of the application is further described in detail through the drawings and the embodiments.
As shown in fig. 1, the traffic event recognition method based on the unmanned aerial vehicle provided by the embodiment of the application comprises the following steps:
101. and carrying out corresponding road section inspection and image acquisition according to a preset unmanned aerial vehicle inspection plan.
For the embodiment of the present application, before step 101, the method may further include: s1: the method comprises the steps of configuring an unmanned aerial vehicle, wherein the unmanned aerial vehicle comprises the unmanned aerial vehicle, an identification camera, an edge computing box, communication equipment and power equipment, and embedding a traffic event identification algorithm into the unmanned aerial vehicle; s2: the unmanned aerial vehicle system records, the unmanned aerial vehicle numbers are recorded in the management system, and task allocation, state monitoring and the like can be performed on the unmanned aerial vehicle after recording; s3: unmanned aerial vehicle route calibration is carried out by using a high-precision map in the system for confirming the navigation path of the unmanned aerial vehicle; s4: the task plan setting comprises setting of patrol time, patrol road sections, patrol unmanned aerial vehicles, patrol duration, patrol frequency and the like in the system. At this time, step 101 may specifically include: and after the unmanned aerial vehicle receives the system plan, automatically performing road section inspection and collecting data of traffic events.
102. And identifying and extracting a vehicle moving target from the image acquired by the unmanned aerial vehicle.
For the embodiment of the present application, step 102 may specifically include: processing an image acquired by the unmanned aerial vehicle through a Gaussian mixture algorithm, and establishing an image background model; differentiating the image according to the image background model and a background differentiating algorithm to obtain a foreground moving object in the image; and filtering the foreground moving target image through self-adaptive threshold segmentation, morphological denoising and shadow removal to obtain a moving vehicle target.
For example, a background model can be established by using road video image information and adopting a Gaussian mixture algorithm; and obtaining a foreground moving target by using a background difference algorithm and adopting the difference between the current and the background, and then performing adaptive threshold segmentation, morphological denoising and shadow removal to ensure that the foreground image only comprises the moving vehicle target.
103. And tracking each vehicle moving target according to a preset tracker model to acquire movement characteristic change information corresponding to each vehicle target.
For the embodiment of the present application, step 103 may specifically include: establishing a continuous self-adaptive mean shift tracker for each vehicle moving target and tracking and acquiring the moving characteristic information of each vehicle moving target; acquiring motion characteristic information of a vehicle target at the next moment through a Kalman filter; and acquiring motion characteristic change information corresponding to each vehicle target respectively according to the motion characteristic information of the current vehicle moving target and the motion characteristic information of the vehicle target at the next moment.
Further, the step of establishing a continuous adaptive mean shift tracker for each vehicle moving object and tracking and acquiring the motion characteristic information of each vehicle moving object comprises the following steps: selecting a search window with an initial size containing the tracking target from the color probability distribution diagram corresponding to the image; resetting the size of the search window and acquiring the centroid point of each vehicle moving object according to the centroid of the search frame; and when the change value of the centroid point of each vehicle moving object corresponding to different search window sizes is smaller than a preset threshold value, acquiring the long axis, the short axis and the direction angle of each vehicle moving object.
For example, the first step: selecting a color probability distribution map with a size S i Comprises a search window for tracking the target; and a second step of: calculating the zero-order moment of the search window:first moment of x and y:wherein I (x) i ,y i ) Is the image coordinates (x i ,y i ) Pixel value of (x) i ,y i ) Is the search window S i Is not limited in terms of the range of (a). And a third step of: calculating the centroid of the search window to be the centroid point of the target:fourth step: the size s of the search window is reset as a function of the probability distribution of the color in the area of the search window above. Fifth step: repeating the steps of the second step, the third step and the fourth step until the centroid change is less than a given threshold value. By calculating the second moment, the long axis, the short axis and the direction angle of the tracked object can be obtained. Wherein, the second moment:the direction angle of the long axis of the target is>Major and minor axes of the targetThe calculation formula is as follows:wherein:in the video image, the target of two adjacent frames is not changed greatly, so that the calculation amount is reduced, the color probability distribution of all pixels in each frame of image is not required to be calculated, and the color probability distribution of pixels in a region larger than the current search window is only required to be calculated. The Camshift algorithm has small calculated amount and can obtain better tracking effect in a simple background environment, but in a complex background, the algorithm does not predict a moving object, so that the problems of background interference, blocked target, tracking failure caused by too fast target movement speed and the like, which are similar to the target in color in a large area, cannot be solved.
Further, kalman filtering is divided into two phases, prediction and correction: the first prediction stage mainly includes state prediction and error covariance prediction. And the correction stage mainly comprises calculation of a Kalman filter gain coefficient and correction of an observed value and an error covariance by using the gain coefficient, wherein the obtained correction value is closer to target real information. From the above reasoning, the Kalman filtering algorithm contains two models: signal model: x is x k =A k X k-1 +B k W k And (3) observing a model: z is Z k =H k X k +V k In which x is k And z k A state vector and an observation vector, respectively; a is that k Is a state transition matrix, B k Is an input matrix H k Is an observation matrix; dynamic noise W k (covariance is Q) and observed noise V k (covariance R) is the mean white noise sequence that is uncorrelated. Let the state vector be expressed as X k =[x k ,y k ,v xk ,v yk ] T Wherein x is k ,y k The components of the object centroid coordinates in the x, y axes, v xk ,v yk The velocity components of the target in the x, y axes, respectively. Observation vector Z k =[x k ,y k ] T x k ,y k Representing the components of the centroid of the object currently observed in the x, y direction, respectively. And obtaining a motion equation according to Newton's motion theorem on the x axis:wherein t is a time variable, w k Indicating acceleration. Similar equations are also found on the y-axis.
104. And identifying and acquiring traffic event information corresponding to the road section according to the motion characteristic change information corresponding to each vehicle target.
For the embodiment of the present application, step 104 may specifically include: and carrying out multi-feature weighted fusion on the motion feature change information corresponding to each vehicle target respectively to obtain traffic events.
Further, the identified traffic event can be sent to a cloud platform, the cloud platform comprises data preprocessing, an identification algorithm and data grading and classified release, the data preprocessing comprises data quality grading processing, frame extraction and fitting, and the identification algorithm comprises an artificial intelligent identification algorithm and a fusion algorithm; the cloud platform is in data connection with a data application, and the data application comprises traffic event reminding at a lane level.
According to the traffic event identification method based on the unmanned aerial vehicle, the vehicle moving targets are identified and extracted from the images acquired by the unmanned aerial vehicle, and the traffic event information corresponding to the road sections is identified and acquired according to the movement characteristic change information corresponding to each vehicle target, and the unmanned aerial vehicle can flexibly fly to the corresponding traffic event place to perform close-range image acquisition, so that the accuracy of image acquisition can be ensured, and meanwhile, the unmanned aerial vehicle does not need to be fixed on a monitoring rod, but flexibly patrols according to a preset patrol route, so that the identification accuracy of traffic events is improved, and the coverage rate of traffic event identification is improved.
In order to implement the method provided by the embodiment of the present application, the embodiment of the present application provides a traffic event recognition system based on an unmanned aerial vehicle, as shown in fig. 2, the system includes: an acquisition unit 21, an identification unit 22, an acquisition unit 23;
the acquisition unit 21 is used for carrying out corresponding road section inspection and image acquisition according to a preset unmanned aerial vehicle inspection plan;
an identifying unit 22, configured to identify and extract a vehicle moving object from an image acquired by the unmanned aerial vehicle;
an obtaining unit 23, configured to track each vehicle moving object according to a preset tracker model to obtain motion feature change information corresponding to each vehicle object respectively;
the acquiring unit 23 is further configured to identify and acquire traffic event information corresponding to the road segment according to the motion feature change information corresponding to each vehicle target.
Further, the identifying unit 22 is specifically configured to process the image collected by the unmanned aerial vehicle through a mixed gaussian algorithm, and establish an image background model; differentiating the image according to the image background model and a background differentiating algorithm to obtain a foreground moving object in the image; and filtering the foreground moving target image through self-adaptive threshold segmentation, morphological denoising and shadow removal to obtain a moving vehicle target.
Further, the acquiring unit 23 is specifically configured to establish a continuous adaptive mean shift tracker for each vehicle moving object and track and acquire motion feature information of each vehicle moving object; acquiring motion characteristic information of a vehicle target at the next moment through a Kalman filter; and acquiring motion characteristic change information corresponding to each vehicle target respectively according to the motion characteristic information of the current vehicle moving target and the motion characteristic information of the vehicle target at the next moment.
Further, the obtaining unit 23 is specifically further configured to select a search window whose initial size includes the tracking target from a color probability distribution map corresponding to the image; resetting the size of the search window and acquiring the centroid point of each vehicle moving object according to the centroid of the search frame; and when the change value of the centroid point of each vehicle moving object corresponding to different search window sizes is smaller than a preset threshold value, acquiring the long axis, the short axis and the direction angle of each vehicle moving object.
Further, the obtaining unit 23 is specifically further configured to perform multi-feature weighted fusion on motion feature change information corresponding to each vehicle target, so as to obtain a traffic event.
According to the traffic event identification method and system based on the unmanned aerial vehicle, the vehicle moving targets are identified and extracted from the images acquired by the unmanned aerial vehicle, and the traffic event information corresponding to the road sections is identified and acquired according to the movement characteristic change information corresponding to each vehicle target, and the unmanned aerial vehicle can flexibly fly to the corresponding traffic event place to perform close-range image acquisition, so that the accuracy of image acquisition can be ensured, and meanwhile, the unmanned aerial vehicle does not need to be fixed on a monitoring rod, and can flexibly patrol according to a preset patrol route, so that the identification accuracy of traffic events is improved, and the coverage rate of traffic event identification is improved.
It should be understood that the specific order or hierarchy of steps in the processes disclosed are examples of exemplary approaches. Based on design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged without departing from the scope of the present disclosure. The accompanying method claims present elements of the various steps in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
In the foregoing detailed description, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments of the subject matter require more features than are expressly recited in each claim. Rather, as the following claims reflect, application lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate preferred embodiment of this application.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. As will be apparent to those skilled in the art; various modifications to these embodiments will be readily apparent, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description includes examples of one or more embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the aforementioned embodiments, but one of ordinary skill in the art may recognize that many further combinations and permutations of various embodiments are possible. Accordingly, the embodiments described herein are intended to embrace all such alterations, modifications and variations that fall within the scope of the appended claims. Furthermore, as used in the specification or claims, the term "comprising" is intended to be inclusive in a manner similar to the term "comprising," as interpreted when employed as a transitional word in a claim. Furthermore, any use of the term "or" in the specification of the claims is intended to mean "non-exclusive or".
Those of skill in the art will further appreciate that the various illustrative logical blocks (illustrative logical block), units, and steps described in connection with the embodiments of the application may be implemented by electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components (illustrative components), elements, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design requirements of the overall system. Those skilled in the art may implement the described functionality in varying ways for each particular application, but such implementation is not to be understood as beyond the scope of the embodiments of the present application.
The various illustrative logical blocks or units described in the embodiments of the application may be implemented or performed with a general purpose processor, a digital signal processor, an Application Specific Integrated Circuit (ASIC), a field programmable gate array or other programmable logic system, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the general purpose processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing systems, e.g., a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other similar configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. In an example, a storage medium may be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC, which may reside in a user terminal. In the alternative, the processor and the storage medium may reside as distinct components in a user terminal.
In one or more exemplary designs, the above-described functions of embodiments of the present application may be implemented in hardware, software, firmware, or any combination of the three. If implemented in software, the functions may be stored on a computer-readable medium or transmitted as one or more instructions or code on the computer-readable medium. Computer readable media includes both computer storage media and communication media that facilitate transfer of computer programs from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer. For example, such computer-readable media may include, but is not limited to, RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage systems, or any other medium that may be used to carry or store program code in the form of instructions or data structures and other data structures that may be read by a general or special purpose computer, or a general or special purpose processor. Further, any connection is properly termed a computer-readable medium, e.g., if the software is transmitted from a website, server, or other remote source via a coaxial cable, fiber optic cable, twisted pair, digital Subscriber Line (DSL), or wireless such as infrared, radio, and microwave, and is also included in the definition of computer-readable medium. The disks (disks) and disks (disks) include compact disks, laser disks, optical disks, DVDs, floppy disks, and blu-ray discs where disks usually reproduce data magnetically, while disks usually reproduce data optically with lasers. Combinations of the above may also be included within the computer-readable media.
The foregoing description of the embodiments has been provided for the purpose of illustrating the general principles of the application, and is not meant to limit the scope of the application, but to limit the application to the particular embodiments, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the application are intended to be included within the scope of the application.

Claims (10)

1. A traffic event recognition method based on an unmanned aerial vehicle, the method comprising:
corresponding road section inspection and image acquisition are carried out according to a preset unmanned aerial vehicle inspection plan;
identifying and extracting a vehicle moving target from images acquired by the unmanned aerial vehicle;
tracking each vehicle moving target according to a preset tracker model to obtain movement characteristic change information corresponding to each vehicle target;
and identifying and acquiring traffic event information corresponding to the road section according to the motion characteristic change information corresponding to each vehicle target.
2. The unmanned aerial vehicle-based traffic event recognition method of claim 1, wherein the step of recognizing and extracting the vehicle moving object from the image collected by the unmanned aerial vehicle comprises:
processing an image acquired by the unmanned aerial vehicle through a Gaussian mixture algorithm, and establishing an image background model;
differentiating the image according to the image background model and a background differentiating algorithm to obtain a foreground moving object in the image;
and filtering the foreground moving target image through self-adaptive threshold segmentation, morphological denoising and shadow removal to obtain a moving vehicle target.
3. The traffic event recognition method based on an unmanned aerial vehicle according to claim 1, wherein the step of tracking each vehicle moving object according to a preset tracker model to obtain the movement characteristic change information corresponding to each vehicle object respectively comprises the following steps:
establishing a continuous self-adaptive mean shift tracker for each vehicle moving target and tracking and acquiring the moving characteristic information of each vehicle moving target;
acquiring motion characteristic information of a vehicle target at the next moment through a Kalman filter;
and acquiring motion characteristic change information corresponding to each vehicle target respectively according to the motion characteristic information of the current vehicle moving target and the motion characteristic information of the vehicle target at the next moment.
4. The unmanned aerial vehicle-based traffic event recognition method of claim 3, wherein the step of establishing a continuous adaptive mean shift tracker for each vehicle moving object and tracking motion feature information of each vehicle moving object comprises:
selecting a search window with an initial size containing the tracking target from the color probability distribution diagram corresponding to the image;
acquiring the mass center of the search window as the mass center point of each vehicle moving object:
resetting the size of a search window and acquiring centroid points of all vehicle moving targets according to the centroid of the search box;
and when the change value of the centroid point of each vehicle moving object corresponding to different search window sizes is smaller than a preset threshold value, acquiring the long axis, the short axis and the direction angle of each vehicle moving object.
5. The traffic event recognition method based on an unmanned aerial vehicle according to claim 3, wherein the identifying and acquiring the traffic event information corresponding to the road section according to the motion feature change information respectively corresponding to each vehicle target comprises:
and carrying out multi-feature weighted fusion on the motion feature change information corresponding to each vehicle target respectively to obtain traffic events.
6. A traffic event recognition system based on an unmanned aerial vehicle, the system comprising:
the acquisition unit is used for carrying out corresponding road section inspection and image acquisition according to a preset unmanned aerial vehicle inspection plan;
the identification unit is used for identifying and extracting a vehicle moving object from the image acquired by the unmanned aerial vehicle;
the acquisition unit is used for tracking each vehicle moving target according to a preset tracker model to acquire movement characteristic change information corresponding to each vehicle target respectively;
the acquisition unit is also used for identifying and acquiring traffic event information corresponding to the road section according to the motion characteristic change information corresponding to each vehicle target.
7. The unmanned aerial vehicle-based traffic event recognition system of claim 6, wherein,
the identification unit is specifically used for processing the image acquired by the unmanned aerial vehicle through a Gaussian mixture algorithm and establishing an image background model; differentiating the image according to the image background model and a background differentiating algorithm to obtain a foreground moving object in the image; and filtering the foreground moving target image through self-adaptive threshold segmentation, morphological denoising and shadow removal to obtain a moving vehicle target.
8. The unmanned aerial vehicle-based traffic event recognition system of claim 6, wherein,
the acquisition unit is specifically used for establishing a continuous self-adaptive mean shift tracker for each vehicle moving target and tracking and acquiring the moving characteristic information of each vehicle moving target; acquiring motion characteristic information of a vehicle target at the next moment through a Kalman filter; and acquiring motion characteristic change information corresponding to each vehicle target respectively according to the motion characteristic information of the current vehicle moving target and the motion characteristic information of the vehicle target at the next moment.
9. The unmanned aerial vehicle-based traffic event recognition system of claim 8, wherein,
the acquisition unit is specifically further configured to select a search window with an initial size including the tracking target from a color probability distribution diagram corresponding to the image; resetting the size of the search window and acquiring the centroid point of each vehicle moving object according to the centroid of the search frame; and when the change value of the centroid point of each vehicle moving object corresponding to different search window sizes is smaller than a preset threshold value, acquiring the long axis, the short axis and the direction angle of each vehicle moving object.
10. The unmanned aerial vehicle-based traffic event recognition system of claim 8, wherein,
the acquisition unit is specifically further used for carrying out multi-feature weighted fusion on the motion feature change information corresponding to each vehicle target respectively to obtain traffic events.
CN202310990535.7A 2023-08-08 2023-08-08 Traffic event identification method and system based on unmanned aerial vehicle Pending CN117152949A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310990535.7A CN117152949A (en) 2023-08-08 2023-08-08 Traffic event identification method and system based on unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310990535.7A CN117152949A (en) 2023-08-08 2023-08-08 Traffic event identification method and system based on unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN117152949A true CN117152949A (en) 2023-12-01

Family

ID=88901721

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310990535.7A Pending CN117152949A (en) 2023-08-08 2023-08-08 Traffic event identification method and system based on unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN117152949A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117391911A (en) * 2023-12-08 2024-01-12 日照先森网络科技股份有限公司 Smart city comprehensive management method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117391911A (en) * 2023-12-08 2024-01-12 日照先森网络科技股份有限公司 Smart city comprehensive management method and system
CN117391911B (en) * 2023-12-08 2024-02-27 日照先森网络科技股份有限公司 Smart city comprehensive management method and system

Similar Documents

Publication Publication Date Title
CN110988912B (en) Road target and distance detection method, system and device for automatic driving vehicle
CN109087510B (en) Traffic monitoring method and device
CN112581612B (en) Vehicle-mounted grid map generation method and system based on fusion of laser radar and all-round-looking camera
Chang et al. Video analytics in smart transportation for the AIC'18 challenge
CN110348332B (en) Method for extracting multi-target real-time trajectories of non-human machines in traffic video scene
CN110688902B (en) Method and device for detecting vehicle area in parking space
CN111144337B (en) Fire detection method and device and terminal equipment
CN110136174B (en) Target object tracking method and device
CN112347933A (en) Traffic scene understanding method and device based on video stream
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
CN117152949A (en) Traffic event identification method and system based on unmanned aerial vehicle
CN114841910A (en) Vehicle-mounted lens shielding identification method and device
CN110276318A (en) Nighttime road rains recognition methods, device, computer equipment and storage medium
CN111723805B (en) Method and related device for identifying foreground region of signal lamp
CN117242489A (en) Target tracking method and device, electronic equipment and computer readable medium
CN115761668A (en) Camera stain recognition method and device, vehicle and storage medium
CN113177504B (en) Vehicle queuing information detection method and device, electronic equipment and storage medium
CN115482477B (en) Road identification method, device, unmanned aerial vehicle, equipment and storage medium
CN115994934B (en) Data time alignment method and device and domain controller
CN115482478B (en) Road identification method, device, unmanned aerial vehicle, equipment and storage medium
Subash Automatic road extraction from satellite images using extended Kalman filtering and efficient particle filtering
CN113129331B (en) Target movement track detection method, device, equipment and computer storage medium
CN113205144B (en) Model training method and device
CN116912517B (en) Method and device for detecting camera view field boundary
CN116844115A (en) Method for tracing and monitoring high-altitude parabolic objects and computing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination