CN112818954B - Vehicle state identification method, device, equipment and medium - Google Patents

Vehicle state identification method, device, equipment and medium Download PDF

Info

Publication number
CN112818954B
CN112818954B CN202110285834.1A CN202110285834A CN112818954B CN 112818954 B CN112818954 B CN 112818954B CN 202110285834 A CN202110285834 A CN 202110285834A CN 112818954 B CN112818954 B CN 112818954B
Authority
CN
China
Prior art keywords
vehicle
monitoring
identified
determining
monitoring equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110285834.1A
Other languages
Chinese (zh)
Other versions
CN112818954A (en
Inventor
吴涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Unisinsight Technology Co Ltd
Original Assignee
Chongqing Unisinsight Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Unisinsight Technology Co Ltd filed Critical Chongqing Unisinsight Technology Co Ltd
Priority to CN202110285834.1A priority Critical patent/CN112818954B/en
Publication of CN112818954A publication Critical patent/CN112818954A/en
Application granted granted Critical
Publication of CN112818954B publication Critical patent/CN112818954B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Abstract

In the vehicle state identification method, device, equipment and medium, the electronic equipment constructs the upstream-downstream relationship between monitoring equipment in the preset area according to the monitoring records of historical vehicles in the preset area; then, determining a target vehicle in a running state from the vehicles to be identified based on the upstream-downstream relationship, the reference time of the last shooting of each vehicle to be identified and the corresponding first monitoring equipment; in this way, statistics of vehicles in a formal state in any region is achieved.

Description

Vehicle state identification method, device, equipment and medium
Technical Field
The present application relates to the field of data processing, and in particular, to a method, an apparatus, a device, and a medium for recognizing a vehicle state.
Background
At present, traffic supervision departments supervise road traffic conditions in urban areas by means of internet of things technology of everything interconnection so as to facilitate the works of traffic congestion dispersion, accident handling, area distribution and control and the like.
However, the inventors have studied and found that it is currently difficult to detect a vehicle in a traveling state in an urban area, limited by the image capturing range of the monitoring apparatus.
Disclosure of Invention
In order to overcome at least one of the deficiencies in the prior art, in a first aspect, an embodiment of the present application provides a vehicle state identification method applied to an electronic device, where the method includes:
acquiring a monitoring record of at least one historical vehicle in a preset area, wherein the monitoring record comprises the history time of shooting each historical vehicle and corresponding monitoring equipment;
constructing an upstream-downstream relation between the monitoring devices according to the sequence of the historical time corresponding to each historical vehicle;
determining at least one vehicle to be identified in the preset area;
and determining a target vehicle in a running state from the at least one vehicle to be identified according to the upstream and downstream relationship, the reference time of the last shooting of each vehicle to be identified and a corresponding first monitoring device, wherein the first monitoring device belongs to the monitoring device.
In a second aspect, an embodiment of the present application provides a vehicle state identification device, which is applied to an electronic device, and includes:
the system comprises a data acquisition module, a monitoring module and a monitoring module, wherein the data acquisition module is used for acquiring a monitoring record of at least one historical vehicle in a preset area, and the monitoring record comprises the historical time of each historical vehicle and corresponding monitoring equipment;
the relation determining module is used for constructing an upstream-downstream relation between the monitoring devices according to the sequence of the historical time corresponding to each historical vehicle;
the vehicle determining module is used for determining at least one vehicle to be identified in the preset area;
and the state determining module is used for determining a target vehicle in a running state from the at least one vehicle to be identified according to the upstream and downstream relationship, the last shot reference time of each vehicle to be identified and corresponding first monitoring equipment, wherein the first monitoring equipment belongs to the monitoring equipment.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device includes a processor and a memory, where the memory stores a computer program, and the computer program, when executed by the processor, implements the vehicle state identification method.
In a fourth aspect, an embodiment of the present application provides a storage medium, where a computer program is stored, and when the computer program is executed by a processor, the method for recognizing a vehicle state is implemented.
Compared with the prior art, the method has the following beneficial effects:
in the vehicle state identification method, device, equipment and medium provided by the embodiment of the application, the electronic equipment constructs the upstream-downstream relationship between monitoring equipment in a preset area according to the monitoring records of historical vehicles in the preset area; then, determining a target vehicle in a running state from the vehicles to be recognized based on the upstream and downstream relationship, the reference time of the last shooting of each vehicle to be recognized and the corresponding first monitoring equipment; thus, statistics of the vehicles in the formal states in any region is achieved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic view of a scenario provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating steps of a vehicle state identification method according to an embodiment of the present application;
FIG. 4 is a schematic illustration of a road segment provided by an embodiment of the present application;
FIG. 5 is a schematic illustration of one of the regions provided by the embodiments of the present application;
FIG. 6 is a second schematic diagram of a region provided in the present embodiment;
FIG. 7 is a third exemplary diagram of a region provided in the present application;
fig. 8 is a schematic diagram illustrating a principle of constructing an upstream-downstream relationship according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a vehicle state identification device according to an embodiment of the present application.
Icon: 120-a memory; 130-a processor; 140-a communication device; 301-entry set; 302-inner set; 303-set of outlets; 401-a data acquisition module; 402-a relationship determination module; 403-a vehicle determination module; 404-state determination module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present application, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. indicate orientations or positional relationships based on orientations or positional relationships shown in the drawings or orientations or positional relationships that the present invention is conventionally placed in use, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present application. Furthermore, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
The inventor researches and discovers that the counting of vehicles in a driving state in an urban area is difficult at present due to the limitation of the image acquisition range of the monitoring equipment.
For example, please refer to an application scenario shown in fig. 1, a monitoring device a is installed on the upstream of a road segment, a monitoring device B is installed on the downstream of the road segment, and a parking lot is built around the road segment. When a vehicle drives into a parking lot during the driving process along a road section, the vehicle is limited by the shooting ranges of the monitoring device A and the monitoring device B (the area within the radius r in the figure 1), and whether the vehicle drives into the parking lot or not cannot be detected, so that a traffic supervision department can really know the driving condition of the vehicle in an urban area.
Of course, the actual situation is not limited to driving into the parking lot, but may include an accident or a roadside parking. And the corresponding vehicle is in a running state.
It should be understood that, in the related art, a preset area may be photographed based on a high-altitude camera, a satellite cloud image, and the like, to obtain a high-altitude photographed image of the preset area; and then, identifying the high-altitude shot images, and counting the number of the vehicles in the running state in the preset area.
However, it has been found through research that the above-mentioned related art has at least the following problems:
problem 1, the sensing visual field of the high-altitude monitoring equipment is difficult to coincide with a preset area.
Problem 2, to great preset region, road network and vehicle are comparatively intensive, have great demand to supervisory equipment's perception field of vision and discernment precision, lead to the discernment degree of difficulty to increase then.
Problem 3, if the image taken by the high-altitude camera or the satellite is a two-dimensional plane, the statistical error will increase correspondingly if the road section in the preset area is blocked, overlapped and covered by the obstacle (for example, tree).
Problem 4, for some vehicles stopped at the roadside, it is difficult to distinguish them from vehicles that are in the form of roads.
In view of this, in order to at least partially solve the above problem, embodiments of the present application provide a vehicle state identification method applied to an electronic device. The electronic device may be a server or a data processing device communicatively connected to the image capture device.
In the vehicle state identification method, the electronic equipment constructs the upstream-downstream relationship between monitoring equipment in a preset area according to the monitoring records of historical vehicles in the preset area; then, a target vehicle in a running state is determined from the vehicles to be recognized based on the upstream-downstream relationship, the reference time at which each vehicle to be recognized was last photographed, and the corresponding first monitoring device.
For convenience of describing the vehicle state identification method, the embodiment of the present application first describes a hardware structure of the electronic device with reference to fig. 2. As shown in fig. 2, the electronic device includes a memory 120, a processor 130, and a communication device 140.
The memory 120, processor 130, and communication device 140 are electrically connected to each other directly or indirectly to enable data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The Memory 120 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. The memory 120 is used for storing a computer program, and the processor 130 executes the computer program after receiving the execution instruction, so as to implement the vehicle state identification method.
The processor 130 may be an integrated circuit chip having signal processing capabilities. The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Based on the hardware structure of the electronic device, the following describes in detail the vehicle state identification method provided in the embodiment of the present application with reference to the step schematic diagram of the vehicle state identification method shown in fig. 3.
As shown in fig. 3, the vehicle state recognition method includes:
step S101, acquiring a monitoring record of at least one historical vehicle in a preset area.
The monitoring records comprise the historical time of shooting of each historical vehicle and corresponding monitoring equipment.
In consideration of convenience and flexibility in use, the preset area can be obtained by self-dividing by a user based on requirements. For example, the electronic device may provide a map display interface, and determine the corresponding preset area in response to an area selection operation of a user in the map display interface.
It should be understood that, since the preset area is obtained by the user by dividing as needed, when the preset area is just divided, the upstream-downstream relationship between different monitoring devices in the preset area is not known, and therefore, the upstream-downstream relationship between different monitoring devices in the preset area needs to be constructed first; in this way, the subsequent detection of the target vehicle in the running state is facilitated.
And S102, constructing an upstream-downstream relation between monitoring devices according to the sequence of the historical time corresponding to each historical vehicle.
The electronic equipment acquires the passing time shot by each monitoring device in the preset area when at least one historical vehicle runs in the preset area.
Then, the electronic equipment determines the upstream-downstream relation between the monitoring equipment which shoots the historical vehicles according to the sequence of the corresponding passing time of the historical vehicles aiming at each historical vehicle; and constructing the upstream-downstream relation between the monitoring devices in the preset area according to the upstream-downstream relation between the monitoring devices corresponding to all historical vehicles.
The electronic device records relevant information shot by each monitoring device of the historical vehicles in the preset area in a form of a table 1:
TABLE 1
Field(s) Data type Remarks for note
deviceid varchar Monitoring device id number
plateno varchar License plate number
platecolor integer Color of license plate
laneno integer Lane numbering
passtime integer Time of flight
Based on the information recorded in table 1, the manner in which the relationship is constructed upstream and downstream is exemplified below with reference to fig. 4. For convenience of description, the different monitoring devices are shown below in the form of A, B, C … …. Within the predetermined area as shown in fig. 8, a monitoring device A, B, C, D, E, F, G, H, J, K is mounted.
The electronic equipment takes the license plate number as a reference, and pulls the passing time of the same vehicle and corresponding monitoring equipment from the database corresponding to the table 1; according to the sequence of the passing time, the driving track of the historical vehicle can be determined as follows:
D->B->E->C->A->F->G;
therefore, B is located at the downstream of the road of D, E is located at the downstream of the road of B, and similarly, the upstream and downstream relation among other monitoring devices can be determined.
Because the running track of a single vehicle can only represent the communication relation between partial roads in the preset area, the electronic equipment can construct the upstream and downstream relation between the monitoring equipment in the preset area according to the upstream and downstream relation between the monitoring equipment corresponding to all historical vehicles. Wherein, the more the number of the historical vehicles is, the more accurate the constructed upstream and downstream relationship is.
Step S103, determining at least one vehicle to be identified in a preset area.
Specifically, the electronic device obtains a running track of at least one vehicle to be screened in a preset area, wherein the at least one vehicle to be screened is a vehicle appearing in the preset area within a second time length from a reference time.
Then, the electronic equipment determines a vehicle which is driven away from a preset area from at least one vehicle to be screened according to the driving track; and removing the running vehicles from at least one vehicle to be screened to determine at least one vehicle to be identified.
In order to determine all vehicles to be identified in the preset area, the electronic device obtains the running track of at least one vehicle to be screened in the preset area, wherein the at least one vehicle to be screened is a vehicle appearing in the preset area within a second time length from the reference time.
Illustratively, the reference time is adaptively changed according to the change of the statistical scene. For example, in real-time traffic control, the reference time may be the current time. When backtracking the traffic information, the reference time may be any time designated by the user.
Taking the second duration of 15 minutes as an example, the electronic device constructs the driving track of each preset vehicle to be screened, which appears in 15 minutes, according to the appearance places of the vehicle to be screened at different times in the preset area.
And then, the electronic equipment judges whether the vehicle to be screened is driven away from the preset area or not according to the driving track, and if the driving track indicates that the vehicle to be screened is driven away from the preset area, the vehicle to be screened is taken as a driven vehicle.
Therefore, after the driving vehicles are removed from the vehicles to be screened, the rest vehicles are the vehicles to be identified.
For example, the electronic device performs image recognition on the captured image through all monitoring devices erected in the preset area to obtain the vehicles appearing in the preset area within the second time length from the reference time.
As known from the urban traffic running condition evaluation specification GB/T33171-2016, the reasonable travel time of the road section is not more than 15 minutes. That is, normally one vehicle that normally travels, the time required to pass through a section of an urban road does not exceed 15 minutes, and therefore, the second period of time may be set to be longer than 15 minutes.
Since the reasonable travel time of the link should not be greater than 15 minutes, it is assumed that the reference time is 12:00 o, the time before 15 minutes is 11: 45.
a vehicle that normally travels, even if 11: at 45, the vehicle at the entrance of the upstream road section must travel to the exit of the downstream road section after 15 minutes (12: 00), and is captured by the downstream monitoring equipment.
In this way, the electronic device performs image recognition on the vehicles shot within the 15 minutes, and all the vehicles appearing in the preset area within the 15 minutes are taken as the vehicles to be screened.
Of course, those skilled in the art can make appropriate increases or decreases on a 15 minute basis, and no inventive contribution is required based on the embodiments of the present application.
Then, the electronic equipment determines a vehicle to be screened which is driven away from a preset area from at least one vehicle to be screened according to the driving track; and removing the running vehicles from at least one vehicle to be screened, and screening out at least one vehicle to be identified.
And step S104, determining a target vehicle in a running state from at least one vehicle to be identified according to the upstream and downstream relationship, the reference time of the last shot of each vehicle to be identified and the corresponding first monitoring equipment.
Wherein the first monitoring device belongs to the monitoring device. Specifically, the electronic device determines, for each vehicle to be identified, a second monitoring device corresponding to the first monitoring device from the monitoring devices according to the upstream-downstream relationship. Wherein the second monitoring device is located downstream of the first monitoring device on the road.
Then, when the vehicle to be identified is not shot by the second monitoring equipment after the distance from the reference time is a first time, the electronic equipment determines that the vehicle to be identified is in a static state; and determining a target vehicle in a running state from at least one vehicle to be identified according to all vehicles to be identified in a static state.
For example, the electronic device may construct the driving track of the vehicle to be identified by the license plate number through the monitoring data of the road monitoring network; and then, according to the end position of the running track, determining the reference time of the last shot vehicle to be identified and the corresponding first monitoring equipment. The license plate number of the vehicle can be obtained by identifying the image shot by the monitoring equipment.
It will be appreciated that in order to determine whether a vehicle to be identified is being filmed by a second monitoring device downstream, it is necessary to first determine how many downstream road sections there are at the erection location of the first monitoring device.
In the embodiment of the application, the electronic device determines the second monitoring device corresponding to the first monitoring device according to the constructed upstream and downstream relationship. The number of the second monitoring devices is determined by road planning of the actual preset area, and the number of the downstream road sections corresponds to the number of the second monitoring devices.
And then, the electronic equipment acquires the monitoring data acquired by the second monitoring equipment to judge whether the vehicle to be identified is shot.
In addition, in the embodiment of the present application, in consideration of complexity of traffic conditions, different manners may be selected for determining the first duration, which is exemplarily described below with reference to fig. 5.
As shown in fig. 5, at the erection position of the first monitoring device, there are two downstream road sections, a road section a and a road section B. The length of the road section A is greater than that of the road section B, but the road section B is frequently traffic-blocked, so that the reasonable passing time of the road section B is 10 minutes, and the reasonable passing time of the road section A is 8 minutes.
In one example, for different downstream road segments, the reasonable passage time length of the corresponding road segment is taken as the first time length. In this example, the first time period of the above-described link a is 8 minutes, and the first time period of the link B is 10 minutes.
In another example, for different downstream road segments, the longest passage time length in the downstream road segment is taken as the first time length of all road segments. In this example, the first time length of each of the link a and the link B is 10 minutes.
Taking the first time length of the road section A and the road section B as an example, assuming that the current time is 12:00, and the last time that the vehicle to be identified is shot by the first monitoring device is 11: 45. Since the maximum reasonable passing time of the road section A and the road section B is already exceeded until 12:00, if the second monitoring devices at the downstream of the road section A and the road section B do not shoot the vehicle to be identified, the vehicle to be identified is determined to be in a static state.
Referring to fig. 5 again, the reasonable passing time lengths of the road segment a and the road segment B are different, so after the monitoring devices satisfying the upstream and downstream relationship are determined, the reasonable passing time length of the road segment corresponding to each group of monitoring devices needs to be obtained.
In the embodiment of the application, the electronic equipment acquires the passing time of the historical vehicle between the first monitoring equipment and the second monitoring equipment; counting the passing time according to a preset counting mode, and determining the passing time meeting the preset counting condition; and taking the passing time length meeting the preset statistical condition as a first time length between the first monitoring equipment and the second monitoring equipment.
The inventor finds that for the road section corresponding to each group of monitoring equipment, the passing time of the historical vehicle passing through the road section meets Gaussian distribution, and the corresponding expression is as follows:
Figure BDA0002980418270000111
in the formula, μ is a time difference mean value of the corresponding passing time of the historical vehicle, and σ is a standard deviation of the corresponding passing time of the historical vehicle.
For the road section corresponding to the nth group of monitoring equipment, the speed is larger than mu nn σ n The passing time of (d) may be defined as the congestion, where μ n Means, σ, of time difference representing section of road corresponding to group n of monitoring devices n Indicating the standard deviation, lambda, of the road section corresponding to the nth group of monitoring devices n And the manual intervention parameters are adaptively adjusted by a user according to the needs of an actual scene.
Based on the Lauda criterion, assuming that the passing time of the historical vehicle passing through the road section corresponding to the nth group of monitoring equipment contains random errors, an interval can be determined according to a certain probability based on the standard deviation and the mean value of the data, the error exceeding the interval belongs to an abnormal value, and the Gaussian distribution is shown in the following table 2:
TABLE 2
Distribution of values Data ratio
(μ-σ,μ+σ) 0.6872
(μ-2σ,μ+2σ) 0.9545
(μ-3σ,μ+3σ) 0.9973
Then, so based on the formula max (10, μ) nn σ n )≤P n ≤min(900,μ nn σ n ) The reasonable passing time length P of the road section corresponding to the nth group of monitoring equipment can be obtained n Wherein, min (900, mu) nn σ n ) And the maximum passing time length of the road section corresponding to the nth group of monitoring equipment is represented and is used as the first time length of the road section corresponding to the nth group of monitoring equipment, and the maximum passing time length is used for checking whether the vehicle to be identified, which is shot by the first monitoring equipment positioned at the upstream in the nth group of monitoring equipment, is in a static state or not.
Therefore, in the step of detecting the vehicle state, the electronic device constructs an upstream-downstream relationship between monitoring devices in the preset area according to the monitoring records of historical vehicles in the preset area; then, determining a target vehicle in a running state from the vehicles to be identified based on the upstream-downstream relationship, the reference time of the last shooting of each vehicle to be identified and the corresponding first monitoring equipment; thus, statistics of the vehicles in the formal states in any region is achieved.
Further, for each vehicle to be identified, the electronic device obtains the number of places of vehicle parking places distributed along a target road section, wherein the target road section is a road section located between the first monitoring device and the second monitoring device.
In the embodiment of the application, different types of parking places are distributed on the target road section, wherein mutually communicated roads are not laid between some parking places and the target road section, so that map navigation information related to the parking places can be acquired from related map platforms.
Then, when the vehicle to be identified is not shot by the second monitoring device after the distance from the reference time is a first time, and the number of places is greater than 0, the electronic device determines that the vehicle to be identified is in a static state.
That is, in the step of checking the vehicle state, the parking place around the target link is taken into consideration, and the accuracy of detecting the vehicle state is further improved.
In the embodiment of the application, the electronic device can also count the number of the target vehicles and provide the number of the target vehicles to the user. The user can be a supervisor of a traffic supervision department, so that the supervisor of the traffic supervision department can make a corresponding traffic management strategy according to the number.
In addition, considering the requirement of statistical accuracy, the monitoring equipment in a preset area can be screened, so that the screened monitoring equipment is surrounded into areas of different types. Illustratively, there may be at least 3 types of regions.
The first type of area is shown in fig. 6, the erection positions of the monitoring devices follow an inlet set 301, an outlet set 303 and an internal set 302, the monitoring devices at the inlet and the outlet do not form a closed state for the area, and the inlet and the outlet can allow incomplete enumeration, so that the area is a theoretical area.
The second type of area is shown in fig. 7, for some specific areas (e.g. administrative areas), when dividing the area, the entrance and the exit cannot be clearly defined, and the monitoring devices in the area are all counted as "internal set", and the monitoring devices in the area do not form a closed state for the area, and the area is a theoretical area.
The third type of region is shown in fig. 8, and is more rigorous than the first type of region and the second type of region, and it follows that the monitoring devices at the inlet and the outlet form a closed state to the regions. As shown in fig. 7, the monitoring devices of the entrance and exit enclose a closed "circle", which serves as the boundary of the area (dotted line in fig. 7). This region belongs to the actual closed loop region, not the theoretical region.
The third type of area can enumerate all entrances and exits in the area and road sections formed by monitoring equipment meeting the upstream and downstream relation, so that the statistical accuracy is higher. Therefore, in the area with higher requirement on the statistical accuracy, more detailed statistics of entering and exiting vehicles needs to be carried out according to the third type of area, omission does not exist when the area is used for road section statistics, equipment supplementation needs to be carried out on intersections lacking monitoring equipment, all vehicles passing through the area can be photographed by the equipment within a specified time interval as much as possible, and finally the purpose of 'sealing' is achieved.
In consideration of data errors and other factors, the constructed upstream-downstream relationship may have some unreasonable upstream-downstream relationship.
In view of this, the electronic device determines, from the upstream and downstream relationships among the monitoring devices in the preset area, an upstream and downstream relationship that does not satisfy the preset construction condition; and eliminating the upstream and downstream relations which do not meet the preset construction conditions from the upstream and downstream relations among the monitoring devices in the preset area.
For example, in an implementation manner, for each group of monitoring devices that satisfy the upstream-downstream relationship, a passing time of each historical vehicle through a road segment corresponding to the group of monitoring devices is obtained, and if the average passing time is greater than a preset reference time, the group of monitoring devices does not satisfy a preset construction condition.
In another implementation manner, for each group of monitoring devices satisfying the upstream and downstream relationship, the passing time of each historical vehicle passing through the corresponding road segment of the group of monitoring devices is obtained, and if the passing time of all the historical vehicles is greater than the preset reference time, the group of monitoring devices does not satisfy the preset construction condition.
The preset reference time length can be determined according to the standard of urban traffic running condition evaluation standard GB/T33171-2016, and the reasonable travel time of the road section is not more than 15 minutes and serves as the preset reference time length.
Further, the preset construction condition may further include a corresponding road section of the group of monitoring devices, and the number of passing vehicles in a specified time period.
The electronic device can count the corresponding road sections of the group of monitoring devices, and the number of historical vehicles passing through the road sections in the day; if the number is less than the reference number, the group of monitoring devices does not meet the preset construction condition. The reference number can be adjusted adaptively according to the size of the population in the city, and in the embodiment of the present application, the reference number is 20.
In addition, it is worth explaining that those skilled in the art add new construction conditions or reduce existing construction conditions based on the above preset construction conditions according to actual needs. The monitoring devices forming the upstream and downstream relationship may also need to satisfy all preset construction conditions or satisfy some of the construction conditions, as required.
Referring to fig. 9, the present embodiment further provides a vehicle state monitoring device, which includes at least one functional module that can be stored in a memory in a software form. Functionally divided, the vehicle condition monitoring device may include:
the data acquisition module 401 is configured to acquire a monitoring record of at least one historical vehicle in a preset area, where the monitoring record includes historical time of each historical vehicle captured and corresponding monitoring equipment;
in this application, when the computer executable instruction corresponding to the data obtaining module 401 is executed by the processor, step S101 in fig. 3 is implemented. For a detailed description of the data acquisition module 401, reference may be made to the detailed description of step S101.
And the relationship determining module 402 is configured to construct an upstream-downstream relationship between the monitoring devices according to the sequence of the historical time corresponding to each historical vehicle.
In this application, when the computer executable instructions corresponding to the relationship determining module 402 are executed by the processor, step S102 in fig. 3 is implemented. The detailed description about the relationship determination module 402 can be found in the detailed description of step S102.
A vehicle determining module 403, configured to determine at least one vehicle to be identified in a preset area.
In this application, when executed by a processor, the computer-executable instructions corresponding to the vehicle determination module 403 implement step S103 in fig. 3. For a detailed description of the vehicle determination module 403, reference may be made to the detailed description of step S103.
The state determining module 404 is configured to determine a target vehicle in a driving state from at least one vehicle to be identified according to the upstream and downstream relationship, the reference time of the last shot of each vehicle to be identified, and a corresponding first monitoring device, where the first monitoring device belongs to the monitoring device.
In this application, when the computer executable instructions corresponding to the state determining module 404 are executed by the processor, step S103 in fig. 3 is implemented. For a detailed description of the status determination module 404, refer to the detailed description of step S103.
In one possible implementation, the state determination module 404 is further configured to:
according to the upstream and downstream relationship, determining second monitoring equipment corresponding to the first monitoring equipment from the monitoring equipment for each vehicle to be identified, wherein the second monitoring equipment is positioned on the road downstream of the first monitoring equipment;
when the vehicle to be recognized is not shot by the second monitoring equipment after the distance from the reference time is the first time, determining that the vehicle to be recognized is in a static state;
and determining a target vehicle in a running state from at least one vehicle to be identified according to all vehicles to be identified in a static state.
The data obtaining module 401 is further configured to obtain the number of places of the vehicle parking places distributed along a target road segment, where the target road segment is a road segment located between the first monitoring device and the second monitoring device.
The state determining module 404 is further configured to determine that the vehicle to be identified is in a stationary state when the vehicle to be identified is not captured by the second monitoring device after the first time period from the reference time and the number of places is greater than 0.
In one possible implementation, the vehicle determination module 403 is further configured to:
acquiring a running track of at least one vehicle to be screened in a preset area, wherein the at least one vehicle to be screened is a vehicle appearing in the preset area within a second time length from a reference moment;
determining a vehicle leaving the preset area from at least one vehicle to be screened according to the running track;
and removing the running vehicles from at least one vehicle to be screened to determine at least one vehicle to be identified.
In one possible implementation, the relationship determining module 402 is further configured to:
determining an upstream-downstream relationship which does not meet a preset construction condition from upstream-downstream relationships among monitoring devices;
and eliminating the upstream and downstream relations which do not meet the preset construction conditions from the upstream and downstream relations between the monitoring devices.
In one possible implementation, the relationship building module is further configured to:
acquiring the passing time of a historical vehicle between first monitoring equipment and second monitoring equipment;
counting the passing time according to a preset statistical mode, and determining the passing time meeting the preset statistical conditions;
and taking the passing time length meeting the preset statistical condition as a first time length between the first monitoring equipment and the second monitoring equipment.
The embodiment of the application also provides a storage medium, wherein the storage medium stores a computer program, and the vehicle state identification method is realized when the computer program is executed by a processor.
In summary, in the vehicle state identification method, apparatus, device and medium provided in the embodiments of the present application, the electronic device constructs an upstream-downstream relationship between monitoring devices in a preset region according to a monitoring record of historical vehicles in the preset region; then, determining a target vehicle in a running state from the vehicles to be recognized based on the upstream and downstream relationship, the reference time of the last shooting of each vehicle to be recognized and the corresponding first monitoring equipment; thus, statistics of the vehicles in the formal states in any region is achieved.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The above description is only for various embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and all such changes or substitutions are included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A vehicle state identification method is applied to an electronic device, and comprises the following steps:
acquiring a monitoring record of at least one historical vehicle in a preset area, wherein the monitoring record comprises the history time of shooting each historical vehicle and corresponding monitoring equipment;
constructing an upstream-downstream relation between the monitoring devices according to the sequence of the historical time corresponding to each historical vehicle;
determining at least one vehicle to be identified in the preset area;
according to the upstream-downstream relationship, the reference time of the last shooting of each vehicle to be identified and corresponding first monitoring equipment, determining a target vehicle in a running state from the at least one vehicle to be identified, wherein the first monitoring equipment belongs to the monitoring equipment, and the method comprises the following steps:
for each vehicle to be identified, determining second monitoring equipment corresponding to the first monitoring equipment from the monitoring equipment according to the upstream and downstream relation, wherein the second monitoring equipment is positioned on the road downstream of the first monitoring equipment;
when the vehicle to be recognized is not shot by the second monitoring equipment after being away from the reference time for a first time, determining that the vehicle to be recognized is in a static state;
and determining a target vehicle in a running state from the at least one vehicle to be identified according to all vehicles to be identified in a static state.
2. The vehicle state recognition method according to claim 1, characterized in that the method further comprises:
acquiring the number of places of vehicle parking places distributed along a target road section, wherein the target road section is a road section between the first monitoring device and the second monitoring device;
when the vehicle to be identified is not shot by the second monitoring device after the vehicle to be identified is away from the reference time for the first time, determining that the vehicle to be identified is in a static state, including:
and when the vehicle to be identified is not shot by the second monitoring equipment after the distance from the reference time is a first time, and the number of places is greater than 0, determining that the vehicle to be identified is in a static state.
3. The vehicle state recognition method according to claim 2, characterized in that the method further comprises:
acquiring the passing time of the historical vehicle between the first monitoring device and the second monitoring device;
counting the passing time according to a preset counting mode, and determining the passing time meeting a preset counting condition;
and taking the passing time length meeting the preset statistical condition as a first time length between the first monitoring device and the second monitoring device.
4. The vehicle state identification method according to claim 1, wherein the determination of at least one vehicle to be identified within the preset area further comprises:
acquiring a running track of at least one vehicle to be screened in the preset area, wherein the at least one vehicle to be screened is a vehicle appearing in the preset area within a second time length from the reference time;
determining a vehicle which is driven away from the preset area from the at least one vehicle to be screened according to the driving track;
and removing the running-away vehicle from the at least one vehicle to be screened to determine the at least one vehicle to be identified.
5. The vehicle state recognition method according to claim 1, characterized in that the method further comprises:
determining an upstream-downstream relationship which does not meet preset construction conditions from the upstream-downstream relationship among the monitoring devices;
and eliminating the upstream and downstream relations which do not meet the preset construction conditions from the upstream and downstream relations between the monitoring devices.
6. The vehicle state recognition method according to claim 1, characterized in that the method further comprises:
and counting the number of the target vehicles, and providing the number for a user.
7. A vehicle state recognition apparatus applied to an electronic device, comprising:
the system comprises a data acquisition module, a monitoring module and a monitoring module, wherein the data acquisition module is used for acquiring a monitoring record of at least one historical vehicle in a preset area, and the monitoring record comprises the historical time of each historical vehicle and corresponding monitoring equipment;
the relation determining module is used for constructing an upstream-downstream relation between the monitoring devices according to the sequence of the historical time corresponding to each historical vehicle;
the vehicle determining module is used for determining at least one vehicle to be identified in the preset area;
a state determining module, configured to determine, according to the upstream-downstream relationship, reference time at which each vehicle to be identified was last photographed, and a corresponding first monitoring device, a target vehicle in a driving state from the at least one vehicle to be identified, where the first monitoring device belongs to a mode of the monitoring device, and includes:
for each vehicle to be identified, determining second monitoring equipment corresponding to the first monitoring equipment from the monitoring equipment according to the upstream and downstream relation, wherein the second monitoring equipment is positioned on the road downstream of the first monitoring equipment;
when the vehicle to be identified is not shot by the second monitoring equipment after the vehicle to be identified is away from the reference time for a first time, determining that the vehicle to be identified is in a static state;
and determining a target vehicle in a running state from the at least one vehicle to be identified according to all vehicles to be identified in a static state.
8. An electronic device, comprising a processor and a memory, wherein the memory stores a computer program, and the computer program, when executed by the processor, implements the vehicle state identification method according to any one of claims 1 to 6.
9. A storage medium characterized in that the storage medium stores a computer program that, when executed by a processor, implements the vehicle state identification method according to any one of claims 1 to 6.
CN202110285834.1A 2021-03-17 2021-03-17 Vehicle state identification method, device, equipment and medium Active CN112818954B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110285834.1A CN112818954B (en) 2021-03-17 2021-03-17 Vehicle state identification method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110285834.1A CN112818954B (en) 2021-03-17 2021-03-17 Vehicle state identification method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN112818954A CN112818954A (en) 2021-05-18
CN112818954B true CN112818954B (en) 2022-08-26

Family

ID=75863703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110285834.1A Active CN112818954B (en) 2021-03-17 2021-03-17 Vehicle state identification method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN112818954B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091899B (en) * 2023-04-12 2023-06-23 中国铁塔股份有限公司 Vehicle tracking method, system, device, electronic equipment and readable storage medium
CN116958914A (en) * 2023-09-21 2023-10-27 广州一链通互联网科技有限公司 Monitoring method, system and storage medium for freight vehicle

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105590087A (en) * 2015-05-19 2016-05-18 中国人民解放军国防科学技术大学 Road recognition method and device
CN107170239A (en) * 2017-06-30 2017-09-15 广东工业大学 A kind of target vehicle follows the trail of grasp shoot method and device
WO2017157119A1 (en) * 2016-03-18 2017-09-21 中兴通讯股份有限公司 Method and device for identifying abnormal behavior of vehicle
CN108074404A (en) * 2016-11-14 2018-05-25 伍煜东 Intelligent traffic management systems and method
CN108597232A (en) * 2018-05-03 2018-09-28 张梦雅 Road traffic safety monitoring system and its monitoring method
CN109446926A (en) * 2018-10-09 2019-03-08 深兰科技(上海)有限公司 A kind of traffic monitoring method and device, electronic equipment and storage medium
CN109714576A (en) * 2019-01-14 2019-05-03 上海钧正网络科技有限公司 Vehicle identification method, device, system and server based on video monitoring
CN110689734A (en) * 2019-09-24 2020-01-14 成都通甲优博科技有限责任公司 Vehicle running condition identification method and device and electronic equipment
CN111372051A (en) * 2020-03-17 2020-07-03 三一重工股份有限公司 Multi-camera linkage blind area detection method and device and electronic equipment
CN112183367A (en) * 2020-09-29 2021-01-05 重庆紫光华山智安科技有限公司 Vehicle data error detection method, device, server and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6921694B2 (en) * 2017-09-21 2021-08-18 株式会社東芝 Monitoring system
CN107977643A (en) * 2017-12-18 2018-05-01 浙江工业大学 A kind of officer's car monitoring method based on road camera
CN110096975B (en) * 2019-04-17 2021-04-09 北京筑梦园科技有限公司 Parking space state identification method, equipment and system
CN112507757A (en) * 2019-08-26 2021-03-16 西门子(中国)有限公司 Vehicle behavior detection method, device and computer readable medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105590087A (en) * 2015-05-19 2016-05-18 中国人民解放军国防科学技术大学 Road recognition method and device
WO2017157119A1 (en) * 2016-03-18 2017-09-21 中兴通讯股份有限公司 Method and device for identifying abnormal behavior of vehicle
CN108074404A (en) * 2016-11-14 2018-05-25 伍煜东 Intelligent traffic management systems and method
CN107170239A (en) * 2017-06-30 2017-09-15 广东工业大学 A kind of target vehicle follows the trail of grasp shoot method and device
CN108597232A (en) * 2018-05-03 2018-09-28 张梦雅 Road traffic safety monitoring system and its monitoring method
CN109446926A (en) * 2018-10-09 2019-03-08 深兰科技(上海)有限公司 A kind of traffic monitoring method and device, electronic equipment and storage medium
CN109714576A (en) * 2019-01-14 2019-05-03 上海钧正网络科技有限公司 Vehicle identification method, device, system and server based on video monitoring
CN110689734A (en) * 2019-09-24 2020-01-14 成都通甲优博科技有限责任公司 Vehicle running condition identification method and device and electronic equipment
CN111372051A (en) * 2020-03-17 2020-07-03 三一重工股份有限公司 Multi-camera linkage blind area detection method and device and electronic equipment
CN112183367A (en) * 2020-09-29 2021-01-05 重庆紫光华山智安科技有限公司 Vehicle data error detection method, device, server and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高速路的车道检测与车辆跟踪;刘金清,等;《计算机系统应用》;20200215;第29卷(第02期);第187-197页 *

Also Published As

Publication number Publication date
CN112818954A (en) 2021-05-18

Similar Documents

Publication Publication Date Title
CN112818954B (en) Vehicle state identification method, device, equipment and medium
CN108346288B (en) Road section operation state early warning method and device and electronic equipment
CN110751828B (en) Road congestion measuring method and device, computer equipment and storage medium
CN108932852B (en) Method and device for recording behaviors of illegal occupation of emergency lane by motor vehicle
CN113936465B (en) Traffic event detection method and device
CN106128126B (en) The method and system of rush hour chance red light number are reduced using plane cognition technology
CN108629982B (en) Road section vehicle number estimation method based on travel time distribution rule
CN109493606A (en) The recognition methods and system of parking are disobeyed on a kind of highway
CN115935056A (en) Method, device and equipment for identifying false track of vehicle and storage medium
US20220237919A1 (en) Method, Apparatus, and Computing Device for Lane Recognition
CN111613056A (en) Traffic abnormal event detection method and device
US20230349717A1 (en) Electronic map correction method, navigation information setting method, navigation method, and apparatus
CN112906428B (en) Image detection region acquisition method and space use condition judgment method
CN114694370A (en) Method, device, computing equipment and storage medium for displaying intersection traffic flow
Jammula et al. Evaluation of Operational Safety Assessment (OSA) Metrics for Automated Vehicles Using Real-World Data
CN114783181B (en) Traffic flow statistics method and device based on road side perception
CN113989715A (en) Vehicle parking violation detection method and device, electronic equipment and storage medium
JP4030354B2 (en) Sudden event detection device
CN115188187A (en) Roadside perception data quality monitoring system and method based on vehicle-road cooperation
CN109685010B (en) Highway slope protection network vulnerability position positioning method and system
CN114863372A (en) Parking management method, parking management device and computer readable storage medium
CN114360248A (en) Traffic dynamic adjustment method, system, equipment and medium based on big data
CN111047878B (en) Traffic violation determination method and device and traffic access
Patel et al. A framework for proactive safety evaluation of intersection using surrogate safety measures and non-compliance behavior
JP7203277B2 (en) Method and apparatus for monitoring vehicle license plate recognition rate and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant