CN115272932A - Machine vision-based urban non-motor vehicle disorderly parking identification method - Google Patents

Machine vision-based urban non-motor vehicle disorderly parking identification method Download PDF

Info

Publication number
CN115272932A
CN115272932A CN202210902925.XA CN202210902925A CN115272932A CN 115272932 A CN115272932 A CN 115272932A CN 202210902925 A CN202210902925 A CN 202210902925A CN 115272932 A CN115272932 A CN 115272932A
Authority
CN
China
Prior art keywords
motor vehicle
urban
coordinate system
vehicle target
parking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210902925.XA
Other languages
Chinese (zh)
Other versions
CN115272932B (en
Inventor
林政涛
刘涛
王芸
孙光宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Tongjian Technology Co ltd
Original Assignee
Zhejiang Tongjian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Tongjian Technology Co ltd filed Critical Zhejiang Tongjian Technology Co ltd
Priority to CN202210902925.XA priority Critical patent/CN115272932B/en
Publication of CN115272932A publication Critical patent/CN115272932A/en
Application granted granted Critical
Publication of CN115272932B publication Critical patent/CN115272932B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Computational Linguistics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a method for identifying disordered parking of urban non-motor vehicles based on machine vision, which aims to solve the problems that the disordered parking of the non-motor vehicles, particularly shared single vehicles, can influence urban traffic and urban appearance, and urban video monitoring is not effectively utilized, and comprises the following steps: s1, initially fetching a stream from an inclined video of a city camera to obtain a modeling picture, and generating a regional three-dimensional model; s2, obtaining the comparison picture again, and only keeping the non-motor vehicle target: s3, positioning a non-motor vehicle target in the regional three-dimensional model; s4, calculating and storing the parking position and the parking time of the non-motor vehicle; s5, judging whether the staying time of the non-motor vehicle is larger than a first threshold of a preset long staying time limit: s6, calculating the disorder of parking of the non-motor vehicle; and S7, judging the disorder degree of the non-motor vehicle. The invention is especially suitable for identifying and warning the disordered parking of the urban non-motor vehicles, and has higher social use value and application prospect.

Description

Machine vision-based urban non-motor vehicle disorderly parking identification method
Technical Field
The invention relates to the technical field of non-motor vehicle management, in particular to a method for identifying disordered parking of urban non-motor vehicles based on machine vision.
Background
At present, the city monitoring maturity is high, and with the development of security industry and the continuous promotion of safe city process, the city video monitoring system industry of China has also been developed, but most of the situations are only in the video recording and manual investigation stage of city public security control and city road monitoring.
With the popularization of non-motor vehicles such as shared bicycles and electric bicycles, the disordered parking of the non-motor vehicles, particularly the shared bicycle, can influence urban traffic and urban appearance. Therefore, a method for identifying the disordered parking of the urban non-motor vehicles based on machine vision is provided.
Disclosure of Invention
It is an object of the present invention to solve or at least alleviate problems in the prior art.
In order to realize the purpose, the invention is realized by the following technical scheme: a disordered parking identification method for urban non-motor vehicles based on machine vision comprises the following steps:
s1, initially taking a stream from an inclined video of a city camera to obtain a modeling picture, and generating a regional three-dimensional model of a camera shooting region after removing interference by using a machine vision algorithm;
s2, obtaining the stream again from the inclined video of the urban camera to obtain a comparison picture, performing newly-added target detection on the comparison picture by using a machine vision algorithm, and only keeping a non-motor vehicle target after impurity removal:
if no non-motor vehicle target exists in the comparison picture, the comparison picture is obtained again to detect the non-motor vehicle target;
if the non-motor vehicle target exists in the picture of the comparison picture, entering the next step;
s3, positioning the non-motor vehicle target subjected to impurity removal in the regional three-dimensional model according to a calculation result of a machine vision algorithm;
s4, calculating and storing the stop position and the stop time of the non-motor vehicle in the comparison picture;
s5, judging whether the staying time of the non-motor vehicle is larger than a first threshold of a preset long staying time limit:
if no non-motor vehicle is larger than the first threshold value in the comparison picture, returning to the step S2, and obtaining the comparison picture again to detect the non-motor vehicle target;
if the comparison result shows that the number of the non-motor vehicles in the picture is larger than a first threshold value, the next step is carried out;
s6, screening out non-motor vehicles parked and gathered for a long time, and calculating the disorder degree of parking of the non-motor vehicles according to the accumulated position, time and form information of the non-motor vehicles in the comparison picture;
s7, judging whether the disorder degree of the non-motor vehicle is larger than a second threshold value of the preset disorder degree:
if the clutter degree of the non-motor vehicles in the comparison picture is lower than a second threshold value, returning to the step S2, and obtaining the comparison picture again to detect the non-motor vehicle target;
and if the clutter degree of the non-motor vehicles in the picture of the comparison picture is higher than a second threshold value, judging that the parking is in the clutter.
Optionally, in step S3, the positioning of the position information of the non-motor vehicle target includes the following steps:
s301, establishing an image space auxiliary coordinate system coordinate of the non-motor vehicle target in the camera shooting area through comparing picture frame simulation distance measurement;
s302, determining the coordinate of a body coordinate system of the non-motor vehicle target by combining the installation azimuth angle and the installation pitch angle of the city camera and the coordinate of the city camera;
s302, determining a center-of-station horizontal coordinate system coordinate of the non-motor vehicle target according to the attitude angle of the urban camera;
s304, determining the coordinates of a geocentric rectangular coordinate system of the urban camera according to the longitude and latitude height of the urban camera;
s305, comprehensively calculating the coordinates of the earth center rectangular coordinate system of the non-motor vehicle target by combining the coordinates of the image space auxiliary coordinate system, the coordinates of the engine body coordinate system, the coordinates of the earth center rectangular coordinate system of the non-motor vehicle target and the coordinates of the earth center rectangular coordinate system of the urban camera, and accordingly calculating the earth coordinates of the non-motor vehicle target.
Optionally, the image space auxiliary coordinate system coordinate representation formula of the non-motor vehicle target is as follows (1):
Figure BDA0003771554880000031
in the formula (1), the comparison picture frame simulation distance measurement is Dis, and (X1, Y1 and Z1) are the image space auxiliary coordinate system coordinates of the non-motor vehicle target.
Optionally, a conversion relation between the body auxiliary coordinate system coordinates of the non-motor vehicle target and the image space auxiliary coordinate system is expressed by a formula (2):
Figure BDA0003771554880000032
in the formula (2), a body coordinate system of the non-motor vehicle target is defined as a left-hand system, the origin is positioned at the focus of the urban camera, the X axis points to the horizontal direction of the focus, the Y axis points to the vertical direction of the focus, and the Z axis points to the upper direction; (X0, Y0, Z0) are coordinates of the urban camera in a body coordinate system; (X1, Y1, Z1) is the image space auxiliary coordinate system coordinate of the non-motor vehicle target; (X2, Y2, Z2) are the body auxiliary coordinate system coordinates of the non-motor vehicle target; a is a rotation matrix, and K is a rotation matrix for correcting the installation angle of the urban camera.
Optionally, the geocentric rectangular coordinate system coordinate of the urban camera is calculated according to the latitude, longitude and altitude of the unmanned aerial vehicle, and is obtained by converting a geodetic coordinate system.
Optionally, a conversion relation expression formula of the body coordinate system coordinates of the non-motor vehicle target and the center horizontal coordinate system coordinates is as follows (3):
Figure BDA0003771554880000033
in the formula (3), the geocentric rectangular coordinate of the non-motor vehicle target is defined as a right-hand system, the center of a reference ellipsoid is taken as an origin, the intersecting line of a starting meridian plane and an equatorial plane is an X axis, the direction orthogonal to the X axis on the equatorial plane is a Y axis, and the rotating axis of an ellipsoid is a Z axis; (X3, Y3, Z3) is the station center horizontal coordinate system coordinate of the non-motor vehicle target; (X2, Y2, Z2) are coordinates of a coordinate system of a non-motor vehicle target; theta is the mounting azimuth angle of the urban camera,
Figure BDA0003771554880000041
is the pitch angle of the urban camera,
Figure BDA0003771554880000042
is the attitude angle of the urban camera.
Optionally, before step S4, it is determined whether there is a previous cache result for storing the parking position information and the parking time information of the non-motor vehicle:
if the result is cached in the first time, comparing the current non-motor vehicle staying position information and staying time information with the result cached in the first time and removing the weight;
if the result is not cached in advance, the stop position information and the stop time information of the non-motor vehicle are saved.
Optionally, in step S5, a first threshold of the long-time stay time period is preset to exclude the non-motor vehicle that is in motion or stays for a short time in the comparison picture.
Optionally, in step S7, the clutter degree calculation step of the non-motor vehicle is as follows:
s701, carrying out average length and width on the non-motor vehicle by comparing the non-motor vehicle target positioning obtained from the picture, and modeling the non-motor vehicle target in the regional three-dimensional model;
s702, respectively calculating the direct aggregation of the non-motor vehicles according to the three-dimensional coordinates of the non-motor vehicle target modeling in the regional three-dimensional model;
and S703, under the condition that more than or equal to three non-motor vehicles are gathered, calculating the curvature of the connecting line of the central points of the gathered vehicles, and obtaining the disorder degree of the non-motor vehicles according to the curvature.
Optionally, in step S702, the direct concentration of the non-motor vehicles is calculated as: comparing the Euclidean distance of each non-motor vehicle target and the adjacent non-motor vehicle targets in the three-dimensional model of the region, and when the Euclidean distance is lower than a third threshold value of the preset Euclidean distance, considering that the non-motor vehicles are gathered.
The embodiment of the invention provides a machine vision-based method for identifying disordered parking of urban non-motor vehicles, which has the following beneficial effects:
1. the invention utilizes the monitoring of the existing urban camera to compare picture pictures, acquires a flow to acquire a comparison picture, establishes a regional three-dimensional model, uses a machine vision algorithm to compare and identify the non-motor vehicle target in the picture, utilizes the machine vision to identify the long-time disorderly parking of the urban non-motor vehicle, can automatically discover the disorderly parking phenomenon of the non-motor vehicle in the city and warn the non-motor vehicle, and plays a positive auxiliary role in managing urban traffic and urban appearance.
2. The invention refines the machine vision positioning algorithm of the non-motor vehicle on the basis of the existing urban camera shooting technology so as to realize more accurate coordinate positioning of the non-motor vehicle, effectively solve the problem that the target position information of the non-motor vehicle is not accurate, provide more accurate disorderly cluster coordinates for non-motor vehicle maintenance personnel and greatly reduce the maintenance cost of urban traffic and urban appearance.
Drawings
The above features, technical characteristics, advantages and implementation manners of the urban non-motor vehicle disorderly parking identification method based on machine vision will be further described in the following detailed description of preferred embodiments in a clearly understandable manner and in combination with the accompanying drawings.
FIG. 1 is a block diagram of the flow structure of the present invention;
fig. 2 is an installation schematic diagram of the city video monitoring in the invention.
Detailed Description
The invention will be further illustrated with reference to the following figures 1-2 and examples:
example 1
A method for recognizing disordered parking of urban non-motor vehicles based on machine vision refers to the accompanying drawings 1-2, and comprises the following steps:
s1, initially taking a stream from an inclined video of a city camera to obtain a modeling picture, and generating a regional three-dimensional model of a camera shooting region after removing interference by using a machine vision algorithm;
s2, obtaining the stream again from the inclined video of the urban camera to obtain a comparison picture, performing newly-added target detection on the comparison picture by using a machine vision algorithm, and only keeping a non-motor vehicle target after impurity removal:
if no non-motor vehicle target exists in the comparison picture, the comparison picture is obtained again to detect the non-motor vehicle target;
if the non-motor vehicle target exists in the picture of the comparison picture, entering the next step;
s3, positioning the non-motor vehicle target subjected to impurity removal in the regional three-dimensional model according to a calculation result of a machine vision algorithm;
s4, calculating and storing the stop position and the stop time of the non-motor vehicle in the comparison picture;
s5, judging whether the staying time of the non-motor vehicle is larger than a first threshold of a preset long staying time limit:
if no non-motor vehicle is larger than the first threshold value in the comparison picture, returning to the step S2, and obtaining the comparison picture again to detect the non-motor vehicle target;
if the comparison result shows that the non-motor vehicles in the picture are larger than the first threshold value, entering the next step;
s6, screening out non-motor vehicles parked and gathered for a long time, and calculating the disorder degree of parking of the non-motor vehicles according to the accumulated position, time and form information of the non-motor vehicles in the comparison picture;
s7, judging whether the disorder degree of the non-motor vehicle is larger than a second threshold value of the preset disorder degree:
if the disorder degree of the non-motor vehicles in the comparison picture is lower than a second threshold value, returning to the step S2, and obtaining the comparison picture again to detect the non-motor vehicle target;
and if the clutter degree of the non-motor vehicles in the picture of the comparison picture is higher than a second threshold value, judging that the parking is in the clutter.
In this embodiment, in the step S3, the positioning of the position information of the non-motor vehicle target includes the following steps:
s301, establishing an image space auxiliary coordinate system coordinate of the non-motor vehicle target in the camera shooting area through comparing picture frame simulation distance measurement;
s302, jointly determining the coordinate of a body coordinate system of the non-motor vehicle target by combining the installation azimuth angle and the installation pitch angle of the urban camera and the coordinate of the urban camera;
s302, determining a center-of-station horizontal coordinate system coordinate of the non-motor vehicle target according to the attitude angle of the urban camera;
s304, determining the coordinates of a geocentric rectangular coordinate system of the urban camera according to the longitude and latitude height of the urban camera;
s305, comprehensively calculating the coordinates of the earth center rectangular coordinate system of the non-motor vehicle target by combining the coordinates of the image space auxiliary coordinate system, the coordinates of the engine body coordinate system, the coordinates of the earth center rectangular coordinate system of the non-motor vehicle target and the coordinates of the earth center rectangular coordinate system of the urban camera, and accordingly calculating the earth coordinates of the non-motor vehicle target.
In this embodiment, the coordinate expression formula of the image space auxiliary coordinate system of the non-motor vehicle target is as follows (1):
Figure BDA0003771554880000071
in the formula (1), the comparison picture frame simulation distance measurement is Dis, and (X1, Y1 and Z1) are the image space auxiliary coordinate system coordinates of the non-motor vehicle target.
In this embodiment, the conversion relationship between the body auxiliary coordinate system coordinates of the non-motor vehicle target and the image space auxiliary coordinate system is expressed by the following formula (2):
Figure BDA0003771554880000072
in the formula (2), a body coordinate system of the non-motor vehicle target is defined as a left-handed system, the origin is located at the focus of the urban camera, the X axis points to the horizontal direction of the focus, the Y axis points to the vertical direction of the focus, and the Z axis points to the upper direction; (X0, Y0, Z0) are coordinates of the urban camera in a body coordinate system; (X1, Y1, Z1) are the image space auxiliary coordinate system coordinates of the non-motor vehicle target; (X2, Y2, Z2) are the body auxiliary coordinate system coordinates of the non-motor vehicle target; a is a rotation matrix, and K is a rotation matrix for correcting the installation angle of the urban camera.
In this embodiment, the geocentric rectangular coordinate system coordinates of the urban camera are calculated according to the latitude, longitude and altitude of the unmanned aerial vehicle, and are obtained by converting a geodetic coordinate system.
In this embodiment, the conversion relationship between the body coordinate system coordinates and the center horizontal coordinate system coordinates of the non-motor vehicle target is expressed by the following formula (3):
Figure BDA0003771554880000073
in the formula (3), the geocentric rectangular coordinate of the non-motor vehicle target is defined as a right-hand system, the center of a reference ellipsoid is taken as an origin, the intersection line of a starting meridian plane and an equatorial plane is taken as an X axis, the direction orthogonal to the X axis on the equatorial plane is taken as a Y axis, and the rotating axis of the ellipsoid is taken as a Z axis; (X3, Y3, Z3) is the station center horizontal coordinate system coordinate of the non-motor vehicle target; (X2, Y2, Z2) are coordinates of a body coordinate system of the non-motor vehicle target; theta is the mounting azimuth angle of the urban camera,
Figure BDA0003771554880000081
is the pitch angle of the urban camera,
Figure BDA0003771554880000082
is the attitude angle of the urban camera.
In this embodiment, before step S4, it is determined whether there is a previous cache result for storing the non-motor vehicle stop position information and the stop time information:
if the result is cached in the first time, comparing the current non-motor vehicle staying position information and staying time information with the result cached in the first time and removing the weight;
if the result is not cached in advance, the stop position information and the stop time information of the non-motor vehicle are saved.
In this embodiment, in the step S5, a first threshold of the long-time stay time period is preset to exclude the non-motor vehicle that stays in motion or in a short time in the comparison picture.
In this embodiment, in the step S7, the step of calculating the clutter degree of the non-motor vehicle is as follows:
s701, carrying out average length and width on the non-motor vehicle by comparing the non-motor vehicle target positioning obtained from the picture, and modeling the non-motor vehicle target in the regional three-dimensional model;
s702, respectively calculating the direct aggregation of the non-motor vehicles according to the three-dimensional coordinates of the non-motor vehicle target modeling in the regional three-dimensional model;
and S703, under the condition that more than or equal to three non-motor vehicles are gathered, calculating the curvature of the connecting line of the central points of the gathered vehicles, and obtaining the disorder degree of the non-motor vehicles according to the curvature.
Example 2
The present embodiment is different from embodiment 1 in that, in step S702, the concentration of non-motor vehicles is calculated as: comparing the Euclidean distance of each non-motor vehicle target and the adjacent non-motor vehicle targets in the three-dimensional model of the region, and when the Euclidean distance is lower than a third threshold value of a preset Euclidean distance, considering that the non-motor vehicles are gathered;
it can be understood that Euclidean distance (Euclidean distance) is a commonly used distance definition for measuring the absolute distance between two points in a multidimensional space, which is the true distance between two points in an m-dimensional space, and Euclidean distance in two and three dimensional spaces is the distance between two points, and the expression formula for the three dimensional Euclidean distance d between each non-motor vehicle and the adjacent non-motor vehicle in the picture is as follows:
Figure BDA0003771554880000091
where (x 1, y1, z 1) is the location of the first non-motor vehicle and (x 2, y2, z 2) is the location of the adjacent non-motor vehicle.
Other undescribed structures refer to example 1.
According to the method for identifying the disordered parking of the urban non-motor vehicles based on the machine vision, the picture images are compared by monitoring of the existing urban camera, the three-dimensional model of the area is established by obtaining the comparison picture images, the machine vision algorithm is used for comparing and identifying and positioning the non-motor vehicles on the picture images, the machine vision is used for identifying the disordered parking of the urban non-motor vehicles for a long time, the disordered parking phenomenon of the non-motor vehicles in the city can be automatically found and warned, and the method plays a positive auxiliary role in managing urban traffic and urban appearance.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A disordered parking identification method for urban non-motor vehicles based on machine vision is characterized by comprising the following steps:
s1, primarily taking a stream from an inclined video of an urban camera to obtain a modeling picture, and generating a regional three-dimensional model of a shooting region after removing interference by using a machine vision algorithm;
s2, obtaining the stream again from the inclined video of the urban camera to obtain a comparison picture, performing newly-added target detection on the comparison picture by using a machine vision algorithm, and only keeping a non-motor vehicle target after impurity removal:
if no non-motor vehicle target exists in the comparison picture, the comparison picture is obtained again to detect the non-motor vehicle target;
if the non-motor vehicle target exists in the picture of the comparison picture, entering the next step;
s3, positioning the non-motor vehicle target subjected to impurity removal in the regional three-dimensional model according to a calculation result of a machine vision algorithm;
s4, calculating and storing the stop position and the stop time of the non-motor vehicle in the comparison picture;
s5, judging whether the staying time of the non-motor vehicle is larger than a first threshold value of a preset long staying time limit or not:
if no non-motor vehicle is larger than the first threshold value in the comparison picture, returning to the step S2, and obtaining the comparison picture again to detect the non-motor vehicle target;
if the comparison result shows that the number of the non-motor vehicles in the picture is larger than a first threshold value, the next step is carried out;
s6, screening out non-motor vehicles parked and gathered for a long time, and calculating the disorder degree of parking of the non-motor vehicles according to the accumulated position, time and form information of the non-motor vehicles in the comparison picture;
s7, judging whether the disorder degree of the non-motor vehicle is larger than a second threshold value of the preset disorder degree:
if the clutter degree of the non-motor vehicles in the comparison picture is lower than a second threshold value, returning to the step S2, and obtaining the comparison picture again to detect the non-motor vehicle target;
and if the clutter degree of the non-motor vehicles in the picture of the comparison picture is higher than a second threshold value, judging that the parking is in the clutter.
2. The method for identifying the disordered parking of the urban non-motor vehicles based on the machine vision as claimed in claim 1, wherein in the step S3, the positioning of the position information of the non-motor vehicle target comprises the following steps:
s301, establishing an image space auxiliary coordinate system coordinate of the non-motor vehicle target in the camera shooting area through comparing picture frame simulation distance measurement;
s302, jointly determining the coordinate of a body coordinate system of the non-motor vehicle target by combining the installation azimuth angle and the installation pitch angle of the urban camera and the coordinate of the urban camera;
s302, determining a center-of-station horizontal coordinate system coordinate of the non-motor vehicle target according to the attitude angle of the urban camera;
s304, determining the coordinates of a geocentric rectangular coordinate system of the urban camera according to the longitude and latitude height of the urban camera;
s305, comprehensively calculating the coordinates of the image space auxiliary coordinate system, the coordinate of the engine body coordinate system, the coordinate of the center of the station horizontal coordinate system and the coordinates of the center of the earth rectangular coordinate system of the urban camera to obtain the coordinates of the center of the earth rectangular coordinate system of the non-motor vehicle target, and accordingly solving the coordinates of the earth of the non-motor vehicle target.
3. The machine vision-based urban non-motor vehicle disorderly parking identification method according to claim 2, wherein the image space auxiliary coordinate system coordinate representation formula of the non-motor vehicle target is as follows (1):
Figure FDA0003771554870000021
in the formula (1), the comparison picture frame simulation distance measurement is Dis, and (X1, Y1 and Z1) are the image space auxiliary coordinate system coordinates of the non-motor vehicle target.
4. The machine vision-based urban non-motor vehicle disorderly parking identification method according to claim 2, wherein the conversion relation between the body auxiliary coordinate system coordinates of the non-motor vehicle target and the image space auxiliary coordinate system is expressed by the following formula (2):
Figure FDA0003771554870000022
in the formula (2), a body coordinate system of the non-motor vehicle target is defined as a left-hand system, the origin is positioned at the focus of the urban camera, the X axis points to the horizontal direction of the focus, the Y axis points to the vertical direction of the focus, and the Z axis points to the upper direction; (X0, Y0, Z0) are coordinates of the urban camera in a body coordinate system; (X1, Y1, Z1) is the image space auxiliary coordinate system coordinate of the non-motor vehicle target; (X2, Y2, Z2) are the body auxiliary coordinate system coordinates of the non-motor vehicle target; a is a rotation matrix, and K is a rotation matrix for correcting the arrangement angle of the urban camera.
5. The machine vision-based urban non-motor vehicle disorderly parking identification method according to claim 2, wherein the geocentric rectangular coordinate system coordinates of the urban camera are calculated according to latitude, longitude and altitude of the unmanned aerial vehicle and are obtained by converting a geodetic coordinate system.
6. The machine vision-based urban non-motor vehicle disorderly parking identification method according to claim 2, wherein the conversion relation between the body coordinate system coordinates of the non-motor vehicle target and the center horizontal coordinate system coordinates is expressed by the following formula (3):
Figure FDA0003771554870000031
in the formula (3), the geocentric rectangular coordinate of the non-motor vehicle target is defined as a right-hand system, the center of a reference ellipsoid is taken as an origin, the intersecting line of a starting meridian plane and an equatorial plane is an X axis, the direction orthogonal to the X axis on the equatorial plane is a Y axis, and the rotating axis of an ellipsoid is a Z axis; (X3, Y3, Z3) is the station center horizontal coordinate system coordinate of the non-motor vehicle target; (X2, Y2, Z2) are coordinates of a coordinate system of a non-motor vehicle target; theta is the mounting azimuth angle of the urban camera,
Figure FDA0003771554870000032
is the pitch angle of the urban camera,
Figure FDA0003771554870000033
is the attitude angle of the urban camera.
7. The city non-motor vehicle disorderly parking identification method based on machine vision as claimed in claim 1, wherein before step S4, it is determined whether there is a previous cache result for storing non-motor vehicle parking position information and parking time information:
if the result is cached in the first time, comparing the current non-motor vehicle staying position information and staying time information with the result cached in the first time and removing the weight;
if the result is not cached in advance, the parking position information and the parking time information of the non-motor vehicle are saved.
8. The method for recognizing disorderly parking of urban non-motor vehicles based on machine vision as claimed in claim 1, wherein in step S5, a first threshold of a preset long-time staying time limit is used for excluding non-motor vehicles which are in motion in the picture or stay for a short time.
9. The method for identifying the disordered parking of the urban non-motor vehicle based on the machine vision as claimed in claim 1, wherein in the step S7, the disorder degree calculation step of the non-motor vehicle is as follows:
s701, carrying out average length and width on the non-motor vehicle by comparing the non-motor vehicle target positioning obtained from the picture, and modeling the non-motor vehicle target in the regional three-dimensional model;
s702, respectively calculating the direct aggregation of the non-motor vehicles according to the three-dimensional coordinates of the non-motor vehicle target modeling in the regional three-dimensional model;
and S703, under the condition that more than or equal to three non-motor vehicles are gathered, calculating the curvature of the connecting line of the central points of the gathered vehicles, and obtaining the disorder degree of the non-motor vehicles according to the curvature.
10. The method for identifying disorderly parked non-motor vehicles in city based on machine vision according to claim 4, wherein in the step S702, the direct concentration of the non-motor vehicles is calculated as: comparing the Euclidean distance of each non-motor vehicle target and the adjacent non-motor vehicle target in the three-dimensional model of the region, and when the obtained Euclidean distance is lower than a third threshold value of the preset Euclidean distance, considering that the non-motor vehicles are gathered.
CN202210902925.XA 2022-07-29 2022-07-29 Machine vision-based urban non-motor vehicle disordered parking identification method Active CN115272932B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210902925.XA CN115272932B (en) 2022-07-29 2022-07-29 Machine vision-based urban non-motor vehicle disordered parking identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210902925.XA CN115272932B (en) 2022-07-29 2022-07-29 Machine vision-based urban non-motor vehicle disordered parking identification method

Publications (2)

Publication Number Publication Date
CN115272932A true CN115272932A (en) 2022-11-01
CN115272932B CN115272932B (en) 2023-04-28

Family

ID=83770886

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210902925.XA Active CN115272932B (en) 2022-07-29 2022-07-29 Machine vision-based urban non-motor vehicle disordered parking identification method

Country Status (1)

Country Link
CN (1) CN115272932B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107230346A (en) * 2017-08-01 2017-10-03 何永安 Confirmation method, device, server and the storage medium of shared bicycle parking specification
CN110345940A (en) * 2019-05-17 2019-10-18 深圳市中智车联科技有限责任公司 The method and its lock in posture and direction are parked for the shared bicycle of specification
CN111062986A (en) * 2018-10-17 2020-04-24 千寻位置网络有限公司 Monocular vision-based auxiliary positioning method and device for shared bicycle
CN111612895A (en) * 2020-05-27 2020-09-01 魏寸新 Leaf-shielding-resistant CIM real-time imaging method for detecting abnormal parking of shared bicycle
CN112185098A (en) * 2020-10-13 2021-01-05 特斯联科技集团有限公司 Shared bicycle monitoring method and system based on city monitoring video
CN213279940U (en) * 2020-11-26 2021-05-25 浙江通见科技有限公司 Video image analysis equipment for storage violation detection
CN113255486A (en) * 2021-05-13 2021-08-13 华设设计集团股份有限公司 Parking space occupation detection method based on high-level video monitoring
US20210327084A1 (en) * 2020-04-21 2021-10-21 Here Global B.V. Visual localization using a three-dimensional model and image segmentation
CN114492955A (en) * 2022-01-07 2022-05-13 重庆市地矿测绘院有限公司 Digital twin space-time big data platform based on three-dimensional live-action modeling and CIM
CN114549956A (en) * 2022-02-11 2022-05-27 上海市测绘院 Deep learning assisted inclined model building facade target recognition method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107230346A (en) * 2017-08-01 2017-10-03 何永安 Confirmation method, device, server and the storage medium of shared bicycle parking specification
CN111062986A (en) * 2018-10-17 2020-04-24 千寻位置网络有限公司 Monocular vision-based auxiliary positioning method and device for shared bicycle
CN110345940A (en) * 2019-05-17 2019-10-18 深圳市中智车联科技有限责任公司 The method and its lock in posture and direction are parked for the shared bicycle of specification
US20210327084A1 (en) * 2020-04-21 2021-10-21 Here Global B.V. Visual localization using a three-dimensional model and image segmentation
CN111612895A (en) * 2020-05-27 2020-09-01 魏寸新 Leaf-shielding-resistant CIM real-time imaging method for detecting abnormal parking of shared bicycle
CN112185098A (en) * 2020-10-13 2021-01-05 特斯联科技集团有限公司 Shared bicycle monitoring method and system based on city monitoring video
CN213279940U (en) * 2020-11-26 2021-05-25 浙江通见科技有限公司 Video image analysis equipment for storage violation detection
CN113255486A (en) * 2021-05-13 2021-08-13 华设设计集团股份有限公司 Parking space occupation detection method based on high-level video monitoring
CN114492955A (en) * 2022-01-07 2022-05-13 重庆市地矿测绘院有限公司 Digital twin space-time big data platform based on three-dimensional live-action modeling and CIM
CN114549956A (en) * 2022-02-11 2022-05-27 上海市测绘院 Deep learning assisted inclined model building facade target recognition method

Also Published As

Publication number Publication date
CN115272932B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN112368756B (en) Method for calculating collision time of object and vehicle, calculating device and vehicle
KR20210078530A (en) Lane property detection method, device, electronic device and readable storage medium
CN111429756B (en) Highway tunnel rear-end collision prevention early warning method based on video technology
CN104902261B (en) Apparatus and method for the road surface identification in low definition video flowing
CN113506318B (en) Three-dimensional target perception method under vehicle-mounted edge scene
CN110969142A (en) Abnormal driving scene extraction method based on internet vehicle natural driving data
CN111009008B (en) Self-learning strategy-based automatic airport airplane tagging method
CN112800938B (en) Method and device for detecting occurrence of side rockfall of unmanned vehicle
CN111915583A (en) Vehicle and pedestrian detection method based on vehicle-mounted thermal infrared imager in complex scene
WO2018149539A1 (en) A method and apparatus for estimating a range of a moving object
CN110718068B (en) Road monitoring camera installation angle estimation method
Matsuda et al. A system for real-time on-street parking detection and visualization on an edge device
CN115272932B (en) Machine vision-based urban non-motor vehicle disordered parking identification method
JP2940366B2 (en) Object tracking recognition device
Cao et al. Accelerating Point-Voxel representation of 3D object detection for automatic driving
CN113447953B (en) Background filtering method based on road traffic point cloud data
CN115410139A (en) Airport apron vehicle overspeed safety event identification system and method based on video analysis
CN114565906A (en) Obstacle detection method, obstacle detection device, electronic device, and storage medium
JPH11317939A (en) Object following up/recognizing device
CN113570622A (en) Obstacle determination method and device, electronic equipment and storage medium
CN114066945B (en) Video tracking method and system based on pixel spatial resolution
CN111739326B (en) Intelligent network-connected automobile operation management method and cloud control system
CN117789141B (en) Pavement throwing event detection method based on artificial intelligence
Huang et al. Detection for dangerous goods vehicles in expressway service station based on surveillance videos
CN115035507A (en) Intelligent mobile car inspection device and method based on Beidou positioning and visual SLAM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant