CN112183312A - City management event processing method based on smart city - Google Patents

City management event processing method based on smart city Download PDF

Info

Publication number
CN112183312A
CN112183312A CN202011029391.1A CN202011029391A CN112183312A CN 112183312 A CN112183312 A CN 112183312A CN 202011029391 A CN202011029391 A CN 202011029391A CN 112183312 A CN112183312 A CN 112183312A
Authority
CN
China
Prior art keywords
city
city management
management
target
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011029391.1A
Other languages
Chinese (zh)
Inventor
刘应森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangyuan Liangzhihui Technology Co ltd
Original Assignee
Guangyuan Liangzhihui Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangyuan Liangzhihui Technology Co ltd filed Critical Guangyuan Liangzhihui Technology Co ltd
Priority to CN202011029391.1A priority Critical patent/CN112183312A/en
Publication of CN112183312A publication Critical patent/CN112183312A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/49Segmenting video sequences, i.e. computational techniques such as parsing or cutting the sequence, low-level clustering or determining units such as shots or scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Tourism & Hospitality (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention relates to the field of smart cities and data fusion, and discloses a city management event processing method based on smart cities, which comprises the following steps: the city management equipment periodically carries out monitoring time marking and monitoring range marking on the collected monitoring videos so as to obtain city management videos and sends the city management videos to the smart city cloud platform. The smart city cloud platform performs video segmentation and image processing on the city management video to obtain a city management image sequence, calculates a management target coefficient of each city management image in the city management image sequence, and then selects a plurality of target city management images from the city management image sequence according to the management target coefficients. And the city management event module analyzes whether a city management event occurs according to all target city management images and sends city management prompt information to a corresponding city management terminal when the city management event occurs.

Description

City management event processing method based on smart city
Technical Field
The invention relates to the field of smart cities and big data, in particular to a city management event processing method based on smart cities.
Background
The smart city is a city informatization advanced form which fully applies a new generation of information technology to various industries in the city and is based on the innovation of the next generation of knowledge society, realizes the deep integration of informatization, industrialization and urbanization, is beneficial to relieving the large urban diseases, improves the urbanization quality, realizes the fine and dynamic management, improves the urban management effect and improves the quality of life of citizens.
The smart city realizes comprehensive and thorough perception, broadband ubiquitous interconnection and intelligent integration and sustainable innovation characterized by user innovation, open innovation, public innovation and collaborative innovation through the application of new-generation information technologies such as Internet of things infrastructure, cloud computing infrastructure, geographic space infrastructure and the like and tools and methods such as wiki, social network, integrated integration method, network-driven full-media integration communication terminal and the like.
The city is a complex, and potential safety hazards such as well cover loss, water pipe breakage, residue dumping, non-open-end objects in the garbage can and the like can be caused by various facility problems. Traditional management mainly relies on the net staff to patrol and report, and these modes have the defect that inefficiency and hidden danger are big.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a city management event processing method based on a smart city, which comprises the following steps: the city management equipment periodically carries out monitoring time marking and monitoring range marking on the collected monitoring video to obtain a city management video and sends the city management video to the smart city cloud platform;
a video processing module of the smart city cloud platform divides a city management video into a plurality of city video images according to a preset time step length on a time dimension, and sorts all the city video images according to a time sequence to obtain a city video image sequence;
the image processing module carries out weighted average on each pixel point of the city video image according to a first preset weight, a second preset weight and a third preset weight so as to obtain a pixel weighted value of each pixel point of the city video image, divides the city video image into a plurality of management image subregions by taking each pixel point as a central pixel point, taking a preset length as a length and taking a preset width as a width of each city video image in the city video image sequence, sorts the pixel weighted values of all the pixel points in each management image subregion according to the size, and finally selects the pixel weighted value in the middle position of a sorting result to replace the pixel weighted value of the central pixel point of the management subregion so as to obtain the city management image sequence;
the city management matrix module obtains a city management point packet according to the pixel weighted values of all pixel points of all city management images of the city management image sequence, and obtains the pixel median and the pixel variance of all the city management images in each pixel point in the city management image sequence according to the city management point set of the city management point packet; then obtaining a city management matrix according to the pixel median and the pixel variance of all city management images in the city management image sequence at each pixel point; the city management point packet comprises a plurality of city management point sets, and the city management point sets comprise pixel weighted values of all city video images of the city video image sequence at the same pixel point;
the target city management image module acquires the background spacing distance of the city background of the monitored area of the city management video according to the city management matrix, and obtaining the distance between the city background of the monitored area of the city management video and the background of each city management image in the city management image sequence, and calculating a management target coefficient of each city management image in the city management image sequence according to the background spacing distance of the city background of the region monitored by the city management video, the city background of the region monitored by the city management video and the background spacing distance of each city management image in the city management image sequence, comparing the management target coefficient of each city management image with a management target threshold value, and selecting a plurality of target city management images from the city management image sequence by taking the city management images of which the management target coefficients are greater than the management target threshold value as the target city management images; taking the city management image with the management target coefficient smaller than the management target threshold value as a city background image to select a plurality of city background images from the city management image sequence; the target city management image is a city management image with probability of occurrence of a city management event; the city background image is a city management image without probability of occurrence of a city management event;
and the city management event module analyzes whether a city management event occurs according to all target city management images and sends city management prompt information to a corresponding city management terminal when the city management event occurs.
According to a preferred embodiment, the city management event comprises: well lid lost, water pipe broken, muck dumped and trash can not have open-edge objects.
According to a preferred embodiment, the city management device is a monitoring device with data transmission function and communication function, which is arranged in a city, and comprises: a gun-type camera, an integral camera, a hemispherical camera, a fisheye camera, and a pinhole camera.
According to a preferred embodiment, the city management terminal is a device with communication function and data transmission function used by a city manager, and comprises: smart phones, smart watches, tablet computers, and notebook computers.
According to a preferred embodiment, the city management prompt message is used for prompting a city manager that a city management event occurs, and comprises an event-related time, an event-related place and an event type.
According to a preferred embodiment, said sequence of urban video images comprises a plurality of urban video images arranged in a temporal sequence.
According to a preferred embodiment, the city management image sequence comprises several city management images arranged in time sequence.
According to a preferred embodiment, the city management event module analyzing whether the city management event occurs according to all target city management images comprises:
the city management event module judges whether two continuous target city management images exist in the city management image sequence; when two continuous target city management images exist in the city management image sequence, acquiring the total number of pixel points of each target city management image and the pixel weighted value of each pixel point, counting the number of the pixel points of each pixel weighted value, calculating the occurrence probability of each pixel weighted value according to the total number of the pixel points and each pixel weighted value, and acquiring the similarity of two adjacent target city management images according to the occurrence probability and the similarity function of each pixel weighted value in each target city management image;
when the similarity is greater than a similarity threshold value, the two adjacent target city management images have a time sequence relationship, and the two target city management images are subjected to feature fusion in a linear iteration mode according to feature fusion factors to obtain target feature fusion images;
the city management event module constructs a maximum likelihood function of a target object in the target feature fusion image according to the target feature fusion image, acquires the conditional probability of the target object at each position coordinate point and the conditional probability of the visual feature of the target object according to the maximum likelihood function, performs linear transformation on the position coordinate and the visual feature of the target object according to the conditional probability of the target object at each position coordinate point and the conditional probability of the visual feature of the target object to obtain a linear transformation relation between the position coordinate and the visual feature of the target object, and acquires the real-time probability of each position coordinate point of the target object in the target feature fusion image according to the linear transformation relation between the position coordinate and the visual feature of the target object.
According to a preferred embodiment, the city management event module analyzing whether the city management event occurs according to all target city management images comprises:
the city management event module divides the target feature fusion image into a plurality of fusion image sub-regions, determines the target fusion image sub-regions according to the real-time probability of each position coordinate point of the target object in the target feature fusion image and extracts target feature vectors of the target fusion image sub-regions;
the city management event module analyzes a visual mean vector of the visual features of the target object and a position mean vector of the position features of the target object by using the target feature vector;
the city management event module extracts a plurality of feature points from the target fusion image subregion, analyzes the feature value and the coordinate position of each feature point according to the visual mean value vector and the position mean value vector, then determines the probability value of each feature point, and calculates the occupancy rate of each feature point according to the probability value of each feature point; determining the center of the target fusion image subregion according to the occupancy rate of each feature point, and acquiring the real-time position of a target object according to the center of the target fusion image subregion;
and the city management event module analyzes whether a city management event occurs according to the real-time position of the target object.
In the invention, city management equipment arranged in a city sends city management videos to a smart city cloud platform in real time, the smart city cloud platform judges whether city management events with potential safety hazards, such as well lid loss, water pipe breakage, muck dumping, trash can non-open-end objects and the like, occur in the city by analyzing the city management videos, and relevant city management personnel are timely notified to timely process the city management events when the city management events occur, so that the city management level and the city management and response efficiency are favorably improved.
Drawings
Fig. 1 is a flowchart illustrating a city management event processing method based on a smart city according to an exemplary embodiment.
Detailed Description
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present invention.
Referring to fig. 1, in one embodiment, the city management event processing method based on smart city of the present invention may include the following steps:
and S1, the city management equipment periodically carries out monitoring time marking and monitoring range marking on the collected monitoring video to obtain the city management video and sends the city management video to the smart city cloud platform.
Optionally, the city management device is a monitoring device disposed in a city and having a data transmission function and a communication function, and includes: a gun-type camera, an integral camera, a hemispherical camera, a fisheye camera, and a pinhole camera.
Specifically, the monitoring time is marked as a time period marked for monitoring the monitoring video, and the monitoring range is marked as a monitoring area range marked with the monitoring video.
Optionally, the city management video is a monitoring video marked with monitoring time and monitoring range, and is used for analyzing whether a city management event occurs in the monitoring range.
S2, the video processing module of the smart city cloud platform divides the city management video into a plurality of city video images according to a preset time step length on the time dimension, and sorts all the city video images according to the time sequence to obtain a city video image sequence.
Optionally, the sequence of city video images comprises a number of city video images arranged in a time sequence. Optionally, the preset time step is preset according to actual conditions.
S3, the image processing module carries out weighted average on each pixel point of the city video image according to the first preset weight, the second preset weight and the third preset weight to obtain a pixel weighted value of each pixel point of the city video image, divides the city video image into a plurality of management image sub-regions by taking each pixel point as a central pixel point, taking the preset length as the length and taking the preset width as the width of each city video image in the city video image sequence, sorts the pixel weighted values of all the pixel points in each management image sub-region according to the size, and finally selects the pixel weighted value at the center position of a sorting result to replace the pixel weighted value of the central pixel point of the management sub-region to obtain the city management image sequence.
Optionally, the city management image sequence includes a plurality of city management images arranged in time order. Optionally, the preset length and the preset width are preset according to actual conditions.
Optionally, the first preset weight, the second preset weight and the third preset weight may be preset according to actual requirements. Specifically, the first preset weight is a weighted value of a red component of the pixel point, the second preset weight is a weighted value of a green component of the pixel point, and the third preset weight is a weighted value of a blue component of the pixel point.
S4, the city management matrix module obtains a city management point packet according to the weighted pixel values of all pixel points of all city management images of the city management image sequence, and obtains the pixel median and the pixel variance of all the city management images in each pixel point in the city management image sequence according to the city management point set of the city management point packet; and then obtaining a city management matrix according to the pixel median and the pixel variance of all the city management images in the city management image sequence at each pixel point.
Optionally, the city management point package includes a plurality of city management point sets, and each city management point set includes pixel weighted values of all city video images of the city video image sequence at the same pixel point.
S5, the target city management image module obtains the background spacing distance of the city background in the monitored area of the city management video according to the city management matrix, and obtaining the distance between the city background of the monitored area of the city management video and the background of each city management image in the city management image sequence, and calculating a management target coefficient of each city management image in the city management image sequence according to the background spacing distance of the city background of the region monitored by the city management video, the city background of the region monitored by the city management video and the background spacing distance of each city management image in the city management image sequence, comparing the management target coefficient of each city management image with a management target threshold value, and selecting a plurality of target city management images from the city management image sequence by taking the city management images of which the management target coefficients are greater than the management target threshold value as the target city management images; and taking the city management image with the management target coefficient smaller than the management target threshold value as a city background image to select a plurality of city background images from the city management image sequence.
Optionally, the pixel feature matrix of each city management image is obtained according to the pixel weighted value of the pixel point of each city management image in the city management image sequence.
Optionally, the target city management image is a city management image with a probability of occurrence of a city management event, and the city background image is a city management image without a probability of occurrence of a city management event.
Optionally, the background separation distance of the city background is:
Figure BDA0002703073080000071
db is the background spacing distance of the city background, m is the row number of the city management matrix, n is the column number of the city management matrix, u isijIs the element of the ith row and the jth column of the city management matrix. i is the row index and j is the column index.
Optionally, the city background is spaced from the city management image by a background distance of
Figure BDA0002703073080000072
Figure BDA0002703073080000073
The distance between the city background and the background of the kth city management image is set, m is the number of lines of the city management matrix, n is the number of columns of the city management matrix, u is the number of lines of the city management matrixijFor the element of the i-th row, j-th column, v, of the city management matrixijManaging the elements of the pixel feature matrix of the image for the kth city in the ith row and jth column, i being the row index and j being the column index
Optionally, the management target coefficient is:
Figure BDA0002703073080000074
optionally,skA management target coefficient for the kth city management image, m is the number of rows of the city management matrix, n is the number of columns of the city management matrix, u isijFor the element of the i-th row, j-th column, v, of the city management matrixijAnd managing the ith row and the jth column of the pixel characteristic matrix of the image for the kth city, wherein i is a row index and j is a column index.
And S6, the city management event module analyzes whether a city management event occurs according to all target city management images, and sends city management prompt information to the corresponding city management terminal when the city management event occurs.
Optionally, the city management event includes: well lid lost, water pipe broken, muck dumped and trash can not have open-edge objects.
Optionally, the city management prompting information is used for prompting a city manager that a city management event occurs, and includes event-related time, event-related place and event type, wherein the event type includes well lid loss, water pipe breakage, muck dumping and trash can non-open-end objects, and the city manager is a person who manages the large and small affairs of the city.
Optionally, the city management terminal is a device used by a city manager and having a communication function and a data transmission function, and the device includes: smart phones, smart watches, tablet computers, and notebook computers.
Specifically, the analyzing, by the city management event module, whether the city management event occurs according to all the target city management images includes:
the city management event module judges whether two continuous target city management images exist in the city management image sequence; when two continuous target city management images exist in the city management image sequence, acquiring the total number of pixel points of each target city management image and the pixel weighted value of each pixel point, counting the number of the pixel points of each pixel weighted value, calculating the occurrence probability of each pixel weighted value according to the total number of the pixel points and each pixel weighted value, and acquiring the similarity of two adjacent target city management images according to the occurrence probability and the similarity function of each pixel weighted value in each target city management image;
when the similarity is greater than a similarity threshold value, the two adjacent target city management images have a time sequence relationship, and the two target city management images are subjected to feature fusion in a linear iteration mode according to feature fusion factors to obtain target feature fusion images;
the city management event module constructs a maximum likelihood function of a target object in the target feature fusion image according to the target feature fusion image, acquires the conditional probability of the target object at each position coordinate point and the conditional probability of the visual feature of the target object according to the maximum likelihood function, performs linear transformation on the position coordinate and the visual feature of the target object according to the conditional probability of the target object at each position coordinate point and the conditional probability of the visual feature of the target object to obtain a linear transformation relation between the position coordinate and the visual feature of the target object, and acquires the real-time probability of each position coordinate point of the target object in the target feature fusion image according to the linear transformation relation between the position coordinate and the visual feature of the target object.
The city management event module analyzes whether a city management event occurs according to all target city management images and comprises the following steps:
the city management event module divides the target feature fusion image into a plurality of fusion image sub-regions, determines the target fusion image sub-regions according to the real-time probability of each position coordinate point of the target object in the target feature fusion image and extracts target feature vectors of the target fusion image sub-regions;
the city management event module analyzes a visual mean vector of the visual features of the target object and a position mean vector of the position features of the target object by using the target feature vector;
the city management event module extracts a plurality of feature points from the target fusion image subregion, analyzes the feature value and the coordinate position of each feature point according to the visual mean value vector and the position mean value vector, then determines the probability value of each feature point, and calculates the occupancy rate of each feature point according to the probability value of each feature point; determining the center of the target fusion image subregion according to the occupancy rate of each feature point, and acquiring the real-time position of a target object according to the center of the target fusion image subregion;
and the city management event module analyzes whether a city management event occurs according to the real-time position of the target object.
In the invention, city management equipment arranged in a city sends city management videos to a smart city cloud platform in real time, the smart city cloud platform judges whether city management events with potential safety hazards, such as well lid loss, water pipe breakage, muck dumping, trash can non-open-end objects and the like, occur in the city by analyzing the city management videos, and relevant city management personnel are timely notified to timely process the city management events when the city management events occur, so that the loss of citizens caused by non-timely processing is avoided, and the city management level and the city management efficiency are improved.
In one embodiment, a smart city management system for performing the method of the present invention may comprise: city management equipment, city management terminal and wisdom city cloud platform, wherein wisdom city cloud platform has communication connection with every city management equipment, city management terminal respectively.
The monitoring video that city management equipment is periodic will gather carries out monitoring time mark and monitoring range mark in order to obtain city management video and sends it to wisdom city cloud platform, city management equipment has data transmission function and communication function's monitoring equipment for laying in the city, and it includes: a gun-type camera, an integral camera, a hemispherical camera, a fisheye camera, and a pinhole camera.
Wisdom city cloud platform includes: the system comprises a video processing module, an image processing module, a city management matrix module, a target city management image module, a city management event module and a database, wherein the modules are in communication connection.
The video processing module divides the city management video into a plurality of city video images according to a preset time step length on a time dimension, and sequences all the city video images according to a time sequence to obtain a city video image sequence.
The image processing module carries out weighted average on each pixel point of the city video image according to the first preset weight, the second preset weight and the third preset weight to obtain a pixel weighted value of each pixel point of the city video image, divides the city video image into a plurality of management image sub-regions by taking each pixel point as a central pixel point, taking the preset length as the length and taking the preset width as the width of each city video image in the city video image sequence, sorts the pixel weighted values of all the pixel points in each management image sub-region according to the size, and finally selects the pixel weighted value in the middle position of a sorting result to replace the pixel weighted value of the central pixel point of the management sub-region to obtain the city management image sequence.
The city management matrix module obtains a city management point packet according to the pixel weighted values of all pixel points of all city management images of the city management image sequence, and obtains the pixel median and the pixel variance of all the city management images in each pixel point in the city management image sequence according to the city management point set of the city management point packet; then obtaining a city management matrix according to the pixel median and the pixel variance of all city management images in the city management image sequence at each pixel point; the city management point packet comprises a plurality of city management point sets, and the city management point sets comprise pixel weighted values of all city video images of the city video image sequence at the same pixel point.
The target city management image module acquires the background spacing distance of the city background of the monitored area of the city management video according to the city management matrix, and obtaining the distance between the city background of the monitored area of the city management video and the background of each city management image in the city management image sequence, and calculating a management target coefficient of each city management image in the city management image sequence according to the background spacing distance of the city background of the region monitored by the city management video, the city background of the region monitored by the city management video and the background spacing distance of each city management image in the city management image sequence, comparing the management target coefficient of each city management image with a management target threshold value, and selecting a plurality of target city management images from the city management image sequence by taking the city management images of which the management target coefficients are greater than the management target threshold value as the target city management images; and taking the city management image with the management target coefficient smaller than the management target threshold value as a city background image to select a plurality of city background images from the city management image sequence.
The city management event module analyzes whether a city management event occurs according to all target city management images, and sends city management prompt information to a corresponding city management terminal when the city management event occurs, wherein the city management terminal is a device which is used by a city manager and has a communication function and a data transmission function, and the device comprises: smart phones, smart watches, tablet computers, and notebook computers.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (9)

1. A city management event processing method based on a smart city is characterized by comprising the following steps:
the city management equipment periodically carries out monitoring time marking and monitoring range marking on the collected monitoring video to obtain a city management video and sends the city management video to the smart city cloud platform; a video processing module of the smart city cloud platform divides a city management video into a plurality of city video images according to a preset time step length on a time dimension, and sorts all the city video images according to a time sequence to obtain a city video image sequence;
the image processing module carries out weighted average on each pixel point of the city video image according to the first preset weight, the second preset weight and the third preset weight to obtain a pixel weighted value of each pixel point of the city video image, divides each city video image in the city video image sequence into a plurality of management image sub-regions, sorts the pixel weighted values of all the pixel points in each management image sub-region according to the size, and then selects the pixel weighted value in the middle position of a sorting result to replace the pixel weighted value of the central pixel point of the management sub-region to obtain the city management image sequence;
the city management matrix module obtains a city management point packet according to the pixel weighted values of all pixel points of all city management images of the city management image sequence, and obtains the pixel median and the pixel variance of all the city management images in each pixel point in the city management image sequence according to the city management point set of the city management point packet; then obtaining a city management matrix according to the pixel median and the pixel variance of all city management images in the city management image sequence at each pixel point;
the target city management image module acquires the background spacing distance of the city background of the monitored area of the city management video according to the city management matrix, and obtaining the city background of the monitored area of the city management video and the background interval distance of each city management image in the city management image sequence according to the city management matrix, then calculating the management target coefficient of each city management image in the city management image sequence according to the background spacing distance of the city background of the monitored area of the city management video and each city management image in the city management image sequence, comparing the management target coefficient of each city management image with a management target threshold value, and selecting a plurality of target city management images from the city management image sequence by taking the city management images of which the management target coefficients are greater than the management target threshold value as the target city management images; taking the city management image with the management target coefficient smaller than the management target threshold value as a city background image to select a plurality of city background images from the city management image sequence;
and the city management event module analyzes whether a city management event occurs according to the target city management image and sends city management prompt information to a corresponding city management terminal when the city management event occurs.
2. The method of claim 1, wherein the city management event comprises: well lid lost, water pipe broken, muck dumped and trash can not have open-edge objects.
3. The method according to claim 2, wherein the city management device is a monitoring device having a data transmission function and a communication function, which is deployed in a city, and comprises: a gun-type camera, an integral camera, a hemispherical camera, a fisheye camera, and a pinhole camera.
4. The method according to one of claims 1 to 3, wherein the city management point package comprises a plurality of city management point sets, and the city management point sets comprise pixel weighted values of all city video images of the city video image sequence at the same pixel point.
5. The method according to one of claims 1 to 4, wherein the target city management image is a city management image in which there is a probability that a city management event occurs; the city background image is a city management image without probability of occurrence of a city management event.
6. The method of claim 5, wherein the city management event module analyzing whether a city management event occurs according to all target city management images comprises:
the city management event module judges whether two continuous target city management images exist in the city management image sequence; when two continuous target city management images exist in the city management image sequence, acquiring the total number of pixel points of each target city management image and the pixel weighted value of each pixel point, counting the number of the pixel points of each pixel weighted value, calculating the occurrence probability of each pixel weighted value according to the total number of the pixel points and each pixel weighted value, and acquiring the similarity of two adjacent target city management images according to the occurrence probability and the similarity function of each pixel weighted value in each target city management image;
when the similarity is greater than a similarity threshold value, the two adjacent target city management images have a time sequence relationship, and the two target city management images are subjected to feature fusion in a linear iteration mode according to feature fusion factors to obtain target feature fusion images;
the city management event module constructs a maximum likelihood function of a target object in the target feature fusion image according to the target feature fusion image, acquires the conditional probability of the target object at each position coordinate point and the conditional probability of the visual feature of the target object according to the maximum likelihood function, performs linear transformation on the position coordinate and the visual feature of the target object according to the conditional probability of the target object at each position coordinate point and the conditional probability of the visual feature of the target object to obtain a linear transformation relation between the position coordinate and the visual feature of the target object, and acquires the real-time probability of each position coordinate point of the target object in the target feature fusion image according to the linear transformation relation between the position coordinate and the visual feature of the target object.
7. The method of claim 6, wherein the city management event module analyzing whether a city management event occurs according to all target city management images comprises:
the city management event module divides the target feature fusion image into a plurality of fusion image sub-regions, determines the target fusion image sub-regions according to the real-time probability of each position coordinate point of the target object in the target feature fusion image, and then extracts the target feature vectors of the target fusion image sub-regions;
the city management event module analyzes the visual mean vector of the visual feature of the target object and the position mean vector of the position feature of the target object according to the target feature vector;
the city management event module extracts a plurality of feature points from the target fusion image subregion, analyzes the feature value and the coordinate position of each feature point according to the visual mean value vector and the position mean value vector, then determines the probability value of each feature point, and calculates the occupancy rate of each feature point according to the probability value of each feature point; determining the center of the target fusion image subregion according to the occupancy rate of each feature point, and acquiring the real-time position of a target object according to the center of the target fusion image subregion;
and the city management event module analyzes whether a city management event occurs according to the real-time position of the target object.
8. The method according to one of claims 1 to 7, wherein the city management prompt message is used for prompting a city manager that a city management event occurs, and comprises an event-related time, an event-related place and an event type.
9. The method according to one of claims 1 to 8, wherein the city management terminal is a device with communication function and data transmission function used by a city manager, and comprises: smart phones, tablet computers, and notebook computers.
CN202011029391.1A 2020-09-27 2020-09-27 City management event processing method based on smart city Withdrawn CN112183312A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011029391.1A CN112183312A (en) 2020-09-27 2020-09-27 City management event processing method based on smart city

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011029391.1A CN112183312A (en) 2020-09-27 2020-09-27 City management event processing method based on smart city

Publications (1)

Publication Number Publication Date
CN112183312A true CN112183312A (en) 2021-01-05

Family

ID=73943519

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011029391.1A Withdrawn CN112183312A (en) 2020-09-27 2020-09-27 City management event processing method based on smart city

Country Status (1)

Country Link
CN (1) CN112183312A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191266A (en) * 2021-04-30 2021-07-30 江苏航运职业技术学院 Remote monitoring management method and system for ship power device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191266A (en) * 2021-04-30 2021-07-30 江苏航运职业技术学院 Remote monitoring management method and system for ship power device

Similar Documents

Publication Publication Date Title
CN111353413B (en) Low-missing-report-rate defect identification method for power transmission equipment
CN108921051B (en) Pedestrian attribute identification network and technology based on cyclic neural network attention model
Wang et al. Machine learning-based regional scale intelligent modeling of building information for natural hazard risk management
CN110738127A (en) Helmet identification method based on unsupervised deep learning neural network algorithm
CN105246033A (en) Terminal location-based crowd status monitoring method and monitoring device
She et al. Weighted network Voronoi Diagrams for local spatial analysis
CN110659391A (en) Video detection method and device
CN111080501B (en) Real crowd density space-time distribution estimation method based on mobile phone signaling data
Zhao et al. A network distance and graph-partitioning-based clustering method for improving the accuracy of urban hotspot detection
Chohlas-Wood et al. Mining 911 calls in New York City: Temporal patterns, detection, and forecasting
Rubio et al. Adaptive non-parametric identification of dense areas using cell phone records for urban analysis
CN109858332A (en) A kind of human behavior analysis method, device and electronic equipment
CN109740527B (en) Image processing method in video frame
CN115082250A (en) Network relation analysis method, device and terminal for individual movement and ecological space
CN105184435A (en) Field staff management method and system
CN112183312A (en) City management event processing method based on smart city
Alashban et al. Single convolutional neural network with three layers model for crowd density estimation
CN113052139A (en) Deep learning double-flow network-based climbing behavior detection method and system
CN112153464A (en) Smart city management system
CN116229396B (en) High-speed pavement disease identification and warning method
CN116797180A (en) Complaint early warning method, complaint early warning device, computer equipment and storage medium
CN115346169B (en) Method and system for detecting sleep post behaviors
Beiji et al. Crime hotspot detection and monitoring using video based event modeling and mapping techniques
CN112711990A (en) Multi-camera combined large-scene crowd counting method
CN109727218B (en) Complete graph extraction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210105

WW01 Invention patent application withdrawn after publication