CN110336873B - Intelligent city management special vehicle monitoring system - Google Patents

Intelligent city management special vehicle monitoring system Download PDF

Info

Publication number
CN110336873B
CN110336873B CN201910595539.9A CN201910595539A CN110336873B CN 110336873 B CN110336873 B CN 110336873B CN 201910595539 A CN201910595539 A CN 201910595539A CN 110336873 B CN110336873 B CN 110336873B
Authority
CN
China
Prior art keywords
vehicle
matrix
information
monitoring
database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910595539.9A
Other languages
Chinese (zh)
Other versions
CN110336873A (en
Inventor
廖兴旺
黄伟鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Ruis Technology Co ltd
Original Assignee
Fujian Ruis Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Ruis Technology Co ltd filed Critical Fujian Ruis Technology Co ltd
Priority to CN201910595539.9A priority Critical patent/CN110336873B/en
Publication of CN110336873A publication Critical patent/CN110336873A/en
Application granted granted Critical
Publication of CN110336873B publication Critical patent/CN110336873B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/172Caching, prefetching or hoarding of files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • G06Q50/265Personal security, identity or safety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • H04L67/025Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/48Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for in-vehicle communication

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Educational Administration (AREA)
  • Strategic Management (AREA)
  • Primary Health Care (AREA)
  • Marketing (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Computing Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Medical Informatics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a smart city management special vehicle monitoring system, which comprises: the vehicle-mounted end comprises a first wireless communication module, a monitoring module, a positioning module and a first instruction transmission module; the network side server comprises a second wireless communication module, a monitoring terminal, a positioning terminal and a second instruction transmission module; the second instruction transmission module is used for inputting a monitoring route instruction by a background worker and transmitting the monitoring route instruction to the vehicle-mounted end; the first instruction transmission module is used for playing the monitoring route instruction, and a driver of the urban management special vehicle drives the vehicle to patrol according to the monitoring route instruction; the positioning module is used for acquiring the position information of the urban management special vehicle; the monitoring module comprises a first monitoring unit and a second monitoring unit; the first monitoring unit is used for acquiring first image information of the surrounding environment of the urban management special vehicle; the second monitoring unit is used for acquiring second image information in the urban management special vehicle; and the vehicle-mounted terminal transmits the position information, the first image information and the second image information to the network side server.

Description

Intelligent city management special vehicle monitoring system
Technical Field
The invention relates to the technical field of monitoring, in particular to a smart city management special vehicle monitoring system.
Background
With the continuous development of science and technology and society, science and technology is widely applied to the life of people, and the rapid development of human society is promoted. City management, which relates to people's life and safety, but at present, for city management, people are still used to drive special vehicles for city management to carry out patrol monitoring,
however, the traditional urban management special vehicle has single function, and patrols in areas only depend on the naked eye observation of patrols, so that the patrols consume great energy and time; in addition, the division management is adopted at present, patrol personnel carry out artificial random patrol monitoring in the jurisdiction area, the monitoring management of all streets in the jurisdiction area can not be realized, and the monitoring efficiency is greatly influenced.
Therefore, it is urgently needed to provide an intelligent city management special vehicle monitoring system.
Disclosure of Invention
In order to solve the technical problems, the invention provides an intelligent urban management special vehicle monitoring system which is used for realizing the monitoring function of urban management special vehicles.
The embodiment of the invention provides a smart city management special vehicle monitoring system which comprises a vehicle-mounted end and a network side server; wherein the content of the first and second substances,
the vehicle-mounted end comprises a first wireless communication module, a monitoring module, a positioning module and a first instruction transmission module;
the network side server comprises a second wireless communication module, a monitoring terminal, a positioning terminal and a second instruction transmission module;
the second instruction transmission module of the network side server is used for a background worker to input a monitoring route instruction and transmit the monitoring route instruction to the vehicle-mounted end through the second wireless communication module;
the first instruction transmission module of the vehicle-mounted end is used for playing the monitoring route instruction received by the first wireless communication module, and a driver of the urban management special vehicle drives the urban management special vehicle to patrol and monitor according to the played monitoring route instruction;
the positioning module of the vehicle-mounted end is used for acquiring the position information of the urban management special vehicle;
the monitoring module of the vehicle-mounted end comprises a first monitoring unit and a second monitoring unit; the first monitoring unit is used for acquiring first image information of the surrounding environment of the urban management special vehicle; the second monitoring unit is used for acquiring second image information in the urban management special vehicle;
the vehicle-mounted terminal is used for transmitting the position information, the first image information and the second image information to the network side server through the first wireless communication module;
the positioning terminal of the network side server is used for receiving the position information transmitted by the vehicle-mounted terminal through the second wireless communication module; and the monitoring terminal of the network side server receives the first image information and the second image information transmitted by the vehicle-mounted terminal through the second wireless communication module.
In one embodiment, the first instruction transmission module is further configured to receive voice information transmitted by a driver, and transmit the voice information to the network-side server through the first wireless communication module;
and the second wireless communication module of the network side server receives the voice information and plays the voice information through the second instruction transmission module.
In one embodiment, the positioning terminal comprises a map database and a display; the map database comprises an electronic map; and the positioning terminal is used for marking the position information transmitted by the vehicle-mounted terminal on the electronic map and displaying the position information through the display.
In one embodiment, the monitoring terminal further comprises a face recognition module;
the face recognition module comprises a template database, an image processing unit, an escape personnel database and an image recognition unit;
the template database is used for storing S face templates with different sizes, and each face template can form a pixel matrix B according to the difference of the pixels of each position point in the templaten×n
Figure BDA0002117509580000031
Wherein B isn×nRepresents the modelThe plate is a template of n x n pixel points, bnnB is the value of the pixel corresponding to the position of the n-th column in the n-th row of the templatennI.e. the value corresponding to the pixel point, further bnnIs a set containing R, G, B three values;
the image processing unit is used for calculating the first image information to form a pixel matrix A;
Figure BDA0002117509580000032
wherein a islmIs the value of the pixel point corresponding to the mth row and the mth column of the pixel point of the first image information, and a is the samelmAlso a set containing R, G, B three values; then, according to the size stored in the template database, S selection frames with the same specification as the template database are determined, data of each selection frame is extracted from the position a11 in the matrix A, and a matrix with the size corresponding to the selection frame is extracted each time to form a matrix Cn×n
Figure BDA0002117509580000033
Wherein c isnnThe value of the pixel point corresponding to the n-th row and n-th column of the matrix selected for the selection frame, and cnnAlso a set containing R, G, B three values; the obtained matrix values and the corresponding template values are subjected to correlation examination, and the detection method is as follows:
Figure BDA0002117509580000041
wherein
Figure BDA0002117509580000042
For the t-th value of the matrix corresponding to the i-th row and j-th column of matrix B,
Figure BDA0002117509580000043
if the value rho (B, C) of the correlation test is more than or equal to 0.5, the area selected by the selection frame is identified as a human face, and the matrix is stored;
the escaping personnel database is used for storing the face images of all the escaping personnel with the same specification;
the image identification unit is used for extracting pixels of the face image from the escaper database to form a matrix, the specification pixel quantity of the face image is larger than the matrix corresponding to the largest template in the S face templates determined in the prior art, and then the pixel matrix of the escaper database is grayed firstly, wherein the grayscale formula is as follows:
grayij=0.3*Rij+0.5*Gij+0.2*Bij
wherein grayijR is the result of graying the RGB values with pixel point positions of i rows and j columnsijIs the value of R with pixel point position i rows and j columns, GijIs the value of G with pixel point position i rows and j columns, BijSetting the pixel point position as the value of B in i rows and j columns; then each escaper can obtain a corresponding grayed data matrix, and the matrix is determined to be Wt*tAt the same time, the matrix C passing the face recognitionn×nGraying to form matrix grayCn×nIn order to analyze the identified matrix with the matrix of the person escaping, the matrix is required to be GrayCn×nConverted into a database matrix W of escaped personst*tThe same size matrix, first determining the grayC in the transformation processn×nThe matrix needs the number and the position of the difference values, wherein the determination method is as follows:
p=t-n
jg=floor(n/p)
qz={jg,2*jg,…,p*jg}
wherein p is the number of the calculated required difference values, t is the row and column specification number of the matrix of the person escaping, and n is the matrix grayC passing face recognitionn×nFloor is a floor rounding function, qzThe position of the required difference value, qz value in the set, is solved, and represents the matrix GrayCn×nThe corresponding row and column need to be followed by a difference, wherein when a difference is made, a corresponding value is inserted after the column first, wherein the difference is made in the following way:
Figure BDA0002117509580000051
wherein CZxRepresenting a new column, grayC, formed by the difference after the X-th columnxMatrix grayCn×nThe value of column X; after all columns solved by qz are differenced, the same difference operation is carried out on the corresponding rows to form a matrix CZGrayC after the differencet×t
The matrix CZGrayC after the obtained difference value is usedt×tAnd calculating the correlation distance with all the matrixes in the escaping person database, wherein the calculation mode is as follows:
Figure BDA0002117509580000052
wherein
Figure BDA0002117509580000054
Is CZgrayCt×tCorrelation between the k-th image matrix in the corresponding data matrix in the fleeing people database, t is matrix and number of rows and columns, CZGrayCt*tFor interpolated matrices by identified graying, CZGrayCijAs a matrix CZgrayCt*tThe value of i row and j column of (1), Wt*tThe value of the grayed matrix corresponding to the kth image in the fleeing personnel database, E represents the mathematical expectation; calculating CZGrayC by the correlation calculation formulat*tAnd obtaining corresponding correlation vectors by correlating all the matrixes in the escape personnel database
P=(p1,p2,p3…pj)
Wherein p isjIs composed of
Figure BDA0002117509580000053
Finding out the maximum value in the correlation vector P, if the face image in the escaping person database corresponding to the maximum value is the escaping person image, and the corresponding PjIf the number is more than or equal to 0.95, the person is determined to be escaping;
after the determination of the escaped person is completed each time, the selection frame is required to move with the step length of 1, that is, the selection matrix C is at the initial position C of the time11Is a of the matrix AijThen the corresponding starting position C11 after moving once is as follows:
Figure BDA0002117509580000061
until moving to cnn=AlmAfter each movement is completed, the first image information needs to be identified and compared with the escaper database, so that whether escapers exist in the first image information or not is determined;
when it is determined that the first image information contains the escaped person, the monitoring terminal transmits alarm information to the vehicle-mounted terminal through the second wireless communication module; and the first wireless communication module of the vehicle-mounted end receives the alarm information transmitted by the monitoring terminal and plays the alarm information through the first instruction transmission module.
In one embodiment, the first wireless communication module or the second wireless communication module comprises one or more of a GPRS communication module, a 4G communication module, or an NB-loT communication module;
the positioning module comprises one or more of a GPS positioning module or a Beidou positioning module.
In one embodiment, the first instruction transmission module or the second instruction transmission module comprises a microphone and a sound player;
the microphone comprises a base and a sound collector; the base is provided with a rotating device, and the sound collector is arranged on the rotating device; the sound collector comprises a plurality of sound collecting devices; a camera device is arranged at the central position of the sound collection equipment;
a controller and a driving motor are further arranged on one side of the rotating device; the controller is connected with the camera device and the driving motor, and the driving motor is connected with the rotating device;
the camera device is used for acquiring the face image information of the environment around the microphone and transmitting the face image information to the controller; the controller controls the driving motor to drive the rotating device to rotate according to the face image information, so that the sound collector faces a user.
In one embodiment, the vehicle-mounted terminal further comprises a vehicle monitoring module; the network side server also comprises a vehicle database;
the vehicle monitoring module comprises a vehicle speed monitoring unit and a fuel consumption monitoring unit; the vehicle speed monitoring unit is used for acquiring the speed information of the urban management special vehicle; the fuel consumption monitoring unit is used for acquiring fuel consumption information of the urban management special vehicle; the vehicle monitoring module is used for transmitting the speed information, the oil consumption information and the vehicle identification information to the network side server through the first wireless communication module; the vehicle database of the network side server is used for storing the speed information, the oil consumption information and the vehicle identification information of the urban management special vehicle received by the second wireless communication module;
the vehicle database also comprises an information receiving unit, an information selecting unit and a folder creating unit; wherein: the information receiving unit is used for receiving the speed information, the oil consumption information and the vehicle identification information; the information selection unit is used for comparing the vehicle identification information with the file names of the folders in the vehicle database, and when the file names of the folders in the vehicle database are the same as the vehicle identification information, the speed information and the fuel consumption information are stored in the folder with the file name which is the same as the vehicle identification information; and when the file name of the file folder in the vehicle database is different from the vehicle identification information, the file folder creating unit creates a new file folder, takes the vehicle identification information as the file name of the new file folder, and stores the speed information and the fuel consumption information into the corresponding file folder.
In one embodiment, the first monitoring unit of the monitoring module comprises a rotating camera;
the rotary camera comprises a base, a rotating shaft, a motor, a rotary platform and a camera; the rotating shaft is inserted into the base, and the motor is arranged in the base and connected with the rotating shaft; the rotating platform is arranged on the rotating shaft; the camera is fixedly arranged on the rotating platform;
the camera comprises a shell, a camera and an illuminating lamp, wherein an installation chamber is arranged on the inner side of the shell, the camera shell is installed at an opening of the installation chamber, a light hole is formed in the camera shell, and the camera is arranged in the installation chamber and aligned with the light hole;
the installation chamber is close to the one end of camera casing is provided with the screens groove, be provided with the tempering translucent cover on the screens groove to with camera casing contacts.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
FIG. 1 is a schematic structural diagram of a smart city management special vehicle monitoring system according to the present invention;
FIG. 2 is a schematic structural diagram of a microphone of a smart city management special vehicle monitoring system according to the present invention;
fig. 3 is a schematic structural diagram of a rotary camera of the smart city management special vehicle monitoring system provided by the invention.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
The embodiment of the invention provides a smart city management special vehicle monitoring system, which comprises a vehicle-mounted system 11 and a network side server 12, as shown in figure 1; wherein the content of the first and second substances,
the vehicle-mounted terminal 11 comprises a first wireless communication module 111, a monitoring module 112, a positioning module 113 and a first instruction transmission module 114;
the network side server 12 comprises a second wireless communication module 121, a monitoring terminal 122, a positioning terminal 123 and a second instruction transmission module 124;
the second instruction transmission module 124 of the network-side server 12 is configured to enable a background worker to input a monitoring route instruction and transmit the monitoring route instruction to the vehicle-mounted terminal 11 through the second wireless communication module 121;
the first instruction transmission module 114 of the vehicle-mounted terminal 11 is configured to play the monitoring route instruction received by the first wireless communication module 111, and a driver of the special urban management vehicle drives the special urban management vehicle according to the played monitoring route instruction to perform patrol monitoring;
the positioning module 113 at the vehicle-mounted end is used for acquiring the position information of the urban management special vehicle;
the monitoring module 112 at the vehicle-mounted end comprises a first monitoring unit and a second monitoring unit; the first monitoring unit is used for acquiring first image information of the surrounding environment of the urban management special vehicle; the second monitoring unit is used for acquiring second image information in the urban management special vehicle;
the vehicle-mounted terminal is used for transmitting the position information, the first image information and the second image information to the network side server 12 through the first wireless communication module 111;
the positioning terminal 123 of the network side server 12 is configured to receive the position information transmitted by the vehicle-mounted terminal through the second wireless communication module 121; the monitoring terminal 122 of the network-side server 12 receives the first image information and the second image information transmitted by the vehicle-mounted terminal through the second wireless communication module 121.
The working principle of the system is as follows: background staff at the network side server 12 transmits the monitoring route instruction to the vehicle-mounted terminal 11 through the second instruction transmission module 124 via the second wireless communication module 121; the first wireless communication module 111 of the vehicle-mounted terminal 11 plays the monitoring route instruction through the first instruction transmission module 111; a driver of the urban management special vehicle drives the vehicle to move according to the monitoring route instruction played by the first instruction transmission module 111; the positioning module 113 and the monitoring module 112 of the network-side server 12 are configured to acquire position information of the urban management special vehicle, first image information of an environment around the urban management special vehicle, and second image information in the urban management special vehicle, and transmit the position information, the first image information, and the second image information to the network-side server 12 through the first wireless communication module 111; the positioning terminal 123 and the monitoring terminal 122 of the network-side server 12 receive the transmitted position information, the first image information, and the second image information through the second wireless communication module 121.
The beneficial effect of above-mentioned system lies in: the first instruction transmission module and the second instruction transmission module are used for realizing the transmission of the monitoring route instruction to the vehicle-mounted end by the network side server, and further improving the patrol efficiency of the urban management special vehicle; the acquisition of the urban management special surrounding environment and the in-vehicle image information is realized through the monitoring module and is transmitted to the network side server, so that the real-time monitoring of the urban management special surrounding environment and the in-vehicle image by background workers is realized; the positioning module is used for realizing the real-time acquisition of the position information of the urban management special vehicle by the network side server, so that a background worker can conveniently judge whether the urban management special vehicle patrols according to a monitoring route instruction; the system adopts the monitoring route instruction transmitted by the network side server to carry out patrol monitoring, solves the defect of artificial random patrol monitoring in the traditional technology, and realizes patrol of all streets in a monitored area by planning the monitoring route by background staff; the inconvenience that patrol personnel observe through human eyes in the prior art is solved through the monitoring module, the workload of the patrol personnel is reduced, the patrol monitoring efficiency is effectively improved, the multifunction of the urban management special vehicle is further realized, and the remote monitoring of the system on the covered area is also realized.
In one embodiment, the first instruction transmission module is further configured to receive voice information transmitted by a driver, and transmit the voice information to the network-side server through the first wireless communication module;
and the second wireless communication module of the network side server receives the voice information and plays the voice information through the second instruction transmission module. According to the technical scheme, the first instruction transmission module, the first wireless communication module, the second instruction transmission module and the second wireless communication module are used for realizing the transmission of the voice information between the network side server and the vehicle-mounted terminal, so that the voice talkback between background workers at the network side server and workers in the city management special vehicle is realized.
In one embodiment, a positioning terminal includes a map database and a display; a map database comprising an electronic map; and the positioning terminal is used for marking the position information transmitted by the vehicle-mounted terminal on the electronic map and displaying the position information through the display. According to the technical scheme, the positioning terminal marks the position information transmitted by the vehicle-mounted terminal on the electronic map, so that the network side server monitors the patrol monitoring route of the urban management special vehicle in real time, the patrol monitoring route is displayed by the display, and monitoring of the movement route of the urban management special vehicle by background workers is further facilitated.
In one embodiment, the monitoring terminal further comprises a face recognition module;
the face recognition module comprises a face template database, an image processing unit, an escaper database and an image recognition unit;
a template database for storing S face templates with different sizes, wherein each face template can form a pixel matrix B according to the difference of the pixels of each position point in the templaten×n
Figure BDA0002117509580000101
Wherein B isn×nRepresenting the template as an n x n number of pixelsTemplate, bnnB is the value of the pixel corresponding to the position of the n-th column in the n-th row of the templatennI.e. the value corresponding to the pixel point, further bnnIs a set containing three values of RGB;
the image processing unit is used for calculating the first image information to form a pixel matrix A;
Figure BDA0002117509580000111
wherein a islmIs the value of the pixel point corresponding to the mth row and mth column of the pixel point of the first image information, and almIs also a set containing three values of RGB; then, according to the size stored in the template database, S selection frames with the same specification as the template database are determined, data are extracted from the position a11 in the matrix A for each selection frame, and a matrix with the size corresponding to the selection frame is extracted each time to form a matrix Cn×n
Figure BDA0002117509580000112
Wherein c isnnThe value of the pixel corresponding to the n-th row and n-th column of the matrix selected for the selection frame, and cnnIs also a set containing three values of RGB; (for example, assume that of the S templates, one template is B8×8The template, which will have an 8 x 8 selection box accordingly, is first extracted as a11To a88The resulting 8-by-8 matrix C8×8) The obtained matrix values and the corresponding template values are subjected to correlation examination, and the detection method is as follows:
Figure BDA0002117509580000113
wherein
Figure BDA0002117509580000114
Corresponding to the ith row and jth column of matrix BThe t-th value of the matrix is,
Figure BDA0002117509580000115
if the value rho (B, C) of the correlation test is more than or equal to 0.5, the area selected by the selection frame is identified as a human face, and the matrix is stored;
the on-flight person database is used for storing the face images of all on-flight persons with the same specification;
an image identification unit, configured to extract pixels of a face image from a fleeing people database to form a matrix, where the face image specification pixel quantity is greater than a matrix corresponding to a largest template among the S face templates determined in the foregoing, and then graying the pixel matrix of the fleeing people database, where the grayscale formula is:
grayij=0.3*Rij+0.5*Gij+0.2*Bij
wherein grayijR is the result of graying the RGB values with pixel point positions of i rows and j columnsijIs the value of R with pixel point position i rows and j columns, GijIs the value of G with pixel point position i rows and j columns, BijSetting the pixel point position as the value of B in i rows and j columns; then each escaper can obtain a corresponding grayed data matrix, and the matrix is determined to be Wt*tAt the same time, the matrix C passing the face recognitionn×nGraying to form matrix grayCn×nIn order to analyze the identified matrix with the matrix of the person escaping, the matrix is required to be GrayCn×nConverted into a database matrix W of escaped personst*tThe same size matrix, first determining the grayC in the transformation processn×nThe matrix needs the number and the position of the difference values, wherein the determination method is as follows:
p=t-n
jg=floor(n/p)
qz={jg,2*jg,…,p*jg}
wherein p is the number of the calculated required difference values, t is the row and column specification number of the matrix of the person escaping, and n is the matrix grayC passing face recognitionn×nThe row and column specification quantity of (1), floor is a downward rounding function, the position of the required difference value solved by qz, and the value possessed by the qz set represent matrix grayCn×nThe corresponding row and column need to be followed by a difference, wherein when a difference is made, a corresponding value is inserted after the column first, wherein the difference is made in the following way:
Figure BDA0002117509580000121
wherein CZxRepresenting a new column, grayC, formed by the difference after the X-th columnxMatrix grayCn×nThe value of column X; after all columns solved by qz are differenced, the same difference operation is carried out on the corresponding rows to form a matrix CZGrayC after the differencet×t
The gray-scale matrix CZGrayC after the obtained difference valuet×tAnd calculating the correlation distance with all the matrixes in the escaping person database, wherein the calculation mode is as follows:
Figure BDA0002117509580000131
wherein
Figure BDA0002117509580000133
Is CZgrayCt×tCorrelation with the k-th image matrix in the corresponding data matrix in the fleeing person database, t being the matrix and number of rows and columns, CZGrayCt*tFor interpolated matrices by identified graying, CZGrayCijAs a matrix CZgrayCt*tThe value of i row and j column of (1), Wt*tThe value of the grayed matrix corresponding to the kth image in the fleeing personnel database, E represents the mathematical expectation; calculating CZGrayC through the correlation calculation formulat*tAnd obtaining corresponding correlation vectors by correlating all matrixes in the escape personnel database
P=(p1,p2,p3…pj)
Wherein p isjIs p(C,W)jFinding out the maximum value in the correlation vector P, if the face image in the escaping person database corresponding to the maximum value is the escaping person image, and the corresponding PjIf the number is more than or equal to 0.95, the person is determined to be escaping;
after the determination of the escaped person is completed each time, the selection frame is required to move with the step length of 1, that is, the selection matrix C is at the initial position C of the time11Is a of the matrix AijThen the corresponding starting position C11 after moving once is as follows:
Figure BDA0002117509580000132
until moving to cnn=AlmAfter each movement is completed, the first image information needs to be identified and compared with the escaper database, so that whether escapers exist in the first image information or not is determined;
when the first image information is determined to have the escaped personnel, the monitoring terminal transmits alarm information to the vehicle-mounted terminal through the second wireless communication module; and the first wireless communication module of the vehicle-mounted end receives the alarm information transmitted by the monitoring terminal and plays the alarm information through the first instruction transmission module. In the technical scheme, S face templates with different sizes are selected when first image information is identified, so that different sizes of faces possibly appearing in the first image information can be identified, the first image information does not need to be correspondingly amplified or reduced during identification, the processing speed is accelerated, meanwhile, when the relevance of escaped persons is utilized during processing, the value after the gray level is utilized for processing, the efficiency of relevance calculation is greatly improved, and when a selection frame is selected, after each selection is completed, the processing with the step length of 1 is carried out, so that the corresponding S selection frames can select the areas corresponding to the specifications corresponding to all the selection frames in the picture; in the image recognition, all the choices are independent choices through a computer, manual intervention is not needed, high intelligence is achieved, automatic recognition and comparison of faces around the urban management special vehicle are achieved through the face recognition module, and when it is determined that escaped people exist in the first image information, alarm information is transmitted to the vehicle-mounted end, so that patrol efficiency is further improved, and safety of surrounding residents is effectively guaranteed.
In one embodiment, the first wireless communication module or the second wireless communication module comprises one or more of a GPRS communication module, a 4G communication module, or an NB-loT communication module; according to the technical scheme, information transmission between the network side server and the vehicle-mounted terminal is achieved through various communication modes.
And the positioning module comprises one or more of a GPS positioning module or a Beidou positioning module. According to the technical scheme, the positioning function of the vehicle-mounted end is realized through various positioning devices
In one embodiment, the first instruction transmission module or the second instruction transmission module comprises a microphone and a sound player;
a microphone, as shown in fig. 2, including a base 21 and a sound collector 22; the base 21 is provided with a rotating device 23, and the sound collector 22 is arranged on the rotating device 23; a sound collector 22 including a plurality of sound collection devices 24; an image pickup device 25 is provided at a central position of the plurality of sound collection apparatuses 24;
a controller 26 and a driving motor 27 are arranged on one side of the rotating device 23; the controller 26 is connected with the camera device 25 and the driving motor 27, and the driving motor 27 is connected with the rotating device 23;
the camera device 25 is used for acquiring face image information of the surrounding environment of the microphone and transmitting the face image information to the controller 26; the controller 26 controls the driving motor 27 to drive the rotation device 23 to rotate according to the face image information so that the sound collector 22 faces the user. In the above technical solution, the camera device 25 is used to obtain the face image information of the surrounding environment of the microphone, and the controller 26 adjusts the rotating device 23 according to the obtained face image information, so that the sound collector 22 faces the user; by the technical scheme, the microphone can automatically adjust the orientation of the sound collector, the sound receiving efficiency of the microphone is effectively improved, and the quality of the voice information acquired by the system is further improved.
In one embodiment, the vehicle-mounted terminal further comprises a vehicle monitoring module; the network side server also comprises a vehicle database;
the vehicle monitoring module comprises a vehicle speed monitoring unit and a fuel consumption monitoring unit; the vehicle speed monitoring unit is used for acquiring speed information of the urban management special vehicle; the fuel consumption monitoring unit is used for acquiring fuel consumption information of the city management special vehicle; the vehicle monitoring module is used for transmitting the speed information, the oil consumption information and the vehicle identification information to the network side server through the first wireless communication module; the vehicle database of the network side server is used for storing the speed information, the oil consumption information and the vehicle identification information of the urban management special vehicle received by the second wireless communication module;
the vehicle database also comprises an information receiving unit, an information selecting unit and a folder creating unit; wherein: the information receiving unit is used for receiving speed information, oil consumption information and vehicle identification information; the information selection unit is used for comparing the vehicle identification information with the file names of the folders in the vehicle database, and when the file names of the folders in the vehicle database are the same as the vehicle identification information, the speed information and the fuel consumption information are stored in the folder with the file name which is the same as the vehicle identification information; when the file name of the folder in the vehicle database is different from the vehicle identification information, the folder creating unit creates a new folder, takes the vehicle identification information as the file name of the new folder, and stores the speed information and the fuel consumption information into the corresponding folders. In the technical scheme, the speed information and the oil consumption information of the urban management special vehicle are acquired through the vehicle monitoring module, and the speed information, the oil consumption information and the vehicle identification information of the urban management special vehicle are transmitted to the vehicle database of the network side server for storage; the vehicle database receives the speed information, the fuel consumption information and the vehicle identification information through the information receiving unit, the information selecting unit compares the file names of the folders in the vehicle database with the vehicle identification information, when the file names are consistent with the vehicle identification information, the speed information and the fuel consumption information are stored in the folder of the file name consistent with the vehicle identification information, and if the file names are not consistent with the vehicle identification information, the file folder creating unit is controlled to create a new folder, the vehicle identification information is used as the file name of the folder, and the speed information and the fuel consumption information are stored in the folder. By the technical scheme, the speed information and the oil consumption information of the urban management special vehicle are acquired, the speed information and the oil consumption information are automatically classified and stored, and the monitoring and the management of background workers on the use condition of the urban management special vehicle are facilitated.
In one embodiment, a first monitoring unit of a monitoring module includes a rotating camera;
the rotary camera, as shown in fig. 3, includes a base 31, a rotating shaft 32, a motor 33, a rotary platform 34 and a camera 35; the rotating shaft 32 is inserted into the base 31, and the base 31 is also internally provided with a motor 33 which is connected with the rotating shaft 32; the rotary platform 34 is arranged on the rotating shaft 32; a camera 35 is fixedly arranged on the rotary platform 34;
the camera 35 comprises a shell 351, a camera shell 352, a camera 353 and an illuminating lamp 354, wherein the inner side of the shell 351 is provided with a mounting chamber 355, the camera shell 352 is mounted at the opening of the mounting chamber 355, a light hole 356 is formed in the camera shell 352, and the camera 353 is arranged in the mounting chamber 355 and aligned with the light hole 356;
one end of the installation chamber 355 close to the camera shell 352 is provided with a clamping groove 357, and the clamping groove 357 is provided with a toughened transparent cover 358 and is in contact with the camera shell 352. The rotating function of the rotating camera is realized through the connection of the motor 33 and the rotating shaft 32 in the technical scheme, the mounting chamber 355 is arranged on the inner side of the shell 351, and the camera shell 352 is provided with the light hole 356, so that the camera 353 mounted in the mounting chamber 355 is communicated with the outside through the light hole 356, and the monitoring and the camera shooting of the surrounding environment of the urban management special vehicle are realized. Through set up first screens groove 357 in installation room 355 for tempering translucent cover 358 can inlay and locate in first screens groove 357, and contact with camera casing 352, thereby realized the sealed to the opening part of installation room 355, and through tempering translucent cover 358's setting, played the guard action to camera 353 in installation room 355.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (7)

1. A smart city management special vehicle monitoring system is characterized by comprising a vehicle-mounted end and a network side server; wherein the content of the first and second substances,
the vehicle-mounted end comprises a first wireless communication module, a monitoring module, a positioning module and a first instruction transmission module;
the network side server comprises a second wireless communication module, a monitoring terminal, a positioning terminal and a second instruction transmission module;
the second instruction transmission module of the network side server is used for a background worker to input a monitoring route instruction and transmit the monitoring route instruction to the vehicle-mounted end through the second wireless communication module;
the first instruction transmission module of the vehicle-mounted end is used for playing the monitoring route instruction received by the first wireless communication module, and a driver of the urban management special vehicle drives the urban management special vehicle to patrol and monitor according to the played monitoring route instruction;
the positioning module of the vehicle-mounted end is used for acquiring the position information of the urban management special vehicle;
the monitoring module of the vehicle-mounted end comprises a first monitoring unit and a second monitoring unit; the first monitoring unit is used for acquiring first image information of the surrounding environment of the urban management special vehicle; the second monitoring unit is used for acquiring second image information in the urban management special vehicle;
the vehicle-mounted terminal is used for transmitting the position information, the first image information and the second image information to the network side server through the first wireless communication module;
the positioning terminal of the network side server is used for receiving the position information transmitted by the vehicle-mounted terminal through the second wireless communication module; the monitoring terminal of the network side server receives the first image information and the second image information transmitted by the vehicle-mounted terminal through the second wireless communication module;
the monitoring terminal also comprises a face recognition module;
the face recognition module comprises a template database, an image processing unit, an escape personnel database and an image recognition unit;
the template database is used for storing S face templates with different sizes, and each face template can form a pixel matrix B according to the difference of the pixels of each position point in the templaten×n
Figure FDA0003206048980000021
Wherein B isn×nRepresenting the template as a template of n x n pixels, bnnThe image processing unit is used for calculating the first image information to form a pixel matrix A for the value of the pixel corresponding to the position of the nth column of the nth row of the template;
Figure FDA0003206048980000022
wherein a islmDetermining K selection frames with the same specification as the template database according to the size stored in the template database for the value of the pixel point corresponding to the mth row and the mth column of the pixel point of the first image information, and respectively selecting each selection frame from a in the matrix A11Extracting data from the position, and extracting the matrix with the size corresponding to the selection frame each time to form a matrix Cn×n
Figure FDA0003206048980000023
Wherein c isnnMoments selected for the selection boxThe correlation between the obtained matrix value and the corresponding template value is checked for the value of the pixel point corresponding to the n-th row and the n-th column of the array, and the detection method is as follows:
Figure FDA0003206048980000024
wherein
Figure FDA0003206048980000025
For the t-th value of the matrix corresponding to the i-th row and j-th column of matrix B,
Figure FDA0003206048980000026
if the value rho (B, C) of the correlation check is not less than 0.5, the area selected by the selection frame is identified as a human face, and the matrix is stored;
the escaping personnel database is used for storing the face images of all the escaping personnel with the same specification;
the image identification unit is used for extracting pixels of the face image from the escaper database to form a matrix, the specification pixel quantity of the face image is larger than the matrix corresponding to the largest template in the S face templates determined in the prior art, and then the pixel matrix of the escaper database is grayed firstly, wherein the grayscale formula is as follows:
grayij=0.3*Rij+0.5*Gij+0.2*Bij
wherein grayijR is the result of graying the RGB values with pixel point positions of i rows and j columnsijIs the value of R with pixel point position i rows and j columns, GijIs the value of G with pixel point position i rows and j columns, BijSetting the pixel point position as the value of B in i rows and j columns; then each escaper can obtain a corresponding grayed data matrix, and the matrix is determined to be Wt*tAt the same time, the matrix C passing the face recognitionn×nGraying to form matrix grayCn×nIn order to analyze the identified matrix with the matrix of the person escaping, the matrix is required to be GrayCn×nConverted into a database matrix W of escaped personst*tThe same size matrix, first determining the grayC in the transformation processn×nThe matrix needs the number and the position of the difference values, wherein the determination method is as follows:
p=t-n
jg=floor(n/p)
qz={jg,2*jg,…,p*jg}
wherein p is the number of the calculated required difference values, t is the row and column specification number of the matrix of the person escaping, and n is the matrix grayC passing face recognitionn×nThe row and column specification quantity of (1), floor is a downward rounding function, the position of the required difference value solved by qz, and the value possessed by the qz set represent matrix grayCn×nThe corresponding row and column need to be followed by a difference, wherein when a difference is made, a corresponding value is inserted after the column first, wherein the difference is made in the following way:
Figure FDA0003206048980000031
wherein CZxRepresenting a new column, grayC, formed by the difference after the X-th columnxMatrix grayCn×nThe value of column X; after all columns solved by qz are differenced, the same difference operation is carried out on the corresponding rows to form a matrix CZGrayC after the differencet×t
The matrix CZGrayC after the obtained difference value is usedt×tAnd calculating the correlation distance with all the matrixes in the escaping personnel database, wherein the calculation mode is as follows:
Figure FDA0003206048980000041
wherein p is(C,W)kIs CZgrayCt×tCorrelation between k image matrix in corresponding data matrix in escape personnel database, t is matrix and row and column number, CZGrayCt×tFor interpolated matrices by identified graying, CZGrayCijAs a matrix CZgrayCt×tThe value of i row and j column of (1), Wt*tThe value of the grayed matrix corresponding to the kth image in the fleeing personnel database, E represents the mathematical expectation; calculating CZGrayC by the correlation calculation formulat×tAnd obtaining corresponding correlation vectors by correlating all the matrixes in the escape personnel database
P=(p1,p2,p3…pk)
Wherein p iskIs composed of
Figure FDA0003206048980000042
Finding out the maximum value in the correlation vector P, if the face image in the escaping person database corresponding to the maximum value is the escaping person image, and the corresponding PkIf the number is more than or equal to 0.95, the person is determined to be escaping;
after the determination of the escaped person is completed each time, the selection frame is required to move with the step length of 1, that is, the selection matrix C is at the initial position C of the time11Is a of the matrix AijThen the corresponding initial position c after one movement11The following were used:
Figure FDA0003206048980000043
until moving to cnn=AlmAfter each movement is completed, the first image information needs to be identified and compared with the escaper database, so that whether escapers exist in the first image information or not is determined;
when it is determined that the first image information contains the escaped person, the monitoring terminal transmits alarm information to the vehicle-mounted terminal through the second wireless communication module; and the first wireless communication module of the vehicle-mounted end receives the alarm information transmitted by the monitoring terminal and plays the alarm information through the first instruction transmission module.
2. The system of claim 1,
the first instruction transmission module is further used for receiving voice information transmitted by a driver and transmitting the voice information to the network side server through the first wireless communication module;
and the second wireless communication module of the network side server receives the voice information and plays the voice information through the second instruction transmission module.
3. The system of claim 1,
the positioning terminal comprises a map database and a display; the map database comprises an electronic map; and the positioning terminal is used for marking the position information transmitted by the vehicle-mounted terminal on the electronic map and displaying the position information through the display.
4. The system of claim 1,
the first wireless communication module or the second wireless communication module comprises one or more of a GPRS communication module, a 4G communication module or an NB-loT communication module;
the positioning module comprises one or more of a GPS positioning module or a Beidou positioning module.
5. The system of claim 1,
the first instruction transmission module or the second instruction transmission module comprises a microphone and a sound player;
the microphone comprises a base (21) and a sound collector (22); a rotating device (23) is arranged on the base (21), and the sound collector (22) is arranged on the rotating device (23); the sound collector (22) comprising a plurality of sound collecting devices (24); an image pickup device (25) is arranged at the central position of the sound collection devices (24);
a controller (26) and a driving motor (27) are further arranged on one side of the rotating device (23); the controller (26) is connected with the image pickup device (25) and the driving motor (27), and the driving motor (27) is connected with the rotating device (23);
the camera device (25) is used for acquiring face image information of the environment around the microphone and transmitting the face image information to the controller (26); the controller (26) controls the driving motor (27) to drive the rotating device (23) to rotate according to the face image information, so that the sound collector (22) faces to the user.
6. The system of claim 1,
the vehicle-mounted end also comprises a vehicle monitoring module; the network side server also comprises a vehicle database;
the vehicle monitoring module comprises a vehicle speed monitoring unit and a fuel consumption monitoring unit; the vehicle speed monitoring unit is used for acquiring the speed information of the urban management special vehicle; the fuel consumption monitoring unit is used for acquiring fuel consumption information of the urban management special vehicle; the vehicle monitoring module is used for transmitting the speed information, the oil consumption information and the vehicle identification information to the network side server through the first wireless communication module; the vehicle database of the network side server is used for storing the speed information, the oil consumption information and the vehicle identification information of the urban management special vehicle received by the second wireless communication module;
the vehicle database also comprises an information receiving unit, an information selecting unit and a folder creating unit; wherein: the information receiving unit is used for receiving the speed information, the oil consumption information and the vehicle identification information; the information selection unit is used for comparing the vehicle identification information with the file names of the folders in the vehicle database, and when the file names of the folders in the vehicle database are the same as the vehicle identification information, the speed information and the fuel consumption information are stored in the folder with the file name which is the same as the vehicle identification information; and when the file name of the file folder in the vehicle database is different from the vehicle identification information, the file folder creating unit creates a new file folder, takes the vehicle identification information as the file name of the new file folder, and stores the speed information and the fuel consumption information into the corresponding file folder.
7. The system of claim 1,
the first monitoring unit of the monitoring module comprises a rotary camera;
the rotary camera comprises a base (31), a rotating shaft (32), a motor (33), a rotary platform (34) and a camera (35); the rotating shaft (32) is inserted into the base (31), and the motor (33) is arranged in the base (31) and connected with the rotating shaft (32); the rotating platform (34) is arranged on the rotating shaft (32); the camera (35) is fixedly arranged on the rotating platform (34);
the camera (35) comprises a shell (351), a camera shell (352), a camera (353) and an illuminating lamp (354), wherein an installation chamber (355) is arranged on the inner side of the shell (351), the camera shell (352) is installed at an opening of the installation chamber (355), a light hole (356) is formed in the camera shell (352), and the camera (353) is arranged in the installation chamber (355) and aligned with the light hole (356);
one end of the installation chamber (355), which is close to the camera shell (352), is provided with a clamping groove (357), and a toughened transparent cover (358) is arranged on the clamping groove (357) and is in contact with the camera shell (352).
CN201910595539.9A 2019-07-03 2019-07-03 Intelligent city management special vehicle monitoring system Active CN110336873B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910595539.9A CN110336873B (en) 2019-07-03 2019-07-03 Intelligent city management special vehicle monitoring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910595539.9A CN110336873B (en) 2019-07-03 2019-07-03 Intelligent city management special vehicle monitoring system

Publications (2)

Publication Number Publication Date
CN110336873A CN110336873A (en) 2019-10-15
CN110336873B true CN110336873B (en) 2021-10-08

Family

ID=68143860

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910595539.9A Active CN110336873B (en) 2019-07-03 2019-07-03 Intelligent city management special vehicle monitoring system

Country Status (1)

Country Link
CN (1) CN110336873B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106235480A (en) * 2016-09-14 2016-12-21 深圳市喜悦智慧数据有限公司 A kind of intelligence police uniform
CN106851225A (en) * 2017-03-29 2017-06-13 济南智安科技发展有限公司 Law-enforcing recorder, method and law enforcement record system with flash appeal function

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190012857A (en) * 2017-07-28 2019-02-11 엘지전자 주식회사 Mobile terminal and method for controlling the same

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106235480A (en) * 2016-09-14 2016-12-21 深圳市喜悦智慧数据有限公司 A kind of intelligence police uniform
CN106851225A (en) * 2017-03-29 2017-06-13 济南智安科技发展有限公司 Law-enforcing recorder, method and law enforcement record system with flash appeal function

Also Published As

Publication number Publication date
CN110336873A (en) 2019-10-15

Similar Documents

Publication Publication Date Title
CN108510750A (en) A method of the unmanned plane inspection parking offense based on neural network model
CN107241572A (en) Student's real training video frequency tracking evaluation system
CN102053563A (en) Flight training data acquisition and quality evaluation system of analog machine
CN107133611B (en) Classroom student head-pointing rate identification and statistics method and device
CN112287827A (en) Complex environment pedestrian mask wearing detection method and system based on intelligent lamp pole
CN112270659A (en) Rapid detection method and system for surface defects of pole piece of power battery
CN111024695A (en) All-in-one AI intelligent water environment-friendly real-time monitoring system
CN110336873B (en) Intelligent city management special vehicle monitoring system
CN114550334A (en) Bridge robot inspection teaching training system, method and storage medium
CN110351268B (en) Digital law enforcement system for smart city
CN114067396A (en) Vision learning-based digital management system and method for live-in project field test
CN113822145A (en) Face recognition operation method based on deep learning
CN112954207A (en) Driving landscape snapshot method and device and automobile central console
CN116059601B (en) Assessment training system based on intelligent sensing technology
CN116168346B (en) Remote accompanying-reading monitoring system based on student behavior capturing
CN116893386A (en) Electric energy meter mounting process detection device and method based on deep learning image recognition
CN110889964A (en) Fake-licensed vehicle accurate alarm system and method based on electronic license plate
CN206948499U (en) The monitoring of student's real training video frequency tracking, evaluation system
CN116110002A (en) Small animal driving method and system based on image recognition
CN115862172A (en) Attendance system with expression discernment
CN108364245A (en) A kind of anti-cheating management system in examination hall
CN111738884A (en) Student behavior diagnosis and management method based on intelligent campus student position information
CN116363575B (en) Classroom monitoring management system based on wisdom campus
CN112990869B (en) Intelligent student attendance management device
CN116844249A (en) Face recognition-based card punching management method and card punching equipment thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant