CN112118427A - Monitoring method, system, server and computer storage medium - Google Patents

Monitoring method, system, server and computer storage medium Download PDF

Info

Publication number
CN112118427A
CN112118427A CN202011179823.7A CN202011179823A CN112118427A CN 112118427 A CN112118427 A CN 112118427A CN 202011179823 A CN202011179823 A CN 202011179823A CN 112118427 A CN112118427 A CN 112118427A
Authority
CN
China
Prior art keywords
target object
positioning
camera
monitoring
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011179823.7A
Other languages
Chinese (zh)
Other versions
CN112118427B (en
Inventor
王岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Qinggan Intelligent Technology Co Ltd
Original Assignee
Shanghai Qinggan Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Qinggan Intelligent Technology Co Ltd filed Critical Shanghai Qinggan Intelligent Technology Co Ltd
Priority to CN202011179823.7A priority Critical patent/CN112118427B/en
Publication of CN112118427A publication Critical patent/CN112118427A/en
Application granted granted Critical
Publication of CN112118427B publication Critical patent/CN112118427B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management

Abstract

The invention discloses a monitoring method, which comprises the following steps: acquiring displacement information of a plurality of positioning labels in a carriage through an ultra-wideband positioning base station; when the displacement information meets a preset displacement condition, starting a camera to shoot a target object corresponding to the positioning label; and generating monitoring data of the target object, and executing corresponding monitoring measures according to the monitoring data. The ultra-wideband high-precision positioning technology is combined with the camera capture, so that the target object in the carriage can be accurately positioned and identified, and the intelligent security early warning efficiency is improved.

Description

Monitoring method, system, server and computer storage medium
Technical Field
The present invention relates to the field of monitoring technologies, and in particular, to a monitoring method, a monitoring system, a server, and a computer storage medium.
Background
In the prior art, suspicious people cannot be screened and accurately positioned in time when passing through a public transport means monitored by a camera. Meanwhile, indoor positioning faces complex propagation environments, and the propagation loss of signals can generate large fluctuation along with the change of the environments due to different room structure designs and different building materials. In general, there are many obstacles in a room, and signal propagation may experience emission, diffraction, refraction, and scattering, resulting in dense multipath reception, and large variations in signal amplitude, phase, and arrival. Resulting in lack of purpose and low efficiency of monitoring.
Disclosure of Invention
The invention aims to provide a monitoring method, a monitoring system, a server and a computer storage medium, which can accurately position and identify a target object in a carriage, and improve the efficiency of intelligent security early warning.
In order to solve the above technical problem, the present application provides a monitoring method, including:
acquiring displacement information of a plurality of positioning labels in a carriage through an ultra-wideband positioning base station;
when the displacement information meets a preset displacement condition, starting a camera to shoot a target object corresponding to the positioning label;
and generating monitoring data of the target object, and executing corresponding monitoring measures according to the monitoring data.
Wherein, the starting of the camera to follow the shooting of the target object corresponding to the positioning tag comprises the following steps:
controlling the shooting angle of the camera according to the position information of the target object so as to capture the target object;
and following and acquiring image information of the target object, wherein the image information comprises a multi-angle image of the target object.
Wherein, the following shooting and the acquisition of the image information of the target object comprise the following steps:
when the target object exceeds the shooting range of the camera but does not leave the carriage, updating the displacement information of the target object according to the positioning tag of the target object, and starting the camera corresponding to the updated displacement information to follow the shooting of the target object; and/or the presence of a gas in the gas,
and switching the camera to be a camera adjacent to the camera according to the moving direction of the target object to continue to follow the shooting.
Wherein, the starting of the camera to follow the shooting of the target object corresponding to the positioning tag further comprises the following steps:
sending a request instruction of a target object corresponding to the positioning tag to a video acquisition system, wherein the request instruction comprises displacement information of the positioning tag;
receiving return data of the video acquisition system, wherein the return data comprises image information of the target object.
Wherein, the displacement information satisfies a preset displacement condition, including at least one of the following:
the moving speed of the positioning label is greater than or equal to a preset speed;
the track length of the positioning label is greater than or equal to a preset length;
the positioning label is positioned in a preset area;
the ratio of the time of the positioning tag in the moving state to the total time of the positioning tag in the compartment is greater than or equal to a preset threshold value.
The generating of the monitoring data of the target object comprises the following steps:
uploading the image information of the target object to a database, and searching identity information matched with the image information;
and generating monitoring data of the target object, wherein the monitoring data comprises at least one of image information, identity information, displacement information and positioning label information.
Wherein, the executing corresponding monitoring measures according to the monitoring data comprises at least one of the following steps:
when the target object leaves the carriage, the follow shot of the target object is released; or the like, or, alternatively,
when the target object is located in the preset area, broadcasting to remind the target object to leave the preset position; or the like, or, alternatively,
when the moving speed of the positioning tag is greater than a preset speed, and/or when the track length of the positioning tag is greater than a preset length, and/or when the ratio of the time of the positioning tag in the moving state to the total time of the positioning tag in the carriage is greater than or equal to a preset threshold value, marking the target object corresponding to the positioning tag.
After marking the target object corresponding to the positioning tag, the method further includes:
storing the location tag and/or image information of the marked target object;
and starting a camera to shoot the target object when the positioning label of the marked target object is acquired again in the carriage and/or the image information of the marked target object is captured.
The application also provides a monitoring system, which comprises an ultra-wideband positioning device, a camera and a controller;
the ultra-wideband positioning device is used for acquiring displacement information of a plurality of positioning labels in the carriage through the ultra-wideband positioning base station;
the camera is used for shooting a target object corresponding to the positioning label;
and the controller is used for generating monitoring data of the target object and executing corresponding monitoring measures according to the monitoring data.
The present application further provides a server, comprising:
at least one processor;
at least one memory coupled to the at least one processor and storing instructions for execution by the at least one processor, the instructions when executed by the at least one processor causing the device to perform a monitoring method as described above.
The present application further provides a computer storage medium having computer program instructions stored thereon; which computer program instructions, when executed by a processor, implement the monitoring method as described above.
The monitoring method, the monitoring system, the server and the computer storage medium comprise the following steps: acquiring displacement information of a plurality of positioning labels in a carriage through an ultra-wideband positioning base station; when the displacement information meets a preset displacement condition, starting a camera to shoot a target object corresponding to the positioning label; and generating monitoring data of the target object, and executing corresponding monitoring measures according to the monitoring data. The ultra-wideband high-precision positioning technology is combined with the camera capture, so that the target object in the carriage can be accurately positioned and identified, and the intelligent security early warning efficiency is improved.
The foregoing description is only an overview of the technical solutions of the present application, and in order to make the technical means of the present application more clearly understood, the present application may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present application more clearly understood, the following preferred embodiments are described in detail with reference to the accompanying drawings.
Drawings
Fig. 1 is an application environment diagram of a monitoring method according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart diagram illustrating a monitoring method according to an embodiment of the present invention;
FIG. 3 is a block diagram illustrating a monitoring system according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a server according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present application is provided for illustrative purposes, and other advantages and capabilities of the present application will become apparent to those skilled in the art from the present disclosure.
In the following description, reference is made to the accompanying drawings that describe several embodiments of the application. It is to be understood that other embodiments may be utilized and that mechanical, structural, electrical, and operational changes may be made without departing from the spirit and scope of the present application. The following detailed description is not to be taken in a limiting sense, and the scope of embodiments of the present application is defined only by the claims of the issued patent. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
Although the terms first, second, etc. may be used herein to describe various elements in some instances, these elements should not be limited by these terms. These terms are only used to distinguish one element from another.
Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
Fig. 1 is a schematic application environment diagram of a monitoring method according to an embodiment of the present invention, including an ultra-wideband positioning base station 11, a camera 12, a controller 13, and a mobile terminal 14 having an ultra-wideband positioning tag. It should be understood that the number of ultra-wideband positioning base stations 11, cameras 12, controllers 13 and mobile terminals 14 and their mounting locations in fig. 1 are merely illustrative. There may be any number of ultra-wideband positioning base stations 11, cameras 12, controllers 13, and mobile terminals 14 associated, as the implementation requires. Data communication is possible between the ultra-wideband positioning base station 11, the camera 12, the controller 13 and the mobile terminal 14. After the mobile terminal 14 of a plurality of positioning tags in the carriage that ultra wide band location basic station 11 gathered, control camera 12 through controller 13 and catch the target object that the positioning tag corresponds, realize accurate location and discernment to the target object, improved the security protection level in the carriage.
Fig. 2 is a flow chart diagram illustrating a monitoring method according to an embodiment of the present invention. As shown in fig. 2, a monitoring method provided for an embodiment of the present invention includes:
step 201: acquiring displacement information of a plurality of positioning labels in a carriage through an ultra-wideband positioning base station;
step 202: when the displacement information meets a preset displacement condition, starting a camera to shoot a target object corresponding to the positioning label;
step 203: and generating monitoring data of the target object, and executing corresponding monitoring measures according to the monitoring data.
The embodiment of the invention is based on UWB (Ultra Wide Band) positioning technology, and carries a positioning label on a target object by arranging an Ultra Wide Band base station in a carriage; when the tag transmits an ultra-wideband signal, the base station receives the information and transmits the information to the server through a network cable or a WIFI network, and the position of the global positioning tag is displayed in real time. The positioning labels can freely walk on each unit, and the positioning targets are really displayed in a virtual dynamic three-dimensional effect through the analysis of positioning platform software. The UWB signal bandwidth is large, the multipath receiving is easy to separate, the anti-fading performance is good, and the high positioning precision can be realized. The embodiment of the invention can be applied to vehicles with unstable signals, such as subways, high-speed rails, trains and the like. UWB positioning tags are installed on mobile phones of passengers in the carriage. And the UWB positioning base station acquires displacement information of the UWB positioning tag so as to monitor passengers in the carriage. And judging abnormal behaviors of passengers according to the displacement information, if judging whether the passengers walk back and forth in the carriages, and then timely developing corresponding security work according to the behaviors of the passengers, so that the safety of the passengers is guaranteed and meanwhile the illegal criminal time is prevented. For example, when a passenger walks back and forth in a plurality of carriages and the shape track is suspicious, a camera is started to identify and take a photo of the target passenger, meanwhile, the suspicious behavior track and the photo of the passenger are uploaded, so that security personnel can conveniently and timely take effective monitoring measures aiming at the target passenger, and the monitoring is released until the target passenger leaves the carriage, so that illegal criminal behaviors in the carriage can be effectively prevented.
In step 201, when the camera is started to follow the target object corresponding to the positioning tag, the shooting angle of the camera is controlled according to the position information of the target object to capture the target object, firstly, the shooting range is determined according to the position information of the positioning tag acquired by UWB, and then the shooting angle of the camera corresponding to the shooting range is adjusted to capture and identify the target object. Then, the target object is followed and image information of the target object is collected, and the shooting angle of the camera can be continuously adjusted to collect multi-angle images of the target object in the period so as to ensure that the collected image information can accurately reflect the appearance characteristics of the target object.
When the target object exceeds the shooting range of the current camera, but the UWB positioning base station still can acquire the positioning tag of the target object in the carriage, which indicates that the target object does not leave the carriage, the displacement information of the target object can be updated according to the positioning tag of the target object, and then the camera corresponding to the updated displacement information is started to follow the shooting of the target object. In other embodiments, the camera may be switched to a camera adjacent to the camera according to the moving direction of the target object to continue to follow up shooting. For example, when the target object moves from the fifth car to the fourth car and the camera in the fifth car cannot shoot the target object, the camera automatically switched to the fourth car continuously shoots the target object, so that continuous shooting of the target object is guaranteed.
In an embodiment, when the camera is started to follow and shoot a target object corresponding to a positioned tag, a request instruction of the target object corresponding to the positioned tag can be sent to the video acquisition system, and the request instruction includes displacement information of the positioned tag. The displacement information comprises information such as the position, the direction and the speed of the target object, and the video acquisition system can control the camera to shoot according to the displacement information of the target object, such as a subway monitoring system and a railway system. Specifically, the API interface including the displacement information of the target object may be sent to the video capture system, and then return data of the video capture system is received, where the return data includes image information of the target object, such as a picture, a video, and the like.
In step 202, the displacement information satisfies a preset displacement condition, for example: when the moving speed of the positioning tag is greater than or equal to the preset speed, for example, 2m/s, the target object corresponding to the positioning tag is judged to be suspicious personnel, and follow shooting is required. The track length of the positioning tag can be larger than or equal to the preset length, and if the track length of the positioning tag is larger than three carriages, the target object corresponding to the positioning tag is judged to be suspicious personnel and needs to be followed and photographed. In addition, the positioning tag can be located in a preset area. For example, when a target object corresponding to the positioning tag is located near a vehicle door, dangerous accidents such as falling or being extruded by the door easily occur when the door is opened or closed, and corresponding monitoring measures need to be taken to remind the target object of being far away from the current area; or when the target object corresponding to the positioning tag moves from an area with low population density to an area with high population density and the area with high population density is not a special position such as a getting-on/off position, judging that the target object corresponding to the positioning tag is a suspicious person, performing follow-up shooting on the target object corresponding to the positioning tag and taking a monitoring measure, and preventing an unexpected event from occurring on the target object. The ratio of the time of the positioning tag in the moving state to the total time of the positioning tag in the car may also be greater than or equal to a preset threshold, for example, the target object is in the moving state more than 90% of the time in the car, which indicates that the target object is in frequent activity, and the shape is suspicious, and the monitoring needs to be strengthened.
In step 203, when the monitoring data of the target object is generated, firstly, uploading the image information of the target object to a database, and searching the identity information matched with the image information; and then generating and storing monitoring data including image information, identity information, displacement information, positioning label information and the like of the target object. Then, the process is carried out. Executing corresponding monitoring measures according to the monitoring data, for example: when the target object leaves the carriage, the follow-up shooting of the target object is released; or broadcasting to remind the target object to leave the preset position when the target object is located in the preset area; or when the moving speed of the target object corresponding to the positioning tag is greater than the preset speed, marking the target object corresponding to the positioning tag, and when the track length of the positioning tag is greater than the preset length, marking the target object corresponding to the positioning tag, and when the ratio of the time of the positioning tag in the moving state to the total time of the positioning tag in the carriage is greater than or equal to a preset threshold value, marking the target object corresponding to the positioning tag. After marking the target object corresponding to the positioning label, storing the positioning label and/or the image information of the marked target object. When the marked positioning label is collected again in the carriage and/or the image information of the marked target object is captured, the camera is started to follow the target object, and the monitoring on the marked target object is enhanced.
According to the monitoring method, displacement information of a plurality of positioning tags in a carriage is acquired through an ultra-wideband positioning base station; when the displacement information meets a preset displacement condition, starting a camera to shoot a target object corresponding to the positioning label; and generating monitoring data of the target object, and executing corresponding monitoring measures according to the monitoring data. The ultra-wideband high-precision positioning technology is combined with the camera capture, so that the target object in the carriage can be accurately positioned and identified, and the intelligent security early warning efficiency is improved.
Fig. 3 is a block diagram illustrating a monitoring system according to an embodiment of the present invention. As shown in fig. 3, the present application further provides a monitoring system 401, comprising an ultra-wideband positioning device 403, a camera 401, and a controller 405; the ultra-wideband positioning device 403 is used for acquiring displacement information of a plurality of positioning tags in the carriage through an ultra-wideband positioning base station; the camera 401 is used for shooting a target object corresponding to the positioning tag; the controller 405 is configured to generate monitoring data of a target object, and execute a corresponding monitoring measure according to the monitoring data.
The specific implementation process of this embodiment is described in detail in the related description of the embodiment of the monitoring method, and is not described herein again.
Fig. 4 is a schematic structural diagram of a server according to an embodiment of the present invention. The server shown in fig. 4 is only an example, and should not bring any limitation to the functions and applicable scope of the embodiments of the present disclosure. As shown in fig. 4, the present application also provides a server 600 including a processing unit 601, which can execute the method of the embodiments of the present disclosure according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. Processor 601 may include, for example, a general purpose microprocessor (e.g., a CPU), an instruction processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), among others. The processor 601 may also include onboard memory for caching purposes. Processor 601 may include a single processing unit or multiple processing units for performing different actions of a method flow according to embodiments of the disclosure.
In the RAM603, various programs and data necessary for the operation of the server 600 are stored. The processor 601, the ROM602, and the RAM603 are connected to each other via a bus 604. The processor 601 performs various operations of the method flows according to the embodiments of the present disclosure by executing programs in the ROM602 and/or RAM 603. Note that the above-described programs may also be stored in one or more memories other than the ROM602 and the RAM 603. The processor 601 may also perform various operations of the method flows according to embodiments of the present disclosure by executing programs stored in one or more memories.
According to an embodiment of the present disclosure, server 600 may also include input/output (I/O) interface 605, input/output (I/O) interface 605 also being connected to bus 604. The server 600 may also include one or more of the following components connected to an input/output (I/O) interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. Further, a drive, removable media. A computer program such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like may also be connected to an input/output (I/O) interface 605 as necessary, so that the computer program read out therefrom is installed into the storage section 608 as necessary.
Method flows according to embodiments of the present disclosure may be implemented as computer software programs. For example, an embodiment of the present disclosure includes a computer program product. Comprising a computer program, carried on a computer readable storage medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from a removable medium. The computer program, when executed by the processor 601, performs the above-described functions defined in the system of the embodiments of the present disclosure. The systems, devices, apparatuses, modules, units, and the like described above may be implemented by computer program modules according to embodiments of the present disclosure.
The present application further provides a computer storage medium having computer program instructions stored thereon; the computer program instructions, when executed by a processor, implement the monitoring method as described in the above embodiments.
In practical implementation, the computer storage medium is applied to the server shown in fig. 4, so as to implement intelligent monitoring.
The above embodiments are merely illustrative of the principles and utilities of the present application and are not intended to limit the application. Any person skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical concepts disclosed in the present application shall be covered by the claims of the present application.

Claims (11)

1. A monitoring method, comprising the steps of:
acquiring displacement information of a plurality of positioning labels in a carriage through an ultra-wideband positioning base station;
when the displacement information meets a preset displacement condition, starting a camera to shoot a target object corresponding to the positioning label;
and generating monitoring data of the target object, and executing corresponding monitoring measures according to the monitoring data.
2. The monitoring method according to claim 1, wherein the starting of the camera to follow the target object corresponding to the positioning tag comprises the following steps:
controlling the shooting angle of the camera according to the position information of the target object so as to capture the target object;
and following and acquiring image information of the target object, wherein the image information comprises a multi-angle image of the target object.
3. A monitoring method according to claim 2, wherein the following and acquiring image information of the target object comprises the steps of:
when the target object exceeds the shooting range of the camera but does not leave the carriage, updating the displacement information of the target object according to the positioning tag of the target object, and starting the camera corresponding to the updated displacement information to follow the shooting of the target object; and/or the presence of a gas in the gas,
and switching the camera to be a camera adjacent to the camera according to the moving direction of the target object to continue to follow the shooting.
4. The monitoring method according to claim 1, wherein the starting of the camera to follow the target object corresponding to the positioning tag further comprises the following steps:
sending a request instruction of a target object corresponding to the positioning tag to a video acquisition system, wherein the request instruction comprises displacement information of the positioning tag;
receiving return data of the video acquisition system, wherein the return data comprises image information of the target object.
5. The monitoring method according to claim 1, wherein the displacement information satisfies a preset displacement condition, including at least one of:
the moving speed of the positioning label is greater than or equal to a preset speed;
the track length of the positioning label is greater than or equal to a preset length;
the positioning label is positioned in a preset area;
the ratio of the time of the positioning tag in the moving state to the total time of the positioning tag in the compartment is greater than or equal to a preset threshold value.
6. The monitoring method according to claim 5, wherein the generating of the monitoring data of the target object comprises the steps of:
uploading the image information of the target object to a database, and searching identity information matched with the image information;
and generating monitoring data of the target object, wherein the monitoring data comprises at least one of image information, identity information, displacement information and positioning label information.
7. The monitoring method according to claim 5, wherein said performing a corresponding monitoring measure according to said monitoring data comprises at least one of:
when the target object leaves the carriage, the follow shot of the target object is released; or the like, or, alternatively,
when the target object is located in the preset area, broadcasting to remind the target object to leave the preset position; or the like, or, alternatively,
when the moving speed of the positioning tag is greater than a preset speed, and/or when the track length of the positioning tag is greater than a preset length, and/or when the ratio of the time of the positioning tag in the moving state to the total time of the positioning tag in the carriage is greater than or equal to a preset threshold value, marking the target object corresponding to the positioning tag.
8. The monitoring method according to claim 7, further comprising, after marking the target object corresponding to the positioning tag:
storing the location tag and/or image information of the marked target object;
and starting a camera to shoot the target object when the positioning label of the marked target object is acquired again in the carriage and/or the image information of the marked target object is captured.
9. A monitoring system is characterized by comprising an ultra-wideband positioning device, a camera and a controller;
the ultra-wideband positioning device is used for acquiring displacement information of a plurality of positioning labels in the carriage through the ultra-wideband positioning base station;
the camera is used for shooting a target object corresponding to the positioning label;
and the controller is used for generating monitoring data of the target object and executing corresponding monitoring measures according to the monitoring data.
10. A server, comprising:
at least one processor;
at least one memory coupled to the at least one processor and storing instructions for execution by the at least one processor, the instructions when executed by the at least one processor causing the device to perform the monitoring method of any of claims 1 to 7.
11. A computer storage medium having computer program instructions stored thereon; the computer program instructions, when executed by a processor, implement the monitoring method of any one of claims 1 to 7.
CN202011179823.7A 2020-10-29 2020-10-29 Monitoring method, system, server and computer storage medium Active CN112118427B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011179823.7A CN112118427B (en) 2020-10-29 2020-10-29 Monitoring method, system, server and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011179823.7A CN112118427B (en) 2020-10-29 2020-10-29 Monitoring method, system, server and computer storage medium

Publications (2)

Publication Number Publication Date
CN112118427A true CN112118427A (en) 2020-12-22
CN112118427B CN112118427B (en) 2022-11-04

Family

ID=73794434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011179823.7A Active CN112118427B (en) 2020-10-29 2020-10-29 Monitoring method, system, server and computer storage medium

Country Status (1)

Country Link
CN (1) CN112118427B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113395528A (en) * 2021-06-09 2021-09-14 珠海格力电器股份有限公司 Live broadcast method and device, electronic equipment and storage medium
CN114339038A (en) * 2021-12-24 2022-04-12 珠海格力电器股份有限公司 Target object monitoring method, device and system

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018448A (en) * 2001-06-29 2003-01-17 Nippon Telegr & Teleph Corp <Ntt> Method and system for tracing/recording recording object
CN101268383A (en) * 2005-08-04 2008-09-17 埃森技术Enc株式会社 Smart video monitoring system and method communicating with auto-tracking radar system
CN103020983A (en) * 2012-09-12 2013-04-03 深圳先进技术研究院 Human-computer interaction device and method used for target tracking
CN105763847A (en) * 2016-02-26 2016-07-13 努比亚技术有限公司 Monitoring method and monitoring terminal
US20180339387A1 (en) * 2017-05-24 2018-11-29 Trimble Inc. Measurement, layout, marking, firestop stick
CN109309809A (en) * 2017-07-28 2019-02-05 阿里巴巴集团控股有限公司 The method and data processing method, device and system of trans-regional target trajectory tracking
CN109948523A (en) * 2019-03-18 2019-06-28 中国汽车工程研究院股份有限公司 A kind of object recognition methods and its application based on video Yu millimetre-wave radar data fusion
CN111079782A (en) * 2019-11-08 2020-04-28 北京万集科技股份有限公司 Vehicle transaction method and system, storage medium and electronic device
CN111311649A (en) * 2020-01-15 2020-06-19 重庆特斯联智慧科技股份有限公司 Indoor internet-of-things video tracking method and system
CN111402578A (en) * 2020-02-28 2020-07-10 平安国际智慧城市科技股份有限公司 Shared vehicle monitoring method and device based on track monitoring and computer equipment
WO2020145882A1 (en) * 2019-01-09 2020-07-16 Hitachi, Ltd. Object tracking systems and methods for tracking a target object
CN111479090A (en) * 2020-04-15 2020-07-31 Oppo广东移动通信有限公司 Intelligent monitoring method, device, system and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018448A (en) * 2001-06-29 2003-01-17 Nippon Telegr & Teleph Corp <Ntt> Method and system for tracing/recording recording object
CN101268383A (en) * 2005-08-04 2008-09-17 埃森技术Enc株式会社 Smart video monitoring system and method communicating with auto-tracking radar system
CN103020983A (en) * 2012-09-12 2013-04-03 深圳先进技术研究院 Human-computer interaction device and method used for target tracking
CN105763847A (en) * 2016-02-26 2016-07-13 努比亚技术有限公司 Monitoring method and monitoring terminal
US20180339387A1 (en) * 2017-05-24 2018-11-29 Trimble Inc. Measurement, layout, marking, firestop stick
CN109309809A (en) * 2017-07-28 2019-02-05 阿里巴巴集团控股有限公司 The method and data processing method, device and system of trans-regional target trajectory tracking
WO2020145882A1 (en) * 2019-01-09 2020-07-16 Hitachi, Ltd. Object tracking systems and methods for tracking a target object
CN109948523A (en) * 2019-03-18 2019-06-28 中国汽车工程研究院股份有限公司 A kind of object recognition methods and its application based on video Yu millimetre-wave radar data fusion
CN111079782A (en) * 2019-11-08 2020-04-28 北京万集科技股份有限公司 Vehicle transaction method and system, storage medium and electronic device
CN111311649A (en) * 2020-01-15 2020-06-19 重庆特斯联智慧科技股份有限公司 Indoor internet-of-things video tracking method and system
CN111402578A (en) * 2020-02-28 2020-07-10 平安国际智慧城市科技股份有限公司 Shared vehicle monitoring method and device based on track monitoring and computer equipment
CN111479090A (en) * 2020-04-15 2020-07-31 Oppo广东移动通信有限公司 Intelligent monitoring method, device, system and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈润杰等: "基于图像识别的医疗机器人设计", 《现代计算机(专业版)》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113395528A (en) * 2021-06-09 2021-09-14 珠海格力电器股份有限公司 Live broadcast method and device, electronic equipment and storage medium
CN114339038A (en) * 2021-12-24 2022-04-12 珠海格力电器股份有限公司 Target object monitoring method, device and system

Also Published As

Publication number Publication date
CN112118427B (en) 2022-11-04

Similar Documents

Publication Publication Date Title
Ke et al. A smart, efficient, and reliable parking surveillance system with edge artificial intelligence on IoT devices
US11443555B2 (en) Scenario recreation through object detection and 3D visualization in a multi-sensor environment
CN109804367B (en) Distributed video storage and search using edge computation
CN112118427B (en) Monitoring method, system, server and computer storage medium
US10552687B2 (en) Visual monitoring of queues using auxillary devices
US8457356B2 (en) Method and system of video object tracking
CN109191829B (en) road safety monitoring method and system, and computer readable storage medium
CN106682644A (en) Double dynamic vehicle monitoring management system and method based on mobile vedio shooting device
US9292743B1 (en) Background modeling for fixed, mobile, and step- and-stare video camera surveillance
CN105336097A (en) Traffic early warning method and device of population movement track
Hatem et al. Bus management system using RFID in WSN
US20220172592A1 (en) Video analytics platform for real-time monitoring and assessment of airplane safety processes
US11210529B2 (en) Automated surveillance system and method therefor
Garibotto et al. White paper on industrial applications of computer vision and pattern recognition
CN115004269A (en) Monitoring device, monitoring method, and program
Lumentut et al. Evaluation of recursive background subtraction algorithms for real-time passenger counting at bus rapid transit system
CN114049587A (en) Event detection method, server and system
CN111929672A (en) Method and device for determining movement track, storage medium and electronic device
US11288519B2 (en) Object counting and classification for image processing
US10388132B2 (en) Systems and methods for surveillance-assisted patrol
CN113611131B (en) Vehicle passing method, device, equipment and computer readable storage medium
Bhavani et al. LoRa Based Elephant Detection System near Railroads
CN113823072A (en) Person tracking support device and person tracking support system
CN111627242B (en) Fusion positioning method for parking spaces in indoor parking garage
Nagarathinam et al. Junction Monitoring System for Emergency Vehicles and Density control Using Image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant