CN113744303B - Target tracking method, device, equipment and readable storage medium - Google Patents

Target tracking method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN113744303B
CN113744303B CN202010469756.6A CN202010469756A CN113744303B CN 113744303 B CN113744303 B CN 113744303B CN 202010469756 A CN202010469756 A CN 202010469756A CN 113744303 B CN113744303 B CN 113744303B
Authority
CN
China
Prior art keywords
target terminal
target
information
state mark
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010469756.6A
Other languages
Chinese (zh)
Other versions
CN113744303A (en
Inventor
王陈
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN202010469756.6A priority Critical patent/CN113744303B/en
Publication of CN113744303A publication Critical patent/CN113744303A/en
Application granted granted Critical
Publication of CN113744303B publication Critical patent/CN113744303B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Alarm Systems (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a target tracking method, a target tracking device, target tracking equipment and a readable storage medium. The method disclosed by the application is applied to the network camera and comprises the following steps: sending the detection information to the target terminal so that the target terminal returns own target terminal information; the target terminal information comprises: identification information and status flag category of the target terminal; and triggering the target terminal to respond correspondingly according to the received state mark category so as to track the target terminal. In the tracking process, the target terminal does not need to send information at any time, and only needs to execute corresponding operation under the triggering of the network camera, so that the effective collection of tracking information is realized, and the tracking success rate is improved. The tracking device, the device and the readable storage medium have the technical effects.

Description

Target tracking method, device, equipment and readable storage medium
Technical Field
The present disclosure relates to the field of security monitoring technologies, and in particular, to a target tracking method, device, apparatus, and readable storage medium.
Background
In the existing target tracking scheme, the target device is required to continuously broadcast information outwards for the surrounding detection devices to grasp, but the broadcast information is not easy to grasp due to the limited number of the detection devices. Meanwhile, the available electric quantity of the target equipment is limited, the electric quantity of the target equipment is always consumed by continuously broadcasting information, so that the online time length of the target equipment is reduced, the detection equipment cannot grasp the information indirectly, the target equipment is difficult to track, and the hunting rate is low.
Therefore, how to effectively collect information in the target tracking process and improve the tracking success rate is a problem that needs to be solved by those skilled in the art.
Disclosure of Invention
In view of the foregoing, an object of the present application is to provide a target tracking method, apparatus, device and readable storage medium, so as to effectively collect information in a target tracking process and improve a tracking success rate. The specific scheme is as follows:
in a first aspect, the present application provides a target tracking method, applied to a network camera, including:
sending detection information to a target terminal so that the target terminal returns own target terminal information; wherein the target terminal information includes: the identification information and the status flag category of the target terminal;
and triggering the target terminal to respond correspondingly according to the received state mark category so as to track the target terminal.
In one embodiment, the determining of the status flag class includes:
the target terminal acquires a brightness value acquired by a light sensor and a distance value acquired by a distance sensor, and if the sum of the brightness value and the distance value is smaller than a preset threshold value, the state mark type is determined to be a first state mark; otherwise, determining the state mark category as a second state mark;
wherein the first status flag indicates that the target terminal is blocked; and the second state mark indicates that the target terminal is not blocked.
In one embodiment, the triggering the target terminal to respond correspondingly according to the received status flag category to track the target terminal includes:
and sending the target terminal information to a cloud platform, and if the cloud platform returns a tracking instruction, triggering an infrared emitter in the target terminal to emit light when the state mark type is a first state mark, and sending a video obtained by real-time video recording to the cloud platform.
In one embodiment, the triggering the target terminal to respond correspondingly according to the received status flag category to track the target terminal includes:
and sending the target terminal information to a cloud platform, and if the cloud platform returns a tracking instruction, triggering a front camera and/or a rear camera in the target terminal to shoot an image when the state mark type is a second state mark, and sending the image and a video obtained by real-time video recording to the cloud platform.
In one embodiment, the method further comprises:
and when the state mark type is the first state mark, switching the network camera to a night vision mode so as to acquire infrared light emitted by the infrared emitter.
In one embodiment, before the sending the probe information to the target terminal, the method further includes:
and starting a face recognition function, and executing the step of sending the detection information to the target terminal if the face is recognized.
In one embodiment, after the target terminal information is sent to a cloud platform, the cloud platform extracts the identification information from the target terminal information and queries whether the identification information is marked as to-be-tracked; if yes, returning the tracking instruction.
In a second aspect, the present application provides an object tracking device applied to a network camera, including:
the sending module is used for sending the detection information to the target terminal so that the target terminal returns the own target terminal information; wherein the target terminal information includes: the identification information and the status flag category of the target terminal;
and the tracking module is used for triggering the target terminal to respond correspondingly according to the received state mark category so as to track the target terminal.
In a third aspect, the present application provides a target tracking apparatus comprising:
a memory for storing a computer program;
a processor for executing the computer program to implement the previously disclosed object tracking method.
In a fourth aspect, the present application provides a readable storage medium storing a computer program, wherein the computer program when executed by a processor implements the previously disclosed object tracking method.
According to the scheme, the application provides a target tracking method, which is applied to a network camera and comprises the following steps: sending detection information to a target terminal so that the target terminal returns own target terminal information; wherein the target terminal information includes: the identification information and the status flag category of the target terminal; and triggering the target terminal to respond correspondingly according to the received state mark category so as to track the target terminal.
In the application, after the network camera sends the detection information to the target terminal, the target terminal returns the identification information and the state mark category of the target terminal; and then the network camera triggers the target terminal to respond correspondingly according to the received state mark category, thereby realizing the tracking of the target terminal. In the tracking process of the method, the target terminal does not need to send information at any time, and only needs to execute corresponding operation under the triggering of the network camera, so that the target terminal does not need to work all the time, the online time of the target terminal is prolonged, effective collection of tracking information is realized, and the tracking success rate is improved.
Correspondingly, the target tracking device, the target tracking equipment and the readable storage medium have the technical effects.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings may be obtained according to the provided drawings without inventive effort to a person skilled in the art.
FIG. 1 is a flow chart of a first object tracking method disclosed in the present application;
FIG. 2 is a flow chart of a second object tracking method disclosed in the present application;
FIG. 3 is a frame diagram of a target tracking system disclosed herein;
FIG. 4 is a flow chart of a second object tracking method disclosed in the present application;
FIG. 5 is a schematic diagram of a target tracking apparatus disclosed herein;
fig. 6 is a schematic diagram of a target tracking apparatus disclosed in the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
Currently, in the existing target tracking scheme, the target is required to continuously broadcast information outwards for surrounding detection devices to grasp, but broadcast information is not easy to grasp because the number of the detection devices is limited. Meanwhile, the available electric quantity of the target is limited, the electric quantity of the target can be always consumed by continuously broadcasting information, so that the online time length of the target can be reduced, the information can not be captured by the detection equipment indirectly, the target is difficult to track, and the recovery rate is low. Therefore, the target tracking scheme can effectively collect information in the target tracking process, and the tracking success rate is improved.
Referring to fig. 1, an embodiment of the present application discloses a first target tracking method, which is applied to a network camera for real-time video recording, and includes:
s101, sending detection information to a target terminal so that the target terminal returns own target terminal information; the target terminal information comprises: identification information of the target terminal and a status flag class.
In one embodiment, the determining of the status flag class includes: the target terminal acquires a brightness value acquired by a light sensor and a distance value acquired by a distance sensor, and if the sum of the brightness value and the distance value is smaller than a preset threshold value, the state mark type is determined to be a first state mark; otherwise, determining the state mark category as a second state mark; wherein the first status flag indicates that the target terminal is occluded (e.g., the target terminal is held in a pocket); the second status flag indicates that the target terminal is not occluded.
S102, triggering the target terminal to respond correspondingly according to the received state mark category so as to track the target terminal.
In a specific embodiment, triggering the target terminal to respond correspondingly according to the received status flag category so as to track the target terminal includes: and sending the target terminal information to the cloud platform, and if the cloud platform returns a tracking instruction, triggering an infrared emitter in the target terminal to emit light when the state mark type is a first state mark, and sending a video obtained by real-time video recording to the cloud platform. In a specific embodiment, when the status flag type is the first status flag, the network camera is switched to be in a night vision mode so as to acquire infrared light emitted by the infrared emitter.
In a specific embodiment, triggering the target terminal to respond correspondingly according to the received status flag category so as to track the target terminal includes: and sending the target terminal information to the cloud platform, if the cloud platform returns a tracking instruction, triggering a front camera and/or a rear camera in the target terminal to shoot an image when the state mark type is a second state mark, and sending the image and a video obtained by real-time video recording to the cloud platform.
Specifically, after the cloud platform obtains the video and the image, the video and the image can be compared, so that the specific position of the target terminal is determined. If the face can be extracted from the image, the face can be identified by using a face identification algorithm, so that the person to be tracked is determined to track the target terminal. Of course, after the network Camera (IP Camera, IPC) obtains the video and the image, the video and the image may also be compared according to the above procedure, so as to determine the person to be tracked, so as to track the target terminal.
In a specific embodiment, before sending the probe information to the target terminal, the method further includes: and starting a face recognition function, and if the face is recognized, executing the step of sending the detection information to the target terminal.
In a specific embodiment, after the target terminal information is sent to the cloud platform, the cloud platform extracts the identification information from the target terminal information and inquires whether the identification information is marked as to-be-tracked; if yes, returning a tracking instruction.
It can be seen that, in the embodiment of the present application, after the network camera sends the detection information to the target terminal, the target terminal returns the identification information and the status flag category of the target terminal; and then the network camera triggers the target terminal to respond correspondingly according to the received state mark category, thereby realizing the tracking of the target terminal. In the tracking process of the method, the target terminal does not need to send information at any time, and only needs to execute corresponding operation under the triggering of the network camera, so that the target terminal does not need to work all the time, the online time of the target terminal is prolonged, effective collection of tracking information is realized, and the tracking success rate is improved.
Referring to fig. 2, the embodiment of the application discloses a second object tracking method, which is applied to a network camera for real-time video recording, and includes:
and S201, if an activation instruction sent by the cloud platform is received, sending detection information to the target terminal so that the target terminal returns own terminal information to the network camera.
The activation instruction is generated when the cloud platform detects that a terminal to be tracked exists in the coverage range of the network camera. The target terminal is any terminal running with a monitoring application program in the coverage area of the network camera, and the monitoring application program is registered on the cloud platform and can acquire terminal information.
Wherein, the terminal information includes: the identification information of the terminal and the status flag category of whether the terminal is blocked. The identification information of the terminal includes: the factory serial number, the MAC address, the push ID, etc. of the terminal, if the terminal is a mobile phone, the identification information of the terminal may also be a mobile phone number, etc. The status flag type of whether the terminal is blocked can be represented by a flag bit, and a flag bit user can define the status flag type by himself, for example: "0" indicates not occluded and "1" indicates occluded; alternatively, "none" means not occluded and "have" means occluded.
In a specific embodiment, before sending the probe information to the target terminal, the method further includes: judging whether personnel appear in the coverage area of the network camera; if yes, the step of sending the detection information to the target terminal is executed. Wherein, judge whether personnel appear in the network camera coverage, include: starting a face recognition function, and if the face is recognized, determining that personnel appear in the coverage area of the network camera; otherwise, determining that no personnel are present in the coverage area of the network camera. When no personnel appear in the coverage area of the network camera, the current tracking flow can be ended; or the network camera broadcasts the detection information and collects the response of the detection information within a preset time period, if the response is collected, the network camera acquires the information of the terminal which sends the response and executes S202-S203; if no response is collected, ending the current tracking flow. The network camera sends or broadcasts the detection information by utilizing the self Wifi function or the Bluetooth function.
S202, sending terminal information to a cloud platform so that the cloud platform can judge whether to track a target terminal according to the terminal information; if so, a tracking instruction is sent to the target terminal so that the monitoring application program in the target terminal controls the target terminal to execute the target operation.
The number of the target terminals may be plural or one. Therefore, the number of terminal information sent to the cloud platform may be multiple or one, and the cloud platform needs to judge one by one according to the multiple terminal information.
In this embodiment, the target operation includes capturing an image or turning on a light source, and the two may alternatively be performed. The tracking information may also be sent to a mailbox or the like bound by the tracked terminal. The user may bind the mailbox for the terminal on the cloud platform for tracking use.
In one embodiment, a monitoring application in a target terminal controls the target terminal to execute a target operation, including: if the target terminal is determined not to be blocked according to the state mark type in the terminal information, a monitoring application program in the target terminal controls a front camera and/or a rear camera of the target terminal to shoot an image, and the image is sent to a network camera; correspondingly, recording and sending tracking information to the cloud platform, including: and sending the video obtained by the image and the real-time video to a cloud platform.
In another embodiment, the monitoring application in the target terminal controls the target terminal to execute the target operation, including: if the target terminal is blocked according to the state mark type in the terminal information, a monitoring application program in the target terminal controls the target terminal to turn on a light source and is in communication connection with the network camera; correspondingly, recording and sending tracking information to the cloud platform, including: and sending the video obtained by the real-time video recording to the cloud platform.
Wherein the light source is a flash lamp or an infrared emitter. The infrared emitter may be a stand-alone device or may be a device in other sensors, such as an infrared emitter in an infrared ranging sensor.
When the target terminal starts the infrared light source, the network camera is switched to a night vision mode to record video in real time. The infrared light source can be seen in the night vision mode, so that when a case transacting person views a video, the person with the target terminal can be quickly found, the related person is locked, and the tracking success rate is improved.
S203, recording and sending tracking information to the cloud platform in the process of executing target operation by the target terminal.
In a specific embodiment, the cloud platform determines whether to track the target terminal according to the terminal information, including: the cloud platform extracts the identification information of the target terminal from the terminal information and inquires whether the identification information is marked as to-be-tracked; if yes, tracking the target terminal; if not, the target terminal is not tracked.
The network cameras are preset at various positions of the monitored area, and the setting positions of the network cameras are recorded in the cloud platform. When the user loses the terminal, the position and time of the terminal loss can be recorded in the cloud platform in time, and the terminal is marked as to-be-tracked. The cloud platform can determine which network camera covers the lost position of the terminal, so that an activation instruction can be timely generated, and the network camera can help to track the lost terminal. The network camera always records the real-time video of the coverage range of the network camera.
Therefore, in the embodiment of the application, the target does not need to send information at any time, only needs to execute the target operation under the control of the monitoring application program, and the network camera records and sends the tracking information to the cloud platform in the process of executing the target operation, so that the collection and recording of the tracking information are completed. The whole process target does not need to work all the time, so that the online time of the target can be prolonged, the effective collection of tracking information is realized, and the tracking success rate is improved.
The embodiment of the application discloses a second target tracking method, which relates to a cloud platform, a plurality of network cameras, a monitoring APP (i.e. the monitoring application program mentioned above) registered in the cloud platform and a client for viewing a monitoring video. Referring to fig. 3 for the relationship among the cloud platform, the network camera and the client, the network camera is responsible for the acquisition, data packaging and uploading of the front-end video. The cloud platform is responsible for storage of video resources, storage of a cloud account system, storage of client side configuration information and the like. The clients generally comprise a mobile client and a PC client, and can receive alarms for users to watch live videos and the like.
Assuming that the tracked terminal is a mobile phone, installing a monitoring APP on the mobile phone, and performing the following settings:
1) Opening a monitoring APP, registering and logging in a cloud platform;
2) Opening a trackable switch in the monitoring APP, which means: allowing the monitoring APP to collect the identification information of the current mobile phone number, serial number, MAC address, push ID and the like, wherein the identification information is shown in Table 1. If the acquisition is not acquired, the acquisition is set to be empty.
TABLE 1
Information encoding Terminal identification information
1 Mobile phone number
2 Sequence number
3 MAC address
4 Push ID
3) The method comprises the steps that a mailbox is bound and tracked for a current mobile phone in a monitoring APP, so that the monitoring APP reports terminal identification information, mailbox information and a registered cloud account to a cloud platform for storage by the cloud platform.
4) The setting monitoring APP is always operated when the mobile phone is in standby.
When the mobile phone of the user is stolen or lost, the user can log in the cloud account at other terminals (mobile phone and PC) in time, and the state of the terminal is set to be 'to be tracked', so that the mobile phone is stolen or lost, and meanwhile, the place and time of the mobile phone being stolen or lost are reported. After the tracking process is completed, but the terminal has not successfully found, the state of the terminal is set as 'tracking completed'. And when the mobile phone is not stolen or lost or successfully retrieved, the state of the terminal in the cloud platform is normal.
The mobile phone status recorded in the cloud platform can be seen in table 2.
TABLE 2
When the cloud platform detects that the state of the terminal is set to be 'to be tracked', the network camera receiving the activation instruction is determined according to the position reported by the user, and the activation instruction is sent to the network camera, so that the network camera enters the 'mobile phone tracking' state. The sites reported by the user can be multiple, so the network cameras receiving the activation instructions can be multiple.
The network camera (IPC) receiving the activation instruction enters a state of tracking the mobile phone, namely, after the stolen system is activated, the process shown in fig. 4 is executed, which is specifically as follows:
1) The IPC starts a face detection function, and when the face is identified, the IPC starts a Wifi detection function, namely, detection information is sent through a wireless broadcasting technology.
2) And when the monitoring APP installed on the mobile phone in the IPC coverage area receives the detection information, the terminal identification information and the flag bit of whether the terminal is installed in the pocket or not listed in the table 1 are sent to the IPC.
Wherein, whether the terminal is arranged in the pocket can be judged by utilizing a light sensor and a distance sensor in the terminal. Such as: the light brightness value of the environment where the terminal is located is smaller than a certain value, and meanwhile, the distance between the terminal and surrounding objects is smaller than a certain value, so that the terminal is determined to be arranged in a pocket; otherwise, it is determined that the terminal is not held in the pocket.
3) After the IPC receives the information, the information is reported to the cloud platform so that the cloud platform can inquire the stolen mobile phone, and the inquired result is returned to the monitoring APP.
4) Monitoring APP judges the query result: if the mobile phone is a stolen mobile phone, the monitoring APP enters a stolen state, and the following operations are executed according to whether the terminal is in a pocket or not:
a. when the mobile phone is not in the pocket, the front camera and/or the rear camera are used for capturing pictures at fixed time and sending the pictures to the IPC so that the IPC can be sent to a mailbox or a cloud platform; and the mobile phone and the IPC are not transmitted any more until the communication is interrupted.
b. When the mobile phone is in the pocket, the monitoring APP triggers the mobile phone to enable the flashlight or the infrared emitter in the mobile phone to flash the light source so that the IPC can record a flash picture until the communication between the mobile phone and the IPC is interrupted, and the mobile phone does not flash. Wherein the scintillation light source is preferably an infrared light source.
When the mobile phone is confirmed to be in a stolen state and the mobile phone is in a pocket, the IPC is set to be in a night vision mode in order to better detect infrared light emitted by the mobile phone. In the night vision mode, the infrared filter of the lens is closed, so that objects with high infrared light intensity can be more obviously seen.
When the mobile phone is in the pocket, the relevant video is checked, and a case handling person can search by flashing light as a clue. When the mobile phone is not in the pocket, the relevant video is checked, and a case handling person can compare pictures shot by the mobile phone to locate the suspects.
Therefore, the method and the device can help the user to recover the stolen mobile phone. When the stolen mobile phone is detected, the suspected person in the monitoring video can be rapidly and accurately positioned according to the flicker of the light source and/or the picture shot by the mobile phone. Meanwhile, the method is not easy to be found by suspects. In addition, the mobile phone saves more electricity and flow in the tracked process, and the standby time of the mobile phone can be prolonged, so that more time is available for collecting information, and the suspects can be conveniently positioned.
The following describes an object tracking device according to an embodiment of the present application, and the object tracking device described below and the object tracking method described above may be referred to each other.
Referring to fig. 5, an embodiment of the present application discloses an object tracking device, which is applied to a network camera, and includes:
a sending module 501, configured to send probe information to a target terminal, so that the target terminal returns own target terminal information; wherein the target terminal information includes: the identification information and the status flag category of the target terminal;
and the tracking module 502 is configured to trigger the target terminal to perform a corresponding response according to the received status flag class, so as to track the target terminal.
In a specific embodiment, the target terminal acquires a brightness value acquired by a light sensor and a distance value acquired by a distance sensor, and if the sum of the brightness value and the distance value is smaller than a preset threshold value, the state mark type is determined to be a first state mark; otherwise, determining the state mark category as a second state mark; the first state mark indicates that the target terminal is shielded; the second status flag indicates that the target terminal is not occluded.
In one embodiment, the tracking module is specifically configured to:
and sending the target terminal information to the cloud platform, and if the cloud platform returns a tracking instruction, triggering an infrared emitter in the target terminal to emit light when the state mark type is a first state mark, and sending a video obtained by real-time video recording to the cloud platform.
In one embodiment, the tracking module is specifically configured to:
and sending the target terminal information to the cloud platform, if the cloud platform returns a tracking instruction, triggering a front camera and/or a rear camera in the target terminal to shoot an image when the state mark type is a second state mark, and sending the image and a video obtained by real-time video recording to the cloud platform.
In one specific embodiment, the method further comprises:
and the mode switching module is used for switching the network camera into a night vision mode when the state mark type is the first state mark so as to acquire infrared light emitted by the infrared emitter.
In one specific embodiment, the method further comprises:
and the identification module is used for starting the face identification function, and executing the step of sending the detection information to the target terminal if the face is identified.
In a specific embodiment, after the target terminal information is sent to the cloud platform, the cloud platform extracts the identification information from the target terminal information and inquires whether the identification information is marked as to-be-tracked; if yes, returning a tracking instruction.
The more specific working process of each module and unit in this embodiment may refer to the corresponding content disclosed in the foregoing embodiment, and will not be described herein.
Therefore, the target terminal does not need to work all the time, the online time of the target terminal is prolonged, the effective collection of tracking information is realized, and the tracking success rate is improved.
The following describes an object tracking device provided in an embodiment of the present application, and the object tracking device described below and the object tracking method and apparatus described above may be referred to each other.
Referring to fig. 6, an embodiment of the present application discloses a target tracking apparatus, including:
a memory 601 for storing a computer program;
a processor 602 for executing the computer program to implement the method disclosed in any of the embodiments above.
The following describes a readable storage medium provided in the embodiments of the present application, and the readable storage medium described below and the method, apparatus and device for tracking an object described above may be referred to with each other.
A readable storage medium storing a computer program, wherein the computer program when executed by a processor implements the object tracking method disclosed in the foregoing embodiments. For specific steps of the method, reference may be made to the corresponding contents disclosed in the foregoing embodiments, and no further description is given here.
Reference to "first," "second," "third," "fourth," etc. (if present) herein is used to distinguish similar objects from each other and does not necessarily describe a particular order or sequence. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, or apparatus.
It should be noted that the description herein of "first," "second," etc. is for descriptive purposes only and is not to be construed as indicating or implying a relative importance or implying an indication of the number of technical features being indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be regarded as not exist and not within the protection scope of the present application.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, so that the same or similar parts between the embodiments are referred to each other.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of readable storage medium known in the art.
The principles and embodiments of the present application are described herein with specific examples, the above examples being provided only to assist in understanding the methods of the present application and their core ideas; meanwhile, as those skilled in the art will have modifications in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (10)

1. A target tracking method, applied to a network camera, comprising:
sending detection information to a target terminal so that the target terminal returns own target terminal information; wherein the target terminal information includes: the identification information and the status flag category of the target terminal; the state mark type is a first state mark or a second state mark, and the first state mark indicates that the target terminal is shielded; a second status flag indicates that the target terminal is not occluded;
when the identification information is marked as to-be-tracked, tracking the target terminal; and triggering the target terminal to respond correspondingly according to the received state mark category so as to track the target terminal.
2. The object tracking method of claim 1, wherein the determining of the status flag class includes:
the target terminal acquires a brightness value acquired by a light sensor and a distance value acquired by a distance sensor, and if the sum of the brightness value and the distance value is smaller than a preset threshold value, the state mark type is determined to be a first state mark; otherwise, determining the state mark category as a second state mark.
3. The target tracking method according to claim 2, wherein the triggering the target terminal to respond accordingly according to the received status flag class to track the target terminal includes:
and sending the target terminal information to a cloud platform, and if the cloud platform returns a tracking instruction, triggering an infrared emitter in the target terminal to emit light when the state mark type is a first state mark, and sending a video obtained by real-time video recording to the cloud platform.
4. The target tracking method according to claim 2, wherein the triggering the target terminal to respond accordingly according to the received status flag class to track the target terminal includes:
and sending the target terminal information to a cloud platform, and if the cloud platform returns a tracking instruction, triggering a front camera and/or a rear camera in the target terminal to shoot an image when the state mark type is a second state mark, and sending the image and a video obtained by real-time video recording to the cloud platform.
5. The target tracking method according to claim 3, further comprising:
and when the state mark type is the first state mark, switching the network camera to a night vision mode so as to acquire infrared light emitted by the infrared emitter.
6. The target tracking method according to any one of claims 1 to 5, characterized by further comprising, before the transmitting of the probe information to the target terminal:
and starting a face recognition function, and executing the step of sending the detection information to the target terminal if the face is recognized.
7. The target tracking method according to claim 3 or 4, wherein after the target terminal information is sent to a cloud platform, the cloud platform extracts the identification information from the target terminal information and inquires whether the identification information is marked as to-be-tracked; if yes, returning the tracking instruction.
8. An object tracking device, applied to a network camera, comprising:
the sending module is used for sending the detection information to the target terminal so that the target terminal returns the own target terminal information; wherein the target terminal information includes: the identification information and the status flag category of the target terminal; the state mark type is a first state mark or a second state mark, and the first state mark indicates that the target terminal is shielded; a second status flag indicates that the target terminal is not occluded;
the tracking module is used for tracking the target terminal when the identification information is marked as to-be-tracked; and triggering the target terminal to respond correspondingly according to the received state mark category so as to track the target terminal.
9. A target tracking apparatus, comprising:
a memory for storing a computer program;
a processor for executing the computer program to implement the object tracking method as claimed in any one of claims 1 to 7.
10. A readable storage medium for storing a computer program, wherein the computer program when executed by a processor implements the object tracking method according to any one of claims 1 to 7.
CN202010469756.6A 2020-05-28 2020-05-28 Target tracking method, device, equipment and readable storage medium Active CN113744303B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010469756.6A CN113744303B (en) 2020-05-28 2020-05-28 Target tracking method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010469756.6A CN113744303B (en) 2020-05-28 2020-05-28 Target tracking method, device, equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN113744303A CN113744303A (en) 2021-12-03
CN113744303B true CN113744303B (en) 2024-04-09

Family

ID=78724261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010469756.6A Active CN113744303B (en) 2020-05-28 2020-05-28 Target tracking method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN113744303B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106331300A (en) * 2015-07-01 2017-01-11 中兴通讯股份有限公司 Mobile phone tracking method and device as well as terminal
CN108668236A (en) * 2018-04-24 2018-10-16 Oppo广东移动通信有限公司 Method for tracing, device and the terminal device of terminal device
CN110856117A (en) * 2018-08-03 2020-02-28 浙江宇视科技有限公司 Intelligent terminal tracking method and device based on WIFI and network camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106331300A (en) * 2015-07-01 2017-01-11 中兴通讯股份有限公司 Mobile phone tracking method and device as well as terminal
CN108668236A (en) * 2018-04-24 2018-10-16 Oppo广东移动通信有限公司 Method for tracing, device and the terminal device of terminal device
CN110856117A (en) * 2018-08-03 2020-02-28 浙江宇视科技有限公司 Intelligent terminal tracking method and device based on WIFI and network camera

Also Published As

Publication number Publication date
CN113744303A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
CN104299351B (en) Intelligent early-warning firefighting robot
US9721461B2 (en) Method, apparatus and system for affirming alarm identity of video monitoring system
US9380397B2 (en) System and method for detecting and analyzing near range weapon fire
CN103726879B (en) Utilize camera automatic capturing mine ore deposit to shake and cave in and the method for record warning in time
CN105357480A (en) Public place wireless internet access security management system and operation method thereof
CN104867265A (en) Camera apparatus, and fire detection alarm system and method
CN108540752B (en) Method, device and system for identifying target object in video monitoring
CN107959812B (en) Monitoring data storage method, device and system and routing equipment
CN204706130U (en) Camera head and fire detection warning system
KR20150018037A (en) System for monitoring and method for monitoring using the same
CN108540756A (en) Recognition methods, apparatus and system based on video and electronic device identification
CN104580227A (en) Automatic defense organizing and removing method for detecting mobile phone MAC address based on home network
CN109376601A (en) Object tracking methods, monitoring server based on clipping the ball, video monitoring system
CN104021655B (en) A kind of interlink alarm system based on law enforcement information acquisition station and alarm method
CN103227916B (en) A kind of monitor video backup method, Apparatus and system
CN108537088B (en) Monitoring method and system
CN112216398A (en) Urban and rural body temperature monitoring, alarming and epidemic preventing system and method based on face recognition
CN113744303B (en) Target tracking method, device, equipment and readable storage medium
CN105872466A (en) Alarm method and alarm device based on smart TV
CN109963113A (en) A kind of monitoring method and device of interesting target
CN108540747B (en) Video monitoring method, device and system
CN207123920U (en) Intelligent alarm device and warning system
CN111711796A (en) Intelligent human shape detection system
CN104702913B (en) The safety control system and device of nuclear power plant
CN108540759B (en) Video monitoring method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant