CN113470069A - Target tracking method, electronic device, and computer-readable storage medium - Google Patents

Target tracking method, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN113470069A
CN113470069A CN202110639175.7A CN202110639175A CN113470069A CN 113470069 A CN113470069 A CN 113470069A CN 202110639175 A CN202110639175 A CN 202110639175A CN 113470069 A CN113470069 A CN 113470069A
Authority
CN
China
Prior art keywords
instruction
camera
priority information
target
execute
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110639175.7A
Other languages
Chinese (zh)
Inventor
蒋茹
隋小波
潘武
梅海波
刘克玮
严俊琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202110639175.7A priority Critical patent/CN113470069A/en
Publication of CN113470069A publication Critical patent/CN113470069A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a target tracking method, an electronic device and a computer readable storage medium, wherein the method comprises the following steps: responding to the first camera to obtain a first instruction sent by the second camera, wherein the first camera executes a second instruction; judging whether the time for the first camera to execute the second instruction exceeds the shortest execution period corresponding to the second instruction; if yes, the first camera executes a first instruction to track the corresponding target; otherwise, the first camera executes the instruction associated with the highest priority information based on the priority information associated with the first instruction and the second instruction. Through the mode, the first camera can simplify the instruction execution flow, and the instruction execution efficiency and the instruction execution order are improved.

Description

Target tracking method, electronic device, and computer-readable storage medium
Technical Field
The present application relates to the field of security technologies, and in particular, to a target tracking method, an electronic device, and a computer-readable storage medium.
Background
With the continuous development of video monitoring technology, video monitoring equipment has been widely applied to the field of security protection. In a monitoring place, a linkage mode of various types of cameras is adopted in order to monitor a scene in a large range and track details of a specific target.
In the prior art, when a second camera (e.g., a panoramic camera) with a large monitoring range monitors a target and then sends an instruction to a first camera (e.g., a detail camera), in order to better implement control of the first camera, a decision layer is usually arranged on the second camera to decide whether to control the first camera to rotate, so as to prevent the first camera from being unable to normally execute actions when the second camera sends the instruction to the first camera within a short time period, but each second camera is provided with a decision layer and the decision layers of each second camera need to be interconnected, which makes the flow of the first camera executing the instruction complicated, the execution efficiency of the instruction low and the probability of confusion among the instructions high.
Disclosure of Invention
The technical problem mainly solved by the present application is to provide a target tracking method, an electronic device, and a computer-readable storage medium, which can simplify a flow of executing instructions by a first camera, and improve efficiency and order of executing instructions.
In order to solve the above technical problem, a first aspect of the present application provides a target tracking method, including: responding to a first camera to obtain a first instruction sent by a second camera, wherein the first camera executes a second instruction; judging whether the time for the first camera to execute the second instruction exceeds the shortest execution period corresponding to the second instruction or not; if yes, the first camera executes the first instruction to track a corresponding target; otherwise, the first camera executes instructions associated with highest priority information based on priority information associated with the first and second instructions.
Wherein the step of the first camera executing the instruction associated with the highest priority information based on the priority information associated with the first instruction and the second instruction comprises: acquiring priority information of the second camera sending the first instruction and priority information of a third camera sending the second instruction; in response to the priority information of the second camera being higher than the priority information of the third camera, the first camera executing the first instruction to track the corresponding target; in response to the priority information of the second camera not being higher than the priority information of the third camera, the first camera continues to execute the second instruction.
Wherein the second camera and the third camera belong to the same or different camera groups, and different camera groups have different priority information, and the step of obtaining the priority information of the second camera sending the first instruction and the priority information of the third camera sending the second instruction includes: and acquiring priority information of a camera group where the second camera sending the first instruction is located and priority information of a camera group where a third camera sending the second instruction is located.
Wherein the step of the first camera executing the instruction associated with the highest priority information based on the priority information associated with the first instruction and the second instruction comprises: obtaining a first target contained in the first instruction and a second target contained in the second instruction; in response to priority information of the first target being higher than priority information of the second target, the first camera executing the first instruction to track the corresponding target; in response to the priority information of the first target not being higher than the priority information of the second target, the first camera continues to execute the second instruction.
The method for acquiring the first target included in the first instruction and the second target included in the second instruction includes: the type of the first target and the corresponding priority information thereof contained in the first instruction are obtained, and the type of the second target and the corresponding priority information thereof contained in the second instruction are obtained.
Wherein the step of the first camera executing the instruction associated with the highest priority information based on the priority information associated with the first instruction and the second instruction comprises: acquiring priority information of the second camera sending the first instruction and priority information of a third camera sending the second instruction; in response to the priority information of the second camera being higher than the priority information of the third camera, the first camera executing the first instruction to track the corresponding target; or, in response to the priority information of the second camera being lower than the priority information of the third camera, the first camera continues to execute the second instruction; or, in response to the priority information of the second camera being equal to the priority information of the third camera, acquiring a first target included in the first instruction and a second target included in the second instruction; in response to priority information of the first target being higher than priority information of the second target, the first camera executing the first instruction to track the corresponding target; in response to the priority information of the first target not being higher than the priority information of the second target, the first camera continues to execute the second instruction.
Wherein the step of the first camera continuing to execute the second instruction comprises: judging whether other instructions are received before the second instruction reaches the shortest execution cycle; if so, discarding the first instruction and taking the other instructions as new first instructions, and then returning to the step that the first camera executes the instructions related to the highest priority information based on the priority information related to the first instruction and the second instruction; otherwise, the first camera executes the first instruction to track the corresponding target.
Wherein the step of the first camera executing the first instructions to track the corresponding target comprises; obtaining and storing priority information and the shortest execution cycle corresponding to the first instruction; executing the first instruction and recording a first timestamp for starting execution of the first instruction; marking the first instruction as the second instruction; the step of judging whether the time for the first camera to execute the second instruction exceeds the shortest execution period corresponding to the second instruction includes: and acquiring a current timestamp, and judging whether the difference value between the current timestamp and the first timestamp is greater than the shortest execution period.
In order to solve the above technical problem, a second aspect of the present application provides an electronic device, which includes a memory and a processor coupled to each other, wherein the memory stores program data, and the processor calls the program data to execute the target tracking method of the first aspect.
To solve the above technical problem, a third aspect of the present application provides a computer-readable storage medium having stored thereon program data, which when executed by a processor, implements the object tracking method of the first aspect.
The beneficial effect of this application is: the method includes the steps that a first instruction of a second camera is obtained through a first camera, if the first camera comprises a second instruction which is being executed, whether the time for the first camera to execute the second instruction exceeds the shortest execution period corresponding to the second instruction is judged, if yes, the first instruction is executed to track a target, otherwise, an instruction related to the highest priority information is selected from priority information related to the first instruction and the second instruction, and whether the second instruction is continuously executed or the first instruction is executed to track the target is determined. Therefore, the first instruction sent by the second camera and the second instruction being executed are decided by the first camera at the end of the first camera, the complexity of a system with multiple types of cameras linked is reduced, the flow of executing the instruction by the first camera is simplified, the execution logic of the instruction can be obtained based on the shortest execution period and the priority information of the instruction, and the order and the efficiency of instruction execution are improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts. Wherein:
FIG. 1 is a schematic flow chart diagram illustrating an embodiment of a target tracking method of the present application;
FIG. 2 is a schematic flow chart diagram illustrating another embodiment of a target tracking method according to the present application;
FIG. 3 is a schematic flow chart diagram illustrating a further embodiment of a target tracking method of the present application;
FIG. 4 is a schematic structural diagram of an embodiment of an electronic device of the present application;
FIG. 5 is a schematic structural diagram of an embodiment of a computer-readable storage medium according to the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship. Further, the term "plurality" herein means two or more than two.
Referring to fig. 1, fig. 1 is a schematic flow chart diagram illustrating an embodiment of a target tracking method according to the present application, the method including:
step S101: the first instruction sent by the second camera is obtained in response to the first camera, and the second instruction is being executed by the first camera.
Specifically, when the first camera obtains the first instruction sent by the second camera, it is determined whether the first camera includes the second instruction being executed, if not, the first camera executes the first instruction to track the corresponding target, and if the first camera includes the second instruction being executed, the process proceeds to step S102.
In an application mode, a first camera is an X detail camera (such as a dome camera) for tracking a target and acquiring a detail image, a second camera is a panoramic camera (such as a gunlock) for monitoring a large range and having a fixed position, a plurality of second cameras call one first camera together, when the second camera shoots the target to be tracked, a first instruction is generated and sent to the first camera, if the first camera does not have an executing second instruction, the first instruction is executed, the first instruction is marked as a second instruction, and if the first camera includes the executing second instruction, the step S102 is entered.
Step S102: and judging whether the time for the first camera to execute the second instruction exceeds the shortest execution period corresponding to the second instruction.
Specifically, the executed time of the first camera that has executed the second instruction is obtained, whether the executed time exceeds the shortest execution period corresponding to the second instruction is determined, if yes, step S103 is executed, otherwise, step S104 is executed. Wherein the shortest execution period is preset by the user and stored in the first camera.
In an application mode, the shortest execution cycle of the instruction corresponding to each second camera is set based on the position of the first camera relative to the second camera, wherein the shortest execution cycle is calculated when the first camera receives the instruction sent by the second camera.
In a specific application scenario, the larger the angle between the first camera and the second camera is, the longer the shortest execution period of the first instruction sent by the second camera is.
In another application, the shortest execution period of the instruction is set based on the type of the object photographed by the second camera, wherein the shortest execution period is calculated from the time when the first camera receives the instruction transmitted by the second camera.
In a specific application scene, the shortest execution cycle of the instruction is set based on the type of the target collected by the second camera, the type of the target is analyzed after the target is collected by the second camera, the first instruction comprises the type information of the target when the first instruction is sent, and the shortest execution cycle lengths corresponding to the target types corresponding to pedestrians, non-motor vehicles and motor vehicles are decreased progressively.
Step S103: the first camera executes the first instructions to track the corresponding target.
Specifically, the first camera executes the first instruction to track the target included in the first instruction, and then the first camera rotates to a position corresponding to the target to track the target, and captures detailed information corresponding to the target.
In an application mode, after the first camera acquires the first instruction, the priority information and the shortest execution period corresponding to the first instruction are acquired and stored, the first instruction is executed, a first timestamp for starting execution of the first instruction is recorded, and the first instruction is marked as a second instruction.
Specifically, the priority information and the shortest execution cycle corresponding to the first instruction are stored, the first timestamp when the first instruction starts to be executed is obtained, the first instruction is marked as the executing second instruction, so that the first instruction is obtained again when the second instruction is executed subsequently, the current timestamp is obtained, whether the difference value between the current timestamp and the first timestamp is greater than the shortest execution cycle is judged, and an accurate basis is provided for whether the time for executing the second instruction exceeds the shortest execution cycle corresponding to the second instruction.
Step S104: based on the priority information associated with the first instruction and the second instruction, the first camera executes the instruction associated with the highest priority information.
Specifically, the second camera is provided with corresponding priority information, when a corresponding request instruction is generated, the priority information corresponding to the second camera is combined with the request instruction to generate a first instruction, when the first instruction is executed by the first camera, the first camera stores the priority information corresponding to the first instruction, the first instruction is marked as a second instruction, and the original priority information is the priority information of the second instruction.
Further, the first camera acquires priority information of the first instruction and priority information of the second instruction, compares the priority information of the first instruction and the priority information of the second instruction, executes the first instruction to track the corresponding target if the priority information of the first instruction is higher than that of the second instruction, and continues to execute the second instruction to track the corresponding target if the priority information of the first instruction is not higher than that of the second instruction.
In a specific application scenario, the system for linking multiple types of cameras comprises a first camera and multiple second cameras, wherein the first camera is a dome camera, and the second camera is a gun camera. The method comprises the steps that the rifle bolts are provided with corresponding priority information and a shortest execution period, a request instruction is generated after any rifle bolt monitors a target, the corresponding priority information and the shortest execution period are added, a first instruction is generated and sent to the ball machine, if the ball machine does not have a second instruction which is executed at the moment, the ball machine executes the first instruction, if the ball machine comprises the second instruction which is executed, whether the time for the ball machine to execute the second instruction exceeds the shortest execution period corresponding to the second instruction or not is judged, if the time for the ball machine to execute the second instruction exceeds the shortest execution period corresponding to the second instruction, the ball machine executes the first instruction, if the time for the ball machine to execute the second instruction does not exceed the second instruction, the priority information corresponding to the first instruction and the second instruction is judged, only when the priority information of the first instruction is higher than the second instruction, the ball machine executes the first instruction, and otherwise, the ball machine continues to execute the second instruction.
According to the scheme, the first camera is used for acquiring the first instruction of the second camera, if the first camera comprises the second instruction which is being executed, whether the time for the first camera to execute the second instruction exceeds the shortest execution period corresponding to the second instruction is judged, if so, the first instruction is executed to track the target, otherwise, the instruction related to the highest priority information is selected from the priority information related to the first instruction and the second instruction, and whether the second instruction is continuously executed or the first instruction is executed to track the target is determined. Therefore, the first instruction sent by the second camera and the second instruction being executed are decided by the first camera at the end of the first camera, the complexity of a system with multiple types of cameras linked is reduced, the flow of executing the instruction by the first camera is simplified, the execution logic of the instruction can be obtained based on the shortest execution period and the priority information of the instruction, and the order and the efficiency of instruction execution are improved.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating another embodiment of a target tracking method according to the present application, the method including:
step S201: the first instruction sent by the second camera is obtained in response to the first camera, and the second instruction is being executed by the first camera.
Specifically, when the first camera obtains the first instruction sent by the second camera, the data of the second instruction being executed is extracted from the first camera, if the first camera does not have the second instruction being executed, the data is empty, and if the first camera includes the second instruction being executed, the priority information, the shortest execution cycle, and the first timestamp for starting execution of the second instruction are obtained.
Step S202: and judging whether the time for the first camera to execute the second instruction exceeds the shortest execution period corresponding to the second instruction.
Specifically, it is determined whether the difference between the current timestamp and the first timestamp is greater than the shortest execution period, if so, the process proceeds to step S205, and if not, the process proceeds to step S203.
Step S203: a first target included in the first instruction and a second target included in the second instruction are obtained.
Specifically, different types of targets correspond to different priority information, when the first camera detects a target, a request instruction is generated and a type corresponding to the target is added, and after the first camera acquires the first instruction or the second instruction, the priority information corresponding to the type of the first target included in the first instruction and the priority information corresponding to the type of the second target included in the second instruction are extracted.
In one application mode, the types of the targets shot by the second camera and the third camera are the same or different, and different types of the targets have different priority information. The step of obtaining a first target included in the first instruction and a second target included in the second instruction includes: the type of the first target and the corresponding priority information contained in the first instruction are obtained, and the type of the second target and the corresponding priority information contained in the second instruction are obtained.
Specifically, different types of targets are provided with different priority information, wherein the hierarchy of the priority information is associated with the moving speed or the degree of importance of the target.
In an application scene, pedestrians, non-motor vehicles and motor vehicles are respectively provided with priority information with the levels from low to high, the moving speed of the pedestrians is slow, the moving time in the whole monitoring range is long, so that the level of the priority information is the lowest when the pedestrians are targeted, and when the motor vehicles are targeted, the moving time of the motor vehicles in the whole monitoring range is short, so that the level of the priority information is the highest when the motor vehicles are targeted, and important information such as license plates of the motor vehicles and the like can be captured as soon as possible.
In another application scenario, the pedestrians in the preset uniform and the pedestrians in the non-preset uniform are provided with priority information from low to high in level, and the pedestrians in the preset uniform can acquire face or body feature information as workers in advance, so that the level of the priority information is the lowest, and the pedestrians in the non-preset uniform do not acquire the face or body feature information at a higher probability, so that the pedestrians in the non-preset uniform need to be tracked as soon as possible and the level of the priority information is the highest.
Further, in response to the first camera starting to execute the second instruction, the priority information corresponding to the type of the second target in the second instruction is stored, in response to the first camera acquiring the first instruction sent by the second camera, the first camera stores the priority information corresponding to the type of the first target in the first instruction, and the first camera side performs a unified decision on the instruction execution sequence through the priority information, so that a decision layer is prevented from being arranged on the second camera or the third camera side, the first camera makes a decision uniformly, and the order of instruction execution is improved.
Step S204: it is determined whether the priority information of the first object is higher than the priority information of the second object.
Specifically, in response to the priority information of the first target being higher than the priority information of the second target, the first camera executes the first instruction to track the corresponding target, proceeding to step S205. In response to the priority information of the first target not being higher than the priority information of the second target, the first camera continues to execute the second instruction, and proceeds to step S206.
Step S205: the first camera executes the first instructions to track the corresponding target.
Specifically, the first camera executes the first instruction to track a target included in the first instruction, saves priority information and a shortest execution cycle corresponding to the first instruction, acquires a first timestamp when the first instruction starts to be executed, and marks the first instruction as a second instruction being executed.
Step S206: the first camera continues to execute the second instruction.
Specifically, the first camera continues to execute the second instruction to continue tracking the target included in the second instruction, and continues to wait whether other first instructions are sent to the first camera.
In an application mode, judging whether other instructions are received before the second instruction reaches the shortest execution cycle; if so, discarding the first instruction and taking other instructions as new first instructions, and then returning to the step that the first camera executes the instruction related to the highest priority information based on the priority information related to the first instruction and the second instruction; otherwise, the first camera executes the first instruction to track the corresponding target.
Specifically, when the difference between the current timestamp for executing the second instruction and the first timestamp exceeds the shortest execution period and no instruction is received, the first camera executes the first instruction to track the corresponding target, so that the integrity of instruction execution is improved, and the first instruction sent by the second camera can be executed by the first camera as far as possible. And when the difference value between the current timestamp for executing the second instruction and the first timestamp does not reach the shortest execution period, receiving other instructions, discarding the first instruction with the previous time sequence, taking the other instructions as the first instruction, judging the size relationship of the priority information of the new first instruction and the second instruction, and determining to continue executing the second instruction or execute the first instruction based on the size relationship.
According to the scheme, the first instruction and the second instruction comprise the type of the target, the type of the target corresponds to the priority information, and further how to execute the first instruction or the second instruction is decided by the first camera on the basis of the hierarchy of the priority information and the shortest execution cycle of the second instruction, so that the complexity of the system is reduced, and the decision efficiency is improved.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating a target tracking method according to another embodiment of the present application, the method including:
step S301: the first instruction sent by the second camera is obtained in response to the first camera, and the second instruction is being executed by the first camera.
Step S302: and judging whether the time for the first camera to execute the second instruction exceeds the shortest execution period corresponding to the second instruction.
Specifically, the steps S301 to S302 are similar to those described above, and the description of the related contents refers to the detailed description of the above embodiments, which is not repeated herein.
Step S303: the first camera executes the first instructions to track the corresponding target.
Specifically, the first camera executes the first instruction to track a target included in the first instruction, saves priority information and a shortest execution cycle corresponding to the first instruction, acquires a first timestamp when the first instruction starts to be executed, and marks the first instruction as a second instruction being executed.
Step S304: priority information of a second camera that transmits the first instruction and priority information of a third camera that transmits the second instruction are obtained.
Specifically, the cameras except the first camera are divided into a plurality of camera groups, each camera group is allocated with corresponding priority information, and when the second camera or the third camera generates a request instruction, the corresponding priority information is added to the request instruction based on the camera group where the second camera or the third camera is located. After the first camera acquires the first instruction or the second instruction, the priority information of the second camera contained in the first instruction and the priority information of the third camera contained in the second instruction are extracted.
In one application, the second camera and the third camera belong to the same or different camera groups, and the different camera groups have different priority information. The step of obtaining priority information of a second camera sending the first instruction and priority information of a third camera sending the second instruction includes: and acquiring priority information of a camera group where a second camera sending the first instruction is located and priority information of a camera group where a third camera sending the second instruction is located.
Specifically, cameras other than the first camera are divided into a plurality of camera groups, wherein each camera group corresponds to priority information, the hierarchy of which is associated with the position of the camera in the monitored site.
In an application scene, cameras except the first camera are provided with corresponding scores based on the importance degree of the position in the monitoring place, a plurality of score sections are selected based on the corresponding scores, the cameras belonging to the same score section are classified into the same camera group, and the higher the score, the higher the level of the priority information corresponding to the camera group is.
Further, in response to the first camera starting to execute the second instruction, the priority information of the camera group where the third camera sending the second instruction is located is stored, in response to the first camera obtaining the first instruction sent by the second camera, the first camera stores the priority information of the camera group where the second camera sending the first instruction is located, the first camera side performs a unified decision on the order of instruction execution through the priority information, a decision layer is prevented from being arranged on the second camera or the third camera side, the decision is made by the first camera in a unified manner, and the order of instruction execution is improved.
Step S305: the first camera executes an instruction related to the highest priority information based on a magnitude relationship of the priority information of the second camera and the priority information of the third camera.
Specifically, the magnitude relation between the priority information of the second camera and the priority information of the third camera is judged, and the first camera preferentially executes the instruction with the highest level of the priority information.
In an application mode, in response to the priority information of the second camera being higher than the priority information of the third camera, the first camera executes a first instruction to track the corresponding target; in response to the priority information of the second camera not being higher than the priority information of the third camera, the first camera continues to execute the second instruction.
Specifically, the second camera sending the first instruction corresponds to priority information, the third camera sending the second instruction corresponds to priority information, and then how to execute the first instruction or the second instruction is determined by the first camera based on the hierarchy of the priority information, so that the complexity of the system is reduced, and the decision efficiency is improved.
In another application, in response to the priority information of the second camera being higher than the priority information of the third camera, the first camera executes the first instruction to track the corresponding target; or, in response to the priority information of the second camera being lower than the priority information of the third camera, the first camera continues to execute the second instruction; or, in response to the priority information of the second camera being equal to the priority information of the third camera, acquiring a first target included in the first instruction and a second target included in the second instruction; in response to the priority information of the first target being higher than the priority information of the second target, the first camera executes a first instruction to track the corresponding target; in response to the priority information of the first target not being higher than the priority information of the second target, the first camera continues to execute the second instruction.
Specifically, when the priority information of the second camera is higher than the priority information of the third camera, the first camera executes the first instruction to track the corresponding target, and when the priority information of the second camera is lower than the priority information of the third camera, the first camera continues to execute the second instruction. When the priority information of the second camera is equal to the priority information of the third camera, a first target included in the first instruction and a second target included in the second instruction are obtained, and a size relationship between the priority information corresponding to the first target and the priority information corresponding to the second target is determined.
Further, a second camera sending the first instruction corresponds to priority information, a third camera sending the second instruction corresponds to priority information, the first camera makes a decision based on the priority information corresponding to the cameras in priority, if the priority information levels of the second camera and the third camera are the same, the first camera makes a decision based on the priority information corresponding to the type of the target, and then how to execute the instructions is decided by the first camera in a unified manner based on the priority information corresponding to the type of the target, and the judgment of the priority information of the first instruction and the priority information of the second instruction is divided into two levels so as to improve the reasonability of the first camera in executing the instructions.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an embodiment of an electronic device 40 of the present application, where the electronic device includes a memory 401 and a processor 402 coupled to each other, where the memory 401 stores program data (not shown), and the processor 402 calls the program data to implement the target tracking method in any of the embodiments described above, and for a description of relevant contents, reference is made to the detailed description of the embodiment of the method described above, which is not repeated herein.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of a computer-readable storage medium 50 of the present application, the computer-readable storage medium 50 stores program data 500, and the program data 500 is executed by a processor to implement the target tracking method in any of the above embodiments, and the related contents are described in detail with reference to the above method embodiments, which are not repeated herein.
It should be noted that, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. A method of target tracking, the method comprising:
responding to a first camera to obtain a first instruction sent by a second camera, wherein the first camera executes a second instruction;
judging whether the time for the first camera to execute the second instruction exceeds the shortest execution period corresponding to the second instruction or not;
if yes, the first camera executes the first instruction to track a corresponding target;
otherwise, the first camera executes instructions associated with highest priority information based on priority information associated with the first and second instructions.
2. The target tracking method of claim 1, wherein the step of the first camera executing an instruction associated with highest priority information based on priority information associated with the first instruction and the second instruction comprises:
acquiring priority information of the second camera sending the first instruction and priority information of a third camera sending the second instruction;
in response to the priority information of the second camera being higher than the priority information of the third camera, the first camera executing the first instruction to track the corresponding target; in response to the priority information of the second camera not being higher than the priority information of the third camera, the first camera continues to execute the second instruction.
3. The target tracking method of claim 2,
the second camera and the third camera belong to the same or different camera groups, and different camera groups have different priority information, and the step of obtaining the priority information of the second camera sending the first instruction and the priority information of the third camera sending the second instruction includes:
and acquiring priority information of a camera group where the second camera sending the first instruction is located and priority information of a camera group where a third camera sending the second instruction is located.
4. The target tracking method of claim 1, wherein the step of the first camera executing an instruction associated with highest priority information based on priority information associated with the first instruction and the second instruction comprises:
obtaining a first target contained in the first instruction and a second target contained in the second instruction;
in response to priority information of the first target being higher than priority information of the second target, the first camera executing the first instruction to track the corresponding target; in response to the priority information of the first target not being higher than the priority information of the second target, the first camera continues to execute the second instruction.
5. The target tracking method of claim 4,
the types of the targets shot by the second camera and the third camera are the same or different, and different types of the targets have different priority information, and the step of obtaining the first target included in the first instruction and the second target included in the second instruction includes:
the type of the first target and the corresponding priority information thereof contained in the first instruction are obtained, and the type of the second target and the corresponding priority information thereof contained in the second instruction are obtained.
6. The target tracking method of claim 1, wherein the step of the first camera executing an instruction associated with highest priority information based on priority information associated with the first instruction and the second instruction comprises:
acquiring priority information of the second camera sending the first instruction and priority information of a third camera sending the second instruction;
in response to the priority information of the second camera being higher than the priority information of the third camera, the first camera executing the first instruction to track the corresponding target; alternatively, the first and second electrodes may be,
in response to the priority information of the second camera being lower than the priority information of the third camera, the first camera continuing to execute the second instruction; alternatively, the first and second electrodes may be,
acquiring a first target included in the first instruction and a second target included in the second instruction in response to the priority information of the second camera being equal to the priority information of the third camera;
in response to priority information of the first target being higher than priority information of the second target, the first camera executing the first instruction to track the corresponding target; in response to the priority information of the first target not being higher than the priority information of the second target, the first camera continues to execute the second instruction.
7. The target tracking method of any of claims 2-6, wherein the step of the first camera continuing to execute the second instruction comprises:
judging whether other instructions are received before the second instruction reaches the shortest execution cycle;
if so, discarding the first instruction and taking the other instructions as new first instructions, and then returning to the step that the first camera executes the instructions related to the highest priority information based on the priority information related to the first instruction and the second instruction;
otherwise, the first camera executes the first instruction to track the corresponding target.
8. The target tracking method of any of claims 1-6, wherein the step of the first camera executing the first instruction to track the corresponding target comprises;
obtaining and storing priority information and the shortest execution cycle corresponding to the first instruction;
executing the first instruction and recording a first timestamp for starting execution of the first instruction;
marking the first instruction as the second instruction;
the step of judging whether the time for the first camera to execute the second instruction exceeds the shortest execution period corresponding to the second instruction includes:
and acquiring a current timestamp, and judging whether the difference value between the current timestamp and the first timestamp is greater than the shortest execution period.
9. An electronic device, comprising: a memory and a processor coupled to each other, wherein the memory stores program data that the processor calls to perform the method of any of claims 1-8.
10. A computer-readable storage medium, on which program data are stored, which program data, when being executed by a processor, carry out the method of any one of claims 1-8.
CN202110639175.7A 2021-06-08 2021-06-08 Target tracking method, electronic device, and computer-readable storage medium Pending CN113470069A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110639175.7A CN113470069A (en) 2021-06-08 2021-06-08 Target tracking method, electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110639175.7A CN113470069A (en) 2021-06-08 2021-06-08 Target tracking method, electronic device, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN113470069A true CN113470069A (en) 2021-10-01

Family

ID=77869482

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110639175.7A Pending CN113470069A (en) 2021-06-08 2021-06-08 Target tracking method, electronic device, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN113470069A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111291585A (en) * 2018-12-06 2020-06-16 杭州海康威视数字技术股份有限公司 Target tracking system, method and device based on GPS and dome camera
CN111405242A (en) * 2020-02-26 2020-07-10 北京大学(天津滨海)新一代信息技术研究院 Ground camera and sky moving unmanned aerial vehicle linkage analysis method and system
CN112135058A (en) * 2020-09-30 2020-12-25 浙江大华技术股份有限公司 Mobile terminal and method for controlling PTZ camera by using mobile terminal
CN112804108A (en) * 2021-01-29 2021-05-14 杭州海康威视数字技术股份有限公司 Signaling execution method and device, electronic equipment and machine-readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111291585A (en) * 2018-12-06 2020-06-16 杭州海康威视数字技术股份有限公司 Target tracking system, method and device based on GPS and dome camera
CN111405242A (en) * 2020-02-26 2020-07-10 北京大学(天津滨海)新一代信息技术研究院 Ground camera and sky moving unmanned aerial vehicle linkage analysis method and system
CN112135058A (en) * 2020-09-30 2020-12-25 浙江大华技术股份有限公司 Mobile terminal and method for controlling PTZ camera by using mobile terminal
CN112804108A (en) * 2021-01-29 2021-05-14 杭州海康威视数字技术股份有限公司 Signaling execution method and device, electronic equipment and machine-readable storage medium

Similar Documents

Publication Publication Date Title
WO2019237536A1 (en) Target real-time tracking method and apparatus, and computer device and storage medium
CN112509364B (en) Method and device for determining parking state of vehicle, computer equipment and storage medium
CN111798483A (en) Anti-blocking pedestrian tracking method and device and storage medium
CN108961316A (en) Image processing method, device and server
CN114943750A (en) Target tracking method and device and electronic equipment
CN113470069A (en) Target tracking method, electronic device, and computer-readable storage medium
CN110933314B (en) Focus-following shooting method and related product
CN114071069A (en) Ball machine control method, device and equipment
CN115412668A (en) Tracking shooting method and device and computer readable storage medium
CN116051464A (en) Vehicle charging port positioning method and related equipment
CN115984516A (en) Augmented reality method based on SLAM algorithm and related equipment
CN112804108B (en) Signaling execution method and device, electronic equipment and machine-readable storage medium
CN115147948A (en) Electronic patrol method, device and equipment
CN111885354B (en) Service improvement discrimination method and device for bank outlets
CN112347996A (en) Scene state judgment method, device, equipment and storage medium
JP2012242970A (en) Image processing device and control method therefor
CN112766764A (en) Security monitoring method and device based on intelligent robot and storage medium
CN105631419A (en) Face recognition method and device
CN111008611A (en) Queuing time determining method and device, storage medium and electronic device
CN114566043B (en) Evidence obtaining method and device for target parking and readable storage medium
CN111915639B (en) Target detection tracking method, device, electronic equipment and storage medium
CN112883864B (en) Ball-free shielding event identification method, device, computer equipment and storage medium
CN112784794B (en) Vehicle parking state detection method and device, electronic equipment and storage medium
CN109328373A (en) Image processing method and its relevant device, storage medium
CN108737351B (en) Distributed denial of service attack defense control method and scheduling equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination