CN110264497A - Track determination method and device, the storage medium, electronic device of duration - Google Patents

Track determination method and device, the storage medium, electronic device of duration Download PDF

Info

Publication number
CN110264497A
CN110264497A CN201910502736.1A CN201910502736A CN110264497A CN 110264497 A CN110264497 A CN 110264497A CN 201910502736 A CN201910502736 A CN 201910502736A CN 110264497 A CN110264497 A CN 110264497A
Authority
CN
China
Prior art keywords
tracking
tracking object
frame image
duration
target area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910502736.1A
Other languages
Chinese (zh)
Other versions
CN110264497B (en
Inventor
李中振
潘华东
龚磊
彭志蓉
林桥洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN201910502736.1A priority Critical patent/CN110264497B/en
Publication of CN110264497A publication Critical patent/CN110264497A/en
Application granted granted Critical
Publication of CN110264497B publication Critical patent/CN110264497B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Abstract

The present invention provides a kind of determination method and devices for tracking duration, storage medium, electronic device, this method comprises: carrying out the mark for determining the first tracking object being located in target area in the obtained nth frame image of video monitoring, the shooting time of the tracking duration and nth frame image that track to the first tracking object to target area;Target area is being carried out to determine that there is no the first tracking objects in target area in the obtained M frame image of video monitoring;Target area is being carried out to determine the mark of the second tracking object occurred in target area and the shooting time of O frame image in the obtained O frame image of video monitoring;Under the first tracking object and the matched situation of the second tracking object, a length of second duration when the determining tracking tracked to the second tracking object in O frame image.Through the invention, it solves the problems, such as that the tracking duration calculation inaccuracy for being not detected object is achieved the effect that accurately to determine tracking duration.

Description

Track determination method and device, the storage medium, electronic device of duration
Technical field
The present invention relates to computer fields, are situated between in particular to a kind of determination method and device for tracking duration, storage Matter, electronic device.
Background technique
In the prior art, when tracking using image to the object in queuing, tracking continuity is only relied on to judge The tracking duration of tracking object.
When tracking object is not detected, it is according to the overlapping area ratio-dependent that front and back tracks the tracking box in image No is the same tracking object.The similarity of potential object and tracking object is either detected by comparing, it is determined whether is Same tracking object, to realize the reorientation to tracking object.
It can be seen from the above, being inaccurate in the prior art to the calculating for the tracking duration for being not detected object.
Aiming at the problems existing in the prior art, the relevant technologies are not yet put forward effective solutions.
Summary of the invention
The embodiment of the invention provides a kind of determination method and devices for tracking duration, storage medium, electronic device, so that It solves the problems, such as less in the related technology to the tracking duration calculation inaccuracy for being not detected object.
According to one embodiment of present invention, a kind of determination method for tracking duration is provided, comprising: to target area Carry out the mark of determining the first tracking object being located in target area in the obtained nth frame image of video monitoring, to first The tracking duration of tracking object tracking and the shooting time of nth frame image, wherein the tracking to the tracking of the first tracking object The first duration of Shi Changwei;Target area determine in target area not in the obtained M frame image of video monitoring There are the first tracking objects;Target area is being carried out determining in target area in the obtained O frame image of video monitoring The mark of the second existing tracking object and the shooting time of O frame image, wherein O > M > N, O, M, N are positive integer;? Under first tracking object and the matched situation of the second tracking object, determination tracks the second tracking object in O frame image A length of second duration when tracking, wherein the sum of a length of first duration and shooting interval when second, shooting interval The difference of the shooting time of the shooting time and nth frame image of O frame image.
According to another embodiment of the invention, a kind of determining device for tracking duration is provided, comprising: first determines mould Block, for target area carry out the obtained nth frame image of video monitoring in determine be located at target area in first with The shooting time of the mark of track object, tracking duration and nth frame image that the first tracking object is tracked, wherein to first A length of first duration when the tracking of tracking object tracking;Second determining module, for carrying out video monitoring institute to target area Determine that there is no the first tracking objects in target area in obtained M frame image;Third determining module, for target Region carry out the obtained O frame image of video monitoring in determine target area in appearance the second tracking object mark, with And the shooting time of O frame image, wherein O > M > N, O, M, N are positive integer;4th determining module, in the first tracking Under object and the matched situation of the second tracking object, the tracking duration tracked to the second tracking object is determined in O frame image For the second duration, wherein the sum of a length of first duration and shooting interval when second, shooting interval are O frame image Shooting time and nth frame image shooting time difference.
According to still another embodiment of the invention, a kind of storage medium is additionally provided, is stored with computer in storage medium Program, wherein computer program is arranged to execute the step in any of the above-described embodiment of the method when operation.
According to still another embodiment of the invention, a kind of electronic device, including memory and processor are additionally provided, is stored Computer program is stored in device, processor is arranged to run computer program to execute in any of the above-described embodiment of the method The step of.
Through the invention, due to determine in the obtained nth frame image of video monitoring positioned at mesh to target area Mark the mark of the first tracking object in region, to the tracking duration of the first tracking object tracking and the shooting of nth frame image Time;To target area carry out the obtained M frame image of video monitoring in determine in target area there is no first with Track object, i.e., in the case where the first tracking object is not detected, if there is the second tracking object in O frame image The shooting time of mark and O frame image, then can in the first tracking object and the matched situation of the second tracking object, The sum of first duration and shooting interval can be determined as to the tracking duration of the second tracking object, shooting interval The difference of the shooting time of the shooting time and nth frame image of O frame image.To realize the first tracking object and second with In the case that track object is same target, tracking duration is accurately determined.Therefore, it can solve in the related technology to not being detected The problem for measuring the tracking duration calculation inaccuracy of object achievees the effect that accurately to determine tracking duration.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present invention, constitutes part of this application, this hair Bright illustrative embodiments and their description are used to explain the present invention, and are not constituted improper limitations of the present invention.In the accompanying drawings:
Fig. 1 is a kind of hardware block diagram of the mobile terminal of the determination method of tracking duration of the embodiment of the present invention;
Fig. 2 is the flow chart of the determination method of tracking duration according to an embodiment of the present invention;
Fig. 3 is the flow chart of preferred embodiment in the present embodiment;
Fig. 4 is the schematic diagram of the 491st frame of video;
Fig. 5 is the schematic diagram of the 498th frame of video;
Fig. 6 is the schematic diagram of the 533rd frame of video;
Fig. 7 is the schematic diagram of the 573rd frame of video;
Fig. 8 is the schematic diagram of the 592nd frame of video;
Fig. 9 is the structural block diagram of the determining device of tracking duration according to an embodiment of the present invention.
Specific embodiment
Hereinafter, the present invention will be described in detail with reference to the accompanying drawings and in combination with Examples.It should be noted that not conflicting In the case of, the features in the embodiments and the embodiments of the present application can be combined with each other.
It should be noted that description and claims of this specification and term " first " in above-mentioned attached drawing, " Two " etc. be to be used to distinguish similar objects, without being used to describe a particular order or precedence order.
Embodiment of the method provided by the embodiment of the present application can be in mobile terminal, terminal or similar operation It is executed in device.For running on mobile terminals, Fig. 1 is a kind of determination method of tracking duration of the embodiment of the present invention The hardware block diagram of mobile terminal.As shown in Figure 1, mobile terminal 10 may include that one or more (only shows one in Fig. 1 It is a) (processor 102 can include but is not limited to the processing of Micro-processor MCV or programmable logic device FPGA etc. to processor 102 Device) and memory 104 for storing data, optionally, above-mentioned mobile terminal can also include the biography for communication function Transfer device 106 and input-output equipment 108.It will appreciated by the skilled person that structure shown in FIG. 1 is only to show Meaning, does not cause to limit to the structure of above-mentioned mobile terminal.For example, mobile terminal 10 may also include it is more than shown in Fig. 1 Perhaps less component or with the configuration different from shown in Fig. 1.
Memory 104 can be used for storing computer program, for example, the software program and module of application software, such as this hair The corresponding computer program of determination method of tracking duration in bright embodiment, processor 102 are stored in memory by operation Computer program in 104 realizes above-mentioned method thereby executing various function application and data processing.Memory 104 May include high speed random access memory, may also include nonvolatile memory, as one or more magnetic storage device, flash memory, Or other non-volatile solid state memories.In some instances, memory 104 can further comprise relative to processor 102 Remotely located memory, these remote memories can pass through network connection to mobile terminal 10.The example packet of above-mentioned network Include but be not limited to internet, intranet, local area network, mobile radio communication and combinations thereof.
Transmitting device 106 is used to that data to be received or sent via a network.Above-mentioned network specific example may include The wireless network that the communication providers of mobile terminal 10 provide.In an example, transmitting device 106 includes a Network adaptation Device (Network Interface Controller, referred to as NIC), can be connected by base station with other network equipments to It can be communicated with internet.In an example, transmitting device 106 can for radio frequency (Radio Frequency, referred to as RF) module is used to wirelessly be communicated with internet.
A kind of determination method for tracking duration is provided in the present embodiment, and Fig. 2 is tracking according to an embodiment of the present invention The flow chart of the determination method of duration, as shown in Fig. 2, the process includes the following steps:
Step S202 is located at target area determine in the obtained nth frame image of video monitoring to target area In the first tracking object mark, to the first tracking object tracking tracking duration and nth frame image shooting time, A length of first duration when wherein, to the tracking of the first tracking object tracking;
Optionally, in the present embodiment, it can be applied in the scene being lined up, including but not limited to the dining room row that buys meal The queuing etc. that queuing that team, railway station are bought tickets, the queuing registered of hospital, railway station are entered the station.Target area can be queuing Region, for example, the portal etc. in railway station.The mark of first tracking object can be allocated to the position of first tracking object Mark ID number is moved, the first duration, the coordinate of the first tracking object position, the first tracking object can also be shown in mark Characteristic information etc..In addition, during tracking, when all showing the shooting that each frame image is taken in each frame image Between.
Step S204 determine in target area in the obtained M frame image of video monitoring to target area There is no the first tracking objects;
Optionally, in the present embodiment, it can determine that the first tracking object is not detected after nth frame image, Either it is not detected after nth frame image, before M frame image.The shooting time and M frame image of nth frame image Shooting time between be spaced the first preset duration, before the shooting time to the shooting time of O frame image of M frame image There is no the first tracking objects and not matched with the first tracking object in periods of two preset durations upper target area Tracking object occurs.
Optionally, the first preset duration and the second preset duration can be 1 second.
It should be noted that be not detected after the first tracking object in the target area, it can be to the first tracking object Mark saved, that is, save the first duration, the coordinate of the first tracking object position, the first tracking object feature letter Breath etc., to facilitate subsequent comparison.
Step S206 is carrying out determining in target area in the obtained O frame image of video monitoring to target area The mark of the second existing tracking object and the shooting time of O frame image, wherein O > M > N, O, M, N are positive integer;
In the present embodiment, the mark of the second tracking object includes but is not limited to identify ID number.When the shooting of O frame image Between form can be the date, for example, 2019-06-05.
Step S208, under the first tracking object and the matched situation of the second tracking object, determining couple in O frame image A length of second duration when the tracking of the second tracking object tracking, wherein when second a length of first duration and shooting interval it With the difference of the shooting time of shooting time and nth frame image that, shooting interval is O frame image.
In the present embodiment, a length of second duration when the tracking to the second tracking object can be shown, in O frame with side Just the observation of user.
Through the invention, due to determine in the obtained nth frame image of video monitoring positioned at mesh to target area Mark the mark of the first tracking object in region, to the tracking duration of the first tracking object tracking and the shooting of nth frame image Time;To target area carry out the obtained M frame image of video monitoring in determine in target area there is no first with Track object, i.e., in the case where the first tracking object is not detected in video monitoring, if in O frame image occur second with Track object then obtains the mark of the second tracking object and the shooting time of O frame image, in the first tracking object and second In the matched situation of tracking object, the sum of the first duration and shooting interval can be determined as to the tracking of the second tracking object Duration, shooting interval are the difference of the shooting time of O frame image and the shooting time of nth frame image.To realize In the case where the first tracking object and the second tracking object are same targets, tracking duration is accurately determined.It therefore, can be with It solves the problems, such as that the tracking duration calculation inaccuracy for being not detected object is reached accurate and determines tracking duration in the related technology Effect.
Optionally, the executing subject of above-mentioned steps can be terminal etc., but not limited to this.
In an alternative embodiment, target area determine in the obtained O frame image of video monitoring After after the mark of the second tracking object occurred in target area and the shooting time of O frame image, matching is used for table Show the first information of the first tracking object with for indicating the second information of the second tracking object, wherein the first information includes the The color and vein of the coordinate position of one tracking object, the biological information of the first tracking object and the first tracking object is believed Breath, the second information include the coordinate position of the second tracking object, the biological information of the second tracking object and the second tracking The color and vein information of object;Utilize the first information and second the first tracking object of information matches and the second tracking object.At this In embodiment, the first information can be the characteristic information of the first tracking object, and the second information can be the spy of the second tracking object Reference breath.Utilize the distance between available first tracking object of coordinate position and second tracking object.Color and vein information It can be the color for the clothes either hair that the first tracking object and the second tracking object are worn.
In an alternative embodiment, the first tracking object and the second tracking object can be matched in the following manner: The first tracking object coordinate position between the coordinate position of the second tracking object at a distance from be less than pre-determined distance, first with The color line of the biometric information matches and the first tracking object of the biological information of track object and the second tracking object In the case where the color and vein information matches for managing information and the second tracking object, the first tracking object and the second tracking object are determined Matching.As an example, if the distance between the first tracking object and the second tracking object are smaller, e.g. 1 meter, according to just Normal queue discipline, it is believed that be same target.If if the distance between the first tracking object and the second tracking object It is bigger, for example, between distance be 10 meters, according to normal queue discipline, the first tracking object can not move within 1 second 10 meters, it can judge that the first tracking object and the second tracking object be not same target.
Optionally, in the present embodiment, biological information includes but is not limited to the head and shoulder feature of tracking object, face letter Breath.
In an alternative embodiment, for indicate the first information of the first tracking object with for indicate second with In the unmatched situation of the second information of track object, the second tracking object can be determined as occurring for the first time in video monitoring Tracking object;Start from scratch and timing is carried out to the tracking duration of the second tracking object.In the present embodiment, first tracking object Losing information can save within certain time, if the second tracking pair occurred in the picture frame obtained within certain time As with the first tracking object and mismatch, then it is assumed that the object that the second tracking object is not detected before being not, then according to New tracking object is tracked.
In an alternative embodiment, shift length of first tracking object before nth frame image is determined;Utilize position Shifting distance determines the maximum frame number that the first tracking object is saved with the relationship in target area between preset displacement threshold value.At this In embodiment, the queuing distance in target area be it is fixed, can using the shift length of the first duration and the first tracking object To determine the average displacement of the first tracking object, by the ratio between average displacement and preset displacement threshold value be determined as first with The maximum frame number that track object is saved.It, can be in M frame if the first tracking object is not detected in M frame image Emerging second tracking object is compared with the first tracking object in maximum frame number after image.
In an alternative embodiment, the number of tracked object is determined in the marking frame for identifying target area Amount;In the case where determining the second tracking object occurred in target area in O frame image, the marking frame of target area is updated The quantity of the tracked object of middle determination.In the present embodiment, shown in the target area in each frame image all by with The quantity of the object of track then reduces the quantity of object to be tracked in the case where tracking object is not detected, and is occurring newly Tracking object in the case where, then increase the quantity of object to be tracked.
Specific embodiment is addressed below, and the present invention is described in detail:
The present embodiment provides a kind of pedestrian's rules to be lined up the method that duration counts, and the second tracking object is in the present embodiment with new For target, by taking queue area as an example, O frame image is illustrated by taking present frame as an example for target area.
Tracking object detection and tracking is carried out in the queue area of delimitation, when present frame fresh target occurs in target area When, judge present frame it is default before whether there is the first tracking object in N frame and be not detected, obtain the first tracking object not by The position that tracks before detecting simultaneously is compared with the new target location of present frame, or by the feature of fresh target be not detected The feature of the first tracking object measured is compared, if positional relationship meets preset the constraint relationship or characteristic similarity reaches To threshold value, then the queuing time of new Place object is updated to the time span of the first tracking object plus present frame to lost frames Between frame period time span.
Fig. 3 is the flow chart of preferred embodiment in the present embodiment, as shown in figure 3, in the scene that pedestrian's rule is lined up, it is right The tracking of tracking object the following steps are included:
S301: obtain pedestrian's rule queue area in queuing image, in the present embodiment, can by monocular, binocular, The cameras such as flake are obtained.
S302: several monitoring areas are determined in pedestrian's rule queue area, obtain the row in each monitoring area Team's image.
S303: detection is lined up the head and shoulder information of the tracking object in image;
S304: the tracking object in head and shoulder information trace monitoring area is utilized;The queuing image that tracking object is occurred is true It is set to first frame image, when showing tracking mark ID, coordinate position and the tracking of tracking object in first frame image The information such as long.
S305: the tracking ID in the tracking ID and first frame image of the tracking object in the second frame image is compared;
S306: judge in the second frame image with the presence or absence of undetected tracking object;
S307: if the second frame has the case where tracking object is not detected, retain undetected tracking object Coordinate position and tracking duration, while saving the ID for tracking successful tracking object, coordinate position, tracking duration and ID before and after frames Pixel displacement.Otherwise, output work queue tracks duration.
Tracking ID in tracking ID and the second frame image in third frame image is compared, judges whether third frame image is deposited It is not detected in tracking object, whether the tracking object ID in opposite second frame image judges to occur in third frame image new The ID of tracking object.If there are tracking objects to be not detected for third frame image, retain the tracking object being not detected Coordinate and tracking duration, while saving the successful Target id of tracking, coordinate position and tracking duration.If third frame image goes out The ID of existing new object, while there are tracking objects to be not detected in the second frame image, judge respectively third frame image each The coordinate position relationship of the coordinate position of emerging ID and all objects being not detected of the second frame image determines new appearance The tracking duration of object whether need to update;Such as: it calculates third frame image and the coordinate of object ID and the second frame figure newly occurs As the coordinate of all objects being not detected, if certain an object A and third frame figure that are not detected in the second frame image As the distance between a certain new object B meets given threshold value or the distance between object A and new object B and color unity and coherence in writing spy Sign meets given threshold value, and the new object B tracking duration in third frame image is updated, the second frame of constraint condition will be met The corresponding objects A tracking duration being not detected in image is along with the total duration after two frame period durations is updated to third frame figure New object B as in is lined up duration, while deletion will be not detected the information of object A in the second frame image.
S308: the tracking ID in the tracking ID and m-1 frame image in m frame image is compared, and before and after frames have tracking Success object B, the average displacement of record front and back frame object B, object B tracking is detected in a period of time, and computing object B is one Average displacement x in the section time calculates maximum after object B is not detected protect by pre-set displacement of targets threshold value d Frame number is deposited, calculation formula: object B tracking maximum frame number=preset displacement of targets threshold value d/ object of saving after being not detected B tracking is not detected average displacement for the previous period.Judge that m frame image is not detected with the presence or absence of tracking object simultaneously It arrives, opposite m-1 frame image trace object ID judges whether m frame image new object ID occurs.If m frame image exists Tracking object is not detected, then retains tracking and be not detected the coordinate of object, target following duration and retain tracking not by It detects the maximum frame number that object saves while saving the successful Target id of tracking, coordinate position tracks duration and pixel displacement. If new ID occurs in m frame image, while having the tracking saved and being not detected target in the corresponding preceding m-1 frame of m frame, sentence Newly there is the target of ID coordinate position and the corresponding preceding all preservations of m-1 frame of m frame image being not detected in disconnected m frame image Coordinate position relationship, judgment method are same as above, and then decide whether that being lined up duration to new object in m frame image is updated, together When the remaining new object for being unsatisfactory for update condition as the normal new tracking target for entering region.The tracking of a certain preservation not by Object information is detected, if (n is that the tracking is not detected to the fresh target of the not corresponding suitable threshold value of the continuous n frame of the object Object maximum saves frame number), then the target information is deleted.
S309: newly there is ID coordinate position and the corresponding preceding all preservations of m-1 frame image of m frame image in m frame image The matched situation of the target coordinate position being not detected under, in m frame image new object be lined up duration be updated.
S310: the corresponding queuing duration of fresh target after output amendment.
Fig. 4 be the 491st frame of video schematic diagram, wherein 4 in figure in big gray area be arranged regular waiting area Domain, small grey box is tracking target in region.Target is monitoring and observation mesh in the frame in the lower right corner in regular queue area in Fig. 4 Mark, the current ID of the target are 425, and the ID is 20 seconds a length of when tracking, and the upper right corner Fig. 4 shows current time.
Fig. 5 is the schematic diagram of the 498th frame of video, observed object in the frame in the lower right corner in queue area in Fig. 5, the target because Tracking is caused to be not detected for reasons such as overexposures.
Fig. 6 is the schematic diagram of the 533rd frame of video, in Fig. 6 in the frame in the lower right corner observed object as fresh target secondary tracking again, The current ID of the target be 477, if as fresh target without tracking duration update, the ID track duration will the timing since 0, But by context of methods, the duration of new ID is updated, it is 23 seconds a length of when the ID is lined up after update, by Fig. 4 and Fig. 6 The upper right side time makes the difference, when showing that observed object tracking is not detected front and back queuing duration in video monitoring and is really lined up It is long consistent.
Fig. 7 is the schematic diagram of the 573rd frame of video, and secondary tracking is not detected the observed object in Fig. 7 in the frame of the lower right corner again.
Fig. 8 is the schematic diagram of the 592nd frame of video, the observed object in Fig. 8 in the frame of the lower right corner as fresh target secondary tracking again, The current ID of the target be 512, if as fresh target without tracking duration update, the ID track duration will the timing since 0, But by context of methods, the duration of new ID is updated, the ID is lined up duration 27 seconds after update, by right to Fig. 6 and Fig. 8 The top time makes the difference, and it is consistent with true queuing duration to show that observed object tracking is not detected front and back queuing duration.
Observed object in the frame of the lower right corner is not detected by two secondary trackings, is lined up duration and is not changed, such Method can be when tracking calculates queuing after being not detected caused by target is because the reasons such as blocking in preferable solution rule queuing process Long inaccurate problem.
In the present embodiment, the strategy of target duration is judged relative to tracking continuity is only relied on, embodiment can be one Determine target during solving the problems, such as to track in degree and calculates target duration inaccuracy after being not detected.
Through the above description of the embodiments, those skilled in the art can be understood that according to above-mentioned implementation The method of example can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but it is very much In the case of the former be more preferably embodiment.Based on this understanding, technical solution of the present invention is substantially in other words to existing The part that technology contributes can be embodied in the form of software products, which is stored in a storage In medium (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal device (can be mobile phone, calculate Machine, server or network equipment etc.) execute method described in each embodiment of the present invention.
Additionally provide a kind of determining device for tracking duration in the present embodiment, the device for realizing above-described embodiment and Preferred embodiment, the descriptions that have already been made will not be repeated.As used below, predetermined function may be implemented in term " module " The combination of the software and/or hardware of energy.It is hard although device described in following embodiment is preferably realized with software The realization of the combination of part or software and hardware is also that may and be contemplated.
Fig. 9 is the structural block diagram of the determining device of tracking duration according to an embodiment of the present invention, as shown in figure 9, the device It include: the first determining module 92, the second determining module 94, third determining module 96 and the 4th determining module 98, below to this Device is described in detail:
First determining module 92, for carrying out determining position in the obtained nth frame image of video monitoring to target area The mark of the first tracking object in target area, to the tracking duration and nth frame image of the tracking of the first tracking object Shooting time, wherein a length of first duration when to the tracking of the first tracking object tracking;
Optionally, in the present embodiment, it can be applied in the scene being lined up, including but not limited to the dining room row that buys meal The queuing etc. that queuing that team, railway station are bought tickets, the queuing registered of hospital, railway station are entered the station.Target area can be queuing Region, for example, the portal etc. in railway station.The mark of first tracking object can be allocated to the position of first tracking object Mark ID number is moved, the first duration, the coordinate of the first tracking object position, the first tracking object can also be shown in mark Characteristic information etc..In addition, during tracking, when all showing the shooting that each frame image is taken in each frame image Between.
Second determining module 94, for determine in the obtained M frame image of video monitoring to target area The first tracking object is not present in target area;
Optionally, in the present embodiment, it can determine that the first tracking object is not detected after nth frame image, Either it is not detected after nth frame image, before M frame image.The shooting time and M frame image of nth frame image Shooting time between be spaced the first preset duration, before the shooting time to the shooting time of O frame image of M frame image There is no the first tracking objects and not matched with the first tracking object in periods of two preset durations upper target area Tracking object occurs.
Optionally, the first preset duration and the second preset duration can be 1 second.
It should be noted that be not detected after the first tracking object in the target area, it can be to the first tracking object Mark saved, that is, save the first duration, the coordinate of the first tracking object position, the first tracking object feature letter Breath etc., to facilitate subsequent comparison.
Third determining module 96, for carrying out determining mesh in the obtained O frame image of video monitoring to target area Mark the mark of the second tracking object occurred in region and the shooting time of O frame image, wherein O > M > N, O, M, N are Positive integer;
In the present embodiment, the mark of the second tracking object includes but is not limited to identify ID number.When the shooting of O frame image Between form can be the date, for example, 2019-06-05.
4th determining module 98 is used under the first tracking object and the matched situation of the second tracking object, in O frame figure A length of second duration when the determining tracking tracked to the second tracking object as in, wherein a length of first duration and shooting when second The sum of time interval, shooting interval are the difference of the shooting time of O frame image and the shooting time of nth frame image.
In the present embodiment, a length of second duration when the tracking to the second tracking object can be shown, in O frame with side Just the observation of user.
Through the invention, due to determine in the obtained nth frame image of video monitoring positioned at mesh to target area Mark the mark of the first tracking object in region, to the tracking duration of the first tracking object tracking and the shooting of nth frame image Time;To target area carry out the obtained M frame image of video monitoring in determine in target area there is no first with Track object, i.e., in the case where the first tracking object is not detected, if there is the second tracking object in O frame image, The mark of the second tracking object and the shooting time of O frame image are then obtained, in the first tracking object and the second tracking object In matched situation, the sum of the first duration and shooting interval can be determined as to the tracking duration of the second tracking object, clapped Take the photograph the difference of the shooting time of shooting time and nth frame image that time interval is O frame image.To realize first with In the case that track object and the second tracking object are same target, tracking duration is accurately determined.Therefore, it can solve correlation Problem in technology to the tracking duration calculation inaccuracy for being not detected object achievees the effect that accurately to determine tracking duration.
Optionally, the executing subject of above-mentioned steps can be terminal etc., but not limited to this.
In an alternative embodiment, target area determine in the obtained O frame image of video monitoring After after the mark of the second tracking object occurred in target area and the shooting time of O frame image, matching is used for table Show the first information of the first tracking object with for indicating the second information of the second tracking object, wherein the first information includes the The color and vein of the coordinate position of one tracking object, the biological information of the first tracking object and the first tracking object is believed Breath, the second information include the coordinate position of the second tracking object, the biological information of the second tracking object and the second tracking The color and vein information of object;Utilize the first information and second the first tracking object of information matches and the second tracking object.At this In embodiment, the first information can be the characteristic information of the first tracking object, and the second information can be the spy of the second tracking object Reference breath.Utilize the distance between available first tracking object of coordinate position and second tracking object.Color and vein information It can be the color for the clothes either hair that the first tracking object and the second tracking object are worn.
In an alternative embodiment, the first tracking object and the second tracking object can be matched in the following manner: The first tracking object coordinate position between the coordinate position of the second tracking object at a distance from be less than pre-determined distance, first with The color line of the biometric information matches and the first tracking object of the biological information of track object and the second tracking object In the case where the color and vein information matches for managing information and the second tracking object, the first tracking object and the second tracking object are determined Matching.As an example, if the distance between the first tracking object and the second tracking object are smaller, e.g. 1 meter, according to just Normal queue discipline, it is believed that be same target.If if the distance between the first tracking object and the second tracking object It is bigger, for example, between distance be 10 meters, according to normal queue discipline, the first tracking object can not move within 1 second 10 meters, it can judge that the first tracking object and the second tracking object be not same target.
Optionally, in the present embodiment, biological information includes but is not limited to the head and shoulder feature of tracking object, face letter Breath.
In an alternative embodiment, for indicate the first information of the first tracking object with for indicate second with In the unmatched situation of the second information of track object, the second tracking object can be determined as occurring for the first time in video monitoring Tracking object;Start from scratch and timing is carried out to the tracking duration of the second tracking object.In the present embodiment, first tracking object Being not detected information can save within certain time, if occur in the picture frame obtained within certain time second Tracking object and the first tracking object simultaneously mismatch, then it is assumed that the object that the second tracking object is not detected before being not, Then tracked according to new tracking object.
In an alternative embodiment, shift length of first tracking object before nth frame image is determined;Utilize position Shifting distance determines the maximum frame number that the first tracking object is saved with the relationship in target area between preset displacement threshold value.At this In embodiment, the queuing distance in target area be it is fixed, can using the shift length of the first duration and the first tracking object To determine the average displacement of the first tracking object, by the ratio between average displacement and preset displacement threshold value be determined as first with The maximum frame number that track object is saved.It, can be in M frame if the first tracking object is not detected in M frame image Emerging second tracking object is compared with the first tracking object in maximum frame number after image.
In an alternative embodiment, the number of tracked object is determined in the marking frame for identifying target area Amount;In the case where determining the second tracking object occurred in target area in O frame image, the marking frame of target area is updated The quantity of the tracked object of middle determination.In the present embodiment, shown in the target area in each frame image all by with The quantity of the object of track then reduces the quantity of object to be tracked in the case where tracking object is not detected, and is going out In the case where now new tracking object, then increase the quantity of object to be tracked.
It should be noted that above-mentioned modules can be realized by software or hardware, for the latter, Ke Yitong Following manner realization is crossed, but not limited to this: above-mentioned module is respectively positioned in same processor;Alternatively, above-mentioned modules are with any Combined form is located in different processors.
The embodiments of the present invention also provide a kind of storage medium, computer program is stored in the storage medium, wherein The computer program is arranged to execute the step in any of the above-described embodiment of the method when operation.
Optionally, in the present embodiment, above-mentioned storage medium can be set to store for executing above each step Computer program.
Optionally, in the present embodiment, above-mentioned storage medium can include but is not limited to: USB flash disk, read-only memory (Read- Only Memory, referred to as ROM), it is random access memory (Random Access Memory, referred to as RAM), mobile hard The various media that can store computer program such as disk, magnetic or disk.
The embodiments of the present invention also provide a kind of electronic device, including memory and processor, stored in the memory There is computer program, which is arranged to run computer program to execute the step in any of the above-described embodiment of the method Suddenly.
Optionally, above-mentioned electronic device can also include transmission device and input-output equipment, wherein the transmission device It is connected with above-mentioned processor, which connects with above-mentioned processor.
Optionally, in the present embodiment, above-mentioned processor can be set to execute above each step by computer program Suddenly.
Optionally, the specific example in the present embodiment can be with reference to described in above-described embodiment and optional embodiment Example, details are not described herein for the present embodiment.
Obviously, those skilled in the art should be understood that each module of the above invention or each step can be with general Computing device realize that they can be concentrated on a single computing device, or be distributed in multiple computing devices and formed Network on, optionally, they can be realized with the program code that computing device can perform, it is thus possible to which they are stored It is performed by computing device in the storage device, and in some cases, it can be to be different from shown in sequence execution herein Out or description the step of, perhaps they are fabricated to each integrated circuit modules or by them multiple modules or Step is fabricated to single integrated circuit module to realize.In this way, the present invention is not limited to any specific hardware and softwares to combine.
The foregoing is only a preferred embodiment of the present invention, is not intended to restrict the invention, for the skill of this field For art personnel, the invention may be variously modified and varied.It is all within principle of the invention, it is made it is any modification, etc. With replacement, improvement etc., should all be included in the protection scope of the present invention.

Claims (11)

1. a kind of determination method for tracking duration characterized by comprising
To target area carry out the obtained nth frame image of video monitoring in determine be located at the target area in first with The shooting time of the mark of track object, tracking duration and the nth frame image that first tracking object is tracked, In, a length of first duration when to the tracking of first tracking object tracking;
The target area is being carried out determining to be not present in the target area in the obtained M frame image of video monitoring First tracking object;
Determine to the target area the occurred in the target area in the obtained O frame image of video monitoring The shooting time of the mark of two tracking objects and the O frame image, wherein O > M > N, described O, M, N are positive integer;
Under first tracking object and the matched situation of the second tracking object, determining pair in the O frame image A length of second duration when the tracking of the second tracking object tracking, wherein a length of first duration and bat when described second The sum of time interval is taken the photograph, the shooting interval is the shooting of the shooting time and the nth frame image of the O frame image The difference of time.
2. the method according to claim 1, wherein obtained carrying out video monitoring to the target area The mark of the second tracking object occurred in the target area and the shooting of the O frame image are determined in O frame image After time, the method also includes:
Match the first information for indicating first tracking object and the second letter for indicating second tracking object Breath, wherein the first information includes the biological characteristic of the coordinate position of first tracking object, first tracking object The color and vein information of image-region where information and first tracking object, second information include described second with The coordinate position of track object, the biological information of second tracking object and image district where second tracking object The color and vein information in domain, the biological information include head and shoulder feature;
First tracking object and described second are determined using the matching result of the first information and second information Whether tracking object matches.
3. according to the method described in claim 2, it is characterized in that, using the first information and second information matching As a result it determines first tracking object and second tracking object matching includes:
First tracking object coordinate position between the coordinate position of second tracking object at a distance from be less than it is pre- If the biometric information matches of the biological information of distance, first tracking object and second tracking object, with And in the case where the color and vein information matches of the color and vein information of first tracking object and second tracking object, Determine that first tracking object is matched with second tracking object.
4. according to the method described in claim 2, it is characterized in that, for indicating the first information of first tracking object In the unmatched situation of the second information for indicating second tracking object, the method also includes:
Second tracking object is determined as the tracking object occurred for the first time in the video monitoring;
Start from scratch and timing is carried out to the tracking duration of second tracking object.
5. the method according to claim 1, wherein further include:
Determine shift length of first tracking object before the N image;
First tracking pair is determined using the relationship between preset displacement threshold value in the shift length and the target area As the maximum frame number being saved.
6. according to the method described in claim 5, it is characterized in that, using being preset in the shift length and the target area The maximum frame number that relationship between displacement threshold value determines that first tracking object is saved includes:
Being averaged for first tracking object is determined using the shift length of first duration and first tracking object Displacement;
Ratio between the average displacement and the preset displacement threshold value is determined as what first tracking object was saved Maximum frame number.
7. method according to any one of claim 1 to 6, which is characterized in that
It is spaced the first preset duration between the shooting time of the nth frame image and the shooting time of the M frame image, it is described The period of the second preset duration upper mesh before the shooting time of M frame image to the shooting time of the O frame image It marks and the first tracking object is not present in region and does not occur with the matched tracking object of first tracking object.
8. method according to any one of claim 1 to 6, which is characterized in that
The quantity of tracked object is determined in the marking frame for identifying the target area;
In the case where the second tracking object occurred in the target area in the O frame image, the target is updated The quantity of tracked object in the marking frame in region.
9. a kind of determining device for tracking duration characterized by comprising
First determining module, for determine in the obtained nth frame image of video monitoring positioned at described to target area The mark of the first tracking object in target area, the tracking duration and the nth frame that first tracking object is tracked The shooting time of image, wherein a length of first duration when to the tracking of first tracking object tracking;
Second determining module, for carrying out determining institute in the obtained M frame image of video monitoring to the target area It states and the first tracking object is not present in target area;
Third determining module, described in being determined in the target area progress obtained O frame image of video monitoring The mark of the second tracking object occurred in target area and the shooting time of the O frame image, wherein O > M > N, institute Stating O, M, N is positive integer;
4th determining module is used under first tracking object and the matched situation of the second tracking object, described A length of second duration when the determining tracking tracked to second tracking object in O frame image, wherein a length of when described second The sum of first duration and shooting interval, shooting time and institute of the shooting interval for the O frame image State the difference of the shooting time of nth frame image.
10. a kind of storage medium, which is characterized in that be stored with computer program in the storage medium, wherein the computer Program is arranged to execute method described in any one of claim 1 to 8 when operation.
11. a kind of electronic device, including memory and processor, which is characterized in that be stored with computer journey in the memory Sequence, the processor are arranged to run the computer program to execute side described in any one of claim 1 to 8 Method.
CN201910502736.1A 2019-06-11 2019-06-11 Method and device for determining tracking duration, storage medium and electronic device Active CN110264497B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910502736.1A CN110264497B (en) 2019-06-11 2019-06-11 Method and device for determining tracking duration, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910502736.1A CN110264497B (en) 2019-06-11 2019-06-11 Method and device for determining tracking duration, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN110264497A true CN110264497A (en) 2019-09-20
CN110264497B CN110264497B (en) 2021-09-17

Family

ID=67917612

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910502736.1A Active CN110264497B (en) 2019-06-11 2019-06-11 Method and device for determining tracking duration, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN110264497B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111008611A (en) * 2019-12-20 2020-04-14 浙江大华技术股份有限公司 Queuing time determining method and device, storage medium and electronic device
CN111126807A (en) * 2019-12-12 2020-05-08 浙江大华技术股份有限公司 Stroke segmentation method and device, storage medium and electronic device
US20210258495A1 (en) * 2020-02-19 2021-08-19 Canon Kabushiki Kaisha Subject tracking device, subject tracking method, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413295A (en) * 2013-07-12 2013-11-27 长沙理工大学 Video multi-target long-range tracking method
CN103793921A (en) * 2012-10-29 2014-05-14 浙江大华技术股份有限公司 Moving object extraction method and moving object extraction device
US20150131859A1 (en) * 2007-02-07 2015-05-14 Samsung Electronics Co., Ltd. Method and apparatus for tracking object, and method and apparatus for calculating object pose information
CN104867198A (en) * 2015-03-16 2015-08-26 北京首都国际机场股份有限公司 Queuing time acquiring method and queuing time acquiring apparatus
CN108764167A (en) * 2018-05-30 2018-11-06 上海交通大学 A kind of target of space time correlation recognition methods and system again
CN109117721A (en) * 2018-07-06 2019-01-01 江西洪都航空工业集团有限责任公司 A kind of pedestrian hovers detection method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150131859A1 (en) * 2007-02-07 2015-05-14 Samsung Electronics Co., Ltd. Method and apparatus for tracking object, and method and apparatus for calculating object pose information
CN103793921A (en) * 2012-10-29 2014-05-14 浙江大华技术股份有限公司 Moving object extraction method and moving object extraction device
CN103413295A (en) * 2013-07-12 2013-11-27 长沙理工大学 Video multi-target long-range tracking method
CN104867198A (en) * 2015-03-16 2015-08-26 北京首都国际机场股份有限公司 Queuing time acquiring method and queuing time acquiring apparatus
CN108764167A (en) * 2018-05-30 2018-11-06 上海交通大学 A kind of target of space time correlation recognition methods and system again
CN109117721A (en) * 2018-07-06 2019-01-01 江西洪都航空工业集团有限责任公司 A kind of pedestrian hovers detection method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111126807A (en) * 2019-12-12 2020-05-08 浙江大华技术股份有限公司 Stroke segmentation method and device, storage medium and electronic device
CN111126807B (en) * 2019-12-12 2023-10-10 浙江大华技术股份有限公司 Stroke segmentation method and device, storage medium and electronic device
CN111008611A (en) * 2019-12-20 2020-04-14 浙江大华技术股份有限公司 Queuing time determining method and device, storage medium and electronic device
US20210258495A1 (en) * 2020-02-19 2021-08-19 Canon Kabushiki Kaisha Subject tracking device, subject tracking method, and storage medium
US11553136B2 (en) * 2020-02-19 2023-01-10 Canon Kabushiki Kaisha Subject tracking device, subject tracking method, and storage medium

Also Published As

Publication number Publication date
CN110264497B (en) 2021-09-17

Similar Documents

Publication Publication Date Title
CN110264497A (en) Track determination method and device, the storage medium, electronic device of duration
CN112102369B (en) Autonomous inspection method, device, equipment and storage medium for water surface floating target
CN104639840B (en) Image processing apparatus and image processing method
CN106651916A (en) Target positioning tracking method and device
CN108985199A (en) Detection method, device and the storage medium of commodity loading or unloading operation
CN112883819A (en) Multi-target tracking method, device, system and computer readable storage medium
CN109035304A (en) Method for tracking target, calculates equipment and device at medium
CN101167086A (en) Human detection and tracking for security applications
CN108337551A (en) A kind of screen recording method, storage medium and terminal device
CN108805900A (en) A kind of determination method and device of tracking target
JP6688975B2 (en) Monitoring device and monitoring system
CN110418170A (en) Detection method and device, storage medium and electronic device
CN109584265A (en) A kind of method for tracking target and device
CN107408119A (en) Image retrieving apparatus, system and method
CN106228218A (en) The intelligent control method of a kind of destination object based on movement and system
CN108550080A (en) Article damage identification method and device
CN110826572A (en) Multi-target detection non-maximum suppression method, device and equipment
Radaelli et al. Using cameras to improve wi-fi based indoor positioning
CN109001484A (en) The detection method and device of rotation speed
CN112528716A (en) Event information acquisition method and device
CN114511611A (en) Image recognition-based goods heap statistical method and device
CN105005505B (en) The method for parallel processing of aerial multi-target track prediction
CN109492584A (en) A kind of recognition and tracking method and electronic equipment
CN109671101A (en) Action trail acquisition device and intelligent terminal
CN109344750A (en) A kind of labyrinth three dimensional object recognition methods based on Structural descriptors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant