US20220044417A1 - Target Object Tracking Method and Apparatus, and Storage Medium - Google Patents

Target Object Tracking Method and Apparatus, and Storage Medium Download PDF

Info

Publication number
US20220044417A1
US20220044417A1 US17/497,648 US202117497648A US2022044417A1 US 20220044417 A1 US20220044417 A1 US 20220044417A1 US 202117497648 A US202117497648 A US 202117497648A US 2022044417 A1 US2022044417 A1 US 2022044417A1
Authority
US
United States
Prior art keywords
target object
image
identification
information
analyzed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/497,648
Inventor
Jing Wang
Guangcheng Zhang
Weilin Li
Bin Zhu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to US17/497,648 priority Critical patent/US20220044417A1/en
Assigned to SHENZHEN SENSETIME TECHNOLOGY CO., LTD. reassignment SHENZHEN SENSETIME TECHNOLOGY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LI, WEILIN, WANG, JING, ZHANG, GUANGCHENG, ZHU, BIN
Publication of US20220044417A1 publication Critical patent/US20220044417A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06K9/627
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Definitions

  • the present disclosure relates to the field of security technologies, and in particular, to a target object tracking method and apparatus, an electronic device, and a storage medium.
  • the catch success rate can be improved by arranging personnel to monitor a suspect using the tracking information.
  • the present disclosure provides a technical solution for target object tracking.
  • a target object tracking method including: obtaining a first reference image of a target object; determining time information and location information of the target object in an image to be analyzed according to the first reference image, the image to be analyzed including the time information and the location information; determining a trajectory of the target object according to the time information and the location information of the target object; and generating tracking information for tracking the target object according to the trajectory of the target object.
  • the method further includes: determining identification information of the target object; and the generating tracking information for tracking the target object according to the trajectory of the target object includes generating tracking information for tracking the target object according to the trajectory of the target object and the identification information of the target object.
  • the determining identification information of the target object includes: detecting the target object in an identification image library according to the first reference image of the target object, identification images in the identification image library including identification information of objects; and determining the identification information of the target object according to the target object detected in the identification image library.
  • the determining identification information of the target object further includes: when it is unable to detect the target object in the identification image library according to the first reference image of the target object, determining a second reference image of the target object in the image to be analyzed, the definition of the second reference image being greater than that of the first reference image; detecting the target object in the identification image library according to the second reference image of the target object; and determining the identification information of the target object according to the target object detected in the identification image library.
  • the method further includes: determining an association object of the target object in the image to be analyzed, and determining a trajectory of the association object; and the generating tracking information for tracking the target object according to the trajectory of the target object includes generating the tracking information for tracking the target object according to the trajectory of the target object and the trajectory of the association object.
  • the determining an association object of the target object in the image to be analyzed includes: determining in the image to be analyzed a target image including the target object; and determining the association object of the target object in the target image.
  • the determining an association object of the target object in the target image includes: determining an object to be associated of the target object in the target image; detecting the object to be associated in the image to be analyzed; determining time information and location information of the object to be associated in the image to be analyzed according to the detected object to be associated; determining a trajectory of the object to be associated according to the time information and the location information of the object to be associated; and when the degree of coincidence between the trajectory of the object to be associated and the trajectory of the target object is greater than a degree-of-coincidence threshold, determining the object to be associated as the association object of the target object.
  • a target object tracking apparatus including: a first reference image obtaining module configured to obtain a first reference image of a target object; an information determining module configured to determine time information and location information of the target object in an image to be analyzed according to the first reference image, the image to be analyzed including the time information and the location information; a trajectory determining module configured to determine a trajectory of the target object according to the time information and the location information of the target object; and a tracking information generating module configured to generate tracking information for tracking the target object according to the trajectory of the target object.
  • the apparatus further includes: a first identification information determining sub-module configured to determine identification information of the target object; and the tracking information generating module includes: a first tracking information generating sub-module configured to generate tracking information for tracking the target object according to the trajectory of the target object and the identification information of the target object.
  • the first identification information determining module includes: a first detecting sub-module configured to detect the target object in an identification image library according to the first reference image of the target object, identification images in the identification image library including identification information of objects; and a first identification information determining sub-module configured to determine the identification information of the target object according to the target object detected in the identification image library.
  • the first identification information determining module further includes: a second reference image obtaining sub-module configured to, when it is unable to detect the target object in the identification image library according to the first reference image of the target object, determine a second reference image of the target object in the image to be analyzed, the definition of the second reference image being greater than that of the first reference image; a second detecting sub-module configured to detect the target object in the identification image library according to the second reference image of the target object, and a second identification information determining sub-module configured to determine the identification information of the target object according to the target object detected in the identification image library.
  • the apparatus further includes: an association object determining module configured to determine an association object of the target object in the image to be analyzed; an association object trajectory determining module configured to determine a trajectory of the association object; and the tracking information generating module includes: a second tracking information generating sub-module configured to generate tracking information for tracking the target object according to the trajectory of the target object and the trajectory of the association object.
  • the association object determining module includes: a target image determining sub-module configured to determine in the image to be analyzed a target image including the target object; and a first association object determining sub-module configured to determine the association object of the target object in the target image.
  • the first association object determining sub-module includes: an object to be associated determining unit configured to determine an object to be associated of the target object in the target image; an object to be associated detecting unit configured to detect the object to be associated in the image to be analyzed; an object to be associated information determining unit configured to determine time information and location information of the object to be associated in the image to be analyzed according to the detected object to be associated; an object to be associated trajectory determining unit configured to determine a trajectory of the object to be associated according to the time information and the location information of the object to be associated; and a second association object determining unit configured to, when the degree of coincidence between the trajectory of the object to be associated and the trajectory of the target object is greater than a degree-of-coincidence threshold, determine the object to be associated as the association object of the target object.
  • an electronic device including: a processor; and a memory configured to store processor-executable instructions; where the processor is configured to execute the target object tracking method.
  • a computer-readable storage medium having computer program instructions stored thereon, where when the computer program instructions are executed by a processor, the target object tracking method is implemented.
  • a computer program including a computer-readable code, where when the computer-readable code runs in an electronic device, a processor in the electronic device executes the target object tracking method.
  • the time information and the location information of the target object can be determined in the image to be analyzed using the first reference image of the target object.
  • the tracking information for tracking the target object is generated according to the trajectory of the target object.
  • Highly-accurate tracking information of the target object is obtained according to the trajectory of the target object determined in the image to be analyzed by using the first reference image of the target object, such that the success rate of target object tracking is improved.
  • FIG. 1 is a flowchart of a target object tracking method according to an exemplary embodiment.
  • FIG. 2 is a flowchart of a target object tracking method according to an exemplary embodiment.
  • FIG. 3 is a flowchart of step S 50 of a target object tracking method according to an exemplary embodiment.
  • FIG. 4 is a flowchart of step S 50 of a target object tracking method according to an exemplary embodiment.
  • FIG. 5 is a flowchart of a target object tracking method according to an exemplary embodiment.
  • FIG. 6 is a flowchart of step S 60 of a target object tracking method according to an exemplary embodiment.
  • FIG. 7 is a flowchart of a target object tracking apparatus according to an exemplary embodiment.
  • FIG. 8 is a block diagram of a target object tracking apparatus according to an exemplary embodiment.
  • FIG. 9 is a block diagram of an electronic device according to an exemplary embodiment.
  • FIG. 1 is a flowchart of a target object tracking method according to an exemplary embodiment. As shown in FIG. 1 , the target object tracking method includes:
  • a first reference image of a target object is obtained.
  • the target object may include various types of objects such as a human, an animal, a plant, and a building. There may be one or more target objects.
  • the target object may be one type of object and may also be a combination of various types of objects.
  • the first reference image of the target object may include a photo, a portrait, or the like of the target object.
  • the first reference image may be a static image, and may also be an image frame in a video stream.
  • the first reference image may merely include an image of the target object, and may also include images of other objects.
  • the first reference image may include one image of the target object, and may also include a plurality of images of the target object.
  • time information and location information of the target object are determined in an image to be analyzed according to the first reference image, the image to be analyzed including the time information and the location information.
  • the image to be analyzed includes an original captured image.
  • the image to be analyzed is an image captured by a surveillance camera.
  • the image to be analyzed may include a plurality of objects, and may also include a single object. For example, if a surveillance image captured by a surveillance camera in a crowded place is determined as an image to be analyzed, captured surveillance image A includes a plurality of objects.
  • the image to be analyzed may also include an image cropped from the original captured image. For example, after performing face recognition on an original image captured by the surveillance camera, detection results of objects in the original image, for example, detection boxes of the objects, are obtained. After cropping corresponding images in the original image according to the detection results of the objects, images to be analyzed of the objects are obtained.
  • surveillance image B captured by a surveillance camera in an Internet café includes three objects, i.e., person 1 , person 2 , and person 3 . Detection boxes of the three objects are detected in the surveillance image B using a face recognition technology.
  • Corresponding images are cropped in the surveillance image B according to the three detection boxes to obtain image to be analyzed 1 of the person 1 , image to be analyzed 2 of the person 2 , and image to be analyzed 3 of the person 3 .
  • each image to be analyzed merely includes one object.
  • the time information of the image to be analyzed includes the time at which the image to be analyzed is captured.
  • the location information of the image to be analyzed includes the location at which the image to be analyzed is captured. For example, if the image to be analyzed is a surveillance image captured by a surveillance camera, the time information of the image to be analyzed is determined according to the time at which the surveillance image is captured, and the location information of the image to be analyzed is determined according to the location at which the camera is mounted.
  • the location information includes longitude and latitude information and postal address information.
  • the detection result of the target object may be obtained by performing target object detection on the first reference image.
  • the target object may be obtained by detecting the first reference image using an image recognition technology.
  • the target object may also be obtained by inputting the first reference image to a corresponding neural network, and detecting the image to be analyzed according to the output result of the neural network.
  • Target object detection is performed in the image to be analyzed according to the target object detected in the first reference image.
  • the time information and the location information of the target object are obtained according to the time information and the location information of the image to be analyzed where the detected target object is located.
  • a trajectory of the target object is determined according to the time information and the location information of the target object.
  • the time information and the location information of the target object have one-to-one correspondence.
  • the trajectory of the target object may be obtained by associating the location information in a time sequence of the time information of the target object. For example, a list-type trajectory of the target object is obtained.
  • a linear trajectory of the target object may also be obtained by marking the time information and the location information of the target object on a map and sequentially connecting the marks on the map in a time sequence according to the marked location information and time information.
  • the linear trajectory of the target object on the map is more intuitive.
  • the trajectory of the target object is a location corresponding to one time point.
  • tracking information for tracking the target object is generated according to the trajectory of the target object.
  • the activity law or the time and/or location at which the target object frequently appears is determined according to the trajectory of the target object, the time and location at which the target object may appear is predicted, and the tracking information for tracking the target object is generated according to the prediction result.
  • a security management department determines according to a trajectory of a suspect a time and a location at which a suspect frequently appears, predicts according to the trajectory of the suspect a time and a location at which the suspect may appear, and generates tracking information for the suspect according to the prediction result, such that the suspect tracking success rate can be improved.
  • the time information and the location information of the target object can be determined in the image to be analyzed using the first reference image of the target object.
  • the tracking information for tracking the target object is generated according to the trajectory of the target object. Highly-accurate tracking information of the target object is obtained according to the trajectory of the target object determined in the image to be analyzed by using the first reference image of the target object, such that the success rate of target object tracking is improved.
  • FIG. 2 is a flowchart of a target object tracking method according to an exemplary embodiment. As shown in FIG. 2 , after step S 10 , the target object tracking method further includes:
  • identification information of the target object is determined.
  • the identification information of the target information includes information such as the name, attribute, and feature of the target object.
  • the target object is distinguished from other objects using the identification information. More comprehensive information of the target object is obtained using the identification information.
  • the identification information includes identity card information, criminal record information, social relation information and the like of the target object.
  • a plurality of identification information libraries can be created according to requirements.
  • a corresponding identification information library can be found according to requirements.
  • Identification information of a preset target object may be obtained according to requirements.
  • Preset identification information of the target object may also be obtained according to requirements.
  • an identity card information library is created. Identity card information of a suspect that falls within the age range of 20-40 years old may be obtained according to requirements. Address information of the suspect that falls within the age range of 20-40 years old may also be obtained.
  • Step S 40 includes:
  • tracking information for tracking the target object is generated according to the trajectory of the target object and the identification information of the target object.
  • the tracking information is obtained according to the combination of the trajectory of the target object and the identification information of the target object. For example, features such as the age, height, and weight of the target object is determined according to the identification information of the target object, and the generated tracking information carry the features such as the age, height, and weight of the target object to facilitate obtaining more comprehensive information of the target object by a user of the tracking information.
  • the identification information of the target object is determined, and the tracking information for tracking the target object is generated according to the trajectory and the identification information of the target object. More comprehensive and accurate tracking information can be obtained using the identification information.
  • the identification information can improve the target object tracking success rate.
  • FIG. 3 is a flowchart of step S 50 of a target object tracking method according to an exemplary embodiment. As shown in FIG. 3 , the step S 50 of the target object tracking method includes:
  • the target object is detected in an identification image library according to the first reference image of the target object, identification images in the identification image library including identification information of objects.
  • the identification image library includes identification images of a plurality of target objects, and the identification images include identification information of the target objects.
  • an identification image library can be created for objects satisfying a set condition.
  • an identification image library may be created for objects having criminal records.
  • An identification image library for objects satisfying a set identification range may also be created.
  • identification image library for objects satisfying identification information such as a set range and a set sex can be created.
  • the target object is detected in the identification image in the identification image library according to the target object in the first reference image.
  • the target object may be detected in the identification image library using technologies such as image recognition.
  • the target object may also be obtained by inputting the first reference image of the target object to a neural network, and detecting target object according to the output result of the neural network.
  • the identification image library includes an identity card information library.
  • Identification images in the identity card information library may include photos on identity cards of persons, and the identification images may also include identity card information such as names, addresses, and ages of the identity cards of the persons.
  • Suspect A is be detected in photos in the identity card information library according to photo 1 of the suspect A.
  • identification information of the target object is determined according to the target object detected in the identification image library.
  • the identification image corresponding to the target object and the identification information corresponding to the target object are determined according to the detection result.
  • identification information on the identity card such as the name, age, and address of the suspect A can be determined according to the detection result.
  • the identification information of the target object can be determined in the identification image library according to the first reference image of the target object.
  • the target object can be conveniently and accurately detected using the identification image library and the finally generated tracking information of the target object is more accurate.
  • FIG. 4 is a flowchart of step S 50 of a target object tracking method according to an exemplary embodiment. As shown in FIG. 4 , the step S 50 of the target object tracking method includes:
  • a second reference image of the target object is determined in the image to be analyzed, the definition of the second reference image being greater than that of the first reference image.
  • different capturing angles and capturing environments may result in different definitions and included features of the target object in the first reference image. If the target object in the first reference image has a poor definition or an incomplete feature, the target object may not be detected in the identification image library.
  • a second reference image of the target object is determined in the image to be analyzed, the definition of the second reference image being greater than that of the first reference image.
  • An image to be analyzed library includes images of a plurality of candidate objects, and the target object can be determined in the image to be analyzed according to the similarity between the candidate objects in the image to be analyzed and the first reference image. Furthermore, the second reference image of the target object is determined in the image to be analyzed according to the determination result of the target object.
  • the candidate object is determined as the target object.
  • photo 3 of suspect B has a poor definition because it is captured at night, and it is unable to detect the suspect B in the identification image library according to the photo 3 .
  • Image 4 of the suspect B is determined in the image to be analyzed library according to the photo 3 of the suspect B.
  • the definition of the image 4 is greater than that of the photo 3 .
  • the suspect B in the image 4 is clearer, and/or the feature of the suspect B is more comprehensive.
  • the target object is detected in the identification image library according to the second reference image of the target object.
  • the target object is continued to be detected in the identification image library according to the determined second reference image of the target object.
  • the suspect B is continued to be detected in the identification image library according to the image 4 of the suspect B. Because the definition of the second reference image is greater than that of the first reference image, the success rate that the target object is detected in the identification image library can be improved.
  • identification information of the target object is determined according to the target object detected in the identification image library.
  • the identification information of the target object can be obtained according to the detection result. For example, after the photo of the identity card of the suspect B is detected in the identity card information library according to the image 4 of the suspect B, the identification information on the identity card, such as the name, age, and address of the suspect B can be obtained.
  • the identification information of the target object when it is unable to detect the target object in the identification image library according to the first reference image of the target object, can be obtained by determining in the image to be analyzed the second reference image of the target object and detecting the target object in the identification image library according to the second reference image. If the first reference image is not clear, the identification information of the target object is obtained according to the second reference image, thereby improving the success rate of obtaining the identification information of the target object.
  • FIG. 5 is a flowchart of a target object tracking method according to an exemplary embodiment. As shown in FIG. 5 , the target object tracking method further includes:
  • an association object of the target object is determined in the image to be analyzed.
  • the association object of the target object may include an object that appears at the same location as the target object at a different time, and may also include an object that appears at the same location and the same time as the target object.
  • the association object may include an object that appears at location 1 and location 2 with the target object at different times, and may also include objects that appear at location 3 with the target object at three same times.
  • the association object of the target object is determined according to requirements.
  • Candidate objects that appear at the same location as the target object can be detected in the image to be analyzed, and the association object is determined from the candidate objects according to a preset association object determination condition.
  • the target object has a plurality of association objects.
  • a trajectory of the association object is determined.
  • time information and location information of the association object are determined in the image to be analyzed according to the image where the association object is located, and the trajectory of the association object is determined according to the time information and the location information of the association object.
  • the determination process of the trajectory of the association object is similar to the generation process of the trajectory of the target object. Reference can be made to the generation process of the trajectory of the target object in the embodiment shown in FIG. 1 .
  • Step S 40 includes:
  • the tracking information for tracking the target object is generated according to the trajectory of the target object and the trajectory of the association object.
  • a cross trajectory of the target object and the association object is generated according to the trajectory of the target object and the association object, and the tracking information for tracking the target object is generated using the cross trajectory.
  • the trajectory of the target object and the trajectory of the association object are combined to generate a combined trajectory, and the tracking information for tracking the target object is generated using the combined trajectory.
  • the association object of the target object is determined in the image to be analyzed, and the tracking trajectory for tracking the target object is generated according to the trajectory of the association object and the trajectory of the target object.
  • the trajectory of the target object can be supplemented or corrected using the trajectory of the association object, such that more accurate tracking information is generated.
  • FIG. 6 is a flowchart of step S 60 of a target object tracking method according to an exemplary embodiment. As shown in FIG. 6 , the step S 60 of the target object tracking method includes:
  • a target image including the target object is determined in the image to be analyzed.
  • step S 62 the association object of the target object is determined in the target image.
  • the target image including the target object is determined in the image to be analyzed.
  • the target image is the image to be analyzed where the target object is located.
  • a plurality of target images of the target object is determined in the image to be analyzed.
  • the target image includes one or more other objects other than the target object.
  • the other objects included in each target image may be different.
  • the association object can be determined in the target image on the basis of different association object selection conditions according to requirements. For example, the other objects appearing in the target image may all be determined as association objects.
  • the other objects having the number of appearances greater than a threshold in each target object may also be determined as association objects.
  • target object 1 has three target images, which are respectively target image 1 , target image 2 , and target image 3 .
  • the target image 1 further includes object A, object B, and object C.
  • target object 2 further includes the object B, the object C, object D, and object E.
  • target object 3 further includes the object A, the object C, the object D, and the object E.
  • the association object selection condition that the number of appearances is greater than a threshold
  • the object C having the number of appearances greater than two may be determined as the association object of the target object.
  • all of the object A to the object E may also be determined as the association objects of the target object.
  • the association object is determined in the target image after the target image of the target object is determined in the image to be analyzed.
  • the association object can be conveniently and accurately determined using the target image.
  • the step S 62 of the target object tracking method includes:
  • the object to be associated is determined in the target image according to requirements. For example, the other objects appearing in the target image of the target object are determined as objects to be associated.
  • Detection is performed in the image to be analyzed according to the object to be associated in the target image.
  • the object to be associated may be recognized in the image to be analyzed using an image recognition technology.
  • the object to be associated may also be obtained by inputting the object to be associated in the target image to a neural network and detecting the object to be associated in the image to be analyzed using the neural network.
  • the time information and the location information of the object to be associated are determined according to the time information and the location information of the image to be analyzed including the object to be associated. A plurality of time information and location information of the object to be associated is determined.
  • a trajectory of the object to be associated is obtained according to the time information and the location information of the object to be associated.
  • the trajectory of the object to be associated may be obtained by associating the location information of the object to be associated in a time sequence.
  • a linear trajectory of the object to be associated may also be obtained by marking the time information and the location information of the object to be associated on a map and linearly connecting the locations in a time sequence.
  • a degree-of-coincidence threshold is set according to requirements. If the degree of coincidence between the trajectory of the object to be associated and the trajectory of the target object is greater than the degree-of-coincidence threshold, the object to be associated is determined as the association object of the target object.
  • the coincidence between the trajectory of the object to be associated and the trajectory of the target image includes the complete coincidence between the time information and the location information of the object to be associated, and may also include the coincidence in a set time range between the time information of the object to be associated and the time information of the target object, and/or the coincidence in a set geographical range between the location information of the object to be associated and the location information of the target object.
  • the association object of the target object is determined according to the degree of coincidence between the trajectory of the object to be associated and the target object and the degree-of-coincidence threshold.
  • the association object has a close association relation with the target object.
  • the trajectory of the association object is also more valuable for the correction and supplementation of the generation of tracking information.
  • FIG. 7 is a block diagram of a target object tracking apparatus according to an exemplary embodiment. As shown in FIG. 7 , the target object tracking apparatus includes:
  • a first reference image obtaining module 10 configured to obtain a first reference image of a target object
  • an information determining module 20 configured to determine time information and location information of the target object in an image to be analyzed according to the first reference image, the image to be analyzed comprising the time information and the location information;
  • a trajectory determining module 30 configured to determine a trajectory of the target object according to the time information and the location information of the target object
  • a tracking information generating module 40 configured to generate tracking information for tracking the target object according to the trajectory of the target object.
  • FIG. 8 is a block diagram of a target object tracking apparatus according to an exemplary embodiment. As shown in FIG. 8 , in a possible implementation, the apparatus further includes:
  • a first identification information determining module 50 configured to determine identification information of the target object.
  • the tracking information generating module 40 includes:
  • a first tracking information generating sub-module 41 configured to generate tracking information for tracking the target object according to the trajectory of the target object and the identification information of the target object.
  • the first identification information determining module 50 includes:
  • a first detecting sub-module 51 configured to detect the target object in an identification image library according to the first reference image of the target object, identification images in the identification image library including identification information of objects; and a first identification information determining sub-module 52 configured to determine the identification information of the target object according to the target object detected in the identification image library.
  • the first identification information determining module 50 further includes:
  • a second reference image obtaining sub-module 53 configured to, when it is unable to detect the target object in the identification image library according to the first reference image of the target object, determine a second reference image of the target object in the image to be analyzed, the definition of the second reference image being greater than that of the first reference image;
  • a second detecting sub-module 54 configured to detect the target object in the identification image library according to the second reference image of the target object
  • a second identification information determining sub-module 55 configured to determine the identification information of the target object according to the target object detected in the identification image library.
  • the apparatus further includes:
  • an association object determining module 60 configured to determine an association object of the target object in the image to be analyzed
  • an association object trajectory determining module 70 configured to determine a trajectory of the association object.
  • the tracking information generating module 40 includes:
  • a second tracking information generating sub-module 42 configured to generate tracking information for tracking the target object according to the trajectory of the target object and the trajectory of the association object.
  • the association object determining module 60 includes:
  • a target image determining sub-module 61 configured to determine in the image to be analyzed a target image of the target object
  • a first association object determining sub-module 62 configured to determine the association object of the target object in the target image.
  • the first association object determining sub-module 62 includes:
  • an object to be associated determining unit configured to determine an object to be associated of the target object in the target image
  • an object to be associated detecting unit configured to detect the object to be associated in the image to be analyzed
  • an object to be associated information determining unit configured to determine time information and location information of the object to be associated in the image to be analyzed according to the detected object to be associated;
  • an object to be associated trajectory determining unit configured to determine a trajectory of the object to be associated according to the time information and the location information of the object to be associated; and a second association object determining unit configured to, when the degree of coincidence between the trajectory of the object to be associated and the trajectory of the target object is greater than a degree-of-coincidence threshold, determine the object to be associated as the association object of the target object.
  • the functions provided by or the modules included in the apparatuses provided by the embodiments of the present disclosure may be used to implement the methods described in the foregoing method embodiments.
  • details are not described herein again.
  • the embodiments of the present disclosure further provide an electronic device, including: a processor; and a memory configured to store processor-executable instructions, wherein the processor executes the target object tracking method by directly or indirectly calling the executable instructions.
  • the embodiments of the present disclosure further provide a computer-readable storage medium, having computer program instructions stored thereon, where when the computer program instructions are executed by a processor, the target object tracking method is implemented.
  • the computer-readable storage medium may be a nonvolatile computer-readable storage medium or a volatile computer-readable storage medium.
  • the embodiments of the present disclosure also provide a computer program, including a computer-readable code, where when the computer-readable code runs in an electronic device, a processor in the electronic device executes the target object tracking method.
  • FIG. 9 is a block diagram of an electronic device according to an exemplary embodiment.
  • the electronic device may be provided as a terminal, a server, or other forms of devices.
  • the electronic device includes a target object tracking apparatus 1900 .
  • the device 1900 includes a processing component 1922 which further includes one or more processors, and a memory resource represented by a memory 1932 and configured to store instructions executable by the processing component 1922 , for example, an application program.
  • the application program stored in the memory 1932 may include one or more modules, each of which corresponds to a set of instructions.
  • the processing component 1922 may be configured to execute instructions so as to execute the above methods.
  • the device 1900 may further include a power supply component 1926 configured to execute power management of the device 1900 , a wired or wireless network interface 1950 configured to connect the device 1900 to the network, and an input/output (I/O) interface 1958 .
  • the device 1900 may be operated based on an operating system stored in the memory 1932 , such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM or the like.
  • a computer-readable storage medium is further provided, for example, a memory 1932 including computer program instructions, which can be executed by the processing component 1922 of the device 1900 to implement the method above.
  • the present disclosure may be a system, a method, and/or a computer program product.
  • the computer program product may include a computer-readable storage medium, on which computer-readable program instructions used by the processor to implement various aspects of the present disclosure are stored.
  • the computer-readable storage medium may be a tangible device that can maintain and store instructions used by an instruction execution device.
  • the computer-readable storage medium may be, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any appropriate combination thereof.
  • the computer-readable storage medium includes a portable computer disk, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable Compact Disc Read-Only Memory (CD-ROM), a Digital Versatile Disk (DVD), a memory stick, a floppy disk, a mechanical coding device such as a punched card storing an instruction or a protrusion structure in a groove, and any appropriate combination thereof.
  • RAM Random Access Memory
  • ROM Read-Only Memory
  • EPROM or flash memory Erasable Programmable Read-Only Memory
  • SRAM Static Random Access Memory
  • CD-ROM Compact Disc Read-Only Memory
  • DVD Digital Versatile Disk
  • memory stick a floppy disk
  • a mechanical coding device such as a punched card storing an instruction or a protrusion structure in a groove, and any appropriate combination thereof.
  • the computer-readable storage medium used here is not interpreted as an instantaneous signal such as a radio wave or other freely propagated electromagnetic wave, an electromagnetic wave propagated by a waveguide or other transmission media (for example, an optical pulse transmitted by an optical fiber cable), or an electrical signal transmitted by a wire.
  • the computer-readable program instruction described here is downloaded from a computer-readable storage medium to each computing/processing device, or downloaded to an external computer or an external storage device via a network, such as the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may include a copper transmission cable, optical fiber transmission, wireless transmission, a router, a firewall, a switch, a gateway computer, and/or an edge server.
  • a network adapter card or a network interface in each computing/processing device receives the computer-readable program instruction from the network, and forwards the computer-readable program instruction, so that the computer-readable program instruction is stored in a computer-readable storage medium in each computing/processing device.
  • Computer program instructions for executing the operations of the present disclosure are compilation instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or source code or target code written in any combination of one or more programming languages.
  • the programming languages include an object-oriented programming language such as Smalltalk or C++, and a conventional procedural programming language such as the “C” language or a similar programming language.
  • the program readable program instructions can be completely executed on a user computer, partially executed on a user computer, executed as an independent software package, executed partially on a user computer and partially on a remote computer, or completely executed on a remote computer or a server.
  • the remote computer may be connected to a user computer via any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, connected via the Internet with the aid of an Internet service provider).
  • LAN Local Area Network
  • WAN Wide Area Network
  • an electronic circuit such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA) is personalized by using status information of the computer-readable program instructions, and the electronic circuit can execute the computer-readable program instructions to implement various aspects of the present disclosure.
  • FPGA Field Programmable Gate Array
  • PDA Programmable Logic Array
  • These computer-readable program instructions may be provided for a general-purpose computer, a dedicated computer, or a processor of another programmable data processing apparatus to generate a machine, so that when the instructions are executed by the computer or the processors of other programmable data processing apparatuses, an apparatus for implementing a specified function/action in one or more blocks in the flowcharts and/or block diagrams is generated.
  • These computer-readable program instructions may also be stored in a computer-readable storage medium. These instructions cause a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium having instructions stored thereon includes an article of manufacture including instructions which implement the aspects of the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
  • the computer-readable program instructions may also be loaded onto a computer, other programmable data processing apparatuses, or other devices, so that a series of operations and steps are executed on the computer, the other programmable apparatuses, or the other devices, thereby generating computer-implemented processes. Therefore, the instructions executed on the computer, the other programmable apparatuses, or the other devices implement the specified functions/actions in the one or more blocks in the flowcharts and/or block diagrams.
  • each block in the flowcharts or block diagrams may represent a module, a program segment, or a part of instruction, and the module, the program segment, or the part of instruction includes one or more executable instructions for implementing a specified logical function.
  • functions marked in the block may also occur in an order different from that marked in the accompanying drawings. For example, two consecutive blocks are actually executed substantially in parallel, or are sometimes executed in a reverse order, depending on the involved functions.
  • each block in the block diagrams and/or flowcharts and a combination of blocks in the block diagrams and/or flowcharts may be implemented by using a dedicated hardware-based system configured to execute specified functions or actions, or may be implemented by using a combination of dedicated hardware and computer instructions.

Abstract

The present disclosure relates to a target object tracking method and apparatus, an electronic device, and a storage medium. The method includes: obtaining a first reference image of a target object; determining time information and location information of the target object in an image to be analyzed according to the first reference image, the image to be analyzed including the time information and the location information; determining a trajectory of the target object according to the time information and the location information of the target object; and generating tracking information for tracking the target object according to the trajectory of the target object. Embodiments of the present disclosure obtain highly-accurate tracking information of the target object according to the trajectory of the target object determined in the image to be analyzed by using the first reference image of the target object, such that the success rate of target object tracking is improved.

Description

  • The present application claims priority to Chinese Patent Application No. 201810558523.6, filed with the Chinese Patent Office on Jun. 1, 2018 and entitled “TARGET OBJECT TRACKING METHOD AND APPARATUS, ELECTRONIC DEVICE, AND STORAGE MEDIUM”, which is incorporated herein by reference in its entirety.
  • TECHNICAL FIELD
  • The present disclosure relates to the field of security technologies, and in particular, to a target object tracking method and apparatus, an electronic device, and a storage medium.
  • BACKGROUND
  • With the development of information technologies, there are also more requirements on tracking information for tracking target objects. For example, in security management departments such as a public security department, the catch success rate can be improved by arranging personnel to monitor a suspect using the tracking information.
  • SUMMARY
  • In this regard, the present disclosure provides a technical solution for target object tracking.
  • According to one aspect of the present disclosure, provided is a target object tracking method, including: obtaining a first reference image of a target object; determining time information and location information of the target object in an image to be analyzed according to the first reference image, the image to be analyzed including the time information and the location information; determining a trajectory of the target object according to the time information and the location information of the target object; and generating tracking information for tracking the target object according to the trajectory of the target object.
  • In a possible implementation, after obtaining the first reference image of the target object, the method further includes: determining identification information of the target object; and the generating tracking information for tracking the target object according to the trajectory of the target object includes generating tracking information for tracking the target object according to the trajectory of the target object and the identification information of the target object.
  • In a possible implementation, the determining identification information of the target object includes: detecting the target object in an identification image library according to the first reference image of the target object, identification images in the identification image library including identification information of objects; and determining the identification information of the target object according to the target object detected in the identification image library.
  • In a possible implementation, the determining identification information of the target object further includes: when it is unable to detect the target object in the identification image library according to the first reference image of the target object, determining a second reference image of the target object in the image to be analyzed, the definition of the second reference image being greater than that of the first reference image; detecting the target object in the identification image library according to the second reference image of the target object; and determining the identification information of the target object according to the target object detected in the identification image library.
  • In a possible implementation, the method further includes: determining an association object of the target object in the image to be analyzed, and determining a trajectory of the association object; and the generating tracking information for tracking the target object according to the trajectory of the target object includes generating the tracking information for tracking the target object according to the trajectory of the target object and the trajectory of the association object.
  • In a possible implementation, the determining an association object of the target object in the image to be analyzed includes: determining in the image to be analyzed a target image including the target object; and determining the association object of the target object in the target image.
  • In a possible implementation, the determining an association object of the target object in the target image includes: determining an object to be associated of the target object in the target image; detecting the object to be associated in the image to be analyzed; determining time information and location information of the object to be associated in the image to be analyzed according to the detected object to be associated; determining a trajectory of the object to be associated according to the time information and the location information of the object to be associated; and when the degree of coincidence between the trajectory of the object to be associated and the trajectory of the target object is greater than a degree-of-coincidence threshold, determining the object to be associated as the association object of the target object.
  • According to one aspect of the present disclosure, provided is a target object tracking apparatus, including: a first reference image obtaining module configured to obtain a first reference image of a target object; an information determining module configured to determine time information and location information of the target object in an image to be analyzed according to the first reference image, the image to be analyzed including the time information and the location information; a trajectory determining module configured to determine a trajectory of the target object according to the time information and the location information of the target object; and a tracking information generating module configured to generate tracking information for tracking the target object according to the trajectory of the target object.
  • In a possible implementation, the apparatus further includes: a first identification information determining sub-module configured to determine identification information of the target object; and the tracking information generating module includes: a first tracking information generating sub-module configured to generate tracking information for tracking the target object according to the trajectory of the target object and the identification information of the target object.
  • In a possible implementation, the first identification information determining module includes: a first detecting sub-module configured to detect the target object in an identification image library according to the first reference image of the target object, identification images in the identification image library including identification information of objects; and a first identification information determining sub-module configured to determine the identification information of the target object according to the target object detected in the identification image library.
  • In a possible implementation, the first identification information determining module further includes: a second reference image obtaining sub-module configured to, when it is unable to detect the target object in the identification image library according to the first reference image of the target object, determine a second reference image of the target object in the image to be analyzed, the definition of the second reference image being greater than that of the first reference image; a second detecting sub-module configured to detect the target object in the identification image library according to the second reference image of the target object, and a second identification information determining sub-module configured to determine the identification information of the target object according to the target object detected in the identification image library.
  • In a possible implementation, the apparatus further includes: an association object determining module configured to determine an association object of the target object in the image to be analyzed; an association object trajectory determining module configured to determine a trajectory of the association object; and the tracking information generating module includes: a second tracking information generating sub-module configured to generate tracking information for tracking the target object according to the trajectory of the target object and the trajectory of the association object.
  • In a possible implementation, the association object determining module includes: a target image determining sub-module configured to determine in the image to be analyzed a target image including the target object; and a first association object determining sub-module configured to determine the association object of the target object in the target image.
  • In a possible implementation, the first association object determining sub-module includes: an object to be associated determining unit configured to determine an object to be associated of the target object in the target image; an object to be associated detecting unit configured to detect the object to be associated in the image to be analyzed; an object to be associated information determining unit configured to determine time information and location information of the object to be associated in the image to be analyzed according to the detected object to be associated; an object to be associated trajectory determining unit configured to determine a trajectory of the object to be associated according to the time information and the location information of the object to be associated; and a second association object determining unit configured to, when the degree of coincidence between the trajectory of the object to be associated and the trajectory of the target object is greater than a degree-of-coincidence threshold, determine the object to be associated as the association object of the target object.
  • According to one aspect of the present disclosure, provided is an electronic device, including: a processor; and a memory configured to store processor-executable instructions; where the processor is configured to execute the target object tracking method.
  • According to one aspect of the present disclosure, provided is a computer-readable storage medium, having computer program instructions stored thereon, where when the computer program instructions are executed by a processor, the target object tracking method is implemented.
  • According to one aspect of the present disclosure, provided is a computer program, including a computer-readable code, where when the computer-readable code runs in an electronic device, a processor in the electronic device executes the target object tracking method.
  • In embodiments of the present disclosure, the time information and the location information of the target object can be determined in the image to be analyzed using the first reference image of the target object. After determining the trajectory of the target object according to the time information and the location information of the target object, the tracking information for tracking the target object is generated according to the trajectory of the target object. Highly-accurate tracking information of the target object is obtained according to the trajectory of the target object determined in the image to be analyzed by using the first reference image of the target object, such that the success rate of target object tracking is improved.
  • Exemplary embodiments are described in detail below with reference to the accompanying drawings, and other features and aspects of the present disclosure become clear.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Accompanying drawings included in the specification and constructing a part of the specification jointly show the exemplary embodiments, characteristics, and aspects of the present disclosure, and are intended to explain the principles of the present disclosure.
  • FIG. 1 is a flowchart of a target object tracking method according to an exemplary embodiment.
  • FIG. 2 is a flowchart of a target object tracking method according to an exemplary embodiment.
  • FIG. 3 is a flowchart of step S50 of a target object tracking method according to an exemplary embodiment.
  • FIG. 4 is a flowchart of step S50 of a target object tracking method according to an exemplary embodiment.
  • FIG. 5 is a flowchart of a target object tracking method according to an exemplary embodiment.
  • FIG. 6 is a flowchart of step S60 of a target object tracking method according to an exemplary embodiment.
  • FIG. 7 is a flowchart of a target object tracking apparatus according to an exemplary embodiment.
  • FIG. 8 is a block diagram of a target object tracking apparatus according to an exemplary embodiment.
  • FIG. 9 is a block diagram of an electronic device according to an exemplary embodiment.
  • DETAILED DESCRIPTION
  • The following will describe various exemplary embodiments, features, and aspects of the present disclosure in detail with reference to the accompanying drawings. Like accompanying symbols in the accompanying drawings represent elements with like or similar functions. Although various aspects of the embodiments are illustrated in the accompanying drawing, the accompanying drawings are not necessarily drawn in proportion unless otherwise specified.
  • The special term “exemplary” here means “used as an example, an embodiment, or an illustration”. Any embodiment described as “exemplary” here is not necessarily to be interpreted as superior to or better than other embodiments.
  • In addition, for better illustration of the present disclosure, various specific details are given in the following specific implementations. A person skilled in the art should understand that the present disclosure may also be implemented without some specific details. In some examples, methods, means, elements, and circuits well known to a person skilled in the art are not described in detail so as to highlight the subject matter of the present disclosure.
  • FIG. 1 is a flowchart of a target object tracking method according to an exemplary embodiment. As shown in FIG. 1, the target object tracking method includes:
  • at step S10, a first reference image of a target object is obtained.
  • In a possible implementation, the target object may include various types of objects such as a human, an animal, a plant, and a building. There may be one or more target objects. The target object may be one type of object and may also be a combination of various types of objects.
  • The first reference image of the target object may include a photo, a portrait, or the like of the target object. The first reference image may be a static image, and may also be an image frame in a video stream. The first reference image may merely include an image of the target object, and may also include images of other objects. The first reference image may include one image of the target object, and may also include a plurality of images of the target object.
  • At step S20, time information and location information of the target object are determined in an image to be analyzed according to the first reference image, the image to be analyzed including the time information and the location information.
  • In a possible implementation, the image to be analyzed includes an original captured image. For example, the image to be analyzed is an image captured by a surveillance camera. The image to be analyzed may include a plurality of objects, and may also include a single object. For example, if a surveillance image captured by a surveillance camera in a crowded place is determined as an image to be analyzed, captured surveillance image A includes a plurality of objects.
  • The image to be analyzed may also include an image cropped from the original captured image. For example, after performing face recognition on an original image captured by the surveillance camera, detection results of objects in the original image, for example, detection boxes of the objects, are obtained. After cropping corresponding images in the original image according to the detection results of the objects, images to be analyzed of the objects are obtained. For example, surveillance image B captured by a surveillance camera in an Internet café includes three objects, i.e., person 1, person 2, and person 3. Detection boxes of the three objects are detected in the surveillance image B using a face recognition technology. Corresponding images are cropped in the surveillance image B according to the three detection boxes to obtain image to be analyzed 1 of the person 1, image to be analyzed 2 of the person 2, and image to be analyzed 3 of the person 3. In this case, each image to be analyzed merely includes one object.
  • The time information of the image to be analyzed includes the time at which the image to be analyzed is captured. The location information of the image to be analyzed includes the location at which the image to be analyzed is captured. For example, if the image to be analyzed is a surveillance image captured by a surveillance camera, the time information of the image to be analyzed is determined according to the time at which the surveillance image is captured, and the location information of the image to be analyzed is determined according to the location at which the camera is mounted. The location information includes longitude and latitude information and postal address information.
  • The detection result of the target object may be obtained by performing target object detection on the first reference image. The target object may be obtained by detecting the first reference image using an image recognition technology. The target object may also be obtained by inputting the first reference image to a corresponding neural network, and detecting the image to be analyzed according to the output result of the neural network.
  • Target object detection is performed in the image to be analyzed according to the target object detected in the first reference image. When the target object is detected in the image to be analyzed, the time information and the location information of the target object are obtained according to the time information and the location information of the image to be analyzed where the detected target object is located.
  • There may be a plurality of images to be analyzed, and therefore, a plurality of time information and location information of the target object can be obtained.
  • At step S30, a trajectory of the target object is determined according to the time information and the location information of the target object.
  • In a possible implementation, the time information and the location information of the target object have one-to-one correspondence. The trajectory of the target object may be obtained by associating the location information in a time sequence of the time information of the target object. For example, a list-type trajectory of the target object is obtained.
  • A linear trajectory of the target object may also be obtained by marking the time information and the location information of the target object on a map and sequentially connecting the marks on the map in a time sequence according to the marked location information and time information. The linear trajectory of the target object on the map is more intuitive.
  • When there is merely one pair of time information and location information of the target object, the trajectory of the target object is a location corresponding to one time point.
  • At step S40, tracking information for tracking the target object is generated according to the trajectory of the target object.
  • In a possible implementation, the activity law or the time and/or location at which the target object frequently appears is determined according to the trajectory of the target object, the time and location at which the target object may appear is predicted, and the tracking information for tracking the target object is generated according to the prediction result. For example, a security management department determines according to a trajectory of a suspect a time and a location at which a suspect frequently appears, predicts according to the trajectory of the suspect a time and a location at which the suspect may appear, and generates tracking information for the suspect according to the prediction result, such that the suspect tracking success rate can be improved.
  • In this embodiment, the time information and the location information of the target object can be determined in the image to be analyzed using the first reference image of the target object. After determining the trajectory of the target object according to the time information and the location information of the target object, the tracking information for tracking the target object is generated according to the trajectory of the target object. Highly-accurate tracking information of the target object is obtained according to the trajectory of the target object determined in the image to be analyzed by using the first reference image of the target object, such that the success rate of target object tracking is improved.
  • FIG. 2 is a flowchart of a target object tracking method according to an exemplary embodiment. As shown in FIG. 2, after step S10, the target object tracking method further includes:
  • at S50, identification information of the target object is determined.
  • In a possible implementation, the identification information of the target information includes information such as the name, attribute, and feature of the target object. The target object is distinguished from other objects using the identification information. More comprehensive information of the target object is obtained using the identification information.
  • For example, if the target object is a human, the identification information includes identity card information, criminal record information, social relation information and the like of the target object.
  • A plurality of identification information libraries can be created according to requirements. A corresponding identification information library can be found according to requirements. Identification information of a preset target object may be obtained according to requirements. Preset identification information of the target object may also be obtained according to requirements. For example, if the target object is a human, an identity card information library is created. Identity card information of a suspect that falls within the age range of 20-40 years old may be obtained according to requirements. Address information of the suspect that falls within the age range of 20-40 years old may also be obtained.
  • Step S40 includes:
  • at step S41, tracking information for tracking the target object is generated according to the trajectory of the target object and the identification information of the target object.
  • In a possible implementation, the tracking information is obtained according to the combination of the trajectory of the target object and the identification information of the target object. For example, features such as the age, height, and weight of the target object is determined according to the identification information of the target object, and the generated tracking information carry the features such as the age, height, and weight of the target object to facilitate obtaining more comprehensive information of the target object by a user of the tracking information.
  • In this embodiment, the identification information of the target object is determined, and the tracking information for tracking the target object is generated according to the trajectory and the identification information of the target object. More comprehensive and accurate tracking information can be obtained using the identification information. When the generated tracking information is used for tracking the target object, the identification information can improve the target object tracking success rate.
  • FIG. 3 is a flowchart of step S50 of a target object tracking method according to an exemplary embodiment. As shown in FIG. 3, the step S50 of the target object tracking method includes:
  • at S51, the target object is detected in an identification image library according to the first reference image of the target object, identification images in the identification image library including identification information of objects.
  • In a possible implementation, the identification image library includes identification images of a plurality of target objects, and the identification images include identification information of the target objects. According to requirements, an identification image library can be created for objects satisfying a set condition. For example, an identification image library may be created for objects having criminal records. An identification image library for objects satisfying a set identification range may also be created. For example, identification image library for objects satisfying identification information such as a set range and a set sex can be created.
  • The target object is detected in the identification image in the identification image library according to the target object in the first reference image. The target object may be detected in the identification image library using technologies such as image recognition. The target object may also be obtained by inputting the first reference image of the target object to a neural network, and detecting target object according to the output result of the neural network.
  • For example, the identification image library includes an identity card information library. Identification images in the identity card information library may include photos on identity cards of persons, and the identification images may also include identity card information such as names, addresses, and ages of the identity cards of the persons. Suspect A is be detected in photos in the identity card information library according to photo 1 of the suspect A.
  • At step S52, identification information of the target object is determined according to the target object detected in the identification image library.
  • In a possible implementation, when the target object is detected in the identification image library, the identification image corresponding to the target object and the identification information corresponding to the target object are determined according to the detection result. For example, when the photo of the identity card of the suspect A is detected in the identity card information library, identification information on the identity card, such as the name, age, and address of the suspect A can be determined according to the detection result.
  • In this embodiment, the identification information of the target object can be determined in the identification image library according to the first reference image of the target object. The target object can be conveniently and accurately detected using the identification image library and the finally generated tracking information of the target object is more accurate.
  • FIG. 4 is a flowchart of step S50 of a target object tracking method according to an exemplary embodiment. As shown in FIG. 4, the step S50 of the target object tracking method includes:
  • at step S53, when it is unable to detect the target object in the identification image library according to the first reference image of the target object, a second reference image of the target object is determined in the image to be analyzed, the definition of the second reference image being greater than that of the first reference image.
  • In a possible implementation, different capturing angles and capturing environments may result in different definitions and included features of the target object in the first reference image. If the target object in the first reference image has a poor definition or an incomplete feature, the target object may not be detected in the identification image library.
  • when it is unable to detect the target object in the identification image library according to the first reference image, a second reference image of the target object is determined in the image to be analyzed, the definition of the second reference image being greater than that of the first reference image. An image to be analyzed library includes images of a plurality of candidate objects, and the target object can be determined in the image to be analyzed according to the similarity between the candidate objects in the image to be analyzed and the first reference image. Furthermore, the second reference image of the target object is determined in the image to be analyzed according to the determination result of the target object.
  • In a possible implementation, if the similarity between one candidate object in the image to be analyzed and the target object is greater than a similarity threshold, the candidate object is determined as the target object.
  • For example, photo 3 of suspect B has a poor definition because it is captured at night, and it is unable to detect the suspect B in the identification image library according to the photo 3. Image 4 of the suspect B is determined in the image to be analyzed library according to the photo 3 of the suspect B. The definition of the image 4 is greater than that of the photo 3. The suspect B in the image 4 is clearer, and/or the feature of the suspect B is more comprehensive.
  • At step S54, the target object is detected in the identification image library according to the second reference image of the target object.
  • In a possible implementation, the target object is continued to be detected in the identification image library according to the determined second reference image of the target object. For example, the suspect B is continued to be detected in the identification image library according to the image 4 of the suspect B. Because the definition of the second reference image is greater than that of the first reference image, the success rate that the target object is detected in the identification image library can be improved.
  • At step S55, identification information of the target object is determined according to the target object detected in the identification image library.
  • In a possible implementation, when the target object is detected in the identification image library according to the second reference image, the identification information of the target object can be obtained according to the detection result. For example, after the photo of the identity card of the suspect B is detected in the identity card information library according to the image 4 of the suspect B, the identification information on the identity card, such as the name, age, and address of the suspect B can be obtained.
  • In this embodiment, when it is unable to detect the target object in the identification image library according to the first reference image of the target object, the identification information of the target object can be obtained by determining in the image to be analyzed the second reference image of the target object and detecting the target object in the identification image library according to the second reference image. If the first reference image is not clear, the identification information of the target object is obtained according to the second reference image, thereby improving the success rate of obtaining the identification information of the target object.
  • FIG. 5 is a flowchart of a target object tracking method according to an exemplary embodiment. As shown in FIG. 5, the target object tracking method further includes:
  • at step S60, an association object of the target object is determined in the image to be analyzed.
  • In a possible implementation, the association object of the target object may include an object that appears at the same location as the target object at a different time, and may also include an object that appears at the same location and the same time as the target object. For example, the association object may include an object that appears at location 1 and location 2 with the target object at different times, and may also include objects that appear at location 3 with the target object at three same times. The association object of the target object is determined according to requirements.
  • Candidate objects that appear at the same location as the target object can be detected in the image to be analyzed, and the association object is determined from the candidate objects according to a preset association object determination condition.
  • The target object has a plurality of association objects.
  • At step S70, a trajectory of the association object is determined.
  • In a possible implementation, time information and location information of the association object are determined in the image to be analyzed according to the image where the association object is located, and the trajectory of the association object is determined according to the time information and the location information of the association object. The determination process of the trajectory of the association object is similar to the generation process of the trajectory of the target object. Reference can be made to the generation process of the trajectory of the target object in the embodiment shown in FIG. 1.
  • Step S40 includes:
  • at step S42, the tracking information for tracking the target object is generated according to the trajectory of the target object and the trajectory of the association object.
  • In a possible implementation, in the case that there is a plenty of time information and location information of the target object in the trajectory of the target object, in order to track the target object in more targeted fashion, a cross trajectory of the target object and the association object is generated according to the trajectory of the target object and the association object, and the tracking information for tracking the target object is generated using the cross trajectory.
  • In the case that there is a little time information and location information of the target object in the trajectory of the target object, in order to generate more useful tracking information, the trajectory of the target object and the trajectory of the association object are combined to generate a combined trajectory, and the tracking information for tracking the target object is generated using the combined trajectory.
  • In this embodiment, the association object of the target object is determined in the image to be analyzed, and the tracking trajectory for tracking the target object is generated according to the trajectory of the association object and the trajectory of the target object. The trajectory of the target object can be supplemented or corrected using the trajectory of the association object, such that more accurate tracking information is generated.
  • FIG. 6 is a flowchart of step S60 of a target object tracking method according to an exemplary embodiment. As shown in FIG. 6, the step S60 of the target object tracking method includes:
  • at step S61, a target image including the target object is determined in the image to be analyzed; and
  • At step S62, the association object of the target object is determined in the target image.
  • In a possible implementation, the target image including the target object is determined in the image to be analyzed. The target image is the image to be analyzed where the target object is located.
  • A plurality of target images of the target object is determined in the image to be analyzed. The target image includes one or more other objects other than the target object. The other objects included in each target image may be different. The association object can be determined in the target image on the basis of different association object selection conditions according to requirements. For example, the other objects appearing in the target image may all be determined as association objects. The other objects having the number of appearances greater than a threshold in each target object may also be determined as association objects.
  • For example, target object 1 has three target images, which are respectively target image 1, target image 2, and target image 3. In addition to the target object, the target image 1 further includes object A, object B, and object C. In addition to the target object, target object 2 further includes the object B, the object C, object D, and object E. In addition to the target object, target object 3 further includes the object A, the object C, the object D, and the object E. According to the association object selection condition that the number of appearances is greater than a threshold, the object C having the number of appearances greater than two may be determined as the association object of the target object. According to an association object selection condition appearing at a same location, all of the object A to the object E may also be determined as the association objects of the target object.
  • In this embodiment, the association object is determined in the target image after the target image of the target object is determined in the image to be analyzed. The association object can be conveniently and accurately determined using the target image.
  • In a possible implementation, the step S62 of the target object tracking method includes:
  • determining an object to be associated of the target object in the target image;
  • detecting the object to be associated in the image to be analyzed;
  • determining time information and location information of the object to be associated in the image to be analyzed according to the detected object to be associated;
  • determining a trajectory of the object to be associated according to the time information and the location information of the object to be associated; and
  • when the degree of coincidence between the trajectory of the object to be associated and the trajectory of the target object is greater than a degree-of-coincidence threshold, determining the object to be associated as the association object of the target object.
  • In a possible implementation, the object to be associated is determined in the target image according to requirements. For example, the other objects appearing in the target image of the target object are determined as objects to be associated.
  • Detection is performed in the image to be analyzed according to the object to be associated in the target image. The object to be associated may be recognized in the image to be analyzed using an image recognition technology. The object to be associated may also be obtained by inputting the object to be associated in the target image to a neural network and detecting the object to be associated in the image to be analyzed using the neural network. When the object to be associated is detected in the image to be analyzed, the time information and the location information of the object to be associated are determined according to the time information and the location information of the image to be analyzed including the object to be associated. A plurality of time information and location information of the object to be associated is determined.
  • A trajectory of the object to be associated is obtained according to the time information and the location information of the object to be associated. For example, the trajectory of the object to be associated may be obtained by associating the location information of the object to be associated in a time sequence. A linear trajectory of the object to be associated may also be obtained by marking the time information and the location information of the object to be associated on a map and linearly connecting the locations in a time sequence.
  • A degree-of-coincidence threshold is set according to requirements. If the degree of coincidence between the trajectory of the object to be associated and the trajectory of the target object is greater than the degree-of-coincidence threshold, the object to be associated is determined as the association object of the target object. The coincidence between the trajectory of the object to be associated and the trajectory of the target image includes the complete coincidence between the time information and the location information of the object to be associated, and may also include the coincidence in a set time range between the time information of the object to be associated and the time information of the target object, and/or the coincidence in a set geographical range between the location information of the object to be associated and the location information of the target object.
  • In this embodiment, the association object of the target object is determined according to the degree of coincidence between the trajectory of the object to be associated and the target object and the degree-of-coincidence threshold. The association object has a close association relation with the target object. The trajectory of the association object is also more valuable for the correction and supplementation of the generation of tracking information.
  • It can be understood that the foregoing method embodiments mentioned in the present disclosure are combined with each other to form a combined embodiment without departing from the principle and the logic. Details are not described in the present disclosure due to space limitation.
  • FIG. 7 is a block diagram of a target object tracking apparatus according to an exemplary embodiment. As shown in FIG. 7, the target object tracking apparatus includes:
  • a first reference image obtaining module 10 configured to obtain a first reference image of a target object;
  • an information determining module 20 configured to determine time information and location information of the target object in an image to be analyzed according to the first reference image, the image to be analyzed comprising the time information and the location information;
  • a trajectory determining module 30 configured to determine a trajectory of the target object according to the time information and the location information of the target object; and
  • a tracking information generating module 40 configured to generate tracking information for tracking the target object according to the trajectory of the target object.
  • FIG. 8 is a block diagram of a target object tracking apparatus according to an exemplary embodiment. As shown in FIG. 8, in a possible implementation, the apparatus further includes:
  • a first identification information determining module 50 configured to determine identification information of the target object.
  • The tracking information generating module 40 includes:
  • a first tracking information generating sub-module 41 configured to generate tracking information for tracking the target object according to the trajectory of the target object and the identification information of the target object.
  • In a possible implementation, the first identification information determining module 50 includes:
  • a first detecting sub-module 51 configured to detect the target object in an identification image library according to the first reference image of the target object, identification images in the identification image library including identification information of objects; and a first identification information determining sub-module 52 configured to determine the identification information of the target object according to the target object detected in the identification image library.
  • In a possible implementation, the first identification information determining module 50 further includes:
  • a second reference image obtaining sub-module 53 configured to, when it is unable to detect the target object in the identification image library according to the first reference image of the target object, determine a second reference image of the target object in the image to be analyzed, the definition of the second reference image being greater than that of the first reference image;
  • a second detecting sub-module 54 configured to detect the target object in the identification image library according to the second reference image of the target object; and
  • a second identification information determining sub-module 55 configured to determine the identification information of the target object according to the target object detected in the identification image library.
  • In a possible implementation, the apparatus further includes:
  • an association object determining module 60 configured to determine an association object of the target object in the image to be analyzed; and
  • an association object trajectory determining module 70 configured to determine a trajectory of the association object.
  • The tracking information generating module 40 includes:
  • a second tracking information generating sub-module 42 configured to generate tracking information for tracking the target object according to the trajectory of the target object and the trajectory of the association object.
  • In a possible implementation, the association object determining module 60 includes:
  • a target image determining sub-module 61 configured to determine in the image to be analyzed a target image of the target object, and a first association object determining sub-module 62 configured to determine the association object of the target object in the target image.
  • In a possible implementation, the first association object determining sub-module 62 includes:
  • an object to be associated determining unit configured to determine an object to be associated of the target object in the target image;
  • an object to be associated detecting unit configured to detect the object to be associated in the image to be analyzed;
  • an object to be associated information determining unit configured to determine time information and location information of the object to be associated in the image to be analyzed according to the detected object to be associated;
  • an object to be associated trajectory determining unit configured to determine a trajectory of the object to be associated according to the time information and the location information of the object to be associated; and a second association object determining unit configured to, when the degree of coincidence between the trajectory of the object to be associated and the trajectory of the target object is greater than a degree-of-coincidence threshold, determine the object to be associated as the association object of the target object.
  • In some embodiments, the functions provided by or the modules included in the apparatuses provided by the embodiments of the present disclosure may be used to implement the methods described in the foregoing method embodiments. For specific implementations, reference may be made to the description in the method embodiments above. For the purpose of brevity, details are not described herein again.
  • The embodiments of the present disclosure further provide an electronic device, including: a processor; and a memory configured to store processor-executable instructions, wherein the processor executes the target object tracking method by directly or indirectly calling the executable instructions.
  • The embodiments of the present disclosure further provide a computer-readable storage medium, having computer program instructions stored thereon, where when the computer program instructions are executed by a processor, the target object tracking method is implemented. The computer-readable storage medium may be a nonvolatile computer-readable storage medium or a volatile computer-readable storage medium.
  • The embodiments of the present disclosure also provide a computer program, including a computer-readable code, where when the computer-readable code runs in an electronic device, a processor in the electronic device executes the target object tracking method.
  • FIG. 9 is a block diagram of an electronic device according to an exemplary embodiment. For example, the electronic device may be provided as a terminal, a server, or other forms of devices. For example, the electronic device includes a target object tracking apparatus 1900. Referring to FIG. 9, the device 1900 includes a processing component 1922 which further includes one or more processors, and a memory resource represented by a memory 1932 and configured to store instructions executable by the processing component 1922, for example, an application program. The application program stored in the memory 1932 may include one or more modules, each of which corresponds to a set of instructions. Further, the processing component 1922 may be configured to execute instructions so as to execute the above methods.
  • The device 1900 may further include a power supply component 1926 configured to execute power management of the device 1900, a wired or wireless network interface 1950 configured to connect the device 1900 to the network, and an input/output (I/O) interface 1958. The device 1900 may be operated based on an operating system stored in the memory 1932, such as Windows Server™, Mac OS X™, Unix™, Linux™, FreeBSD™ or the like. In an exemplary embodiment, a computer-readable storage medium is further provided, for example, a memory 1932 including computer program instructions, which can be executed by the processing component 1922 of the device 1900 to implement the method above.
  • The present disclosure may be a system, a method, and/or a computer program product. The computer program product may include a computer-readable storage medium, on which computer-readable program instructions used by the processor to implement various aspects of the present disclosure are stored.
  • The computer-readable storage medium may be a tangible device that can maintain and store instructions used by an instruction execution device. For example, the computer-readable storage medium may be, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any appropriate combination thereof. More specific examples (a non-exhaustive list) of the computer-readable storage medium include a portable computer disk, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable Compact Disc Read-Only Memory (CD-ROM), a Digital Versatile Disk (DVD), a memory stick, a floppy disk, a mechanical coding device such as a punched card storing an instruction or a protrusion structure in a groove, and any appropriate combination thereof. The computer-readable storage medium used here is not interpreted as an instantaneous signal such as a radio wave or other freely propagated electromagnetic wave, an electromagnetic wave propagated by a waveguide or other transmission media (for example, an optical pulse transmitted by an optical fiber cable), or an electrical signal transmitted by a wire.
  • The computer-readable program instruction described here is downloaded from a computer-readable storage medium to each computing/processing device, or downloaded to an external computer or an external storage device via a network, such as the Internet, a local area network, a wide area network, and/or a wireless network. The network may include a copper transmission cable, optical fiber transmission, wireless transmission, a router, a firewall, a switch, a gateway computer, and/or an edge server. A network adapter card or a network interface in each computing/processing device receives the computer-readable program instruction from the network, and forwards the computer-readable program instruction, so that the computer-readable program instruction is stored in a computer-readable storage medium in each computing/processing device.
  • Computer program instructions for executing the operations of the present disclosure are compilation instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or source code or target code written in any combination of one or more programming languages. The programming languages include an object-oriented programming language such as Smalltalk or C++, and a conventional procedural programming language such as the “C” language or a similar programming language. The program readable program instructions can be completely executed on a user computer, partially executed on a user computer, executed as an independent software package, executed partially on a user computer and partially on a remote computer, or completely executed on a remote computer or a server. In the case of a remote computer, the remote computer may be connected to a user computer via any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, connected via the Internet with the aid of an Internet service provider). In some embodiments, an electronic circuit such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA) is personalized by using status information of the computer-readable program instructions, and the electronic circuit can execute the computer-readable program instructions to implement various aspects of the present disclosure.
  • Various aspects of the present disclosure are described here with reference to the flowcharts and/or block diagrams of the methods, apparatuses (systems), and computer program products according to the embodiments of the present disclosure. It should be understood that each block in the flowcharts and/or block diagrams and a combination of the blocks in the flowcharts and/or block diagrams can be implemented with the computer-readable program instructions.
  • These computer-readable program instructions may be provided for a general-purpose computer, a dedicated computer, or a processor of another programmable data processing apparatus to generate a machine, so that when the instructions are executed by the computer or the processors of other programmable data processing apparatuses, an apparatus for implementing a specified function/action in one or more blocks in the flowcharts and/or block diagrams is generated. These computer-readable program instructions may also be stored in a computer-readable storage medium. These instructions cause a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium having instructions stored thereon includes an article of manufacture including instructions which implement the aspects of the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
  • The computer-readable program instructions may also be loaded onto a computer, other programmable data processing apparatuses, or other devices, so that a series of operations and steps are executed on the computer, the other programmable apparatuses, or the other devices, thereby generating computer-implemented processes. Therefore, the instructions executed on the computer, the other programmable apparatuses, or the other devices implement the specified functions/actions in the one or more blocks in the flowcharts and/or block diagrams.
  • The flowcharts and block diagrams in the accompanying drawings show architectures, functions, and operations that may be implemented by the systems, methods, and computer program products in the embodiments of the present disclosure. In this regard, each block in the flowcharts or block diagrams may represent a module, a program segment, or a part of instruction, and the module, the program segment, or the part of instruction includes one or more executable instructions for implementing a specified logical function. In some alternative implementations, functions marked in the block may also occur in an order different from that marked in the accompanying drawings. For example, two consecutive blocks are actually executed substantially in parallel, or are sometimes executed in a reverse order, depending on the involved functions. It should also be noted that each block in the block diagrams and/or flowcharts and a combination of blocks in the block diagrams and/or flowcharts may be implemented by using a dedicated hardware-based system configured to execute specified functions or actions, or may be implemented by using a combination of dedicated hardware and computer instructions.
  • The embodiments of the present disclosure are described above. The foregoing descriptions are exemplary but not exhaustive, and are not limited to the disclosed embodiments. For a person of ordinary skill in the art, many modifications and variations are all obvious without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable other persons of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (20)

1-17. (canceled)
18. A target object tracking method, comprising:
obtaining a first reference image of a target object and determining identification information of the target object;
determining time information and location information of the target object in an image to be analyzed according to the first reference image, the image to be analyzed comprising the time information and the location information;
determining a trajectory of the target object according to the time information and the location information of the target object; and
generating tracking information for tracking the target object according to the trajectory of the target object and the identification information of the target object,
wherein determining identification information of the target object comprises:
when it is unable to detect the target object in an identification image library according to the first reference image of the target object, determining a second reference image of the target object in the image to be analyzed, identification images in the identification image library comprising identification information of objects, and the definition of the second reference image being greater than that of the first reference image;
detecting the target object in the identification image library according to the second reference image of the target object; and
determining the identification information of the target object according to the target object detected in the identification image library.
19. The method according to claim 18, wherein determining identification information of the target object further comprises:
when it is able to detect the target object in the identification image library according to the first reference image of the target object, detecting the target object in the identification image library according to the first reference image of the target object.
20. The method according to claim 18, wherein determining a second reference image of the target object in the image to be analyzed comprises:
determining the target object in the image to be analyzed according to the similarity between the candidate objects in the image to be analyzed and the first reference image; and
determining the second reference image of the target object in the image to be analyzed according to the determination result of the target object.
21. The method according to claim 18, wherein determining the target object in the image to be analyzed according to the similarity comprises:
when the similarity between one candidate object in the image to be analyzed and the target object is greater than a similarity threshold, determining the one candidate object as the target object.
22. The method according to claim 18, wherein the identification image library includes identification images of a plurality of target objects, and the identification images include identification information of the target objects.
23. The method according to claim 18, wherein identification information of the target object at least includes a name, an attribute, and feature of the target object.
24. The method according to claim 23, wherein if the target object is a human, the identification information of the target object at least includes identity card information, criminal record information, and social relation information.
25. The method according to claim 18, wherein if the image to be analyzed is an original captured image, the image to be analyzed includes a plurality of objects or a single object.
26. The method according to claim 18, wherein if the image to be analyzed is an image cropped from an original captured image, each image to be analyzed merely includes one object.
27. A target object tracking apparatus, comprising:
a processor; and
a memory having stored thereon instructions that, when executed by the processor, cause the processor to:
obtain a first reference image of a target object and determine identification information of the target object;
determine time information and location information of the target object in an image to be analyzed according to the first reference image, the image to be analyzed comprising the time information and the location information;
determine a trajectory of the target object according to the time information and the location information of the target object; and
generate tracking information for tracking the target object according to the trajectory of the target object and the identification information of the target object,
wherein determining identification information of the target object comprises:
when it is unable to detect the target object in an identification image library according to the first reference image of the target object, determining a second reference image of the target object in the image to be analyzed, identification images in the identification image library comprising identification information of objects, and the definition of the second reference image being greater than that of the first reference image;
detecting the target object in the identification image library according to the second reference image of the target object; and
determining the identification information of the target object according to the target object detected in the identification image library.
28. The apparatus according to claim 27, wherein determining identification information of the target object further comprises:
when it is able to detect the target object in the identification image library according to the first reference image of the target object, detecting the target object in the identification image library according to the first reference image of the target object.
29. The apparatus according to claim 27, wherein determining a second reference image of the target object in the image to be analyzed comprises:
determining the target object in the image to be analyzed according to the similarity between the candidate objects in the image to be analyzed and the first reference image; and
determining the second reference image of the target object in the image to be analyzed according to the determination result of the target object.
30. The apparatus according to claim 27, wherein determining the target object in the image to be analyzed according to the similarity comprises:
when the similarity between one candidate object in the image to be analyzed and the target object is greater than a similarity threshold, determining the one candidate object as the target object.
31. The apparatus according to claim 27, wherein the identification image library includes identification images of a plurality of target objects, and the identification images include identification information of the target objects.
32. The apparatus according to claim 27, wherein identification information of the target object at least includes a name, an attribute, and feature of the target object.
33. The apparatus according to claim 32, wherein if the target object is a human, the identification information of the target object at least includes identity card information, criminal record information, and social relation information.
34. The apparatus according to claim 27, wherein if the image to be analyzed is an original captured image, the image to be analyzed includes a plurality of objects or a single object.
35. The apparatus according to claim 27, wherein if the image to be analyzed is an image cropped from an original captured image, each image to be analyzed merely includes one object.
36. A non-transitory computer-readable storage medium having computer program instructions stored thereon, wherein when the computer program instructions are executed by a processor, the processor is caused to execute the target object tracking method, comprising:
obtaining a first reference image of a target object and determining identification information of the target object;
determining time information and location information of the target object in an image to be analyzed according to the first reference image, the image to be analyzed comprising the time information and the location information;
determining a trajectory of the target object according to the time information and the location information of the target object; and
generating tracking information for tracking the target object according to the trajectory of the target object and the identification information of the target object,
wherein determining identification information of the target object comprises:
when it is unable to detect the target object in an identification image library according to the first reference image of the target object, determining a second reference image of the target object in the image to be analyzed, identification images in the identification image library comprising identification information of objects, and the definition of the second reference image being greater than that of the first reference image;
detecting the target object in the identification image library according to the second reference image of the target object; and
determining the identification information of the target object according to the target object detected in the identification image library.
US17/497,648 2018-06-01 2021-10-08 Target Object Tracking Method and Apparatus, and Storage Medium Abandoned US20220044417A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/497,648 US20220044417A1 (en) 2018-06-01 2021-10-08 Target Object Tracking Method and Apparatus, and Storage Medium

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN201810558523.6 2018-06-01
CN201810558523.6A CN108897777B (en) 2018-06-01 2018-06-01 Target object tracking method and device, electronic equipment and storage medium
PCT/CN2019/087261 WO2019228194A1 (en) 2018-06-01 2019-05-16 Target object tracking method and apparatus, electronic device, and storage medium
US16/913,768 US11195284B2 (en) 2018-06-01 2020-06-26 Target object tracking method and apparatus, and storage medium
US17/497,648 US20220044417A1 (en) 2018-06-01 2021-10-08 Target Object Tracking Method and Apparatus, and Storage Medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/913,768 Continuation US11195284B2 (en) 2018-06-01 2020-06-26 Target object tracking method and apparatus, and storage medium

Publications (1)

Publication Number Publication Date
US20220044417A1 true US20220044417A1 (en) 2022-02-10

Family

ID=64344001

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/913,768 Active US11195284B2 (en) 2018-06-01 2020-06-26 Target object tracking method and apparatus, and storage medium
US17/497,648 Abandoned US20220044417A1 (en) 2018-06-01 2021-10-08 Target Object Tracking Method and Apparatus, and Storage Medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/913,768 Active US11195284B2 (en) 2018-06-01 2020-06-26 Target object tracking method and apparatus, and storage medium

Country Status (5)

Country Link
US (2) US11195284B2 (en)
JP (1) JP7073527B2 (en)
CN (1) CN108897777B (en)
SG (1) SG11202005970QA (en)
WO (1) WO2019228194A1 (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108897777B (en) * 2018-06-01 2022-06-17 深圳市商汤科技有限公司 Target object tracking method and device, electronic equipment and storage medium
CN109547748B (en) * 2018-12-07 2021-06-11 苏州科达科技股份有限公司 Object foot point determining method and device and storage medium
CN112771566A (en) * 2018-12-24 2021-05-07 华为技术有限公司 Image processing apparatus, image processing method, and program
CN111385834B (en) * 2018-12-27 2023-08-18 深圳市大数据研究院 Object recognition method, device, electronic equipment and computer readable storage medium
CN109784220B (en) * 2018-12-28 2022-06-17 上海依图网络科技有限公司 Method and device for determining passerby track
CN109740004B (en) * 2018-12-28 2023-07-11 上海依图网络科技有限公司 Filing method and device
CN111382627B (en) * 2018-12-28 2024-03-26 成都云天励飞技术有限公司 Method for judging peer and related products
CN109800664B (en) * 2018-12-28 2024-01-12 上海依图网络科技有限公司 Method and device for determining passersby track
CN109800329B (en) * 2018-12-28 2021-07-02 上海依图网络科技有限公司 Monitoring method and device
CN111429476B (en) * 2019-01-09 2023-10-20 杭州海康威视系统技术有限公司 Method and device for determining action track of target person
JP7330708B2 (en) * 2019-01-28 2023-08-22 キヤノン株式会社 Image processing device, image processing method, and program
CN111524160A (en) * 2019-02-01 2020-08-11 深圳市商汤科技有限公司 Track information acquisition method and device, electronic equipment and storage medium
CN110110690B (en) * 2019-05-16 2023-04-07 廊坊鑫良基科技有限公司 Target pedestrian tracking method, device, equipment and storage medium
CN110659560B (en) * 2019-08-05 2022-06-28 深圳市优必选科技股份有限公司 Method and system for identifying associated object
CN110502651B (en) * 2019-08-15 2022-08-02 深圳市商汤科技有限公司 Image processing method and device, electronic equipment and storage medium
CN111126807B (en) * 2019-12-12 2023-10-10 浙江大华技术股份有限公司 Stroke segmentation method and device, storage medium and electronic device
CN111010547A (en) * 2019-12-23 2020-04-14 浙江大华技术股份有限公司 Target object tracking method and device, storage medium and electronic device
CN113326410B (en) * 2020-02-28 2024-02-13 拓尔思天行网安信息技术有限责任公司 Method and device for analyzing behavior track of relatives based on space-time association
CN112037927A (en) * 2020-08-24 2020-12-04 北京金山云网络技术有限公司 Method and device for determining co-pedestrian associated with tracked person and electronic equipment
CN112040186B (en) * 2020-08-28 2023-01-31 北京市商汤科技开发有限公司 Method, device and equipment for determining activity area of target object and storage medium
CN113487651B (en) * 2021-06-17 2022-07-05 超节点创新科技(深圳)有限公司 Luggage tracking method, device, equipment and readable storage medium
KR102473801B1 (en) * 2021-07-07 2022-12-05 주식회사 딥핑소스 Method for annonymous tracking objects in places and device using them
US11388330B1 (en) * 2021-07-07 2022-07-12 Deeping Source Inc. Method for tracking target objects in a specific space, and device using the same
CN114185964A (en) * 2021-12-03 2022-03-15 深圳市商汤科技有限公司 Data processing method, device, equipment, storage medium and program product

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130034266A1 (en) * 2010-02-21 2013-02-07 Elbit Systems Ltd. Method and system for detection and tracking employing multi-view multi-spectral imaging
US20130307974A1 (en) * 2012-05-17 2013-11-21 Canon Kabushiki Kaisha Video processing apparatus and method for managing tracking object
US20160140391A1 (en) * 2014-11-14 2016-05-19 Intel Corporation Automatic target selection for multi-target object tracking
US20160373661A1 (en) * 2015-06-16 2016-12-22 Chengdu Ck Technology Co., Ltd. Camera system for generating images with movement trajectories
US20170248971A1 (en) * 2014-11-12 2017-08-31 SZ DJI Technology Co., Ltd. Method for detecting target object, detection apparatus and robot
US20180158197A1 (en) * 2016-12-01 2018-06-07 Skydio, Inc. Object tracking by an unmanned aerial vehicle using visual sensors
US20190066334A1 (en) * 2017-08-25 2019-02-28 Boe Technology Group Co., Ltd. Method, apparatus, terminal and system for measuring trajectory tracking accuracy of target
US20200242780A1 (en) * 2019-01-28 2020-07-30 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US20200327678A1 (en) * 2018-06-01 2020-10-15 Shenzhen Sensetime Technology Co., Ltd. Target Object Tracking Method and Apparatus, and Storage Medium
US20200374492A1 (en) * 2016-08-26 2020-11-26 Zhejiang Dahua Technology Co., Ltd. Methods and systems for object monitoring
US20200401857A1 (en) * 2018-12-21 2020-12-24 Shanghai Sensetime Intelligent Technology Co., Ltd. Image processing method and apparatus, electronic device, and storage medium

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5631972A (en) * 1995-05-04 1997-05-20 Ferris; Stephen Hyperladder fingerprint matcher
JP2006236255A (en) 2005-02-28 2006-09-07 Mitsubishi Electric Corp Person-tracking device and person-tracking system
JP2006236225A (en) 2005-02-28 2006-09-07 Toshiba Corp Traffic congestion detecting device, traffic congestion information providing device
JP5349367B2 (en) * 2010-02-26 2013-11-20 セコム株式会社 Moving object tracking device
TWI425454B (en) * 2010-12-28 2014-02-01 Ind Tech Res Inst Method, system and computer program product for reconstructing moving path of vehicle
EP2693404B1 (en) * 2011-03-28 2019-04-24 Nec Corporation Person tracking device, person tracking method, and non-temporary computer-readable medium storing person tracking program
CN102843547B (en) * 2012-08-01 2014-01-08 安科智慧城市技术(中国)有限公司 Intelligent tracking method and system for suspected target
CN106203458B (en) * 2015-04-29 2020-03-06 杭州海康威视数字技术股份有限公司 Crowd video analysis method and system
CN105138954B (en) * 2015-07-12 2019-06-04 上海微桥电子科技有限公司 A kind of image automatic screening inquiry identifying system
CN105574393A (en) * 2015-07-31 2016-05-11 宇龙计算机通信科技(深圳)有限公司 App access method and terminal
CN105205155B (en) * 2015-09-25 2019-05-03 珠海世纪鼎利科技股份有限公司 A kind of screening system and method for big data crime partner
CN105404890B (en) * 2015-10-13 2018-10-16 广西师范学院 A kind of criminal gang's method of discrimination for taking track space and time order into account
CN105354548B (en) * 2015-10-30 2018-10-26 武汉大学 A kind of monitor video pedestrian recognition methods again based on ImageNet retrievals
CN105976399A (en) * 2016-04-29 2016-09-28 北京航空航天大学 Moving object detection method based on SIFT (Scale Invariant Feature Transform) feature matching
CN106127142A (en) * 2016-06-21 2016-11-16 北京小米移动软件有限公司 Object tracking method and device
CN107665495B (en) * 2016-07-26 2021-03-16 佳能株式会社 Object tracking method and object tracking device
CN106056110A (en) * 2016-08-09 2016-10-26 成都联众智科技有限公司 Face detection tracking method in traffic monitoring video
CN106650652A (en) * 2016-12-14 2017-05-10 黄先开 Trajectory tracking system and method based on face recognition technology
CN108052882A (en) * 2017-11-30 2018-05-18 广东云储物联视界科技有限公司 A kind of operating method of intelligent safety defense monitoring system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130034266A1 (en) * 2010-02-21 2013-02-07 Elbit Systems Ltd. Method and system for detection and tracking employing multi-view multi-spectral imaging
US20130307974A1 (en) * 2012-05-17 2013-11-21 Canon Kabushiki Kaisha Video processing apparatus and method for managing tracking object
US20170248971A1 (en) * 2014-11-12 2017-08-31 SZ DJI Technology Co., Ltd. Method for detecting target object, detection apparatus and robot
US20160140391A1 (en) * 2014-11-14 2016-05-19 Intel Corporation Automatic target selection for multi-target object tracking
US20160373661A1 (en) * 2015-06-16 2016-12-22 Chengdu Ck Technology Co., Ltd. Camera system for generating images with movement trajectories
US20200374492A1 (en) * 2016-08-26 2020-11-26 Zhejiang Dahua Technology Co., Ltd. Methods and systems for object monitoring
US20180158197A1 (en) * 2016-12-01 2018-06-07 Skydio, Inc. Object tracking by an unmanned aerial vehicle using visual sensors
US20190066334A1 (en) * 2017-08-25 2019-02-28 Boe Technology Group Co., Ltd. Method, apparatus, terminal and system for measuring trajectory tracking accuracy of target
US20200327678A1 (en) * 2018-06-01 2020-10-15 Shenzhen Sensetime Technology Co., Ltd. Target Object Tracking Method and Apparatus, and Storage Medium
US20200401857A1 (en) * 2018-12-21 2020-12-24 Shanghai Sensetime Intelligent Technology Co., Ltd. Image processing method and apparatus, electronic device, and storage medium
US20200242780A1 (en) * 2019-01-28 2020-07-30 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium

Also Published As

Publication number Publication date
SG11202005970QA (en) 2020-07-29
CN108897777B (en) 2022-06-17
US20200327678A1 (en) 2020-10-15
US11195284B2 (en) 2021-12-07
JP2021508900A (en) 2021-03-11
JP7073527B2 (en) 2022-05-23
CN108897777A (en) 2018-11-27
WO2019228194A1 (en) 2019-12-05

Similar Documents

Publication Publication Date Title
US20220044417A1 (en) Target Object Tracking Method and Apparatus, and Storage Medium
US10999075B2 (en) Blockchain-based patrol inspection proof storage method, apparatus, and electronic device
JP6446971B2 (en) Data processing apparatus, data processing method, and computer program
EP3382643B1 (en) Automated object tracking in a video feed using machine learning
US9292939B2 (en) Information processing system, information processing method and program
WO2019179295A1 (en) Facial recognition method and device
US9734411B2 (en) Locating objects using images from portable devices
US9285868B2 (en) Camera device, communication system, and camera system
US9947105B2 (en) Information processing apparatus, recording medium, and information processing method
US20240037610A1 (en) Computer Vision Systems and Methods for Automatically Detecting, Classifying, and Pricing Objects Captured in Images or Videos
JP2017168029A (en) Device, program, and method for predicting position of examination object by action value
US10659680B2 (en) Method of processing object in image and apparatus for same
US11526999B2 (en) Object tracking using multi-camera system
CN111563398A (en) Method and device for determining information of target object
CN108876817B (en) Cross track analysis method and device, electronic equipment and storage medium
CN110505438B (en) Queuing data acquisition method and camera
CN114663871A (en) Image recognition method, training method, device, system and storage medium
US10902249B2 (en) Video monitoring
CN109033264B (en) Video analysis method and device, electronic equipment and storage medium
US10049456B2 (en) Verification of business processes using spatio-temporal data
WO2022125353A1 (en) Reducing false negatives and finding new classes in object detectors
US11410443B2 (en) Labelling training method and system for implementing the same
US20210312323A1 (en) Generating performance predictions with uncertainty intervals
CN104268445A (en) Method and device for preventing secondary picture spreading
CN116630879A (en) Device loss detection method, device and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: SHENZHEN SENSETIME TECHNOLOGY CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, JING;ZHANG, GUANGCHENG;LI, WEILIN;AND OTHERS;REEL/FRAME:057744/0402

Effective date: 20200618

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION