CN111382603A - Track calculation device and method - Google Patents

Track calculation device and method Download PDF

Info

Publication number
CN111382603A
CN111382603A CN201811620137.1A CN201811620137A CN111382603A CN 111382603 A CN111382603 A CN 111382603A CN 201811620137 A CN201811620137 A CN 201811620137A CN 111382603 A CN111382603 A CN 111382603A
Authority
CN
China
Prior art keywords
image
optical flow
tracking targets
frame
tracking target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811620137.1A
Other languages
Chinese (zh)
Other versions
CN111382603B (en
Inventor
李�杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SF Technology Co Ltd
Original Assignee
SF Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SF Technology Co Ltd filed Critical SF Technology Co Ltd
Priority to CN201811620137.1A priority Critical patent/CN111382603B/en
Publication of CN111382603A publication Critical patent/CN111382603A/en
Application granted granted Critical
Publication of CN111382603B publication Critical patent/CN111382603B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06V20/42Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a track calculation device and a track calculation method. Generating respective corresponding image characteristics of different tracking targets in a multi-frame sequential image according to respective motion characteristics of the different tracking targets in the multi-frame sequential image; and identifying corresponding tracking targets in the multi-frame sequential images according to the image characteristics corresponding to the different tracking targets respectively, and generating motion tracks of the corresponding tracking targets, wherein the calculation is simple and quick, the resource occupation is less, and the real-time processing requirement is met.

Description

Track calculation device and method
Technical Field
The invention relates to video structural analysis, in particular to a track calculation device and method.
Background
The CGAN generates a mask area containing the track of a throwing person and a throwing object, and directly calculates the distance of the mask containing the influence of the person, so that the calculated value is larger. How to eliminate the influence of people in the mask area, only the length of the trajectory of the object to be thrown, i.e., the throw distance, is calculated.
In the optional scheme, based on openposition, the skeleton of the person and the position information of the hand are found out, the position of the hand is a starting point, and the end point of a mask area on one side far away from the skeleton of the person is an end point. The scheme has the defects that the openposition cost is high, the requirement of real-time processing is not met, and the skeleton information of a person cannot be calculated or the calculation is inaccurate for scenes without the person.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a trajectory calculation device and method.
According to an aspect of the present invention, there is provided a trajectory calculation method including the steps of:
generating respective corresponding image characteristics of different tracking targets in a multi-frame sequential image according to respective motion characteristics of the different tracking targets in the multi-frame sequential image;
and identifying corresponding tracking targets in the multi-frame sequential images according to the image characteristics corresponding to the different tracking targets respectively, and generating motion tracks of the corresponding tracking targets.
Further, in the above-mentioned case,
the multi-frame sequential image is a multi-frame optical flow sequential image, the respective corresponding image features of the different tracking targets in the multi-frame sequential image are the respective corresponding optical flow features of the different tracking targets in the multi-frame optical flow sequential image, and the optical flow features are preset numbers of optical flow values of the different tracking target feature points deviating from corresponding preset thresholds.
Identifying corresponding tracking targets in the multi-frame sequential images according to the image features corresponding to the different tracking targets respectively, and generating motion tracks of the corresponding tracking targets, wherein the motion tracks comprise:
according to the optical flow characteristics corresponding to the different tracking targets, the different tracking targets of the corresponding interested areas in the multi-frame optical flow sequence images are segmented, the corresponding tracking targets are identified, and the motion tracks of the corresponding tracking targets are generated according to the corresponding tracking targets identified in the multi-frame sequence images.
According to the optical flow characteristics corresponding to the different tracking targets, segmenting the different tracking targets of the corresponding interested areas in the multi-frame optical flow sequence images, and identifying the corresponding tracking targets, wherein the method comprises the following steps:
acquiring optical flow values of a plurality of frames of optical flow sequence images, wherein the positions of the optical flow values, which deviate from a preset threshold value, of the images generate reference images through first identification marks;
acquiring an interested area corresponding to a multi-frame optical flow sequence image, counting the number of tracking target characteristic points of which the optical flow values in the interested area deviate from a preset threshold value, and if the number exceeds the preset number, enabling the corresponding tracking target characteristic points to pass through a first identification mark, and enabling the other tracking target characteristic points to pass through a second identification mark, so as to generate a comparison characteristic image;
and identifying a corresponding tracking target according to the quantity comparison of the first identification marks in the reference image and the first identification marks in the comparison characteristic image.
The different tracking targets comprise an executing body for executing the throwing action and a thrown object.
Identifying a corresponding tracking target according to the comparison of the number of the first identification marks in the reference image and the number of the first identification marks in the comparison characteristic image, and generating a motion track of the corresponding tracking target according to the corresponding tracking target identified in the multi-frame sequential image, wherein the motion track comprises:
comparing the number of the first identification marks in the reference image with the number of the first identification marks in the comparison characteristic image, identifying the corresponding tracking target, the executing body executing the throwing action and the object thrown according to the change of the number of the first identification marks, wherein the tracking target with smaller change amount of the first identification marks is the area where the executing body executing the throwing action is located, the tracking target with larger change amount of the first identification marks is the object thrown, and generating the motion track of the corresponding tracking target according to the corresponding tracking target identified in the multi-frame sequence image.
According to another aspect of the present invention, there is provided a trajectory calculation device including:
the image feature generation unit is configured to generate respective corresponding image features of different tracking targets in a multi-frame sequential image according to respective motion features of the different tracking targets in the multi-frame sequential image;
and the motion track generation unit of the tracking target is configured to identify a corresponding tracking target in the multi-frame sequential image according to the image features corresponding to the different tracking targets respectively, and generate a motion track of the corresponding tracking target.
Further, the multi-frame sequential image is a multi-frame optical flow sequential image, the respective corresponding image features of the different tracking targets in the multi-frame sequential image are the respective corresponding optical flow features of the different tracking targets in the multi-frame optical flow sequential image, and the optical flow features are preset numbers of optical flow values of the different tracking target feature points deviating from corresponding preset thresholds.
The motion trajectory generation unit of the tracking target is further configured to:
according to the optical flow characteristics corresponding to the different tracking targets, the different tracking targets of the corresponding interested areas in the multi-frame optical flow sequence images are segmented, the corresponding tracking targets are identified, and the motion tracks of the corresponding tracking targets are generated according to the corresponding tracking targets identified in the multi-frame sequence images.
The motion trajectory generation unit of the tracking target is further configured to:
acquiring optical flow values of a plurality of frames of optical flow sequence images, wherein the positions of the optical flow values, which deviate from a preset threshold value, of the images generate reference images through first identification marks;
acquiring an interested area corresponding to a multi-frame optical flow sequence image, counting the number of tracking target characteristic points of which the optical flow values in the interested area deviate from a preset threshold value, and if the number exceeds the preset number, enabling the corresponding tracking target characteristic points to pass through a first identification mark, and enabling the other tracking target characteristic points to pass through a second identification mark, so as to generate a comparison characteristic image;
and identifying a corresponding tracking target according to the quantity comparison of the first identification marks in the reference image and the first identification marks in the comparison characteristic image.
According to another aspect of the present invention, there is provided an apparatus comprising:
one or more processors;
a memory for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method of any of the above.
According to another aspect of the invention, there is provided a computer readable storage medium storing a computer program which, when executed by a processor, implements a method as defined in any one of the above.
Compared with the prior art, the invention has the following beneficial effects:
1. according to the track calculation method disclosed by the invention, the corresponding image characteristics of different tracking targets in a multi-frame sequence image are generated according to the respective motion characteristics of the different tracking targets in the multi-frame sequence image; corresponding tracking targets are identified in the multi-frame sequence images according to the image characteristics corresponding to the different tracking targets respectively, the motion tracks of the corresponding tracking targets are generated, the calculation is simple and rapid, the resource occupation is less, the real-time processing requirement is met, and the problems that openposition overhead is large, the real-time processing requirement cannot be met, the calculation is inaccurate and the like in the traditional method are solved.
2. The track calculation device provided by the invention identifies different tracking targets based on the image characteristics generated by the respective motion characteristics of the different tracking targets through the mutual cooperation of all the units, generates the motion tracks of the corresponding tracking targets, and has the advantages of simple and rapid calculation and less resource occupation.
3. The device and the computer-readable storage medium storing the computer program of the embodiment of the invention identify different tracking targets based on the image characteristics generated by the respective motion characteristics of the different tracking targets and generate the motion tracks of the corresponding tracking targets, and have the advantages of simple and rapid calculation, less resource occupation, capability of meeting the real-time processing requirement and convenience for popularization.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a schematic diagram of a computer system according to an embodiment,
in the figure, 100 computer system, 101CPU, 102ROM, 103RAM, 104 bus, 105I/O interface, 106 input part, 107 output part, 108 storage part, 109 communication part, 110 drive, 111 removable medium.
Detailed Description
In order to better understand the technical scheme of the invention, the invention is further explained by combining the specific embodiment and the attached drawings of the specification.
The first embodiment is as follows:
a trajectory calculation method of this embodiment, as shown in fig. 1, includes the following steps:
s1, according to the respective motion characteristics of different tracking targets in the multi-frame sequence image, generating the respective corresponding image characteristics of the different tracking targets in the multi-frame sequence image.
The multi-frame sequential images are multi-frame optical flow sequential images, and the different tracking targets comprise an execution body for executing the throwing action and a thrown object, wherein the execution body for executing the throwing action can be a person for executing the throwing action or a robot for executing the throwing action.
When the multi-frame sequential image is a multi-frame optical flow sequential image, the respective corresponding image features of the different tracking targets in the multi-frame sequential image are the respective corresponding optical flow features of the different tracking targets in the multi-frame optical flow sequential image, and the optical flow features are preset numbers of optical flow values of the different tracking target feature points deviating from corresponding preset thresholds. Since the optical flow value generated by the optical flow algorithm for the area without motion information is 128, and the optical flow value generated by the optical flow algorithm for the area with motion information deviates from 128, the preset threshold value is set to be 128 or a value close to 128, so that different tracking targets can be distinguished by distinguishing the optical flow values of the feature points of the different tracking targets.
S2, identifying corresponding tracking targets in the multi-frame sequential images according to the image features corresponding to the different tracking targets respectively, and generating motion tracks of the corresponding tracking targets.
When the multi-frame sequential image is a multi-frame optical flow sequential image, S2 includes:
according to the optical flow characteristics corresponding to the different tracking targets, the different tracking targets of the corresponding interested areas in the multi-frame optical flow sequence images are segmented, the corresponding tracking targets are identified, and the motion tracks of the corresponding tracking targets are generated according to the corresponding tracking targets identified in the multi-frame sequence images.
S2 more specifically includes:
s21, acquiring optical flow values of multi-frame optical flow sequence images, wherein the image position of the optical flow value deviating from a preset threshold generates a reference image through a first identification mark, wherein the first identification is set to be 255 in gray value, or other gray values except for 128 and its close value in the range of 0-255, and the first identification mark needs to be distinguished from a second identification mark for later-period statistics convenience;
s22, obtaining an interested area corresponding to a multi-frame optical flow sequence image, counting the number of tracking target characteristic points of which optical flow values deviate from a preset threshold value in the interested area, if the number exceeds the preset number, enabling the corresponding tracking target characteristic points to pass through a first identification mark, enabling the other tracking target characteristic points to pass through a second identification mark, and generating a comparison characteristic image, wherein the second identification is a gray value set to be 0, the second identification can also be other gray values except for 128 and a near value thereof in a range of 0-255, and the second identification needs to be distinguished from the first identification mark for later-period counting convenience.
And S23, identifying the corresponding tracking target according to the number comparison of the first identification marks in the reference image and the first identification marks in the comparison characteristic image.
Since the different tracking targets include an executing body for executing the throwing action and an object to be thrown, the executing body for executing the throwing action may be a person for executing the throwing action or a robot for executing the throwing action. Therefore, S23 specifically includes:
comparing the number of the first identification marks in the reference image with the number of the first identification marks in the comparison characteristic image, identifying a corresponding person or robot which executes the throwing action and the object to be thrown according to the change of the number of the first identification marks, wherein the tracking object with a smaller change amount of the first identification marks is the area where the person or robot which executes the throwing action is located, the tracking object with a larger change amount of the first identification marks is the object to be thrown, generating a motion track of the corresponding tracking object according to the person or robot which executes the throwing action and the object to be thrown which are identified in the multi-frame sequence image, and mainly acquiring the motion track of the object to be thrown.
The trajectory calculation method is mainly used for distinguishing the execution body executing the throwing motion from the thrown object based on the fact that the number of times of the thrown object appearing in the optical flow diagram is small, and the specific steps of S2 are as follows:
s21, marking the positions of the excessive optical flow values deviating from the preset range of 128 in the multi-frame optical flow sequence as 255 and taking the positions as reference images;
s22, counting the times of the appearance of points deviating from 128 preset ranges in the frame sequence (flowx, flowy) of the optical flow in the mask area;
s23, the mark of the step S22 with the times within a certain range is 255, and the rest marks are 0;
s24, the area that is the most decreased from the reference image in step S21 by 255 is the area of the execution body that executes the throwing motion, and this area is excluded from the mask area, and the remaining part is the mask of the throwing area.
Based on the fact that the larger value of the tossed object appears less frequently at the same position in the sequence of optical-flow frames and the larger value of the tossed object appears more frequently in the region where the executing body executing the tossing action appears, the executing body executing the tossing action and the tossed object region in the multi-frame optical-flow sequence images are distinguished, and the trajectory of the tossed object is formed from the tossed object recognized in the multi-frame optical-flow sequence images.
The trajectory calculation device of the present embodiment includes:
the image feature generation unit is used for generating corresponding image features of different tracking targets in a multi-frame sequence image according to the motion features of the different tracking targets in the multi-frame sequence image, the multi-frame sequence image is a multi-frame optical flow sequence image, and the different tracking targets comprise an execution body for executing a throwing action and an object to be thrown. The respective corresponding image features of the different tracking targets in the multi-frame sequential images are the respective corresponding optical flow features of the different tracking targets in the multi-frame optical flow sequential images, and the optical flow features are preset numbers of optical flow values of the different tracking target feature points deviating from corresponding preset thresholds.
And the motion track generation unit of the tracking target is configured to identify a corresponding tracking target in the multi-frame sequential image according to the image features corresponding to the different tracking targets respectively, and generate a motion track of the corresponding tracking target. When the multi-frame sequential image is a multi-frame optical flow sequential image, the motion trajectory generation unit of the tracking target is further configured to: according to the optical flow characteristics corresponding to the different tracking targets, the different tracking targets of the corresponding interested areas in the multi-frame optical flow sequence images are segmented, the corresponding tracking targets are identified, and the motion tracks of the corresponding tracking targets are generated according to the corresponding tracking targets identified in the multi-frame sequence images.
Further configured for:
acquiring optical flow values of a plurality of frames of optical flow sequence images, wherein the positions of the optical flow values, which deviate from a preset threshold value, of the images generate reference images through first identification marks;
acquiring an interested area corresponding to a multi-frame optical flow sequence image, counting the number of tracking target characteristic points of which the optical flow values in the interested area deviate from a preset threshold value, and if the number exceeds the preset number, enabling the corresponding tracking target characteristic points to pass through a first identification mark, and enabling the other tracking target characteristic points to pass through a second identification mark, so as to generate a comparison characteristic image;
and identifying a corresponding tracking target according to the quantity comparison of the first identification marks in the reference image and the first identification marks in the comparison characteristic image.
Since the different tracking targets include the execution body that executes the throwing motion and the object that is thrown, the motion trajectory generation unit of the tracking target is further configured to:
comparing the number of the first identification marks in the reference image with the number of the first identification marks in the comparison characteristic image, identifying the corresponding tracking target, the executing body executing the throwing action and the object thrown according to the change of the number of the first identification marks, wherein the tracking target with smaller change amount of the first identification marks is the area where the executing body executing the throwing action is located, the tracking target with larger change amount of the first identification marks is the object thrown, and generating the motion track of the corresponding tracking target according to the corresponding tracking target identified in the multi-frame sequence image.
It should be understood that the steps in the trajectory calculation method described above correspond to sub-units described in the trajectory calculation device. Thus, the operations and features described above for the system and the units included therein are equally applicable to the above method and will not be described again here.
The present embodiment also provides an apparatus, which is suitable for implementing the embodiments of the present application.
The apparatus includes a computer system 100, and as shown in fig. 2, the computer system 100 includes a Central Processing Unit (CPU)101 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)102 or a program loaded from a storage section into a Random Access Memory (RAM) 103. In the RAM103, various programs and data necessary for system operation are also stored. The CPU 101, ROM 102, and RAM103 are connected to each other via a bus 104. An input/output (I/O) interface 105 is also connected to bus 104.
The following components are connected to the I/O interface 105: an input portion 106 including a keyboard, a mouse, and the like; an output section 107 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 108 including a hard disk and the like; and a communication section 109 including a network interface card such as a LAN card, a modem, or the like. The communication section 109 performs communication processing via a network such as the internet. The drives are also connected to the I/O interface 105 as needed. A removable medium 111 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 110 as necessary, so that a computer program read out therefrom is mounted into the storage section 108 as necessary.
In particular, according to an embodiment of the invention, the process described above with reference to the flowchart of fig. 1 may be implemented as a computer software program. For example, an embodiment of the invention includes a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The above-described functions defined in the system of the present application are executed when the computer program is executed by the Central Processing Unit (CPU) 101.
It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to one embodiment of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present invention may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves. The described units or modules may also be provided in a processor, and may be described as: a processor includes an image feature generation unit, a motion trajectory generation unit that tracks a target. Where the names of these units or modules do not in some cases constitute a limitation of the units or modules themselves, for example, the image feature generation unit may also be described as an image feature generation unit configured to generate respective corresponding image features of different tracked objects in a multi-frame sequence image according to respective motion features of the different tracked objects in the multi-frame sequence image.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the trajectory calculation method as described in the above embodiments.
For example, the electronic device may implement the following as shown in fig. 1: generating respective corresponding image characteristics of different tracking targets in a multi-frame sequential image according to respective motion characteristics of the different tracking targets in the multi-frame sequential image; and identifying corresponding tracking targets in the multi-frame sequential images according to the image characteristics corresponding to the different tracking targets respectively, and generating motion tracks of the corresponding tracking targets.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware.
Example two
The same features of this embodiment and the first embodiment are not described again, and the different features of this embodiment and the first embodiment are:
s2 more specifically includes:
s21, acquiring optical flow values of multi-frame optical flow sequence images, wherein the image position of the optical flow value deviating from a preset threshold generates a reference image through a first identification mark, wherein the first identification is a gray value set to be 0, and can also be other gray values except for 128 and its close value within the range of 0-255, and the reference image needs to be distinguished from a second identification mark for later-period statistics convenience;
s22, obtaining an interested area corresponding to a multi-frame optical flow sequence image, counting the number of tracking target characteristic points of which optical flow values deviate from a preset threshold value in the interested area, if the number exceeds the preset number, enabling the corresponding tracking target characteristic points to pass through a first identification mark, enabling the other tracking target characteristic points to pass through a second identification mark, and generating a comparison characteristic image, wherein the second identification is that the gray value is set to be 255, the second identification can also be other gray values except for 128 and the approximate value thereof in the range of 0-255, and the second identification needs to be distinguished from the first identification mark for later-period counting convenience.
And S23, identifying the corresponding tracking target according to the number comparison of the first identification marks in the reference image and the first identification marks in the comparison characteristic image.
EXAMPLE III
The same features of this embodiment and the first embodiment are not described again, and the different features of this embodiment and the first embodiment are:
s2 more specifically includes:
s21, acquiring optical flow values of multi-frame optical flow sequence images, wherein the image position of the optical flow value deviating from a preset threshold generates a reference image through a first identification mark, wherein the first identification is a gray value set to be 20, or other gray values except for 128 and its close value within the range of 0-255, and the first identification mark needs to be distinguished from a second identification mark for later-period statistics convenience;
s22, obtaining an interested area corresponding to a multi-frame optical flow sequence image, counting the number of tracking target characteristic points of which optical flow values deviate from a preset threshold value in the interested area, if the number exceeds the preset number, enabling the corresponding tracking target characteristic points to pass through a first identification mark, enabling the other tracking target characteristic points to pass through a second identification mark, and generating a comparison characteristic image, wherein the second identification is that the gray value is set to be 210, the second identification can also be other gray values except for 128 and the approximate value thereof in the range of 0-255, and the second identification needs to be distinguished from the first identification mark for later-period counting convenience.
And S23, identifying the corresponding tracking target according to the number comparison of the first identification marks in the reference image and the first identification marks in the comparison characteristic image.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the features described above have similar functions to (but are not limited to) those disclosed in this application.

Claims (10)

1. A track calculation method is characterized by comprising the following steps:
generating respective corresponding image characteristics of different tracking targets in a multi-frame sequential image according to respective motion characteristics of the different tracking targets in the multi-frame sequential image;
and identifying corresponding tracking targets in the multi-frame sequential images according to the image characteristics corresponding to the different tracking targets respectively, and generating motion tracks of the corresponding tracking targets.
2. The trajectory calculation method according to claim 1,
the multi-frame sequential image is a multi-frame optical flow sequential image, the respective corresponding image features of the different tracking targets in the multi-frame sequential image are the respective corresponding optical flow features of the different tracking targets in the multi-frame optical flow sequential image, and the optical flow features are preset numbers of optical flow values of the different tracking target feature points deviating from corresponding preset thresholds.
3. The trajectory calculation method according to claim 2, wherein identifying a corresponding tracked object in the multi-frame sequential images according to the image features corresponding to the different tracked objects, and generating the motion trajectory of the corresponding tracked object comprises:
according to the optical flow characteristics corresponding to the different tracking targets, the different tracking targets of the corresponding interested areas in the multi-frame optical flow sequence images are segmented, the corresponding tracking targets are identified, and the motion tracks of the corresponding tracking targets are generated according to the corresponding tracking targets identified in the multi-frame sequence images.
4. The trajectory calculation method according to claim 3, wherein the step of segmenting the different tracking targets in the corresponding regions of interest in the images of the plurality of frames of the optical flow sequence according to the optical flow features corresponding to the different tracking targets, and identifying the corresponding tracking targets comprises:
acquiring optical flow values of a plurality of frames of optical flow sequence images, wherein the positions of the optical flow values, which deviate from a preset threshold value, of the images generate reference images through first identification marks;
acquiring an interested area corresponding to a multi-frame optical flow sequence image, counting the number of tracking target characteristic points of which the optical flow values in the interested area deviate from a preset threshold value, and if the number exceeds the preset number, enabling the corresponding tracking target characteristic points to pass through a first identification mark, and enabling the other tracking target characteristic points to pass through a second identification mark, so as to generate a comparison characteristic image;
and identifying a corresponding tracking target according to the quantity comparison of the first identification marks in the reference image and the first identification marks in the comparison characteristic image.
5. The trajectory calculation method according to any one of claims 1 to 4, wherein the different tracking targets include an execution body that executes a throwing motion and an object to be thrown.
6. The trajectory calculation method of claim 5, wherein identifying the corresponding tracked object according to the comparison of the number of the first identification marks in the reference image and the number of the first identification marks in the comparison feature image, and generating the motion trajectory of the corresponding tracked object according to the identified corresponding tracked object in the multi-frame sequential image comprises:
comparing the number of the first identification marks in the reference image with the number of the first identification marks in the comparison characteristic image, identifying the corresponding tracking target, the executing body executing the throwing action and the object thrown according to the change of the number of the first identification marks, wherein the tracking target with smaller change amount of the first identification marks is the area where the executing body executing the throwing action is located, the tracking target with larger change amount of the first identification marks is the object thrown, and generating the motion track of the corresponding tracking target according to the corresponding tracking target identified in the multi-frame sequence image.
7. A trajectory calculation device, comprising:
the image feature generation unit is configured to generate respective corresponding image features of different tracking targets in a multi-frame sequential image according to respective motion features of the different tracking targets in the multi-frame sequential image;
and the motion track generation unit of the tracking target is configured to identify a corresponding tracking target in the multi-frame sequential image according to the image features corresponding to the different tracking targets respectively, and generate a motion track of the corresponding tracking target.
8. The trajectory calculation device of claim 7,
the multi-frame sequential image is a multi-frame optical flow sequential image, the respective corresponding image features of the different tracking targets in the multi-frame sequential image are the respective corresponding optical flow features of the different tracking targets in the multi-frame optical flow sequential image, and the optical flow features are preset numbers of optical flow values of the different tracking target feature points deviating from corresponding preset thresholds.
9. The trajectory calculation device according to claim 7, wherein the motion trajectory generation unit of the tracking target is further configured to:
according to the optical flow characteristics corresponding to the different tracking targets, the different tracking targets of the corresponding interested areas in the multi-frame optical flow sequence images are segmented, the corresponding tracking targets are identified, and the motion tracks of the corresponding tracking targets are generated according to the corresponding tracking targets identified in the multi-frame sequence images.
10. The trajectory calculation device according to claim 9, wherein the motion trajectory generation unit of the tracking target is further configured to:
acquiring optical flow values of a plurality of frames of optical flow sequence images, wherein the positions of the optical flow values, which deviate from a preset threshold value, of the images generate reference images through first identification marks;
acquiring an interested area corresponding to a multi-frame optical flow sequence image, counting the number of tracking target characteristic points of which the optical flow values in the interested area deviate from a preset threshold value, and if the number exceeds the preset number, enabling the corresponding tracking target characteristic points to pass through a first identification mark, and enabling the other tracking target characteristic points to pass through a second identification mark, so as to generate a comparison characteristic image;
and identifying a corresponding tracking target according to the quantity comparison of the first identification marks in the reference image and the first identification marks in the comparison characteristic image.
CN201811620137.1A 2018-12-28 2018-12-28 Track calculation device and method Active CN111382603B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811620137.1A CN111382603B (en) 2018-12-28 2018-12-28 Track calculation device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811620137.1A CN111382603B (en) 2018-12-28 2018-12-28 Track calculation device and method

Publications (2)

Publication Number Publication Date
CN111382603A true CN111382603A (en) 2020-07-07
CN111382603B CN111382603B (en) 2023-09-26

Family

ID=71217133

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811620137.1A Active CN111382603B (en) 2018-12-28 2018-12-28 Track calculation device and method

Country Status (1)

Country Link
CN (1) CN111382603B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0907145A2 (en) * 1997-10-03 1999-04-07 Nippon Telegraph and Telephone Corporation Method and equipment for extracting image features from image sequence
CN101854466A (en) * 2010-05-13 2010-10-06 北京英泰智软件技术发展有限公司 Moving area detection method and device
CN102222341A (en) * 2010-04-16 2011-10-19 东软集团股份有限公司 Method and device for detecting motion characteristic point and method and device for detecting motion target
CN104408743A (en) * 2014-11-05 2015-03-11 百度在线网络技术(北京)有限公司 Image segmentation method and device
CN105261042A (en) * 2015-10-19 2016-01-20 华为技术有限公司 Optical flow estimation method and apparatus
CN105654031A (en) * 2014-10-22 2016-06-08 通用汽车环球科技运作有限责任公司 Systems and methods for object detection
CN105827951A (en) * 2016-01-29 2016-08-03 维沃移动通信有限公司 Moving object photographing method and mobile terminal
CN107103615A (en) * 2017-04-05 2017-08-29 合肥酷睿网络科技有限公司 A kind of monitor video target lock-on tracing system and track lock method
CN107590453A (en) * 2017-09-04 2018-01-16 腾讯科技(深圳)有限公司 Processing method, device and the equipment of augmented reality scene, computer-readable storage medium
CN108280843A (en) * 2018-01-24 2018-07-13 新华智云科技有限公司 A kind of video object detecting and tracking method and apparatus

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0907145A2 (en) * 1997-10-03 1999-04-07 Nippon Telegraph and Telephone Corporation Method and equipment for extracting image features from image sequence
CN102222341A (en) * 2010-04-16 2011-10-19 东软集团股份有限公司 Method and device for detecting motion characteristic point and method and device for detecting motion target
CN101854466A (en) * 2010-05-13 2010-10-06 北京英泰智软件技术发展有限公司 Moving area detection method and device
CN105654031A (en) * 2014-10-22 2016-06-08 通用汽车环球科技运作有限责任公司 Systems and methods for object detection
CN104408743A (en) * 2014-11-05 2015-03-11 百度在线网络技术(北京)有限公司 Image segmentation method and device
CN105261042A (en) * 2015-10-19 2016-01-20 华为技术有限公司 Optical flow estimation method and apparatus
CN105827951A (en) * 2016-01-29 2016-08-03 维沃移动通信有限公司 Moving object photographing method and mobile terminal
CN107103615A (en) * 2017-04-05 2017-08-29 合肥酷睿网络科技有限公司 A kind of monitor video target lock-on tracing system and track lock method
CN107590453A (en) * 2017-09-04 2018-01-16 腾讯科技(深圳)有限公司 Processing method, device and the equipment of augmented reality scene, computer-readable storage medium
CN108280843A (en) * 2018-01-24 2018-07-13 新华智云科技有限公司 A kind of video object detecting and tracking method and apparatus

Also Published As

Publication number Publication date
CN111382603B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
CN108985259B (en) Human body action recognition method and device
US10810748B2 (en) Multiple targets—tracking method and apparatus, device and storage medium
CN110378264B (en) Target tracking method and device
CN109584276B (en) Key point detection method, device, equipment and readable medium
CN110443210B (en) Pedestrian tracking method and device and terminal
CN108960090A (en) Method of video image processing and device, computer-readable medium and electronic equipment
JP2018063236A (en) Method and apparatus for annotating point cloud data
CN104937638A (en) Systems and methods for tracking and detecting a target object
US10671887B2 (en) Best image crop selection
CN110163096B (en) Person identification method, person identification device, electronic equipment and computer readable medium
CN104732187A (en) Method and equipment for image tracking processing
CN110059623B (en) Method and apparatus for generating information
CN111783626B (en) Image recognition method, device, electronic equipment and storage medium
CN110335313B (en) Audio acquisition equipment positioning method and device and speaker identification method and system
CN111079638A (en) Target detection model training method, device and medium based on convolutional neural network
CN108288025A (en) A kind of car video monitoring method, device and equipment
CN110688873A (en) Multi-target tracking method and face recognition method
TWI729587B (en) Object localization system and method thereof
CN115063454A (en) Multi-target tracking matching method, device, terminal and storage medium
Stadler et al. Bytev2: Associating more detection boxes under occlusion for improved multi-person tracking
CN111382603B (en) Track calculation device and method
CN111898529B (en) Face detection method and device, electronic equipment and computer readable medium
KR20150137698A (en) Method and apparatus for movement trajectory tracking of moving object on animal farm
CN113657219A (en) Video object detection tracking method and device and computing equipment
KR102029860B1 (en) Method for tracking multi objects by real time and apparatus for executing the method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant