CN111665490B - Target tracking method and device, storage medium and electronic device - Google Patents

Target tracking method and device, storage medium and electronic device Download PDF

Info

Publication number
CN111665490B
CN111665490B CN202010491672.2A CN202010491672A CN111665490B CN 111665490 B CN111665490 B CN 111665490B CN 202010491672 A CN202010491672 A CN 202010491672A CN 111665490 B CN111665490 B CN 111665490B
Authority
CN
China
Prior art keywords
target
target object
distance
robot
height
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010491672.2A
Other languages
Chinese (zh)
Other versions
CN111665490A (en
Inventor
王林源
马子昂
卢维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202010491672.2A priority Critical patent/CN111665490B/en
Publication of CN111665490A publication Critical patent/CN111665490A/en
Application granted granted Critical
Publication of CN111665490B publication Critical patent/CN111665490B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Abstract

The invention discloses a target tracking method and device, a storage medium and an electronic device. Wherein the method comprises the following steps: obtaining a target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object; determining a current distance between the robot and the target object according to the target height, wherein the current distance is used for representing a current actual distance between the robot and the target object; determining a target tracking distance of a robot tracking target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height; and tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object. By adopting the technical scheme, the problem that the following of any given target at any specified distance cannot be flexibly realized in the related art is solved.

Description

Target tracking method and device, storage medium and electronic device
Technical Field
The present invention relates to the field of computers, and in particular, to a target tracking method and apparatus, a storage medium, and an electronic apparatus.
Background
At present, as robots are increasingly widely applied to industries such as warehouse logistics, factory security inspection and the like, the intelligent requirements of robot platforms are increasingly remarkable. When the robot platform is assembled and advanced, the robot tracks business scenes such as moving targets, and the robot is required to have the capability of following the targets.
Because the following targets are given randomly and the movement of the targets is random, the following algorithm deployed on the robot platform needs to be ensured to have high execution efficiency due to the limitation of the platform calculation force, and the following of any given target for any specified distance cannot be realized flexibly.
Therefore, in the related art, there is a problem that following of any given target by any specified distance cannot be flexibly realized, and an effective technical scheme has not been proposed yet.
Disclosure of Invention
The embodiment of the invention provides a target tracking method and device, a storage medium and an electronic device, which at least solve the technical problem that the following of any given target at any specified distance cannot be flexibly realized in the related art.
According to an aspect of an embodiment of the present invention, there is provided a target tracking method including: obtaining a target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object; determining a current distance between the robot and the target object according to the target height, wherein the current distance is used for representing a current actual distance between the robot and the target object; determining a target tracking distance of a robot tracking target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height; and tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object.
According to another aspect of the embodiment of the present invention, there is also provided an object tracking apparatus including: the first acquisition unit is used for acquiring the target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object; a first determining unit, configured to determine a current distance between the robot and the target object according to the target height, where the current distance is used to represent a current actual distance between the robot and the target object; the second determining unit is used for determining a target tracking distance of the robot for tracking the target object according to the current distance and determining a target included angle between the robot and the target object according to the target height; and the tracking unit is used for tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object.
According to yet another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to perform the above-described object tracking method when run.
According to still another aspect of the embodiments of the present invention, there is further provided an electronic device including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the target tracking method described above through the computer program.
In the embodiment of the invention, firstly, a target object currently tracked by a robot is obtained, the target height representing the actual height of the target object is obtained, the current distance between the robot and the target object is determined according to the target height, then, the target tracking distance of the target object tracked by the robot is determined according to the current distance, and the target included angle between the robot and the target object is determined according to the target height, so that the target object is tracked by the robot according to the target tracking distance and the target included angle. The technical effect that the robot can flexibly track the target object according to the target tracking distance and the target included angle is achieved, and the technical problem that the following of any given target at any specified distance cannot be flexibly achieved in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a schematic illustration of an application environment of a target tracking method according to an embodiment of the invention;
FIG. 2 is a flow chart of an alternative target tracking method according to an embodiment of the invention;
FIG. 3 is a schematic illustration of an alternative target height according to an embodiment of the invention;
FIG. 4 is a schematic illustration of an alternative target angle according to an embodiment of the invention;
FIG. 5 is a flow chart of another alternative target tracking method according to an embodiment of the invention;
FIG. 6 is a flow diagram of an alternative target tracking algorithm according to an embodiment of the invention;
FIG. 7 is a schematic diagram of an alternative target tracking apparatus according to an embodiment of the invention;
fig. 8 is a schematic structural view of an alternative electronic device according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to one aspect of an embodiment of the present invention, a target tracking method is provided. Alternatively, the above-described target tracking method may be applied, but is not limited to, in an application environment as shown in fig. 1. As shown in fig. 1, the terminal device 102 obtains a target height of a target object, where the target object is a tracked object currently tracked by the robot, and the target height is used to represent an actual height of the target object, and sends the target height to the server 104 through a network. After receiving the target height, the server 104 determines a current distance between the robot and the target object according to the target height, wherein the current distance is used for representing an actual distance between the robot and the target object; and determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height. The server 106 sends the target tracking distance and the target height to the terminal device 102 through the network 104, and the terminal device 102 tracks the target object according to the target tracking distance and the target included angle after receiving the target tracking distance and the target height, wherein the target tracking distance is a preset tracking distance between the robot and the target object. The above is merely an example, and embodiments of the present application are not limited herein.
Or, acquiring a target height of a target object at the terminal device 102, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object; determining a current distance between the robot and the target object according to the target height, wherein the current distance is used for representing an actual distance between the robot and the target object; determining a target tracking distance of a robot tracking target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height; and tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object. The above is merely an example, and the present embodiment is not limited in any way.
Alternatively, in this embodiment, the terminal device may include, but is not limited to, at least one of: a mobile phone (e.g., android mobile phone, iOS mobile phone, etc.), a notebook computer, a tablet computer, a palm computer, MID (Mobile Internet Devices, mobile internet device), PAD, desktop computer, etc. The network may include, but is not limited to: a wired network, a wireless network, wherein the wired network comprises: local area networks, metropolitan area networks, and wide area networks, the wireless network comprising: bluetooth, WIFI, and other networks that enable wireless communications. The server may be a single server or a server cluster composed of a plurality of servers. The above is merely an example, and the present embodiment is not limited thereto.
Alternatively, in the present embodiment, as an optional implementation manner, the method may be performed by a server, may be performed by a terminal device, or may be performed by both the server and the terminal device, and in the present embodiment, the description is given by way of example by the terminal device (for example, the above-described terminal device 102). As shown in fig. 2, the flow of the target tracking method may include the steps of:
step S202, obtaining the target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object;
step S204, determining the current distance between the robot and the target object according to the target height, wherein the current distance is used for representing the actual distance between the robot and the target object;
step S206, determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height;
step S208, tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object.
Optionally, the target tracking method can be, but is not limited to, a robot in the industrial application scenarios of warehouse logistics, factory security inspection and the like.
Through the embodiment, firstly, a target object currently tracked by the robot is obtained, a target height representing the actual height of the target object is obtained, the current distance between the robot and the target object is determined according to the target height, then, the target tracking distance of the target object tracked by the robot is determined according to the current distance, and the target included angle between the robot and the target object is determined according to the target height, so that the target object is tracked by the robot according to the target tracking distance and the target included angle. The technical effect that the robot can flexibly track the target object according to the target tracking distance and the target included angle is achieved, and the technical problem that the following of any given target at any specified distance cannot be flexibly achieved in the related technology is solved.
In an alternative embodiment, obtaining the target height of the target object includes: acquiring a pre-stored target height under the condition that the target height of the target object is known; or under the condition that the target height of the target object is unknown, acquiring a previous frame image and a current frame image containing the target object, and acquiring a first moving distance of the robot in a first time interval for generating the previous frame image and the current frame image; the target height of the target object is determined according to the first moving distance, the first pixel height of the target object in the previous frame image, the second pixel height of the target object in the current frame image and the focal length of the image acquisition device of the robot.
Alternatively, if the target height of the target object is known and stored in the robot system in advance, the target height may be directly acquired. Or alternatively
If the target height of the target object for the robot is unknown, a previous frame image and a current frame image containing the target object can be acquired, and a first moving distance of the robot in a first time interval of the moment of generating the two frame images can be acquired, and then the target height of the target object is determined according to the first moving distance, a first pixel height of the target object in the previous frame image, a second pixel height of the target object in the current frame image, and a focal length of an image acquisition device (such as a monocular camera) mounted on the robot.
In an alternative embodiment, determining the target height of the target object based on the first movement distance, the first pixel height of the target object in the previous frame image, the second pixel height of the target object in the current frame image, and the focal length of the image acquisition device of the robot includes: the product of the first movement distance, the first pixel height and the second pixel height is determined as a first value, the product of the focal length and the difference between the second pixel height and the first pixel height is determined as a second value, and the ratio between the first value and the second value is determined as a target height.
Alternatively, assume that a first moving distance between two frames of images of the robot is l, and a first pixel height of the target object in a previous frame of image is v 0 The second pixel height of the target object in the current frame image is v t The focal length f of the camera mounted on the robot is known, then the starting distance x between the target object and the robot platform camera 0 Distance x between target and robot platform in following process t And the following target height h is calculated as follows:
as shown in fig. 3, the following formula can be obtained from the similar triangle proportional relationship between the image and the real environment and the robot motion distance difference between the two frames of images:
Figure BDA0002521298730000061
thereby obtaining the target height
Figure BDA0002521298730000062
Wherein lv is 0 v t At the first value, (v) t -v 0 ) f is the second value.
In an alternative embodiment, determining the current distance between the robot and the target object from the target height comprises: the ratio between the product of the target height and the focal length and the second pixel height is determined as the current distance.
Alternatively, in the case where the target height h is known in advance, the current distance calculation method between the robot and the target may be:
Figure BDA0002521298730000071
in an alternative embodiment, the method further comprises: a third value is determined as a product of the first movement distance and the first pixel height, and a ratio of the third value to a difference between the second pixel height and the first pixel height is determined as the current distance.
Optionally, the distance between the current frame target object and the robotic platform (i.e., the current distance)
Figure BDA0002521298730000072
Distance between the starting target position and the robot platform +.>
Figure BDA0002521298730000073
Wherein lv is t The third value is the above-mentioned third value.
In an alternative embodiment, determining a target angle between the robot and the target object from the target height comprises: acquiring a third pixel height of the target object in the next frame image under the condition that the target tracking distance between the robot and the target object is changed and the target object moves transversely in a second time interval between the next frame image and the current frame image; determining a ratio of a product of the target height and the focal length to the third pixel height as a first distance between the robot and the target object, and determining a ratio of a product of the target height and the focal length to the second pixel height as a second distance between the robot and the target object, wherein the first distance is a distance between the robot and the target object in a first shooting direction for acquiring a next frame image, and the second distance is a distance between the robot and the target object in a second shooting direction for acquiring a current frame image; and determining a target included angle according to the first distance, the second pixel height and a target proportionality coefficient, wherein the target proportionality coefficient is a proportionality coefficient between the pixel coordinates of the target object in the image and the real distance of the target object.
Alternatively, the robot platform should follow the rotation through a corresponding angle as the target object moves laterally in the image plane. Assuming that the distance between the robot and the target object changes and the target object moves transversely, the relation between the robot platform and the target is shown in fig. 4 according to the pinhole imaging principle of the pinhole camera.
The target height h can be calculated by the above steps, using the pixel height v of the target height in the image t And v t+1 (corresponding to the third pixel height described above), the target object and robot platform distance l can be estimated t (the l t And x is as described above t Meaning identical to that of (c) t+1 Between which are locatedIs the relation of:
Figure BDA0002521298730000081
mapping the target object of t+1 frames into t frame images, the imaging characteristics of the pinhole camera of fig. 4 should satisfy the proportional relation:
Figure BDA0002521298730000082
wherein l t+1 For the first distance l t For the second distance, u is t And u_ (t+1) is the transverse coordinate of the target object, k is the ratio system of the pixel coordinate to the true distance, and k (corresponding to the target ratio relation) satisfies
Figure BDA0002521298730000083
Thereby, the angle between the robot and the target object is changed as follows:
Figure BDA0002521298730000084
wherein alpha is the target included angle.
In an alternative embodiment, determining a target tracking distance for the robot to track the target object based on the current distance includes: when the difference value between the current distance and the target tracking distance is larger than a preset threshold value and the current distance is smaller than the target tracking distance, the moving speed of the robot is reduced, so that the difference value between the current distance and the target tracking distance is smaller than or equal to the preset threshold value; or under the condition that the difference value between the current distance and the target tracking distance is larger than a preset threshold value and the current distance is larger than the target tracking distance, the moving speed of the robot is increased, so that the difference value between the current distance and the target tracking distance is smaller than or equal to the preset threshold value.
In an alternative embodiment, tracking the target object according to the target tracking distance and the target included angle includes: in the process of tracking the target object by the robot, keeping the distance between the robot and the target object as a target tracking distance and keeping the target included angle smaller than or equal to a first preset angle.
Alternatively, the real pose relationship from the robot to the following target object can be obtained through the steps. During the tracking process of the robot, the included angle between the target object and the robot platform (corresponding to the target included angle) can be kept to be zero, and the distance between the robot and the target object is kept to be a fixed distance (such as the target tracking distance). The target tracking distance may be a pre-specified tracking distance.
In an alternative embodiment, the method further comprises: acquiring an i-1 frame image of a target object, a first pixel height and a first pixel width of a sampling frame for sampling the target object, and an initial position coordinate of the target object in the i-1 frame image; extracting image features of the i-1 th frame image to obtain a first feature vector of the i-1 th frame image; training classifier parameters according to a first feature vector, a first pixel height, a first pixel width and initial position coordinates, wherein the initial position coordinates are center pixel coordinates of a target object in an i-1 th frame image, the previous frame image comprises the i-1 th frame image, and i is a natural number; sampling a target object at an initial coordinate position by using a sampling frame corresponding to the first pixel height and the first pixel width to obtain an ith frame image, performing pole-to-logarithmic conversion on the ith-1 frame image and the ith frame image, and determining a scale difference and a rotation angle of scale change of the target object in the ith-1 frame image and the ith frame image, wherein the scale difference is used for representing the size change of the target object in the ith-1 frame image and the ith frame image, and the rotation angle is used for representing the angle change of the target object in the ith-1 frame image and the ith frame image in the same reference direction, and the current frame image comprises the ith frame image; acquiring the current coordinate position of a target object in an ith frame image; extracting image features of an ith frame image to obtain a second feature vector of the ith frame image, and determining a second pixel height and a second pixel width of a target object in the ith frame image according to the first pixel height, the first pixel width, the scale difference and the rotation angle, wherein the current position coordinate is a central pixel coordinate of the target object in the ith frame image; training and updating classifier parameters according to the second feature vector, the current position coordinate, the second pixel height and the second pixel width to enable the image acquisition device to sample an (i+1) th frame image according to the current position coordinate, the second pixel height and the second pixel width, wherein the (i+1) th frame image comprises a next frame image.
Optionally, when the target object moves, the robot platform may collect image information by using a camera, perform pose estimation of the target object on the image by using a target tracking algorithm, including pixel position estimation and target inclination angle (corresponding to the rotation angle) estimation, and obtain the position information of the target object on the image, and then obtain the current distance and the target included angle between the target and the robot platform by using the above steps.
It should be noted that, tracking of the target object may be achieved by training a classifier in advance; the classifier can be trained according to the obtained ith frame image, parameters of the classifier are updated, when the ith+1 frame image is obtained, the ith+1 frame image is continuously used for continuously training the classifier, and tracking of the target object is achieved in a mode of continuously updating the classifier on line.
By means of the continuous updating of the classifier, the position of the target object and the rotation angle of the target object can be updated in real time, and the tracking accuracy of the target object is improved.
In an alternative embodiment, after determining the scale difference and the rotation angle of the scale change of the target object in the i-1 th frame image and the i-th frame image, the method further comprises: and in the process of tracking the target object by the robot, stopping tracking the target object by the robot under the condition that the rotation angle is larger than or equal to a second preset angle.
Alternatively, the determination of the safety state following the target object is made based on the obtained target inclination angle (corresponding to the above-described rotation angle). By designating a safe angle range, when the inclination angle of the tracking target in the image plane exceeds the limit, the robot platform stops following. In addition, when the target object fails to track, the target moves out of the image completely or the robot platform encounters an obstacle, the robot platform is also stopped.
The flow of the face detection method is described below in conjunction with an alternative example, and as shown in fig. 5, the method may include the steps of:
step S501, calibrating a camera installed on the robot, determining the internal reference focal length f and the optical center pixel position of the camera, and ensuring that the optical center of the camera is adjusted to the image center in the subsequent image processing process. The robot platform odometer or accelerometer is calibrated and used to estimate the displacement distance of the robot platform during the initialization phase.
Step S502, determining a following target (corresponding to the target object) by a detection algorithm or an artificially specified method, and initializing the following target, including determining a tracking target frame and calculating a target height. And designating the expected distance between the robot platform and the following target, and determining the corresponding relation between the target and the target pixels in the image by combining the following target height.
Alternatively, it is assumed that a first moving distance between two frame images (a previous frame image and a current frame image) of the robot is l, and a target pixel height (corresponding to the first pixel height) of a following target in the previous frame image is v 0 The current frame follows the target second pixel height v t Knowing the focal length f of the camera, the starting distance x between the target and the robot platform camera 0 Distance x between target and robot platform in following process t And the following target height h is calculated as follows:
as shown in fig. 3, the following formula can be obtained from the similar triangle proportional relationship between the image and the real environment and the robot motion distance difference between the two frames of images:
Figure BDA0002521298730000111
thereby obtaining the target height
Figure BDA0002521298730000112
Distance between current target object and robot platform +.>
Figure BDA0002521298730000113
Distance between the starting target position and the robot platform +.>
Figure BDA0002521298730000114
When the target height h is known in advance, the current distance calculation method between the robot and the target may further be:
Figure BDA0002521298730000115
after the corresponding transformation relation between the camera pixels and the actual environment is determined, the robot platform can be presumed to follow the rotated angle when the target object moves transversely in the image plane. It is generally assumed that the image acquisition frequency is high, the distance between the robot and the object is unchanged, and only the lateral movement is performed. If the target moves transversely with the distance change, the relation between the robot platform and the target is shown in figure 4 according to the pinhole imaging principle of the pinhole camera.
The target height h can be calculated by the above steps, using the pixel height v of the target height in the image t And v t-1 (corresponding to the third pixel height described above), the target object and robot platform distance l can be estimated t And l t-1 Relationship between:
Figure BDA0002521298730000121
mapping the target object of t+1 frames into t frame images, the imaging characteristics of the pinhole camera of fig. 4 should satisfy the proportional relation:
Figure BDA0002521298730000122
wherein l t+1 For the first distance l t Is the second distance.
Thereby, the angle between the robot and the target object is changed as follows:
Figure BDA0002521298730000123
wherein alpha is the upperAnd the target included angle.
The real pose relation from the robot to the following target object can be obtained through the steps. During the tracking process of the robot, the included angle between the target object and the robot platform (corresponding to the target included angle) can be kept to be zero, and the distance between the robot and the target object is kept to be a fixed distance (such as the target tracking distance). The target tracking distance may be a pre-specified tracking distance.
In step S503, when the target object moves, the robot platform may acquire image information by using a camera, perform pose estimation of the target object on the image by using a target tracking algorithm (e.g., a tracking algorithm KSCFrot), including pixel position estimation and target inclination angle (corresponding to the rotation angle) estimation, and obtain the position information of the target object on the image, and then obtain the current distance and the target included angle between the target and the robot platform by using the above steps.
Step S504, performing a safety state judgment of the following target object according to the target inclination angle (corresponding to the rotation angle) obtained in step S503. By designating a safe angle range, when the inclination angle of the tracking target in the image plane exceeds the limit, the robot platform stops following. In addition, when the target object fails to track, the target moves out of the image completely or the robot platform encounters an obstacle, the robot platform is also stopped. If not, the process advances to step S505.
Step S505, judging whether the robot platform reaches a designated position, and if not, adjusting the robot platform to enable the distance and the included angle between the robot and the target object to be consistent with expected conditions. After reaching the new position, step S503 is executed again.
Alternatively, the principle and flow of the target tracking algorithm (e.g., the tracking algorithm KSCFrot) in the above step S503 are as follows: firstly, estimating the pixel coordinate position of a tracking target by using a tracking classifier, then estimating the scale and the angle, and finally updating the tracking classifier with new positions and sizes, as shown in fig. 6, wherein the specific process is as follows:
step 1, KSCFrot position estimation process:
given a picture containing a tracked target object, the target object's width, height and initial position coordinates. Taking the center of a target object as an origin, sampling with the size which is 2.5 times of the width and the height of the target object to obtain a base sample, and circularly shifting an image block of the base sample to form a group of initial training samples xi.
Training of tracking target objects and background two classifiers is performed by using training samples, and the training samples need to be processed. The feature extraction is generally performed on the sample image block, so that pixel information can be mapped to a nonlinear space, and classification of a target object by a support vector machine can be facilitated. Defining image mapping to nonlinear spatial operation as
Figure BDA0002521298730000135
The mapping method may be extracting Hog features, statistical color features, or features extracted through neural networks, etc. It will be appreciated that the above is only an example, and the present embodiment is not limited in any way herein.
The training sample labeling process of the support vector machine classifier comprises the steps of firstly establishing a two-dimensional Gaussian response pi with the amplitude of 1 according to the size of a sampling image block, further classifying labels into three types, and establishing two labeling threshold values, wherein the Gaussian response is larger than an upper boundary threshold value theta u Is denoted as 1, less than the lower boundary threshold value θ l Is defined as-1, and the other labels are defined as 0. The classifier is trained such that output results and labeling errors are minimized.
Figure BDA0002521298730000131
Wherein y is the output result.
The classifier w is defined as a dual parameter alpha and a feature vector
Figure BDA0002521298730000132
Combining feature vectors in the classifier with the image feature vectors to obtain a kernel matrix, ++>
Figure BDA0002521298730000134
The kernel matrix has circularity, can simplify calculation, and can convert a solving classifier into a dual space through the kernel matrix to solve dual parameters. Meanwhile, the convolution operation of the time domain can be transferred to the frequency domain to perform dot multiplication operation, and the fast Fourier transform of the frequency domain is utilized >
Figure BDA0002521298730000146
The processing speed is faster, and the following results can be obtained:
Figure BDA0002521298730000141
the optimization objective function of the classifier is:
Figure BDA0002521298730000142
s.t.e≥0 (3)
wherein C is an adjustment coefficient.
The classifier parameters satisfy:
Figure BDA0002521298730000143
Figure BDA0002521298730000144
and carrying out correlation calculation on each sample xi generated by the cyclic matrix and the classifier { w, b } or { alpha, b } to obtain a corresponding response result pi. After the result is obtained, the result is converted back to the time domain through inverse Fourier transform, so that the result can be displayed more intuitively. And arranging the response results in a circulation sequence of the generated samples, searching the maximum response position and comparing the maximum response position with the standard Gaussian response maximum position so as to judge the moving distance of the tracking target. The width and height of the object on the image remain unchanged at this time.
Figure BDA0002521298730000145
max(p i )→u,v (6)
Step 2, a scale and angle estimation process:
after the preliminary position estimate is obtained, the target scale and angle change are estimated. The process is to sample the same width and height of one frame or initial frame base sample above the initial target position. And circularly generating a group of scale and angle training samples by the base samples, and converting Cartesian coordinates of the samples into polar-logarithmic coordinates. By pole-to-pole (·) conversion, the change in angle and scale can be made to be linear. Assuming that the template is I0, the current frame target image is I1. Two image blocks x center pixel (u 0 ,v 0 ) The Cartesian coordinates u/v are exchanged for polar coordinates s/θ for the origin.
Figure BDA0002521298730000151
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0002521298730000152
r is the reference direction in cartesian coordinates.
Figure BDA0002521298730000153
Figure BDA0002521298730000154
Extracting features from the training sample (phi), and combining the feature vector of the training sample with a tracking template h ρ And performing phase correlation estimation in a nonlinear feature space, arranging the obtained results according to a cycle sequence of the generated samples, and taking coordinate information of the maximum response result. Because the coordinates of the polar coordinate system represent the length and angle, theThe coordinates of this maximum response can be used to determine the scale and angle differences between the tracking target and the template. The image of the tracking target object obtained by the step 1 process is scaled in size at the preliminary cartesian position coordinates. And meanwhile, the obtained angle difference is accumulated, so that the angle change between the tracking target and the initial state of the tracking target is determined. And finally, obtaining the Cartesian coordinate position and the inclination angle estimation of the tracking target.
max(h ρ Φ)→s′,θ′ (10)
Step 4, updating the template:
the training sample collection is re-performed with the method in step 1 at the new position with the new size and classifier parameters are calculated. And sampling the position and the size of the next image obtained by the current frame, and performing a new round of target tracking calculation.
Through the embodiment, firstly, a camera (such as a monocular industrial camera) is utilized to carry out target pose estimation by utilizing a target tracking algorithm, wherein the target pose estimation comprises target image position and in-plane rotation angle estimation; secondly, calculating the following distance and the following direction of the robot platform by combining the internal parameters of the camera with the target size and combining the conversion relation from the planar image to the real environment; finally, the pose of the robot is adjusted so that the following target is located at a fixed distance in front of the camera, and meanwhile, the operation safety of the robot platform is ensured through the estimated rotation angle. Based on a support vector machine tracking algorithm KSCF, a new tracking algorithm KSCFrot is designed by introducing a polar-logarithmic coordinate to estimate the scale and angle of a tracking target, so that the rotation condition of the tracking target can be estimated. The method achieves the purpose that the robot platform follows any given target according to the specified distance by means of the monocular vision sensor, meanwhile, the abnormal state of inclination or rollover of the following target can be safely judged, the flexibility of the robot for tracking the target object is improved, and the safety of the robot is improved through the estimation of the rotation angle.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present invention is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present invention. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present invention.
According to still another aspect of the embodiment of the present invention, there is also provided an object tracking apparatus, as shown in fig. 7, including:
a first obtaining unit 702, configured to obtain a target height of a target object, where the target object is a tracking object currently tracked by the robot, and the target height is used to represent an actual height of the target object;
a first determining unit 704, configured to determine a current distance between the robot and the target object according to the target height, where the current distance is used to represent a current actual distance between the robot and the target object;
a second determining unit 706, configured to determine a target tracking distance for the robot to track the target object according to the current distance, and determine a target included angle between the robot and the target object according to the target height;
and a tracking unit 708, configured to track the target object according to a target tracking distance and a target included angle, where the target tracking distance is a preset tracking distance between the robot and the target object.
Alternatively, the first acquisition unit 702 may be used to perform step S202, the first determination unit 704 may be used to perform step S204, the second determination unit 706 may be used to perform step S206, and the tracking unit 708 may be used to perform step S208.
Through the embodiment, firstly, a target object currently tracked by the robot is obtained, a target height representing the actual height of the target object is obtained, the current distance between the robot and the target object is determined according to the target height, then, the target tracking distance of the target object tracked by the robot is determined according to the current distance, and the target included angle between the robot and the target object is determined according to the target height, so that the target object is tracked by the robot according to the target tracking distance and the target included angle. The technical effect that the robot can flexibly track the target object according to the target tracking distance and the target included angle is achieved, and the technical problem that the following of any given target at any specified distance cannot be flexibly achieved in the related technology is solved.
As an optional solution, the first obtaining unit includes: the first acquisition module is used for acquiring a pre-stored target height under the condition that the target height of the target object is known; or a second acquisition module, configured to acquire a previous frame image and a current frame image including the target object, and acquire a first moving distance of the robot in a first time interval for generating the previous frame image and the current frame image, in a case that the target height of the target object is unknown; the first determining module is used for determining the target height of the target object according to the first moving distance, the first pixel height of the target object in the previous frame image, the second pixel height of the target object in the current frame image and the focal length of the image acquisition device of the robot.
As an optional technical solution, the first determining module is further configured to determine a product of the first moving distance, the first pixel height and the second pixel height as a first value, determine a product of the focal length and a difference between the second pixel height and the first pixel height as a second value, and determine a ratio between the first value and the second value as the target height.
As an optional solution, the first determining unit is further configured to determine a ratio between a product of the target height and the focal length and the second pixel height as the current distance.
As an optional technical solution, the apparatus further includes: and a third determining unit for determining a third value as a product between the first moving distance and the first pixel height, and determining a ratio of the third value to a difference between the second pixel height and the first pixel height as the current distance.
As an optional technical solution, the second determining unit, the third obtaining module is configured to obtain a third pixel height of the target object in the next frame image when a target tracking distance between the robot and the target object changes and the target object moves laterally in a second time interval between the next frame image and the current frame image; the second determining module is used for determining the ratio of the product of the target height and the focal length to the third pixel height as a first distance between the robot and the target object and determining the ratio of the product of the target height and the focal length to the second pixel height as a second distance between the robot and the target object, wherein the first distance is the distance between the robot and the target object in a first shooting direction for acquiring a next frame image, and the second distance is the distance between the robot and the target object in a second shooting direction for acquiring a current frame image; and the third determining module is used for determining a target included angle according to the first distance, the second pixel height and the target proportionality coefficient, wherein the target proportionality coefficient is a proportionality coefficient between the pixel coordinates of the target object in the image and the real distance of the target object.
As an optional solution, the second determining unit includes: the first processing module is used for reducing the moving speed of the robot under the condition that the difference value between the current distance and the target tracking distance is larger than a preset threshold value and the current distance is smaller than the target tracking distance so that the difference value between the current distance and the target tracking distance is smaller than or equal to the preset threshold value; or the second processing module is used for improving the moving speed of the robot under the condition that the difference value between the current distance and the target tracking distance is larger than a preset threshold value and the current distance is larger than the target tracking distance, so that the difference value between the current distance and the target tracking distance is smaller than or equal to the preset threshold value.
As an optional technical solution, the tracking unit is further configured to keep a distance between the robot and the target object as a target tracking distance and keep a target included angle smaller than or equal to a first preset angle in a process of tracking the target object by the robot.
As an optional technical solution, the apparatus further includes: the first processing unit is used for acquiring an i-1 frame image of the target object, a first pixel height and a first pixel width of a sampling frame for sampling the target object and an initial position coordinate of the target object in the i-1 frame image; extracting image features of the i-1 th frame image to obtain a first feature vector of the i-1 th frame image; training classifier parameters according to a first feature vector, a first pixel height, a first pixel width and initial position coordinates, wherein the initial position coordinates are center pixel coordinates of the target object in an i-1 th frame image, the previous frame image comprises the i-1 th frame image, and i is a natural number; the second processing unit is used for sampling the target object at an initial coordinate position by using a sampling frame corresponding to the first pixel height and the first pixel width to obtain an ith frame image, performing pole-to-logarithmic conversion on the ith-1 frame image and the ith frame image, and determining a scale difference and a rotation angle of scale change of the target object in the ith-1 frame image and the ith frame image, wherein the scale difference is used for representing the size change of the target object in the ith-1 frame image and the ith frame image, and the rotation angle is used for representing the angle change of the target object in the ith-1 frame image and the ith frame image in the same reference direction, and the current frame image comprises the ith frame image; the third processing unit is used for acquiring the current coordinate position of the target object in the ith frame of image; extracting image features of an ith frame image to obtain a second feature vector of the ith frame image, and determining a second pixel height and a second pixel width of a target object in the ith frame image according to the first pixel height, the first pixel width, the scale difference and the rotation angle, wherein the current position coordinate is a central pixel coordinate of the target object in the ith frame image; and the fourth processing unit is used for training and updating classifier parameters according to the second feature vector, the current position coordinate, the second pixel height and the second pixel width so that the image acquisition equipment samples the (i+1) th frame image according to the current position coordinate, the second pixel height and the second pixel width, wherein the (i+1) th frame image comprises the next frame image.
As an optional technical solution, the apparatus further includes: and a fifth processing unit, configured to stop tracking of the target object by the robot when the rotation angle is greater than or equal to the second preset angle in the process of tracking the target object by the robot after determining the scale difference and the rotation angle of the scale change of the target object in the i-1 th frame image and the i-th frame image.
According to a further aspect of embodiments of the present invention there is also provided a storage medium having stored therein a computer program, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described storage medium may be configured to store a computer program for performing the steps of:
s1, acquiring a target height of a target object, wherein the target object is a tracking object currently tracked by a robot, and the target height is used for representing the actual height of the target object;
s2, determining the current distance between the robot and the target object according to the target height, wherein the current distance is used for representing the current actual distance between the robot and the target object;
s3, determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height;
S4, tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object.
Alternatively, in the present embodiment, the above-described storage medium may be configured to store a computer program for performing the steps of:
alternatively, in this embodiment, it will be understood by those skilled in the art that all or part of the steps in the methods of the above embodiments may be performed by a program for instructing a terminal device to execute the steps, where the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, ROM (Read-Only Memory), RAM (Random Access Memory ), magnetic or optical disk, and the like.
According to a further aspect of the embodiments of the present invention there is also provided an electronic device for implementing the above object tracking method, as shown in fig. 8, the electronic device comprising a memory 802 and a processor 804, the memory 802 having stored therein a computer program, the processor 804 being arranged to perform the steps of any of the method embodiments described above by means of the computer program.
Alternatively, in this embodiment, the electronic apparatus may be located in at least one network device of a plurality of network devices of the computer network.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, acquiring a target height of a target object, wherein the target object is a tracking object currently tracked by a robot, and the target height is used for representing the actual height of the target object;
s2, determining the current distance between the robot and the target object according to the target height, wherein the current distance is used for representing the current actual distance between the robot and the target object;
s3, determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height;
s4, tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object.
Alternatively, it will be understood by those skilled in the art that the structure shown in fig. 8 is only schematic, and the electronic device may also be a terminal device such as a smart phone (e.g. an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, and a mobile internet device (Mobile Internet Devices, MID), a PAD, etc. Fig. 8 is not limited to the structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, etc.) than shown in FIG. 8, or have a different configuration than shown in FIG. 8.
The memory 802 may be used to store software programs and modules, such as program instructions/modules corresponding to the target tracking method and apparatus in the embodiment of the present invention, and the processor 804 executes the software programs and modules stored in the memory 802, thereby performing various functional applications and data processing, that is, implementing the target tracking method described above. Memory 802 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 802 may further include memory remotely located relative to processor 804, which may be connected to the terminal via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. The memory 802 may be, but is not limited to, storing information such as a target height of a target object. As an example, as shown in fig. 8, the memory 802 may include, but is not limited to, the first acquiring unit 702, the first determining unit 704, the second determining unit 706, and the tracking unit 708 in the target tracking apparatus. In addition, other module units in the above target tracking apparatus may be included, but are not limited to, and are not described in detail in this example.
Optionally, the transmission device 806 is used to receive or transmit data via a network. Specific examples of the network described above may include wired networks and wireless networks. In one example, the transmission means 806 includes a network adapter (Network Interface Controller, NIC) that can connect to other network devices and routers via a network cable to communicate with the internet or a local area network. In one example, the transmission device 806 is a Radio Frequency (RF) module for communicating wirelessly with the internet.
In addition, the electronic device further includes: a display 808 and a connection bus 810 for connecting the various modular components of the electronic device described above.
In other embodiments, the terminal or the server may be a node in a distributed system, where the distributed system may be a blockchain system, and the blockchain system may be a distributed system formed by connecting the plurality of nodes through a network communication. Among them, the nodes may form a Peer-To-Peer (P2P) network, and any type of computing device, such as a server, a terminal, etc., may become a node in the blockchain system by joining the Peer-To-Peer network.
Alternatively, in this embodiment, it will be understood by those skilled in the art that all or part of the steps in the methods of the above embodiments may be performed by a program for instructing a terminal device to execute the steps, where the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), magnetic or optical disk, and the like.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
The integrated units in the above embodiments may be stored in the above-described computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present invention may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing one or more computer devices (which may be personal computers, servers or network devices, etc.) to perform all or part of the steps of the method of the various embodiments of the present invention.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and are merely a logical functional division, and there may be other manners of dividing the apparatus in actual implementation, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (12)

1. A target tracking method, applied to a robot, comprising:
obtaining a target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object;
determining a current distance between the robot and the target object according to the target height, wherein the current distance is used for representing a current actual distance between the robot and the target object;
determining a target tracking distance of the robot for tracking the target object according to the current distance, and determining a target included angle between the robot and the target object according to the target height;
Tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object;
wherein the determining a target included angle between the robot and the target object according to the target height includes: acquiring a third pixel height of the target object in the next frame image under the condition that the target tracking distance between the robot and the target object is changed and the target object moves transversely in a second time interval between the next frame image and the current frame image;
determining a ratio of a product of the target height and the focal length to the third pixel height as a first distance between the robot and the target object, and determining a ratio of the product of the target height and the focal length to a second pixel height of the target object in the current frame image as a second distance between the robot and the target object, wherein the first distance is a distance between the robot and the target object in a first shooting direction in which the next frame image is acquired, and the second distance is a distance between the robot and the target object in a second shooting direction in which the current frame image is acquired;
And determining the target included angle according to the first distance, the second pixel height and a target proportionality coefficient, wherein the target proportionality coefficient is a proportionality coefficient between the pixel coordinates of the target object in the image and the real distance of the target object.
2. The method of claim 1, wherein the obtaining the target height of the target object comprises:
acquiring a pre-stored target height in the case that the target height of the target object is known; or alternatively
Acquiring a previous frame image and a current frame image containing the target object, and acquiring a first moving distance of the robot in a first time interval for generating the previous frame image and the current frame image, under the condition that the target height of the target object is unknown;
and determining the target height of the target object according to the first moving distance, the first pixel height of the target object in the previous frame image, the second pixel height of the target object in the current frame image and the focal length of the image acquisition equipment of the robot.
3. The method of claim 2, wherein the determining the target height of the target object based on the first distance of movement, the first pixel height of the target object in the previous frame image, the second pixel height of the target object in the current frame image, and the focal length of the image capture device of the robot comprises:
Determining a product of the first moving distance, the first pixel height and the second pixel height as a first value, and a product of the focal length and a difference between the second pixel height and the first pixel height as a second value, and determining a ratio between the first value and the second value as the target height.
4. The method of claim 2, wherein the determining the current distance between the robot and the target object from the target height comprises:
a ratio between a product of the target height and the focal length and the second pixel height is determined as the current distance.
5. The method according to claim 2, wherein the method further comprises:
a third value is determined as a product between the first movement distance and the first pixel height, and a ratio of the third value to a difference between the second pixel height and the first pixel height is determined as the current distance.
6. The method of claim 1, wherein the determining a target tracking distance for the robot to track the target object based on the current distance comprises:
When the difference between the current distance and the target tracking distance is larger than a preset threshold value and the current distance is smaller than the target tracking distance, reducing the moving speed of the robot so that the difference between the current distance and the target tracking distance is smaller than or equal to the preset threshold value; or alternatively
And under the condition that the difference value between the current distance and the target tracking distance is larger than the preset threshold value and the current distance is larger than the target tracking distance, the moving speed of the robot is increased so that the difference value between the current distance and the target tracking distance is smaller than or equal to the preset threshold value.
7. The method of claim 1, wherein tracking the target object based on the target tracking distance and the target angle comprises:
and in the process of tracking the target object by the robot, keeping the distance between the robot and the target object as the target tracking distance and keeping the target included angle smaller than or equal to a first preset angle.
8. The method according to any one of claims 2 to 5, further comprising:
Acquiring an i-1 frame image of the target object, the first pixel height and the first pixel width of a sampling frame for sampling the target object, and initial position coordinates of the target object in the i-1 frame image; extracting image characteristics of the i-1 th frame image to obtain a first characteristic vector of the i-1 th frame image; training classifier parameters according to the first feature vector, the first pixel height, the first pixel width and the initial position coordinates, wherein the initial position coordinates are center pixel coordinates of the target object in the i-1 th frame image, the previous frame image comprises the i-1 th frame image, and the i is a natural number;
sampling the target object at an initial coordinate position by using the sampling frame corresponding to the first pixel height and the first pixel width to obtain an ith frame image, performing pole-to-logarithm conversion on the ith-1 frame image and the ith frame image, and determining a scale difference and a rotation angle of scale change of the target object in the ith-1 frame image and the ith frame image, wherein the scale difference is used for representing the size change of the target object in the ith-1 frame image and the ith frame image, the rotation angle is used for representing the angle change of the target object in the ith-1 frame image and the ith frame image in the same reference direction, and the current frame image comprises the ith frame image;
Acquiring the current coordinate position of the target object in the ith frame image; extracting image features of the ith frame image to obtain a second feature vector of the ith frame image, and determining the second pixel height and the second pixel width of the target object in the ith frame image according to the first pixel height, the first pixel width, the scale difference and the rotation angle, wherein the current position coordinate is the central pixel coordinate of the target object in the ith frame image;
training and updating the classifier parameters according to the second feature vector, the current position coordinate, the second pixel height and the second pixel width to enable the image acquisition device to sample an (i+1) th frame image according to the current position coordinate, the second pixel height and the second pixel width, wherein the (i+1) th frame image comprises a next frame image.
9. The method of claim 8, wherein after said determining the scale difference and rotation angle of the scale change of the target object in the i-1 th frame image and the i-th frame image, the method further comprises:
And stopping tracking of the target object by the robot under the condition that the rotation angle is larger than or equal to a second preset angle in the process of tracking the target object by the robot.
10. An object tracking device, which is characterized in that,
the first acquisition unit is used for acquiring the target height of a target object, wherein the target object is a tracking object currently tracked by the robot, and the target height is used for representing the actual height of the target object;
a first determining unit, configured to determine a current distance between the robot and the target object according to the target height, where the current distance is used to represent a current actual distance between the robot and the target object;
a second determining unit, configured to determine a target tracking distance of the robot for tracking the target object according to the current distance, and determine a target included angle between the robot and the target object according to the target height;
the tracking unit is used for tracking the target object according to the target tracking distance and the target included angle, wherein the target tracking distance is a preset tracking distance between the robot and the target object;
The second determination unit further includes: the third acquisition module is used for acquiring a third pixel height of the target object in the next frame image under the condition that the target tracking distance between the robot and the target object is changed and the target object moves transversely in a second time interval between the next frame image and the current frame image; the second determining module is used for determining the ratio of the product of the target height and the focal length to the third pixel height as a first distance between the robot and the target object and determining the ratio of the product of the target height and the focal length to the second pixel height as a second distance between the robot and the target object, wherein the first distance is the distance between the robot and the target object in a first shooting direction for acquiring a next frame image, and the second distance is the distance between the robot and the target object in a second shooting direction for acquiring a current frame image; and the third determining module is used for determining a target included angle according to the first distance, the second pixel height and the target proportionality coefficient, wherein the target proportionality coefficient is a proportionality coefficient between the pixel coordinates of the target object in the image and the real distance of the target object.
11. A computer readable storage medium comprising a stored program, wherein the program when run performs the method of any one of the preceding claims 1 to 9.
12. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the method according to any of the claims 1 to 9 by means of the computer program.
CN202010491672.2A 2020-06-02 2020-06-02 Target tracking method and device, storage medium and electronic device Active CN111665490B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010491672.2A CN111665490B (en) 2020-06-02 2020-06-02 Target tracking method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010491672.2A CN111665490B (en) 2020-06-02 2020-06-02 Target tracking method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN111665490A CN111665490A (en) 2020-09-15
CN111665490B true CN111665490B (en) 2023-07-14

Family

ID=72383713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010491672.2A Active CN111665490B (en) 2020-06-02 2020-06-02 Target tracking method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN111665490B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113457153A (en) * 2021-06-24 2021-10-01 深圳市瑞立视多媒体科技有限公司 Virtual engine-based vehicle plane movement control method and device
CN113591722B (en) * 2021-08-02 2023-09-12 山东大学 Target person following control method and system for mobile robot

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012215549A (en) * 2011-04-01 2012-11-08 Mitsubishi Electric Corp Tracking device
CN105488815A (en) * 2015-11-26 2016-04-13 北京航空航天大学 Real-time object tracking method capable of supporting target size change
EP3121791A1 (en) * 2015-07-24 2017-01-25 Ricoh Company, Ltd. Method and system for tracking objects
CN106502272A (en) * 2016-10-21 2017-03-15 上海未来伙伴机器人有限公司 A kind of target following control method and device
CN106605154A (en) * 2016-05-24 2017-04-26 英华达(上海)科技有限公司 Moving object monitoring method, wearing type apparatus and server
CN107992052A (en) * 2017-12-27 2018-05-04 纳恩博(北京)科技有限公司 Method for tracking target and device, mobile equipment and storage medium
CN108646741A (en) * 2018-05-31 2018-10-12 哈尔滨工程大学 A kind of unmanned boat method for tracking target of view-based access control model feedback
CN109716256A (en) * 2016-08-06 2019-05-03 深圳市大疆创新科技有限公司 System and method for tracking target
CN109816698A (en) * 2019-02-25 2019-05-28 南京航空航天大学 Unmanned plane visual target tracking method based on dimension self-adaption core correlation filtering
CN110276786A (en) * 2015-09-15 2019-09-24 深圳市大疆创新科技有限公司 Determine method and device, tracking device and the system of the location information of tracking target
CN111127518A (en) * 2019-12-24 2020-05-08 深圳火星探索科技有限公司 Target tracking method and device based on unmanned aerial vehicle

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012215549A (en) * 2011-04-01 2012-11-08 Mitsubishi Electric Corp Tracking device
EP3121791A1 (en) * 2015-07-24 2017-01-25 Ricoh Company, Ltd. Method and system for tracking objects
CN110276786A (en) * 2015-09-15 2019-09-24 深圳市大疆创新科技有限公司 Determine method and device, tracking device and the system of the location information of tracking target
CN105488815A (en) * 2015-11-26 2016-04-13 北京航空航天大学 Real-time object tracking method capable of supporting target size change
CN106605154A (en) * 2016-05-24 2017-04-26 英华达(上海)科技有限公司 Moving object monitoring method, wearing type apparatus and server
CN109716256A (en) * 2016-08-06 2019-05-03 深圳市大疆创新科技有限公司 System and method for tracking target
CN106502272A (en) * 2016-10-21 2017-03-15 上海未来伙伴机器人有限公司 A kind of target following control method and device
CN107992052A (en) * 2017-12-27 2018-05-04 纳恩博(北京)科技有限公司 Method for tracking target and device, mobile equipment and storage medium
CN108646741A (en) * 2018-05-31 2018-10-12 哈尔滨工程大学 A kind of unmanned boat method for tracking target of view-based access control model feedback
CN109816698A (en) * 2019-02-25 2019-05-28 南京航空航天大学 Unmanned plane visual target tracking method based on dimension self-adaption core correlation filtering
CN111127518A (en) * 2019-12-24 2020-05-08 深圳火星探索科技有限公司 Target tracking method and device based on unmanned aerial vehicle

Also Published As

Publication number Publication date
CN111665490A (en) 2020-09-15

Similar Documents

Publication Publication Date Title
JP7106665B2 (en) MONOCULAR DEPTH ESTIMATION METHOD AND DEVICE, DEVICE AND STORAGE MEDIUM THEREOF
EP3766044B1 (en) Three-dimensional environment modeling based on a multicamera convolver system
KR102175491B1 (en) Method and apparatus for tracking object based on correlation filter
CN107369166B (en) Target tracking method and system based on multi-resolution neural network
KR102459853B1 (en) Method and device to estimate disparity
KR20180087994A (en) Stero matching method and image processing apparatus
US11049270B2 (en) Method and apparatus for calculating depth map based on reliability
US9165365B2 (en) Method and system for estimating attitude of camera
EP2352128B1 (en) Mobile body detection method and mobile body detection apparatus
WO2018128667A1 (en) Systems and methods for lane-marker detection
WO2018063608A1 (en) Place recognition algorithm
CN107516322B (en) Image object size and rotation estimation calculation method based on log polar space
CN111665490B (en) Target tracking method and device, storage medium and electronic device
CN110176024B (en) Method, device, equipment and storage medium for detecting target in video
JP2021077353A (en) Drone vision slam method based on gpu acceleration
CN112184757A (en) Method and device for determining motion trail, storage medium and electronic device
JP5674550B2 (en) Status tracking apparatus, method, and program
CN113112542A (en) Visual positioning method and device, electronic equipment and storage medium
CN116097307A (en) Image processing method and related equipment
CN113052907B (en) Positioning method of mobile robot in dynamic environment
CN114299230A (en) Data generation method and device, electronic equipment and storage medium
CN113639782A (en) External parameter calibration method and device for vehicle-mounted sensor, equipment and medium
CN110175523B (en) Self-moving robot animal identification and avoidance method and storage medium thereof
CN113936042B (en) Target tracking method and device and computer readable storage medium
KR101480824B1 (en) Background motion compensation method using multi-homography scheme

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant