CN111784224A - Object tracking method and device, control platform and storage medium - Google Patents

Object tracking method and device, control platform and storage medium Download PDF

Info

Publication number
CN111784224A
CN111784224A CN202010223928.1A CN202010223928A CN111784224A CN 111784224 A CN111784224 A CN 111784224A CN 202010223928 A CN202010223928 A CN 202010223928A CN 111784224 A CN111784224 A CN 111784224A
Authority
CN
China
Prior art keywords
tracked
positioning information
current positioning
current
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010223928.1A
Other languages
Chinese (zh)
Other versions
CN111784224B (en
Inventor
吴迪
万保成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN202010223928.1A priority Critical patent/CN111784224B/en
Publication of CN111784224A publication Critical patent/CN111784224A/en
Application granted granted Critical
Publication of CN111784224B publication Critical patent/CN111784224B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/083Shipping
    • G06Q10/0833Tracking

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Economics (AREA)
  • Quality & Reliability (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates to an object tracking method and apparatus, a control platform, and a storage medium. The object tracking method includes: acquiring current positioning information of all table objects on a control platform; matching the current positioning information of all table objects on the control platform with the prestored historical motion track of each object to be tracked; and taking the current positioning information of the successfully matched table top object as the current positioning information of the corresponding object to be tracked. The method and the device can realize the tracking of the motion of the object without prediction.

Description

Object tracking method and device, control platform and storage medium
Technical Field
The present disclosure relates to the field of logistics, and in particular, to an object tracking method and apparatus, a control platform, and a storage medium.
Background
Under the current rapid development environment of logistics transportation and storage automation, some scenes of unmanned warehouses need to perform low-speed motion control on objects such as square boxes (the low speed is defined to be less than 2 m/s). Such as smart conveyor belts and other similar platform-type control devices. Therefore, the square box needs to be accurately positioned and tracked firstly, and the acquired box information (such as geometric dimension, position, attitude, box number and historical movement path) is provided for the control device, so that real-time closed-loop control is jointly completed.
Disclosure of Invention
The inventor finds out through research that: in the related art, because the control part is provided with the expected control path of each box body, prediction can be firstly carried out on the basis of the positioning result of the previous frame (or more frames before), the time interval t from the previous frame (or more frames before) to the current frame and the expected path, and the predicted positioning information under the current frame is estimated; then, the positioning result really obtained under the current frame is matched with the current frame. And completing logic design based on the matching result, thereby further completing the tracking.
However, the prediction of the related art requires the positioning result of the previous frame (or more frames before), the time interval t from the previous frame (or more frames before) to the current frame, and the expected path, and naturally requires corresponding time resources and space resources (such as computing resources). The related art needs more resources if the result of more frames before is used for more accurate prediction.
In view of at least one of the above technical problems, the present disclosure provides an object tracking method and apparatus, a control platform, and a storage medium, which can achieve tracking of motion of an object without prediction.
According to an aspect of the present disclosure, there is provided an object tracking method including:
pre-storing the historical motion track of each object to be tracked;
acquiring current positioning information of all table objects on a control platform;
for each object to be tracked, searching current positioning information matched with the historical motion track of the object to be tracked from the current positioning information of all table objects;
and taking the current positioning information of the successfully matched table top object as the current positioning information of the corresponding object to be tracked.
In some embodiments of the present disclosure, the successfully matched tabletop object and the object to be tracked are the same object.
In some embodiments of the present disclosure, the object tracking method further comprises:
and outputting the current motion trail of the object to be tracked, wherein the current motion trail comprises historical motion trail and current positioning information.
In some embodiments of the present disclosure, the searching, for each object to be tracked, current positioning information that matches the historical motion trajectory of the object to be tracked from current positioning information of all table objects includes:
and aiming at each object to be tracked, searching current positioning information matched with the latest positioning information in the historical motion trail of the object to be tracked from the current positioning information of all table objects.
In some embodiments of the present disclosure, for each object to be tracked, the searching for current positioning information matching the latest positioning information in the historical motion trajectory of the object to be tracked from the current positioning information of all table objects includes:
calculating the distance between the current positioning information of each table object and the latest positioning information of the object to be tracked;
judging whether the distance is smaller than a preset threshold value;
and if the distance is smaller than a preset threshold value, determining that the table object with the distance smaller than the preset threshold value is successfully matched with the object to be tracked.
In some embodiments of the present disclosure, the obtaining current positioning information of all table objects on the control platform includes: and acquiring the current positioning information of all table objects on the control platform at preset positioning time intervals.
In some embodiments of the present disclosure, the predetermined threshold is less than a movement distance of the object within a predetermined positioning time interval, and the predetermined threshold is less than a minimum distance from a center of the object to an edge of the object.
In some embodiments of the present disclosure, the object tracking method further comprises:
judging whether current positioning information of the table top object which is unsuccessfully matched with the historical motion tracks of all the objects to be tracked exists or not;
and under the condition that the current positioning information of the table top object which is unsuccessfully matched with the historical motion trails of all the objects to be tracked exists, taking the table top object as a new object to be tracked, and taking the current positioning information of the table top object as the current positioning information of the object to be tracked.
In some embodiments of the present disclosure, the object tracking method further comprises:
judging whether a historical motion track of an object to be tracked, which is unsuccessfully matched with the current positioning information of all table objects, exists;
and under the condition that the historical motion trail of the object to be tracked which is unsuccessfully matched with the current positioning information of all table top objects exists, deleting the historical motion trail of the object to be tracked from the set of the object to be tracked.
In some embodiments of the present disclosure, the deleting the historical motion trail of the object to be tracked from the set of objects to be tracked includes:
adding 1 to the tracking abnormal times of the object to be tracked;
judging whether the tracking abnormal times of the object to be tracked are more than or equal to the preset times;
and deleting the historical motion trail of the object to be tracked from the object set to be tracked under the condition that the tracking abnormal times of the object to be tracked are more than or equal to the preset times.
According to another aspect of the present disclosure, there is provided an object tracking apparatus including:
the historical track storage module is used for storing the historical motion track of each object to be tracked in advance;
the positioning information acquisition module is used for acquiring the current positioning information of all table objects on the control platform;
the motion track matching module is used for searching current positioning information matched with the historical motion track of the object to be tracked from the current positioning information of all table objects aiming at each object to be tracked;
and the object motion tracking module is used for taking the current positioning information of the successfully matched table top object as the current positioning information corresponding to the object to be tracked.
In some embodiments of the present disclosure, the object tracking device is configured to perform operations for implementing the object tracking method according to any of the above embodiments.
According to another aspect of the present disclosure, there is provided an object tracking apparatus including:
a memory to store instructions;
a processor configured to execute the instructions to cause the apparatus to perform operations to implement the object tracking method according to any of the embodiments described above.
According to another aspect of the present disclosure, there is provided a control platform, comprising an object tracking device as described in any one of the above embodiments.
According to another aspect of the present disclosure, a computer-readable storage medium is provided, wherein the computer-readable storage medium stores computer instructions which, when executed by a processor, implement the object tracking method according to any one of the above embodiments.
The method and the device can realize the tracking of the motion of the object without prediction.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 and 2 are schematic diagrams of a control platform in some embodiments of the present disclosure.
FIG. 3 is a schematic diagram of some embodiments of a related art object tracking method.
FIG. 4 is a schematic diagram of some embodiments of object tracking methods of the present disclosure.
FIG. 5 is a schematic diagram of further embodiments of object tracking methods of the present disclosure.
Fig. 6 is a schematic view of a scenario in accordance with some embodiments of the present disclosure.
FIG. 7 is a schematic diagram of some embodiments of object tracking devices of the present disclosure.
FIG. 8 is a schematic view of additional embodiments of object tracking devices according to the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
The relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Fig. 1 and 2 are schematic diagrams of a control platform in some embodiments of the present disclosure. Fig. 1 and 2 show schematic views of an intelligent conveyor belt and similar control platforms. Objects such as square boxes, packages and the like need to be accurately positioned and tracked, and the acquired object information (such as geometric dimension, position, attitude, box number and historical movement path) is provided for a control platform, so that real-time closed-loop control is jointly completed.
In the scenarios of fig. 1 and 2, a typical control generally requires a shooting rate of at least 30 frames per second, i.e. at least 30 times per second of location tracking information is required to complete real-time control (each time the interval of location tracking information is less than 33 ms). Typical 3D depth cameras can achieve a data acquisition rate of 30 frames per second, and in the case of 2D color cameras or other sensors, the frame rate can be higher. It is assumed that both the data acquisition and the localization part already fulfill this requirement.
When objects such as square boxes move continuously at a low speed, the positioning part is supposed to be completed, that is, the geometric position and posture information of all the objects such as the square boxes on the platform under each shooting (each frame) is acquired, and then the square boxes need to be tracked.
The tracking of the present disclosure refers to: the positioning results of all the square boxes on the table top (current frame) (for example, if N square boxes exist, the information of the size, the coordinate of the central point, the rotation angle and the like of each box is known, and the information is referred to as geometric pose information in the present disclosure for short, the geometric pose information is considered to be in one-to-one correspondence with the boxes), and the positioning results of the square boxes on the table top (for example, the geometric pose information of M square boxes) acquired by the previous frame are matched to calculate: 1) each box body of the current frame corresponds to the previous box body and is given the same number; 2) the respective historical motion tracks of all the boxes can be obtained. After the tracking part is finished, the tracking result of the current frame can be provided for the control device, and the control device carries out real-time closed-loop control on each box body according to the expected path based on the serial number, the historical motion track information and the like of each box body.
FIG. 3 is a schematic diagram of some embodiments of a related art object tracking method. As shown in fig. 3, the related art object tracking method may include: because the control part is provided with an expected control path of each box body, the prediction can be firstly carried out by the related technology based on the positioning result of the previous frame (or more frames before), the time interval t from the previous frame (or more frames before) to the current frame and the expected path, and the predicted positioning information under the current frame is estimated; then, the positioning result really obtained under the current frame is matched with the current frame. And completing logic design based on the matching result, thereby further completing the tracking.
The inventor finds out through research that the related art has some technical problems:
1. the positioning result of the related art is generally obtained by a camera vision or other sensors, and we will refer to the positioning module providing the function; the control device is referred to as a control module. The tracking function (module), whether placed in the positioning module or the control module, or a single module, requires multiple interactions with the positioning module or (and) the control module during each tracking calculation. Because the tracking module not only needs the current frame positioning result provided by the positioning module, but also needs to predict positioning information provided by the control and positioning module, more information transmission and interaction are needed in each tracking calculation.
2. The related art prediction requires the positioning result of the previous frame (or more frames before), the time interval t from the previous frame (or more frames before) to the current frame, and the expected path, and thus naturally requires corresponding time resources and space resources (such as computing resources). More resources are needed if the related art is to make a more accurate prediction using the results of more previous frames.
3. The related technology can also adopt technologies based on templates, deep learning and the like, but all have certain limitations. The method is characterized in that the target is modeled based on the requirement of a template, or the target type characteristics are limited to a certain extent, the target type characteristics are limited based on deep learning, and pre-training is required. The sizes such as the length, the width and the height of the square box of the scene shown in fig. 1 and 2 are unknown (can be various), and the texture of the box is very single (or the texture is consistent among different boxes except for the size). In addition, these techniques are more demanding in terms of time or space resources.
4. The related technology does not take perfect consideration to the situations of the box entering and exiting the platform, abnormal situations and the like.
In view of at least one of the above technical problems, the present disclosure provides an object tracking method and apparatus, a control platform, and a storage medium, and the present disclosure is explained below by embodiments.
The invention provides an object tracking method which is free of prediction and has simple logic and is complete aiming at a tracking part under the low-speed motion control of a square box body.
The present disclosure is directed to the tracking portion only, and the positioning results for all cases at each positioning period (e.g., each frame) are considered to have been obtained. The positioning result mainly refers to geometric pose information including the geometric size (length and width) of the square box body, pose information (coordinates of the central point of the box body, rotation angle) and the like.
In a typical scenario, the following prerequisites generally exist for the scenario and the control conditions of typical control mentioned in the present disclosure:
1) the square box body moves at low speed, and the speed is below 2 m/s. At 30 frames per second, i.e., each location information acquisition interval is about 33 ms. The calculated box motion distance is not more than 2m/s 33 ms-66 mm-0.066 m.
2) The size of the object, such as a square box, cannot be too small or otherwise difficult to control by the control device (platform). In general, for example, the side length of a square box is at least 15 cm.
3) There will be multiple cases simultaneously on the control platform, each case needing to be controlled separately from the entry platform to the exit.
Aiming at the control requirements in the scenes, the disclosure provides a tracking method for controlling the low-speed motion of objects such as a square box body.
FIG. 4 is a schematic diagram of some embodiments of object tracking methods of the present disclosure. Preferably, this embodiment may be performed by the object tracking device of the present disclosure. The method may comprise the steps of:
and step 40, pre-storing the historical motion track of each object to be tracked.
And step 41, acquiring the current positioning information of all table objects on the control platform.
In some embodiments of the present disclosure, step 41 may comprise: and acquiring the current positioning information of all table objects on the control platform at preset positioning time intervals.
In some embodiments of the present disclosure, step 41 may comprise: and acquiring the current positioning information of all moving table objects on the table top with coordinate information by using sensors such as a camera and the like.
And 42, searching current positioning information matched with the historical motion trail of the object to be tracked from the current positioning information of all table objects aiming at each object to be tracked.
In some embodiments of the present disclosure, step 42 may comprise: and aiming at each object to be tracked, searching current positioning information matched with the latest positioning information in the historical motion trail of the object to be tracked from the current positioning information of all table objects.
In some embodiments of the present disclosure, for each object to be tracked, the step of searching current positioning information matching the latest positioning information in the historical motion trail of the object to be tracked from the current positioning information of all table objects includes: calculating the distance between the current positioning information of each table object and the latest positioning information of the object to be tracked; judging whether the distance is smaller than a preset threshold value; and if the distance is smaller than a preset threshold value, determining that the table object with the distance smaller than the preset threshold value is successfully matched with the object to be tracked.
In some embodiments of the present disclosure, the distance between the current positioning information of each table top object and the latest positioning information of each object to be tracked may be a distance between a center point coordinate position in the current positioning information of each table top object and a center point coordinate position in the latest positioning information of each object to be tracked.
In some embodiments of the present disclosure, the distance between the coordinate position of the center point of each table object and the coordinate position of the center point of each object to be tracked may be a euclidean distance or other distance therebetween.
In some embodiments of the present disclosure, the predetermined threshold is less than a movement distance of the object within a predetermined positioning time interval; and the predetermined threshold is less than the minimum distance from the center of the object to the edge of the object.
In other embodiments of the present disclosure, step 42 may include: and judging whether the table object is matched with the object to be tracked or not by using the overlapping (such as intersection area) of the objects in the two adjacent positioning information as a judgment criterion.
For example: step 42 may include: and judging whether the intersection area of one surface (for example, the bottom surface) in the current positioning information of each table-board object and the corresponding surface (for example, the bottom surface) in the latest positioning information in the historical motion trail of each object to be tracked is smaller than a preset area, so as to judge whether the table-board object is matched with the object to be tracked.
For the scenes and the control mentioned in the above embodiments of the present disclosure, it is only necessary to suggest the criterion of euclidean distance based on the threshold, and the criterion of the above embodiments of the present disclosure is simple, small in calculation amount, and effective enough.
And 43, taking the current positioning information of the successfully matched table top object as the current positioning information of the corresponding object to be tracked.
In some embodiments of the present disclosure, the successfully matched tabletop object and the object to be tracked are the same object.
In some embodiments of the present disclosure, the object tracking method may further include: judging whether current positioning information of the table top object which is unsuccessfully matched with the historical motion tracks of all the objects to be tracked exists or not; and under the condition that the current positioning information of the table top object which is unsuccessfully matched with the historical motion trails of all the objects to be tracked exists, taking the table top object as a new object to be tracked, and taking the current positioning information of the table top object as the current positioning information of the object to be tracked.
In some embodiments of the present disclosure, the object tracking method may further include: under the condition of creating a new object to be tracked, adding a path state value for the new object to be tracked, wherein the path state is normal or abnormal, and a normal (state) path refers to a box which is really and newly entered into a control platform and can only enter from a certain specified position of the control platform; an abnormal (status) path refers to a type other than a designated location entry (e.g., a box dropped into the platform upon empty, etc., to characterize a situation where it is not normally entered into the control platform from the conveyor).
In some embodiments of the present disclosure, the object tracking method may further include: judging whether a historical motion track of an object to be tracked, which is unsuccessfully matched with the current positioning information of all table objects, exists; and under the condition that the historical motion trail of the object to be tracked which is unsuccessfully matched with the current positioning information of all table top objects exists, deleting the historical motion trail of the object to be tracked from the set of the object to be tracked.
In some embodiments of the present disclosure, the deleting the historical motion trail of the object to be tracked from the set of objects to be tracked includes: adding 1 to the tracking abnormal times of the object to be tracked; judging whether the tracking abnormal times of the object to be tracked are more than or equal to the preset times; and deleting the historical motion trail of the object to be tracked from the object set to be tracked under the condition that the tracking abnormal times of the object to be tracked are more than or equal to the preset times.
In some embodiments of the present disclosure, the object tracking method may further include: under the condition that the current positioning information of a table object is successfully matched with the historical motion track of an object to be tracked, judging whether the table object leaves the control platform; under the condition that the table object does not leave the control platform, executing operation (namely path updating operation) of taking the current positioning information of the successfully matched table object as the current positioning information corresponding to the object to be tracked, and then deleting the object to be tracked from the object set to be tracked and deleting the table object from the table object set; and under the condition that the table object leaves the control platform, deleting the object to be tracked from the object set to be tracked, and deleting the table object from the table object set.
Based on the object tracking method provided by the embodiment of the disclosure, a complete flow and a method for tracking the square box body are provided aiming at the requirement of the platform type control device such as the intelligent conveyor belt on the low-speed motion control (real-time closed-loop control) of the objects such as the square box body. The above-described embodiments of the present disclosure do not require interaction with other modules, do not require the expected movement path of the case, and do not require real-time prediction of current positioning information.
Based on actual control and operation conditions in a scene, the embodiment of the disclosure can complete the tracking of multiple boxes only by positioning information of each frame (each preset positioning time interval), and covers a complete life cycle of the boxes from entering the platform to leaving the platform, and possible abnormal conditions and the like. The above embodiments of the present disclosure provide clear and complete "path set" point of view, and require less time resources and less space resources.
The method and the system track the boxes from the viewpoint of 'path set', and complete real-time tracking of any box. The "path set" of the present disclosure is more like the historical motion trajectory of each box, and the control path of the control portion is different. The control path of the control part is based on the expected path and the current pose of the box body, and the control instruction is sent to the box body to enable the box body to move according to the expected path. The "path set" is a state of each box in the process from entering the platform to leaving the platform, from the perspective of the historical motion track.
FIG. 5 is a schematic diagram of further embodiments of object tracking methods of the present disclosure. Preferably, this embodiment may be performed by the object tracking device of the present disclosure. The object in the embodiment of fig. 5 is a square box, and the object in the embodiment of fig. 5 may be other objects such as a package. The object tracking method of the embodiment of fig. 5 may include the steps of:
step 1, obtaining a positioning result list of the current frame, representing the respective positioning information of all square boxes on the table top of the current frame, and totaling M boxes. And recording the positioning result of one of the two as M, wherein M is more than or equal to 0 and less than or equal to (M-1).
Step 2, "path set" stores the respective historical motion trajectories of the square boxes (again, non-control paths).
In some embodiments of the present disclosure, the selection stores the latest pose information only once in the path set. I.e. the last update of the current frame.
In the above embodiment of the present disclosure, N paths are summed in the path set, which means the historical motion trajectories (pose information, etc.) of the currently tracked N boxes. One of the paths is marked as N, 0 is less than or equal to N is less than or equal to (N-1). The path n represents the last updated (e.g., last) geometric pose information for the square box by the current frame.
In some embodiments of the present disclosure, the method can be extended to all historical pose information according to requirements, and all historical pose information can be adopted to perform weighted judgment on matching results.
And 3, matching the path n and the positioning result m based on a preset threshold value. And if the matching is successful, the positioning result m belongs to the path n, namely the square box body corresponding to the positioning result m is matched with the square box body corresponding to (tracked) the path n, and then the step 4-1 is executed. Otherwise, if the matching is not successful, step 5-1 is performed.
In some embodiments of the present disclosure, threshold-based matching: the judgment is mainly carried out based on the coordinates of the central point in the pose information. Based on the latest position information (center point coordinates) of the path n and the position information (center point coordinates) of the positioning result m, the euclidean distance between the two is calculated.
In some embodiments of the present disclosure, a predetermined threshold of 0.07m may be set, and when the euclidean distance between the two is less than the threshold, the matching is considered to be successful, otherwise the matching fails.
In some embodiments of the present disclosure, the threshold holds if (simultaneously: 1) Less than the distance of motion within a frame. The square box body moves at a low speed, and the speed is below 2 m/s; the control requires that the updating of the positioning tracking information is at least 30 frames per second, namely each group of time intervals is less than 33ms, and the calculated movement distance is not more than 2m/s x 33 ms-0.066 m; condition 1) is satisfied; 2) less than half the shortest side length of the box. Considering the control requirement of the control device, the side length of the square box body is at least more than 15 cm; condition 2) is satisfied.
And 4-1, judging whether the box body leaves the platform according to the positioning result m. If the platform leaves, executing the step 4-2; otherwise, if not off-platform. Indicating that the tank is still within control range of the control platform, step 4-3 is performed.
In some embodiments of the present disclosure, the criterion for whether the box has left the platform is: whether all four corners of the box body leave the control platform.
And 4-2, if the platform is left. The positioning result m is deleted from the positioning result list of the current frame. The matching is successful, so that the positioning result m does not need to be matched subsequently, and the subsequent calculation amount can be reduced by deleting the positioning result m from the list. After step 4-2, the process then jumps to step 6, i.e. the "path loss" operation is performed.
And 4-3, entering a path updating operation, and then executing the step 4-4.
In some embodiments of the present disclosure, the "path update" operation may include: and updating the information of the positioning result m into the path n, so that the path n acquires the latest geometric pose information of the corresponding box body and completes the association with the past historical motion track. Meanwhile, the update exception counter for path n (explained in detail in "path setup") is set to 0.
Step 4-4, deleting the positioning result m from the positioning result list of the current frame; next, a jump is made to step 0.
And 5-1, continuously judging whether all positioning results of the current frame are traversed. If not (all positioning results of the current frame are not traversed), executing the step 5-2; if yes (all positioning results of the current frame have been traversed), it means that the path n does not match any positioning result of the previous frame. Path n enters a "path extinction" operation, i.e. jumps to step 6.
And step 5-2, making m equal to m +1, and then continuing to perform threshold-based matching on the next positioning result (positioning result m +1) in the positioning result list and the path n, namely, jumping back to step 0.
And 6, performing 'path extinction' operation.
In some embodiments of the present disclosure, a "path kill" operation is used to indicate a kill (delete) operation performed on a path, indicating that the life cycle of the path has ended and that the path (and its corresponding box) need not be traced.
In some embodiments of the present disclosure, step 6 may comprise:
and 6-1, judging whether the box body leaves the platform. The method of determining whether the pod has left the platform is the same as described in step 4-1.
And 6-2, if the box body is separated, namely under the condition of the step 4-2, the life cycle of the path is naturally ended, and the path belongs to normal extinction: path n is immediately deleted from the "path set".
And 6-3, if the box body does not leave, namely the path n is not matched with any positioning result, indicating that abnormal conditions such as loss of the heel and the like may exist. To address this type of anomaly, the present disclosure contemplates forced (abnormal) extinction of path n.
In some embodiments of the present disclosure, step 6-3 may comprise: setting an update exception counter in path n (explained in detail in "path setup"); the counter is first incremented by +1 and then a decision is made as to whether a preset value has been reached, for example 3 times (indicating that 3 consecutive frames have not been updated. this value can be set according to the scene, but is not recommended to be too large, otherwise the possibility of errors is increased during the matching of the path with the positioning result). If the threshold is reached, immediately deleting the path n from the "path set" (forced extinction); if not, go directly to the subsequent step 7.
And 7, judging whether all paths in the path set are traversed or not. If all paths in the path set are not traversed, entering the next cycle, and repeating the operations of the steps 3-6 for the next path (path n + 1); if all paths in the path set have been traversed, indicating that all paths in the path set have been processed, go to step 0.
And 8, judging whether the positioning result list is empty or not. If the list is not empty, indicating that there is a positioning result (denoted as j) that does not match any of the paths in the set of paths, step 9 is performed. If the list is empty. Indicating that all the positioning results of the current frame have been successfully matched, step 10 is performed.
In step 4-2 and step 4-4 (matching based on threshold, when matching is successful), the above-mentioned embodiment of the present disclosure performs a deletion operation on the matched positioning result m, so that all the matched positioning results are deleted. If all the localization results of the current frame match, the list will be empty.
Step 9, enter "path setup" operation, and then enter step 10.
In some embodiments of the present disclosure, step 9 may comprise: when the positioning result j is not matched with any path in the path set, the present disclosure newly creates a new path in the path set (if there are M paths in the previous path set, the path is the M +1 th path). The path has the following characteristics:
a) a path has a state that indicates whether the path is normal or abnormal.
The normal (state) path refers to the box that really enters the control platform newly: and the system can only enter from a certain specified position of the control platform, and is shown in a scene diagram of fig. 6.
An abnormal (status) path refers to a type other than a designated location entry (e.g., a box dropped into the platform upon empty, etc., to characterize a situation where it is not normally entered into the control platform from the conveyor).
When the positioning result j appears (without matching), a new path is established for the positioning result, and the positioning result is updated to the newly established path. The rest of the state and operations of the path (such as path update and extinction, etc.) are identical except that the state is normal or abnormal.
The above embodiments of the present disclosure have the advantage of dividing the path into normal and abnormal paths: dividing the path into normal and abnormal can distinguish whether the box enters the platform normally at all, or whether the box is lost (or lost due to empty, etc.). More importantly, the abnormal path may give an alarm, such as a movement to a designated location, to an operation corresponding to the box. Only a uniform processing means needs to be set for the abnormal path. But there is no difference to the operation and normal state of the abnormal path for the path itself (and all flows in the tracking method).
b) The path also has an 'update exception counter', which is set to 0 when the path is established (the path in the newly established normal state and the abnormal state is both true). Because they are all just "updated" at this time). A value other than 0 indicates how many times the path has not been updated in succession. For example, the update exception counter is 3, which indicates that the path has been updated 3 times (3 times, and the matching with all the positioning results of the current frame is not successful).
Step 10, outputting the tracking information of the current frame, i.e. the current "path set". And finishing the tracking.
In some embodiments of the present disclosure, step 10 may comprise: the latest state of each path in the path set (necessary information in the path set, which indicates the latest situation of each tracked box) is provided to the control part, so that the control part can tell the number, the geometric position and the posture and the historical motion track of each box body on the current table top.
All the processes of one-time tracking (after the positioning result of the current frame is obtained) are completed.
Each path in the path set of the above embodiments of the present disclosure includes three stages, i.e., path establishment → path update → path extinction. Compared with the method that each tracking is considered for the square box body (namely, after each positioning is finished, whether the box body is newly entered or not, whether the box body is left or not, which box body the box body corresponds to before and the like) each time, the path set viewpoint provided by the embodiment of the disclosure can only consider the latest state of the path under the condition that the box body continuously enters or leaves the platform and under the condition that an abnormity occurs, and whether the path is established, updated or eliminated.
The above-described embodiments of the present disclosure do not require prediction of the positioning information of the current frame based on the expected path of the control module (no prediction is required), and do not require interaction with other modules (no interaction is required). And tracking the square box body only by utilizing the positioning result every time. The present disclosure therefore requires less time and space resources.
The above-described embodiments of the present disclosure may be applied to the exemplary scenarios of the embodiments of fig. 1 and 2.
The above embodiments of the present disclosure may also be adapted to the scene schematic shown in fig. 6. As shown in fig. 6, the square boxes move from the conveyor belt, enter the platform type control device from a certain designated side (designated position, in actual scene, only the designated position is required to enter, and the position of leaving can be many), and then are controlled according to a preset path and leave the platform (such as indicated by a green arrow or a yellow arrow). And tracking the square box body in the process of entering the platform and leaving the platform, and outputting the real-time geometric pose information to the control platform.
The table board of the embodiment of the present disclosure can have a plurality of square boxes simultaneously, and the boxes need to be controlled respectively.
FIG. 7 is a schematic diagram of some embodiments of object tracking devices of the present disclosure. As shown in fig. 7, the object tracking apparatus of the present disclosure may include a history track storage module 70, a positioning information acquisition module 71, a motion track matching module 72, and an object motion tracking module 73, wherein:
and a historical track storage module 70, configured to store a historical motion track of each object to be tracked in advance.
And the positioning information acquisition module 71 is configured to acquire current positioning information of all table objects on the control platform.
In some embodiments of the present disclosure, the positioning information acquisition module 71 may be used to acquire the current positioning information of all table objects on the control platform at predetermined positioning time intervals.
And a motion track matching module 72, configured to, for each object to be tracked, search current positioning information that matches the historical motion track of the object to be tracked from the current positioning information of all table objects.
In some embodiments of the present disclosure, the motion trajectory matching module 72 may be configured to, for each object to be tracked, search current positioning information of all table objects for current positioning information that matches the latest positioning information in the historical motion trajectories of the object to be tracked.
In some embodiments of the present disclosure, the motion trajectory matching module 72 may be configured to calculate a distance between the current positioning information of each table object and the latest positioning information of each object to be tracked; judging whether the distance is smaller than a preset threshold value; and when the distance is smaller than a preset threshold value, determining that the table object with the distance smaller than the preset threshold value is successfully matched with the object to be tracked.
In some embodiments of the present disclosure, the distance between the current positioning information of each table top object and the latest positioning information of each object to be tracked may be a distance between a center point coordinate position in the current positioning information of each table top object and a center point coordinate position in the latest positioning information of each object to be tracked.
In some embodiments of the present disclosure, the predetermined threshold is less than a movement distance of the object within a predetermined positioning time interval; the predetermined threshold is less than a minimum distance from the center of the object to the edge of the object.
In some embodiments of the present disclosure, the motion trajectory matching module 72 may be configured to determine whether the tabletop object matches the object to be tracked by using the overlap (such as the intersection area) of the objects in the two adjacent positioning information as a criterion.
In some embodiments of the present disclosure, the motion trajectory matching module 72 may be configured to determine whether an intersection area of one surface (e.g., bottom surface) in the current positioning information of each table object and a corresponding surface (e.g., bottom surface) in the latest positioning information in the historical motion trajectory of each object to be tracked is smaller than a predetermined area, so as to determine whether the table object and the object to be tracked are matched.
For the scenes and the control mentioned in the above embodiments of the present disclosure, it is only necessary to suggest the criterion of euclidean distance based on the threshold, and the above embodiments of the present disclosure have simple criterion, small calculation amount, and sufficient effectiveness
And the object motion tracking module 73 is configured to use the current positioning information of the successfully matched table object as the current positioning information of the corresponding object to be tracked.
In some embodiments of the present disclosure, the successfully matched tabletop object and the object to be tracked are the same object.
In some embodiments of the present disclosure, the object tracking device may be further configured to output a current motion trajectory of the object to be tracked, wherein the current motion trajectory includes a historical motion trajectory and current positioning information.
In some embodiments of the present disclosure, the object tracking device may be further configured to determine whether there is current positioning information of the table object that has failed to match the historical motion trajectories of all the objects to be tracked; and under the condition that the current positioning information of the table top object which is unsuccessfully matched with the historical motion trails of all the objects to be tracked exists, taking the table top object as a new object to be tracked, and taking the current positioning information of the table top object as the current positioning information of the object to be tracked.
In some embodiments of the present disclosure, the object tracking device may be further configured to determine whether there is a historical movement trajectory of the object to be tracked that is not successfully matched with the current positioning information of all the table objects; and under the condition that the historical motion trail of the object to be tracked which is unsuccessfully matched with the current positioning information of all table top objects exists, deleting the historical motion trail of the object to be tracked from the set of the object to be tracked.
In some embodiments of the present disclosure, the object tracking apparatus may be configured to add 1 to the number of tracking anomalies of the object to be tracked when the historical motion trajectory of the object to be tracked is deleted from the set of objects to be tracked; judging whether the tracking abnormal times of the object to be tracked are more than or equal to the preset times; and deleting the historical motion trail of the object to be tracked from the object set to be tracked under the condition that the tracking abnormal times of the object to be tracked are more than or equal to the preset times.
In some embodiments of the present disclosure, the object tracking apparatus may be further configured to determine whether the table object has left the control platform when the current positioning information of the table object is successfully matched with the historical motion trajectory of an object to be tracked; under the condition that the table object does not leave the control platform, executing operation (namely path updating operation) of taking the current positioning information of the successfully matched table object as the current positioning information corresponding to the object to be tracked, and then deleting the object to be tracked from the object set to be tracked and deleting the table object from the table object set; and under the condition that the table object leaves the control platform, deleting the object to be tracked from the object set to be tracked, and deleting the table object from the table object set.
In some embodiments of the present disclosure, the object tracking apparatus is configured to perform operations for implementing the object tracking method according to any of the embodiments described above (e.g., any of the embodiments of fig. 4 or 5).
Based on the object tracking device provided by the embodiment of the disclosure, a complete flow and a method for tracking the square box body are provided aiming at the requirement of the platform type control device such as the intelligent conveyor belt on the low-speed motion control (real-time closed-loop control) of the objects such as the square box body. The above-described embodiments of the present disclosure do not require interaction with other modules, do not require the expected movement path of the case, and do not require real-time prediction of current positioning information.
FIG. 8 is a schematic view of additional embodiments of object tracking devices according to the present disclosure. As shown in fig. 8, the object tracking device of the present disclosure may include a memory 81 and a processor 82, wherein:
a memory 81 for storing instructions.
A processor 82 configured to execute the instructions to cause the apparatus to perform operations for implementing the object tracking method according to any of the embodiments described above (e.g., any of the embodiments of fig. 4 or fig. 5).
Based on actual control and operation conditions in a scene, the embodiment of the disclosure can complete the tracking of multiple boxes only by positioning information of each frame (each preset positioning time interval), and covers a complete life cycle of the boxes and other objects from entering the platform to leaving, and possible abnormal conditions and the like. The above embodiments of the present disclosure provide clear and complete "path set" point of view, and require less time resources and less space resources.
According to another aspect of the present disclosure, there is provided a control platform including an object tracking device as described in any of the above embodiments (e.g., any of the embodiments of fig. 7 or 8).
Based on the control platform provided by the above embodiment of the present disclosure, the boxes are tracked from the viewpoint of "path set", and real-time tracking of any box is completed. The "path set" of the present disclosure is more like the historical motion trajectory of each box, and the control path of the control portion is different. The control path of the control part is based on the expected path and the current pose of the box body, and the control instruction is sent to the box body to enable the box body to move according to the expected path. The "path set" is a state of each box in the process from entering the platform to leaving the platform, from the perspective of the historical motion track.
According to another aspect of the present disclosure, a computer-readable storage medium is provided, wherein the computer-readable storage medium stores computer instructions, which when executed by a processor, implement the object tracking method according to any of the above embodiments (e.g., any of fig. 4 or fig. 5).
Each path in the path set of the above embodiments of the present disclosure includes three stages, i.e., path establishment → path update → path extinction. Compared with the method that each tracking is considered for the square box body (namely, after each positioning is finished, whether the box body is newly entered or not, whether the box body is left or not, which box body the box body corresponds to before and the like) each time, the path set viewpoint provided by the embodiment of the disclosure can only consider the latest state of the path under the condition that the box body continuously enters or leaves the platform and under the condition that an abnormity occurs, and whether the path is established, updated or eliminated.
The above-described embodiments of the present disclosure do not require prediction of the positioning information of the current frame based on the expected path of the control module (no prediction is required), and do not require interaction with other modules (no interaction is required). And tracking the square box body only by utilizing the positioning result every time. The present disclosure therefore requires less time and space resources.
The object tracking devices described above may be implemented as a general purpose processor, a Programmable Logic Controller (PLC), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any suitable combination thereof, for performing the functions described herein.
Thus far, the present disclosure has been described in detail. Some details that are well known in the art have not been described in order to avoid obscuring the concepts of the present disclosure. It will be fully apparent to those skilled in the art from the foregoing description how to practice the presently disclosed embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware to implement the above embodiments, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk, an optical disk, or the like.
The description of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to practitioners skilled in this art. The embodiment was chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (13)

1. An object tracking method, comprising:
pre-storing the historical motion track of each object to be tracked;
acquiring current positioning information of all table objects on a control platform;
for each object to be tracked, searching current positioning information matched with the historical motion track of the object to be tracked from the current positioning information of all table objects;
and taking the current positioning information of the successfully matched table top object as the current positioning information of the corresponding object to be tracked.
2. The object tracking method according to claim 1, wherein the table object and the object to be tracked that are successfully matched are the same object;
the object tracking method further includes:
and outputting the current motion trail of the object to be tracked, wherein the current motion trail comprises historical motion trail and current positioning information.
3. The object tracking method according to claim 1, wherein the searching for current positioning information matching the historical motion trajectory of the object to be tracked from the current positioning information of all table objects for each object to be tracked comprises:
and aiming at each object to be tracked, searching current positioning information matched with the latest positioning information in the historical motion trail of the object to be tracked from the current positioning information of all table objects.
4. The object tracking method according to claim 3, wherein for each object to be tracked, the searching for current positioning information matching the latest positioning information in the historical motion trail of the object to be tracked from the current positioning information of all table objects comprises:
calculating the distance between the current positioning information of each table object and the latest positioning information of the object to be tracked;
judging whether the distance is smaller than a preset threshold value;
and if the distance is smaller than a preset threshold value, determining that the table object with the distance smaller than the preset threshold value is successfully matched with the object to be tracked.
5. The object tracking method according to claim 4,
the acquiring of the current positioning information of all table objects on the control platform comprises: acquiring current positioning information of all table objects on the control platform at preset positioning time intervals;
the predetermined threshold is less than the distance of movement of the object within a predetermined positioning time interval and the predetermined threshold is less than the minimum distance from the center of the object to the edge of the object.
6. The object tracking method according to any one of claims 1 to 5, further comprising:
judging whether current positioning information of the table top object which is unsuccessfully matched with the historical motion tracks of all the objects to be tracked exists or not;
and under the condition that the current positioning information of the table top object which is unsuccessfully matched with the historical motion trails of all the objects to be tracked exists, taking the table top object as a new object to be tracked, and taking the current positioning information of the table top object as the current positioning information of the object to be tracked.
7. The object tracking method according to any one of claims 1 to 5, further comprising:
judging whether a historical motion track of an object to be tracked, which is unsuccessfully matched with the current positioning information of all table objects, exists;
and under the condition that the historical motion trail of the object to be tracked which is unsuccessfully matched with the current positioning information of all table top objects exists, deleting the historical motion trail of the object to be tracked from the set of the object to be tracked.
8. The object tracking method according to claim 7, wherein the deleting the historical motion trajectory of the object to be tracked from the set of objects to be tracked comprises:
adding 1 to the tracking abnormal times of the object to be tracked;
judging whether the tracking abnormal times of the object to be tracked are more than or equal to the preset times;
and deleting the historical motion trail of the object to be tracked from the object set to be tracked under the condition that the tracking abnormal times of the object to be tracked are more than or equal to the preset times.
9. An object tracking device, comprising:
the historical track storage module is used for storing the historical motion track of each object to be tracked in advance;
the positioning information acquisition module is used for acquiring the current positioning information of all table objects on the control platform;
the motion track matching module is used for searching current positioning information matched with the historical motion track of the object to be tracked from the current positioning information of all table objects aiming at each object to be tracked;
and the object motion tracking module is used for taking the current positioning information of the successfully matched table top object as the current positioning information corresponding to the object to be tracked.
10. The object tracking device according to claim 9, wherein the object tracking device is configured to perform operations to implement the object tracking method according to any one of claims 2 to 8.
11. An object tracking device, comprising:
a memory to store instructions;
a processor configured to execute the instructions to cause the apparatus to perform operations to implement the object tracking method of any of claims 1-8.
12. A control platform, characterized by an object tracking device as claimed in any one of claims 9-11.
13. A computer-readable storage medium storing computer instructions which, when executed by a processor, implement an object tracking method as claimed in any one of claims 1 to 8.
CN202010223928.1A 2020-03-26 2020-03-26 Object tracking method and device, control platform and storage medium Active CN111784224B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010223928.1A CN111784224B (en) 2020-03-26 2020-03-26 Object tracking method and device, control platform and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010223928.1A CN111784224B (en) 2020-03-26 2020-03-26 Object tracking method and device, control platform and storage medium

Publications (2)

Publication Number Publication Date
CN111784224A true CN111784224A (en) 2020-10-16
CN111784224B CN111784224B (en) 2024-08-20

Family

ID=72753058

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010223928.1A Active CN111784224B (en) 2020-03-26 2020-03-26 Object tracking method and device, control platform and storage medium

Country Status (1)

Country Link
CN (1) CN111784224B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050185822A1 (en) * 2004-02-20 2005-08-25 James Slaski Component association tracker system and method
CN102509457A (en) * 2011-10-09 2012-06-20 青岛海信网络科技股份有限公司 Vehicle tracking method and device
CN106767829A (en) * 2016-12-31 2017-05-31 佛山潮伊汇服装有限公司 Open country swimming automatic navigation method
CN108230353A (en) * 2017-03-03 2018-06-29 北京市商汤科技开发有限公司 Method for tracking target, system and electronic equipment
CN110046548A (en) * 2019-03-08 2019-07-23 深圳神目信息技术有限公司 Tracking, device, computer equipment and the readable storage medium storing program for executing of face
CN110414447A (en) * 2019-07-31 2019-11-05 京东方科技集团股份有限公司 Pedestrian tracting method, device and equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050185822A1 (en) * 2004-02-20 2005-08-25 James Slaski Component association tracker system and method
CN102509457A (en) * 2011-10-09 2012-06-20 青岛海信网络科技股份有限公司 Vehicle tracking method and device
CN106767829A (en) * 2016-12-31 2017-05-31 佛山潮伊汇服装有限公司 Open country swimming automatic navigation method
CN108230353A (en) * 2017-03-03 2018-06-29 北京市商汤科技开发有限公司 Method for tracking target, system and electronic equipment
CN110046548A (en) * 2019-03-08 2019-07-23 深圳神目信息技术有限公司 Tracking, device, computer equipment and the readable storage medium storing program for executing of face
CN110414447A (en) * 2019-07-31 2019-11-05 京东方科技集团股份有限公司 Pedestrian tracting method, device and equipment

Also Published As

Publication number Publication date
CN111784224B (en) 2024-08-20

Similar Documents

Publication Publication Date Title
CN109724612B (en) AGV path planning method and device based on topological map
US10293483B2 (en) Apparatus and methods for training path navigation by robots
CN109163722B (en) Humanoid robot path planning method and device
CN106682572A (en) Target tracking method, target tracking system and first electronic device
CN106780484A (en) Robot interframe position and orientation estimation method based on convolutional neural networks Feature Descriptor
Bipin et al. Autonomous navigation of generic monocular quadcopter in natural environment
CN112965496B (en) Path planning method and device based on artificial potential field algorithm and storage medium
Chang et al. Accuracy improvement of autonomous straight take-off, flying forward, and landing of a drone with deep reinforcement learning
CN111292352A (en) Multi-target tracking method, device, equipment and storage medium
Xing et al. Autonomous power line inspection with drones via perception-aware mpc
CN110383192A (en) Moveable platform and its control method
EP2899706A1 (en) Method and system for analyzing human behavior in an intelligent surveillance system
CN111784224A (en) Object tracking method and device, control platform and storage medium
Bobkov et al. Vision-based navigation method for a local maneuvering of the autonomous underwater vehicle
CN112987713B (en) Control method and device for automatic driving equipment and storage medium
CN113156961A (en) Driving control model training method, driving control method and related device
CN116540715A (en) Target tracking method, target tracking device, robot, and storage medium
CN112766764A (en) Security monitoring method and device based on intelligent robot and storage medium
Hofmann et al. The Carologistics RoboCup Logistics Team 2018
CN111738152B (en) Image determining method and device, storage medium and electronic device
CN115371685B (en) Method and device for planning dominant path of unmanned equipment in industrial control scene and storage medium
Lőrincz et al. Imitation learning for generalizable self-driving policy with sim-to-real transfer
KR102238780B1 (en) A automatic guiding method for unmanned areal vehicle using object detection
Ji Target tracking trajectory generation for quadrotors in static complex environments
Hossain et al. Follow the Soldiers with Optimized Single-Shot Multibox Detection and Reinforcement Learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant