CN111986224B - Target behavior prediction tracking method and device - Google Patents

Target behavior prediction tracking method and device Download PDF

Info

Publication number
CN111986224B
CN111986224B CN202010777700.7A CN202010777700A CN111986224B CN 111986224 B CN111986224 B CN 111986224B CN 202010777700 A CN202010777700 A CN 202010777700A CN 111986224 B CN111986224 B CN 111986224B
Authority
CN
China
Prior art keywords
target
behavior prediction
tracked
tracking device
moving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010777700.7A
Other languages
Chinese (zh)
Other versions
CN111986224A (en
Inventor
郭霄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seven Seas Shenzhen Technology Co ltd
Original Assignee
Seven Seas Shenzhen Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seven Seas Shenzhen Technology Co ltd filed Critical Seven Seas Shenzhen Technology Co ltd
Priority to CN202010777700.7A priority Critical patent/CN111986224B/en
Publication of CN111986224A publication Critical patent/CN111986224A/en
Application granted granted Critical
Publication of CN111986224B publication Critical patent/CN111986224B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/047Optimisation of routes or paths, e.g. travelling salesman problem
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Multimedia (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Biomedical Technology (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Game Theory and Decision Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The embodiment of the invention provides a target behavior prediction tracking method and device, which can acquire images of targets to be tracked and identify and obtain attitude images; determining a first motion direction according to the gesture image; the method comprises the steps of respectively collecting the distance between a target to be tracked and a target behavior prediction tracking device at a first time point and a second time point, and respectively taking the distance as a first distance and a second distance; calculating the moving speed of the target to be tracked in the front-back direction by using the first distance, the second distance, the moving duration and the moving speed of the target behavior prediction tracking device, and calculating the second moving direction in the front-back direction; estimating the target position of the target to be tracked after a preset time length according to the first moving direction, the second moving direction and the moving speed of the target to be tracked; and planning a tracking path based on the coordinates and the speed of the target behavior prediction tracking device in a preset planning area. The method reduces the planning range of the tracking path by utilizing the movement direction, and rapidly plans the tracking path.

Description

Target behavior prediction tracking method and device
Technical Field
The invention relates to the technical field of tracking robots, in particular to a target behavior prediction tracking method and device.
Background
At present, the security prevention and control of public places is gradually popularized. In the existing security prevention and control process, the personnel tracking device is usually relied on to carry out inspection in public places such as airports and stations, and after suspicious personnel are monitored, the suspicious personnel are tracked, so that the suspicious personnel can be found and tracked in time, and the security prevention and control is completed.
In the process of personnel tracking, as the movement rule of suspicious personnel has uncertainty, how to quickly plan a tracking path and track personnel has important significance on whether safety prevention and control can be effectively performed.
Disclosure of Invention
The embodiment of the invention provides a target behavior prediction tracking method and device, which aim to achieve the technical effects of quickly planning a tracking path and tracking personnel.
In one aspect of the present invention, a target behavior prediction tracking method is provided, and is applied to a target behavior prediction tracking device, where the target behavior prediction tracking device includes: a depth camera and an image acquisition device, the method comprising:
in the process of carrying out inspection in a preset planning area, the target behavior prediction tracking device acquires an image of a target to be tracked through the image acquisition equipment;
The images are identified by utilizing a pre-trained gesture recognition neural network model to obtain a gesture image of the target to be tracked;
determining a first movement direction of the target to be tracked in the left-right direction according to the gesture represented by the gesture image;
the distance of the target to be tracked relative to the target behavior prediction tracking device is acquired at a first time point and a second time point through the depth camera and is used as a first distance and a second distance respectively;
calculating the moving speed of the target to be tracked in the front-rear direction by using the first distance, the second distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration, and determining the second moving direction of the target to be tracked in the front-rear direction according to the calculated moving speed, wherein the moving duration is the duration between the first time point and the second time point;
estimating the target position of the target to be tracked after a preset time length according to the first movement direction, the second movement direction and the movement speed of the target to be tracked;
and planning a tracking path of the target behavior prediction tracking device moving to the target position based on the coordinates and the speed of the target behavior prediction tracking device in a preset planning area, and tracking the target to be tracked according to the tracking path.
Optionally, the step of determining the first movement direction of the target to be tracked in the left-right direction according to the gesture represented in the gesture image includes:
determining a target gesture key point in the gesture image, wherein the target gesture key point comprises: left and right shoulder articulation points, left and right hip articulation points, and left and right knee articulation points;
fitting a fitting straight line for representing the target gesture to be tracked by using the target gesture key points through a least square method;
and under the condition that the slope of the fitting straight line is a positive value, determining that the target to be tracked moves rightwards, and under the condition that the slope of the fitting straight line is a negative value, determining that the target to be tracked moves leftwards.
Optionally, the step of calculating the moving speed of the target to be tracked in the front-rear direction by using the first distance, the second distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration, and determining the second moving direction of the target to be tracked in the front-rear direction according to the calculated moving speed includes:
respectively converting the first distance and the second distance into distances in the y direction under a geodetic coordinate system to obtain a first test distance and a second test distance;
Calculating the moving speed of the target to be tracked in the front-rear direction by using the following expression by using the first test distance, the second test distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration:
v ty =((v ry *dt+D 2y )-D 1y )/dt
wherein v is ty Representing the moving speed of the person to be tracked in the front-rear direction, v ry Representing the moving speed of the target behavior prediction tracking device in the front-back direction in the movement duration, dt represents the movement duration, and D 1y Represents a first test distance, D 2y Representing a second test distance;
at said v ty In case the value of (2) is positive, determining that the target to be tracked moves forward, and in the v ty And under the condition that the value of (2) is negative, determining that the target to be tracked moves backwards.
Optionally, before the step of planning the tracking path of the target behavior prediction tracking device moving to the target to be tracked based on the coordinates of the target behavior prediction tracking device in the preset planning area and the target position, the method further includes:
cutting out the preset planning area by using the following expression:
wherein rect represents the clipping result, (x) r ,y r ) Representing coordinates of the target behavior prediction tracking device in the preset planning area, (x) t ,y t ) Representing coordinates of an object to be tracked in the preset planning area, W representing the width of the preset planning area, H representing the height of the preset planning area, V tx Representing the moving speed of the target to be tracked in the first direction, V ty Representing the moving speed of the target to be tracked in the second direction.
Optionally, the step of planning the tracking path of the target behavior prediction tracking device moving to the target position based on the coordinates and the speed of the target behavior prediction tracking device in a preset planning area includes:
calculating the circle centers of the i tracking paths by using the speed of the target behavior prediction tracking device and coordinates in the preset planning area through the following expression:
wherein v represents the moving speed of the target behavior prediction tracking device, w represents the rotating speed of the target behavior prediction tracking device, θ represents the steering angle of the target behavior prediction tracking device, fx, fy are the coordinates of the target behavior prediction tracking device,the radius of the circle center of the i-th segment tracking path is v i /w i I represents the number of paths, i=1, 2, 3 … …;
determining i feasible paths of the target behavior prediction tracking device moving to the target position by using the i circle centers obtained through calculation and the speed of the target behavior prediction tracking device;
And selecting one path from the determined feasible paths as a tracking path for the target behavior prediction tracking device to move towards the target position.
Optionally, the step of selecting a path from the determined feasible paths as a tracking path for the target behavior prediction tracking device to move towards the target position includes:
determining a speed value range V of the target behavior prediction tracking device when moving along each feasible path S
Determining a safe speed value range v of the target behavior prediction tracking device when moving according to each feasible path by using the following expression a
Determining a maximum acceleration value range V of the target behavior prediction tracking device when moving along each feasible path by using the following expression d
Determining a speed search space Vr, wherein vr=vs n Va n Vd;
selecting a feasible path corresponding to each speed in the speed search space Vr from the determined feasible paths as a to-be-selected path;
and selecting one path from the paths to be selected as a tracking path for the target behavior prediction tracking device to move towards the target position.
Optionally, the step of selecting a path from the paths to be selected as a tracking path for the target behavior prediction tracking device to move toward the target position includes:
Each speed in the speed search space Vr is scored using the following evaluation function:
G(v,w)=σ(α*heading(v,w)+β*dist(v,w)+γ*vel(v,w))
the head (v, w) represents the alignment degree of the target behavior prediction tracking device and the personnel to be tracked, the dist (v, w) represents the distance between the target behavior prediction tracking device and the nearest obstacle intersected with the track of the target behavior prediction tracking device, and the vel (v, w) represents the speed v, alpha, beta and gamma of a certain track of the target behavior prediction tracking device, wherein the speeds v, alpha, beta and gamma are the weights of the evaluation functions respectively;
and taking the path to be selected corresponding to the highest scoring speed as a tracking path for the target behavior prediction tracking device to move towards the target position.
Optionally, the method further comprises:
and after the target behavior prediction tracking device moves to the target position, acquiring the face information of the target to be tracked through the image acquisition equipment.
Optionally, the method further comprises:
and acquiring and recording time information for acquiring the face data and geographic position information of the target position representation.
In still another aspect of the present invention, there is also provided a target behavior prediction tracking device, including: depth camera and image acquisition device still include:
The image acquisition module is connected with the image acquisition equipment and used for acquiring an image of a target to be tracked through the image acquisition equipment in the process that the target behavior prediction tracking device performs inspection in a preset planning area;
the image recognition module is connected with the image acquisition module and used for recognizing the image by utilizing a pre-trained gesture recognition neural network model to obtain a gesture image of the target to be tracked;
the direction determining module is connected with the image identifying module and used for determining a first movement direction of the target to be tracked in the left-right direction according to the gesture represented by the gesture image;
the distance acquisition module is connected with the direction determination module and the depth camera and used for respectively acquiring the distance of the target to be tracked relative to the target behavior prediction tracking device at a first time point and a second time point through the depth camera, and the distance is respectively used as a first distance and a second distance;
the speed calculation module is connected with the distance acquisition module and used for calculating the moving speed of the target to be tracked according to the first distance, the second distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration, and determining the second moving direction of the target to be tracked in the front-back direction according to the calculated moving speed, wherein the moving duration is the duration between the first time point and the second time point;
The position estimating module is connected with the speed calculating module and used for estimating the target position of the target to be tracked after the preset time length according to the first moving direction, the second moving direction and the moving speed;
and the personnel tracking module is connected with the position estimation module and is used for planning a tracking path of the target behavior prediction tracking device moving towards the target to be tracked based on the coordinates of the target behavior prediction tracking device in a preset planning area and the target position, and tracking the target to be tracked according to the tracking path.
According to the target behavior prediction tracking method and device provided by the embodiment of the invention, the image acquisition equipment is used for acquiring the image of the target to be tracked in the process of carrying out inspection in the preset planning area; the images are identified by utilizing a pre-trained gesture recognition neural network model to obtain a gesture image of the target to be tracked; determining a first movement direction of the target to be tracked in the left-right direction according to the gesture represented by the gesture image; the distance of the target to be tracked relative to the target behavior prediction tracking device is acquired at a first time point and a second time point through the depth camera and is used as a first distance and a second distance respectively; calculating the moving speed of the target to be tracked in the front-rear direction by using the first distance, the second distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration, and determining the second moving direction of the target to be tracked in the front-rear direction according to the calculated moving speed, wherein the moving duration is the duration between the first time point and the second time point; estimating the target position of the target to be tracked after a preset time length according to the first movement direction, the second movement direction and the movement speed of the target to be tracked; and planning a tracking path of the target behavior prediction tracking device moving to the target position based on the coordinates and the speed of the target behavior prediction tracking device in a preset planning area, and tracking the target to be tracked according to the tracking path. By applying the scheme provided by the embodiment of the invention, the movement direction of the target to be tracked can be identified, and the target position of the target to be tracked after the preset time is estimated by utilizing the movement direction and the movement speed of the target to be tracked, namely the planning range of the tracking path can be reduced by utilizing the movement direction of the target to be tracked, so that the tracking path to the target position can be planned rapidly.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate and together with the description serve to explain the invention. In the drawings:
fig. 1 is a schematic structural diagram of a first target behavior prediction tracking device according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a target behavior prediction method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a target gesture key point according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a clipping result according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a second target behavior prediction tracking device according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following embodiments and the accompanying drawings, in order to make the objects, technical solutions and advantages of the present invention more apparent. The exemplary embodiments of the present invention and the descriptions thereof are used herein to explain the present invention, but are not intended to limit the invention.
Referring to fig. 1, a schematic structural diagram of a first target behavior prediction tracking device according to an embodiment of the present invention is provided, where in this implementation manner, the target behavior prediction tracking device includes: the device comprises a depth camera, an image acquisition device, a searchlight, an antenna and a scram button;
In the implementation, in the process that the target behavior prediction tracking device performs inspection in a preset planning area, an image of a target to be tracked is acquired through image acquisition equipment; recognizing the image by utilizing a pre-trained gesture recognition neural network model to obtain a gesture image of the target to be tracked; determining a first movement direction of a target to be tracked in the left-right direction according to the gesture represented by the gesture image; the method comprises the steps that the distance of a target to be tracked relative to a target behavior prediction tracking device is acquired at a first time point and a second time point through a depth camera and is used as a first distance and a second distance respectively; calculating the moving speed of the target to be tracked in the front-rear direction by using the first distance, the second distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration, and determining the second moving direction of the target to be tracked in the front-rear direction according to the calculated moving speed, wherein the moving duration is the duration between the first time point and the second time point; estimating the target position of the target to be tracked after a preset time length according to the first moving direction, the second moving direction and the moving speed of the target to be tracked; the target behavior prediction tracking device is used for predicting the coordinate and the speed of the target behavior prediction tracking device in a preset planning area, planning a tracking path of the target behavior prediction tracking device moving towards the target position, and tracking the target to be tracked according to the tracking path, so that the planning range of the tracking path can be reduced by utilizing the movement direction of the target to be tracked, and the tracking path towards the target position can be planned rapidly.
Referring to fig. 2, a flow chart of a target behavior prediction tracking method provided by an embodiment of the present invention is applied to a target behavior prediction tracking device, where the target behavior prediction tracking device includes: a depth camera and an image acquisition device, the method comprising:
s200, in the process that the target behavior prediction tracking device performs inspection in a preset planning area, an image of a target to be tracked is acquired through image acquisition equipment.
In implementation, the preset planning area may be a public area such as an airport, a station, and the like. Specifically, before the target behavior prediction tracking device performs the inspection, the map information of the preset planning area may be input to the target behavior prediction tracking device, so that the target behavior prediction tracking device may perform the inspection within the preset planning area autonomously.
The target to be tracked may be a person to be tracked, a vehicle to be tracked, or the like.
S210, recognizing the image by utilizing a pre-trained gesture recognition neural network model to obtain a gesture image of the target to be tracked.
In implementations, a large number of pose images may be collected to train the neural network model to obtain a pose recognition neural network model.
The specific training process may include: firstly, inputting an RGB image with target personnel posture information, and normalizing the RGB image into uniform resolution; secondly, carrying out 11 x 11 convolution operation on the image, and recording a result; thirdly, carrying out local response normalization operation on the output of the last step; fourthly, performing 5*5 convolution operation on the output of the last step, and recording the result; fifthly, carrying out local response normalization operation on the output of the previous step; sixthly, carrying out 2 x 2 pooling operation on the output of the previous step, and recording the result; seventh, performing 3*3 convolution operation on the output of the previous step for three times; eighth, carrying out 2 x 2 pooling operation on the output of the last step; ninth, carrying out data flattening operation on the operation in the previous step; tenth, the result of the last step is fully connected and the softmax activation function is used.
S220, determining a first movement direction of the target to be tracked in the left-right direction according to the gesture represented by the gesture image.
In practice, the above-described left-right direction is a direction perpendicular to the line-of-sight direction of the object to be tracked in the horizontal direction.
In an implementation, a target pose keypoint in a pose image may be determined, and in an implementation, referring to fig. 3, in a target pose keypoint schematic diagram provided by an embodiment of the present invention, the target pose keypoints may include: left and right shoulder articulation points, left and right hip articulation points, and left and right knee articulation points;
fitting a fitting straight line for representing the gesture of the target to be tracked by using the target gesture key points through a least square method;
in implementation, with continued reference to fig. 3, in the process of obtaining the fitting straight line, the center point of the connecting line between the left and right shoulder joint points, the center point of the connecting line between the left and right hip joint points, and the center point of the connecting line between the left and right knee joint points may be calculated, and then the fitting straight line for representing the gesture of the object to be tracked may be fitted by using the calculated three center points through a least square method.
And determining that the target to be tracked moves rightwards under the condition that the slope of the fitting straight line is a positive value, and determining that the target to be tracked moves leftwards under the condition that the slope of the fitting straight line is a negative value.
S230, respectively acquiring the distance of the target to be tracked relative to the target behavior prediction tracking device at a first time point and a second time point by a depth camera, and respectively serving as a first distance and a second distance.
In practice, the target behavior prediction tracking device controls the depth camera to shoot a depth image, and by shooting the depth image containing the target to be tracked, the depth information contained in the depth image also indicates the distance between the target to be tracked and the target behavior prediction tracking device.
In the implementation, the target to be tracked generally moves along the sight line direction of the target to be tracked, and the target behavior prediction tracking device can keep constant-speed movement along the sight line direction of the target to be tracked in the acquisition process and acquire the relative distance at a first time point and a second time point respectively.
S240, calculating the moving speed of the target to be tracked in the front-rear direction by using the first distance, the second distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration, and determining the second moving direction of the target to be tracked in the front-rear direction according to the calculated moving speed.
The movement time is the time between the first time point and the second time point.
In practice, the front-rear direction is a direction parallel to the direction of the line of sight of the object to be tracked, that is, the horizontal direction.
In the implementation, the first distance and the second distance are data which are acquired by the target behavior prediction tracking device through the depth camera and represent the distance between the target to be tracked and the target behavior prediction tracking device, and in the actual calculation process, the first distance and the second distance are required to be converted into distances in the y direction under the geodetic coordinate system respectively to obtain a first test distance and a second test distance;
calculating the moving speed of the target to be tracked in the front-rear direction by using the following expression by using the first test distance, the second test distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration:
v ty =((v ry *dt+D 2y )-D 1y )/dt
wherein v is ty Representing the moving speed of the person to be tracked in the front-rear direction, v ry Representing the moving speed of the target behavior prediction tracking device in the front-back direction in the movement duration, dt represents the movement duration, and D 1y Represents a first test distance, D 2y Representing a second test distance;
at v ty Under the condition that the value of (a) is positive, the forward movement of the target to be tracked is determined, and the value of (b) is v ty And under the condition that the value of (2) is negative, determining that the target to be tracked moves backwards.
In practice, the target behavior prediction tracking device may be kept at a constant speed for the duration of the movement for ease of calculation. Alternatively, v may be obtained by calculating the average speed over the duration of the movement without maintaining a constant speed ry
S250, estimating the target position of the target to be tracked after a preset time according to the first moving direction, the second moving direction and the moving speed of the target to be tracked.
In practice, the object to be tracked is usually moved along the line of sight, and the lateral movement speed is negligible only for determining the lateral movement direction, and based on this, the movement speed of the object to be tracked can be also taken as the movement speed v in the front-rear direction ty
S260, planning a tracking path of the target behavior prediction tracking device moving to the target position based on the coordinates and the speed of the target behavior prediction tracking device in a preset planning area, and tracking the target to be tracked according to the tracking path.
In practice, to reduce the planning range of the tracking path, the preset planning area may be cropped using the following expression:
wherein rect represents the clipping result, (x) r ,y r ) Representing coordinates of the target behavior prediction tracking device in the preset planning area, (x) t ,y t ) Representing coordinates of an object to be tracked in the preset planning area, W representing the width of the preset planning area, H representing the height of the preset planning area, V tx Representing the moving speed of the target to be tracked in the first direction, V ty Representing the moving speed of the target to be tracked in the second direction.
The preset planning area is reduced through the clipping strategy, so that the planning range of the tracking path is reduced. Referring to fig. 4, a schematic diagram of a clipping result is provided according to an embodiment of the present invention, wherein the left diagram is V tx >0,V ty Cutting results in case of > 0, right graph V tx <0,V ty Results of clipping < 0.
In implementation, the center of the i paths can be calculated by using the speed of the target behavior prediction tracking device and coordinates in a preset planning area based on the motion dynamics principle of the target behavior prediction tracking device through the following expression:
wherein v represents the moving speed of the target behavior prediction tracking device, w represents the rotating speed of the target behavior prediction tracking device, θ represents the steering angle of the target behavior prediction tracking device, fx, fy are the coordinates of the target behavior prediction tracking device,is the center of the i-th path, and the radius is v i /w i I represents the number of paths, i=1, 2, 3 … …;
Determining a feasible path of the i item target behavior prediction tracking device moving to the target position by using the i circle centers and the speed of the target behavior prediction tracking device obtained through calculation; in practice, the speed of the target behavior prediction tracking device can be used for calculating the radius of the feasible path, and the track of the feasible path is determined by obtaining the radius and the circle center.
And selecting one path from the determined feasible paths as a tracking path for the target behavior prediction tracking device to move towards the target position.
In one implementation manner, one feasible path can be selected at will to be used as a tracking path for the target behavior prediction tracking device to move towards the target position, the feasible path with the shortest path can be selected to be used as a tracking path for the target behavior prediction tracking device to move towards the target position, and the feasible path farthest from the obstacle can be selected to be used as a tracking path for the target behavior prediction tracking device to move towards the target position.
In practice, each selectable path is determined by the speed of the target behavior prediction tracking device, so that the speed space of the target behavior prediction tracking device can be downsampled, thereby reducing the speed value range, reducing the number of planned selectable paths, and reducing the calculation amount.
Specifically, the speed value range V of the target behavior prediction tracking device when moving according to each feasible path can be determined S
Determining a safe speed value range v of the target behavior prediction tracking device when moving according to each feasible path by using the following expression a
Determining a maximum acceleration value range V of the target behavior prediction tracking device when moving along each feasible path by using the following expression d
Determining a speed search space Vr, wherein vr=vs n Va n Vd;
selecting a feasible path corresponding to each speed in the speed search space Vr from the determined feasible paths as a to-be-selected path; and selecting one path from the paths to be selected as a tracking path for the target behavior prediction tracking device to move towards the target position.
The dotted points on the variables in the above expressions represent the derivation of the variables.
In implementation, in the process of selecting one path from the paths to be selected as the tracking path for the target behavior prediction tracking device to move towards the target position, each speed in the speed search space Vr may be scored by using the following evaluation function:
G(v,w)=σ(α*heading(v,w)+β*dist(v,w)+γ*vel(v,w))
wherein, head (v, w) represents the alignment degree of the target behavior prediction tracking device and the target to be tracked, dist (v, w) represents the distance between the nearest barrier intersected with the track of the target behavior prediction tracking device, and vel (v, w) represents the speed v, alpha, beta and gamma of a certain track of the target behavior prediction tracking device, which are the weights of the evaluation functions respectively; and taking the path to be selected corresponding to the highest scoring speed as a tracking path for the target behavior prediction tracking device to move towards the target position.
The alignment degree of the head (v, w) can be represented by 180-theta, theta is the included angle between the target behavior prediction tracking device and the target to be tracked, the larger the included angle is, the higher the degree of the target behavior prediction tracking device is, and accordingly, the higher the degree of the target behavior prediction tracking device reaching the target to be tracked is, the smaller the value of the head (v, w) is.
The larger the value of dist (v, w) described above, the further the target behavior prediction tracking device is from an obstacle on the path, the higher the safety.
The above-mentioned vel (v, w) represents the speed v of a certain track of the target behavior prediction tracking device, in the implementation, the closer the vel (v, w) is to the optimal moving speed of the target behavior prediction tracking device, the better the performance of the corresponding target behavior prediction tracking device, the faster the approaching speed to the target to be tracked, so the level (v, w) can be determined according to the approaching degree to the optimal moving speed, and the closer the corresponding value is.
In the implementation, when the target to be tracked is a person, after the target behavior prediction tracking device moves to the target position, the image acquisition equipment can also acquire the face information of the target to be tracked; and time information of acquiring face data of the target to be tracked and geographic position information of the target position can be acquired and recorded. In one implementation, the face data and the geographic location information may be stored after being encrypted using an RSA encryption algorithm.
In the implementation, the target behavior prediction tracking device can also send alarm information containing face data and positions of the person to be tracked to security personnel in real time in the process of tracking the target to be tracked, so that the security personnel can carry out subsequent processing.
Referring to fig. 5, a schematic structural diagram of a second target behavior prediction tracking device according to an embodiment of the present invention is provided, where the target behavior prediction tracking device includes: the depth camera 500 and the image capturing apparatus 510 further include:
the image acquisition module 520 is connected with the image acquisition device 510 and used for acquiring an image of a target to be tracked through the image acquisition device in the process of carrying out inspection in a preset planning area by the target behavior prediction tracking device;
the image recognition module 530 is connected with the image acquisition module 520 and is used for recognizing the image by utilizing a pre-trained gesture recognition neural network model to obtain a gesture image of the target to be tracked;
a direction determining module 540, connected to the image identifying module 530, for determining a first movement direction of the object to be tracked in the left-right direction according to the gesture represented by the gesture image;
the distance acquisition module 550 is connected with the direction determination module 540 and the depth camera 500, and is used for respectively acquiring the distance between the target to be tracked and the target behavior prediction tracking device at a first time point and a second time point through the depth camera, and the distances are respectively used as a first distance and a second distance;
The speed calculating module 560 is connected to the distance collecting module 550 and is configured to calculate a moving speed of the target to be tracked according to the first distance, the second distance, the moving duration and a moving speed of the target behavior prediction tracking device in the moving duration, and determine a second moving direction of the target to be tracked in a front-rear direction according to the calculated moving speed, where the moving duration is a duration between the first time point and the second time point;
a position estimating module 570, coupled to the speed calculating module 560, for estimating a target position of the target to be tracked after a predetermined time period according to the first moving direction, the second moving direction, and the moving speed;
the personnel tracking module 580 is connected to the position estimation module 570, and is configured to plan a tracking path for the target behavior prediction tracking device to move toward the target to be tracked based on the coordinates of the target behavior prediction tracking device in a preset planning area and the target position, and track the target to be tracked according to the tracking path.
In implementation, the direction determination module 540 is also configured to
Determining a target gesture key point in the gesture image, wherein the target gesture key point comprises: left and right shoulder articulation points, left and right hip articulation points, and left and right knee articulation points;
Fitting a fitting straight line for representing the target gesture to be tracked by using the target gesture key points through a least square method;
and under the condition that the slope of the fitting straight line is a positive value, determining that the target to be tracked moves rightwards, and under the condition that the slope of the fitting straight line is a negative value, determining that the target to be tracked moves leftwards.
In implementation, the speed calculation module 560 is also configured to
Respectively converting the first distance and the second distance into distances in the y direction under a geodetic coordinate system to obtain a first test distance and a second test distance;
calculating the moving speed of the target to be tracked in the front-rear direction by using the following expression by using the first test distance, the second test distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration:
v ty =((v ry *dt+D 2y )-D 1y )/dt
wherein v is ty Representing the moving speed of the person to be tracked in the front-rear direction, v ry Representing the moving speed of the target behavior prediction tracking device in the front-back direction in the movement duration, dt represents the movement duration, and D 1y Represents a first test distance, D 2y Representing a second test distance;
at said v ty In case the value of (2) is positive, determining that the target to be tracked moves forward, and in the v ty And under the condition that the value of (2) is negative, determining that the target to be tracked moves backwards.
In an implementation, the apparatus further comprises: region clipping module for
Cutting out the preset planning area by using the following expression:
wherein rect represents the clipping result, (x) r ,y r ) Representing coordinates of the target behavior prediction tracking device in the preset planning area, (x) t ,y t ) Representing coordinates of an object to be tracked in the preset planning area, W representing the width of the preset planning area, H representing the height of the preset planning area, V tx Representing the moving speed of the target to be tracked in the first direction, V ty Representing the moving speed of the target to be tracked in the second direction.
In implementation, the person tracking module 580 includes:
the circle center calculating unit is used for predicting the speed of the tracking device by utilizing the target behaviors and calculating the circle centers of the i tracking paths by using the coordinates in the preset planning area through the following expression:
wherein v represents the moving speed of the target behavior prediction tracking device, w represents the rotating speed of the target behavior prediction tracking device, θ represents the steering angle of the target behavior prediction tracking device, fx, fy are the coordinates of the target behavior prediction tracking device, The radius of the circle center of the i-th segment tracking path is v i /w i I represents the number of paths, i=1, 2, 3 … …;
the path determining unit is used for determining i feasible paths for the target behavior prediction tracking device to move towards the target position by using the i circle centers obtained through calculation and the speed of the target behavior prediction tracking device;
and the path selection unit is used for selecting one path from the determined feasible paths as a tracking path for the target behavior prediction tracking device to move towards the target position.
In an implementation, the path selection unit includes:
a first determination subunit for determining a speed range V of the target behavior prediction tracking device when moving along each feasible path S
A second determination subunit for determining a safe speed value range v of the target behavior prediction tracking device when moving along each feasible path by using the following expression a
A third determination subunit for determining the maximum acceleration value range V of the target behavior prediction tracking device when moving along each feasible path by using the following expression d
A fourth determination subunit configured to determine a speed search space Vr, where vr=vs n Va n Vd;
The first selecting subunit is configured to select, from the determined feasible paths, a feasible path corresponding to each speed in the speed search space Vr as a to-be-selected path;
and the second selecting subunit is used for selecting one path from the paths to be selected as a tracking path for the target behavior prediction tracking device to move towards the target position.
In an implementation, the second selection subunit is further configured to
Each speed in the speed search space Vr is scored using the following evaluation function:
G(v,w)=σ(α*heading(v,w)+β*dist(v,w)+γ*vel(v,w))
the head (v, w) represents the alignment degree of the target behavior prediction tracking device and the personnel to be tracked, the dist (v, w) represents the distance between the target behavior prediction tracking device and the nearest obstacle intersected with the track of the target behavior prediction tracking device, and the vel (v, w) represents the speed v, alpha, beta and gamma of a certain track of the target behavior prediction tracking device, wherein the speeds v, alpha, beta and gamma are the weights of the evaluation functions respectively;
and taking the path to be selected corresponding to the highest scoring speed as a tracking path for the target behavior prediction tracking device to move towards the target position.
In an implementation, the apparatus further comprises:
and the information acquisition module is used for acquiring the face information of the target to be tracked through the image acquisition equipment after the target behavior prediction tracking device moves to the target position.
In an implementation, the apparatus further comprises:
and the information acquisition module is used for acquiring and recording time information for acquiring the face data and geographic position information of the target position representation.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments in part.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention are included in the protection scope of the present invention.

Claims (10)

1. A target behavior prediction tracking method, which is applied to a target behavior prediction tracking device, wherein the target behavior prediction tracking device comprises: a depth camera and an image acquisition device, the method comprising:
in the process of carrying out inspection in a preset planning area, the target behavior prediction tracking device acquires an image of a target to be tracked through the image acquisition equipment;
the images are identified by utilizing a pre-trained gesture recognition neural network model to obtain a gesture image of the target to be tracked;
determining a first movement direction of the target to be tracked in the left-right direction according to the gesture represented by the gesture image;
the distance of the target to be tracked relative to the target behavior prediction tracking device is acquired at a first time point and a second time point through the depth camera and is used as a first distance and a second distance respectively;
calculating the moving speed of the target to be tracked in the front-rear direction by using the first distance, the second distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration, and determining the second moving direction of the target to be tracked in the front-rear direction according to the calculated moving speed, wherein the moving duration is the duration between the first time point and the second time point;
Estimating the target position of the target to be tracked after a preset time length according to the first movement direction, the second movement direction and the movement speed of the target to be tracked;
and planning a tracking path of the target behavior prediction tracking device moving to the target position based on the coordinates and the speed of the target behavior prediction tracking device in a preset planning area, and tracking the target to be tracked according to the tracking path.
2. The method of claim 1, wherein the step of determining a first direction of motion of the object to be tracked in a left-right direction from the pose characterized in the pose image comprises:
determining a target gesture key point in the gesture image, wherein the target gesture key point comprises: left and right shoulder articulation points, left and right hip articulation points, and left and right knee articulation points;
fitting a fitting straight line for representing the target gesture to be tracked by using the target gesture key points through a least square method;
and under the condition that the slope of the fitting straight line is a positive value, determining that the target to be tracked moves rightwards, and under the condition that the slope of the fitting straight line is a negative value, determining that the target to be tracked moves leftwards.
3. The method of claim 1, wherein the step of calculating a moving speed of the object to be tracked in the front-rear direction using the first distance, the second distance, the moving duration, and the moving speed of the object behavior prediction tracking device within the moving duration, and determining the second moving direction of the object to be tracked in the front-rear direction based on the calculated moving speed, comprises:
respectively converting the first distance and the second distance into distances in the y direction under a geodetic coordinate system to obtain a first test distance and a second test distance;
calculating the moving speed of the target to be tracked in the front-rear direction by using the following expression by using the first test distance, the second test distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration:
v ty =((v ry *dt+D 2y )-D 1y )/dt
wherein v is ty Representing the moving speed of the person to be tracked in the front-rear direction, v ry Representing the moving speed of the target behavior prediction tracking device in the front-back direction in the movement duration, dt represents the movement duration, and D 1y Represents a first test distance, D 2y Representing a second test distance;
at said v ty In case the value of (2) is positive, determining that the target to be tracked moves forward, and in the v ty And under the condition that the value of (2) is negative, determining that the target to be tracked moves backwards.
4. The method of claim 1, wherein the step of planning a tracking path for the target behavior prediction tracking device to move toward the target to be tracked based on coordinates of the target behavior prediction tracking device within a preset planning area and the target position further comprises, prior to the step of:
cutting out the preset planning area by using the following expression:
wherein rect represents the clipping result, (x) r ,y r ) Representing coordinates of the target behavior prediction tracking device in the preset planning area, (x) t ,y t ) Representing coordinates of an object to be tracked in the preset planning area, W representing the width of the preset planning area, H representing the height of the preset planning area, V tx Representing the moving speed of the target to be tracked in the first direction, V ty Representing the moving speed of the target to be tracked in the second direction.
5. The method of claim 1, wherein the step of planning a tracking path for movement of the target behavior prediction tracking device to the target location based on coordinates and speed of the target behavior prediction tracking device within a preset planning area comprises:
Calculating the circle centers of the i tracking paths by using the speed of the target behavior prediction tracking device and coordinates in the preset planning area through the following expression:
wherein v represents the moving speed of the target behavior prediction tracking device, w represents the rotating speed of the target behavior prediction tracking device, θ represents the steering angle of the target behavior prediction tracking device, fx, fy are the coordinates of the target behavior prediction tracking device,the radius of the circle center of the i-th segment tracking path is v i /w i I represents the number of paths, i=1, 2, 3 … …;
determining i feasible paths of the target behavior prediction tracking device moving to the target position by using the i circle centers obtained through calculation and the speed of the target behavior prediction tracking device;
and selecting one path from the determined feasible paths as a tracking path for the target behavior prediction tracking device to move towards the target position.
6. The method of claim 5, wherein the step of selecting one of the determined viable paths as a tracking path for the target behavior prediction tracking device to move toward the target location comprises:
determining a speed value range V of the target behavior prediction tracking device when moving along each feasible path S
Determining a safe speed value range v of the target behavior prediction tracking device when moving according to each feasible path by using the following expression a
The following expression was used to determineMaximum acceleration value range V of the target behavior prediction tracking device when moving according to each feasible path d
Determining a speed search space Vr, wherein vr=vs n Va n Vd;
selecting a feasible path corresponding to each speed in the speed search space Vr from the determined feasible paths as a to-be-selected path;
and selecting one path from the paths to be selected as a tracking path for the target behavior prediction tracking device to move towards the target position.
7. The method of claim 6, wherein the step of selecting one of the candidate paths as a tracking path for the target behavior prediction tracking device to move toward the target location comprises:
each speed in the speed search space Vr is scored using the following evaluation function:
G(v,w)=σ(α*heading(v,w)+β*dist(v,w)+γ*vel(v,w))
the head (v, w) represents the alignment degree of the target behavior prediction tracking device and the personnel to be tracked, the dist (v, w) represents the distance between the target behavior prediction tracking device and the nearest obstacle intersected with the track of the target behavior prediction tracking device, and the vel (v, w) represents the speed v, alpha, beta and gamma of a certain track of the target behavior prediction tracking device, wherein the speeds v, alpha, beta and gamma are the weights of the evaluation functions respectively;
And taking the path to be selected corresponding to the highest scoring speed as a tracking path for the target behavior prediction tracking device to move towards the target position.
8. The method of any one of claims 1-7, wherein the method further comprises:
and after the target behavior prediction tracking device moves to the target position, acquiring the face information of the target to be tracked through the image acquisition equipment.
9. The method of claim 8, wherein the method further comprises:
and acquiring and recording time information for acquiring the face information and geographic position information of the target position representation.
10. A target behavior prediction tracking device, characterized in that the target behavior prediction tracking device comprises: depth camera and image acquisition device still include:
the image acquisition module is connected with the image acquisition equipment and used for acquiring an image of a target to be tracked through the image acquisition equipment in the process that the target behavior prediction tracking device performs inspection in a preset planning area;
the image recognition module is connected with the image acquisition module and used for recognizing the image by utilizing a pre-trained gesture recognition neural network model to obtain a gesture image of the target to be tracked;
The direction determining module is connected with the image identifying module and used for determining a first movement direction of the target to be tracked in the left-right direction according to the gesture represented by the gesture image;
the distance acquisition module is connected with the direction determination module and the depth camera and used for respectively acquiring the distance of the target to be tracked relative to the target behavior prediction tracking device at a first time point and a second time point through the depth camera, and the distance is respectively used as a first distance and a second distance;
the speed calculation module is connected with the distance acquisition module and used for calculating the moving speed of the target to be tracked according to the first distance, the second distance, the moving duration and the moving speed of the target behavior prediction tracking device in the moving duration, and determining the second moving direction of the target to be tracked in the front-back direction according to the calculated moving speed, wherein the moving duration is the duration between the first time point and the second time point;
the position estimating module is connected with the speed calculating module and used for estimating the target position of the target to be tracked after the preset time length according to the first moving direction, the second moving direction and the moving speed;
And the personnel tracking module is connected with the position estimation module and is used for planning a tracking path of the target behavior prediction tracking device moving towards the target to be tracked based on the coordinates of the target behavior prediction tracking device in a preset planning area and the target position, and tracking the target to be tracked according to the tracking path.
CN202010777700.7A 2020-08-05 2020-08-05 Target behavior prediction tracking method and device Active CN111986224B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010777700.7A CN111986224B (en) 2020-08-05 2020-08-05 Target behavior prediction tracking method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010777700.7A CN111986224B (en) 2020-08-05 2020-08-05 Target behavior prediction tracking method and device

Publications (2)

Publication Number Publication Date
CN111986224A CN111986224A (en) 2020-11-24
CN111986224B true CN111986224B (en) 2024-01-05

Family

ID=73445131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010777700.7A Active CN111986224B (en) 2020-08-05 2020-08-05 Target behavior prediction tracking method and device

Country Status (1)

Country Link
CN (1) CN111986224B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113255534B (en) * 2021-05-28 2022-08-12 河北幸福消费金融股份有限公司 Early warning method, system, device and storage medium based on video image analysis
CN113465505B (en) * 2021-06-28 2024-03-22 七海测量技术(深圳)有限公司 Visual detection positioning system and method
CN113744299B (en) * 2021-09-02 2022-07-12 上海安维尔信息科技股份有限公司 Camera control method and device, electronic equipment and storage medium
CN114202581A (en) * 2021-12-09 2022-03-18 云赛智联股份有限公司 Intelligent control system and method for fish prohibition in Yangtze river
CN115512479B (en) * 2022-09-09 2024-04-09 北海市冠标智慧声谷科技有限责任公司 Method for managing reception information and back-end equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016034008A1 (en) * 2014-09-04 2016-03-10 华为技术有限公司 Target tracking method and device
CN106289180A (en) * 2015-05-21 2017-01-04 中兴通讯股份有限公司 The computational methods of movement locus and device, terminal
CN106651916A (en) * 2016-12-29 2017-05-10 深圳市深网视界科技有限公司 Target positioning tracking method and device
CN106682572A (en) * 2016-10-12 2017-05-17 纳恩博(北京)科技有限公司 Target tracking method, target tracking system and first electronic device
CN107643752A (en) * 2017-05-09 2018-01-30 清研华宇智能机器人(天津)有限责任公司 Omni-directional mobile robots path planning algorithm based on pedestrian track prediction
CN110018689A (en) * 2019-05-15 2019-07-16 福州大学 A kind of more virtual target point overall situation active path planning algorithms based on dynamic window
CN110086992A (en) * 2019-04-29 2019-08-02 努比亚技术有限公司 Filming control method, mobile terminal and the computer storage medium of mobile terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016034008A1 (en) * 2014-09-04 2016-03-10 华为技术有限公司 Target tracking method and device
CN106289180A (en) * 2015-05-21 2017-01-04 中兴通讯股份有限公司 The computational methods of movement locus and device, terminal
CN106682572A (en) * 2016-10-12 2017-05-17 纳恩博(北京)科技有限公司 Target tracking method, target tracking system and first electronic device
CN106651916A (en) * 2016-12-29 2017-05-10 深圳市深网视界科技有限公司 Target positioning tracking method and device
CN107643752A (en) * 2017-05-09 2018-01-30 清研华宇智能机器人(天津)有限责任公司 Omni-directional mobile robots path planning algorithm based on pedestrian track prediction
CN110086992A (en) * 2019-04-29 2019-08-02 努比亚技术有限公司 Filming control method, mobile terminal and the computer storage medium of mobile terminal
CN110018689A (en) * 2019-05-15 2019-07-16 福州大学 A kind of more virtual target point overall situation active path planning algorithms based on dynamic window

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种融合行人预测信息的局部路径规划算法;赵青;《武 汉 大 学 学 报·信 息 科 学 版》;第45卷(第5期);667-674 *
基于单目视觉的四旋翼飞行器目标跟踪算法研究;张立国;《计量学报》;第39卷(第3期);342-347 *

Also Published As

Publication number Publication date
CN111986224A (en) 2020-11-24

Similar Documents

Publication Publication Date Title
CN111986224B (en) Target behavior prediction tracking method and device
CN113269098B (en) Multi-target tracking positioning and motion state estimation method based on unmanned aerial vehicle
CN109947119B (en) Mobile robot autonomous following method based on multi-sensor fusion
JP7147420B2 (en) OBJECT DETECTION DEVICE, OBJECT DETECTION METHOD AND COMPUTER PROGRAM FOR OBJECT DETECTION
Cherubini et al. Autonomous visual navigation and laser-based moving obstacle avoidance
KR102016549B1 (en) System and methof of detecting vehicle and lane position
CN111932580A (en) Road 3D vehicle tracking method and system based on Kalman filtering and Hungary algorithm
JP4871160B2 (en) Robot and control method thereof
US20110026770A1 (en) Person Following Using Histograms of Oriented Gradients
JP2022506404A (en) Methods and devices for determining vehicle speed
JP2006192563A (en) Target object detection apparatus and robot provided with the same
Premebida et al. A multi-target tracking and GMM-classifier for intelligent vehicles
CN111968713B (en) Data acquisition method and inspection device
JP7194130B2 (en) A method and apparatus for detecting emergency vehicles in real time and planning driving routes to deal with situations expected to be caused by emergency vehicles.
JP2021026644A (en) Article detection apparatus, article detection method, and article-detecting computer program
US20210364321A1 (en) Driving information providing method, and vehicle map providing server and method
Stanislas et al. Multimodal sensor fusion for robust obstacle detection and classification in the maritime RobotX challenge
Baig et al. A robust motion detection technique for dynamic environment monitoring: A framework for grid-based monitoring of the dynamic environment
Guo et al. Humanlike behavior generation in urban environment based on learning-based potentials with a low-cost lane graph
Lin et al. System integration of sensor-fusion localization tasks using vision-based driving lane detection and road-marker recognition
US20230350418A1 (en) Position determination by means of neural networks
Han et al. Optimization‐based humanoid robot navigation using monocular camera within indoor environment
Ma et al. Dynamic sensor planning with stereo for model identification on a mobile platform
Márquez-Gámez et al. Active visual-based detection and tracking of moving objects from clustering and classification methods
Kim et al. Traffic Accident Detection Based on Ego Motion and Object Tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant