CN113313819A - Visualization method for time-lapse trajectory - Google Patents

Visualization method for time-lapse trajectory Download PDF

Info

Publication number
CN113313819A
CN113313819A CN202110644063.0A CN202110644063A CN113313819A CN 113313819 A CN113313819 A CN 113313819A CN 202110644063 A CN202110644063 A CN 202110644063A CN 113313819 A CN113313819 A CN 113313819A
Authority
CN
China
Prior art keywords
dimensional
track
space
time
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110644063.0A
Other languages
Chinese (zh)
Inventor
大方
刘志超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Qingzhou Zhihang Technology Co ltd
Original Assignee
Beijing Qingzhou Zhihang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Qingzhou Zhihang Technology Co ltd filed Critical Beijing Qingzhou Zhihang Technology Co ltd
Priority to CN202110644063.0A priority Critical patent/CN113313819A/en
Publication of CN113313819A publication Critical patent/CN113313819A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to the field of automatic driving of vehicles, in particular to a visualization method for a track in time tick, which comprises the following steps: step S1, obtaining two-dimensional track points in a two-dimensional space-time track of a vehicle in an automatic driving application scene; step S2, calculating the space coordinate of each two-dimensional track point in the two-dimensional space-time track; step S3 is to generate a three-dimensional track point in a three-dimensional space according to the horizontal and vertical coordinates and the space coordinates of each two-dimensional track point in the two-dimensional space-time track; step S4, forming a three-dimensional track curve of the vehicle by using all the generated three-dimensional track points; step S5 renders a three-dimensional trajectory curve of the vehicle in a three-dimensional space. The invention also relates to a method for visualizing the vehicle contour trajectory in time space, a device for visualizing the trajectory in time space and a computer-readable storage medium.

Description

Visualization method for time-lapse trajectory
Technical Field
The invention relates to the field of automatic driving of vehicles, in particular to a visualization method for a time-lapse track.
Background
Visualization (visualization) is a technique that transforms complex, abstract information and data into a form that can be intuitively and conveniently analyzed by developers. In an automatic driving system, intermediate results calculated by each module usually require different forms of visualization technology. Trajectory planning (trajectory planning) is an important technical link in an automatic driving system, and a trajectory (trajectory) of vehicle driving is calculated from information such as driving decision and destination. The trajectory describes the spatial position at which the autonomous vehicle is expected to arrive at each time in the future, and can be considered as a curve in the space of time.
In addition to trajectory, the description of the movement of an autonomous vehicle often uses a path (path) and a speed profile (speed profile). The path represents the position the vehicle has reached in space, independent of speed. For example, a vehicle traveling on a straight road has a straight path, and the path is the same regardless of the speed, the presence or absence of a brake, or even the vehicle stopping during the form process. The speed profile represents the distance traveled by the vehicle over time, independent of the path. For example, when the vehicle reaches a destination and performs a deceleration braking operation on the road, the speed curve is the same curve that gradually decelerates regardless of whether the lane has a radian or not. Both the path and velocity curves are merged together to be equivalent to the trajectory; the trajectory is a complete description of the position of the vehicle in space-time.
Trajectory planning techniques can be roughly classified into space time decoupled planning (space time decoupled planning) and space time joint planning (space time joint planning) according to whether or not a degree-of-freedom decomposition is performed. The space-time decoupling planning technique usually adopts a representation form of decomposing space-time trajectories into paths and speed curves, and reduces the respective difficulty of solving two problems by respectively solving the paths and the speeds, but sacrifices the optimality of the finally obtained trajectories to a certain extent. The space-time joint planning technology does not carry out the decomposition, and directly solves the optimal space-time trajectory, so that the trajectory with higher quality can be obtained, but the requirements on the adaptability, robustness and calculation efficiency of a solver are higher.
The existing trajectory visualization method is generally based on path visualization under a world coordinate system, and because a path curve directly corresponds to a position in a space, visualization is intuitive. The speed information in the trajectory, except for the path, is usually visualized by drawing the path as a strip with a width, reflecting the speed in the width of the strip. There are also methods of visualizing by drawing coordinate and velocity information into a two-dimensional chart.
In spatiotemporal joint planning, complete visualization of trajectories is crucial to the efficiency of development. A complete trajectory contains both spatial information (path) and temporal information (velocity), which are both indispensable and mutually influencing and require a synchronized visualization. As shown in fig. 1, the autonomous vehicle (vehicle a) and the testing vehicle (vehicle B) interact with each other at the intersection and pass through in a staggered manner. The intersection is crossed by the two vehicles when viewed from the space alone, the paths of the two vehicles are intersected, and the intersection is interacted with the two vehicles when viewed from the time alone within the same time period, so that the automatic driving vehicle and the test vehicle cannot be accurately analyzed whether the collision occurs or not or the interaction is completed in any mode when viewed from the space or the time dimension. Obviously, the traditional path-based visualization method cannot meet the research and development requirements. In the development process of the space-time joint planning algorithm, the requirement on a novel space-time synchronous trajectory visualization method is particularly obvious.
The form of trajectory expression in spatio-temporal joint optimization is various, and the two most common are the function of coordinates in Cartesian coordinate system (Cartesian frame) over time (x (t), y (t)), and the function of coordinates in Frenet frame (s (t), l (t)) over time relative to the road reference line. Both have merits and different use scenes. The visualization method proposed by the present invention is applicable to both.
Disclosure of Invention
The invention aims to provide a visualization method for a time-lapse track. Compared with the traditional method, the method provided by the invention can help research and development personnel to visually observe the mutual influence of the path and the speed and judge the accurate position relation of the vehicle and other objects at each moment. The technical solution of the invention is as follows:
a method of visualizing a trajectory in space of time, comprising the steps of:
s1, obtaining two-dimensional track points p in a two-dimensional space-time track of a vehicle in an automatic driving application scene;
p=(x,y,t);
wherein x and y are respectively the horizontal and vertical coordinates of the two-dimensional track point p, and t is time;
s2, calculating the space coordinate of each two-dimensional track point p in the two-dimensional space-time track;
step S3, correspondingly generating a three-dimensional track point p' in a three-dimensional space according to the horizontal and vertical coordinates and the space coordinates of each two-dimensional track point p in the two-dimensional space-time track;
step S4, forming a three-dimensional track curve { p '} of the vehicle by using all the generated three-dimensional track points p';
and step S5, rendering the three-dimensional track curve { p' } of the vehicle in a three-dimensional space.
Further, the step S2 includes the following steps:
step S21, obtaining the characteristic speed of the two-dimensional track point p in the two-dimensional space-time track of the vehicle in the automatic driving application scene, and taking the characteristic speed as the space coordinate conversion coefficient
Figure BDA0003108340850000021
Step S22, converting the space coordinate into coefficient
Figure BDA0003108340850000022
And multiplying the time t to obtain the space coordinate of the two-dimensional track point p.
Further, in step S3, for one three-dimensional track point p' in the three-dimensional space:
Figure BDA0003108340850000023
wherein the spatial coordinate conversion coefficient
Figure BDA0003108340850000024
Set to a constant.
A visualization method for a vehicle body contour track in the sky comprises the following steps:
step s1, determining the vehicle body contour, comprising: determining a two-dimensional outline frame of the vehicle according to the size of the vehicle body;
step s2, sampling time point T according to the set trackiObtaining two-dimensional track sampling point p in two-dimensional space-time track of vehicle in automatic driving application scenei
pi=(xi,yi,t);
Wherein i is 0,1,2 …; x is the number ofi、yiRespectively two-dimensional trace sampling points piThe abscissa and ordinate of (a), t is time;
step s3, calculating each two-dimensional track sampling point p in the two-dimensional space-time trackiThe spatial coordinates of (a);
step s4, sampling point p according to each two-dimensional track in the two-dimensional space-time trackiThe abscissa, the ordinate and the space coordinate of the three-dimensional space correspondingly generate a three-dimensional track sampling point p in the three-dimensional spacei’;
Step s5, connecting the vertexes of the corresponding vehicle body contours of the adjacent three-dimensional track sampling points in sequence to fill gaps between the adjacent three-dimensional track sampling points, and finally obtaining a three-dimensional track curve of the vehicle;
and step s6, rendering the three-dimensional track curve of the vehicle in the three-dimensional space.
Further, the step s3 includes the steps of:
step s31, obtaining two-dimensional track sampling points p in two-dimensional space-time track of vehicle in automatic driving application sceneiThe characteristic speed is used as a space coordinate conversion coefficient;
step s32, transforming the space coordinate into coefficients
Figure BDA0003108340850000031
Multiplying by trace sample time to calculate conversion height
Figure BDA0003108340850000032
Step s33, converting the height
Figure BDA0003108340850000033
As the two-dimensional trajectory sampling point piThe spatial coordinates of (a).
Further, the spatial coordinate conversion coefficient
Figure BDA0003108340850000034
Set to a constant.
An apparatus for visualizing trajectories in space, the apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing at least one of the steps of the method when executing the computer program.
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out at least one of the method steps.
The invention has the beneficial effects that: the method for visualizing the trajectory in the space has the core that the time dimension is converted into the longitudinal (height) dimension in the space, so that the whole trajectory can be completely and uniformly displayed under a world coordinate system. Therefore, compared with the traditional method, the method provided by the invention can help research and development personnel to visually observe the mutual influence of the path and the speed and judge the accurate position relation of the vehicle and other objects at each moment.
Drawings
FIG. 1 is a prior art path diagram of an autonomous vehicle (vehicle A) and a test vehicle (vehicle B) interacting at an intersection;
FIG. 2 is a flow diagram of a method of visualizing trajectories in time space according to an embodiment of the invention;
FIGS. 3a and 3b are three-dimensional effect diagrams of a method of visualizing trajectories in time space according to an embodiment of the invention;
FIG. 4 is a flow chart of a method of visualizing trajectories in space time according to another embodiment of the present invention;
fig. 5a and 5b are three-dimensional effect diagrams of a visualization method for a trajectory in space of time according to another embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The core of the invention is to convert the time dimension into the longitudinal (height) dimension in the space, so that the whole track can be completely and uniformly displayed under a world coordinate system.
The spatial dimension of the trajectory facing the field of autonomous driving is usually only two dimensions, since the movement of the vehicle is limited to the ground. Modern visualization tools all have the capability of three-dimensional rendering, that is, three spatial dimensions under a world coordinate system can be used for visualization; where this dimension of height is wasteful of the track.
The invention provides a method for converting a trajectory from a space-time curve to a curve in a three-dimensional space by using a space dimension of height to represent time and converting a time value of each point in the trajectory into a height value through a certain conversion coefficient (dimension is speed).
Example 1
As shown in fig. 2, the present invention provides a method for visualizing a trajectory in time of flight, comprising the steps of:
s1, obtaining two-dimensional track points p in a two-dimensional space-time track of a vehicle in an automatic driving application scene;
p=(x,y,t);
wherein x and y are respectively the horizontal and vertical coordinates of the two-dimensional track point p, and t is time;
s2, calculating the space coordinate of each two-dimensional track point p in the two-dimensional space-time track;
step S3, correspondingly generating a three-dimensional track point p' in a three-dimensional space according to the horizontal and vertical coordinates and the space coordinates of each two-dimensional track point p in the two-dimensional space-time track;
step S4, forming a three-dimensional track curve { p '} of the vehicle by using all the generated three-dimensional track points p';
and step S5, rendering the three-dimensional track curve { p' } of the vehicle in a three-dimensional space.
Further, the step S2 includes the following steps:
step S21, obtaining the characteristic speed of the two-dimensional track point p in the two-dimensional space-time track of the vehicle in the automatic driving application scene, and taking the characteristic speed as the space coordinate conversion coefficient
Figure BDA0003108340850000041
Step S22, converting the space coordinate into coefficient
Figure BDA0003108340850000042
And multiplying the time t to obtain the space coordinate of the two-dimensional track point p.
Further, in step S3, for one three-dimensional track point p' in the three-dimensional space:
Figure BDA0003108340850000051
wherein the spatial coordinate conversion coefficient
Figure BDA0003108340850000052
The meaning of (A) is: the coefficient and space coordinate of projecting the time t to the height axis (z axis) according to a certain linear relation
Figure BDA0003108340850000053
Namely: the height of the z-axis is used to represent time multiplied by a conversion factor only. Examples are: if a vehicle is stationary somewhere, its trajectory in 3D coordinates is a straight up and down cylinder according to the above definition; if a vehicle runs at high speed, the track of the vehicle in the 3D coordinates is a relatively flat cylinder, and the greater the speed,the more the trajectory cylinder is inclined; conversely, the lower the velocity, the more vertical the trajectory cylinder.
Therefore, the method of the present invention can easily represent the trajectory of an object at any speed (uniform speed, variable speed, etc.).
Preferably, the spatial coordinates are transformed into coefficients
Figure BDA0003108340850000054
Set to a constant, for example: 10 m/s.
An apparatus for visualizing trajectories in space, the apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing at least one of the steps of the method when executing the computer program.
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out at least one of the method steps.
Fig. 3a and 3b show three-dimensional effect diagrams of a visualization method for a trajectory in space of time according to embodiment 1 of the present invention. As shown in fig. 3a, 3 b: at the same location (i.e., A, B intersection of the driving tracks of two vehicles in the two-dimensional map), if the track of the vehicle A is above the vehicle B, the vehicle A arrives at the meeting location later than the vehicle B by the proposed visualization method. The Z-axis coordinate of the 3D track of the A vehicle is larger when the meeting place is reached, namely, the required time is longer.
The novel visualization method provided by embodiment 1 of the invention can help research and development personnel to clearly observe the complete interaction process of two objects. For example, in fig. 3a and 3B, it can be clearly seen that the trajectory of the autonomous vehicle (vehicle a) is above the test vehicle (vehicle B), i.e. the autonomous vehicle passes through the intersection behind the test vehicle, which is the behavior of the autonomous vehicle to yield. Such a spatiotemporal relationship cannot be directly observed in a traditional visualization method, and a research and development worker needs to obtain a conclusion through indirect reasoning, so that the thinking burden of the research and development worker is increased.
From the visualization of embodiment 1 of the present invention, a richer motion state can also be intuitively observed. The degree of inclination of the trajectory curve in the horizontal direction in three-dimensional space is directly proportional to the velocity, and the rate of change (degree of curvature) thereof reflects the acceleration. As can be seen from fig. 3a and 3b, although the autonomous vehicle and the test vehicle are close to the intersection interaction point, the autonomous vehicle is slower in speed and in an accelerated state, and thus passes through the intersection later than the test vehicle.
Example 2
One of the key indicators in evaluating the interaction of an autonomous vehicle with other objects is whether a collision occurs (i.e., whether two vehicles arrive at the intersection of the tracks in the two-dimensional map at the same time), and how much distance from the collision. If two vehicles have an interaction in 2D, it is difficult to determine whether a collision has occurred or the minimum distance that the two vehicles have occurred throughout. The 3D trajectory can visually determine whether a collision occurs and find the moment of the minimum distance.
In the visualization method provided by the embodiment of the invention, it is easy to add visualization of the collision volume of the vehicle and other objects, and only the object contour polygon needs to be expanded in the horizontal direction on each track point.
In the embodiment of the invention, the sequence of reaching the intersection point by the two vehicles can be embodied through the Z-axis coordinate. By expanding the trajectory into a wheel contour 3D trajectory, the contour size factor of the vehicle can also be taken into account in the collision detection, making the collision detection more accurate.
As shown in fig. 4, the present invention provides a method for visualizing a trajectory in time of flight, comprising the steps of:
step s1, determining the vehicle body contour, comprising: determining a two-dimensional outline frame of the vehicle according to the size of the vehicle body;
step s2, sampling time point T according to the set trackiObtaining two-dimensional track sampling point p in two-dimensional space-time track of vehicle in automatic driving application scenei
pi=(xi,yi,t);
Wherein i is 0,1,2 …; x is the number ofi、yiRespectively two-dimensional trace sampling points piThe abscissa and ordinate of (a), t is time;
step s3, calculating each two-dimensional track sampling point p in the two-dimensional space-time trackiThe spatial coordinates of (a);
step s4, sampling point p according to each two-dimensional track in the two-dimensional space-time trackiThe abscissa, the ordinate and the space coordinate of the three-dimensional space correspondingly generate a three-dimensional track sampling point p in the three-dimensional spacei’;
Step s5, connecting the vertexes of the corresponding vehicle body contours of the adjacent three-dimensional track sampling points in sequence to fill gaps between the adjacent three-dimensional track sampling points, and finally obtaining a three-dimensional track curve of the vehicle;
for example: in the ideal case where the temporal sampling of the traces is dense: it is sufficient that each trace point is expanded independently;
the practical situation is as follows: if the time sampling is sparse and there is an obvious gap between adjacent track sampling points, the gap in time can be filled up by connecting corresponding vertexes on the profiles of adjacent sampling times in the longitudinal direction (height is also the time direction).
And step s6, rendering the three-dimensional track curve of the vehicle in the three-dimensional space.
Further, the step s3 includes the steps of:
step s31, obtaining two-dimensional track sampling points p in two-dimensional space-time track of vehicle in automatic driving application sceneiThe characteristic velocity is taken as a space coordinate conversion coefficient
Figure BDA0003108340850000061
Step s32, transforming the space coordinate into coefficients
Figure BDA0003108340850000062
Multiplying by trace sample time to calculate conversion height
Figure BDA0003108340850000063
Step s33, converting the height
Figure BDA0003108340850000064
As the two-dimensional trajectory sampling point piThe spatial coordinates of (a).
Further, the spatial coordinate conversion coefficient
Figure BDA0003108340850000065
Set to a constant, for example: 10 m/s.
An apparatus for visualizing trajectories in space, the apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing at least one of the steps of the method when executing the computer program.
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out at least one of the method steps.
Fig. 5a and 5b show three-dimensional effect diagrams of a visualization method for a trajectory in space of time according to embodiment 2 of the present invention.
The results of the collision volume visualization in this embodiment of the invention are shown in fig. 5a, 5 b: through visual observation in three-dimensional space, whether the future tracks of the two vehicles have coincident collision can be easily determined, and meanwhile, the minimum distance of the two vehicles in a future period of time is also calculated in the visual result.
The above examples only show some embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the appended claims.

Claims (8)

1. A method for visualizing a trajectory in space, comprising the steps of:
s1, obtaining two-dimensional track points p in a two-dimensional space-time track of a vehicle in an automatic driving application scene;
p=(x,y,t);
wherein x and y are respectively the horizontal and vertical coordinates of the two-dimensional track point p, and t is time;
s2, calculating the space coordinate of each two-dimensional track point p in the two-dimensional space-time track;
step S3, correspondingly generating a three-dimensional track point p' in a three-dimensional space according to the horizontal and vertical coordinates and the space coordinates of each two-dimensional track point p in the two-dimensional space-time track;
step S4, forming a three-dimensional track curve { p '} of the vehicle by using all the generated three-dimensional track points p';
and step S5, rendering the three-dimensional track curve { p' } of the vehicle in a three-dimensional space.
2. A method for visualizing a trajectory through the sky as claimed in claim 1, wherein said step S2 comprises the steps of:
step S21, obtaining the characteristic speed of the two-dimensional track point p in the two-dimensional space-time track of the vehicle in the automatic driving application scene, and taking the characteristic speed as the space coordinate conversion coefficient
Figure FDA0003108340840000011
Step S22, converting the space coordinate into coefficient
Figure FDA0003108340840000012
And multiplying the time t to obtain the space coordinate of the two-dimensional track point p.
3. A method for visualizing a trajectory in space according to claim 2, wherein in said step S3, for a three-dimensional trajectory point p' in three-dimensional space:
Figure FDA0003108340840000013
wherein the spatial coordinate conversion coefficient
Figure FDA0003108340840000014
Set to a constant.
4. A visualization method for a vehicle contour track in a time-of-flight space is characterized by comprising the following steps:
step s1, determining the vehicle body contour, comprising: determining a two-dimensional outline frame of the vehicle according to the size of the vehicle body;
step s2, sampling time point T according to the set trackiObtaining two-dimensional track sampling point p in two-dimensional space-time track of vehicle in automatic driving application scenei
pi=(xi,yi,t);
Wherein i is 0,1,2 …; x is the number ofi、yiRespectively two-dimensional trace sampling points piThe abscissa and ordinate of (a), t is time;
step s3, calculating each two-dimensional track sampling point p in the two-dimensional space-time trackiThe spatial coordinates of (a);
step s4, sampling point p according to each two-dimensional track in the two-dimensional space-time trackiThe abscissa, the ordinate and the space coordinate of the three-dimensional space correspondingly generate a three-dimensional track sampling point p in the three-dimensional spacei’;
Step s5, connecting the vertexes of the corresponding vehicle body contours of the adjacent three-dimensional track sampling points in sequence to fill gaps between the adjacent three-dimensional track sampling points, and finally obtaining a three-dimensional track curve of the vehicle;
and step s6, rendering the three-dimensional track curve of the vehicle in the three-dimensional space.
5. A method of visualizing a trajectory through the sky as claimed in claim 4 wherein said step s3 includes the steps of:
step s31, obtaining two-dimensional track sampling points p in two-dimensional space-time track of vehicle in automatic driving application sceneiThe characteristic velocity is taken as a space coordinate conversion coefficient
Figure FDA0003108340840000021
Step s32, transforming the space coordinate into coefficients
Figure FDA0003108340840000022
Multiplying by trace sample time to calculate conversion height
Figure FDA0003108340840000023
Step s33, converting the height
Figure FDA0003108340840000024
As the two-dimensional trajectory sampling point piThe spatial coordinates of (a).
6. A method for visualizing trajectories in space as defined in claim 5, wherein said spatial coordinate transformation coefficients
Figure FDA0003108340840000025
Set to a constant.
7. An apparatus for visualizing a trajectory in space of time, the apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor realizes the steps of the method according to at least one of claims 1 to 6 when executing the computer program.
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to at least one of claims 1 to 6.
CN202110644063.0A 2021-06-09 2021-06-09 Visualization method for time-lapse trajectory Withdrawn CN113313819A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110644063.0A CN113313819A (en) 2021-06-09 2021-06-09 Visualization method for time-lapse trajectory

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110644063.0A CN113313819A (en) 2021-06-09 2021-06-09 Visualization method for time-lapse trajectory

Publications (1)

Publication Number Publication Date
CN113313819A true CN113313819A (en) 2021-08-27

Family

ID=77378353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110644063.0A Withdrawn CN113313819A (en) 2021-06-09 2021-06-09 Visualization method for time-lapse trajectory

Country Status (1)

Country Link
CN (1) CN113313819A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113885771A (en) * 2021-12-09 2022-01-04 北京车网科技发展有限公司 Visual processing method of track information

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113885771A (en) * 2021-12-09 2022-01-04 北京车网科技发展有限公司 Visual processing method of track information

Similar Documents

Publication Publication Date Title
CN110658531B (en) Dynamic target tracking method for port automatic driving vehicle
CN110531770B (en) RRT path planning method and system based on improvement
CN108898676B (en) Method and system for detecting collision and shielding between virtual and real objects
EP2209091B1 (en) System and method for object motion detection based on multiple 3D warping and vehicle equipped with such system
CN112389440B (en) Vehicle driving risk prediction method in off-road environment based on vehicle-road action mechanism
US8229249B2 (en) Spatial motion calculation apparatus and method for the same
WO2022016311A1 (en) Point cloud-based three-dimensional reconstruction method and apparatus, and computer device
CN109116867A (en) A kind of unmanned plane during flying barrier-avoiding method, device, electronic equipment and storage medium
CN110197027A (en) A kind of automatic Pilot test method, device, smart machine and server
KR100657937B1 (en) Real time 3 dimensional transformation method for 2 dimensional linear data and apparatus therefor, and real time 3 dimensional visualization method for 2 dimensional linear data and apparatus therefor
CN111771207A (en) Enhanced vehicle tracking
CN109682336B (en) Automatic planning and optimizing method for three-coordinate measurement path for vehicle body precision detection
CN105096381A (en) Collision detecting method using moving three-dimension ship models in navigation channel
JP7306766B2 (en) Target motion information detection method, apparatus, equipment and medium
Mouhagir et al. A markov decision process-based approach for trajectory planning with clothoid tentacles
Agafonov et al. 3D objects detection in an autonomous car driving problem
CN114647246A (en) Local path planning method and system for time-space coupling search
CN113313819A (en) Visualization method for time-lapse trajectory
CN117805848A (en) Train loading condition detection system and method
Lyssenko et al. Towards safety-aware pedestrian detection in autonomous systems
Franke et al. Bounding box dataset augmentation for long-range object distance estimation
WO2021109166A1 (en) Three-dimensional laser positioning method and system
JP2023146041A (en) machine learning system
Han et al. Novel cartographer using an oak-d smart camera for indoor robots location and navigation
Lu et al. Rail Track Area Environment Perception Based on Rader Target Gird

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210827

WW01 Invention patent application withdrawn after publication