CN112947401A - Method for displaying perception data in automatic driving system - Google Patents

Method for displaying perception data in automatic driving system Download PDF

Info

Publication number
CN112947401A
CN112947401A CN201911375088.4A CN201911375088A CN112947401A CN 112947401 A CN112947401 A CN 112947401A CN 201911375088 A CN201911375088 A CN 201911375088A CN 112947401 A CN112947401 A CN 112947401A
Authority
CN
China
Prior art keywords
barrier
obstacle
static
automatic driving
dynamic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911375088.4A
Other languages
Chinese (zh)
Inventor
李�昊
郭坤
张弛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Automobile Technology Co Ltd
Original Assignee
Shendong Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shendong Technology Beijing Co ltd filed Critical Shendong Technology Beijing Co ltd
Publication of CN112947401A publication Critical patent/CN112947401A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

The invention discloses a method for displaying perception data in an automatic driving system, which comprises the following steps: and acquiring the information of the perceived barrier, wherein the barrier is represented by a static barrier and a dynamic barrier, the static barrier and the dynamic barrier present corresponding morphological characteristics according to the position information and the stay time information of the barrier, and the automatic driving system continuously acquires the information of the barrier and continuously and stably displays the morphological characteristic change presented by the static barrier and the dynamic barrier.

Description

Method for displaying perception data in automatic driving system
Technical Field
The invention relates to the field of driving assistance display, in particular to a display method of perception data in an automatic driving system.
Background
The display method applied to the automatic driving system at present is unsatisfactory in the presenting effect, for example, in the method of driving a vehicle by using a 3D frame mark, the 3D frame mark inevitably shakes due to the real-time change of perception data and under the condition that the accuracy degree of the perception data is difficult to realize, and the displaying effect is very unstable; for the method using the plane frame mark, although the negative influence caused by inaccurate perception data can be compensated to a certain extent, the visual effect is lack of space sense, and the visual effect is influenced by lack of depth of field when too many obstacles are presented.
Disclosure of Invention
The invention provides a method for displaying perception data in an automatic driving system, aiming at overcoming the defects of unstable display effect and the like in the current automatic driving system.
According to an aspect of the present invention, there is provided a method of displaying perception data in an automatic driving system, including: and acquiring the sensed obstacle information, wherein the obstacle is represented by a static barrier and a dynamic barrier, the static barrier and the dynamic barrier present corresponding morphological characteristics according to the position information and the stay time information of the obstacle, and the morphological characteristic changes presented by the static barrier and the dynamic barrier are continuously and stably displayed by continuously acquiring the obstacle information.
Preferably, the static barriers include a left side static barrier and a right side static barrier, the left side static barrier and the right side static barrier being displayed on the front left side and the front right side of the autonomous vehicle, respectively, and the dynamic barrier being displayed in front of the autonomous vehicle perpendicular to the static barrier direction.
Preferably, the different hues presented by the static barrier and the dynamic barrier represent distance information of the obstacle from the autonomous vehicle.
Preferably, the closer the color of the hue to red indicates the closer the distance between the obstacle and the autonomous vehicle, the closer the color of the hue to green indicates the farther the distance.
Preferably, the height of the static barrier represents the time for which the obstacle stays, and a higher height represents a longer time for staying, whereas a shorter time for staying.
Preferably, the morphological characteristics of the static barrier and the dynamic barrier are continuously and stably displayed, and the morphological characteristics comprise: the data compensation scheme of gradient tracking realizes that the automatic driving system continuously and stably displays the morphological characteristic change presented by the static barrier and the dynamic barrier.
Preferably, the autonomous driving system is adapted for use with autonomous vehicles travelling on a fixed route.
Preferably, the automatic driving system is an automatic driving system having a perception capability in automatic driving ranks from L0 to L5.
According to another aspect of the present invention, there is provided a method of displaying perception data in an automatic driving system, including: and obtaining the perceived barrier information, wherein the barrier is represented by a virtual wall, the virtual wall presents corresponding morphological characteristics according to the position information and the stay time information of the barrier, and the morphological characteristic change presented by the virtual wall is continuously and stably displayed by continuously obtaining the barrier information.
Preferably, the virtual wall includes a first virtual wall and a second virtual wall.
Preferably, the first virtual wall and the second virtual wall are represented by a static fence and a dynamic fence.
Compared with the prior art, the invention has the beneficial effects that:
the method can not only stably perform projection display based on inaccurate and/or unstable data, but also embody the 3D effect and give consideration to the functions of beauty and early warning. The barrier is represented by the virtual wall body, and is further represented by the fence, so that the barrier is similar to the fence on the real road surface, and the safety sense can be brought to passengers in the automatic driving vehicle. The number of the virtual walls is flexibly set, so that the automatic driving vehicle can present the most suitable display effect according to different road conditions. The danger of the barrier can be distinguished and early warned easily by passengers and/or other viewers in the automatic driving vehicle through different colors and heights, the length of time for capturing the barrier is represented by the height of the static barrier, the display effect is more friendly, and the defect that the display effect is sharp due to the fact that the static barrier is occasionally absent is avoided. The display effect is continuously stable and is not jittered and discontinuous through the compensation scheme.
Drawings
The foregoing summary, as well as the following detailed description, will be better understood when read in conjunction with the appended drawings. For the purpose of illustration, certain embodiments of the disclosure are shown in the drawings. It should be understood, however, that the invention is not limited to the precise arrangements and instrumentalities shown. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate implementations of systems and apparatus according to the invention and, together with the description, serve to explain the advantages and principles of the invention.
Wherein the content of the first and second substances,
fig. 1 is a flow chart illustrating a method for displaying sensed data in an automatic driving system according to the present invention.
Fig. 2 is a schematic structural diagram of a display method of sensing data in the automatic driving system of the present invention.
Detailed Description
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The drawings and written description are provided to guide those skilled in the art in making and using the invention for which patent protection is sought. The invention is capable of other embodiments and of being practiced and carried out in various ways. Those skilled in the art will appreciate that not all features of a commercial embodiment are shown for the sake of clarity and understanding. Those skilled in the art will also appreciate that the development of an actual commercial embodiment incorporating aspects of the present inventions will require numerous implementation-specific decisions to achieve the developer's ultimate goal for the commercial embodiment. While these efforts may be complex and time consuming, these efforts will be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.
Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting. For example, use of singular terms, such as "a," "an," and "the" is not intended to limit the number of items. Also, the use of relational terms, such as, but not limited to, "top," "bottom," "left," "right," "upper," "lower," "down," "up," "side," and the like are used in this description with specific reference to the figures for clarity and are not intended to limit the scope of the invention or the appended claims. Furthermore, it will be appreciated that any of the features of the present invention may be used alone, or in combination with other features. Other systems, methods, features and advantages of the invention will be, or will become, apparent to one with skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features and advantages be included within this description, be within the scope of the invention, and be protected by the accompanying claims.
Referring to fig. 1 and 2, an embodiment of a method for displaying perception data in an automatic driving system is schematically illustrated according to the present invention.
The method comprises the following specific steps:
step S101: the obstacle vehicle approaches the autonomous vehicle from the front left, the obstacle vehicle being captured by an on-board sensing device, wherein the on-board sensing device comprises an image sensor, an ultrasonic radar, a lidar, and/or a millimeter wave radar, etc.
In some embodiments, the obstacle vehicle includes an approaching autonomous vehicle from a rear and/or other direction.
In some embodiments, the obstacle vehicle further includes humans, animals, and/or other moving and/or stationary objects captured by the on-board sensing device.
In some embodiments, the autonomous vehicle may be a remote control car, a robot dog, a boat, a remote control boat, an aircraft, a drone.
Step S102: the obstacle vehicle is captured by the vehicle-mounted sensing device, the information of the obstacle vehicle captured by the vehicle-mounted sensing device is displayed through an image through an automatic driving system, and a plurality of small rectangles 1 enclosed by dotted lines in fig. 2 represent the recognized obstacle vehicle. Wherein, the image is displayed by a fence, and the fence comprises a dynamic fence F and two side static fences L and R. The dynamic barrier F is displayed across the front of the autonomous vehicle and can move left and right. The static fences at the two sides are vertical to the dynamic fence F and are displayed at the left side and the right side of the automatic driving vehicle.
In some embodiments, the vehicle's identification by the autopilot system also includes people, animals, and/or other moving and/or stationary objects captured by the vehicle sensing device.
In some embodiments, the image is presented by objects of other shapes, such as virtual walls, virtual dams, virtual collision avoidance devices.
And when the vehicle-mounted sensing equipment does not catch any obstacle, the fence is not displayed.
When the obstacle vehicle gradually approaches the automatic driving vehicle from the left front and is just captured by the vehicle-mounted sensing device, the left side static barrier L appears and presents a certain height, and the whole color of the left side static barrier L is displayed to be green.
When the barrier vehicle further approaches the automatic driving vehicle and a part of the vehicle body of the barrier vehicle enters a warning area preset by the automatic driving system, the height of the left side static barrier L is increased, and simultaneously, the color of the left side static barrier L is gradually changed from green to orange. At the same time, a dynamic barrier F appears and is correspondingly displayed in front of the autonomous vehicle according to the body portion in the warning area. At this time, the color displayed by the dynamic barrier F is reddish, i.e., it means that the obstacle vehicle is closer to the autonomous vehicle.
In the technical solution provided in step S102, the warning area preset by the automatic driving system may be set as follows. For example, the projection line of the front end of the autonomous vehicle in the horizontal direction is taken as the x-axis, the central axis of the autonomous vehicle is taken as the y-axis, and all display regions of the autonomous system are divided into 4 static regions SL1, SL2, SR1, SR2, and 1 dynamic region SF, where SL1 and SR1 jointly constitute the warning region in step S102, and the dynamic region SF is a subset of the warning region, and the range of the dynamic region is within the range where SL1 and SR1 jointly constitute the warning region in step S102. SL2 and SR2 are located on either side of the warning area.
Setting the width of the warning area in the x direction to be the lane width of a lane where the automatic driving vehicle is located, wherein the position of the y axis is kept unchanged, the width of the SL1 in the x direction is smaller than the width of the SR1 in the x direction, the widths of the SL2 and the SR2 are set to be the same, and the widths of the SL2 and the SR2 are set to be 0.5 meter. At this time, the left side static barrier is displayed closer to the central axis than the right side static barrier.
In some embodiments, the x-direction width of SL1 may be set greater than the x-direction width of SR 1.
In some embodiments, the widths of SR1 and SR2 may be set to zero, e.g., the autonomous vehicle travels on a fixed route without moving obstacles to the right of the route, e.g., the right is a cliff.
In some embodiments, the width of SL1 and SL2 may be set to zero, as an autonomous vehicle travels on a fixed route without moving obstacles to the left of the route traveled.
In some embodiments, the width of SL2 and SR2 may be set to zero, as in autonomous vehicles traveling on fixed tracks, which typically only need to be concerned with obstacles directly in front.
In some embodiments, the SL1 and SR1 in the four static areas may be arranged in the same area, for example, the x-direction width of the warning area formed by the SL1 and SR1 is arranged as the width of a lane, or the width of a channel, and the SL1 and SL2 each occupy half of the lane or channel.
In some embodiments, the number of obstacles captured may be represented by a thickness of the fence, with a thicker fence representing a greater number of obstacles captured.
When the obstacle vehicle is within the range of the static area SL1 and/or SL2, the obstacle vehicle is captured by the automatic driving system, and the captured information of the obstacle vehicle is displayed by an image after being processed by the automatic driving system, the image being represented by the left side static barrier L in step S102; similarly, information of obstacle vehicles within the static regions SR1 and/or SR2 is represented by the right side static barrier R; the information captured by the autonomous driving system when the obstacle vehicle is within the warning zone is represented by a dynamic barrier F.
In some embodiments, the left side static barrier L and the right side static barrier R capture obstacle vehicles simultaneously when the obstacle vehicles are in the SL1 right zone and/or the obstacle vehicles are in the SL2 left zone range. Namely, the left side static barrier L and the right side static barrier R have a cross recognition range. The cross-recognition range may be set to be smaller than the width of the warning area in the x direction, for example, half of the width of the warning area in the x direction.
In the technical solution provided in step S102, the method for generating the left side static barrier L by the information of the obstacle vehicle can be obtained as follows.
Firstly, all obstacle vehicles in an arbitrary perception area C in a display area of an automatic driving system are equivalent to an obstacle point M, for the arbitrary obstacle point M, a corresponding point P is arranged on a left side static barrier L, and the position relation between the point M and the point P determines the hue and the height of the position of the point P.
In the technical solution provided in step S102, the method for equating all obstacle vehicles in the sensing area C to one obstacle point M includes: the average of all the points on the vehicle that can be perceived as an obstacle is equivalent to the obstacle point M.
The method for calculating the hue of the position of the corresponding point P on the left side static barrier L through the barrier point M comprises the following steps: and linearly mapping the distance from the point M to the point P into red and green intervals to obtain the ideal hue of the point P.
In order to make the display effect more stable and friendly, the actual hue of the point P may be set to deviate from the ideal hue by a certain ratio on the basis of the primary color phase.
The closer the abscissa of the obstacle point M is to x being 0, the more red the color displayed at the position of the point P is; conversely, the more green the color.
The method for calculating the height of the position of the corresponding point P on the left side static barrier L through the barrier point M comprises the following steps: if an obstacle vehicle is caught in any of the sensing areas C, the height of the fence is increased by one unit of height accordingly, and the maximum height may be set to a specific upper limit value; if no obstacle vehicle is captured in any of the sensing areas C, the height of the barrier is reduced by one unit of height, and the minimum height is 0.
As can be seen from the above method for calculating the height, the longer the obstacle vehicle stays in the sensing area C, the more the height of the position of the point P increases, and the higher the height displayed finally; conversely, the more the height is reduced, the lower the final displayed height. That is, the height of the position of the point P can indicate the length of the stagnation time of the obstacle vehicle in the sensing area C through the above calculation method.
In the technical solution provided in step S102, the method for generating the dynamic barrier F by the information of the obstacle vehicle may be obtained as follows.
First, the sensing area SF of the dynamic barrier F is set, and the maximum range is the common area formed by SL1 and SR 1.
In some embodiments, the sensing area of the dynamic barrier F may be set to a larger range, for example, the range in the x direction may be customized according to different road conditions, which may be twice the distance between L _ x1 and R _ x1 in fig. 2, or half or less.
Next, the left and right boundaries of the dynamic barrier F are set to the minimum and maximum x-coordinate values of the body portions of all obstacle vehicles within the sensing field SF, respectively.
The method for setting the hue of the position of the center point of the dynamic fence F through the y coordinate value corresponding to the obstacle point in the induction area SF comprises the following steps: and mapping the y coordinate value to the red and green hue intervals linearly to obtain the ideal hue of the central point.
In order to make the display effect more stable and friendly, the actual hue of the center point may be set to deviate from the ideal hue by a certain ratio on the basis of the primary color phase.
The closer the center point of the dynamic fence is to the barrier vehicle, the more red the color displayed at the position of the center point is; conversely, the more green the color.
By means of gradient tracing, the color displayed by the center point can show good gradual transition.
The method for setting the hue of the positions of other points of the dynamic barrier F comprises the following steps: and linearly mapping the distance from each point in the dynamic barrier F to the center point of the dynamic barrier F into the hue intervals of the inverted red and the green to obtain the hue of each point in the dynamic barrier F. In order to make the display effect more stable and friendly, a proportionality constant k can be introduced to adjust the proportion of the mapping.
Thus, the dynamic barrier F shows a color that gradually changes from red to green toward the middle to both sides.
The method for setting the height of the position of each point on the dynamic fence F comprises the following steps: the height of each point in the dynamic barrier F is determined by introducing 4 variables: sine function value of position, negative exponential function value of position, y value of the point on the dynamic barrier F, and proportionality constant k.
Therefore, the dynamic barrier F presents a sine wave which changes along with time, and the closer the automatic driving vehicle is to the obstacle, the higher the overall height of the dynamic barrier F is; conversely, the lower the overall height.
In some embodiments, the sensory data is displayed in the display in the form of a semi-transparent overlay.
Step S103: when the barrier vehicle enters the warning area, the static fences L and R on the two side edges are changed from orange to red, namely the barrier vehicle is closer to the automatic driving vehicle. The dynamic barrier F rises in height to a higher height and is shown in red in color, indicating that the obstacle vehicle is relatively close to the autonomous vehicle.
Next, the obstacle vehicle continues to approach the right side static barrier R, at which time the left side static barrier L turns green, indicating that the left side of the autonomous vehicle is relatively safe, and the left side static barrier L is reduced in height, indicating that the obstacle vehicle is not sensed; at the same time, the right side static barrier R turns orange, indicating that the obstacle vehicle is closer to the right of the autonomous vehicle.
Next, the obstacle vehicle moves away from the right side static barrier R, which turns green, indicating right safety, while the left side static barrier L drops in height to completely disappear as the obstacle continues to be imperceptible.
On the basis of the above embodiments, another embodiment of the display method of the perception data in the following automatic driving system is explained according to the present invention.
The method comprises the following specific steps:
s201: the obstacle moves from near to far, and the autonomous vehicle acquires obstacle data by using sensors such as an image sensor, an ultrasonic radar, a laser radar, and a millimeter wave radar.
S202: when the obstacle is closer to the autonomous vehicle, the obstacle represented by the dynamic barrier is higher in height and reddish in color, refer to S102.
S203: when the obstacle gradually moves away from the autonomous vehicle, the height of the dynamic barrier gradually decreases, and the color changes from red to green, in reference to S103.
On the basis of the above embodiments, another embodiment of the display method of the perception data in the following automatic driving system is explained according to the present invention.
The method comprises the following specific steps:
s301: the obstacle moves from far to near, and the autonomous vehicle acquires obstacle data by using sensors such as an image sensor, an ultrasonic radar, a laser radar, and a millimeter wave radar.
S302: when the obstacle is far from the autonomous vehicle, the obstacle represented by the dynamic barrier is low in height and green in color, refer to S102.
S303: the obstacle gradually approaches the process of automatically driving the vehicle, the height of the dynamic fence gradually increases, and the color changes from green to red, refer to S103.
On the basis of the above embodiments, another embodiment of the display method of the perception data in the following automatic driving system is explained according to the present invention.
The method comprises the following specific steps:
s401: the obstacle moves from near to far from the front of the autonomous vehicle, and the autonomous vehicle acquires obstacle data by using sensors such as an image sensor, an ultrasonic radar, a laser radar, and a millimeter wave radar.
S402: when the obstacle is closer to the autonomous vehicle, the obstacle represented by the dynamic barrier is higher in height and redder in color. The barrier represented by the static barriers at the two sides is reddish in color, which indicates that the barrier is closer to the automatic driving vehicle, and the height of the barrier represented by the static barriers at the two sides is always increased, which indicates that the barrier is always positioned right in front of the automatic driving vehicle. Refer to S102.
In some embodiments, if the obstacle has not been driven off or parked directly in front of the autonomous vehicle, the height of the two side static barriers may be maintained at a higher height at all times, indicating that the obstacle has been parked for a longer period of time.
S403: when the barrier is gradually far away from the automatic driving vehicle, the height of the dynamic fence is gradually reduced, and the color is changed from reddish to greenish. The color of the barrier represented by the static barriers at the two sides is changed from reddish to greenish, which indicates that the barrier is far away from the automatic driving vehicle, and the height of the barrier represented by the static barriers at the two sides is always reduced, which indicates that the barrier is gradually far away from the automatic driving vehicle. Refer to S103. When the height of the static fences on both sides drops to zero, it indicates that the barrier has moved away from the autonomous vehicle.
On the basis of the above embodiments, another embodiment of the display method of the perception data in the following automatic driving system is explained according to the present invention.
The method comprises the following specific steps:
s501: the obstacle moves from far to near from the front of the autonomous vehicle or the autonomous vehicle gradually approaches the obstacle in front, and the autonomous vehicle acquires obstacle data by using sensors such as an image sensor, an ultrasonic radar, a laser radar, and a millimeter wave radar.
S502: when the barrier is far from the autonomous vehicle, the barrier represented by the dynamic barrier is low in height and green in color. The barrier represented by the static barriers on the two sides is green in color, the barrier is far away from the automatic driving vehicle, and the barrier represented by the static barriers on the two sides is zero in height. Refer to S102.
S503: the barrier gradually approaches the automatic driving vehicle or the automatic driving vehicle gradually approaches the barrier, the height of the dynamic barrier gradually rises, and the color is changed from green to red. The color of the barrier represented by the static barriers at the two sides is changed from green to red, which indicates that the barrier is closer to the automatic driving vehicle, and the height of the barrier represented by the static barriers at the two sides is continuously increased, which indicates that the barrier is always closer to the automatic driving vehicle. Refer to S103.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention in any way, and all modifications and equivalents of the above embodiments that may be made in accordance with the present invention are intended to be included within the scope of the present invention.

Claims (8)

1. A method of displaying sensory data in an autonomous driving system, comprising:
the perceived obstacle information is acquired and,
the barrier is represented by a static barrier and a dynamic barrier, the static barrier and the dynamic barrier present corresponding morphological characteristics according to the position information and the staying time information of the barrier,
by continuously acquiring the obstacle information, morphological characteristic changes presented by the static barrier and the dynamic barrier are continuously and stably displayed.
2. The display method as claimed in claim 1, wherein the static barrier includes a left side static barrier and a right side static barrier, the left side static barrier and the right side static barrier are displayed on left and right front sides of the autonomous vehicle, respectively, and the dynamic barrier is displayed in front of the autonomous vehicle perpendicular to the static barrier direction.
3. The display method according to any one of claims 1 to 2, wherein the different hues presented by the static barrier and the dynamic barrier represent distance information of the obstacle from the autonomous vehicle.
4. The display method according to claim 3, wherein a closer color of the hue to red indicates a closer distance of the obstacle from the autonomous vehicle, and a closer color of the hue to green indicates a farther distance.
5. The display method according to any one of claims 1 to 2, wherein the time for which the obstacle stays is represented by the height of the static barrier, and the higher the height is, the longer the time for which the obstacle stays is, and vice versa.
6. The method as claimed in claim 1, wherein the continuously and stably displaying the morphological characteristics of the static barrier and the dynamic barrier comprises:
the automatic driving system continuously and stably displays the morphological characteristic change presented by the static barrier and the dynamic barrier through a data compensation scheme of gradient tracking.
7. The display method according to any one of claims 1 to 6, wherein the autonomous driving system is an autonomous vehicle adapted to travel on a fixed route.
8. The display method according to any one of claims 1 to 7, wherein the automatic driving system is an automatic driving system having a perception capability in an automatic driving hierarchy from L0 to L5.
CN201911375088.4A 2019-12-09 2019-12-27 Method for displaying perception data in automatic driving system Pending CN112947401A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2019112530020 2019-12-09
CN201911253002 2019-12-09

Publications (1)

Publication Number Publication Date
CN112947401A true CN112947401A (en) 2021-06-11

Family

ID=76234452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911375088.4A Pending CN112947401A (en) 2019-12-09 2019-12-27 Method for displaying perception data in automatic driving system

Country Status (1)

Country Link
CN (1) CN112947401A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012037924A (en) * 2010-08-03 2012-02-23 Aisin Seiki Co Ltd Driving support device
CN103675827A (en) * 2013-11-18 2014-03-26 法雷奥汽车内部控制(深圳)有限公司 Vehicle-mounted radar detection virtual panorama system
CN106314440A (en) * 2015-07-07 2017-01-11 大陆汽车电子(芜湖)有限公司 Vehicle distance display method
CN109017786A (en) * 2018-08-09 2018-12-18 北京智行者科技有限公司 Vehicle obstacle-avoidance method
CN109934164A (en) * 2019-03-12 2019-06-25 杭州飞步科技有限公司 Data processing method and device based on Trajectory Safety degree
CN110210280A (en) * 2019-03-01 2019-09-06 北京纵目安驰智能科技有限公司 A kind of over the horizon cognitive method, system, terminal and storage medium
CN110386152A (en) * 2019-06-17 2019-10-29 江铃汽车股份有限公司 The human-computer interaction display control method and system driven based on L2 grades of intelligence navigators
CN110525360A (en) * 2019-08-26 2019-12-03 广汽蔚来新能源汽车科技有限公司 Auxiliary driving method, device, system and storage medium based on car-mounted terminal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012037924A (en) * 2010-08-03 2012-02-23 Aisin Seiki Co Ltd Driving support device
CN103675827A (en) * 2013-11-18 2014-03-26 法雷奥汽车内部控制(深圳)有限公司 Vehicle-mounted radar detection virtual panorama system
CN106314440A (en) * 2015-07-07 2017-01-11 大陆汽车电子(芜湖)有限公司 Vehicle distance display method
CN109017786A (en) * 2018-08-09 2018-12-18 北京智行者科技有限公司 Vehicle obstacle-avoidance method
CN110210280A (en) * 2019-03-01 2019-09-06 北京纵目安驰智能科技有限公司 A kind of over the horizon cognitive method, system, terminal and storage medium
CN109934164A (en) * 2019-03-12 2019-06-25 杭州飞步科技有限公司 Data processing method and device based on Trajectory Safety degree
CN110386152A (en) * 2019-06-17 2019-10-29 江铃汽车股份有限公司 The human-computer interaction display control method and system driven based on L2 grades of intelligence navigators
CN110525360A (en) * 2019-08-26 2019-12-03 广汽蔚来新能源汽车科技有限公司 Auxiliary driving method, device, system and storage medium based on car-mounted terminal

Similar Documents

Publication Publication Date Title
DE102016200018B4 (en) VEHICLE MOTION CONTROL DEVICE
US9921585B2 (en) Detailed map format for autonomous driving
RU2631543C1 (en) Device and method for vehicle traffic control
EP3693203B1 (en) Controller, display, method and carrier means for information provision
US10152120B2 (en) Information provision device and information provision method
US9417080B2 (en) Movement trajectory generator
US9355567B2 (en) System and method for highlighting an area encompassing an aircraft that is free of hazards
US8068038B2 (en) System and method for rendering a primary flight display having a conformal terrain avoidance guidance element
US9223017B2 (en) Systems and methods for enhanced awareness of obstacle proximity during taxi operations
US10140876B2 (en) Systems and methods for enhanced awareness of obstacle proximity during taxi operations
DE102016106832A1 (en) Vehicle control device
CN107161141A (en) Pilotless automobile system and automobile
US20170039438A1 (en) Vehicle display system
KR102613822B1 (en) Control of vehicles throughout multi-lane turns
DE102012214988A1 (en) Advanced reality front and rear seat vehicle gaming system for entertaining and informing passengers
CN111399512A (en) Driving control method, driving control device and vehicle
CN112419788A (en) Unmanned aerial vehicle detection system and related presentation method
WO2023024542A1 (en) Vehicle decision-making planning method and apparatus, and device and medium
CN109186624A (en) A kind of unmanned vehicle traveling right of way planing method based on friendship specification beam
DE112022002353T5 (en) Methods and systems for trajectory generation for an autonomous vehicle when driving over an intersection
CN114987554A (en) Obstacle encountering control method and system for automatic driving vehicle, electronic equipment and storage medium
DE112020002546T5 (en) AUTONOMOUS DRIVING CONTROL DEVICE, AUTONOMOUS DRIVING CONTROL SYSTEM AND AUTONOMOUS DRIVING CONTROL METHOD
EP4326589A1 (en) Methods and systems for inferring unpainted stop lines for autonomous vehicles
CA3102494C (en) Vehicle speed management systems and methods
CN112947401A (en) Method for displaying perception data in automatic driving system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220805

Address after: Room 618, 6 / F, building 5, courtyard 15, Kechuang 10th Street, Beijing Economic and Technological Development Zone, Daxing District, Beijing 100176

Applicant after: Xiaomi Automobile Technology Co.,Ltd.

Address before: 1219, floor 11, SOHO, Zhongguancun, No. 8, Haidian North Second Street, Haidian District, Beijing 100089

Applicant before: SHENDONG TECHNOLOGY (BEIJING) Co.,Ltd.