CN111623795B - Live-action navigation icon display method, device, equipment and medium - Google Patents

Live-action navigation icon display method, device, equipment and medium Download PDF

Info

Publication number
CN111623795B
CN111623795B CN202010469716.1A CN202010469716A CN111623795B CN 111623795 B CN111623795 B CN 111623795B CN 202010469716 A CN202010469716 A CN 202010469716A CN 111623795 B CN111623795 B CN 111623795B
Authority
CN
China
Prior art keywords
icon
live
vehicle
lane line
road
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010469716.1A
Other languages
Chinese (zh)
Other versions
CN111623795A (en
Inventor
黄庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Intelligent Connectivity Beijing Technology Co Ltd
Original Assignee
Apollo Intelligent Connectivity Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Intelligent Connectivity Beijing Technology Co Ltd filed Critical Apollo Intelligent Connectivity Beijing Technology Co Ltd
Priority to CN202010469716.1A priority Critical patent/CN111623795B/en
Publication of CN111623795A publication Critical patent/CN111623795A/en
Priority to JP2021086777A priority patent/JP7258078B2/en
Priority to KR1020210067095A priority patent/KR102559269B1/en
Application granted granted Critical
Publication of CN111623795B publication Critical patent/CN111623795B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/367Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a live-action navigation icon display method, a live-action navigation icon display device, live-action navigation icon display equipment and a live-action navigation medium, and relates to the live-action navigation technology. Wherein, the method comprises the following steps: determining a driving road of the vehicle through matching of the positioning coordinates of the vehicle and the electronic map; determining a steering navigation path of the vehicle according to the driving road; fitting the steering navigation path, and generating an indication icon based on the fitted steering navigation path; determining a target lane line of a driving road on the live-action image in the driving process of the vehicle; and displaying the indication icon on the live-action image according to the position relation between the target lane line and the indication icon. The embodiment of the application can solve the problem that the real-scene navigation icon is not smoothly displayed in the vehicle steering process, and can optimize the display effect of the navigation icon in the real-scene navigation steering process.

Description

Live-action navigation icon display method, device, equipment and medium
Technical Field
The embodiment of the application relates to a computer technology, in particular to a live-action navigation technology, and particularly relates to a live-action navigation icon display method, a live-action navigation icon display device, equipment and a medium.
Background
The live-action navigation is a product of combining a navigation technology and an Augmented Reality (AR) technology, and can provide more visual and intuitive navigation guidance for a user, so that the user is not lost.
However, in the live-action navigation, particularly when turning a corner or changing lanes, the navigation icon is not smoothly displayed, and even the display of the icon deviates from the road entity, which is very likely to occur along with the change of the vehicle position. Therefore, how to optimize the display of the live-action navigation icon during the vehicle steering process still remains a current consideration.
Disclosure of Invention
The embodiment of the application discloses a method, a device, equipment and a medium for displaying a live-action navigation icon, so as to optimize the display effect of the navigation icon in the live-action navigation steering process.
In a first aspect, an embodiment of the present application discloses a method for displaying live-action navigation icons, including:
determining a driving road of a vehicle through matching of the positioning coordinates of the vehicle and an electronic map;
determining a steering navigation path of the vehicle according to the driving road;
fitting the steering navigation path, and generating an indication icon based on the fitted steering navigation path;
determining a target lane line of the driving road on the live-action image in the driving process of the vehicle;
and displaying the indication icon on the live-action image according to the position relation between the target lane line and the indication icon.
In a second aspect, an embodiment of the present application discloses a live-action navigation icon display device, including:
the driving road determining module is used for determining the driving road of the vehicle through matching of the positioning coordinates of the vehicle and the electronic map;
the steering navigation path determining module is used for determining a steering navigation path of the vehicle according to the running road;
the icon generation module is used for fitting the steering navigation path and generating an indication icon based on the fitted steering navigation path;
the target lane line determining module is used for determining a target lane line of the driving road on the live-action image in the driving process of the vehicle;
and the icon display module is used for displaying the indication icon on the live-action image according to the position relation between the target lane line and the indication icon.
In a third aspect, an embodiment of the present application further discloses an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method of displaying a live-action navigation icon according to any of the embodiments of the present application.
In a fourth aspect, embodiments of the present application further disclose a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute the live-action navigation icon display method according to any of the embodiments of the present application.
According to the technical scheme of the embodiment of the application, after the current steering navigation path of the vehicle is obtained in the steering process of the vehicle, the steering navigation path is subjected to fitting processing, and the indication icon is displayed on the live-action image according to the determined position relation between the target lane line and the indication icon, so that the navigation icon display effect in the live-action navigation process is optimized integrally.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
fig. 1 is a flowchart of a method for displaying live-action navigation icons according to an embodiment of the present disclosure;
FIG. 2 is a flowchart of another method for displaying real navigation icons according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an illustration of a real navigation icon according to an embodiment of the disclosure;
FIG. 4 is a flowchart of a method for displaying a real navigation icon according to an embodiment of the present disclosure;
FIG. 5 is a schematic structural diagram of an exemplary embodiment of a real navigation icon display apparatus;
fig. 6 is a block diagram of an electronic device disclosed according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a flowchart of a method for displaying a real-scene navigation icon according to an embodiment of the present disclosure, which may be applied to a case where navigation icons are displayed in real time during a real-scene navigation process, such as turning, lane changing, and the like. The method of the embodiment may be executed by a live-action navigation icon display device, which may be implemented by software and/or hardware, and may be integrated on any electronic device with computing capability, including but not limited to a vehicle-mounted device.
As shown in fig. 1, the method for displaying live-action navigation icons disclosed in this embodiment may include:
s101, determining a driving road of the vehicle through matching of the positioning coordinates of the vehicle and the electronic map.
The Positioning coordinates of the vehicle may be obtained by a Positioning device on the vehicle, for example, by using a Global Positioning System (GPS) to obtain the longitude and latitude coordinates of the vehicle in real time. The electronic map data contains the coordinates of each road, so that the road on which the vehicle runs currently can be determined by matching the positioning coordinates with the electronic map in the running process of the vehicle.
Optionally, the determining the driving road of the vehicle by matching the positioning coordinates of the vehicle with the electronic map includes: acquiring a positioning coordinate set of a vehicle within a preset time, and performing data filtering on the positioning coordinate set; and matching the filtered positioning coordinate set with an electronic map to determine the driving road of the vehicle.
The length of the preset time can be flexibly set according to requirements, for example, the positioning data of the vehicle can be periodically acquired according to a time period in the driving process of the vehicle, a positioning coordinate set is obtained, and then the positioning coordinate set is used for road matching. The driving road of the vehicle is determined by utilizing the positioning coordinate set in the set time, and the stability of road matching is improved. By filtering the positioning coordinate set, some positioning points with large deviation can be removed, which is beneficial to improving the accuracy of road matching, and further can ensure that the steering navigation path is accurately acquired. The present embodiment is not particularly limited to the filtering algorithm, and any available algorithm in the prior art may be adopted on the basis of ensuring that data filtering can be implemented.
And S102, determining a steering navigation path of the vehicle according to the running road.
The steering navigation path is a part of a complete navigation path obtained by using any available path planning algorithm based on the current driving road and the driving destination of the vehicle. The steering navigation path of the vehicle is obtained according to the current driving road of the vehicle, and the road or lane to which the vehicle will drive subsequently can be determined, that is, the steering (changing the driving direction) mentioned in the embodiment includes the situations of vehicle turning, vehicle lane changing and the like.
S103, fitting the steering navigation path, and generating an indication icon based on the fitted steering navigation path.
The indication icon is a navigation icon used for indicating a lane line on the inner side of the vehicle in the vehicle steering process, and can be used for visually prompting the lane line which is currently approached by the vehicle or is about to cross the vehicle. And generating an indication icon based on the current steering navigation path, so that the indication icon is matched with the current steering navigation path, for example, the indication icon and the current steering navigation path are consistent in shape, the correctness of the navigation guidance is ensured, and the world coordinate and the image coordinate of the indication icon can be determined according to the position coordinate included in the steering navigation path. A smooth curve can be obtained by fitting the steering navigation path, and then a current indication icon generated based on the smooth curve can ensure certain display fluency in the aspect of subsequent display, so that the display effect of the indication icon is prevented from being influenced by the appearance of some jumping coordinate points on the steering navigation path.
The present embodiment is not specifically limited, and any available algorithm in the prior art can be flexibly selected. Illustratively, the fitting process is performed on the rotary navigation path using a Bezier curve algorithm.
The bezier curve is a basic tool for computer graphic image modeling, which can create and edit a graphic by controlling four points on the curve, i.e., a start point, an end point, and two intermediate points separated from each other, wherein a control line located at the center of the curve plays an important role. This control line is virtual, with the middle crossing the bezier curve and the ends being control endpoints. Changing the curvature of the curve by the Bezier curve when moving the end points at the two ends; when the middle point is moved, that is, when the virtual control line is moved, the bezier curve is uniformly moved while the start point and the end point are locked. Since all control points and nodes on the bezier curve can be edited, the bezier curve algorithm is generally the preferred algorithm in the graphic editing process.
In addition, as the vehicle travels in real time, the steering navigation path of the vehicle changes, and the indication icon generated based on the fitted steering navigation path also changes, so the indication icon in this embodiment may also be referred to as a dynamic flow mark, and during the vehicle traveling, a dynamic change effect may be presented. The length of the curve corresponding to the indication icon, and the width and the height of the indication icon can be flexibly set. Optionally, the indication icon may be composed of a plurality of preset icon units, the preset icon units may be, for example, V-shaped icon units, and each preset icon unit takes the smooth curve obtained by the fitting processing as a central line and arranges the central lines in sequence.
And S104, determining a target lane line of a driving road on the live-action image in the driving process of the vehicle.
In the live-action navigation process, a camera disposed on a vehicle may be used to acquire a live-action image of a driving environment of the vehicle in real time, and then a lane line recognition technology may be used to recognize a lane line in the live-action image, where the lane line recognition technology may include, but is not limited to, an image processing-based recognition technology, a model training-based recognition technology, and the like. The specific identification process of the lane line may be executed locally in the local device, or may be executed in the cloud server, which is not specifically limited in this embodiment. By identifying the live-action image, the finally identified lane line may include at least one lane line, where the target lane line is a lane line on the inner side of the vehicle during turning, such as turning or changing the lane line. In the lane line identification process, a target lane line can be determined by combining the driving direction to be converted of the vehicle, and the judgment of the vehicle steering can also be determined by analyzing the deflection angle of the vehicle body. For example, when the vehicle is about to turn right, the most adjacent lane line on the right side of the vehicle on the live-action image is determined as the target lane line; and when the vehicle is about to turn left, determining the lane line closest to the left side of the vehicle on the live-action image as the target lane line.
And S105, displaying the indication icon on the live-action image according to the position relation between the target lane line and the indication icon.
The indication icon is used for indicating a target lane line to a user in the vehicle steering process, and the indication icon is displayed on the live-action image according to the position relation between the indication icon and the user, for example, the relation between the image coordinates of the indication icon and the user on the live-action image, so that the display effect that the navigation icon is attached to the live-action is obtained.
It should be noted that there is no restriction on the execution sequence between operations S101-S103 and operation S104, and the operation sequence shown in fig. 1 should not be understood as a specific restriction on the embodiment.
According to the technical scheme of the embodiment, after the current steering navigation path of the vehicle is obtained in the vehicle steering process, the steering navigation path is fitted to obtain a smooth curve, and then the current indication icon is generated based on the smooth curve, so that the phenomenon that the display effect of the indication icon is influenced by the appearance of some jumping coordinate points on the steering navigation path in the vehicle driving process is avoided, the display smoothness of the indication icon is ensured, and the problem that the display of the live-action navigation icon is not smooth in the vehicle steering process is solved; meanwhile, the indication icon is displayed on the live-action image according to the determined position relation between the target lane line and the indication icon, so that the attaching degree between the indication icon and the lane line is ensured, the display effect that the indication icon deviates from the lane line is avoided, and the display effect of the navigation icon in the live-action navigation process is integrally optimized.
Fig. 2 is a flowchart of another method for displaying real navigation icons according to the embodiment of the present application, which is further optimized and expanded based on the above technical solution, and can be combined with the above optional embodiments. As shown in fig. 2, the method may include:
s201, determining a driving road of the vehicle through matching of the positioning coordinates of the vehicle and the electronic map.
S202, determining a steering navigation path of the vehicle according to the running road.
And S203, fitting the rotary guide navigation path.
And S204, drawing the indication icon by using a three-dimensional icon drawing mode according to the shape of the fitted steering navigation path.
The implementation of the three-dimensional (3D) icon drawing manner may be implemented by a three-dimensional icon drawing tool, such as OpenGL. For example, the indication icon may be composed of a plurality of preset icon units, the preset icon units are all 3D icon units, for example, the preset icon units may be 3D V-shaped icon units, and each preset icon unit takes the smooth curve obtained by the fitting process as a central line and arranges the central lines in sequence
And S205, determining world coordinates corresponding to the indication icon according to the position coordinates included in the fitted steering navigation path.
The indication icon is consistent with the shape of the turning navigation path and can be a part of the turning navigation path, so that the position coordinate (i.e. world coordinate) included in the fitted turning navigation path can be correspondingly used as the world coordinate of the indication icon, and further, the world coordinate of the indication icon comprises a coordinate which refers to the center line of the icon along the length direction. After the coordinates of the central line of the icon are determined, the world coordinates corresponding to all contour lines of the indication icon can be determined by combining the preset width and height of the indication icon. Of course, in the preset floating display range, that is, the floating change of the world coordinate does not have a significant adverse effect on the final display effect of the navigation icon, the position coordinate in the fitted steering navigation path may be entirely subjected to appropriate translation transformation, and then used to determine the world coordinate corresponding to the indication icon.
And S206, determining the image coordinates of the indication icon on the live-action image by using the conversion relation between the world coordinate system and the image coordinate system.
The conversion relationship between the world coordinate system and the image coordinate system may be obtained based on internal reference and external reference of the vehicle-mounted camera, which is not described in detail in this embodiment.
And S207, determining a target lane line of a driving road on the live-action image in the driving process of the vehicle.
And S208, displaying the indication icon on the live-action image along the steering direction according to the image coordinates of the target lane line and the indication icon on the live-action image respectively.
In the process of identifying the target lane line, the image coordinates thereof may be determined together. The indication icon is used for indicating the target lane line to the user, so that after the image coordinates of the indication icon and the target lane line are obtained, the indication icon is displayed on the live-action image along the current steering direction, and the display effect that the navigation icon is attached to the live-action can be obtained. It should be noted that there is no limitation on the execution sequence between operations S201-S206 and operation S207, and the operation sequence shown in fig. 2 should not be understood as a specific limitation to the present embodiment.
According to the technical scheme of the embodiment, after the current turning navigation path of the vehicle is obtained in the vehicle turning process, the turning navigation path is subjected to fitting processing to obtain a smooth curve, and then the indication icon is generated based on the smooth curve, so that the phenomenon that the display effect of the indication icon is influenced by the appearance of some jumping coordinate points on the turning navigation path in the vehicle driving process is avoided, the display smoothness of the indication icon is ensured, and the problem that the display of the live-action navigation icon is not smooth in the vehicle turning process is solved; moreover, the shapes of the indication icon and the steering navigation path are consistent, so that the accuracy of navigation guidance is ensured; meanwhile, the indication icon is displayed on the live-action image along the current steering direction based on the image coordinate, so that the attaching degree between the indication icon and the lane line is ensured, the display effect that the indication icon deviates from the lane line is avoided, and the display effect of the navigation icon in the live-action navigation process is integrally optimized.
On the basis of the above technical solution, optionally, after the indication icon is drawn in a three-dimensional icon drawing manner according to the shape of the fitted steering navigation path, the method of this embodiment further includes:
drawing a turning icon with the same direction as the fitted turning navigation path by using a three-dimensional icon drawing mode;
correspondingly, after the indication icon is displayed on the live-action image, the method of the embodiment further includes: a steering icon is displayed above the indicator icon.
The turn icon is used to indicate that the vehicle is about to change direction of travel. The indication icon and the steering icon are displayed in a 3D mode, so that good stereoscopic impression and sense of reality can be presented, the visual effect of the display of the live-action navigation icon is improved, and good navigation guide is provided for a user.
Fig. 3 is a schematic diagram illustrating the display effect of the live-action navigation icon by taking the current lane change of the vehicle as an example. In fig. 3, the steering icon is displayed above the indicator icon, both the steering icon and the indicator icon are 3D icons, and the indicator icon is displayed on the live-action image in the current steering direction, i.e., in the direction toward the target lane line. Fig. 3 is a simple illustration of the display effect of the navigation icon in the present embodiment, and should not be understood as a specific limitation to the present embodiment. A partial diagram of the navigation path may be displayed on the live view image, as shown in the lower right of fig. 3, information such as a vehicle speed, a current road name, a name of a road to be entered, a driving distance, a driving time, and an electronic eye may be displayed, and the embodiment is not particularly limited.
Fig. 4 is a flowchart of another method for displaying live-action navigation icons according to an embodiment of the present application, which is further optimized and expanded based on the above technical solution, and can be combined with the above optional embodiments. Specifically, fig. 4 exemplifies a curve scene, and exemplifies the live-action navigation icon display method of the present embodiment. As shown in fig. 4, the method may include:
s301, determining a driving road of the vehicle through matching of the positioning coordinates of the vehicle and the electronic map.
And S302, determining a steering navigation path of the vehicle according to the running road.
And S303, fitting the steering navigation path, and generating an indication icon based on the fitted steering navigation path.
S304, recognizing the live-action image in the vehicle driving process by using a preset image recognition algorithm to obtain an initial recognition result of the target lane line on the driving road.
The preset image recognition algorithm is used for recognizing the lane line on the image, and the embodiment is not particularly limited with respect to the specific implementation of the algorithm.
S305, acquiring at least one of a vehicle body deflection angle and road coordinates of a driving road in the electronic map.
S306, correcting the initial recognition result of the target lane line by using at least one of the acquired vehicle body deflection angle and the acquired road coordinate to obtain the target recognition result of the target lane line.
In a curve scene, the vehicle body deflection angle is changed along with the change of the road curvature, so that the identification result of the curve lane line can be corrected by utilizing the change of the vehicle body deflection angle. The body deflection angle can be obtained by using an Inertial navigation unit (IMU) on the vehicle. The road coordinates refer to world coordinates stored in the electronic map of the current driving road of the vehicle. The change of the road coordinates can reflect the change of the road shape, and the world coordinates of the lane line of the current road can be estimated by using the combination of the road coordinates and the road width, so that the road coordinates of the curve can also be used for correcting the recognition result of the lane line of the curve.
For the case that the vehicle body deflection angle and the road coordinate are simultaneously used for the correction of the lane line recognition result, the two can be subjected to weighted fusion processing for the correction of the lane line recognition result. On the basis of ensuring the correction accuracy, the distribution of the weights may be flexibly set, and this embodiment is not particularly limited. Furthermore, the lane line identification result can be corrected by using an extended Kalman filtering algorithm.
The initial recognition result of the lane line is corrected by utilizing at least one of the vehicle body deflection angle and the road coordinate, so that the accuracy of the final recognition result of the target lane line is ensured, the accuracy of the display position of the indication icon is further ensured, and the display effect of the navigation icon fitting the real scene is ensured.
And S307, displaying the indication icon on the live-action image according to the position relation between the target lane line and the indication icon.
According to the technical scheme of the embodiment, in the vehicle steering process, firstly, after the current steering navigation path of the vehicle is obtained, the steering navigation path is fitted to obtain a smooth curve, and then the indication icon is generated based on the smooth curve, so that the phenomenon that the display effect of the indication icon is influenced by the appearance of some jumping coordinate points on the steering navigation path in the vehicle driving process is avoided, the display smoothness of the indication icon is ensured, and the problem that the display of the live-action navigation icon is not smooth in the vehicle steering process is solved; then, by utilizing at least one of the acquired vehicle body deflection angle and the acquired road coordinate, the initial recognition result of the target lane line is corrected, so that the accuracy of the final recognition result of the target lane line is ensured, the accuracy of the display position of the indication icon is further ensured, and the problem of unsmooth display of the live-action navigation icon caused by inaccurate and unstable recognition results of the lane line can be solved; and finally, displaying the indication icon on the live-action image according to the position relation between the target lane line and the indication icon, so that the attaching degree between the indication icon and the lane line is ensured, the display effect that the indication icon deviates from the lane line is avoided, and the display effect of the navigation icon in the live-action navigation process is integrally optimized.
In general, the image recognition algorithm can satisfy most lane line recognition situations, and compared with a lane line recognition algorithm based on a training model, the image recognition algorithm involves relatively low computational complexity and consumes relatively less computing resources for navigation equipment, so that the image recognition algorithm can be used as a lane line recognition method which is preferentially used. However, for some special cases, such as night scenes, due to poor image acquisition quality, a large error may exist when the lane line recognition is performed by using the image recognition algorithm, and at this time, the recognition accuracy can be improved by using the lane line recognition algorithm based on the training model.
On the basis of the above technical solution, optionally, determining a target lane line of a driving road on the live-action image in the driving process of the vehicle includes:
determining a target lane line of a driving road on a live-action image in the driving process of the vehicle by using a pre-trained neural network model; the neural network model is obtained based on sample image training of different image parameters, and the image parameters comprise image brightness and image contrast.
For example, a large number of sample images taken at night with different image parameters may be collected as input for model training, and the neural network model may be trained with the lane marking results on the sample images as output. The neural network structure that can be adopted, this embodiment is not specifically limited, can carry out the flexibility as required and select. For example, the neural network structure may be divided into a front-end network and a back-end network, the front-end network may include, but is not limited to, backhaul, respet 101, Xception, etc., and the back-end network may include, but is not limited to, an encor-decor network such as deeepalbv 3 +. Through multilayer network link, the expression ability of the model can be enhanced, the robustness and the adaptability of the model to night scenes are further improved, the lane line identification result with higher accuracy is output, and the better live-action navigation effect can be displayed in the night scenes.
Certainly, for the identification of the lane line, after the local device acquires the live-action image, the local device may further send the live-action image to the cloud server, and the cloud server performs the identification processing of the lane line, for example, the cloud server identifies the live-action image by using a pre-trained neural network model, or identifies the live-action image by using a preset image identification algorithm, determines a target lane line of the current driving road of the vehicle, and then feeds back the identification result of the lane line to the local device, thereby reducing the resource consumption of the local device.
Fig. 5 is a schematic structural diagram of a real-scene navigation icon display apparatus according to an embodiment of the present disclosure, which may be suitable for displaying navigation icons in real time during a real-scene navigation process, such as turning, lane changing, and the like. The apparatus can be implemented by software and/or hardware, and can be integrated on any electronic device with computing capability, including but not limited to an in-vehicle device and the like.
As shown in fig. 5, the real-scene navigation icon display apparatus 400 disclosed in this embodiment may include a driving road determining module 401, a steering navigation path determining module 402, an icon generating module 403, a target lane line determining module 404, and an icon display module 405, wherein:
a driving road determining module 401, configured to determine a driving road of a vehicle through matching of the positioning coordinates of the vehicle and the electronic map;
a steering navigation path determining module 402, configured to determine a steering navigation path of the vehicle according to a driving road;
an icon generating module 403, configured to perform fitting processing on the turning navigation path, and generate an indication icon based on the fitted turning navigation path;
a target lane line determining module 404, configured to determine a target lane line of a driving road on the live-action image in the driving process of the vehicle;
and an icon display module 405, configured to display the indication icon on the live-action image according to a position relationship between the target lane line and the indication icon.
Optionally, the icon generating module 403 includes:
the fitting processing unit is used for fitting the rotary guide navigation path;
the indication icon drawing unit is used for drawing an indication icon by using a three-dimensional icon drawing mode according to the shape of the fitted steering navigation path;
the world coordinate determination unit is used for determining a world coordinate corresponding to the indication icon according to the position coordinate included in the fitted steering navigation path;
and the image coordinate determination unit is used for determining the image coordinates of the indication icon on the live-action image by utilizing the conversion relation between the world coordinate system and the image coordinate system.
Optionally, the icon display module 405 is specifically configured to:
and displaying the indication icon on the live-action image along the steering direction according to the image coordinates of the target lane line and the indication icon on the live-action image respectively.
Optionally, the icon generating module 403 further includes a turning icon drawing unit, configured to draw a turning icon having the same direction as the fitted turning navigation path in the three-dimensional icon drawing manner after the instruction icon drawing unit performs an operation of drawing the instruction icon according to the shape of the fitted turning navigation path in the three-dimensional icon drawing manner;
accordingly, the icon display module 405 includes an indication icon display unit and a turning icon display unit, in which:
an indication icon display unit for displaying the indication icon on the live-action image according to a positional relationship between the target lane line and the indication icon;
a steering icon display unit for displaying the steering icon above the indication icon after the indication icon display unit performs an operation of displaying the indication icon on the live-action image.
Optionally, the icon generating module 403 includes:
the fitting processing unit is used for fitting the rotary navigation path by utilizing a Bezier curve algorithm;
and the indication icon generating unit is used for generating an indication icon based on the fitted steering navigation path.
Optionally, the driving road determining module 401 includes:
the data filtering unit is used for acquiring a positioning coordinate set of the vehicle within preset time and filtering data of the positioning coordinate set;
and the driving road determining unit is used for matching the positioning coordinate set after the filtering processing with the electronic map to determine the driving road of the vehicle.
Alternatively, if the driving road is a curve, the target lane line determination module 404 includes:
the initial recognition result determining unit is used for recognizing the live-action image in the driving process of the vehicle by using a preset image recognition algorithm to obtain an initial recognition result of a target lane line on a driving road;
the system comprises a deflection angle and road coordinate acquisition unit, a deflection angle and road coordinate acquisition unit and a control unit, wherein the deflection angle and road coordinate acquisition unit is used for acquiring at least one of a vehicle body deflection angle and road coordinates of a driving road in an electronic map;
and the recognition result correction unit is used for correcting the initial recognition result of the target lane line by using at least one of the acquired vehicle body deflection angle and the acquired road coordinate to obtain the target recognition result of the target lane line.
Optionally, the target lane line determining module 404 is specifically configured to:
determining a target lane line of a driving road on a live-action image in the driving process of the vehicle by using a pre-trained neural network model;
the neural network model is obtained based on sample image training of different image parameters, and the image parameters comprise image brightness and image contrast.
The real navigation icon display apparatus 400 disclosed in the embodiment of the present application can execute any of the real navigation icon display methods disclosed in the embodiment of the present application, and has functional modules and beneficial effects corresponding to the execution methods. Reference may be made to the description of any method embodiment of the present application for details not explicitly described in this embodiment.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 6, fig. 6 is a block diagram of an electronic device for implementing a method for displaying a live-action navigation icon in an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of embodiments of the present application described and/or claimed herein.
As shown in fig. 6, the electronic apparatus includes: one or more processors 501, memory 502, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display Graphical information for a Graphical User Interface (GUI) on an external input/output device, such as a display device coupled to the Interface. In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations, e.g., as a server array, a group of blade servers, or a multi-processor system. In fig. 6, one processor 501 is taken as an example.
The memory 502 is a non-transitory computer readable storage medium provided by the embodiments of the present application. The memory stores instructions executable by the at least one processor, so that the at least one processor executes the method for displaying the live-action navigation icon provided by the embodiment of the application. The non-transitory computer-readable storage medium of the embodiments of the present application stores computer instructions for causing a computer to execute the live-action navigation icon display method provided by the embodiments of the present application.
The memory 502, which is a non-transitory computer-readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the live-action navigation icon display method in the embodiment of the present application, for example, the driving road determination module 401, the turning navigation path determination module 402, the icon generation module 403, the target lane line determination module 404, and the icon display module 405 shown in fig. 5. The processor 501 executes various functional applications and data processing of the electronic device by running non-transitory software programs, instructions and modules stored in the memory 502, that is, implements the live-action navigation icon display method in the above method embodiment.
The memory 502 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 502 may optionally include a memory remotely disposed from the processor 501, and these remote memories may be connected to an electronic device for implementing the live-action navigation icon display method in the present embodiment through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device for implementing the live-action navigation icon display method in the embodiment of the present application may further include: an input device 503 and an output device 504. The processor 501, the memory 502, the input device 503 and the output device 504 may be connected by a bus or other means, and fig. 6 illustrates the connection by a bus as an example.
The input device 503 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus for implementing the live-action navigation icon display method in the present embodiment, such as an input device of a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, or the like. The output device 504 may include a display device, an auxiliary lighting device such as a Light Emitting Diode (LED), a tactile feedback device, and the like; the tactile feedback device is, for example, a vibration motor or the like. The Display device may include, but is not limited to, a Liquid Crystal Display (LCD), an LED Display, and a plasma Display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, Integrated circuitry, Application Specific Integrated Circuits (ASICs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs, also known as programs, software applications, or code, include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or Device for providing machine instructions and/or data to a Programmable processor, such as a magnetic disk, optical disk, memory, Programmable Logic Device (PLD), including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device for displaying information to a user, for example, a Cathode Ray Tube (CRT) or an LCD monitor; and a keyboard and a pointing device, such as a mouse or a trackball, by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here, or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, after the current steering navigation path of the vehicle is obtained in the vehicle steering process, the steering navigation path is subjected to fitting processing to obtain a smooth curve, and then the indication icon is generated based on the smooth curve, so that the phenomenon that the display effect of the indication icon is influenced by the appearance of some jumping coordinate points on the steering navigation path in the vehicle driving process is avoided, the display smoothness of the indication icon is ensured, and the problem that the display of the live-action navigation icon is not smooth in the vehicle steering process is solved; meanwhile, the indication icon is displayed on the live-action image according to the determined position relation between the target lane line and the indication icon, so that the attaching degree between the indication icon and the lane line is ensured, and the display effect that the indication icon deviates from the lane line is avoided; therefore, the display effect of the navigation icons in the live-action navigation process is optimized integrally.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A live-action navigation icon display method is characterized by comprising the following steps:
determining a driving road of a vehicle through matching of the positioning coordinates of the vehicle and an electronic map;
determining a steering navigation path of the vehicle according to the driving road;
fitting the steering navigation path, and generating an indication icon based on the fitted steering navigation path;
determining a target lane line of the driving road on the live-action image in the driving process of the vehicle;
displaying the indication icon on the live-action image according to the position relation between the target lane line and the indication icon;
wherein, in a case that the driving road is a curve, determining a target lane line of the driving road on the live-action image in the driving process of the vehicle includes:
recognizing the live-action image in the vehicle driving process by using a preset image recognition algorithm to obtain an initial recognition result of a target lane line on the driving road;
acquiring at least one of a vehicle body deflection angle and a road coordinate of the driving road in the electronic map;
and correcting the initial recognition result of the target lane line by using at least one of the acquired vehicle body deflection angle and the acquired road coordinate to obtain the target recognition result of the target lane line.
2. The method of claim 1, wherein generating an indicator icon based on the fitted steering navigation path comprises:
drawing the indication icon by using a three-dimensional icon drawing mode according to the shape of the fitted steering navigation path;
determining a world coordinate corresponding to the indication icon according to the position coordinate included in the fitted steering navigation path;
and determining the image coordinates of the indication icon on the live-action image by using the conversion relation between the world coordinate system and the image coordinate system.
3. The method according to claim 2, wherein displaying the indication icon on the live-action image according to a positional relationship between the target lane line and the indication icon includes:
and displaying the indication icon on the live-action image along a steering direction according to the image coordinates of the target lane line and the indication icon on the live-action image respectively.
4. The method of claim 2, wherein after the indicator icon is rendered using a three-dimensional icon rendering manner according to the fitted shape of the steering navigation path, the method further comprises:
drawing a turning icon with the same direction as the fitted turning navigation path by using the three-dimensional icon drawing mode;
correspondingly, after the indication icon is displayed on the live-action image, the method further comprises: displaying the turning icon above the indication icon.
5. The method of claim 1, wherein fitting the steering navigation path comprises:
and fitting the steering navigation path by using a Bezier curve algorithm.
6. The method of claim 1, wherein determining the driving path of the vehicle by matching the location coordinates of the vehicle with an electronic map comprises:
acquiring a positioning coordinate set of the vehicle within a preset time, and performing data filtering on the positioning coordinate set;
and matching the filtered positioning coordinate set with the electronic map to determine the driving road of the vehicle.
7. The method of claim 1, wherein determining a target lane line of the driving road on the live-action image during the driving of the vehicle comprises:
determining a target lane line of the driving road on the live-action image in the driving process of the vehicle by using a pre-trained neural network model;
the neural network model is obtained based on sample image training of different image parameters, and the image parameters comprise image brightness and image contrast.
8. A live-action navigation icon display device, comprising:
the driving road determining module is used for determining the driving road of the vehicle through matching of the positioning coordinates of the vehicle and the electronic map;
the steering navigation path determining module is used for determining a steering navigation path of the vehicle according to the running road;
the icon generation module is used for fitting the steering navigation path and generating an indication icon based on the fitted steering navigation path;
the target lane line determining module is used for determining a target lane line of the driving road on the live-action image in the driving process of the vehicle;
the icon display module is used for displaying the indication icon on the live-action image according to the position relation between the target lane line and the indication icon;
wherein, in the case where it is determined that the traveling road is a curve, the target lane line determining module includes:
the initial recognition result determining unit is used for recognizing the live-action image in the driving process of the vehicle by using a preset image recognition algorithm to obtain an initial recognition result of a target lane line on a driving road;
the system comprises a deflection angle and road coordinate acquisition unit, a deflection angle and road coordinate acquisition unit and a control unit, wherein the deflection angle and road coordinate acquisition unit is used for acquiring at least one of a vehicle body deflection angle and road coordinates of a driving road in an electronic map;
and the recognition result correction unit is used for correcting the initial recognition result of the target lane line by using at least one of the acquired vehicle body deflection angle and the acquired road coordinate to obtain the target recognition result of the target lane line.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of displaying a live action navigation icon according to any one of claims 1-7.
10. A non-transitory computer readable storage medium storing computer instructions for causing a computer to execute the live action navigation icon display method according to any one of claims 1 to 7.
CN202010469716.1A 2020-05-28 2020-05-28 Live-action navigation icon display method, device, equipment and medium Active CN111623795B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202010469716.1A CN111623795B (en) 2020-05-28 2020-05-28 Live-action navigation icon display method, device, equipment and medium
JP2021086777A JP7258078B2 (en) 2020-05-28 2021-05-24 Real scene navigation icon display method, apparatus, equipment and medium
KR1020210067095A KR102559269B1 (en) 2020-05-28 2021-05-25 Real-scene navigation icon display method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010469716.1A CN111623795B (en) 2020-05-28 2020-05-28 Live-action navigation icon display method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN111623795A CN111623795A (en) 2020-09-04
CN111623795B true CN111623795B (en) 2022-04-15

Family

ID=72259229

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010469716.1A Active CN111623795B (en) 2020-05-28 2020-05-28 Live-action navigation icon display method, device, equipment and medium

Country Status (3)

Country Link
JP (1) JP7258078B2 (en)
KR (1) KR102559269B1 (en)
CN (1) CN111623795B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112304330B (en) * 2020-10-29 2024-05-24 腾讯科技(深圳)有限公司 Method for displaying running state of vehicle and electronic equipment
CN112325896B (en) * 2020-10-30 2023-03-14 上海商汤临港智能科技有限公司 Navigation method, navigation device, intelligent driving equipment and storage medium
CN112556685B (en) * 2020-12-07 2022-03-25 腾讯科技(深圳)有限公司 Navigation route display method and device, storage medium and electronic equipment
CN112665608B (en) * 2021-01-05 2024-04-09 腾讯科技(深圳)有限公司 Road navigation indication method, device, computer equipment and storage medium
CN113566836A (en) * 2021-06-28 2021-10-29 阿波罗智联(北京)科技有限公司 Road guiding method, device, electronic equipment and storage medium
CN113483774B (en) * 2021-06-29 2023-11-03 阿波罗智联(北京)科技有限公司 Navigation method, navigation device, electronic equipment and readable storage medium
CN113701773B (en) * 2021-08-16 2023-07-18 深蓝汽车科技有限公司 ARHUD navigation curve indication method and system based on lane line equation
CN113686348B (en) * 2021-08-18 2024-04-16 京东鲲鹏(江苏)科技有限公司 Path planning method and device, storage medium and electronic equipment
CN114184208A (en) * 2021-12-06 2022-03-15 北京中交兴路信息科技有限公司 Method, apparatus, electronic device, and medium for providing navigation route for vehicle
CN114670922B (en) * 2022-04-07 2024-07-09 合众新能源汽车股份有限公司 Vehicle steering control method and control system
CN115273515B (en) * 2022-06-23 2024-05-07 智道网联科技(北京)有限公司 Method, apparatus and readable storage medium for displaying navigation screen at turning position of vehicle
CN115158157B (en) * 2022-06-28 2024-09-13 长城汽车股份有限公司 Star top display method and device based on user-defined input and vehicle
CN115406462A (en) * 2022-08-31 2022-11-29 重庆长安汽车股份有限公司 Navigation and live-action fusion method and device, electronic equipment and storage medium
CN115218919B (en) * 2022-09-21 2022-12-13 泽景(西安)汽车电子有限责任公司 Optimization method and system of flight path line and display
CN115683152A (en) * 2022-10-27 2023-02-03 长城汽车股份有限公司 Vehicle navigation guiding method and device based on coordinate transformation and electronic equipment
CN117191072B (en) * 2023-11-07 2024-01-26 山东高速信息集团有限公司 Highway road live-action navigation system
CN117422808B (en) * 2023-12-19 2024-03-19 中北数科(河北)科技有限公司 Three-dimensional scene data loading method and electronic equipment
CN118209129B (en) * 2024-01-31 2024-10-01 深圳市八方达电子有限公司 Navigation picture intelligent switching method and system based on vehicle-mounted navigator

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106682646A (en) * 2017-01-16 2017-05-17 北京新能源汽车股份有限公司 Lane line identification method and device
CN110610137A (en) * 2019-08-21 2019-12-24 北京地平线机器人技术研发有限公司 Method and device for detecting vehicle running state, electronic equipment and storage medium

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100697098B1 (en) * 2004-06-30 2007-03-20 에스케이 주식회사 System and method for providing telematics service using guidance point map
JP2013130552A (en) * 2011-12-22 2013-07-04 Aisin Aw Co Ltd Display system, display method, and display program
CN104515529A (en) * 2013-09-27 2015-04-15 高德软件有限公司 Real-scenery navigation method and navigation equipment
CN104697545A (en) * 2013-12-04 2015-06-10 大陆汽车投资(上海)有限公司 Method and apparatus for processing navigation prompt information
JP5972301B2 (en) * 2014-02-20 2016-08-17 本田技研工業株式会社 Visit plan creation system, terminal device, and visit plan creation method
CN105333883B (en) * 2014-08-07 2018-08-14 深圳点石创新科技有限公司 A kind of guidance path track display method and device for head up display
PL3253692T3 (en) * 2015-02-05 2021-05-17 Grey Orange Pte. Ltd. Apparatus and method for handling goods
JP6540453B2 (en) * 2015-10-28 2019-07-10 株式会社デンソー Information presentation system
JP6432116B2 (en) * 2016-05-23 2018-12-05 本田技研工業株式会社 Vehicle position specifying device, vehicle control system, vehicle position specifying method, and vehicle position specifying program
CN106767853B (en) * 2016-12-30 2020-01-21 中国科学院合肥物质科学研究院 Unmanned vehicle high-precision positioning method based on multi-information fusion
JP6946760B2 (en) * 2017-06-08 2021-10-06 株式会社デンソー Transfer control device and control program
CN107894237A (en) * 2017-11-16 2018-04-10 百度在线网络技术(北京)有限公司 Method and apparatus for showing navigation information
CN111919211A (en) * 2018-03-09 2020-11-10 福特全球技术公司 Turn path visualization for improved spatial and situational awareness in turn maneuvers
JP7346859B2 (en) * 2018-03-29 2023-09-20 株式会社リコー Control device, display device, moving object, control method, and program
KR102682524B1 (en) * 2018-09-11 2024-07-08 삼성전자주식회사 Localization method and apparatus of displaying virtual object in augmented reality
CN110926487A (en) * 2018-09-19 2020-03-27 阿里巴巴集团控股有限公司 Driving assistance method, driving assistance system, computing device, and storage medium
CN109141464B (en) * 2018-09-30 2020-12-29 百度在线网络技术(北京)有限公司 Navigation lane change prompting method and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106682646A (en) * 2017-01-16 2017-05-17 北京新能源汽车股份有限公司 Lane line identification method and device
CN110610137A (en) * 2019-08-21 2019-12-24 北京地平线机器人技术研发有限公司 Method and device for detecting vehicle running state, electronic equipment and storage medium

Also Published As

Publication number Publication date
JP7258078B2 (en) 2023-04-14
KR20210070250A (en) 2021-06-14
CN111623795A (en) 2020-09-04
KR102559269B1 (en) 2023-07-25
JP2021131895A (en) 2021-09-09

Similar Documents

Publication Publication Date Title
CN111623795B (en) Live-action navigation icon display method, device, equipment and medium
CN111649739B (en) Positioning method and device, automatic driving vehicle, electronic equipment and storage medium
CN111797187A (en) Map data updating method and device, electronic equipment and storage medium
CN111860304B (en) Image labeling method, electronic device, equipment and storage medium
CN111231950A (en) Method, device and equipment for planning lane change path of vehicle and readable storage medium
CN113223113B (en) Lane line processing method and device, electronic equipment and cloud control platform
CN111693059B (en) Navigation method, device and equipment for roundabout and storage medium
CN111626206A (en) High-precision map construction method and device, electronic equipment and computer storage medium
CN111767360B (en) Method and device for marking virtual lane at intersection
CN111767853B (en) Lane line detection method and device
CN111523471B (en) Method, device, equipment and storage medium for determining lane where vehicle is located
CN113570664B (en) Augmented reality navigation display method and device, electronic equipment and computer medium
CN112270669A (en) Human body 3D key point detection method, model training method and related device
CN110619312B (en) Method, device and equipment for enhancing positioning element data and storage medium
US20190130631A1 (en) Systems and methods for determining how to render a virtual object based on one or more conditions
CN111784835A (en) Drawing method, drawing device, electronic equipment and readable storage medium
CN110595490A (en) Preprocessing method, device, equipment and medium for lane line perception data
CN111079079A (en) Data correction method and device, electronic equipment and computer readable storage medium
Zhao et al. Real-time visual-inertial localization using semantic segmentation towards dynamic environments
CN112581533A (en) Positioning method, positioning device, electronic equipment and storage medium
CN113483774A (en) Navigation method, navigation device, electronic equipment and readable storage medium
CN113483771A (en) Method, device and system for generating live-action map
CN111767844A (en) Method and apparatus for three-dimensional modeling
CN113763504B (en) Map updating method, system, vehicle-mounted terminal, server and storage medium
CN111612851B (en) Method, apparatus, device and storage medium for calibrating camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20211022

Address after: 100176 101, floor 1, building 1, yard 7, Ruihe West 2nd Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Applicant after: Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd.

Address before: 2 / F, baidu building, 10 Shangdi 10th Street, Haidian District, Beijing 100085

Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant