CN115691266A - Driving guide method and device for track scene, electronic equipment and vehicle - Google Patents

Driving guide method and device for track scene, electronic equipment and vehicle Download PDF

Info

Publication number
CN115691266A
CN115691266A CN202211331602.6A CN202211331602A CN115691266A CN 115691266 A CN115691266 A CN 115691266A CN 202211331602 A CN202211331602 A CN 202211331602A CN 115691266 A CN115691266 A CN 115691266A
Authority
CN
China
Prior art keywords
driving
track
vehicle
virtual
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211331602.6A
Other languages
Chinese (zh)
Inventor
张晓龙
段迎新
李缘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Great Wall Motor Co Ltd
Original Assignee
Great Wall Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Great Wall Motor Co Ltd filed Critical Great Wall Motor Co Ltd
Priority to CN202211331602.6A priority Critical patent/CN115691266A/en
Publication of CN115691266A publication Critical patent/CN115691266A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Navigation (AREA)

Abstract

The application provides a method and a device for guiding driving of a track scene, an electronic device and a vehicle, wherein the driving guide is carried out only after the use scene is determined to be the track scene, so that the influence on the driving experience of a common road section is avoided; if the driving mode is the track training mode, the user can download game scenes of different tracks at the cloud end, so that virtual simulation learning is realized on the real vehicle, and the user can adapt to the track scenes more quickly; if the driving mode is a track real driving mode, after a target track is determined, the virtual display content containing the driving guide information is projected to the AR projection display virtual image for display, so that a user can see the driving guide trend and operation guide in front without lowering head, the driving burden caused by frequent switching of the focal length up and down of eyes under the high-speed driving scene of the track is avoided, the driving guide safety is improved, and the user without the experience of racing can have the same driving experience as a racing driver on the track.

Description

Driving guide method and device for track scene, electronic equipment and vehicle
Technical Field
The application relates to the technical field of intelligent driving, in particular to a driving guide method and device for a track scene, electronic equipment and a vehicle.
Background
With the increase of the popularity of sports cars and standard racing yards and racing tracks, and the periodic open days of the racing tracks in many racing yards at present, the driving scenes of the racing tracks can be gradually increased by people according to the requirements of the people on the driving scenes, but the driving experience and the driving safety can be ensured only by the training and the instruction of a professional coach when the sports cars are driven on the racing tracks at present, a special display area is not arranged in a vehicle to guide a driver to drive under the scene of the racing tracks, and the display is carried out on an automobile instrument panel or a car machine screen, so that the driver needs to look over related information by lowering the head of the driver, and the frequent switching of the focal length up and down of eyes can bring the driving burden to the driver and bring potential safety hazards, so that common users without the experience of racing cars can hardly have the driving experience the same as the racing tracks; meanwhile, the training and the guidance of the professional coach limit the learning coverage of ordinary people and limit the driving experience of people.
Disclosure of Invention
In view of this, an object of the present application is to provide a driving guiding method and apparatus for a track scene, an electronic device, and a vehicle, which are used to solve the problem that a user without driving experience cannot have the driving experience of a racing car.
In view of the above, a first aspect of the present application provides a method for guiding driving in a track scene, including:
determining a driving mode in response to the use scene of the vehicle being a track scene;
responding to the driving mode as a track real driving mode, and determining a target track;
in response to the acquired projection construction data, determining virtual display content of an AR projection display virtual image according to the projection construction data;
determining driving guidance information based on the acquired vehicle control data;
and projecting the virtual display content containing the driving guide information to an AR projection display virtual image for display.
Optionally, the determining driving guidance information based on the vehicle control data includes:
constructing a coordinate system which takes the central point of the front axle of the whole car as the original point, the driving direction as the X-axis direction and the vertical direction of the driving direction as the Y-axis direction in a plane parallel to the target track, and acquiring historical driving information of the target track;
determining a vehicle location of a vehicle on the target track based on the smart driving data and the track map information;
in the coordinate system, the driving guidance information is determined based on the historical driving information, the vehicle position, and the vehicle control data.
Optionally, the virtual display content includes a virtual guideline, and the driving guidance information includes a guidance trend;
the determining the driving guidance information based on the historical driving information, the vehicle position, and the vehicle control data in the coordinate system includes:
acquiring an optimal driving path in the historical driving information;
calculating an offset difference between the vehicle position and the optimal driving path in the coordinate system;
determining the guiding direction of the virtual guide line based on the offset difference value, and updating the guiding direction in real time according to the change of the vehicle position.
Optionally, the determining the driving guidance information based on the historical driving information, the vehicle location, and the vehicle control data comprises:
acquiring optimal control data in the historical driving information;
determining recommended control information by comparing the vehicle control data and the optimal control data;
determining an operation guidance for driving the vehicle to the optimal driving path based on the recommended control information, and updating the operation guidance in real time according to a change in the vehicle position.
Optionally, calculating an offset difference between the vehicle position and the optimal driving path in the coordinate system comprises:
acquiring an optimal driving path in the historical driving information, and calculating a first distance between the optimal driving path and the edge of the track;
determining a second spacing between the vehicle and the edge of the track based on the vehicle position;
and calculating the difference value between the first distance and the second distance to obtain the offset difference value, and updating the offset difference value in real time according to the change of the vehicle position.
Optionally, the performing AR fusion processing on the virtual display element to obtain virtual display content of an AR projection display virtual image includes:
and adjusting the size and the angle of the virtual display element to enable the virtual display element to be fused with the target track to obtain the virtual display content.
A second aspect of the present application provides a driving guide apparatus for a track scene, including:
a scenario confirmation module configured to: determining a driving mode in response to the use scene of the vehicle being a track scene;
a track confirmation module configured to: responding to the driving mode as a track real driving mode, and determining a target track;
a display content determination module configured to: responding to the acquired projection construction data applied to the target track, and determining virtual display content of an AR projection display virtual image according to the projection construction data;
a guidance information determination module configured to: determining driving guidance information based on the acquired vehicle control data;
a projection module configured to: and projecting the virtual display content containing the driving guide information to an AR projection display virtual image for display.
A third aspect of the present application provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method as provided by the first aspect of the present application when executing the program.
A fourth aspect of the present application provides a vehicle comprising a driving guide of a track scene as provided in the second aspect of the present application or an electronic device as provided in the third aspect of the present application.
As can be seen from the above, the driving guidance method, the driving guidance device, the electronic device and the vehicle for the track scene provided by the application can perform driving guidance only after the usage scene is determined to be the track scene, so that the driving experience of a common road section is prevented from being influenced; if the driving mode is a track training mode, storing different track professional racers and coach driving modes and operation teaching through the cloud platform to form a teaching game scene, downloading the game scenes of different tracks at the cloud end through the vehicle-mounted machine system by a user, realizing virtual simulation learning on a real vehicle, and combining the guidance of the actual track and the virtual-real combination to enable the user to adapt to the track scene more quickly; if the driving mode is a track real driving mode, firstly, a target track needs to be determined, virtual display elements are determined when vehicle control data, intelligent driving data and track map information of the target track are acquired, then AR fusion processing is carried out on the virtual display elements to obtain virtual display contents of AR projection display virtual images, finally the virtual display contents containing driving guide information are projected to the AR projection display virtual images to be displayed, the virtual display contents are attached to the road surface after projection in the visual angle of a user, the user can see the driving guide trend and operation guide in front without lowering head, the driving burden caused by frequent switching of focal lengths of eyes up and down in a track high-speed driving scene is avoided, the safety of driving guide is improved, the user can drive safely on the target track more easily by providing convenient driving guide for the user, and the inexperienced user can have the driving experience like a track driver on the track.
Drawings
In order to more clearly illustrate the technical solutions in the present application or related technologies, the drawings required for the embodiments or related technologies in the following description are briefly introduced, and it is obvious that the drawings in the following description are only the embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of a driving guiding method for a track scene according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating track scene simulation training performed within a game scene according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of a lower viewing angle according to an embodiment of the present application;
FIG. 4 is a flowchart of determining driving directions based on vehicle control data according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a racetrack scene according to an embodiment of the present application;
fig. 6 is a flowchart of determining driving guidance information according to an embodiment of the present application;
FIG. 7 is a schematic view of a virtual guideline according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a drive guidance for entering a curved optimal path according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an AR projection display virtual image of a curved optimal path driving guide according to an embodiment of the present disclosure;
fig. 10 is a flowchart of another method for determining driving guidance information according to the embodiment of the present application;
FIG. 11 is a schematic diagram of a brake-in-knee driving guide according to an embodiment of the present disclosure;
FIG. 12 is a schematic view of an AR projection virtual image of a brake entry bend point driving guide according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a driving guide device for a track scene according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 15 is a flowchart illustrating another description manner of the driving guidance method for a track scene according to the embodiment of the present application.
Reference numerals: 1. the central point of the front axle of the whole vehicle; 2. a deceleration downshift point before cornering; 3. bending the optimal path; 4. braking to a bending point; 5. turning into a transition path; 6. balancing an accelerator and a forced steering point; 7. a turning transition path is formed; 8. increasing the throttle point; 9. increasing an accelerator upshift path; 10. a maximum throttle point; 11. the edge of the track; 12. a virtual guide line; 13. AR projection displays a virtual image; 14. a human eye position; 15. and (6) virtual braking points.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is further described in detail below with reference to the accompanying drawings in combination with specific embodiments.
It should be noted that technical terms or scientific terms used in the embodiments of the present application should have a general meaning as understood by those having ordinary skill in the art to which the present application belongs, unless otherwise defined. The use of "first," "second," and similar terms in the embodiments of the present application do not denote any order, quantity, or importance, but rather the terms are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
In some embodiments, as shown in fig. 1, a method for guiding driving of a track scene includes:
step 100: and determining the driving mode in response to the use scene of the vehicle being a track scene.
In the step, because the racing track scene is greatly different from the common town road scene, the driving guide in the common town road scene mainly avoids traffic accidents such as rear-end collision, red light running, pedestrian collision and the like under the constraint of traffic rules, and the racing track is used for running the whole course at the fastest speed in the scene without the constraint of the conventional traffic rules; the road conditions of the racing track scene are greatly different from those of the common town road scene, the road conditions of the common town road scene generally have road traffic sign lines, traffic lights and the like to provide a guiding and guiding function, the roads are mutually crossed and communicated, the road conditions are complex and speed limitation exists, the racing track road conditions are generally simple and mostly narrow and multi-turn roads, and other guiding and indicating contents hardly exist on the racing track in order to ensure the smoothness and the adhesive force of the racing track, so that the driving and guiding method under the town road scene in the related technology is greatly different from the driving and guiding method under the racing track scene and cannot be mutually applied under the general condition, and the driving and guiding method provided by the embodiment of the application can be used for driving and guiding only after the using scene is determined to be the racing track scene, so that the driving of a vehicle on the common road is prevented from being influenced and driving danger is caused.
Step 200: and determining a target track in response to the driving mode being the track driving mode.
In this step, after determining that the driving mode is the real driving mode of the track, since the map data of different tracks are different, it is first necessary to determine a target track on which the user is going to perform real driving.
Step 300: and in response to the acquired projection construction data applied to the target track, determining virtual display content of the AR projection display virtual image according to the projection construction data.
In this step, after determining the target track, it needs to be determined whether the AR-HUD can acquire projection construction data, for example: the driving guidance needs to be established and updated in real time by depending on the data, so that only the track map information of the vehicle control data, the intelligent driving data and the target track is obtained at the same time, the virtual projection elements which need to be added in the AR projection display virtual image 13 of the AR-HUD can be determined, for example: driving guide lines, brake points, over-bending points, dynamic accelerator stroke display, dynamic steering wheel angle display and the like. Illustratively, virtual display elements such as a driving guide line, a braking point, a turning point, a dynamic accelerator travel display and a dynamic steering wheel angle display are subjected to AR fusion processing to obtain virtual display contents such as a virtual guide line 12, a virtual accelerator, a virtual braking point 15 and a virtual steering wheel.
Step 400: the driving guidance information is determined based on the acquired vehicle control data.
In this step, driving guidance information needs to be determined according to vehicle control data (ECU data of the vehicle), for example, a guidance direction of the virtual guide line 12 is determined according to a current vehicle position of the vehicle, a user controls a driving direction of the vehicle according to the guidance direction of the virtual guide line 12, after the driving direction is determined, the user may not determine how much steering wheel is rotated, and then the steering wheel can be driven according to the guidance direction of the virtual guide line 12, at this time, a recommended steering angle to the steering wheel, which the user needs to increase or decrease, can be determined according to a current steering wheel angle of the vehicle, the size of the recommended steering angle can be displayed in a numerical value form, and when the user rotates the steering wheel according to the recommended steering angle, the recommended steering angle can be updated in real time; the driving with a large throttle after the driving guide information of the steering wheel is determined is also difficult to control, so that the recommended throttle strength which needs to be increased or decreased by a user is determined according to the current throttle strength of the vehicle, the recommended throttle strength can be displayed in a numerical value form, and when the user changes the throttle strength according to the recommended throttle strength, the recommended throttle strength can be updated in real time; and if the distance between the position of the vehicle and the braking point is smaller than a preset first distance threshold value, the fact that the vehicle is about to enter a road section needing braking and speed reduction is indicated, at the moment, the virtual braking point 15 is displayed in the projection display virtual image of the AR-HUD, the distance between the position of the vehicle and the braking point is updated in real time, the distance is displayed in a data form, and meanwhile the smaller the distance is, the larger the display size of the virtual braking point 15 can be.
Step 500: and projecting the virtual display content containing the driving guide information to an AR projection display virtual image for display.
In this step, as shown in fig. 3, after the projection distance and the downward viewing angle are determined, the virtual display content with the driving guidance information is projected by the AR-HUD into the AR projection display virtual image 13 for display, and the AR projection display virtual image 13 is merged with the target track, that is, viewed from the viewing angle of the user, and the virtual display content is displayed on the road surface of the target track. The projection distance and the lower view angle are inherent attributes of the vehicle, and have different values according to different vehicles.
In some embodiments, the driving guidance method for a track scene further includes: and responding to the fact that the driving mode is the track training mode, downloading a game scene which is constructed in advance, and performing track scene simulation training in the game scene after the target simulation track is determined.
After the use scene of the vehicle is determined to be the track scene, the driving mode of the vehicle needs to be determined, if the driving mode is the track training mode, it is described that a user does not need to drive the vehicle onto the track to carry out real driving, in order to guarantee the authenticity of the training process and the validity of the training result, as shown in fig. 2, the high-precision map data of the track is collected and is integrated with the whole vehicle navigation calibration, and meanwhile, the driving scheme that each curve passes through the curve in the track is arranged: the method comprises the following steps of containing information such as an optimal bending passing route, a brake downshift point, a brake bending point 4, a turning angle, accelerator force, brake force and the like, and simultaneously obtaining road edge information stored by an intelligent driving controller and data of Electronic Control Units (ECU) of the vehicle: the method comprises the steps of combining information such as a vehicle speed, a brake, an accelerator, gears, a steering wheel corner and a vehicle angle, constructing a track game scene by combining a first visual angle track scene video of a driver under the track scene, storing the constructed game scene to the cloud, downloading the pre-constructed game scene from the cloud after the driving mode is determined to be a track training mode, performing virtual simulation learning on a real vehicle after a target simulation track is determined, combining the game scene with various real vehicle data through an Augmented Reality Head-Up Display (AR-HUD), and enabling a user to combine the projection of the AR-HUD to perform track scene simulation training of the real vehicle game scene.
In summary, the driving guidance method for the track scene provided by the embodiment of the application performs driving guidance only after determining that the usage scene is the track scene, so as to avoid influencing the driving experience of a common road section; if the driving mode is a track training mode, storing different track professional track racers and coach driving modes and operation teaching through a cloud platform to form a teaching game scene, downloading a pre-constructed game scene from a cloud end after the driving mode is determined to be the track training mode, and combining the game scene with various real vehicle data through an Augmented Reality Head-Up Display (AR-HUD) after a target simulation track is determined, so that a user can perform track scene simulation training of a real vehicle game scene by combining the projection of the AR-HUD, and the user can adapt to the track scene more quickly; if the driving mode is a track real driving mode, after the projection distance and the lower view angle are determined, the virtual display content containing the driving guide information is projected to the AR projection display virtual image 13 to be displayed, the lower view angle enables a user to see the driving guide trend and the operation guide in front without lowering head, the driving burden caused by frequent up-and-down switching of focal lengths of eyes under the high-speed running scene of the track is avoided, the safety of driving guide is improved, the user can easily carry out safe driving on a target track through providing convenient driving guide for the user, and the user without the experience of racing cars can have the same driving experience as a racer on the track.
In some embodiments, the projection construction data includes vehicle control data, intelligent driving data, and track map information of the target track, and as shown in fig. 4, determining driving guidance information based on the acquired vehicle control data includes:
step 410: and constructing a coordinate system which takes the central point of the front axle of the whole car as the original point, the driving direction as the X-axis direction and the vertical direction of the driving direction as the Y-axis direction in a plane parallel to the target track, and acquiring historical driving information of the target track.
In this step, exemplarily, as shown in fig. 5, a coordinate system is constructed in a plane parallel to the target track, where the coordinate system takes a front axle central point 1 of the entire car as an origin, the driving direction as an X-axis direction, and the vertical direction of the driving direction as a Y-axis direction, so that the offset difference between the optimal driving path in the historical driving information (including the driving scheme of each curve passing through a curve in the track) and the vehicle can be measured well, the operation process is reduced, and the real-time data is more accurate; as shown in fig. 5, the historical driving data may include guidance information such as a deceleration downshift point 2 before the curve is entered, an optimal curve-entering path 3, a brake curve-entering point 4, a curve-entering transition path 5, a balanced accelerator, a power-on steering point 6, a curve-exiting transition path 7, an increased accelerator point 8, an increased accelerator upshift path 9, and a maximum accelerator point 10.
Step 420: the vehicle position on the target track is determined based on the intelligent driving data and the track map information.
In the step, the AR-HUD controller receives the map information, takes a positioning device arranged in the vehicle as a positioning identification point, determines the position of the positioning identification point on a map corresponding to the target track by combining navigation information and identified track edge information in the intelligent driving data, and determines the vehicle position of the vehicle on the target track according to the position on the map.
Step 430: in the coordinate system, the driving guidance information is determined based on the historical driving information, the vehicle position, and the vehicle control data.
In this step, driving guidance information of the virtual display content is determined in real time based on the historical driving information, the vehicle position, and the vehicle control data, and the driving guidance information is updated based on a change in the vehicle position and a change in the vehicle control data.
In some embodiments, the virtual display content includes a virtual guideline 12, the driving guidance information includes a guidance trend; step 430: in the coordinate system, the driving guidance information is determined based on the historical driving information, the vehicle position, and the vehicle control data, as shown in fig. 6, including:
step 431: an optimal travel path in the history driving information is acquired.
In this step, the optimal travel paths include, for example, an in-turn optimal path 3, an in-turn over-conversion path 5, an out-turn over-conversion path 7, and an increase accelerator up-shift path 9, wherein the in-turn optimal path 3, the in-turn over-conversion path 5, the out-turn over-conversion path 7, and the increase accelerator up-shift path 9 are the result of subdividing the optimal travel path of the target race track at each turn, so as to provide more detailed driving guidance for the user, for example: the path that the professional racing driver travels when turning into the curve in the historical driving information can be used as an optimal path 3 for turning into the curve, the path that the professional racing driver travels when turning into the curve and transiting to the curve in the historical driving information can be used as a transition path 5 for turning into the curve, the path that the professional racing driver travels when turning out and transiting to the curve in the historical driving information can be used as a transition path 7 for turning out, the accelerator or the gear-up of the professional racing driver in a certain road section in the historical driving information can be used as an increased accelerator gear-up path 9, and the boundary point of each road section can be obtained by consulting the professional racing driver. The straight-going track part does not need to be subdivided, the user can go straight in the target track at will, but the user is required to go according to the optimal running path as soon as possible when the user is close to a turning part, so that the user can smoothly and quickly pass the turning part according to the driving guide information.
Step 432: an offset difference between the vehicle position and the optimal driving path is calculated in a coordinate system.
In this step, first, an optimal travel path in the historical driving information is acquired, and a first distance between the optimal travel path and the track edge 11 is calculated; then, determining a second spacing between the vehicle and the track edge 11 based on the vehicle position; and finally, calculating the difference value between the first distance and the second distance to obtain an offset difference value, and updating the offset difference value in real time according to the change of the position of the vehicle. The method comprises the steps of taking a center point (positioning identification point) of a front axle of a vehicle as an origin of a coordinate system, easily calculating a second distance between the position of the vehicle and a track edge 11 according to positioning or navigation software when the track edge 11 (such as the track edge 11 on the outer side of a target track) serving as a calculation reference is selected according to track map information and the position of the vehicle, obtaining a historical vehicle position of the vehicle at the same track edge 11 position in historical driving information according to the first distance between an optimal driving path and the track edge 11, then calculating the first distance between the position of the vehicle and the same track edge 11 by using the positioning or navigation software, or determining the first distance between the optimal driving path and the track edge 11 in advance, obtaining an offset difference value by calculating the difference value between the first distance and the second distance, and having a simple calculation process, ensuring the speed of calculation data, ensuring the timeliness of driving guide information, and reducing the updating delay time of the offset value when the offset value is updated in real time according to the change of the position of the vehicle.
Step 433: and determining the guiding trend of the virtual guide line based on the deviation difference value, and updating the guiding trend in real time according to the change of the vehicle position.
In the step, the center point 1 of the front axle of the whole car is a reference point for positioning the position of the car and an AR projection display virtual image 13, and by taking the reference point as a reference point, a deceleration-before-bend downshift point 2, a bend-in optimal path 3, a brake bend-in point 4, a bend-in transition path 5, a balance accelerator, a force application steering point 6, a bend-out transition path 7, an increase accelerator point 8, an increase accelerator shift-up guide path 9, the position relationship between the maximum accelerator point 10 and the center point (car position) of the front axle of the car in a coordinate system so as to determine the guide trend of a virtual guide line 12, and then establishing the corresponding relationship between the virtual guide line and the human eye position 14 and the AR projection display virtual image 13 through coordinate conversion so as to realize AR display after the virtual image and the actual track are fused. For example, as shown in fig. 7, if the offset difference is-0.5 m, which indicates that the vehicle position (origin of coordinates) can be made to coincide with the optimal driving path by moving 0.5 m in the Y-axis reverse direction, the guiding direction of the virtual guide line 12 is a curve for making the vehicle travel to the optimal driving path, the curve degree of the curve is related to the information such as the vehicle speed, and the calculation process is not described herein again. After determining the guiding direction of the virtual guide line 12, the user can determine the driving direction and the traveling path of the vehicle according to the real-time updated guiding direction.
Exemplarily, as shown in fig. 8, the AR-HUD controller receives high-precision map information, performs fusion processing on navigation information and intelligently driving identified track edge information to locate a vehicle position, determines coordinates of an optimal path to bend 3 in the coordinate system according to a relationship between the vehicle position and a track edge 11 by taking a direction perpendicular to the vehicle traveling direction in a parallel horizontal plane as a Y axis, establishes a corresponding relationship between the vehicle traveling direction in the parallel horizontal plane as an X axis and a center point of a front axle of the vehicle as an origin, establishes a corresponding relationship between the vehicle position and an eye position 14 and an AR projection display virtual image 13 through coordinate transformation, determines a position and an angle of a virtual guide line 12 in the AR projection display virtual image 13 according to a deviation difference between the vehicle position and the optimal path to bend, and shows that the virtual guide line 12 is a straight line at this time in fig. 9, which shows that the vehicle position (coordinate origin) coincides with the optimal path to bend 3 at this time, without performing steering operation, wherein the AR projection display virtual guide line 13 is a virtual image without a frame shown in fig. 9 when the virtual image is actually used; and fig. 9 shows the effect of coordinate transformation, the offset difference is calculated by taking the center point 1 of the whole front axle as the origin, but the user operates by taking the human eye as the origin, so that although the position 14 of the human eye does not coincide with the optimal path 3 of the approach curve, the virtual guide line 12 is still a straight line, the display size, the display position, the display angle and the display shape of the virtual guide line 12 are updated in real time along with the movement of the vehicle, the AR display of the fusion of the optimal path 3 of the approach curve and the actual target track is performed, and the user can know the driving direction and the driving route in real time.
In some embodiments, as shown in fig. 10, determining driving guidance information based on the historical driving information, the vehicle location, and the vehicle control data includes:
step 434: acquiring optimal control data in historical driving information;
in this step, after the guiding direction of the virtual guide line 12 is determined, the user is aware of the driving direction of the vehicle, but there may be a deviation in actual operation, which makes it difficult to drive the vehicle according to the virtual guide line 12, so it is also necessary to guide the control of the vehicle by the user to ensure that the user can control the vehicle to advance according to the virtual guide line 12, and at this time, it is necessary to first obtain the optimal control data in the historical driving information as a reference for controlling the vehicle.
Step 435: determining recommended control information by comparing the vehicle control data with the optimal control data;
in this step, after the user determines the driving direction according to the virtual guide line 12, the user may not determine how much the steering wheel is rotated to drive according to the guiding direction of the virtual guide line 12, at this time, the current steering wheel angle in the vehicle control data and the optimal steering wheel angle in the optimal control data may be compared, the recommended steering angle to the steering wheel that the user needs to increase or decrease is determined by calculating the difference between the current steering wheel angle and the optimal steering wheel angle, the size of the recommended steering angle may be displayed in the form of a numerical value, the rotating direction may be displayed in the form of positive or negative of the numerical value, the closer to zero of the displayed numerical value indicates that the current steering wheel angle is closer to the optimal steering wheel angle, and when the user rotates the steering wheel according to the recommended steering angle, the recommended steering angle may be updated in real time; the driving with a large throttle after the driving guide information of the steering wheel is determined is also difficult to control, the current throttle force in the vehicle control data and the optimal throttle force in the optimal control data can be compared at the moment, the recommended throttle force which needs to be increased or decreased by a user is determined by calculating the difference value of the current throttle force and the optimal throttle force, the magnitude of the recommended throttle force can be displayed in a numerical value form, the closer the displayed numerical value is to a zero value, the closer the current throttle force is to the optimal throttle force, and the recommended throttle force can be updated in real time when the user changes the throttle force according to the recommended throttle force; if the distance between the position of the vehicle and the brake point is smaller than a preset first distance threshold value, the fact that the vehicle is about to enter a road section needing braking and decelerating is indicated, at the moment, the virtual brake point 15 is displayed in the projection display virtual image of the AR-HUD, the distance between the position of the vehicle and the brake point is updated in real time, the distance is displayed in a data mode, and meanwhile the smaller the distance is, the larger the display size of the virtual brake point 15 can be; similarly, for the shift point, if the distance between the vehicle position and the shift point is smaller than the preset second distance threshold, it is indicated that the vehicle is about to enter a road section needing to shift gears, at this time, the virtual shift point is displayed in the projection display virtual image of the AR-HUD, the recommended gear (optimal gear) to which the vehicle needs to shift gears is updated in real time, and the recommended gear is displayed in a text or numerical value form, meanwhile, the smaller the distance between the vehicle position and the shift point is, the larger the display size of the virtual shift point can be, the principle is that the size of the same object seen by the human eyes at a greater distance is smaller, and the larger the size of the same object seen at a closer distance is.
Step 436: an operation guidance for driving the vehicle to the optimal driving path is determined based on the recommended control information, and the operation guidance is updated in real time according to a change in the position of the vehicle.
In this step, as shown in fig. 11, for example, by taking a braking point as an example for explanation, the AR-HUD controller receives high-precision map information, combines navigation information and intelligent driving recognition track edge information to perform fusion processing to locate a vehicle position, takes a direction perpendicular to a vehicle driving direction in a parallel horizontal plane as a Y axis, takes the vehicle driving direction in the parallel horizontal plane as an X axis, takes a vehicle front axis center point as an origin to establish a coordinate system, determines a braking bend point 4 in the coordinate system according to the vehicle position and the track turning position, establishes a corresponding relationship with a human eye position 14 and an AR projection display virtual image 13 through coordinate transformation, and determines a position and a size of a virtual braking point 15 in the AR projection display virtual image 13 according to a distance between the vehicle position and the braking bend point 4, wherein the display form of the virtual braking point 15 in the AR projection display image 13 is shown in fig. 12, in order to inform a user that the point is the virtual braking point 15, a foot-on-brake pattern is displayed on the right side of the point, so as to inform the user that the point is a virtual braking point, a display guide line is displayed in real-time, and how the virtual braking point can be displayed on the virtual braking point, so that the virtual braking point can be displayed in real-time, and how a target of the virtual driving control of the vehicle can be performed according to enable the virtual braking point to be displayed.
In some embodiments, performing AR fusion processing on the virtual display element to obtain virtual display content of an AR projection display virtual image includes:
and adjusting the size and the angle of the virtual display element to enable the virtual display element to be fused with the target track to obtain virtual display content. Illustratively, as the vehicle is closer to the optimal bending point, the size of the virtual guide line 12 shown in fig. 7 is larger and larger, and conforms to the rule that the closer the vehicle is to the human eye, the larger the image is, so that the user has better visual experience, visual fatigue is avoided, and the virtual guide line 12 can indirectly indicate that the vehicle is closer to the optimal bending point, and the virtual guide line 12 is attached to the ground within the visual angle of the user, so in order to ensure that the visual effect of attachment can be maintained all the time along with the movement of the vehicle, the angle of the virtual display element, for example, the triangular area at the head of the virtual guide line 12, and as the vehicle is closer to the optimal bending point, the vertex angle of the triangle is larger and smaller, and the base angle is smaller and smaller.
It can be seen from fig. 8 that the AR projection display virtual image 13 is perpendicular to the ground, and if the display effect shown in fig. 7 is to be achieved, the size and angle of the virtual display element need to be adjusted so that the virtual display element is merged with the target track, so as to obtain the virtual display content shown in fig. 7.
It should be noted that the method of the embodiment of the present application may be executed by a single device, such as a computer or a server. The method of the embodiment can also be applied to a distributed scene and completed by the mutual cooperation of a plurality of devices. In such a distributed scenario, one of the multiple devices may only perform one or more steps of the method of the embodiment, and the multiple devices interact with each other to complete the method.
It should be noted that the above describes some embodiments of the present application. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Based on the same inventive concept, corresponding to any embodiment method, the application also provides a driving guide device for the track scene.
Referring to fig. 13, the driving guide apparatus for a racetrack scene includes:
a scenario confirmation module 10 configured to: determining a driving mode in response to the use scene of the vehicle being a track scene;
a track confirmation module 20 configured to: responding to the driving mode as a track real driving mode, and determining a target track;
a display content determination module 30 configured to: in response to the fact that projection construction data applied to the target track are obtained, virtual display content of an AR projection display virtual image is determined according to the projection construction data;
a guidance information determination module 40 configured to: determining driving guidance information based on the acquired vehicle control data;
a projection module 50 configured to: and projecting the virtual display content containing the driving guide information to an AR projection display virtual image for display.
For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, the functionality of the various modules may be implemented in the same one or more pieces of software and/or hardware in the practice of the present application.
The device of the above embodiment is used for implementing the driving guidance method of the corresponding track scene in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Based on the same inventive concept, corresponding to the method of any embodiment, the application further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and running on the processor, and when the processor executes the program, the driving guidance method for the track scene according to any embodiment is implemented.
Fig. 14 is a schematic diagram illustrating a more specific hardware structure of an electronic device according to this embodiment, where the electronic device may include: a processor 1010, a memory 1020, an input/output interface 1030, a communication interface 1040, and a bus 1050. Wherein the processor 1010, memory 1020, input/output interface 1030, and communication interface 1040 are communicatively coupled to each other within the device via bus 1050.
The processor 1010 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solutions provided in the embodiments of the present disclosure.
The Memory 1020 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static storage device, a dynamic storage device, or the like. The memory 1020 may store an operating system and other application programs, and when the technical solution provided by the embodiments of the present specification is implemented by software or firmware, the relevant program codes are stored in the memory 1020 and called to be executed by the processor 1010.
The input/output interface 1030 is used for connecting an input/output module to input and output information. The i/o module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. Wherein the input devices may include a keyboard, mouse, touch screen, microphone, various sensors, etc., and the output devices may include a display, speaker, vibrator, indicator light, etc.
The communication interface 1040 is used for connecting a communication module (not shown in the drawings) to implement communication interaction between the present apparatus and other apparatuses. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, bluetooth and the like).
The bus 1050 includes a path to transfer information between various components of the device, such as the processor 1010, memory 1020, input/output interface 1030, and communication interface 1040.
It should be noted that although the above-mentioned device only shows the processor 1010, the memory 1020, the input/output interface 1030, the communication interface 1040 and the bus 1050, in a specific implementation, the device may also include other components necessary for normal operation. In addition, those skilled in the art will appreciate that the above-described apparatus may also include only those components necessary to implement the embodiments of the present description, and not necessarily all of the components shown in the figures.
The electronic device of the above embodiment is used to implement the driving guidance method of the corresponding track scene in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Based on the same inventive concept, corresponding to any of the above-mentioned embodiment methods, the present application further provides a computer-readable storage medium storing computer instructions for causing the computer to execute the driving guidance method for a track scene according to any of the above-mentioned embodiments.
Computer-readable media of the present embodiments, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The computer instructions stored in the storage medium of the above embodiment are used to enable the computer to execute the driving guidance method for the track scene according to any of the above embodiments, and have the beneficial effects of the corresponding method embodiments, which are not described herein again.
Based on the same inventive concept, corresponding to any embodiment method, the present application further provides a vehicle including the processing device for frequent wake-up failure as described in the above embodiments or the electronic device as described in the above embodiments or the computer-readable storage medium as described in the above embodiments.
It should be noted that, as shown in fig. 15, the embodiment of the present application can be further described in the following manner:
the vehicle is started, firstly, the driving scene is determined to be a track scene, and if the driving scene is not the track scene, the driving guide is not carried out; if the track scene is in a track training mode, selecting a target simulation track to be selected after the cloud game scene is called, and performing real-vehicle game track scene simulation training after the track selection is finished; if the track scene is in the real driving mode, selecting a target track, determining virtual display elements of virtual display contents when vehicle control data, intelligent driving data and track high-precision map information of the target track can be simultaneously acquired, performing AR fusion processing on the virtual display elements of the virtual display contents to obtain virtual display contents capable of being projected, then selecting a proper track position to draw AR driving guide information, perfecting the virtual display contents, and finally projecting the virtual display contents with the driving guide information onto a projection plane of an AR projection display virtual image 13 to be displayed. If the vehicle control data, the intelligent driving data and the high-precision track map information of the target track cannot be acquired simultaneously, projection is not carried out, and driving guidance is not provided.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the context of the present application, technical features in the above embodiments or in different embodiments may also be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the embodiments of the present application described above, which are not provided in detail for the sake of brevity.
In addition, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown in the provided figures for simplicity of illustration and discussion, and so as not to obscure the embodiments of the application. Furthermore, devices may be shown in block diagram form in order to avoid obscuring embodiments of the application, and this also takes into account the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform within which the embodiments of the application are to be implemented (i.e., specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the application, it should be apparent to one skilled in the art that the embodiments of the application can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative instead of restrictive.
While the present application has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of these embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description. For example, other memory architectures, such as Dynamic RAM (DRAM), may use the discussed embodiments.
The present embodiments are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Therefore, any omissions, modifications, substitutions, improvements, and the like that may be made without departing from the spirit and principles of the embodiments of the present application are intended to be included within the scope of the present application.

Claims (10)

1. A driving guide method for a track scene is characterized by comprising the following steps:
determining a driving mode in response to the use scene of the vehicle being a track scene;
responding to the fact that the driving mode is a track real driving mode, and determining a target track;
in response to the fact that projection construction data applied to the target track are obtained, virtual display content of an AR projection display virtual image is determined according to the projection construction data;
determining driving guidance information based on the acquired vehicle control data;
and projecting the virtual display content containing the driving guide information to an AR projection display virtual image for display.
2. The method of claim 1, wherein the projection build data includes the vehicle control data, smart driving data, and track map information for the target track;
the determining driving guidance information based on the acquired vehicle control data includes:
constructing a coordinate system which takes the central point of the front axle of the whole car as the origin, takes the driving direction as the X-axis direction and takes the vertical direction of the driving direction as the Y-axis direction in a plane parallel to the target track, and acquiring the historical driving information of the target track;
determining a vehicle location of a vehicle on the target track based on the smart driving data and the track map information;
determining the driving guidance information based on the historical driving information, the vehicle position, and the vehicle control data in the coordinate system.
3. The method of claim 2, wherein the virtual display content comprises a virtual guideline, the driving guidance information comprises a guidance heading;
the determining, in the coordinate system, the driving guidance information based on the historical driving information, the vehicle position, and the vehicle control data includes:
acquiring an optimal driving path in the historical driving information;
calculating an offset difference between the vehicle position and the optimal driving path in the coordinate system;
determining the guiding direction of the virtual guide line based on the offset difference value, and updating the guiding direction in real time according to the change of the vehicle position.
4. The method of claim 2, wherein the determining the driving guidance information based on the historical driving information, the vehicle location, and the vehicle control data comprises:
acquiring optimal control data in the historical driving information;
determining recommended control information by comparing the vehicle control data and the optimal control data;
determining an operation guidance for driving the vehicle to the optimal driving path based on the recommended control information, and updating the operation guidance in real time according to a change in the vehicle position.
5. The method of claim 3, wherein calculating an offset difference between the vehicle position and the optimal driving path in the coordinate system comprises:
calculating a first distance between the optimal driving path and the edge of the track;
determining a second spacing between the vehicle and the edge of the track based on the vehicle position;
and calculating the difference value between the first distance and the second distance to obtain the offset difference value, and updating the offset difference value in real time according to the change of the vehicle position.
6. The method according to claim 1, wherein performing AR fusion processing on the virtual display element to obtain virtual display content of an AR projection display virtual image includes:
and adjusting the size and the angle of the virtual display element to enable the virtual display element to be fused with the target track to obtain the virtual display content.
7. The method of claim 1, further comprising:
and responding to the fact that the driving mode is a track training mode, downloading a pre-constructed game scene, and performing track scene simulation training in the game scene after a target simulation track is determined.
8. A driving guide device for a racing scene, comprising:
a scenario confirmation module configured to: determining a driving mode in response to the use scene of the vehicle being a track scene;
a track confirmation module configured to: responding to the driving mode as a track real driving mode, and determining a target track;
a display content determination module configured to: responding to the acquired projection construction data applied to the target track, and determining virtual display content of an AR projection display virtual image according to the projection construction data;
a guidance information determination module configured to: determining driving guidance information based on the acquired vehicle control data;
a projection module configured to: and projecting the virtual display content containing the driving guide information to an AR projection display virtual image for display.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of any one of claims 1 to 6 when executing the program.
10. A vehicle comprising the frequent wake-up failure handling apparatus of claim 8 or the electronic device of claim 9.
CN202211331602.6A 2022-10-27 2022-10-27 Driving guide method and device for track scene, electronic equipment and vehicle Pending CN115691266A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211331602.6A CN115691266A (en) 2022-10-27 2022-10-27 Driving guide method and device for track scene, electronic equipment and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211331602.6A CN115691266A (en) 2022-10-27 2022-10-27 Driving guide method and device for track scene, electronic equipment and vehicle

Publications (1)

Publication Number Publication Date
CN115691266A true CN115691266A (en) 2023-02-03

Family

ID=85045238

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211331602.6A Pending CN115691266A (en) 2022-10-27 2022-10-27 Driving guide method and device for track scene, electronic equipment and vehicle

Country Status (1)

Country Link
CN (1) CN115691266A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117850602A (en) * 2024-03-08 2024-04-09 厦门精图信息技术有限公司 AI electronic map and intelligent application method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117850602A (en) * 2024-03-08 2024-04-09 厦门精图信息技术有限公司 AI electronic map and intelligent application method thereof

Similar Documents

Publication Publication Date Title
US9086297B2 (en) Navigation system having maneuver attempt training mechanism and method of operation thereof
JP4333703B2 (en) Navigation device
US20160003636A1 (en) Multi-level navigation monitoring and control
JP4701060B2 (en) Navigation device
JP2018189590A (en) Display unit and display control method
CN211956223U (en) Lane change track planning system
Gabbard et al. AR drivesim: An immersive driving simulator for augmented reality head-up display research
CN112141110A (en) Vehicle lane changing method, device, equipment and storage medium
CN115691266A (en) Driving guide method and device for track scene, electronic equipment and vehicle
CN112519677A (en) Control device
CN102252683A (en) Method for representing curve progression on display device and driver information system
Isermann Fahrerassistenzsysteme 2017: Von der Assistenz zum automatisierten Fahren-3. Internationale ATZ-Fachtagung Automatisiertes Fahren
JP2019027996A (en) Display method for vehicle and display device for vehicle
US20230356588A1 (en) Vehicle display device and vehicle display method
CN109781131A (en) Navigate abductive approach, device, navigator and readable storage medium storing program for executing
CN116295503B (en) Navigation information generation method and device, electronic equipment and storage medium
CN116091740B (en) Information display control method, storage medium and electronic device
CN116088538B (en) Vehicle track information generation method, device, equipment and computer readable medium
US9846819B2 (en) Map image display device, navigation device, and map image display method
CN115683152A (en) Vehicle navigation guiding method and device based on coordinate transformation and electronic equipment
US20190161093A1 (en) Driving assist device
CN115848386A (en) Bump information determination method, vehicle control method, system, device, and medium
CN115862349A (en) Intersection traffic control method, device, equipment and storage medium
JP6759132B2 (en) Head-up display device and display control method
CN112286188B (en) Vehicle driving control method, device, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination