CN110920604A - Driving assistance method, driving assistance system, computing device, and storage medium - Google Patents

Driving assistance method, driving assistance system, computing device, and storage medium Download PDF

Info

Publication number
CN110920604A
CN110920604A CN201811089401.3A CN201811089401A CN110920604A CN 110920604 A CN110920604 A CN 110920604A CN 201811089401 A CN201811089401 A CN 201811089401A CN 110920604 A CN110920604 A CN 110920604A
Authority
CN
China
Prior art keywords
information
vehicle
driving
lane
traffic sign
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811089401.3A
Other languages
Chinese (zh)
Inventor
徐嘉南
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201811089401.3A priority Critical patent/CN110920604A/en
Publication of CN110920604A publication Critical patent/CN110920604A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096833Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096855Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Transportation (AREA)
  • Mathematical Physics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The disclosure provides a driving assistance method and a driving assistance system. Identifying environmental information around the vehicle; determining lane information of the vehicle; based on the environmental information and the lane information, traffic sign information for guiding driving behavior is generated. Thereby, convenience is provided for driving the vehicle. In addition, the navigation information can be combined with a navigation tool (such as a navigation engine) to convert the navigation information into more refined traffic sign information, such as lane-level sign information, so as to provide more accurate and convenient navigation service.

Description

Driving assistance method, driving assistance system, computing device, and storage medium
Technical Field
The present disclosure relates to the field of vehicle driving, and in particular, to a driving assistance method, a driving assistance system, a computing device, and a storage medium.
Background
Most of the existing vehicle navigation schemes plan a navigation path according to the current position and destination of a vehicle, and generate navigation prompt information according to the real-time position of the vehicle in the driving process. The vehicle navigation scheme mainly broadcasts navigation prompt information to a user through voice, and the broadcasted navigation prompt information is mostly rough prompt words such as 'turn right after driving 500 meters ahead', 'turn right at a front intersection' and the like. After receiving the navigation prompt information, the user needs to determine the driving action by combining with the current actual road information, and the user is not easy in the whole driving process.
Some navigation schemes may use 2D or 3D frames to identify the road morphology so that the user can intuitively obtain the navigation prompt information. The scheme has the defects that the user cannot effectively convert the cartoon picture and the live-action road surface after seeing the cartoon picture and the live-action road surface, particularly at a complex intersection, the information is guided by the enlarged image of the cartoon intersection, and the intuitive cognitive conversion cannot be brought to the user.
Disclosure of Invention
An object of the present disclosure is to provide a driving assistance scheme capable of providing convenience for a user to drive a vehicle.
According to a first aspect of the present disclosure, there is provided a driving assist method including: identifying environmental information around the vehicle; determining lane information of the vehicle; based on the environmental information and the lane information, traffic sign information for guiding driving behavior is generated.
Optionally, the traffic sign information is lane-level sign information.
Optionally, the step of generating traffic sign information further comprises: based on the navigation information, the environment information, and the lane information, traffic sign information corresponding to the navigation information is generated.
Optionally, the step of generating traffic sign information corresponding to the navigation information comprises: and if the navigation information is prompt information for straight ahead, generating a straight ahead mark for guiding the vehicle to go straight ahead along the current lane.
Alternatively, the length of the straight ahead flag varies with the distance between the vehicle and the preceding vehicle.
Optionally, the step of generating traffic sign information corresponding to the navigation information comprises: and if the navigation information is the prompting information of steering driving, generating a steering driving mark for guiding the vehicle to drive to the steering position from the current lane and steering the vehicle.
Optionally, the steering driving mark is a straight guiding mark when the distance between the vehicle and the steering position is greater than a first distance, the steering driving mark is a combination of the straight guiding mark and the steering guiding mark when the distance between the vehicle and the steering position is less than the first distance and greater than a second distance, the second distance is less than the first distance, and the steering driving mark is a steering guiding mark when the distance between the vehicle and the steering position is less than the second distance.
Optionally, the step of generating traffic sign information corresponding to the navigation information comprises: and generating a merging sign for guiding the vehicle to drive from the current lane to the target lane when the vehicle is determined to drive from the current lane to the target lane according to the navigation information.
Optionally, the method further comprises: the step of generating traffic sign information for guiding the driving behavior is performed in response to the arrival or imminent arrival of the vehicle at a predetermined scene.
Optionally, the predetermined scenario comprises at least one of: the navigation engine generates a scene of an enlarged intersection; an overpass scene; a roundabout scenario.
Optionally, the environmental information comprises at least one of: a lane line; a vehicle; a pedestrian; a traffic sign; road surface obstacles.
Optionally, the traffic sign information comprises at least one of: straight line straight mark; a curve straight line mark; a u-turn driving sign; a steering driving sign; marking the doubling; driving into the roundabout sign; driving out of the roundabout sign; driving in the high-speed mark; driving out the high-speed mark; and (4) error marking.
Optionally, the method further comprises: the traffic sign information is visually presented.
Optionally, the step of visually presenting the traffic sign information comprises: and displaying the traffic sign information on the live-action road in an overlapping manner.
Optionally, the step of displaying the traffic sign information on the live-action road in an overlapping manner comprises: acquiring a live-action road image; determining rendering parameter information based on the live-action road image; and rendering the image by using the rendering parameter information so as to display the traffic sign information on the live-action road in an overlapping manner.
Optionally, the step of rendering the image using the rendering parameter information includes: drawing the traffic sign information to the live-action road image based on the rendering parameter information, wherein the method further comprises the following steps: and presenting the rendered live-action road image on the vehicle-mounted display screen.
Optionally, the step of rendering the image using the rendering parameter information includes: and drawing the traffic sign information on the vehicle-mounted display screen based on the rendering parameter information so that the traffic sign information displayed on the vehicle-mounted display screen is matched with the live-action road.
Optionally, the step of determining rendering parameter information includes: identifying image characteristic points in the live-action road image; determining the relation between the road picture size and the actual distance in the real-scene road image based on the image feature points; and generating rendering parameter information based on the relation between the road picture size and the actual distance.
Optionally, the rendering parameter information comprises at least one of: a location; rotating the angle; a change in size; the duration of time.
Optionally, the step of identifying environmental information around the vehicle includes: imaging the surroundings of the vehicle; the resultant image is analyzed to identify environmental information around the vehicle.
Optionally, the step of determining the lane information where the vehicle is located includes: determining lane information where the vehicle is based on one or more of a signal location algorithm, a dead reckoning algorithm, and an environmental feature matching algorithm.
Optionally, the method further comprises: and playing voice broadcast information consistent with the traffic sign information.
According to a second aspect of the present disclosure, there is also provided a driving assistance system including: the system comprises an image acquisition module, a display module and a display module, wherein the image acquisition module is used for imaging the periphery of the vehicle to obtain an image containing environmental information of the periphery of the vehicle; the image recognition module is used for analyzing the image generated by the image acquisition module so as to recognize the environmental information around the vehicle; the lane positioning module is used for determining lane information of the vehicle; and a sign generation module for generating traffic sign information for guiding driving behavior based on the environment information and the lane information.
According to a third aspect of the present disclosure, there is also provided a computing device comprising: a processor; and a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform a method as set forth in the first aspect of the disclosure.
According to a fourth aspect of the present disclosure, there is also provided a non-transitory machine-readable storage medium having stored thereon executable code, which when executed by a processor of an electronic device, causes the processor to perform the method as set forth in the first aspect of the present disclosure.
The present disclosure generates traffic sign information for guiding driving behavior by means of positioning information of a lane level and environmental information around a vehicle. Therefore, convenience can be provided for a user to drive a vehicle or drive the vehicle without people. In addition, the method can be combined with a navigation tool (such as a navigation engine) to convert the navigation information into more refined traffic sign information so as to provide more accurate and convenient navigation service for users.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
Fig. 1 is a schematic block diagram showing the structure of a driving assistance system according to an embodiment of the present disclosure.
Fig. 2 is a schematic diagram showing the display of traffic sign information on the in-vehicle display screen.
Fig. 3 is a schematic flowchart illustrating a driving assistance method according to an embodiment of the present disclosure.
Fig. 4 is a schematic structural diagram of a computing device that can be used to implement the driving assistance method according to an embodiment of the present disclosure.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The present disclosure provides a driving assistance scheme in which traffic sign information for guiding driving behavior is generated by means of positioning information at a lane level and environmental information around a vehicle. The traffic sign information is sign information that can guide driving behavior, and for example, the traffic sign information may be sign information for guiding key driving actions such as straight running, turning, merging, turning around, and entering a rotary. Wherein the traffic sign information may be lane-level sign information in order to guide driving behavior at a finer lane level.
The driving assistance scheme can be combined with the existing navigation tool (such as a navigation engine) to convert the navigation information into more refined traffic sign information, such as lane-level traffic sign information, so as to provide more accurate and convenient navigation service for users. For example, when the navigation engine generates the navigation prompt information of "go straight ahead by 100 meters" according to the vehicle position and the navigation path, the straight-ahead mark for guiding the vehicle to go straight ahead along the current lane may be generated according to the recognized environment information and the lane information where the vehicle is located; when the navigation engine generates the prompt information of 'steering driving', a steering driving mark for guiding the vehicle to steer to the target lane along the current lane can be generated according to the environment information and the lane information acquired in real time.
Moreover, when the driving assistance scheme of the disclosure is combined with the existing navigation tool (such as a navigation engine), the disclosure can also avoid driving accidents caused by inaccurate navigation to a certain extent. For example, in the case where the navigation information suggests traveling 300 meters ahead along the current road but no road is present 100 meters ahead, if the user travels completely in accordance with the navigation, traffic accidents may be inevitably generated, based on the driving assistance scheme of the present disclosure, traffic sign information proceeding along the current lane may be displayed, and when no road is detected ahead, the traffic sign information may become a wrong sign to prompt the user.
In addition, the driving assistance scheme of the present disclosure may also automatically generate traffic sign information for guiding driving behavior during driving of the vehicle without depending on navigation information of a navigation engine. For example, it may be determined whether the current driving behavior violates a traffic rule according to the environment information and the lane information where the vehicle is located, which are acquired in real time, and in case of violating the traffic rule, traffic sign information for guiding the user to make a correct driving action may be generated. For another example, the driving error condition of the user may be counted according to the historical driving behavior of the user (which may be a group user or a single user), a scene with a high driving error (such as a viaduct scene, a roundabout and other complex scenes) may be determined, and when the vehicle travels to such a scene with a high error rate, the auxiliary driving scheme according to the present disclosure may be automatically executed, and the flag information for guiding the driving behavior at the lane level may be generated, so as to reduce the driving error rate of the user in such a scene. For example, some special scenes including, but not limited to, a scene of an enlarged intersection, a viaduct scene, and a roundabout scene generated by the navigation engine may also be defined according to expert experience, and when the vehicle travels into such special scenes, the driving assistance scheme of the present disclosure may be automatically executed to generate traffic sign information for guiding driving behaviors, so as to reduce the driving error rate of the user in the special scenes. The early stage of screening of the special scenes can be determined through expert experience, the driving error rate of a driver can be known through machine learning in the execution process, and the special scenes are automatically marked.
Further, traffic sign information generated based on the driving assistance scheme disclosed by the invention can be displayed on vehicle-mounted screens such as a central control screen, an instrument panel and a HUD screen in a way of being matched with road scenes, so that AR display is realized, a user is helped to obtain more intuitive and friendly navigation experience, and the problem of difficulty in recognizing traditional 2D or 3D navigation pictures is solved.
The following further describes aspects of the present disclosure.
[ term interpretation ]
AR: augmented Reality (AR) is a technology for calculating the position and angle of a camera image in real time and adding corresponding images and videos.
Image recognition technology: refers to a technique of processing, analyzing and understanding an image with a computer to recognize various different modes of objects and objects.
Lane positioning: the positioning progress can be accurate to a specific lane, and compared with the traditional navigation positioning, the method only can support path positioning and navigation.
[ Driving assistance System ]
Fig. 1 is a schematic block diagram showing the structure of a driving assistance system according to an embodiment of the present disclosure. The driving assistance system 100 may implement the driving assistance scheme of the present disclosure in combination with technologies of image recognition, lane positioning, image rendering, and the like. The driving assistance system 100 may be mounted in an in-vehicle operating system to provide a driving assistance service for a user or for unmanned driving, for example, a navigation module 140 may be combined to provide a more refined navigation service for the user.
FIG. 1 illustrates a number of modules that may be involved in some embodiments. It should be understood that not all of these modules are necessary to implement the technical solution of the present disclosure. For example, as will be apparent from reading the following detailed description of embodiments, not all of these modules may be required to participate in some scenarios.
The image capturing module 110 is configured to image the surroundings of the vehicle to obtain an image including environment information around the vehicle, where the image captured by the image capturing module 110 may be a video or a picture. Alternatively, the image capture module 110 may be a video signal capture module, and the video signal capture module may include a right-side road surface capture module, a front main road capture module, and a left-side road surface capture module mounted on the vehicle. By way of example, the video signal acquisition module may be an in-vehicle camera, and the in-vehicle camera may acquire the road conditions in front of the vehicle and on the left and right sides of the vehicle in real time to obtain a video information stream containing environmental information around the vehicle.
The image captured by the image capture module 110 may be passed to the image recognition module 120. The image recognition module 120 may analyze (image recognition) the image captured by the image capturing module 110 to recognize environmental information around the vehicle. The environmental information mentioned here can be one or more kinds of road condition information such as the road where the vehicle is located and the lane lines, pedestrians, traffic signs, road surface obstacles, etc. on the road.
The present disclosure mainly generates the lane-level sign information, and thus requires the image recognition module 120 to perform more detailed recognition of the lane lines. Preferably, the identified lane lines may include, but are not limited to, white single solid lines, yellow double solid lines, white dashed lines, yellow dashed lines, diversion lines, zebra stripes, driving stop lines, road speed limit signs, yellow dashed solid lines, strong wearing line sheltering, vehicle and pedestrian sheltering edge lines, and the like. The specific image recognition algorithm is not described herein again.
The lane locating module 130 is used to determine lane information where the vehicle is located. The lane positioning module 130 may determine lane information of the vehicle based on a plurality of fusion positioning algorithms such as signal positioning (e.g., GPS signal positioning), dead reckoning, environmental feature matching, and the like. In addition, the lane positioning module 130 may further combine the lane line identified by the image recognition module 120 in the process of positioning the lane where the vehicle is located, so as to achieve more accurate lane positioning. The signal positioning, the dead reckoning and the environmental feature matching are all existing positioning technologies, and the specific implementation process of lane positioning based on multiple positioning technologies is not repeated here.
The sign generation module 150 is a core rule module of the driving assistance system 100, and is configured to generate traffic sign information for guiding driving behavior based on a specific rule. For example, the user may perform a corresponding driving action according to the traffic sign information, and may also provide a decision for unmanned driving according to the generated traffic sign information.
Generation of flag information
The traffic sign information is sign information capable of guiding driving behaviors, and may alternatively be lane-level sign information capable of guiding driving behaviors at a lane level, such as sign information for guiding key driving actions such as straight running, steering, merging, turning around, driving in and out of a roundabout, driving in and out of a high speed, and the like. Specifically, the traffic sign information may include, but is not limited to, a straight-line sign, a curved-line sign, a u-turn sign, a turn-around sign, a merge sign, an inbound roundabout sign, an outbound roundabout sign, an inbound high-speed sign, an outbound high-speed sign, an error sign, and the like. Briefly, the straight-line sign refers to a sign that goes straight ahead along the extending direction of the current lane; the curve straight mark refers to a mark which advances along the curve of the current lane, and if the lane is a curved lane with a certain radian, the curve straight mark is generated; the U-turn driving mark can be a mark for turning from the current lane to the target lane; the steering driving mark can be a left-turning driving mark or a right-turning driving mark; the merging mark can be a mark from the merging of the current lane to the target lane, can be left merging or right merging; the input roundabout mark can be a guide mark generated when the input roundabout is input or is about to input, and the output roundabout mark can be a guide mark generated when the output roundabout is output or about to output; the entry highway mark may be a guidance mark generated when entering or about to enter the highway section, and the exit highway mark may be a guidance mark generated when exiting or about to exit the highway section; the error flag may be a lane error flag, or may be another error flag such as a route error flag.
Generation method one
In one embodiment of the present disclosure, the navigation module 140 may be used to obtain navigation information, for example, the navigation module 140 may be an in-vehicle navigation engine, and may perform navigation planning in real time based on the position and destination of the vehicle and generate navigation prompt information. In addition, the navigation module 140 may also be connected to a navigation engine (e.g., a mobile navigation engine such as a mobile phone) for receiving navigation information generated by the navigation engine. The navigation information may include route planning information and navigation prompt information generated based on the vehicle position.
The sign generation module 150 may generate traffic sign information corresponding to the navigation information according to the navigation information, the identified environment information, and the lane information where the vehicle is located. The navigation information may include route planning information and navigation prompt information, and thus, the traffic sign information may be generated with reference to the route planning information and also with reference to the navigation prompt information.
For example, it may be determined whether the vehicle is traveling in the correct lane according to the path planning information and the lane information in which the vehicle is currently located, and in the case where the vehicle is not traveling in the correct lane, a merge sign for guiding the vehicle from the current lane to the correct lane may be generated. For another example, it may also be determined whether the driving direction of the vehicle is wrong according to the path planning information, the information of the lane where the vehicle is currently located, and the turning driving sign that turns around from the current lane to the target lane may be generated when the driving direction of the vehicle is wrong. For example, traffic sign information corresponding to the navigation prompt information may be generated based on the navigation prompt information generated in real time, and the generated traffic sign information may be regarded as a result of conversion of the navigation prompt information at a lane level.
The following is an exemplary description of the generation process and animation display effect of several kinds of traffic sign information.
1. Straight line straight mark
In the case where the navigation information is prompt information for straight ahead (such as "straight ahead 300 m"), a straight-ahead sign for guiding the vehicle to go straight ahead along the current lane may be generated. Wherein, the length of the straight-line sign can change along with the change of the distance between the vehicle and the front vehicle.
As an example, a straight traveling guide line for guiding the straight traveling (i.e., straight traveling) of the vehicle may be generated as the straight traveling flag when the navigation function is turned on. The length of the straight guide line can vary with the distance of the front vehicle. The animation display rule of the straight guide line can be set to be that the fluid effect and the length do not exceed the horizon line and the tail part of the front vehicle, and a pixel gap is left between the fluid effect and the tail part of the front vehicle. And when other marks (such as a steering mark and a doubling mark) are triggered, the straight-line straight mark can be closed.
2. Steering running mark
In the case where the navigation information is prompt information for steering travel (e.g., "turn right at the intersection ahead"), a steering travel flag for guiding the vehicle to travel from the current lane to the steering position and turn to travel may be generated.
As an example, the detection may be initiated when the navigation engine generates turn-around cue information. And generating a corresponding steering driving mark according to the distance between the vehicle and the steering position. The turn-by-turn indicator may be a straight guide indicator if the vehicle is more than a first distance (e.g., 30 meters) from the turn, may be a combination of a straight guide indicator and a turn-by-turn indicator if the vehicle is less than the first distance and more than a second distance (e.g., 10 meters) from the turn, and may be a turn-by-turn indicator if the vehicle is less than the second distance from the turn. Wherein the second distance is smaller than the first distance, the straight guiding mark may be a straight guiding line, and the turning guiding mark may be a turning guiding arrow. The direction of the steering guide mark needs to be consistent with the navigation prompting direction, and left and right are distinguished. The turn-around running mark may be turned off after the navigation engine detects that the turn-around is completed, or the turn-around running mark may be turned off when the rear vehicle arrives, or the turn-around running mark may be turned off when the turn-around lamp is turned off.
3. Doubling mark
In the case where the vehicle needs to be driven from the current lane to the target lane based on the navigation information, a merge sign for guiding the vehicle from the current lane to the target lane may be generated. For example, when it is determined that the current vehicle is located in a lane not corresponding to the navigation information and needs to travel to the target lane, the merge sign for traveling from the current lane to the target lane may be generated. Wherein the generation of the doubling flag may be triggered when the distance from the solid line (solid line side of single solid line, double solid line, dashed solid line) is more than 3 m. Moreover, the judgment of the trigger condition of the doubling mark can be realized in various ways, and is not described herein again.
In the case where it is determined that the doubling is required, a doubling flag visually guiding the user to make a doubling driving operation may be generated when there is no vehicle in the target lane or when there is a vehicle in the target lane and the distance from the vehicle is greater than 10 m. For example, the merging sign may be a visual guidance arrow, the visual guidance arrow may extend furthest to the tail of the leading vehicle, and when the target lane has a vehicle and is less than 10m, the merging sign may be displayed normally and a vehicle highlight is performed.
As shown in fig. 2, when the vehicle is distant from the front vehicle in the target lane, the merge sign may be displayed at a specific position on a head-up display (HUD) so that the merge sign is visually displayed superimposed on the live-action road. The drawing shows a doubling flag including a doubling flag 1 and a doubling flag 2. The doubling sign 1 is a visual guide arrow with a short line distance and is used for guiding the doubling operation, and the doubling sign 2 is displayed from a far position and extends to the tail of a front vehicle in a target lane and is used for identifying the front vehicle in the target lane. The merging sign 1 and the merging sign 2 may be in a color different from that of the road and highlighted, for example, may be highlighted green.
When it is detected that the vehicle enters an area of a lane line (a solid line, a solid line side of a virtual line and a real line) of the non-lane-changing, lane-changing reminding can be omitted. After the merging, the navigation engine can perform route re-planning according to the route to be driven into the corresponding lane.
So far, taking a straight line straight mark, a steering driving mark and a doubling mark as examples, the generation mechanism and the display effect of the doubling marks are exemplarily described. As described above, the traffic sign information may further include various other signs such as a curvable straight sign, a u-turn driving sign, a driving-in roundabout sign, a driving-out roundabout sign, a driving-in high-speed sign, a driving-out high-speed sign, and an error sign, and lane-level signs of different categories may have corresponding generation mechanisms and display rules, which are not described in detail in this disclosure.
Generation mode two
In another embodiment of the present disclosure, the sign generating module 150 may also automatically generate traffic sign information for guiding driving behavior during driving of the vehicle by the user without depending on navigation information of the navigation engine. For example, the sign generation module 150 may determine whether the current driving behavior violates a traffic rule according to the environment information and the lane information where the vehicle is located, which are acquired in real time, and generate traffic sign information for guiding the user to make a correct driving action in case of violating the traffic rule. For another example, the driving error condition of the user may be counted according to the historical driving behavior of the user (which may be a group user or a single user), a scene with a high driving error (such as a viaduct scene, a roundabout and other complex scenes) may be determined, and when the vehicle travels to such a scene with a high error rate, the sign generation module 150 may generate the sign information for guiding the driving behavior at the lane level, so as to reduce the driving error rate of the user. For example, some special complex scenes may be defined according to expert experience, including but not limited to a scene in which the navigation engine generates an enlarged intersection (e.g., a scene in which the 2D navigation occurs an enlarged intersection), a viaduct scene, and a roundabout scene. The sign generation module 150 may generate sign information for guiding driving behavior at a lane level when the vehicle travels into such a special scene to reduce a driving error rate of the user. The early stage of the screening of the complex scenes can be determined through expert experience, the driving error rate of a driver can be known through machine learning in the execution process, and the special scenes are automatically marked.
Display of logo information
The generated traffic sign information may be displayed on the display module 170. The display module 170 may be an on-vehicle display module, and may include, but is not limited to, an on-vehicle display screen such as a center control screen, a dashboard, and a HUD screen. Preferably, the display module 170 may display the traffic sign information on the live-action road in an overlapping manner, that is, in an AR display manner, so as to help the user obtain a more intuitive and friendly driving experience and solve the problem of cognitive difficulty of the conventional 2D or 3D navigation picture. Preferably, the traffic sign information may be displayed superimposed on the corresponding lane in the live-action road, so that the user may make the corresponding driving action directly based on what is seen, without any further cognitive conversion of what is seen.
The traffic sign information may be drawn onto the display module 170 by the rendering module 160. Specifically, the rendering module 160 may first acquire the live-action road image from the image capturing module 110, and determine rendering parameter information based on the live-action road image. For example, the rendering module 160 may identify image feature points such as a road center axis and a road horizon in the real road image, and based on the identified image feature points, may determine a relationship between a road image size and an actual distance in the real road image, for example, the relationship between the road image size and the actual distance in the real road image may be determined by comparing the image feature points with a distance feature library prepared in advance. Based on the relationship between the road picture size and the actual distance, parameter information for rendering may be generated, such as parameter information that may include position, rotation angle, size change, duration, and the like. Finally, the rendering module 160 may perform image rendering based on the generated rendering parameters to draw the traffic sign information onto the display module 170.
It should be noted that, corresponding animation display effects and display rules may be set in advance for each type of traffic sign information. For example, the animation display effect and rule of the straight-line sign may be: the length of the straight line straight mark is changed along with the change of the distance of the front vehicle. The animation display effect and the rule of the turn sign may be: the turn sign is a "straight guide line" when the distance from the turn is more than 30 meters, the "straight guide line" is changed to a "straight guide line" + "turn guide arrow" when the distance is less than 30 meters, and the turn sign is changed to a "turn guide gesture" when the distance is less than 10 meters. The rendering module 160 may also refer to a corresponding animation display effect and a corresponding display rule when determining the rendering parameter information, and may perform rendering based on a display effect and a display rule corresponding to the traffic sign information to be rendered after determining the rendering parameter.
In the present disclosure, the rendering module 160 may draw the traffic sign information onto the real road image and present the rendered real road image on the display module 170 to implement AR display. In addition, as shown in fig. 2, the display module 170 may be a vehicle-mounted display screen (e.g., a HUD head-up display), and the rendering module 160 may also directly draw the traffic sign information onto the vehicle-mounted display screen (e.g., the HUD head-up display), so that the traffic sign information displayed on the screen matches with the real road, thereby implementing AR display. Optionally, due to the difference of the display pictures and the difference of the screen sizes of different display screens, image effect adaptation can be performed according to the characteristics of the current screen through a screen adaptation module (not shown in the figure), and the adaptation process is not described herein again.
As an example of the present disclosure, the sign generation module 150 may be used to generate and schedule various types of traffic sign information, display time, display location, and the like. The rendering module 160 may obtain the generated traffic sign information sequence from the sign generation module 150, the traffic sign information sequence including the queue time and the traffic sign information. The rendering module 160 may render each traffic sign information in the sequence according to rendering parameter information such as a position, a rotation angle, a size change, a sequence duration, and the like, and finally display the rendered traffic sign information on the display module 170.
The prompt module 180 may play voice broadcast information consistent with the traffic sign information to help the user better understand the traffic sign information. Optionally, the prompt module 180 may integrate the navigation information and the traffic sign information into a voice broadcast message to help the user to better understand the navigation intention.
In summary, the driving assistance system of the present disclosure may be combined with a navigation engine to convert navigation information into finer sign information for guiding driving behavior at a lane level, and may combine traffic sign information with live-action roads for AR display. Therefore, the driving assistance system of the present disclosure can be implemented as an AR navigation system, and the following system modifications can be performed on the original vehicle-mounted operating system to implement the AR navigation system.
1) And a fixed front camera is additionally arranged and is used for supporting the image acquisition module 110 to acquire road images. The vehicle-mounted system needs to be added with a necessary camera signal acquisition processing unit, an image picture size, a frame rate parameter and an image transmission format. Optionally, a left fisheye camera and a right fisheye camera can be optionally arranged on the vehicle machine capable of receiving high hardware cost, the left fisheye camera and the right fisheye camera are used for collecting left lane pictures and right lane pictures, and the safety of steering and merging guide can be improved. If so, the video signal processing units of the system portion need to be increased accordingly.
2) The improvement of the system computing power needs to be further improved for the real-time image recognition, rendering and other work, the hardware computation of the whole vehicle-mounted system needs to be further improved, and the specific improved numerical value is dynamically computed according to the function points of practical application. An add-on power resource allocation unit is required in the in-vehicle system for allocating power and image processing resources.
3) And the navigation mode of the AR effect can be defaulted as an alternative navigation view mode in the vehicle-mounted operating system by the AR navigation mode switching control unit. After the user actively clicks the switching control unit to switch the modes, the default 2D navigation picture can be switched to the AR navigation mode. In addition, in some special complex scenes, including but not limited to scenes in which all 2D navigation has intersection enlarged images, viaduct scenes and roundabout scenes, the navigation mode can be automatically switched to the AR navigation mode. The early stage of the screening of the complex scenes can be determined through navigation expert experience, and the driving error rate of a driver can be known through machine learning in the execution process of the system, so that the complex scenes can be automatically marked.
[ METHOD FOR VEHICLE NAVIGATION ]
Fig. 3 is a schematic flowchart illustrating a driving assistance method according to an embodiment of the present disclosure. The method shown in fig. 3 may be implemented by the driving assistance system shown in fig. 1. The basic implementation process of the driving assistance method is described below, and for the details involved therein, the above description may be referred to, and details are not described below.
Referring to fig. 3, in step S310, environmental information around the vehicle is identified.
The environmental information around the vehicle may be one or more road condition information such as a road on which the vehicle is located, a lane line on the road, a pedestrian, a traffic sign, a road surface obstacle, and the like. The surroundings of the vehicle may be imaged and the resulting image may be analyzed to identify environmental information around the vehicle.
In step S320, the lane information where the vehicle is located is determined.
The lane information in which the vehicle is located may be determined based on one or more of a signal location algorithm, a dead reckoning algorithm, and an environmental feature matching algorithm.
In step S330, traffic sign information for guiding driving behavior is generated based on the environmental information and the lane information. For the traffic sign information and the generation process thereof, reference may be made to the above description, which is not repeated herein.
After the traffic sign information is obtained, the traffic sign information can also be visually displayed. For example, the traffic sign information may be displayed superimposed on the live-action road, so as to realize AR display of the traffic sign information. For displaying the traffic sign information, refer to the above description, and are not described herein.
In addition, voice broadcast information consistent with the traffic sign information can be played. Also, the step of generating traffic sign information for guiding the driving behavior may be performed in response to detecting that the vehicle arrives or is about to arrive at a predetermined scene. The predetermined scene may be a scene with a high driving error rate of the user or a preset special scene, such as but not limited to a scene in which the navigation engine generates an enlarged intersection, a viaduct scene, a roundabout scene, and the like.
[ calculating device ]
Fig. 4 is a schematic structural diagram of a computing device that can be used to implement the driving assistance method according to an embodiment of the present disclosure.
Referring to fig. 4, computing device 400 includes memory 410 and processor 420.
The processor 420 may be a multi-core processor or may include a plurality of processors. In some embodiments, processor 420 may include a general-purpose host processor and one or more special coprocessors such as a Graphics Processor (GPU), a Digital Signal Processor (DSP), or the like. In some embodiments, processor 420 may be implemented using custom circuits, such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA).
The memory 410 may include various types of storage units, such as system memory, Read Only Memory (ROM), and permanent storage. Wherein the ROM may store static data or instructions that are required by the processor 420 or other modules of the computer. The persistent storage device may be a read-write storage device. The persistent storage may be a non-volatile storage device that does not lose stored instructions and data even after the computer is powered off. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the permanent storage may be a removable storage device (e.g., floppy disk, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as a dynamic random access memory. The system memory may store instructions and data that some or all of the processors require at runtime. Further, the memory 410 may include any combination of computer-readable storage media, including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic and/or optical disks, may also be employed. In some embodiments, memory 410 may include a removable storage device that is readable and/or writable, such as a Compact Disc (CD), a read-only digital versatile disc (e.g., DVD-ROM, dual layer DVD-ROM), a read-only Blu-ray disc, an ultra-density optical disc, a flash memory card (e.g., SD card, min SD card, Micro-SD card, etc.), a magnetic floppy disc, or the like. Computer-readable storage media do not contain carrier waves or transitory electronic signals transmitted by wireless or wired means.
The memory 410 has stored thereon executable code that, when processed by the processor 420, may cause the processor 420 to perform the driving assistance methods described above.
The driving assistance method, system, and computing device according to the present disclosure have been described in detail above with reference to the accompanying drawings.
This is disclosed can combine image recognition, high accuracy location, the navigation technique among the AR display technology cooperation vehicle-mounted system, the drawing and the display effect of the lane rank that realize, and present product compares in this scheme, all has the great difference of ease of use, practicality and user experience aspect.
Specifically, some manufacturers and college research institutes have proposed AR navigation in different forms and different schemes in the market, and the specific disadvantages include: a) the method can only display schematic icons, cannot accurately identify the specific lane, has no great practical improvement compared with the traditional navigation map, and due to the limitation of lane positioning technology, products on the market only can prompt that the functions only comprise two functions of straight going and steering, and other functions (such as doubling, turning around, roundabout and the like which strongly depend on accurate lane information) cannot effectively guide and cover driving behaviors, which is the maximum limitation of the current market scheme; b) the schematic icon is displayed in a relatively fixed manner, and is mostly a static icon (such as a HUDAR scheme). The reminding effect is weak, the user is difficult to see carefully when the reminding effect is superposed on a real scene, and compared with the traditional navigation information, the reminding effect is more convenient for the owner to pay attention and pay eyes, and the technical value cannot be effectively exerted.
The lane positioning scheme can be realized by comprehensively utilizing image recognition, high-precision positioning and navigation technologies, and more accurate AR navigation display and navigation mark prompt in more forms are realized, so that the driving experience of a vehicle owner is improved. The concrete advantages include: 1) the types of AR navigation marks which can be supported by the current market products are greatly expanded, and the expansion types comprise left and right doubling, turning, driving in and out of the roundabout, driving in and out of the high speed and the like; 2) the existing AR navigation forms (straight line and steering) in the market are displayed with higher precision, the display is accurately attached to the current lane, and the user can be more intuitively and effectively helped to understand the navigation function.
Furthermore, the method according to the present disclosure may also be implemented as a computer program or computer program product comprising computer program code instructions for performing the above-mentioned steps defined in the above-mentioned method of the present disclosure.
Alternatively, the present disclosure may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) which, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform the various steps of the above-described method according to the present disclosure.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (25)

1. A driving assist method characterized by comprising:
identifying environmental information around the vehicle;
determining lane information of the vehicle;
and generating traffic sign information for guiding driving behaviors based on the environment information and the lane information.
2. The driving assist method according to claim 1, characterized in that the traffic sign information is lane-level sign information.
3. The driving assist method according to claim 1, wherein the step of generating traffic sign information further includes:
and generating traffic sign information corresponding to the navigation information based on the navigation information, the environment information and the lane information.
4. The driving assist method according to claim 3, wherein the step of generating traffic sign information corresponding to the navigation information includes:
and under the condition that the navigation information is prompt information for straight-ahead, generating a straight-ahead mark for guiding the vehicle to go ahead along the current lane.
5. The driving assist method according to claim 4,
the length of the straight ahead mark varies with the distance between the vehicle and the preceding vehicle.
6. The driving assist method according to claim 3, wherein the step of generating traffic sign information corresponding to the navigation information includes:
and if the navigation information is prompt information of steering driving, generating a steering driving mark for guiding the vehicle to drive from the current lane to a steering position and steering the vehicle.
7. The driving assist method according to claim 6,
the steering driving mark is a straight-ahead guiding mark when the distance between the vehicle and the steering position is larger than a first distance,
the steering driving mark is a combination of a straight guide mark and a steering guide mark when the distance between the vehicle and the steering position is less than a first distance and greater than a second distance, and the second distance is less than the first distance,
and in the case that the distance from the vehicle to the turning position is less than a second distance, the turning driving mark is a turning guide mark.
8. The driving assist method according to claim 3, wherein the step of generating traffic sign information corresponding to the navigation information includes:
and generating a merging mark for guiding the vehicle to drive from the current lane to the target lane when the vehicle is determined to drive from the current lane to the target lane according to the navigation information.
9. The driving assist method according to claim 1, characterized by further comprising:
the step of generating traffic sign information for guiding driving behavior is performed in response to the vehicle arriving or coming to a predetermined scene.
10. The driving assist method according to claim 9, characterized in that the predetermined scene includes at least one of:
the navigation engine generates a scene of an enlarged intersection;
an overpass scene;
a roundabout scenario.
11. The driving assist method according to claim 1, characterized in that the environmental information includes at least one of:
a lane line;
a vehicle;
a pedestrian;
a traffic sign;
road surface obstacles.
12. The driving assist method according to claim 1, characterized in that the traffic sign information includes at least one of:
straight line straight mark;
a curve straight line mark;
a u-turn driving sign;
a steering driving sign;
marking the doubling;
driving into the roundabout sign;
driving out of the roundabout sign;
driving in the high-speed mark;
driving out the high-speed mark;
and (4) error marking.
13. The driving assist method according to claim 1, characterized by further comprising:
visually displaying the traffic sign information.
14. The driving assist method according to claim 13, wherein the step of visually presenting the traffic sign information includes:
and displaying the traffic sign information on the live-action road in an overlapping manner.
15. The driving assist method according to claim 14, wherein the step of displaying the traffic sign information superimposed on the live-action road includes:
acquiring a live-action road image;
determining rendering parameter information based on the live-action road image;
and performing image rendering by using the rendering parameter information so as to display the traffic sign information on the real road in an overlapping manner.
16. The driving assistance method according to claim 15, wherein the step of image rendering using the rendering parameter information includes: drawing the traffic sign information onto the live-action road image based on the rendering parameter information, the method further comprising:
and presenting the rendered live-action road image on the vehicle-mounted display screen.
17. The driving assistance method according to claim 15, wherein the step of image rendering using the rendering parameter information includes:
and drawing the traffic sign information on a vehicle-mounted display screen based on the rendering parameter information so that the traffic sign information displayed on the vehicle-mounted display screen is matched with the real scene road.
18. The driving assist method according to claim 15, wherein the step of determining rendering parameter information includes:
identifying image feature points in the live-action road image;
determining the relation between the road picture size and the actual distance in the real scene road image based on the image feature points; and
and generating rendering parameter information based on the relation between the road picture size and the actual distance.
19. The driving assist method according to claim 7, wherein the rendering parameter information includes at least one of:
a location;
rotating the angle;
a change in size;
the duration of time.
20. The driving assist method according to claim 1, wherein the step of identifying environmental information around the vehicle includes:
imaging the surroundings of the vehicle;
the resultant image is analyzed to identify environmental information surrounding the vehicle.
21. The driving assist method according to claim 1, wherein the step of determining the lane information in which the vehicle is located includes:
determining lane information where the vehicle is based on one or more of a signal location algorithm, a dead reckoning algorithm, and an environmental feature matching algorithm.
22. The driving assist method according to claim 1, characterized by further comprising:
and playing voice broadcast information consistent with the traffic sign information.
23. A driving assistance system characterized by comprising:
the system comprises an image acquisition module, a display module and a display module, wherein the image acquisition module is used for imaging the periphery of the vehicle to obtain an image containing environmental information of the periphery of the vehicle;
the image recognition module is used for analyzing the image generated by the image acquisition module so as to recognize the environmental information around the vehicle;
the lane positioning module is used for determining lane information where the vehicle is located; and
and the sign generation module is used for generating traffic sign information for guiding driving behaviors based on the environment information and the lane information.
24. A computing device, comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any one of claims 1 to 22.
25. A non-transitory machine-readable storage medium having stored thereon executable code, which when executed by a processor of an electronic device, causes the processor to perform the method of any of claims 1-22.
CN201811089401.3A 2018-09-18 2018-09-18 Driving assistance method, driving assistance system, computing device, and storage medium Pending CN110920604A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811089401.3A CN110920604A (en) 2018-09-18 2018-09-18 Driving assistance method, driving assistance system, computing device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811089401.3A CN110920604A (en) 2018-09-18 2018-09-18 Driving assistance method, driving assistance system, computing device, and storage medium

Publications (1)

Publication Number Publication Date
CN110920604A true CN110920604A (en) 2020-03-27

Family

ID=69855788

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811089401.3A Pending CN110920604A (en) 2018-09-18 2018-09-18 Driving assistance method, driving assistance system, computing device, and storage medium

Country Status (1)

Country Link
CN (1) CN110920604A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111583696A (en) * 2020-05-15 2020-08-25 咸阳师范学院 Unmanned vehicle control system and operation method thereof
CN111967301A (en) * 2020-06-30 2020-11-20 北京百度网讯科技有限公司 Positioning navigation method, device, electronic equipment and storage medium
CN112683293A (en) * 2020-12-15 2021-04-20 东风汽车有限公司 Vehicle-mounted navigation method, electronic equipment and storage medium
CN112860924A (en) * 2020-12-31 2021-05-28 广州方纬智慧大脑研究开发有限公司 Road traffic auxiliary sign generation method, device, equipment and medium
CN112885132A (en) * 2021-01-27 2021-06-01 安徽蓝鸟智能停车科技产业化有限公司 Intelligent unmanned dispatching system based on AI and automatic driving method
CN112885087A (en) * 2021-01-22 2021-06-01 北京嘀嘀无限科技发展有限公司 Method, apparatus, device and medium for determining road condition information and program product
CN113237490A (en) * 2021-02-08 2021-08-10 上海博泰悦臻网络技术服务有限公司 AR navigation method, system, electronic device and storage medium
CN113362629A (en) * 2021-07-21 2021-09-07 张铂虎 Area positioning navigation system based on traffic sign information line and working method thereof
CN113352889A (en) * 2021-06-29 2021-09-07 广州小鹏汽车科技有限公司 Display method, vehicle-mounted terminal, vehicle and storage medium
CN113352890A (en) * 2021-06-29 2021-09-07 广州小鹏汽车科技有限公司 Display method, vehicle-mounted terminal, vehicle and storage medium
WO2022016953A1 (en) * 2020-07-22 2022-01-27 Oppo广东移动通信有限公司 Navigation method and apparatus, storage medium and electronic device
CN114435403A (en) * 2022-02-22 2022-05-06 重庆长安汽车股份有限公司 Navigation positioning checking system and method based on environmental information
CN114572112A (en) * 2022-02-25 2022-06-03 智己汽车科技有限公司 Augmented reality method and system for automobile front windshield
CN114993337A (en) * 2022-08-08 2022-09-02 泽景(西安)汽车电子有限责任公司 Navigation animation display method and device, ARHUD and storage medium
CN115071733A (en) * 2022-07-21 2022-09-20 成都工业职业技术学院 Auxiliary driving method and device based on computer
CN116489318A (en) * 2023-06-25 2023-07-25 北京易控智驾科技有限公司 Remote driving method and device for automatic driving vehicle

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1503354A1 (en) * 2003-07-30 2005-02-02 Robert Bosch Gmbh Generating traffic information by interpretation of traffic sign scenarios and navigation information in a vehicle
WO2009084135A1 (en) * 2007-12-28 2009-07-09 Mitsubishi Electric Corporation Navigation system
JP2011149835A (en) * 2010-01-22 2011-08-04 Clarion Co Ltd Car navigation device
US20130151145A1 (en) * 2011-12-13 2013-06-13 Ken Ishikawa Display system, display method, and display program
KR20150054022A (en) * 2013-11-08 2015-05-20 현대오트론 주식회사 Apparatus for displaying lane changing information using head-up display and method thereof
US20160327402A1 (en) * 2014-02-05 2016-11-10 Panasonic Intellectual Property Management Co., Ltd. Display apparatus for vehicle and display method of display apparatus for vehicle
CN106448260A (en) * 2015-08-05 2017-02-22 Lg电子株式会社 Driver assistance apparatus and vehicle including the same
CN107665506A (en) * 2016-07-29 2018-02-06 成都理想境界科技有限公司 Realize the method and system of augmented reality

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1503354A1 (en) * 2003-07-30 2005-02-02 Robert Bosch Gmbh Generating traffic information by interpretation of traffic sign scenarios and navigation information in a vehicle
WO2009084135A1 (en) * 2007-12-28 2009-07-09 Mitsubishi Electric Corporation Navigation system
JP2011149835A (en) * 2010-01-22 2011-08-04 Clarion Co Ltd Car navigation device
US20130151145A1 (en) * 2011-12-13 2013-06-13 Ken Ishikawa Display system, display method, and display program
KR20150054022A (en) * 2013-11-08 2015-05-20 현대오트론 주식회사 Apparatus for displaying lane changing information using head-up display and method thereof
US20160327402A1 (en) * 2014-02-05 2016-11-10 Panasonic Intellectual Property Management Co., Ltd. Display apparatus for vehicle and display method of display apparatus for vehicle
CN106448260A (en) * 2015-08-05 2017-02-22 Lg电子株式会社 Driver assistance apparatus and vehicle including the same
CN107665506A (en) * 2016-07-29 2018-02-06 成都理想境界科技有限公司 Realize the method and system of augmented reality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
车云: "智能汽车:决战2020", pages: 165 - 168 *

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111583696A (en) * 2020-05-15 2020-08-25 咸阳师范学院 Unmanned vehicle control system and operation method thereof
CN111967301A (en) * 2020-06-30 2020-11-20 北京百度网讯科技有限公司 Positioning navigation method, device, electronic equipment and storage medium
WO2022016953A1 (en) * 2020-07-22 2022-01-27 Oppo广东移动通信有限公司 Navigation method and apparatus, storage medium and electronic device
CN112683293A (en) * 2020-12-15 2021-04-20 东风汽车有限公司 Vehicle-mounted navigation method, electronic equipment and storage medium
CN112683293B (en) * 2020-12-15 2024-06-21 东风汽车有限公司 Vehicle navigation method, electronic equipment and storage medium
CN112860924A (en) * 2020-12-31 2021-05-28 广州方纬智慧大脑研究开发有限公司 Road traffic auxiliary sign generation method, device, equipment and medium
CN112860924B (en) * 2020-12-31 2024-06-04 广州方纬智慧大脑研究开发有限公司 Road traffic auxiliary sign generation method, device, equipment and medium
CN112885087A (en) * 2021-01-22 2021-06-01 北京嘀嘀无限科技发展有限公司 Method, apparatus, device and medium for determining road condition information and program product
CN112885132A (en) * 2021-01-27 2021-06-01 安徽蓝鸟智能停车科技产业化有限公司 Intelligent unmanned dispatching system based on AI and automatic driving method
CN113237490A (en) * 2021-02-08 2021-08-10 上海博泰悦臻网络技术服务有限公司 AR navigation method, system, electronic device and storage medium
CN113352890B (en) * 2021-06-29 2022-05-13 广州小鹏汽车科技有限公司 Display method, vehicle-mounted terminal, vehicle and storage medium
CN113352890A (en) * 2021-06-29 2021-09-07 广州小鹏汽车科技有限公司 Display method, vehicle-mounted terminal, vehicle and storage medium
CN113352889A (en) * 2021-06-29 2021-09-07 广州小鹏汽车科技有限公司 Display method, vehicle-mounted terminal, vehicle and storage medium
CN113362629A (en) * 2021-07-21 2021-09-07 张铂虎 Area positioning navigation system based on traffic sign information line and working method thereof
CN113362629B (en) * 2021-07-21 2024-02-27 张铂虎 Regional positioning navigation system based on traffic sign information line and working method thereof
CN114435403B (en) * 2022-02-22 2023-11-03 重庆长安汽车股份有限公司 Navigation positioning checking system and method based on environment information
CN114435403A (en) * 2022-02-22 2022-05-06 重庆长安汽车股份有限公司 Navigation positioning checking system and method based on environmental information
CN114572112A (en) * 2022-02-25 2022-06-03 智己汽车科技有限公司 Augmented reality method and system for automobile front windshield
CN115071733A (en) * 2022-07-21 2022-09-20 成都工业职业技术学院 Auxiliary driving method and device based on computer
CN115071733B (en) * 2022-07-21 2022-10-25 成都工业职业技术学院 Auxiliary driving method and device based on computer
CN114993337A (en) * 2022-08-08 2022-09-02 泽景(西安)汽车电子有限责任公司 Navigation animation display method and device, ARHUD and storage medium
CN114993337B (en) * 2022-08-08 2022-11-15 泽景(西安)汽车电子有限责任公司 Navigation animation display method and device, ARHUD and storage medium
CN116489318B (en) * 2023-06-25 2023-08-22 北京易控智驾科技有限公司 Remote driving method and device for automatic driving vehicle
CN116489318A (en) * 2023-06-25 2023-07-25 北京易控智驾科技有限公司 Remote driving method and device for automatic driving vehicle

Similar Documents

Publication Publication Date Title
CN110920604A (en) Driving assistance method, driving assistance system, computing device, and storage medium
US10733462B2 (en) Travel assistance device and computer program
CN110926487A (en) Driving assistance method, driving assistance system, computing device, and storage medium
JP4886597B2 (en) Lane determination device, lane determination method, and navigation device using the same
KR20210038633A (en) Conditional availability of vehicle mixed reality
CN111460865A (en) Driving assistance method, driving assistance system, computing device, and storage medium
EP3627110B1 (en) Method for planning trajectory of vehicle
CN111207768B (en) Information prompting method, device, equipment and storage medium for navigation process
CN111351503A (en) Driving assistance method, driving assistance system, computing device, and storage medium
JP2019164611A (en) Traveling support device and computer program
JP6444508B2 (en) Display control device and navigation device
KR20130066210A (en) Device and method of acquiring traffic-control sign information using a camera
JP2022006844A (en) Object detection method and object detection device
CN114582153B (en) Ramp entry long solid line reminding method, system and vehicle
CN111348055A (en) Driving assistance method, driving assistance system, computing device, and storage medium
CN112683293A (en) Vehicle-mounted navigation method, electronic equipment and storage medium
JP5308810B2 (en) In-vehicle video display
JP2022041286A (en) Display control device, display control method, and display control program
WO2018207308A1 (en) Display control device and display control method
JP6388723B2 (en) Display control device and navigation device
JP2024018205A (en) Superimposed image display device
JP2023038558A (en) Superimposed image display device
JP2023131981A (en) Superimposed image display device
JP2023097939A (en) Superimposed image display device
JP2021081232A (en) Superimposed image display device and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20201125

Address after: Room 603, 6 / F, Roche Plaza, 788 Cheung Sha Wan Road, Kowloon, China

Applicant after: Zebra smart travel network (Hong Kong) Limited

Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Applicant before: Alibaba Group Holding Ltd.

TA01 Transfer of patent application right
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40027800

Country of ref document: HK