CN113743526A - Lane line fusion method and system of AR-HUD - Google Patents

Lane line fusion method and system of AR-HUD Download PDF

Info

Publication number
CN113743526A
CN113743526A CN202111074620.6A CN202111074620A CN113743526A CN 113743526 A CN113743526 A CN 113743526A CN 202111074620 A CN202111074620 A CN 202111074620A CN 113743526 A CN113743526 A CN 113743526A
Authority
CN
China
Prior art keywords
lane line
hud
data
road surface
line image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111074620.6A
Other languages
Chinese (zh)
Inventor
王双军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hozon New Energy Automobile Co Ltd
Original Assignee
Hozon New Energy Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hozon New Energy Automobile Co Ltd filed Critical Hozon New Energy Automobile Co Ltd
Priority to CN202111074620.6A priority Critical patent/CN113743526A/en
Publication of CN113743526A publication Critical patent/CN113743526A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides an AR-HUD lane line fusion method and system, wherein the method comprises the following steps: acquiring a lane line image of a road surface; detecting the change of a broken line of the lane line image of the road surface, and processing the lane line image of the road surface when the change of the broken line of the lane line image of the road surface occurs to obtain broken line change data; receiving human eye position data in the vehicle and calibration data of the AR-HUD device; and according to the fold line change data, combining the human eye position data and the calibration data to obtain a lane line image displayed by the AR-HUD equipment. The invention can realize good fusion of the lane line image displayed by the AR-HUD equipment and the road lane line, thereby improving the safety of vehicle auxiliary driving and automatic driving.

Description

Lane line fusion method and system of AR-HUD
Technical Field
The invention mainly relates to the field of intelligent driving, in particular to a lane line fusion method, a system, a device and a computer readable medium for AR-HUD.
Background
Currently, AR-HUDs (Augmented Reality-Head up Display systems) are increasingly equipped on vehicles. AR-HUD can let the image of driving environment throw on windshield, can not need to hold down to see the display interface (or be called well accuse) of vehicle instrument screen and car machine system, can let drive and increase the security to have better driving experience.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a lane line fusion method, a system, a device and a computer readable medium for AR-HUD, which realize good fusion of lane line images displayed by AR-HUD equipment and road lane lines, thereby improving the safety of vehicle auxiliary driving and automatic driving.
In order to solve the technical problem, the invention provides an AR-HUD lane line fusion method, which comprises the following steps:
acquiring a lane line image of a road surface; detecting the change of a broken line of the lane line image of the road surface, and processing the lane line image of the road surface when the change of the broken line of the lane line image of the road surface occurs to obtain broken line change data; receiving human eye position data in the vehicle and calibration data of the AR-HUD device; and according to the fold line change data, combining the human eye position data and the calibration data to obtain a lane line image displayed by the AR-HUD equipment.
In an embodiment of the present invention, the broken line change data includes a position of a broken line point, an angle of the broken line, and a height ratio of the broken line changed portion in the lane line image of the road surface.
In an embodiment of the present invention, the method further comprises projecting the image of the lane line displayed by the AR-HUD device to a windshield surface of the vehicle.
In an embodiment of the present invention, obtaining the lane line image displayed by the AR-HUD device according to the fold line change data and by combining the eye position data and the calibration data includes:
and converting the road lane line image data of the road surface where the fold line change data is located, the human eye position data and the calibration data into the same coordinate system, and calculating to obtain a lane line image displayed by the AR-HUD equipment.
The invention also provides an AR-HUD lane line fusion system, which comprises:
the road surface image acquisition module is used for acquiring lane line image data of a road surface; the image processing module is used for detecting the fold line change of the lane line image of the road surface, and processing the lane line image of the road surface when the fold line change occurs in the lane line image of the road surface to obtain fold line change data; the image fusion module receives human eye position data in the vehicle and calibration data of the AR-HUD device; and obtaining a lane line image displayed by the AR-HUD equipment according to the broken line change data, the eye position data and the calibration data.
The invention also provides an AR-HUD lane line fusion device, comprising: a memory for storing instructions executable by the processor; a processor executing the instructions to implement the method of any preceding claim.
The invention also provides a computer readable medium having stored thereon computer program code which, when executed by a processor, implements a method as in any of the preceding.
Compared with the prior art, the invention has the following advantages: the method for fusing the lane lines of the AR-HUD device can adjust and change the lane line images displayed by the AR-HUD device when the vehicle runs up and down the slope, so that better virtual-real fusion with the lane lines of the actual road surface is realized, and the safe and stable driving of the vehicle is facilitated.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the principle of the application. In the drawings:
fig. 1 is a flowchart of a lane line fusion method of the AR-HUD according to an embodiment of the present application.
FIG. 2 is a schematic diagram of a lane-line fusion system of the AR-HUD according to an embodiment of the present application.
Fig. 3A is a schematic diagram of a road lane line image when a vehicle ascends an incline according to an embodiment of the present application.
Fig. 3B is a schematic diagram of a road lane line image when a vehicle descends a slope according to an embodiment of the present application.
Fig. 4 is a schematic diagram of calculating the fold line change data of the road lane line image according to an embodiment of the present application.
Fig. 5 is a schematic diagram of a lane line fusion apparatus of the AR-HUD according to an embodiment of the present application.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
The relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present application unless specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description. Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
In the description of the present application, it is to be understood that the orientation or positional relationship indicated by the directional terms such as "front, rear, upper, lower, left, right", "lateral, vertical, horizontal" and "top, bottom", etc., are generally based on the orientation or positional relationship shown in the drawings, and are used for convenience of description and simplicity of description only, and in the case of not making a reverse description, these directional terms do not indicate and imply that the device or element being referred to must have a particular orientation or be constructed and operated in a particular orientation, and therefore, should not be considered as limiting the scope of the present application; the terms "inner and outer" refer to the inner and outer relative to the profile of the respective component itself.
It should be noted that the terms "first", "second", and the like are used to define the components, and are only used for convenience of distinguishing the corresponding components, and the terms have no special meanings unless otherwise stated, and therefore, the scope of protection of the present application is not to be construed as being limited. Further, although the terms used in the present application are selected from publicly known and used terms, some of the terms mentioned in the specification of the present application may be selected by the applicant at his or her discretion, the detailed meanings of which are described in relevant parts of the description herein. Further, it is required that the present application is understood not only by the actual terms used but also by the meaning of each term lying within.
It will be understood that when an element is referred to as being "on," "connected to," "coupled to" or "contacting" another element, it can be directly on, connected or coupled to, or contacting the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly on," "directly connected to," "directly coupled to" or "directly contacting" another element, there are no intervening elements present. Similarly, when a first component is said to be "in electrical contact with" or "electrically coupled to" a second component, there is an electrical path between the first component and the second component that allows current to flow. The electrical path may include capacitors, coupled inductors, and/or other components that allow current to flow even without direct contact between the conductive components.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, various steps may be processed in reverse order or simultaneously. Meanwhile, other operations are added to or removed from these processes.
Embodiments of the present application describe a lane line fusion method, system, apparatus, and computer-readable medium for AR-HUD.
Fig. 1 is a flowchart of a lane line fusion method of the AR-HUD according to an embodiment of the present application.
FIG. 2 is a schematic diagram of a lane-line fusion system of the AR-HUD according to an embodiment of the present application.
Referring to fig. 1, the lane line fusion method of the AR-HUD includes a step 101 of acquiring a lane line image of a road surface; 102, detecting the change of a broken line of the lane line image of the road surface, and processing the lane line image of the road surface when the broken line of the lane line image of the road surface changes to obtain broken line change data; 103, receiving human eye position data in the vehicle and calibration data of the AR-HUD device; and step 104, obtaining a lane line image displayed by the AR-HUD equipment according to the fold line change data by combining the human eye position data and the calibration data.
Referring to fig. 2, the lane line fusion system 200 of the AR-HUD includes a road surface image acquisition module 202, an image processing module 204, and an image fusion module 206.
On some AR-HUD devices, it is common to collect ADAS data, body data, and high-precision map data. However, the problem of AR-HUD is always unsolved by all the manufacturers, namely the problem of accurate virtual-real fusion. For example, a flat (or straight) road has clear dotted lines or solid lines on both sides of a lane line where a vehicle is located, at this time, the ADAS can easily capture the dotted lines or the solid lines on both sides of the lane, data is sent to the AR-HUD for calculation, then the lane line is drawn for accurate fitting, and at this time, the AR-HUD can still normally operate. However, if the lane line changes ahead when the vehicle is going uphill or downhill, the lane line changes in front, unlike the lane line of a road that is level ahead. At this time, the virtual image calculated from the data transmitted from the ADAS cannot be accurately attached to the lane line on the front uphill and downhill. This will bring obvious influence to the application such as the supplementary driving of vehicle, will make the planning travel track of vehicle deviate from the lane to influence the normal driving of vehicle.
The technical scheme of the lane line fusion method, the system and the device of the AR-HUD and the computer readable medium can solve the problem.
In the lane line fusion method of the AR-HUD of the present application, specifically, in step 101, a lane line image of a road surface is acquired. In some embodiments, the lane line image of the road surface is acquired, for example, by a camera mounted on the vehicle. The imaging device may be mounted at a side or front position outside the vehicle body, or may be mounted at a position near the vehicle cab.
In some embodiments, the accuracy of lane line image acquisition can be further improved by combining the distance measuring device with the camera device.
Fig. 3A is a schematic diagram of a road lane line image when a vehicle ascends an incline according to an embodiment of the present application. Fig. 3B is a schematic diagram of a road lane line image when a vehicle descends a slope according to an embodiment of the present application.
In fig. 3A, a vehicle 301 travels uphill, for example, in a direction a, and a first change of fold line occurs in lane lines 302 and 303 on both sides. In fig. 3B, the vehicle 304 travels uphill, for example, in the direction B, and the lane lines 305 and 306 on both sides show the second fold line change.
In step 102, a broken line change of the lane line image of the road surface is detected, and when the broken line change occurs in the lane line image of the road surface, the lane line image of the road surface is processed to obtain broken line change data.
In some embodiments, the broken line change data includes a position of a broken line point, an angle of the broken line, and a height proportion of the broken line changed portion in the lane line image of the road surface.
Fig. 4 is a schematic diagram of calculating the fold line change data of the road lane line image according to an embodiment of the present application.
Specifically, fig. 4 shows a calculation process of the fold line change data by taking the fold line change of the road surface lane line image as an example when the vehicle ascends a slope.
Referring to fig. 4, for example, the height of the lane line (specifically, the captured lane line image) 305 in the vertical direction of the image is L2, and the broken line points of the lane line are C and D. The polyline points may also be taken as the line emphasis points where the lane line image is bent, e.g. the midpoints of C and D. The angle of the fold line, which is indicated by the angle γ in fig. 4, for example, can be obtained by the angle between the extension of the direction of the lane line before bending and the direction of the lane line after bending.
With continued reference to fig. 4, the height of the broken line changing portion of the lane line in the vertical direction of the image (which may also be referred to as the height direction) is L1 (which may also be understood as the projected height of the broken line changing portion of the lane line in the vertical direction of the image is L1), the height occupation ratio of the broken line changing portion of the lane line in the lane line image of the road surface can be calculated by L1/L2. The calculation process of the broken line change data of the road surface lane line image when the vehicle is descending the slope can be obtained in a similar manner to the aforementioned calculation process.
Next, in step 103, eye position data and calibration data for the AR-HUD device within the vehicle are received.
In some embodiments, the human eye position data in the vehicle is acquired, for example, by a DMS (Driver Monitoring System) camera. Specifically, the head position of the person in the vehicle can be represented by the eyebrow coordinates of the person.
With respect to calibration data of the AR-HUD device, the assembly process of the components of each vehicle during the manufacturing process may generate slight differences, for example, the AR-HUD device may have slight differences in the position parameters after each vehicle is installed, even though the hardware parts of the device are the same, which may affect the use of the device by the user. Specifically, for example, the angle between the windshield of the vehicle and the vehicle console, and in some examples, the largest and smallest of the thousands of vehicles, the largest angles may differ by about 1 degree. In addition, after the AR-HUD device (or AR-HUD for short) is installed, its left and right inclination is not absolute horizontal position in thousands of stations, and the maximum differential angle may have 3 degrees (°), 5 degrees (°), or other values. The slight angle is then magnified even more after refraction of the light. Therefore, when the vehicle leaves the factory, each AR-HUD device needs to be adjusted, the optical effect of the AR-HUD device is adjusted to be consistent reference, and then the calibration parameters are stored, so that the calibration parameters are used as reference every time later, and the parameters are calibration data.
After receiving the eye position data and the calibration data of the AR-HUD device in the car in step 103, in step 104, the lane line image displayed by the AR-HUD device is obtained by combining the eye position data and the calibration data according to the polyline change data.
In some embodiments, obtaining the lane line image displayed by the AR-HUD device according to the polyline change data and combining the eye position data and the calibration data includes:
and converting the road lane line image data of the road surface where the fold line change data is located, the human eye position data and the calibration data into the same coordinate system, and calculating to obtain a lane line image displayed by the AR-HUD equipment. The same coordinate system is for example a vehicle coordinate system or a world coordinate system.
In an embodiment of the application, the lane line fusion method of the AR-HUD further includes projecting the lane line image displayed by the AR-HUD device onto a surface of a windshield of the vehicle, so that a driver of the vehicle sees the lane line image displayed by the AR-HUD device. When the auxiliary driving or automatic driving function of the vehicle is started, the driving track can be planned on the basis of the lane line image displayed by the AR-HUD device, the safe and stable driving of the vehicle is realized, the situations that the vehicle exits the lane line area and the like are avoided, and the driving safety is improved.
As previously mentioned, in some embodiments, the lane line fusion system 200 of the AR-HUD includes a road surface image acquisition module 202, an image processing module 204, and an image fusion module 206.
The road surface image acquisition module 202, the image processing module 204 and the image fusion module 206 may be coupled to each other by an electrical connection or a network connection, so as to transmit data and instructions.
Specifically, the road surface image obtaining module 202 may be configured to obtain lane line image data of a road surface. The image processing module 204 may detect a fold line change of the lane line image of the road surface, and process the lane line image of the road surface when the fold line change occurs in the lane line image of the road surface, so as to obtain fold line change data. The image fusion module 206 may receive the eye position data in the vehicle and the calibration data of the AR-HUD device; and obtaining a lane line image displayed by the AR-HUD equipment according to the broken line change data, the eye position data and the calibration data.
The method and the system for fusing the lane lines of the AR-HUD can adjust and change the lane line images displayed by the AR-HUD equipment when the vehicle runs up and down the slope, so that better virtual-real fusion is realized with the lane lines on the actual road surface, safety and stable driving of the vehicle are facilitated, and good use experience brought to vehicle users is achieved.
The application also provides a lane line fusion device of AR-HUD, include: a memory for storing instructions executable by the processor; and a processor for executing the instructions to implement the method as previously described.
FIG. 5 shows a schematic diagram of a lane line fusion arrangement of the AR-HUD shown in accordance with an embodiment of the present application. The lane line fusing apparatus 500 of the AR-HUD may include an internal communication bus 501, a Processor (Processor)502, a Read Only Memory (ROM)503, a Random Access Memory (RAM)504, and a communication port 505. The lane line fusion apparatus 500 of the AR-HUD is connected to the network through a communication port, and can be connected to other devices. The internal communication bus 501 may enable data communication among the lane line fusion 500 components of the AR-HUD. The processor 502 may make the determination and issue the prompt. In some embodiments, the processor 502 may be comprised of one or more processors. The communication port 505 may enable sending and receiving information and data from a network. The lane line fusion 500 of the AR-HUD may also include various forms of program storage units and data storage units, such as a Read Only Memory (ROM)503 and a Random Access Memory (RAM)504, capable of storing various data files used for computer processing and/or communication, as well as possible program instructions executed by the processor 502. The processor executes these instructions to implement the main parts of the method. The results of the processing by the processor may be communicated to the user device via the communication port for display on the user interface.
The lane line fusion apparatus 500 of the AR-HUD may be implemented as a computer program, stored in a memory, and recorded in the processor 502 for execution, so as to implement the lane line fusion method of the AR-HUD of the present application.
The present application also provides a computer readable medium having stored thereon computer program code which, when executed by a processor, implements the lane line fusion method of the AR-HUD as described above.
Aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. The processor may be one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), digital signal processing devices (DAPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, or a combination thereof. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media. For example, computer-readable media may include, but are not limited to, magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips … …), optical disks (e.g., Compact Disk (CD), Digital Versatile Disk (DVD) … …), smart cards, and flash memory devices (e.g., card, stick, key drive … …).
The computer readable medium may comprise a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, and the like, or any suitable combination. The computer readable medium can be any computer readable medium that can communicate, propagate, or transport the program for use by or in connection with an instruction execution system, apparatus, or device. Program code on a computer readable medium may be propagated over any suitable medium, including radio, electrical cable, fiber optic cable, radio frequency signals, or the like, or any combination of the preceding.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Although the present application has been described with reference to the present specific embodiments, it will be recognized by those skilled in the art that the foregoing embodiments are merely illustrative of the present application and that various changes and substitutions of equivalents may be made without departing from the spirit of the application, and therefore, it is intended that all changes and modifications to the above-described embodiments that come within the spirit of the application fall within the scope of the claims of the application.

Claims (10)

1. An AR-HUD lane line fusion method comprises the following steps:
acquiring a lane line image of a road surface;
detecting the change of a broken line of the lane line image of the road surface, and processing the lane line image of the road surface when the change of the broken line of the lane line image of the road surface occurs to obtain broken line change data;
receiving human eye position data in the vehicle and calibration data of the AR-HUD device;
and according to the fold line change data, combining the human eye position data and the calibration data to obtain a lane line image displayed by the AR-HUD equipment.
2. The AR-HUD lane line fusion method according to claim 1, wherein the broken line change data includes a position of a broken line point, an angle of the broken line, and a height ratio of the broken line changed portion in the lane line image of the road surface.
3. The method of lane line fusion of an AR-HUD according to claim 1, further comprising projecting a lane line image displayed by said AR-HUD device onto a vehicle windshield surface.
4. The method for fusing lane lines of an AR-HUD according to claim 1, wherein obtaining the lane line image displayed by the AR-HUD device according to the polyline change data by combining the eye position data and the calibration data comprises:
and converting the road lane line image data of the road surface where the fold line change data is located, the human eye position data and the calibration data into the same coordinate system, and calculating to obtain a lane line image displayed by the AR-HUD equipment.
5. An AR-HUD lane line fusion system, comprising:
the road surface image acquisition module is used for acquiring lane line image data of a road surface;
the image processing module is used for detecting the fold line change of the lane line image of the road surface, and processing the lane line image of the road surface when the fold line change occurs in the lane line image of the road surface to obtain fold line change data;
the image fusion module receives human eye position data in the vehicle and calibration data of the AR-HUD device; and obtaining a lane line image displayed by the AR-HUD equipment according to the broken line change data, the eye position data and the calibration data.
6. The AR-HUD lane line fusion device according to claim 5, wherein the broken line change data includes a position of a broken line point, an angle of the broken line, and a height ratio of the broken line changed portion in the lane line image of the road surface.
7. The lane line fusion apparatus of an AR-HUD according to claim 5, further comprising projecting the lane line image displayed by the AR-HUD device onto a vehicle windshield surface.
8. The lane line fusion apparatus according to claim 5, wherein obtaining the lane line image displayed by the AR-HUD device according to the fold line variation data in combination with the eye position data and the calibration data comprises:
and converting the road lane line image data of the road surface where the fold line change data is located, the human eye position data and the calibration data into the same coordinate system, and calculating to obtain a lane line image displayed by the AR-HUD equipment.
9. An AR-HUD lane line fusion device comprising:
a memory for storing instructions executable by the processor;
a processor executing the instructions to implement the method of any of claims 1-4.
10. A computer-readable medium having stored thereon computer program code which, when executed by a processor, implements the method of any of claims 1-4.
CN202111074620.6A 2021-09-14 2021-09-14 Lane line fusion method and system of AR-HUD Pending CN113743526A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111074620.6A CN113743526A (en) 2021-09-14 2021-09-14 Lane line fusion method and system of AR-HUD

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111074620.6A CN113743526A (en) 2021-09-14 2021-09-14 Lane line fusion method and system of AR-HUD

Publications (1)

Publication Number Publication Date
CN113743526A true CN113743526A (en) 2021-12-03

Family

ID=78738657

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111074620.6A Pending CN113743526A (en) 2021-09-14 2021-09-14 Lane line fusion method and system of AR-HUD

Country Status (1)

Country Link
CN (1) CN113743526A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1388028A (en) * 2001-05-25 2003-01-01 现代自动车株式会社 Method for sampling to road information using vehicle-carrying camera and detecting for spaces of vehicles
JP2013092944A (en) * 2011-10-26 2013-05-16 Isuzu Motors Ltd Lane identification device
CN107679496A (en) * 2017-10-10 2018-02-09 深圳地平线机器人科技有限公司 Control the method and apparatus of vehicle and the vehicle including the device
CN109353279A (en) * 2018-12-06 2019-02-19 延锋伟世通电子科技(上海)有限公司 A kind of vehicle-mounted head-up-display system of augmented reality
CN110525342A (en) * 2019-08-30 2019-12-03 的卢技术有限公司 A kind of vehicle-mounted auxiliary driving method of AR-HUD based on deep learning and its system
CN110758286A (en) * 2019-10-22 2020-02-07 同济大学 AR-HUD (augmented reality) -based automobile side and rear blind spot detection system and method based on head-up display
US20200311442A1 (en) * 2017-11-10 2020-10-01 Denso Corporation Orientation detection device and non-transitory computer readable medium
CN113103965A (en) * 2021-04-15 2021-07-13 舍弗勒技术股份两合公司 Lane keeping control method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1388028A (en) * 2001-05-25 2003-01-01 现代自动车株式会社 Method for sampling to road information using vehicle-carrying camera and detecting for spaces of vehicles
JP2013092944A (en) * 2011-10-26 2013-05-16 Isuzu Motors Ltd Lane identification device
CN107679496A (en) * 2017-10-10 2018-02-09 深圳地平线机器人科技有限公司 Control the method and apparatus of vehicle and the vehicle including the device
US20200311442A1 (en) * 2017-11-10 2020-10-01 Denso Corporation Orientation detection device and non-transitory computer readable medium
CN109353279A (en) * 2018-12-06 2019-02-19 延锋伟世通电子科技(上海)有限公司 A kind of vehicle-mounted head-up-display system of augmented reality
CN110525342A (en) * 2019-08-30 2019-12-03 的卢技术有限公司 A kind of vehicle-mounted auxiliary driving method of AR-HUD based on deep learning and its system
CN110758286A (en) * 2019-10-22 2020-02-07 同济大学 AR-HUD (augmented reality) -based automobile side and rear blind spot detection system and method based on head-up display
CN113103965A (en) * 2021-04-15 2021-07-13 舍弗勒技术股份两合公司 Lane keeping control method and device

Similar Documents

Publication Publication Date Title
CN113370982B (en) Road bump area detection method and device, electronic equipment and storage medium
CN109284348A (en) A kind of update method of electronic map, device, equipment and storage medium
JP6376429B2 (en) Target point arrival detection device, target point arrival detection program, mobile device control system, and mobile
CN110909705B (en) Road side parking space sensing method and system based on vehicle-mounted camera
CN109102711A (en) The method for determining road safety velocity information
CN109196309A (en) Method for providing the method for track of vehicle information and for positioning pothole
CN107800747A (en) A kind of passenger's locating and displaying processing method and processing device
KR101448506B1 (en) Measurement Method and Apparatus for Measuring Curvature of Lane Using Behavior of Preceding Vehicle
US11475679B2 (en) Road map generation system and road map generation method
WO2018030010A1 (en) Road surface estimation device, vehicle control device, road surface estimation method, and program
JP2018092604A (en) Information processing device, imaging device, apparatus control system, movable body, information processing method, and program
CN109229109A (en) Judge the method, apparatus, equipment and computer storage medium of vehicle heading
CN107688174A (en) A kind of image distance-finding method, system, storage medium and vehicle-mounted visually-perceptible equipment
CN112991732A (en) Real-time curve rollover early warning system and method based on binocular camera
CN110458080A (en) The pre-judging method and system of front pit-hole in a kind of running car
CN109886088A (en) The determination method and Related product in congestion lane
CN110843775B (en) Obstacle identification method based on pressure sensor
CN105103211B (en) Place shows system, method and program
JP2020003463A (en) Vehicle's self-position estimating device
CN104471436A (en) Method and device for calculating a change in an image scale of an object
CN109859464A (en) Congestion Lane determining method and Related product
CN112183206A (en) Traffic participant positioning method and system based on roadside monocular camera
CN104697491A (en) Distance determination using a monoscopic imager in a vehicle
CN113743526A (en) Lane line fusion method and system of AR-HUD
CN114333390B (en) Method, device and system for detecting shared vehicle parking event

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 314500 988 Tong Tong Road, Wu Tong Street, Tongxiang, Jiaxing, Zhejiang

Applicant after: United New Energy Automobile Co.,Ltd.

Address before: 314500 988 Tong Tong Road, Wu Tong Street, Tongxiang, Jiaxing, Zhejiang

Applicant before: Hezhong New Energy Vehicle Co.,Ltd.