CN115423987A - Vehicle-mounted live-action navigation system based on AR technology - Google Patents

Vehicle-mounted live-action navigation system based on AR technology Download PDF

Info

Publication number
CN115423987A
CN115423987A CN202210973547.4A CN202210973547A CN115423987A CN 115423987 A CN115423987 A CN 115423987A CN 202210973547 A CN202210973547 A CN 202210973547A CN 115423987 A CN115423987 A CN 115423987A
Authority
CN
China
Prior art keywords
image
navigation
data
information
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210973547.4A
Other languages
Chinese (zh)
Inventor
孟庆贺
马文峰
马良
李振龙
于振勇
姜杨阳
李文博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Bestune Car Co Ltd
Original Assignee
FAW Bestune Car Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Bestune Car Co Ltd filed Critical FAW Bestune Car Co Ltd
Priority to CN202210973547.4A priority Critical patent/CN115423987A/en
Publication of CN115423987A publication Critical patent/CN115423987A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)

Abstract

The invention relates to a vehicle-mounted live-action navigation system based on AR technology, which comprises an image information acquisition module, an AR engine processing module and an image rendering module; the image information acquisition module is used for acquiring a real-time video stream and transmitting the real-time video stream to the AR engine processing module through the LVDS; the AR engine processing module extracts image frames, image identification data and map basic data according to a real-time video stream input by the camera for data fusion, calculates image data information of actual description and sends the image data information to the image rendering module; and after receiving the data information, the image rendering module can perform AR image rendering, stacking and displaying and space coordinate system conversion. The method relies on the bottom map data and the navigation engine framework, combines the image processing, the data training and the AR augmented reality technology, fuses the data of multiple sensors such as a GPS and a camera, enhances the visual effect on the real road, realizes accurate immersive navigation, and facilitates the user to more intuitively understand the navigation information of the live road.

Description

Vehicle-mounted live-action navigation system based on AR technology
Technical Field
The invention belongs to the technical field of automobile intelligent networking, and particularly relates to a vehicle-mounted live-action navigation system based on an AR (augmented reality) technology.
Background
The traditional navigation is mainly displayed by a 2D plane map, the visual effect of a user is not intuitive enough, and the intersection is easy to miss under the condition of an immature path or unknown indication. In order to enable a user to better understand the real road condition ahead, the real-scene navigation system can fuse the road real-scene picture with navigation information, so that the user can understand the navigation guidance intention and make driving control in a shorter time, and more convenient and faster interactive experience is brought to the user.
Disclosure of Invention
The invention aims to provide a vehicle-mounted live-action navigation system based on an AR technology, so as to solve the problems of accurate immersive navigation and convenience for a user to more intuitively understand live-action road navigation information.
The purpose of the invention is realized by the following technical scheme:
a vehicle-mounted live-action navigation system based on AR technology comprises an image information acquisition module, an AR engine processing module and an image rendering module;
the image information acquisition module is used for acquiring a real-time video stream and transmitting the real-time video stream to the AR engine processing module through LVDS; the AR engine processing module comprises a navigation engine SDK unit, an ADAS SDK unit and a map basic data unit, can extract image frames, image identification data and map basic data according to a real-time video stream input by a camera for data fusion, calculates image data information of actual drawing and sends the image data information to the image rendering module; and after receiving the data information, the image rendering module can perform AR image rendering, stacking and displaying, space coordinate system conversion and animation effect realization according to the vehicle speed.
Furthermore, the image information acquisition module acquires real-time video stream through a live-action navigation camera.
Furthermore, the resolution of the live-action navigation camera is 100 ten thousand pixels, the installation horizontal viewing angle is 52 (+/-5), and the vertical viewing angle is 38 (+/-5).
Further, the navigation engine SDK unit extracts image frames according to real-time video streams input by the camera and transmits the image frames to the ADAS SDK unit for processing; and meanwhile, data fusion is carried out according to the image identification data and the map basic data, and the actually depicted image data information is calculated.
Furthermore, the ADAS SDK unit combines the image frame sequence, adopts the OpenCV tool technology to perform image recognition, and feeds back the recognized front vehicle information, lane lines and other data to the navigation engine SDK unit.
Further, the map basic data unit transmits basic information such as GPS real-time position information, navigation TBT route information, map basic data information and the like to the navigation engine SDK unit.
Further, the image rendering module can convert the data information fused by the AR engine processing module into a 2D/3D space coordinate system.
Furthermore, the image rendering module comprises an OpenGL calibration unit, and can render the navigation TBT information or the POI position in real time by combining the fusion data, the vehicle speed, and the coordinate position, and superimpose the navigation TBT information or the POI position on the navigation live-action picture, so as to realize an AR navigation effect.
Compared with the prior art, the invention has the beneficial effects that:
the vehicle-mounted live-action navigation system based on the AR technology is based on the bottom map data and the navigation engine framework, combines the technologies such as image processing, data training and AR Augmented Reality technology (Augmented Reality), fuses the data of multiple sensors such as a GPS (global positioning system) and a camera, enhances the visual effect on a real road, realizes accurate immersive navigation, and is convenient for a user to more intuitively understand the navigation information of the live-action road.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is a functional block diagram of a vehicle-mounted live-action navigation system based on AR technology.
Detailed Description
The invention is further illustrated by the following examples:
the present invention will be described in further detail with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some structures related to the present invention are shown in the drawings, not all of them.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
As shown in fig. 1, the vehicle-mounted live-action navigation system based on the AR technology of the present invention includes an image information collecting module, an AR engine processing module, and an image rendering module.
The image information acquisition module mainly provides video stream pictures and navigates the camera through the live-action (the resolution is 100 ten thousand pixels, the horizontal visual angle is 52 installed) (+/-5) Vertical viewing angle 38 (+/-5) ) And collecting real-time video stream, and transmitting the real-time video stream to the AR engine processing module through LVDS.
The AR engine processing module is a core processing unit of the live-action navigation engine, can realize the navigation engine SDK, the ADAS SDK and the map basic data, and comprises a navigation engine SDK unit, an ADAS SDK unit and a map basic data unit.
1. The navigation engine SDK: extracting image frames according to a real-time video stream input by a camera, and transmitting the image frames to an ADAS SDK software unit for processing; and meanwhile, data fusion is carried out according to the image identification data and the map basic data, and the actually depicted image data information is calculated.
2. ADAS SDK: and combining the image frame sequence, performing image recognition by adopting an OpenCV tool technology, and feeding back the recognized previous vehicle information, lane lines and other data to the navigation engine SDK.
3. Map basic data: and basic information such as GPS real-time position information, navigation TBT route information, map basic data information and the like is transmitted to a navigation engine SDK.
The image rendering module comprises an OpenGL calibration unit and can realize coordinate system conversion and AR effect rendering.
1. And (3) converting a coordinate system: firstly, 2D/3D space coordinate system conversion is carried out on data information fused by the AR engine processing module. Specifically, the coordinates of the POI are geographical coordinates, the camera takes the coordinates of the camera, the screen sees a pixel coordinate system, and the coordinate conversion is affected by different cameras, different installation positions and different screens in order to display the geographical coordinates + the camera coordinates in the pixel coordinate system. According to the world coordinate system basis, all coordinates are converted to the world coordinate system through a series of calibration conversion, the relative relationship can be correctly presented, and the relative relationship seen by human eyes is correct.
2. Rendering the AR effect: according to the OpenGL technology, navigation TBT information or POI positions are rendered in real time by combining fusion data, vehicle speed and coordinate positions, and are superposed on a navigation live-action picture, so that an AR navigation effect is achieved.
Example 1
The invention relates to a vehicle-mounted live-action navigation system based on an AR technology, which comprises an image information acquisition module, an AR engine processing module and an image rendering module.
The image information acquisition module is used for acquiring real-time video streams and transmitting the real-time video streams to the AR engine processing module through LVDS. The AR engine processing module comprises a navigation engine SDK unit, an ADAS SDK unit and a map basic data unit, can extract image frames, image identification data and map basic data according to real-time video streams input by a camera to perform data fusion, calculates image data information of actual drawing, and sends the image data information to the image rendering module. And after the image rendering module receives the data information, AR (augmented reality) picture rendering, overlapping and displaying can be performed, a space coordinate system is converted, and an animation effect is realized according to the vehicle speed.
Specifically, the image information acquisition module acquires real-time video stream through a live-action navigation camera with the resolution of 100 ten thousand pixels, the horizontal visual angle of 52 (+/-5) and the vertical visual angle of 38 (+/-5).
The navigation engine SDK unit extracts image frames according to real-time video streams input by the camera and transmits the image frames to the ADAS SDK unit for processing; and meanwhile, data fusion is carried out according to the image identification data and the map basic data, and the actually depicted image data information is calculated. The ADAS SDK unit combines the image frame sequence, adopts the OpenCV tool technology to perform image recognition, and feeds back recognized front vehicle information, lane lines and other data to the navigation engine SDK unit. And the map basic data unit transmits basic information such as GPS real-time position information, navigation TBT route information, map basic data information and the like to the navigation engine SDK unit.
The image rendering module can convert the data information fused by the AR engine processing module into a 2D/3D space coordinate system. The image rendering module comprises an OpenGL calibration unit and can be used for rendering navigation TBT information or POI positions in real time by combining fusion data, vehicle speed and coordinate positions and superposing the navigation TBT information or POI positions on a navigation live-action picture to realize an AR navigation effect.
The AR live-action navigation system function list is shown in table 1:
TABLE 1 AR live-action navigation System function List
Figure BDA0003797193740000061
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in some detail by the above embodiments, the invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the invention, and the scope of the invention is determined by the scope of the appended claims.

Claims (8)

1. The utility model provides a vehicle-mounted live-action navigation based on AR technique which characterized in that: the system comprises an image information acquisition module, an AR engine processing module and an image rendering module;
the image information acquisition module is used for acquiring a real-time video stream and transmitting the real-time video stream to the AR engine processing module through LVDS; the AR engine processing module comprises a navigation engine SDK unit, an ADASDK unit and a map basic data unit, can extract image frames, image identification data and map basic data according to a real-time video stream input by a camera for data fusion, calculates image data information of actual description and sends the image data information to the image rendering module; and after the image rendering module receives the data information, AR (augmented reality) picture rendering, overlapping and displaying can be performed, a space coordinate system is converted, and an animation effect is realized according to the vehicle speed.
2. The AR technology-based in-vehicle live-action navigation system according to claim 1, wherein: the image information acquisition module acquires real-time video stream through the live-action navigation camera.
3. The AR technology-based vehicular live-action navigation system of claim 2, wherein: the resolution of the live-action navigation camera is 100 ten thousand pixels, the installation horizontal visual angle is 52 (+/-5), and the vertical visual angle is 38 (+/-5).
4. The AR technology-based in-vehicle live-action navigation system according to claim 1, wherein: the navigation engine SDK unit extracts image frames according to a real-time video stream input by the camera and transmits the image frames to the ADASS DK unit for processing; and meanwhile, performing data fusion according to the image identification data and the map basic data, and calculating the image data information actually depicted.
5. The AR technology-based in-vehicle real scene navigation system as recited in claim 4, wherein: the ADASS DK unit combines the image frame sequence, adopts OpenCV tool technology to perform image recognition, and feeds back recognized data such as the information of the front vehicle, lane lines and the like to the navigation engine SDK unit.
6. The AR technology-based in-vehicle live-action navigation system according to claim 1, wherein: and the map basic data unit transmits basic information such as GPS real-time position information, navigation TBT route information, map basic data information and the like to the navigation engine SDK unit.
7. The AR technology-based in-vehicle live-action navigation system according to claim 1, wherein: the image rendering module can convert the 2D/3D space coordinate system of the data information fused by the AR engine processing module.
8. The AR technology-based vehicular live-action navigation system of claim 1, wherein: the image rendering module comprises an OpenGL calibration unit and can be used for rendering navigation TBT information or POI positions in real time by combining fusion data, vehicle speed and coordinate positions and superposing the navigation TBT information or POI positions on a navigation live-action picture to realize an AR navigation effect.
CN202210973547.4A 2022-08-15 2022-08-15 Vehicle-mounted live-action navigation system based on AR technology Pending CN115423987A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210973547.4A CN115423987A (en) 2022-08-15 2022-08-15 Vehicle-mounted live-action navigation system based on AR technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210973547.4A CN115423987A (en) 2022-08-15 2022-08-15 Vehicle-mounted live-action navigation system based on AR technology

Publications (1)

Publication Number Publication Date
CN115423987A true CN115423987A (en) 2022-12-02

Family

ID=84199214

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210973547.4A Pending CN115423987A (en) 2022-08-15 2022-08-15 Vehicle-mounted live-action navigation system based on AR technology

Country Status (1)

Country Link
CN (1) CN115423987A (en)

Similar Documents

Publication Publication Date Title
AU2006203980B2 (en) Navigation and inspection system
US20140285523A1 (en) Method for Integrating Virtual Object into Vehicle Displays
EP2207113B1 (en) Automated annotation of a view
US8773534B2 (en) Image processing apparatus, medium recording image processing program, and image processing method
WO2009119110A1 (en) Blind spot display device
WO2012033095A1 (en) Vehicle system
CN109961522B (en) Image projection method, device, equipment and storage medium
CN104833368A (en) Live-action navigation system and method
JP2012164157A (en) Image synthesizer
EP4050305A1 (en) Visual positioning method and device
CN101122464A (en) GPS navigation system road display method, device and apparatus
EP3859390A1 (en) Method and system for rendering a representation of an evinronment of a vehicle
CN111201473A (en) Method for operating a display device in a motor vehicle
TW201011259A (en) Method capable of generating real-time 3D map images and navigation system thereof
Coors et al. Matching buildings: Pose estimation in an urban environment
JP2014013989A (en) Augmented reality system
WO2008082423A1 (en) Navigation and inspection system
CN103324523A (en) Quick-look client combined with three-dimensional earth
CN105091895A (en) Concern prompt system and method
CN112288876A (en) Long-distance AR identification server and system
CN115423987A (en) Vehicle-mounted live-action navigation system based on AR technology
JP6448274B2 (en) Information display control system and information display control method
EP3702864A1 (en) Accounting for latency in teleoperated remote driving
CN114742977A (en) Video perspective method based on AR technology
CN114689063A (en) Map modeling and navigation guiding method, electronic device and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination