CN110349078B - AR (augmented reality) graphic method for HUD (head Up display) orthostatic live-action display - Google Patents

AR (augmented reality) graphic method for HUD (head Up display) orthostatic live-action display Download PDF

Info

Publication number
CN110349078B
CN110349078B CN201910437528.8A CN201910437528A CN110349078B CN 110349078 B CN110349078 B CN 110349078B CN 201910437528 A CN201910437528 A CN 201910437528A CN 110349078 B CN110349078 B CN 110349078B
Authority
CN
China
Prior art keywords
coordinate
icons
graph
initial
gridding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910437528.8A
Other languages
Chinese (zh)
Other versions
CN110349078A (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Ruisi Huachuang Technology Co ltd
Original Assignee
Shenzhen Ruisi Huachuang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ruisi Huachuang Technology Co ltd filed Critical Shenzhen Ruisi Huachuang Technology Co ltd
Priority to CN201910437528.8A priority Critical patent/CN110349078B/en
Publication of CN110349078A publication Critical patent/CN110349078A/en
Application granted granted Critical
Publication of CN110349078B publication Critical patent/CN110349078B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T3/06
    • G06T3/08
    • G06T5/80

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention provides an AR (augmented reality) graphic algorithm for HUD (head Up display) orthostatic live-action display. The AR graphic algorithm comprises the following steps of constructing a reference icon library which is stored by adopting an orthomorphic real scene gridding format: firstly, respectively establishing a coordinate system of an initial graph and a normal live-action graph; secondly, constructing a multivariate multiple linear relation model function between coordinate systems; thirdly, solving parameter values of the multivariate multiple linear relation model function to obtain a numerical conversion function; and fourthly, converting all the corresponding coordinate points in the initial coordinate graph through the numerical function relationship, thereby obtaining the normal live-action image corresponding to the initial coordinate graph. The AR graphic algorithm can obtain a numerical AR graphic conversion function relationship by setting a specific reference object coordinate conversion system and a multivariate multiple linear relationship model function, thereby realizing normal live-action display of shot images at various angles according with human eye transmission rules.

Description

AR (augmented reality) graphic method for HUD (head Up display) orthostatic live-action display
Technical Field
The invention relates to an augmented reality graph conversion technology, in particular to an AR graph algorithm which obtains an AR graph conversion function relationship through a multivariate multiple linear relationship model function and can realize orthostatic real-scene display of shot images at different angles.
Background
The Advanced Driving assistance System (Advanced Driving assistance System) senses the surrounding environment at any time in the Driving process of the automobile by using various sensors arranged on the automobile, collects data, identifies, detects and tracks static and dynamic objects, and performs systematic operation and analysis by combining with map data of a navigator, so that a driver can be made to perceive possible dangers in advance, and the comfort and the safety of automobile Driving are effectively improved.
Augmented Reality (AR) is a technology for calculating the position and angle of a camera image in real time and adding a corresponding image, and the aim of the technology is to sleeve a virtual world on a screen in the real world and perform interaction.
At present, an ADAS advanced driving assistance system cannot be fully realized by a vehicle-mounted head-up display optical system, and the main defects are as follows:
1. the existing traditional vehicle-mounted head-up display system directly projects a display picture which is irrelevant to a road picture (an actual world) in front, that is, the traditional vehicle-mounted head-up display image is not fused with the external actual world. The perspective view angle of the real world object observed by the human eyes is characterized by being large and small, the image in the traditional head-up display is not subjected to corresponding coordinate conversion to accord with the perspective effect of the human eyes, so that the image projected by the head-up display is inconsistent with the coordinate of the real image, the human eyes watch the image in the head-up display, the two images cannot be well fused and attached, and the function of enhancing the display is not provided, so that if the content of the head-up display is fused with the real world, the image characteristic required to be projected accords with the perspective effect of the real world object on the human eyes.
2. Static and dynamic objects with a certain visual angle, such as pedestrians and vehicles on the side, or vehicles behind, cannot be identified, detected and tracked, and in this case, on one hand, the existing head-up display system cannot display the image real scene in the right position, so that subjective misjudgment is easily generated, and the other party lacks necessary warning behaviors, thereby having a certain potential safety hazard.
Disclosure of Invention
The invention aims to solve the technical problems and provides an AR (augmented reality) graphic method for HUD (head Up display) normal live-action display.
In order to solve the above prior art problems, the technical scheme of the invention is as follows:
an AR graphic method for HUD orthomorphic real scene display, the AR graphic algorithm includes constructing a reference icon library stored by using an orthomorphic real scene gridding format, the AR graphic algorithm adopts the following steps:
firstly, respectively establishing a coordinate system of an initial graph and a positive real scene graph;
secondly, constructing a multivariate multiple linear relation model function between coordinate systems;
thirdly, selecting coordinate points in the orthostatic real scene gridding format of the icons in the multiple reference icon libraries and corresponding coordinate points in the initial coordinate graph to substitute the multiple linear relation model function, solving to obtain parameter values of the multiple linear relation model function, and further obtaining a numerical conversion function;
and fourthly, converting all the corresponding coordinate points in the initial coordinate graph through the numerical function relationship, thereby obtaining the normal live-action image corresponding to the initial coordinate graph.
The normal real scene gridding format of the icons in the reference icon library is a two-dimensional or three-dimensional mode;
when the normal real scene gridding format of the icons in the reference icon library is in a two-dimensional mode, the AR graphic algorithm comprises the following steps:
setting a coordinate system of the initial coordinate graph as a coordinate I, and representing the middle point of the image as (X)1,Y1)、(X2,Y2)、(X3,Y3)……(Xn,Yn),
The coordinate system of the icons in the reference icon library is a coordinate II, and the middle point of the image is represented as (Xa)1,Ya1)、(Xa2,Ya2)、(Xa3,Ya3)……(Xan,Yan);
Setting the coordinate point in the first coordinate and the X-axis coordinate and the Y-axis coordinate of the coordinate point in the second coordinate to be a multiple linear relation, wherein the multiple linear relation is at least 3 times, and when the multiple linear relation is selected to be a cubic linear relation, obtaining a functional relation model of the coordinates as follows:
Xn = AXan 3 + BXan 2 + CXan + DYan 3 + EYan 2 + FYan + G,
Yn = HXan 3 + IXan 2 + JXan + KYan 3 + LYan 2 + MYan + M;
selecting coordinate points in the normal real-scene gridding format of the icons in a plurality of reference icon libraries and corresponding coordinate points in the initial coordinate graph to substitute the coordinate points into the fourteen-element cubic equation, and selecting 14 characteristic points to obtain the values of the parameters A-M;
and (IV) substituting the parameter values into the functional relation model to obtain a numerical functional relation, and converting all corresponding coordinate points in the initial coordinate graph through the numerical functional relation to obtain an orthonormal coordinate image corresponding to the initial coordinate graph.
The reference icon library is used for inputting different road sign icons or traffic sign icons or pedestrian icons or vehicle icons, and then storing the input road sign icons or traffic sign icons or pedestrian icons or vehicle icons in a normal real scene gridding format to obtain the reference icon library containing a large number of reference icons.
The invention discloses an AR (augmented reality) graph method for HUD (head Up display) normal live-action scene display, which has the following beneficial effects:
1. the AR graphic algorithm can obtain a numerical graphic conversion function relationship by setting a specific reference object coordinate conversion system and a multivariate multiple linear relationship model function, thereby realizing the normal live-action display of shot images at different angles;
2. the AR graph algorithm can realize two-dimensional or three-dimensional graph conversion, the precision of the AR graph algorithm can be adjusted according to needs, meanwhile, the AR graph algorithm is reversible, and various special graph conversion effects can be realized.
Description of the drawings:
FIG. 1 is a schematic diagram of an initial image of an AR graphics method for HUD orthostatic live view display according to the present invention;
FIG. 2 is a schematic diagram of gridding of orthographic scene icons in a reference graphic library of the AR graphic method for HUD orthographic scene display according to the present invention;
fig. 3 is a comparison chart of the initial image conversion and the conversion of the normal live-action icon in the AR graphics method for HUD normal live-action display according to the present invention.
The specific implementation mode is as follows:
the invention is further illustrated by the following examples:
example (b):
an AR graphic method for HUD orthomorphic real scene display, the AR graphic algorithm includes constructing a reference icon library stored by using an orthomorphic real scene gridding format, the AR graphic algorithm adopts the following steps:
firstly, respectively establishing a coordinate system of an initial graph and a positive real scene graph;
secondly, constructing a multivariate multiple linear relation model function between coordinate systems;
thirdly, selecting coordinate points in the orthostatic real scene gridding format of the icons in the multiple reference icon libraries and corresponding coordinate points in the initial coordinate graph to substitute the multiple linear relation model function, solving to obtain parameter values of the multiple linear relation model function, and further obtaining a numerical conversion function;
and fourthly, converting all the corresponding coordinate points in the initial coordinate graph through the numerical function relationship, thereby obtaining the normal live-action image corresponding to the initial coordinate graph.
The normal real scene gridding format of the icons in the reference icon library is a two-dimensional or three-dimensional mode;
when the normal real scene gridding format of the icons in the reference icon library is in a two-dimensional mode, the AR graphic algorithm comprises the following steps:
setting a coordinate system of the initial coordinate graph as a coordinate I, and representing the middle point of the image as (X)1,Y1)、(X2,Y2)、(X3,Y3)……(Xn,Yn),
The coordinate system of the icons in the reference icon library is a coordinate II, and the middle point of the image is represented as (Xa)1,Ya1)、(Xa2,Ya2)、(Xa3,Ya3)……(Xan,Yan);
Setting the X-axis coordinate and the Y-axis coordinate of the coordinate point in the first coordinate and the coordinate point in the second coordinate to be a multiple linear relation, wherein the multiple linear relation is at least 3 times, and when the multiple linear relation is selected to be a cubic linear relation, obtaining a functional relation model of the coordinates as follows:
Xn = AXan 3 + BXan 2 + CXan + DYan 3 + EYan 2 + FYan + G,
Yn = HXan 3 + IXan 2 + JXan + KYan 3 + LYan 2 + MYan + M;
selecting coordinate points in the normal real scene gridding format of the icons in a plurality of reference icon libraries and corresponding coordinate points in the initial coordinate graph to substitute the coordinate points into the fourteen-element cubic equation, and selecting 14 characteristic points to obtain the values of the parameters A-M;
and (IV) substituting the parameter values into the functional relation model to obtain a numerical functional relation, and converting all corresponding coordinate points in the initial coordinate graph through the numerical functional relation to obtain an orthonormal coordinate image corresponding to the initial coordinate graph.
The reference icon library is used for inputting different road sign icons or traffic sign icons or pedestrian icons or vehicle icons, and then storing the input road sign icons or traffic sign icons or pedestrian icons or vehicle icons in a normal real scene gridding format to obtain the reference icon library containing a large number of reference icons.
The present invention has been described in detail, and it should be understood that the detailed description and specific examples, while indicating the preferred embodiment of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.

Claims (2)

1. An AR graphic method for HUD orthostatic live-action display is characterized in that an AR graphic algorithm comprises a reference icon library which is stored by adopting an orthostatic live-action gridding format, and the AR graphic algorithm adopts the following first step to fourth step:
firstly, respectively establishing a coordinate system of an initial graph and a positive real scene graph;
secondly, constructing a multivariate multiple linear relation model function between coordinate systems;
thirdly, selecting coordinate points in the normal position live-action gridding format of the icons in the plurality of reference icon libraries and corresponding coordinate points in the initial coordinate graph to substitute the multivariate multiple linear relation model function, and solving to obtain parameter values of the multivariate multiple linear relation model function to obtain a digitized conversion function;
fourthly, all the corresponding coordinate points in the initial coordinate graph are converted through the numerical function relationship, and therefore a normal live-action image corresponding to the initial coordinate graph is obtained;
the normal real scene gridding format of the icons in the reference icon library is a two-dimensional or three-dimensional mode;
when the orthographic real scene gridding format of the icons in the reference icon library is in a two-dimensional mode, the AR graphic algorithm comprises the following steps (I) to (IV):
setting a coordinate system of the initial coordinate graph as a coordinate I, and representing the middle point of the image as (X)1,Y1)、(X2,Y2)、(X3,Y3)……(Xn,Yn),
The coordinate system of the icons in the reference icon library is a coordinate II, and the middle point of the image is represented as (Xa)1,Ya1)、(Xa2,Ya2)、(Xa3,Ya3)……(Xan,Yan);
Setting the X-axis coordinate and the Y-axis coordinate of the coordinate point in the first coordinate and the coordinate point in the second coordinate to be a multiple linear relation, wherein the multiple linear relation is at least 3 times, and when the multiple linear relation is selected to be a cubic linear relation, obtaining a functional relation model of the coordinates as follows:
Xn = AXan 3 + BXan 2 + CXan + DYan 3 + EYan 2 + FYan + G,
Yn = HXan 3 + IXan 2 + JXan + KYan 3 + LYan 2 + MYan + M;
selecting coordinate points in the normal real-scene gridding format of the icons in a plurality of reference icon libraries and corresponding coordinate points in the initial coordinate graph to substitute the coordinate points into the functional relation model, and selecting 14 characteristic points to obtain the values of the parameters A-M;
substituting the parameter values into the functional relation model to obtain a numerical functional relation, and converting all corresponding coordinate points in the initial coordinate graph through the numerical functional relation to obtain an orthonormal coordinate image corresponding to the initial coordinate graph;
and when the normal real scene gridding format of the icons in the reference icon library is in a three-dimensional mode, performing three-dimensional coordinate conversion by using the AR graphic algorithm.
2. The AR graphic method for HUD orthophoria display according to claim 1, wherein the reference icon library is used for inputting different road sign icons or traffic sign icons or pedestrian icons or vehicle icons, and then storing the input road sign icons or traffic sign icons or pedestrian icons or vehicle icons in an orthophoria gridding format to obtain the reference icon library containing the reference icons.
CN201910437528.8A 2019-05-24 2019-05-24 AR (augmented reality) graphic method for HUD (head Up display) orthostatic live-action display Active CN110349078B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910437528.8A CN110349078B (en) 2019-05-24 2019-05-24 AR (augmented reality) graphic method for HUD (head Up display) orthostatic live-action display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910437528.8A CN110349078B (en) 2019-05-24 2019-05-24 AR (augmented reality) graphic method for HUD (head Up display) orthostatic live-action display

Publications (2)

Publication Number Publication Date
CN110349078A CN110349078A (en) 2019-10-18
CN110349078B true CN110349078B (en) 2022-07-15

Family

ID=68174283

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910437528.8A Active CN110349078B (en) 2019-05-24 2019-05-24 AR (augmented reality) graphic method for HUD (head Up display) orthostatic live-action display

Country Status (1)

Country Link
CN (1) CN110349078B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106570938A (en) * 2016-10-21 2017-04-19 哈尔滨工业大学深圳研究生院 OPENGL based panoramic monitoring method and system
CN107783937A (en) * 2017-10-19 2018-03-09 西安科技大学 A kind of method for solving any anglec of rotation three-dimensional coordinate conversion parameter
CN109493321A (en) * 2018-10-16 2019-03-19 中国航空工业集团公司洛阳电光设备研究所 A kind of vehicle-mounted HUD visual system parallax calculation method
CN109688392A (en) * 2018-12-26 2019-04-26 联创汽车电子有限公司 AR-HUD optical projection system and mapping relations scaling method and distortion correction method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2976543A1 (en) * 2016-08-23 2018-02-23 8696322 Canada Inc. System and method for augmented reality head up display for vehicles

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106570938A (en) * 2016-10-21 2017-04-19 哈尔滨工业大学深圳研究生院 OPENGL based panoramic monitoring method and system
CN107783937A (en) * 2017-10-19 2018-03-09 西安科技大学 A kind of method for solving any anglec of rotation three-dimensional coordinate conversion parameter
CN109493321A (en) * 2018-10-16 2019-03-19 中国航空工业集团公司洛阳电光设备研究所 A kind of vehicle-mounted HUD visual system parallax calculation method
CN109688392A (en) * 2018-12-26 2019-04-26 联创汽车电子有限公司 AR-HUD optical projection system and mapping relations scaling method and distortion correction method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
光学透射式AR-HUD系统的标定方法研究;安喆 等;《光子学报》;20190129;第48卷(第4期);全文 *
视线跟踪技术及其应用研究;徐兴民;《中国优秀博硕士学位论文全文数据库(电子期刊) 信息科技辑》;20080115;全文 *

Also Published As

Publication number Publication date
CN110349078A (en) 2019-10-18

Similar Documents

Publication Publication Date Title
JP6580800B2 (en) Accelerated light field display
JP3286306B2 (en) Image generation device and image generation method
JP6079131B2 (en) Image processing apparatus, method, and program
US10539790B2 (en) Coordinate matching apparatus for head-up display
US11302086B1 (en) Providing features of an electronic product in an augmented reality environment
JP2002135765A (en) Camera calibration instruction device and camera calibration device
CN106454311A (en) LED three-dimensional imaging system and method
US8749547B2 (en) Three-dimensional stereoscopic image generation
US10634504B2 (en) Systems and methods for electronic mapping and localization within a facility
CA2796514A1 (en) Method and device for representing synthetic environments
US11212501B2 (en) Portable device and operation method for tracking user's viewpoint and adjusting viewport
US10937215B1 (en) Techniques for enabling drawing in a computer-generated reality environment
US11227494B1 (en) Providing transit information in an augmented reality environment
CN109764888A (en) Display system and display methods
US8896631B2 (en) Hyper parallax transformation matrix based on user eye positions
JPH09265550A (en) Three-dimensional display device
CN110349078B (en) AR (augmented reality) graphic method for HUD (head Up display) orthostatic live-action display
CN110347241B (en) AR head-up display optical system capable of realizing normal live-action display
US20190137770A1 (en) Display system and method thereof
CN115984122A (en) HUD backlight display system and method
CN107784693B (en) Information processing method and device
KR101001856B1 (en) System and method for displaying 3d virtual image using moving picture input device
Borsoi et al. On the performance and implementation of parallax free video see-through displays
CN112634342A (en) Method for computer-implemented simulation of optical sensors in a virtual environment
Ueno et al. [Poster] Overlaying navigation signs on a road surface using a head-up display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20191017

Address after: 518112 5b, unit 1, building 10, Zhonghai yicui villa, Zhonghai yicui community, Jihua street, Longgang District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Ruisi Huachuang Technology Co.,Ltd.

Address before: Room 205, No. 16, Haomen East Lake, Jiaocheng District, Ningde City, Fujian Province

Applicant before: Yang Qiaoxue

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221222

Address after: No. 901-902, Building 2, Hunan Military civilian Integration Science and Technology Innovation Industrial Park, No. 699, Qingshan Road, Changsha Hi tech Development Zone, Changsha, Hunan 410000

Patentee after: Hunan Ruisi Huachuang Technology Co.,Ltd.

Address before: 518112 5b, unit 1, building 10, Zhonghai yicui villa, Zhonghai yicui community, Jihua street, Longgang District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Ruisi Huachuang Technology Co.,Ltd.