CN116883625B - Image display method and device, electronic equipment and storage medium - Google Patents

Image display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116883625B
CN116883625B CN202310682698.9A CN202310682698A CN116883625B CN 116883625 B CN116883625 B CN 116883625B CN 202310682698 A CN202310682698 A CN 202310682698A CN 116883625 B CN116883625 B CN 116883625B
Authority
CN
China
Prior art keywords
angle
image
target
image generator
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310682698.9A
Other languages
Chinese (zh)
Other versions
CN116883625A (en
Inventor
张涛
王熊熊
张宁波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zejing Automobile Electronic Co ltd
Original Assignee
Jiangsu Zejing Automobile Electronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Zejing Automobile Electronic Co ltd filed Critical Jiangsu Zejing Automobile Electronic Co ltd
Priority to CN202310682698.9A priority Critical patent/CN116883625B/en
Publication of CN116883625A publication Critical patent/CN116883625A/en
Application granted granted Critical
Publication of CN116883625B publication Critical patent/CN116883625B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/028Multiple view windows (top-side-front-sagittal-orthogonal)

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Instrument Panels (AREA)

Abstract

The application discloses an image display method, an image display device, electronic equipment and a storage medium, and relates to the technical field of head-up display. The technical problem that an AR-HUD virtual image cannot be fused with different driving scenes better is solved, and the method comprises the following steps: acquiring the speed of the vehicle at the current moment, and determining an AR element to be displayed and a target imaging angle corresponding to the AR element according to the speed; determining a target installation angle of the image generator according to the target imaging angle; driving the rotating mechanism to rotate so as to adjust the installation angle of the image generator to be at a target installation angle; at the target installation angle, AR elements are generated by an image generator and presented in an image presentation component. According to the technical scheme, different driving scenes can be fused better, and driving safety and driving experience feeling can be improved.

Description

Image display method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of heads-up display technologies, and in particular, to an image display method, an image display device, an electronic device, and a storage medium.
Background
The augmented reality head-up display (Augmented Reality Head Up Display, AR-HUD) technology is gradually applied to automobiles, and driving assistance information of the automobiles can be projected onto a windshield in front of a driver's visual field after virtual-real fusion (such as marking information of lane lines, obstacles and the like) is performed, so that the driver can consider instrument parameters and external environment information in a head-up state.
The imaging angle of the virtual image of the AR-HUD in the prior art is generally fixed, one is that the virtual image is approximately perpendicular to the ground, and the other is that the virtual image is approximately fitted with the road, but both schemes have certain defects. When the virtual image is nearly vertical to the ground, the virtual image cannot be completely attached to the road surface and the imaging distance is relatively short, so that the low-speed driving requirement in urban areas can be met, but when the human eyes are generally far away from each other by more than 20m and the virtual image distance is relatively short of 10m in the high-speed driving process, the human eyes need to switch the visual angle back and forth to observe the road surface condition and the AR-HUD image, and the driving safety is not facilitated. When the virtual image is approximately attached to the road surface, the requirement of high-speed driving can be met because the virtual image can be attached to the road surface and the imaging distance is far, but the inclined long-distance image can be overlapped with a front vehicle when the urban area is driven at a low speed, so that a driver can misjudge the road condition and further influence the driving safety. Therefore, how to image the AR-HUD virtual image to better merge with different driving scenes (such as urban driving scenes and high-speed driving scenes), so as to improve driving safety and driving experience feeling, becomes a problem to be solved urgently.
Disclosure of Invention
The application provides an image display method, an image display device, electronic equipment and a storage medium, which can better integrate different driving scenes (such as urban driving scenes and high-speed driving scenes) and can improve driving safety and driving experience.
In a first aspect, the present application provides an image display method, applied to a head-up display, where an image generator, a rotation mechanism and an image display assembly are configured in the head-up display, where the rotation mechanism is configured on a geometric center point of the image generator, and is used to adjust an installation angle of the image generator, and the method includes:
acquiring the speed of a vehicle at the current moment, and determining an Augmented Reality (AR) element to be displayed and a target imaging angle corresponding to the AR element according to the speed;
determining a target installation angle of the image generator according to the target imaging angle;
driving the rotating mechanism to rotate so as to adjust the installation angle of the image generator to be at the target installation angle;
at the target installation angle, generating the AR element by the image generator and displaying the AR element in the image display component.
The embodiment of the application provides an image display method, which can realize the adjustment of the installation angle of an image generator by configuring a rotating mechanism on the geometric center point of the image generator on hardware. According to the method, the imaging angle of the AR element at the current moment is determined through the vehicle speed at the current moment, the installation angle of the image generator and the driving parameters of the rotating mechanism are calculated based on the imaging angle of the AR element, and finally the rotating mechanism is driven based on the driving parameters so as to achieve the purpose that the image generator is adjusted to the target installation angle. According to the method and the device, AR elements are controlled to be displayed at a proper imaging angle according to the current speed of the vehicle, different driving scenes (such as urban driving scenes and high-speed driving scenes) can be fused better, and driving safety and driving experience feeling can be improved.
Further, the determining the target installation angle of the image generator according to the target imaging angle includes: acquiring an initial imaging angle of the AR element displayed by the image display assembly at the last moment, and acquiring an initial installation angle of the image generator at the last moment; determining an imaging angle difference based on the initial imaging angle and the target imaging angle; converting the imaging angle difference value into an installation angle difference value of the image generator according to a conversion coefficient, wherein the conversion coefficient is determined by the magnification of an optical system of the head-up display and the inclination angle of a windshield; and determining the target installation angle of the image generator according to the initial installation angle and the installation angle difference value.
Further, the driving the rotation mechanism to rotate to adjust the installation angle of the image generator to be at the target installation angle includes: determining a first direction of imaging angle change based on the initial imaging angle and the target imaging angle, the first direction being either a clockwise direction or a counter-clockwise direction; determining a second direction of the change of the installation angle of the image generator according to the first direction, wherein the second direction and the first direction are opposite to each other; determining a driving parameter of the rotating mechanism based on the mounting angle difference and the second direction; and driving the rotating mechanism to rotate through pulse width modulation based on the driving parameters, and adjusting the image generator from the initial installation angle to the target installation angle under the rotation of the rotating mechanism.
Further, the head-up display further includes a mirror assembly, the mirror assembly is configured to reflect the AR element to the image display assembly, at least one mirror in the mirror assembly is a free-form surface mirror, a surface type parameter and an installation position of the free-form surface mirror are jointly determined by a plurality of preset imaging angles supported by the head-up display, the target imaging angle is one of the preset imaging angles, the preset imaging angle is an angle between a virtual image of the AR element in the image display assembly and a reference line, and the reference line is a line perpendicular to the ground.
Further, the target surface type parameter and the target mounting position of the free-form surface mirror and the target mounting position of the image generator are determined by: determining initial surface type parameters and a first installation position of the free-form surface mirror corresponding to the AR element in the process of displaying the AR element through each preset imaging angle in the plurality of preset imaging angles, and determining a second installation position of the image generator; determining a visual evaluation function and a visual constraint condition corresponding to the AR element watched at a driving eye point according to the initial surface type parameter, the first installation position and the second installation position; the initial surface type parameter, the first installation position and the second installation position are adjusted to enable the visual evaluation function to be established under the visual constraint condition, so that the target surface type parameter and the target installation position of the free-form surface mirror and the target installation position of the image generator are obtained; the visual evaluation function is a functional relation consisting of an initial surface type parameter, a first installation position and a second installation position which are independent variables, and an imaging effect of the AR element watched at the driving eyepoint which is the dependent variable; the visual constraint condition includes that a distortion rate of a virtual image of the AR element in the image display device is less than or equal to a preset first distortion rate and a modulation transfer function value of the virtual image of the AR element in the image display device is greater than or equal to a preset first function value at 6 line pairs/mm.
Further, the determining, according to the vehicle speed, the augmented reality AR element to be displayed and the target imaging angle corresponding to the AR element includes: determining a vehicle speed interval range corresponding to the vehicle speed; determining the AR element corresponding to the vehicle speed interval range; and determining the target imaging angle from the plurality of preset imaging angles based on the vehicle speed interval range.
Further, the AR element includes a first type element including at least one of a vehicle speed icon and a gear icon, and a second type element including at least one of a navigation icon, a lane line icon, an obstacle icon, a collision warning icon, and a following distance icon.
In a second aspect, the present application provides an image display apparatus configured in a head-up display, in which an image generator, a rotation mechanism, and an image display assembly are configured, the rotation mechanism being configured on a geometric center point of the image generator, for adjusting an installation angle of the image generator, the apparatus comprising:
the imaging angle determining module is used for obtaining the speed of the vehicle at the current moment, and determining an Augmented Reality (AR) element to be displayed and a target imaging angle corresponding to the AR element according to the speed;
The installation angle determining module is used for determining a target installation angle of the image generator according to the target imaging angle;
the installation angle adjusting module is used for driving the rotating mechanism to rotate so as to adjust the installation angle of the image generator to be at the target installation angle;
and the image display module is used for generating the AR element through the image generator under the target installation angle and displaying the AR element in the image display component.
In a third aspect, the present application provides an electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the image presentation method of any embodiment of the present application.
In a fourth aspect, the present application provides a computer readable storage medium storing computer instructions for causing a processor to execute the image presentation method according to any embodiment of the present application.
It should be noted that the above-mentioned computer instructions may be stored in whole or in part on a computer-readable storage medium. The computer readable storage medium may be packaged together with the processor of the image display device or may be packaged separately from the processor of the image display device, which is not limited in this application.
The description of the second, third and fourth aspects of the present application may refer to the detailed description of the first aspect; moreover, the advantages described in the second aspect, the third aspect and the fourth aspect may refer to the analysis of the advantages of the first aspect, and are not described herein.
It should be understood that the description of this section is not intended to identify key or critical features of the embodiments of the application or to delineate the scope of the application. Other features of the present application will become apparent from the description that follows.
It can be appreciated that before using the technical solutions disclosed in the embodiments of the present application, the user should be informed and authorized by appropriate means of the type, the usage range, the usage scenario, etc. of the personal information related to the present application according to the relevant laws and regulations.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1A is a schematic diagram of a prior art simultaneous imaging of a far view image and a near view image;
FIG. 1B is a schematic illustration of HUD imaging with virtual images approximately perpendicular to the ground;
FIG. 1C is a schematic illustration of HUD imaging with a virtual image approximately conforming to a road surface;
fig. 2 is a first flow chart of an image display method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of different AR virtual image imaging angles implemented in the same optical system;
fig. 4 is a schematic structural diagram of an image display device according to an embodiment of the present application;
fig. 5 is a block diagram of an electronic device for implementing an image presentation method according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," "target," and "original," etc. in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the present application described herein may be capable of executing sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
Prior to the description of the embodiments of the present application, application scenarios and defects of the prior art of the present application will be described. The HUD system mainly comprises an image generator, an imaging light path component and an image display component. Wherein the image generator is used for generating an image digital signal (i.e. a real image) and converting the image digital signal into light rays carrying image information. The image generator may be an optical engine fabricated using digital light processing technology (Digital Light Processing, DLP) or liquid crystal on silicon technology (Liquid Crystal On Silicon, LCOS), and includes an illumination assembly and a projection assembly, which may be a micro-projection lens. The imaging light path component is used for performing functions such as reflection projection on the real image through a reflecting mirror. The image display component is used for displaying a virtual image picture; depending on the application scenario, the image display components may also be different. When the application scene is a cinema for delivering a movie, the image display component is a projection curtain or a display screen; when the application scenario is to display driving information on a windshield of a vehicle, then the image display component is the windshield of the vehicle.
In the embodiment of the application, the image display method is described by taking an example of displaying the AR element on the windshield of the vehicle. The prior art is described in detail below. As shown in fig. 1A, a schematic diagram of simultaneous imaging of a far view image and a near view image is shown, where reference numeral 1 in fig. 1A is an image generator and mirror assembly, reference numeral 3 is a windshield, reference numeral 4 is a driving eye point, reference numeral 11 is a near view image, and reference numeral 12 is a far view image. In the figure, the far view image and the near view image of the image generator share the reflecting mirror component, and the image projected by the image generator passes through the reflecting mirror component and finally is reflected to enter human eyes through the inner surface of the windshield glass to form the seen far view image and the seen near view image. However, the defects of the scheme are that the close-range image and the far-range image are perpendicular to the ground, the imaging angle cannot be adjusted, and the AR display effect of fusion with ground live-action (such as ground and obstacles) cannot be realized in a self-adaptive manner aiming at different driving scenes.
The image display method of the corresponding embodiment of fig. 2 involves adjusting the HUD imaging angle, which can solve the above-mentioned drawbacks of the prior art scheme. The HUD imaging angle will be briefly described. Fig. 1B is a schematic diagram of HUD imaging performed in a manner that a virtual image is nearly perpendicular to the ground, in fig. 1B, reference numeral 1 is a driving eye point, reference numeral 2 is a windshield, reference numeral 3 is a HUD structural component, reference numerals 4 and 5 are mirror components (including a secondary mirror and a primary mirror), reference numeral 6 is an image generator, and reference numeral 8 is an AR virtual image, that is, a virtual image that is visible at the driving eye point after the AR element generated by the image generator is projected by the HUD. The imaging angle of the AR virtual image is nearly perpendicular to the ground, at which point the mounting angle of the image generator 6 is a, and the mirror assemblies 4 and 5 match the current imaging system for free-form surface mirrors such that viewing the AR virtual image 8 at the driving eyepoint 1 satisfies the human eye visual constraint. Fig. 1C is a schematic diagram of HUD imaging in a manner that a virtual image is approximately fitted to a road surface, reference numerals 4 'and 5' in fig. 1C are mirror assemblies, reference numeral 6 'is an image generator, and reference numeral 8' is an AR virtual image. The imaging angle of the AR virtual image is approximately fitted to the road surface, and at this time, the installation angle of the image generator 6 'is b, and the mirror assemblies 4' and 5 'are free-form surface mirrors matching the current imaging system, so that viewing the AR virtual image 8' at the driving eyepoint 1 satisfies the human eye visual constraint condition.
In the two solutions of fig. 1B and 1C, respectively, the surface type and position of the free-form surface mirrors 4 and 5 are generally different from those of the free-form surface mirrors 4' and 5', and the position and the mounting angle (i.e., angle a) of the image generator 6 are also different from those of the image generator 6' (i.e., angle B). How to integrate the two schemes of fig. 1B and fig. 1C into the same optical system and adapt to different driving scenarios (such as urban driving scenario and high-speed driving scenario) is a problem to be solved by the present application.
Fig. 2 is a first flow chart of an image display method provided in the embodiment of the present application, which is applicable to determining AR elements to be displayed and imaging modes of an AR-HUD in different driving scenarios (e.g. urban driving scenarios and high-speed driving scenarios). The image display method provided by the embodiment of the invention can be executed by the image display device provided by the embodiment of the application, and the device can be realized in a software and/or hardware mode and is integrated in an electronic device for executing the method. Preferably, the electronic device in the embodiment of the application may be a HUD, in which an image generator, a rotation mechanism and an image display assembly are configured, and the rotation mechanism is configured on a geometric center point of the image generator and is used for adjusting an installation angle of the image generator.
Referring to fig. 2, the method of the present embodiment includes, but is not limited to, the following steps:
s110, acquiring the speed of the vehicle at the current moment, and determining the AR element to be displayed and the target imaging angle corresponding to the AR element according to the speed.
Wherein the AR element is driving assistance information related to a driving road condition during driving of the vehicle. Different vehicle speeds correspond to different types of AR elements, the types of the AR elements are consistent with the number of vehicle speed interval ranges, and the number of HUD imaging angles and the number of image generator installation angles are consistent. In theory, a plurality of vehicle speed interval ranges may be set, with a plurality of preset HUD imaging angles corresponding to the plurality of vehicle speed interval ranges, and a plurality of preset image generator mounting angles. In the embodiment of the application, two types of AR elements are taken as examples to explain the image display method, the number of corresponding vehicle speed interval ranges is also two, the number of preset imaging angles of the HUD is also two, and the number of preset installation angles of the image generator is also two.
The AR element includes a first type element including at least any one of a vehicle speed icon and a gear icon, and a second type element including at least any one of a navigation icon, a lane line icon, an obstacle icon, a collision warning icon, and a following distance icon.
In the embodiment of the application, a micro control unit (Micro Controller Unit, MCU) in the HUD reads the speed of the vehicle at the current moment through a controller area network (Controller Area Network, CAN) interface. After the vehicle speed at the current moment is obtained, the MCU judges a vehicle speed interval range corresponding to the current vehicle speed, determines an AR element corresponding to the vehicle speed interval range, and determines a target imaging angle from a plurality of preset imaging angles based on the vehicle speed interval range.
For example, when the vehicle speed is 20km/h, the MCU judges that the current vehicle speed is in a vehicle speed interval range smaller than 60km/h, and determines that the current driving scene is in an urban driving scene. At this time, the AR element that matches it is determined as the first type element, that is, including the vehicle speed icon, the gear icon, and the like. In consideration of driving safety and driving experience, the HUD imaging angle is determined as a virtual image perpendicular to the ground, for example, an angle between the AR virtual image and a reference line is 0 °, and the reference line is a line perpendicular to the ground.
Also for example, when the vehicle speed is 85km/h, the MCU determines that the current vehicle speed is in a vehicle speed section range of 60km/h or more, and determines that the current vehicle speed is in a high-speed driving scene. At this time, the AR element matched with the AR element is determined to be a second type element so as to realize AR fusion interaction with the external environment, namely a navigation icon, a lane line icon, an obstacle icon, a collision early warning icon, a following distance icon and the like. In consideration of driving safety and driving experience feeling, the HUD imaging angle is determined such that the virtual image is close to the ground, for example, the angle between the AR virtual image and the reference line is 50 °.
S120, determining a target installation angle of the image generator according to the target imaging angle.
In the embodiment of the application, the imaging angle of the AR virtual image in the HUD imaging optical system affects the mounting angle of the image generator, and the AR virtual image imaging angle has a correlation with the image generator mounting angle.
Specifically, determining the target installation angle of the image generator according to the target imaging angle includes: acquiring an initial imaging angle of the AR element displayed by the image display assembly at the last moment, and acquiring an initial installation angle of the image generator at the last moment; determining an imaging angle difference based on the initial imaging angle and the target imaging angle; converting the imaging angle difference value into an installation angle difference value of an image generator according to a preset conversion formula; and determining a target installation angle of the image generator according to the initial installation angle and the installation angle difference value, wherein a preset conversion formula is used for expressing the relation between the imaging angle difference value of the pre-configured AR element and the installation angle difference value of the image generator.
Assuming that the installation angle difference of the image generator is x, the imaging angle difference of the AR virtual image is y, and a preset conversion formula satisfied between the two is y=x×k, wherein K is a conversion coefficient, the value of the conversion coefficient is related to the magnification of the HUD optical system and the inclination angle of the windshield, and the conversion coefficient K can be determined according to the magnification of the HUD optical system and the inclination angle of the windshield. For a specific model of HUD and a specific model of windshield glass, the magnification of an HUD optical system and the inclination angle of the windshield glass are determined and known, and the imaging angle and the installation angle of an image generator are calibrated based on the specific model of HUD and the specific model of windshield glass for multiple tests in the prior stage, so that the conversion coefficient between the imaging angle difference value of an AR virtual image and the installation angle difference value of the image generator can be determined, and the conversion coefficient can be preconfigured in the HUD system; therefore, the imaging angle difference of the AR virtual image may be converted into the installation angle difference of the image generator according to a preset conversion coefficient.
Illustratively, in one specific application example, the image generator mounting angle is rotated clockwise by 3.765 °, and correspondingly the AR virtual image imaging angle is rotated counterclockwise by 50 °. In different application examples, as the conversion coefficients have different values, the corresponding relationship between the installation angle of the image generator and the imaging angle of the AR virtual image is different, but the following rule is followed: the clockwise rotation of the installation angle of the image generator corresponds to the anticlockwise rotation of the imaging angle of the AR virtual image, and the preset conversion relation is met between the installation angle difference value of the image generator and the imaging angle difference value of the AR virtual image.
S130, driving the rotation mechanism to rotate so as to adjust the installation angle of the image generator to be at a target installation angle.
The rotation mechanism is arranged at the geometrical center point of the image generator (as denoted by reference numeral 7 in fig. 3) for adjusting the mounting angle of the image generator. It should be noted that, when the mounting angle of the image generator is rotated, the geometric center position of the image generator does not move, that is, the image generator is rotated around its geometric center position.
Specifically, the driving rotation mechanism rotates to adjust the installation angle of the image generator to be at the target installation angle, including: determining a first direction of imaging angle change based on the initial imaging angle and the target imaging angle; determining a second direction of the change in the mounting angle of the image generator according to the first direction; determining a driving parameter of the rotating mechanism based on the installation angle difference value and the second direction; the rotation mechanism is driven to rotate through pulse width modulation based on the driving parameters, and the image generator is adjusted to a target installation angle from an initial installation angle under the rotation of the rotation mechanism. Wherein the first direction is clockwise or anticlockwise; the driving parameters include the rotational direction of the rotation mechanism and the number of rotational steps.
According to the principle of reversibility of the light path, when the image generator mounting angle rotates clockwise, the corresponding AR virtual image imaging angle rotates anticlockwise. Thus, the second direction and the first direction are opposite directions to each other.
As shown in fig. 3, a schematic diagram of implementing different AR virtual image imaging angles in the same optical system is shown, where reference numeral 1 is a driving eye point, reference numeral 2 is a windshield glass, reference numeral 3 is a HUD structural component, reference numerals 4 and 5 are mirror components (including a secondary mirror and a primary mirror), reference numerals 6 and 6 'are image generators at two mounting angles respectively, reference numeral 7 is a rotation mechanism, and reference numerals 8 and 8' are AR virtual images at two imaging angles respectively. In the figure, the solid line is an optical transmission schematic corresponding to the AR virtual image in one imaging mode, and the dotted line is an optical transmission schematic corresponding to the AR virtual image in another imaging mode.
For example, if the vehicle speed is greater than 60km/h at the previous time, the image generator is installed at a mounting angle of reference numeral 6 'in fig. 3, and the imaging angle of the AR virtual image is 8' in fig. 3. When the speed of the vehicle at the current moment is 20km/h, the MCU determines that the driving scene is currently in the urban area, and determines an AR element and an HUD imaging angle matched with the driving scene in the urban area. In addition, the MCU calculates driving parameters of the rotating mechanism according to the HUD imaging angle, and drives the rotating mechanism to enable the image generator to rotate to an installation angle corresponding to the imaging angle. The image generator is rotated counterclockwise from the mounting angle 6 'to the mounting angle 6 and the ar virtual image is rotated clockwise from the imaging angle 8' to the imaging angle 8 as in fig. 3.
Also by way of example, if the vehicle speed is less than 60km/h at the previous time, the image generator is installed at a mounting angle of reference numeral 6 in fig. 3, and the ar virtual image is imaged at a imaging angle of reference numeral 8 in fig. 3. When the speed of the vehicle at the current moment is 85km/h, the MCU determines that the vehicle is currently in a high-speed driving scene, and determines an AR element and an HUD imaging angle matched with the high-speed driving scene. In addition, the MCU calculates driving parameters of the rotating mechanism according to the HUD imaging angle, and drives the rotating mechanism to enable the image generator to rotate to an installation angle corresponding to the imaging angle. The image generator rotates clockwise from the mounting angle 6 to the mounting angle 6 'and the AR virtual image rotates counterclockwise from the imaging angle 8 to the imaging angle 8' as in fig. 3.
And S140, generating AR elements through an image generator under the target installation angle, and displaying the AR elements in an image display component.
In the embodiment of the application, after the image generator is adjusted from the initial installation angle to the target installation angle, the AR element is generated through the image generator under the target installation angle, and the AR element is displayed to the driver through the image display component in the HUD, so that the driver is assisted in safe driving. Alternatively, the image display assembly may be an area of the windshield of the vehicle.
According to the technical scheme provided by the embodiment, the speed of the vehicle at the current moment is obtained, and the AR element to be displayed and the target imaging angle corresponding to the AR element are determined according to the speed; determining a target installation angle of the image generator according to the target imaging angle; driving the rotating mechanism to rotate so as to adjust the installation angle of the image generator to be at a target installation angle; at the target installation angle, AR elements are generated by an image generator and presented in an image presentation component. The method and the device can realize the adjustment of the installation angle of the image generator in hardware by configuring the rotating mechanism on the geometric center point of the image generator. According to the method, the imaging angle of the AR element at the current moment is determined through the vehicle speed at the current moment, the installation angle of the image generator and the driving parameters of the rotating mechanism are calculated based on the imaging angle of the AR element, and finally the rotating mechanism is driven based on the driving parameters so as to achieve the purpose that the image generator is adjusted to the target installation angle. According to the method and the device, AR elements are controlled to be displayed at a proper imaging angle according to the current speed of the vehicle, different driving scenes (such as urban driving scenes and high-speed driving scenes) can be fused better, and driving safety and driving experience feeling can be improved.
In an alternative embodiment, the head-up display further includes a mirror assembly, the mirror assembly is configured to reflect the AR element to the image display assembly, at least one mirror in the mirror assembly is a free-form surface mirror, a surface-type parameter and an installation position of the free-form surface mirror are determined by a plurality of preset imaging angles supported by the head-up display together, the target imaging angle is one of the plurality of preset imaging angles, the preset imaging angle is an angle between a virtual image of the AR element in the image display assembly and a reference line, and the reference line is a line perpendicular to the ground.
Fig. 3 is a schematic diagram showing that different AR virtual image imaging angles are implemented in the same optical system, and in the corresponding embodiment of fig. 2, how to integrate the two schemes of fig. 1B and fig. 1C into the same optical system to implement adaptation to different driving scenarios (such as urban driving scenarios and high-speed driving scenarios) is solved, but before executing the method of the corresponding embodiment of fig. 2, it is also necessary to ensure that the AR virtual images at different imaging angles in fig. 3 satisfy the visual constraint condition. Thus, this step describes how to adjust the face and position of the mirror assembly (i.e. reference numerals 4 and 5 in the figures) and the position of the image generator (i.e. reference numerals 6 or 6' in the figures), all to achieve that the AR elements for viewing different imaging angles at the driving eyepoint meet the visual constraints. The visual constraint condition comprises that the distortion rate of the virtual image of the AR element in the image display assembly is smaller than or equal to a preset first distortion rate and the modulation transfer function value of the virtual image of the AR element in the image display assembly is larger than or equal to a preset first function value at the position of 6 line pairs/millimeter.
In an alternative embodiment, the predetermined first distortion ratio is 5% and the predetermined first function value is 0.3, i.e. the visual constraint includes that the distortion of the virtual image presented by the AR element in the image display assembly is less than or equal to 5% and the modulation transfer function value of the virtual image presented by the AR element in the image display assembly is more than or equal to 0.3 at 6 log/mm. The modulation transfer function refers to the ratio of the contrast of the output image to the input image of the AR element during transmission of the HUD imaging light path assembly.
Specifically, the target surface type parameter and the target installation position of the free-form surface mirror and the target installation position of the image generator are determined by the following modes: determining initial surface type parameters and a first installation position of a corresponding free-form surface mirror and a second installation position of an image generator in the process of displaying AR elements through each preset imaging angle in a plurality of preset imaging angles; determining a visual evaluation function and a visual constraint condition corresponding to the AR element watched at the driving eyepoint according to the initial surface type parameter, the first installation position and the second installation position; and the initial surface type parameter, the first installation position and the second installation position are adjusted to enable the visual evaluation function to be established under the visual constraint condition, so that the target surface type parameter and the target installation position of the free-form surface mirror and the target installation position of the image generator are obtained. The visual evaluation function is a functional relationship composed of an initial surface type parameter, a first installation position and a second installation position as independent variables, and imaging effects of the AR element viewed at the driving eyepoint as dependent variables.
Fig. 4 is a schematic structural diagram of an image display device provided in an embodiment of the present application, where the image display device is configured in a head-up display, and an image generator, a rotation mechanism, and an image display assembly are configured in the head-up display, where the rotation mechanism is configured on a geometric center point of the image generator, and is used to adjust an installation angle of the image generator, as shown in fig. 4, the device 400 may include:
the imaging angle determining module 410 is configured to obtain a vehicle speed of the vehicle at a current moment, and determine an augmented reality AR element to be displayed and a target imaging angle corresponding to the AR element according to the vehicle speed;
a mounting angle determining module 420 for determining a target mounting angle of the image generator according to the target imaging angle;
a mounting angle adjusting module 430 for driving the rotating mechanism to rotate so as to adjust the mounting angle of the image generator to be at the target mounting angle;
an image display module 440 for generating the AR element by the image generator at the target installation angle and displaying the AR element in the image display component.
Further, the installation angle determining module 420 may be specifically configured to: acquiring an initial imaging angle of the AR element displayed by the image display assembly at the last moment, and acquiring an initial installation angle of the image generator at the last moment; determining an imaging angle difference based on the initial imaging angle and the target imaging angle; converting the imaging angle difference value into an installation angle difference value of the image generator according to a conversion coefficient, wherein the conversion coefficient is determined by the magnification of an optical system of the head-up display and the inclination angle of a windshield; and determining the target installation angle of the image generator according to the initial installation angle and the installation angle difference value.
Further, the installation angle adjusting module 430 may be specifically configured to: determining a first direction of imaging angle change based on the initial imaging angle and the target imaging angle, the first direction being either a clockwise direction or a counter-clockwise direction; determining a second direction of the change of the installation angle of the image generator according to the first direction, wherein the second direction and the first direction are opposite to each other; determining a driving parameter of the rotating mechanism based on the mounting angle difference and the second direction; and driving the rotating mechanism to rotate through pulse width modulation based on the driving parameters, and adjusting the image generator from the initial installation angle to the target installation angle under the rotation of the rotating mechanism.
Optionally, the head-up display further includes a mirror assembly, the mirror assembly is configured to reflect the AR element to the image display assembly, at least one mirror in the mirror assembly is a free-form surface mirror, a surface parameter and an installation position of the free-form surface mirror are jointly determined by a plurality of preset imaging angles supported by the head-up display, the target imaging angle is one of the preset imaging angles, the preset imaging angle is an angle between a virtual image and a reference line of the AR element in the image display assembly, and the reference line is a line perpendicular to the ground.
Optionally, the target surface type parameter and the target mounting position of the free-form surface mirror and the target mounting position of the image generator are determined by: determining initial surface type parameters and a first installation position of the free-form surface mirror corresponding to the AR element in the process of displaying the AR element through each preset imaging angle in the plurality of preset imaging angles, and determining a second installation position of the image generator; determining a visual evaluation function and a visual constraint condition corresponding to the AR element watched at a driving eye point according to the initial surface type parameter, the first installation position and the second installation position; the initial surface type parameter, the first installation position and the second installation position are adjusted to enable the visual evaluation function to be established under the visual constraint condition, so that the target surface type parameter and the target installation position of the free-form surface mirror and the target installation position of the image generator are obtained; the visual evaluation function is a functional relation consisting of an initial surface type parameter, a first installation position and a second installation position which are independent variables, and an imaging effect of the AR element watched at the driving eyepoint which is the dependent variable; the visual constraint condition includes that a distortion rate of a virtual image of the AR element in the image display device is less than or equal to a preset first distortion rate and a modulation transfer function value of the virtual image of the AR element in the image display device is greater than or equal to a preset first function value at 6 line pairs/mm.
Further, the imaging angle determining module 410 may be specifically configured to: determining a vehicle speed interval range corresponding to the vehicle speed; determining the AR element corresponding to the vehicle speed interval range; and determining the target imaging angle from the plurality of preset imaging angles based on the vehicle speed interval range.
Optionally, the AR element includes a first type element and a second type element, the first type element includes at least one of a vehicle speed icon and a gear icon, and the second type element includes at least one of a navigation icon, a lane line icon, an obstacle icon, a collision warning icon, and a following distance icon.
The image display device provided by the embodiment is applicable to the image display method provided by any embodiment, and has corresponding functions and beneficial effects.
Fig. 5 is a block diagram of an electronic device for implementing an image presentation method according to an embodiment of the present application. The electronic device 10 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the applications described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the image presentation method.
In some embodiments, the image presentation method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the image presentation method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the image presentation method in any other suitable way (e.g. by means of firmware).
Various implementations of the systems and techniques described here above can be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out the methods of the present application may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this application, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server) or that includes a middleware component (e.g., an application server) or that includes a front-end component through which a user can interact with an implementation of the systems and techniques described here, or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
Note that the above is only a preferred embodiment of the present application and the technical principle applied. Those skilled in the art will appreciate that the present application is not limited to the particular embodiments described herein, but is capable of numerous obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the present application. For example, one skilled in the art may use the various forms of flow shown above to reorder, add, or delete steps; the steps recited in the present application may be performed in parallel, sequentially or in a different order, as long as the desired results of the technical solutions of the present application are achieved, and are not limited herein.
The above embodiments do not limit the scope of the application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application are intended to be included within the scope of the present application.

Claims (8)

1. The image display method is characterized by being applied to a head-up display, wherein an image generator, a rotating mechanism and an image display assembly are arranged in the head-up display, and the rotating mechanism is arranged on the geometric center point of the image generator and used for adjusting the installation angle of the image generator; the method comprises the following steps:
Acquiring the speed of a vehicle at the current moment, and determining an Augmented Reality (AR) element to be displayed and a target imaging angle corresponding to the AR element according to the speed;
determining a target mounting angle of the image generator according to the target imaging angle, comprising: acquiring an initial imaging angle of the AR element displayed by the image display assembly at the last moment, and acquiring an initial installation angle of the image generator at the last moment; determining an imaging angle difference based on the initial imaging angle and the target imaging angle; converting the imaging angle difference value into an installation angle difference value of the image generator according to a conversion coefficient, wherein the conversion coefficient is determined by the magnification of an optical system of the head-up display and the inclination angle of a windshield; determining the target installation angle of the image generator according to the initial installation angle and the installation angle difference value;
driving the rotation mechanism to rotate so as to adjust the installation angle of the image generator to be at the target installation angle, comprising: determining a first direction of imaging angle change based on the initial imaging angle and the target imaging angle, the first direction being either a clockwise direction or a counter-clockwise direction; determining a second direction of the change of the installation angle of the image generator according to the first direction, wherein the second direction and the first direction are opposite to each other; determining a driving parameter of the rotating mechanism based on the mounting angle difference and the second direction; driving the rotating mechanism to rotate through pulse width modulation based on the driving parameters, and adjusting the image generator from the initial installation angle to the target installation angle under the rotation of the rotating mechanism;
At the target installation angle, generating the AR element by the image generator and displaying the AR element in the image display component.
2. The image display method according to claim 1, wherein the head-up display further comprises a mirror assembly for reflecting the AR element to the image display assembly, at least one mirror of the mirror assembly is a free-form surface mirror, a surface type parameter and a mounting position of the free-form surface mirror are commonly determined by a plurality of preset imaging angles supported by the head-up display, the target imaging angle is one of the plurality of preset imaging angles, the preset imaging angle is an angle between a virtual image presented by the AR element in the image display assembly and a reference line, and the reference line is a line perpendicular to the ground.
3. The image presentation method of claim 2, wherein the target surface type parameter and target mounting position of the freeform mirror and the target mounting position of the image generator are determined by:
determining initial surface type parameters and a first installation position of the free-form surface mirror corresponding to the AR element in the process of displaying the AR element through each preset imaging angle in the plurality of preset imaging angles, and determining a second installation position of the image generator;
Determining a visual evaluation function and a visual constraint condition corresponding to the AR element watched at a driving eye point according to the initial surface type parameter, the first installation position and the second installation position;
the initial surface type parameter, the first installation position and the second installation position are adjusted to enable the visual evaluation function to be established under the visual constraint condition, so that the target surface type parameter and the target installation position of the free-form surface mirror and the target installation position of the image generator are obtained;
the visual evaluation function is a functional relation consisting of an initial surface type parameter, a first installation position and a second installation position which are independent variables, and an imaging effect of the AR element watched at the driving eyepoint which is the dependent variable; the visual constraint condition includes that a distortion rate of a virtual image of the AR element in the image display device is less than or equal to a preset first distortion rate and a modulation transfer function value of the virtual image of the AR element in the image display device is greater than or equal to a preset first function value at 6 line pairs/mm.
4. The image displaying method according to claim 2, wherein the determining, according to the vehicle speed, an augmented reality AR element to be displayed and a target imaging angle corresponding to the AR element includes:
Determining a vehicle speed interval range corresponding to the vehicle speed;
determining the AR element corresponding to the vehicle speed interval range;
and determining the target imaging angle from the plurality of preset imaging angles based on the vehicle speed interval range.
5. The image presentation method according to claim 1, wherein the AR element includes a first type element including at least one of a vehicle speed icon and a gear icon, and a second type element including at least one of a navigation icon, a lane line icon, an obstacle icon, a collision warning icon, and a following distance icon.
6. An image display device is characterized by being arranged on a head-up display, wherein an image generator, a rotating mechanism and an image display assembly are arranged in the head-up display, and the rotating mechanism is arranged on the geometric center point of the image generator and used for adjusting the installation angle of the image generator; the device comprises:
the imaging angle determining module is used for obtaining the speed of the vehicle at the current moment, and determining an Augmented Reality (AR) element to be displayed and a target imaging angle corresponding to the AR element according to the speed;
The installation angle determining module is used for determining a target installation angle of the image generator according to the target imaging angle;
the installation angle adjusting module is used for driving the rotating mechanism to rotate so as to adjust the installation angle of the image generator to be at the target installation angle;
the image display module is used for generating the AR element through the image generator under the target installation angle and displaying the AR element in the image display component;
the installation angle determining module is specifically configured to obtain an initial imaging angle of an AR element displayed by the image display assembly at a previous time, and obtain an initial installation angle of the image generator at the previous time; determining an imaging angle difference based on the initial imaging angle and the target imaging angle; converting the imaging angle difference value into an installation angle difference value of the image generator according to a conversion coefficient, wherein the conversion coefficient is determined by the magnification of an optical system of the head-up display and the inclination angle of a windshield; determining the target installation angle of the image generator according to the initial installation angle and the installation angle difference value;
The installation angle adjusting module is specifically configured to determine a first direction of imaging angle change based on the initial imaging angle and the target imaging angle, where the first direction is a clockwise direction or a counterclockwise direction; determining a second direction of the change of the installation angle of the image generator according to the first direction, wherein the second direction and the first direction are opposite to each other; determining a driving parameter of the rotating mechanism based on the mounting angle difference and the second direction; and driving the rotating mechanism to rotate through pulse width modulation based on the driving parameters, and adjusting the image generator from the initial installation angle to the target installation angle under the rotation of the rotating mechanism.
7. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the image presentation method of any one of claims 1 to 5.
8. A computer readable storage medium storing computer instructions for causing a processor to perform the image presentation method of any one of claims 1 to 5.
CN202310682698.9A 2023-06-09 2023-06-09 Image display method and device, electronic equipment and storage medium Active CN116883625B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310682698.9A CN116883625B (en) 2023-06-09 2023-06-09 Image display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310682698.9A CN116883625B (en) 2023-06-09 2023-06-09 Image display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116883625A CN116883625A (en) 2023-10-13
CN116883625B true CN116883625B (en) 2024-03-22

Family

ID=88253841

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310682698.9A Active CN116883625B (en) 2023-06-09 2023-06-09 Image display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116883625B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB287608A (en) * 1926-12-10 1928-03-12 Barr & Stroud Ltd Apparatus for the solution by optical means of data capable of representation in theform of triangles
JP2019191721A (en) * 2018-04-20 2019-10-31 名古屋電機工業株式会社 Movement controller, movement control method, and movement control program
CN110471082A (en) * 2019-08-13 2019-11-19 西安电子科技大学 Single pixel laser calculates imaging device and method
WO2020124992A1 (en) * 2018-12-19 2020-06-25 南京理工大学 Aperture coding imaging system based on transmission-type dual slits, and super-resolution method therefor
CN114616817A (en) * 2020-01-22 2022-06-10 奥迪股份公司 Method for producing reproducible visual angle of photographic image of object and mobile equipment with integrated camera
CN114877871A (en) * 2022-05-06 2022-08-09 中国人民解放军国防科技大学 Attitude staring control method for deep space target observation by uncalibrated video satellite

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB287608A (en) * 1926-12-10 1928-03-12 Barr & Stroud Ltd Apparatus for the solution by optical means of data capable of representation in theform of triangles
JP2019191721A (en) * 2018-04-20 2019-10-31 名古屋電機工業株式会社 Movement controller, movement control method, and movement control program
WO2020124992A1 (en) * 2018-12-19 2020-06-25 南京理工大学 Aperture coding imaging system based on transmission-type dual slits, and super-resolution method therefor
CN110471082A (en) * 2019-08-13 2019-11-19 西安电子科技大学 Single pixel laser calculates imaging device and method
CN114616817A (en) * 2020-01-22 2022-06-10 奥迪股份公司 Method for producing reproducible visual angle of photographic image of object and mobile equipment with integrated camera
CN114877871A (en) * 2022-05-06 2022-08-09 中国人民解放军国防科技大学 Attitude staring control method for deep space target observation by uncalibrated video satellite

Also Published As

Publication number Publication date
CN116883625A (en) 2023-10-13

Similar Documents

Publication Publication Date Title
EP3368965B1 (en) Remote rendering for virtual images
CN106502427B (en) Virtual reality system and scene presenting method thereof
WO2022188096A1 (en) Hud system, vehicle, and virtual image position adjustment method
US10482666B2 (en) Display control methods and apparatuses
US11232602B2 (en) Image processing method and computing device for augmented reality device, augmented reality system, augmented reality device as well as computer-readable storage medium
CN207557584U (en) Augmented reality head-up display device
GB2532954A (en) Display control system for an augmented reality display system
JP2017030737A (en) Display device for vehicle and display method for vehicle
JP2019174693A (en) Display system, control unit, control method, program, and movable body
CN112384883A (en) Wearable device and control method thereof
CN112596247A (en) Image display method and device and head-mounted display equipment
US11394938B2 (en) Projection apparatus, system and projection method
CN115525152A (en) Image processing method, system, device, electronic equipment and storage medium
US20220013046A1 (en) Virtual image display system, image display method, head-up display, and moving vehicle
EP3961353A1 (en) Method and apparatus for controlling head-up display based on eye tracking status
CN116883625B (en) Image display method and device, electronic equipment and storage medium
CN116449569A (en) AR-HUD imaging system and imaging display control method
CN115128815A (en) Image display method and device, electronic equipment and storage medium
CN116974084B (en) Image display method and device, electronic equipment and storage medium
CN115431764B (en) AR scale display method and device, electronic equipment and storage medium
CN115542557A (en) Image display method and device, electronic equipment and storage medium
CN111241946A (en) Method and system for increasing FOV (field of view) based on single DLP (digital light processing) optical machine
CN116243880B (en) Image display method, electronic equipment and storage medium
CN115665400B (en) Augmented reality head-up display imaging method, device, equipment and storage medium
CN116338958A (en) Double-layer image imaging method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant