CN108896281B - Viewing zone width measuring method and system based on Hud system - Google Patents

Viewing zone width measuring method and system based on Hud system Download PDF

Info

Publication number
CN108896281B
CN108896281B CN201810864231.5A CN201810864231A CN108896281B CN 108896281 B CN108896281 B CN 108896281B CN 201810864231 A CN201810864231 A CN 201810864231A CN 108896281 B CN108896281 B CN 108896281B
Authority
CN
China
Prior art keywords
image
cameras
width
images
shot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810864231.5A
Other languages
Chinese (zh)
Other versions
CN108896281A (en
Inventor
杨云召
罗伟慧
夏正国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhangjiagang Kangdexin Optronics Material Co Ltd
Original Assignee
Zhangjiagang Kangdexin Optronics Material Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhangjiagang Kangdexin Optronics Material Co Ltd filed Critical Zhangjiagang Kangdexin Optronics Material Co Ltd
Priority to CN201810864231.5A priority Critical patent/CN108896281B/en
Publication of CN108896281A publication Critical patent/CN108896281A/en
Application granted granted Critical
Publication of CN108896281B publication Critical patent/CN108896281B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for

Abstract

The embodiment of the invention discloses a method and a system for measuring width of a visual area based on an Hud system, wherein the method comprises the following steps: sequentially adjusting the central view offset of the displayed content on the screen by the set step length until the central view offset is the width of a layout period; for each adjustment, the left camera and the right camera shoot virtual images of display contents, and the images shot by the two cameras are respectively marked by the same number; selecting an image from the images shot by the left camera, and selecting an image with the same brightness as the image selected from the images shot by the left camera from the images shot by the right camera; and determining the width of a visual area at the centers of the two cameras according to the number, the arrangement period width, the stepping length and the distance between the two cameras of the two selected images. The embodiment of the invention can accurately measure the width of the visual area at the optimal visual range and improve the accuracy and reliability of data measurement.

Description

Viewing zone width measuring method and system based on Hud system
Technical Field
The embodiment of the invention relates to the technical field of Hud, in particular to a method and a system for measuring the width of a visual area based on a Hud system.
Background
A Head-Up Display system, also called a Head Up Display (Hud) system, is an auxiliary vision safety driving system. Hud system can project some important information needed by driver to the windshield, the projection light is reflected to the eyes of driver, so that the driver can see the display and data of the instrument without lowering head, and the delay and discomfort caused by neglecting the rapid change of external environment between lowering head and raising head and the need of continuous adjustment of eye focus can be reduced. A 3D-Augmented Reality (AR) head-up display, i.e., a 3D-AR-HUD system is an Hud system combined with AR technology, and outputs information not only on the front windshield but also more synthesized with the environment outside the vehicle, such as: roads, pedestrians in the front, vehicles, surrounding buildings and the like, and the driver is friendly to be reminded, so that accidents are effectively prevented.
In the 3D-AR-HUD system, if the viewing position of the driver is inappropriate, the light entering the left eye may enter the right eye, at the moment, the right eye can see both the left image and the right image, 3D crosstalk (crosstalk) is easily generated, the in-and-out screen effect and the viewing comfort of the system can be directly influenced, and the 3D crosstalk in the system can be greatly influenced by the design of the visual area width (namely cone width) in the 3D-AR-HUD system. The existing 3D-AR-Hud system measures approximate cone width by adopting a method of seeing a blueprint with one eye, and the accuracy and reliability of the obtained measurement data are low.
Disclosure of Invention
The embodiment of the invention provides a method and a system for measuring the width of a visual area based on an Hud system, which can accurately measure the width of the visual area and improve the accuracy and reliability of data measurement.
In a first aspect, an embodiment of the present invention provides a method for measuring a width of a viewing area based on a head-up display system, including:
sequentially adjusting the central view offset of the displayed content on the screen by a set step length until the central view offset is the width of a layout period; wherein, the screen displays left and right viewpoint images with different brightness;
for each adjustment, shooting virtual images of the display content through a left camera and a right camera, and marking the images shot by the two cameras respectively by adopting the same number; wherein the number of the image is the same as the adjusted order number;
selecting an image from the images shot by the left camera, and selecting an image with the same brightness as the image selected from the images shot by the left camera from the images shot by the right camera;
and determining the width of a visual area at the centers of the two cameras according to the number of the two selected images, the arrangement period width, the stepping length and the distance between the two cameras.
In a second aspect, an embodiment of the present invention further provides a system for measuring a width of a viewing zone based on a head-up display system, including:
the device comprises a head-up display system, a left camera, a right camera and a main control device;
the head-up display system is used for displaying display contents through a screen and sequentially adjusting the central view offset of the display contents on the screen by a set step length until the central view offset is one arrangement period width; wherein two viewpoint images of different brightness are displayed on the screen;
the two cameras are used for shooting virtual images of the display content for each adjustment;
the master control equipment is respectively connected with the two cameras and used for:
for each adjustment, marking the images shot by the two cameras respectively by adopting the same serial number; wherein the number of the image is the same as the adjusted order number;
selecting an image from the images shot by the left camera, and selecting an image with the same brightness as the image selected from the images shot by the left camera from the images shot by the right camera;
and determining the width of a visual area at the centers of the two cameras according to the number of the two selected images, the arrangement period width, the stepping length and the distance between the two cameras.
According to the embodiment of the invention, the central view offset of the displayed content on the screen is sequentially adjusted by the set step length progress until the central view offset is equal to the width of a layout period, and the virtual images of the displayed content are shot by the two cameras and are respectively marked in the adjustment process. Then selecting two images with the same image brightness from the images shot by the two cameras; according to the number, the arrangement period width, the stepping length and the distance between the two cameras of the two selected images, the visual area width at the centers of the two cameras is determined, the visual area width can be accurately measured, and the accuracy and the reliability of quantity measurement are improved.
Drawings
Fig. 1a is a flowchart illustrating a method for measuring a viewing area width based on a head-up display system according to an embodiment of the present invention;
fig. 1b is a schematic view of an application scenario of a head-up display system according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for measuring an access screen distance based on a head-up display system according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a viewing area width measuring system based on a head-up display system according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an access screen distance measuring system based on a head-up display Hud system according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention.
It should be further noted that, for the convenience of description, only some but not all of the relevant aspects of the present invention are shown in the drawings. Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Example one
Fig. 1a is a flowchart of a method for measuring a viewing area width based on a head-up display system according to an embodiment of the present invention, where the method is applicable to a situation of accurately measuring a viewing area width at an optimal viewing distance, and the method can be executed by a viewing area width measuring system based on a head-up display system, where the system may include a head-up display system, two left and right cameras, and a main control device. Accordingly, as shown in fig. 1a, the method comprises the following operations:
s110, sequentially adjusting the central view offset of the content displayed on the screen by a set step length until the central view offset is one arrangement period width; wherein, the left and right two-viewpoint images with different brightness are displayed on the screen.
The set step length may be set according to Hud system layout period width, and if the layout period width is 1, the step length may be 0.25 or 0.5, etc. The embodiments of the present invention do not limit this.
Among them, the screen is a display screen in the Hud system, and the screen can display image content in a 3D form. The center view offset may be an offset of the center of each viewpoint in the screen, or an offset of the center viewpoint of the display content. The two-viewpoint images may be black and white two-viewpoint images for entering the left eye and the right eye, respectively. Optionally, the two-viewpoint images in the black-and-white format are displayed on the screen, and may be displayed on the screen in the 3D-AR-Hud system in order to clearly distinguish brightness information of images captured by the two cameras.
In an embodiment of the present invention, the heads-up display system may be a 3D-AR-Hud system. Fig. 1b is a schematic diagram of an application scenario of a head-up display system according to an embodiment of the present invention, and in a specific example, as shown in fig. 1b, a 3D-AR-HUD system 101 projects an image onto a windshield 102, where the windshield has a PVB film thereon, and a user can see a virtual 3D image at a position 103 through the windshield 102. The image distance (i.e. the virtual image distance from the virtual image to the human eye) is typically between 5 and 8 meters.
According to the embodiment of the invention, the two eyes of a viewer are simulated by arranging the left camera and the right camera. The display content on the screen is reflected by the windshield, the reflected light enters human eyes, the human eyes can watch the virtual image of the display content, and the virtual image of the display content reaches the front of the windshield.
In an optional embodiment of the present invention, before sequentially adjusting the center view offset of the content displayed on the screen by the set step length, the method may further include: and adjusting the layout parameters of the display content on the screen so that the virtual image of the display content shot by the left camera is a single-viewpoint image, and/or the virtual image of the display content shot by the right camera is a single-viewpoint image.
When the width of the viewing area is measured, the two cameras simulate the eyes of a viewer, and the layout parameters of the display content, such as the layout cycle width, can be adjusted first, so that the left camera and the right camera of the cameras can both shoot single-viewpoint images, the left camera can obtain a black full-screen image, and the right camera can obtain a white full-screen image; or the right camera can acquire the black full-screen image, and the left camera can acquire the white full-screen image. Thereby determining the distance from the centers of the two cameras to the plane where the virtual image is located.
Optionally, the 3D-AR-HUD interleaving software may be used to adjust the mapping parameters by its own function. When the two cameras can acquire the single-viewpoint images, the distance from the center of the camera to the plane where the virtual image is located is the best. When the layout parameters of the displayed content are adjusted, the width of the visual area at the center of the two cameras is the width of the visual area at the optimal visual distance. And the distance from the centers of the two cameras to the plane where the virtual image is located is different, so that the corresponding visual area widths are also different.
In an optional embodiment of the present invention, the two cameras are respectively disposed at positions where both eyes of the viewer are located. Correspondingly, in order to realize effective simulation of the cameras, the two cameras can be respectively arranged at the positions of the eyes of the viewer. Of course, the positions of the two eyes of the viewer can also be set according to actual requirements, such as 1 meter or 1.2 meters away from the windshield, and the embodiment of the present invention does not limit this.
After the layout parameters of the display content are adjusted, and thus the distance from the two cameras to the plane where the virtual image is located is determined, the display content of the screen in the 3D-AR-Hud system can be adjusted according to the set step progress. Specifically, the central view offset of the content displayed on the screen is sequentially adjusted according to the set step length progress until the central view offset is equal to the period width of one row of pictures. For example, in the view arrangement mode of the 28-row arrangement period, the value of each center view offset is switched from 0 to 27.5 in 56 times in steps of 0.5.
And S120, for each adjustment, shooting virtual images of the display content through the left camera and the right camera, and marking the images shot by the two cameras respectively by adopting the same serial numbers.
Wherein the number of the picture is the same as the adjusted order number.
In the embodiment of the invention, in each adjustment process, the left camera and the right camera can shoot virtual images of the display content in sequence. That is, each time the central view offset is switched, the left and right cameras take one image. When the 3D-AR-Hud system employs a 28 view mapping mode and switches the center view offset by 0.5 step closer, the left and right cameras can acquire 56 images numbered from 0 to 55, respectively. That is, the number of images taken by the left and right cameras and the adjusted order number of the display contents on the screen may remain the same.
When the new line display system is on-vehicle new line display system, shoot through controlling two cameras the virtual image of demonstration content can include: through two cameras shoot the virtual image that light formed through the reflection of locomotive windshield to follow the image of shooing and extract the light of display content warp the virtual image that windshield reflection formed.
Wherein, the light that windshield reflection shows content also reflects the light that comes from other objects, and during reflection light entered into the camera, contained the virtual image that shows content in the image that the camera was shot, also can contain the light of other objects, need follow the virtual image that the light that draws the formation of display content in the image of shooting through windshield reflection to the accuracy obtains the virtual image. Among them, as the method of extraction, a method of difference in black and white can be used.
S130, selecting an image from the images shot by the left camera, and selecting an image with the same brightness as the image selected from the images shot by the left camera from the images shot by the right camera.
Wherein, the image shot by the left camera is the image shot by the left camera in the process that the central view offset is moved from 0 to one arrangement period width. Accordingly, the image captured by the right camera is the image captured by the right camera in the process of moving the central view shift amount from 0 to one swath cycle width.
In an optional embodiment of the present invention, the selecting an image from the images captured by the left camera and an image from the images captured by the right camera with the same brightness as the image selected from the images captured by the left camera may include: and respectively selecting the image with the maximum brightness from the image shot by the left camera and the image shot by the right camera. The image with the maximum brightness is selected, so that the accuracy of the selected image can be ensured, and the accuracy of the measured data is improved. Alternatively, the brightness of each image may be calculated by a brightness accumulation method, so as to determine the image with the maximum brightness. Specifically, the brightness of each pixel in the image may be added to obtain the brightness of the image.
S140, determining the width of a visual area at the centers of the two cameras according to the numbers of the two selected images, the arrangement period width, the stepping length and the distance between the two cameras.
Correspondingly, after the image selection is determined, the corresponding number is also determined, and the width of the visual area at the center of the two cameras can be finally determined according to the number of the two selected images, the width of the arrangement period, the stepping length, the distance between the two cameras and other parameters, so that the measured value of the width of the visual area corresponding to the current position of the camera can be obtained.
In an optional embodiment of the present invention, the determining, according to the number of the two selected images, the period width of the layout, the step length, and the distance between the two cameras, the width of the view zone at the centers of the two cameras may include:
determining the width of the visual area at the centers of the two cameras according to the following formula:
Figure BDA0001750513410000081
wherein cone is the width of a visual area at the center of the two cameras; rindexThe number of the image selected from the images shot by the right camera is the number of the image; l isindexThe number of the image selected from the images shot by the left camera is the number of the image; d1The distance between the two cameras is used as the distance; pitch is the width of the drawing cycle, d2Is the step length. Optionally, the ratio of the period width of the chart to the step length is 56, and the distance between the two cameras is 65 mm.
In the embodiment of the invention, the camera can adopt an industrial camera, and the distance between the two cameras can be set to 65mm so as to simulate the interpupillary distance of human eyes. Meanwhile, the ratio of the cycle width of the profile to the step length may be set to 56.
According to the embodiment of the invention, the central view offset of the displayed content on the screen is sequentially adjusted by the set step length progress until the central view offset is equal to the width of a layout period, and the virtual images of the displayed content are shot by the two cameras and are respectively marked in the adjustment process. Then selecting two images with the same image brightness from the images shot by the left camera and the right camera; according to the number, the arrangement period width, the stepping length and the distance between the two cameras of the two selected images, the width of the visual area at the center of the two cameras is determined, the problem that the accuracy and the reliability of the measurement data of the visual area width of the existing 3D-AR-Hud system are low is solved, the accurate measurement of the width of the visual area is realized, and the accuracy and the reliability of data measurement are improved.
Example two
Fig. 2 is a flowchart of a method for measuring an access screen distance based on a head-up display system according to a second embodiment of the present invention, where the method is applicable to accurately measure Hud an access screen distance in a system, and the method may be executed by an access screen distance measuring system based on a head-up display system, where the system may include a head-up display system, two left and right cameras, a main control device, and a measurement comparison board. Accordingly, as shown in fig. 2, the method includes the following operations:
s210, shooting virtual images of display contents on a screen through the left camera and the right camera, and shooting images of the measurement contrast plate.
The image content on the measurement contrast plate is the same as that of a source image, and the source image is displayed in a 3D mode through the screen.
The measurement contrast plate is used for assisting in measuring the screen access distance of the 3D-AR-Hud system, and may be a plate with any shape, any material, any size, and any color, as long as the image can be clearly displayed, which is not limited in the embodiment of the present invention. The source image may be a source image of display content to be presented by the 3D-AR-Hud system. In the embodiment of the invention, when the distance between the 3D-AR-Hud and the screen is measured, the virtual image of the content displayed on the screen can be captured by the left and right cameras, and the left and right cameras need to capture the image of the measurement contrast plate at the same time.
In an alternative embodiment of the present invention, the two cameras are respectively disposed at positions where both eyes of the viewer are located, and the distance between the two cameras is 65 mm.
Similarly, two cameras may be disposed at positions where both eyes of the viewer are located, respectively, and a distance between the two cameras may be set to 65mm to simulate human eyes.
In an alternative embodiment of the invention, the measurement contrast plate is a white contrast plate with black dots; the source image is a white disk image with a black central point.
In the embodiment of the invention, in order to simplify the measurement and improve the measurement efficiency of the distance between the input screen and the output screen of the 3D-AR-Hud system, a white contrast plate with black dots can be used as the measurement contrast plate. Correspondingly, the source image to be displayed by the 3D-AR-Hud system can be set to be a white disk image with a black central point, so that image comparison is convenient.
S220, when the image center of the display content virtual image shot by the left camera and the image center of the measurement contrast plate shot by the left camera coincide and the image center of the display content virtual image shot by the right camera and the image center of the measurement contrast plate shot by the right camera coincide, the distance between the measurement contrast plate and a zero plane is used as a screen-out distance or a screen-in distance.
And the zero plane is a plane where a virtual image of the display content is located when the screen displays the source image in 2D.
Wherein the in-out screen distance may be a distance between a virtual image plane displayed in the 3D-AR-Hud system and a zero plane.
In the embodiment of the invention, when the screen access distance is measured, the white measurement contrast plate printed with black dots can be moved in the area near the virtual image. In the process of moving the measurement contrast plate, when the image center of the display content virtual image shot by the left camera coincides with the image center of the measurement contrast plate shot by the left camera, that is, the black point in the display content virtual image shot by the left camera coincides with the black point in the image of the measurement contrast plate shot by the left camera, and the image center of the display content virtual image shot by the right camera coincides with the image center of the measurement contrast plate shot by the right camera, that is, when the black point in the display content virtual image shot by the right camera coincides with the black point in the image of the measurement contrast plate shot by the right camera, the distance between the contrast plate and a zero plane can be measured at the moment and used as a screen-out distance or a screen-in distance.
In an optional embodiment of the invention, the method may further comprise: and when the image center of the display content virtual image shot by the left camera and the image center of the measurement contrast plate shot by the left camera do not coincide, and/or the image center of the display content virtual image shot by the right camera and the image center of the measurement contrast plate shot by the right camera do not coincide, the measurement contrast plate is moved.
Correspondingly, if the image center of the virtual image of the display content shot by the left camera or the right camera does not coincide with the image center of the measurement contrast plate shot by the camera in the process of measuring the distance between the access screen and the display screen, the measurement contrast plate can be moved back and forth, left and right until the image centers of the virtual image of the display content shot by the two cameras coincide with the image center of the measurement contrast plate shot by the camera.
In the embodiment of the invention, the images of the measurement contrast plate are shot by two cameras; when the image centers of the virtual images of the display contents shot by the left camera and the right camera and the image centers of the measurement contrast plates shot by the left camera and the right camera are respectively superposed, the distance between the measurement contrast plates and a zero plane is used as a screen outlet distance or a screen inlet distance, the problems of low accuracy and reliability of measurement data of the screen inlet and outlet distance of the existing 3D-AR-Hud system are solved, the screen inlet and outlet distance is accurately measured, and the accuracy and reliability of data measurement are improved.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a system for measuring a width of a viewing area based on a head-up display system according to a third embodiment of the present invention, where the system is capable of executing a method for measuring a width of a viewing area based on a head-up display system according to any embodiment of the present invention, and has functional modules and beneficial effects corresponding to the execution method. As shown in fig. 3, the system may include:
a head-up display system 310, a left camera 320, a right camera 320 and a main control device 330;
the head-up display system 310 is configured to display content through a screen, and sequentially adjust a center view offset of the display content on the screen by a set step length until the center view offset is a drawing arrangement period width; wherein two viewpoint images of different brightness are displayed on the screen;
two cameras 320 for shooting a virtual image of the display content for each adjustment;
the main control device 330 is connected to the two cameras 320, respectively, and is configured to:
for each adjustment, the images shot by the two cameras 320 are respectively marked by the same number; wherein the number of the image is the same as the adjusted order number;
selecting an image from the images shot by the left camera 320, and selecting an image with the same brightness as the image selected from the images shot by the left camera 320 from the images shot by the right camera 320;
and determining the width of a visual area at the center of the two cameras 320 according to the number of the two selected images, the arrangement period width, the stepping length and the distance between the two cameras 320.
As shown in fig. 3, the 3D-AR-Hud system may include a high-brightness LED (Light-Emitting Diode) backlight 301, a screen, a 3D lenticular lens 302, and an aspheric reflector. The high-brightness LED backlight 301 and the 3D lenticular grating 302 may be in the form of "Thin Film Transistor (TFT) + cylindrical lens diaphragm", and the aspheric reflector may be in the form of a concave reflector 303 and a flat reflector 304. These devices may be assembled and protected by a protective case 305, such as a glass cover. The virtual image is formed in the coincidence of the lane in the front of the driver and the lane in the front, the distance of the virtual image is determined by the design parameters of the reflector set, and the zero plane distance of the virtual image directly determines the laminating condition of the display information and the road. The image distance L is accurately measured, and the image and the lane can be accurately attached only by adjusting the size and the perspective of the display content and the screen access effect through the image distance L.
According to the embodiment of the invention, the central view offset of the displayed content on the screen is sequentially adjusted by the set step length progress until the central view offset is equal to the width of a layout period, and the virtual images of the displayed content are shot by the two cameras and are respectively marked in the adjustment process. Then selecting two images with the same image brightness from the images shot by the left camera and the right camera; according to the number, the arrangement period width, the stepping length and the distance between the two cameras of the two selected images, the width of the visual area at the center of the two cameras is determined, the problem that the accuracy and the reliability of the measurement data of the visual area width of the existing 3D-AR-Hud system are low is solved, the accurate measurement of the width of the visual area is realized, and the accuracy and the reliability of data measurement are improved.
Optionally, the main control device 330 is further configured to determine the width of the viewing zone at the center of the two cameras 320 according to the following formula:
Figure BDA0001750513410000131
wherein cone is the width of the viewing zone at the center of the two cameras 320; rindexEditing of images selected from images captured by right camera 320Number; l isindexA number for an image selected from the images captured by the left camera 320; d1Is the distance between the two cameras 320; pitch is the width of the drawing cycle, d2Is the step length.
Optionally, the main control device 330 is further configured to select an image with the largest brightness from the image captured by the left camera 320 and the image captured by the right camera 320.
Optionally, the head-up display system 310 is further configured to adjust layout parameters of the display content on the screen, so that a virtual image of the display content, which is captured by the left camera 320, is a single-viewpoint image, and/or a virtual image of the display content, which is captured by the right camera 320, is a single-viewpoint image.
Optionally, the new line display system is on-vehicle new line display system, and two cameras 320 still are used for shooting the virtual image that light formed through the reflection of locomotive windshield to extract from the image of shooing the light warp of demonstration content the virtual image that windshield reflection formed.
Optionally, the two cameras 320 are respectively disposed at positions of two eyes of the viewer.
Optionally, a two-viewpoint image in a black-and-white format is displayed on the screen.
Optionally, the ratio of the drawing cycle width to the step length is 56, and the distance between the two cameras 320 is 65 mm.
The system for measuring the width of the visual area based on the head-up display system can execute the method for measuring the width of the visual area based on the head-up display system provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method. For details of the viewing area width measurement method based on the head-up display system according to any embodiment of the present invention, reference may be made to the following description.
Example four
Fig. 4 is a schematic structural diagram of an access screen distance measuring system based on a head-up display system according to a fourth embodiment of the present invention, where the system is capable of performing the access screen distance measuring method based on the head-up display system according to any embodiment of the present invention, and has functional modules and beneficial effects corresponding to the performing method. As shown in fig. 4, the system may include:
the head-up display system 410, the left camera 420, the right camera 420, the main control device 430 and the measurement comparison board 440;
the head-up display system 410 is used for performing 3D display on a source image through a screen;
the measurement contrast plate 440 is provided with an image; wherein the image content on the measurement contrast plate 440 is the same as the content of the source image;
two cameras 420 for taking virtual images of the contents displayed on the screen and taking images of the measurement contrast plate 440;
the main control device 430 is configured to, when it is determined that the image center of the virtual display content image captured by the left camera 420 coincides with the image center of the measurement contrast plate 440 captured by the left camera 420 and the image center of the virtual display content image captured by the right camera 420 coincides with the image center of the measurement contrast plate 440 captured by the right camera 420, use the distance between the measurement contrast plate 440 and a zero plane as a screen exit distance or a screen entry distance;
and the zero plane is a plane where a virtual image of the display content is located when the screen displays the source image in 2D.
The backlight 501, the display screen, the 3D lenticular lens 402, the concave mirror 403, the plane mirror 404, and the protective shell 405 in the head-up display system 410 may refer to the corresponding descriptions of the third embodiment. The two cameras 420 may be connected to the main control device 430 by using a USB (Universal Serial Bus).
Specifically, in the process of measuring the in-out screen distance, the measurement contrast plate 440 may be moved near the virtual image display area, when the image center of the virtual image of the display content captured by the left camera 420 coincides with the image center of the measurement contrast plate 440 captured by the left camera 420, and the image center of the virtual image of the display content captured by the right camera 420 coincides with the image center of the measurement contrast plate 440 captured by the right camera 420, the movement of the measurement contrast plate 440 is stopped, and the distance between the measurement contrast plate 440 and the zero plane at this time is used as the out-screen distance or the in-screen distance.
In the embodiment of the invention, the images of the measurement contrast plate are shot by two cameras; when the image centers of the virtual images of the display contents shot by the left camera and the right camera and the image centers of the measurement contrast plates shot by the left camera and the right camera are respectively superposed, the distance between the measurement contrast plates and a zero plane is used as a screen outlet distance or a screen inlet distance, the problems of low accuracy and reliability of measurement data of the screen inlet and outlet distance of the existing 3D-AR-Hud system are solved, the screen inlet and outlet distance is accurately measured, and the accuracy and reliability of data measurement are improved.
Alternatively, the measurement contrast plate 440 is a white contrast plate with black dots; the source image is a white disk image with a black central point.
Optionally, the main control device 430 is further configured to move the measurement contrast plate 440 when it is determined that the center of the image of the virtual display content image captured by the left camera 420 does not coincide with the center of the image of the measurement contrast plate captured by the left camera 420, and/or the center of the image of the virtual display content image captured by the right camera 420 does not coincide with the center of the image of the measurement contrast plate 440 captured by the right camera 420.
Optionally, the two cameras 420 are respectively disposed at positions of two eyes of the viewer, and a distance between the two cameras 420 is 65 mm.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (8)

1. A method for measuring the width of a visual area based on a head-up display system is characterized by comprising the following steps:
sequentially adjusting the central view offset of the displayed content on the screen by a set step length until the central view offset is the width of a layout period; wherein, the screen displays left and right viewpoint images with different brightness;
for each adjustment, shooting virtual images of the display content through a left camera and a right camera, and marking the images shot by the two cameras respectively by adopting the same number; wherein the number of the image is the same as the adjusted order number;
selecting an image from the images shot by the left camera, and selecting an image with the same brightness as the image selected from the images shot by the left camera from the images shot by the right camera;
determining the width of a visual area at the centers of the two cameras according to the number of the two selected images, the arrangement period width, the stepping length and the distance between the two cameras;
determining the width of the visual area at the centers of the two cameras according to the following formula:
Figure FDA0002401368860000011
wherein cone is the width of a visual area at the center of the two cameras; rindexThe number of the image selected from the images shot by the right camera is the number of the image; l isindexThe number of the image selected from the images shot by the left camera is the number of the image; d1The distance between the two cameras is used as the distance; pitch is the width of the drawing cycle, d2Is the step length.
2. The method of claim 1, wherein selecting an image from the images captured by the left camera and an image from the images captured by the right camera with the same brightness as the image selected from the images captured by the left camera comprises:
and respectively selecting the image with the maximum brightness from the image shot by the left camera and the image shot by the right camera.
3. The method of claim 1, further comprising, before sequentially adjusting the center view offset of the content displayed on the screen by the set step length:
and adjusting the layout parameters of the display content on the screen so that the virtual image of the display content shot by the left camera is a single-viewpoint image, and/or the virtual image of the display content shot by the right camera is a single-viewpoint image.
4. The method of claim 1, wherein the head-up display system is an on-vehicle head-up display system, and the capturing virtual images of the display content by the left and right cameras comprises:
through two cameras shoot the virtual image that light formed through the reflection of locomotive windshield to follow the image of shooing and extract the light of display content warp the virtual image that windshield reflection formed.
5. The method of claim 1, wherein the two cameras are respectively disposed at positions of eyes of a viewer.
6. The method of claim 1, wherein the two-viewpoint image in black and white format is displayed on the screen.
7. The method of any of claims 1-6, wherein the ratio of the pitch period width to the step length is 56 and the distance between the two cameras is 65 mm.
8. A viewing zone width measurement system based on a head-up display system, comprising:
the device comprises a head-up display system, a left camera, a right camera and a main control device;
the head-up display system is used for displaying display contents through a screen and sequentially adjusting the central view offset of the display contents on the screen by a set step length until the central view offset is one arrangement period width; wherein two viewpoint images of different brightness are displayed on the screen;
the two cameras are used for shooting virtual images of the display content for each adjustment;
the master control equipment is respectively connected with the two cameras and used for:
for each adjustment, marking the images shot by the two cameras respectively by adopting the same serial number; wherein the number of the image is the same as the adjusted order number;
selecting an image from the images shot by the left camera, and selecting an image with the same brightness as the image selected from the images shot by the left camera from the images shot by the right camera;
determining the width of a visual area at the centers of the two cameras according to the number of the two selected images, the arrangement period width, the stepping length and the distance between the two cameras;
determining the width of the visual area at the centers of the two cameras according to the following formula:
Figure FDA0002401368860000031
wherein cone is the width of a visual area at the center of the two cameras; rindexThe number of the image selected from the images shot by the right camera is the number of the image; l isindexThe number of the image selected from the images shot by the left camera is the number of the image; d1The distance between the two cameras is used as the distance; pitch is the width of the drawing cycle, d2Is the step length.
CN201810864231.5A 2018-08-01 2018-08-01 Viewing zone width measuring method and system based on Hud system Active CN108896281B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810864231.5A CN108896281B (en) 2018-08-01 2018-08-01 Viewing zone width measuring method and system based on Hud system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810864231.5A CN108896281B (en) 2018-08-01 2018-08-01 Viewing zone width measuring method and system based on Hud system

Publications (2)

Publication Number Publication Date
CN108896281A CN108896281A (en) 2018-11-27
CN108896281B true CN108896281B (en) 2020-06-12

Family

ID=64352834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810864231.5A Active CN108896281B (en) 2018-08-01 2018-08-01 Viewing zone width measuring method and system based on Hud system

Country Status (1)

Country Link
CN (1) CN108896281B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114061911B (en) * 2021-11-16 2023-09-15 深圳市锐思华创技术有限公司 Dazzle light detection system of head-up display

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101865763A (en) * 2010-07-14 2010-10-20 西安工业大学 Parallax measurement device of airborne head-up display
WO2018061745A1 (en) * 2016-09-30 2018-04-05 日本精機株式会社 Head-up display device
CN107941464A (en) * 2017-11-09 2018-04-20 中国航空工业集团公司洛阳电光设备研究所 A kind of head-up display visual field testboard
CN108174182A (en) * 2017-12-30 2018-06-15 上海易维视科技股份有限公司 Three-dimensional tracking mode bore hole stereoscopic display vision area method of adjustment and display system
CN108181094A (en) * 2017-12-25 2018-06-19 中国航空工业集团公司洛阳电光设备研究所 A kind of head-up display Character Intensity and line width measuring method
CN108225734A (en) * 2018-01-05 2018-06-29 宁波均胜科技有限公司 A kind of error calibration system and its error calibrating method based on HUD systems
CN108344562A (en) * 2018-03-12 2018-07-31 福耀集团(上海)汽车玻璃有限公司 A kind of HUD windshields detection device and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5359958B2 (en) * 2010-03-29 2013-12-04 株式会社デンソー Method for manufacturing head-up display device
DE102015226425A1 (en) * 2015-12-22 2017-06-22 Robert Bosch Gmbh Backlight device for backlighting a display unit for a head-up display, head-up display and method for backlighting a display unit for a head-up display

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101865763A (en) * 2010-07-14 2010-10-20 西安工业大学 Parallax measurement device of airborne head-up display
WO2018061745A1 (en) * 2016-09-30 2018-04-05 日本精機株式会社 Head-up display device
CN107941464A (en) * 2017-11-09 2018-04-20 中国航空工业集团公司洛阳电光设备研究所 A kind of head-up display visual field testboard
CN108181094A (en) * 2017-12-25 2018-06-19 中国航空工业集团公司洛阳电光设备研究所 A kind of head-up display Character Intensity and line width measuring method
CN108174182A (en) * 2017-12-30 2018-06-15 上海易维视科技股份有限公司 Three-dimensional tracking mode bore hole stereoscopic display vision area method of adjustment and display system
CN108225734A (en) * 2018-01-05 2018-06-29 宁波均胜科技有限公司 A kind of error calibration system and its error calibrating method based on HUD systems
CN108344562A (en) * 2018-03-12 2018-07-31 福耀集团(上海)汽车玻璃有限公司 A kind of HUD windshields detection device and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Survey and Classification of Head-Up Display Presentation Principles;Marcus Tonnis 等;《Proceedings of 17th World Congress on Ergonomics》;20090809;全文 *

Also Published As

Publication number Publication date
CN108896281A (en) 2018-11-27

Similar Documents

Publication Publication Date Title
KR101131983B1 (en) A head-up display device for vehicle moving the projecting position of virtual images by the vehicle velocity
CN209086550U (en) A kind of naked eye 3D-HUD display device
CN108196366B (en) Method and device for adjusting display brightness
CN101681022B (en) Image display apparatus and method for displaying an image
JP6163033B2 (en) Head-up display device and display unit
EP3444139A1 (en) Image processing method and image processing device
US20160202487A1 (en) Head-mounted display capable of adjusting image viewing distance
CN103533340B (en) The bore hole 3D player method of mobile terminal and mobile terminal
CN108989794B (en) Virtual image information measuring method and system based on head-up display system
CN105974584A (en) Head-up display and control method thereof
CN108700744A (en) Information display device
KR101852209B1 (en) Method for producing an autostereoscopic display and autostereoscopic display
US20180345867A1 (en) Display system, electronic mirror system, and moving body
WO2020194699A1 (en) Aerial video display device
CN109855845B (en) Binocular eye lens measurement vehicle-mounted HUD virtual image distance and correction method
US10955685B2 (en) Volumetric display arrangement and a method for representing content of an image
US11945306B2 (en) Method for operating a visual field display device for a motor vehicle
US11222476B2 (en) System to add parallax to video for augmented reality head up display
US20100066814A1 (en) Method capable of generating real-time 3d map images and navigation system thereof
CN111189621A (en) Head-up display optical detection calibration method and system
CN108896281B (en) Viewing zone width measuring method and system based on Hud system
US20130265398A1 (en) Three-Dimensional Image Based on a Distance of a Viewer
KR100813492B1 (en) Head up display system for vehicle
CN108896982A (en) Discrepancy screen distance measurement method and system based on Hud system
JP2023533991A (en) Head blind zone perspective virtual display apparatus and method based on convex lens/concave mirror

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200408

Address after: 215634 north side of Chengang road and west side of Ganghua Road, Jiangsu environmental protection new material industrial park, Zhangjiagang City, Suzhou City, Jiangsu Province

Applicant after: ZHANGJIAGANG KANGDE XIN OPTRONICS MATERIAL Co.,Ltd.

Address before: 201203, room 5, building 690, No. 202 blue wave road, Zhangjiang hi tech park, Shanghai, Pudong New Area

Applicant before: WZ TECHNOLOGY Inc.

GR01 Patent grant
GR01 Patent grant