CN108444448B - Test method and device - Google Patents

Test method and device Download PDF

Info

Publication number
CN108444448B
CN108444448B CN201810311530.6A CN201810311530A CN108444448B CN 108444448 B CN108444448 B CN 108444448B CN 201810311530 A CN201810311530 A CN 201810311530A CN 108444448 B CN108444448 B CN 108444448B
Authority
CN
China
Prior art keywords
position information
image
observation point
shooting
range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810311530.6A
Other languages
Chinese (zh)
Other versions
CN108444448A (en
Inventor
魏伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN201810311530.6A priority Critical patent/CN108444448B/en
Publication of CN108444448A publication Critical patent/CN108444448A/en
Application granted granted Critical
Publication of CN108444448B publication Critical patent/CN108444448B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying

Abstract

The invention provides a test method and a test device. The test method comprises the following steps: acquiring a projection image of projection equipment; determining the farthest observation point and the nearest observation point of the projected image according to the position information of the projection equipment; testing the position information of the viewed projected image in the preset ranges of the farthest observation point and the nearest observation point respectively; and determining the window range for watching the projection image according to the test result. According to the invention, the most real eye box size is obtained according to the farthest test point and the nearest test point in the actual driving process, so that whether the designed HUD is qualified or not can be determined, the HUD can be designed according to the tested eye box size, the HUD design accuracy is improved, and the user experience is further improved.

Description

Test method and device
Technical Field
The invention relates to the field of vehicle-mounted systems, in particular to a testing method and a testing device.
Background
HUD (Head Up Display) is future mainstream car auxiliary assembly, can project some auxiliary driving information (such as speed of a motor vehicle, navigation, oil consumption, rotational speed etc.) on the windshield through optical projection's mode, and the driver need not to look down at the panel board and can acquire these information to improve the security of driving.
Eye box refers to the window scope of watching the HUD image, is the important technical index of HUD technique, and eye box size can reflect out whether HUD's design is qualified. During the driving process of the vehicle, the driver's seat can be adjusted in the up, down, front and back directions, and whether the HUD image can be completely seen at different positions is a concern. No protocol has yet been developed to test the eye box size of HUDs.
Therefore, how to obtain the most realistic eye box size to determine whether a designed HUD is acceptable is a problem to be solved.
Disclosure of Invention
The invention provides a testing method and a testing device, which are used for solving the problem that the most real eye box size obtained by testing does not exist in the prior art so as to determine whether a designed HUD is qualified or not.
In order to solve the above problems, the present invention discloses a test method, comprising: acquiring a projection image of projection equipment;
determining the farthest observation point and the nearest observation point of the projected image according to the position information of the projection equipment;
testing the position information of the viewed projected image in the preset ranges of the farthest observation point and the nearest observation point respectively;
and determining the window range for watching the projection image according to the test result.
Preferably, the step of testing the position information of the projected image viewed in the preset ranges of the farthest observation point and the nearest observation point respectively includes:
continuously shooting the projection image within a first preset range of the farthest observation point to obtain continuously shot first images, and recording first position information of each first image;
and continuously shooting the projected image within a second preset range of the nearest observation point to acquire continuously shot second images, and recording second position information of shooting each second image.
Preferably, the step of determining a window range for viewing the projected image according to the test result includes:
drawing a critical line at a boundary line close to the projected image in advance;
searching a first target image with a boundary line overlapped with the critical line from the first image, and searching a second target image with a boundary line overlapped with the critical line from the second image;
determining first edge position information of the projected image viewed in the first preset range according to the first position information of the first target image; determining second edge position information of the projected image viewed in the second preset range according to second position information of the second target image;
and determining a shooting range for shooting the projection image according to the area surrounded by the first edge position information and the second edge position information, and taking the shooting range as a window range of the projection image.
Preferably, the step of testing the position information of the projected image viewed in the preset ranges of the farthest observation point and the nearest observation point respectively includes:
calculating the distance between the farthest observation point and the nearest observation point;
equally dividing the distance, and taking an equally divided point as an interval observation point;
continuously shooting the projection image within a third preset range of the interval observation point to obtain continuously shot third images, and recording third position information of all the third images;
searching a third target image with a boundary line overlapped with the critical line from the third image;
determining third edge position information for viewing the projected image within the third preset range according to third position information of the third target image;
the step of determining the window range for viewing the projected image according to the test result comprises:
determining a shooting range for shooting the projection image according to an area surrounded by the first edge position information, the second edge position information and the third edge position information, and taking the shooting range as a window range of the projection image.
In order to solve the above problem, an embodiment of the present invention further discloses a testing apparatus, including:
the acquisition module is used for acquiring a projection image of the projection equipment;
the observation point determining module is used for determining the farthest observation point and the closest observation point of the projected image according to the position information of the projection equipment;
the testing module is used for testing the position information of the viewed projection image in the preset ranges of the farthest observation point and the nearest observation point respectively;
and the window range determining module is used for determining the window range for watching the projection image according to the test result.
Preferably, the test module comprises:
the first shooting submodule is used for continuously shooting the projection image within a first preset range of the farthest observation point so as to obtain continuously shot first images and recording first position information for shooting each first image;
and the second shooting submodule is used for continuously shooting the projection image within a second preset range of the nearest observation point so as to obtain continuously shot second images and record second position information for shooting each second image.
Preferably, the window range determining module includes:
a critical line drawing submodule for drawing a critical line in advance at a boundary line in the projection image, the boundary line being close to the projection image;
the object image searching submodule is used for searching a first object image of which the boundary line is overlapped with the critical line from the first image and searching a second object image of which the boundary line is overlapped with the critical line from the second image;
an edge position information determining submodule, configured to determine, according to first position information of the first target image, first edge position information of the projected image viewed within the first preset range; determining second edge position information of the projected image viewed in the second preset range according to second position information of the second target image;
and the first window range determining submodule is used for determining a shooting range for shooting the projection image according to an area surrounded by the first edge position information and the second edge position information, and taking the shooting range as a window range of the projection image.
Preferably, the test module comprises:
the distance calculation submodule is used for calculating the distance between the farthest observation point and the nearest observation point;
the interval test point determining submodule is used for equally dividing the distance and taking the equally divided points as interval observation points;
the third image shooting submodule is used for continuously shooting the projection image within a third preset range of the interval observation point so as to obtain continuously shot third images and recording third position information for shooting each third image;
the third target image searching submodule is used for searching a third target image of which the boundary line is overlapped with the critical line from the third image;
a third edge position information determining submodule, configured to determine, according to third position information of the third target image, third edge position information of viewing the projected image within the third preset range;
the window range determination module comprises:
and the second window range determining submodule is used for determining a shooting range for shooting the projection image according to an area surrounded by the first edge position information, the second edge position information and the third edge position information, and taking the shooting range as a window range of the projection image.
Compared with the prior art, the invention has the following advantages:
the embodiment of the invention provides a test method and a test device, wherein a projection image of projection equipment is obtained; determining the farthest observation point and the nearest observation point of the projected image according to the position information of the projection equipment; testing the position information of the viewed projected image in the preset ranges of the farthest observation point and the nearest observation point respectively; and determining the window range for watching the projection image according to the test result. According to the embodiment of the invention, the most real eye box size is obtained according to the farthest test point and the nearest test point in the actual driving process, so that whether the designed HUD is qualified or not can be determined, the HUD can be designed according to the tested eye box size, the HUD design accuracy is improved, and the user experience is further improved.
Drawings
FIG. 1 is a flow chart illustrating the steps of a testing method provided by an embodiment of the present invention;
FIG. 1a is a schematic view of an eye-box provided by an embodiment of the invention;
FIG. 2 is a flow chart illustrating the steps of a testing method provided by an embodiment of the present invention;
FIG. 2a is a schematic diagram illustrating drawing of critical lines in a projected image according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a testing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Example one
Referring to fig. 1, a flowchart illustrating steps of a testing method provided in an embodiment of the present invention is shown, which may specifically include:
step 101: and acquiring a projection image of the projection equipment.
In the embodiment of the present invention, the projection device may be a HUD or the like, which is not limited in this respect.
The embodiments of the present invention are described in terms of HUD projection images, and are not intended to limit the invention.
The embodiment of the invention is a test carried out outside the vehicle-mounted equipment, and can firstly arrange a similar space environment according to the vehicle-mounted equipment, install the projection equipment such as HUD and the like, and arrange the projection equipment on the front windshield similar to the vehicle-mounted equipment.
After the arrangement according to the environment of the vehicle-mounted device is completed, a projection image projected by a projection device (such as a HUD) can be acquired.
After the projection image of the projection apparatus is acquired, step 102 is performed.
Step 102: and determining the farthest observation point and the nearest observation point of the projected image according to the position information of the projection equipment.
In the embodiment of the invention, since the projection device is installed according to the space environment of the vehicle-mounted device, after the position information of the projection device is determined, the seat position information of the vehicle-mounted device can be calculated, and the seat can move back and forth, for example, the seat can move back and forth by 20cm and the like.
After the position information of the projection device is determined, the farthest observation point and the closest observation point of the projected image, that is, the farthest distance point of the driver seat of the vehicle-mounted device moving backward and the closest distance point of the driver seat moving forward, can be determined according to the position information of the projection device.
After the farthest observation point and the nearest observation point of the projected image are determined, step 103 is entered.
Step 103: and testing the position information of the viewed projected image in the preset ranges of the farthest observation point and the nearest observation point respectively.
Generally, since the front windshield of the vehicle-mounted device is in an arc structure, a user can view a complete projected image at the farthest observation point and the closest observation point, a specific range exists, and the user cannot view the complete projected image beyond the specific range.
In the embodiment of the present invention, before performing the test, the preset ranges for viewing the projected image at the farthest observation point and the closest observation point may be set according to preset rules, for example, the preset ranges may be set according to the work experience of the developer, and the like, which is not limited in this embodiment of the present invention.
The test can be performed by continuously shooting in the preset range of the farthest observation point and the closest observation point, for example, the shooting device is used for continuously shooting the projection image in the preset range of the farthest observation point and the closest observation point to perform the test, and the position information of each shot image is recorded, so that the edge position information of the shot projection image can be determined according to the continuously shot images, and then the window range for watching the projection image in the preset range of the farthest test point and the closest test point, namely the size of the eyebox, can be determined according to the edge position information.
Of course, in practical applications, the test may be performed in other manners, and the embodiment of the present invention is not limited thereto.
The method of testing will be described in detail in the following example two, which is not repeated herein.
After the position information of the viewed projection image is tested within the preset range of the farthest observation point and the nearest observation point, the process goes to step 104.
Step 104: and determining the window range for watching the projection image according to the test result.
In the embodiment of the present invention, after the test is performed in the preset range of the farthest observation point and the nearest observation point, the edge position information of the projection image viewed at the farthest observation point and the nearest observation point can be determined according to the test result, and further, the window range of the projection image viewed can be determined according to the edge position information.
For example, referring to fig. 1a, a schematic view of an eye box according to an embodiment of the present invention is shown, as shown in fig. 1a, a formed window range may be a rectangular parallelepiped structure, that is, an eye box structure, and a user can view a complete projection image from the rectangular parallelepiped structure range.
Of course, in practical applications, the formed eye box structure may also be other structures, such as an oval shape or an irregular shape, and the above examples are only examples for better understanding of the technical solution of the embodiments of the present invention, and are not meant to be the only limitation to the embodiments of the present invention.
The embodiment of the invention provides a test method, which comprises the steps of obtaining a projection image of projection equipment; determining the farthest observation point and the nearest observation point of the projected image according to the position information of the projection equipment; testing the position information of the viewed projected image in the preset ranges of the farthest observation point and the nearest observation point respectively; and determining the window range for watching the projection image according to the test result. According to the embodiment of the invention, the most real eye box size is obtained according to the farthest test point and the nearest test point in the actual driving process, so that whether the designed HUD is qualified or not can be determined, the HUD can be designed according to the tested eye box size, the HUD design accuracy is improved, and the user experience is further improved.
Example two
Referring to fig. 2, a flowchart illustrating steps of a testing method provided in an embodiment of the present invention is shown, which may specifically include:
step 201: and acquiring a projection image of the projection equipment.
Step 202: and determining the farthest observation point and the nearest observation point of the projected image according to the position information of the projection equipment.
In the embodiment of the present invention, the implementation manner of the step 201 to the step 202 is similar to the implementation manner of the step 101 to the step 102 in the first embodiment, and the embodiment of the present invention is not described herein again.
After the farthest observation point and the nearest observation point for viewing the projected image are determined, step 203 is entered.
Step 203: and continuously shooting the projected image within a first preset range of the farthest observation point to obtain continuously shot first images, and recording first position information of each first image.
In the embodiment of the present invention, the first preset range refers to a set range extending outward by a set distance value with the farthest observation point as a center, for example, the set value of the distance may be 30cm, 15cm, and the like, which is not limited in this embodiment of the present invention. Of course, since the shape of the front windshield of the in-vehicle device is an arc-shaped structure, the distances extending in the respective directions around the farthest observation point are not necessarily the same, and may also be the same, and the embodiment of the present invention is not limited thereto.
The projection image is continuously shot in a first preset range of the farthest observation point by adopting shooting equipment, so that a plurality of images shot in the first preset range can be obtained and recorded as first images, and shooting position information of each shot first image is recorded and recorded as first position information while shooting each first image.
After the first images continuously taken are acquired and the first position information of the first images taken is recorded, step 204 is entered.
Step 204: and continuously shooting the projected image within a second preset range of the nearest observation point to acquire continuously shot second images, and recording second position information of shooting each second image.
In the embodiment of the present invention, the second preset range refers to a set range that extends outward with the closest observation point as a center to set a distance value, for example, the set value of the distance may be 25cm, 12cm, and the like, which is not limited in the embodiment of the present invention. Of course, since the shape of the front windshield of the in-vehicle device is an arc-shaped structure, the distances extending in the respective directions around the closest observation point are not necessarily the same, and may be the same, and the embodiment of the present invention is not limited thereto.
And continuously shooting the projection image in a second preset range which is farthest and closest to the observation point by adopting shooting equipment, so that a plurality of images shot in the second preset range can be obtained and recorded as a second eight image, and the shooting position information of each shot second image is recorded and recorded as second position information while each second image is shot.
After the second images continuously taken are acquired and the second position information of taking each second image is recorded, step 205 is entered.
Step 205: drawing a critical line in the projection image at a boundary line close to the projection image in advance.
In an embodiment of the present invention, a critical line is pre-drawn at a position in a projected image close to a boundary line of the projected image, for example, referring to fig. 2a, which shows a schematic diagram of drawing a critical line in a projected image according to an embodiment of the present invention, as shown in fig. 2a, one or more critical lines may be pre-drawn at a position close to each boundary line of the projected image, and the number of critical lines drawn with respect to each boundary line should be the same, for example, in fig. 2a, the number of critical lines drawn close to a left boundary line of the projected image is 4, and the number of critical lines drawn close to a right boundary line, an upper boundary line and a lower boundary line of the projected image should also be 4. Of course, the colors of the plurality of critical lines drawn near each boundary line may be different colors, so as to distinguish the positions of the captured edges when the projected image is captured, or may be the same color, which is not limited in the embodiment of the present invention.
Step 206: and searching a first target image of which the boundary line is overlapped with the critical line from the first image, and searching a second target image of which the boundary line is overlapped with the critical line from the second image.
In the embodiment of the invention, the first target image is a first image which can be shot to form a complete projection image in a first preset range of a farthest observation point, the second target image is a second image which can be shot to form a complete projection image in a second preset range of a nearest observation point, and the first target image and the second target image are at least four images.
After the projected image is continuously shot within the first preset range, the first target image whose boundary line overlaps the critical line can be searched from the continuously shot first image, as shown in fig. 2a, when the first target image is a left half image, it indicates that the right boundary line of the shot first image does not overlap the adjacent line corresponding to the left boundary line in the projected image, and the projected image cannot be seen when the projected image is viewed at the position where the first target image is shot. As can be seen from the right half of fig. 2a, the boundary lines of the first image captured at this time overlap the critical lines in the projected image, and the first image is regarded as the first target image.
Further, the second target image in which the second image is captured may be determined in the above manner.
After the first target image and the second target image are found, step 207 is entered.
Step 207: determining first edge position information of the projected image viewed in the first preset range according to the first position information of the first target image; and determining second edge position information of the projected image viewed in the second preset range according to the second position information of the second target image.
In the embodiment of the present invention, the first edge position information is first position information at which the projected image can be viewed exactly within a first preset range, and the second edge position information is second position information at which the projected image can be viewed exactly within a second preset range.
The position information of the first images and the second images is recorded in advance, so that after the corresponding first target images and second target images are found out from the first images and the second images, the position information recorded in advance can be used for determining the first position information of the first target images and the second position information of the second target images, and the first edge position information for viewing the projected images in the first preset range and the second edge position information for viewing the projected images in the second preset range can be determined according to the first position information of the first target images and the second position information of the second target images.
In a preferred embodiment of the present invention, other observation points may be determined between the farthest observation point and the nearest observation point for testing, and specifically, the following steps may be included:
step S1: calculating the distance between the farthest observation point and the nearest observation point;
step S2: equally dividing the distance, and taking an equally divided point as an interval observation point;
step S3: continuously shooting the projection image within a third preset range of the interval observation point to obtain continuously shot third images, and recording third position information of all the third images;
step S4: searching a third target image with a boundary line overlapped with the critical line from the third image;
step S5: and determining third edge position information for viewing the projected image within the third preset range according to third position information of the third target image.
In the embodiment of the invention, when other observation points are required to be determined between the farthest observation point and the nearest observation point for testing, the distance between the farthest observation point and the nearest observation point is firstly calculated, the calculated distance is further subjected to equal division, and the equal division point is used as the interval observation point. For example, the distance between the farthest observation point and the closest observation point is 24cm, and when one interval observation point needs to be determined, the distance is halved by 24cm, and the positions 12cm away from both the farthest observation point and the closest observation point are used as interval observation points for testing. Of course, when three observation points are determined between the farthest observation point and the closest observation point for testing, the distance between the farthest observation point and the closest observation point is divided into four equal parts, and the three equal parts are used as interval observation points.
It should be understood that the above examples are only examples for better understanding of the technical solutions of the embodiments of the present invention, and are not to be taken as the only limitation of the embodiments of the present invention.
The interval observation point corresponds to the farthest observation point and the closest observation point, a preset range exists at the interval point, a complete projected image can be viewed, the preset range is used as a third preset range, the projected image is continuously shot in the third preset range, so that continuously shot third images are obtained, position information of all the shot third images is recorded, further, a third target image can be searched from the third images in the similar mode in the step 206, and the third target image refers to the third image which can be shot to the complete projected image in the third preset range of the observation point.
And further determining third edge position information for viewing the projected image within a third preset range according to the recorded third position information of the third target image.
And then executing the step of determining the window range according to the test result.
After the first edge location information and the second edge location information are determined, step 208 is entered.
Step 208: and determining a shooting range for shooting the projection image according to the area surrounded by the first edge position information and the second edge position information, and taking the shooting range as a window range of the projection image.
In the embodiment of the present invention, the first edge position information and the second edge position information are both a plurality of position information, a range formed by the plurality of first edge position information is a first range in which the complete projected image can be viewed at the farthest observation point, and a range formed by the plurality of second edge position information is a second range in which the complete projected image can be viewed at the closest observation point.
In the embodiment of the present invention, the first range and the second range may be set to be approximately parallel structures, such as a circular structure, or a square structure, or an irregular structure, and the like.
And connecting two approximate center points of a first structure formed by the first range and a second structure formed by the second range by using a straight line, wherein the straight line is perpendicular to the first structure and the second structure, and further connecting a point formed by certain first edge position information and a point formed by certain second edge position information, and the connected straight line is on the same plane with the straight line perpendicular to the first structure and the second structure.
Furthermore, the plurality of points formed by the plurality of first edge position information and the plurality of second edge position information are respectively connected one by adopting straight lines in the above mode, and the straight lines connected are all on the same plane with the straight lines perpendicular to the first structure and the second structure, so that an area which is used as a city according to the first edge position information and the second edge position information can be obtained, the area is a shooting range for shooting a projection image, and the shooting range is used as a window range of the projection image, namely an eyebox.
In a preferred embodiment of the present invention, if there are other observation points between the farthest observation point and the nearest observation point, step 208 above may include:
substep N1: determining a shooting range for shooting the projection image according to an area surrounded by the first edge position information, the second edge position information and the third edge position information, and taking the shooting range as a window range of the projection image.
Specifically, the point formed by the first edge position information and the point formed by the third edge position information may be connected in the above manner, a first shooting range is formed as above, the point formed by the second edge position information and the point formed by the third edge position information are further connected, and a second shooting range is formed as above, and an area formed by the first shooting range and the second shooting range is a window range of the projected image.
Of course, in practical applications, a person skilled in the art may also use other ways to determine the window scope, and the embodiment of the present invention is not limited thereto.
The embodiment of the invention provides a test method, which comprises the steps of obtaining a projection image of projection equipment; determining the farthest observation point and the nearest observation point of the projected image according to the position information of the projection equipment; testing the position information of the viewed projected image in the preset ranges of the farthest observation point and the nearest observation point respectively; and determining the window range for watching the projection image according to the test result. According to the embodiment of the invention, the most real eye box size is obtained according to the farthest test point and the nearest test point in the actual driving process, so that whether the designed HUD is qualified or not can be determined, the HUD can be designed according to the tested eye box size, the HUD design accuracy is improved, and the user experience is further improved.
EXAMPLE III
Referring to fig. 3, a schematic structural diagram of a testing apparatus provided in an embodiment of the present invention is shown, which may specifically include the following steps:
an obtaining module 310, configured to obtain a projection image of a projection apparatus; an observation point determining module 320, configured to determine a farthest observation point and a closest observation point of the projected image according to the position information of the projection device; the testing module 330 is configured to test the position information of the viewed projection image in the preset ranges of the farthest observation point and the closest observation point respectively; and a window range determining module 340, configured to determine a window range for viewing the projected image according to the test result.
Preferably, the test module 330 includes: the first shooting submodule is used for continuously shooting the projection image within a first preset range of the farthest observation point so as to obtain continuously shot first images and recording first position information for shooting each first image; and the second shooting submodule is used for continuously shooting the projection image within a second preset range of the nearest observation point so as to obtain continuously shot second images and record second position information for shooting each second image.
Preferably, the window range determining module 340 includes: a critical line drawing submodule for drawing a critical line in advance at a boundary line in the projection image, the boundary line being close to the projection image; the object image searching submodule is used for searching a first object image of which the boundary line is overlapped with the critical line from the first image and searching a second object image of which the boundary line is overlapped with the critical line from the second image; an edge position information determining submodule, configured to determine, according to first position information of the first target image, first edge position information of the projected image viewed within the first preset range; determining second edge position information of the projected image viewed in the second preset range according to second position information of the second target image; and the first window range determining submodule is used for determining a shooting range for shooting the projection image according to an area surrounded by the first edge position information and the second edge position information, and taking the shooting range as a window range of the projection image.
Preferably, the test module 330 includes: the distance calculation submodule is used for calculating the distance between the farthest observation point and the nearest observation point; the interval test point determining submodule is used for equally dividing the distance and taking the equally divided points as interval observation points; the third image shooting submodule is used for continuously shooting the projection image within a third preset range of the interval observation point so as to obtain continuously shot third images and recording third position information for shooting each third image; the third target image searching submodule is used for searching a third target image of which the boundary line is overlapped with the critical line from the third image; a third edge position information determining submodule, configured to determine, according to third position information of the third target image, third edge position information of viewing the projected image within the third preset range; the window range determination module 340 includes: and the second window range determining submodule is used for determining a shooting range for shooting the projection image according to an area surrounded by the first edge position information, the second edge position information and the third edge position information, and taking the shooting range as a window range of the projection image.
The embodiment of the invention provides a testing device, which is used for testing the projection of a projection device by acquiring a projection image of the projection device; determining the farthest observation point and the nearest observation point of the projected image according to the position information of the projection equipment; testing the position information of the viewed projected image in the preset ranges of the farthest observation point and the nearest observation point respectively; and determining the window range for watching the projection image according to the test result. According to the embodiment of the invention, the most real eye box size is obtained according to the farthest test point and the nearest test point in the actual driving process, so that whether the designed HUD is qualified or not can be determined, the HUD can be designed according to the tested eye box size, the HUD design accuracy is improved, and the user experience is further improved.
While, for purposes of simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present invention is not limited by the illustrated ordering of acts, as some steps may occur in other orders or concurrently with other steps in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above detailed description of the testing method and the testing apparatus provided by the present invention, and the specific examples applied herein have been provided to explain the principles and embodiments of the present invention, and the above descriptions of the embodiments are only used to help understand the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (6)

1. A method of testing, comprising:
acquiring a projection image of projection equipment;
determining the farthest observation point and the nearest observation point of the projected image according to the position information of the projection equipment;
testing the position information of the viewed projected image in the preset ranges of the farthest observation point and the nearest observation point respectively;
determining a window range for watching the projection image according to the test result;
wherein the step of testing the position information of the projected image viewed in the preset ranges of the farthest observation point and the nearest observation point respectively comprises:
calculating the distance between the farthest observation point and the nearest observation point;
equally dividing the distance, and taking an equally divided point as an interval observation point;
continuously shooting the projection image within a third preset range of the interval observation point to obtain continuously shot third images, and recording third position information of all the third images;
drawing a critical line at a boundary line close to the projected image in advance; the critical line is a line drawn at a boundary line close to the projection image in the projection image;
searching a third target image of which the boundary line is overlapped with the critical line from the third image;
determining third edge position information for viewing the projected image within the third preset range according to third position information of the third target image;
the step of determining the window range for viewing the projected image according to the test result comprises:
determining a shooting range for shooting the projection image according to an area surrounded by the first edge position information, the second edge position information and the third edge position information, and taking the shooting range as a window range of the projection image; the first edge position information is first position information which can just see the projected image within a first preset range of the farthest observation point; the second edge position information is second position information of just seeing the projected image within a second preset range of the nearest observation point.
2. The method of claim 1, wherein the step of testing the position information of the projected image viewed in the preset ranges of the farthest observation point and the nearest observation point respectively further comprises:
continuously shooting the projection image within a first preset range of the farthest observation point to obtain continuously shot first images, and recording first position information of each first image;
and continuously shooting the projected image within a second preset range of the nearest observation point to acquire continuously shot second images, and recording second position information of shooting each second image.
3. The method of claim 2, wherein the step of determining the window range for viewing the projected image based on the test result further comprises:
searching a first target image of which the boundary line of the first image is overlapped with the critical line from the first image, and searching a second target image of which the boundary line of the second image is overlapped with the critical line from the second image;
determining first edge position information of the projected image viewed in the first preset range according to the first position information of the first target image; determining second edge position information of the projected image viewed in the second preset range according to second position information of the second target image;
determining a shooting range for shooting the projection image according to an area surrounded by the first edge position information, the second edge position information and the third edge position information, and taking the shooting range as a window range of the projection image.
4. A test apparatus, comprising:
the acquisition module is used for acquiring a projection image of the projection equipment;
the observation point determining module is used for determining the farthest observation point and the closest observation point of the projected image according to the position information of the projection equipment;
the testing module is used for testing the position information of the viewed projection image in the preset ranges of the farthest observation point and the nearest observation point respectively;
the window range determining module is used for determining the window range for watching the projection image according to the test result;
wherein the test module comprises:
the distance calculation submodule is used for calculating the distance between the farthest observation point and the nearest observation point;
the interval test point determining submodule is used for equally dividing the distance and taking the equally divided points as interval observation points;
the third image shooting submodule is used for continuously shooting the projection image within a third preset range of the interval observation point so as to obtain continuously shot third images and recording third position information for shooting each third image;
a third target image searching submodule, configured to search, from the third image, a third target image in which a boundary line of the third image overlaps a critical line; the critical line is a line drawn at a boundary line close to the projection image in the projection image;
a third edge position information determining submodule, configured to determine, according to third position information of the third target image, third edge position information of viewing the projected image within the third preset range;
the window range determination module comprises:
the second window range determining submodule is used for determining a shooting range for shooting the projection image according to an area surrounded by the first edge position information, the second edge position information and the third edge position information, and taking the shooting range as a window range of the projection image; the first edge position information is first position information which can just see the projected image within a first preset range of the farthest observation point; the second edge position information is second position information of just seeing the projected image within a second preset range of the nearest observation point;
the window range determination module further comprises:
and the critical line drawing submodule is used for drawing a critical line in the projection image at a position close to the boundary line of the projection image in advance.
5. The apparatus of claim 4, wherein the test module further comprises:
the first shooting submodule is used for continuously shooting the projection image within a first preset range of the farthest observation point so as to obtain continuously shot first images and recording first position information for shooting each first image;
and the second shooting submodule is used for continuously shooting the projection image within a second preset range of the nearest observation point so as to obtain continuously shot second images and record second position information for shooting each second image.
6. The apparatus of claim 5, wherein the window extent determination module further comprises:
the object image searching submodule is used for searching a first object image in which the boundary line of the first image is overlapped with the critical line from the first image and searching a second object image in which the boundary line of the second image is overlapped with the critical line from the second image;
an edge position information determining submodule, configured to determine, according to first position information of the first target image, first edge position information of the projected image viewed within the first preset range; determining second edge position information of the projected image viewed in the second preset range according to second position information of the second target image;
and the first window range determining submodule is used for determining a shooting range for shooting the projection image according to an area surrounded by the first edge position information, the second edge position information and the third edge position information, and taking the shooting range as a window range of the projection image.
CN201810311530.6A 2018-04-09 2018-04-09 Test method and device Active CN108444448B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810311530.6A CN108444448B (en) 2018-04-09 2018-04-09 Test method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810311530.6A CN108444448B (en) 2018-04-09 2018-04-09 Test method and device

Publications (2)

Publication Number Publication Date
CN108444448A CN108444448A (en) 2018-08-24
CN108444448B true CN108444448B (en) 2021-05-25

Family

ID=63199452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810311530.6A Active CN108444448B (en) 2018-04-09 2018-04-09 Test method and device

Country Status (1)

Country Link
CN (1) CN108444448B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5731902A (en) * 1996-08-19 1998-03-24 Delco Electronics Corporation Head-up display combiner binocular test fixture
CN101166288A (en) * 2006-10-17 2008-04-23 精工爱普生株式会社 Calibration technique for heads up display system
CN206132356U (en) * 2016-09-18 2017-04-26 惠州市华阳多媒体电子有限公司 HUD image test equipment
CN106657979A (en) * 2016-09-18 2017-05-10 惠州市华阳多媒体电子有限公司 HUD image testing system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5731902A (en) * 1996-08-19 1998-03-24 Delco Electronics Corporation Head-up display combiner binocular test fixture
CN101166288A (en) * 2006-10-17 2008-04-23 精工爱普生株式会社 Calibration technique for heads up display system
CN206132356U (en) * 2016-09-18 2017-04-26 惠州市华阳多媒体电子有限公司 HUD image test equipment
CN106657979A (en) * 2016-09-18 2017-05-10 惠州市华阳多媒体电子有限公司 HUD image testing system and method

Also Published As

Publication number Publication date
CN108444448A (en) 2018-08-24

Similar Documents

Publication Publication Date Title
Bark et al. Personal navi: Benefits of an augmented reality navigational aid using a see-thru 3d volumetric hud
US9690104B2 (en) Augmented reality HUD display method and device for vehicle
CN105676452B (en) Augmented reality HUD display methods and device for vehicle
EP3461129B1 (en) Method and apparatus for rendering image
KR101615111B1 (en) Multi-view display device and method thereof
EP3739545A1 (en) Image processing method and apparatus, vehicle-mounted head up display system, and vehicle
EP3179334A1 (en) Device and method for testing function or use of a head worn see through augmented reality device
EP3496041A1 (en) Method and apparatus for estimating parameter of virtual screen
DE102019118595A1 (en) PASSENGER FRONT INDICATORS FOR VEHICLES
KR102188149B1 (en) Method for Displaying 3-Demension Image and Display Apparatus Thereof
CN103686162B (en) Method and device for testing crosstalk of three-dimensional display
US20160103319A1 (en) Vehicle image display
KR20150139610A (en) Component assembly operation assist system, and component assembly method
CN109579868A (en) The outer object localization method of vehicle, device and automobile
KR20190012630A (en) Method of processing images and apparatus thereof
US10782776B2 (en) Vehicle display configuration system and method
JP2014165810A (en) Parameter acquisition device, parameter acquisition method and program
RU2602729C2 (en) Method of distance to object determining by means of camera (versions)
EP2981080A1 (en) Apparatus and method for rendering image
CN108444448B (en) Test method and device
US20170171535A1 (en) Three-dimensional display apparatus and method for controlling the same
CN111125270A (en) Map construction and loading method and device
Langlois et al. Virtual head-up displays for augmented reality in cars: a user testing to validate the congruence
US8983125B2 (en) Three-dimensional image processing device and three dimensional image processing method
CN106153053B (en) Device and method for controlling head-up display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant