CN108444448A - A kind of test method and device - Google Patents
A kind of test method and device Download PDFInfo
- Publication number
- CN108444448A CN108444448A CN201810311530.6A CN201810311530A CN108444448A CN 108444448 A CN108444448 A CN 108444448A CN 201810311530 A CN201810311530 A CN 201810311530A CN 108444448 A CN108444448 A CN 108444448A
- Authority
- CN
- China
- Prior art keywords
- image
- projection image
- observation point
- location information
- projection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
Abstract
The present invention provides a kind of test method and test devices.The test method includes:Obtain the projection image of device for projecting;According to the location information of the device for projecting, the farthest observation point of the projection image and nearest observation point are determined;Respectively in the presetting range of the farthest observation point and the nearest observation point, the location information to watching the projection image is tested;The window of the viewing projection image is determined according to test result.The present invention according in practical driving conditions farthest test point and nearest test point obtain most true eye box size, it may thereby determine that whether the HUD of design is qualified, it can to design HUD with the eye box size of test, the accuracy for improving HUD designs, further improves user experience.
Description
Technical field
The present invention relates to onboard system fields, more particularly to a kind of test method and device.
Background technology
HUD (Head Up Display, head up display) is the following mainstream automotive ancillary equipment, can assist some
Driving information (such as speed, navigation, oil consumption, rotating speed) is projected by way of optical projection on front windshield, driver
Without bowing, the disk that watches the instruments can obtain these information, to improve the safety of driving.
Eye box refers to the window for watching HUD images, is the important technology index of HUD technologies, eye box size can
It is whether qualified with the design for reflecting HUD.In vehicle travel process, the seat of driver is can be upper and lower, forward and backward etc.
It is adjusted on direction, and can completely see that HUD images are problems of people's attention in different positions.And at present also
Do not occurred the scheme tested the eye box size of HUD.
Therefore, most true eye box size how is obtained to determine whether qualification is urgently to be resolved hurrily at present to the HUD designed
Problem.
Invention content
A kind of test method of present invention offer and device, to solve in prior art, there is no tests to obtain most really
Eye box size to determine the whether qualified problems of the HUD of design.
To solve the above-mentioned problems, the invention discloses a kind of test methods, including:Obtain the projection shadow of device for projecting
Picture;
According to the location information of the device for projecting, the farthest observation point of the projection image and nearest observation point are determined;
Respectively in the presetting range of the farthest observation point and the nearest observation point, to watching the projection image
Location information is tested;
The window of the viewing projection image is determined according to test result.
Preferably, described respectively in the presetting range of the farthest observation point and the nearest observation point, to watching institute
Location information the step of being tested of projection image is stated, including:
The projection image is continuously shot in the first presetting range of the farthest observation point, it is continuous to obtain
First image of shooting, and the first position information of each described first image of records photographing;
The projection image is continuously shot in the second presetting range of the nearest observation point, it is continuous to obtain
Second image of shooting, and the second position information of each second image of records photographing.
Preferably, the step of window that the viewing projection image is determined according to test result, including:
In advance critical line is drawn at the boundary line of the projection image in the projection image;
The first object image of boundary line and the critical line overlap is searched from described first image, and from described second
The second target image of boundary line and the critical line overlap is searched in image;
According to the first position information of the first object image, determination watches the throwing in first presetting range
The first edge location information of projection picture;And it according to the second position information of second target image, determines described second
The second edge location information of the viewing projection image in presetting range;
Shooting institute is determined according to the first edge location information and the second edge location information area defined
The coverage of projection image is stated, and using the coverage as the window of the projection image.
Preferably, described respectively in the presetting range of the farthest observation point and the nearest observation point, to watching institute
Location information the step of being tested of projection image is stated, including:
Calculate the distance between the farthest observation point and the nearest observation point;
A point processing is carried out etc. to the distance, and using Along ent as section observation point;
The projection image is continuously shot in the third presetting range of the section observation point, it is continuous to obtain
The third image of shooting, and the third place information of each third image of records photographing;
The third target image of boundary line and the critical line overlap is searched from the third image;
According to the third place information of the third target image, determination watches the throwing in the third presetting range
The third edge position information of projection picture;
The step of window that the viewing projection image is determined according to test result, including:
According to the first edge location information, the second edge location information and the third edge position information institute
The region surrounded determines the coverage of the shooting projection image, and regarding using the coverage as the projection image
Window range.
To solve the above-mentioned problems, the embodiment of the invention also discloses a kind of test devices, including:
Acquisition module, the projection image for obtaining device for projecting;
Observation point determining module determines the farthest of the projection image for the location information according to the device for projecting
Observation point and nearest observation point;
Test module is used for respectively in the presetting range of the farthest observation point and the nearest observation point, to viewing
The location information of the projection image is tested;
Window determining module, the window for determining the viewing projection image according to test result.
Preferably, the test module includes:
First shooting submodule, for being carried out to the projection image in the first presetting range of the farthest observation point
It is continuously shot, to obtain the first image being continuously shot, and the first position information of each described first image of records photographing;
Second shooting submodule, for being carried out to the projection image in the second presetting range of the nearest observation point
It is continuously shot, to obtain the second image being continuously shot, and the second position information of each second image of records photographing.
Preferably, the window determining module includes:
Critical line rendering submodule, for being painted at the boundary line of the projection image in the projection image in advance
Critical line processed;
Target image searches submodule, and for searching boundary line and the critical line overlap from described first image
One target image, and from second image search boundary line and the critical line overlap the second target image;
Edge position information determination sub-module is determined for the first position information according to the first object image
The first edge location information of the viewing projection image in first presetting range;And according to second target image
Second position information determines the second edge location information that the projection image is watched in second presetting range;
First window determination sub-module, for according to the first edge location information and the second edge position
Information area defined determines the coverage of the shooting projection image, and using the coverage as the projection shadow
The window of picture.
Preferably, the test module includes:
Apart from computational submodule, for calculating the distance between the farthest observation point and the nearest observation point;
Section test point determination sub-module, for point processing such as carrying out to the distance, and using Along ent as section sight
Measuring point;
Third image taking submodule is used in the third presetting range of the section observation point to the projection image
It is continuously shot, to obtain the third image being continuously shot, and the third place information of each third image of records photographing;
Third target image searches submodule, for searching boundary line and the critical line overlap from the third image
Third target image;
Third edge position information determination sub-module, for the third place information according to the third target image, really
It is scheduled on the third edge position information of the projection image of viewing in the third presetting range;
The window determining module includes:
Second window determination sub-module, for according to the first edge location information, the second edge position
Information and the third edge position information area defined determine the coverage of the shooting projection image, and will be described
Window of the coverage as the projection image.
Compared with prior art, the present invention includes following advantages:
An embodiment of the present invention provides a kind of test method and devices, by the projection image for obtaining device for projecting;Foundation
The location information of the device for projecting determines the farthest observation point of the projection image and nearest observation point;Respectively it is described most
In the presetting range of remote observation point and the nearest observation point, the location information to watching the projection image is tested;Root
The window of the viewing projection image is determined according to test result.In the embodiment of the present invention according in practical driving conditions most
Remote test point and nearest test point obtain most true eye box size, may thereby determine that whether the HUD of design is qualified, can be with
So that designing HUD with the eye box size of test, the accuracy of HUD designs is improved, user experience is further improved.
Description of the drawings
Fig. 1 shows a kind of step flow chart of test method provided in an embodiment of the present invention;
Fig. 1 a show the schematic diagram of a kind of box provided in an embodiment of the present invention;
Fig. 2 shows a kind of step flow charts of test method provided in an embodiment of the present invention;
Fig. 2 a show the schematic diagram that critical line is drawn in a kind of projection image provided in an embodiment of the present invention;
Fig. 3 shows a kind of structural schematic diagram of test device provided in an embodiment of the present invention.
Specific implementation mode
In order to make the foregoing objectives, features and advantages of the present invention clearer and more comprehensible, below in conjunction with the accompanying drawings and specific real
Applying mode, the present invention is described in further detail.
Embodiment one
Referring to Fig.1, the step flow chart for showing a kind of test method provided in an embodiment of the present invention, can specifically wrap
It includes:
Step 101:Obtain the projection image of device for projecting.
In embodiments of the present invention, device for projecting can not limit this with the equipment such as HUD, the embodiment of the present invention.
The embodiment of the present invention is the description carried out with HUD projection images, not as the sole limitation to the present invention.
The embodiment of the present invention is the test carried out except mobile unit, can be arranged first according to mobile unit similar
Space environment, and the device for projecting such as HUD are installed, and on the front windshield similar to mobile unit.
After being completed according to the environment of mobile unit arrangement, then the projection of device for projecting (such as HUD) projection can be obtained
Image.
After getting the projection image of device for projecting, then 102 are entered step.
Step 102:According to the location information of the device for projecting, the farthest observation point and recently of the projection image is determined
Observation point.
In embodiments of the present invention, since device for projecting is the installation carried out according to the space environment of mobile unit, because
And after determining the location information of device for projecting, then can calculate the seat position information of mobile unit, and seat be can be with
It is moved forward and backward, for example, 20cm etc. can be moved before and after seat.
After determining the location information of device for projecting, then projection shadow can be determined according to the location information of device for projecting
Most distant point that the farthest observation point and nearest observation point namely mobile unit pilot set of picture move backward and to Forward
Dynamic the near distance spot.
After the farthest observation point and nearest observation point for determining projection image, then 103 are entered step.
Step 103:Respectively in the presetting range of the farthest observation point and the nearest observation point, to watching the throwing
The location information of projection picture is tested.
Normally, since the front windshield of mobile unit is arc-shaped structure, user is in farthest observation point and most close up
Measuring point can watch complete projection image that can have a specific range cannot see when more than the particular range
Whole projection image.
In embodiments of the present invention, before being tested, can be arranged in farthest observation point and most according to presetting rule
The presetting range of close up measuring point viewing projection image, for example, presetting range can be arranged according to the working experience of developer
Etc., the embodiment of the present invention does not limit this.
The mode being continuously shot in the presetting range of farthest observation point and nearest observation point may be used to be surveyed
Examination, for example, continuously being clapped projection image in the presetting range of farthest observation point and nearest observation point using capture apparatus
It takes the photograph to carry out test mode, and the location information of each image of records photographing, to which the image that foundation is continuously shot can be true
The edge position information of shooting projection image is made, and then can be determined according to edge position information in farthest test point and most
The window of the projection image namely the size of eye box are watched in the presetting range of nearly test point.
Certainly, in practical applications, can also be tested using other manner, the embodiment of the present invention does not limit this
System.
The method of test will be described in detail in following embodiments two, the embodiment of the present invention is no longer subject to herein
It repeats.
The location information of viewing projection image is tested in the presetting range of farthest observation point and nearest observation point
Later, then 104 are entered step.
Step 104:The window of the viewing projection image is determined according to test result.
In embodiments of the present invention, after being tested in the presetting range of farthest observation point and nearest observation point, then
The edge position information in farthest observation point and nearest observation point viewing projection image can be determined according to test result, in turn
The window of viewing projection image can be determined according to edge position information, it specifically, will be in following embodiments two in detail
It is thin to illustrate, it is not repeated here herein.
For example, referring to Fig. 1 a, the schematic diagram of a kind of box provided in an embodiment of the present invention, as shown in Figure 1a, institute are shown
The window of formation can be a rectangular parallelepiped structure namely eye box structure, user can out of this rectangular parallelepiped structure
To watch complete projection image.
Certainly, in practical applications, it may be other structures to be formed by a box structure, such as ellipse or irregular shape
Shape etc., above-mentioned example is merely to the example for more fully understanding the technical solution of the embodiment of the present invention and enumerating, not as right
The sole limitation of the embodiment of the present invention.
An embodiment of the present invention provides test methods, by the projection image for obtaining device for projecting;It is set according to the projection
Standby location information determines the farthest observation point of the projection image and nearest observation point;Respectively in the farthest observation point and
In the presetting range of the nearest observation point, the location information to watching the projection image is tested;According to test result
Determine the window of the viewing projection image.In the embodiment of the present invention according in practical driving conditions farthest test point and
Nearest test point obtains most true eye box size, may thereby determine that whether the HUD of design is qualified, can make to test
Eye box size design HUD, improve the accuracy of HUD designs, further improve user experience.
Embodiment two
With reference to Fig. 2, shows a kind of step flow chart of test method provided in an embodiment of the present invention, can specifically wrap
It includes:
Step 201:Obtain the projection image of device for projecting.
Step 202:According to the location information of the device for projecting, the farthest observation point and recently of the projection image is determined
Observation point.
In embodiments of the present invention, the embodiment of 201~step 202 of above-mentioned steps and step in above-described embodiment one
The embodiment of 101~step 102 is similar, and the embodiment of the present invention is not repeated here herein.
After the farthest observation point and nearest observation point for determining viewing projection image, then 203 are entered step.
Step 203:The projection image is continuously shot in the first presetting range of the farthest observation point, with
Obtain the first image being continuously shot, and the first position information of each described first image of records photographing.
In embodiments of the present invention, the first presetting range refers to the setpoint distance value that extended outwardly centered on farthest observation point
Setting range, for example, the setting value of distance can be 30cm, 15cm etc., the embodiment of the present invention does not limit this.When
So, since the shape of mobile unit front windshield is arcuate structure, extend to all directions centered on by farthest observation point
Distance it is not necessarily identical, can also be identical, the embodiment of the present invention does not limit this.
Projection image is continuously shot in the first presetting range of farthest observation point using capture apparatus, so as to
Multiple images are obtained to get the shooting in the first presetting range, the first image are denoted as, in each the first image of shooting
Meanwhile also the taking location information of each captured the first image is recorded, it is denoted as first position information.
In the first image for being continuously shot of acquisition, and after the first position information of each first image of records photographing, then into
Enter step 204.
Step 204:The projection image is continuously shot in the second presetting range of the nearest observation point, with
Obtain the second image being continuously shot, and the second position information of each second image of records photographing.
In embodiments of the present invention, the second presetting range refers to the setpoint distance value that extended outwardly centered on nearest observation point
Setting range, for example, the setting value of distance can be 25cm, 12cm etc., the embodiment of the present invention does not limit this.When
So, since the shape of mobile unit front windshield is arcuate structure, extend to all directions centered on by nearest observation point
Distance it is not necessarily identical, can also be identical, the embodiment of the present invention does not limit this.
Projection image is continuously shot in the second presetting range of farthest close up measuring point using capture apparatus, to
The shooting in the second presetting range can be got and obtain multiple images, be denoted as the two or eight image, shooting each the second figure
While picture, also the taking location information of each captured the second image is recorded, is denoted as second position information.
In the second image for being continuously shot of acquisition, and after the second position information of each second image of records photographing, then into
Enter step 205.
Step 205:In advance critical line is drawn at the boundary line of the projection image in the projection image.
In embodiments of the present invention, projection image in close to the projection image boundary line position at it is pre-rendered have face
Boundary line shows the schematic diagram that critical line is drawn in a kind of projection image provided in an embodiment of the present invention for example, referring to Fig. 2 a,
As shown in Figure 2 a, at the position close to projection image each boundary line can with pre-rendered one or more critical line, and with it is each
The number for the critical line that boundary line is drawn should be identical, for example, in Fig. 2 a, face close to what projection image left side boundary line was drawn
Boundary line number is 4, then the critical line number that boundary line, upper border line and following boundary line are drawn on the right of the projection image is also answered
It is 4.Certainly, it can be different colors in the color for a plurality of critical line drawn close to each boundary line, shot with facilitating
When projecting image, the marginal position of shooting is distinguished, or identical color, the embodiment of the present invention are not subject to this
Limitation.
Step 206:The first object image of lookup boundary line and the critical line overlap from described first image, and from
The second target image of boundary line and the critical line overlap is searched in second image.
In embodiments of the present invention, first object image refers to that can be taken in the first presetting range of farthest observation point
First image of complete projection image, the second target image refers to that can have been taken in the second presetting range of nearest observation point
Second image of whole projection image, and first object image and the second target image are at least four images.
It, then can be from the first image being continuously shot after being continuously shot to projection image in the first presetting range
The middle first object image for searching boundary line and critical line overlap when for left-half image, then shows to clap as shown in Figure 2 a
The right boundary line for the first image taken the photograph is overlooking line overlap not corresponding with left side boundary line in projection image, is shooting first figure
Viewing projection image cannot see that complete projection image at the position of picture.And it can be obtained from the image of Fig. 2 a right half parts
Know, each boundary line of the first captured at this time image is overlapped with the critical line in projection image, and by this first
Image is as first object image.
In turn, aforesaid way may be used to determine the second target image of the second image of shooting.
After finding out first object image and the second target image, then 207 are entered step.
Step 207:According to the first position information of the first object image, determination is seen in first presetting range
See the first edge location information of the projection image;And it according to the second position information of second target image, determines
The second edge location information of the viewing projection image in second presetting range.
In embodiments of the present invention, first edge location information refers to that can watch throwing just in the first presetting range
The first position information of projection picture, second edge location information refer to that can watch projection shadow just in the second presetting range
The second position information of picture.
Due to the pre-recorded location information of shooting each first image and each second image, from each first image and each
After finding out corresponding first object image and the second target image in second image, then position that can be pre-recorded is believed
Breath determines the first position information of first object image and the second position information of the second target image, according to first object figure
The first position information of picture and the second position information of the second target image can then be determined to watch in the first presetting range
It projects the first edge location information of image and watches the second edge location information of projection image in the second presetting range.
In one preferred embodiment of the invention, it can also be determined between farthest observation point and nearest observation point
Its observation point is tested, and specifically, may include steps of:
Step S1:Calculate the distance between the farthest observation point and the nearest observation point;
Step S2:A point processing is carried out etc. to the distance, and using Along ent as section observation point;
Step S3:The projection image is continuously shot in the third presetting range of the section observation point, with
Obtain the third image being continuously shot, and the third place information of each third image of records photographing;
Step S4:The third target image of boundary line and the critical line overlap is searched from the third image;
Step S5:According to the third place information of the third target image, determination is seen in the third presetting range
See the third edge position information of the projection image.
In embodiments of the present invention, it is needing to determine that other observation points carry out between farthest observation point and nearest observation point
When test, then the distance between farthest observation point and nearest observation point are calculated first, and then carry out to the calculated distance
Decile processing, and using Along ent as section observation point.For example, farthest the distance between observation point and nearest observation point are
24cm, it needs to be determined that when a section observation point, then adjust the distance=24cm carries out bisection processing, will be apart from farthest observation point
It is that position at 12cm is tested as section observation point with nearest observation point.Certainly, in the farthest observation point of needs and most
When determining that three observation points are tested between close up measuring point, then to the distance between farthest observation point and nearest observation point into
The processing of the row quartering, and then using three Along ents as section observation point.
It is to be appreciated that above-mentioned example is merely to more fully understand the technical solution of the embodiment of the present invention and showing for enumerating
Example, not as the sole limitation to the embodiment of the present invention.
Section observation point corresponds to farthest observation point and nearest observation point, can be with there are a presetting range at interval point
Complete projection image is watched, using the presetting range as third presetting range, and to the projection in third presetting range
Image is continuously shot, and to obtain the third image being continuously shot, and is remembered to the location information for shooting each third image
Record, and then the similar manner in above-mentioned steps 206 may be used and search third target image, third target figure from third image
Seem to refer to the third image that complete projection image can be taken in the third presetting range of section observation point.
And then can be according to the third target image the third place information recorded, determination is watched in third presetting range
Project the third edge position information of image.
And then execute the step of determining window according to test result.
After determining first edge location information and second edge location information, then 208 are entered step.
Step 208:It is true according to the first edge location information and the second edge location information area defined
Surely the coverage of the projection image is shot, and using the coverage as the window of the projection image.
In embodiments of the present invention, first edge location information and second edge location information are multiple location informations,
The range that multiple first edge location informations are formed is first that complete projection image can be watched in farthest observation point
Range, and the range that multiple second edge location informations are formed is that can watch complete projection image in nearest observation point
The second range.
In embodiments of the present invention, the first range and the second ranging from approximately parallel structure can be set, such as round knot
Structure or square structure or irregular shape structure etc..
First range is formed by first structure and the second range is formed by two approximate center points of the second structure
It is connected with straight line, which is formed by perpendicular to first structure and the second structure, and then by a certain first edge location information
Point is formed by with a certain second edge location information and is a little attached, and the straight line of the connection is and above-mentioned and first structure and the
The straight line of two structure verticals is in the same plane.
Further, multiple first edge location informations are formed by respectively with multiple by multiple points using aforesaid way
Two edge position informations are respectively adopted straight line and are connected one by one, and the straight line being attached be with perpendicular to the first knot
Structure and the straight line of the second structure in the same plane, so as to obtain according to first edge location information and second edge position
Information makees the region encircled a city, which is the coverage of shooting projection image, and using coverage as projection image
Window namely eye box.
In one preferred embodiment of the invention, there is also other observations between farthest observation point and nearest observation point
Point, then above-mentioned steps 208 may include:
Sub-step N1:According to the first edge location information, the second edge location information and the third edge
Location information area defined determines the coverage of the shooting projection image, and using the coverage as the throwing
The window of projection picture.
Specifically, aforesaid way may be used to be formed by first edge location information a little and third edge position information
It is formed by and is a little attached, form the first coverage according to such as upper type, and then second edge location information is formed
Point and third edge position information be formed by and be a little attached, form the second coverage according to such as upper type, then first
It is the window for projecting image that coverage and the second coverage, which are formed by region,.
Certainly, in practical applications, those skilled in the art can also determine window, this hair using other manner
Bright embodiment does not limit this.
An embodiment of the present invention provides test methods, by the projection image for obtaining device for projecting;It is set according to the projection
Standby location information determines the farthest observation point of the projection image and nearest observation point;Respectively in the farthest observation point and
In the presetting range of the nearest observation point, the location information to watching the projection image is tested;According to test result
Determine the window of the viewing projection image.In the embodiment of the present invention according in practical driving conditions farthest test point and
Nearest test point obtains most true eye box size, may thereby determine that whether the HUD of design is qualified, can make to test
Eye box size design HUD, improve the accuracy of HUD designs, further improve user experience.
Embodiment three
With reference to Fig. 3, shows a kind of structural schematic diagram of test device provided in an embodiment of the present invention, can specifically include
Following steps:
Acquisition module 310, the projection image for obtaining device for projecting;Observation point determining module 320, for according to described in
The location information of device for projecting determines the farthest observation point of the projection image and nearest observation point;Test module 330, is used for
Respectively in the presetting range of the farthest observation point and the nearest observation point, to watching the location information of the projection image
It is tested;Window determining module 340, the window for determining the viewing projection image according to test result.
Preferably, the test module 330 includes:First shooting submodule, for the first of the farthest observation point
The projection image is continuously shot in presetting range, to obtain the first image being continuously shot, and each institute of records photographing
State the first position information of the first image;Second shooting submodule, in the second presetting range of the nearest observation point
The projection image is continuously shot, to obtain the second image being continuously shot, and each second image of records photographing
Second position information.
Preferably, the window determining module 340 includes:Critical line rendering submodule, in advance in the throwing
In projection picture critical line is drawn at the boundary line of the projection image;Target image searches submodule, for from described the
The first object image of boundary line and the critical line overlap is searched in one image, and searches boundary line from second image
With the second target image of the critical line overlap;Edge position information determination sub-module, for according to the first object figure
The first position information of picture determines the first edge location information that the projection image is watched in first presetting range;
And according to the second position information of second target image, the projection image is watched in determination in second presetting range
Second edge location information;First window determination sub-module, for according to the first edge location information and described
Second edge location information area defined determines the coverage of the shooting projection image, and the coverage is made
For the window of the projection image.
Preferably, the test module 330 includes:Apart from computational submodule, for calculating the farthest observation point and institute
State the distance between nearest observation point;Section test point determination sub-module for point processing such as carrying out to the distance, and will wait
Branch is as section observation point;Third image taking submodule, for right in the third presetting range of the section observation point
The projection image is continuously shot, to obtain the third image that is continuously shot, and each third image of records photographing
The third place information;Third target image search submodule, for from the third image search boundary line with it is described critical
The third target image of line overlap;Third edge position information determination sub-module, for the according to the third target image
Three location informations determine the third edge position information that the projection image is watched in the third presetting range;It is described to regard
Window range determination module 340 includes:Second window determination sub-module, for according to the first edge location information, institute
It states second edge location information and the third edge position information area defined determines the bat of the shooting projection image
Range is taken the photograph, and using the coverage as the window of the projection image.
An embodiment of the present invention provides test devices, by the projection image for obtaining device for projecting;It is set according to the projection
Standby location information determines the farthest observation point of the projection image and nearest observation point;Respectively in the farthest observation point and
In the presetting range of the nearest observation point, the location information to watching the projection image is tested;According to test result
Determine the window of the viewing projection image.In the embodiment of the present invention according in practical driving conditions farthest test point and
Nearest test point obtains most true eye box size, may thereby determine that whether the HUD of design is qualified, can make to test
Eye box size design HUD, improve the accuracy of HUD designs, further improve user experience.
For each method embodiment above-mentioned, for simple description, therefore it is all expressed as a series of combination of actions, but
Be those skilled in the art should understand that, the present invention is not limited by the described action sequence because according to the present invention, certain
A little steps can be performed in other orders or simultaneously.Secondly, it those skilled in the art should also know that, is retouched in specification
The embodiment stated belongs to preferred embodiment, and involved action and module are not necessarily essential to the invention.
Each embodiment in this specification is described in a progressive manner, the highlights of each of the examples are with
The difference of other embodiment, the same or similar parts between the embodiments can be referred to each other.
Finally, it is to be noted that, herein, relational terms such as first and second and the like be used merely to by
One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation
Between there are any actual relationship or orders.Moreover, the terms "include", "comprise" or its any other variant meaning
Covering non-exclusive inclusion, so that process, method, commodity or equipment including a series of elements include not only that
A little elements, but also include other elements that are not explicitly listed, or further include for this process, method, commodity or
The intrinsic element of equipment.In the absence of more restrictions, the element limited by sentence "including a ...", is not arranged
Except there is also other identical elements in process, method, commodity or the equipment including the element.
Above to a kind of test method provided by the present invention and a kind of test device, it is described in detail, herein
Applying specific case, principle and implementation of the present invention are described, and the explanation of above example is only intended to help
Understand the method and its core concept of the present invention;Meanwhile for those of ordinary skill in the art, according to the thought of the present invention,
There will be changes in the specific implementation manner and application range, in conclusion the content of the present specification should not be construed as to this
The limitation of invention.
Claims (8)
1. a kind of test method, which is characterized in that including:
Obtain the projection image of device for projecting;
According to the location information of the device for projecting, the farthest observation point of the projection image and nearest observation point are determined;
Respectively in the presetting range of the farthest observation point and the nearest observation point, to watching the position of the projection image
Information is tested;
The window of the viewing projection image is determined according to test result.
2. according to the method described in claim 1, it is characterized in that, described respectively in the farthest observation point and the most close up
In the presetting range of measuring point, to watch it is described projection image location information test the step of, including:
The projection image is continuously shot in the first presetting range of the farthest observation point, is continuously shot with obtaining
The first image, and the first position information of each described first image of records photographing;
The projection image is continuously shot in the second presetting range of the nearest observation point, is continuously shot with obtaining
The second image, and the second position information of each second image of records photographing.
3. according to the method described in claim 2, it is characterized in that, described determined according to test result watches the projection image
Window the step of, including:
In advance critical line is drawn at the boundary line of the projection image in the projection image;
The first object image of boundary line and the critical line overlap is searched from described first image, and from second image
Middle the second target image for searching boundary line and the critical line overlap;
According to the first position information of the first object image, the projection shadow is watched in determination in first presetting range
The first edge location information of picture;And according to the second position information of second target image, determination is preset described second
The second edge location information of the viewing projection image in range;
It is determined according to the first edge location information and the second edge location information area defined and shoots the throwing
The coverage of projection picture, and using the coverage as the window of the projection image.
4. according to the method described in claim 2, it is characterized in that, described respectively in the farthest observation point and the most close up
In the presetting range of measuring point, to watch it is described projection image location information test the step of, including:
Calculate the distance between the farthest observation point and the nearest observation point;
A point processing is carried out etc. to the distance, and using Along ent as section observation point;
The projection image is continuously shot in the third presetting range of the section observation point, is continuously shot with obtaining
Third image, and the third place information of each third image of records photographing;
The third target image of boundary line and the critical line overlap is searched from the third image;
According to the third place information of the third target image, the projection shadow is watched in determination in the third presetting range
The third edge position information of picture;
The step of window that the viewing projection image is determined according to test result, including:
It is surrounded according to the first edge location information, the second edge location information and the third edge position information
Region determine shooting it is described projection image coverage, and using the coverage as it is described projection image form model
It encloses.
5. a kind of test device, which is characterized in that including:
Acquisition module, the projection image for obtaining device for projecting;
Observation point determining module determines the farthest observation of the projection image for the location information according to the device for projecting
Point and nearest observation point;
Test module is used for respectively in the presetting range of the farthest observation point and the nearest observation point, described in viewing
The location information of projection image is tested;
Window determining module, the window for determining the viewing projection image according to test result.
6. device according to claim 5, which is characterized in that the test module includes:
First shooting submodule, it is continuous for being carried out to the projection image in the first presetting range of the farthest observation point
Shooting, to obtain the first image being continuously shot, and the first position information of each described first image of records photographing;
Second shooting submodule, it is continuous for being carried out to the projection image in the second presetting range of the nearest observation point
Shooting, to obtain the second image being continuously shot, and the second position information of each second image of records photographing.
7. device according to claim 6, which is characterized in that the window determining module includes:
Critical line rendering submodule is faced for being drawn at the boundary line of the projection image in the projection image in advance
Boundary line;
Target image searches submodule, the first mesh for searching boundary line and the critical line overlap from described first image
Logo image, and from second image search boundary line and the critical line overlap the second target image;
Edge position information determination sub-module is determined for the first position information according to the first object image described
The first edge location information of the viewing projection image in first presetting range;And according to the second of second target image
Location information determines the second edge location information that the projection image is watched in second presetting range;
First window determination sub-module, for according to the first edge location information and the second edge location information
Area defined determines the coverage of the shooting projection image, and using the coverage as the projection image
Window.
8. device according to claim 6, which is characterized in that the test module includes:
Apart from computational submodule, for calculating the distance between the farthest observation point and the nearest observation point;
Section test point determination sub-module, for point processing such as carrying out to the distance, and using Along ent as section observation point;
Third image taking submodule, for being carried out to the projection image in the third presetting range of the section observation point
It is continuously shot, to obtain the third image being continuously shot, and the third place information of each third image of records photographing;
Third target image searches submodule, and for searching boundary line and the critical line overlap from the third image
Three target images;
Third edge position information determination sub-module is determined for the third place information according to the third target image
The third edge position information of the viewing projection image in the third presetting range;
The window determining module includes:
Second window determination sub-module, for according to the first edge location information, the second edge location information
The coverage of the shooting projection image is determined with the third edge position information area defined, and by the shooting
Window of the range as the projection image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810311530.6A CN108444448B (en) | 2018-04-09 | 2018-04-09 | Test method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810311530.6A CN108444448B (en) | 2018-04-09 | 2018-04-09 | Test method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108444448A true CN108444448A (en) | 2018-08-24 |
CN108444448B CN108444448B (en) | 2021-05-25 |
Family
ID=63199452
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810311530.6A Active CN108444448B (en) | 2018-04-09 | 2018-04-09 | Test method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108444448B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5731902A (en) * | 1996-08-19 | 1998-03-24 | Delco Electronics Corporation | Head-up display combiner binocular test fixture |
CN101166288A (en) * | 2006-10-17 | 2008-04-23 | 精工爱普生株式会社 | Calibration technique for heads up display system |
CN206132356U (en) * | 2016-09-18 | 2017-04-26 | 惠州市华阳多媒体电子有限公司 | HUD image test equipment |
CN106657979A (en) * | 2016-09-18 | 2017-05-10 | 惠州市华阳多媒体电子有限公司 | HUD image testing system and method |
-
2018
- 2018-04-09 CN CN201810311530.6A patent/CN108444448B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5731902A (en) * | 1996-08-19 | 1998-03-24 | Delco Electronics Corporation | Head-up display combiner binocular test fixture |
CN101166288A (en) * | 2006-10-17 | 2008-04-23 | 精工爱普生株式会社 | Calibration technique for heads up display system |
CN206132356U (en) * | 2016-09-18 | 2017-04-26 | 惠州市华阳多媒体电子有限公司 | HUD image test equipment |
CN106657979A (en) * | 2016-09-18 | 2017-05-10 | 惠州市华阳多媒体电子有限公司 | HUD image testing system and method |
Also Published As
Publication number | Publication date |
---|---|
CN108444448B (en) | 2021-05-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103501409B (en) | Ultrahigh resolution panorama speed dome AIO (All-In-One) system | |
CN101673395B (en) | Image mosaic method and image mosaic device | |
KR102198352B1 (en) | Apparatus and method for 3d image calibration in tiled display | |
CN102446048B (en) | Information processing device and information processing method | |
WO2022088103A1 (en) | Image calibration method and apparatus | |
US9696798B2 (en) | Eye gaze direction indicator | |
CN103188434B (en) | Method and device of image collection | |
CN107452031B (en) | Virtual ray tracking method and light field dynamic refocusing display system | |
CN109579868A (en) | The outer object localization method of vehicle, device and automobile | |
CN108848374B (en) | Display parameter measuring method and device, storage medium and measuring system | |
CN105847662A (en) | Moving object shooting method based on mobile terminal, and mobile terminal | |
CN111625091B (en) | Label overlapping method and device based on AR glasses | |
CN102523462B (en) | Method and device for rapidly acquiring elemental image array based on camera array | |
EP3496041A1 (en) | Method and apparatus for estimating parameter of virtual screen | |
CN106204431A (en) | The display packing of intelligent glasses and device | |
JP6126501B2 (en) | Camera installation simulator and its computer program | |
WO2016183954A1 (en) | Calculation method and apparatus for movement locus, and terminal | |
EP3062506B1 (en) | Image switching method and apparatus | |
CN115769592A (en) | Image acquisition method, image acquisition device, electronic device, and medium | |
CN113474810A (en) | Bending detection method and device of flexible screen and readable storage medium | |
US20230328400A1 (en) | Auxiliary focusing method, apparatus, and system | |
CN110458104B (en) | Human eye sight direction determining method and system of human eye sight detection system | |
US20180052584A1 (en) | 3d display ray principles and methods, zooming, and real-time demonstration | |
CN108444448A (en) | A kind of test method and device | |
CN116978010A (en) | Image labeling method and device, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |