CN112393881A - Vehicle-mounted W-shaped HUD visual range whole vehicle testing method - Google Patents
Vehicle-mounted W-shaped HUD visual range whole vehicle testing method Download PDFInfo
- Publication number
- CN112393881A CN112393881A CN202110065699.XA CN202110065699A CN112393881A CN 112393881 A CN112393881 A CN 112393881A CN 202110065699 A CN202110065699 A CN 202110065699A CN 112393881 A CN112393881 A CN 112393881A
- Authority
- CN
- China
- Prior art keywords
- display
- manipulator
- hud
- visual
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M11/00—Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
Abstract
The invention provides a vehicle-mounted W-shaped HUD visible range whole vehicle testing method, which comprises the following steps: s1, connecting and fixing the test equipment: arranging a HUD head-up display on a movable support of a test bench, fixing the bottom end of a manipulator on a main driving position of the test bench, and arranging a vision camera on the front arm end of the manipulator; s2, line connection: the computer is connected with a controller of the manipulator through a control line, the controller is used for driving the manipulator to move, the visual camera is connected with the computer through a USB data line, and the visual camera transmits a shot image to the computer through network communication; s3, calibrating the position of the visual camera: the computer adjusts the angle of the vision camera by controlling the manipulator, and the imaging center of the HUD head-up display is positioned at the center of the visual field of the vision camera; s4, calculating and analyzing the image of the HUD head-up display; the method can acquire, calculate and analyze the image of the HUD head-up display in a short time, and can accurately measure the visual range of the HUD head-up display.
Description
Technical Field
The invention belongs to the field of automobiles, and particularly relates to a vehicle-mounted W-shaped HUD whole automobile testing method in a visual range.
Background
With the vigorous development of the intelligent automobile industry, emerging human-computer interaction technology such as HUD head-up display is applied to the automobile industry in a large number, and the HUD head-up display can project vehicle condition information such as vehicle speed, rotating speed and water temperature to a windshield. HUD new line display product quality is good and uneven, and HUD new line display legibility has important influence to HUD new line display rationality, and then influence driving safety performance, do not have relevant test evaluation standard and test technique at present, the evaluation of this type of equipment mainly relies on subjective evaluation and the lower simple and easy equipment of precision to evaluate, do not have the perfect objective data of specialty to control, the human-computer interaction part quality is good and uneven, be unfavorable for whole car enterprise brand high-end development, consequently need a set of complete whole car test equipment of HUD visual range and method urgently.
Disclosure of Invention
In view of this, the invention aims to provide a vehicle-mounted W-shaped HUD visible range whole vehicle testing method to solve the problem that no equipment with systematic objective measurement of the HUD visible range exists at present.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a vehicle-mounted W-shaped HUD visual range whole vehicle testing method comprises the following steps:
s1, connecting and fixing the test equipment: arranging a HUD head-up display on a movable support of a test bench, fixing the bottom end of a manipulator on a main driving position of the test bench, and arranging a vision camera on the front arm end of the manipulator;
s2, line connection: the computer is connected with a controller of the manipulator through a control line, the controller is used for driving the manipulator to move, the visual camera is connected with the computer through a USB data line, and the visual camera transmits a shot image to the computer through network communication;
s3, calibrating the position of the visual camera: the computer adjusts the angle of the vision camera by controlling the manipulator, the imaging center of the HUD head-up display is positioned at the center of the visual field of the vision camera, and the method comprises the following steps:
s301, adjusting the posture of the manipulator to enable the visual camera to face the HUD head-up display, enabling the imaging center of the HUD head-up display and the center of the visual camera to be in a world coordinate system, enabling the coordinates in the Y direction and the Z direction to be equal, enabling the imaging center of the HUD head-up display to be located in the center of the visual field of the visual camera, and recording the spatial position point of the manipulator as an initial point photom;
s302, gradually raising the posture of the manipulator along the Z-axis direction of a world coordinate system, finely adjusting the posture of the tail end to enable the vision camera to still face the imaging center of the HUD head-up display, ensuring that the imaging center of the HUD head-up display is in the center of the visual field of the vision camera, and respectively recording spatial position coordinate points Pn of the manipulator at different heights;
and S4, calculating and analyzing the image of the HUD head-up display.
Further, the step of performing S4 is as follows:
s401, after recording of spatial coordinate points is completed, respectively taking spatial position coordinate points Pn at different heights as rotation starting points, enabling a manipulator to drive a vision camera to rotate clockwise around the X-axis direction of a world coordinate system, recording the change condition of an imaging center of an HUD head-up display in the visual field of the vision camera in the rotation process by using the vision camera, and judging that the points are visible boundary points imaged by the HUD head-up display when the image of the HUD head-up display is obviously darkened and unclear in the visual field of the vision camera;
s402, building a Solidworks model of the visual range of the HUD head-up display;
and S403, completing data measurement in a visual range, and performing imaging calculation and analysis on the HUD head-up display.
Compared with the prior art, the vehicle-mounted W-shaped HUD visual range whole vehicle testing method has the following advantages:
according to the vehicle-mounted W-shaped HUD visible range whole vehicle testing method, the HUD head-up display and the manipulator device are fixedly connected onto the test bed, the HUD visible range can be accurately measured, the current situation that no systemic observable HUD head-up display visible range device exists is made up, HUD imaging can be acquired, calculated and analyzed in a short time, and the expected effect is achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a diagram of various spatial positions of a robotic arm according to an embodiment of the present invention;
FIG. 2 is a diagram of a position of a motion process of a visual camera with P33 as a starting point according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a position of a motion process of a visual camera with P44 as a starting point according to an embodiment of the present invention;
FIG. 4 is a schematic view of a HUD head-up display according to an embodiment of the present invention;
fig. 5 is a robot coordinate system according to an embodiment of the present invention.
Description of reference numerals:
1-test bench; 2-a manipulator; 3-a visual camera; 4-HUD head-up display; 5-movable support.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "up", "down", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on those shown in the drawings, and are used only for convenience in describing the present invention and for simplicity in description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and thus, are not to be construed as limiting the present invention. Furthermore, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," etc. may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless otherwise specified.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art through specific situations.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
Referring to fig. 1 to 5, a vehicle-mounted W-shaped HUD visual range vehicle test method includes the following steps:
s1, connecting and fixing the test equipment: arranging a HUD head-up display 4 on a movable support 5 of a test bench 1, fixing the bottom end of a manipulator 2 on a main driving position of the test bench 1, and arranging a vision camera 3 at the front arm end of the manipulator 2;
s2, line connection: the computer is connected with a controller of the manipulator 2 through a control line, the controller is used for driving the manipulator 2 to move, the visual camera 3 is connected with the computer through a USB data line, and the visual camera 3 transmits a shot image to the computer through network communication;
s3, calibrating the position of the visual camera: the computer adjusts the angle of the vision camera 3 by controlling the manipulator 2, the imaging center of the HUD head-up display 4 is positioned at the center of the visual field of the vision camera 3, and the method comprises the following steps:
s301, adjusting the posture of the manipulator 2 to enable the visual camera 3 to be opposite to the HUD head-up display 4, enabling the imaging center of the HUD head-up display 4 and the center of the visual camera 3 to be in a world coordinate system (the original point is set as the imaging center of the HUD head-up display), enabling the Y direction coordinate to be equal to the Z direction coordinate, enabling the imaging center of the HUD head-up display 4 to be located in the center of the visual field of the visual camera 3, and recording the spatial position point of the manipulator 2 as an initial point photom;
s302, gradually raising the posture of the manipulator 2 along the Z-axis direction of a world coordinate system, wherein the raising numerical error is allowed to be +/-1.2 mm each time by 10mm, finely adjusting the posture of the tail end, enabling the vision camera 3 to still face the imaging center of the HUD head-up display 4, ensuring that the imaging center of the HUD head-up display 4 is in the center of the visual field of the vision camera 3, respectively recording spatial position coordinate points Pn of the manipulator 2 at different heights in the robot coordinate system, enabling the coordinate origin of the robot coordinate system to be located at the center of a robot base, enabling the direction of an X, Y, Z axis to be the same as the world coordinate direction, taking Pn after adjustment for the 4 th time, the 6 th time, the 10 th time and the 13 th time as examples, and respectively recording the Pn as P11, P22, P33 and P44, and coordinate values of each point are shown in a table,
watch 1 | Phome | P11 | P22 | P33 | P44 |
X(mm) | -2.65 | -2.65 | -2.65 | -2.65 | -2.65 |
Y(mm) | 0 | 0 | 0 | 0 | 0 |
Z(mm) | 949.54 | 993.47 | 1015.59 | 1054.3 | 1088.55 |
And S4, calculating and analyzing the image of the HUD head-up display 4.
The execution steps of S4 are as follows:
s401, after recording the spatial coordinate points, respectively taking spatial position points Pn with different heights as rotation starting points, enabling the manipulator 2 to drive the vision camera 3 to rotate clockwise around the X-axis direction of a world coordinate system, and using the vision camera 3 to record the change condition of the HUD head-up display 4 in the visual field of the vision camera 3 in the rotation process, taking P33 as an example of the rotation process when the P33 is taken as a starting point;
s402, building a Solidworks model of the visual range of the HUD head-up display 4;
and S403, completing the measurement of the data in the visible range, performing imaging calculation and analysis on the HUD head-up display 4, and calculating the test angle of the visible range of the HUD head-up display 4.
The test result when the point P44 is taken as the test starting point in the present patent application is: when the manipulator is located the photom point, the central distance of vision camera lens is 612mm from HUD head-up display formation of image central line point, and the difference of position point P44 and photom in world coordinate Z direction is 139.01mm, and the visual angle of solving HUD head-up display on the view cone side is:
the angle of visibility of the HUD head-up display α =2 × arctan (139.01 ÷ 612) ≈ 25.58 °.
In the implementation of the scheme, the manipulator 2 drives the vision camera 3 to rotate around the connecting line of the center of the HUD head-up display 4 and the photom as a rotating shaft by taking the position points Pn with different heights as starting points and around the connecting line as a rotating shaft, the rotating radius is continuously increased, the vision angle is continuously increased, by observing the video of the vision camera 3, when the movement is started from the point P44, the HUD head-up display 4 is obviously darkened from the visual field of the vision camera 3 compared with the previous test, and the point is judged to be the boundary point of the imaging visual range of the HUD head-up display 4,
through test calculation, when the manipulator 2 is located at the point of the photom, the distance between the central point of the lens of the visual camera and the imaging center line point of the HUD head-up display 4 is 612mm, the difference between the position point P44 and the photom in the direction of the world coordinate Z is 139.01mm, and the visual angle of the HUD head-up display 4 on the side surface of the viewing cone is 25.58 degrees.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (2)
1. The utility model provides a whole car test method of visual scope of on-vehicle W type HUD which characterized in that: the method comprises the following steps:
s1, connecting and fixing the test equipment: arranging a HUD head-up display on a movable support of a test bench, fixing the bottom end of a manipulator on a main driving position of the test bench, and arranging a vision camera on the front arm end of the manipulator;
s2, line connection: the computer is connected with a controller of the manipulator through a control line, the controller is used for driving the manipulator to move, the visual camera is connected with the computer through a USB data line, and the visual camera transmits a shot image to the computer through network communication;
s3, calibrating the position of the visual camera: the computer adjusts the angle of the vision camera by controlling the manipulator, the imaging center of the HUD head-up display is positioned at the center of the visual field of the vision camera, and the method comprises the following steps:
s301, adjusting the posture of the manipulator to enable the visual camera to face the HUD head-up display, enabling the imaging center of the HUD head-up display and the center of the visual camera to be in a world coordinate system, enabling the coordinates in the Y direction and the Z direction to be equal, enabling the imaging center of the HUD head-up display to be located in the center of the visual field of the visual camera, and recording the spatial position point of the manipulator as an initial point photom;
s302, gradually raising the posture of the manipulator along the Z-axis direction of a world coordinate system, finely adjusting the posture of the tail end to enable the vision camera to still face the imaging center of the HUD head-up display, ensuring that the imaging center of the HUD head-up display is in the center of the visual field of the vision camera, and respectively recording spatial position coordinate points Pn of the manipulator at different heights;
and S4, calculating and analyzing the image of the HUD head-up display.
2. The vehicle-mounted W-shaped HUD visual range vehicle test method according to claim 1, characterized in that: the execution steps of S4 are as follows:
s401, after recording of spatial coordinate points is completed, respectively taking spatial position coordinate points Pn at different heights as rotation starting points, enabling a manipulator to drive a vision camera to rotate clockwise around the X-axis direction of a world coordinate system, recording the change condition of an imaging center of an HUD head-up display in the visual field of the vision camera in the rotation process by using the vision camera, and judging that the points are visible boundary points imaged by the HUD head-up display when the image of the HUD head-up display is obviously darkened and unclear in the visual field of the vision camera;
s402, building a Solidworks model of the visual range of the HUD head-up display;
and S403, completing data measurement in a visual range, and performing imaging calculation and analysis on the HUD head-up display.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110065699.XA CN112393881A (en) | 2021-01-19 | 2021-01-19 | Vehicle-mounted W-shaped HUD visual range whole vehicle testing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110065699.XA CN112393881A (en) | 2021-01-19 | 2021-01-19 | Vehicle-mounted W-shaped HUD visual range whole vehicle testing method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112393881A true CN112393881A (en) | 2021-02-23 |
Family
ID=74625318
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110065699.XA Pending CN112393881A (en) | 2021-01-19 | 2021-01-19 | Vehicle-mounted W-shaped HUD visual range whole vehicle testing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112393881A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2733602A1 (en) * | 1995-04-28 | 1996-10-31 | Sextant Avionique | Laser aligning device for head up display collimator of aircraft |
CN102681312A (en) * | 2011-03-16 | 2012-09-19 | 宏瞻科技股份有限公司 | Human eye safety protection system of laser projection system |
CN109581658A (en) * | 2017-09-29 | 2019-04-05 | 中华映管股份有限公司 | Head up display |
CN109791300A (en) * | 2016-10-24 | 2019-05-21 | 麦克赛尔株式会社 | Head-up display system |
CN109827756A (en) * | 2019-03-15 | 2019-05-31 | 延锋伟世通电子科技(上海)有限公司 | A kind of vehicle-mounted WHUD virtual image test macro and test method |
CN209559470U (en) * | 2019-03-15 | 2019-10-29 | 延锋伟世通电子科技(上海)有限公司 | A kind of vehicle-mounted WHUD virtual image test device |
-
2021
- 2021-01-19 CN CN202110065699.XA patent/CN112393881A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2733602A1 (en) * | 1995-04-28 | 1996-10-31 | Sextant Avionique | Laser aligning device for head up display collimator of aircraft |
CN102681312A (en) * | 2011-03-16 | 2012-09-19 | 宏瞻科技股份有限公司 | Human eye safety protection system of laser projection system |
CN109791300A (en) * | 2016-10-24 | 2019-05-21 | 麦克赛尔株式会社 | Head-up display system |
CN109581658A (en) * | 2017-09-29 | 2019-04-05 | 中华映管股份有限公司 | Head up display |
CN109827756A (en) * | 2019-03-15 | 2019-05-31 | 延锋伟世通电子科技(上海)有限公司 | A kind of vehicle-mounted WHUD virtual image test macro and test method |
CN209559470U (en) * | 2019-03-15 | 2019-10-29 | 延锋伟世通电子科技(上海)有限公司 | A kind of vehicle-mounted WHUD virtual image test device |
Non-Patent Citations (2)
Title |
---|
何焰蓝: "《技术物理实验》", 30 June 2009, 国防科技大学出版社 * |
张好运: "车载抬头显示器的整车级别客观测试方法", 《中国科技信息》 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10508907B2 (en) | Wheel aligner with advanced diagnostics and no-stop positioning | |
CN100415460C (en) | Robot system | |
AU732066B2 (en) | Apparatus and method for determining axial stability | |
CN109798855B (en) | Calibration method and system of machine tool | |
CN109188449B (en) | Method and system for positioning position of charging port | |
CN112393881A (en) | Vehicle-mounted W-shaped HUD visual range whole vehicle testing method | |
CN113513999B (en) | Large-view-field real-time deformation measurement system and method for static test of spacecraft structure | |
CN212692809U (en) | Camera assembly and wheel positioning system | |
JP3095463B2 (en) | 3D measurement method | |
JP2009258098A (en) | Lens measuring apparatus, method for measuring lens, and method for manufacturing lens | |
CN112082482A (en) | Visual positioning method for object with edge characteristic only, application and precision evaluation method | |
CN114279303B (en) | Device and method for detecting verticality of double-sided micro-cylindrical lens array | |
CN113670203B (en) | Threaded hole center positioning device and method combining PNP algorithm with contact measurement | |
CN100398990C (en) | Revolving type spherometer, and measuring method | |
CN112082483B (en) | Positioning method and application of workpiece with edge characteristics only and precision evaluation method | |
CN113375907A (en) | Performance test system of high-precision six-degree-of-freedom optical assembly | |
CN114234887A (en) | Free-form surface lens curvature measuring device and application thereof | |
CN218156867U (en) | Finished automobile-based offline AR-HUD detection equipment | |
CN112937448A (en) | Method, device and equipment for adjusting position of sight glass and storage medium | |
IL288303B (en) | Method for a stereo camera system for detecting divergences from a calibration, and stereo camera system therefor | |
CN109062138A (en) | A kind of five shaft platform system calibrating schemes based on stereo calibration block | |
CN115284079B (en) | Magnetorheological polishing calibration method | |
CN219854654U (en) | Mechanical arm repeated positioning precision testing device and system | |
EP3889542B1 (en) | Method for estimating normal vector to die and/or normal vector to attachment | |
CN216482847U (en) | Glass automatic checkout device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210223 |