CN112733633B - Method for predicting eye position of driver of high-power wheeled tractor - Google Patents

Method for predicting eye position of driver of high-power wheeled tractor Download PDF

Info

Publication number
CN112733633B
CN112733633B CN202011585120.4A CN202011585120A CN112733633B CN 112733633 B CN112733633 B CN 112733633B CN 202011585120 A CN202011585120 A CN 202011585120A CN 112733633 B CN112733633 B CN 112733633B
Authority
CN
China
Prior art keywords
driver
tractor
eye position
point
height
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011585120.4A
Other languages
Chinese (zh)
Other versions
CN112733633A (en
Inventor
金晓萍
蒋建华
陈瑞
孙厚杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Agricultural University
Original Assignee
China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Agricultural University filed Critical China Agricultural University
Priority to CN202011585120.4A priority Critical patent/CN112733633B/en
Publication of CN112733633A publication Critical patent/CN112733633A/en
Application granted granted Critical
Publication of CN112733633B publication Critical patent/CN112733633B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/15Vehicle, aircraft or watercraft design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/165Detection; Localisation; Normalisation using facial parts and geometric relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Computer Hardware Design (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a method for predicting the eye position of a driver of a high-power wheeled tractor, which belongs to the technical field of ergonomic application. Firstly, quantitatively describing the eye position distribution of a tractor driver through an eye position measurement experiment; then analyzing and processing the experimental data, and drawing an eye position scatter diagram; then determining candidate predictors of the layout parameters of the cab and the anthropometric parameters of the driver; then, a multi-element linear regression method is adopted to establish a tractor driver eye position prediction model based on anthropometric parameters and tractor driving layout; finally, the prediction effect of the model is checked. The prediction method provided by the invention can accurately establish the eye position prediction model of the driver of the high-power wheeled tractor, has the advantages of strong applicability and good prediction effect, and effectively solves the problems of inconvenient design of the visual field of the driver cab, difficult check of the visual field of the driver and the like.

Description

Method for predicting eye position of driver of high-power wheeled tractor
Technical Field
The invention belongs to the technical field of application of ergonomics, and particularly relates to a method for predicting the eye position of a driver of a high-power wheeled tractor.
Background
The field of view is of great importance for safe driving of the driver, and the eye ellipse is a major tool for the driver's cab field of view and layout design related to the field of view. The eye ellipses are statistical distribution patterns of eye positions of drivers of different sizes when they sit in a vehicle in a normal posture, and are named as ellipses in shape. Currently, eye ellipses are widely used in the field of view design of cars and commercial trucks as a major tool for the field of view and field-related man-machine design. In contrast, in tractor field of view designs, there are fewer ergonomic applications and there is currently no model for the tractor's eye ellipse. However, on one hand, with the development of electronic technology, the intelligent display device of the tractor is gradually started, and the prediction model of the eye position of the tractor can provide tools for the visual field design of the intelligent display device of the tractor, so that the method has important significance for improving the visual field performance of a driver of the tractor and optimizing the layout of a cab; on the other hand, as a non-road vehicle, a tractor has a large difference from a road vehicle in terms of a vehicle body structure, a traveling road condition, a working environment, and the like, and a driver's sight line attention direction has a large difference from the road vehicle, and it is impossible to directly apply an eye position prediction model of other types of vehicles in the view design of the tractor. Therefore, there is a need to explore a method for building a predictive model of the eye position of a tractor driver
Aiming at the problems of inconvenient design of the tractor cab, difficult check of the driver vision and the like caused by lack of an eye position prediction model in the current high-power wheeled tractor cab vision design, a high-power wheeled tractor driver eye position prediction method is urgently needed, and an eye ellipse model of the high-power tractor is built by the method and is used for the vision design of the tractor cab.
Disclosure of Invention
The invention aims to provide a method for predicting the eye position of a driver of a high-power wheeled tractor, which is characterized by comprising the following steps of:
step one: adopting a photography method to perform an eye position measurement experiment of a driver of the high-power wheeled tractor, collecting front and rear position data and eye position images of a seat selected by the driver in the tractor, and quantitatively describing eye position distribution of the driver of the tractor;
step two: carrying out data processing on the tractor driver eyepoint information acquired through experiments, and drawing a scatter diagram of eye position distribution;
step three: determining candidate predictors of the layout parameters of the cab and the human body measurement parameters of the driver, and performing correlation verification of the candidate predictors so as to avoid multiple collinearity of the predictors in regression analysis, thereby respectively selecting parameters representing the characteristics of the cab of the tractor and the characteristics of the human body of the driver;
step four: establishing a tractor driver eye position prediction model based on anthropometric parameters and tractor cab layout by adopting a regression analysis method, and finally establishing a high-power wheeled tractor driver eye position prediction model from the central position, the direction and the size of the side view and front view direction prediction models;
step five: and drawing a two-dimensional scatter diagram by using the experimentally collected eye position data to verify the prediction effect of the prediction model.
In the first step, a photographic method is adopted to carry out an eye position measurement experiment of a driver of the high-power wheeled tractor, and the method specifically comprises the following steps:
respectively erecting side view cameras at the symmetrical plane of the tractor cab and the position forward from the center of the steering wheel, and simultaneously shooting and collecting images of the driver in the front view direction and the side view direction;
measuring the height, weight and sitting height of a driver, marking the position of an AHP point, pasting a scale taking the AHP point as a base point on the side edge of the seat, and recording the position of the seat in the front-rear direction;
the subject enters a cab to simulate driving, a marker indicating the point H is moved to a position overlapped with the point H, scale data of the front and rear positions of the seat selected by the driver relative to the AHP point is recorded, and a side view and front view eye position picture is taken after the driving posture is kept for 30 seconds.
In the second step, data processing is performed on the tractor driver eyepoint information acquired through experiments, and the method specifically comprises the following steps:
reading the actual distance of the H point relative to the AHP point in the X direction, namely marking the eye position by using the eye bead coordinates at the front and back positions of the seat, converting to obtain the actual distance of the eye position relative to the H point according to the proportion of the image shot by the camera, and combining the actual distance of the seat position and the eye relative to the H point to obtain the actual distance of the eye relative to the AHP point; and a scatter diagram of the eye position distribution is drawn by taking the AHP point as an origin.
The candidate predictors of the layout parameters of the cab in the third step are a horizontal distance L11 from the center of the steering wheel to the AHP point, a vertical distance H17 from the center of the steering wheel to the AHP point, a vertical distance H30 from the seat position reference point SgRP to the AHP point and an initial backrest angle A40 of the seat, and the front and rear positions of the seat are calibrated by taking the AHP point as a reference. The layout parameters L11, H17, H30 and A40 of the tractor cab are all constant, so only the front and rear positions of the seat are considered to be adjusted to be variable; candidate predictors of the driver anthropometric parameters are the height (status), weight (Weight), sitting Height (SH), body height-Body Mass Index (BMI) and sitting height-height ratio (SH/S) of the driver, and, because there may be a correlation between the driver anthropometric parameters, to avoid multiple collinearity of the predictors in the regression analysis, the body height-body mass index BMI is replaced with a logarithmic form Ln (BMI) of the body height-body mass index, obtaining a measure of the approximate normal distribution; the obtained anthropometric parameters of the driver are subjected to correlation analysis by SPSS software (SPSS software of SPSS company in the United states) to obtain that the colinear of the anthropometric parameters of height status, sitting height to height ratio (SH/S) and logarithmic form of body height to body mass index (Ln (BMI)) is not obvious, so that status, SH/S and Ln (BMI) are selected as parameters for representing the characteristics of the human body.
And in the fourth step, a regression analysis method is adopted, tractor and anthropometric characteristic parameters related to the eye position prediction of a driver are selected as independent variables, an equation of the selected parameters and the dependent variables is obtained through regression by using SPSS data analysis software, and the fitting degree of an evaluation equation of an R square value is adjusted.
And in the fifth step, drawing a two-dimensional scatter diagram of the eye position data acquired through the experiment to verify the prediction effect of the prediction model.
The invention has the following beneficial effects: the invention establishes the model of the driver's eye ellipse different from the parameters of the automobile, can accurately establish the model of the eye position prediction of the driver of the high-power wheeled tractor, can describe the eye position distribution of different tractors and driver groups, can be used as the establishment of the model of the eye ellipse of other tractors and drivers to provide reference, provides tools for the visual field design of the intelligent display device of the tractor, improves the visual field performance of the driver of the tractor, and effectively solves the problems existing in the visual field design of the tractor. The model has the advantages of strong applicability and good prediction effect; the layout optimization of the cab is of great significance.
Drawings
FIG. 1 is a schematic diagram of a driver eye position prediction flow;
FIG. 2 is a schematic diagram of cab layout parameters and driver eye position distribution;
FIG. 3 is a side view eye position scatter plot from experimental data;
FIG. 4 is a graph of eye ellipses drawn by data obtained from a model 1254 tractor through a predictive model versus experimentally obtained scatter points;
FIG. 5 is a graph showing the comparison of an eye ellipse drawn by data obtained from a model predictive model and a scatter obtained from an experiment for a model 1354 tractor;
fig. 6 is a graph of comparison of eye ellipses drawn by data obtained from a model 1804 tractor through a predictive model with experimentally obtained scatter points.
Detailed Description
The invention provides a method for predicting the eye position of a driver of a high-power wheeled tractor,
the present invention will be described in detail below with reference to the accompanying drawings and examples.
Example 1
FIG. 1 is a schematic diagram of a driver eye position prediction process; the method for predicting the eye position of the driver of the high-power wheeled tractor comprises the following steps:
step one: the method comprises the steps of sampling the position of eyes of a tractor driver by adopting a photography method, and simultaneously working through two cameras arranged in front of and at the side of the driver to obtain front-rear position data and eye position images of a seat selected by the driver in the tractor, so as to quantitatively describe the position distribution of the eyes of the tractor driver;
step two: carrying out data processing on the tractor driver eyepoint information acquired through experiments, and drawing a scatter diagram of eye position distribution;
step three: through carrying out correlation test on candidate predictive factors of the layout parameters of the cab and the anthropometric parameters of the driver, parameters representing the characteristics of the cab of the tractor and the characteristics of the anthropometric parameters of the driver are respectively selected;
step four: and establishing a high-power wheeled tractor driver eye position prediction model based on anthropometric parameters and tractor cab layout by adopting a regression analysis method, predicting the central position of the model, the directions and the sizes of all axes from side view and front view directions, and finally obtaining an eye ellipse model of the tractor driver.
Step five: and drawing a two-dimensional graph by using the experimentally collected eye position data to verify the prediction effect of the prediction model.
The first step comprises the following steps:
three types of high-power wheeled tractors 1254, 1354 and 1804 are selected as experimental equipment, wherein the cab layout is similar, and the size difference among the types is large.
Experiment 180 male drivers aged 18-60 years old, having a height greater than 150cm and holding a C1 driver license were enrolled for experiments, and divided into three groups according to tractor model, each group of subjects 60 persons. The measurement of eye position is performed in the tractor cab, and only the subject is required to maintain a comfortable driving posture due to limited conditions, and the driving process is simulated by observing a reasonable range in front.
As shown in fig. 2, the layout parameters of the cab and the eye position distribution of the driver are shown, namely, the experimental scene is arranged: erecting a front view camera at a position 5000mm in front of the center of a steering wheel of a tractor cab, erecting a side view camera at a position 1500mm away from a symmetrical plane of the tractor cab, wherein the height of the camera is flush with the height of eyes of a 50 th percentile male sitting in a driving position, and the camera is used for shooting and collecting images of the driver in the front view and the side view directions simultaneously; the position of an Accelerator Heel Point (AHP) is marked, and a scale which takes the AHP point as a base point is stuck on the side of the seat and is used for recording the position of the seat in the front-rear direction.
In the experiment, the height (wearing shoes), weight and sitting height of the driver were measured. Before each experiment, the tractor seat is adjusted to an initial position, namely the seat is adjusted to the backward maximum stroke, and the backrest is adjusted to a preset angle; the tested person enters the cab to simulate driving, the right foot is lightly placed on the accelerator pedal and keeps the heel in contact with the floor, the left foot is placed on the floor, the back is attached to the backrest, the front and back positions of the seat are adjusted in a comfortable driving posture, and the two hands are held on the horizontal line of the steering wheel to tie up the safety belt.
The subject enters a cab to simulate driving, a marker indicating the point H is moved to a position overlapped with the point H, scale data of the front and rear positions of the seat selected by the driver relative to the AHP point is recorded, and a side view and front view eye position picture is taken after the driving posture is kept for 30 seconds.
And replacing the tested tractor, repeating the steps, and sequentially completing data acquisition of the other two tractors.
The process of the second step is as follows:
firstly, obtaining the actual distance of the H point relative to the AHP point in the X direction, namely the front and back positions of the seat according to the recorded scale data; further, marking the eye position by using the eye bead coordinates, and converting to obtain the actual distance of the eye position relative to the H point according to the proportion of the image shot by the camera; finally, combining the actual distance between the seat position and the eye relative to the H point to obtain the actual distance between the eye and the AHP point, and drawing a scatter diagram of the eye position distribution by taking the AHP point as an origin (shown in fig. 3).
The process of the third step is as follows:
the candidate predictors of the cab layout parameters are a horizontal distance (L11) from the center of the steering wheel to the AHP point, a vertical distance (H17) from the center of the steering wheel to the AHP point, a vertical distance (H30) from the seat position reference point SgRP to the AHP point, and a seat initial back angle (a 40), and the front-rear position of the seat is calibrated based on the AHP point. Table 1 shows the main layout parameters of the test tractor cab, and is constant, so only the seat front-rear position is considered to be adjusted to the variation.
Table 1 major layout parameters for tractor cab
In Table 1, the values of H17 for the three tractors used for the experiment are the same, so L11, H30 and A40 are selected as parameters characterizing the cab of the tractor, and are not considered in the subsequent analysis.
Candidate predictors of driver anthropometric parameters are the height (status), weight (Weight), sitting Height (SH), body height-to-Body Mass Index (BMI), and sitting height-to-height ratio (SH/S) of the driver, and table 2 is the statistics of the measured driver anthropometric parameters.
Table 2 statistics of driver anthropometric parameters (n=180)
Because of the possible correlation between the driver's body parameters, to avoid multiple collinearity of predictors in the regression analysis, a logarithmic form of body mass index (Ln (BMI)) is substituted for Body Mass Index (BMI) to obtain a measure of the near normal distribution. The obtained anthropometric parameters of the driver were subjected to correlation analysis by using SPSS software (SPSS company in the United states), and the results are shown in Table 3, and finally, the anthropometric parameters of height (status), sitting height ratio (SH/S) and logarithmic form of body height/body mass index (Ln (BMI)) were not remarkable in colinear, so that status, SH/S and Ln (BMI) were selected as parameters representing the characteristics of the human body.
TABLE 3 correlation matrix of candidate predictors
The process of the fourth step is as follows:
and establishing an eye position prediction model by using a regression analysis method in statistics, and finally establishing an eye position prediction model of the driver of the high-power wheeled tractor from the central positions, directions and sizes of the side view and front view direction prediction models.
(1) Prediction of side view direction model
First, the prediction of the center position is performed:
with status, ln (BMI), SH/S, L11, H30, A40 as predictive variables, the driver seat front-rear position X seat Regression parameters of the regression-derived prediction model are shown in Table 4 as dependent variables.
TABLE 4 regression coefficients of driver seat front-rear position prediction models
Table 4 shows that R is adjusted 2 The fitting goodness is better at 0.400. The regression equation is shown in formula (1):
X seat =928.31+0.139Stature+10.451Ln(BMI)-418.785SH/S-25.69A40 (1)
regression coefficients of models of the driver's eye positions in the X-direction and Z-direction were obtained by regression using status, ln (BMI), SH/S, L11, H30, A40 as prediction variables and the seat H point as a reference, as shown in tables 5 and 6.
TABLE 5 regression coefficients of the driver X-direction eye position prediction model
TABLE 6 regression coefficients for a driver Z-direction eye position prediction model
According to tables 5 and 6, regression equations for obtaining the eye positions of the driver in the X direction and the Z direction are shown in the formulas (2) and (3):
X eyeReH =-177.722-0.025Stature+48.215Ln(BMI)-327.779SH/S+14.998A40 (2)
Z eyeReH =-1377.985+0.307Stature-18.629Ln(BMI)+368.311SH/S+60.513A40 (3)
the X-direction eye position is obtained by the formulas (1) and (2), and the Z-direction eye position is obtained by the formula (3) and the height H of the seat position reference point SgRP The addition results in the following prediction model of eye position with respect to AHP point:
X ReAHP =1106.032+0.164Stature-37.764Ln(BMI)-91.006SH/S-40.688A40 (4)
Z ReAHP =-1377.985+0.307Stature-18.629Ln(BMI)+368.311SH/S+60.513A40+H SgRP (5)
subsequently, prediction of the spindle inclination angle is performed:
the principal component analyzes the side view position data, establishes a covariance matrix of the eye position relative to the H point abscissa and the H point ordinate, calculates a first eigenvector of the covariance matrix by Matlab, and calculates an inclination angle of a main shaft in the side view.
Taking a 1254 model tractor as an example, the command for calculating eigenvalues and eigenvectors of the covariance matrix by Matlab is as follows:
>>a=[1381.973-196.446;-196.446 662.613];
[V,D]=eig(a)
V=
-0.247354082078766 -0.968925156077068
-0.968925156077068 0.247354082078766
D=1.0e+03*0.612462872610615
00,1.432123127389386
wherein the maximum characteristic value is 1.432×10 3 The corresponding eigenvector is (-0.969,0.247) and equation (6) is the X-axis tilt angle calculation equation:
arctan[0.247/(-0.969)]=-14.3° (6)
the covariance matrix, the first eigenvector and the X-axis dip angle corresponding to the three tractors are calculated according to the method, and the X-axis dip angle prediction results are shown in Table 7.
Table 7X axis tilt prediction
Finally, the X-axis and Z-axis lengths are predicted:
the axial length of the prediction model in the side view direction is provided with an X-axis axial length and a Z-axis axial length, the X-axis axial length and the Z-axis axial length are mainly influenced by human body measurement parameters, and the correlation with cab layout parameters is smaller, so that the size of the prediction model of each tractor cab is the same. And selecting a set of regression equations with the axial length range capable of describing the relation between the characteristic parameters of the driver and eyes to the maximum extent, as shown in formulas (7) and (8).
X ReH =-53.565+0.023Stature-94.547Ln(BMI)+1068.981SH/S,RMSE=49.036 (7)
Z ReH =-350.106+0.479Stature+6.174Ln(BMI)+676.524SH/S,RMSE=28.009 (8)
Wherein RMSE is root mean square error.
Taking the X axis as an example, the specific calculation process of the axial length is as follows:
and (3) calculating the standard deviation of the eye position of the driver along the X axis according to a regression equation shown in the formula (9).
Calculate the X-axis length of 95% eye position: let X be two cut-off points on X axis of prediction model 1 And X 2 The cumulative distribution function of the standard normal distribution is X 1 And X 2 The area between the two parts and the X axis is 0.90, and the X axis length is calculated according to the formula.
Wherein, the liquid crystal display device comprises a liquid crystal display device,to accumulate standard normal distribution, X centroid Is the X coordinate of the central position of the predictive model.
The calculation of the Z-axis length is similar to the above steps, and the Z-axis length is obtained by substituting regression coefficients.
(2) Front view direction prediction model
The forward-looking direction prediction model is constructed according to the Y coordinate and the Z coordinate of the eye position of the driver, and the correlation analysis shows that the correlation of the Y coordinate and the Z coordinate of the eye position of the driver in the forward-looking direction is not obvious; z coordinate and Z axis length in front viewThe side view directions are the same; the driver anthropometric parameters and the cab layout parameters do not affect the Y-coordinate of the forward direction prediction model. Taking a central plane of a driver as a base plane, calculating the mean value mu, wherein the Y coordinates of the centroids of the left eye position prediction model and the right eye position prediction model of the driver are the same y = ±27.09mm; standard value sigma y =25.38mm;
(3) Prediction result analysis
The tractor driver eye position prediction model is population eye position distribution determined by anthropometric parameters and tractor cab layout parameters, the tractor layout parameters influence the mass center position and the main shaft inclination angle of the prediction model, and the size of the eye position prediction model is mainly related to human body characteristic parameters. Substituting the data according to the established series of mathematical models to finally obtain the prediction result of the high-power wheeled tractor driver eye position prediction model as shown in table 8.
Table 8 prediction results of model
The fifth step comprises the following steps:
and (3) drawing an eye ellipse model of the three types of tractors according to the prediction result of the eye position prediction model of the high-power wheeled tractor driver obtained in the step (IV), and comparing the eye ellipse model with experimental data obtained in the step (II), as shown in fig. 5. Finally, the prediction effects of the model 1254, 1354 and 1804 tractor eye ellipse models are 82.3%, 81.7% and 80% respectively; the X-axis inclination angle is horizontal or forwards and upwards inclined, and is greatly different from the X-axis inclination angle of the automobile eye ellipse (shown in fig. 4, 5 and 6).
The preferred embodiments of the present invention have been described in detail above with reference to the accompanying drawings, but the present invention is not limited to the specific details of the above embodiments, and the prediction of the driver's eye position can be performed on different types of tractors within the scope of the technical idea of the present invention, not just the three types of tractors described in the specific embodiments.

Claims (1)

1. The method for predicting the eye position of the driver of the high-power wheeled tractor is characterized by comprising the following steps of:
step one: adopting a photography method to perform an eye position measurement experiment of a driver of the high-power wheeled tractor, collecting front and rear position data and eye position images of a seat selected by the driver in the tractor, and quantitatively describing eye position distribution of the driver of the tractor; the method specifically comprises the following steps: respectively erecting side view cameras at the symmetrical plane of the tractor cab and the position forward from the center of the steering wheel, and simultaneously shooting and collecting images of the driver in the front view direction and the side view direction; measuring the height, weight and sitting height of a driver, marking the position of an AHP point, pasting a scale taking the AHP point as a base point on the side edge of the seat, and recording the position of the seat in the front-rear direction; the tested person enters a cab to simulate driving, a marker indicating the point H is moved to a position overlapped with the point H, scale data of the front and rear positions of the seat selected by the driver relative to the AHP point is recorded, and a side view and front view eye position picture is taken after the driving posture is kept for 30 seconds;
step two: carrying out data processing on the tractor driver eyepoint information acquired through experiments, and drawing a scatter diagram of eye position distribution; the method specifically comprises the following steps: reading the actual distance of the H point relative to the AHP point in the X direction, namely marking the eye position by using the eye bead coordinates at the front and back positions of the seat, converting to obtain the actual distance of the eye position relative to the H point according to the proportion of the image shot by the camera, and combining the actual distance of the seat position and the eye relative to the H point to obtain the actual distance of the eye relative to the AHP point; and a scatter diagram of eye position distribution is drawn by taking the AHP point as an origin;
step three: determining candidate predictors of the layout parameters of the cab and the human body measurement parameters of the driver, and performing correlation verification of the candidate predictors so as to avoid multiple collinearity of the predictors in regression analysis, thereby respectively selecting parameters representing the characteristics of the cab of the tractor and the characteristics of the human body of the driver; the candidate prediction factors of the layout parameters of the cab are a horizontal distance L11 from the center of the steering wheel to the AHP point, a vertical distance H17 from the center of the steering wheel to the AHP point, a vertical distance H30 from a seat position reference point SgRP to the AHP point and an initial backrest angle A40 of the seat, the front and rear positions of the seat are calibrated by taking the AHP point as a reference, and the layout parameters L11, H17, H30 and A40 of the cab of the tractor are constant, so that only the front and rear positions of the seat are considered to be adjusted to be variable; the candidate predictors of the human body measurement parameters of the driver are the height, the weight, the sitting height, the body height and body mass index BMI and the sitting height and height ratio of the driver, and because of the correlation among the human body parameters of the driver, in order to avoid the multiple collinearity of the predictors in regression analysis, the body height and body mass index BMI is replaced by a logarithmic form Ln (BMI) of the body height and body mass index, so as to obtain the measurement of approximate normal distribution; carrying out correlation analysis on the obtained human body measurement parameters of the driver by utilizing SPSS software to obtain that the collinearity of the logarithmic form Ln (BMI) of the human body measurement parameters of height, sitting height and height ratio and body height and body weight index is not obvious, so that the logarithmic form Ln (BMI) of the height, sitting height and height ratio and body height and body weight index is selected as the parameter representing the human body characteristics;
step four: establishing a tractor driver eye position prediction model based on anthropometric parameters and tractor cab layout by adopting a regression analysis method, and finally establishing a high-power wheeled tractor driver eye position prediction model from the central position, the direction and the size of the side view and front view direction prediction models; the method specifically comprises the following steps: tractor and anthropometric characteristic parameters related to the eye position prediction of a driver are selected as independent variables, an equation of the selected parameters and the dependent variables is obtained through regression by using SPSS data analysis software, and the fitting degree of an evaluation equation of an R square value is adjusted;
step five: drawing a two-dimensional scatter diagram by using eye position data acquired through experiments to verify the prediction effect of a prediction model; the method specifically comprises the following steps: and (3) drawing an eye ellipse model of the tractor according to the prediction result of the eye position prediction model of the high-power wheeled tractor driver, and comparing with the experimental data obtained in the step (II).
CN202011585120.4A 2020-12-28 2020-12-28 Method for predicting eye position of driver of high-power wheeled tractor Active CN112733633B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011585120.4A CN112733633B (en) 2020-12-28 2020-12-28 Method for predicting eye position of driver of high-power wheeled tractor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011585120.4A CN112733633B (en) 2020-12-28 2020-12-28 Method for predicting eye position of driver of high-power wheeled tractor

Publications (2)

Publication Number Publication Date
CN112733633A CN112733633A (en) 2021-04-30
CN112733633B true CN112733633B (en) 2023-07-28

Family

ID=75606969

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011585120.4A Active CN112733633B (en) 2020-12-28 2020-12-28 Method for predicting eye position of driver of high-power wheeled tractor

Country Status (1)

Country Link
CN (1) CN112733633B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006163900A (en) * 2004-12-08 2006-06-22 Nissan Motor Co Ltd Driver-monitoring system and processing method therefor
KR20180012603A (en) * 2016-07-27 2018-02-06 현대자동차주식회사 Apparatus and method for controlling driving posture of vehicle
SE1830358A1 (en) * 2018-12-12 2020-06-13 Innovationcare Alcolock device using mapping gaze and motion parameters
CN111666634A (en) * 2020-06-12 2020-09-15 吉林大学 Method for establishing ellipse of driver's eye based on human motion simulation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10357195B2 (en) * 2017-08-01 2019-07-23 Panasonic Intellectual Property Management Co., Ltd. Pupillometry and sensor fusion for monitoring and predicting a vehicle operator's condition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006163900A (en) * 2004-12-08 2006-06-22 Nissan Motor Co Ltd Driver-monitoring system and processing method therefor
KR20180012603A (en) * 2016-07-27 2018-02-06 현대자동차주식회사 Apparatus and method for controlling driving posture of vehicle
SE1830358A1 (en) * 2018-12-12 2020-06-13 Innovationcare Alcolock device using mapping gaze and motion parameters
CN111666634A (en) * 2020-06-12 2020-09-15 吉林大学 Method for establishing ellipse of driver's eye based on human motion simulation

Also Published As

Publication number Publication date
CN112733633A (en) 2021-04-30

Similar Documents

Publication Publication Date Title
CN107390205B (en) A kind of monocular vision vehicle odometry method obtaining front truck feature using car networking
Reed et al. A statistical method for predicting automobile driving posture
CN103927754B (en) A kind of scaling method of vehicle-mounted vidicon
CN102975718B (en) In order to determine that vehicle driver is to method, system expected from object state and the computer-readable medium including computer program
CN109263652B (en) Method for measuring and checking front visual field of driver
CN101253385A (en) Distortion evaluating apparatus and distortion evaluating method
DE112018006164B4 (en) VISUAL DIRECTION CALIBRATION DEVICE, VISUAL DIRECTION CALIBRATION PROCEDURE, AND VISUAL DIRECTION CALIBRATION PROGRAM
CN112815907B (en) Vehicle wading monitoring method, device and system, computer equipment and storage medium
CN111735385A (en) Method for determining seat reference point in reverse engineering of competitive product vehicle
Reed et al. A new approach to modeling driver reach
CN110555885A (en) calibration method and device of vehicle-mounted camera and terminal
CN109345591B (en) Vehicle posture detection method and device
CN110239556A (en) A kind of driver manipulates ability cognitive method immediately
CN112733633B (en) Method for predicting eye position of driver of high-power wheeled tractor
CN105737743A (en) Measurement method for dummy H point displacement in impact test based on photographed image analysis
US8098887B2 (en) Face tracking device
CN112598212B (en) Driving visual field evaluation system combining virtuality and reality
CN108256487B (en) Driving state detection device and method based on reverse dual-purpose
CN105389464B (en) A kind of method and apparatus calculating driver's spatial impression parameter
Ziraknejad et al. The effect of Time-of-Flight camera integration time on vehicle driver head pose tracking accuracy
CN114926729A (en) High-risk road section identification system and method based on driving video
CN114880804A (en) Data-driven trailer motor home frame structure safety monitoring method
CN113044045A (en) Self-adaptive adjustment method for seats in intelligent cockpit
CN116659436B (en) Measurement platform for human body driving gesture acquisition and debugging method
CN106845404B (en) Foot posture testing system and foot posture testing method for automobile driver and auxiliary testing shoes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant