WO2021020385A1 - Display control device - Google Patents

Display control device Download PDF

Info

Publication number
WO2021020385A1
WO2021020385A1 PCT/JP2020/028877 JP2020028877W WO2021020385A1 WO 2021020385 A1 WO2021020385 A1 WO 2021020385A1 JP 2020028877 W JP2020028877 W JP 2020028877W WO 2021020385 A1 WO2021020385 A1 WO 2021020385A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
pitch angle
display control
control device
Prior art date
Application number
PCT/JP2020/028877
Other languages
French (fr)
Japanese (ja)
Inventor
泉樹 立入
大祐 竹森
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2021020385A1 publication Critical patent/WO2021020385A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Definitions

  • the present disclosure relates to a technique for controlling display on an in-vehicle display.
  • a head-up display (hereinafter referred to as AR-HUD) having an AR function for superimposing a virtual image on an object in the scenery outside the vehicle that can be seen through the windshield is known.
  • AR is an abbreviation for Augmented Reality and is translated as augmented reality.
  • HUD is an abbreviation for Head Up Display.
  • the AR-HUD has a problem that the superimposed position of the virtual image in the landscape shifts due to the change in the pitch angle and the roll angle of the vehicle.
  • the vehicle posture is estimated using an angle sensor and a roll sensor, and the angle of the mirror that projects the image on the windshield and the display position in the image are adjusted based on the estimated vehicle posture.
  • a technique for suppressing the misalignment of the virtual image is disclosed.
  • One aspect of the present disclosure is to provide a technique for accurately suppressing the misalignment of a virtual image superimposed on an object by a simple method.
  • One aspect of the present disclosure is a display control device, which includes an information generation unit, a vehicle height acquisition unit, an acceleration acquisition unit, a characteristic storage unit, a correction amount calculation unit, a pitch angle calculation unit, and eye information acquisition.
  • a unit and a projection correction unit are provided.
  • the information generation unit acquires information on the object existing in the landscape visually recognized by the driver via the projection area, and the projection area of the superimposed image superimposed on the object according to the three-dimensional position information of the object. Set the projection position within.
  • the projection area is an area in the front window of the vehicle on which an image to be recognized as a virtual image by the driver is projected.
  • the vehicle height acquisition unit acquires a vehicle height detection value indicating a displacement amount of the vehicle height of the vehicle.
  • the acceleration acquisition unit acquires lateral acceleration, which is the acceleration applied in the vehicle width direction of the vehicle.
  • the characteristic storage unit stores conversion information representing the correlation between the lateral acceleration and the displacement amount of the vehicle height detection value.
  • the correction amount calculation unit calculates the correction amount of the vehicle height detection value by using the lateral acceleration acquired by the acceleration acquisition unit and the conversion information stored in the characteristic storage unit.
  • the pitch angle calculation unit uses the value obtained by correcting the vehicle height detection value acquired by the vehicle height acquisition unit with the correction amount calculated by the correction amount calculation unit, and is the pitch angle which is the inclination angle of the vehicle in the front-rear direction. Is calculated.
  • the eye information acquisition unit acquires eye information which is a detection result of the amount of deviation of the driver's eye position from the reference position.
  • the projection correction unit corrects the projection position of the superimposed image set by the information generation unit according to both the pitch angle and the eye information so that the superimposed image visually recognized by the driver is superimposed and displayed on the object.
  • the information display system 1 shown in FIGS. 1 to 3 is mounted on a vehicle and used.
  • the vehicle equipped with the information display system 1 is referred to as a own vehicle.
  • the information display system 1 includes a display control device 10.
  • the information display system 1 includes a peripheral monitoring unit 2, a behavior detection unit 3, a driver detection unit 4, a map storage unit 5, a positioning unit 6, a navigation device 7, a characteristic storage unit 8, and a head-up display ( Hereinafter, the HUD) device 9 may be provided.
  • HUD is an abbreviation for Head Up Display.
  • Each part constituting the information display system 1 may transmit and receive information via the in-vehicle LAN.
  • LAN is an abbreviation for Local Area Network.
  • the information display system 1 projects an image onto the windshield 120 located in front of the driver's seat by the HUD device 9, and displays various information on the actual scenery visually recognized by the driver through the windshield 120.
  • an image displayed superimposed on such an actual landscape is referred to as an AR image.
  • AR is an abbreviation for Augmented Reality.
  • the peripheral monitoring unit 2 includes at least one of a radar sensor and a camera.
  • the radar sensor uses infrared rays, millimeter waves, ultrasonic waves, etc. as radar waves, and detects the distance from the target that reflects the radar wave, the direction in which the target exists, and the like.
  • As the camera a visible light camera, an infrared camera, or the like is used.
  • the camera is arranged so as to include an area visually recognized by the driver through the windshield (hereinafter, a viewing area) as an imaging range.
  • the radar sensor is arranged so as to include the viewing area as the detection range.
  • the peripheral monitoring unit 2 detects a target existing on the traveling path of the own vehicle with a radar sensor and a camera, and generates target information including the position of the detected target.
  • the detection target of the peripheral monitoring unit 2 includes, for example, various targets to be processed by the advanced driver assistance system (that is, ADAS: Advanced Driver Assistance System).
  • the peripheral monitoring unit 2 may generate target information including the position of the target based on the map information stored in the map storage unit 5, which will be described later.
  • the behavior detection unit 3 outputs various sensors that output a signal indicating a driving operation by the driver, a signal indicating the behavior of the own vehicle as a result of the driving operation, and a signal indicating the state of the vehicle that affects the behavior of the own vehicle. including.
  • the behavior detection unit 3 includes at least a height sensor 31, an acceleration sensor 32, a seat sensor 33, and a trunk sensor 34.
  • the height sensor 31 is provided on one of the wheels of the own vehicle, and outputs a detection signal according to the relative displacement amount (hereinafter, vehicle height detection value) H between the axle of the wheel and the vehicle body. In this embodiment, it is provided on the right rear wheel.
  • the acceleration sensor 32 detects the magnitude of lateral acceleration, which is the acceleration in the vehicle width direction applied to the own vehicle.
  • the value of the acceleration shall be represented by a positive sign for the acceleration applied to the right toward the traveling direction of the own vehicle and a negative sign for the acceleration applied to the left.
  • the seat sensor 33 is provided on each of the occupant seats of the own vehicle and outputs a detection signal indicating the presence or absence of an occupant on each occupant seat.
  • a driver's seat hereinafter, D seat
  • P seat a passenger seat
  • RR seat a rear right seat
  • RL seat a rear left seat
  • the trunk sensor 34 is provided in the trunk located near the rear end of the own vehicle, and outputs a detection signal indicating the presence or absence of luggage in the trunk.
  • the seat sensor 33 and the trunk sensor 34 may be configured to output a detection signal indicating the weight of a load such as an occupant or luggage.
  • the position where the seat sensor 33 and the trunk sensor 34 are installed corresponds to the measurement position.
  • the behavior detection unit 3 includes a height sensor 31, an acceleration sensor 32, a seat sensor 33, and a trunk sensor 34.
  • the behavior detection unit 3 may further include an accelerator pedal sensor, a brake pedal sensor, a steering angle sensor, a direction indicator switch, a vehicle speed sensor, a yaw rate sensor, and the like.
  • the driver detection unit 4 is a device that detects the driver's state such as the face position, face orientation, eye position, and line-of-sight direction from the driver's face image captured by the in-vehicle camera.
  • the driver detection unit 4 is known as a so-called driver status monitoring system (that is, DSM: Driver Status Monitoring system).
  • Map information, AR information, etc. are stored in the map storage unit 5.
  • the map information is used for route guidance by the navigation device 7 and for superimposing an AR image on the actual landscape.
  • Map information includes information on roads, lane markings such as white lines, information on road markings, and information on structures.
  • Information about roads includes, for example, position information for each point, curve curvature and slope, and shape information such as connection relationships with other roads.
  • Information on lane markings and road markings includes, for example, lane marking and road marking type information, location information, and three-dimensional shape information.
  • the information about the structure includes, for example, type information, position information, and shape information of each structure.
  • the structure includes road signs, traffic lights, street lights, tunnels, overpasses, buildings facing the road, and the like.
  • the map information has the above-mentioned position information and shape information in the form of point cloud data, vector data, or the like of feature points represented by three-dimensional coordinates. That is, the map information represents a three-dimensional map including altitude in addition to latitude and longitude with respect to position information. Therefore, from the map information, information on the slope of the road at each point on the road, specifically, the longitudinal gradient along the traveling direction of the road and the crossing gradient along the width direction of the road can be extracted.
  • the location information contained in the map information has a relatively small error on the order of centimeters.
  • the map information is highly accurate map data in that it has position information based on three-dimensional coordinates including height information, and it is also highly accurate in that the error in the position information is relatively small. It is data.
  • AR information is data used for displaying AR images, and includes symbols, characters, icons, etc. that are superimposed and displayed on the background.
  • the AR information may include information for route guidance linked with the navigation device 7 (for example, an arrow superimposed on the road surface).
  • the positioning unit 6 is a device that generates position information for specifying the current position of the own vehicle.
  • the positioning unit 6 includes, for example, a GNSS receiver and sensors for autonomous navigation such as a gyroscope and a distance sensor.
  • GNSS is an abbreviation for Global Navigation Satellite System.
  • the GNSS receiver receives the transmission signal from the artificial satellite and detects the position coordinates and altitude of the vehicle.
  • the gyroscope outputs a detection signal according to the angular velocity of the rotational motion applied to the vehicle.
  • the distance sensor outputs the mileage of the vehicle.
  • the positioning unit 6 performs high-precision position information and the like indicating the current position of the own vehicle by combined positioning that combines information based on the output signal from the GNSS receiver and information based on the output signal from the sensor for autonomous navigation. Generate.
  • the position information generated by the positioning unit 6 may have an accuracy of specifying the lane in which the vehicle travels among a plurality of lanes, for example.
  • the navigation device 7 provides route guidance based on the current position of the own vehicle and map data.
  • the navigation device 7 identifies the current position and the traveling direction of the own vehicle on the road by the positioning result of the positioning unit 6 and the map matching using the map data.
  • the navigation device 7 provides the display control device 10 with map information, AR information, and the like regarding the current position and traveling direction of the own vehicle, the route to the destination, the roads and facilities existing in the visual area of the driver, and the like.
  • the characteristic storage unit 8 includes a non-volatile memory. At least the pitch angle conversion information Gp and the correction amount conversion information Gh are stored in the characteristic storage unit 8 as characteristic information.
  • the pitch angle conversion information Gp is information used when calculating the vehicle pitch angle from the vehicle height detection value H.
  • the correction amount conversion information Gh is information used when calculating the correction amount ⁇ H of the detection value H of the height sensor 31 from the detection value (that is, lateral acceleration) of the acceleration sensor 32.
  • the characteristic information Gp and Gh may be expressed by mathematical formulas or by table format data. In particular, a plurality of types of pitch angle conversion information Gp are prepared according to the detection results of the seat sensor 33 and the trunk sensor 34, that is, the load state specified from the arrangement of the occupants and the load in the vehicle.
  • the seat sensor 33 detects the presence or absence of a occupant in each of the four occupant seats (that is, the D seat, the P seat, the RR seat, and the RL seat), and the trunk sensor 34 detects the presence or absence of luggage in one trunk.
  • the load state has 16 patterns. Characteristic information may be prepared for each of the 16 patterns, but the number of characteristic information may be reduced by integrating patterns in a load state having similar conversion characteristics. Specifically, for example, there may be four patterns classified according to the presence / absence of a P-seat occupant and the presence / absence of luggage in the trunk, or two patterns classified only by the presence / absence of a P-seat occupant.
  • the HUD device 9 is arranged on the instrument panel 110.
  • the HUD device 9 includes a projector 91 and an optical system 92.
  • the projector 91 includes a liquid crystal display (hereinafter referred to as LCD) panel and a backlight.
  • the projector 91 is fixed with the display screen of the LCD panel facing the optical system 92.
  • the projector 91 displays an image on the LCD panel according to an instruction from the display control device 10, and transmits and illuminates the image with a backlight to emit light formed as a virtual image toward the optical system 92.
  • the optical system 92 has at least a concave mirror, reflects and magnifies the light emitted from the projector 91, and projects it onto a projection area 121, which is an area on the windshield 120 set in the driver's visible area. To do. By this projection, the AR image is superimposed and displayed on the actual landscape in the visible area of the driver.
  • the display control device 10 includes a microcomputer having a CPU 11 and a semiconductor memory (hereinafter, simply memory) 12 such as a ROM or RAM. Each function of the display control device 10 is realized by the CPU 11 executing a program stored in a non-transitional substantive recording medium.
  • the display control device 10 is also called an HCU.
  • HCU is an abbreviation for HMI Control Unit
  • HMI is an abbreviation for Human Machine Interface.
  • the display control device 10 includes an information generation unit 101 and a position correction unit 102 as a functional configuration realized by executing a program stored in the memory 12.
  • the information generation unit 101 is visually recognized via the projection area 121 based on the detection result of the peripheral monitoring unit 2, the route information from the navigation device 7, the map information, and the like, and is the target of providing information to the driver. Extract the target object.
  • the information generation unit 101 generates position information representing the three-dimensional position of the object and an AR image superimposed on the object. At this time, based on the three-dimensional position of the object and the standard eye position of the driver, when the vehicle is in a standard load state in which only the driver is on board, the AR image is superimposed on the object and recognized. As such, the projection position of the AR image in the projection area 121 is set.
  • the position correction unit 102 is generated by the information generation unit 101 based on the detection results of the sensors 31 to 34 and the vehicle pitch angle calculated from the characteristic information Gp and Gh stored in the characteristic storage unit 8. The projection position of the AR image in the projection area 121 is corrected.
  • the display control device 10 causes the HUD device 9 to display an AR image generated by the information generation unit 101 and whose projection position has been corrected by the position correction unit 102.
  • the display control device 10 acquires the detection results of the seat sensor 33 and the trunk sensor 34 as a load detection value indicating a load state applied to the vehicle.
  • a load detection value indicating a load state applied to the vehicle.
  • the presence / absence of an occupant and the presence / absence of a load such as luggage are used as the load detection value.
  • the display control device 10 acquires the pitch angle conversion information Gp according to the load state indicated by the load detection value from the characteristic storage unit 8.
  • the display control device 10 determines the current position acquired from the positioning unit 6, that is, the slope information ⁇ v, ⁇ h regarding the road gradient of the road on which the own vehicle is traveling, based on the map information stored in the map storage unit 5. To get.
  • Gradient information ⁇ v is information about the longitudinal gradient
  • gradient information ⁇ h is information about the cross slope.
  • the gradient information ⁇ v is represented by a positive sign value if the longitudinal gradient is a forward gradient, and is represented by a negative sign value if the longitudinal gradient is a backward gradient.
  • the gradient information ⁇ h is represented by a positive sign value if the transverse gradient is a rightward gradient, and is represented by a negative sign value if the transverse gradient is a leftward gradient.
  • the display control device 10 acquires the vehicle height detection value H, which is the detection value of the height sensor 31.
  • the display control device 10 acquires the lateral acceleration ⁇ , which is the detection result of the acceleration sensor 32.
  • the display control device 10 detects the vehicle height using the lateral acceleration ⁇ acquired in S150, the gradient information ⁇ h of the transverse gradient acquired in S130, and the correction amount conversion information Gh stored in the characteristic storage unit 8.
  • the correction amount ⁇ H with respect to the value H is calculated. The method of calculating the correction amount ⁇ H will be described later.
  • the display control device 10 calculates the vehicle pitch angle ⁇ , which is the inclination angle of the vehicle body in the front-rear direction with respect to the horizontal plane, using the equation (1).
  • Gp is the pitch angle conversion information acquired in S120
  • ⁇ p is the gradient information of the longitudinal gradient acquired in S130
  • H is the vehicle height detection value acquired in S140
  • ⁇ H is S160. It is a correction amount for the calculated vehicle height detection value H.
  • Gp (X) is a vehicle pitch angle estimated from the pitch angle conversion information Gp with the vehicle height detection value as X.
  • C is an experimentally determined constant.
  • the display control device 10 acquires eye information indicating the eye position of the driver detected by the driver detection unit 4. Eye information is represented by the amount of deviation from the standard eye position.
  • the display control device 10 in the projection area 121 based on the vehicle tilt angle ⁇ calculated in S170, the eye information acquired in S180, and the three-dimensional position of the object on which the AR image is superimposed.
  • the correction amount of the projection position of the AR image is calculated.
  • the projected position of the AR image generated by the information generation unit 101 is corrected by this correction amount, and the AR image is supplied to the HUD device 9.
  • the display control device 10 determines whether or not the end condition is satisfied, returns the process to S130 if the end condition is not satisfied, and ends the process if the end condition is satisfied.
  • the display control device 10 determines that the end condition is satisfied, for example, when the ignition switch is turned off or when a command to stop the operation of the HUD device 9 is input.
  • S110 corresponds to the load acquisition unit
  • S140 corresponds to the vehicle height acquisition unit
  • S130 corresponds to the gradient acquisition unit
  • S150 corresponds to the acceleration acquisition unit
  • S160 corresponds to the correction amount calculation unit
  • S170 corresponds to the pitch angle calculation unit
  • S180 corresponds to the eye information acquisition unit
  • S190 corresponds to the projection correction unit.
  • the eye position of the driver is Ei
  • the projection area 121 is Di
  • the position where the light imaged as a virtual image is emitted that is, the position of the projector 91 is represented by Si.
  • the projection position Pb in the projection area D1 is a position shifted upward as compared with the projection position Pa in the projection area D0.
  • the driver's posture is displaced and the eye position shifts, and when the eye position E2 is located lower than the eye position E1, the AR image superimposed on the object O is It needs to be projected on the position Pc on the projection area D1.
  • the projection position Pc when the eye position is displaced is shifted downward as compared with the projected position Pb when the eye position is not displaced.
  • the AR image is obtained by summing the correction amount according to the vehicle pitch angle ⁇ , that is, the deviation amount from Pa to Pb, and the correction amount caused by the deviation of the eye position, that is, the deviation amount from Pb to Pc.
  • the correction amount of the projection position of is calculated.
  • the relationship between the vehicle height detection value H and the vehicle pitch angle ⁇ is theoretically derived.
  • the distance from the front wheel to the position of the center of gravity of the load applied to the vehicle is a
  • the distance from the front wheel to the rear wheel is b.
  • the spring constant of the front suspension is Kf
  • the spring constant of the rear suspension is Kr
  • the displacement amount of the front suspension is xf
  • the displacement amount of the rear suspension is xr.
  • Equation (2) can be obtained from the balance of forces
  • equation (3) can be obtained from the balance of moments around the front.
  • the vehicle pitch angle ⁇ is expressed by the equation (6), and the equation (7) can be obtained by solving the equation (6) with respect to ⁇ .
  • the vehicle pitch angle ⁇ can be obtained by using the values of xf and xr thus obtained and the equation (7). From this relationship, it can be seen that there is a corresponding relationship between the vehicle height detection value H (that is, xf or xr) and the vehicle pitch angle ⁇ .
  • centrifugal force Fc When the vehicle turns, centrifugal force Fc is generated in the left-right direction.
  • the centrifugal force Fc is calculated by the equation (8) with the mass of the vehicle as M. Further, when the vehicle speed V and the turning radius R of the vehicle are obtained, the centrifugal force Fc is calculated by the equation (9).
  • the centrifugal force Fc changes the roll angle of the vehicle, and as a result, changes the vehicle height detection value H, which is the detection value of the height sensor 31.
  • the vehicle height detection value H is detected to be lower than the actual vehicle height when turning left, and higher than the actual vehicle height when turning right. Will be done.
  • the pitch angle ⁇ does not change. That is, when the pitch angle ⁇ is estimated from the vehicle height detection value H using the pitch angle conversion information Gp during the rotation of the vehicle, the estimated pitch angle ⁇ is the error of the vehicle detection value H generated by the centrifugal force Fc. Is included.
  • the tread width of the vehicle is T
  • the weight of the rear wheel portion of the vehicle is Mr
  • the height of the center of gravity of the vehicle is h
  • the spring constant of the suspension of the rear wheel is Kr.
  • the vehicle weight Mr is a value obtained by multiplying the mass of the rear wheel portion of the vehicle by the gravitational acceleration. Further, the right load which is the load applied to the right rear wheel is Wr, and the left load which is the load applied to the left rear wheel is Wl.
  • Equation (10) can be obtained from the balance of the moment around the left rear wheel by the centrifugal force Fc. Further, the right load Wr applied to the right rear wheel by the moment by modifying the equation (10) is expressed by the equation (11).
  • the vehicle weight applied to the right rear wheel is Mr. cos ⁇ h due to the crossing gradient ⁇ h.
  • a force having a magnitude of Mr. sin ⁇ h acts in the direction opposite to the centrifugal force Fc. Therefore, the right load Wr in this case is expressed by the equation (13).
  • equation (14) can be obtained.
  • the suspension of the right wheel changes so that the vehicle height becomes higher, so to offset the increase.
  • FIG. 10 shows a corrected vehicle height in which the detected value H is corrected by the error of the pitch angle ⁇ calculated from the vehicle height detected value H when the vehicle is turning and the correction amount ⁇ H calculated by using the lateral acceleration ⁇ or the centrifugal force Fc. It is a graph which showed the error of the pitch angle ⁇ calculated from the detected value H + ⁇ H. It can be seen that the detection error during turning is suppressed by reflecting the correction amount ⁇ H.
  • the spring characteristic Kr of the suspension is constant when calculating the correction amount ⁇ H (that is, the displacement amount Hc) is shown, but the present disclosure is limited to the above embodiment. is not it.
  • the spring characteristic Kr of the suspension may be changed to calculate the correction amount ⁇ H according to the direction of the acceleration ⁇ or the centrifugal force Fc. In this case, the calculation accuracy of the correction amount ⁇ H can be further improved.
  • the seat sensor 33 and the trunk sensor 34 have been described in the case of detecting the presence / absence of the load, but may be configured to detect the weight of the load. In this case, the load state can be calculated more accurately, and the calculation accuracy of the vehicle pitch angle ⁇ can be further improved.
  • the display control device 10 and its method described in the present disclosure are provided by configuring a processor and memory programmed to perform one or more functions embodied by a computer program. It may be realized by a dedicated computer. Alternatively, the display control device 10 and its method described in the present disclosure may be realized by a dedicated computer provided by configuring the processor with one or more dedicated hardware logic circuits. Alternatively, the display control device 10 and its method described in the present disclosure comprises a processor and memory programmed to perform one or more functions and a processor composed of one or more hardware logic circuits. It may be realized by one or more dedicated computers configured by a combination.
  • the computer program may also be stored on a computer-readable non-transitional tangible recording medium as an instruction executed by the computer.
  • the method for realizing the functions of each part included in the display control device 10 does not necessarily include software, and all the functions may be realized by using one or a plurality of hardware.
  • a plurality of functions possessed by one component in the above embodiment may be realized by a plurality of components, or one function possessed by one component may be realized by a plurality of components. .. Further, a plurality of functions possessed by the plurality of components may be realized by one component, or one function realized by the plurality of components may be realized by one component. Further, a part of the configuration of the above embodiment may be omitted. In addition, at least a part of the configuration of the above embodiment may be added or replaced with the configuration of the other above embodiment.

Abstract

A pitch angle calculation unit (S170) calculates a pitch angle that is an inclination angle in the front-rear direction, by using a value obtained by correcting a vehicle height detection value indicating a displacement amount of the height of a vehicle by a correction amount according to lateral acceleration. An eye information acquisition unit (S180) acquires eye information that is a detection result of a displacement amount from a reference position regarding the position of the eyes of a driver. A projection correction unit (S190) corrects a projection position of a superimposition image in accordance with both of the pitch angle and the eye information such that the superimposition image to be visually recognized by the driver is displayed on a target object in a superimposed manner.

Description

表示制御装置Display control device 関連出願の相互参照Cross-reference of related applications
 本国際出願は、2019年7月30日に日本国特許庁に出願された日本国特許出願第2019-139711号に基づく優先権を主張するものであり、日本国特許出願第2019-139711号の全内容を本国際出願に参照により援用する。 This international application claims priority based on Japanese Patent Application No. 2019-139711 filed with the Japan Patent Office on July 30, 2019, and Japanese Patent Application No. 2019-139711. The entire contents are incorporated in this international application by reference.
 本開示は、車載ディスプレイによる表示を制御する技術に関する。 The present disclosure relates to a technique for controlling display on an in-vehicle display.
 ウィンドシールド越しに見える車外の風景中の対象物に虚像を重畳表示するAR機能を有したヘッドアップディスプレイ(以下、AR-HUD)が知られている。ARは、Augmented Realityの略であり、拡張現実と訳される。HUDは、Head Up Displayの略である。AR-HUDでは、車両のピッチ角やロール角の変化によって、風景中における虚像の重畳位置がずれるという問題がある。 A head-up display (hereinafter referred to as AR-HUD) having an AR function for superimposing a virtual image on an object in the scenery outside the vehicle that can be seen through the windshield is known. AR is an abbreviation for Augmented Reality and is translated as augmented reality. HUD is an abbreviation for Head Up Display. The AR-HUD has a problem that the superimposed position of the virtual image in the landscape shifts due to the change in the pitch angle and the roll angle of the vehicle.
 下記特許文献1には、角度センサおよびロールセンサを用いて車両姿勢を推定し、推定された車両姿勢に基づいて、ウィンドシールドに映像を投影するミラーの角度や映像内での表示位置を調整することで、虚像の位置ずれを抑制する技術が開示されている。 In Patent Document 1 below, the vehicle posture is estimated using an angle sensor and a roll sensor, and the angle of the mirror that projects the image on the windshield and the display position in the image are adjusted based on the estimated vehicle posture. As a result, a technique for suppressing the misalignment of the virtual image is disclosed.
国際公開第2018/042898号International Publication No. 2018/042898
 しかしながら、発明者の詳細な検討の結果、特許文献1に記載の従来技術では、車両姿勢を推定するために、高価なセンサを二つ用いる必要があり、しかも、センサでの検出結果から車両姿勢を算出するための計算負荷が高いという問題が見出された。また、車両挙動によりドライバの眼の位置がずれた場合に、虚像の位置ずれが大きくなるという問題も見出された。 However, as a result of detailed examination by the inventor, in the prior art described in Patent Document 1, it is necessary to use two expensive sensors in order to estimate the vehicle posture, and moreover, the vehicle posture is based on the detection result by the sensors. The problem was found that the calculation load for calculating was high. In addition, it has been found that when the position of the driver's eyes shifts due to the behavior of the vehicle, the position shift of the virtual image becomes large.
 本開示の1つの局面は、簡易な手法で対象物に重畳表示される虚像の位置ずれを精度よく抑制する技術を提供することにある。 One aspect of the present disclosure is to provide a technique for accurately suppressing the misalignment of a virtual image superimposed on an object by a simple method.
 本開示の一態様は、表示制御装置であって、情報生成部と、車高取得部と、加速度取得部と、特性記憶部と、補正量算出部と、ピッチ角算出部と、眼情報取得部と、投影補正部と、を備える。 One aspect of the present disclosure is a display control device, which includes an information generation unit, a vehicle height acquisition unit, an acceleration acquisition unit, a characteristic storage unit, a correction amount calculation unit, a pitch angle calculation unit, and eye information acquisition. A unit and a projection correction unit are provided.
 情報生成部は、投影領域を介して、ドライバに視認される風景中に存在する対象物の情報を取得し、対象物の3次元位置情報に応じて、対象物に重畳する重畳画像の投影領域内での投影位置を設定する。投影領域は、ドライバに虚像として認識させる画像が投影される車両のフロントウインド中の領域である。車高取得部は、車両の車高の変位量を表す車高検出値を取得する。加速度取得部は、車両の車幅方向に加わる加速度である横加速度を取得する。特性記憶部は、横加速度と車高検出値の変位量との相関関係を表す変換情報を記憶する。補正量算出部は、加速度取得部で取得された横加速度、および特性記憶部に記憶された変換情報を用いて、車高検出値の補正量を算出する。ピッチ角算出部は、車高取得部で取得された車高検出値を、補正量算出部で算出された補正量で補正した値を用いて、車両の前後方向への傾き角であるピッチ角を算出する。眼情報取得部は、ドライバの眼の位置について基準位置からのずれ量の検出結果である眼情報を取得する。投影補正部は、ドライバによって視認される重畳画像が対象物に重畳表示されるように、ピッチ角および眼情報の両方に従って、情報生成部が設定する重畳画像の投影位置を補正する。 The information generation unit acquires information on the object existing in the landscape visually recognized by the driver via the projection area, and the projection area of the superimposed image superimposed on the object according to the three-dimensional position information of the object. Set the projection position within. The projection area is an area in the front window of the vehicle on which an image to be recognized as a virtual image by the driver is projected. The vehicle height acquisition unit acquires a vehicle height detection value indicating a displacement amount of the vehicle height of the vehicle. The acceleration acquisition unit acquires lateral acceleration, which is the acceleration applied in the vehicle width direction of the vehicle. The characteristic storage unit stores conversion information representing the correlation between the lateral acceleration and the displacement amount of the vehicle height detection value. The correction amount calculation unit calculates the correction amount of the vehicle height detection value by using the lateral acceleration acquired by the acceleration acquisition unit and the conversion information stored in the characteristic storage unit. The pitch angle calculation unit uses the value obtained by correcting the vehicle height detection value acquired by the vehicle height acquisition unit with the correction amount calculated by the correction amount calculation unit, and is the pitch angle which is the inclination angle of the vehicle in the front-rear direction. Is calculated. The eye information acquisition unit acquires eye information which is a detection result of the amount of deviation of the driver's eye position from the reference position. The projection correction unit corrects the projection position of the superimposed image set by the information generation unit according to both the pitch angle and the eye information so that the superimposed image visually recognized by the driver is superimposed and displayed on the object.
 このような構成によれば、車高検出値に誤差が生じる車両の旋回中であっても、横加速度に基づいて算出される補正量によって誤差が抑制されるため、車高検出値から車両のピッチ角を精度よく算出できる。更に、この算出されたピッチ角だけでなく、ドライバの眼の位置に応じて、重畳画像の投影位置を補正するため、ドライバに視認される虚像の対象物からの位置ずれを精度よく抑制できる。 According to such a configuration, even when the vehicle is turning, an error occurs in the vehicle height detection value. Therefore, the error is suppressed by the correction amount calculated based on the lateral acceleration. The pitch angle can be calculated accurately. Further, since the projection position of the superimposed image is corrected according to not only the calculated pitch angle but also the position of the driver's eyes, the displacement of the virtual image visually recognized by the driver from the object can be accurately suppressed.
情報表示システムの構成を示すブロック図である。It is a block diagram which shows the structure of an information display system. 投影領域を示す説明図である。It is explanatory drawing which shows the projection area. HUD装置の配置等を示す説明図である。It is explanatory drawing which shows the arrangement of a HUD apparatus and the like. 位置補正部が実行する処理のフローチャートである。It is a flowchart of the process executed by the position correction part. ピッチ角の変化および眼位置のずれと、HUD表示面での重畳表示位置との関係を示す説明図である。It is explanatory drawing which shows the relationship between the change of the pitch angle and the deviation of the eye position, and the superimposition display position on the HUD display surface. ハイトセンサの検出値と車両ピッチ角との関係を示す簡易モデルをである。This is a simple model showing the relationship between the detected value of the height sensor and the vehicle pitch angle. 車両に加わる遠心力に関する説明図である。It is explanatory drawing about the centrifugal force applied to a vehicle. 車両に加わる遠心力と車両ロール角との関係を説明するために必要なパラメータを示す説明図である。It is explanatory drawing which shows the parameter necessary for explaining the relationship between the centrifugal force applied to a vehicle, and a vehicle roll angle. 横断勾配がある場合における各パラメータの関係を示す説明図である。It is explanatory drawing which shows the relationship of each parameter when there is a cross slope. 車両に遠心力が加わった場合に検出される車両ピッチ角の誤差、および遠心力に基づく補正を加えた車両ピッチ角の誤差を測定した結果を示すグラフである。It is a graph which shows the result of having measured the error of the vehicle pitch angle detected when a centrifugal force is applied to a vehicle, and the error of the vehicle pitch angle which added the correction based on the centrifugal force.
 以下、図面を参照しながら、本開示の実施形態を説明する。 Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
 [1.構成]
 図1~図3に示す情報表示システム1は、車両に搭載して使用される。以下、情報表示システム1を搭載する車両を自車という。情報表示システム1は、表示制御装置10を備える。情報表示システム1は、周辺監視部2と、挙動検知部3と、ドライバ検知部4と、地図記憶部5と、測位部6と、ナビゲーション装置7と、特性記憶部8と、ヘッドアップディスプレイ(以下、HUD)装置9とを備えてもよい。HUDは、Head Up Displayの略である。情報表示システム1を構成する各部は、車載LANを介して情報を送受信してもよい。LANは、Local Area Networkの略である。
[1. Constitution]
The information display system 1 shown in FIGS. 1 to 3 is mounted on a vehicle and used. Hereinafter, the vehicle equipped with the information display system 1 is referred to as a own vehicle. The information display system 1 includes a display control device 10. The information display system 1 includes a peripheral monitoring unit 2, a behavior detection unit 3, a driver detection unit 4, a map storage unit 5, a positioning unit 6, a navigation device 7, a characteristic storage unit 8, and a head-up display ( Hereinafter, the HUD) device 9 may be provided. HUD is an abbreviation for Head Up Display. Each part constituting the information display system 1 may transmit and receive information via the in-vehicle LAN. LAN is an abbreviation for Local Area Network.
 情報表示システム1は、HUD装置9によって、ドライバ席前方に位置するウィンドシールド120に画像を投影することで、ウィンドシールド120を通してドライバに視認される実風景に重ねて、様々な情報を表示する。以下では、このような実風景に重畳して表示される画像をAR画像という。ARは、Augmented Realityの略である。 The information display system 1 projects an image onto the windshield 120 located in front of the driver's seat by the HUD device 9, and displays various information on the actual scenery visually recognized by the driver through the windshield 120. In the following, an image displayed superimposed on such an actual landscape is referred to as an AR image. AR is an abbreviation for Augmented Reality.
 周辺監視部2は、レーダセンサおよびカメラのうち少なくとも一方を備える。レーダセンサは、赤外線、ミリ波、超音波などをレーダ波として使用し、レーダ波を反射した物標との距離や、その物標が存在する方向等を検出する。カメラは、可視光カメラや赤外線カメラ等が用いられる。カメラは、ウィンドシールドを通してドライバによって視認される領域(以下、視認領域)を撮像範囲として含むように配置される。レーダセンサは、カメラと同様に、視認領域を検知範囲として含むように配置される。 The peripheral monitoring unit 2 includes at least one of a radar sensor and a camera. The radar sensor uses infrared rays, millimeter waves, ultrasonic waves, etc. as radar waves, and detects the distance from the target that reflects the radar wave, the direction in which the target exists, and the like. As the camera, a visible light camera, an infrared camera, or the like is used. The camera is arranged so as to include an area visually recognized by the driver through the windshield (hereinafter, a viewing area) as an imaging range. Like the camera, the radar sensor is arranged so as to include the viewing area as the detection range.
 周辺監視部2は、自車の走行路上に存在する物標をレーダセンサおよびカメラで検知し、検知した物標の位置を含む物標情報等を生成する。周辺監視部2の検出対象には、例えば、先進運転支援システム(すなわち、ADAS:Advanced Driver Assistance System)での処理の対象となる種々の物標が含まれる。なお、周辺監視部2は、後述する地図記憶部5に記憶された地図情報に基づいて、物標の位置を含む物標情報を生成してもよい。 The peripheral monitoring unit 2 detects a target existing on the traveling path of the own vehicle with a radar sensor and a camera, and generates target information including the position of the detected target. The detection target of the peripheral monitoring unit 2 includes, for example, various targets to be processed by the advanced driver assistance system (that is, ADAS: Advanced Driver Assistance System). The peripheral monitoring unit 2 may generate target information including the position of the target based on the map information stored in the map storage unit 5, which will be described later.
 挙動検知部3は、ドライバによる運転操作を表す信号、その運転操作の結果である自車の挙動を表す信号、および自車の挙動に影響を与える車両の状態を表す信号を出力する種々のセンサを含む。 The behavior detection unit 3 outputs various sensors that output a signal indicating a driving operation by the driver, a signal indicating the behavior of the own vehicle as a result of the driving operation, and a signal indicating the state of the vehicle that affects the behavior of the own vehicle. including.
 挙動検知部3は、少なくともハイトセンサ31、加速度センサ32、シートセンサ33、およびトランクセンサ34を含む。 The behavior detection unit 3 includes at least a height sensor 31, an acceleration sensor 32, a seat sensor 33, and a trunk sensor 34.
 ハイトセンサ31は、自車の車輪のいずれかに設けられ、車輪の車軸と車体との相対変位量(以下、車高検出値)Hに応じた検出信号を出力する。本実施形態では、右後輪に設けられる。 The height sensor 31 is provided on one of the wheels of the own vehicle, and outputs a detection signal according to the relative displacement amount (hereinafter, vehicle height detection value) H between the axle of the wheel and the vehicle body. In this embodiment, it is provided on the right rear wheel.
 加速度センサ32は、自車に加わる車幅方向の加速度である横加速度の大きさを検出する。加速度の値は、自車の進行方向に向かって右方向に加わる加速度を正符号で表し、左方向に加わる加速度を負符号で表すものとする。 The acceleration sensor 32 detects the magnitude of lateral acceleration, which is the acceleration in the vehicle width direction applied to the own vehicle. The value of the acceleration shall be represented by a positive sign for the acceleration applied to the right toward the traveling direction of the own vehicle and a negative sign for the acceleration applied to the left.
 シートセンサ33は、自車の乗員シートのそれぞれに設けられ、各乗員シートの乗員の有無を表す検出信号を出力する。ここでは、運転席(以下、D席)、助手席(以下、P席)、後部右席(以下、RR席)、後部左席(以下、RL席)が存在する。 The seat sensor 33 is provided on each of the occupant seats of the own vehicle and outputs a detection signal indicating the presence or absence of an occupant on each occupant seat. Here, there are a driver's seat (hereinafter, D seat), a passenger seat (hereinafter, P seat), a rear right seat (hereinafter, RR seat), and a rear left seat (hereinafter, RL seat).
 トランクセンサ34は、自車の後端付近に位置するトランクに設けられ、トランク内の荷物の有無を表す検出信号を出力する。なお、シートセンサ33およびトランクセンサ34は、乗員や荷物等の積載物の重さを表す検出信号を出力するように構成されてもよい。なお、シートセンサ33およびトランクセンサ34が設置される位置が計測位置に相当する。 The trunk sensor 34 is provided in the trunk located near the rear end of the own vehicle, and outputs a detection signal indicating the presence or absence of luggage in the trunk. The seat sensor 33 and the trunk sensor 34 may be configured to output a detection signal indicating the weight of a load such as an occupant or luggage. The position where the seat sensor 33 and the trunk sensor 34 are installed corresponds to the measurement position.
 挙動検知部3には、ハイトセンサ31、加速度センサ32、シートセンサ33、およびトランクセンサ34が含まれる。挙動検知部3には、更に、アクセルペダルセンサ、ブレーキペダルセンサ、舵角センサ、方向指示スイッチ、車速センサ、およびヨーレートセンサ等が含まれてもよい。 The behavior detection unit 3 includes a height sensor 31, an acceleration sensor 32, a seat sensor 33, and a trunk sensor 34. The behavior detection unit 3 may further include an accelerator pedal sensor, a brake pedal sensor, a steering angle sensor, a direction indicator switch, a vehicle speed sensor, a yaw rate sensor, and the like.
 ドライバ検知部4は、車内カメラにより撮像されたドライバの顔画像から、顔位置、顔向き、眼位置および視線方向といったドライバの状態を検知する装置である。ドライバ検知部4は、いわゆるドライバ状況モニタリングシステム(すなわち、DSM:Driver Status Monitoring system)として知られている。 The driver detection unit 4 is a device that detects the driver's state such as the face position, face orientation, eye position, and line-of-sight direction from the driver's face image captured by the in-vehicle camera. The driver detection unit 4 is known as a so-called driver status monitoring system (that is, DSM: Driver Status Monitoring system).
 地図記憶部5には、地図情報およびAR情報等が記憶される。地図情報は、ナビゲーション装置7による経路案内や実風景にAR画像を重畳表示させるために用いられる。 Map information, AR information, etc. are stored in the map storage unit 5. The map information is used for route guidance by the navigation device 7 and for superimposing an AR image on the actual landscape.
 地図情報には、道路に関する情報、白線等の区画線および道路標示に関する情報、構造物に関する情報が含まれる。道路に関する情報には、例えば地点別の位置情報、カーブ曲率や勾配、他の道路との接続関係といった形状情報が含まれる。区画線および道路標示に関する情報には、例えば、区画線および道路標示の種別情報、位置情報、および3次元形状情報が含まれる。構造物に関する情報には、例えば各構造物の種別情報、位置情報、および形状情報が含まれる。ここで構造物には、道路標識、信号機、街灯、トンネル、陸橋、および道路に面する建物等が含まれる。 Map information includes information on roads, lane markings such as white lines, information on road markings, and information on structures. Information about roads includes, for example, position information for each point, curve curvature and slope, and shape information such as connection relationships with other roads. Information on lane markings and road markings includes, for example, lane marking and road marking type information, location information, and three-dimensional shape information. The information about the structure includes, for example, type information, position information, and shape information of each structure. Here, the structure includes road signs, traffic lights, street lights, tunnels, overpasses, buildings facing the road, and the like.
 地図情報は、上述の位置情報および形状情報を、3次元座標で表される特徴点の点群データやベクトルデータ等の形式で有する。すなわち地図情報は、位置情報に関して経緯度に加えて高度を含んだ3次元地図を表現する。従って、地図情報からは、道路上の各地点における道路の勾配に関する情報、具体的には、道路の進行方向に沿った縦断勾配および道路の幅方向に沿った横断勾配が抽出可能である。地図情報に含まれる位置情報は、センチメートルオーダの比較的小さい誤差を有する。地図情報は、高さ情報まで含んだ3次元座標による位置情報を有しているという点で精度の高い地図データであり、また、その位置情報の誤差が比較的小さいという点でも精度の高い地図データである。 The map information has the above-mentioned position information and shape information in the form of point cloud data, vector data, or the like of feature points represented by three-dimensional coordinates. That is, the map information represents a three-dimensional map including altitude in addition to latitude and longitude with respect to position information. Therefore, from the map information, information on the slope of the road at each point on the road, specifically, the longitudinal gradient along the traveling direction of the road and the crossing gradient along the width direction of the road can be extracted. The location information contained in the map information has a relatively small error on the order of centimeters. The map information is highly accurate map data in that it has position information based on three-dimensional coordinates including height information, and it is also highly accurate in that the error in the position information is relatively small. It is data.
 AR情報は、AR画像の表示に用いられるデータであり、背景に重畳して表示する記号、文字、およびアイコン等が含まれる。AR情報として、ナビゲーション装置7と連動した経路案内のための情報(例えば、路面に重畳表示される矢印など)が含まれてもよい。 AR information is data used for displaying AR images, and includes symbols, characters, icons, etc. that are superimposed and displayed on the background. The AR information may include information for route guidance linked with the navigation device 7 (for example, an arrow superimposed on the road surface).
 測位部6は、自車の現在位置を特定するための位置情報を生成する装置である。測位部6は、例えばGNSS受信機と、ジャイロスコープおよび距離センサ等の自律航法用センサとを備える。GNSSは、Global Navigation Satellite Systemの略称である。GNSS受信機は、人工衛星からの送信信号を受信し、車両の位置座標や高度を検出する。ジャイロスコープは、車両に加えられる回転運動の角速度に応じた検出信号を出力する。距離センサは、車両の走行距離を出力する。測位部6は、GNSS受信機からの出力信号に基づく情報と、自律航法用のセンサからの出力信号に基づく情報とを組み合わせる複合測位により、自車の現在位置を表す高精度な位置情報等を生成する。測位部6にて生成される位置情報は、例えば複数車線のうち、車両が走行する車線を特定する精度を有してもよい。 The positioning unit 6 is a device that generates position information for specifying the current position of the own vehicle. The positioning unit 6 includes, for example, a GNSS receiver and sensors for autonomous navigation such as a gyroscope and a distance sensor. GNSS is an abbreviation for Global Navigation Satellite System. The GNSS receiver receives the transmission signal from the artificial satellite and detects the position coordinates and altitude of the vehicle. The gyroscope outputs a detection signal according to the angular velocity of the rotational motion applied to the vehicle. The distance sensor outputs the mileage of the vehicle. The positioning unit 6 performs high-precision position information and the like indicating the current position of the own vehicle by combined positioning that combines information based on the output signal from the GNSS receiver and information based on the output signal from the sensor for autonomous navigation. Generate. The position information generated by the positioning unit 6 may have an accuracy of specifying the lane in which the vehicle travels among a plurality of lanes, for example.
 ナビゲーション装置7は、自車の現在位置と地図データとに基づいて経路案内を実施する。ナビゲーション装置7は、測位部6での測位結果と、地図データを利用したマップマッチングにより、道路上における自車の現在位置や進行方位を特定する。ナビゲーション装置7は、自車の現在位置および進行方位、目的地までの経路、ドライバの視覚領域内に存在する道路や施設に関する地図情報およびAR情報等を表示制御装置10に提供する。 The navigation device 7 provides route guidance based on the current position of the own vehicle and map data. The navigation device 7 identifies the current position and the traveling direction of the own vehicle on the road by the positioning result of the positioning unit 6 and the map matching using the map data. The navigation device 7 provides the display control device 10 with map information, AR information, and the like regarding the current position and traveling direction of the own vehicle, the route to the destination, the roads and facilities existing in the visual area of the driver, and the like.
 特性記憶部8は、不揮発性メモリを備える。特性記憶部8には、特性情報として、ピッチ角変換情報Gpと、補正量変換情報Ghとが少なくとも記憶される。ピッチ角変換情報Gpは、車高検出値Hから車両ピッチ角を算出するときに用いる情報である。補正量変換情報Ghは、加速度センサ32の検出値(すなわち、横加速度)からハイトセンサ31の検出値Hの補正量ΔHを算出するときに用いる情報である。特性情報Gp,Ghは、数式で表現されてもよいし、テーブル形式のデータで表現されてもよい。特に、ピッチ角変換情報Gpは、シートセンサ33およびトランクセンサ34の検出結果、すなわち車両内での乗員および積載物の配置から特定される荷重状態に応じて複数種類が用意される。 The characteristic storage unit 8 includes a non-volatile memory. At least the pitch angle conversion information Gp and the correction amount conversion information Gh are stored in the characteristic storage unit 8 as characteristic information. The pitch angle conversion information Gp is information used when calculating the vehicle pitch angle from the vehicle height detection value H. The correction amount conversion information Gh is information used when calculating the correction amount ΔH of the detection value H of the height sensor 31 from the detection value (that is, lateral acceleration) of the acceleration sensor 32. The characteristic information Gp and Gh may be expressed by mathematical formulas or by table format data. In particular, a plurality of types of pitch angle conversion information Gp are prepared according to the detection results of the seat sensor 33 and the trunk sensor 34, that is, the load state specified from the arrangement of the occupants and the load in the vehicle.
 例えば、シートセンサ33は、4つの乗員シート(すなわち、D席、P席、RR席、RL席)のそれぞれで乗員の有無を検出し、トランクセンサ34は、1つのトランクで荷物の有無を検出する。この場合、上記5箇所の積載物(すなわち、乗員または荷物)の有無によって荷重状態は32パターン存在する。但し、D席は常に乗員ありとすると、荷重状態は16パターンとなる。この16パターンのそれぞれについて特性情報を用意してもよいが、類似した変換特性を有する荷重状態のパターンを統合して、特性情報の数を削減してもよい。具体的には、例えば、P席の乗員の有無およびトランクの荷物の有無で分類した4パターンとしてもよいし、P席の乗員の有無だけで分類した2パターンとしてもよい。 For example, the seat sensor 33 detects the presence or absence of a occupant in each of the four occupant seats (that is, the D seat, the P seat, the RR seat, and the RL seat), and the trunk sensor 34 detects the presence or absence of luggage in one trunk. To do. In this case, there are 32 patterns of load states depending on the presence or absence of the load (that is, occupants or luggage) at the above five locations. However, assuming that the D seat always has an occupant, the load state has 16 patterns. Characteristic information may be prepared for each of the 16 patterns, but the number of characteristic information may be reduced by integrating patterns in a load state having similar conversion characteristics. Specifically, for example, there may be four patterns classified according to the presence / absence of a P-seat occupant and the presence / absence of luggage in the trunk, or two patterns classified only by the presence / absence of a P-seat occupant.
 図2および図3に示すように、HUD装置9は、インストルメントパネル110に配置される。HUD装置9は、プロジェクタ91と、光学系92とを備える。プロジェクタ91は、液晶ディスプレイ(以下、LCD)パネルとバックライトとを備える。プロジェクタ91はLCDパネルの表示画面を光学系92に向けた姿勢で固定される。プロジェクタ91は、表示制御装置10からの指示に従ってLCDパネルに画像を表示し、バックライトによって透過照明することで、虚像として結像される光を光学系92に向けて射出する。光学系92は、少なくとも凹面鏡を有しており、プロジェクタ91から射出された光を反射しかつ拡大して、ドライバの視認領域中に設定されたウィンドシールド120上の領域である投影領域121に投影する。この投影により、ドライバの視認領域中の実風景にAR画像が重畳表示される。 As shown in FIGS. 2 and 3, the HUD device 9 is arranged on the instrument panel 110. The HUD device 9 includes a projector 91 and an optical system 92. The projector 91 includes a liquid crystal display (hereinafter referred to as LCD) panel and a backlight. The projector 91 is fixed with the display screen of the LCD panel facing the optical system 92. The projector 91 displays an image on the LCD panel according to an instruction from the display control device 10, and transmits and illuminates the image with a backlight to emit light formed as a virtual image toward the optical system 92. The optical system 92 has at least a concave mirror, reflects and magnifies the light emitted from the projector 91, and projects it onto a projection area 121, which is an area on the windshield 120 set in the driver's visible area. To do. By this projection, the AR image is superimposed and displayed on the actual landscape in the visible area of the driver.
 表示制御装置10は、CPU11と、例えば、ROMまたはRAM等の半導体メモリ(以下、単にメモリ)12とを有するマイクロコンピュータを備える。表示制御装置10の各機能は、CPU11が非遷移的実体的記録媒体に格納されたプログラムを実行することにより実現される。表示制御装置10は、HCUとも呼ばれる。HCUは、HMI Control Unitの略であり、HMIは、Human Machine Interfaceの略である。 The display control device 10 includes a microcomputer having a CPU 11 and a semiconductor memory (hereinafter, simply memory) 12 such as a ROM or RAM. Each function of the display control device 10 is realized by the CPU 11 executing a program stored in a non-transitional substantive recording medium. The display control device 10 is also called an HCU. HCU is an abbreviation for HMI Control Unit, and HMI is an abbreviation for Human Machine Interface.
 表示制御装置10は、メモリ12に記憶されたプログラムを実行することで実現される機能構成として、情報生成部101と、位置補正部102とを備える。 The display control device 10 includes an information generation unit 101 and a position correction unit 102 as a functional configuration realized by executing a program stored in the memory 12.
 情報生成部101は、周辺監視部2での検出結果、ナビゲーション装置7からの経路情報、および地図情報等に基づいて、投影領域121を介して視認され、かつ、ドライバへの情報提供の対象となる対象物を抽出する。情報生成部101は、対象物の3次元位置を表す位置情報と、対象物に重畳表示されるAR画像を生成する。このとき、対象物の3次元位置と、標準的なドライバの眼位置とに基づき、車両がドライバのみが搭乗した標準的な荷重状態にある場合に、AR画像が対象物に重畳されて認識されるように、投影領域121におけるAR画像の投影位置が設定される。 The information generation unit 101 is visually recognized via the projection area 121 based on the detection result of the peripheral monitoring unit 2, the route information from the navigation device 7, the map information, and the like, and is the target of providing information to the driver. Extract the target object. The information generation unit 101 generates position information representing the three-dimensional position of the object and an AR image superimposed on the object. At this time, based on the three-dimensional position of the object and the standard eye position of the driver, when the vehicle is in a standard load state in which only the driver is on board, the AR image is superimposed on the object and recognized. As such, the projection position of the AR image in the projection area 121 is set.
 位置補正部102は、センサ31~34での検出結果と、特性記憶部8に記憶された特性情報Gp,Ghとから算出される車両ピッチ角に基づいて、情報生成部101にて生成された投影領域121におけるAR画像の投影位置を補正する。 The position correction unit 102 is generated by the information generation unit 101 based on the detection results of the sensors 31 to 34 and the vehicle pitch angle calculated from the characteristic information Gp and Gh stored in the characteristic storage unit 8. The projection position of the AR image in the projection area 121 is corrected.
 表示制御装置10は、情報生成部101にて生成され、位置補正部102により投影位置が補正されたAR画像を、HUD装置9に表示させる。 The display control device 10 causes the HUD device 9 to display an AR image generated by the information generation unit 101 and whose projection position has been corrected by the position correction unit 102.
 [2.処理]
 次に、表示制御装置10が、位置補正部102としての機能を実現するために実行する処理を、図4のフローチャートを用いて説明する。なお、本処理は、HUD装置9への給電が開始され表示可能な状態になると実行される。
[2. processing]
Next, the process executed by the display control device 10 in order to realize the function as the position correction unit 102 will be described with reference to the flowchart of FIG. This process is executed when the power supply to the HUD device 9 is started and the display becomes possible.
 S110では、表示制御装置10は、シートセンサ33およびトランクセンサ34での検出結果を、車両に加わる荷重状態を表す荷重検出値として取得する。ここでは、荷重検出値として、乗員の有無および荷物等の積載物の有無が用いられる。 In S110, the display control device 10 acquires the detection results of the seat sensor 33 and the trunk sensor 34 as a load detection value indicating a load state applied to the vehicle. Here, the presence / absence of an occupant and the presence / absence of a load such as luggage are used as the load detection value.
 続くS120では、表示制御装置10は、荷重検出値が示す荷重状態に応じたピッチ角変換情報Gpを、特性記憶部8から取得する。 In the following S120, the display control device 10 acquires the pitch angle conversion information Gp according to the load state indicated by the load detection value from the characteristic storage unit 8.
 続くS130では、表示制御装置10は、地図記憶部5に記憶された地図情報に基づき、測位部6から取得される現在位置、すなわち自車が走行中の道路の道路勾配に関する勾配情報ψv,ψhを取得する。 In the following S130, the display control device 10 determines the current position acquired from the positioning unit 6, that is, the slope information ψv, ψh regarding the road gradient of the road on which the own vehicle is traveling, based on the map information stored in the map storage unit 5. To get.
 勾配情報ψvは、縦断勾配に関する情報であり、勾配情報ψhは、横断勾配に関する情報である。勾配情報ψvは、縦断勾配が前傾勾配であれば正符号の値で表現され、縦断勾配が後傾勾配であれば負符号の値で表現される。また、勾配情報ψhは、横断勾配が右傾勾配であれば正符号の値で表現され、横断勾配が左傾勾配であれば負符号の値で表現される。 Gradient information ψv is information about the longitudinal gradient, and gradient information ψh is information about the cross slope. The gradient information ψv is represented by a positive sign value if the longitudinal gradient is a forward gradient, and is represented by a negative sign value if the longitudinal gradient is a backward gradient. Further, the gradient information ψh is represented by a positive sign value if the transverse gradient is a rightward gradient, and is represented by a negative sign value if the transverse gradient is a leftward gradient.
 続くS140では、表示制御装置10は、ハイトセンサ31の検出値である車高検出値Hを取得する。 In the following S140, the display control device 10 acquires the vehicle height detection value H, which is the detection value of the height sensor 31.
 続くS150では、表示制御装置10は、加速度センサ32での検出結果である横加速度αを取得する。 In the following S150, the display control device 10 acquires the lateral acceleration α, which is the detection result of the acceleration sensor 32.
 続くS160では、表示制御装置10は、S150で取得した横加速度αおよびS130で取得した横断勾配の勾配情報ψhと、特性記憶部8に記憶された補正量変換情報Ghとを用いて車高検出値Hに対する補正量ΔHを算出する。補正量ΔHの算出方法については後述する。 In the following S160, the display control device 10 detects the vehicle height using the lateral acceleration α acquired in S150, the gradient information ψh of the transverse gradient acquired in S130, and the correction amount conversion information Gh stored in the characteristic storage unit 8. The correction amount ΔH with respect to the value H is calculated. The method of calculating the correction amount ΔH will be described later.
 続くS170では、表示制御装置10は、(1)式を用いて、水平面に対する車体の前後方向への傾斜角度である車両ピッチ角θを算出する。但し、Gpは、S120で取得したピッチ角変換情報であり、ψpは、S130で取得した縦断勾配の勾配情報であり、Hは、S140で取得した車高検出値であり、ΔHは、S160で算出した車高検出値Hに対する補正量である。また、Gp(X)は、車高検出値をXとしてピッチ角変換情報Gpから推定される車両ピッチ角である。Cは、実験的に決定される定数である。 In the following S170, the display control device 10 calculates the vehicle pitch angle θ, which is the inclination angle of the vehicle body in the front-rear direction with respect to the horizontal plane, using the equation (1). However, Gp is the pitch angle conversion information acquired in S120, ψp is the gradient information of the longitudinal gradient acquired in S130, H is the vehicle height detection value acquired in S140, and ΔH is S160. It is a correction amount for the calculated vehicle height detection value H. Further, Gp (X) is a vehicle pitch angle estimated from the pitch angle conversion information Gp with the vehicle height detection value as X. C is an experimentally determined constant.
Figure JPOXMLDOC01-appb-M000001
 続くS180では、表示制御装置10は、ドライバ検知部4で検知されるドライバの眼位置を表す眼情報を取得する。眼情報は、標準的な眼位置からのずれ量で表される。
Figure JPOXMLDOC01-appb-M000001
In the following S180, the display control device 10 acquires eye information indicating the eye position of the driver detected by the driver detection unit 4. Eye information is represented by the amount of deviation from the standard eye position.
 続くS190では、表示制御装置10は、S170で算出した車両傾き角度θと、S180で取得した眼情報と、AR画像が重畳される対象物の3次元位置とに基づいて、投影領域121でのAR画像の投影位置の補正量を算出する。情報生成部101で生成されたAR画像は、この補正量によって投影位置が補正されて、HUD装置9に供給される。 In the following S190, the display control device 10 in the projection area 121 based on the vehicle tilt angle θ calculated in S170, the eye information acquired in S180, and the three-dimensional position of the object on which the AR image is superimposed. The correction amount of the projection position of the AR image is calculated. The projected position of the AR image generated by the information generation unit 101 is corrected by this correction amount, and the AR image is supplied to the HUD device 9.
 続くS200では、表示制御装置10は、終了条件が成立したか否かを判定し、終了条件が不成立であれば、処理をS130に戻し、終了条件が成立していれば、処理を終了する。なお、表示制御装置10は、例えば、イグニションスイッチがオフされた場合、または、HUD装置9の作動を停止する指令が入力された場合に、終了条件が成立したと判定する。 In the following S200, the display control device 10 determines whether or not the end condition is satisfied, returns the process to S130 if the end condition is not satisfied, and ends the process if the end condition is satisfied. The display control device 10 determines that the end condition is satisfied, for example, when the ignition switch is turned off or when a command to stop the operation of the HUD device 9 is input.
 上記処理において、S110が荷重取得部に相当し、S140が車高取得部に相当し、S130が勾配取得部に相当し、S150が加速度取得部に相当する。また、S160が補正量算出部に相当し、S170がピッチ角算出部に相当し、S180が眼情報取得部に相当し、S190が投影補正部に相当する。 In the above process, S110 corresponds to the load acquisition unit, S140 corresponds to the vehicle height acquisition unit, S130 corresponds to the gradient acquisition unit, and S150 corresponds to the acceleration acquisition unit. Further, S160 corresponds to the correction amount calculation unit, S170 corresponds to the pitch angle calculation unit, S180 corresponds to the eye information acquisition unit, and S190 corresponds to the projection correction unit.
 [3.補正量の算出方法]
 次に、S190にて実施されるAR画像の投影位置の補正量を算出する処理の概要を、図5を用いて説明する。
[3. Calculation method of correction amount]
Next, the outline of the process for calculating the correction amount of the projection position of the AR image performed in S190 will be described with reference to FIG.
 ここでは、簡単のため、縦断勾配がψv=0°の場合について説明する。 Here, for the sake of simplicity, the case where the longitudinal gradient is ψv = 0 ° will be described.
 車両がピッチングを起こして車体が前傾した場合について説明する。ドライバの眼位置をEi、投影領域121をDi、虚像として結像される光が出射される位置、すなわちプロジェクタ91の位置をSiで表す。なお、i=0は、車両ピッチ角がθ=0°の場合の位置、i=1は車両ピッチ角がθ>0°の場合の位置を示す。 The case where the vehicle pitches and the vehicle body leans forward will be explained. The eye position of the driver is Ei, the projection area 121 is Di, and the position where the light imaged as a virtual image is emitted, that is, the position of the projector 91 is represented by Si. Note that i = 0 indicates a position when the vehicle pitch angle is θ = 0 °, and i = 1 indicates a position when the vehicle pitch angle is θ> 0 °.
 ドライバの眼位置E0、投影領域D0、光出射位置S0は、車両が前傾すると、いずれも車両重心Jを回転の中心として角度θだけ回転した位置E0→E1,D0→D1,S0→S1に移動する。 When the vehicle tilts forward, the driver's eye position E0, projection area D0, and light emission position S0 all change to positions E0 → E1, D0 → D1, S0 → S1 that are rotated by an angle θ with the vehicle center of gravity J as the center of rotation. Moving.
 車両ピッチ角がθ=0°の場合、すなわち眼位置がE0にある場合、対象物Oに重畳されるAR画像は、眼位置E0と対象物Oとを接続する直線が、投影領域D0と交差する位置Paに投影される必要がある。 When the vehicle pitch angle is θ = 0 °, that is, when the eye position is at E0, in the AR image superimposed on the object O, the straight line connecting the eye position E0 and the object O intersects the projection area D0. It needs to be projected at the position Pa.
 車両ピッチ角がθ>0°の場合、すなわち眼位置がE1にある場合、対象物Oに重畳されるAR画像は、眼位置E1と対象物Oとを接続する直線が、投影領域D1と交差する位置Pbに投影される必要がある。 When the vehicle pitch angle is θ> 0 °, that is, when the eye position is at E1, in the AR image superimposed on the object O, the straight line connecting the eye position E1 and the object O intersects the projection area D1. It needs to be projected on the position Pb.
 つまり、投影領域D0における投影位置Paと比較して、投影領域D1における投影位置Pbは、上方にずれた位置となる。 That is, the projection position Pb in the projection area D1 is a position shifted upward as compared with the projection position Pa in the projection area D0.
 更に、車両が前傾したときに、ドライバの体勢がくずれる等して眼位置にずれが生じ、眼位置E1より低い位置に眼位置E2がある場合、対象物Oに重畳されるAR画像は、投影領域D1上の位置Pcに投影される必要がある。 Further, when the vehicle leans forward, the driver's posture is displaced and the eye position shifts, and when the eye position E2 is located lower than the eye position E1, the AR image superimposed on the object O is It needs to be projected on the position Pc on the projection area D1.
 つまり、眼位置のずれが生じていない場合の投影位置Pbと比較して、眼位置のずれが生じている場合の投影位置Pcは、下方にずれる。 That is, the projection position Pc when the eye position is displaced is shifted downward as compared with the projected position Pb when the eye position is not displaced.
 このように、車両ピッチ角θに応じた補正量、すなわちPaからPbへのずれ量と、眼位置のずれによって生じる補正量、すなわちPbからPcへのずれ量とを合計することで、AR画像の投影位置の補正量が算出される。 In this way, the AR image is obtained by summing the correction amount according to the vehicle pitch angle θ, that is, the deviation amount from Pa to Pb, and the correction amount caused by the deviation of the eye position, that is, the deviation amount from Pb to Pc. The correction amount of the projection position of is calculated.
 [4.ピッチ角変換情報]
 特性記憶部8に記憶されるピッチ角変換情報ψpについて説明する。
[4. Pitch angle conversion information]
The pitch angle conversion information ψp stored in the characteristic storage unit 8 will be described.
 図6に示す簡易な2輪モデルを用いて、車高検出値Hと車両ピッチ角θとの関係を理論的に導出する。ここで、フロント輪から、車両に加わる荷重の重心位置までの距離をa、フロント輪からリア輪までの距離をbとする。フロントサスペンションのバネ定数をKf、リアサスペンションのバネ定数をKr、フロントサスペンションの変位量をxf、リアサスペンションの変位量をxrとする。乗員や積載物によって車両に加わる荷重をF、この荷重による車体の傾き角である車両ピッチ角をθとする。 Using the simple two-wheel model shown in FIG. 6, the relationship between the vehicle height detection value H and the vehicle pitch angle θ is theoretically derived. Here, the distance from the front wheel to the position of the center of gravity of the load applied to the vehicle is a, and the distance from the front wheel to the rear wheel is b. The spring constant of the front suspension is Kf, the spring constant of the rear suspension is Kr, the displacement amount of the front suspension is xf, and the displacement amount of the rear suspension is xr. Let F be the load applied to the vehicle by the occupants and the load, and θ be the vehicle pitch angle, which is the inclination angle of the vehicle body due to this load.
 力の釣り合いから(2)式が得られ、フロントまわりのモーメントの釣り合いから(3)式が得られる。 Equation (2) can be obtained from the balance of forces, and equation (3) can be obtained from the balance of moments around the front.
Figure JPOXMLDOC01-appb-M000002
 (3)式をxrについて解くと(4)式が得られ、(4)式を(2)式に代入して整理すると(5)式が得られる。
Figure JPOXMLDOC01-appb-M000002
Solving Eq. (3) for xr gives Eq. (4), and substituting Eq. (4) into Eq. (2) for rearrangement gives Eq. (5).
Figure JPOXMLDOC01-appb-M000003
 このとき、車両ピッチ角θは(6)式で表され、(6)式をθについて解くことで(7)式が得られる。
Figure JPOXMLDOC01-appb-M000003
At this time, the vehicle pitch angle θ is expressed by the equation (6), and the equation (7) can be obtained by solving the equation (6) with respect to θ.
Figure JPOXMLDOC01-appb-M000004
 つまり、ハイトセンサ31によってxfが得られれば、(5)式からFを求めることができ、そのFを用いて、(4)式からxrが算出される。同様に、ハイトセンサ31によってxrが得られれば、(4)式からFを求めることができ、そのFを用いて、(5)式からxfが算出される。
Figure JPOXMLDOC01-appb-M000004
That is, if xf is obtained by the height sensor 31, F can be obtained from the equation (5), and xr is calculated from the equation (4) using the F. Similarly, if xr is obtained by the height sensor 31, F can be obtained from the equation (4), and xf is calculated from the equation (5) using the F.
 このようにして得られたxf、xrの値と(7)式を用いて、車両ピッチ角θを得ることができる。この関係から、車高検出値H(すなわち、xfまたはxr)と車両ピッチ角θとに対応関係があることがわかる。 The vehicle pitch angle θ can be obtained by using the values of xf and xr thus obtained and the equation (7). From this relationship, it can be seen that there is a corresponding relationship between the vehicle height detection value H (that is, xf or xr) and the vehicle pitch angle θ.
 [5.補正値変換情報]
 特性記憶部8に記憶される補正量変換情報Ghについて説明する。
[5. Correction value conversion information]
The correction amount conversion information Gh stored in the characteristic storage unit 8 will be described.
 まず、遠心力Fcについて、図7を参照して説明する。車両の旋回時には、左右方向へ遠心力Fcが発生する。車両の横加速度αを検出できる場合、遠心力Fcは、車両の質量をMとして、(8)式で算出される。また、車速Vおよび車両の旋廻半径Rが得られる場合、遠心力Fcは(9)式で算出される。 First, the centrifugal force Fc will be described with reference to FIG. When the vehicle turns, centrifugal force Fc is generated in the left-right direction. When the lateral acceleration α of the vehicle can be detected, the centrifugal force Fc is calculated by the equation (8) with the mass of the vehicle as M. Further, when the vehicle speed V and the turning radius R of the vehicle are obtained, the centrifugal force Fc is calculated by the equation (9).
Figure JPOXMLDOC01-appb-M000005
 ところで、遠心力Fcは、車両のロール角を変化させ、その結果、ハイトセンサ31の検出値である車高検出値Hを変化させる。ハイトセンサ31が右後輪に配置されている場合、車高検出値Hは、左旋回時には、実際の車高より低い値が検出され、右旋回時には、実際の車高より高い値が検出される。但し、このとき、ピッチ角θに変化は生じない。つまり、車両の旋廻中に、車高検出値Hからピッチ角変換情報Gpを用いてピッチ角θを推定すると、推定されたピッチ角θには、遠心力Fcにより発生する車両検出値Hの誤差が含まれる。
Figure JPOXMLDOC01-appb-M000005
By the way, the centrifugal force Fc changes the roll angle of the vehicle, and as a result, changes the vehicle height detection value H, which is the detection value of the height sensor 31. When the height sensor 31 is arranged on the right rear wheel, the vehicle height detection value H is detected to be lower than the actual vehicle height when turning left, and higher than the actual vehicle height when turning right. Will be done. However, at this time, the pitch angle θ does not change. That is, when the pitch angle θ is estimated from the vehicle height detection value H using the pitch angle conversion information Gp during the rotation of the vehicle, the estimated pitch angle θ is the error of the vehicle detection value H generated by the centrifugal force Fc. Is included.
 ここで、遠心力Fcによるハイトセンサ31の変位量Hcとの関係を、図8、図9を参照して説明する。 Here, the relationship with the displacement amount Hc of the height sensor 31 due to the centrifugal force Fc will be described with reference to FIGS. 8 and 9.
 図8に示すように、車両のトレッド幅をT、車両の後輪部分の車重をMr、車両の重心高さをh、後輪のサスペンションのバネ定数をKrとする。なお、車重Mrは、車両の後輪部の質量に重力加速度を乗じた値である。また、右後輪に加わる荷重である右荷重をWr、左後輪に加わる荷重である左荷重をWlとする。 As shown in FIG. 8, the tread width of the vehicle is T, the weight of the rear wheel portion of the vehicle is Mr, the height of the center of gravity of the vehicle is h, and the spring constant of the suspension of the rear wheel is Kr. The vehicle weight Mr is a value obtained by multiplying the mass of the rear wheel portion of the vehicle by the gravitational acceleration. Further, the right load which is the load applied to the right rear wheel is Wr, and the left load which is the load applied to the left rear wheel is Wl.
 まず、道路の横断勾配が、ψh=0°の場合について説明する。 First, the case where the cross slope of the road is ψh = 0 ° will be described.
 遠心力Fcによる左後輪まわりモーメントのつり合いから(10)式が得られる。また、(10)式を変形することで、モーメントにより右後輪に加わる右荷重Wrは、(11)式で表される。 Equation (10) can be obtained from the balance of the moment around the left rear wheel by the centrifugal force Fc. Further, the right load Wr applied to the right rear wheel by the moment by modifying the equation (10) is expressed by the equation (11).
Figure JPOXMLDOC01-appb-M000006
 右荷重Wrと、ハイトセンサ31の変位量Hcとの関係は、(12)式で表される。従って、(12)式の右荷重Wrの項に、(11)式を代入することで、遠心力Fcと変位量Hcとの関係式が得られる。
Figure JPOXMLDOC01-appb-M000006
The relationship between the right load Wr and the displacement amount Hc of the height sensor 31 is expressed by the equation (12). Therefore, by substituting the equation (11) into the term of the right load Wr in the equation (12), the relational expression between the centrifugal force Fc and the displacement amount Hc can be obtained.
Figure JPOXMLDOC01-appb-M000007
 次に、横断勾配がψh≠0°の場合について説明する。
Figure JPOXMLDOC01-appb-M000007
Next, the case where the cross slope is ψh ≠ 0 ° will be described.
 図9に示すように、この場合、横断勾配ψhにより、右後輪に加わる車重はMr・cosψhとなる。また、横断勾配ψhにより、遠心力Fcとは反対方向に大きさがMr・sinψhの力が作用する。従って、この場合の右荷重Wrは、(13)式で表される。但し、ψh<<1であると仮定した場合、cosψh≒1、sinψh≒ψhとなるため、(14)式が得られる。 As shown in FIG. 9, in this case, the vehicle weight applied to the right rear wheel is Mr. cos ψh due to the crossing gradient ψh. Further, due to the transverse gradient ψh, a force having a magnitude of Mr. sinψh acts in the direction opposite to the centrifugal force Fc. Therefore, the right load Wr in this case is expressed by the equation (13). However, assuming that ψh << 1, cosψh≈1 and sinψh≈ψh, so equation (14) can be obtained.
Figure JPOXMLDOC01-appb-M000008
 そして、(12)式の右荷重Wrの項に、(13)または(14)式を代入することで、横断勾配ψhが考慮された遠心力Fcと変位量Hcとの関係を表す関係式が得られる。さらに、その関係式の遠心力Fcの項に、(8)式を代入することで、横加速度αと変位量Hcが得られ、この関係式から補正量変換情報Ghが生成される。補正量変換情報Ghは、関係式そのものであってもよいし、関係式に基づいて生成されるテーブル情報であってもよい。
Figure JPOXMLDOC01-appb-M000008
Then, by substituting the equation (13) or (14) into the term of the right load Wr in the equation (12), a relational expression expressing the relationship between the centrifugal force Fc and the displacement amount Hc in consideration of the transverse gradient ψh can be obtained. can get. Further, by substituting the equation (8) into the term of the centrifugal force Fc of the relational expression, the lateral acceleration α and the displacement amount Hc are obtained, and the correction amount conversion information Gh is generated from this relational expression. The correction amount conversion information Gh may be the relational expression itself or the table information generated based on the relational expression.
 右旋回時には、右輪のサスペンションは、車高がより低くなるように変化するため、その低下分を相殺するために、補正量はΔH=Hc、すなわち正符号を持つ値として算出される。逆に、左旋回時には、右輪のサスペンションは車高がより高くなるように変化するため、その増加分を相殺するために。補正量はΔH=-Hc、すなわち負符号を持つ値として算出される。 When turning to the right, the suspension of the right wheel changes so that the vehicle height becomes lower, so the correction amount is calculated as ΔH = Hc, that is, a value having a positive code in order to offset the decrease. On the contrary, when turning left, the suspension of the right wheel changes so that the vehicle height becomes higher, so to offset the increase. The correction amount is calculated as ΔH = −Hc, that is, a value having a negative sign.
 [6.測定]
 図10は、車両の旋回時に車高検出値Hから算出されるピッチ角θの誤差と、横加速度αまたは遠心力Fcとを用いて算出した補正量ΔHで検出値Hを補正した補正車高検出値H+ΔHから算出されるピッチ角θの誤差とを示したグラフである。補正量ΔHを反映させることで、旋回時の検出誤差が抑制されることがわかる。
[6. Measurement]
FIG. 10 shows a corrected vehicle height in which the detected value H is corrected by the error of the pitch angle θ calculated from the vehicle height detected value H when the vehicle is turning and the correction amount ΔH calculated by using the lateral acceleration α or the centrifugal force Fc. It is a graph which showed the error of the pitch angle θ calculated from the detected value H + ΔH. It can be seen that the detection error during turning is suppressed by reflecting the correction amount ΔH.
 [7.効果]
 以上詳述した実施形態によれば、以下の効果を奏する。
[7. effect]
According to the embodiment described in detail above, the following effects are obtained.
 (7a)情報表示システム1によれば、車高検出値Hに誤差が生じる車両の旋回中であっても、横加速度αまたは遠心力Fcに基づいて算出される補正量ΔHによって誤差が抑制されるため、車高検出値Hから車両のピッチ角θを精度よく推定できる。その結果、推定されたピッチ角θによってAR画像の投影位置を正しく補正することができ、ドライバに視認される虚像の対象物に対する位置ずれが生じることを抑制できる。 (7a) According to the information display system 1, even during turning of the vehicle in which an error occurs in the vehicle height detection value H, the error is suppressed by the correction amount ΔH calculated based on the lateral acceleration α or the centrifugal force Fc. Therefore, the pitch angle θ of the vehicle can be accurately estimated from the vehicle height detection value H. As a result, the projected position of the AR image can be correctly corrected by the estimated pitch angle θ, and it is possible to suppress the occurrence of misalignment of the virtual image visually recognized by the driver with respect to the object.
 (7b)情報表示システム1では、補正量ΔHを算出する際に、横断勾配ψhの影響も考慮されているため、精度のよい補正量ΔHを得ることができ、その結果、車両のピッチ角θの算出精度をより向上させることができる。 (7b) In the information display system 1, since the influence of the transverse gradient ψh is also taken into consideration when calculating the correction amount ΔH, an accurate correction amount ΔH can be obtained, and as a result, the pitch angle θ of the vehicle is θ. The calculation accuracy of can be further improved.
 (7c)情報表示システム1では、車両ピッチ角θだけでなく、ドライバの眼の位置に応じて、AR画像の投影位置を補正するため、ドライバに視認される虚像の対象物に対する位置ずれを精度よく抑制できる。 (7c) In the information display system 1, in order to correct the projection position of the AR image according to not only the vehicle pitch angle θ but also the position of the driver's eyes, the position deviation of the virtual image visually recognized by the driver with respect to the object is accurate. Can be suppressed well.
 [8.他の実施形態]
 以上、本開示の実施形態について説明したが、本開示は上述の実施形態に限定されることなく、種々変形して実施することができる。
[8. Other embodiments]
Although the embodiments of the present disclosure have been described above, the present disclosure is not limited to the above-described embodiments, and can be implemented in various modifications.
 (8a)上記実施形態では、補正量ΔH(すなわち、変位量Hc)を算出する際にサスペンションのバネ特性Krが一定である場合について示したが、本開示は、上記実施形態に限定されるものではない。例えば、サスペンションのバネ特性Krが、縮み方向と伸び方向とで異なる場合には、加速度αまたは遠心力Fcの向きに応じて、バネ特性Krを変化させて補正量ΔHを算出してもよい。この場合、補正量ΔHの算出精度をより向上させることができる。 (8a) In the above embodiment, the case where the spring characteristic Kr of the suspension is constant when calculating the correction amount ΔH (that is, the displacement amount Hc) is shown, but the present disclosure is limited to the above embodiment. is not it. For example, when the spring characteristic Kr of the suspension differs between the contraction direction and the extension direction, the spring characteristic Kr may be changed to calculate the correction amount ΔH according to the direction of the acceleration α or the centrifugal force Fc. In this case, the calculation accuracy of the correction amount ΔH can be further improved.
 (8b)上記実施形態では、シートセンサ33およびトランクセンサ34は、積載物の有無を検出する場合について説明したが、積載物の重さを検出するように構成されてもよい。この場合、荷重状態をより正確に算出でき、ひいては車両ピッチ角θの算出精度をより向上させることができる。 (8b) In the above embodiment, the seat sensor 33 and the trunk sensor 34 have been described in the case of detecting the presence / absence of the load, but may be configured to detect the weight of the load. In this case, the load state can be calculated more accurately, and the calculation accuracy of the vehicle pitch angle θ can be further improved.
 (8c)本開示に記載の表示制御装置10およびその手法は、コンピュータプログラムにより具体化された一つ乃至は複数の機能を実行するようにプログラムされたプロセッサおよびメモリを構成することによって提供された専用コンピュータにより、実現されてもよい。あるいは、本開示に記載の表示制御装置10およびその手法は、一つ以上の専用ハードウェア論理回路によってプロセッサを構成することによって提供された専用コンピュータにより、実現されてもよい。もしくは、本開示に記載の表示制御装置10およびその手法は、一つ乃至は複数の機能を実行するようにプログラムされたプロセッサおよびメモリと一つ以上のハードウェア論理回路によって構成されたプロセッサとの組み合わせにより構成された一つ以上の専用コンピュータにより、実現されてもよい。また、コンピュータプログラムは、コンピュータにより実行されるインストラクションとして、コンピュータ読み取り可能な非遷移有形記録媒体に記憶されてもよい。表示制御装置10に含まれる各部の機能を実現する手法には、必ずしもソフトウェアが含まれている必要はなく、その全部の機能が、一つあるいは複数のハードウェアを用いて実現されてもよい。 (8c) The display control device 10 and its method described in the present disclosure are provided by configuring a processor and memory programmed to perform one or more functions embodied by a computer program. It may be realized by a dedicated computer. Alternatively, the display control device 10 and its method described in the present disclosure may be realized by a dedicated computer provided by configuring the processor with one or more dedicated hardware logic circuits. Alternatively, the display control device 10 and its method described in the present disclosure comprises a processor and memory programmed to perform one or more functions and a processor composed of one or more hardware logic circuits. It may be realized by one or more dedicated computers configured by a combination. The computer program may also be stored on a computer-readable non-transitional tangible recording medium as an instruction executed by the computer. The method for realizing the functions of each part included in the display control device 10 does not necessarily include software, and all the functions may be realized by using one or a plurality of hardware.
 (8d)上記実施形態における1つの構成要素が有する複数の機能を、複数の構成要素によって実現したり、1つの構成要素が有する1つの機能を、複数の構成要素によって実現したりしてもよい。また、複数の構成要素が有する複数の機能を、1つの構成要素によって実現したり、複数の構成要素によって実現される1つの機能を、1つの構成要素によって実現したりしてもよい。また、上記実施形態の構成の一部を省略してもよい。また、上記実施形態の構成の少なくとも一部を、他の上記実施形態の構成に対して付加または置換してもよい。 (8d) A plurality of functions possessed by one component in the above embodiment may be realized by a plurality of components, or one function possessed by one component may be realized by a plurality of components. .. Further, a plurality of functions possessed by the plurality of components may be realized by one component, or one function realized by the plurality of components may be realized by one component. Further, a part of the configuration of the above embodiment may be omitted. In addition, at least a part of the configuration of the above embodiment may be added or replaced with the configuration of the other above embodiment.
 (8e)上述した表示制御装置10の他、当該表示制御装置10を構成要素とするシステム、当該表示制御装置10としてコンピュータを機能させるためのプログラム、このプログラムを記録した半導体メモリ等の非遷移的実態的記録媒体、表示制御方法など、種々の形態で本開示を実現することもできる。 (8e) In addition to the above-mentioned display control device 10, a system having the display control device 10 as a component, a program for operating a computer as the display control device 10, a non-transitional non-transitional memory such as a semiconductor memory in which this program is recorded. The present disclosure can also be realized in various forms such as an actual recording medium and a display control method.

Claims (4)

  1.  ドライバに虚像として認識させる画像が投影される車両のフロントウインド中の領域を投影領域として、前記投影領域を介して、前記ドライバに視認される風景中に存在する対象物の情報を取得し、前記対象物の3次元位置情報に応じて、前記対象物に重畳する重畳画像の前記投影領域内での投影位置を設定するように構成された情報生成部(101)と、
     前記車両の車高の変位量を表す車高検出値を取得するように構成された車高取得部(102:S140)と、
     前記車両の車幅方向に加わる加速度である横加速度を取得するように構成された加速度取得部(102:S150)と、
     前記横加速度と前記車高検出値の変位量との相関関係を表す変換情報を記憶するように構成された特性記憶部(8)と、
     前記加速度取得部で取得された前記横加速度、および前記特性記憶部に記憶された前記変換情報を用いて、前記車高検出値の補正量を算出するように構成された補正量算出部(102:S160)と、
     前記車高取得部で取得された前記車高検出値を、前記補正量算出部で算出された前記補正量で補正した値を用いて、前記車両の前後方向への傾き角であるピッチ角を算出するように構成されたピッチ角算出部(102:S170)と、
     前記ドライバの眼の位置について基準位置からのずれ量の検出結果である眼情報を取得するように構成された眼情報取得部(102:S180)と、
     前記ドライバによって視認される前記重畳画像が前記対象物に重畳表示されるように、前記ピッチ角および前記眼情報の両方に従って、前記情報生成部が設定する前記重畳画像の投影位置を補正するように構成された投影補正部(102:S190)と、
     を備える表示制御装置。
    The area in the front window of the vehicle on which the image to be recognized as a virtual image by the driver is projected is set as the projection area, and the information of the object existing in the landscape visible to the driver is acquired through the projection area. An information generation unit (101) configured to set the projection position of the superimposed image superimposed on the object in the projection region according to the three-dimensional position information of the object.
    A vehicle height acquisition unit (102: S140) configured to acquire a vehicle height detection value representing a displacement amount of the vehicle height of the vehicle, and
    An acceleration acquisition unit (102: S150) configured to acquire lateral acceleration, which is an acceleration applied in the vehicle width direction of the vehicle, and
    A characteristic storage unit (8) configured to store conversion information representing a correlation between the lateral acceleration and the displacement amount of the vehicle height detection value, and
    The correction amount calculation unit (102) configured to calculate the correction amount of the vehicle height detection value by using the lateral acceleration acquired by the acceleration acquisition unit and the conversion information stored in the characteristic storage unit. : S160) and
    The pitch angle, which is the inclination angle of the vehicle in the front-rear direction, is calculated by using the value obtained by correcting the vehicle height detection value acquired by the vehicle height acquisition unit with the correction amount calculated by the correction amount calculation unit. A pitch angle calculation unit (102: S170) configured to calculate, and
    An eye information acquisition unit (102: S180) configured to acquire eye information which is a detection result of a deviation amount from a reference position with respect to the eye position of the driver.
    The projection position of the superimposed image set by the information generator is corrected according to both the pitch angle and the eye information so that the superimposed image visually recognized by the driver is superimposed and displayed on the object. The configured projection correction unit (102: S190) and
    A display control device comprising.
  2.  請求項1に記載の表示制御装置であって、
     複数の計測位置にて前記車両に加わる荷重状態の検出結果である荷重検出値を取得するように構成された荷重取得部(102:S110)を更に備え、
     前記ピッチ角算出部は、前記車高検出値から前記車両のピッチ角への変換特性を、前記荷重取得部にて取得された前記荷重検出値に従って変化させるように構成された、
     表示制御装置。
    The display control device according to claim 1.
    Further, a load acquisition unit (102: S110) configured to acquire a load detection value which is a detection result of a load state applied to the vehicle at a plurality of measurement positions is further provided.
    The pitch angle calculation unit is configured to change the conversion characteristic from the vehicle height detection value to the pitch angle of the vehicle according to the load detection value acquired by the load acquisition unit.
    Display control device.
  3.  請求項1または請求項2に記載の表示制御装置であって、
     前記車両が走行中の道路の勾配を表す勾配情報を取得するように構成された勾配取得部(102:S130)を更に備え、
     前記補正量算出部および前記ピッチ角算出部のうち少なくとも一方は、前記補正量または前記ピッチ角の算出に、前記勾配取得部にて取得された勾配情報を用いるように構成された
     表示制御装置。
    The display control device according to claim 1 or 2.
    Further comprising a gradient acquisition unit (102: S130) configured to acquire gradient information representing the gradient of the road on which the vehicle is traveling.
    At least one of the correction amount calculation unit and the pitch angle calculation unit is a display control device configured to use the gradient information acquired by the gradient acquisition unit for calculating the correction amount or the pitch angle.
  4.  請求項1から請求項3までのいずれか1項に記載の表示制御装置であって、
     前記補正量算出部は、前記加速度取得部にて取得される前記加速度の方向に応じて、特性の異なる前記変換情報を用いるように構成された
     表示制御装置。
    The display control device according to any one of claims 1 to 3.
    The correction amount calculation unit is a display control device configured to use the conversion information having different characteristics according to the direction of the acceleration acquired by the acceleration acquisition unit.
PCT/JP2020/028877 2019-07-30 2020-07-28 Display control device WO2021020385A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019139711A JP7057327B2 (en) 2019-07-30 2019-07-30 Display control device
JP2019-139711 2019-07-30

Publications (1)

Publication Number Publication Date
WO2021020385A1 true WO2021020385A1 (en) 2021-02-04

Family

ID=74228489

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/028877 WO2021020385A1 (en) 2019-07-30 2020-07-28 Display control device

Country Status (2)

Country Link
JP (1) JP7057327B2 (en)
WO (1) WO2021020385A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023184140A1 (en) * 2022-03-29 2023-10-05 华为技术有限公司 Display method, apparatus and system
WO2023240918A1 (en) * 2022-06-17 2023-12-21 江苏泽景汽车电子股份有限公司 Display picture compensation method and apparatus, and electronic device and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018042898A1 (en) * 2016-08-29 2018-03-08 マクセル株式会社 Head-up display device
WO2018070252A1 (en) * 2016-10-14 2018-04-19 日立マクセル株式会社 Vehicle image display apparatus
WO2018088360A1 (en) * 2016-11-08 2018-05-17 日本精機株式会社 Head-up display device
JP2018120141A (en) * 2017-01-26 2018-08-02 日本精機株式会社 Head-up display
WO2019092995A1 (en) * 2017-11-10 2019-05-16 株式会社デンソー Display control device and display control program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018042898A1 (en) * 2016-08-29 2018-03-08 マクセル株式会社 Head-up display device
WO2018070252A1 (en) * 2016-10-14 2018-04-19 日立マクセル株式会社 Vehicle image display apparatus
WO2018088360A1 (en) * 2016-11-08 2018-05-17 日本精機株式会社 Head-up display device
JP2018120141A (en) * 2017-01-26 2018-08-02 日本精機株式会社 Head-up display
WO2019092995A1 (en) * 2017-11-10 2019-05-16 株式会社デンソー Display control device and display control program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023184140A1 (en) * 2022-03-29 2023-10-05 华为技术有限公司 Display method, apparatus and system
WO2023240918A1 (en) * 2022-06-17 2023-12-21 江苏泽景汽车电子股份有限公司 Display picture compensation method and apparatus, and electronic device and storage medium

Also Published As

Publication number Publication date
JP7057327B2 (en) 2022-04-19
JP2021020625A (en) 2021-02-18

Similar Documents

Publication Publication Date Title
JP5161760B2 (en) In-vehicle display system and display method
US8558758B2 (en) Information display apparatus
KR101558353B1 (en) Head-up display apparatus for vehicle using aumented reality
JP6201690B2 (en) Vehicle information projection system
JP6756327B2 (en) Posture detection device and posture detection program
WO2021020145A1 (en) Display control device
WO2015060193A1 (en) Vehicle information projection system, and projection device
WO2018025624A1 (en) Head-up display device, display control method, and control program
JP2004538530A (en) METHOD AND APPARATUS FOR DISPLAYING DRIVING INSTRUCTIONS IN A CAR NAVIGATION SYSTEM
JP6787297B2 (en) Display control device and display control program
CN109927552B (en) Display device for vehicle
JP6724886B2 (en) Virtual image display
JP6724885B2 (en) Virtual image display
JP6952899B2 (en) Head-up display
WO2021020385A1 (en) Display control device
CN111094898A (en) Method, device, and computer-readable storage medium having instructions for controlling display of an augmented reality heads-up display device for a motor vehicle
US11945309B2 (en) Display system
US20220270527A1 (en) Display control device
WO2021044741A1 (en) Display control device, display control program, and heads-up display
JP6891863B2 (en) Display control device and display control program
JP7417907B2 (en) display system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20848484

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20848484

Country of ref document: EP

Kind code of ref document: A1