WO2020246114A1 - 表示制御装置および表示制御プログラム - Google Patents
表示制御装置および表示制御プログラム Download PDFInfo
- Publication number
- WO2020246114A1 WO2020246114A1 PCT/JP2020/012383 JP2020012383W WO2020246114A1 WO 2020246114 A1 WO2020246114 A1 WO 2020246114A1 JP 2020012383 W JP2020012383 W JP 2020012383W WO 2020246114 A1 WO2020246114 A1 WO 2020246114A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- display control
- content
- vehicle
- superimposed
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
Definitions
- the present disclosure relates to a display control device and a display control program that control the display of a virtual image.
- Patent Document 1 proposes a display device for vehicles that superimposes and displays contents by a head-up display. This vehicle display device superimposes and displays the content indicating the route from the traveling position of the own vehicle to the guidance point in the front view of the driver.
- Patent Document 1 It is considered that the technique of Patent Document 1 is premised on superimposing the content on the area visible to the occupant in the foreground. However, depending on the surrounding conditions such as the road structure, information on both the area visible to the occupant and the area invisible to the occupant in the foreground may be presented by the content. It is not proposed in Patent Document 1 to display the contents in an easy-to-understand manner even in such a situation.
- An object of the present disclosure is to provide a display control device and a display control program capable of displaying in an easy-to-understand manner for occupants.
- the display control device is used in a vehicle and controls the superimposed display of contents by a head-up display.
- the display control device includes an estimation unit that estimates an invisible area that is visible to the occupants of the vehicle and an invisible area that is invisible to the occupants of the road surface in the foreground, and a planned travel route of the vehicle at a specific point.
- the display control unit is provided with a display control unit that superimposes and displays the route content on the road surface, and the display control unit stops the superimposition display of the route content on the non-visual area when a specific point is included in the non-visual area.
- the non-visual information content that presents information about a specific point is superimposed and displayed on a display position associated with a predetermined position on the planned traveling route at a display height at least a part of which is displayed above the visible area.
- the display control program is used in the vehicle and controls the superimposed display of the content by the head-up display.
- the display control program estimates to at least one processing unit an invisible area visible to the occupants of the vehicle and an invisible area invisible to the occupants of the road surface in the foreground, and the specific point is not visible.
- the non-visual information content that is superimposed on the road surface and presents the planned route of the vehicle at a specific point, stops the superimposed display on the non-visible area, and presents information about the specific point. Is superimposed on the display position associated with the predetermined position on the scheduled travel path at the display height at which at least a part of the display is displayed above the visible area.
- the superimposed display of the route content on the non-visible area is stopped, and at least a part of the non-visible information content is positioned above the visible area. Is superimposed and displayed. Therefore, the occupant can recognize the information about the specific point separately from the route content superimposed on the visible area. Then, since the non-visual information content is superimposed and displayed on the display position associated with the predetermined position on the scheduled travel route, it is easy for the occupant to recognize that the information on the scheduled travel route is presented. As described above, it is possible to provide a display control device and a display control program capable of displaying in an easy-to-understand manner for the occupant.
- FIG. 1 is a diagram showing an overall image of an in-vehicle network including an HCU according to the first embodiment of the present disclosure.
- FIG. 2 is a diagram showing an example of a head-up display mounted on a vehicle.
- FIG. 3 is a diagram showing an example of a schematic configuration of the HCU.
- FIG. 4 is a diagram that visualizes and shows an example of a display layout simulation performed by the display generation unit.
- FIG. 5 is a diagram showing an example of a guidance display.
- FIG. 6 is a diagram showing an example of a guidance display.
- FIG. 7 is a diagram showing an example of a guidance display.
- FIG. 1 is a diagram showing an overall image of an in-vehicle network including an HCU according to the first embodiment of the present disclosure.
- FIG. 2 is a diagram showing an example of a head-up display mounted on a vehicle.
- FIG. 3 is a diagram showing an example of a schematic configuration of the HCU.
- FIG. 4 is
- FIG. 8 is a flowchart showing the display control method of the first embodiment.
- FIG. 9 is a diagram showing an example of the guidance display of the second embodiment.
- FIG. 10 is a diagram showing an example of a guidance display according to the third embodiment.
- FIG. 11 is a diagram showing an example of the guidance display of the fourth embodiment.
- FIG. 12 is a diagram that visualizes and shows an example of a display layout simulation performed by the display generation unit of the fifth embodiment.
- FIG. 13 is a diagram showing an example of the guidance display of the fifth embodiment.
- FIG. 14 is a diagram that visualizes and shows an example of a display layout simulation performed by the display generation unit of another embodiment.
- FIG. 15 is a diagram showing an example of a guidance display of another embodiment.
- the function of the display control device according to the first embodiment of the present disclosure is realized by the HCU (Human Machine Interface Control Unit) 100 shown in FIGS. 1 and 2.
- the HCU 100 comprises an HMI (Human Machine Interface) system 10 used in the vehicle A together with a head-up display (hereinafter, “HUD”) 20 and the like.
- the HMI system 10 further includes an operation device 26, a DSM (Drive Status Monitor) 27, and the like.
- the HMI system 10 has an input interface function for accepting user operations by an occupant (for example, a driver) of the vehicle A, and an output interface function for presenting information to the driver.
- the HMI system 10 is communicably connected to the communication bus 99 of the vehicle-mounted network 1 mounted on the vehicle A.
- the HMI system 10 is one of a plurality of nodes provided in the vehicle-mounted network 1.
- a peripheral monitoring sensor 30, a locator 40, a DCM49, a driving support ECU (Electronic Control Unit) 50, a navigation device 60, and the like are connected to the communication bus 99 of the vehicle-mounted network 1 as nodes. These nodes connected to the communication bus 99 can communicate with each other.
- the peripheral monitoring sensor 30 is an autonomous sensor that monitors the surrounding environment of the vehicle A. From the detection range around the own vehicle, the peripheral monitoring sensor 30 includes moving objects such as pedestrians, cyclists, non-human animals, and other vehicles, as well as falling objects, guardrails, curbs, road markings, traveling lane markings, and the like. It is possible to detect road markings and stationary objects such as roadside structures.
- the peripheral monitoring sensor 30 provides the detection information of detecting an object around the vehicle A to the driving support ECU 50 and the like through the communication bus 99.
- the peripheral monitoring sensor 30 has a front camera 31 and a millimeter wave radar 32 as a detection configuration for object detection.
- the front camera 31 outputs at least one of the imaging data obtained by photographing the front range of the vehicle A and the analysis result of the imaging data as detection information.
- a plurality of millimeter-wave radars 32 are arranged, for example, on the front and rear bumpers of the vehicle A at intervals from each other.
- the millimeter wave radar 32 irradiates the millimeter wave or the quasi-millimeter wave toward the front range, the front side range, the rear range, the rear side range, and the like of the vehicle A.
- the millimeter wave radar 32 generates detection information by a process of receiving reflected waves reflected by a moving object, a stationary object, or the like.
- the peripheral monitoring sensor 30 may include detection configurations such as a rider and sonar.
- the locator 40 generates highly accurate position information of vehicle A and the like by compound positioning that combines a plurality of acquired information.
- the locator 40 can specify, for example, the lane in which the vehicle A travels among a plurality of lanes.
- the locator 40 includes a GNSS (Global Navigation Satellite System) receiver 41, an inertial sensor 42, a high-precision map database (hereinafter, “high-precision map DB”) 43, and a locator ECU 44.
- GNSS Global Navigation Satellite System
- the GNSS receiver 41 receives positioning signals transmitted from a plurality of artificial satellites (positioning satellites).
- the GNSS receiver 41 can receive a positioning signal from each positioning satellite of at least one satellite positioning system among satellite positioning systems such as GPS, GLONASS, Galileo, IRNSS, QZSS, and Beidou.
- the inertial sensor 42 has, for example, a gyro sensor and an acceleration sensor.
- the high-precision map DB 43 is mainly composed of a non-volatile memory, and stores map data with higher accuracy than that used for normal navigation (hereinafter, “high-precision map data”).
- the high-precision map data holds detailed information at least for information in the height (z) direction.
- High-precision map data includes information that can be used for advanced driving support and autonomous driving, such as three-dimensional shape information of roads (road structure information), number of lanes, and information indicating the direction of travel allowed for each lane. ing.
- the locator ECU 44 has a configuration mainly including a microcomputer provided with a processor, a RAM, a storage unit, an input / output interface, a bus connecting these, and the like.
- the locator ECU 44 combines the positioning signal received by the GNSS receiver 41, the measurement result of the inertial sensor 42, the vehicle speed information output to the communication bus 99, and the like, and sequentially positions the own vehicle position and the traveling direction of the vehicle A.
- the locator ECU 44 provides the position information and the direction information of the vehicle A based on the positioning result to the HCU 100, the driving support ECU 50, and the like through the communication bus 99.
- the vehicle speed information is information indicating the current traveling speed of the vehicle A, and is generated based on the detection signal of the wheel speed sensor provided in the hub portion of each wheel of the vehicle A.
- the node (ECU) that generates vehicle speed information and outputs it to the communication bus 99 may be appropriately changed.
- a brake control ECU that controls the distribution of braking force for each wheel, or an in-vehicle ECU such as the HCU100, is electrically connected to the wheel speed sensor of each wheel to generate vehicle speed information and output to the communication bus 99.
- the locator ECU 44 determines whether or not the required high-precision map data is in the high-precision map DB 43 in response to a request from the HCU 100, the driving support ECU 50, and the like. When the requested high-precision map data is in the high-precision map DB 43, the locator ECU 44 reads the corresponding high-precision map data from the high-precision map DB 43 and provides it to the request source ECU.
- the DCM (Data Communication Module) 49 is a communication module mounted on the vehicle A.
- the DCM49 transmits and receives radio waves to and from base stations around the vehicle A by wireless communication in accordance with communication standards such as LTE (Long Term Evolution) and 5G.
- LTE Long Term Evolution
- the operation support ECU 50 has a configuration mainly including a computer including a processor, a RAM, a storage unit, an input / output interface, and a bus connecting these.
- the driving support ECU 50 has a driving support function that supports the driving operation of the driver. As an example, at the automatic driving level specified by the American Society of Automotive Engineers of Japan, the driving support ECU 50 enables partially automatic driving control (advanced driving support) of level 2 or lower.
- the driving support ECU 50 recognizes the driving environment around the vehicle A based on the detection information acquired from the peripheral monitoring sensor 30.
- the driving support ECU 50 provides the HCU 100 with the analysis result of the detection information carried out for recognizing the driving environment as the analyzed detection information.
- the driving support ECU 50 can provide the HCU 100 with the relative positions of the left and right lane markings or road edges of the lane in which the vehicle A is currently traveling (hereinafter, "own lane Lns", see FIG. 4).
- the left-right direction is a direction that coincides with the width direction of the vehicle A stationary on the horizontal plane, and is set with reference to the traveling direction of the vehicle A.
- the driving support ECU 50 can exert a plurality of functions for realizing advanced driving support by executing the program stored in the storage unit by the processor.
- the driving support ECU 50 has an ACC (Adaptive Cruise Control) control unit and an LTC control unit.
- the ACC control unit is a functional unit that realizes the functions of the ACC.
- the ACC control unit causes the vehicle A to travel at a constant speed at the target vehicle speed, or causes the vehicle A to follow the vehicle A while maintaining the inter-vehicle distance from the vehicle in front.
- the LTC control unit is a functional unit that realizes the function of the LTC (Lane Trace Control).
- the LTC control unit causes the vehicle A to travel in the own lane in cooperation with the ACC control unit according to the planned traveling line generated along the running own lane Lns.
- the navigation device 60 searches for a route to the set destination and guides the traveling along the searched route.
- the navigation device 60 includes a navigation map database (hereinafter, navigation map DB) 61 and a navigation ECU 62.
- the navigation ECU 62 is mainly composed of a microcomputer provided with a processor, RAM, a storage unit, an input / output interface, a bus connecting these, and the like.
- the navigation ECU 62 acquires the position information and the direction information of the vehicle A (own vehicle) from the locator ECU 44 through the communication bus 99.
- the navigation ECU 62 acquires the operation information input to the operation device 26 through the communication bus 99 and the HCU 100, and sets the destination based on the driver operation.
- the navigation ECU 62 searches for a plurality of routes to the destination so as to satisfy conditions such as time priority and distance priority. When one of the plurality of searched routes is selected, the navigation ECU 62 provides the route information based on the set route to the HCU 100 through the communication bus 99 together with the related navigation map data.
- the navigation ECU 62 sequentially outputs a guidance implementation request toward the HCU 100.
- the guide point GP is set near the center of each of the intersection section and the branchable section as an example.
- the guide point GP may be set on the front side or the back side of each of the intersection section and the branchable section.
- the guide point GP is an example of a specific point.
- the guidance implementation request is guidance information used for route guidance to the driver, and specifically includes information on the position of the guidance point GP and information indicating the direction in which the vehicle A should proceed at the guidance point GP. ..
- the guidance implementation request is output when the remaining distance Lr (see FIG. 4) from the vehicle A to the guidance point GP becomes less than the first threshold value (for example, about 300 m).
- the HCU 100 presents information related to route guidance based on the acquisition of the guidance implementation request from the navigation ECU 62.
- the operation device 26 is an input unit that accepts user operations by a driver or the like.
- a user operation for switching between starting and stopping is input to the operation device 26, for example, for a driving support function and an automatic driving function.
- the operation device 26 includes a steering switch provided on the spoke portion of the steering wheel, an operation lever provided on the steering column portion 8, a voice input device for detecting the driver's utterance, and the like.
- the DSM27 has a configuration including a near-infrared light source, a near-infrared camera, and a control unit for controlling them.
- the DSM 27 is installed in a posture in which the near-infrared camera is directed toward the headrest portion of the driver's seat, for example, on the upper surface of the steering column portion 8 or the upper surface of the instrument panel 9.
- the DSM27 uses a near-infrared camera to photograph the head of the driver irradiated with near-infrared light by a near-infrared light source.
- the image captured by the near-infrared camera is image-analyzed by the control unit.
- the control unit extracts information such as the position of the eye point EP and the line-of-sight direction from the captured image, and sequentially outputs the extracted state information to the HCU 100.
- the HUD 20 is mounted on the vehicle A as one of a plurality of in-vehicle display devices together with a meter display, a center information display, and the like.
- the HUD 20 is electrically connected to the HCU 100 and sequentially acquires video data generated by the HCU 100. Based on the video data, the HUD 20 presents various information related to the vehicle A, such as route information, sign information, and control information of each vehicle-mounted function, to the driver using the virtual image Vi.
- the HUD 20 is housed in the storage space inside the instrument panel 9 below the windshield WS.
- the HUD 20 projects the light formed as a virtual image Vi toward the projection range PA of the windshield WS.
- the light projected on the windshield WS is reflected toward the driver's seat side in the projection range PA and is perceived by the driver.
- the driver visually recognizes the display in which the virtual image Vi is superimposed on the foreground seen through the projection range PA.
- the HUD 20 includes a projector 21 and a magnifying optical system 22.
- the projector 21 has an LCD (Liquid Crystal Display) panel and a backlight.
- the projector 21 is fixed to the housing of the HUD 20 with the display surface of the LCD panel facing the magnifying optical system 22.
- the projector 21 displays each frame image of video data on the display surface of the LCD panel, and transmits and illuminates the display surface with a backlight to emit light formed as a virtual image Vi toward the magnifying optical system 22.
- the magnifying optical system 22 is configured to include at least one concave mirror in which a metal such as aluminum is vapor-deposited on the surface of a base material made of synthetic resin or glass.
- the magnifying optical system 22 projects the light emitted from the projector 21 onto the upper projection range PA while spreading it by reflection.
- the angle of view VA is set for the above HUD20. Assuming that the virtual range in the space where the virtual image Vi can be imaged by the HUD 20 is the image plane IS, the angle of view VA is defined based on the virtual line connecting the driver's eye point EP and the outer edge of the image plane IS. The viewing angle.
- the angle of view VA is an angle range in which the driver can visually recognize the virtual image Vi when viewed from the eye point EP. In HUD20, the horizontal angle of view in the horizontal direction is larger than the vertical angle of view in the vertical direction. When viewed from the eye point EP, the front range that overlaps with the image plane IS is the range within the angle of view VA.
- the HUD 20 displays superimposed content CTs (see FIG. 6 and the like) and non-superimposed content as virtual image Vi.
- Superimposed content CTs are AR display objects used for augmented reality (hereinafter referred to as “AR”) display.
- the display position of the superimposed content CTs is associated with a specific superimposed object existing in the foreground, such as a specific position on the road surface, a vehicle in front, a pedestrian, and a road sign.
- the superimposed content CTs are superimposed and displayed on a specific superimposed object in the foreground, and can be moved in the appearance of the driver following the superimposed object so as to be relatively fixed to the superimposed object.
- the shape of the superimposed content CTs may be continuously updated at a predetermined cycle according to the relative position and shape of the superimposed object.
- the superimposed content CTs are displayed in a posture closer to horizontal than the non-superimposed content, and have a display shape extending in the depth direction (traveling direction) as seen from the driver, for example.
- the non-superimposed content is a non-AR display object excluding the superimposed content CTs among the display objects superimposed and displayed in the foreground. Unlike the superimposed content CTs, the non-superimposed content is displayed superimposed on the foreground without specifying the superimposed target.
- the non-superimposed content is displayed at a fixed position in the projection range PA, so that it is displayed as if it is relatively fixed to the vehicle configuration such as the windshield WS.
- the HCU 100 is an electronic control device that integrally controls the display by a plurality of in-vehicle display devices including the HUD 20 in the HMI system 10.
- the HCU 100 mainly includes a computer including a processing unit 11, a RAM 12, a storage unit 13, an input / output interface 14, and a bus connecting them.
- the processing unit 11 is hardware for arithmetic processing combined with the RAM 12.
- the processing unit 11 has a configuration including at least one arithmetic core such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit).
- the processing unit 11 may be configured to further include an FPGA (Field-Programmable Gate Array) and an IP core having other dedicated functions.
- the RAM 12 may be configured to include a video RAM for video generation.
- the processing unit 11 executes various processes for realizing the functions of the functional units described later.
- the storage unit 13 is configured to include a non-volatile storage medium.
- Various programs (display control programs, etc.) executed by the processing unit 11 are stored in the storage unit 13.
- the HCU 100 shown in FIGS. 1 to 3 has a plurality of functional units for controlling the superimposed display of the content by the HUD 20 by executing the display control program stored in the storage unit 13 by the processing unit 11.
- the HCU 100 is constructed with functional units such as a viewpoint position specifying unit 71, a locator information acquisition unit 72, a guidance information acquisition unit 73, a virtual layout unit 75, and a display generation unit 76.
- the viewpoint position specifying unit 71 identifies the position of the eye point EP of the driver seated in the driver's seat based on the state information acquired from the DSM 27.
- the viewpoint position specifying unit 71 generates three-dimensional coordinates (hereinafter, “eye point coordinates”) indicating the position of the eye point EP, and sequentially provides the generated eye point coordinates to the virtual layout unit 75.
- the locator information acquisition unit 72 acquires the latest position information and direction information about the vehicle A from the locator ECU 44 as own vehicle position information. In addition, the locator information acquisition unit 72 acquires high-precision map data of the peripheral range of the vehicle A from the locator ECU 44. The locator information acquisition unit 72 sequentially provides the acquired vehicle position information and high-precision map data to the virtual layout unit 75.
- the external world information acquisition unit 74 acquires detection information about the peripheral range of the vehicle A, particularly the front range, from the driving support ECU 50. Specifically, the outside world information acquisition unit 74 acquires detection information indicating the relative positions of the left and right lane markings or road edges of the own lane Lns. The external world information acquisition unit 74 sequentially provides the acquired detection information to the virtual layout unit 75. The external world information acquisition unit 74 may acquire the imaging data of the front camera 31 as the detection information instead of the detection information as the analysis result acquired from the driving support ECU 50.
- the virtual layout unit 75 has a function of selecting contents to be used for information presentation based on various acquired information and a function of simulating the display layout of superimposed contents CTs (see FIG. 6 and the like).
- the virtual layout unit 75 selects the content to be used for route guidance at the guidance point GP when the guidance implementation request is acquired from the navigation device 60. Specifically, content that guides a right or left turn at a branch point such as an intersection, content that guides a straight ahead of a predetermined distance, content that guides a lane change, and the like are appropriately selected.
- the virtual layout unit 75 executes a virtual layout function that simulates the display layout of the superimposed content CTs based on various provided information.
- the virtual layout unit 75 reproduces the current traveling environment of the vehicle A in the virtual space based on the own vehicle position information, high-precision map data, detection information, and the like.
- the display generation unit 76 sets the own vehicle object AO at the reference position in the virtual three-dimensional space.
- the display generation unit 76 maps the road model of the shape indicated by the high-precision map data to the three-dimensional space in association with the own vehicle object AO based on the own vehicle position information.
- the display generation unit 76 sets a planned travel route based on the guidance information on the road model.
- the display generation unit 76 sets the virtual camera position CP and the superimposition range SA in association with the own vehicle object AO.
- the virtual camera position CP is a virtual position corresponding to the driver's eye point EP.
- the display generation unit 76 sequentially corrects the virtual camera position CP with respect to the own vehicle object AO based on the latest eye point coordinates acquired by the viewpoint position specifying unit 71.
- the superimposition range SA is a range in which the virtual image Vi can be superposed and displayed. When the display generation unit 76 looks forward from the virtual camera position CP based on the virtual camera position CP and the outer edge position (coordinates) information of the projection range PA stored in advance in the storage unit 13 (see FIG. 1) or the like.
- the front range inside the projection range PA is set as the superimposition range SA.
- the superimposition range SA corresponds to the angle of view VA of HUD20.
- the virtual layout unit 75 arranges the first virtual object V1 and the second virtual object V2 in the virtual space.
- the first virtual object V1 is arranged so as to overlap the planned travel route arranged on the road surface of the road model in the three-dimensional space.
- the first virtual object V1 is set in the virtual space when the path content CTr described later is displayed as a virtual image.
- the first virtual object V1 is a strip-shaped object arranged so as to cover the virtual road surface of the planned travel route in a plane.
- the first virtual object V1 is arranged in a traveling section including at least the guide point GP.
- the first virtual object V1 has a curved shape connecting the approach lane and the exit lane as shown in FIG. 4 in the right / left turn scene at the intersection.
- the first virtual object V1 defines the position and shape of the route content CTr. That is, the shape of the first virtual object V1 seen from the virtual camera position CP becomes the virtual image shape of the path content CTr visually recognized from the eye point EP.
- the virtual layout unit 75 sets the visible in-visible area Av visible to the driver and the invisible non-visible area Ai in the virtual space on the road surface in the foreground. Estimate based on virtual camera position CP and road structure model.
- the virtual layout unit 75 estimates the visible area Av and the non-visible area Ai with respect to the range in which at least the first virtual object V1 is arranged on the virtual road surface. For example, as shown in FIG. 4, the virtual layout unit 75 sets the virtual road surface intersecting the straight line extended from the virtual camera position CP in the superimposition range SA as the visible area Av, and within the range in which the first virtual object V1 is arranged.
- the virtual road surface that does not intersect the straight line is defined as the non-visual area Ai.
- the virtual layout unit 75 estimates the visible object portion and the invisible object portion from the virtual camera position CP in the first virtual object V1 arranged on the virtual road surface, so that each area Av, Ai May be configured to indirectly estimate.
- the virtual layout unit 75 estimates that the virtual road surface of the uphill road is the visible area Av and the virtual road surface of the flat road is the non-visual area Ai.
- the virtual road surface ahead of the uphill road in front can be visually recognized. It can be the outer area Ai.
- the virtual layout unit 75 is an example of an estimation unit.
- the second virtual object V2 is set in the virtual space when the non-visual information content CTi, which will be described later, is displayed as a virtual image. More specifically, the second virtual object V2 is arranged when the guide point GP is included in the invisible non-visible area Ai from the virtual camera position CP.
- the second virtual object V2 is a plane-shaped object that floats on the virtual road surface.
- the second virtual object V2 is arranged with the plane of the plane shape facing the own vehicle object side.
- the second virtual object V2 is arranged above the guide point GP set on the virtual road surface. More specifically, the second virtual object V2 sets the position (x, y) on the two-dimensional coordinates parallel to the horizontal plane in the virtual space to be the same as the position on the two-dimensional coordinates of the guide point GP.
- the position of the second virtual object V2 in the height (z) direction is a height position where the entire second virtual object V2 can be visually recognized from the virtual camera position CP. That is, the lower edge of the second virtual object V2 is positioned above the straight line connecting the virtual camera position CP and the top of the road surface.
- the upper edge of the second virtual object V2 is positioned below the upper edge of the superposition range SA. When the second virtual object V2 protrudes above the upper edge of the superposition range SA with a preset initial size, the size in the vertical direction is set to be smaller than the upper edge of the superimposition range SA. Adjusted to fit below.
- the virtual layout unit 75 estimates the position of the visible point VP at which the guide point GP can be visually recognized from the driver. Based on the driver's field of view range set in advance, the virtual layout unit 75 estimates that the point where the guide point GP enters the field of view range when viewed from the virtual camera position CP is the visible point VP.
- the visible point VP is a point where the guide point GP can be kept visible between the time of passing and the time of entering the guide point GP.
- the virtual layout unit 75 does not presume that the point where the guide point GP becomes invisible again after passing due to the road structure or the like is the visible point VP.
- the virtual layout unit 75 may be configured to consider that the guide point GP is maintained in a visible state when the distance of the section in which the guide point GP cannot be seen again is less than the threshold value.
- the virtual layout unit 75 may simply estimate the visible point VP based on the road structure.
- the visible point VP is a point in the section from the vehicle A to the guide point GP where the amount of change in the gradient is equal to or greater than a predetermined amount.
- a point where the magnitude of the gradient is equal to or less than a predetermined value may be regarded as a visible point VP.
- the highest altitude point in the section from the vehicle A to the guide point GP may be regarded as the visible point VP.
- the display generation unit 76 controls the presentation of information to the driver by the HUD 20 by a process of generating video data that is sequentially output to the HUD 20.
- the display generation unit 76 is an example of a display control unit.
- the display generation unit 76 has a function of drawing the content and a function of controlling the display period of the content based on various acquired information.
- the display generation unit 76 determines the original image to be drawn in each frame image constituting the video data based on the selection result of the content acquired from the virtual layout unit 75.
- the display generation unit 76 draws the drawing position and drawing shape of the original image in the frame image according to the eye point EP and each position of the overlay target. To correct.
- the superimposed content CTs are displayed at the position and shape correctly superimposed on the superimposed object when viewed from the eye point EP.
- the display generation unit 76 selects the content to be drawn on the video data based on the content selection result of the virtual layout unit 75 and the layout information as the simulation result using the virtual space. As an example, the display generation unit 76 draws the route content CTr (see FIGS. 5 to 7) and the non-visual information content CTi (see FIG. 6), which are contents related to the route guidance processing, and presents them to the driver.
- Route content CTr is content used to display the planned travel route of vehicle A.
- the route content CTr is the superimposed content CTs for which the road surface of the planned travel route is superimposed, and the drawing shape is determined based on the first virtual object V1 arranged in the display simulation.
- the route content CTr is drawn in a shape along the planned travel route, and indicates the lane in which vehicle A should travel, the point where right / left turn and lane change are required, and the like.
- the route content CTr is a drawing shape that imitates the shape of the lane of the planned travel route, and is a seat shape that extends in a strip shape along the traveling direction of the vehicle A.
- the route content CTr has a linear shape
- the route content CTr has a mode that connects the approach lane and the exit lane on the planned travel route within the intersection.
- the route content CTr updates the drawing shape at a predetermined update cycle so as to match the road surface shape seen from the eye point EP according to the traveling of the vehicle A.
- the display generation unit 76 stops the superimposed display of the route content CTr on the non-visible area Ai when the guide point GP is included in the non-visible area Ai. That is, the display generation unit 76 superimposes the route content CTr only on the visible area Av, and does not present information on the non-visual area Ai by the route content CTr. Based on the result of the display simulation, the display generation unit 76 generates only the content portion corresponding to the object portion arranged in the visible area Av of the first virtual object V1 as the route content CTr.
- the non-visible information content CTi is a content that presents information about the non-visible area Ai, particularly the guide point GP, to the driver in a content mode different from the route content CTr.
- the non-visual information content CTi is superimposed content CTs that superimpose the space on the road surface of the planned travel route, and the drawing shape and drawing position are determined based on the second virtual object V2 arranged in the display simulation. ..
- the non-visual information content CTi is imaged at a position on the image plane IS where the virtual lines connecting the virtual camera position CP and each point of the second virtual object V2 intersect.
- the non-visual information content CTi is drawn at the display position associated with the position of the guide point GP, which is a point on the planned travel route.
- the non-visible information content CTi is generated when the guide point GP is in the non-visible area Ai and the distance from the visible point VP to the guide point GP is less than the threshold value.
- the non-visual information content CTi presents the direction in which the vehicle A should travel at the guidance point GP as information regarding the guidance point GP.
- the non-visual information content CTi is a content in which an arrow (an arrow curved in the left turn direction) that simply indicates the shape of the traveling path near the guide point GP is drawn on the surface of the rectangular object. It is drawn.
- the non-visual information content CTi is drawn at a display height away from the road surface in the visible area Av, and is displayed at a display position that does not overlap with the route content CTr.
- the non-visual information content CTi is displayed above the guide point GP as if it were floating from the road surface at a height visible to the driver.
- the non-visual information content CTi is updated in drawing shape and drawing position at a predetermined update cycle so that it stays in the space position above the guide point GP seen from the eye point EP and is displayed according to the traveling of the vehicle A. To.
- the display generation unit 76 changes the display state of the above-mentioned route content CTr and non-visual information content CTi according to the remaining distance Lr to the guide point GP of the vehicle A. This change in the display state will be described below with reference to FIGS. 4 to 7.
- the display generation unit 76 displays only the route content CTr prior to the non-visual information content CTi in the traveling section where the remaining distance Lr to the guide point GP is less than the first threshold value and the second threshold value (for example, about 100 m) or more (for example, about 100 m). (See FIG. 5).
- the display generation unit 76 hides the non-visual information content CTi even when the guide point GP is included in the non-visual area Ai. As a result, the display generation unit 76 presents only the planned travel route on the road surface visible to the driver to the driver at the stage relatively far from the guide point GP.
- the display generation unit 76 displays both the route content CTr and the non-visual information content CTi (see FIG. 6). As a result, the display generation unit 76 presents the information of the guide point GP, which cannot be seen by the driver, to the driver at the stage when the guide point GP is relatively close to the guide point GP.
- the non-visual information content CTi indicates that a left turn is required beyond the visible area Av.
- the display generation unit 76 hides the non-visual information content CTi when the guide point GP becomes visible as the vehicle A travels. Then, the display generation unit 76 presents the planned travel route at the guide point GP that is included in the visible area Av by generating the route content CTr superimposed on the guide point GP (see FIG. 7). .. For example, when the vehicle A moves from an uphill road to a flat road, the guide point GP becomes visible. The display generation unit 76 determines that the vehicle A has become visible when the vehicle A reaches the visible point VP.
- the display control process shown in FIG. 8 is started by the HCU 100 that has completed the start-up process or the like, for example, by switching the vehicle power supply to the on state.
- the HCU 100 determines in S10 whether or not the destination is set based on the information from the navigation ECU 62. If it is determined that the destination has not been set, the determination in S10 is repeated until the destination is set. If it is determined that the destination has been set, the process proceeds to S20.
- S20 it is determined whether or not there is a guidance implementation request from the navigation ECU 62. In other words, it is determined whether or not the remaining distance Lr from the vehicle A to the guide point GP is less than the first threshold value. If it is determined that there is no guidance implementation request, the determination in S20 is repeated until the guidance implementation request is obtained. On the other hand, if it is determined that there is a guidance implementation request, the process proceeds to S30.
- the display layout is simulated to estimate the visible area Av and the non-visible area Ai of the road surface.
- the process proceeds to S40, and the route content CTr is superimposed and displayed only on the visible area Av. That is, the superimposition of the route content CTr on the non-visual area Ai is interrupted.
- the process proceeds to S50.
- S50 it is determined whether or not the remaining distance Lr from the vehicle A to the guide point GP is less than the second threshold value based on the own vehicle position information from the locator ECU 44 and the position information of the guide point GP from the navigation ECU 62. While it is determined that the remaining distance Lr is equal to or higher than the second threshold value, the processes of S30 and S40 are repeated, and the drawing shape of the route content CTr is updated according to the change of each area Av and Ai accompanying the running. .. On the other hand, if it is determined that the remaining distance Lr is less than the second threshold value, the process proceeds to S60.
- S60 the estimation of each area Av and Ai is performed again, and the process proceeds to S70.
- S70 it is determined whether or not the guide point GP is included in the visible area Av based on the respective areas Av and Ai estimated in S60 and the position information of the guide point GP.
- the process proceeds to S80, and the route content CTr is superimposed and displayed on the visible area Av in the same manner as S40, and the process proceeds to S110.
- the process proceeds to S90.
- S90 it is determined whether or not the distance from the guide point GP to the visible point VP is less than the threshold value. If it is determined that the threshold value is equal to or higher than the threshold value, the process proceeds to S80. That is, when the visible point VP is relatively far from the guide point GP, the display of the non-visible information content CTi is stopped. On the other hand, if it is determined that the distance from the guide point GP to the visible point VP is less than the threshold value, the process proceeds to S100.
- S110 it is determined whether or not the end condition of the route guidance display is satisfied. It is determined that the end condition is satisfied based on, for example, the passage of the guide point GP or the passage of the guide end point set ahead of the guide point GP in the traveling direction. While it is determined that the end condition is not satisfied, the processes of S60 to S100 are repeated to update the drawing shape and display state of the superimposed content CTs (route content CTr and non-visual information content CTi) related to the route guidance. I will do it. On the other hand, when it is determined that the end condition is satisfied, the process proceeds to S120, the superimposed content CTs related to the route guidance are hidden, and the series of processes is ended.
- the process proceeds to S120, the superimposed content CTs related to the route guidance are hidden, and the series of processes is ended.
- the display generation unit 76 of the HCU 100 stops the superimposed display of the route content CTr on the non-visible area Ai.
- the display generation unit 76 superimposes and displays the non-visual information content CTi on the display position associated with the guide point GP at the display height displayed above the visible area Av.
- the guidance point GP when the guidance point GP is included in the non-visible area Ai, the superimposed display of the route content CTr on the non-visible area Ai is stopped, and the non-visible information content CTi is positioned above the visible area Av. Is superimposed and displayed. Therefore, the driver who is a occupant can recognize the information about the guide point GP separately from the route content CTr superimposed on the visible area Av. Then, since the non-visual information content CTi is superimposed and displayed on the display position associated with the guide point GP which is a predetermined position on the scheduled travel route, the occupant recognizes that the information on the scheduled travel route is presented. Easy to do.
- the HCU 100 and the display control program capable of displaying the driver in an easy-to-understand manner. Moreover, since the information of the guide point GP located in the non-visible area Ai can be presented to the driver in advance by the non-visible information content CTi, the HCU 100 can be displayed with high convenience.
- the display generation unit 76 displays the non-visual information content CTi when the distance from the visible point VP to the guide point GP is less than the threshold value. Therefore, the display generation unit 76 can surely present the information of the guide point GP in advance under the situation that the time from when the guide point GP becomes visible to the time when the guide point GP is reached is relatively short. Therefore, the display generation unit 76 can give a sense of security to the driver.
- the display generation unit 76 stops the display of the non-visible information content CTi. According to this, the display generation unit 76 can avoid the display of the non-visual information content CTi in a situation where the need to present the information of the guide point GP in advance is relatively small. Therefore, the display generation unit 76 can prevent the inside of the angle of view VA from becoming complicated by the displayed object.
- the display generation unit 76 displays the entire non-visual information content CTi at a display position above the route content CTr superimposed on the visible area Av and not overlapping the route content CTr. According to this, since the non-visual information content CTi and the route content CTr are displayed apart from each other, the display generation unit 76 can provide a display that is easier to see. In particular, since the non-visual information content CTi of the first embodiment is displayed directly above the upper edge of the route content CTr, it is easy to understand that the information on the route continuous with the route content CTr is presented.
- the display generation unit 76 superimposes and displays the non-visual information content CTi on the display position associated with the position of the guide point GP. According to this, it is possible to more clearly present that the non-visual information content CTi is a display related to the guide point GP.
- the display generation unit 76 hides the non-visual information content CTi when the driver can visually recognize the guide point GP. According to this, the display generation unit 76 hides the non-visual information content CTi when it is no longer needed, so that it is possible to prevent the inside of the angle of view VA from becoming complicated by the displayed object.
- the display generation unit 76 displays the route content CTr prior to the non-visual information content CTi. According to this, the display generation unit 76 can display the non-visual information content CTi, which is displayed on the side farther from the vehicle A, behind the route content CTr. Therefore, it is possible to avoid that the non-visual information content CTi is displayed at a stage where it is relatively far from the non-visual area Ai and the necessity is small, and the inside of the angle of view VA becomes complicated.
- the display generation unit 76 of the second embodiment implements an animation display of the non-visual information content CTi.
- the non-visual information content CTi is a content having a linear arrow shape pointing to the left direction, which is the planned traveling direction at the guide point GP (see FIG. 9).
- the non-visual information content CTi is animated and displayed so as to move in the traveling direction at the guidance point GP.
- the non-visual information content CTi is drawn so as to continuously and smoothly move to the left from the movement start position indicated by the dotted line.
- the non-visual information content CTi disappears and reappears at the movement start position, or moves so as to return to the movement start position faster than the movement to the left, and then moves to the left again.
- Animated to move is
- the non-visual information content CTi has the movement start position as the upper position of the guide point GP, and is displayed so as to move from this position as the starting point.
- the non-visual information content CTi may have the movement end position as a position above the guide point GP, or may be displayed so as to pass above the guide point GP between the movement start position and the movement end position. Good.
- the display generation unit 76 of the second embodiment animates the non-visual information content CTi so as to move toward the scheduled direction of travel at the guidance point GP, the guidance information is displayed using the movement of the non-visual information content CTi. Can be presented to the driver. Therefore, the display generation unit 76 can present information in a more intuitive and easy-to-understand manner.
- the non-visual information content CTi presents the road shape of the guidance point GP as guidance information.
- the non-visual information content CTi is content that schematically illustrates a plan view from the sky of an intersection where the guide point GP is set.
- the non-visual information content CTi presents a warning as guidance information that the road surface including the guidance point GP cannot be seen by the driver.
- the non-visual information content CTi is drawn as content in which an exclamation mark is drawn on the surface of an object having a circular shape.
- the virtual layout unit 75 arranges the second virtual object V2 having a shape that follows the planned travel route in the same manner as the first virtual object V1 (see FIG. 12).
- the second virtual object V2 is a band-shaped object that floats above the non-visible area Ai.
- the tip of the second virtual object V2 has an arrow shape that fits within the superposition range SA.
- the second virtual object V2 is arranged in a posture in which a strip-shaped object arranged so as to cover the virtual road surface of the planned travel route in the non-visible area Ai in a plane is lifted from the virtual camera position CP to a visible position. ..
- the second virtual object V2 is arranged so that the entire second virtual object V2 fits in the superposition range SA.
- the second virtual object V2 has a shape that is continuous from the portion arranged in the visible area Av of the first virtual object V1 and is extended with the same gradient. If the second virtual object V2 is arranged in the above posture and shape and extends beyond the superimposition range SA, the second virtual object V2 is arranged in a posture that fits in the superimposition range SA or is deformed into a shape that fits in the superimposition range SA.
- the display generation unit 76 draws the non-visual information content CTi as superposed content CTs that superimpose the space above the road surface of the planned travel route in the non-visible area Ai and visible to the driver (see FIG. 13).
- the non-visible information content CTi has a shape in which the route content CTr is extended in the air along the planned travel route of the non-visible area Ai.
- the non-visual information content CTi is superimposed and displayed in association with the display position at the cut portion of the route content CTr, that is, the boundary point between the visible in-view area Av and the non-visual area Ai.
- the non-visual information content CTi is different from the path content CTr in display modes such as brightness, transmittance, display color, and pattern. As a result, the non-visual information content CTi is visually distinguished from the route content CTr superimposed on the road surface of the visible area Av.
- the non-visual information content CTi may have the same display mode as the route content CTr.
- the HCU 100 superimposes and displays the route content CTr and the non-visual information content CTi that present the route to the destination set by the navigation device 60.
- the route content CTr and the non-visual information content CTi that present the planned travel route of the vehicle A by the LTC or LCA may be displayed.
- a point where a right or left turn or a lane change is made, a point where a plurality of roads such as an intersection are connected, and the like are set as specific points.
- the HCU 100 displays the non-visual information content CTi as if it were floating above the guide point GP at a height that can be visually recognized by the driver.
- the HCU 100 may display the non-visual information content CTi as if it were floating at a point on the front side of the guide point GP.
- the HCU 100 may place the second virtual object V2 above the top point.
- the non-visual information content CTi is drawn based on the second virtual object V2 arranged above the top point, it is larger than the case where it is superimposed above the guide point GP with respect to the same remaining distance Lr. Becomes larger.
- the HCU 100 displays the entire non-visual information content CTi so as not to overlap the visible area Av above the visible area Av.
- the HCU 100 may display a part of the non-visual information content CTi so as to overlap the visible area Av.
- the non-visual information content CTi is displayed so as to partially overlap the path content CTr superimposed on the visible area Av.
- the non-visual information content CTi is displayed as if it were floating at a point on the planned travel route in the visible area Av.
- the display is such that a part of the non-visual information content CTi overlaps the visible area Av when it extends beyond the angle of view VA. Whether or not it protrudes from the angle of view VA may be determined by the display generation unit 76 based on the size of the width from the upper edge of the angle of view VA to the upper edge of the visible area Av.
- the non-visible area Ai is the area of the road surface blocked by the uphill road
- the non-visible area Ai is the area of the road surface blocked by the roadside structure on the curved road. You may.
- the processing unit and processor of the above-described embodiment include one or more CPUs (Central Processing Units).
- a processing unit and a processor may be a processing unit including a GPU (Graphics Processing Unit), a DFP (Data Flow Processor), and the like in addition to the CPU.
- the processing unit and the processor may be a processing unit including an FPGA (Field-Programmable Gate Array) and an IP core specialized in specific processing such as learning and inference of AI.
- Each arithmetic circuit unit of such a processor may be individually mounted on a printed circuit board, or may be mounted on an ASIC (Application Specific Integrated Circuit), an FPGA, or the like.
- ASIC Application Specific Integrated Circuit
- non-transitional substantive storage media such as a flash memory and a hard disk
- the form of such a storage medium may also be changed as appropriate.
- the storage medium may be in the form of a memory card or the like, and may be inserted into a slot portion provided in an in-vehicle ECU and electrically connected to a control circuit.
- control unit and its method described in the present disclosure may be realized by a dedicated computer constituting a processor programmed to execute one or a plurality of functions embodied by a computer program.
- the apparatus and method thereof described in the present disclosure may be realized by a dedicated hardware logic circuit.
- the apparatus and method thereof described in the present disclosure may be realized by one or more dedicated computers configured by a combination of a processor that executes a computer program and one or more hardware logic circuits.
- the computer program may be stored in a computer-readable non-transitional tangible recording medium as an instruction executed by the computer.
- each section is expressed as, for example, S10. Further, each section can be divided into a plurality of subsections, while a plurality of sections can be combined into one section.
- each section thus constructed can be referred to as a device, module, or means.
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
- Instrument Panels (AREA)
- Controls And Circuits For Display Device (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-105597 | 2019-06-05 | ||
JP2019105597A JP7014206B2 (ja) | 2019-06-05 | 2019-06-05 | 表示制御装置および表示制御プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020246114A1 true WO2020246114A1 (ja) | 2020-12-10 |
Family
ID=73649501
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/012383 WO2020246114A1 (ja) | 2019-06-05 | 2020-03-19 | 表示制御装置および表示制御プログラム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7014206B2 (enrdf_load_stackoverflow) |
WO (1) | WO2020246114A1 (enrdf_load_stackoverflow) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220160279A (ko) | 2021-05-27 | 2022-12-06 | 현대자동차주식회사 | 모바일 장치 및 차량 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010156608A (ja) * | 2008-12-26 | 2010-07-15 | Toshiba Corp | 車載用表示システム及び表示方法 |
JP2015128956A (ja) * | 2014-01-08 | 2015-07-16 | パイオニア株式会社 | ヘッドアップディスプレイ、制御方法、プログラム、及び記憶媒体 |
WO2018051912A1 (ja) * | 2016-09-14 | 2018-03-22 | パナソニックIpマネジメント株式会社 | 表示装置 |
-
2019
- 2019-06-05 JP JP2019105597A patent/JP7014206B2/ja active Active
-
2020
- 2020-03-19 WO PCT/JP2020/012383 patent/WO2020246114A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010156608A (ja) * | 2008-12-26 | 2010-07-15 | Toshiba Corp | 車載用表示システム及び表示方法 |
JP2015128956A (ja) * | 2014-01-08 | 2015-07-16 | パイオニア株式会社 | ヘッドアップディスプレイ、制御方法、プログラム、及び記憶媒体 |
WO2018051912A1 (ja) * | 2016-09-14 | 2018-03-22 | パナソニックIpマネジメント株式会社 | 表示装置 |
Also Published As
Publication number | Publication date |
---|---|
JP7014206B2 (ja) | 2022-02-01 |
JP2020196417A (ja) | 2020-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11996018B2 (en) | Display control device and display control program product | |
JP7052786B2 (ja) | 表示制御装置および表示制御プログラム | |
US20220118983A1 (en) | Display control device and display control program product | |
US20220024314A1 (en) | Display device and non-transitory computer-readable storage medium for display control on head-up display | |
US11850940B2 (en) | Display control device and non-transitory computer-readable storage medium for display control on head-up display | |
US20220058998A1 (en) | Display control device and non-transitory computer-readable storage medium for display control on head-up display | |
JP7420165B2 (ja) | 表示制御装置、および表示制御プログラム | |
US11710429B2 (en) | Display control device and non-transitory computer readable storage medium for display control by head-up display | |
JP7283448B2 (ja) | 表示制御装置および表示制御プログラム | |
JP7338735B2 (ja) | 表示制御装置及び表示制御プログラム | |
JP7416114B2 (ja) | 表示制御装置および表示制御プログラム | |
JP7255429B2 (ja) | 表示制御装置および表示制御プログラム | |
JP7697309B2 (ja) | 表示制御装置及び表示制御プログラム | |
JP7243660B2 (ja) | 表示制御装置及び表示制御プログラム | |
JP7092158B2 (ja) | 表示制御装置及び表示制御プログラム | |
JP7111137B2 (ja) | 表示制御装置、および表示制御プログラム | |
JP7111121B2 (ja) | 表示制御装置及び表示制御プログラム | |
WO2021065735A1 (ja) | 表示制御装置、および表示制御プログラム | |
JP2021094965A (ja) | 表示制御装置、および表示制御プログラム | |
JP7014206B2 (ja) | 表示制御装置および表示制御プログラム | |
JP2021060808A (ja) | 表示制御システム及び表示制御プログラム | |
JP7259802B2 (ja) | 表示制御装置、表示制御プログラム及び車載システム | |
JP2021037895A (ja) | 表示制御システム、表示制御装置、および表示制御プログラム | |
JP2021066197A (ja) | 表示制御装置、表示制御プログラム及び車載システム | |
JP2020138610A (ja) | 車両用表示制御装置、車両用表示制御方法、車両用表示制御プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20818681 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20818681 Country of ref document: EP Kind code of ref document: A1 |