WO2020158601A1 - Display control device, method, and computer program - Google Patents

Display control device, method, and computer program Download PDF

Info

Publication number
WO2020158601A1
WO2020158601A1 PCT/JP2020/002504 JP2020002504W WO2020158601A1 WO 2020158601 A1 WO2020158601 A1 WO 2020158601A1 JP 2020002504 W JP2020002504 W JP 2020002504W WO 2020158601 A1 WO2020158601 A1 WO 2020158601A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
update cycle
real object
display update
setting process
Prior art date
Application number
PCT/JP2020/002504
Other languages
French (fr)
Japanese (ja)
Inventor
誠 秦
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2020158601A1 publication Critical patent/WO2020158601A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position

Definitions

  • the present disclosure relates to a display control device, a method, and a computer program that are used in a vehicle and that superimpose an image on the foreground of the vehicle for visual recognition.
  • Patent Document 1 discloses data in which the position of an object (other vehicle) around the vehicle is detected, and the size and position of the object after a predetermined time is predicted from the relative speed between the vehicle and the object. There is disclosed a display device that performs display according to the position of an object by generating the image and performing display based on the generated image.
  • the outline of this disclosure relates to prompt visual attention to the target. More specifically, the present invention also relates to prompting visual attention while smoothing the display transition of an image based on the predicted position where the position of a real object is predicted.
  • the display control device described in the present specification sets the first position setting process for setting the position of the image 200 based on the position of the real object acquired immediately before, and at least the position of the real object acquired immediately before.
  • Second position setting process for setting the position of the image 200 based on the predicted position of the real object in the display update cycle of the image 200 predicted based on the position of one or more real objects acquired in the past
  • a second display that is not immediately after the position of the real object is acquired in the first display update cycle F ⁇ immediately after the position of the real object is acquired.
  • the second position setting process is executed.
  • the first position setting process is executed in the first display update period F ⁇
  • the second position setting process is executed in the second display update period F ⁇ .
  • the second position setting process is executed in the first display update cycle F ⁇ and the second display update cycle F ⁇ .
  • FIG. 3 is a block diagram of a vehicular display system according to some embodiments.
  • FIG. 6 is a diagram showing a first AR image displayed in association with a first real object and a second AR image displayed in association with a second real object according to some embodiments. It is a figure explaining how the specific position of the 1st real object and the specific position of the 2nd real object are set for every display update cycle of AR image concerning some embodiments.
  • FIG. 3 provides a description of the image displayed by the vehicular display system.
  • FIG. 4 provides a description of the processing method.
  • the present invention is not limited to the following embodiments (including the contents of the drawings). Of course, changes (including deletion of components) can be added to the following embodiments. In addition, in the following description, in order to facilitate understanding of the present invention, description of known technical matters is omitted as appropriate.
  • the image display unit 11 in the vehicle display system 10 is a head-up display (HUD: Head-Up Display) device provided in the dashboard 5 of the vehicle 1.
  • the HUD device emits the display light 11 a toward the front windshield 2 (which is an example of a member to be projected) and displays the image 200 in the virtual display area 100, so that the display light 11 a is transmitted through the front windshield 2.
  • the image 200 is overlaid and visually recognized on the foreground 300 which is the visually recognized real space.
  • the horizontal direction when the driver 4 seated in the driver's seat of the host vehicle 1 faces the front of the host vehicle 1 is the X axis (the left direction is the positive direction of the X axis), and the vertical direction is the vertical direction.
  • the Y-axis (the upward direction is the positive Y-axis direction) and the front-back direction is the Z-axis (the forward direction is the positive Z-axis direction).
  • the image display unit 11 may be a head mounted display (hereinafter, HMD) device.
  • HMD head mounted display
  • the driver 4 mounts the HMD device on his/her head and sits on the seat of the own vehicle 1 to visually recognize the displayed image 200 by superimposing it on the foreground 300 through the front windshield 2 of the own vehicle 1.
  • the display area 100 in which the vehicle display system 10 displays the predetermined image 200 is fixed at a specific position based on the coordinate system of the host vehicle 1, and when the driver 4 turns in that direction, the display area 100 is displayed at the specific position.
  • the image 200 displayed in the fixed display area 100 can be visually recognized.
  • the image display unit 11 is an obstacle (pedestrian) existing in a foreground 300 which is a real space (real scene) visually recognized through the front windshield 2 of the vehicle 1 under the control of a display control device 13 described later. , Bicycles, motorcycles, other vehicles, etc.), road surfaces, road signs, and the vicinity of real objects 310 such as features (buildings, bridges, etc.) (an example of a specific positional relationship between images and real objects), real objects 310 To display the image 200 at a position overlapping with (an example of a specific positional relationship between the image and the real object) or at a position set with reference to the real object 310 (an example of a specific positional relationship between the image and the real object).
  • a position overlapping with an example of a specific positional relationship between the image and the real object
  • a position set with reference to the real object 310 an example of a specific positional relationship between the image and the real object.
  • the image display unit 11 changes the display position according to the position of the real object 310 (described in detail later) and a non-AR image (not shown) that does not change the display position according to the position of the real object 310. And can be displayed.
  • AR visual augmented reality
  • FIG. 2 is a block diagram of a vehicle display system 10 according to some embodiments.
  • the vehicle display system 10 includes an image display unit 11 and a display control device 13 that controls the image display unit 11.
  • the display controller 13 comprises one or more I/O interfaces 14, one or more processors 16, one or more memories 18, and one or more image processing circuits 20.
  • the various functional blocks depicted in FIG. 2 may be implemented in hardware, software, or a combination of both.
  • FIG. 2 is only one embodiment of an implementation and the illustrated components may be combined into fewer components or there may be additional components.
  • image processing circuitry 20 eg, a graphics processing unit
  • processors 16 e.g, a graphics processing unit
  • the processor 16 and the image processing circuit 20 are operably connected to the memory 18. More specifically, the processor 16 and the image processing circuit 20 execute a program stored in the memory 18 to operate the vehicular display system 10, for example, to generate or/and transmit image data. be able to.
  • the processor 16 or/and the image processing circuit 20 includes at least one general purpose microprocessor (eg, central processing unit (CPU)), at least one application specific integrated circuit (ASIC), at least one field programmable gate array (FPGA). , Or any combination thereof.
  • the memory 18 includes any type of magnetic media such as a hard disk, any type of optical media such as CDs and DVDs, any type of semiconductor memory such as volatile memory, and non-volatile memory. Volatile memory may include DRAM and SRAM, and non-volatile memory may include ROM and NVROM.
  • the processor 16 is operably connected to the I/O interface 14.
  • the I/O interface 14 uses the vehicle display system 10 as a personal area network (PAN) such as a Bluetooth (registered trademark) network or a local area network (LAN) such as an 802.11x Wi-Fi (registered trademark) network. It may include a wireless communication interface for connecting to a wide area network (WAN) such as a 4G or LTE cellular network.
  • the I/O interface 14 may include a wired communication interface such as, for example, a USB port, a serial port, a parallel port, an OBDII, and/or any other suitable wired communication port.
  • the processor 16 is operably connected to the I/O interface 14 so as to communicate with various other electronic devices connected to the vehicle display system 10 (I/O interface 14). Can be given and received.
  • the I/O interface 14 includes, for example, a vehicle ECU 401 provided in the host vehicle 1, a road information database 403, a host vehicle position detection unit 405, a vehicle exterior sensor 407, a line-of-sight direction detection unit 409, an eye position detection unit 411, and portable information.
  • the terminal 413, the vehicle exterior communication connection device 420, and the like are operably connected.
  • the image display unit 11 is operably connected to the processor 16 and the image processing circuit 20.
  • the image displayed by the image display unit 11 may be based on the image data received from the processor 16 and/or the image processing circuit 20.
  • the processor 16 and the image processing circuit 20 control the image displayed by the image display unit 11 based on the information obtained from the I/O interface 14.
  • the I/O interface 14 may include a function of processing (converting, calculating, analyzing) information received from another electronic device or the like connected to the vehicle display system 10.
  • the host vehicle 1 is in the state of the host vehicle 1 (for example, mileage, vehicle speed, accelerator pedal opening, engine throttle opening, injector fuel injection amount, engine speed, motor speed, steering angle, shift position, drive mode).
  • the vehicle ECU 401 controls each unit of the host vehicle 1, and can transmit vehicle speed information indicating the current vehicle speed of the host vehicle 1 to the processor 16, for example.
  • vehicle ECU 401 may transmit the determination result of the data detected by the sensor and/or the analysis result to processor 16 in addition to or instead of simply transmitting the data detected by the sensor to processor 16. For example, information indicating whether the host vehicle 1 is traveling at a low speed or stopped may be transmitted to the processor 16.
  • the vehicle ECU 401 may transmit an instruction signal for instructing the image 200 displayed by the vehicle display system 10 to the I/O interface 14, in which case the coordinates of the image 200, the notification necessity degree of the image 200, or / And the necessity degree related information that is a basis for determining the notification necessity degree may be added to the instruction signal and transmitted.
  • the host vehicle 1 may include a road information database 403 including a navigation system and the like.
  • the road information database 403 is based on the position of the vehicle 1 acquired from the vehicle position detection unit 405 described later, and is road information (lane, white line, stop line, Crosswalk, width of road, number of lanes, intersection, curve, branch road, traffic regulation, etc.), presence/absence of feature information (buildings, bridges, rivers, etc.), position (including distance to own vehicle 1), direction The direction (based on the host vehicle 1), shape, type, detailed information, etc. may be read and transmitted to the processor 16. Further, the road information database 403 may calculate an appropriate route from the starting point to the destination and send it to the processor 16 as navigation information.
  • the host vehicle 1 may include a host vehicle position detection unit 405 such as a GNSS (Global Navigation Satellite System).
  • the road information database 403, the portable information terminal 413, which will be described later, and/or the external communication connecting device 420 acquires the position information of the own vehicle 1 from the own vehicle position detection unit 405 continuously, intermittently, or at every predetermined event. Thus, the information around the vehicle 1 can be selected and/or generated and transmitted to the processor 16.
  • GNSS Global Navigation Satellite System
  • the host vehicle 1 may include one or more vehicle exterior sensors 407 that detect the real objects 310 existing around the host vehicle 1 (front, side, and rear).
  • the real object 310 detected by the vehicle exterior sensor 407 is, for example, a pedestrian, a bicycle, a motorcycle, another vehicle (the preceding vehicle 320 or the like), a road surface (a traveling lane 330), a marking line, a roadside object, and/or a feature (building. Etc.) etc. may be included.
  • the vehicle exterior sensor for example, a millimeter wave radar, an ultrasonic radar, a radar sensor such as a laser radar, there is a camera sensor consisting of a camera and an image processing device, may be configured by a combination of both the radar sensor, the camera sensor, It may be configured with only one of them.
  • a conventionally known method is applied to the object detection by the radar sensor and the camera sensor.
  • the position of the real object (relative distance from the own vehicle 1, the traveling direction of the own vehicle 1 Position in the left-right direction in the front-back direction, vertical position, etc.), size (size in the horizontal direction (left-right direction), height direction (up-down direction), etc.), movement direction (lateral direction (left-right direction)) , Depth direction (front-back direction), movement speed (lateral direction (left-right direction), depth direction (front-back direction)), and/or type may be detected.
  • One or more vehicle exterior sensors 407 detect a real object in front of the own vehicle 1 for each detection cycle of each sensor, and detect real object related information (presence or absence of real object, which is an example of real object related information).
  • information such as the position, size, and/or type of each real object
  • the real object related information may be transmitted to the processor 16 via another device (for example, the vehicle ECU 401).
  • An infrared camera or a near-infrared camera is desirable when using a camera as a sensor so that a real object can be detected even when the surroundings are dark such as at night. Further, when using a camera as a sensor, a stereo camera capable of acquiring a distance and the like by parallax is desirable.
  • the host vehicle 1 may include a line-of-sight direction detection unit 409 including an infrared camera that detects the gaze direction of the driver 4 (hereinafter, also referred to as “line-of-sight direction”) and that captures an image of the face of the driver 4.
  • the processor 16 can specify the line-of-sight direction of the driver 4 by acquiring an image captured by the infrared camera (an example of information that can estimate the line-of-sight direction) and analyzing the captured image. Note that the processor 16 may acquire the line-of-sight direction of the driver 4 specified by the line-of-sight direction detection unit 409 (or another analysis unit) from the image captured by the infrared camera from the I/O interface 14.
  • the method of acquiring the driver's 4 line-of-sight direction of the vehicle 1 or the information capable of estimating the driver's 4 line-of-sight direction is not limited to these, and the EOG (Electro-oculogram) method, the corneal reflex Method, scleral reflection method, Purkinje image detection method, search coil method, infrared fundus camera method, and other known gaze direction detection (estimation) techniques may be used.
  • EOG Electro-oculogram
  • the host vehicle 1 may include an eye position detection unit 411 including an infrared camera that detects the position of the eyes of the driver 4.
  • the processor 16 can specify the eye position of the driver 4 by acquiring an image (an example of information that can estimate the eye position) captured by the infrared camera and analyzing the captured image.
  • the processor 16 may acquire the information on the position of the eyes of the driver 4 identified from the image captured by the infrared camera from the I/O interface 14.
  • the method for acquiring the position of the eyes of the driver 4 of the vehicle 1 or the information capable of estimating the position of the eyes of the driver 4 is not limited to these, and known eye position detection (estimation) It may be acquired using a technology.
  • the processor 16 adjusts at least the position of the image 200 based on the position of the eyes of the driver 4 so that the image 200 in which the image 200 superimposed on a desired position of the foreground 300 is detected is the viewer (driver 4). May be visually confirmed.
  • the mobile information terminal 413 is a smartphone, a laptop computer, a smart watch, or other information device that can be carried by the driver 4 (or another occupant of the vehicle 1).
  • the I/O interface 14 can communicate with the mobile information terminal 413, and acquires the data recorded in the mobile information terminal 413 (or the server via the mobile information terminal).
  • the mobile information terminal 413 has, for example, the same function as the road information database 403 and the vehicle position detection unit 405 described above, acquires the road information (an example of real object-related information), and transmits it to the processor 16. Good.
  • the mobile information terminal 413 may also acquire commercial information (an example of real object-related information) related to a commercial facility in the vicinity of the host vehicle 1 and send it to the processor 16.
  • the portable information terminal 413 transmits schedule information of the owner (for example, the driver 4) of the portable information terminal 413, incoming information at the portable information terminal 413, mail reception information, etc. to the processor 16, and the processor 16 and The image processing circuit 20 may generate or/and transmit image data regarding these.
  • the outside-vehicle communication connection device 420 is a communication device for exchanging information with the own vehicle 1, and for example, another vehicle connected to the own vehicle 1 through vehicle-to-vehicle communication (V2V: Vehicle To Vehicle), pedestrian-to-vehicle communication (V2P: It is a network communication device connected by a pedestrian (a portable information terminal carried by a pedestrian) and a road-to-vehicle communication (V2I: Vehicle To road Infrastructure) connected by a Vehicle To Pedestrian. Includes everything connected by V2X (Vehicle To Everything).
  • V2V Vehicle To Vehicle
  • V2P pedestrian-to-vehicle communication
  • V2P It is a network communication device connected by a pedestrian (a portable information terminal carried by a pedestrian) and a road-to-vehicle communication (V2I: Vehicle To road Infrastructure) connected by a Vehicle To Pedestrian. Includes everything connected by V2X (Vehicle To Everything).
  • the extra-vehicle communication connection device 420 acquires the position of, for example, a pedestrian, a bicycle, a motorcycle, another vehicle (such as a preceding vehicle), a road surface, a marking line, a roadside object, and/or a feature (such as a building), and the processor 16 May be sent to.
  • the vehicle exterior communication connection device 420 may have the same function as the own vehicle position detection unit 405 described above, may acquire the position information of the own vehicle 1 and may transmit the position information to the processor 16, and further, the above road information database. It also has the function of 403, and may acquire the road information (an example of real object related information) and send it to the processor 16.
  • the information acquired from the vehicle exterior communication connection device 420 is not limited to the above.
  • the software components stored in the memory 18 are the real object related information detection module 502, the real object position setting module 504, the difference determination module 506, the distance determination module 508, the speed determination module 510, the notification necessity determination module 512, and the image position.
  • a decision module 514, an image size decision module 516, and a graphics module 518 are included.
  • the real object related information detection module 502 acquires information (also called real object related information) including at least the position of the real object 310 existing in front of the host vehicle 1.
  • the real object-related information detection module 502 uses, for example, the vehicle exterior sensor 407 to detect the position of the real object 310 existing in the foreground 300 of the host vehicle 1 (the traveling direction of the host vehicle 1 from the driver 4 in the driver's seat of the host vehicle 1 ( The position in the height direction (vertical direction) and the lateral direction (horizontal direction) when visually recognizing the front), and the position in the depth direction (front direction) may be added to these), and of the real object 310.
  • the real object related information detection module 502 uses the external communication connection device 420 to detect the position, relative speed, and type of the real object (other vehicle), the lighting state of the direction indicator of the real object (other vehicle), and the steering angle operation. (Or an example of the actual object and related information) indicating the state, or/and the planned traveling route and the traveling schedule by the driving support system.
  • the real object related information detection module 502 detects the position of the left lane marking 331 (see FIG. 3) of the vehicle driving lane 330 (see FIG. 3) and the right lane marking 332 (see FIG. 3)), and the area (running lane 330) between the left and right partition lines 331 and 332 may be recognized.
  • the real object position setting module 504 acquires an observation position indicating the current position of the real object 310 from the road information database 403, the vehicle exterior sensor 407, the portable information terminal 413, or the vehicle exterior communication connection device 420 via the I/O interface 14. Alternatively, the observation position of the real object obtained by fusing these two or more observation positions is acquired, and the position of the real object 310 (also referred to as a specific position) is set based on the acquired observation position.
  • An image position determination module 514 which will be described later, determines the position of the AR image 210 based on the specific position of the real object 310 set by the real object position setting module 504.
  • the real object position setting module 504 sets the specific position of the real object 310 based on the observation position of the real object 310 acquired immediately before, and the one acquired in the past including at least the observation position of the real object 310 acquired immediately before.
  • Position setting processing and a real object in the display update cycle of the AR image 210 predicted based on the observation positions of one or more real objects 310 acquired in the past including at least the observation position of the real object 310 acquired immediately before
  • a second position setting process of setting the position of the AR image 210 based on the predicted position of 310 can be executed.
  • the real object position setting module 504 sets the specific position of the real object 310, which is the reference of the position to be displayed, in the time series of the first position setting process and the second position setting process.
  • the specific position of the real object 310 that serves as the reference of the display position is calculated by the second position setting process.
  • FIG. 3 is a diagram showing a first AR image 220 displayed in association with the first real object 320 and a second AR image 230 displayed in association with the second real object 330.
  • the first AR image 220 is an arcuate shape that draws attention to the preceding vehicle (first real object) 320 preceding the traveling lane 330 of the own vehicle 1 and is visually recognized so as to surround the preceding vehicle 320 behind. This is a warning image.
  • the second AR image 230 is a route image having a single arrow shape that shows the planned route of the host vehicle 1 and is visually recognized in an overlapping manner on the traveling lane (second real object) 330 of the host vehicle 1.
  • the processor 16 uses the real object-related information detection module 502 to observe the position Im of the preceding vehicle (first real object) 320, which serves as a reference for the position where the attention image (first AR image) 220 is displayed, and the route image.
  • the observation position In indicating the position of the traveling lane (second real object) 330 that serves as a reference for the position where the (second AR image) 230 is displayed is acquired.
  • the processor 16 uses the observation positions Im to Im-5 of the first real object 320 acquired in the past, including the observation position Im acquired immediately before, from the observation positions Im to Im-5 of the first time for each display update cycle k+2 to k ⁇ 2 of the AR image 210.
  • the second at every display update cycle k+2 to k-2 of the AR image 210.
  • the specific positions Qk+2 to Qk-2 of the real object 330 are set.
  • FIG. 4 is a diagram for explaining how the specific position of the first real object and the specific position of the second real object are set for each display update cycle of the AR image.
  • the display update cycle k+2 is the newest display update cycle, and becomes older in the order of k+1, k, k-1, k-2.
  • Pk+2 (Qk+2) is the newest in the figure.
  • Im(In) is the newest observation position of the first real object 320 (second real object 330) in the figure, and Im-1(In-1), Im-2(In-2)... It becomes old in order.
  • the cycle in which the observation position Im of the first real object 320 is acquired and the cycle in which the observation position In of the second real object 330 is acquired are described as different, but they are acquired.
  • the periods may be aligned.
  • the setting of the specific position Pk of the first real object 320 will be described.
  • the processor 16 acquires the specific position Pk of the first real object 320 existing in front of the host vehicle 1 from the one or more I/O interfaces 14 in the setting of the specific position Pk of the first real object 320.
  • the first display update cycle F ⁇ the first position setting processing is executed
  • the second display update cycle F ⁇ which is not immediately after the observation position Im of the first real object 320 is acquired
  • the second position setting processing is executed.
  • the display update cycle k-1, k+1 is the first display update cycle F ⁇ immediately after the observation positions Im-1, Im are acquired
  • the display update cycle k-2, k, k+2 is the observation position Im-1.
  • Im is the second display update cycle F ⁇ that is not the first display update cycle F ⁇ immediately after the acquisition. Since the display update cycle k is the second display update cycle F ⁇ , the specific position Pk in the display update cycle k is determined by the second position setting process, and specifically, the four observation positions Im ⁇ 1, Im-2, Im-3, Im-4. Since the next display update cycle k+1 is the first display update cycle F ⁇ , the specific position Pk+1 in the display update cycle k+1 is determined by the first position setting process, and specifically, the first actual acquired immediately before. The observation position Im of the object 320 is set.
  • the specific position Pk+2 in the display update cycle k+2 is determined by the second position setting process, and specifically, the four observation positions before that. It is predicted based on Im, Im-1, Im-2 and Im-3.
  • the method of calculating the predicted position by the real object position setting module 504 and the real object position setting module 504 obtains a display update cycle (eg, the display update cycle k in FIG. 4) that is the target of processing in the past. Any method may be used as long as the prediction is performed based on the observed positions (observed positions Im-1, Im-2,... In FIG. 4).
  • the real object position setting module 504 uses the least squares method or a prediction algorithm such as a Kalman filter, an ⁇ - ⁇ filter, or a particle filter, and uses one or more past observed positions to calculate the next value. May be predicted.
  • the processor 16 executes the second position setting process in the first display update cycle F ⁇ and the second display update cycle F ⁇ in setting the specific position Qk of the second real object 330.
  • the display update cycle k-2, k, k+2 is the first display update cycle F ⁇ immediately after the observation positions In-2, In-1, In are acquired, and the display update cycle k-1, k+1 is observed.
  • the second display update cycle F ⁇ is not the first display update cycle F ⁇ immediately after the positions In-2, In-1, and In are acquired. That is, the display update cycle k is the first display update cycle F ⁇ , but the specific position Qk in the display update cycle k is determined by the second position setting process.
  • the specific position Qk+1 in the next display update cycle k+1 is also determined by the second position setting process, and specifically, the four observation positions In-1, In-2, In-3, In-4 before that are determined.
  • the next display update cycle k+2 is the second display update cycle F ⁇ , but the specific position Qk+2 in the display update cycle k+2 is determined by the second position setting process, and specifically, the four observation positions before that.
  • the difference determination module 506 of FIG. 2 predicts the predicted position in the first display update cycle F ⁇ based on the observation position acquired immediately before and one or more observation positions including at least the observation position acquired immediately before. And are compared, and it is determined whether the difference between them is larger than a predetermined difference threshold value stored in the memory 18. When these differences are larger than the predetermined threshold value, the processor 16 executes the first position setting process in the first display update cycle F ⁇ , and executes the second position setting process in the second display update cycle F ⁇ . After continuing this for a predetermined number of display update cycles, the second position setting process may be executed in the first display update cycle F ⁇ and the second display update cycle F ⁇ .
  • the processor 16 executes the second position setting process in the first display update cycle F ⁇ and the second display update cycle F ⁇ .
  • the memory 18 may store two or more difference thresholds, and the distance determination module 508 may determine the degree of difference between the observed position acquired immediately before and the predicted position in three or more stages. Further, these difference thresholds may be variable. For example, the difference determination module 506 may be changed according to the relative speed between the real object 310 and the host vehicle 1, and in this case, the higher the relative speed, the longer the difference threshold may be set.
  • the difference determination module 506 may determine whether the observed position is closer to the host vehicle 1 than the predicted position.
  • the processor 16 executes the first position setting process in the first display update cycle F ⁇ .
  • the second display update cycle F ⁇ when the second position setting process is executed and the observed position acquired immediately before is not nearer to the vehicle 1 than the predicted position, the first display update cycle F ⁇ and the second display update In the period F ⁇ , the second position setting process may be executed. According to this, when the real object is assumed to be closer than the predicted position, the AR image 210 can be quickly displayed based on the observation position where the real object is highly likely to exist.
  • the distance determination module 508 determines the degree of distance between the real object 310 and the host vehicle 1. For example, the distance determination module 508 determines that the distance between the real object 310 and the host vehicle 1 that can be acquired by executing the real object related information detection module 502 is greater than the predetermined distance threshold stored in the memory 18. You may judge whether it is long.
  • the memory 18 may store two or more distance thresholds, and the distance determination module 508 may determine the degree of the distance between the real object 310 and the vehicle 1 in three or more steps. Further, these distance thresholds may be variable. For example, the distance determination module 508 may change the distance according to the relative speed between the real object 310 and the host vehicle 1. In this case, the faster the relative speed, the longer the distance threshold may be set.
  • the speed determination module 510 determines the degree of relative speed between the real object 310 and the host vehicle 1. For example, the speed determination module 510 calculates the real object 310 and the own vehicle based on the time change of the distance between the real object 310 and the own vehicle 1 that can be acquired by the real object related information detection module 502 being executed. It may be determined whether the relative speed with respect to 1 is faster than a predetermined relative speed threshold value stored in the memory 18.
  • the memory 18 may store two or more relative speed thresholds, and the speed determination module 510 may determine the degree of relative speed between the real object 310 and the vehicle 1 in three or more stages. Further, these relative speed thresholds may be variable. For example, the speed determination module 510 may change the speed according to the distance between the real object 310 and the host vehicle 1. In this case, the longer the distance between the real object 310 and the host vehicle 1, the relative speed. You may set so that a threshold value may become fast.
  • the notification necessity degree determination module 512 determines whether or not each image 200 displayed by the vehicle display system 10 is the content to be notified to the driver 4.
  • the notification necessity degree determination module 512 may obtain information from various other electronic devices connected to the I/O interface 14 and calculate the notification necessity degree.
  • the electronic device connected to the I/O interface 14 in FIG. 2 transmits information to the vehicle ECU 401, and the notification necessity degree determination module 512 detects (acquires) the notification necessity degree determined by the vehicle ECU 401 based on the received information. ) May be.
  • the "information need level” is, for example, a risk level derived from the degree of seriousness of the possibility itself, an urgency level derived from the length of the reaction time required to take a reaction action, the own vehicle 1 or the driver 4 ( Alternatively, it can be determined based on the effectiveness derived from the situation of other occupants of the vehicle 1 or a combination thereof (the indicator of the notification necessity degree is not limited to these). That is, the notification necessity degree determination module 512 may determine whether to notify the driver 4 and may select not to display the route image 220, the warning image 230 described below, or both of them.
  • the vehicle display system 10 may not have a function of estimating (calculating) the notification necessity degree, and a part or all of the function of estimating the notification necessity degree may be displayed by the vehicle display system 10. It may be provided separately from the control device 13.
  • the image position determination module 514 determines the determined position (observation position or predicted position) of the real object 310 set by the real object position setting module 504 so that the image 200 is visually recognized in a specific positional relationship with the real object 310. Based on the coordinates of the image 200 (including at least the horizontal direction (X-axis direction) and the vertical direction (Y-axis direction) when the driver 4 views the direction of the display area 100 from the driver's seat of the vehicle 1) decide. In addition to this, the image position determination module 514 determines when the driver 4 views the direction of the display area 100 from the driver's seat of the own vehicle 1 based on the determined position of the real object 310 set by the real object position setting module 504. The front-back direction (Z-axis direction) may be determined.
  • the image position determination module 514 adjusts the position of the image 200 based on the position of the eyes of the driver 4 detected by the eye position detection unit 411. For example, the image position determination module 514 determines the horizontal position and the vertical position of the image 200 so that the center of the image 200 is visually recognized so as to overlap with the center of the real object.
  • the “specific positional relationship” can be adjusted depending on the situation of the real object or the host vehicle 1, the type of the real object, the type of the displayed image, or the like.
  • the image size determination module 516 may change the size of the AR image 210 in accordance with the position and/or size of the real object 310 to be associated. For example, the image size determination module 516 can reduce the size of the AR image 210 if the position of the real object 310 to be associated is distant. Further, the image size determination module 516 can increase the size of the AR image 210 if the size of the real object 310 to be associated is large.
  • the image size determination module 516 detects the type, number, and/or notification necessity determination module 512 of the real object that displays the image 200 detected by the real object related information detection module 502 in association with each other ( The size of the image 200 may be determined based on the (estimated) notification need.
  • the image size determination module 516 may have a function of predicting and calculating the size of the AR image 210 to be displayed in the display update cycle of this time, based on the size of the real object a predetermined number of times in the past.
  • the image size determination module 516 tracks pixels of the real object 310 between two past captured images captured by the camera (an example of the vehicle exterior sensor 407) using, for example, the Lucas-Kanade method.
  • the size of the real object in the current display update cycle may be predicted, and the size of the AR image may be determined according to the predicted size of the real object.
  • the rate of change of the size of the real object is obtained based on the change of the size of the real object between the past two captured images, and the size of the AR image is determined according to the rate of change of the size of the real object.
  • the method of estimating the size change of the real object from the viewpoint that changes in time series is not limited to the above, and known methods including optical flow estimation algorithms such as the Horn-Schunkk method, the Buxton-Buxton method, and the Black-Jepson method, for example. You may use the method of.
  • the graphics module 518 provides visual effects (eg, brightness, transparency, saturation, contrast, or other visual characteristic), size, display position, distance (distance from the driver 4 to the image 200) of the displayed image 200. ) Includes various known software components for modifying.
  • the graphic module 518 displays the coordinates set by the image position determination module 514 (the left-right direction (X-axis direction) and the up-down direction (Y-axis when the driver 4 views the direction of the display area 100 from the driver's seat of the vehicle 1 ).
  • the image 200 is displayed so as to be visually recognized by the driver 4 with the image size set by the image size determination module 516.
  • the display control device includes the image display unit 11 that superimposes and displays the image 200 on the position associated with the real object existing in the foreground viewed from the driver 4 of the vehicle 1.
  • the display controller 13 to control, one or more I/O interfaces 14, one or more processors 16, a memory 18, and one or more processors 16 stored in the memory 18.
  • One or more computer programs configured to be executed, the one or more processors 16 set the position of the image 200 based on the position of the most recently acquired real object.
  • the first position setting process and the predicted position of the real object in the display update cycle of the image predicted based on the positions of one or more real objects acquired in the past including at least the position of the real object acquired immediately before.
  • a second position setting process for setting the position of the image 200 based on the position of the real object existing in front of the vehicle 1 from one or more I/O interfaces 14.
  • the first position setting process is not executed and the position of the real object existing in front of the own vehicle 1 is acquired from the one or more I/O interfaces 14.
  • the second position setting process is executed. According to this, since the display position of the AR image is determined based on the predicted position of the real object from the past observation position, the observation position of the real object is acquired while preventing the display position of the image from changing rapidly. In the display update cycle immediately after, the display position of the AR image is determined based on the observation position of the real object, so that the viewer can quickly recognize the accurate position of the real object.
  • the image 200 includes a first AR image 220 and a second AR image 230 that is of a different type than the first AR image 220, and one or more processors 16 may cause the first AR image 220 to include the first AR image 220.
  • the first position setting process is executed
  • the second position setting process is executed
  • the second AR image 230 in the second display update cycle F ⁇
  • the second position setting process may be executed. According to this, in one of the types of images to be displayed, the position of the image can be updated only at the predicted position of the real object, and a smoothly changing image in which abrupt changes are suppressed can be displayed.
  • the first AR image 220 may be a warning image that calls attention to the real object
  • the second AR image 230 may be a route image that shows the route of the host vehicle 1.
  • the first AR image 220 is an image that requires a relatively high degree of notification, and includes signs such as "stop" on the road surface, road surface conditions (wet, freezing) that may cause slippage, and automatic manual operation of the host vehicle 1. It may be an image showing a driving switching point.
  • the second AR image 230 is an image having a lower notification necessity than the first AR image 220, and shows road signs, signboards, POI information, information about the direction of the final destination, and the like, which are not so important as "stop”. It may be an image.
  • the one or more processors 16 include the position of the real object acquired immediately before and the position of the real object acquired at least immediately before in the first display update cycle F ⁇ .
  • the predicted position of the real object in the display update cycle of the image predicted based on the acquired position of one or more real objects is compared, and if the difference between these is greater than a predetermined threshold value, the first display update In the period F ⁇ , the first position setting process is executed, in the second display update period F ⁇ , the second position setting process is executed, and when the difference is not larger than the predetermined threshold, the first display update period F ⁇ and the second display
  • the second position setting process may be executed in the update cycle F ⁇ .
  • the image is quickly displayed at the position based on the latest observed position, and the AR image and the actual image in which the AR image is associated with each other are associated with each other.
  • the visual attention of the driver 4 can be promptly directed to the object.
  • the one or more processors 16 include the position of the real object acquired immediately before and the position of the real object acquired at least immediately before in the first display update cycle F ⁇ .
  • the predicted position of the real object in the display update cycle of the image predicted based on the acquired position of one or more real objects is compared, and the position of the real object acquired immediately before is the predicted position of the real object.
  • the first position setting process is executed in the first display update cycle F ⁇
  • the second position setting process is executed in the second display update cycle F ⁇
  • the actual position acquired immediately before is executed.
  • the second position setting process may be executed in the first display update cycle F ⁇ and the second display update cycle F ⁇ . According to this, when the latest observed position of the real object is closer to the host vehicle 1 than the predicted position, and the rapid approach of the real object can be assumed, the image is displayed at the position based on the latest observed position, so that the AR image is displayed. Also, the visual attention of the driver 4 can be promptly directed to the real object associated with this AR image.
  • one or more processors 16 may use the one or more I/O interfaces 14 to set the position of the image 200 relative to a reference real object and the host vehicle 1.
  • the first position setting process is executed in the first display update cycle F ⁇
  • the second position setting process is executed in the second display update cycle F ⁇ . If the relative speed is not faster than the predetermined threshold value, the second position setting process may be executed in the first display update cycle F ⁇ and the second display update cycle F ⁇ .
  • one or more of the processors 16 may determine, in the first display update period F ⁇ , if the real object serving as the reference for setting the position of the image 200 is determined to be approaching.
  • the second position setting process is executed in the second display update cycle F ⁇ , and it is not determined that the real object serving as the reference for setting the position of the image 200 is approaching, the first position setting process is performed.
  • the second position setting process may be executed. According to this, when it can be assumed that the real object is approaching, the driver displays the AR image and the real object associated with the AR image by displaying the image at the position based on the latest observation position. The visual attention of 4 can be directed quickly.
  • the display area 100 is not limited to an arrangement that is substantially along a plane (XY plane) consisting of up, down, left and right as seen from the driver 4.
  • the display region 100 may be rotated about the left-right direction (X-axis direction) viewed from the driver 4 and arranged substantially along the traveling lane 330 (ZX plane).
  • the display area 100 may be a curved surface instead of a flat surface.
  • a stereoscopic display may be adopted for the image display unit 11 and the image 200 may be displayed in the display area 100 which is a three-dimensional area.
  • the second AR image 230 may be a route image with two or more illustrations or the like that are visually recognized to be superimposed on the traveling lane 330 of the host vehicle 1.
  • Gaze direction detecting section 411... Eye position detecting section, 413... Portable information terminal, 420... Out-of-vehicle communication connecting device, 502... Real object related information detecting module, 504... Real object position setting module, 506... Difference determination module, 508... Distance determination module, 510... Speed determination module, 512... Notification necessity determination module, 514... Image position determination module, 516... Image size determination module, 518... Graphic module, F ⁇ ... First display Update cycle, F ⁇ ... Second display update cycle

Abstract

The present invention smooths the transition of displaying images on the basis of a predicted position that is a predicted position of a real object, and quickly directs visual attention. A display control device is capable of executing a first position setting process for setting a position of an image 210 on the basis of the most recently acquired position of a real object 310, and a second position setting process for setting a position of an image 220 on the basis of a predicted position of the real object 310 in a display update cycle of the image 210 that is predicted on the basis of one or more previously acquired positions of the real object 310, including at least the most recently acquired position of the real object 310. The display control device executes the first position setting process in a first display update cycle Fα immediately after the acquisition of the position of the real object 310 existing in front of an own vehicle, and executes the second position setting process in a second display update cycle Fβ that is not immediately after the acquisition of the position of the real object 310 existing in front of the own vehicle.

Description

表示制御装置、方法、及びコンピュータ・プログラムDisplay control device, method, and computer program
 本開示は、車両で使用され、車両の前景に画像を重畳して視認させる表示制御装置、方法、及びコンピュータ・プログラムに関する。 The present disclosure relates to a display control device, a method, and a computer program that are used in a vehicle and that superimpose an image on the foreground of the vehicle for visual recognition.
 特許文献1には、自車両周辺の対象物(他の車両)の位置を検出し、自車両と対象物との相対速度から、対象物の所定時間後の大きさ及び位置を予測したデータを生成し、これに基づいて表示を行うことで、対象物の位置に合わせた表示を行う表示装置が開示されている。 Patent Document 1 discloses data in which the position of an object (other vehicle) around the vehicle is detected, and the size and position of the object after a predetermined time is predicted from the relative speed between the vehicle and the object. There is disclosed a display device that performs display according to the position of an object by generating the image and performing display based on the generated image.
特開2014-177275号公報JP, 2014-177275, A
 しかしながら、特許文献1の表示装置のように、対象物の所定時間後の大きさ及び位置を予測したデータのみで画像の位置などを調整すると、予測と実際の観測位置とに大きな違いがある場合でも実際の観測位置が画像に迅速に反映されにくく、注意すべき実オブジェクトに対して視認者が注意を向けることが遅くなることが想定される。 However, when the position of an image or the like is adjusted only by the data of predicting the size and position of an object after a predetermined time as in the display device of Patent Document 1, there is a large difference between the predicted position and the actual observed position. However, it is assumed that it is difficult for the actual observation position to be reflected in the image quickly, and it becomes slower for the viewer to pay attention to the real object to be noted.
 本明細書に開示される特定の実施形態の要約を以下に示す。これらの態様が、これらの特定の実施形態の概要を読者に提供するためだけに提示され、この開示の範囲を限定するものではないことを理解されたい。実際に、本開示は、以下に記載されない種々の態様を包含し得る。 A summary of the specific embodiments disclosed herein is provided below. It should be understood that these aspects are presented only to provide an overview of the reader to these particular embodiments and are not intended to limit the scope of this disclosure. Indeed, the present disclosure may include various aspects not described below.
 本開示の概要は、迅速に視覚的注意を対象に向けさせることに関する。より具体的には、実オブジェクトの位置を予測した予測位置に基づいて画像の表示遷移を滑らかにしつつ、視覚的注意を迅速に向かせることにも関する。 The outline of this disclosure relates to prompt visual attention to the target. More specifically, the present invention also relates to prompting visual attention while smoothing the display transition of an image based on the predicted position where the position of a real object is predicted.
 したがって、本明細書に記載される表示制御装置は、直前に取得した実オブジェクトの位置に基づいて画像200の位置を設定する第1位置設定処理と、少なくとも直前に取得した前記実オブジェクトの位置を含む過去に取得した1つ又はそれ以上の前記実オブジェクトの位置を元に予測される画像200の表示更新周期における前記実オブジェクトの予測位置に基づいて画像200の位置を設定する第2位置設定処理と、を実行可能であり、前記実オブジェクトの位置を取得した直後の第1表示更新周期Fαでは、前記第1位置設定処理を実行し、前記実オブジェクトの位置を取得した直後ではない第2表示更新周期Fβでは、前記第2位置設定処理を実行する。また、いくつかの実施形態では、第1画像220において、第1表示更新周期Fαでは、前記第1位置設定処理を実行し、第2表示更新周期Fβでは、前記第2位置設定処理を実行し、第2画像230において、第1表示更新周期Fα及び第2表示更新周期Fβでは、前記第2位置設定処理を実行する。 Therefore, the display control device described in the present specification sets the first position setting process for setting the position of the image 200 based on the position of the real object acquired immediately before, and at least the position of the real object acquired immediately before. Second position setting process for setting the position of the image 200 based on the predicted position of the real object in the display update cycle of the image 200 predicted based on the position of one or more real objects acquired in the past And a second display that is not immediately after the position of the real object is acquired in the first display update cycle Fα immediately after the position of the real object is acquired. In the update cycle Fβ, the second position setting process is executed. Further, in some embodiments, in the first image 220, the first position setting process is executed in the first display update period Fα, and the second position setting process is executed in the second display update period Fβ. In the second image 230, the second position setting process is executed in the first display update cycle Fα and the second display update cycle Fβ.
いくつかの実施形態に係る、車両用表示システムの車両への適用例を示す図である。It is a figure which shows the example of application to the vehicle of the display system for vehicles which concerns on some embodiment. いくつかの実施形態に係る、車両用表示システムのブロック図である。FIG. 3 is a block diagram of a vehicular display system according to some embodiments. いくつかの実施形態に係る、第1実オブジェクトに対応付けられて表示される第1AR画像と、第2実オブジェクトに対応付けられて表示される第2AR画像と、を示す図である。FIG. 6 is a diagram showing a first AR image displayed in association with a first real object and a second AR image displayed in association with a second real object according to some embodiments. いくつかの実施形態に係る、AR画像の表示更新周期毎に第1実オブジェクトの特定位置、第2実オブジェクトの特定位置がどのように設定されるかを説明する図である。It is a figure explaining how the specific position of the 1st real object and the specific position of the 2nd real object are set for every display update cycle of AR image concerning some embodiments.
 以下、図1及び図2では、例示的な車両用表示システムの構成の説明を提供する。図3では、車両用表示システムが表示する画像についての説明を提供する。また、図4では、処理方法の説明を提供する。なお、本発明は以下の実施形態(図面の内容も含む)によって限定されるものではない。下記の実施形態に変更(構成要素の削除も含む)を加えることができるのはもちろんである。また、以下の説明では、本発明の理解を容易にするために、公知の技術的事項の説明を適宜省略する。 1 and 2, below, a description of the configuration of an exemplary vehicle display system is provided. FIG. 3 provides a description of the image displayed by the vehicular display system. Also, FIG. 4 provides a description of the processing method. The present invention is not limited to the following embodiments (including the contents of the drawings). Of course, changes (including deletion of components) can be added to the following embodiments. In addition, in the following description, in order to facilitate understanding of the present invention, description of known technical matters is omitted as appropriate.
 図1を参照する。車両用表示システム10における画像表示部11は、自車両1のダッシュボード5内に設けられたヘッドアップディスプレイ(HUD:Head-Up Display)装置である。HUD装置は、表示光11aをフロントウインドシールド2(被投影部材の一例である)に向けて出射し、仮想的な表示領域100内に画像200を表示することで、フロントウインドシールド2を介して視認される現実空間である前景300に画像200を重ねて視認させる。なお、本実施形態の説明では、自車両1の運転席に着座する運転者4が自車両1の前方を向いた際の左右方向をX軸(左方向がX軸正方向)、上下方向をY軸(上方向がY軸正方向)、前後方向をZ軸(前方向がZ軸正方向)とする。 Refer to Figure 1. The image display unit 11 in the vehicle display system 10 is a head-up display (HUD: Head-Up Display) device provided in the dashboard 5 of the vehicle 1. The HUD device emits the display light 11 a toward the front windshield 2 (which is an example of a member to be projected) and displays the image 200 in the virtual display area 100, so that the display light 11 a is transmitted through the front windshield 2. The image 200 is overlaid and visually recognized on the foreground 300 which is the visually recognized real space. In the description of the present embodiment, the horizontal direction when the driver 4 seated in the driver's seat of the host vehicle 1 faces the front of the host vehicle 1 is the X axis (the left direction is the positive direction of the X axis), and the vertical direction is the vertical direction. The Y-axis (the upward direction is the positive Y-axis direction) and the front-back direction is the Z-axis (the forward direction is the positive Z-axis direction).
 また、画像表示部11は、ヘッドマウントディスプレイ(以下、HMD)装置であってもよい。運転者4は、HMD装置を頭部に装着して自車両1の座席に着座することで、表示される画像200を、自車両1のフロントウインドシールド2を介した前景300に重畳して視認する。車両用表示システム10が所定の画像200を表示する表示領域100は、自車両1の座標系を基準とした特定の位置に固定され、運転者4がその方向を向くと、その特定の位置に固定された表示領域100内に表示された画像200を視認することができる。 Further, the image display unit 11 may be a head mounted display (hereinafter, HMD) device. The driver 4 mounts the HMD device on his/her head and sits on the seat of the own vehicle 1 to visually recognize the displayed image 200 by superimposing it on the foreground 300 through the front windshield 2 of the own vehicle 1. To do. The display area 100 in which the vehicle display system 10 displays the predetermined image 200 is fixed at a specific position based on the coordinate system of the host vehicle 1, and when the driver 4 turns in that direction, the display area 100 is displayed at the specific position. The image 200 displayed in the fixed display area 100 can be visually recognized.
 画像表示部11は、後述する表示制御装置13の制御に基づいて、自車両1のフロントウインドシールド2を介して視認される現実空間(実景)である前景300に存在する、障害物(歩行者、自転車、自動二輪車、他車両など)、路面、道路標識、及び地物(建物、橋など)などの実オブジェクト310の近傍(画像と実オブジェクトとの特定の位置関係の一例)、実オブジェクト310に重なる位置(画像と実オブジェクトとの特定の位置関係の一例)、又は実オブジェクト310を基準に設定された位置(画像と実オブジェクトとの特定の位置関係の一例)に画像200を表示することで、視覚的な拡張現実(AR:Augmented Reality)を形成することもできる。画像表示部11は、実オブジェクト310の位置に応じて表示位置を変化させるAR画像210(後で詳述する)と、実オブジェクト310の位置に応じて表示位置を変化させない不図示の非AR画像と、を表示することができる。 The image display unit 11 is an obstacle (pedestrian) existing in a foreground 300 which is a real space (real scene) visually recognized through the front windshield 2 of the vehicle 1 under the control of a display control device 13 described later. , Bicycles, motorcycles, other vehicles, etc.), road surfaces, road signs, and the vicinity of real objects 310 such as features (buildings, bridges, etc.) (an example of a specific positional relationship between images and real objects), real objects 310 To display the image 200 at a position overlapping with (an example of a specific positional relationship between the image and the real object) or at a position set with reference to the real object 310 (an example of a specific positional relationship between the image and the real object). Thus, it is possible to form a visual augmented reality (AR). The image display unit 11 changes the display position according to the position of the real object 310 (described in detail later) and a non-AR image (not shown) that does not change the display position according to the position of the real object 310. And can be displayed.
 図2は、いくつかの実施形態に係る、車両用表示システム10のブロック図である。車両用表示システム10は、画像表示部11と、画像表示部11を制御する表示制御装置13と、で構成される。表示制御装置13は、1つ又はそれ以上のI/Oインタフェース14、1つ又はそれ以上のプロセッサ16、1つ又はそれ以上のメモリ18、及び1つ又はそれ以上の画像処理回路20を備える。図2に記載される様々な機能ブロックは、ハードウェア、ソフトウェア、又はこれら両方の組み合わせで構成されてもよい。図2は、実施態様の一実施形態に過ぎず、図示された構成要素は、より数の少ない構成要素に組み合わされてもよく、又は追加の構成要素があってもよい。例えば、画像処理回路20(例えば、グラフィック処理ユニット)が、1つ又はそれ以上のプロセッサ16に含まれてもよい。 FIG. 2 is a block diagram of a vehicle display system 10 according to some embodiments. The vehicle display system 10 includes an image display unit 11 and a display control device 13 that controls the image display unit 11. The display controller 13 comprises one or more I/O interfaces 14, one or more processors 16, one or more memories 18, and one or more image processing circuits 20. The various functional blocks depicted in FIG. 2 may be implemented in hardware, software, or a combination of both. FIG. 2 is only one embodiment of an implementation and the illustrated components may be combined into fewer components or there may be additional components. For example, image processing circuitry 20 (eg, a graphics processing unit) may be included in one or more processors 16.
 図示するように、プロセッサ16及び画像処理回路20は、メモリ18と動作可能に連結される。より具体的には、プロセッサ16及び画像処理回路20は、メモリ18に記憶されているプログラムを実行することで、例えば画像データを生成又は/及び送信するなど、車両用表示システム10の操作を行うことができる。プロセッサ16又は/及び画像処理回路20は、少なくとも1つの汎用マイクロプロセッサ(例えば、中央処理装置(CPU))、少なくとも1つの特定用途向け集積回路(ASIC)、少なくとも1つのフィールドプログラマブルゲートアレイ(FPGA)、又はそれらの任意の組み合わせを含むことができる。メモリ18は、ハードディスクのような任意のタイプの磁気媒体、CD及びDVDのような任意のタイプの光学媒体、揮発性メモリのような任意のタイプの半導体メモリ、及び不揮発性メモリを含む。揮発性メモリは、DRAM及びSRAMを含み、不揮発性メモリは、ROM及びNVROMを含んでもよい。 As illustrated, the processor 16 and the image processing circuit 20 are operably connected to the memory 18. More specifically, the processor 16 and the image processing circuit 20 execute a program stored in the memory 18 to operate the vehicular display system 10, for example, to generate or/and transmit image data. be able to. The processor 16 or/and the image processing circuit 20 includes at least one general purpose microprocessor (eg, central processing unit (CPU)), at least one application specific integrated circuit (ASIC), at least one field programmable gate array (FPGA). , Or any combination thereof. The memory 18 includes any type of magnetic media such as a hard disk, any type of optical media such as CDs and DVDs, any type of semiconductor memory such as volatile memory, and non-volatile memory. Volatile memory may include DRAM and SRAM, and non-volatile memory may include ROM and NVROM.
 図示するように、プロセッサ16は、I/Oインタフェース14と動作可能に連結されている。例えば、I/Oインタフェース14は、車両用表示システム10を、Bluetooth(登録商標)ネットワークなどのパーソナルエリアネットワーク(PAN)、802.11x Wi-Fi(登録商標)ネットワークなどのローカルエリアネットワーク(LAN)、4G又はLTE(登録商標)セルラーネットワークなどの広域ネットワーク(WAN)に接続する無線通信インタフェースを含むことができる。また、I/Oインタフェース14は、例えば、USBポート、シリアルポート、パラレルポート、OBDII、及び/又は他の任意の適切な有線通信ポートなどの有線通信インタフェースを含むことができる。 As shown, the processor 16 is operably connected to the I/O interface 14. For example, the I/O interface 14 uses the vehicle display system 10 as a personal area network (PAN) such as a Bluetooth (registered trademark) network or a local area network (LAN) such as an 802.11x Wi-Fi (registered trademark) network. It may include a wireless communication interface for connecting to a wide area network (WAN) such as a 4G or LTE cellular network. Also, the I/O interface 14 may include a wired communication interface such as, for example, a USB port, a serial port, a parallel port, an OBDII, and/or any other suitable wired communication port.
 図示するように、プロセッサ16は、I/Oインタフェース14と相互動作可能に連結されることで、車両用表示システム10(I/Oインタフェース14)に接続される種々の他の電子機器等と情報を授受可能となる。I/Oインタフェース14には、例えば、自車両1に設けられた車両ECU401、道路情報データベース403、自車位置検出部405、車外センサ407、視線方向検出部409、目位置検出部411、携帯情報端末413、及び車外通信接続機器420などが動作可能に連結される。画像表示部11は、プロセッサ16及び画像処理回路20に動作可能に連結される。したがって、画像表示部11によって表示される画像は、プロセッサ16又は/及び画像処理回路20から受信された画像データに基づいてもよい。プロセッサ16及び画像処理回路20は、I/Oインタフェース14から得られる情報に基づき、画像表示部11が表示する画像を制御する。なお、I/Oインタフェース14は、車両用表示システム10に接続される他の電子機器等から受信する情報を加工(変換、演算、解析)する機能を含んでいてもよい。 As shown in the figure, the processor 16 is operably connected to the I/O interface 14 so as to communicate with various other electronic devices connected to the vehicle display system 10 (I/O interface 14). Can be given and received. The I/O interface 14 includes, for example, a vehicle ECU 401 provided in the host vehicle 1, a road information database 403, a host vehicle position detection unit 405, a vehicle exterior sensor 407, a line-of-sight direction detection unit 409, an eye position detection unit 411, and portable information. The terminal 413, the vehicle exterior communication connection device 420, and the like are operably connected. The image display unit 11 is operably connected to the processor 16 and the image processing circuit 20. Therefore, the image displayed by the image display unit 11 may be based on the image data received from the processor 16 and/or the image processing circuit 20. The processor 16 and the image processing circuit 20 control the image displayed by the image display unit 11 based on the information obtained from the I/O interface 14. The I/O interface 14 may include a function of processing (converting, calculating, analyzing) information received from another electronic device or the like connected to the vehicle display system 10.
 自車両1は、自車両1の状態(例えば、走行距離、車速、アクセルペダル開度、エンジンスロットル開度、インジェクター燃料噴射量、エンジン回転数、モータ回転数、ステアリング操舵角、シフトポジション、ドライブモード、各種警告状態)などを検出する車両ECU401を含んでいる。車両ECU401は、自車両1の各部を制御するものであり、例えば、自車両1の現在の車速を示す車速情報をプロセッサ16へ送信することができる。なお、車両ECU401は、単にセンサで検出したデータをプロセッサ16へ送信することに加え、又は代わりに、センサで検出したデータの判定結果、若しくは/及び解析結果をプロセッサ16へ送信することができる。例えば、自車両1が低速走行しているか、又は停止しているかを示す情報をプロセッサ16へ送信してもよい。 The host vehicle 1 is in the state of the host vehicle 1 (for example, mileage, vehicle speed, accelerator pedal opening, engine throttle opening, injector fuel injection amount, engine speed, motor speed, steering angle, shift position, drive mode). , A vehicle ECU 401 for detecting various warning states). The vehicle ECU 401 controls each unit of the host vehicle 1, and can transmit vehicle speed information indicating the current vehicle speed of the host vehicle 1 to the processor 16, for example. It should be noted that vehicle ECU 401 may transmit the determination result of the data detected by the sensor and/or the analysis result to processor 16 in addition to or instead of simply transmitting the data detected by the sensor to processor 16. For example, information indicating whether the host vehicle 1 is traveling at a low speed or stopped may be transmitted to the processor 16.
 また、車両ECU401は、車両用表示システム10が表示する画像200を指示する指示信号をI/Oインタフェース14に送信してもよく、この際、画像200の座標、画像200の報知必要度、又は/及び報知必要度を判定する元となる必要度関連情報を、指示信号に付加して送信してもよい。 Further, the vehicle ECU 401 may transmit an instruction signal for instructing the image 200 displayed by the vehicle display system 10 to the I/O interface 14, in which case the coordinates of the image 200, the notification necessity degree of the image 200, or / And the necessity degree related information that is a basis for determining the notification necessity degree may be added to the instruction signal and transmitted.
 自車両1は、ナビゲーションシステム等からなる道路情報データベース403を含んでいてもよい。道路情報データベース403は、後述する自車位置検出部405から取得される自車両1の位置に基づき、実オブジェクト関連情報の一例である自車両1が走行する道路情報(車線,白線,停止線,横断歩道,道路の幅員,車線数,交差点,カーブ,分岐路,交通規制など)、地物情報(建物、橋、河川など)の、有無、位置(自車両1までの距離を含む)、方向(自車両1を基準とした方向)、形状、種類、詳細情報などを読み出し、プロセッサ16に送信してもよい。また、道路情報データベース403は、出発地から目的地までの適切な経路を算出し、ナビゲーション情報としてプロセッサ16に送信してもよい。 The host vehicle 1 may include a road information database 403 including a navigation system and the like. The road information database 403 is based on the position of the vehicle 1 acquired from the vehicle position detection unit 405 described later, and is road information (lane, white line, stop line, Crosswalk, width of road, number of lanes, intersection, curve, branch road, traffic regulation, etc.), presence/absence of feature information (buildings, bridges, rivers, etc.), position (including distance to own vehicle 1), direction The direction (based on the host vehicle 1), shape, type, detailed information, etc. may be read and transmitted to the processor 16. Further, the road information database 403 may calculate an appropriate route from the starting point to the destination and send it to the processor 16 as navigation information.
 自車両1は、GNSS(全地球航法衛星システム)等からなる自車位置検出部405を含んでいてもよい。道路情報データベース403、後述する携帯情報端末413、又は/及び車外通信接続機器420は、自車位置検出部405から自車両1の位置情報を連続的、断続的、又は所定のイベント毎に取得することで、自車両1の周辺の情報を選択、又は/及び生成して、プロセッサ16に送信することができる。 The host vehicle 1 may include a host vehicle position detection unit 405 such as a GNSS (Global Navigation Satellite System). The road information database 403, the portable information terminal 413, which will be described later, and/or the external communication connecting device 420 acquires the position information of the own vehicle 1 from the own vehicle position detection unit 405 continuously, intermittently, or at every predetermined event. Thus, the information around the vehicle 1 can be selected and/or generated and transmitted to the processor 16.
 自車両1は、自車両1の周辺(前方、側方、及び後方)に存在する実オブジェクト310を検出する1つ又はそれ以上の車外センサ407を含んでいてもよい。車外センサ407が検知する実オブジェクト310は、例えば、歩行者、自転車、自動二輪車、他車両(先行車両320等)、路面(走行レーン330)、区画線、路側物、又は/及び地物(建物など)などを含んでいてもよい。車外センサとしては、例えば、ミリ波レーダ、超音波レーダ、レーザレーダ等のレーダセンサ、カメラと画像処理装置からなるカメラセンサがあり、レーダセンサ、カメラセンサの両方の組み合わせで構成されてもよく、どちらか一方だけで構成されてもよい。これらレーダセンサやカメラセンサによる物体検知については従来の周知の手法を適用する。これらのセンサによる物体検知によって、三次元空間内での実オブジェクトの有無、実オブジェクトが存在する場合には、その実オブジェクトの位置(自車両1からの相対的な距離、自車両1の進行方向を前後方向とした場合の左右方向の位置、上下方向の位置等)、大きさ(横方向(左右方向)、高さ方向(上下方向)等の大きさ)、移動方向(横方向(左右方向)、奥行き方向(前後方向))、移動速度(横方向(左右方向)、奥行き方向(前後方向))、又は/及び種類等を検出してもよい。1つ又はそれ以上の車外センサ407は、各センサの検知周期毎に、自車両1の前方の実オブジェクトを検知して、実オブジェクト関連情報の一例である実オブジェクト関連情報(実オブジェクトの有無、実オブジェクトが存在する場合には実オブジェクト毎の位置、大きさ、又は/及び種類等の情報)をプロセッサ16に送信することができる。なお、これら実オブジェクト関連情報は、他の機器(例えば、車両ECU401)を経由してプロセッサ16に送信されてもよい。また、夜間等の周辺が暗いときでも実オブジェクトが検知できるように、センサとしてカメラを利用する場合には赤外線カメラや近赤外線カメラが望ましい。また、センサとしてカメラを利用する場合、視差で距離等も取得できるステレオカメラが望ましい。 The host vehicle 1 may include one or more vehicle exterior sensors 407 that detect the real objects 310 existing around the host vehicle 1 (front, side, and rear). The real object 310 detected by the vehicle exterior sensor 407 is, for example, a pedestrian, a bicycle, a motorcycle, another vehicle (the preceding vehicle 320 or the like), a road surface (a traveling lane 330), a marking line, a roadside object, and/or a feature (building. Etc.) etc. may be included. As the vehicle exterior sensor, for example, a millimeter wave radar, an ultrasonic radar, a radar sensor such as a laser radar, there is a camera sensor consisting of a camera and an image processing device, may be configured by a combination of both the radar sensor, the camera sensor, It may be configured with only one of them. A conventionally known method is applied to the object detection by the radar sensor and the camera sensor. By detecting the objects by these sensors, the presence or absence of a real object in the three-dimensional space, and if the real object exists, the position of the real object (relative distance from the own vehicle 1, the traveling direction of the own vehicle 1 Position in the left-right direction in the front-back direction, vertical position, etc.), size (size in the horizontal direction (left-right direction), height direction (up-down direction), etc.), movement direction (lateral direction (left-right direction)) , Depth direction (front-back direction), movement speed (lateral direction (left-right direction), depth direction (front-back direction)), and/or type may be detected. One or more vehicle exterior sensors 407 detect a real object in front of the own vehicle 1 for each detection cycle of each sensor, and detect real object related information (presence or absence of real object, which is an example of real object related information). When a real object exists, information such as the position, size, and/or type of each real object) can be transmitted to the processor 16. The real object related information may be transmitted to the processor 16 via another device (for example, the vehicle ECU 401). An infrared camera or a near-infrared camera is desirable when using a camera as a sensor so that a real object can be detected even when the surroundings are dark such as at night. Further, when using a camera as a sensor, a stereo camera capable of acquiring a distance and the like by parallax is desirable.
 自車両1は、運転者4の注視方向(以下では「視線方向」ともいう)を検出する、運転者4の顔を撮像する赤外線カメラ等からなる視線方向検出部409を含んでいてもよい。プロセッサ16は、赤外線カメラが撮像した画像(視線方向を推定可能な情報の一例)を取得し、この撮像画像を解析することで運転者4の視線方向を特定することができる。なお、プロセッサ16は、赤外線カメラの撮像画像から視線方向検出部409(又は他の解析部)が特定した運転者4の視線方向をI/Oインタフェース14から取得するものであってもよい。また、自車両1の運転者4の視線方向、又は運転者4の視線方向を推定可能な情報を取得する方法は、これらに限定されるものではなく、EOG(Electro-oculogram)法、角膜反射法、強膜反射法、プルキンエ像検出法、サーチコイル法、赤外線眼底カメラ法などの他の既知の視線方向検出(推定)技術を用いて取得されてもよい。 The host vehicle 1 may include a line-of-sight direction detection unit 409 including an infrared camera that detects the gaze direction of the driver 4 (hereinafter, also referred to as “line-of-sight direction”) and that captures an image of the face of the driver 4. The processor 16 can specify the line-of-sight direction of the driver 4 by acquiring an image captured by the infrared camera (an example of information that can estimate the line-of-sight direction) and analyzing the captured image. Note that the processor 16 may acquire the line-of-sight direction of the driver 4 specified by the line-of-sight direction detection unit 409 (or another analysis unit) from the image captured by the infrared camera from the I/O interface 14. The method of acquiring the driver's 4 line-of-sight direction of the vehicle 1 or the information capable of estimating the driver's 4 line-of-sight direction is not limited to these, and the EOG (Electro-oculogram) method, the corneal reflex Method, scleral reflection method, Purkinje image detection method, search coil method, infrared fundus camera method, and other known gaze direction detection (estimation) techniques may be used.
 自車両1は、運転者4の目の位置を検出する赤外線カメラ等からなる目位置検出部411を含んでいてもよい。プロセッサ16は、赤外線カメラが撮像した画像(目の位置を推定可能な情報の一例)を取得し、この撮像画像を解析することで運転者4の目の位置を特定することができる。なお、プロセッサ16は、赤外線カメラの撮像画像から特定された運転者4の目の位置の情報をI/Oインタフェース14から取得するものであってもよい。なお、自車両1の運転者4の目の位置、又は運転者4の目の位置を推定可能な情報を取得する方法は、これらに限定されるものではなく、既知の目位置検出(推定)技術を用いて取得されてもよい。プロセッサ16は、運転者4の目の位置に基づき、画像200の位置を少なくとも調整することで、前景300の所望の位置に重畳した画像200を、目位置を検出した視認者(運転者4)に視認させてもよい。 The host vehicle 1 may include an eye position detection unit 411 including an infrared camera that detects the position of the eyes of the driver 4. The processor 16 can specify the eye position of the driver 4 by acquiring an image (an example of information that can estimate the eye position) captured by the infrared camera and analyzing the captured image. The processor 16 may acquire the information on the position of the eyes of the driver 4 identified from the image captured by the infrared camera from the I/O interface 14. The method for acquiring the position of the eyes of the driver 4 of the vehicle 1 or the information capable of estimating the position of the eyes of the driver 4 is not limited to these, and known eye position detection (estimation) It may be acquired using a technology. The processor 16 adjusts at least the position of the image 200 based on the position of the eyes of the driver 4 so that the image 200 in which the image 200 superimposed on a desired position of the foreground 300 is detected is the viewer (driver 4). May be visually confirmed.
 携帯情報端末413は、スマートフォン、ノートパソコン、スマートウォッチ、又は運転者4(又は自車両1の他の乗員)が携帯可能なその他の情報機器である。I/Oインタフェース14は、携帯情報端末413と通信を行うことが可能であり、携帯情報端末413(又は携帯情報端末を通じたサーバ)に記録されたデータを取得する。携帯情報端末413は、例えば、上述の道路情報データベース403及び自車位置検出部405と同様の機能を有し、前記道路情報(実オブジェクト関連情報の一例)を取得し、プロセッサ16に送信してもよい。また、携帯情報端末413は、自車両1の近傍の商業施設に関連するコマーシャル情報(実オブジェクト関連情報の一例)を取得し、プロセッサ16に送信してもよい。なお、携帯情報端末413は、携帯情報端末413の所持者(例えば、運転者4)のスケジュール情報、携帯情報端末413での着信情報、メールの受信情報などをプロセッサ16に送信し、プロセッサ16及び画像処理回路20は、これらに関する画像データを生成又は/及び送信してもよい。 The mobile information terminal 413 is a smartphone, a laptop computer, a smart watch, or other information device that can be carried by the driver 4 (or another occupant of the vehicle 1). The I/O interface 14 can communicate with the mobile information terminal 413, and acquires the data recorded in the mobile information terminal 413 (or the server via the mobile information terminal). The mobile information terminal 413 has, for example, the same function as the road information database 403 and the vehicle position detection unit 405 described above, acquires the road information (an example of real object-related information), and transmits it to the processor 16. Good. The mobile information terminal 413 may also acquire commercial information (an example of real object-related information) related to a commercial facility in the vicinity of the host vehicle 1 and send it to the processor 16. The portable information terminal 413 transmits schedule information of the owner (for example, the driver 4) of the portable information terminal 413, incoming information at the portable information terminal 413, mail reception information, etc. to the processor 16, and the processor 16 and The image processing circuit 20 may generate or/and transmit image data regarding these.
 車外通信接続機器420は、自車両1と情報のやりとりをする通信機器であり、例えば、自車両1と車車間通信(V2V:Vehicle To Vehicle)により接続される他車両、歩車間通信(V2P:Vehicle To Pedestrian)により接続される歩行者(歩行者が携帯する携帯情報端末)、路車間通信(V2I:Vehicle To roadside Infrastructure)により接続されるネットワーク通信機器であり、広義には、自車両1との通信(V2X:Vehicle To Everything)により接続される全てのものを含む。車外通信接続機器420は、例えば、歩行者、自転車、自動二輪車、他車両(先行車等)、路面、区画線、路側物、又は/及び地物(建物など)の位置を取得し、プロセッサ16に送信してもよい。また、車外通信接続機器420は、上述の自車位置検出部405と同様の機能を有し、自車両1の位置情報を取得し、プロセッサ16に送信してもよく、さらに上述の道路情報データベース403の機能も有し、前記道路情報(実オブジェクト関連情報の一例)を取得し、プロセッサ16に送信してもよい。なお、車外通信接続機器420から取得される情報は、上述のものに限定されない。 The outside-vehicle communication connection device 420 is a communication device for exchanging information with the own vehicle 1, and for example, another vehicle connected to the own vehicle 1 through vehicle-to-vehicle communication (V2V: Vehicle To Vehicle), pedestrian-to-vehicle communication (V2P: It is a network communication device connected by a pedestrian (a portable information terminal carried by a pedestrian) and a road-to-vehicle communication (V2I: Vehicle To road Infrastructure) connected by a Vehicle To Pedestrian. Includes everything connected by V2X (Vehicle To Everything). The extra-vehicle communication connection device 420 acquires the position of, for example, a pedestrian, a bicycle, a motorcycle, another vehicle (such as a preceding vehicle), a road surface, a marking line, a roadside object, and/or a feature (such as a building), and the processor 16 May be sent to. In addition, the vehicle exterior communication connection device 420 may have the same function as the own vehicle position detection unit 405 described above, may acquire the position information of the own vehicle 1 and may transmit the position information to the processor 16, and further, the above road information database. It also has the function of 403, and may acquire the road information (an example of real object related information) and send it to the processor 16. The information acquired from the vehicle exterior communication connection device 420 is not limited to the above.
 メモリ18に記憶されたソフトウェア構成要素は、実オブジェクト関連情報検出モジュール502、実オブジェクト位置設定モジュール504、差分判定モジュール506、距離判定モジュール508、速度判定モジュール510、報知必要度判定モジュール512、画像位置決定モジュール514、画像サイズ決定モジュール516、及びグラフィックモジュール518を含む。 The software components stored in the memory 18 are the real object related information detection module 502, the real object position setting module 504, the difference determination module 506, the distance determination module 508, the speed determination module 510, the notification necessity determination module 512, and the image position. A decision module 514, an image size decision module 516, and a graphics module 518 are included.
 実オブジェクト関連情報検出モジュール502は、自車両1の前方に存在する実オブジェクト310の少なくとも位置を含む情報(実オブジェクト関連情報とも呼ぶ)を取得する。実オブジェクト関連情報検出モジュール502は、例えば、車外センサ407から、自車両1の前景300に存在する実オブジェクト310の位置(自車両1の運転席にいる運転者4から自車両1の進行方向(前方)を視認した際の高さ方向(上下方向)、横方向(左右方向)の位置であり、これらに、奥行き方向(前方向)の位置が追加されてもよい)、及び実オブジェクト310のサイズ(高さ方向、横方向のサイズ)、自車両1に対する相対速度(相対的な移動方向も含む)、を含む情報(実オブジェクト関連情報の一例)を取得してもよい。また、実オブジェクト関連情報検出モジュール502は、車外通信接続機器420を介して実オブジェクト(他車両)の位置、相対速度、種類、実オブジェクト(他車両)の方向指示器の点灯状態、舵角操作の状態、又は/及び運転支援システムによる進行予定経路、進行スケジュール、を示す情報(実オブエクと関連情報の一例)を取得してもよい。 The real object related information detection module 502 acquires information (also called real object related information) including at least the position of the real object 310 existing in front of the host vehicle 1. The real object-related information detection module 502 uses, for example, the vehicle exterior sensor 407 to detect the position of the real object 310 existing in the foreground 300 of the host vehicle 1 (the traveling direction of the host vehicle 1 from the driver 4 in the driver's seat of the host vehicle 1 ( The position in the height direction (vertical direction) and the lateral direction (horizontal direction) when visually recognizing the front), and the position in the depth direction (front direction) may be added to these), and of the real object 310. You may acquire the information (an example of real object relevant information) containing size (height direction, horizontal size) and relative speed with respect to the own vehicle 1 (a relative moving direction is also included). In addition, the real object related information detection module 502 uses the external communication connection device 420 to detect the position, relative speed, and type of the real object (other vehicle), the lighting state of the direction indicator of the real object (other vehicle), and the steering angle operation. (Or an example of the actual object and related information) indicating the state, or/and the planned traveling route and the traveling schedule by the driving support system.
 また、実オブジェクト関連情報検出モジュール502は、車外センサ407から、自車両1の走行レーン330(図3参照)の左側の区画線331(図3参照)の位置と、右側の区画線332(図3参照)の位置とを取得し、それら左右の区画線331,332の間の領域(走行レーン330)を認識してもよい。 In addition, the real object related information detection module 502 detects the position of the left lane marking 331 (see FIG. 3) of the vehicle driving lane 330 (see FIG. 3) and the right lane marking 332 (see FIG. 3)), and the area (running lane 330) between the left and right partition lines 331 and 332 may be recognized.
 実オブジェクト位置設定モジュール504は、I/Oインタフェース14を介して道路情報データベース403、車外センサ407、携帯情報端末413、若しくは車外通信接続機器420から実オブジェクト310の現在の位置を示す観測位置を取得し、又はこれら2以上の観測位置をフュージョンした実オブジェクトの観測位置を取得し、取得した観測位置に基づいて実オブジェクト310の位置(特定位置とも呼ぶ)を設定する。後述する画像位置決定モジュール514は、この実オブジェクト位置設定モジュール504が設定した実オブジェクト310の特定位置を基準にAR画像210の位置を決定する。 The real object position setting module 504 acquires an observation position indicating the current position of the real object 310 from the road information database 403, the vehicle exterior sensor 407, the portable information terminal 413, or the vehicle exterior communication connection device 420 via the I/O interface 14. Alternatively, the observation position of the real object obtained by fusing these two or more observation positions is acquired, and the position of the real object 310 (also referred to as a specific position) is set based on the acquired observation position. An image position determination module 514, which will be described later, determines the position of the AR image 210 based on the specific position of the real object 310 set by the real object position setting module 504.
 実オブジェクト位置設定モジュール504は、直前に取得した実オブジェクト310の観測位置に基づいて実オブジェクト310の特定位置を設定と、少なくとも直前に取得した実オブジェクト310の観測位置を含む過去に取得した1つ又はそれ以上の実オブジェクト310の観測位置を元に予測される所定の時刻における実オブジェクトの予測位置に基づいて実オブジェクト310の特定位置を設定と、を実行可能である。すなわち、実オブジェクト位置設定モジュール504と後述する画像位置決定モジュール514を実行することで、プロセッサ16は、直前に取得した実オブジェクト310の観測位置に基づいてAR画像210の特定位置を設定する第1位置設定処理と、少なくとも直前に取得した実オブジェクト310の観測位置を含む過去に取得した1つ又はそれ以上の実オブジェクト310の観測位置を元に予測されるAR画像210の表示更新周期における実オブジェクト310の予測位置に基づいてAR画像210の位置を設定する第2位置設定処理と、を実行可能である。 The real object position setting module 504 sets the specific position of the real object 310 based on the observation position of the real object 310 acquired immediately before, and the one acquired in the past including at least the observation position of the real object 310 acquired immediately before. Alternatively, it is possible to set the specific position of the real object 310 based on the predicted position of the real object at a predetermined time predicted based on the observed position of the real object 310 or more. That is, by executing the real object position setting module 504 and the image position determination module 514 described later, the processor 16 sets the specific position of the AR image 210 based on the observation position of the real object 310 acquired immediately before. Position setting processing and a real object in the display update cycle of the AR image 210 predicted based on the observation positions of one or more real objects 310 acquired in the past including at least the observation position of the real object 310 acquired immediately before A second position setting process of setting the position of the AR image 210 based on the predicted position of 310 can be executed.
 実オブジェクト位置設定モジュール504は、例えば、第1AR画像220においては、表示する位置の基準となる実オブジェクト310の特定位置を、前記第1位置設定処理と前記第2位置設定処理とを時系列的に使い分けることにより算出し、第2AR画像230においては、表示する位置の基準となる実オブジェクト310の特定位置を、前記第2位置設定処理により算出する。図3は、第1実オブジェクト320に対応付けられて表示される第1AR画像220と、第2実オブジェクト330に対応付けられて表示される第2AR画像230と、を示す図である。ここで、第1AR画像220は、自車両1の走行レーン330を先行している先行車両(第1実オブジェクト)320に対する注意を促し、先行車両320を後方で囲むように視認される円弧状の注意喚起画像である。また、第2AR画像230は、自車両1の予定経路を示し、自車両1の走行レーン(第2実オブジェクト)330に重ねて視認される1つの矢印形状からなる経路画像である。 For example, in the first AR image 220, the real object position setting module 504 sets the specific position of the real object 310, which is the reference of the position to be displayed, in the time series of the first position setting process and the second position setting process. In the second AR image 230, the specific position of the real object 310 that serves as the reference of the display position is calculated by the second position setting process. FIG. 3 is a diagram showing a first AR image 220 displayed in association with the first real object 320 and a second AR image 230 displayed in association with the second real object 330. Here, the first AR image 220 is an arcuate shape that draws attention to the preceding vehicle (first real object) 320 preceding the traveling lane 330 of the own vehicle 1 and is visually recognized so as to surround the preceding vehicle 320 behind. This is a warning image. In addition, the second AR image 230 is a route image having a single arrow shape that shows the planned route of the host vehicle 1 and is visually recognized in an overlapping manner on the traveling lane (second real object) 330 of the host vehicle 1.
 プロセッサ16は、実オブジェクト関連情報検出モジュール502により、注意喚起画像(第1AR画像)220を表示する位置の基準となる先行車両(第1実オブジェクト)320の位置を示す観測位置Imと、経路画像(第2AR画像)230を表示する位置の基準となる走行レーン(第2実オブジェクト)330の位置を示す観測位置Inと、を取得する。次に、プロセッサ16は、直前に取得した観測位置Imを含む過去に取得した第1実オブジェクト320の観測位置Im~Im-5からAR画像210の表示更新周期k+2~k-2毎の第1実オブジェクト320の特定位置Pk+2~Qk-2と、過去に取得したものも含む第2実オブジェクト330の観測位置In~In-5からAR画像210の表示更新周期k+2~k-2毎の第2実オブジェクト330の特定位置Qk+2~Qk-2とを、設定する。 The processor 16 uses the real object-related information detection module 502 to observe the position Im of the preceding vehicle (first real object) 320, which serves as a reference for the position where the attention image (first AR image) 220 is displayed, and the route image. The observation position In indicating the position of the traveling lane (second real object) 330 that serves as a reference for the position where the (second AR image) 230 is displayed is acquired. Next, the processor 16 uses the observation positions Im to Im-5 of the first real object 320 acquired in the past, including the observation position Im acquired immediately before, from the observation positions Im to Im-5 of the first time for each display update cycle k+2 to k−2 of the AR image 210. From the specific positions Pk+2 to Qk-2 of the real object 320 and the observation positions In to In-5 of the second real object 330 including those acquired in the past, the second at every display update cycle k+2 to k-2 of the AR image 210. The specific positions Qk+2 to Qk-2 of the real object 330 are set.
 図4は、AR画像の表示更新周期毎に第1実オブジェクトの特定位置、第2実オブジェクトの特定位置がどのように設定されるかを説明する図である。図中、表示更新周期k+2が最も新しい表示更新周期であり、k+1,k,k-1,k-2の順に古くなっていく。第1実オブジェクト320の特定位置、第2実オブジェクト330の特定位置も同様であり、図中では、Pk+2(Qk+2)が最も新しい。また、図中の第1実オブジェクト320(第2実オブジェクト330)の観測位置は、Im(In)が最も新しく、Im-1(In-1),Im-2(In-2)・・・の順に古くなっていく。図中では、第1実オブジェクト320の観測位置Imが取得される周期と、第2実オブジェクト330の観測位置Inが取得される周期とは、それぞれ異なるように記載してあるが、取得される周期が揃えられてもよい。 FIG. 4 is a diagram for explaining how the specific position of the first real object and the specific position of the second real object are set for each display update cycle of the AR image. In the figure, the display update cycle k+2 is the newest display update cycle, and becomes older in the order of k+1, k, k-1, k-2. The same applies to the specific position of the first real object 320 and the specific position of the second real object 330, and Pk+2 (Qk+2) is the newest in the figure. Im(In) is the newest observation position of the first real object 320 (second real object 330) in the figure, and Im-1(In-1), Im-2(In-2)... It becomes old in order. In the figure, the cycle in which the observation position Im of the first real object 320 is acquired and the cycle in which the observation position In of the second real object 330 is acquired are described as different, but they are acquired. The periods may be aligned.
 まず、第1実オブジェクト320の特定位置Pkの設定について説明する。プロセッサ16は、第1実オブジェクト320の特定位置Pkの設定において、1つ又はそれ以上のI/Oインタフェース14から自車両1の前方に存在する第1実オブジェクト320の特定位置Pkを取得した直後の第1表示更新周期Fαでは、前記第1位置設定処理を実行し、第1実オブジェクト320の観測位置Imを取得した直後ではない第2表示更新周期Fβでは、前記第2位置設定処理を実行する。図中、表示更新周期k-1,k+1が、観測位置Im-1,Imを取得した直後の第1表示更新周期Fαであり、表示更新周期k-2,k,k+2が観測位置Im-1,Imを取得した直後の第1表示更新周期Fαではない第2表示更新周期Fβである。表示更新周期kは、第2表示更新周期Fβであるため、表示更新周期kにおける特定位置Pkは、前記第2位置設定処理により決定され、具体的には、それ以前の4つの観測位置Im-1,Im-2,Im-3,Im-4に基づいて予測される。次の表示更新周期k+1は、第1表示更新周期Fαであるため、表示更新周期k+1における特定位置Pk+1は、前記第1位置設定処理により決定され、具体的には、直前に取得した第1実オブジェクト320の観測位置Imに設定される。次の表示更新周期k+2は、第2表示更新周期Fβであるため、表示更新周期k+2における特定位置Pk+2は、前記第2位置設定処理により決定され、具体的には、それ以前の4つの観測位置Im,Im-1,Im-2,Im-3に基づいて予測される。実オブジェクト位置設定モジュール504による予測位置の算出方法に特段の制約はなく、実オブジェクト位置設定モジュール504が処理対象とする表示更新周期(例えば、図4の表示更新周期k)よりも過去に取得された観測位置(図4では、観測位置Im-1、Im-2、・・・)に基づいて予測を行う限り、如何なる手法を用いてもよい。実オブジェクト位置設定モジュール504は、例えば、最小二乗法や、カルマンフィルタ、α-βフィルタ、又はパーティクルフィルタなどの予測アルゴリズムを用いて、過去の1つ又はそれ以上の観測位置を用いて、次回の値を予測するようにしてもよい。 First, the setting of the specific position Pk of the first real object 320 will be described. Immediately after the processor 16 acquires the specific position Pk of the first real object 320 existing in front of the host vehicle 1 from the one or more I/O interfaces 14 in the setting of the specific position Pk of the first real object 320. In the first display update cycle Fα, the first position setting processing is executed, and in the second display update cycle Fβ which is not immediately after the observation position Im of the first real object 320 is acquired, the second position setting processing is executed. To do. In the figure, the display update cycle k-1, k+1 is the first display update cycle Fα immediately after the observation positions Im-1, Im are acquired, and the display update cycle k-2, k, k+2 is the observation position Im-1. , Im is the second display update cycle Fβ that is not the first display update cycle Fα immediately after the acquisition. Since the display update cycle k is the second display update cycle Fβ, the specific position Pk in the display update cycle k is determined by the second position setting process, and specifically, the four observation positions Im− 1, Im-2, Im-3, Im-4. Since the next display update cycle k+1 is the first display update cycle Fα, the specific position Pk+1 in the display update cycle k+1 is determined by the first position setting process, and specifically, the first actual acquired immediately before. The observation position Im of the object 320 is set. Since the next display update cycle k+2 is the second display update cycle Fβ, the specific position Pk+2 in the display update cycle k+2 is determined by the second position setting process, and specifically, the four observation positions before that. It is predicted based on Im, Im-1, Im-2 and Im-3. There is no particular limitation on the method of calculating the predicted position by the real object position setting module 504, and the real object position setting module 504 obtains a display update cycle (eg, the display update cycle k in FIG. 4) that is the target of processing in the past. Any method may be used as long as the prediction is performed based on the observed positions (observed positions Im-1, Im-2,... In FIG. 4). The real object position setting module 504 uses the least squares method or a prediction algorithm such as a Kalman filter, an α-β filter, or a particle filter, and uses one or more past observed positions to calculate the next value. May be predicted.
 次に、第2実オブジェクト330の特定位置Qkの設定について説明する。プロセッサ16は、第2実オブジェクト330の特定位置Qkの設定において、第1表示更新周期Fα及び第2表示更新周期Fβでは、前記第2位置設定処理を実行する。図中、表示更新周期k-2,k,k+2が、観測位置In-2,In-1,Inを取得した直後の第1表示更新周期Fαであり、表示更新周期k-1,k+1が観測位置In-2,In-1,Inを取得した直後の第1表示更新周期Fαではない第2表示更新周期Fβである。すなわち、表示更新周期kは、第1表示更新周期Fαであるが、表示更新周期kにおける特定位置Qkは、前記第2位置設定処理により決定され、具体的には、それ以前の4つの観測位置In-1,In-2,In-3,In-4に基づいて予測される。次の表示更新周期k+1における特定位置Qk+1も、前記第2位置設定処理により決定され、具体的には、それ以前の4つの観測位置In-1,In-2,In-3,In-4に基づいて予測される。次の表示更新周期k+2は、第2表示更新周期Fβであるが、表示更新周期k+2における特定位置Qk+2は、前記第2位置設定処理により決定され、具体的には、それ以前の4つの観測位置In,In-1,In-2,In-3に基づいて予測される。 Next, the setting of the specific position Qk of the second real object 330 will be described. The processor 16 executes the second position setting process in the first display update cycle Fα and the second display update cycle Fβ in setting the specific position Qk of the second real object 330. In the figure, the display update cycle k-2, k, k+2 is the first display update cycle Fα immediately after the observation positions In-2, In-1, In are acquired, and the display update cycle k-1, k+1 is observed. The second display update cycle Fβ is not the first display update cycle Fα immediately after the positions In-2, In-1, and In are acquired. That is, the display update cycle k is the first display update cycle Fα, but the specific position Qk in the display update cycle k is determined by the second position setting process. Predicted based on In-1, In-2, In-3, In-4. The specific position Qk+1 in the next display update cycle k+1 is also determined by the second position setting process, and specifically, the four observation positions In-1, In-2, In-3, In-4 before that are determined. Predicted based on. The next display update cycle k+2 is the second display update cycle Fβ, but the specific position Qk+2 in the display update cycle k+2 is determined by the second position setting process, and specifically, the four observation positions before that. Predicted based on In, In-1, In-2, In-3.
 図2の差分判定モジュール506は、第1表示更新周期Fαにおいて、直前に取得した観測位置と、直前に取得した観測位置を少なくとも含む1つ又はそれ以上の観測位置に基づいて予測される予測位置と、を比較し、それらの差分がメモリ18に記憶されている所定の差分閾値より大きいかを判定する。これらの差分が所定の閾値より大きい場合、プロセッサ16は、第1表示更新周期Fαでは、前記第1位置設定処理を実行し、第2表示更新周期Fβでは、第2位置設定処理を実行し、これを所定の表示更新周期回数だけ継続した後、第1表示更新周期Fα及び第2表示更新周期Fβでは、第2位置設定処理を実行するようにしてもよい。また、これらの差分が所定の閾値より大きくない場合、プロセッサ16は、第1表示更新周期Fα及び第2表示更新周期Fβでは、前記第2位置設定処理を実行する。なお、メモリ18は、2つ以上の差分閾値を記憶し、距離判定モジュール508は、直前に取得した観測位置と、予測位置との差分の度合いを3段階以上で判定してもよい。また、これらの差分閾値は、可変であってもよい。例えば、差分判定モジュール506は、実オブジェクト310と自車両1との相対速度に応じて変化させてもよく、この場合、相対速度が速い程、差分閾値が長くなるように設定してもよい。 The difference determination module 506 of FIG. 2 predicts the predicted position in the first display update cycle Fα based on the observation position acquired immediately before and one or more observation positions including at least the observation position acquired immediately before. And are compared, and it is determined whether the difference between them is larger than a predetermined difference threshold value stored in the memory 18. When these differences are larger than the predetermined threshold value, the processor 16 executes the first position setting process in the first display update cycle Fα, and executes the second position setting process in the second display update cycle Fβ. After continuing this for a predetermined number of display update cycles, the second position setting process may be executed in the first display update cycle Fα and the second display update cycle Fβ. Further, when these differences are not larger than the predetermined threshold value, the processor 16 executes the second position setting process in the first display update cycle Fα and the second display update cycle Fβ. The memory 18 may store two or more difference thresholds, and the distance determination module 508 may determine the degree of difference between the observed position acquired immediately before and the predicted position in three or more stages. Further, these difference thresholds may be variable. For example, the difference determination module 506 may be changed according to the relative speed between the real object 310 and the host vehicle 1, and in this case, the higher the relative speed, the longer the difference threshold may be set.
 また、いくつかの実施形態において、差分判定モジュール506は、観測位置が予測位置よりも自車両1の近傍であるかを判定してもよい。プロセッサ16は、差分判定モジュール506が実オブジェクトの直前に取得した観測位置が、予測位置よりも自車両1の近傍である場合、第1表示更新周期Fαでは、前記第1位置設定処理を実行し、第2表示更新周期Fβでは、前記第2位置設定処理を実行し、直前に取得した観測位置が、予測位置よりも自車両1の近傍でない場合、第1表示更新周期Fα及び第2表示更新周期Fβでは、前記第2位置設定処理を実行してもよい。これによれば、実オブジェクトが予測位置よりも近いと想定される場合に、迅速に、実際に実オブジェクトが存在する可能性が高い観測位置を基準にAR画像210を表示することができる。 Further, in some embodiments, the difference determination module 506 may determine whether the observed position is closer to the host vehicle 1 than the predicted position. When the observation position acquired immediately before the real object by the difference determination module 506 is closer to the host vehicle 1 than the predicted position, the processor 16 executes the first position setting process in the first display update cycle Fα. In the second display update cycle Fβ, when the second position setting process is executed and the observed position acquired immediately before is not nearer to the vehicle 1 than the predicted position, the first display update cycle Fα and the second display update In the period Fβ, the second position setting process may be executed. According to this, when the real object is assumed to be closer than the predicted position, the AR image 210 can be quickly displayed based on the observation position where the real object is highly likely to exist.
 距離判定モジュール508は、実オブジェクト310と自車両1との間の距離の度合いを判定する。例えば、距離判定モジュール508は、実オブジェクト関連情報検出モジュール502が実行されることで取得可能な実オブジェクト310と自車両1との間の距離が、メモリ18に記憶された所定の距離閾値よりも長いか否かを判定してもよい。なお、メモリ18は、2つ以上の距離閾値を記憶し、距離判定モジュール508は、実オブジェクト310と自車両1との間の距離の度合いを3段階以上で判定してもよい。また、これらの距離閾値は、可変であってもよい。例えば、距離判定モジュール508は、実オブジェクト310と自車両1との相対速度に応じて変化させてもよく、この場合、相対速度が速い程、距離閾値が長くなるように設定してもよい。 The distance determination module 508 determines the degree of distance between the real object 310 and the host vehicle 1. For example, the distance determination module 508 determines that the distance between the real object 310 and the host vehicle 1 that can be acquired by executing the real object related information detection module 502 is greater than the predetermined distance threshold stored in the memory 18. You may judge whether it is long. The memory 18 may store two or more distance thresholds, and the distance determination module 508 may determine the degree of the distance between the real object 310 and the vehicle 1 in three or more steps. Further, these distance thresholds may be variable. For example, the distance determination module 508 may change the distance according to the relative speed between the real object 310 and the host vehicle 1. In this case, the faster the relative speed, the longer the distance threshold may be set.
 速度判定モジュール510は、実オブジェクト310と自車両1との相対速度の度合いを判定する。例えば、速度判定モジュール510は、実オブジェクト関連情報検出モジュール502が実行されることで取得可能な実オブジェクト310と自車両1との間の距離の時間変化に基づき算出した、実オブジェクト310と自車両1との相対速度が、メモリ18に記憶された所定の相対速度閾値よりも速いか否かを判定してもよい。なお、メモリ18は、2つ以上の相対速度閾値を記憶し、速度判定モジュール510は、実オブジェクト310と自車両1との相対速度の度合いを3段階以上で判定してもよい。また、これらの相対速度閾値は、可変であってもよい。例えば、速度判定モジュール510は、実オブジェクト310と自車両1との間の距離に応じて変化させてもよく、この場合、実オブジェクト310と自車両1との間の距離が長い程、相対速度閾値が速くなるように設定してもよい。 The speed determination module 510 determines the degree of relative speed between the real object 310 and the host vehicle 1. For example, the speed determination module 510 calculates the real object 310 and the own vehicle based on the time change of the distance between the real object 310 and the own vehicle 1 that can be acquired by the real object related information detection module 502 being executed. It may be determined whether the relative speed with respect to 1 is faster than a predetermined relative speed threshold value stored in the memory 18. The memory 18 may store two or more relative speed thresholds, and the speed determination module 510 may determine the degree of relative speed between the real object 310 and the vehicle 1 in three or more stages. Further, these relative speed thresholds may be variable. For example, the speed determination module 510 may change the speed according to the distance between the real object 310 and the host vehicle 1. In this case, the longer the distance between the real object 310 and the host vehicle 1, the relative speed. You may set so that a threshold value may become fast.
 報知必要度判定モジュール512は、車両用表示システム10が表示する各画像200が運転者4に報知するべき内容であるかを判定する。報知必要度判定モジュール512は、I/Oインタフェース14に接続される種々の他の電子機器から情報を取得し、報知必要度を算出してもよい。また、図2でI/Oインタフェース14に接続された電子機器が車両ECU401に情報を送信し、受信した情報に基づき車両ECU401が決定した報知必要度を、報知必要度判定モジュール512が検出(取得)してもよい。『報知必要度』は、例えば、起こり得る自体の重大さの程度から導き出される危険度、反応行動を起こすまでに要求される反応時間の長短から導き出される緊急度、自車両1や運転者4(又は自車両1の他の乗員)の状況から導き出される有効度、又はこれらの組み合わせなどで決定され得る(報知必要度の指標はこれらに限定されない)。すなわち、報知必要度判定モジュール512は、運転者4に報知すべきかを判定し、経路画像220、後述する注意喚起画像230、又はこれら双方を表示しないことも選択し得る。なお、車両用表示システム10は、報知必要度を推定する(算出する)機能を有していなくてもよく、報知必要度を推定する機能の一部又は全部は、車両用表示システム10の表示制御装置13とは別に設けられてもよい。 The notification necessity degree determination module 512 determines whether or not each image 200 displayed by the vehicle display system 10 is the content to be notified to the driver 4. The notification necessity degree determination module 512 may obtain information from various other electronic devices connected to the I/O interface 14 and calculate the notification necessity degree. In addition, the electronic device connected to the I/O interface 14 in FIG. 2 transmits information to the vehicle ECU 401, and the notification necessity degree determination module 512 detects (acquires) the notification necessity degree determined by the vehicle ECU 401 based on the received information. ) May be. The "information need level" is, for example, a risk level derived from the degree of seriousness of the possibility itself, an urgency level derived from the length of the reaction time required to take a reaction action, the own vehicle 1 or the driver 4 ( Alternatively, it can be determined based on the effectiveness derived from the situation of other occupants of the vehicle 1 or a combination thereof (the indicator of the notification necessity degree is not limited to these). That is, the notification necessity degree determination module 512 may determine whether to notify the driver 4 and may select not to display the route image 220, the warning image 230 described below, or both of them. The vehicle display system 10 may not have a function of estimating (calculating) the notification necessity degree, and a part or all of the function of estimating the notification necessity degree may be displayed by the vehicle display system 10. It may be provided separately from the control device 13.
 画像位置決定モジュール514は、画像200が、実オブジェクト310と特定の位置関係になって視認されるように、実オブジェクト位置設定モジュール504が設定した実オブジェクト310の決定位置(観測位置又は予測位置)に基づき、画像200の座標(運転者4が自車両1の運転席から表示領域100の方向を見た際の左右方向(X軸方向)、及び上下方向(Y軸方向)を少なくとも含む)を決定する。これに加え、画像位置決定モジュール514は、実オブジェクト位置設定モジュール504が設定した実オブジェクト310の決定位置に基づき、運転者4が自車両1の運転席から表示領域100の方向を見た際の前後方向(Z軸方向)を決定してもよい。なお、画像位置決定モジュール514は、目位置検出部411が検出した運転者4の眼の位置に基づいて、画像200の位置を調整する。例えば、画像位置決定モジュール514は、画像200の中心が実オブジェクトの中心と重なって視認されるように、画像200の左右方向、及び上下方向の位置を決定する。なお、『特定の位置関係』は、実オブジェクト又は自車両1の状況、実オブジェクトの種類、表示される画像の種類などにより調整され得る。 The image position determination module 514 determines the determined position (observation position or predicted position) of the real object 310 set by the real object position setting module 504 so that the image 200 is visually recognized in a specific positional relationship with the real object 310. Based on the coordinates of the image 200 (including at least the horizontal direction (X-axis direction) and the vertical direction (Y-axis direction) when the driver 4 views the direction of the display area 100 from the driver's seat of the vehicle 1) decide. In addition to this, the image position determination module 514 determines when the driver 4 views the direction of the display area 100 from the driver's seat of the own vehicle 1 based on the determined position of the real object 310 set by the real object position setting module 504. The front-back direction (Z-axis direction) may be determined. The image position determination module 514 adjusts the position of the image 200 based on the position of the eyes of the driver 4 detected by the eye position detection unit 411. For example, the image position determination module 514 determines the horizontal position and the vertical position of the image 200 so that the center of the image 200 is visually recognized so as to overlap with the center of the real object. The “specific positional relationship” can be adjusted depending on the situation of the real object or the host vehicle 1, the type of the real object, the type of the displayed image, or the like.
 画像サイズ決定モジュール516は、対応付ける実オブジェクト310の位置、又は/及びサイズに合わせて、AR画像210のサイズを変更してもよい。例えば、画像サイズ決定モジュール516は、対応付ける実オブジェクト310の位置が遠方であれば、AR画像210のサイズを小さくし得る。また、画像サイズ決定モジュール516は、対応付ける実オブジェクト310のサイズが大きければ、AR画像210のサイズを大きくし得る。 The image size determination module 516 may change the size of the AR image 210 in accordance with the position and/or size of the real object 310 to be associated. For example, the image size determination module 516 can reduce the size of the AR image 210 if the position of the real object 310 to be associated is distant. Further, the image size determination module 516 can increase the size of the AR image 210 if the size of the real object 310 to be associated is large.
 また、画像サイズ決定モジュール516は、実オブジェクト関連情報検出モジュール502により検出された画像200を対応付けて表示する実オブジェクトの種類、数、又は/及び、報知必要度判定モジュール512で検出された(推定された)報知必要度の大きさに基づいて、画像200のサイズを決定し得る。 In addition, the image size determination module 516 detects the type, number, and/or notification necessity determination module 512 of the real object that displays the image 200 detected by the real object related information detection module 502 in association with each other ( The size of the image 200 may be determined based on the (estimated) notification need.
 画像サイズ決定モジュール516は、過去の所定の回数の実オブジェクトのサイズに基づいて、今回の表示更新周期で表示するAR画像210を表示するサイズを予測算出する機能を有してもよい。第1の手法として、画像サイズ決定モジュール516は、カメラ(車外センサ407の一例)による過去の2つの撮像画像間で、例えば、Lucas-Kanade法を使用して、実オブジェクト310の画素を追跡することで、今回の表示更新周期における実オブジェクトのサイズを予測し、予測した実オブジェクトのサイズに合わせてAR画像のサイズを決定してもよい。第2手法として、過去の2つの撮像画像間での実オブジェクトのサイズの変化に基づき、実オブジェクトのサイズの変化率を求めて、実オブジェクトのサイズの変化率に応じてAR画像のサイズを決定してもよい。なお、時系列で変化する視点からの実オブジェクトのサイズ変化を推定する方法は、上記に限られず、例えば、Horn-Schunck法、Buxton-Buxton、Black-Jepson法などのオプティカルフロー推定アルゴリズムを含む公知の手法を用いてもよい。 The image size determination module 516 may have a function of predicting and calculating the size of the AR image 210 to be displayed in the display update cycle of this time, based on the size of the real object a predetermined number of times in the past. As a first method, the image size determination module 516 tracks pixels of the real object 310 between two past captured images captured by the camera (an example of the vehicle exterior sensor 407) using, for example, the Lucas-Kanade method. Thus, the size of the real object in the current display update cycle may be predicted, and the size of the AR image may be determined according to the predicted size of the real object. As a second method, the rate of change of the size of the real object is obtained based on the change of the size of the real object between the past two captured images, and the size of the AR image is determined according to the rate of change of the size of the real object. You may. Note that the method of estimating the size change of the real object from the viewpoint that changes in time series is not limited to the above, and known methods including optical flow estimation algorithms such as the Horn-Schunkk method, the Buxton-Buxton method, and the Black-Jepson method, for example. You may use the method of.
 グラフィックモジュール518は、表示される画像200の、視覚的効果(例えば、輝度、透明度、彩度、コントラスト、又は他の視覚特性)、サイズ、表示位置、距離(運転者4から画像200までの距離)を変更するための様々な既知のソフトウェア構成要素を含む。グラフィックモジュール518は、画像位置決定モジュール514が設定した座標(運転者4が自車両1の運転席から表示領域100の方向を見た際の左右方向(X軸方向)、及び上下方向(Y軸方向)を少なくとも含む)、画像サイズ決定モジュール516が設定した画像サイズで運転者4に視認されるように画像200を表示する。 The graphics module 518 provides visual effects (eg, brightness, transparency, saturation, contrast, or other visual characteristic), size, display position, distance (distance from the driver 4 to the image 200) of the displayed image 200. ) Includes various known software components for modifying. The graphic module 518 displays the coordinates set by the image position determination module 514 (the left-right direction (X-axis direction) and the up-down direction (Y-axis when the driver 4 views the direction of the display area 100 from the driver's seat of the vehicle 1 ). The image 200 is displayed so as to be visually recognized by the driver 4 with the image size set by the image size determination module 516.
 以上に説明したように、本実施形態の表示制御装置は、自車両1の運転者4から見られる前景に存在する実オブジェクトに対応付けた位置に画像200を重ねて表示する画像表示部11を制御する表示制御装置13において、1つ又はそれ以上のI/Oインタフェース14と、1つ又はそれ以上のプロセッサ16と、メモリ18と、メモリ18に格納され、1つ又はそれ以上のプロセッサ16によって実行されるように構成される1つ又はそれ以上のコンピュータ・プログラムと、を備え、1つ又はそれ以上のプロセッサ16は、直前に取得した実オブジェクトの位置に基づいて画像200の位置を設定する第1位置設定処理と、少なくとも直前に取得した実オブジェクトの位置を含む過去に取得した1つ又はそれ以上の実オブジェクトの位置を元に予測される画像の表示更新周期における実オブジェクトの予測位置に基づいて画像200の位置を設定する第2位置設定処理と、を実行可能であり、1つ又はそれ以上のI/Oインタフェース14から自車両1の前方に存在する実オブジェクトの位置を取得した直後の第1表示更新周期Fαでは、第1位置設定処理を実行し、1つ又はそれ以上のI/Oインタフェース14から自車両1の前方に存在する実オブジェクトの位置を取得した直後ではない第2表示更新周期Fβでは、第2位置設定処理を実行する。これによれば、実オブジェクトの過去の観測位置からの予測位置に基づいてAR画像の表示位置が決定されるので、画像の表示位置が急激に変化しにくくしつつ、実オブジェクトの観測位置を取得した直後の表示更新周期では、実オブジェクトの観測位置に基づいてAR画像の表示位置が決定されるので、実オブジェクトの正確な位置を迅速に視認者に認識させることができる。 As described above, the display control device according to the present embodiment includes the image display unit 11 that superimposes and displays the image 200 on the position associated with the real object existing in the foreground viewed from the driver 4 of the vehicle 1. In the display controller 13 to control, one or more I/O interfaces 14, one or more processors 16, a memory 18, and one or more processors 16 stored in the memory 18. One or more computer programs configured to be executed, the one or more processors 16 set the position of the image 200 based on the position of the most recently acquired real object. The first position setting process and the predicted position of the real object in the display update cycle of the image predicted based on the positions of one or more real objects acquired in the past including at least the position of the real object acquired immediately before. And a second position setting process for setting the position of the image 200 based on the position of the real object existing in front of the vehicle 1 from one or more I/O interfaces 14. In the first display update cycle Fα, the first position setting process is not executed and the position of the real object existing in front of the own vehicle 1 is acquired from the one or more I/O interfaces 14. In the display update cycle Fβ, the second position setting process is executed. According to this, since the display position of the AR image is determined based on the predicted position of the real object from the past observation position, the observation position of the real object is acquired while preventing the display position of the image from changing rapidly. In the display update cycle immediately after, the display position of the AR image is determined based on the observation position of the real object, so that the viewer can quickly recognize the accurate position of the real object.
 また、いくつかの実施形態では、画像200は、第1AR画像220と、第1AR画像220と種別が異なる第2AR画像230と、を含み、1つ又はそれ以上のプロセッサ16は、第1AR画像220において、第1表示更新周期Fαでは、第1位置設定処理を実行し、第2表示更新周期Fβでは、第2位置設定処理を実行し、第2AR画像230において、第1表示更新周期Fα及び第2表示更新周期Fβでは、第2位置設定処理を実行してもよい。これによれば、表示する画像の種別の一方では、実オブジェクトの予測位置でのみ画像の位置を更新することができ、急激な変化が抑制された滑らかに変化する画像を表示することができる。 Also, in some embodiments, the image 200 includes a first AR image 220 and a second AR image 230 that is of a different type than the first AR image 220, and one or more processors 16 may cause the first AR image 220 to include the first AR image 220. In the first display update period Fα, the first position setting process is executed, in the second display update period Fβ, the second position setting process is executed, and in the second AR image 230, the first display update period Fα and In the second display update cycle Fβ, the second position setting process may be executed. According to this, in one of the types of images to be displayed, the position of the image can be updated only at the predicted position of the real object, and a smoothly changing image in which abrupt changes are suppressed can be displayed.
 また、いくつかの実施形態では、第1AR画像220は、実オブジェクトに対する注意を促す注意喚起画像であり、第2AR画像230は、自車両1の経路を示す経路画像であってもよい。 Further, in some embodiments, the first AR image 220 may be a warning image that calls attention to the real object, and the second AR image 230 may be a route image that shows the route of the host vehicle 1.
 なお、第1AR画像220は、比較的、報知必要度が高い画像であり、道路面上の「止まれ」などの標識、スリップのおそれがある路面状況(湿潤、凍結)、自車両1の自動手動運転切替え地点を示す画像であってもよい。また、第2AR画像230は、第1AR画像220より報知必要度が低い画像であり、「止まれ」などの重要度が高くない道路標識、看板、POI情報、最終目的地の方向の情報などを示す画像であってもよい。 Note that the first AR image 220 is an image that requires a relatively high degree of notification, and includes signs such as "stop" on the road surface, road surface conditions (wet, freezing) that may cause slippage, and automatic manual operation of the host vehicle 1. It may be an image showing a driving switching point. In addition, the second AR image 230 is an image having a lower notification necessity than the first AR image 220, and shows road signs, signboards, POI information, information about the direction of the final destination, and the like, which are not so important as "stop". It may be an image.
 また、いくつかの実施形態では、1つ又はそれ以上のプロセッサ16は、第1表示更新周期Fαで、直前に取得した実オブジェクトの位置と、少なくとも直前に取得した実オブジェクトの位置を含む過去に取得した1つ又はそれ以上の実オブジェクトの位置を元に予測される画像の表示更新周期における実オブジェクトの予測位置と、を比較し、これらの差分が所定の閾値より大きい場合、第1表示更新周期Fαでは、第1位置設定処理を実行し、第2表示更新周期Fβでは、第2位置設定処理を実行し、差分が所定の閾値より大きくない場合、第1表示更新周期Fα及び第2表示更新周期Fβでは、第2位置設定処理を実行してもよい。これによれば、実オブジェクトの観測位置と予測位置とに乖離が発生した場合に、最新の観測位置に基づいた位置に画像を迅速に表示し、AR画像及びこのAR画像が対応付けられた実オブジェクトに運転者4の視覚的注意を迅速に向けさせることができる。 Further, in some embodiments, the one or more processors 16 include the position of the real object acquired immediately before and the position of the real object acquired at least immediately before in the first display update cycle Fα. The predicted position of the real object in the display update cycle of the image predicted based on the acquired position of one or more real objects is compared, and if the difference between these is greater than a predetermined threshold value, the first display update In the period Fα, the first position setting process is executed, in the second display update period Fβ, the second position setting process is executed, and when the difference is not larger than the predetermined threshold, the first display update period Fα and the second display The second position setting process may be executed in the update cycle Fβ. According to this, when a difference occurs between the observed position and the predicted position of the real object, the image is quickly displayed at the position based on the latest observed position, and the AR image and the actual image in which the AR image is associated with each other are associated with each other. The visual attention of the driver 4 can be promptly directed to the object.
 また、いくつかの実施形態では、1つ又はそれ以上のプロセッサ16は、第1表示更新周期Fαで、直前に取得した実オブジェクトの位置と、少なくとも直前に取得した実オブジェクトの位置を含む過去に取得した1つ又はそれ以上の実オブジェクトの位置を元に予測される画像の表示更新周期における実オブジェクトの予測位置と、を比較し、直前に取得した実オブジェクトの位置が、実オブジェクトの予測位置よりも自車両1の近傍である場合、第1表示更新周期Fαでは、第1位置設定処理を実行し、第2表示更新周期Fβでは、第2位置設定処理を実行し、直前に取得した実オブジェクトの位置が、実オブジェクトの予測位置よりも自車両1の近傍でない場合、第1表示更新周期Fα及び第2表示更新周期Fβでは、第2位置設定処理を実行してもよい。これによれば、予測位置より最新の実オブジェクトの観測位置が自車両1に近く実オブジェクトの急接近が想定できる場合に、最新の観測位置に基づいた位置に画像を表示することで、AR画像及びこのAR画像が対応付けられた実オブジェクトに運転者4の視覚的注意を迅速に向けさせることができる。 Further, in some embodiments, the one or more processors 16 include the position of the real object acquired immediately before and the position of the real object acquired at least immediately before in the first display update cycle Fα. The predicted position of the real object in the display update cycle of the image predicted based on the acquired position of one or more real objects is compared, and the position of the real object acquired immediately before is the predicted position of the real object. When it is closer to the host vehicle 1, the first position setting process is executed in the first display update cycle Fα, the second position setting process is executed in the second display update cycle Fβ, and the actual position acquired immediately before is executed. When the position of the object is not nearer to the host vehicle 1 than the predicted position of the real object, the second position setting process may be executed in the first display update cycle Fα and the second display update cycle Fβ. According to this, when the latest observed position of the real object is closer to the host vehicle 1 than the predicted position, and the rapid approach of the real object can be assumed, the image is displayed at the position based on the latest observed position, so that the AR image is displayed. Also, the visual attention of the driver 4 can be promptly directed to the real object associated with this AR image.
 また、いくつかの実施形態では、1つ又はそれ以上のプロセッサ16は、1つ又はそれ以上のI/Oインタフェース14から画像200の位置を設定する基準となる実オブジェクトと自車両1との相対速度を推定可能な情報を取得し、相対速度が所定の閾値より速い場合、第1表示更新周期Fαでは、第1位置設定処理を実行し、第2表示更新周期Fβでは、第2位置設定処理を実行し、相対速度が所定の閾値より速くない場合、第1表示更新周期Fα及び第2表示更新周期Fβでは、第2位置設定処理を実行してもよい。これによれば、相対速度が速い実オブジェクトに対しては、最新の観測位置を迅速に画像に反映させて視覚的注意を向かせ、相対速度が遅い実オブジェクトに対しては、観測位置と予測位置とに瞬間的に差が生じていても迅速に画像に反映させずに視覚的注意が過度に向かないようにすることもできる。 Further, in some embodiments, one or more processors 16 may use the one or more I/O interfaces 14 to set the position of the image 200 relative to a reference real object and the host vehicle 1. When the information capable of estimating the speed is acquired and the relative speed is faster than the predetermined threshold value, the first position setting process is executed in the first display update cycle Fα, and the second position setting process is executed in the second display update cycle Fβ. If the relative speed is not faster than the predetermined threshold value, the second position setting process may be executed in the first display update cycle Fα and the second display update cycle Fβ. According to this, for a real object with a high relative speed, the latest observation position is quickly reflected in the image to draw visual attention, and for a real object with a low relative speed, the observation position is predicted. Even if there is an instantaneous difference between the position and the position, it is possible to prevent the visual attention from being excessively directed without promptly reflecting it on the image.
 また、いくつかの実施形態では、1つ又はそれ以上のプロセッサ16は、画像200の位置を設定する基準となる実オブジェクトが接近していると判定される場合、第1表示更新周期Fαでは、第1位置設定処理を実行し、第2表示更新周期Fβでは、第2位置設定処理を実行し、画像200の位置を設定する基準となる実オブジェクトが接近していると判定されない場合、第1表示更新周期Fα及び第2表示更新周期Fβでは、第2位置設定処理を実行してもよい。これによれば、実オブジェクトが接近していると想定できる場合に、最新の観測位置に基づいた位置に画像を表示することで、AR画像及びこのAR画像が対応付けられた実オブジェクトに運転者4の視覚的注意を迅速に向けさせることができる。 Also, in some embodiments, one or more of the processors 16 may determine, in the first display update period Fα, if the real object serving as the reference for setting the position of the image 200 is determined to be approaching. When the first position setting process is executed, the second position setting process is executed in the second display update cycle Fβ, and it is not determined that the real object serving as the reference for setting the position of the image 200 is approaching, the first position setting process is performed. In the display update cycle Fα and the second display update cycle Fβ, the second position setting process may be executed. According to this, when it can be assumed that the real object is approaching, the driver displays the AR image and the real object associated with the AR image by displaying the image at the position based on the latest observation position. The visual attention of 4 can be directed quickly.
 また、表示領域100は、図1に示すように、運転者4から見て、上下左右からなる平面(XY平面)に概ね沿うような配置に限定されない。例えば、表示領域100は、運転者4から見た左右方向(X軸方向)を軸として回転させ、走行レーン330(ZX平面)に概ね沿うように配置されてもよい。なお、表示領域100は、平面ではなく、曲面になり得る。また、画像表示部11に立体表示器を採用し、3次元領域である表示領域100に画像200を表示するものであってもよい。 Further, as shown in FIG. 1, the display area 100 is not limited to an arrangement that is substantially along a plane (XY plane) consisting of up, down, left and right as seen from the driver 4. For example, the display region 100 may be rotated about the left-right direction (X-axis direction) viewed from the driver 4 and arranged substantially along the traveling lane 330 (ZX plane). The display area 100 may be a curved surface instead of a flat surface. Further, a stereoscopic display may be adopted for the image display unit 11 and the image 200 may be displayed in the display area 100 which is a three-dimensional area.
 第2AR画像230は、自車両1の走行レーン330に重ねて視認される2つ以上のイラストなどでの経路画像であってもよい。 The second AR image 230 may be a route image with two or more illustrations or the like that are visually recognized to be superimposed on the traveling lane 330 of the host vehicle 1.
1…自車両、2…フロントウインドシールド、4…運転者、5…ダッシュボード、10…車両用表示システム、11…画像表示部、11a…表示光、13…表示制御装置、14…I/Oインタフェース、16…プロセッサ、18…メモリ、20…画像処理回路、100…表示領域、200…画像、210…AR画像、220…第1AR画像(経路画像)、230…第2AR画像(注意喚起画像)、300…前景、310…実オブジェクト、320…第1実オブジェクト(先行車両)、330…第2実オブジェクト(走行レーン)、401…車両ECU、403…道路情報データベース、405…自車位置検出部、407…車外センサ、409…視線方向検出部、411…目位置検出部、413…携帯情報端末、420…車外通信接続機器、502…実オブジェクト関連情報検出モジュール、504…実オブジェクト位置設定モジュール、506…差分判定モジュール、508…距離判定モジュール、510…速度判定モジュール、512…報知必要度判定モジュール、514…画像位置決定モジュール、516…画像サイズ決定モジュール、518…グラフィックモジュール、Fα…第1表示更新周期、Fβ…第2表示更新周期 DESCRIPTION OF SYMBOLS 1... Own vehicle, 2... Front windshield, 4... Driver, 5... Dashboard, 10... Vehicle display system, 11... Image display part, 11a... Display light, 13... Display control device, 14... I/O Interface, 16... Processor, 18... Memory, 20... Image processing circuit, 100... Display area, 200... Image, 210... AR image, 220... First AR image (path image), 230... Second AR image (attention image) , 300... Foreground, 310... Real object, 320... First real object (preceding vehicle), 330... Second real object (running lane), 401... Vehicle ECU, 403... Road information database, 405... Own vehicle position detection unit , 407... Out-of-vehicle sensor, 409... Gaze direction detecting section, 411... Eye position detecting section, 413... Portable information terminal, 420... Out-of-vehicle communication connecting device, 502... Real object related information detecting module, 504... Real object position setting module, 506... Difference determination module, 508... Distance determination module, 510... Speed determination module, 512... Notification necessity determination module, 514... Image position determination module, 516... Image size determination module, 518... Graphic module, Fα... First display Update cycle, Fβ... Second display update cycle

Claims (15)

  1.  自車両の運転者から見られる前景に存在する実オブジェクトに対応付けた位置に画像を重ねて表示する画像表示部を制御する表示制御装置において、
     1つ又はそれ以上のI/Oインタフェースと、
     1つ又はそれ以上のプロセッサと、
     メモリと、
     前記メモリに格納され、前記1つ又はそれ以上のプロセッサによって実行されるように構成される1つ又はそれ以上のコンピュータ・プログラムと、を備え、
     前記1つ又はそれ以上のプロセッサは、
      直前に取得した前記実オブジェクトの位置に基づいて前記画像の位置を設定する第1位置設定処理と、
      少なくとも直前に取得した前記実オブジェクトの位置を含む過去に取得した1つ又はそれ以上の前記実オブジェクトの位置を元に予測される前記画像の表示更新周期における前記実オブジェクトの予測位置に基づいて前記画像の位置を設定する第2位置設定処理と、を実行可能であり、
      前記1つ又はそれ以上のI/Oインタフェースから前記自車両の前方に存在する前記実オブジェクトの位置を取得した直後の第1表示更新周期では、前記第1位置設定処理を実行し、
      前記1つ又はそれ以上のI/Oインタフェースから前記自車両の前方に存在する前記実オブジェクトの位置を取得した直後ではない第2表示更新周期では、前記第2位置設定処理を実行する、
     表示制御装置。
    In a display control device that controls an image display unit that displays an image in an overlapping manner at a position associated with a real object existing in the foreground seen from the driver of the own vehicle,
    One or more I/O interfaces,
    One or more processors,
    Memory and
    One or more computer programs stored in the memory and configured to be executed by the one or more processors;
    The one or more processors are
    A first position setting process for setting the position of the image based on the position of the real object acquired immediately before;
    Based on the predicted position of the real object in the display update cycle of the image predicted based on the position of one or more real objects acquired in the past including the position of the real object acquired at least immediately before A second position setting process for setting the position of the image, and
    In the first display update cycle immediately after acquiring the position of the real object existing in front of the own vehicle from the one or more I/O interfaces, the first position setting process is executed,
    The second position setting process is executed in the second display update cycle that is not immediately after the position of the real object existing in front of the own vehicle is acquired from the one or more I/O interfaces.
    Display controller.
  2.  前記画像は、第1画像と、前記第1画像と種別が異なる第2画像と、を含み、
     前記1つ又はそれ以上のプロセッサは、
      前記第1画像において、
       前記第1表示更新周期では、前記第1位置設定処理を実行し、
       前記第2表示更新周期では、前記第2位置設定処理を実行し、
      前記第2画像において、
       前記第1表示更新周期及び前記第2表示更新周期では、前記第2位置設定処理を実行する、
     請求項1に記載の表示制御装置。
    The image includes a first image and a second image of a different type from the first image,
    The one or more processors are
    In the first image,
    In the first display update cycle, the first position setting process is executed,
    In the second display update cycle, the second position setting process is executed,
    In the second image,
    In the first display update cycle and the second display update cycle, the second position setting process is executed.
    The display control device according to claim 1.
  3.  前記第1画像は、前記実オブジェクトに対する注意を促す注意喚起画像であり、
     前記第2画像は、前記自車両の経路を示す経路画像である、
     請求項2に記載の表示制御装置。
    The first image is a warning image that calls attention to the real object,
    The second image is a route image showing a route of the own vehicle,
    The display control device according to claim 2.
  4.  前記1つ又はそれ以上のプロセッサは、
     前記第1表示更新周期で、直前に取得した前記実オブジェクトの位置と、少なくとも直前に取得した前記実オブジェクトの位置を含む過去に取得した1つ又はそれ以上の前記実オブジェクトの位置を元に予測される前記画像の表示更新周期における前記実オブジェクトの予測位置と、を比較し、
      これらの差分が所定の閾値より大きい場合、
       前記第1表示更新周期では、前記第1位置設定処理を実行し、
       前記第2表示更新周期では、前記第2位置設定処理を実行し、
      前記差分が所定の閾値より大きくない場合、
       前記第1表示更新周期及び前記第2表示更新周期では、前記第2位置設定処理を実行する、
     請求項1に記載の表示制御装置。
    The one or more processors are
    In the first display update cycle, prediction is performed based on the position of the real object acquired immediately before and the position of one or more real objects acquired at least including the position of the real object acquired immediately before. Comparing the predicted position of the real object in the display update cycle of the image,
    If these differences are greater than a given threshold,
    In the first display update cycle, the first position setting process is executed,
    In the second display update cycle, the second position setting process is executed,
    If the difference is not greater than a predetermined threshold,
    In the first display update cycle and the second display update cycle, the second position setting process is executed.
    The display control device according to claim 1.
  5.  前記1つ又はそれ以上のプロセッサは、
     前記第1表示更新周期で、直前に取得した前記実オブジェクトの位置と、少なくとも直前に取得した前記実オブジェクトの位置を含む過去に取得した1つ又はそれ以上の前記実オブジェクトの位置を元に予測される前記画像の表示更新周期における前記実オブジェクトの予測位置と、を比較し、
      前記直前に取得した前記実オブジェクトの位置が、前記実オブジェクトの予測位置よりも前記自車両の近傍である場合、
       前記第1表示更新周期では、前記第1位置設定処理を実行し、
       前記第2表示更新周期では、前記第2位置設定処理を実行し、
      前記直前に取得した前記実オブジェクトの位置が、前記実オブジェクトの予測位置よりも前記自車両の近傍でない場合、
       前記第1表示更新周期及び前記第2表示更新周期では、前記第2位置設定処理を実行する、
     請求項1に記載の表示制御装置。
    The one or more processors are
    In the first display update cycle, prediction is made based on the position of the real object acquired immediately before and the position of one or more real objects acquired at least including the position of the real object acquired immediately before. Comparing the predicted position of the real object in the display update cycle of the image,
    If the position of the real object acquired immediately before is closer to the host vehicle than the predicted position of the real object,
    In the first display update cycle, the first position setting process is executed,
    In the second display update cycle, the second position setting process is executed,
    If the position of the real object acquired immediately before is not near the own vehicle than the predicted position of the real object,
    In the first display update cycle and the second display update cycle, the second position setting process is executed.
    The display control device according to claim 1.
  6.  前記1つ又はそれ以上のプロセッサは、
     前記1つ又はそれ以上のI/Oインタフェースから前記画像の位置を設定する基準となる前記実オブジェクトと前記自車両との相対速度を推定可能な情報を取得し、
      前記相対速度が所定の閾値より速い場合、
       前記第1表示更新周期では、前記第1位置設定処理を実行し、
       前記第2表示更新周期では、前記第2位置設定処理を実行し、
        前記相対速度が所定の閾値より速くない場合、
       前記第1表示更新周期及び前記第2表示更新周期では、前記第2位置設定処理を実行する、
     請求項1に記載の表示制御装置。
    The one or more processors are
    From the one or more I/O interfaces, obtain information capable of estimating the relative speed between the real object and the own vehicle, which serves as a reference for setting the position of the image,
    If the relative speed is faster than a predetermined threshold,
    In the first display update cycle, the first position setting process is executed,
    In the second display update cycle, the second position setting process is executed,
    If the relative speed is not faster than a predetermined threshold,
    In the first display update cycle and the second display update cycle, the second position setting process is executed.
    The display control device according to claim 1.
  7.  前記1つ又はそれ以上のプロセッサは、
     前記画像の位置を設定する基準となる前記実オブジェクトが接近していると判定される場合、
      前記第1表示更新周期では、前記第1位置設定処理を実行し、
      前記第2表示更新周期では、前記第2位置設定処理を実行し、
     前記画像の位置を設定する基準となる前記実オブジェクトが接近していると判定されない場合、
       前記第1表示更新周期及び前記第2表示更新周期では、前記第2位置設定処理を実行する、
     請求項1に記載の表示制御装置。
    The one or more processors are
    When it is determined that the real object that is the reference for setting the position of the image is approaching,
    In the first display update cycle, the first position setting process is executed,
    In the second display update cycle, the second position setting process is executed,
    If it is not determined that the real object that is the reference for setting the position of the image is approaching,
    In the first display update cycle and the second display update cycle, the second position setting process is executed.
    The display control device according to claim 1.
  8.  自車両の運転者から見られる前景に存在する実オブジェクトに対応付けた位置に画像を重ねて表示する画像表示部を制御する方法において、
     直前に取得した前記実オブジェクトの位置に基づいて前記画像の位置を設定する第1位置設定処理と、
     少なくとも直前に取得した前記実オブジェクトの位置を含む過去に取得した1つ又はそれ以上の前記実オブジェクトの位置を元に予測される前記画像の表示更新周期における前記実オブジェクトの予測位置に基づいて前記画像の位置を設定する第2位置設定処理と、を実行可能であり、
     前記自車両の前方に存在する前記実オブジェクトの位置を取得した直後の第1表示更新周期では、前記第1位置設定処理を実行することと、
     前記自車両の前方に存在する前記実オブジェクトの位置を取得した直後ではない第2表示更新周期では、前記第2位置設定処理を実行することと、を含む、
     方法。
    In a method of controlling an image display unit for displaying an image in an overlapping manner at a position associated with a real object existing in the foreground seen by the driver of the own vehicle,
    A first position setting process for setting the position of the image based on the position of the real object acquired immediately before;
    Based on the predicted position of the real object in the display update cycle of the image predicted based on the position of one or more real objects acquired in the past including the position of the real object acquired at least immediately before A second position setting process for setting the position of the image, and
    Executing the first position setting process in a first display update cycle immediately after acquiring the position of the real object existing in front of the own vehicle;
    Executing the second position setting process in a second display update cycle that is not immediately after acquiring the position of the real object existing in front of the own vehicle.
    Method.
  9.  前記画像は、第1画像と、前記第1画像と種別が異なる第2画像と、を含み、
      前記第1画像において、
       前記第1表示更新周期では、前記第1位置設定処理を実行し、
       前記第2表示更新周期では、前記第2位置設定処理を実行することと、
      前記第2画像において、
       前記第1表示更新周期及び前記第2表示更新周期では、前記第2位置設定処理を実行することと、を含む、
     請求項8に記載の方法。
    The image includes a first image and a second image of a different type from the first image,
    In the first image,
    In the first display update cycle, the first position setting process is executed,
    Performing the second position setting process in the second display update cycle;
    In the second image,
    Performing the second position setting process in the first display update cycle and the second display update cycle.
    The method of claim 8.
  10.  前記第1画像は、前記実オブジェクトに対する注意を促す注意喚起画像であり、
     前記第2画像は、前記自車両の経路を示す経路画像である、
     請求項9に記載の方法。
    The first image is a warning image that calls attention to the real object,
    The second image is a route image showing a route of the own vehicle,
    The method according to claim 9.
  11.  前記第1表示更新周期で、直前に取得した前記実オブジェクトの位置と、少なくとも直前に取得した前記実オブジェクトの位置を含む過去に取得した1つ又はそれ以上の前記実オブジェクトの位置を元に予測される前記画像の表示更新周期における前記実オブジェクトの予測位置と、を比較することと、
      これらの差分が所定の閾値より大きい場合、
       前記第1表示更新周期では、前記第1位置設定処理を実行し、
       前記第2表示更新周期では、前記第2位置設定処理を実行することと、
      前記差分が所定の閾値より大きくない場合、
       前記第1表示更新周期及び前記第2表示更新周期では、前記第2位置設定処理を実行することと、を含む、
     請求項8に記載の方法。
    In the first display update cycle, prediction is performed based on the position of the real object acquired immediately before and the position of one or more real objects acquired at least including the position of the real object acquired immediately before. Comparing the predicted position of the real object in the display update cycle of the image,
    If these differences are greater than a given threshold,
    In the first display update cycle, the first position setting process is executed,
    Performing the second position setting process in the second display update cycle;
    If the difference is not greater than a predetermined threshold,
    Performing the second position setting process in the first display update cycle and the second display update cycle.
    The method of claim 8.
  12.  前記第1表示更新周期で、直前に取得した前記実オブジェクトの位置と、少なくとも直前に取得した前記実オブジェクトの位置を含む過去に取得した1つ又はそれ以上の前記実オブジェクトの位置を元に予測される前記画像の表示更新周期における前記実オブジェクトの予測位置と、を比較することと、
      前記直前に取得した前記実オブジェクトの位置が、前記実オブジェクトの予測位置よりも前記自車両の近傍である場合、
       前記第1表示更新周期では、前記第1位置設定処理を実行し、
       前記第2表示更新周期では、前記第2位置設定処理を実行することと、
      前記直前に取得した前記実オブジェクトの位置が、前記実オブジェクトの予測位置よりも前記自車両の近傍でない場合、
       前記第1表示更新周期及び前記第2表示更新周期では、前記第2位置設定処理を実行することと、を含む、
     請求項8に記載の方法。
    In the first display update cycle, prediction is performed based on the position of the real object acquired immediately before and the position of one or more real objects acquired at least including the position of the real object acquired immediately before. Comparing the predicted position of the real object in the display update cycle of the image,
    If the position of the real object acquired immediately before is closer to the host vehicle than the predicted position of the real object,
    In the first display update cycle, the first position setting process is executed,
    Performing the second position setting process in the second display update cycle;
    If the position of the real object acquired immediately before is not near the own vehicle than the predicted position of the real object,
    Performing the second position setting process in the first display update cycle and the second display update cycle.
    The method of claim 8.
  13.  前記画像の位置を設定する基準となる前記実オブジェクトと前記自車両との相対速度を推定可能な情報を取得することと、
      前記相対速度が所定の閾値より速い場合、
       前記第1表示更新周期では、前記第1位置設定処理を実行し、
       前記第2表示更新周期では、前記第2位置設定処理を実行することと、
        前記相対速度が所定の閾値より速くない場合、
       前記第1表示更新周期及び前記第2表示更新周期では、前記第2位置設定処理を実行することと、を含む、
     請求項8に記載の方法。
    Acquiring information capable of estimating a relative speed between the real object and the own vehicle, which serves as a reference for setting the position of the image,
    If the relative speed is faster than a predetermined threshold,
    In the first display update cycle, the first position setting process is executed,
    Performing the second position setting process in the second display update cycle;
    If the relative speed is not faster than a predetermined threshold,
    Performing the second position setting process in the first display update cycle and the second display update cycle.
    The method of claim 8.
  14.  前記画像の位置を設定する基準となる前記実オブジェクトが接近していると判定される場合、
      前記第1表示更新周期では、前記第1位置設定処理を実行し、
      前記第2表示更新周期では、前記第2位置設定処理を実行することと、
     前記画像の位置を設定する基準となる前記実オブジェクトが接近していると判定されない場合、
       前記第1表示更新周期及び前記第2表示更新周期では、前記第2位置設定処理を実行することと、を含む、
     請求項8に記載の方法。
    When it is determined that the real object that is the reference for setting the position of the image is approaching,
    In the first display update cycle, the first position setting process is executed,
    Performing the second position setting process in the second display update cycle;
    If it is not determined that the real object that is the reference for setting the position of the image is approaching,
    Performing the second position setting process in the first display update cycle and the second display update cycle.
    The method of claim 8.
  15.  請求項8乃至14のいずれかに記載の方法を実行するための命令を含む、コンピュータ・プログラム。 A computer program comprising instructions for performing the method according to any of claims 8-14.
PCT/JP2020/002504 2019-01-29 2020-01-24 Display control device, method, and computer program WO2020158601A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-013681 2019-01-29
JP2019013681 2019-01-29

Publications (1)

Publication Number Publication Date
WO2020158601A1 true WO2020158601A1 (en) 2020-08-06

Family

ID=71840946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/002504 WO2020158601A1 (en) 2019-01-29 2020-01-24 Display control device, method, and computer program

Country Status (1)

Country Link
WO (1) WO2020158601A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023010236A1 (en) * 2021-07-31 2023-02-09 华为技术有限公司 Display method, device and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011119917A (en) * 2009-12-02 2011-06-16 Denso Corp Display device for vehicle
JP2015141155A (en) * 2014-01-30 2015-08-03 パイオニア株式会社 virtual image display device, control method, program, and storage medium
WO2017069038A1 (en) * 2015-10-22 2017-04-27 日本精機株式会社 Onboard display system
WO2018105052A1 (en) * 2016-12-07 2018-06-14 三菱電機株式会社 Display control device, display system, and display control method
JP2018151903A (en) * 2017-03-14 2018-09-27 アイシン・エィ・ダブリュ株式会社 Virtual image display device and computer program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011119917A (en) * 2009-12-02 2011-06-16 Denso Corp Display device for vehicle
JP2015141155A (en) * 2014-01-30 2015-08-03 パイオニア株式会社 virtual image display device, control method, program, and storage medium
WO2017069038A1 (en) * 2015-10-22 2017-04-27 日本精機株式会社 Onboard display system
WO2018105052A1 (en) * 2016-12-07 2018-06-14 三菱電機株式会社 Display control device, display system, and display control method
JP2018151903A (en) * 2017-03-14 2018-09-27 アイシン・エィ・ダブリュ株式会社 Virtual image display device and computer program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023010236A1 (en) * 2021-07-31 2023-02-09 华为技术有限公司 Display method, device and system

Similar Documents

Publication Publication Date Title
EP3339124B1 (en) Autonomous driving system
JP6223630B2 (en) Display control apparatus, display system, display control method, and display control program
US11803053B2 (en) Display control device and non-transitory tangible computer-readable medium therefor
US20210016793A1 (en) Control apparatus, display apparatus, movable body, and image display method
JP7255608B2 (en) DISPLAY CONTROLLER, METHOD, AND COMPUTER PROGRAM
JP2020032866A (en) Vehicular virtual reality providing device, method and computer program
JP7014205B2 (en) Display control device and display control program
KR20200131832A (en) Information processing devices, mobile devices and methods, and programs
JP7459883B2 (en) Display control device, head-up display device, and method
WO2020158601A1 (en) Display control device, method, and computer program
WO2016056199A1 (en) Head-up display device, and display method for head-up display
WO2022230995A1 (en) Display control device, head-up display device, and display control method
JP2020086884A (en) Lane marking estimation device, display control device, method and computer program
WO2021200914A1 (en) Display control device, head-up display device, and method
JP2020121607A (en) Display control device, method and computer program
JP2020121704A (en) Display control device, head-up display device, method and computer program
JP7302311B2 (en) Vehicle display control device, vehicle display control method, vehicle display control program
JP7434894B2 (en) Vehicle display device
JP2020199883A (en) Display control device, head-up display device, method and computer program
WO2021200913A1 (en) Display control device, image display device, and method
WO2023145852A1 (en) Display control device, display system, and display control method
JP2020106911A (en) Display control device, method, and computer program
WO2023003045A1 (en) Display control device, head-up display device, and display control method
JP2022077138A (en) Display controller, head-up display device, and display control method
JP2020086882A (en) Display control device, method and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20749375

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20749375

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP