WO2018167815A1 - Display control device and display control method - Google Patents

Display control device and display control method Download PDF

Info

Publication number
WO2018167815A1
WO2018167815A1 PCT/JP2017/009932 JP2017009932W WO2018167815A1 WO 2018167815 A1 WO2018167815 A1 WO 2018167815A1 JP 2017009932 W JP2017009932 W JP 2017009932W WO 2018167815 A1 WO2018167815 A1 WO 2018167815A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
vehicle
display control
control device
obstacle
Prior art date
Application number
PCT/JP2017/009932
Other languages
French (fr)
Japanese (ja)
Inventor
下谷 光生
龍太 久良木
井崎 公彦
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2019505313A priority Critical patent/JP6727400B2/en
Priority to PCT/JP2017/009932 priority patent/WO2018167815A1/en
Publication of WO2018167815A1 publication Critical patent/WO2018167815A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a display control device and a display control method for controlling a display unit.
  • Patent Documents 1 and 2 propose a technique for displaying a corner pole using a virtual image or holography of a HUD (head-up display) so that the cost of hardware can be reduced.
  • the driver may want to visually recognize the positional relationship between the vehicle part and the obstacle other than the part where the corner pole is provided. There is.
  • the corner pole is fixed, such visual recognition cannot be performed.
  • a method of installing a plurality of corner poles or a mechanism for moving one corner pole mechanically can be considered.
  • the appearance is deteriorated and the cost of hardware is reduced. There was a problem that increased.
  • the present invention has been made in view of the above-described problems, and an object thereof is to provide a technique capable of virtually moving a corner pole.
  • the display control device is a display control device that controls a display unit, and the display unit can display one or more display objects that are visible from the driver's seat of the vehicle over the scenery outside the vehicle.
  • the display control device displays an information acquisition unit that acquires information and a first display object that is a display object on the display unit, and based on the information acquired by the information acquisition unit, on the front side end of the vehicle body A control unit that moves the first display object within a corresponding predetermined range.
  • the first display object is moved within a predetermined range corresponding to the front end of the vehicle body. According to such a configuration, the corner pole can be virtually moved.
  • FIG. 1 is a block diagram illustrating a configuration of a display control device according to a first embodiment.
  • 6 is a block diagram illustrating a configuration of a display control device according to Embodiment 2.
  • FIG. 10 is a flowchart illustrating an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a flowchart illustrating an operation of a display control apparatus according to a second modification of the second embodiment.
  • 14 is a flowchart illustrating an operation of a display control apparatus according to a fourth modification of the second embodiment.
  • FIG. 16 is a diagram for explaining an operation of a display control apparatus according to Modification 4 of Embodiment 2.
  • FIG. 16 is a diagram for explaining an operation of a display control apparatus according to Modification 4 of Embodiment 2.
  • 10 is a block diagram illustrating a configuration of a display control device according to Embodiment 3.
  • FIG. 14 is a flowchart illustrating an operation of the display control apparatus according to the third embodiment.
  • 10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3.
  • FIG. 10 is a diagram for explaining the operation of a display control apparatus according to a second modification of the third embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a third modification of the third embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a third modification of the third embodiment.
  • FIG. 25 is a diagram for explaining an operation of a display control apparatus according to Modification 5 of Embodiment 3.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to Modification 5 of Embodiment 3.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to Modification 5 of Embodi
  • FIG. 25 is a diagram for explaining an operation of a display control apparatus according to Modification 5 of Embodiment 3.
  • FIG. 10 is a diagram for explaining the operation of the display control apparatus according to the fourth embodiment.
  • FIG. 10 is a diagram for explaining the operation of the display control apparatus according to the fourth embodiment.
  • 10 is a flowchart showing the operation of the display control apparatus according to the fourth embodiment.
  • FIG. 10 is a block diagram illustrating a configuration of a display control device according to a fifth embodiment.
  • 10 is a flowchart illustrating an operation of the display control apparatus according to the fifth embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a fifth embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a fifth embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a fifth embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a fifth embodiment
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to a modification of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to a modification of the fifth embodiment.
  • FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 1 of Embodiment 5.
  • FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 1 of Embodiment 5.
  • FIG. 25 is a diagram for explaining an operation of a display control apparatus according to Modification 2 of Embodiment 5.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to Modification 2 of Embodiment 5.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to Modification 2 of Embodiment 5.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to Modification 2 of Embodiment 5.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 3 of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 3 of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 3 of the fifth embodiment.
  • FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 4 of Embodiment 5.
  • FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 4 of Embodiment 5.
  • FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 4 of Embodiment 5.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 5 of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 5 of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 5 of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 6 of the fifth embodiment.
  • FIG. 10 is a block diagram illustrating a configuration of a display control device according to a sixth embodiment. 18 is a flowchart showing the operation of the display control apparatus according to the sixth embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a sixth embodiment.
  • 25 is a diagram for explaining an operation of a display control apparatus according to the first modification of the sixth embodiment. It is a block diagram which shows the hardware constitutions of the display control apparatus which concerns on another modification. It is a block diagram which shows the hardware constitutions of the display control apparatus which concerns on another modification. It is a block diagram which shows the structure of the server which concerns on another modification. It is a block diagram which shows the structure of the communication terminal which concerns on another modification.
  • Embodiment 1 the display control apparatus according to Embodiment 1 of the present invention will be described as being mounted on a vehicle.
  • a vehicle on which the display control device is mounted and which is a target of attention will be described as “own vehicle”.
  • FIG. 1 is a block diagram showing a configuration of the display control apparatus 1 according to the first embodiment.
  • the display control device 1 in FIG. 1 controls a virtual display unit 21 that is a display unit.
  • the virtual display unit 21 can display one or more display objects that are visually recognized from the driver's seat of the host vehicle over the scenery outside the host vehicle. That is, the virtual display unit 21 can display a virtual display object that can be seen by the driver of the host vehicle as if it actually exists in a three-dimensional space in the real world.
  • a HUD that displays a virtual image or a holography, or an autostereoscopic display device is applied to the virtual display unit 21.
  • the display control device 1 in FIG. 1 includes an information acquisition unit 11 and a control unit 12.
  • the information acquisition unit 11 acquires information that can be used to move the display object.
  • This display object includes a first display object indicating a pole for the driver to visually recognize the positional relationship between the vehicle and the obstacle, that is, a pole corresponding to a corner pole.
  • the first display object is described as a “pole object”.
  • a user operation for moving the pole object is used.
  • the control unit 12 displays the pole object on the virtual display unit 21, and based on the information acquired by the information acquisition unit 11, the control unit 12 displays the pole object within a predetermined range corresponding to the front side end of the body of the host vehicle.
  • the predetermined range includes at least one of a range overlapping the front end portion of the body of the host vehicle and a range around the front end portion of the body of the host vehicle. For example, a front bumper of the own vehicle, or a fender and a front bumper of the own vehicle are applied to the front end of the body of the own vehicle.
  • FIG. 2 is a block diagram showing a configuration of the display control apparatus 1 according to Embodiment 2 of the present invention.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • the virtual display unit 21 is the same as that in the first embodiment.
  • the operation input unit 22 receives various operations such as an operation of moving the pole object from the driver, and outputs an operation signal indicating the received operation to the display control device 1.
  • the operation input unit 22 will be described below as a touch panel.
  • the operation input unit 22 includes, for example, a switch that receives a driver's push operation, a voice operation input device that receives a driver's voice operation, a gesture operation input device that includes a camera that receives a driver's gesture for space as an operation, And at least any one of the line-of-sight operation input devices including a camera or the like that accepts the movement of the driver's line of sight as an operation may be used. Any device that can input the driver's intention may be used. Some of these will be described in detail in later-described modifications.
  • the display control device 1 includes an operation signal input unit 11a, a virtual display object generation unit 12a, a display position control unit 12b, and a display control unit 12c.
  • the operation signal input unit 11a is included in the concept of the information acquisition unit 11 in FIG. 1 of the first embodiment.
  • the virtual display object generation unit 12a, the display position control unit 12b, and the display control unit 12c are included in the concept of the control unit 12 in FIG. 1 of the first embodiment, as indicated by a broken line in FIG.
  • the operation signal input unit 11a acquires an operation signal indicating the operation from the operation input unit 22. For this reason, when the operation input unit 22 receives an operation of moving the pole object, the operation signal input unit 11a acquires information on the operation of moving the pole object from the operation input unit 22.
  • an operation for moving the pole object is referred to as a “movement operation”.
  • the virtual display object generation unit 12a generates a display object to be displayed on the virtual display unit 21 based on the operation acquired by the operation signal input unit 11a.
  • the virtual display object generation unit 12a generates a pole object based on the moving operation acquired by the operation signal input unit 11a.
  • the virtual display object generation unit 12a may generate a pole object based on a signal indicating that the accessory power supply of the host vehicle is turned on.
  • the display position control unit 12b determines the display position of the display object generated by the virtual display object generation unit 12a based on the operation acquired by the operation signal input unit 11a.
  • the display position control unit 12b determines the position of the virtual image as the display position.
  • the position of the virtual image is a position in a three-dimensional coordinate space with a specific position (for example, a driver's seat or a windshield) of the host vehicle as a reference such as an origin.
  • the three-dimensional coordinate space is a polar coordinate space
  • the position of the virtual image is defined by, for example, the virtual image direction that is the direction of the virtual image and the virtual image distance that is the distance to the virtual image.
  • the coordinate space is an orthogonal coordinate space
  • the position of the virtual image is defined by coordinates on three orthogonal coordinate axes defined in the front-rear direction, the left-right direction, and the up-down direction of the vehicle, for example.
  • the display position control unit 12b determines the display position of the pole object at a position overlapping with the fender and the front bumper of the host vehicle based on the moving operation acquired by the operation signal input unit 11a. .
  • the display control unit 12c controls the virtual display unit 21 so that the display object generated by the virtual display object generation unit 12a is displayed at the display position determined by the display position control unit 12b.
  • the display control unit 12c controls the virtual display unit 21 so that the pole object generated by the virtual display object generation unit 12a is displayed at the display position determined by the display position control unit 12b. To do.
  • FIG. 3 is a flowchart showing the operation of the display control apparatus 1 according to the second embodiment.
  • step S1 the virtual display object generation unit 12a generates a pole object based on the operation acquired by the operation signal input unit 11a.
  • the display position control unit 12b determines the display position of the pole object as the initial position.
  • the display control unit 12c controls the virtual display unit 21 so as to display the pole object at the initial position.
  • the initial position is assumed to be the position of the front end portion on the passenger seat side of the body of the host vehicle, like the position of a general corner pole.
  • FIG. 4 is a view showing a state where the vehicle can be seen through the windshield 31 from the interior of the host vehicle when step S1 is performed.
  • FIG. 4 shows a hood 32 and a rearview mirror 33 of the host vehicle, and a pole object 34 displayed at the same initial position as the corner pole. Note that the bonnet 32 may not be visible depending on the vehicle type of the host vehicle.
  • FIG. 5 which shows typically the state which can be seen through the windshield 31 from the room
  • the sign of the host vehicle 2 is attached to the bonnet 32.
  • step S3 the display position control unit 12b determines whether or not the operation signal input unit 11a has acquired a moving operation. If it is determined that the moving operation has been acquired, the process proceeds to step S3. If it is determined that the moving operation has not been acquired, the process of step S2 is performed again.
  • FIG. 6 is a diagram illustrating an example of an arrangement relationship between the host vehicle 2 and another vehicle 51a that is an obstacle.
  • FIG. 6 shows not only the pole object 34 displayed at the initial position 2a in step S1, but also the positions 2b and 52 of the respective parts forming the shortest distance between the host vehicle 2 and the other vehicle 51a. Is also shown.
  • the driver who is going to drive the own vehicle 2 in the vicinity of the other vehicle 51a determines whether the own vehicle 2 collides with the other vehicle 51a at positions 2b and 52. It is thought that the distance between them is measured. At this time, if the pole object 34 is displayed at the position 2b instead of the initial position 2a, it can be expected that the judgment of the collision is enhanced. Therefore, in the case of FIG. 6, the driver performs a moving operation in step S2 of FIG. 3, and the process proceeds to step S3.
  • step S3 the display position control unit 12b determines the display position of the pole object 34 based on the moving operation acquired by the operation signal input unit 11a.
  • the display control unit 12c controls the virtual display unit 21 so that the pole object 34 is displayed at the display position determined by the display position control unit 12b. Thereby, the pole object 34 moves. Thereafter, the process returns to step S2.
  • FIG. 7 is a view showing a state where the vehicle can be seen through the windshield 31 from the interior of the host vehicle when Step S2 and Step S3 are performed. As shown in FIG. 7, step S2 and step S3 are repeatedly performed, so that the pole object 34 moves from the initial position to a position intended by the driver.
  • the virtual display unit 21 is a HUD that displays a virtual image and the virtual image distance is defined based on the driver seat, the pole object 34 moves in the direction from the driver seat side to the passenger seat side. Accordingly, the virtual image distance of the pole object 34 becomes longer.
  • FIG. 8 is a diagram showing a state in which the pole object 34 has moved from the initial position 2a to the position 2b in the arrangement relationship of FIG.
  • the driver of the host vehicle 2 drives the host vehicle 2 so that the host vehicle 2 does not collide with the other vehicle 51a while measuring the distance between the positions 2b and 52 using the pole object 34 as an index. It can be performed.
  • the pole object 34 can be moved based on the operation of moving the pole object 34. Therefore, the user can move the pole object 34 to the intended position.
  • the initial position of the pole object 34 is the position of the front end on the passenger seat side of the body of the host vehicle, and therefore the pole object 34 is used in the same manner as a conventional corner pole. Can do.
  • ⁇ Modification 1 of Embodiment 2> It is assumed that the display control unit 12c according to the second embodiment described above changes both the virtual image distance and the virtual image direction of the pole object 34 based on the moving operation when the virtual display unit 21 is a HUD. It was. However, the present invention is not limited to this, and the display control unit 12c may change the virtual image direction of the pole object 34 without changing the virtual image distance of the pole object 34 based on the moving operation. According to such a configuration, a HUD that does not change the virtual image distance can be used for the virtual display unit 21.
  • the initial position of the pole object may be appropriately calibrated.
  • the calibration may be automatically performed based on the detection result or estimation result of the driver's eye position, or the driver manually sets the pole object to an appropriate display position by operating the menu screen. May be implemented.
  • the display control apparatus 1 includes a vehicle information acquisition unit included in the concept of the information acquisition unit 11 of FIG.
  • FIG. 9 is a flowchart showing the operation of the display control apparatus 1 according to this modification.
  • the operation in FIG. 9 is the same as that in which step S11 is added between step S1 and step S2 in FIG.
  • step S11 the vehicle information acquisition unit acquires the speed of the host vehicle from an ECU (Electronic Control Unit) of the host vehicle.
  • the display position control unit 12b determines whether or not the speed of the host vehicle acquired by the vehicle information acquisition unit is smaller than a predetermined speed (for example, 10 km / h). If it is determined that the speed of the host vehicle is lower than a predetermined speed, the process proceeds to step S2, and if it is determined that the speed of the host vehicle is equal to or higher than the predetermined speed, the process is performed. Return to S1.
  • a predetermined speed for example, 10 km / h
  • step S2 If it is determined in step S2 that the moving operation has been acquired, the process proceeds to step S3. If it is determined that the moving operation has not been acquired, the process returns to step S11. After the process of step S3 is performed, the process returns to step S11.
  • the position of the pole object 34 is fixed to the initial position. According to such a configuration, the pole object 34 can be prevented from interfering with the driver's vision while the host vehicle is traveling.
  • the vehicle information acquisition unit described above may acquire automatic driving information indicating whether or not automatic driving is being performed on the own vehicle from the ECU of the own vehicle, instead of the speed of the own vehicle. Then, in step S11 of FIG. 9, the display position control unit 12b determines whether or not the automatic driving information acquired by the vehicle information acquisition unit indicates that automatic driving is being performed on the host vehicle. Also good. If it is not indicated that the vehicle is performing automatic driving, the process proceeds to step S2, and if the vehicle is indicating that automatic driving is being performed, the process proceeds to step S1. The operation may be performed so as to return to step (b).
  • the display position control unit 12b configured in this manner fixes the position of the pole object 34 at the initial position when the automatic driving information indicates that automatic driving is being performed in the host vehicle. According to such a configuration, similarly to the above, it is possible to suppress the pole object 34 from interfering with the driver's vision during the automatic driving of the host vehicle.
  • the display control device 1 includes a peripheral brightness acquisition unit included in the concept of the information acquisition unit 11 of FIG.
  • the surrounding brightness acquisition unit acquires the brightness around the host vehicle acquired by the brightness sensor, or acquires an illumination signal when an illumination display is performed on the host vehicle.
  • the control unit 12 determines the color and brightness of the pole object 34 when the brightness around the host vehicle acquired by the surrounding brightness acquisition unit is equal to or less than the threshold value or when the illumination signal is acquired by the surrounding brightness acquisition unit. Is changed to a conspicuous color and brightness even in a dark environment. According to such a configuration, the pole object 34 can be easily seen even in a dark environment.
  • the color in this specification includes not only a single color but also a color tone such as a combination of a plurality of colors.
  • the operation input unit 22 in FIG. 2 has been described as using a touch panel.
  • the present invention is not limited to this.
  • a gesture operation input device that detects an instruction direction indicated by the driver's finger may be used as the operation input unit 22. Then, when the pointing direction is detected by the gesture operation input device, the display control device 1 may move the pole object 34 to a position on the pointing direction.
  • the display control apparatus 1 determines that the pole object 34 exists in the instruction direction detected by the gesture operation input device, and when another instruction direction is detected by the gesture operation input device, the other instruction direction The pole object 34 may be moved to the upper position.
  • a line-of-sight operation input device that detects the driver's line-of-sight direction may be used as the operation input unit 22. Then, after determining that the pole object 34 is present in the line-of-sight direction detected by the line-of-sight operation input device, the display control device 1 determines another line-of-sight direction when another line-of-sight direction is detected by the line-of-sight operation input device. The pole object 34 may be moved to the upper position.
  • both the gesture operation input device and the voice operation input device may be used for the operation input unit 22.
  • FIG. 10 is a flowchart showing the operation of the display control apparatus 1 according to this modification when a gesture operation input device and a voice operation input device are used as the operation input unit 22.
  • the operation in FIG. 10 is the same as that in which step S21 is added before step S1 in FIG. 3 and step S2 in FIG. 3 is replaced with steps S22 to S25.
  • step S21 the control unit 12 determines whether or not the operation input unit 22 and thus the operation signal input unit 11a have acquired a display command sound for displaying the pole object 34, for example, a “corner pole display” sound. Determine. If it is determined that the voice of the display command has been acquired, the process proceeds to step S1, and if it is determined that the voice of the display command has not been acquired, the process of step S21 is performed again.
  • step S1 the control unit 12 controls the virtual display unit 21 so as to display the pole object 34 at the initial position.
  • step S22 the control unit 12 causes the operation input unit 22 and, in turn, the operation signal input unit 11a to erase a voice of an erasure command for erasing the display of the pole object 34, for example, a voice of “Corner Pole Erase”. It is determined whether or not If it is determined that the erase command voice has been acquired, the process proceeds to step S23. If it is determined that the erase command voice has not been acquired, the process proceeds to step S24.
  • step S23 the control unit 12 controls the virtual display unit 21 so as to erase the display of the pole object 34. Thereby, the pole object 34 is deleted. Thereafter, the process returns to step S21.
  • step S24 the control unit 12 acquires the voice of the movement command for moving the pole object 34, for example, the voice of “movement of the corner pole” by the operation input unit 22 and the operation signal input unit 11a. It is determined whether or not a gesture operation pointing to the object 34 has been acquired. That is, the control unit 12 determines whether the driver has issued a movement command while pointing in the same direction as the pole object 34 as shown in FIG. If it is determined that the voice of the movement command and the gesture operation indicating the designated direction have been acquired, the process proceeds to step S25, and if not, the process returns to step S22. When the process proceeds to step S25, the control unit 12 may cause the tip of the pole object 34 to blink.
  • step S25 the control unit 12 obtains a voice of a determination command for the operation input unit 22 and thus the operation signal input unit 11a to determine the display position of the pole object 34, for example, a voice “decide a corner pole”, In addition, it is determined whether or not a gesture operation indicating an instruction direction different from the pole object 34 has been acquired. That is, the control unit 12 determines whether the driver has issued a determination command while pointing in a different direction from the pole object 34 as shown in FIG. If it is determined that the voice of the determination command and the gesture operation pointing to the other instruction direction have been acquired, the process proceeds to step S3. If not, the process of step S25 is performed again. If it is determined that the voice of the determination command and the gesture operation indicating another instruction direction are not acquired even if the process of step S25 is performed a certain number of times, the process may return to step S22.
  • step S3 the control unit 12 controls the virtual display unit 21 so that the pole object 34 is displayed in the different instruction direction described above. Thereafter, the process returns to step S22.
  • gesture operation input device is used as the operation input unit 22, and instead of acquiring various commands such as display commands from voice, a specific gesture operation such as swinging of a driver's finger is performed. Even with the configuration for acquiring the above, the same operation as described above can be realized.
  • FIG. 13 is a block diagram showing the configuration of the display control apparatus 1 according to Embodiment 3 of the present invention.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • the periphery detection unit 23 Before describing the internal configuration of the display control device 1, the periphery detection unit 23 will be described.
  • the virtual display unit 21 and the operation input unit 22 are the same as those in the second embodiment.
  • the periphery detection unit 23 acquires information related to obstacles such as other vehicles around the host vehicle. This information includes relative positions of the obstacle and the host vehicle. For example, at least one of an ultrasonic sensor, an image recognition device, a laser radar, a millimeter wave radar, an acoustic recognition device, and a night vision camera may be used for the periphery detection unit 23. A sensing device may be used.
  • the display control device 1 in FIG. 13 has the same configuration as the display control device 1 in FIG. 2 in which an outside information input unit 11b and a relative position acquisition unit 11c are added.
  • the vehicle outside information input unit 11b and the relative position acquisition unit 11c are included in the concept of the information acquisition unit 11 in FIG.
  • the outside-vehicle information input unit 11b acquires information related to obstacles around the host vehicle from the surroundings detection unit 23. This information includes information on relative positions of the obstacle and the host vehicle.
  • the relative position acquisition unit 11c stores in advance the own vehicle internal position, which is any position in the own vehicle necessary for obtaining the relative position acquired by the outside information input unit 11b.
  • the relative position acquisition unit 11c includes the relative position of the obstacle and the host vehicle acquired by the vehicle outside information input unit 11b, the host vehicle internal position stored in advance, and the pole object 34 determined by the display position control unit 12b. Based on the position, a relative position between the obstacle and the pole object 34 is acquired.
  • the control unit 12 changes the display mode of the pole object 34 based on the relative position of the obstacle and the pole object 34 acquired by the relative position acquisition unit 11c.
  • the control unit 12 obtains a pole distance that is a distance between the obstacle and the pole object 34 based on the relative position acquired by the vehicle outside information input unit 11b, and based on the pole distance.
  • the color of the pole object 34 is changed.
  • FIG. 14 is a flowchart showing the operation of the display control apparatus 1 according to the third embodiment. The operation in FIG. 14 is the same as that in which step S31 is added after step S3 in FIG.
  • step S31 the control unit 12 obtains the above-described pole distance based on the relative position acquired by the vehicle exterior information input unit 11b, and changes the color of the pole object 34 based on the pole distance. For example, as shown in FIG. 15, when the pole distance dp between the obstacle 51 and the pole object 34 is larger than a predetermined first distance (for example, 40 cm), the control unit 12 The 34 color is changed to a safety color such as light blue. As shown in FIG. 16, when the pole distance dp is equal to or smaller than the first distance and larger than a predetermined second distance (for example, 20 cm), the control unit 12 changes the color of the pole object 34 to yellow or the like. Change to the attention color. As shown in FIG. 17, when the pole distance dp is equal to or less than the second distance, the control unit 12 changes the color of the pole object 34 to an alarm color such as red. After step S31, the process returns to step S2.
  • a predetermined first distance for example, 40 cm
  • the control unit 12 changes the color of the pole object 34 to
  • the display mode of the pole object 34 is changed based on the relative position of the obstacle around the host vehicle and the pole object 34. According to such a configuration, the driver can know the relative positional relationship such as the pole distance by changing the display mode such as the color of the pole object 34 when the pole object 34 is moved.
  • control unit 12 changes the color of the pole object 34 based on the pole distance, but the present invention is not limited to this. For example, as shown in FIG. 18, based on whether or not the portion 53 of the obstacle 51 that has the shortest distance from the pole object 34 is located within a fan-shaped range 35 that extends forward from the pole object 34.
  • the control unit 12 may change the color of the pole object 34.
  • the display mode changed by the control unit 12 has been described as the color of the pole object 34.
  • the present invention is not limited to this, and the display mode changed by the control unit 12 may be at least one of presence / absence of animation such as blinking of the pole object 34, color, shape, and pattern. .
  • the control unit 12 changes the display mode of the pole object 34 based on the relative position of the obstacle and the pole object 34 acquired by the relative position acquisition unit 11c. It is not a thing.
  • the control unit 12 may change the display mode of the pole object 34 based on the relative position of the obstacle and the host vehicle acquired by the outside information input unit 11b.
  • the control unit 12 may change the display mode of the pole object 34 based on the distance between the obstacle and the bumper of the host vehicle. More specifically, the control unit 12 determines whether or not the distance between the obstacle and the bumper of the host vehicle is equal to or less than a distance (for example, 5 cm) where the obstacle and the bumper are likely to contact each other. Based on the above, the display mode of the pole object 34 may be changed.
  • the periphery detection unit 23 in FIG. 13 may further detect the color of an obstacle around the host vehicle, and the vehicle outside information input unit 11b may further acquire the color of the obstacle detected by the periphery detection unit 23.
  • the control part 12 may change the color of the pole object 34 based on the color of the obstruction acquired by the vehicle exterior information input part 11b.
  • the control unit 12 may change the color of the pole object 34 to a color that is the same as or similar to the color of the obstacle acquired by the outside-vehicle information input unit 11b.
  • the color here includes not only one color but also a color tone such as a pattern in which a plurality of colors are combined.
  • the surrounding detection unit 23 detects the relative position between the obstacle and the pole object 34 and the color of the obstacle for each of a plurality of obstacles around the host vehicle
  • the outside information input unit 11b detects the surroundings.
  • the information detected by the unit 23 may be acquired.
  • the control part 12 may change the color of the pole object 34 based on the relative position and color about each of several obstacles which were acquired by the vehicle exterior information input part 11b.
  • the control unit 12 may change the color of the pole object 34 based on the color of the obstacle closest to the pole object 34 among the plurality of obstacles 51.
  • FIG. 19 the example of FIG.
  • the driver can determine which part of the own vehicle is close to which obstacle, and which obstacle affects the running of the own vehicle.
  • the control unit 12 has changed the color of all parts of the pole object 34 based on the color of the obstacle acquired by the outside information input unit 11b.
  • the present invention is not limited to this. Absent.
  • the control unit 12 may change the color of a part of the pole object 34 (for example, the tip) based on the color of the obstacle acquired by the outside information input unit 11b.
  • the control unit 12 may change the color of the remaining part (for example, the side surface) of the pole object 34 based on the relative position between the obstacle and the pole object 34.
  • the control unit 12 can control a display unit (not shown) different from the virtual display unit 21 that can be displayed in the host vehicle.
  • 23 includes a camera.
  • the periphery detection unit 23 may capture the periphery of the host vehicle with a camera, and the vehicle information input unit 11b may acquire the image from the periphery detection unit 23.
  • the control part 12 displays the image acquired by the vehicle exterior information input part 11b on another display part which can be displayed within the own vehicle, and superimposes the display object corresponding to the pole object 34 on the said image.
  • the display object corresponding to the pole object 34 is a display object that is the same as or similar to the pole object 34.
  • FIG. 20 is a diagram illustrating an example of an arrangement relationship between the host vehicle 2 and another vehicle 51a that is an obstacle.
  • FIG. 21 shows that the corresponding pole object 37, which is a display object corresponding to the pole object 34, is superimposed on the image 36 acquired by the outside information input unit 11b and displayed on another display unit in the arrangement relationship of FIG. It is a figure which shows a state. Since the front portion of the other vehicle 51a in FIG. 20 is inclined to the left side, strictly speaking, the front portion of the other vehicle 51a is inclined to the left side in the image, but this is not an important point. In FIG. 1, the front portion of the other vehicle is shown not to tilt.
  • the driver can easily associate the scenery seen directly with the scenery seen from the camera by the pole object 34 and the corresponding pole object 37.
  • a general display device is used for another display unit.
  • the present invention is not limited to this, and a display device capable of displaying a virtual display object using stereoscopic vision may be used on another display unit. In that case, a virtual display object may be applied to the corresponding pole object 37.
  • Embodiment 3 the control part 12 assumed moving the pole object 34 based on a driver
  • the present invention is not limited to this, and the control unit 12 automatically moves the pole object 34 from one side of the passenger seat and the driver seat to the other side within a range where the vehicle fender and the front bumper overlap. May be.
  • the driver does not need to perform the moving operation of the pole object 34 when the driver tries to know the relative positional relationship between the obstacle and the host vehicle by changing the display mode. Can be reduced.
  • the control unit 12 moves the pole object 34 within a range that overlaps with the fender and the front bumper of the host vehicle based on the moving operation, as indicated by a two-dot chain line in FIGS. 22 and 23. I let you.
  • the present invention is not limited to this, and as shown by the solid lines in FIGS. 22 and 23, the control unit 12 is configured to move away from the fender and the front bumper of the host vehicle 2 based on the moving operation.
  • the pole object 34 may be moved forward. For example, a distance of about 10 cm is used as the maximum separation distance between the pole object 34 and the fender and front bumper of the host vehicle 2.
  • the control unit 12 can perform the operation described in the third embodiment even when the pole object 34 is moving away from the fender and the front bumper of the host vehicle 2. That is, even when the pole object 34 is moving away from the fender and the front bumper of the host vehicle 2, the control unit 12 determines the pole based on the pole distance between the obstacle around the host vehicle and the pole object 34. The display mode of the object 34 can be changed. Therefore, the driver can confirm how far the vehicle is from the obstacle by moving the pole object 34 to the obstacle side.
  • the control unit 12 changes the display mode of the pole object 34 to a special display mode indicating that it has touched.
  • a display object such as a character or a graphic indicating contact may be further displayed on the virtual display unit 21.
  • the block configuration of the display control apparatus 1 according to the fourth embodiment of the present invention is the same as the block configuration (FIG. 13) of the display control apparatus 1 according to the third embodiment.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • control unit 12 causes the virtual display unit 21 to further display the second display object.
  • the second display object a display object corresponding to a corner pole and indicating a fixed pole in the same manner as the actual corner pole is used.
  • auxiliary pole object the first display object described as the pole object in the above description
  • main pole object the second display object
  • FIGS. 24 and 25 are views showing a state where the vehicle can be seen through the windshield 31 from the interior of the host vehicle.
  • the control unit 12 corresponds to the front side end portion of the body of the host vehicle based on the movement operation acquired by the information acquisition unit 11, similarly to the pole object described so far.
  • the auxiliary pole object 38 is moved within a predetermined range.
  • the control unit 12 moves the main pole object 39 to the front side on the passenger seat side of the body of the host vehicle, like the position of a general corner pole. Secure at the end position.
  • the display mode of the auxiliary pole object 38 and the display mode of the main pole object 39 are different from each other and similar.
  • FIG. 26 is a flowchart showing the operation of the display control apparatus 1 according to the fourth embodiment.
  • step S41 the virtual display object generation unit 12a generates the main pole object 39 and the auxiliary pole object 38 based on the movement operation acquired by the operation signal input unit 11a.
  • the display position control unit 12b determines the display positions of the main pole object 39 and the auxiliary pole object 38 as initial positions.
  • the display control unit 12c controls the virtual display unit 21 so that the main pole object 39 and the auxiliary pole object 38 are displayed at the initial positions. As a result, the state shown in FIG. 24 is obtained.
  • step S42 of FIG. 26 the display position control unit 12b determines whether or not the operation signal input unit 11a has acquired a movement operation. If it is determined that the moving operation has been acquired, the process proceeds to step S43. If it is determined that the moving operation has not been acquired, the process of step S42 is performed again.
  • step S43 the display position control unit 12b determines the display position of the auxiliary pole object 38 based on the moving operation acquired by the operation signal input unit 11a.
  • the display control unit 12c controls the virtual display unit 21 so that the auxiliary pole object 38 is displayed at the display position determined by the display position control unit 12b. As a result, the auxiliary pole object 38 moves without moving the main pole object 39. Thereafter, the process returns to step S42.
  • the auxiliary pole object 38 is moved based on the operation of moving the auxiliary pole object 38 without moving the main pole object 39. Therefore, the driver can use the main pole object 39 in the same manner as a conventional corner pole even after the auxiliary pole object 38 is moved.
  • FIG. 27 is a block diagram showing a configuration of display control apparatus 1 according to Embodiment 5 of the present invention.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • the display control device 1 in FIG. 27 has the same configuration as that obtained by removing the operation signal input unit 11a and the relative position acquisition unit 11c from the display control device 1 in FIG.
  • the virtual display unit 21 and the periphery detection unit 23 are the same as those in the third embodiment.
  • the vehicle exterior information input unit 11b acquires the relative position of the obstacle around the host vehicle and the host vehicle from the periphery detection unit 23.
  • the display position control unit 12b determines the display position of the display object based on the relative position acquired by the vehicle exterior information input unit 11b.
  • the control unit 12 configured as described above is based on the relative positions of the obstacles around the host vehicle and the host vehicle acquired by the vehicle outside information input unit 11b even if there is no movement operation from the driver.
  • the object 34 can be automatically moved.
  • the control unit 12 determines the distance between the obstacle and the own vehicle based on the relative position of the obstacle around the own vehicle and the own vehicle acquired by the outside information input unit 11b. The vehicle distance is calculated. And the control part 12 fixes the position of the pole object 34 to an initial position, when the own vehicle distance is larger than a predetermined threshold value (for example, 40 cm). On the other hand, when the own vehicle distance is equal to or less than the threshold, the control unit 12 determines whether the obstacle is based on the relative position of the obstacle around the own vehicle and the own vehicle acquired by the outside information input unit 11b. The pole object 34 is brought closer. Here, the control unit 12 moves the pole object 34 to the position where the own vehicle distance is the shortest in the movable range of the pole object 34, that is, the portion where the distance to the obstacle is the shortest in the own vehicle.
  • FIG. 28 is a flowchart showing the operation of the display control apparatus 1 according to the fifth embodiment.
  • step S51 the virtual display object generation unit 12a generates the pole object 34 based on the relative position of the obstacle around the host vehicle and the host vehicle.
  • the display position control unit 12b determines the display position of the pole object 34 as the initial position.
  • the display control unit 12c controls the virtual display unit 21 so as to display the pole object 34 at the initial position.
  • step S52 the vehicle exterior information input unit 11b acquires the relative position of the obstacle around the host vehicle and the host vehicle.
  • step S53 the control unit 12 obtains the above-described host vehicle distance based on the relative positions of the obstacles around the host vehicle and the host vehicle acquired by the vehicle outside information input unit 11b. And the control part 12 determines whether the own vehicle distance is below a predetermined threshold value (for example, 40 cm). If it is determined that the host vehicle distance is equal to or less than the predetermined threshold, the process proceeds to step S54. If it is determined that the host vehicle distance is greater than the threshold, the process returns to step S51.
  • a predetermined threshold value for example, 40 cm
  • step S54 the control unit 12 moves the pole object 34 to a position where the own vehicle distance is the shortest. Thereafter, the process returns to step S52.
  • 29 and 30 are diagrams showing examples of the operation result of FIG.
  • FIG. 29 shows a case where the own vehicle distance dv, which is the distance between the obstacle 51 and the own vehicle 2, is larger than the threshold value.
  • the pole object 34 is fixed at the initial position.
  • FIG. 30 shows a case where the host vehicle distance dv is equal to or less than a threshold value. In this case, the pole object 34 moves to a position where the own vehicle distance dv is minimized.
  • the pole object 34 is automatically moved based on the relative positions of the obstacle 51 around the host vehicle 2 and the host vehicle 2. According to such a configuration, the driver does not need to perform the moving operation of the pole object 34, so the burden on the driver can be reduced.
  • the pole object 34 is brought closer to the obstacle based on the relative position of the obstacle around the host vehicle 2 and the host vehicle. According to such a configuration, the driver can know which part of the host vehicle 2 is closest to the obstacle 51 by the position of the pole object 34.
  • control unit 12 determines the virtual image distance and virtual image of the pole object 34 based on the relative positions of the obstacle around the host vehicle and the host vehicle. Instead of changing both the directions, the virtual image direction of the pole object 34 may be changed without changing the virtual image distance of the pole object 34.
  • each modification of the third embodiment may be appropriately combined with the fifth embodiment, or each modification of the fifth embodiment may be appropriately combined with the third embodiment.
  • the control unit 12 according to the fifth embodiment is based on the relative positions of the obstacles around the host vehicle and the host vehicle acquired by the outside information input unit 11b as in the first modification of the third embodiment.
  • the display mode of the pole object 34 may be changed.
  • the control unit 12 according to the fifth embodiment changes the display mode of the pole object 34 based on the host vehicle distance dv that is the distance between the obstacle and the host vehicle based on the relative position. May be.
  • the display mode changed by the control unit 12 may be at least one of presence / absence of animation such as blinking of the pole object 34, color, shape, and pattern.
  • the display mode changed by the control unit 12 may be at least one of presence / absence of animation such as blinking of the pole object 34, color, shape, and pattern.
  • the color of the pole object 34 is changed from the color of the pole object 34 shown in FIG.
  • the change of the display mode is not limited to this.
  • the control unit 12 may increase or decrease the length of the pole object 34 as the host vehicle distance dv decreases.
  • the control unit 12 may blink the pole object 34 or add a pattern to the pole object 34 as the host vehicle distance dv decreases.
  • the control part 12 may change the display mode of the pole object 34 in steps based on the own vehicle distance dv, or may change it continuously.
  • the fourth embodiment may be combined with the fifth embodiment. That is, as shown in FIGS. 33, 34, and 35, the control unit 12 does not move the main pole object 39 based on the relative positions of the obstacle 51 around the host vehicle 2 and the host vehicle 2.
  • the auxiliary pole object 38 may be automatically moved. Even in this case, the same effect as in the fifth embodiment can be obtained.
  • the periphery detection unit 23 in FIG. 27 detects information about obstacles around the host vehicle 2, and the vehicle outside information input unit 11 b acquires information detected by the periphery detection unit 23.
  • the information about the obstacle around the host vehicle 2 includes at least one of the relative position between the obstacle and the host vehicle 2, the attribute of the obstacle, the height of the obstacle, and the color of the obstacle. May be included.
  • control part 12 may make the virtual display part 21 further display the 3rd display object which is a display object which shows the information regarding the obstruction acquired by the vehicle exterior information input part 11b with the pole object 34.
  • the third display object is referred to as “additional object”.
  • the vehicle outside information input unit 11b acquires information on the relative position of the obstacle and the host vehicle as information on the obstacle, and the control unit 12 indicates the host vehicle distance dv based on the relative position.
  • the additional objects 40a, 40b, and 40c are attached to the pole object 34 and displayed on the virtual display unit 21.
  • the display object of the character which shows the value of the own vehicle distance dv is shown as the additional object 40a.
  • the display object of the arrow which shows the value of the own vehicle distance dv by length is shown as the additional object 40b.
  • a display object of two fingers indicating the value of the host vehicle distance dv is shown as an additional object 40c.
  • the driver can know information on obstacles around the host vehicle 2.
  • ⁇ Modification 3 of Embodiment 5> 27 may detect the attribute of an obstacle around the host vehicle, and the vehicle outside information input unit 11b may acquire the attribute of the obstacle detected by the periphery detection unit 23.
  • the obstacle attribute mentioned here corresponds to any of a stationary object, a vehicle, a person, and an animal other than a person.
  • the control part 12 may change the display mode of the pole object 34 based on the attribute of the obstruction acquired by the vehicle exterior information input part 11b.
  • FIG. 39, 40, and 41 are diagrams illustrating an example in which the control unit 12 changes the display mode of the pole object 34 based on the attribute of the obstacle.
  • FIG. 39 shows a pole object 34 indicating a stationary object when the attribute of the obstacle is the stationary object 51d.
  • FIG. 40 shows a pole object 34 indicating a vehicle when the attribute of the obstacle is the vehicle 51e.
  • FIG. 41 shows a pole object 34 indicating a person when the attribute of the obstacle is a person 51f.
  • the driver can know the attributes of the obstacles around the host vehicle.
  • the control unit 12 changes the display mode of the pole object 34 based on the attribute of the obstacle acquired by the outside information input unit 11b, but is not limited thereto.
  • the control unit 12 may attach an additional object indicating the attribute of the obstacle to the pole object 34 as in the second modification of the fifth embodiment.
  • ⁇ Modification 4 of Embodiment 5> 27 may detect the height of an obstacle around the host vehicle, and the vehicle outside information input unit 11b may acquire the height of the obstacle detected by the periphery detection unit 23. And the control part 12 may change the display mode of the pole object 34 based on the height of the obstruction acquired by the vehicle exterior information input part 11b.
  • FIG. 42 and 43 are diagrams illustrating an example in which the control unit 12 changes the display mode of the pole object 34 based on the height of the obstacle.
  • FIG. 42 shows a pole object 34 indicating that the height of the obstacle is relatively low depending on the position of the mark when the height of the obstacle is relatively low.
  • FIG. 43 shows a pole object 34 indicating that the height of the obstacle is relatively high depending on the position of the mark when the height of the obstacle is relatively high.
  • the height of the obstacle may be indicated by the length of the pole object 34, or the height of the obstacle may be indicated by a scale mark (not shown) of the pole object 34. You may show.
  • the driver can know the height of the obstacle around the host vehicle.
  • the control part 12 changed the display mode of the pole object 34 based on the height of the obstruction acquired by the vehicle exterior information input part 11b here, it is not restricted to this.
  • the control unit 12 may attach an additional object indicating the height of the obstacle to the pole object 34 as in the second modification of the fifth embodiment.
  • the pole object 34 may be moved forward. For example, the control unit 12 first automatically moves the pole object 34 to a position where the own vehicle distance is the shortest in a range in which the pole object 34 can move, and then moves the pole object 34 toward the obstacle. You may move automatically.
  • 44, 45 and 46 are diagrams sequentially illustrating a state in which the pole object 34 moved to the position where the own vehicle distance is the shortest is separated from the fender and the front bumper of the own vehicle 2.
  • the control unit 12 makes a pole between the obstacle 51 around the host vehicle 2 and the pole object 34.
  • the display mode of the pole object 34 may be changed based on the distance.
  • the control unit 12 may change the display mode of the pole object 34 to a special display mode indicating that the pole object 34 has been touched.
  • a display object such as a character or a figure indicating that it has been performed may be further displayed on the virtual display unit 21. According to such a configuration, the driver can know the relative positions of the obstacle 51 and the host vehicle 2.
  • the control unit 12 first automatically moves the pole object 34 to the position where the own vehicle distance is the shortest in the movable range of the pole object 34, and then the pole object 34. Was automatically moved to the obstacle.
  • the control unit 12 may combine the automatic movement of the pole object 34 with the movement of the pole object 34 based on the driver's movement operation.
  • the control unit 12 determines, based on the relative positions of the obstacle 51 around the host vehicle 2 and the host vehicle, the position where the host vehicle distance is the shortest in the movable range of the pole object 34.
  • the pole object 34 may be moved automatically, and then the pole object 34 may be moved toward the obstacle based on the driver's movement operation.
  • control unit 12 has displayed one pole object 34 on the virtual display unit 21.
  • the present invention is not limited to this, and the control unit 12 may display a plurality of pole objects 34 on the virtual display unit 21.
  • the periphery detection unit 23 detects the relative positions of the obstacles around the host vehicle and the host vehicle with respect to a plurality of obstacles
  • the outside information input unit 11b detects the obstacles around the host vehicle with respect to the plurality of obstacles.
  • You may acquire the relative position about the own vehicle from the periphery detection part 23.
  • the control part 12 may display the pole object 34 on the virtual display part 21 one by one in the position where the own vehicle distance becomes the shortest about each of several obstacles.
  • the control unit 12 may cause the virtual display unit 21 to display the auxiliary pole objects 38 one by one at the position where the own vehicle distance is the shortest for each of the plurality of obstacles 51. .
  • FIG. 48 is a block diagram showing a configuration of display control apparatus 1 according to Embodiment 6 of the present invention.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • the in-vehicle LAN (Local Area Network) device 24 Before describing the internal configuration of the display control device 1, the in-vehicle LAN (Local Area Network) device 24 will be described.
  • the virtual display unit 21 and the periphery detection unit 23 are the same as those in the third embodiment.
  • the in-vehicle LAN device 24 constitutes a CAN (Controller Area Network) or the like, and communicates various information and control commands between devices in the own vehicle. Thereby, the in-vehicle LAN device 24 acquires the position information of the own vehicle, the control information in traveling of the own vehicle, the information related to the body of the own vehicle, the unique information of the own vehicle from the own vehicle.
  • the periphery detection unit 23 may be connected to the in-vehicle LAN device 24. In this case, the vehicle exterior information input unit 11 b acquires information on obstacles around the host vehicle via the vehicle interior LAN device 24.
  • the display control device 1 in FIG. 48 has the same configuration as that in which the in-vehicle information input unit 11d is added to the display control device 1 in FIG.
  • the in-vehicle information input unit 11d is included in the concept of the information acquisition unit 11 in FIG.
  • the in-vehicle information input unit 11d obtains a first movement locus that is a future movement locus of the host vehicle based on the control information acquired by the in-vehicle LAN device 24 and the like.
  • the first movement locus is referred to as “own vehicle movement locus”.
  • the control unit 12 such as the display position control unit 12b is based on the relative position of the obstacle and the own vehicle acquired by the outside information input unit 11b and the own vehicle movement locus obtained by the in-vehicle information input unit 11d.
  • the control part 12 moves the pole object 34 to the calculated
  • FIG. 49 is a flowchart showing the operation of the display control apparatus 1 according to the sixth embodiment. The operation in FIG. 49 is the same as that obtained by replacing step S54 in FIG. 28 with steps S61 and S62.
  • step S61 the control unit 12 determines the own vehicle based on the relative position of the obstacle and the own vehicle acquired by the outside information input unit 11b and the own vehicle movement trajectory acquired by the in-vehicle information input unit 11d.
  • the own vehicle 2 when traveling along the own vehicle movement locus is indicated by a two-dot chain line.
  • the control unit 12 determines that the portion 2c indicated by the two-dot chain line of the own vehicle 2 is the other vehicle 51a. It is predicted that the portion 54 will come into contact.
  • step S62 the control unit 12 moves the pole object 34 to the obtained part.
  • the control unit 12 moves the pole object 34 to the part 2 d indicated by the solid line corresponding to the part 2 c of the host vehicle 2. Thereafter, the process returns to step S52.
  • the pole object 34 is moved to the part of the host vehicle 2 that will come into contact with an obstacle in the future. Accordingly, the driver can change the movement trajectory of the host vehicle 2 with reference to the display of the pole object 34 so that the host vehicle 2 does not contact the obstacle.
  • the control unit 12 determines that the host vehicle 2 is moving on the own vehicle based on the relative position acquired by the outside information input unit 11b and the own vehicle moving track acquired by the in-vehicle information input unit 11d. , The pole object 34 is moved to the part of the host vehicle 2 that contacts the obstacle.
  • the control unit 12 determines the vehicle 2 based on the relative position acquired by the vehicle outside information input unit 11b and the vehicle movement locus acquired by the vehicle interior information input unit 11d.
  • the pole object 34 may be moved to the portion of the host vehicle 2 that does not contact the obstacle but is closest to the obstacle.
  • the control part 12 may change the display mode of the pole object 34 based on whether it contacts with an obstruction. According to such a configuration, the driver can change the movement trajectory of the host vehicle 2 with reference to the display of the pole object 34 so that the host vehicle 2 does not contact an obstacle.
  • the control unit 12 moves the main pole object 39 to the portion 2e currently closest to the other vehicle 51a in the own vehicle 2, and the own vehicle 2 follows the own vehicle movement locus.
  • the auxiliary pole object 38 may be moved to the portion 2f of the host vehicle 2 that contacts the obstacle.
  • the control unit 12 moves one auxiliary pole object 38 to the currently nearest portion 2e of the own vehicle 2 and, when the own vehicle 2 moves along the own vehicle movement locus, Of these, another auxiliary pole object 38 may be moved to the portion 2f in contact with the obstacle. In this case, the control unit 12 may fix the position of the main pole object 39 to the initial position.
  • the vehicle outside information input unit 11b obtains a second movement locus that is a future movement locus of the obstacle based on the information about the obstacle around the host vehicle detected by the periphery detection unit 23.
  • the second movement locus is referred to as “obstacle movement locus”.
  • the outside information input unit 11b moves the obstacle of the other vehicle based on the tire direction of the other vehicle indicated in the captured image. Find as a trajectory.
  • the vehicle outside information input unit 11b moves the obstacle on the movement locus of the other vehicle based on the information on the automatic driving of the other vehicle acquired by inter-vehicle communication or the like. Find as a trajectory.
  • the control unit 12 such as the display position control unit 12b, the relative position of the obstacle and the own vehicle acquired by the outside information input unit 11b, the own vehicle movement locus obtained by the inboard information input unit 11d, and the outside information input.
  • the own vehicle moves along the own vehicle movement locus and the obstacle moves along the obstacle movement locus based on the obstacle movement locus obtained by the unit 11b
  • the obstacle of the own vehicle At least one of the part closest to the object and the part in contact with the obstacle is obtained.
  • the control unit 12 moves the pole object 34 to at least one of the obtained portions.
  • the operation signal input unit 11a performs a switching operation from the driver via the operation input unit 22. You may get it.
  • the control part 12 is based on the switching operation acquired by the operation signal input part 11a, the display demonstrated in Embodiment 5, the display demonstrated in Embodiment 6, the modification 1 of Embodiment 6, and The display described in 2 may be selectively performed.
  • the information acquisition unit 11 and the control unit 12 in the display control apparatus 1 described above are hereinafter referred to as “information acquisition unit 11 etc.”.
  • the information acquisition unit 11 and the like are realized by a processing circuit 81 illustrated in FIG. That is, the processing circuit 81 displays an information acquisition unit 11 that acquires information and a first display object that is a display object on the display unit, and based on the information acquired by the information acquisition unit 11, the front side of the vehicle body And a control unit 12 that moves the first display object within a predetermined range corresponding to the end.
  • Dedicated hardware may be applied to the processing circuit 81, or a processor that executes a program stored in the memory may be applied.
  • the processor corresponds to, for example, a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor) and the like.
  • the processing circuit 81 When the processing circuit 81 is dedicated hardware, the processing circuit 81 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate). Array) or a combination thereof.
  • Each function of each unit such as the information acquisition unit 11 may be realized by a circuit in which processing circuits are distributed, or the function of each unit may be realized by a single processing circuit.
  • the processing circuit 81 When the processing circuit 81 is a processor, the functions of the information acquisition unit 11 and the like are realized by a combination with software or the like.
  • the software or the like corresponds to, for example, software, firmware, or software and firmware.
  • Software or the like is described as a program and stored in the memory 83.
  • the processor 82 applied to the processing circuit 81 reads out and executes a program stored in the memory 83, thereby realizing the functions of the respective units. That is, when executed by the processing circuit 81, the display control device 1 displays the information display step and the first display object, which is a display object, on the display unit.
  • the display control device 1 Based on the acquired information, the display control device 1 And a memory 83 for storing a program to be executed as a result of performing control for moving the first display object within a predetermined range corresponding to the front end of the body.
  • this program causes a computer to execute procedures and methods such as the information acquisition unit 11.
  • the memory 83 is, for example, non-volatile or RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), or the like. Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk), its drive device, etc., or any storage media used in the future May be.
  • each function of the information acquisition unit 11 and the like is realized by either hardware or software.
  • the present invention is not limited to this, and a configuration in which a part of the information acquisition unit 11 or the like is realized by dedicated hardware and another part is realized by software or the like.
  • the function of the information acquisition unit 11 is realized by the processing circuit 81 as dedicated hardware, and the processing circuit 81 as the processor 82 reads and executes the program stored in the memory 83 for the other parts.
  • the function can be realized.
  • the processing circuit 81 can realize the functions described above by hardware, software, or the like, or a combination thereof.
  • the display control device 1 described above includes a navigation device such as PND (Portable Navigation Device), a communication terminal including mobile terminals such as a mobile phone, a smartphone, and a tablet, and functions of applications installed on these devices,
  • the present invention can also be applied to a display control system constructed as a system by appropriately combining servers.
  • each function or each component of the display control device 1 described above may be distributed and arranged in each device that constructs the system, or may be concentrated on any device. Good.
  • the display control apparatus may further include a virtual display unit 21 in FIG.
  • FIG. 54 is a block diagram showing a configuration of the server 91 according to this modification.
  • 54 includes a communication unit 91a and a control unit 91b, and can perform wireless communication with a display control device 93 realized by a navigation device of the host vehicle 92 or the like.
  • the communication unit 91a which is an information acquisition unit, receives information acquired by the display control device 93 by performing wireless communication with the display control device 93.
  • the control unit 91b has a function similar to that of the control unit 12 in FIG. 1 when a processor (not illustrated) of the server 91 executes a program stored in a memory (not illustrated) of the server 91. That is, the control unit 91b generates a control signal for moving the first display object within a predetermined range corresponding to the front end portion of the vehicle body based on the information received by the communication unit 91a. And the communication part 91a transmits the control signal of the control part 91b to the display control apparatus 93.
  • the display control device 93 moves the pole object displayed on the virtual display unit 21 based on the control signal transmitted from the communication unit 91a.
  • the server 91 configured in this way, the same effect as that of the display control device 1 described in the first embodiment can be obtained.
  • FIG. 55 is a block diagram showing a configuration of the communication terminal 96 according to the present modification.
  • the communication terminal 96 in FIG. 55 includes a communication unit 96a similar to the communication unit 91a and a control unit 96b similar to the control unit 91b, and can perform wireless communication with the display control device 98 of the host vehicle 97. It has become.
  • mobile terminals such as mobile phones, smartphones, and tablets carried by the driver of the host vehicle 97 are applied to the communication terminal 96.
  • the same effect as that of the display control device 1 described in the first embodiment can be obtained.
  • the present invention can be freely combined with each embodiment and each modification within the scope of the invention, or can be appropriately modified and omitted with each embodiment and each modification.
  • SYMBOLS 1 Display control apparatus, 2 Own vehicle, 11 Information acquisition part, 12 Control part, 21 Virtual display part, 34 Pole object, 36 image, 37 Corresponding pole object, 38 Auxiliary pole object, 39 Main pole object, 40a, 40b, 40c Additional objects, 51 obstacles.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Instrument Panels (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Traffic Control Systems (AREA)

Abstract

The purpose of the present invention is to provide a technique capable of virtually moving a corner pole. This display control device is a display control device for controlling a display unit. The display unit can display one or more display objects which appear to overlap with scenery outside a vehicle when viewed from the driver's seat of the vehicle. The display control device is provided with: an information acquisition unit for acquiring information; a control unit that allows a first display object, which is the display object, to be displayed on a display unit, and moves, on the basis of the information acquired from the information acquisition unit, the first display object within a predetermined range corresponding to a front-side end portion of a vehicle body.

Description

表示制御装置及び表示制御方法Display control apparatus and display control method
 本発明は、表示部を制御する表示制御装置及び表示制御方法に関する。 The present invention relates to a display control device and a display control method for controlling a display unit.
 運転者が他車両と障害物との位置関係を視認するためのコーナポールが設けられた車両が知られており、このコーナポールについて様々な技術が提案されている。例えば特許文献1及び2には、ハードウェアのコストを削減することができるように、HUD(ヘッドアップディスプレイ)の虚像またはホログラフィを利用して、コーナポールを表示する技術が提案されている。 Vehicles with corner poles for drivers to visually recognize the positional relationship between other vehicles and obstacles are known, and various techniques have been proposed for this corner pole. For example, Patent Documents 1 and 2 propose a technique for displaying a corner pole using a virtual image or holography of a HUD (head-up display) so that the cost of hardware can be reduced.
国際公開第2012/127681号International Publication No. 2012/1276881 特開平07-140918号公報Japanese Patent Application Laid-Open No. 07-140918
 運転者が運転している車両と、当該車両周辺の障害物との位置関係によっては、運転者は、コーナポールが設けられた部分以外の車両の部分と障害物との位置関係を視認したい場合がある。しかしながら、コーナポールの位置は固定されているので、そのような視認を行うことができなかった。また、この問題を解決するために、複数のコーナポールを設置する方法、または、1つのコーナポールを機械式に移動する機構を設けることが考えられるが、外観が悪くなったり、ハードウェアのコストが増加したりするという問題があった。 Depending on the positional relationship between the vehicle the driver is driving and obstacles around the vehicle, the driver may want to visually recognize the positional relationship between the vehicle part and the obstacle other than the part where the corner pole is provided. There is. However, since the corner pole is fixed, such visual recognition cannot be performed. In order to solve this problem, a method of installing a plurality of corner poles or a mechanism for moving one corner pole mechanically can be considered. However, the appearance is deteriorated and the cost of hardware is reduced. There was a problem that increased.
 そこで、本発明は、上記のような問題点を鑑みてなされたものであり、仮想的にコーナポールを移動することが可能な技術を提供することを目的とする。 Therefore, the present invention has been made in view of the above-described problems, and an object thereof is to provide a technique capable of virtually moving a corner pole.
 本発明に係る表示制御装置は、表示部を制御する表示制御装置であって、表示部は、車両外の景色に重ねて車両の運転席から視認される1以上の表示オブジェクトを表示可能であり、表示制御装置は、情報を取得する情報取得部と、表示オブジェクトである第1表示オブジェクトを表示部に表示させ、情報取得部で取得された情報に基づいて、車両のボディの前側端部に対応する予め定められた範囲内で第1表示オブジェクトを移動させる制御部とを備える。 The display control device according to the present invention is a display control device that controls a display unit, and the display unit can display one or more display objects that are visible from the driver's seat of the vehicle over the scenery outside the vehicle. The display control device displays an information acquisition unit that acquires information and a first display object that is a display object on the display unit, and based on the information acquired by the information acquisition unit, on the front side end of the vehicle body A control unit that moves the first display object within a corresponding predetermined range.
 本発明によれば、情報取得部で取得された情報に基づいて、車両のボディの前側端部に対応する予め定められた範囲内で第1表示オブジェクトを移動させる。このような構成によれば、仮想的にコーナポールを移動することができる。 According to the present invention, based on the information acquired by the information acquisition unit, the first display object is moved within a predetermined range corresponding to the front end of the vehicle body. According to such a configuration, the corner pole can be virtually moved.
 本発明の目的、特徴、態様及び利点は、以下の詳細な説明と添付図面とによって、より明白となる。 The objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description and the accompanying drawings.
実施の形態1に係る表示制御装置の構成を示すブロック図である。1 is a block diagram illustrating a configuration of a display control device according to a first embodiment. 実施の形態2に係る表示制御装置の構成を示すブロック図である。6 is a block diagram illustrating a configuration of a display control device according to Embodiment 2. FIG. 実施の形態2に係る表示制御装置の動作を示すフローチャートである。10 is a flowchart illustrating an operation of the display control apparatus according to the second embodiment. 実施の形態2に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment. 実施の形態2に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment. 実施の形態2に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment. 実施の形態2に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment. 実施の形態2に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment. 実施の形態2の変形例2に係る表示制御装置の動作を示すフローチャートである。10 is a flowchart illustrating an operation of a display control apparatus according to a second modification of the second embodiment. 実施の形態2の変形例4に係る表示制御装置の動作を示すフローチャートである。14 is a flowchart illustrating an operation of a display control apparatus according to a fourth modification of the second embodiment. 実施の形態2の変形例4に係る表示制御装置の動作を説明するための図である。FIG. 16 is a diagram for explaining an operation of a display control apparatus according to Modification 4 of Embodiment 2. 実施の形態2の変形例4に係る表示制御装置の動作を説明するための図である。FIG. 16 is a diagram for explaining an operation of a display control apparatus according to Modification 4 of Embodiment 2. 実施の形態3に係る表示制御装置の構成を示すブロック図である。10 is a block diagram illustrating a configuration of a display control device according to Embodiment 3. FIG. 実施の形態3に係る表示制御装置の動作を示すフローチャートである。14 is a flowchart illustrating an operation of the display control apparatus according to the third embodiment. 実施の形態3に係る表示制御装置の動作を説明するための図である。10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3. FIG. 実施の形態3に係る表示制御装置の動作を説明するための図である。10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3. FIG. 実施の形態3に係る表示制御装置の動作を説明するための図である。10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3. FIG. 実施の形態3に係る表示制御装置の動作を説明するための図である。10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3. FIG. 実施の形態3の変形例2に係る表示制御装置の動作を説明するための図である。10 is a diagram for explaining the operation of a display control apparatus according to a second modification of the third embodiment. FIG. 実施の形態3の変形例3に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a third modification of the third embodiment. 実施の形態3の変形例3に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a third modification of the third embodiment. 実施の形態3の変形例5に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining an operation of a display control apparatus according to Modification 5 of Embodiment 3. 実施の形態3の変形例5に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining an operation of a display control apparatus according to Modification 5 of Embodiment 3. 実施の形態4に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining the operation of the display control apparatus according to the fourth embodiment. 実施の形態4に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining the operation of the display control apparatus according to the fourth embodiment. 実施の形態4に係る表示制御装置の動作を示すフローチャートである。10 is a flowchart showing the operation of the display control apparatus according to the fourth embodiment. 実施の形態5に係る表示制御装置の構成を示すブロック図である。FIG. 10 is a block diagram illustrating a configuration of a display control device according to a fifth embodiment. 実施の形態5に係る表示制御装置の動作を示すフローチャートである。10 is a flowchart illustrating an operation of the display control apparatus according to the fifth embodiment. 実施の形態5に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a fifth embodiment. 実施の形態5に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a fifth embodiment. 実施の形態5の変形例に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to a modification of the fifth embodiment. 実施の形態5の変形例に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to a modification of the fifth embodiment. 実施の形態5の変形例1に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 1 of Embodiment 5. 実施の形態5の変形例1に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 1 of Embodiment 5. 実施の形態5の変形例1に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 1 of Embodiment 5. 実施の形態5の変形例2に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to Modification 2 of Embodiment 5. 実施の形態5の変形例2に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to Modification 2 of Embodiment 5. 実施の形態5の変形例2に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to Modification 2 of Embodiment 5. 実施の形態5の変形例3に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 3 of the fifth embodiment. 実施の形態5の変形例3に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 3 of the fifth embodiment. 実施の形態5の変形例3に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 3 of the fifth embodiment. 実施の形態5の変形例4に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 4 of Embodiment 5. 実施の形態5の変形例4に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 4 of Embodiment 5. 実施の形態5の変形例5に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 5 of the fifth embodiment. 実施の形態5の変形例5に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 5 of the fifth embodiment. 実施の形態5の変形例5に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 5 of the fifth embodiment. 実施の形態5の変形例6に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 6 of the fifth embodiment. 実施の形態6に係る表示制御装置の構成を示すブロック図である。FIG. 10 is a block diagram illustrating a configuration of a display control device according to a sixth embodiment. 実施の形態6に係る表示制御装置の動作を示すフローチャートである。18 is a flowchart showing the operation of the display control apparatus according to the sixth embodiment. 実施の形態6に係る表示制御装置の動作を説明するための図である。FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a sixth embodiment. 実施の形態6の変形例1に係る表示制御装置の動作を説明するための図である。FIG. 25 is a diagram for explaining an operation of a display control apparatus according to the first modification of the sixth embodiment. その他の変形例に係る表示制御装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the display control apparatus which concerns on another modification. その他の変形例に係る表示制御装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the display control apparatus which concerns on another modification. その他の変形例に係るサーバの構成を示すブロック図である。It is a block diagram which shows the structure of the server which concerns on another modification. その他の変形例に係る通信端末の構成を示すブロック図である。It is a block diagram which shows the structure of the communication terminal which concerns on another modification.
 <実施の形態1>
 以下、本発明の実施の形態1に係る表示制御装置が、車両に搭載されているものとして説明する。そして、当該表示制御装置が搭載され、着目の対象となる車両を「自車両」と記して説明する。
<Embodiment 1>
Hereinafter, the display control apparatus according to Embodiment 1 of the present invention will be described as being mounted on a vehicle. A vehicle on which the display control device is mounted and which is a target of attention will be described as “own vehicle”.
 図1は、本実施の形態1に係る表示制御装置1の構成を示すブロック図である。図1の表示制御装置1は、表示部である仮想表示部21を制御する。この仮想表示部21は、自車両外の景色に重ねて自車両の運転席から視認される1以上の表示オブジェクトを表示可能となっている。つまり、仮想表示部21は、あたかも現実世界の3次元空間に実在するように自車両の運転者に見える仮想的な表示オブジェクトを表示することが可能となっている。このような仮想表示部21には、例えば、虚像もしくはホログラフィを表示するHUD、または、裸眼立体視表示装置などが適用される。 FIG. 1 is a block diagram showing a configuration of the display control apparatus 1 according to the first embodiment. The display control device 1 in FIG. 1 controls a virtual display unit 21 that is a display unit. The virtual display unit 21 can display one or more display objects that are visually recognized from the driver's seat of the host vehicle over the scenery outside the host vehicle. That is, the virtual display unit 21 can display a virtual display object that can be seen by the driver of the host vehicle as if it actually exists in a three-dimensional space in the real world. For example, a HUD that displays a virtual image or a holography, or an autostereoscopic display device is applied to the virtual display unit 21.
 図1の表示制御装置1は、情報取得部11と制御部12とを備える。 The display control device 1 in FIG. 1 includes an information acquisition unit 11 and a control unit 12.
 情報取得部11は、表示オブジェクトの移動に使用可能な情報を取得する。この表示オブジェクトには、運転者が車両と障害物との位置関係を視認するためのポール、つまりコーナポールに相当するポールを示す第1表示オブジェクトが含まれる。以下の説明では実施の形態4を除いて、第1表示オブジェクトを「ポールオブジェクト」と記す。ポールオブジェクトの移動に使用可能な情報としては、実施の形態2以降で説明するように、ポールオブジェクトを移動させるユーザの操作などが用いられる。 The information acquisition unit 11 acquires information that can be used to move the display object. This display object includes a first display object indicating a pole for the driver to visually recognize the positional relationship between the vehicle and the obstacle, that is, a pole corresponding to a corner pole. In the following description, except for the fourth embodiment, the first display object is described as a “pole object”. As information that can be used for moving the pole object, as described in the second and subsequent embodiments, a user operation for moving the pole object is used.
 制御部12は、ポールオブジェクトを仮想表示部21に表示させ、情報取得部11で取得された情報に基づいて、自車両のボディの前側端部に対応する予め定められた範囲内でポールオブジェクトを移動させる。なお、予め定められた範囲は、自車両のボディの前側端部と重なる範囲、及び、自車両のボディの前側端部周辺の範囲の少なくともいずれか1つを含む。自車両のボディの前側端部には、例えば自車両の前側バンパー、または、自車両のフェンダー及び前側バンパーが適用される。 The control unit 12 displays the pole object on the virtual display unit 21, and based on the information acquired by the information acquisition unit 11, the control unit 12 displays the pole object within a predetermined range corresponding to the front side end of the body of the host vehicle. Move. The predetermined range includes at least one of a range overlapping the front end portion of the body of the host vehicle and a range around the front end portion of the body of the host vehicle. For example, a front bumper of the own vehicle, or a fender and a front bumper of the own vehicle are applied to the front end of the body of the own vehicle.
 <実施の形態1のまとめ>
 以上のような本実施の形態1に係る表示制御装置1によれば、仮想的にコーナポールを移動することができる。したがって、ハードウェアのコスト及び外観の悪化を抑制することができる。
<Summary of Embodiment 1>
According to the display control apparatus 1 according to the first embodiment as described above, the corner pole can be virtually moved. Therefore, deterioration of hardware cost and appearance can be suppressed.
 <実施の形態2>
 図2は、本発明の実施の形態2に係る表示制御装置1の構成を示すブロック図である。以下、本実施の形態2で説明する構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
<Embodiment 2>
FIG. 2 is a block diagram showing a configuration of the display control apparatus 1 according to Embodiment 2 of the present invention. Hereinafter, among the constituent elements described in the second embodiment, constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
 表示制御装置1内部の構成について説明する前に、操作入力部22について説明する。なお、仮想表示部21は実施の形態1と同様である。 Before describing the internal configuration of the display control device 1, the operation input unit 22 will be described. The virtual display unit 21 is the same as that in the first embodiment.
 操作入力部22は、ポールオブジェクトを移動させる操作などの各種操作を運転者から受け付け、受け付けた操作を示す操作信号を表示制御装置1に出力する。本実施の形態2では、操作入力部22にはタッチパネルが用いられるものとして以下説明する。ただし操作入力部22には、例えば、運転者の押し操作を受け付けるスイッチ、運転者の音声操作を受け付ける音声操作入力装置、空間に対する運転者のジェスチャを操作として受け付けるカメラなどからなるジェスチャ操作入力装置、及び、運転者の視線の動きを操作として受け付けるカメラなどからなる視線操作入力装置の少なくともいずれか1つが用いられてもよい。また運転者の意思を入力できる操作装置であれば、どのような装置を用いてもよい。これらのうちのいくつかについては、後述の変形例において詳細に説明する。 The operation input unit 22 receives various operations such as an operation of moving the pole object from the driver, and outputs an operation signal indicating the received operation to the display control device 1. In the second embodiment, the operation input unit 22 will be described below as a touch panel. However, the operation input unit 22 includes, for example, a switch that receives a driver's push operation, a voice operation input device that receives a driver's voice operation, a gesture operation input device that includes a camera that receives a driver's gesture for space as an operation, And at least any one of the line-of-sight operation input devices including a camera or the like that accepts the movement of the driver's line of sight as an operation may be used. Any device that can input the driver's intention may be used. Some of these will be described in detail in later-described modifications.
 次に、図2の表示制御装置1内部の構成について説明する。表示制御装置1は、操作信号入力部11aと、仮想表示オブジェクト生成部12aと、表示位置制御部12bと、表示制御部12cとを備える。なお、操作信号入力部11aは、実施の形態1の図1の情報取得部11の概念に含まれる。仮想表示オブジェクト生成部12a、表示位置制御部12b及び表示制御部12cは、図2の破線に示されるように、実施の形態1の図1の制御部12の概念に含まれる。 Next, the internal configuration of the display control device 1 in FIG. 2 will be described. The display control device 1 includes an operation signal input unit 11a, a virtual display object generation unit 12a, a display position control unit 12b, and a display control unit 12c. The operation signal input unit 11a is included in the concept of the information acquisition unit 11 in FIG. 1 of the first embodiment. The virtual display object generation unit 12a, the display position control unit 12b, and the display control unit 12c are included in the concept of the control unit 12 in FIG. 1 of the first embodiment, as indicated by a broken line in FIG.
 操作入力部22が操作を受け付けた場合に、操作信号入力部11aは、当該操作を示す操作信号を操作入力部22から取得する。このため、操作入力部22がポールオブジェクトを移動させる操作を受け付けた場合に、操作信号入力部11aは、ポールオブジェクトを移動させる操作の情報を操作入力部22から取得する。以下の説明では、ポールオブジェクトを移動させる操作を「移動操作」と記す。 When the operation input unit 22 receives an operation, the operation signal input unit 11a acquires an operation signal indicating the operation from the operation input unit 22. For this reason, when the operation input unit 22 receives an operation of moving the pole object, the operation signal input unit 11a acquires information on the operation of moving the pole object from the operation input unit 22. In the following description, an operation for moving the pole object is referred to as a “movement operation”.
 仮想表示オブジェクト生成部12aは、操作信号入力部11aで取得された操作に基づいて、仮想表示部21に表示すべき表示オブジェクトを生成する。本実施の形態2では、仮想表示オブジェクト生成部12aは、操作信号入力部11aで取得された移動操作に基づいて、ポールオブジェクトを生成する。なお、仮想表示オブジェクト生成部12aは、自車両のアクセサリー電源がオンしたことを示す信号に基づいて、ポールオブジェクトを生成してもよい。 The virtual display object generation unit 12a generates a display object to be displayed on the virtual display unit 21 based on the operation acquired by the operation signal input unit 11a. In the second embodiment, the virtual display object generation unit 12a generates a pole object based on the moving operation acquired by the operation signal input unit 11a. The virtual display object generation unit 12a may generate a pole object based on a signal indicating that the accessory power supply of the host vehicle is turned on.
 表示位置制御部12bは、仮想表示オブジェクト生成部12aで生成された表示オブジェクトの表示位置を、操作信号入力部11aで取得された操作に基づいて決定する。仮想表示部21が虚像を表示するHUDである場合、表示位置制御部12bは虚像の位置を表示位置として決定する。ここで、虚像の位置は、自車両の特定位置(例えば運転席またはフロントガラス)を原点などの基準にした3次元座標空間における位置である。3次元座標空間が極座標空間である場合、虚像の位置は、例えば、虚像の方向である虚像方向と、虚像までの距離である虚像距離とによって規定される。座標空間が直交座標空間である場合、虚像の位置は、例えば、車両の前後方向、左右方向、上下方向に規定された3つの直交座標軸上の座標によって規定される。 The display position control unit 12b determines the display position of the display object generated by the virtual display object generation unit 12a based on the operation acquired by the operation signal input unit 11a. When the virtual display unit 21 is a HUD that displays a virtual image, the display position control unit 12b determines the position of the virtual image as the display position. Here, the position of the virtual image is a position in a three-dimensional coordinate space with a specific position (for example, a driver's seat or a windshield) of the host vehicle as a reference such as an origin. When the three-dimensional coordinate space is a polar coordinate space, the position of the virtual image is defined by, for example, the virtual image direction that is the direction of the virtual image and the virtual image distance that is the distance to the virtual image. When the coordinate space is an orthogonal coordinate space, the position of the virtual image is defined by coordinates on three orthogonal coordinate axes defined in the front-rear direction, the left-right direction, and the up-down direction of the vehicle, for example.
 本実施の形態2では、表示位置制御部12bは、操作信号入力部11aで取得された移動操作に基づいて、自車両のフェンダー及び前側バンパーのどこかと重なる位置にポールオブジェクトの表示位置を決定する。 In the second embodiment, the display position control unit 12b determines the display position of the pole object at a position overlapping with the fender and the front bumper of the host vehicle based on the moving operation acquired by the operation signal input unit 11a. .
 表示制御部12cは、仮想表示オブジェクト生成部12aで生成された表示オブジェクトを、表示位置制御部12bで決定された表示位置に表示するように、仮想表示部21を制御する。本実施の形態2では、表示制御部12cは、仮想表示オブジェクト生成部12aで生成されたポールオブジェクトを、表示位置制御部12bで決定された表示位置に表示するように、仮想表示部21を制御する。 The display control unit 12c controls the virtual display unit 21 so that the display object generated by the virtual display object generation unit 12a is displayed at the display position determined by the display position control unit 12b. In the second embodiment, the display control unit 12c controls the virtual display unit 21 so that the pole object generated by the virtual display object generation unit 12a is displayed at the display position determined by the display position control unit 12b. To do.
 <動作>
 図3は、本実施の形態2に係る表示制御装置1の動作を示すフローチャートである。
<Operation>
FIG. 3 is a flowchart showing the operation of the display control apparatus 1 according to the second embodiment.
 まずステップS1にて、仮想表示オブジェクト生成部12aは、操作信号入力部11aで取得された操作などに基づいてポールオブジェクトを生成する。表示位置制御部12bは、ポールオブジェクトの表示位置を初期位置に決定する。これにより、表示制御部12cは、ポールオブジェクトを初期位置に表示するように仮想表示部21を制御する。なお、本実施の形態2では、初期位置は、一般的なコーナポールの位置のように、自車両のボディのうちの助手席側の前側端部の位置であるものとして説明する。 First, in step S1, the virtual display object generation unit 12a generates a pole object based on the operation acquired by the operation signal input unit 11a. The display position control unit 12b determines the display position of the pole object as the initial position. Thereby, the display control unit 12c controls the virtual display unit 21 so as to display the pole object at the initial position. In the second embodiment, the initial position is assumed to be the position of the front end portion on the passenger seat side of the body of the host vehicle, like the position of a general corner pole.
 図4は、ステップS1が行われた時に、自車両の室内からフロントガラス31越しに見える状態を示す図である。図4には、自車両のボンネット32及びバックミラー33と、コーナポールの位置と同じ初期位置に表示されたポールオブジェクト34とが示されている。なお、自車両の車種によっては、ボンネット32が見えない場合もある。 FIG. 4 is a view showing a state where the vehicle can be seen through the windshield 31 from the interior of the host vehicle when step S1 is performed. FIG. 4 shows a hood 32 and a rearview mirror 33 of the host vehicle, and a pole object 34 displayed at the same initial position as the corner pole. Note that the bonnet 32 may not be visible depending on the vehicle type of the host vehicle.
 なお、以下の説明では、自車両は左ハンドルの車両であるものとして説明する。また、以下の説明では、図4の代わりに、自車両の室内からフロントガラス31越しに見える状態を模式的に示す図5を用いることもある。この模式図では、ボンネット32に自車両2の符号が付されていることが示されている。 In the following description, it is assumed that the host vehicle is a left-hand drive vehicle. Moreover, in the following description, FIG. 5 which shows typically the state which can be seen through the windshield 31 from the room | chamber interior of the own vehicle may be used instead of FIG. In this schematic diagram, it is shown that the sign of the host vehicle 2 is attached to the bonnet 32.
 図3のステップS2にて、表示位置制御部12bは、操作信号入力部11aが移動操作を取得したか否かを判定する。移動操作を取得したと判定した場合には処理がステップS3に進み、移動操作を取得しなかったと判定した場合にはステップS2の処理が再度行われる。 3, the display position control unit 12b determines whether or not the operation signal input unit 11a has acquired a moving operation. If it is determined that the moving operation has been acquired, the process proceeds to step S3. If it is determined that the moving operation has not been acquired, the process of step S2 is performed again.
 図6は、自車両2と、障害物である他車両51aとの配置関係の一例を示す図である。なお図6には、ステップS1によって初期位置2aに表示されたポールオブジェクト34が示されているだけでなく、自車両2と他車両51aとの間の最短距離をなす各部分の位置2b,52も示されている。 FIG. 6 is a diagram illustrating an example of an arrangement relationship between the host vehicle 2 and another vehicle 51a that is an obstacle. FIG. 6 shows not only the pole object 34 displayed at the initial position 2a in step S1, but also the positions 2b and 52 of the respective parts forming the shortest distance between the host vehicle 2 and the other vehicle 51a. Is also shown.
 図6の配置関係において、他車両51aの近くで自車両2を走行させようとしている運転者は、自車両2が他車両51aと衝突するか否かを判断するために、位置2b,52の間の距離を目測すると考えられる。この際に、ポールオブジェクト34が初期位置2aではなく位置2bに表示されれば、衝突の判断を高めることが期待できる。そこで、図6のような場合、図3のステップS2にて運転者は移動操作を行い、ステップS3に処理が進むことになる。 In the arrangement relationship of FIG. 6, the driver who is going to drive the own vehicle 2 in the vicinity of the other vehicle 51a determines whether the own vehicle 2 collides with the other vehicle 51a at positions 2b and 52. It is thought that the distance between them is measured. At this time, if the pole object 34 is displayed at the position 2b instead of the initial position 2a, it can be expected that the judgment of the collision is enhanced. Therefore, in the case of FIG. 6, the driver performs a moving operation in step S2 of FIG. 3, and the process proceeds to step S3.
 ステップS3にて、表示位置制御部12bは、操作信号入力部11aで取得された移動操作に基づいてポールオブジェクト34の表示位置を決定する。表示制御部12cは、ポールオブジェクト34を、表示位置制御部12bで決定された表示位置に表示するように、仮想表示部21を制御する。これにより、ポールオブジェクト34が移動する。その後、ステップS2に処理が戻る。 In step S3, the display position control unit 12b determines the display position of the pole object 34 based on the moving operation acquired by the operation signal input unit 11a. The display control unit 12c controls the virtual display unit 21 so that the pole object 34 is displayed at the display position determined by the display position control unit 12b. Thereby, the pole object 34 moves. Thereafter, the process returns to step S2.
 図7は、ステップS2及びステップS3が行われた時に、自車両の室内からフロントガラス31越しに見える状態を示す図である。図7に示すように、ステップS2及びステップS3が繰り返し行われることにより、ポールオブジェクト34が初期位置から運転者が意図する位置に移動する。なお、仮想表示部21が虚像を表示するHUDであり、かつ虚像距離が運転席を基準にして規定されている場合には、ポールオブジェクト34が運転席側から助手席側への方向に移動するにつれて、ポールオブジェクト34の虚像距離が長くなっていく。 FIG. 7 is a view showing a state where the vehicle can be seen through the windshield 31 from the interior of the host vehicle when Step S2 and Step S3 are performed. As shown in FIG. 7, step S2 and step S3 are repeatedly performed, so that the pole object 34 moves from the initial position to a position intended by the driver. When the virtual display unit 21 is a HUD that displays a virtual image and the virtual image distance is defined based on the driver seat, the pole object 34 moves in the direction from the driver seat side to the passenger seat side. Accordingly, the virtual image distance of the pole object 34 becomes longer.
 図8は、図6の配置関係において、ポールオブジェクト34が初期位置2aから位置2bに移動した状態を示す図である。この場合、自車両2の運転者は、ポールオブジェクト34を指標にして位置2b,52の間の距離を目測しながら、自車両2が他車両51aと衝突しないように自車両2を走行させる運転を行うことができる。 FIG. 8 is a diagram showing a state in which the pole object 34 has moved from the initial position 2a to the position 2b in the arrangement relationship of FIG. In this case, the driver of the host vehicle 2 drives the host vehicle 2 so that the host vehicle 2 does not collide with the other vehicle 51a while measuring the distance between the positions 2b and 52 using the pole object 34 as an index. It can be performed.
 <実施の形態2のまとめ>
 以上のような本実施の形態2に係る表示制御装置1によれば、ポールオブジェクト34を移動させる操作に基づいてポールオブジェクト34を移動させることができる。したがって、ユーザは、意図する位置にポールオブジェクト34を移動させることができる。
<Summary of Embodiment 2>
According to the display control apparatus 1 according to the second embodiment as described above, the pole object 34 can be moved based on the operation of moving the pole object 34. Therefore, the user can move the pole object 34 to the intended position.
 また本実施の形態2では、ポールオブジェクト34の初期位置は、自車両のボディのうちの助手席側の前側端部の位置であることから、ポールオブジェクト34を従来のコーナポールと同様に用いることができる。 In the second embodiment, the initial position of the pole object 34 is the position of the front end on the passenger seat side of the body of the host vehicle, and therefore the pole object 34 is used in the same manner as a conventional corner pole. Can do.
 <実施の形態2の各変形例>
 次に、実施の形態2の各変形例について説明する。なお、以下で説明する変形例によっては、ポールオブジェクト34の色などの表示態様が変更される。このような表示態様の変更は、仮想表示オブジェクト生成部12aがポールオブジェクト34を生成する際に行われてもよいし、表示制御部12cが仮想表示部21を制御する際に行われてもよい。以下の説明では冗長な記載を低減するために、仮想表示オブジェクト生成部12a、表示位置制御部12b及び表示制御部12cの少なくともいずれか1つによって動作が行われる場合、その記載の代わりに、制御部12によって動作が行われると記載することもある。
<Each modification of Embodiment 2>
Next, modifications of the second embodiment will be described. Note that the display mode such as the color of the pole object 34 is changed depending on the modification described below. Such a display mode change may be performed when the virtual display object generation unit 12 a generates the pole object 34, or may be performed when the display control unit 12 c controls the virtual display unit 21. . In the following description, in order to reduce redundant descriptions, when an operation is performed by at least one of the virtual display object generation unit 12a, the display position control unit 12b, and the display control unit 12c, control is performed instead of the description. It may be described that the operation is performed by the unit 12.
 <実施の形態2の変形例1>
 上述した実施の形態2に係る表示制御部12cは、仮想表示部21がHUDである場合には、移動操作に基づいてポールオブジェクト34の虚像距離及び虚像方向の両方を変更することを想定していた。しかしこれに限ったものではなく、表示制御部12cは、移動操作に基づいてポールオブジェクト34の虚像距離を変更せずに、ポールオブジェクト34の虚像方向を変更してもよい。このような構成によれば、虚像距離を変更しないHUDを仮想表示部21に用いることができる。
<Modification 1 of Embodiment 2>
It is assumed that the display control unit 12c according to the second embodiment described above changes both the virtual image distance and the virtual image direction of the pole object 34 based on the moving operation when the virtual display unit 21 is a HUD. It was. However, the present invention is not limited to this, and the display control unit 12c may change the virtual image direction of the pole object 34 without changing the virtual image distance of the pole object 34 based on the moving operation. According to such a configuration, a HUD that does not change the virtual image distance can be used for the virtual display unit 21.
 また、運転者によって目の位置が異なることから、ポールオブジェクトの初期位置には適宜キャリブレーションが行われてもよい。キャリブレーションは、運転者の目の位置の検出結果または推定結果に基づいて自動的に実施されてもよいし、運転者がメニュー画面を操作して手動でポールオブジェクトを適切な表示位置に設定することによって実施されてもよい。 Also, since the position of the eyes varies depending on the driver, the initial position of the pole object may be appropriately calibrated. The calibration may be automatically performed based on the detection result or estimation result of the driver's eye position, or the driver manually sets the pole object to an appropriate display position by operating the menu screen. May be implemented.
 <実施の形態2の変形例2>
 図示しないが、本変形例に係る表示制御装置1は、図1の情報取得部11の概念に含まれる車両情報取得部を備えているものとする。
<Modification 2 of Embodiment 2>
Although not shown, the display control apparatus 1 according to the present modification includes a vehicle information acquisition unit included in the concept of the information acquisition unit 11 of FIG.
 図9は、本変形例に係る表示制御装置1の動作を示すフローチャートである。この図9の動作は、図3のステップS1とステップS2との間にステップS11を追加したものと同様である。 FIG. 9 is a flowchart showing the operation of the display control apparatus 1 according to this modification. The operation in FIG. 9 is the same as that in which step S11 is added between step S1 and step S2 in FIG.
 ステップS1の後、ステップS11にて、車両情報取得部は、自車両の速度を自車両のECU(Electronic Control Unit)などから取得する。表示位置制御部12bは、車両情報取得部で取得された自車両の速度が、予め定められた速度(例えば時速10km)よりも小さいか否かを判定する。自車両の速度が、予め定められた速度よりも小さいと判定した場合には処理がステップS2に進み、自車両の速度が、予め定められた速度以上であると判定した場合には処理がステップS1に戻る。 After step S1, in step S11, the vehicle information acquisition unit acquires the speed of the host vehicle from an ECU (Electronic Control Unit) of the host vehicle. The display position control unit 12b determines whether or not the speed of the host vehicle acquired by the vehicle information acquisition unit is smaller than a predetermined speed (for example, 10 km / h). If it is determined that the speed of the host vehicle is lower than a predetermined speed, the process proceeds to step S2, and if it is determined that the speed of the host vehicle is equal to or higher than the predetermined speed, the process is performed. Return to S1.
 ステップS2にて移動操作を取得したと判定した場合には処理がステップS3に進み、移動操作を取得しなかったと判定した場合には処理がステップS11に戻る。ステップS3の処理が行われた後には処理がステップS11に戻る。 If it is determined in step S2 that the moving operation has been acquired, the process proceeds to step S3. If it is determined that the moving operation has not been acquired, the process returns to step S11. After the process of step S3 is performed, the process returns to step S11.
 以上のような本変形例に係る表示制御装置1によれば、自車両の速度が、予め定められた速度以上である場合には、ポールオブジェクト34の位置を初期位置に固定する。このような構成によれば、ポールオブジェクト34が、自車両の走行中に運転者の視覚の邪魔になってしまうことを抑制することができる。 According to the display control apparatus 1 according to this modification as described above, when the speed of the host vehicle is equal to or higher than a predetermined speed, the position of the pole object 34 is fixed to the initial position. According to such a configuration, the pole object 34 can be prevented from interfering with the driver's vision while the host vehicle is traveling.
 なお、上述の車両情報取得部は、自車両の速度の代わりに、自車両で自動運転が行われているか否かを示す自動運転情報を、自車両のECUなどから取得してもよい。そして、図9のステップS11にて、表示位置制御部12bは、車両情報取得部で取得された自動運転情報が自車両で自動運転が行われていることを示しているか否かを判定してもよい。自車両で自動運転が行われていることを示していない場合には、処理がステップS2に進み、自車両にて自動運転が行われていることを示している場合には、処理がステップS1に戻るように動作が行われてもよい。このように構成された表示位置制御部12bは、自動運転情報が自車両にて自動運転が行われていることを示す場合に、ポールオブジェクト34の位置を初期位置に固定する。このような構成によれば、上述と同様に、ポールオブジェクト34が、自車両の自動運転中に運転者の視覚の邪魔になってしまうことを抑制することができる。 Note that the vehicle information acquisition unit described above may acquire automatic driving information indicating whether or not automatic driving is being performed on the own vehicle from the ECU of the own vehicle, instead of the speed of the own vehicle. Then, in step S11 of FIG. 9, the display position control unit 12b determines whether or not the automatic driving information acquired by the vehicle information acquisition unit indicates that automatic driving is being performed on the host vehicle. Also good. If it is not indicated that the vehicle is performing automatic driving, the process proceeds to step S2, and if the vehicle is indicating that automatic driving is being performed, the process proceeds to step S1. The operation may be performed so as to return to step (b). The display position control unit 12b configured in this manner fixes the position of the pole object 34 at the initial position when the automatic driving information indicates that automatic driving is being performed in the host vehicle. According to such a configuration, similarly to the above, it is possible to suppress the pole object 34 from interfering with the driver's vision during the automatic driving of the host vehicle.
 <実施の形態2の変形例3>
 図示しないが、本変形例に係る表示制御装置1は、図1の情報取得部11の概念に含まれる周辺明度取得部を備えているものとする。周辺明度取得部は、明るさセンサで取得された自車両周辺の明るさを取得したり、自車両にてイルミネーション表示が行われている場合にイルミネーション信号を取得したりする。
<Modification 3 of Embodiment 2>
Although not shown, the display control device 1 according to the present modification includes a peripheral brightness acquisition unit included in the concept of the information acquisition unit 11 of FIG. The surrounding brightness acquisition unit acquires the brightness around the host vehicle acquired by the brightness sensor, or acquires an illumination signal when an illumination display is performed on the host vehicle.
 制御部12は、周辺明度取得部で取得された自車両周辺の明るさが閾値以下である場合、または、イルミネーション信号が周辺明度取得部で取得された場合に、ポールオブジェクト34の色及び明るさを、暗い環境下でも目立つ色及び明るさに変更する。このような構成によれば、暗い環境下でもポールオブジェクト34を見えやすくすることができる。なお、本明細書において色とは、1つの色を含むだけでなく、複数の色を組み合わせた模様などの色調も含む。 The control unit 12 determines the color and brightness of the pole object 34 when the brightness around the host vehicle acquired by the surrounding brightness acquisition unit is equal to or less than the threshold value or when the illumination signal is acquired by the surrounding brightness acquisition unit. Is changed to a conspicuous color and brightness even in a dark environment. According to such a configuration, the pole object 34 can be easily seen even in a dark environment. Note that the color in this specification includes not only a single color but also a color tone such as a combination of a plurality of colors.
 <実施の形態2の変形例4>
 実施の形態2では、図2の操作入力部22にはタッチパネルが用いられるものとして説明したが、これに限ったものではない。例えば、操作入力部22に、運転者の指が指す指示方向を検出するジェスチャ操作入力装置が用いられてもよい。そして、表示制御装置1は、ジェスチャ操作入力装置で指示方向が検出された場合に、当該指示方向上の位置に、ポールオブジェクト34を移動させてもよい。または、表示制御装置1は、ジェスチャ操作入力装置で検出された指示方向にポールオブジェクト34が存在すると判定した後に、ジェスチャ操作入力装置で別の指示方向が検出された場合に、当該別の指示方向上の位置に、ポールオブジェクト34を移動させてもよい。
<Modification 4 of Embodiment 2>
In the second embodiment, the operation input unit 22 in FIG. 2 has been described as using a touch panel. However, the present invention is not limited to this. For example, a gesture operation input device that detects an instruction direction indicated by the driver's finger may be used as the operation input unit 22. Then, when the pointing direction is detected by the gesture operation input device, the display control device 1 may move the pole object 34 to a position on the pointing direction. Alternatively, when the display control apparatus 1 determines that the pole object 34 exists in the instruction direction detected by the gesture operation input device, and when another instruction direction is detected by the gesture operation input device, the other instruction direction The pole object 34 may be moved to the upper position.
 また例えば、操作入力部22に、運転者の視線方向を検出する視線操作入力装置が用いられてもよい。そして、表示制御装置1は、視線操作入力装置で検出された視線方向にポールオブジェクト34が存在すると判定した後に、視線操作入力装置で別の視線方向が検出された場合に、当該別の視線方向上の位置に、ポールオブジェクト34を移動させてもよい。 For example, a line-of-sight operation input device that detects the driver's line-of-sight direction may be used as the operation input unit 22. Then, after determining that the pole object 34 is present in the line-of-sight direction detected by the line-of-sight operation input device, the display control device 1 determines another line-of-sight direction when another line-of-sight direction is detected by the line-of-sight operation input device. The pole object 34 may be moved to the upper position.
 また例えば、操作入力部22に、ジェスチャ操作入力装置及び音声操作入力装置の両方が用いられてもよい。 For example, both the gesture operation input device and the voice operation input device may be used for the operation input unit 22.
 図10は、操作入力部22に、ジェスチャ操作入力装置及び音声操作入力装置を用いた場合の、本変形例に係る表示制御装置1の動作を示すフローチャートである。この図10の動作は、図3のステップS1の前にステップS21を追加し、図3のステップS2をステップS22~25に置き換えたものと同様である。 FIG. 10 is a flowchart showing the operation of the display control apparatus 1 according to this modification when a gesture operation input device and a voice operation input device are used as the operation input unit 22. The operation in FIG. 10 is the same as that in which step S21 is added before step S1 in FIG. 3 and step S2 in FIG. 3 is replaced with steps S22 to S25.
 まずステップS21にて、制御部12は、操作入力部22ひいては操作信号入力部11aが、ポールオブジェクト34を表示するための表示コマンドの音声、例えば「コーナポール表示」という音声を取得したか否かを判定する。表示コマンドの音声を取得したと判定した場合には処理がステップS1に進み、表示コマンドの音声を取得しなかったと判定した場合にはステップS21の処理を再度行う。 First, in step S21, the control unit 12 determines whether or not the operation input unit 22 and thus the operation signal input unit 11a have acquired a display command sound for displaying the pole object 34, for example, a “corner pole display” sound. Determine. If it is determined that the voice of the display command has been acquired, the process proceeds to step S1, and if it is determined that the voice of the display command has not been acquired, the process of step S21 is performed again.
 その後、ステップS1にて、制御部12は、ポールオブジェクト34を初期位置に表示するように、仮想表示部21を制御する。 Thereafter, in step S1, the control unit 12 controls the virtual display unit 21 so as to display the pole object 34 at the initial position.
 ステップS1の後のステップS22にて、制御部12は、操作入力部22ひいては操作信号入力部11aが、ポールオブジェクト34の表示を消去するための消去コマンドの音声、例えば「コーナポール消去」という音声を取得したか否かを判定する。消去コマンドの音声を取得したと判定した場合には処理がステップS23に進み、消去コマンドの音声を取得しなかったと判定した場合には処理がステップS24に進む。 In step S22 after step S1, the control unit 12 causes the operation input unit 22 and, in turn, the operation signal input unit 11a to erase a voice of an erasure command for erasing the display of the pole object 34, for example, a voice of “Corner Pole Erase”. It is determined whether or not If it is determined that the erase command voice has been acquired, the process proceeds to step S23. If it is determined that the erase command voice has not been acquired, the process proceeds to step S24.
 ステップS23にて、制御部12は、ポールオブジェクト34の表示を消去するように、仮想表示部21を制御する。これにより、ポールオブジェクト34が消去される。その後、ステップS21に処理が戻る。 In step S23, the control unit 12 controls the virtual display unit 21 so as to erase the display of the pole object 34. Thereby, the pole object 34 is deleted. Thereafter, the process returns to step S21.
 ステップS24にて、制御部12は、操作入力部22ひいては操作信号入力部11aが、ポールオブジェクト34を移動するための移動コマンドの音声、例えば「コーナポール移動」という音声を取得し、かつ、ポールオブジェクト34を指すジェスチャ操作を取得したか否かを判定する。つまり、制御部12は、図11のように運転者が、ポールオブジェクト34と同じ指示方向を指しながら、移動コマンドを発声したかを判定する。移動コマンドの音声と、当該指示方向を指すジェスチャ操作とを取得したと判定した場合には処理がステップS25に進み、そうでない場合には処理がステップS22に戻る。なお、処理がステップS25に進む場合において、制御部12は、ポールオブジェクト34の先端を点滅表示させてもよい。 In step S24, the control unit 12 acquires the voice of the movement command for moving the pole object 34, for example, the voice of “movement of the corner pole” by the operation input unit 22 and the operation signal input unit 11a. It is determined whether or not a gesture operation pointing to the object 34 has been acquired. That is, the control unit 12 determines whether the driver has issued a movement command while pointing in the same direction as the pole object 34 as shown in FIG. If it is determined that the voice of the movement command and the gesture operation indicating the designated direction have been acquired, the process proceeds to step S25, and if not, the process returns to step S22. When the process proceeds to step S25, the control unit 12 may cause the tip of the pole object 34 to blink.
 ステップS25にて、制御部12は、操作入力部22ひいては操作信号入力部11aが、ポールオブジェクト34の表示位置を決定するための決定コマンドの音声、例えば「コーナポール決定」という音声を取得し、かつ、ポールオブジェクト34とは別の指示方向を指すジェスチャ操作を取得したか否かを判定する。つまり、制御部12は、図12のように運転者が、ポールオブジェクト34とは別の指示方向を指しながら、決定コマンドを発声したかを判定する。決定コマンドの音声と、当該別の指示方向を指すジェスチャ操作とを取得したと判定した場合には処理がステップS3に進み、そうでない場合にはステップS25の処理が再度行われる。なお、ステップS25の処理が一定回数行われても決定コマンドの音声と、別の指示方向を指すジェスチャ操作とを取得しなかったと判定した場合にはステップS22に戻ってもよい。 In step S25, the control unit 12 obtains a voice of a determination command for the operation input unit 22 and thus the operation signal input unit 11a to determine the display position of the pole object 34, for example, a voice “decide a corner pole”, In addition, it is determined whether or not a gesture operation indicating an instruction direction different from the pole object 34 has been acquired. That is, the control unit 12 determines whether the driver has issued a determination command while pointing in a different direction from the pole object 34 as shown in FIG. If it is determined that the voice of the determination command and the gesture operation pointing to the other instruction direction have been acquired, the process proceeds to step S3. If not, the process of step S25 is performed again. If it is determined that the voice of the determination command and the gesture operation indicating another instruction direction are not acquired even if the process of step S25 is performed a certain number of times, the process may return to step S22.
 ステップS3にて、制御部12は、ポールオブジェクト34を上述した別の指示方向に表示するように、仮想表示部21を制御する。その後、ステップS22に処理が戻る。 In step S3, the control unit 12 controls the virtual display unit 21 so that the pole object 34 is displayed in the different instruction direction described above. Thereafter, the process returns to step S22.
 なお、以上の説明では、操作入力部22に、ジェスチャ操作入力装置及び音声操作入力装置の両方を用いた構成について説明した。しかしこれに限ったものではなく、例えば操作入力部22に、視線操作入力装置及び音声操作入力装置の両方を用いても、上述と同様の動作を実現することができる。 In the above description, the configuration in which both the gesture operation input device and the voice operation input device are used for the operation input unit 22 has been described. However, the present invention is not limited to this. For example, even when both the line-of-sight operation input device and the voice operation input device are used as the operation input unit 22, the same operation as described above can be realized.
 また、操作入力部22に、ジェスチャ操作入力装置のみを用いた構成であって、表示コマンドなどの各種コマンドを音声から取得する代わりに、運転者の指の揺動などのような特定のジェスチャ操作を取得する構成であっても、上記と同様の動作を実現することができる。 Further, only the gesture operation input device is used as the operation input unit 22, and instead of acquiring various commands such as display commands from voice, a specific gesture operation such as swinging of a driver's finger is performed. Even with the configuration for acquiring the above, the same operation as described above can be realized.
 <実施の形態3>
 図13は、本発明の実施の形態3に係る表示制御装置1の構成を示すブロック図である。以下、本実施の形態3で説明する構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
<Embodiment 3>
FIG. 13 is a block diagram showing the configuration of the display control apparatus 1 according to Embodiment 3 of the present invention. Hereinafter, among the constituent elements described in the third embodiment, constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
 表示制御装置1内部の構成について説明する前に、周辺検出部23について説明する。なお、仮想表示部21及び操作入力部22は実施の形態2と同様である。 Before describing the internal configuration of the display control device 1, the periphery detection unit 23 will be described. The virtual display unit 21 and the operation input unit 22 are the same as those in the second embodiment.
 周辺検出部23は、自車両周辺の他車両などの障害物に関する情報を取得する。この情報には、障害物と自車両とについての相対位置が含まれる。この周辺検出部23には、例えば、超音波センサ、画像認識装置、レーザレーダ、ミリ波レーダ、音響認識装置、及び、暗視カメラの少なくともいずれか1つが用いられてもよいし、上記以外のセンシング装置が用いられてもよい。 The periphery detection unit 23 acquires information related to obstacles such as other vehicles around the host vehicle. This information includes relative positions of the obstacle and the host vehicle. For example, at least one of an ultrasonic sensor, an image recognition device, a laser radar, a millimeter wave radar, an acoustic recognition device, and a night vision camera may be used for the periphery detection unit 23. A sensing device may be used.
 次に、図13の表示制御装置1内部の構成について説明する。図13の表示制御装置1は、図2の表示制御装置1に、車外情報入力部11b、及び、相対位置取得部11cが追加された構成と同様である。なお、車外情報入力部11b、及び、相対位置取得部11cは、図1の情報取得部11の概念に含まれる。 Next, the internal configuration of the display control device 1 in FIG. 13 will be described. The display control device 1 in FIG. 13 has the same configuration as the display control device 1 in FIG. 2 in which an outside information input unit 11b and a relative position acquisition unit 11c are added. The vehicle outside information input unit 11b and the relative position acquisition unit 11c are included in the concept of the information acquisition unit 11 in FIG.
 車外情報入力部11bは、自車両周辺の障害物に関する情報を周辺検出部23から取得する。この情報には、障害物と自車両とについての相対位置の情報が含まれる。 The outside-vehicle information input unit 11b acquires information related to obstacles around the host vehicle from the surroundings detection unit 23. This information includes information on relative positions of the obstacle and the host vehicle.
 相対位置取得部11cには、車外情報入力部11bで取得された相対位置を求めるために必要な自車両内のいずれかの位置である自車両内部位置が、予め記憶されている。相対位置取得部11cは、車外情報入力部11bで取得された障害物と自車両とについての相対位置と、予め記憶された自車両内部位置と、表示位置制御部12bで決定したポールオブジェクト34の位置とに基づいて、障害物とポールオブジェクト34とについての相対位置を取得する。 The relative position acquisition unit 11c stores in advance the own vehicle internal position, which is any position in the own vehicle necessary for obtaining the relative position acquired by the outside information input unit 11b. The relative position acquisition unit 11c includes the relative position of the obstacle and the host vehicle acquired by the vehicle outside information input unit 11b, the host vehicle internal position stored in advance, and the pole object 34 determined by the display position control unit 12b. Based on the position, a relative position between the obstacle and the pole object 34 is acquired.
 制御部12は、相対位置取得部11cで取得された障害物とポールオブジェクト34とについての相対位置に基づいて、ポールオブジェクト34の表示態様を変更する。本実施の形態3では、制御部12は、車外情報入力部11bで取得された相対位置に基づいて障害物とポールオブジェクト34との間の距離であるポール距離を求め、当該ポール距離に基づいて、ポールオブジェクト34の色を変更する。 The control unit 12 changes the display mode of the pole object 34 based on the relative position of the obstacle and the pole object 34 acquired by the relative position acquisition unit 11c. In the third embodiment, the control unit 12 obtains a pole distance that is a distance between the obstacle and the pole object 34 based on the relative position acquired by the vehicle outside information input unit 11b, and based on the pole distance. The color of the pole object 34 is changed.
 <動作>
 図14は、本実施の形態3に係る表示制御装置1の動作を示すフローチャートである。この図14の動作は、図3のステップS3の後にステップS31を追加したものと同様である。
<Operation>
FIG. 14 is a flowchart showing the operation of the display control apparatus 1 according to the third embodiment. The operation in FIG. 14 is the same as that in which step S31 is added after step S3 in FIG.
 ステップS31にて、制御部12は、車外情報入力部11bで取得された相対位置に基づいて、上述したポール距離を求め、当該ポール距離に基づいてポールオブジェクト34の色を変更する。例えば、図15に示すように、障害物51とポールオブジェクト34との間のポール距離dpが、予め定められた第1距離(例えば40cm)よりも大きい場合には、制御部12は、ポールオブジェクト34の色を水色などの安全色に変更する。図16に示すように、ポール距離dpが、第1距離以下で、かつ予め定められた第2距離(例えば20cm)よりも大きい場合には、制御部12は、ポールオブジェクト34の色を黄色などの注意色に変更する。図17に示すように、ポール距離dpが、第2距離以下である場合には、制御部12は、ポールオブジェクト34の色を赤色などの警報色に変更する。ステップS31の後、処理がステップS2に戻る。 In step S31, the control unit 12 obtains the above-described pole distance based on the relative position acquired by the vehicle exterior information input unit 11b, and changes the color of the pole object 34 based on the pole distance. For example, as shown in FIG. 15, when the pole distance dp between the obstacle 51 and the pole object 34 is larger than a predetermined first distance (for example, 40 cm), the control unit 12 The 34 color is changed to a safety color such as light blue. As shown in FIG. 16, when the pole distance dp is equal to or smaller than the first distance and larger than a predetermined second distance (for example, 20 cm), the control unit 12 changes the color of the pole object 34 to yellow or the like. Change to the attention color. As shown in FIG. 17, when the pole distance dp is equal to or less than the second distance, the control unit 12 changes the color of the pole object 34 to an alarm color such as red. After step S31, the process returns to step S2.
 <実施の形態3のまとめ>
  以上のような本実施の形態3に係る表示制御装置1によれば、自車両周辺の障害物とポールオブジェクト34とについての相対位置に基づいて、ポールオブジェクト34の表示態様を変更する。このような構成によれば、運転者は、ポールオブジェクト34を移動させた時の、ポールオブジェクト34の色などの表示態様の変化によって、ポール距離などの相対位置関係を知ることができる。
<Summary of Embodiment 3>
According to the display control device 1 according to the third embodiment as described above, the display mode of the pole object 34 is changed based on the relative position of the obstacle around the host vehicle and the pole object 34. According to such a configuration, the driver can know the relative positional relationship such as the pole distance by changing the display mode such as the color of the pole object 34 when the pole object 34 is moved.
 なお、以上の説明では、制御部12は、ポール距離に基づいて、ポールオブジェクト34の色を変更したがこれに限ったものではない。例えば、図18に示すように、障害物51のうちポールオブジェクト34との間の距離が最も短い部分53が、ポールオブジェクト34から前方に広がる扇形の範囲35内に位置するか否かに基づいて、制御部12は、ポールオブジェクト34の色を変更してもよい。 In the above description, the control unit 12 changes the color of the pole object 34 based on the pole distance, but the present invention is not limited to this. For example, as shown in FIG. 18, based on whether or not the portion 53 of the obstacle 51 that has the shortest distance from the pole object 34 is located within a fan-shaped range 35 that extends forward from the pole object 34. The control unit 12 may change the color of the pole object 34.
 また、以上の説明では、制御部12によって変更される表示態様は、ポールオブジェクト34の色であるものとして説明した。しかしこれに限ったものではなく、制御部12によって変更される表示態様は、例えばポールオブジェクト34の点滅などのアニメーションの有無、色、形状、及び、模様の少なくともいずれか1つであってもよい。 In the above description, the display mode changed by the control unit 12 has been described as the color of the pole object 34. However, the present invention is not limited to this, and the display mode changed by the control unit 12 may be at least one of presence / absence of animation such as blinking of the pole object 34, color, shape, and pattern. .
 <実施の形態3の各変形例>
 次に、実施の形態3の各変形例について説明する。なお、詳細には説明しないが、実施の形態3に実施の形態2の各変形例を適宜組み合わせてもよいし、実施の形態2に実施の形態3の各変形例を適宜組み合わせてもよい。
<Each modification of Embodiment 3>
Next, modifications of the third embodiment will be described. Although not described in detail, each modification of the second embodiment may be appropriately combined with the third embodiment, or each modification of the third embodiment may be appropriately combined with the second embodiment.
 <実施の形態3の変形例1>
 実施の形態3では、制御部12は、相対位置取得部11cで取得された障害物とポールオブジェクト34とについての相対位置に基づいて、ポールオブジェクト34の表示態様を変更したが、これに限ったものではない。例えば、制御部12は、車外情報入力部11bで取得された障害物と自車両とについての相対位置に基づいて、ポールオブジェクト34の表示態様を変更してもよい。具体的には、制御部12は、障害物と自車両のバンパーとの間の距離に基づいて、ポールオブジェクト34の表示態様を変更してもよい。より具体的には、制御部12は、障害物と自車両のバンパーとの間の距離が、当該障害物と当該バンパーとが接触する可能性が高い距離(例えば5cm)以下であるか否かに基づいて、ポールオブジェクト34の表示態様を変更してもよい。
<Modification 1 of Embodiment 3>
In the third embodiment, the control unit 12 changes the display mode of the pole object 34 based on the relative position of the obstacle and the pole object 34 acquired by the relative position acquisition unit 11c. It is not a thing. For example, the control unit 12 may change the display mode of the pole object 34 based on the relative position of the obstacle and the host vehicle acquired by the outside information input unit 11b. Specifically, the control unit 12 may change the display mode of the pole object 34 based on the distance between the obstacle and the bumper of the host vehicle. More specifically, the control unit 12 determines whether or not the distance between the obstacle and the bumper of the host vehicle is equal to or less than a distance (for example, 5 cm) where the obstacle and the bumper are likely to contact each other. Based on the above, the display mode of the pole object 34 may be changed.
 <実施の形態3の変形例2>
 図13の周辺検出部23は、自車両周辺の障害物の色をさらに検出し、車外情報入力部11bは、周辺検出部23で検出された障害物の色をさらに取得してもよい。そして、制御部12は、車外情報入力部11bで取得された障害物の色に基づいて、ポールオブジェクト34の色を変更してもよい。例えば、制御部12は、車外情報入力部11bで取得された障害物の色と同一または類似する色に、ポールオブジェクト34の色を変更してもよい。ここでいう色は、上述したように、1つの色を含むだけでなく、複数の色を組み合わせた模様などの色調も含む。
<Modification 2 of Embodiment 3>
The periphery detection unit 23 in FIG. 13 may further detect the color of an obstacle around the host vehicle, and the vehicle outside information input unit 11b may further acquire the color of the obstacle detected by the periphery detection unit 23. And the control part 12 may change the color of the pole object 34 based on the color of the obstruction acquired by the vehicle exterior information input part 11b. For example, the control unit 12 may change the color of the pole object 34 to a color that is the same as or similar to the color of the obstacle acquired by the outside-vehicle information input unit 11b. As described above, the color here includes not only one color but also a color tone such as a pattern in which a plurality of colors are combined.
 また、周辺検出部23が、自車両周辺の複数の障害物のそれぞれについて、障害物とポールオブジェクト34との相対位置、及び、障害物の色を検出し、車外情報入力部11bが、周辺検出部23で検出されたそれら情報を取得してもよい。そして、制御部12は、車外情報入力部11bで取得された、複数の障害物のそれぞれについての相対位置及び色に基づいて、ポールオブジェクト34の色を変更してもよい。例えば、制御部12は、図19に示すように、複数の障害物51のうちポールオブジェクト34に最も近い障害物の色に基づいて、ポールオブジェクト34の色を変更してもよい。図19の例では、ポールオブジェクト34が障害物51bに近い場合に、ポールオブジェクト34の色調は障害物51bの色調と同じとなるように変更され、ポールオブジェクト34が障害物51cに近い場合に、ポールオブジェクト34の色は障害物51cの色と同じとなるように変更されている。 In addition, the surrounding detection unit 23 detects the relative position between the obstacle and the pole object 34 and the color of the obstacle for each of a plurality of obstacles around the host vehicle, and the outside information input unit 11b detects the surroundings. The information detected by the unit 23 may be acquired. And the control part 12 may change the color of the pole object 34 based on the relative position and color about each of several obstacles which were acquired by the vehicle exterior information input part 11b. For example, as illustrated in FIG. 19, the control unit 12 may change the color of the pole object 34 based on the color of the obstacle closest to the pole object 34 among the plurality of obstacles 51. In the example of FIG. 19, when the pole object 34 is close to the obstacle 51b, the color tone of the pole object 34 is changed to be the same as the color tone of the obstacle 51b, and when the pole object 34 is close to the obstacle 51c, The color of the pole object 34 is changed to be the same as the color of the obstacle 51c.
 このような構成によれば、運転者は、自車両のどの部分がどの障害物に近いか、ひいてはどの障害物が自車両の走行に影響を及ぼすかを判断することができる。 According to such a configuration, the driver can determine which part of the own vehicle is close to which obstacle, and which obstacle affects the running of the own vehicle.
 なお、図19の例では、制御部12は、車外情報入力部11bで取得された障害物の色に基づいて、ポールオブジェクト34の全ての部分の色を変更したが、これに限ったものではない。例えば、制御部12は、車外情報入力部11bで取得された障害物の色に基づいて、ポールオブジェクト34の一部(例えば先端など)の色を変更してもよい。そしてこの構成において、制御部12は、障害物とポールオブジェクト34とについての相対位置に基づいて、ポールオブジェクト34の残部(例えば側面)の色を変更してもよい。 In the example of FIG. 19, the control unit 12 has changed the color of all parts of the pole object 34 based on the color of the obstacle acquired by the outside information input unit 11b. However, the present invention is not limited to this. Absent. For example, the control unit 12 may change the color of a part of the pole object 34 (for example, the tip) based on the color of the obstacle acquired by the outside information input unit 11b. In this configuration, the control unit 12 may change the color of the remaining part (for example, the side surface) of the pole object 34 based on the relative position between the obstacle and the pole object 34.
 <実施の形態3の変形例3>
 本変形例では、制御部12は、自車両内で表示可能な、仮想表示部21とは別の表示部(図示せず)を制御することが可能となっており、図13の周辺検出部23はカメラを含むものとする。この場合に、周辺検出部23は、自車両周辺をカメラで撮像し、車外情報入力部11bは、当該画像を周辺検出部23から取得してもよい。そして、制御部12は、車外情報入力部11bで取得された画像を、自車両内で表示可能な別の表示部に表示させ、当該画像にポールオブジェクト34に対応する表示オブジェクトを重畳させてもよい。なお、ポールオブジェクト34に対応する表示オブジェクトとは、ポールオブジェクト34と同一または類似する表示オブジェクトである。
<Modification 3 of Embodiment 3>
In this modification, the control unit 12 can control a display unit (not shown) different from the virtual display unit 21 that can be displayed in the host vehicle. 23 includes a camera. In this case, the periphery detection unit 23 may capture the periphery of the host vehicle with a camera, and the vehicle information input unit 11b may acquire the image from the periphery detection unit 23. And the control part 12 displays the image acquired by the vehicle exterior information input part 11b on another display part which can be displayed within the own vehicle, and superimposes the display object corresponding to the pole object 34 on the said image. Good. The display object corresponding to the pole object 34 is a display object that is the same as or similar to the pole object 34.
 図20は、自車両2と、障害物である他車両51aとの配置関係の一例を示す図である。図21は、図20の配置関係において、車外情報入力部11bで取得され、別の表示部に表示された画像36に、ポールオブジェクト34に対応する表示オブジェクトである対応ポールオブジェクト37が重畳された状態を示す図である。なお、図20の他車両51aの前側部分は左側に傾いているので、その画像においても厳密には他車両51aの前側部分は左側に傾くが、重要な点ではないので便宜上、図21の画像においては他車両の前側部分は傾かないように示されている。 FIG. 20 is a diagram illustrating an example of an arrangement relationship between the host vehicle 2 and another vehicle 51a that is an obstacle. FIG. 21 shows that the corresponding pole object 37, which is a display object corresponding to the pole object 34, is superimposed on the image 36 acquired by the outside information input unit 11b and displayed on another display unit in the arrangement relationship of FIG. It is a figure which shows a state. Since the front portion of the other vehicle 51a in FIG. 20 is inclined to the left side, strictly speaking, the front portion of the other vehicle 51a is inclined to the left side in the image, but this is not an important point. In FIG. 1, the front portion of the other vehicle is shown not to tilt.
 以上のような本変形例の構成によれば、運転者は、直接見た風景と、カメラから見える風景との対応付けを、ポールオブジェクト34と対応ポールオブジェクト37とによって容易に行うことができる。 According to the configuration of the present modification as described above, the driver can easily associate the scenery seen directly with the scenery seen from the camera by the pole object 34 and the corresponding pole object 37.
 なお以上の構成では、別の表示部に、一般的な表示装置を用いた場合を想定していた。しかしこれに限ったものではなく、別の表示部に、立体視を利用した仮想的な表示オブジェクトを表示可能な表示装置が用いられてもよい。また、その場合に、対応ポールオブジェクト37に、仮想的な表示オブジェクトが適用されてもよい。 In the above configuration, it is assumed that a general display device is used for another display unit. However, the present invention is not limited to this, and a display device capable of displaying a virtual display object using stereoscopic vision may be used on another display unit. In that case, a virtual display object may be applied to the corresponding pole object 37.
 <実施の形態3の変形例4>
 実施の形態3では、制御部12は、運転者の移動操作に基づいてポールオブジェクト34を移動させることを想定していた。しかしこれに限ったものではなく、制御部12は、自車両のフェンダー及び前側バンパーのどこかと重なる範囲内において、助手席及び運転席の一方側から他方側にポールオブジェクト34を自動的に移動させてもよい。このような構成によれば、運転者が、障害物と自車両とについての相対位置関係を表示態様の変化によって知ろうとする際に、ポールオブジェクト34の移動操作を行う必要がなくなるので、運転者の負担を軽減することができる。
<Modification 4 of Embodiment 3>
In Embodiment 3, the control part 12 assumed moving the pole object 34 based on a driver | operator's moving operation. However, the present invention is not limited to this, and the control unit 12 automatically moves the pole object 34 from one side of the passenger seat and the driver seat to the other side within a range where the vehicle fender and the front bumper overlap. May be. According to such a configuration, the driver does not need to perform the moving operation of the pole object 34 when the driver tries to know the relative positional relationship between the obstacle and the host vehicle by changing the display mode. Can be reduced.
 <実施の形態3の変形例5>
 実施の形態3に係る制御部12は、図22及び図23の二点鎖線に示すように、移動操作に基づいて、自車両のフェンダー及び前側バンパーのどこかと重なる範囲内でポールオブジェクト34を移動させた。しかしこれに限ったものではなく、制御部12は、図22及び図23の実線に示すように、移動操作に基づいて、自車両2のフェンダー及び前側バンパーから離間するように、自車両2の前方にポールオブジェクト34を移動させてもよい。なお、ポールオブジェクト34と、自車両2のフェンダー及び前側バンパーとの最大の離間距離には、例えば10cm程度の距離が用いられる。
<Modification 5 of Embodiment 3>
The control unit 12 according to the third embodiment moves the pole object 34 within a range that overlaps with the fender and the front bumper of the host vehicle based on the moving operation, as indicated by a two-dot chain line in FIGS. 22 and 23. I let you. However, the present invention is not limited to this, and as shown by the solid lines in FIGS. 22 and 23, the control unit 12 is configured to move away from the fender and the front bumper of the host vehicle 2 based on the moving operation. The pole object 34 may be moved forward. For example, a distance of about 10 cm is used as the maximum separation distance between the pole object 34 and the fender and front bumper of the host vehicle 2.
 このような構成によれば、ポールオブジェクト34が自車両2のフェンダー及び前側バンパーから離れて移動している際にも、制御部12は、実施の形態3で説明した動作を行うことができる。つまり、ポールオブジェクト34が自車両2のフェンダー及び前側バンパーから離れて移動している際にも、制御部12は、自車両周辺の障害物とポールオブジェクト34との間のポール距離に基づいてポールオブジェクト34の表示態様を変更することができる。このため、運転者は、ポールオブジェクト34を障害物側に移動させることによって、自車両と障害物とがどれぐらい離れているかを確認することができる。 According to such a configuration, the control unit 12 can perform the operation described in the third embodiment even when the pole object 34 is moving away from the fender and the front bumper of the host vehicle 2. That is, even when the pole object 34 is moving away from the fender and the front bumper of the host vehicle 2, the control unit 12 determines the pole based on the pole distance between the obstacle around the host vehicle and the pole object 34. The display mode of the object 34 can be changed. Therefore, the driver can confirm how far the vehicle is from the obstacle by moving the pole object 34 to the obstacle side.
 なお、この移動の際に、ポールオブジェクト34が障害物と接触する位置に達した場合に、制御部12は、ポールオブジェクト34の表示態様を、接触したことを示す特別の表示態様に変化させてもよいし、接触したことを示す文字や図形などの表示オブジェクトを仮想表示部21にさらに表示させてもよい。 In this movement, when the pole object 34 reaches a position where it comes into contact with an obstacle, the control unit 12 changes the display mode of the pole object 34 to a special display mode indicating that it has touched. Alternatively, a display object such as a character or a graphic indicating contact may be further displayed on the virtual display unit 21.
 <実施の形態4>
 本発明の実施の形態4に係る表示制御装置1のブロック構成は、実施の形態3に係る表示制御装置1のブロック構成(図13)と同様である。以下、本実施の形態4で説明する構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
<Embodiment 4>
The block configuration of the display control apparatus 1 according to the fourth embodiment of the present invention is the same as the block configuration (FIG. 13) of the display control apparatus 1 according to the third embodiment. Hereinafter, among the constituent elements described in the fourth embodiment, constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
 本実施の形態4では、制御部12は、第2表示オブジェクトを仮想表示部21にさらに表示させる。ここでは、第2表示オブジェクトには、コーナポールに相当し、かつ実物のコーナポールと同様に固定されたポールを示す表示オブジェクトが用いられる。 In the fourth embodiment, the control unit 12 causes the virtual display unit 21 to further display the second display object. Here, as the second display object, a display object corresponding to a corner pole and indicating a fixed pole in the same manner as the actual corner pole is used.
 なお、本実施の形態4の説明においては、これまでの説明でポールオブジェクトと記した第1表示オブジェクトを「補助ポールオブジェクト」と記し、第2表示オブジェクトを「主ポールオブジェクト」と記す。 In the description of the fourth embodiment, the first display object described as the pole object in the above description is referred to as “auxiliary pole object”, and the second display object is referred to as “main pole object”.
 図24及び図25は、自車両の室内からフロントガラス31越しに見える状態を示す図である。図24及び図25に示すように、制御部12は、これまで説明したポールオブジェクトと同様に、情報取得部11で取得された移動操作に基づいて、自車両のボディの前側端部に対応する予め定められた範囲内で補助ポールオブジェクト38を移動させる。一方、制御部12は、情報取得部11で取得された移動操作に関わらず、主ポールオブジェクト39を、一般的なコーナポールの位置のように、自車両のボディのうちの助手席側の前側端部の位置に固定する。なお、補助ポールオブジェクト38の表示態様と、主ポールオブジェクト39の表示態様とは互いに異なり、かつ類似する。 FIGS. 24 and 25 are views showing a state where the vehicle can be seen through the windshield 31 from the interior of the host vehicle. As shown in FIGS. 24 and 25, the control unit 12 corresponds to the front side end portion of the body of the host vehicle based on the movement operation acquired by the information acquisition unit 11, similarly to the pole object described so far. The auxiliary pole object 38 is moved within a predetermined range. On the other hand, regardless of the movement operation acquired by the information acquisition unit 11, the control unit 12 moves the main pole object 39 to the front side on the passenger seat side of the body of the host vehicle, like the position of a general corner pole. Secure at the end position. The display mode of the auxiliary pole object 38 and the display mode of the main pole object 39 are different from each other and similar.
 <動作>
 図26は、本実施の形態4に係る表示制御装置1の動作を示すフローチャートである。
<Operation>
FIG. 26 is a flowchart showing the operation of the display control apparatus 1 according to the fourth embodiment.
 まずステップS41にて、仮想表示オブジェクト生成部12aは、操作信号入力部11aで取得された移動操作などに基づいて主ポールオブジェクト39及び補助ポールオブジェクト38を生成する。表示位置制御部12bは、主ポールオブジェクト39及び補助ポールオブジェクト38の表示位置を初期位置に決定する。これにより、表示制御部12cは、主ポールオブジェクト39及び補助ポールオブジェクト38を初期位置に表示するように仮想表示部21を制御する。この結果、図24の状態となる。 First, in step S41, the virtual display object generation unit 12a generates the main pole object 39 and the auxiliary pole object 38 based on the movement operation acquired by the operation signal input unit 11a. The display position control unit 12b determines the display positions of the main pole object 39 and the auxiliary pole object 38 as initial positions. Thereby, the display control unit 12c controls the virtual display unit 21 so that the main pole object 39 and the auxiliary pole object 38 are displayed at the initial positions. As a result, the state shown in FIG. 24 is obtained.
 その後、図26のステップS42にて、表示位置制御部12bは、操作信号入力部11aが移動操作を取得したか否かを判定する。移動操作を取得したと判定した場合には処理がステップS43に進み、移動操作を取得しなかったと判定した場合にはステップS42の処理が再度行われる。 Thereafter, in step S42 of FIG. 26, the display position control unit 12b determines whether or not the operation signal input unit 11a has acquired a movement operation. If it is determined that the moving operation has been acquired, the process proceeds to step S43. If it is determined that the moving operation has not been acquired, the process of step S42 is performed again.
 ステップS43にて、表示位置制御部12bは、操作信号入力部11aで取得された移動操作に基づいて補助ポールオブジェクト38の表示位置を決定する。表示制御部12cは、補助ポールオブジェクト38を、表示位置制御部12bで決定された表示位置に表示するように、仮想表示部21を制御する。これにより、主ポールオブジェクト39は移動せずに、補助ポールオブジェクト38が移動する。その後、ステップS42に処理が戻る。 In step S43, the display position control unit 12b determines the display position of the auxiliary pole object 38 based on the moving operation acquired by the operation signal input unit 11a. The display control unit 12c controls the virtual display unit 21 so that the auxiliary pole object 38 is displayed at the display position determined by the display position control unit 12b. As a result, the auxiliary pole object 38 moves without moving the main pole object 39. Thereafter, the process returns to step S42.
 <実施の形態4のまとめ>
 以上のような本実施の形態4に係る表示制御装置1によれば、主ポールオブジェクト39を移動させずに、補助ポールオブジェクト38を移動させる操作に基づいて補助ポールオブジェクト38を移動させる。このため、運転者は、補助ポールオブジェクト38を移動させた後でも、主ポールオブジェクト39を従来のコーナポールと同様に用いることができる。
<Summary of Embodiment 4>
According to the display control apparatus 1 according to the fourth embodiment as described above, the auxiliary pole object 38 is moved based on the operation of moving the auxiliary pole object 38 without moving the main pole object 39. Therefore, the driver can use the main pole object 39 in the same manner as a conventional corner pole even after the auxiliary pole object 38 is moved.
 <実施の形態5>
 図27は、本発明の実施の形態5に係る表示制御装置1の構成を示すブロック図である。以下、本実施の形態5で説明する構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
<Embodiment 5>
FIG. 27 is a block diagram showing a configuration of display control apparatus 1 according to Embodiment 5 of the present invention. Hereinafter, among the constituent elements described in the fifth embodiment, constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
 図27の表示制御装置1は、図13の表示制御装置1から、操作信号入力部11a及び相対位置取得部11cを取り除いた構成と同様である。なお、仮想表示部21及び周辺検出部23は実施の形態3と同様である。 The display control device 1 in FIG. 27 has the same configuration as that obtained by removing the operation signal input unit 11a and the relative position acquisition unit 11c from the display control device 1 in FIG. The virtual display unit 21 and the periphery detection unit 23 are the same as those in the third embodiment.
 本実施の形態5では、車外情報入力部11bは、自車両周辺の障害物と自車両とについての相対位置を、周辺検出部23から取得する。表示位置制御部12bは、車外情報入力部11bで取得された当該相対位置に基づいて表示オブジェクトの表示位置を決定する。このように構成された制御部12は、運転者からの移動操作がなくても、車外情報入力部11bで取得された自車両周辺の障害物と自車両とについての相対位置に基づいて、ポールオブジェクト34を自動的に移動させることが可能となっている。 In the fifth embodiment, the vehicle exterior information input unit 11b acquires the relative position of the obstacle around the host vehicle and the host vehicle from the periphery detection unit 23. The display position control unit 12b determines the display position of the display object based on the relative position acquired by the vehicle exterior information input unit 11b. The control unit 12 configured as described above is based on the relative positions of the obstacles around the host vehicle and the host vehicle acquired by the vehicle outside information input unit 11b even if there is no movement operation from the driver. The object 34 can be automatically moved.
 なお本実施の形態5では、制御部12は、車外情報入力部11bで取得された自車両周辺の障害物と自車両とについての相対位置に基づいて、障害物と自車両との間の距離である自車両距離を求める。そして、制御部12は、自車両距離が予め定められた閾値(例えば40cm)よりも大きい場合には、ポールオブジェクト34の位置を初期位置に固定する。一方、制御部12は、自車両距離が当該閾値以下である場合には、車外情報入力部11bで取得された自車両周辺の障害物と自車両とについての相対位置に基づいて、障害物にポールオブジェクト34を近づける。ここでは制御部12は、ポールオブジェクト34が移動可能な範囲のうち自車両距離が最も短くなる位置、つまり自車両のうち障害物との間の距離が最も短くなる部分に、ポールオブジェクト34を移動させる。 In the fifth embodiment, the control unit 12 determines the distance between the obstacle and the own vehicle based on the relative position of the obstacle around the own vehicle and the own vehicle acquired by the outside information input unit 11b. The vehicle distance is calculated. And the control part 12 fixes the position of the pole object 34 to an initial position, when the own vehicle distance is larger than a predetermined threshold value (for example, 40 cm). On the other hand, when the own vehicle distance is equal to or less than the threshold, the control unit 12 determines whether the obstacle is based on the relative position of the obstacle around the own vehicle and the own vehicle acquired by the outside information input unit 11b. The pole object 34 is brought closer. Here, the control unit 12 moves the pole object 34 to the position where the own vehicle distance is the shortest in the movable range of the pole object 34, that is, the portion where the distance to the obstacle is the shortest in the own vehicle. Let
 <動作>
 図28は、本実施の形態5に係る表示制御装置1の動作を示すフローチャートである。
<Operation>
FIG. 28 is a flowchart showing the operation of the display control apparatus 1 according to the fifth embodiment.
 まずステップS51にて、仮想表示オブジェクト生成部12aは、自車両周辺の障害物と自車両とについての相対位置などに基づいてポールオブジェクト34を生成する。表示位置制御部12bは、ポールオブジェクト34の表示位置を初期位置に決定する。これにより、表示制御部12cは、ポールオブジェクト34を初期位置に表示するように仮想表示部21を制御する。 First, in step S51, the virtual display object generation unit 12a generates the pole object 34 based on the relative position of the obstacle around the host vehicle and the host vehicle. The display position control unit 12b determines the display position of the pole object 34 as the initial position. Thereby, the display control unit 12c controls the virtual display unit 21 so as to display the pole object 34 at the initial position.
 ステップS52にて、車外情報入力部11bは、自車両周辺の障害物と自車両とについての相対位置を取得する。 In step S52, the vehicle exterior information input unit 11b acquires the relative position of the obstacle around the host vehicle and the host vehicle.
 ステップS53にて、制御部12は、車外情報入力部11bで取得された自車両周辺の障害物と自車両とについての相対位置に基づいて、上述した自車両距離を求める。そして制御部12は、自車両距離が予め定められた閾値(例えば40cm)以下であるか否かを判定する。自車両距離が予め定められた閾値以下であると判定した場合には処理がステップS54に進み、自車両距離が当該閾値よりも大きいと判定した場合には処理がステップS51に戻る。 In step S53, the control unit 12 obtains the above-described host vehicle distance based on the relative positions of the obstacles around the host vehicle and the host vehicle acquired by the vehicle outside information input unit 11b. And the control part 12 determines whether the own vehicle distance is below a predetermined threshold value (for example, 40 cm). If it is determined that the host vehicle distance is equal to or less than the predetermined threshold, the process proceeds to step S54. If it is determined that the host vehicle distance is greater than the threshold, the process returns to step S51.
 ステップS54にて、制御部12は、自車両距離が最も短くなる位置に、ポールオブジェクト34を移動させる。その後、ステップS52に処理が戻る。 In step S54, the control unit 12 moves the pole object 34 to a position where the own vehicle distance is the shortest. Thereafter, the process returns to step S52.
 図29及び図30は、図28の動作結果の例を示す図である。 29 and 30 are diagrams showing examples of the operation result of FIG.
 図29では、障害物51と自車両2との間の距離である自車両距離dvが、閾値よりも大きい場合が示されている。この場合、ポールオブジェクト34は初期位置に固定される。図30では、自車両距離dvが、閾値以下である場合が示されている。この場合、ポールオブジェクト34は、自車両距離dvが最小となる位置に移動する。 FIG. 29 shows a case where the own vehicle distance dv, which is the distance between the obstacle 51 and the own vehicle 2, is larger than the threshold value. In this case, the pole object 34 is fixed at the initial position. FIG. 30 shows a case where the host vehicle distance dv is equal to or less than a threshold value. In this case, the pole object 34 moves to a position where the own vehicle distance dv is minimized.
 <実施の形態5のまとめ>
 以上のような本実施の形態5に係る表示制御装置1によれば、自車両2周辺の障害物51と自車両2とについての相対位置に基づいて、ポールオブジェクト34を自動的に移動させる。このような構成によれば、運転者はポールオブジェクト34の移動操作を行う必要がなくなるので、運転者の負担を軽減することができる。
<Summary of Embodiment 5>
According to the display control apparatus 1 according to the fifth embodiment as described above, the pole object 34 is automatically moved based on the relative positions of the obstacle 51 around the host vehicle 2 and the host vehicle 2. According to such a configuration, the driver does not need to perform the moving operation of the pole object 34, so the burden on the driver can be reduced.
 また本実施の形態5では、自車両2周辺の障害物と自車両とについての相対位置に基づいて、障害物にポールオブジェクト34を近づける。このような構成によれば、運転者は、自車両2のうちどの部分が障害物51に最も近いかを、ポールオブジェクト34の位置によって知ることができる。 In the fifth embodiment, the pole object 34 is brought closer to the obstacle based on the relative position of the obstacle around the host vehicle 2 and the host vehicle. According to such a configuration, the driver can know which part of the host vehicle 2 is closest to the obstacle 51 by the position of the pole object 34.
 <実施の形態5の各変形例>
 次に、実施の形態5の各変形例について説明する。なお、詳細には説明しないが、実施の形態5に実施の形態2の各変形例を適宜組み合わせてもよいし、実施の形態2に実施の形態5の各変形例を適宜組み合わせてもよい。例えば、実施の形態5に係る制御部12は、実施の形態2の変形例1のように、自車両周辺の障害物と自車両とについての相対位置に基づいてポールオブジェクト34の虚像距離及び虚像方向の両方を変更するのではなく、ポールオブジェクト34の虚像距離を変更せずに、ポールオブジェクト34の虚像方向を変更してもよい。
<Variations of Embodiment 5>
Next, modifications of the fifth embodiment will be described. Although not described in detail, the respective modifications of the second embodiment may be appropriately combined with the fifth embodiment, or the respective modifications of the fifth embodiment may be appropriately combined with the second embodiment. For example, the control unit 12 according to the fifth embodiment, like the first modification of the second embodiment, determines the virtual image distance and virtual image of the pole object 34 based on the relative positions of the obstacle around the host vehicle and the host vehicle. Instead of changing both the directions, the virtual image direction of the pole object 34 may be changed without changing the virtual image distance of the pole object 34.
 また、詳細には説明しないが、実施の形態5に実施の形態3の各変形例を適宜組み合わせてもよいし、実施の形態3に実施の形態5の各変形例を適宜組み合わせてもよい。例えば、実施の形態5に係る制御部12は、実施の形態3の変形例1のように、車外情報入力部11bで取得された自車両周辺の障害物と自車両とについての相対位置に基づいて、ポールオブジェクト34の表示態様を変更してもよい。具体的には、実施の形態5に係る制御部12は、当該相対位置に基づく障害物と自車両との間の距離である自車両距離dvに基づいて、ポールオブジェクト34の表示態様を変更してもよい。 Although not described in detail, each modification of the third embodiment may be appropriately combined with the fifth embodiment, or each modification of the fifth embodiment may be appropriately combined with the third embodiment. For example, the control unit 12 according to the fifth embodiment is based on the relative positions of the obstacles around the host vehicle and the host vehicle acquired by the outside information input unit 11b as in the first modification of the third embodiment. Thus, the display mode of the pole object 34 may be changed. Specifically, the control unit 12 according to the fifth embodiment changes the display mode of the pole object 34 based on the host vehicle distance dv that is the distance between the obstacle and the host vehicle based on the relative position. May be.
 この際、制御部12によって変更される表示態様は、例えばポールオブジェクト34の点滅などのアニメーションの有無、色、形状、及び、模様の少なくともいずれか1つであってもよい。なお、図31及び図32の例では、自車両距離dvが短くなるにつれて、ポールオブジェクト34の色が、図29などのポールオブジェクト34の色から注意色を経て警報色に変更されている。 At this time, the display mode changed by the control unit 12 may be at least one of presence / absence of animation such as blinking of the pole object 34, color, shape, and pattern. In the example of FIGS. 31 and 32, as the host vehicle distance dv becomes shorter, the color of the pole object 34 is changed from the color of the pole object 34 shown in FIG.
 また表示態様の変更はこれに限ったものではなく、例えば、制御部12は、自車両距離dvが小さくなるにつれて、ポールオブジェクト34の長さを長くしたり短くしたりしてもよい。また例えば、制御部12は、自車両距離dvが小さくなるにつれて、ポールオブジェクト34を点滅させたりしてもよいし、ポールオブジェクト34に模様を付加してもよい。また、制御部12は、自車両距離dvに基づいて、ポールオブジェクト34の表示態様を段階的に変更してもよいし、連続的に変更してもよい。 Further, the change of the display mode is not limited to this. For example, the control unit 12 may increase or decrease the length of the pole object 34 as the host vehicle distance dv decreases. For example, the control unit 12 may blink the pole object 34 or add a pattern to the pole object 34 as the host vehicle distance dv decreases. Moreover, the control part 12 may change the display mode of the pole object 34 in steps based on the own vehicle distance dv, or may change it continuously.
 <実施の形態5の変形例1>
 実施の形態5に実施の形態4を組み合わせてもよい。つまり、制御部12は、図33、図34及び図35に示すように、主ポールオブジェクト39を移動させずに、自車両2周辺の障害物51と自車両2とについての相対位置に基づいて、補助ポールオブジェクト38を自動的に移動させてもよい。この場合であっても、実施の形態5と同様の効果を得ることができる。
<Modification 1 of Embodiment 5>
The fourth embodiment may be combined with the fifth embodiment. That is, as shown in FIGS. 33, 34, and 35, the control unit 12 does not move the main pole object 39 based on the relative positions of the obstacle 51 around the host vehicle 2 and the host vehicle 2. The auxiliary pole object 38 may be automatically moved. Even in this case, the same effect as in the fifth embodiment can be obtained.
 <実施の形態5の変形例2>
 上述したように、図27の周辺検出部23は、自車両2周辺の障害物に関する情報を検出し、車外情報入力部11bは、周辺検出部23で検出された情報を取得する。ここで、自車両2周辺の障害物に関する情報は、障害物と自車両2とについての相対位置、障害物の属性、障害物の高さ、及び、障害物の色の少なくともいずれか1つを含んでもよい。
<Modification 2 of Embodiment 5>
As described above, the periphery detection unit 23 in FIG. 27 detects information about obstacles around the host vehicle 2, and the vehicle outside information input unit 11 b acquires information detected by the periphery detection unit 23. Here, the information about the obstacle around the host vehicle 2 includes at least one of the relative position between the obstacle and the host vehicle 2, the attribute of the obstacle, the height of the obstacle, and the color of the obstacle. May be included.
 そして制御部12は、車外情報入力部11bで取得された障害物に関する情報を示す表示オブジェクトである第3表示オブジェクトを、ポールオブジェクト34に付随させて仮想表示部21にさらに表示させてもよい。以下の説明では、第3表示オブジェクトを「付加オブジェクト」と記す。 And the control part 12 may make the virtual display part 21 further display the 3rd display object which is a display object which shows the information regarding the obstruction acquired by the vehicle exterior information input part 11b with the pole object 34. FIG. In the following description, the third display object is referred to as “additional object”.
 図36、図37及び図38は、付加オブジェクトの例を示す図である。これらの図では、車外情報入力部11bが、障害物に関する情報として、障害物と自車両とについての相対位置の情報を取得し、制御部12が、当該相対位置に基づく自車両距離dvを示す付加オブジェクト40a,40b,40cをポールオブジェクト34に付随させて仮想表示部21に表示させている。図36では、自車両距離dvの値を示す文字の表示オブジェクトが、付加オブジェクト40aとして示されている。図37では、自車両距離dvの値を長さで示す矢印の表示オブジェクトが、付加オブジェクト40bとして示されている。図38では、自車両距離dvの値を示す二つの指の表示オブジェクトが、付加オブジェクト40cとして示されている。 36, 37, and 38 are diagrams showing examples of additional objects. In these drawings, the vehicle outside information input unit 11b acquires information on the relative position of the obstacle and the host vehicle as information on the obstacle, and the control unit 12 indicates the host vehicle distance dv based on the relative position. The additional objects 40a, 40b, and 40c are attached to the pole object 34 and displayed on the virtual display unit 21. In FIG. 36, the display object of the character which shows the value of the own vehicle distance dv is shown as the additional object 40a. In FIG. 37, the display object of the arrow which shows the value of the own vehicle distance dv by length is shown as the additional object 40b. In FIG. 38, a display object of two fingers indicating the value of the host vehicle distance dv is shown as an additional object 40c.
 以上のような本変形例に係る表示制御装置1によれば、運転者は、自車両2周辺の障害物に関する情報を知ることができる。 According to the display control apparatus 1 according to the present modification as described above, the driver can know information on obstacles around the host vehicle 2.
 <実施の形態5の変形例3>
 図27の周辺検出部23は、自車両周辺の障害物の属性を検出し、車外情報入力部11bは、周辺検出部23で検出された障害物の属性を取得してもよい。ここでいう障害物の属性には、静止物体、車両、人、及び、人以外の動物などのいずれかが該当する。そして、制御部12は、車外情報入力部11bで取得された障害物の属性に基づいて、ポールオブジェクト34の表示態様を変更してもよい。
<Modification 3 of Embodiment 5>
27 may detect the attribute of an obstacle around the host vehicle, and the vehicle outside information input unit 11b may acquire the attribute of the obstacle detected by the periphery detection unit 23. The obstacle attribute mentioned here corresponds to any of a stationary object, a vehicle, a person, and an animal other than a person. And the control part 12 may change the display mode of the pole object 34 based on the attribute of the obstruction acquired by the vehicle exterior information input part 11b.
 図39、図40及び図41は、制御部12が障害物の属性に基づいてポールオブジェクト34の表示態様を変更する例を示す図である。図39では、障害物の属性が静止物体51dである場合に、静止物体を示すポールオブジェクト34が示されている。図40では、障害物の属性が車両51eである場合に、車両を示すポールオブジェクト34が示されている。図41では、障害物の属性が人51fである場合に、人を示すポールオブジェクト34が示されている。 39, 40, and 41 are diagrams illustrating an example in which the control unit 12 changes the display mode of the pole object 34 based on the attribute of the obstacle. FIG. 39 shows a pole object 34 indicating a stationary object when the attribute of the obstacle is the stationary object 51d. FIG. 40 shows a pole object 34 indicating a vehicle when the attribute of the obstacle is the vehicle 51e. FIG. 41 shows a pole object 34 indicating a person when the attribute of the obstacle is a person 51f.
 以上のような本変形例に係る表示制御装置1によれば、運転者は、自車両周辺の障害物の属性を知ることができる。なお、ここでは、制御部12は、車外情報入力部11bで取得された障害物の属性に基づいて、ポールオブジェクト34の表示態様を変更したが、これに限ったものではない。例えば、制御部12は、実施の形態5の変形例2と同様に、障害物の属性を示す付加オブジェクトをポールオブジェクト34に付随させてもよい。 According to the display control device 1 according to this modification as described above, the driver can know the attributes of the obstacles around the host vehicle. Here, the control unit 12 changes the display mode of the pole object 34 based on the attribute of the obstacle acquired by the outside information input unit 11b, but is not limited thereto. For example, the control unit 12 may attach an additional object indicating the attribute of the obstacle to the pole object 34 as in the second modification of the fifth embodiment.
 <実施の形態5の変形例4>
 図27の周辺検出部23は、自車両周辺の障害物の高さを検出し、車外情報入力部11bは、周辺検出部23で検出された障害物の高さを取得してもよい。そして、制御部12は、車外情報入力部11bで取得された障害物の高さに基づいて、ポールオブジェクト34の表示態様を変更してもよい。
<Modification 4 of Embodiment 5>
27 may detect the height of an obstacle around the host vehicle, and the vehicle outside information input unit 11b may acquire the height of the obstacle detected by the periphery detection unit 23. And the control part 12 may change the display mode of the pole object 34 based on the height of the obstruction acquired by the vehicle exterior information input part 11b.
 図42及び図43は、制御部12が障害物の高さに基づいてポールオブジェクト34の表示態様を変更する例を示す図である。図42では、障害物の高さが比較的低い場合に、マークの位置によって障害物の高さが比較的低いことを示すポールオブジェクト34が示されている。図43では、障害物の高さが比較的高い場合に、マークの位置によって障害物の高さが比較的高いことを示すポールオブジェクト34が示されている。マークの位置によって障害物の高さを示す代わりに、ポールオブジェクト34そのものの長さによって障害物の高さを示してもよいし、ポールオブジェクト34の目盛マーク(図示せず)によって障害物の高さを示してもよい。 42 and 43 are diagrams illustrating an example in which the control unit 12 changes the display mode of the pole object 34 based on the height of the obstacle. FIG. 42 shows a pole object 34 indicating that the height of the obstacle is relatively low depending on the position of the mark when the height of the obstacle is relatively low. FIG. 43 shows a pole object 34 indicating that the height of the obstacle is relatively high depending on the position of the mark when the height of the obstacle is relatively high. Instead of indicating the height of the obstacle by the position of the mark, the height of the obstacle may be indicated by the length of the pole object 34, or the height of the obstacle may be indicated by a scale mark (not shown) of the pole object 34. You may show.
 以上のような本変形例に係る表示制御装置1によれば、運転者は、自車両周辺の障害物の高さを知ることができる。なお、ここでは、制御部12は、車外情報入力部11bで取得された障害物の高さに基づいて、ポールオブジェクト34の表示態様を変更したが、これに限ったものではない。例えば、制御部12は、実施の形態5の変形例2と同様に、障害物の高さを示す付加オブジェクトをポールオブジェクト34に付随させてもよい。 According to the display control device 1 according to this modification as described above, the driver can know the height of the obstacle around the host vehicle. In addition, although the control part 12 changed the display mode of the pole object 34 based on the height of the obstruction acquired by the vehicle exterior information input part 11b here, it is not restricted to this. For example, the control unit 12 may attach an additional object indicating the height of the obstacle to the pole object 34 as in the second modification of the fifth embodiment.
 <実施の形態5の変形例5>
 実施の形態5に実施の形態3の変形例5を組み合わせてもよい。つまり実施の形態5に係る制御部12は、自車両2周辺の障害物と自車両2とについての相対位置に基づいて、自車両2のフェンダー及び前側バンパーから離間するように、自車両2の前方にポールオブジェクト34を移動させてもよい。例えば、制御部12は、まず、ポールオブジェクト34が移動可能な範囲のうち自車両距離が最も短くなる位置に、ポールオブジェクト34を自動的に移動させ、その後に、ポールオブジェクト34を障害物に向かうように自動的に移動させてもよい。
<Modification 5 of Embodiment 5>
You may combine the modification 5 of Embodiment 3 with Embodiment 5. FIG. That is, the control unit 12 according to the fifth embodiment of the host vehicle 2 is configured to be separated from the fender and the front bumper of the host vehicle 2 based on the relative positions of the host vehicle 2 and the obstacles around the host vehicle 2. The pole object 34 may be moved forward. For example, the control unit 12 first automatically moves the pole object 34 to a position where the own vehicle distance is the shortest in a range in which the pole object 34 can move, and then moves the pole object 34 toward the obstacle. You may move automatically.
 図44、図45及び図46は、自車両距離が最も短くなる位置に移動されたポールオブジェクト34が、自車両2のフェンダー及び前側バンパーから離間していく様子を順に示す図である。これらの図に示すように、制御部12は、ポールオブジェクト34が自車両2のフェンダー及び前側バンパーから離間していく際に、自車両2周辺の障害物51とポールオブジェクト34との間のポール距離に基づいてポールオブジェクト34の表示態様を変更してもよい。そして、ポールオブジェクト34が障害物51と接触する位置に達した場合に、制御部12は、ポールオブジェクト34の表示態様を、接触したことを示す特別の表示態様に変化させてもよいし、接触したことを示す文字や図形などの表示オブジェクトを仮想表示部21にさらに表示させてもよい。このような構成によれば、運転者は、障害物51と自車両2とについての相対位置を知ることができる。 44, 45 and 46 are diagrams sequentially illustrating a state in which the pole object 34 moved to the position where the own vehicle distance is the shortest is separated from the fender and the front bumper of the own vehicle 2. As shown in these drawings, when the pole object 34 moves away from the fender and the front bumper of the host vehicle 2, the control unit 12 makes a pole between the obstacle 51 around the host vehicle 2 and the pole object 34. The display mode of the pole object 34 may be changed based on the distance. When the pole object 34 reaches a position where it comes into contact with the obstacle 51, the control unit 12 may change the display mode of the pole object 34 to a special display mode indicating that the pole object 34 has been touched. A display object such as a character or a figure indicating that it has been performed may be further displayed on the virtual display unit 21. According to such a configuration, the driver can know the relative positions of the obstacle 51 and the host vehicle 2.
 なお、以上の説明では、制御部12は、まず、ポールオブジェクト34が移動可能な範囲のうち自車両距離が最も短くなる位置に、ポールオブジェクト34を自動的に移動させ、その後に、ポールオブジェクト34を障害物に向かうように自動的に移動させた。しかしこれに限ったものではなく、例えば、制御部12は、ポールオブジェクト34を自動的に移動させることと、運転者の移動操作に基づいてポールオブジェクト34を移動させることとを組み合わせてもよい。具体的には、制御部12は、自車両2周辺の障害物51と自車両とについての相対位置に基づいて、ポールオブジェクト34が移動可能な範囲のうち自車両距離が最も短くなる位置に、ポールオブジェクト34を自動的に移動させ、その後に、運転者の移動操作に基づいて、ポールオブジェクト34を障害物に向かうように移動させてもよい。 In the above description, the control unit 12 first automatically moves the pole object 34 to the position where the own vehicle distance is the shortest in the movable range of the pole object 34, and then the pole object 34. Was automatically moved to the obstacle. However, the present invention is not limited to this. For example, the control unit 12 may combine the automatic movement of the pole object 34 with the movement of the pole object 34 based on the driver's movement operation. Specifically, the control unit 12 determines, based on the relative positions of the obstacle 51 around the host vehicle 2 and the host vehicle, the position where the host vehicle distance is the shortest in the movable range of the pole object 34. The pole object 34 may be moved automatically, and then the pole object 34 may be moved toward the obstacle based on the driver's movement operation.
 <実施の形態5の変形例6>
 これまでの説明では、制御部12は、1つのポールオブジェクト34を仮想表示部21に表示させた。しかしこれに限ったものではなく、制御部12は、複数のポールオブジェクト34を仮想表示部21に表示させてもよい。
<Modification 6 of Embodiment 5>
In the description so far, the control unit 12 has displayed one pole object 34 on the virtual display unit 21. However, the present invention is not limited to this, and the control unit 12 may display a plurality of pole objects 34 on the virtual display unit 21.
 例えば、周辺検出部23は、複数の障害物について自車両周辺の障害物と自車両とについての相対位置を検出し、車外情報入力部11bは、複数の障害物について自車両周辺の障害物と自車両とについての相対位置を、周辺検出部23から取得してもよい。そして制御部12は、複数の障害物のそれぞれについて自車両距離が最も短くなる位置に、ポールオブジェクト34を1つずつ仮想表示部21に表示させてもよい。また、制御部12は、図47に示すように、複数の障害物51のそれぞれについて自車両距離が最も短くなる位置に、補助ポールオブジェクト38を1つずつ仮想表示部21に表示させてもよい。 For example, the periphery detection unit 23 detects the relative positions of the obstacles around the host vehicle and the host vehicle with respect to a plurality of obstacles, and the outside information input unit 11b detects the obstacles around the host vehicle with respect to the plurality of obstacles. You may acquire the relative position about the own vehicle from the periphery detection part 23. FIG. And the control part 12 may display the pole object 34 on the virtual display part 21 one by one in the position where the own vehicle distance becomes the shortest about each of several obstacles. Further, as shown in FIG. 47, the control unit 12 may cause the virtual display unit 21 to display the auxiliary pole objects 38 one by one at the position where the own vehicle distance is the shortest for each of the plurality of obstacles 51. .
 <実施の形態6>
 図48は、本発明の実施の形態6に係る表示制御装置1の構成を示すブロック図である。以下、本実施の形態6で説明する構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
<Embodiment 6>
FIG. 48 is a block diagram showing a configuration of display control apparatus 1 according to Embodiment 6 of the present invention. Hereinafter, among the constituent elements described in the sixth embodiment, constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
 表示制御装置1内部の構成について説明する前に、車内LAN(Local Area Network)装置24について説明する。なお、仮想表示部21及び周辺検出部23は実施の形態3と同様である。 Before describing the internal configuration of the display control device 1, the in-vehicle LAN (Local Area Network) device 24 will be described. The virtual display unit 21 and the periphery detection unit 23 are the same as those in the third embodiment.
 車内LAN装置24は、CAN(Controller Area Network)などを構成しており、自車両内の機器間で様々な情報及び制御命令を通信する。これにより、車内LAN装置24は、自車両の位置情報、自車両の走行における制御情報、自車両のボディに関する情報、自車両の固有情報などを自車両から取得する。なお、周辺検出部23は、車内LAN装置24と接続される構成であってもよい。この場合、車外情報入力部11bは、車内LAN装置24を経由して、自車両周辺の障害物に関する情報を取得する。 The in-vehicle LAN device 24 constitutes a CAN (Controller Area Network) or the like, and communicates various information and control commands between devices in the own vehicle. Thereby, the in-vehicle LAN device 24 acquires the position information of the own vehicle, the control information in traveling of the own vehicle, the information related to the body of the own vehicle, the unique information of the own vehicle from the own vehicle. The periphery detection unit 23 may be connected to the in-vehicle LAN device 24. In this case, the vehicle exterior information input unit 11 b acquires information on obstacles around the host vehicle via the vehicle interior LAN device 24.
 次に、図48の表示制御装置1内部の構成について説明する。図48の表示制御装置1は、図27の表示制御装置1に、車内情報入力部11dが追加された構成と同様である。なお、車内情報入力部11dは、図1の情報取得部11の概念に含まれる。 Next, the internal configuration of the display control device 1 in FIG. 48 will be described. The display control device 1 in FIG. 48 has the same configuration as that in which the in-vehicle information input unit 11d is added to the display control device 1 in FIG. The in-vehicle information input unit 11d is included in the concept of the information acquisition unit 11 in FIG.
 車内情報入力部11dは、車内LAN装置24で取得された制御情報などに基づいて、自車両の将来の移動軌跡である第1移動軌跡を求める。以下の説明では第1移動軌跡を「自車両移動軌跡」と記す。 The in-vehicle information input unit 11d obtains a first movement locus that is a future movement locus of the host vehicle based on the control information acquired by the in-vehicle LAN device 24 and the like. In the following description, the first movement locus is referred to as “own vehicle movement locus”.
 表示位置制御部12bなどの制御部12は、車外情報入力部11bで取得された障害物と自車両とについての相対位置と、車内情報入力部11dで求めた自車両移動軌跡とに基づいて、自車両が自車両移動軌跡に沿って移動した場合に、自車両のうち障害物と接触する部分を求める。そして、制御部12は、求めた部分にポールオブジェクト34を移動させる。 The control unit 12 such as the display position control unit 12b is based on the relative position of the obstacle and the own vehicle acquired by the outside information input unit 11b and the own vehicle movement locus obtained by the in-vehicle information input unit 11d. When the own vehicle moves along the own vehicle movement trajectory, a portion of the own vehicle that comes into contact with the obstacle is obtained. And the control part 12 moves the pole object 34 to the calculated | required part.
 <動作>
 図49は、本実施の形態6に係る表示制御装置1の動作を示すフローチャートである。この図49の動作は、図28のステップS54をステップS61及びステップS62に置き換えたものと同様である。
<Operation>
FIG. 49 is a flowchart showing the operation of the display control apparatus 1 according to the sixth embodiment. The operation in FIG. 49 is the same as that obtained by replacing step S54 in FIG. 28 with steps S61 and S62.
 ステップS61にて、制御部12は、車外情報入力部11bで取得された障害物と自車両とについての相対位置と、車内情報入力部11dで取得された自車両移動軌跡とに基づいて、自車両が自車両移動軌跡に沿って移動した場合に、自車両のうち障害物と接触する部分を求める。図50には、自車両移動軌跡で走行した場合の自車両2が二点鎖線で示されている。図50の配置関係において、制御部12は、実線で示された自車両2が自車両移動軌跡に沿って移動した場合に、自車両2の二点鎖線で示された部分2cが他車両51aの部分54と接触すると予測する。 In step S61, the control unit 12 determines the own vehicle based on the relative position of the obstacle and the own vehicle acquired by the outside information input unit 11b and the own vehicle movement trajectory acquired by the in-vehicle information input unit 11d. When the vehicle moves along the trajectory of the own vehicle, a portion of the own vehicle that contacts the obstacle is obtained. 50, the own vehicle 2 when traveling along the own vehicle movement locus is indicated by a two-dot chain line. In the arrangement relationship of FIG. 50, when the own vehicle 2 indicated by the solid line moves along the own vehicle movement locus, the control unit 12 determines that the portion 2c indicated by the two-dot chain line of the own vehicle 2 is the other vehicle 51a. It is predicted that the portion 54 will come into contact.
 ステップS62にて、制御部12は、求めた部分にポールオブジェクト34を移動させる。図50の場合、制御部12は、自車両2の部分2cに対応する実線で示された部分2dにポールオブジェクト34を移動させる。その後、ステップS52に処理が戻る。 In step S62, the control unit 12 moves the pole object 34 to the obtained part. In the case of FIG. 50, the control unit 12 moves the pole object 34 to the part 2 d indicated by the solid line corresponding to the part 2 c of the host vehicle 2. Thereafter, the process returns to step S52.
 <実施の形態6のまとめ>
 以上のような本実施の形態6に係る表示制御装置1によれば、自車両2のうち障害物と将来接触する部分にポールオブジェクト34を移動させる。これにより、運転者は、ポールオブジェクト34の表示を参考にして、自車両2が障害物に接触しないように自車両2の移動軌跡を変更することができる。
<Summary of Embodiment 6>
According to the display control apparatus 1 according to the sixth embodiment as described above, the pole object 34 is moved to the part of the host vehicle 2 that will come into contact with an obstacle in the future. Accordingly, the driver can change the movement trajectory of the host vehicle 2 with reference to the display of the pole object 34 so that the host vehicle 2 does not contact the obstacle.
 <実施の形態6の各変形例>
 次に、実施の形態6の各変形例について説明する。なお、詳細には説明しないが、実施の形態6に実施の形態2,3,5の各変形例を適宜組み合わせてもよいし、実施の形態2,3,5に実施の形態6の各変形例を適宜組み合わせてもよい。
<Each modification of Embodiment 6>
Next, modifications of the sixth embodiment will be described. Although not described in detail, the modification examples of the second, third, and fifth embodiments may be combined with the sixth embodiment as appropriate, or the modification examples of the sixth embodiment may be combined with the second, third, and fifth embodiments. You may combine an example suitably.
 <実施の形態6の変形例1>
 実施の形態6では、制御部12は、車外情報入力部11bで取得された相対位置と、車内情報入力部11dで取得された自車両移動軌跡とに基づいて、自車両2が自車両移動軌跡に沿って移動した場合に、自車両2のうち障害物と接触する部分にポールオブジェクト34を移動させた。
<Modification 1 of Embodiment 6>
In the sixth embodiment, the control unit 12 determines that the host vehicle 2 is moving on the own vehicle based on the relative position acquired by the outside information input unit 11b and the own vehicle moving track acquired by the in-vehicle information input unit 11d. , The pole object 34 is moved to the part of the host vehicle 2 that contacts the obstacle.
 しかしこれに限ったものではなく、例えば、制御部12は、車外情報入力部11bで取得された相対位置と、車内情報入力部11dで取得された自車両移動軌跡とに基づいて、自車両2が自車両移動軌跡に沿って移動した場合に、自車両2のうち障害物と接触しないけれども障害物に最も近づく部分にポールオブジェクト34を移動させてもよい。または、制御部12は、障害物と接触するか否かに基づいてポールオブジェクト34の表示態様を変更してもよい。このような構成によれば、運転者は、ポールオブジェクト34の表示を参考にして、自車両2が障害物に接触しないように自車両2の移動軌跡を変更することができる。 However, the present invention is not limited to this. For example, the control unit 12 determines the vehicle 2 based on the relative position acquired by the vehicle outside information input unit 11b and the vehicle movement locus acquired by the vehicle interior information input unit 11d. , The pole object 34 may be moved to the portion of the host vehicle 2 that does not contact the obstacle but is closest to the obstacle. Or the control part 12 may change the display mode of the pole object 34 based on whether it contacts with an obstruction. According to such a configuration, the driver can change the movement trajectory of the host vehicle 2 with reference to the display of the pole object 34 so that the host vehicle 2 does not contact an obstacle.
 また例えば、図51に示すように、制御部12は、自車両2のうち他車両51aに現在最も近い部分2eに主ポールオブジェクト39を移動させるとともに、自車両2が自車両移動軌跡に沿って移動した場合に、自車両2のうち障害物と接触する部分2fに補助ポールオブジェクト38を移動させてもよい。また例えば、制御部12は、自車両2のうち現在最も近い部分2eに1つの補助ポールオブジェクト38を移動させるとともに、自車両2が自車両移動軌跡に沿って移動した場合に、自車両2のうち障害物と接触する部分2fに別の補助ポールオブジェクト38を移動させてもよい。そしてこの場合に、制御部12は、主ポールオブジェクト39の位置を初期位置に固定してもよい。 Further, for example, as shown in FIG. 51, the control unit 12 moves the main pole object 39 to the portion 2e currently closest to the other vehicle 51a in the own vehicle 2, and the own vehicle 2 follows the own vehicle movement locus. When the vehicle moves, the auxiliary pole object 38 may be moved to the portion 2f of the host vehicle 2 that contacts the obstacle. Further, for example, the control unit 12 moves one auxiliary pole object 38 to the currently nearest portion 2e of the own vehicle 2 and, when the own vehicle 2 moves along the own vehicle movement locus, Of these, another auxiliary pole object 38 may be moved to the portion 2f in contact with the obstacle. In this case, the control unit 12 may fix the position of the main pole object 39 to the initial position.
 <実施の形態6の変形例2>
 実施の形態6では、障害物は静止していることを想定していたが、障害物は移動してもよい。この場合、車外情報入力部11bは、周辺検出部23で検出された自車両周辺の障害物に関する情報に基づいて、障害物の将来の移動軌跡である第2移動軌跡を求める。以下の説明では第2移動軌跡を「障害物移動軌跡」と記す。
<Modification 2 of Embodiment 6>
In the sixth embodiment, it is assumed that the obstacle is stationary, but the obstacle may move. In this case, the vehicle outside information input unit 11b obtains a second movement locus that is a future movement locus of the obstacle based on the information about the obstacle around the host vehicle detected by the periphery detection unit 23. In the following description, the second movement locus is referred to as “obstacle movement locus”.
 例えば、障害物が自動運転できない他車両であった場合には、車外情報入力部11bは、撮像した画像に示される他車両のタイヤの方向などに基づいて、他車両の移動軌跡を障害物移動軌跡として求める。障害物が自動運転できる他車両であった場合には、車外情報入力部11bは、車車間通信などによって取得された他車両の自動運転の情報に基づいて、他車両の移動軌跡を障害物移動軌跡として求める。 For example, when the obstacle is another vehicle that cannot be automatically driven, the outside information input unit 11b moves the obstacle of the other vehicle based on the tire direction of the other vehicle indicated in the captured image. Find as a trajectory. When the obstacle is another vehicle that can be automatically driven, the vehicle outside information input unit 11b moves the obstacle on the movement locus of the other vehicle based on the information on the automatic driving of the other vehicle acquired by inter-vehicle communication or the like. Find as a trajectory.
 表示位置制御部12bなどの制御部12は、車外情報入力部11bで取得された障害物と自車両とについての相対位置と、車内情報入力部11dで求めた自車両移動軌跡と、車外情報入力部11bで求めた障害物移動軌跡とに基づいて、自車両が自車両移動軌跡に沿って移動し、かつ、障害物が障害物移動軌跡に沿って移動した場合に、自車両のうち障害物に最も近づく部分、及び、障害物と接触する部分の少なくともいずれか1つを求める。そして、制御部12は、求めた少なくともいずれか1つの部分にポールオブジェクト34を移動させる。 The control unit 12 such as the display position control unit 12b, the relative position of the obstacle and the own vehicle acquired by the outside information input unit 11b, the own vehicle movement locus obtained by the inboard information input unit 11d, and the outside information input. When the own vehicle moves along the own vehicle movement locus and the obstacle moves along the obstacle movement locus based on the obstacle movement locus obtained by the unit 11b, the obstacle of the own vehicle At least one of the part closest to the object and the part in contact with the obstacle is obtained. Then, the control unit 12 moves the pole object 34 to at least one of the obtained portions.
 このような構成によれば、自車両のうち障害物に最も近づく部分を求める精度、及び、障害物と接触する部分を求める精度を高めることができる。 According to such a configuration, it is possible to improve the accuracy of obtaining the portion of the own vehicle that is closest to the obstacle and the accuracy of obtaining the portion that contacts the obstacle.
 <実施の形態6の変形例3>
 実施の形態5の構成に、実施の形態2と同様の操作信号入力部11a及び操作入力部22を追加した構成において、操作信号入力部11aが操作入力部22を介して運転者から切り替え操作を取得してもよい。そして、制御部12は、操作信号入力部11aで取得した切り替え操作に基づいて、実施の形態5で説明した表示と、実施の形態6で説明した表示と、実施の形態6の変形例1及び2で説明した表示とが選択的に行われてもよい。
<Modification 3 of Embodiment 6>
In the configuration in which the operation signal input unit 11a and the operation input unit 22 similar to those of the second embodiment are added to the configuration of the fifth embodiment, the operation signal input unit 11a performs a switching operation from the driver via the operation input unit 22. You may get it. And the control part 12 is based on the switching operation acquired by the operation signal input part 11a, the display demonstrated in Embodiment 5, the display demonstrated in Embodiment 6, the modification 1 of Embodiment 6, and The display described in 2 may be selectively performed.
 <その他の変形例>
 上述した表示制御装置1における情報取得部11及び制御部12を、以下「情報取得部11等」と記す。情報取得部11等は、図52に示す処理回路81により実現される。すなわち、処理回路81は、情報を取得する情報取得部11と、表示オブジェクトである第1表示オブジェクトを表示部に表示させ、情報取得部11で取得された情報に基づいて、車両のボディの前側端部に対応する予め定められた範囲内で第1表示オブジェクトを移動させる制御部12と、を備える。処理回路81には、専用のハードウェアが適用されてもよいし、メモリに格納されるプログラムを実行するプロセッサが適用されてもよい。プロセッサには、例えば、中央処理装置、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)などが該当する。
<Other variations>
The information acquisition unit 11 and the control unit 12 in the display control apparatus 1 described above are hereinafter referred to as “information acquisition unit 11 etc.”. The information acquisition unit 11 and the like are realized by a processing circuit 81 illustrated in FIG. That is, the processing circuit 81 displays an information acquisition unit 11 that acquires information and a first display object that is a display object on the display unit, and based on the information acquired by the information acquisition unit 11, the front side of the vehicle body And a control unit 12 that moves the first display object within a predetermined range corresponding to the end. Dedicated hardware may be applied to the processing circuit 81, or a processor that executes a program stored in the memory may be applied. The processor corresponds to, for example, a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor) and the like.
 処理回路81が専用のハードウェアである場合、処理回路81は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)、またはこれらを組み合わせたものが該当する。情報取得部11等の各部の機能それぞれは、処理回路を分散させた回路で実現されてもよいし、各部の機能をまとめて一つの処理回路で実現されてもよい。 When the processing circuit 81 is dedicated hardware, the processing circuit 81 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate). Array) or a combination thereof. Each function of each unit such as the information acquisition unit 11 may be realized by a circuit in which processing circuits are distributed, or the function of each unit may be realized by a single processing circuit.
 処理回路81がプロセッサである場合、情報取得部11等の機能は、ソフトウェア等との組み合わせにより実現される。なお、ソフトウェア等には、例えば、ソフトウェア、ファームウェア、または、ソフトウェア及びファームウェアが該当する。ソフトウェア等はプログラムとして記述され、メモリ83に格納される。図53に示すように、処理回路81に適用されるプロセッサ82は、メモリ83に記憶されたプログラムを読み出して実行することにより、各部の機能を実現する。すなわち、表示制御装置1は、処理回路81により実行されるときに、情報を取得するステップと、表示オブジェクトである第1表示オブジェクトを表示部に表示させ、取得された情報に基づいて、車両のボディの前側端部に対応する予め定められた範囲内で第1表示オブジェクトを移動させる制御を行うステップと、が結果的に実行されることになるプログラムを格納するためのメモリ83を備える。換言すれば、このプログラムは、情報取得部11等の手順や方法をコンピュータに実行させるものであるともいえる。ここで、メモリ83は、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリー、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)などの、不揮発性または揮発性の半導体メモリ、HDD(Hard Disk Drive)、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)、そのドライブ装置等、または、今後使用されるあらゆる記憶媒体であってもよい。 When the processing circuit 81 is a processor, the functions of the information acquisition unit 11 and the like are realized by a combination with software or the like. Note that the software or the like corresponds to, for example, software, firmware, or software and firmware. Software or the like is described as a program and stored in the memory 83. As shown in FIG. 53, the processor 82 applied to the processing circuit 81 reads out and executes a program stored in the memory 83, thereby realizing the functions of the respective units. That is, when executed by the processing circuit 81, the display control device 1 displays the information display step and the first display object, which is a display object, on the display unit. Based on the acquired information, the display control device 1 And a memory 83 for storing a program to be executed as a result of performing control for moving the first display object within a predetermined range corresponding to the front end of the body. In other words, it can be said that this program causes a computer to execute procedures and methods such as the information acquisition unit 11. Here, the memory 83 is, for example, non-volatile or RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), or the like. Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk), its drive device, etc., or any storage media used in the future May be.
 以上、情報取得部11等の各機能が、ハードウェア及びソフトウェア等のいずれか一方で実現される構成について説明した。しかしこれに限ったものではなく、情報取得部11等の一部を専用のハードウェアで実現し、別の一部をソフトウェア等で実現する構成であってもよい。例えば、情報取得部11については専用のハードウェアとしての処理回路81でその機能を実現し、それ以外についてはプロセッサ82としての処理回路81がメモリ83に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。以上のように、処理回路81は、ハードウェア、ソフトウェア等、またはこれらの組み合わせによって、上述の各機能を実現することができる。 As described above, the configuration in which each function of the information acquisition unit 11 and the like is realized by either hardware or software has been described. However, the present invention is not limited to this, and a configuration in which a part of the information acquisition unit 11 or the like is realized by dedicated hardware and another part is realized by software or the like. For example, the function of the information acquisition unit 11 is realized by the processing circuit 81 as dedicated hardware, and the processing circuit 81 as the processor 82 reads and executes the program stored in the memory 83 for the other parts. The function can be realized. As described above, the processing circuit 81 can realize the functions described above by hardware, software, or the like, or a combination thereof.
 また、以上で説明した表示制御装置1は、PND(Portable Navigation Device)などのナビゲーション装置と、携帯電話、スマートフォン及びタブレットなどの携帯端末を含む通信端末と、これらにインストールされるアプリケーションの機能と、サーバとを適宜に組み合わせてシステムとして構築される表示制御システムにも適用することができる。この場合、以上で説明した表示制御装置1の各機能あるいは各構成要素は、前記システムを構築する各機器に分散して配置されてもよいし、いずれかの機器に集中して配置されてもよい。その一例として、表示制御装置は、図1の仮想表示部21をさらに備えてもよい。 The display control device 1 described above includes a navigation device such as PND (Portable Navigation Device), a communication terminal including mobile terminals such as a mobile phone, a smartphone, and a tablet, and functions of applications installed on these devices, The present invention can also be applied to a display control system constructed as a system by appropriately combining servers. In this case, each function or each component of the display control device 1 described above may be distributed and arranged in each device that constructs the system, or may be concentrated on any device. Good. As an example, the display control apparatus may further include a virtual display unit 21 in FIG.
 図54は、本変形例に係るサーバ91の構成を示すブロック図である。図54のサーバ91は、通信部91aと制御部91bとを備えており、自車両92のナビゲーション装置などによって実現される表示制御装置93と無線通信を行うことが可能となっている。 FIG. 54 is a block diagram showing a configuration of the server 91 according to this modification. 54 includes a communication unit 91a and a control unit 91b, and can perform wireless communication with a display control device 93 realized by a navigation device of the host vehicle 92 or the like.
 情報取得部である通信部91aは、表示制御装置93と無線通信を行うことにより、表示制御装置93で取得された情報を受信する。 The communication unit 91a, which is an information acquisition unit, receives information acquired by the display control device 93 by performing wireless communication with the display control device 93.
 制御部91bは、サーバ91の図示しないプロセッサなどが、サーバ91の図示しないメモリに記憶されたプログラムを実行することにより、図1の制御部12と同様の機能を有している。つまり、制御部91bは、通信部91aで受信された情報に基づいて、車両のボディの前側端部に対応する予め定められた範囲内で第1表示オブジェクトを移動させる制御信号を生成する。そして、通信部91aは、制御部91bの制御信号を表示制御装置93に送信する。表示制御装置93は、通信部91aから送信された制御信号に基づいて、仮想表示部21で表示されるポールオブジェクトを移動させる。 The control unit 91b has a function similar to that of the control unit 12 in FIG. 1 when a processor (not illustrated) of the server 91 executes a program stored in a memory (not illustrated) of the server 91. That is, the control unit 91b generates a control signal for moving the first display object within a predetermined range corresponding to the front end portion of the vehicle body based on the information received by the communication unit 91a. And the communication part 91a transmits the control signal of the control part 91b to the display control apparatus 93. FIG. The display control device 93 moves the pole object displayed on the virtual display unit 21 based on the control signal transmitted from the communication unit 91a.
 このように構成されたサーバ91によれば、実施の形態1で説明した表示制御装置1と同様の効果を得ることができる。 According to the server 91 configured in this way, the same effect as that of the display control device 1 described in the first embodiment can be obtained.
 図55は、本変形例に係る通信端末96の構成を示すブロック図である。図55の通信端末96は、通信部91aと同様の通信部96aと、制御部91bと同様の制御部96bとを備えており、自車両97の表示制御装置98と無線通信を行うことが可能となっている。なお、通信端末96には、例えば自車両97の運転者が携帯する携帯電話機、スマートフォン、及びタブレットなどの携帯端末が適用される。このように構成された通信端末96によれば、実施の形態1で説明した表示制御装置1と同様の効果を得ることができる。 FIG. 55 is a block diagram showing a configuration of the communication terminal 96 according to the present modification. The communication terminal 96 in FIG. 55 includes a communication unit 96a similar to the communication unit 91a and a control unit 96b similar to the control unit 91b, and can perform wireless communication with the display control device 98 of the host vehicle 97. It has become. For example, mobile terminals such as mobile phones, smartphones, and tablets carried by the driver of the host vehicle 97 are applied to the communication terminal 96. According to the communication terminal 96 configured as described above, the same effect as that of the display control device 1 described in the first embodiment can be obtained.
 なお、本発明は、その発明の範囲内において、各実施の形態及び各変形例を自由に組み合わせたり、各実施の形態及び各変形例を適宜、変形、省略したりすることが可能である。 The present invention can be freely combined with each embodiment and each modification within the scope of the invention, or can be appropriately modified and omitted with each embodiment and each modification.
 本発明は詳細に説明されたが、上記した説明は、すべての態様において、例示であって、本発明がそれに限定されるものではない。例示されていない無数の変形例が、本発明の範囲から外れることなく想定され得るものと解される。 Although the present invention has been described in detail, the above description is illustrative in all aspects, and the present invention is not limited thereto. It is understood that countless variations that are not illustrated can be envisaged without departing from the scope of the present invention.
 1 表示制御装置、2 自車両、11 情報取得部、12 制御部、21 仮想表示部、34 ポールオブジェクト、36 画像、37 対応ポールオブジェクト、38 補助ポールオブジェクト、39 主ポールオブジェクト、40a,40b,40c 付加オブジェクト、51 障害物。 DESCRIPTION OF SYMBOLS 1 Display control apparatus, 2 Own vehicle, 11 Information acquisition part, 12 Control part, 21 Virtual display part, 34 Pole object, 36 image, 37 Corresponding pole object, 38 Auxiliary pole object, 39 Main pole object, 40a, 40b, 40c Additional objects, 51 obstacles.

Claims (22)

  1.  表示部を制御する表示制御装置であって、
     前記表示部は、
     車両外の景色に重ねて前記車両の運転席から視認される1以上の表示オブジェクトを表示可能であり、
     前記表示制御装置は、
     情報を取得する情報取得部と、
     前記表示オブジェクトである第1表示オブジェクトを前記表示部に表示させ、前記情報取得部で取得された前記情報に基づいて、前記車両のボディの前側端部に対応する予め定められた範囲内で前記第1表示オブジェクトを移動させる制御部と
    を備える、表示制御装置。
    A display control device for controlling a display unit,
    The display unit
    It is possible to display one or more display objects that are visible from the driver's seat of the vehicle over the scenery outside the vehicle,
    The display control device includes:
    An information acquisition unit for acquiring information;
    The first display object, which is the display object, is displayed on the display unit, and based on the information acquired by the information acquisition unit, within a predetermined range corresponding to the front end of the vehicle body A display control apparatus comprising: a control unit that moves the first display object.
  2.  請求項1に記載の表示制御装置であって、
     前記第1表示オブジェクトの初期位置は、前記ボディのうちの助手席側の前側端部の位置である、表示制御装置。
    The display control device according to claim 1,
    The display control device, wherein an initial position of the first display object is a position of a front end portion on a passenger seat side of the body.
  3.  請求項1に記載の表示制御装置であって、
     前記制御部は、
     前記情報取得部で取得された前記情報に基づいて、前記ボディの前側端部よりも前方に前記第1表示オブジェクトを移動させる、表示制御装置。
    The display control device according to claim 1,
    The controller is
    A display control apparatus that moves the first display object forward of the front end portion of the body based on the information acquired by the information acquisition unit.
  4.  請求項1に記載の表示制御装置であって、
     前記制御部は、
     前記表示オブジェクトである第2表示オブジェクトを前記表示部にさらに表示させ、前記第2表示オブジェクトを、前記ボディのうちの助手席側の前側端部の位置に固定する、表示制御装置。
    The display control device according to claim 1,
    The controller is
    A display control device that further displays a second display object, which is the display object, on the display unit, and fixes the second display object at a position of a front end portion on a passenger seat side of the body.
  5.  請求項4に記載の表示制御装置であって、
     前記第1表示オブジェクトの表示態様と、前記第2表示オブジェクトの表示態様とは互いに異なり、かつ類似する、表示制御装置。
    The display control device according to claim 4,
    A display control apparatus in which a display mode of the first display object and a display mode of the second display object are different and similar to each other.
  6.  請求項1に記載の表示制御装置であって、
     前記情報は、前記第1表示オブジェクトを移動させる操作の情報を含む、表示制御装置。
    The display control device according to claim 1,
    The display control device, wherein the information includes information on an operation for moving the first display object.
  7.  請求項6に記載の表示制御装置であって、
     前記情報取得部は、
     前記車両周辺の障害物と、前記第1表示オブジェクトまたは前記車両とについての相対位置をさらに取得し、
     前記制御部は、
     前記情報取得部で取得された前記相対位置に基づいて、前記第1表示オブジェクトの表示態様を変更する、表示制御装置。
    The display control device according to claim 6,
    The information acquisition unit
    Further acquiring a relative position of the obstacle around the vehicle and the first display object or the vehicle;
    The controller is
    A display control device that changes a display mode of the first display object based on the relative position acquired by the information acquisition unit.
  8.  請求項1に記載の表示制御装置であって、
     前記情報は、前記車両周辺の障害物と前記車両とについての相対位置の情報を含む、表示制御装置。
    The display control device according to claim 1,
    The display control device, wherein the information includes information on a relative position of an obstacle around the vehicle and the vehicle.
  9.  請求項8に記載の表示制御装置であって、
     前記制御部は、
     前記情報取得部で取得された前記相対位置に基づく前記障害物と前記車両との間の距離が、予め定められた閾値よりも大きい場合には、前記第1表示オブジェクトの位置を初期位置に固定し、前記閾値以下である場合には、前記第1表示オブジェクトの移動を行う、表示制御装置。
    The display control device according to claim 8,
    The controller is
    When the distance between the obstacle based on the relative position acquired by the information acquisition unit and the vehicle is larger than a predetermined threshold, the position of the first display object is fixed to the initial position. And a display control apparatus that moves the first display object when the threshold value is equal to or less than the threshold value.
  10.  請求項8に記載の表示制御装置であって、
     前記制御部は、
     前記情報取得部で取得された前記相対位置に基づいて、前記障害物に前記第1表示オブジェクトを近づける、表示制御装置。
    The display control device according to claim 8,
    The controller is
    A display control apparatus that brings the first display object closer to the obstacle based on the relative position acquired by the information acquisition unit.
  11.  請求項8に記載の表示制御装置であって、
     前記情報は、前記車両の将来の移動軌跡である第1移動軌跡をさらに含み、
     前記制御部は、
     前記情報取得部で取得された前記相対位置及び前記第1移動軌跡に基づいて、前記車両が前記第1移動軌跡に沿って移動した場合に、前記車両のうち前記障害物に最も近づく部分、及び、前記障害物と接触する部分の少なくともいずれか1つに、前記第1表示オブジェクトを移動させる、表示制御装置。
    The display control device according to claim 8,
    The information further includes a first movement locus that is a future movement locus of the vehicle,
    The controller is
    Based on the relative position and the first movement trajectory acquired by the information acquisition unit, when the vehicle moves along the first movement trajectory, a portion of the vehicle that is closest to the obstacle, and A display control device that moves the first display object to at least one of the portions in contact with the obstacle.
  12.  請求項11に記載の表示制御装置であって、
     前記情報は、前記障害物の将来の移動軌跡である第2移動軌跡をさらに含み、
     前記制御部は、
     前記情報取得部で取得された前記相対位置、前記第1移動軌跡及び前記第2移動軌跡に基づいて、前記車両が前記第1移動軌跡に沿って移動し、かつ、前記障害物が前記第2移動軌跡に沿って移動した場合に、前記車両のうち前記障害物に最も近づく部分、及び、前記障害物と接触する部分の少なくともいずれか1つに、前記第1表示オブジェクトを移動させる、表示制御装置。
    The display control device according to claim 11,
    The information further includes a second movement locus that is a future movement locus of the obstacle,
    The controller is
    Based on the relative position, the first movement trajectory, and the second movement trajectory acquired by the information acquisition unit, the vehicle moves along the first movement trajectory, and the obstacle is the second Display control for moving the first display object to at least one of a portion of the vehicle that is closest to the obstacle and a portion that is in contact with the obstacle when the vehicle moves along a movement locus. apparatus.
  13.  請求項8に記載の表示制御装置であって、
     前記制御部は、
     前記情報取得部で取得された前記相対位置に基づく前記障害物と前記車両との間の距離に基づいて、前記第1表示オブジェクトの表示態様を変更する、表示制御装置。
    The display control device according to claim 8,
    The controller is
    A display control device that changes a display mode of the first display object based on a distance between the obstacle and the vehicle based on the relative position acquired by the information acquisition unit.
  14.  請求項13に記載の表示制御装置であって、
     前記第1表示オブジェクトの前記表示態様は、
     前記第1表示オブジェクトのアニメーションの有無、色、形状、及び、模様の少なくともいずれか1つを含む、表示制御装置。
    The display control device according to claim 13,
    The display mode of the first display object is:
    A display control apparatus comprising at least one of presence / absence of an animation of the first display object, a color, a shape, and a pattern.
  15.  請求項1に記載の表示制御装置であって、
     前記情報取得部は、
     前記車両周辺の障害物の属性をさらに取得し、
     前記制御部は、
     前記情報取得部で取得された前記障害物の属性に基づいて、前記第1表示オブジェクトの表示態様を変更する、表示制御装置。
    The display control device according to claim 1,
    The information acquisition unit
    Further acquiring the attributes of obstacles around the vehicle;
    The controller is
    A display control device that changes a display mode of the first display object based on an attribute of the obstacle acquired by the information acquisition unit.
  16.  請求項1に記載の表示制御装置であって、
     前記情報取得部は、
     前記車両周辺の障害物の高さをさらに取得し、
     前記制御部は、
     前記情報取得部で取得された前記障害物の高さに基づいて、前記第1表示オブジェクトの表示態様を変更する、表示制御装置。
    The display control device according to claim 1,
    The information acquisition unit
    Further acquiring the height of obstacles around the vehicle,
    The controller is
    A display control device that changes a display mode of the first display object based on the height of the obstacle acquired by the information acquisition unit.
  17.  請求項1に記載の表示制御装置であって、
     前記情報取得部は、
     前記車両周辺の障害物の色をさらに取得し、
     前記制御部は、
     前記情報取得部で取得された前記障害物の色に基づいて、前記第1表示オブジェクトの色を変更する、表示制御装置。
    The display control device according to claim 1,
    The information acquisition unit
    Further acquiring the color of obstacles around the vehicle,
    The controller is
    A display control device that changes the color of the first display object based on the color of the obstacle acquired by the information acquisition unit.
  18.  請求項1に記載の表示制御装置であって、
     前記情報取得部は、
     前記車両周辺の障害物に関する情報を取得し、
     前記制御部は、
     前記情報取得部で取得された前記障害物に関する情報を示す前記表示オブジェクトである第3表示オブジェクトを、前記第1表示オブジェクトに付随させて前記表示部にさらに表示させる、表示制御装置。
    The display control device according to claim 1,
    The information acquisition unit
    Obtaining information about obstacles around the vehicle,
    The controller is
    A display control device that causes the display unit to further display a third display object, which is the display object indicating the information related to the obstacle acquired by the information acquisition unit, accompanying the first display object.
  19.  請求項18に記載の表示制御装置であって、
     前記障害物に関する情報は、前記車両周辺の障害物と前記第1表示オブジェクトとについての相対位置の情報を含む、表示制御装置。
    The display control device according to claim 18,
    The information on the obstacle is a display control device including information on a relative position of the obstacle around the vehicle and the first display object.
  20.  請求項1に記載の表示制御装置であって、
     前記情報取得部は、
     前記車両周辺を撮像した画像を取得し、
     前記制御部は、
     前記情報取得部で取得された前記画像を、前記車両内で表示可能な、前記表示部とは別の表示部に表示させ、当該画像に前記第1表示オブジェクトに対応する表示オブジェクトを重畳させる、表示制御装置。
    The display control device according to claim 1,
    The information acquisition unit
    Obtain an image of the surroundings of the vehicle,
    The controller is
    Displaying the image acquired by the information acquisition unit on a display unit different from the display unit that can be displayed in the vehicle, and superimposing a display object corresponding to the first display object on the image; Display control device.
  21.  請求項1に記載の表示制御装置であって、
     前記情報取得部は、
     前記車両にて自動運転が行われているか否かを示す自動運転情報を取得し、
     前記制御部は、
     前記情報取得部で取得された前記自動運転情報が前記車両にて自動運転が行われていることを示す場合に、前記第1表示オブジェクトの位置を初期位置に固定する、表示制御装置。
    The display control device according to claim 1,
    The information acquisition unit
    Obtaining automatic driving information indicating whether or not automatic driving is performed in the vehicle,
    The controller is
    A display control device that fixes the position of the first display object at an initial position when the automatic driving information acquired by the information acquisition unit indicates that automatic driving is being performed in the vehicle.
  22.  表示部を制御する表示制御方法であって、
     前記表示部は、
     車両外の景色に重ねて前記車両の運転席から視認される1以上の表示オブジェクトを表示可能であり、
     前記表示制御方法は、
     情報を取得し、
     前記表示オブジェクトである第1表示オブジェクトを前記表示部に表示させ、取得された前記情報に基づいて、前記車両のボディの前側端部に対応する予め定められた範囲内で前記第1表示オブジェクトを移動させる、表示制御方法。
    A display control method for controlling a display unit,
    The display unit
    It is possible to display one or more display objects that are visible from the driver's seat of the vehicle over the scenery outside the vehicle,
    The display control method includes:
    Get information,
    The first display object, which is the display object, is displayed on the display unit, and based on the acquired information, the first display object is within a predetermined range corresponding to the front end portion of the vehicle body. The display control method to move.
PCT/JP2017/009932 2017-03-13 2017-03-13 Display control device and display control method WO2018167815A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019505313A JP6727400B2 (en) 2017-03-13 2017-03-13 Display control device and display control method
PCT/JP2017/009932 WO2018167815A1 (en) 2017-03-13 2017-03-13 Display control device and display control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/009932 WO2018167815A1 (en) 2017-03-13 2017-03-13 Display control device and display control method

Publications (1)

Publication Number Publication Date
WO2018167815A1 true WO2018167815A1 (en) 2018-09-20

Family

ID=63521889

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/009932 WO2018167815A1 (en) 2017-03-13 2017-03-13 Display control device and display control method

Country Status (2)

Country Link
JP (1) JP6727400B2 (en)
WO (1) WO2018167815A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019026747A1 (en) * 2017-07-31 2019-02-07 日本精機株式会社 Augmented real image display device for vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07140918A (en) * 1993-11-16 1995-06-02 Toyohiko Hatada Display device for vehicle
JPH08295175A (en) * 1995-04-25 1996-11-12 Mitsubishi Motors Corp Vertual image corner marker
JP2010039508A (en) * 2008-07-31 2010-02-18 Honda Motor Co Ltd Vehicular information notification system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005138755A (en) * 2003-11-07 2005-06-02 Denso Corp Device and program for displaying virtual images
JP2010064713A (en) * 2008-09-12 2010-03-25 Toshiba Corp Image irradiation system, and image irradiation method
JP2010250057A (en) * 2009-04-15 2010-11-04 Stanley Electric Co Ltd Virtual image type sign display device
JP2012116400A (en) * 2010-12-02 2012-06-21 Jvc Kenwood Corp Corner pole projection device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07140918A (en) * 1993-11-16 1995-06-02 Toyohiko Hatada Display device for vehicle
JPH08295175A (en) * 1995-04-25 1996-11-12 Mitsubishi Motors Corp Vertual image corner marker
JP2010039508A (en) * 2008-07-31 2010-02-18 Honda Motor Co Ltd Vehicular information notification system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019026747A1 (en) * 2017-07-31 2019-02-07 日本精機株式会社 Augmented real image display device for vehicle
JPWO2019026747A1 (en) * 2017-07-31 2020-05-28 日本精機株式会社 Augmented reality image display device for vehicles

Also Published As

Publication number Publication date
JPWO2018167815A1 (en) 2019-06-27
JP6727400B2 (en) 2020-07-22

Similar Documents

Publication Publication Date Title
US10591723B2 (en) In-vehicle projection display system with dynamic display area
JP6413207B2 (en) Vehicle display device
JP6340969B2 (en) Perimeter monitoring apparatus and program
CA3069114C (en) Parking assistance method and parking assistance device
EP3118047B1 (en) Display control device, display device, display control program, display control method, and recording medium
JP4770931B2 (en) Display device
JP6346614B2 (en) Information display system
EP2487906B1 (en) Control device and vehicle surrounding monitoring device
JP4475308B2 (en) Display device
JP7069548B2 (en) Peripheral monitoring device
JP6045796B2 (en) Video processing apparatus, video processing method, and video display system
CN108349503B (en) Driving support device
JP2015000630A (en) Onboard display device and program
WO2014156614A1 (en) Vehicular display device
US11181909B2 (en) Remote vehicle control device, remote vehicle control system, and remote vehicle control method
JP6720729B2 (en) Display controller
JP2017186008A (en) Information display system
US20190286118A1 (en) Remote vehicle control device and remote vehicle control method
JP6805974B2 (en) Driving support device and computer program
WO2018167815A1 (en) Display control device and display control method
JP2019069717A (en) Parking support device
JP2019119357A (en) Display system
JP7051263B2 (en) Operation plan change instruction device and operation plan change instruction method
JP2020121638A (en) Display control device
WO2022210172A1 (en) Vehicular display system, vehicular display method, and vehicular display program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17900715

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019505313

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17900715

Country of ref document: EP

Kind code of ref document: A1