JP6413207B2 - Vehicle display device - Google Patents

Vehicle display device Download PDF

Info

Publication number
JP6413207B2
JP6413207B2 JP2013106522A JP2013106522A JP6413207B2 JP 6413207 B2 JP6413207 B2 JP 6413207B2 JP 2013106522 A JP2013106522 A JP 2013106522A JP 2013106522 A JP2013106522 A JP 2013106522A JP 6413207 B2 JP6413207 B2 JP 6413207B2
Authority
JP
Japan
Prior art keywords
image
vehicle
display
captured image
left
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2013106522A
Other languages
Japanese (ja)
Other versions
JP2014229997A (en
Inventor
聖弥 小堀内
聖弥 小堀内
太田 聡
聡 太田
郁代 笹島
郁代 笹島
悠樹 高橋
悠樹 高橋
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Priority to JP2013106522A priority Critical patent/JP6413207B2/en
Publication of JP2014229997A publication Critical patent/JP2014229997A/en
Application granted granted Critical
Publication of JP6413207B2 publication Critical patent/JP6413207B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a vehicle display device mounted on a vehicle.

  For example, in Patent Document 1, a meter panel that is arranged on an instrument panel of a vehicle and displays vehicle information, and a first display that allows a driver to visually recognize a video image of a blind spot on the left and right rear sides of the vehicle by an imaging unit. A virtual image generated by the first display unit is projected on an instrument panel or a windshield of the vehicle in a direction and a position corresponding to a direction and a position in which the imaging unit is disposed. A vehicular display device is disclosed that allows a displayed captured video to be visually and intuitively recognized by visually recognizing a video (virtual image) captured by an imaging unit at a position corresponding to the position. The present applicant has previously filed such a specific configuration of the vehicular display device in Japanese Patent Application No. 2013-068055, and the configuration will be described below.

  This vehicle display device provides an observer with a driving assistance image including left rear vehicle information of the vehicle, a left rear captured image that is a left rear captured image of the vehicle, and a right that is a right rear captured image of the vehicle. A rear captured image, and when all of the driving assistance image is included within an effective visual field angle in a horizontal direction, the left rear captured image is located on the left side of the driving assistance image and the left rear At least a part of the captured image is included in an effective viewing angle in the horizontal direction, the right rear captured image is located on the right side of the driving assistance image, and at least a part of the right rear captured image is in the horizontal direction. At least a part of the left rear captured image and the right rear captured image within the effective visual field angle that is included in the effective visual field angle and is within the effective visual field angle that is excellent in information receiving ability even when viewing the most important driving support image for vehicle operation. Can be visually recognized Can intuitively displayed image without significant viewpoint movement can grasp the position captured.

JP 2005-329768 A

  However, in the display device for a vehicle as described above, the vehicle information and the blind spot of the vehicle can be recognized intuitively without moving a large viewpoint, but driving assistance that needs to be recognized mainly during vehicle operation When the image is visually recognized, the captured images (the left rear captured image and the right rear captured image) that are close to each other (within the effective viewing angle when the driving support image is viewed) are also viewed at the same time. There was a risk that the attention to the image would be reduced.

  Therefore, the present invention has been made paying attention to the above-described problems, and provides a vehicle display device that can intuitively recognize a display image without accompanying a large viewpoint movement and does not reduce the attention to the image. It is intended to provide.

In order to solve the above problems, the present invention is arranged in an instrument panel of a vehicle and images a driving assistance image including at least vehicle information or navigation information, and a left rear of the vehicle located on the left side of the driving assistance image. A vehicle display device that displays to a viewer a left rear captured image and a right rear captured image that is located on the right side of the driving support image and that captures the right rear of the vehicle. All of the driving assistance images are included in a range of 15 ° left and right with the line of sight facing the front direction as the center line, and at least a part of each of the left rear captured image and the right rear captured image is included is, the vehicle is traveling, the left and rear captured image viewable range visibility of both the right rear captured image lower than the visibility of the driving support image, if certain conditions are met, small At least the visibility of either the left rear captured image or the right rear captured image is increased for a predetermined duration and the visibility is continued according to the timing when the specific condition is satisfied. It has a display adjustment means for automatically changing the period.

  ADVANTAGE OF THE INVENTION According to this invention, a display image can be recognized intuitively without accompanying a big viewpoint movement, and the display apparatus for vehicles which does not reduce the attentiveness with respect to an image can be provided.

The electric structure of the display apparatus for vehicles which is embodiment of this invention is shown. The figure which shows the general view of the driver's seat vicinity in the vehicle by which a vehicle display apparatus same as the above is mounted. The top view which shows the general view of the display means with which the display apparatus for vehicles same as the above is equipped. The side view which shows the general view of a display means same as the above. The perspective view which shows the outline of a display means same as the above Plan view showing an overview of the display means The figure which shows the example of a display of a display means same as the above. The figure which shows the relationship between the imaging position of a virtual image in a vehicle display apparatus same as the above, and a driver | operator's eyes | visual_axis. It is a figure which shows the example of a display of a display means same as the above, and is a figure explaining a display adjustment process. The timing chart explaining the display adjustment process in the display apparatus for vehicles same as the above. The figure which shows the example of a display of a display means same as the above. The timing chart explaining the display adjustment process in the modification of the display apparatus for vehicles of this invention.

Embodiments to which the present invention is applied will be described below with reference to the accompanying drawings.
The vehicle display device according to the present embodiment is a vehicle display device 1 mounted on the host vehicle 2. The vehicle display device 1 displays driving support information including vehicle information and navigation information and captured images around the host vehicle 2 on the display means 500, and displays these images on the instrument panel 3 (hereinafter referred to as instrument panel) of the host vehicle 2. The first opening 3a, the second opening 3b, and the third opening 3c are visually recognized by the user (usually the driver) (FIG. 2). In addition, when the driver of the host vehicle 2 operates the first and second operation units 100 and 200 provided on the steering 4 of the host vehicle 2, an image corresponding to the operation is displayed on the display unit 500. In the present embodiment, the “instrument panel” refers to the entire front seat front portion of the host vehicle 2.

  As shown in FIG. 2, the host vehicle 2 has a first opening 3 a, a second opening 3 b, and a third opening 3 c above the steering 4 of the instrument panel 3. The first opening 3a is located in the center as seen from the driver, the second opening 3b is located on the left side of the first opening 3a as seen from the driver, and the third opening 3c is the first as seen from the driver. Located on the right side of the opening 3a. A first display image V1 generated by first display means 510 described later is visually recognized from the first opening 3a, and a second display image V2 generated by second display means 520 described later from the second opening 3b. Is visually recognized, and a third display image V3 generated by third display means 530 described later is visually recognized from the third opening 3c. The first to third openings 3a to 3c may be covered with a transparent protective cover on the driver side.

  The vehicle control system unit 10 is an ECU (Engine Control Unit) for controlling the vehicle, various sensors, and the like, and controls vehicle information such as a vehicle speed, an engine speed, a water temperature, a remaining fuel amount, an electric energy, a travel distance, and a gear position. Output to the unit 300.

  The in-vehicle electronic device 20 is an audio device, a car navigation device, and the like, and includes an electronic device that is detachably disposed on a dashboard, in addition to an electronic device that is fitted in the instrument panel 3 of the host vehicle 2, It includes an electronic device that is simply brought into the host vehicle 2 and operated in the host vehicle 2, and includes a high-function mobile phone called a smartphone. The in-vehicle electronic device 20 is electrically connected to a control unit 300 described later, and outputs audio information and navigation information to the control unit 300. The control unit 300 causes the display unit 500 to display an image including these information. The in-vehicle electronic device 20 operates according to operation signals of first and second operation units 100 and 200 (described later) output from the control unit 300.

  The first and second imaging units 30 and 40 are each composed of a well-known imaging device such as a CCD and peripheral circuits and lenses for controlling the imaging device. The first and second imaging units 30 and 40 are installed in the vicinity of the door mirror of the host vehicle 2 or in front of the left and right side surfaces such as the front of the door mirror. Then, the left and right rear sides of the host vehicle 2 are imaged and the captured image is output to the image analysis unit 50. In the present embodiment, the first imaging unit 30 is installed in front of the left side of the host vehicle 2 and images the left rear of the host vehicle 2, and the second imaging unit 40 is installed in front of the right side of the host vehicle 2. Then, the right rear of the host vehicle 2 is imaged.

  The image analysis unit 50 is an ECU having a CPU, a memory such as a RAM and a ROM, and the captured images acquired by the first and second imaging units 30 and 40 and a driver imaging unit 80 described later are used as a template matching method. By analyzing the image of the first and second imaging units 30, 40, the type of the specific object (other vehicle, pedestrian, etc.) located on the left and right rear of the host vehicle 2 from the left and right rear captured images, Recognize the size, position, moving direction, etc. of the target, recognize the specific target (driver's pupil) and the moving direction (gaze direction) of the specific target from the captured image of the driver imaging unit 80, and analyze these images (Specific target type, size, position, moving direction) and captured image are output to the controller 300.

(Configuration of vehicle display device 1)
The vehicle display device 1 includes a first operation unit 100, a second operation unit 200, a control unit 300, a storage unit 400, and a display unit 500.

  Each of the first and second operation units 100 and 200 includes a contact sensor serving as an operation surface, and an operation in which the driver touches the operation surface with a thumb or the like (touch operation) or an operation for tracing a predetermined locus. This is a touch sensor that detects the position where the thumb or the like touches the operation surface under the control of the control unit 300 when performing (gesture operation).

  The control unit 300 includes a CPU and the like, and executes an operation program stored in the storage unit 400 to perform various processes and controls. Specifically, display control of the display unit 500, position detection control of operations on the first and second operation units 100 and 200, operation control of the in-vehicle electronic device 20, and the like are performed. At least a part of the control unit 300 may be configured by various dedicated circuits such as an ASIC (Application Specific Integrated Circuit). The control unit 300 includes a display adjustment unit 310 and a selection unit 320, which will be described later, and executes display adjustment processing for adjusting the visibility of the second display image V2 and the third display image V3.

  The storage unit 400 includes a ROM, a RAM, a flash memory, and the like, and includes a program area that stores an operation program executed by the control unit 300, a work area that temporarily stores a result of arithmetic processing performed by the control unit 300, and an image analysis unit It functions as a data area or the like for temporarily storing image analysis results and captured images from 50.

  3 and 4 are schematic views showing the display means 500. FIG. 3 is a schematic view seen from above, and FIG. 4 is a schematic view seen from the side. The display unit 500 generates a first display unit 510 that generates a first display image V1 to be viewed from the first opening 3a of the host vehicle 2, and a second display image V2 to be viewed from the second opening 3b of the host vehicle 2. Second display means 520 for generating, and third display means 530 for generating a third display image V3 to be visually recognized from the third opening 3c of the host vehicle 2 are provided. As shown in FIGS. 3 and 4, the first, second, and third display means 510, 520, and 530 include first, second, and third projectors (display units) 511, 521, and 531; Second and third screens 512, 522, and 532 and first, second, and third reflecting members (display surfaces) 513, 523, and 533, and the display surfaces (first and first) of these display means. 2, the third reflecting member 513, 523, 533) and the first, second, and third openings 3a, 3b, 3c of the host vehicle 2, the first, second, and third partition walls 514 , 524, 534 (FIGS. 5 and 6). 4 and 5, only the second projector 521, the second screen 522, and the second reflecting member 523 are shown in order to simplify the drawings.

  The first projector 511 emits the first display light L1 indicating the first display image V1 toward the first reflecting member 513 through the first screen 512. For example, the light from the light source is transmitted through the liquid crystal panel. The liquid crystal projector forms the first display light L1. The first projector 511 functions as a display device of the present invention together with the first screen 512.

  The first screen 512 is a transmissive planar screen that transmits the first display light L1 emitted from the first projector 511 and displays a real image of the first display image V1, and is made of a light transmissive resin material. A louver layer, a colored layer, a diffusion layer, a hard coat layer, or the like that prevents external light from entering is formed on the substrate. The first screen 512 is visually recognized by the driver through the first opening 3 a provided in the instrument panel 3 through the first reflecting member 513. Therefore, the display light L1 indicating the first display image V1 transmitted through the first screen 512 is guided to the first reflecting member 513, and the display light L1 reflected by the first reflecting member 513 is projected onto the viewpoint E of the driver. Then, the virtual image V1 indicating the first display image V1 is visually recognized on the virtual optical path. Note that a three-dimensional screen may be used as the first screen 512.

  The first reflecting member 513 is a concave mirror that magnifies and reflects the display light L1 from the first projector 511 toward the driver, and has a highly light-reflective metal film such as aluminum on the surface of a base material made of a glass material. Formed. In the drawing, in order to simplify the drawing, the first reflecting member 513 is illustrated in a planar shape. The first reflecting member 513 is disposed in the instrument panel 3 so as to correspond to the first opening 3a (so that the reflecting surface faces the first opening 3a).

  The first partition wall 514 is formed of a light-shielding resin material or the like, and is a cylindrical member that extends from the first opening 3a of the host vehicle 2 toward the first reflecting member 513 as shown in FIG. A display hole 514a is provided on the optical path of the display light L1 from the first screen 512 to the first reflecting member 513. As shown in FIG. 6, the first, second, and third partition walls 514, 524, and 534 are provided on the respective display surfaces (first and second) of the first, second, and third display means 510, 520, and 530, respectively. , Third reflecting members 513, 523, 533) and formed so as not to be affected by display light from adjacent display means. Second and third projectors 521 and 531, second and third screens 522 and 532, second and third reflecting members 523 and 533, and second and second display means 520 and third display means 530. Regarding the three partition walls 524 and 534, the first projector 511, the first screen 512, the first reflecting member 513, and the first partition wall 514 of the first display means 510 described above except that the length of the optical path of the display light is different. Since it is the same as that, the description is omitted.

FIG. 7 shows a display example of the display means 500.
The first display image (virtual image V1) includes a high-priority driving assistance image that includes vehicle information and navigation information. In FIG. 7, in the first display image V1, the vehicle speed is displayed as vehicle information, and route guidance to the destination is displayed as navigation information. In addition, as the first display image V1, audio information, a menu screen, and a peripheral captured image obtained by imaging the front rear and front front (dead angle portion) of the host vehicle 2 may be appropriately switched and displayed with driving support information.
The second display image (virtual image V <b> 2) includes a left rear captured image obtained by capturing the left rear of the periphery of the host vehicle 2 by the first imaging unit 40. The left rear captured image is a reversed image in which the left and right are reversed, similar to the image projected on the door mirror and the fender mirror. In the present embodiment, the left rear captured image is always displayed.
The third display image (virtual image V <b> 3) includes a right rear captured image obtained by capturing the right rear of the periphery of the host vehicle 2 by the second imaging unit 50. Note that the right rear captured image is a reversed image in which the left and right are reversed, similar to the image projected on the door mirror and the fender mirror. In the present embodiment, the right rear captured image is always displayed.

FIG. 8 shows the relationship between the imaging positions of the virtual images V1 to V3 and the driver's line of sight. FIG. 8A is a plan view of the host vehicle 2 as viewed from above, and FIG. 8B is a side view of the host vehicle 2.
The virtual image V1 of the first display image located in the center when viewed from the driver among the virtual images V1 to V3 is formed so as to be located behind (front) from the front end of the host vehicle 2. Further, the virtual image V1 is within the effective visual field angle A1 in the horizontal direction with respect to the host vehicle 2 when the line of sight E11 when the driver looks at the front is the center line (reference) (specifically, the line of sight E11 is centered). Within the effective viewing angle A2 perpendicular to the host vehicle 2 when the driver's line of sight E11 is the center line (specifically, the line of sight E11 is the center line). (A range of 10 degrees above and below) includes a part thereof (in this embodiment, the upper half from the upper end to the center).

  The virtual image V2 of the second display image located on the left side when viewed from the driver among the virtual images V1 to V3 is ahead (distant) from the imaging position of the virtual image V1 of the first display image, and the host vehicle 2 An image is formed so as to be located behind (front) the front end of the lens. Further, the virtual image V2 is a part of the effective visual field angle A1 in the horizontal direction with respect to the host vehicle 2 when the driver's line of sight E11 is the center line (in the present embodiment, the right half from the right end to the center). And the effective visual field angle A2 in the direction perpendicular to the host vehicle 2 when the driver's line of sight E11 is the center line is included in the vertical direction (in this embodiment, the upper half from the upper end to the center). . It should be noted that the imaging position of the virtual image V2 is set to be farther than the imaging position of the virtual image V1, so that the optical path length of the second display light L2 is longer than the optical path length of the first display light L1. Yes. Further, in order to set the imaging position of the virtual image V2 to the left side when viewed from the driver with respect to the imaging position of the virtual image V1, the second reflecting member 522 is predetermined on the left side when viewed from the driver with respect to the driver's line of sight E11. The angle (15 ° in the present embodiment) is inclined.

  The virtual image V3 of the third display image located on the right side when viewed from the driver among the virtual images V1 to V3 is ahead (distant) from the imaging position of the virtual image V1 of the first display image, and the host vehicle 2 The image is formed so as to be positioned on the extension line S <b> 1 from the right end surface of the vehicle 2 and positioned rearward (front) from the front end portion of the host vehicle 2. Further, the virtual image V3 is a part of the effective visual field angle A1 in the horizontal direction with respect to the host vehicle 2 when the driver's line of sight E11 is the center line (in the present embodiment, the right half from the left end to the center) And the effective visual field angle A2 in the direction perpendicular to the host vehicle 2 when the driver's line of sight E11 is the center line is included in the vertical direction (in this embodiment, the upper half from the upper end to the center). . It should be noted that the imaging position of the virtual image V3 is farther than the imaging position of the virtual image V1, so that the optical path length of the third display light L3 is longer than the optical path length of the first display light L1. Yes. Further, in order to set the imaging position of the virtual image V3 to the right side when viewed from the driver with respect to the imaging position of the virtual image V1, the third reflecting member 532 is predetermined on the right side when viewed from the driver with respect to the driver's line of sight E11. The angle (15 ° in the present embodiment) is inclined.

  As described above, when the imaging positions of the virtual images V1 to V3 are determined, as shown in FIG. 7, the effective visual field A with the driver's line of sight E11 as the center line includes a substantially upper half region of the virtual image V1. In this state, a substantially upper left 1/4 region of the virtual image V2 and a substantially upper right 1/4 region of the virtual image V3 are included. The effective visual field A is an area where the observer can receive information instantaneously only by eye movement. From the line of sight E11 (when viewed from the front), the driver uses the eye movement only and uses the virtual image V1 of the first display image (driving support image) as a reference, and the virtual image V2 of the second display image (left rear captured image). Can be visually recognized, and a part of the virtual image V3 of the third display image (right rear captured image) can be visually recognized far to the right. Then, the driver can intuitively grasp the direction in which the second and third display images V2 and V3 are captured from the directions of the virtual images V2 and V3 with respect to the virtual image V1 only by eye movements without a large line-of-sight movement. However, when the observer visually recognizes the virtual image V1 of the first display image (driving support image), the virtual image V2 of the second display image (left rear captured image) and the third display image (right rear captured image) within the effective visual field A. ) Is included, and there is a risk that the attention to each display image may be reduced. However, in the present invention, the display image displayed by the display unit 500 by the display adjustment processing described below is displayed. These annoyances can be reduced by adjusting the visibility.

(Display adjustment processing)
The display adjustment unit 310 adjusts the luminance and brightness of the second and third display images V2 and V3 by driving and controlling the second and third projectors (display devices) 521 and 531 to provide the second and third display. The relative visibility of the images V2 and V3 with respect to the first display image V1 is adjusted (display adjustment processing). The display adjustment means 310 has a low visibility and low visibility state L (V2a, V3a in FIG. 9) with low brightness and brightness and high brightness and brightness. It adjusts to the high visibility state H (V2b, V3b in FIG. 9) with high visibility. As shown in FIG. 9, the display adjustment unit 310 is a first display mode in which both the second and third display images V <b> 2 and V <b> 3 are set to the low visibility state L in response to selection by the selection unit 320 described later. 9A) and the second display mode in which one of the second and third display images V2 and V3 is in the low visibility state L and the other is in the high visibility state H (FIG. 9B). ) And a third display mode (FIG. 9C) in which both the second and third display images V2 and V3 are set to the high visibility state H. When the host vehicle 2 is traveling, the display adjustment unit 310 is in the first display mode in which both the second and third display images V2 and V3 are in the low visibility state L, and the host vehicle 2 is selected while traveling. Corresponding to the selection from the means 320, a transition is made to the second display mode or the third display mode. Thereby, during driving | running | working of the own vehicle 2, the attention power fall with respect to the 1st display image V1 with a high visual recognition priority can be suppressed.

  The selection unit 320 receives various information signals from the image analysis unit 50, the steering angle detection unit 60, the direction instruction unit 70, the first operation unit 100, and the second operation unit 200, and these input signals. The display adjustment means is selected by selecting a display device (second projector 521 or third projector 531) to be adjusted to the high visibility state H and outputting a selection signal F based on the selection result to the display adjustment means 310. 310 adjusts the target display image to the high visibility state H. Further, the selection unit 320 is based on input signals from the image analysis unit 50, the steering angle detection unit 60, the direction instruction unit 70, the first operation unit 100, and the second operation unit 200. By selecting the display (second projector 521 or third projector 531) to be adjusted to the state L, and outputting the selection cancellation signal D based on the selection result to the display adjustment unit 310, the display adjustment unit 310 becomes the target. The display image is adjusted to the low visibility state L. Below, the specific example of the control process of the selection means 320 is demonstrated.

  The selection unit 320 inputs object detection information (such as the type, size, position, and moving direction of a specific target) that is an image analysis result (via the image analysis unit 50) from the first imaging unit 30 and the second imaging unit 40 To do. When a specific target (such as another vehicle or a pedestrian) is detected from the image analysis result of the first image capturing unit 30 or the second image capturing unit 40, the selection unit 320 selects the image capturing unit (first image capturing unit) from which the specific target is detected. 30 or the second image capturing unit 40), and the display signal is output to the display adjustment unit 310 to improve the visibility of the target display image (high visibility state H). ). Further, when the specific target is no longer detected from the captured image, the selection unit 320 outputs the selection cancellation signal D to the display adjustment unit 310, thereby reducing the visibility of the target display image again (low visibility). Transition to sex state L).

  The selection unit 320 also inputs gaze direction information (driver's gaze direction) that is an image analysis result from the driver imaging unit 80 via the image analysis unit 50. When it is detected from the image analysis result of the driver imaging unit 80 that the driver's line-of-sight direction has moved in the direction of the second opening 3b or the third opening 3c, the selection unit 320 moves the driver's line of sight. By selecting the direction indicator and outputting this selection signal F to the display adjustment means 310, the visibility of the target display image is improved (the state is shifted to the high visibility state H). Further, the selection unit 320 outputs a selection cancellation signal D to the display adjustment unit 310 when the driver's line of sight moves forward (center direction) that is not in the direction of the second opening 3b or the third opening 3c. Thus, the visibility of the target display image is lowered again (the state is shifted to the low visibility state L).

  Further, the selection unit 320 inputs the steering angle information of the steering 4 of the host vehicle 2 from the steering angle detection unit 60. By selecting the indicator in the direction in which the steering 4 is steered based on this steering angle information, and outputting this selection signal F to the display adjusting means 310, the visibility of the target display image is improved (high visibility state H ). In addition, when the left and right steering angles of the steering wheel 4 are not detected from the steering angle detection unit 60, the selection unit 320 outputs the selection cancellation signal D to the display adjustment unit 310, thereby reducing the visibility of the target display image again. (Transition to the low visibility state L).

  Further, the selection unit 320 inputs the operation information of the turn signal of the host vehicle 2 from the direction instruction unit 70. By selecting the indicator in the direction instructed to turn on the direction indicator lamp based on the operation information of the direction indicator lamp, and outputting the selection signal F to the display adjustment means 310, the visibility of the target display image is improved. (Move to high visibility state H). In addition, when the left and right operation information of the direction indicator lamp is not detected from the direction indicator 70, the selection unit 320 outputs the selection cancellation signal D to the display adjustment unit 310, thereby reducing the visibility of the target display image again. (Transition to the low visibility state L).

  The selection unit 320 performs operation information from the first operation unit 100 or the second operation unit 200. By selecting a display device in the direction indicated by the operation information and outputting the selection signal F to the display adjustment unit 310, the visibility of the target display image is improved (the state is shifted to the high visibility state H). . In addition, when a release operation is performed from the first operation unit 100 or the second operation unit 200, the selection unit 320 outputs a selection release signal D to the display adjustment unit 310, thereby improving the visibility of the target display image. Reduce again (transition to low visibility state L).

  As described above, the display adjustment unit 310 improves the visibility of the display image displayed by the display selected by the selection signal F (shifts to the high visibility state H), and is deselected by the selection cancellation signal D. The display adjustment means 310 has a timer function for measuring the time, and after the predetermined display image is set to the high visibility state H, the visibility of the displayed image is lowered (shifted to the low visibility state L). When the predetermined first duration TA has elapsed, the display image in the high visibility state H is shifted to the low visibility state L. With such a configuration, the visibility of the display image selected by the selection unit 320 is temporarily improved, so that the display image (second display image V2 or third display image V3) of the selected display is displayed on the driver. When the driver confirms the display image displayed on the selected display and focuses on the first display image V1, the visibility of the target display image is automatically confirmed. Can be reduced. Thus, the display adjustment means 310 has a timer function, and the display images (second and third display images V2 and V3) in the high visibility state H have passed for a predetermined period (first duration TA). Thereafter, by shifting again to the low visibility state L, it is possible to suppress a reduction in attention to the first display image V1 having a high viewing priority.

  In addition, the display adjustment unit 310 can change the duration until the display image that has entered the high visibility state H is changed to the low visibility state L again. When the display image that has transitioned from the high visibility state H to the low visibility state L is selected again by the selection unit 310 within the predetermined period T0, the display adjustment unit 310 sets the duration of the high visibility state H to the first time. The second duration TB is longer than the first duration TA. This process will be specifically described with reference to FIG.

  10 shows that when the selection unit 320 outputs the selection signal F and the selection cancellation signal D, the second display image V2 displayed on the display (second display 521) is in the high visibility state H or low visibility. It is a figure which shows whether it exists in the state L. FIG. First, when the selection unit 320 selects the second display 521 (second display image V2) at time t11, the display adjustment unit 310 adjusts the second display image V2 to the high visibility state H. After that, even if the selection cancellation signal D is input by the selection unit 310, the display adjustment unit 310 maintains the second display image V2 in the high visibility state H for the first duration TA, and the second display image V2 at the time t12. The display image V2 is shifted to the low visibility state L. When the second display 521 is selected by the selection unit 310 within the predetermined period T0 after the second display image V2 is in the low visibility state L, the display adjustment unit 310 displays the second display image V2 with high visibility. The second display image V2 is held in the high visibility state H for the second duration TB longer than the first duration TA (time t13). Further, when the second display 521 is selected by the selection unit 310 after the predetermined period T0 (time t15) after the second display image V2 enters the low visibility state L, during the first duration TA. The second display image V2 is kept in the high visibility state H.

Thus, the display image required by the driver can be quickly provided by setting the target display image to the high visibility state H immediately after the selection signal F is output by the selection unit 310.
Further, even if the selection cancellation signal D is input by the selection unit 320, the display image of the target is highly visually recognized by continuing the high visibility state H only for a predetermined period (first duration TA or second duration TB). The troublesomeness felt by the driver can be reduced by frequently switching to the sex state H and the low visibility state L.
In addition, since the duration of the high visibility state H can be increased, a display image that is frequently selected by the selection unit 310 and that is considered to have a high viewing priority can be viewed for a long time, and the viewing priority is high. The information of the display image can be surely recognized by the driver.

  Further, the selection means 320 inputs vehicle information such as a vehicle speed and a shift position from the vehicle control system unit 10, and outputs a selection signal F when it is determined from the vehicle information that the host vehicle 2 is not traveling. The adjustment unit 310 shifts to the third display mode in which both the second and third display images V2 and V3 are set to the high visibility state H.

In the third display mode that shifts when the host vehicle 2 stops, the control unit 300 outputs an image output from the image analysis unit 50 as a result of image analysis of the captured images of the first imaging unit 30 and the second imaging unit 40. Depending on the analysis result (type, size, position, moving direction, etc. of the specific target), as shown in FIG. 11, a frame image P surrounding the specific target and an icon image Q1 arranged outside the frame image P , Q2 are superimposed on the captured image and displayed.
The icon images Q1 and Q2 are different character images depending on the type of the specific target. When the specific target is a car, the icon image is a character image imitating a car (icon image Q1). When the specific target is a person, This is a simulated character image (icon image Q2). The image analysis unit 50 determines the moving direction of the specific target from the time-dependent changes of the captured images captured by the first imaging unit 30 and the second imaging unit 40, and the control unit 300 is as shown in FIG. The icon images Q1 and Q2 are displayed as character images from which the moving direction of the specific target is estimated. Further, the control unit 300 may change at least the color of either the frame image P or the icon image Q depending on the type of the specific target.
Further, the image analysis unit 50 determines the distance between the specific target and the host vehicle 2 from the temporal change in the size of the captured image captured by the first image capturing unit 30 and the second image capturing unit 40, and the specific target is the host vehicle 2. When it is close enough, at least one of the color of the frame image P or the icon image Q may be changed or blinked. Thus, when the host vehicle 2 stops, both the second and third display images V2 and V3 are in the high visibility state H. At this time, the specific target in the first imaging unit 30 and the second imaging unit 40 By displaying the frame image P and the icon image Q based on this detection, the specific object can be recognized more easily, and the driver can recognize the difference in color and the difference in brightness (flashing) more impressively.

  As described above, the vehicle display device 1 of the present invention is disposed in the instrument panel 3 of the host vehicle 2 and is provided on the left side of the driving support image V1 including at least vehicle information or navigation information and the driving support image V1. A vehicle that displays to the observer E a left rear captured image V2 that is located and images the left rear of the vehicle, and a right rear captured image V3 that is located on the right side of the driving support image V1 and images the right rear of the vehicle. When the observer E visually recognizes the front direction of the host vehicle 2 and all of the driving support image V1 is included within the horizontal effective viewing angle A of the observer E, The left rear captured image V2 and the right rear captured image V3 are each included at least in part within the horizontal effective field of view angle A of the observer E, and the left rear captured image while the host vehicle 2 is traveling. V2 and before When the visibility of both of the right rear captured image V3 is lower than the visibility of the driving support image V1 within a visible range and the specific condition is satisfied, at least the left rear captured image V2 or the right rear captured image V3 It has the display adjustment means 310 that enhances the visibility of either one. With such a configuration, the display image can be recognized intuitively without a large viewpoint movement, and a reduction in attention to the image is suppressed. can do.

  Further, in the vehicle display device 1 of the present invention, the display adjusting unit 310 increases the visibility of either the left rear captured image V2 or the right rear captured image V3 for a predetermined duration TA. During this time, the visibility is continued. With such a configuration, when it is estimated that the viewing priority is high, the driver can reliably recognize the target captured image (left rear captured image V2 or right rear captured image V3).

  In addition, this invention is not limited by the above embodiment and drawing. Changes (including deletion of components) can be made as appropriate without departing from the scope of the present invention. An example of a modification is shown below.

(Modification)
In the above embodiment, a first display image V1 (driving support image) V1 displayed at the center in front of the driver's line of sight, a second display image V2 (left rear captured image) V2 displayed on the left side in front of the driver's line of sight, Although the display device that displays the third display image V3 (left rear captured image) V3 displayed on the right side in front of the driver's line of sight was used as a projector, the virtual image reflected by the reflecting member was visually recognized by the driver. The first, second and third display images V2 and V3 displayed on the liquid crystal display or the like may be directly viewed by the driver.

  In the above embodiment, while the host vehicle 2 is traveling, the display adjustment unit 310 displays the left rear captured image as the second display image V2 and the right rear captured image as the third display image V3. However, either the left rear captured image or the right rear captured image may be selected and displayed while the host vehicle 2 is traveling. For example, the driver can select the captured image to be displayed (either the left rear captured image or the right rear captured image) by operating the first operation unit 100 or the second operation unit 200, and the selected captured image. Is displayed while the host vehicle 2 is traveling, the captured image that has not been selected is not displayed, and driving assistance information different from the captured image is displayed in the display area. The display adjustment unit 310 can change the captured image that has not been selected from the non-display state to the display state in accordance with the selection signal F of the selection unit 320, and can further change the captured image to the low visibility state L or high visibility. Transition to state H is possible. In addition, the display adjustment unit 310 displays a captured image when a selection cancellation signal D is input from the selection unit 320 to a captured image that has not been selected by the driver, or when a certain period of time has elapsed due to the timer function. The display state can be changed to the non-display state. With such a configuration, the contents of the images (the images displayed by the second display image V2 and the third display image V3) that enter the field of view when the first display image V1 having a high viewing priority is viewed while the host vehicle 2 is traveling. By preventing the display of unnecessary images, it is possible to suppress a reduction in attention to the first display image V1 having a high viewing priority.

  The display adjustment unit 310 switches the second display image V2 and the third display image V3 between a high visibility state H with high visibility and a low visibility state L with low visibility. It may have an intermediate visibility state M between H and the low visibility state L. When switching from the high visibility state H to the low visibility state L, or when switching from the low visibility state L to the high visibility state H, the visibility is changed stepwise via the intermediate visibility state M. May be. In particular, when switching from the high visibility state H to the low visibility state L, even if the driver is viewing the target display image, the visibility of the target display image is changed stepwise according to the elapsed time. Thus, it is possible to make the driver recognize that the visibility of the display image is reduced, and to increase the attention to the target display image until the low visibility state L is reached. However, when the target display image is switched from the low-visibility state L to the high-visibility state H, it is quickly switched from the low-visibility state L to the high-visibility state H as in the above embodiment. It is desirable to recognize the display image of the object.

When the selection unit 320 selects either the second display image V2 or the third display image V3 to be in the high visibility state H, the display adjustment unit 310 causes the other to follow and achieve high visibility. The sex state H may be set (the third display mode may be set). With such a configuration, the periphery of the host vehicle 2 can be confirmed quickly. Specifically, for example, the selection unit 320 detects an object (such as another vehicle or a pedestrian) around the host vehicle 2 from the first imaging unit 30 and the second imaging unit 40, and the object is predetermined on the host vehicle 2. When approaching more than a distance or when approaching the host vehicle 2 rapidly, it is determined that attention is required for the surroundings, and both the second display image V2 and the third display image V3 are displayed via the display adjustment unit 310. The third display mode may improve the visibility. In addition, the selection means 320 is careful about the surroundings when information such as the steering angle information from the steering angle detection unit 60 or the turn signal operation information from the direction instruction unit 70 is input that changes the course of the host vehicle 2. It may be determined as necessary, and the third display mode in which the visibility of both the second display image V2 and the third display image V3 is improved via the display adjustment unit 310 may be used.
In addition, the other display image that is made to follow the selection by the selection unit 320 may be in the intermediate visibility state M. With such a configuration, the periphery of the host vehicle 2 can be quickly confirmed, and the driver can intuitively recognize which of the left and right display images should be viewed with priority. .

  Further, as in the above embodiment, when the display image that has transitioned from the high visibility state H to the low visibility state L is selected again by the selection unit 310 within the predetermined period T0, Although the continuation period of the visibility state H is set to the second continuation period TB longer than the first continuation period TA, as shown in FIG. 12A, the target display image is selected within the predetermined period T0 by the selection unit 310. Is selected again (time t23), the duration of the high visibility state H may be the second duration TC shorter than the first duration TA. With such a configuration, it is possible to shorten the time to gaze at the captured image, and it is possible to suppress a reduction in attention to the driving assistance image V1 that is considered to have a high visual priority. If the captured image is frequently viewed even after continuing the high visibility state H for the second duration TC, the duration of the high visibility state H may be further shortened and the duration is zero. (It is not necessary to shift to the high visibility state H).

  Further, in the above embodiment, the display adjustment unit 310 is based on whether it is selected again by the selection unit 310 within a predetermined period T0 from the time when the high visibility state H shifts to the low visibility state L. Although it has been determined whether or not to change the duration of the visibility state H, as shown in FIG. 12B, the time point (time t33) at which the low visibility state L shifts to the high visibility state H is determined. As a starting point, when the selection means 310 selects again within a predetermined period T0, the duration of the high visibility state H may be changed to a second duration TB longer than the first duration TA.

  In addition, when the selection signal 320 continues for a predetermined period by the selection unit 320, the display adjustment unit 310 may change the duration of the high visibility state H to the second duration TB longer than the first duration TA. .

  The present invention is applicable to a vehicle display device mounted on a vehicle.

DESCRIPTION OF SYMBOLS 1 Display apparatus for vehicles 2 Own vehicle 3 Instrument panel 100 1st operation part 200 2nd operation part 300 Control part 400 Memory | storage part 500 Display means 510 1st display means 511 1st projector 512 1st screen 513 1st reflection member 514 First partition wall 520 Second display means 521 Second projector 522 Second screen 523 Second reflecting member 524 Second partition wall 530 Third display means 531 Third projector 532 Third screen 533 Third reflecting member 534 Third partition wall L1 First display light L2 Second display light L3 Third display light V1 Virtual image (first display image)
V2 virtual image (second display image)
V3 virtual image (third display image)
DESCRIPTION OF SYMBOLS 10 Vehicle control system part 20 Vehicle-mounted electronic device 30 1st imaging part 40 2nd imaging part

Claims (4)

  1. A driving assistance image that is arranged in the instrument panel of the vehicle and includes at least vehicle information or navigation information; a left rear imaging image that is located on the left side of the driving assistance image and images the left rear of the vehicle; and A right rear captured image that is located on the right side and images the right rear of the vehicle, and a vehicle display device that displays to the observer,
    All of the driving assistance images are included in a range of 15 ° left and right with the line of sight of the observer facing the front of the vehicle as a center line, and the left rear captured image and the right rear captured image Each contains at least a part,
    When the vehicle is traveling, the visibility of both the left rear captured image and the right rear captured image is lower than the visibility of the driving support image within a visible range, and at least the left rear imaging is performed when a specific condition is satisfied. The visibility is continued for a predetermined duration after increasing the visibility of either the image or the right rear captured image, and the duration is automatically changed according to the timing when the specific condition is satisfied A display device for a vehicle, comprising display adjusting means.
  2.   The display adjusting means includes turn signal operation information of the vehicle, steering angle information of steering of the vehicle, object detection information around the vehicle, gaze direction information of the observer, and the observer operates a predetermined operation unit. The vehicle display device according to claim 1, wherein when any of the operation information satisfies a specific condition, it is determined that the specific condition is satisfied.
  3. The vehicle turn signal operation information, the steering angle information of the vehicle steering, the object detection information around the vehicle, the gaze direction information of the observer, and the operation information for the observer to operate a predetermined operation unit are related to the left and right. Including left and right information,
    The vehicle display device according to claim 2, wherein the display adjustment unit adjusts the visibility of the left rear captured image or the right rear captured image in accordance with the left-right information.
  4. A driving assistance image that is arranged in the instrument panel of the vehicle and includes at least vehicle information or navigation information; a left rear imaging image that is located on the left side of the driving assistance image and images the left rear of the vehicle; and A right rear captured image that is located on the right side and images the right rear of the vehicle, and a vehicle display device that displays to the observer,
    All of the driving assistance images are included in a range of 15 ° left and right with the line of sight of the observer facing the front of the vehicle as a center line, and the left rear captured image and the right rear captured image Each contains at least a part,
    While the vehicle is traveling, the visibility of either the left rear captured image or the right rear captured image is made lower than the visibility of the driving support image within a visible range, the other is not displayed, and the specific condition is A vehicle display device comprising display adjusting means for increasing visibility of at least one of the left rear captured image and the right rear captured image when established.
JP2013106522A 2013-05-20 2013-05-20 Vehicle display device Active JP6413207B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013106522A JP6413207B2 (en) 2013-05-20 2013-05-20 Vehicle display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013106522A JP6413207B2 (en) 2013-05-20 2013-05-20 Vehicle display device

Publications (2)

Publication Number Publication Date
JP2014229997A JP2014229997A (en) 2014-12-08
JP6413207B2 true JP6413207B2 (en) 2018-10-31

Family

ID=52129499

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013106522A Active JP6413207B2 (en) 2013-05-20 2013-05-20 Vehicle display device

Country Status (1)

Country Link
JP (1) JP6413207B2 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015054598A (en) * 2013-09-11 2015-03-23 本田技研工業株式会社 Display device for vehicle
CN106687327B (en) * 2014-09-29 2018-12-11 矢崎总业株式会社 Vehicle display device
JP6197814B2 (en) * 2015-03-18 2017-09-20 マツダ株式会社 Vehicle display device
JP2017114165A (en) * 2015-12-21 2017-06-29 株式会社デンソー Display apparatus
JP6361988B2 (en) * 2016-05-30 2018-07-25 マツダ株式会社 Vehicle display device
JP6361986B2 (en) * 2016-05-30 2018-07-25 マツダ株式会社 Vehicle display device
JP6459071B2 (en) * 2016-05-30 2019-01-30 マツダ株式会社 Vehicle display device
WO2018092919A1 (en) * 2016-11-21 2018-05-24 京セラ株式会社 Image processing device, imaging device, and display system
US20190283607A1 (en) * 2016-12-01 2019-09-19 Sharp Kabushiki Kaisha Display device and electronic mirror
WO2019026506A1 (en) * 2017-07-31 2019-02-07 シャープ株式会社 Control device, display system, control program, and recording medium
JP2019036924A (en) * 2017-08-21 2019-03-07 シャープ株式会社 Electronic mirror system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4233182B2 (en) * 1999-10-13 2009-03-04 本田技研工業株式会社 Vehicle rear view device
JP2002254985A (en) * 2001-02-28 2002-09-11 Auto Network Gijutsu Kenkyusho:Kk Device for visible identification around vehicle
JP2007522981A (en) * 2004-02-20 2007-08-16 シャープ株式会社 Situation detection display system, situation detection display method, situation detection display system control program, and recording medium recording the program
JP4857891B2 (en) * 2006-04-27 2012-01-18 株式会社デンソー Driving support device and program
JP5682304B2 (en) * 2010-12-27 2015-03-11 トヨタ自動車株式会社 Image providing device

Also Published As

Publication number Publication date
JP2014229997A (en) 2014-12-08

Similar Documents

Publication Publication Date Title
US9656690B2 (en) System and method for using gestures in autonomous parking
KR101631963B1 (en) Head up display device and vehicle having the same
JP6149543B2 (en) Head-up display device
US10272780B2 (en) Information display system and information display device
US8941737B2 (en) Image generating apparatus and image display system
US9789819B2 (en) Driving assistance device
JP2016045705A (en) In-vehicle device, in-vehicle device control method, and in-vehicle device control program
KR101478135B1 (en) Augmented reality lane change helper system using projection unit
US8878934B2 (en) Image display device
EP3015904B1 (en) Head-up display device
DE102012102508B4 (en) Adjustment method and system of a smart vehicle imaging device
DE10237988B4 (en) Vehicle imaging device and vehicle monitoring device
JP4807263B2 (en) Vehicle display device
US8536995B2 (en) Information display apparatus and information display method
EP1974998B1 (en) Driving support method and driving support apparatus
JP6148887B2 (en) Image processing apparatus, image processing method, and image processing system
JP6221942B2 (en) Head-up display device
KR101342033B1 (en) Image generation device and image display system
JP2014534697A (en) Virtual display system for vehicles
JP4900232B2 (en) Vehicle parking assist device and video display method
US20140195096A1 (en) Apparatus and method for contactlessly detecting objects and/or persons and gestures and/or operating procedures made and/or carried out thereby
JP5143235B2 (en) Control device and vehicle surrounding monitoring device
KR20120118074A (en) Vehicle periphery monitoring device
US20120287282A1 (en) Image processing apparatus, image processing system, and image processing method
JP5117003B2 (en) Driving assistance device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160311

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20170112

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170207

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170405

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170720

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170907

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180306

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180411

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180904

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180917

R150 Certificate of patent or registration of utility model

Ref document number: 6413207

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150