WO2011070640A1 - Système d'affichage d'image périphérique de véhicule - Google Patents

Système d'affichage d'image périphérique de véhicule Download PDF

Info

Publication number
WO2011070640A1
WO2011070640A1 PCT/JP2009/070488 JP2009070488W WO2011070640A1 WO 2011070640 A1 WO2011070640 A1 WO 2011070640A1 JP 2009070488 W JP2009070488 W JP 2009070488W WO 2011070640 A1 WO2011070640 A1 WO 2011070640A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
vehicle
hue
vehicle interior
camera
Prior art date
Application number
PCT/JP2009/070488
Other languages
English (en)
Japanese (ja)
Inventor
佐藤 徳行
Original Assignee
クラリオン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by クラリオン株式会社 filed Critical クラリオン株式会社
Priority to CN200980162790.0A priority Critical patent/CN102714710B/zh
Priority to PCT/JP2009/070488 priority patent/WO2011070640A1/fr
Priority to EP09852037.2A priority patent/EP2512133B1/fr
Priority to US13/514,575 priority patent/US20120249789A1/en
Publication of WO2011070640A1 publication Critical patent/WO2011070640A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/602Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint
    • B60R2300/605Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint the adjustment being automatic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/802Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views

Definitions

  • the present invention relates to a vehicle periphery image display system that displays a vehicle periphery image including a blind spot area on a monitor screen in a vehicle based on a camera image acquired from an in-vehicle blind spot camera.
  • the side view monitor system currently in practical use has a side camera (CCD camera, etc.) set inside the side mirror, and the actual camera image from the side camera is displayed on the monitor screen of the front display unit that is also used as the navigation system. Is displayed. In other words, the front side portion of the vehicle that becomes the driver's blind spot is displayed on the monitor screen, so that the driver can recognize the situation of the portion that becomes the blind spot.
  • CCD camera CCD camera, etc.
  • the side camera is placed inside the side mirror, there is a large parallax between the camera viewpoint and the driver viewpoint, and the shape of the obstacle and other objects is the shape in the camera image and the driver's seat.
  • the shape that can be seen from is completely different.
  • a converted external image is generated by converting a camera image obtained by a blind spot camera provided outside the vehicle body into a virtual camera image viewed from the viewpoint position of the driver.
  • a visual recognition area image excluding the blind spot area is generated from a camera video obtained by a driver viewpoint camera provided near the driver's viewpoint position.
  • the vehicle periphery image display system which obtains the synthesized image which synthesize
  • the viewpoint of the rear camera attached to the trunk part outside the vehicle is converted from the viewpoint of the driver's viewpoint to the image viewed from the rear by image conversion.
  • the portion of the composited rear view image that can be seen from the window is a live image (raw image) from the internal camera image, and cannot be captured by the camera.
  • the blind spot is obtained by superimposing the image of the external camera by image processing.
  • the boundary line for cutting out the image should be matched with the vehicle-shaped window frame and others.
  • the edge portion of the window frame or the like is superimposed as a thick frame-shaped superimpose.
  • the conventional vehicle periphery image display system has the following problems.
  • the applicant first converted the viewpoint of the actual camera image input from the vehicle-mounted blind spot camera into a virtual camera image viewed from the viewpoint position of the driver, and the vehicle interior that has been made semitransparent into the virtual camera image.
  • Proposed a vehicle peripheral image display system that transmits a semi-transparent vehicle interior image and displays a virtual camera image by superimposing images Japanese Patent Application No. 2008-39395, filing date: February 20, 2008.
  • the translucent interior image is superimposed on the virtual camera image
  • the brightness and hue of the virtual camera image and the translucent interior image approximate.
  • the outline of the vehicle and the color of the passenger compartment are blended into the delicate color of the external image.
  • the semi-transparent vehicle interior image is also made entirely whitish and transparent.
  • An object of the present invention is to provide a vehicle periphery image display system that can clearly see through the situation in a positional relationship with the host vehicle.
  • an in-vehicle blind spot camera that is attached to the host vehicle and images the periphery of the vehicle, a monitor that is set at a vehicle interior position that can be visually recognized by the driver, and an actual input from the in-vehicle blind spot camera.
  • Monitor image generation means for generating a display image on the monitor based on the camera image.
  • the monitor video generation means includes an image processing unit, an external video color determination unit, a vehicle interior image color automatic adjustment unit, and an image synthesis circuit.
  • the image processing unit converts an actual camera image input from the in-vehicle blind spot camera into a virtual camera image viewed from the viewpoint position of the driver.
  • the external video color determination unit determines the color of the external video from the in-vehicle blind spot camera according to at least one of luminance, hue, saturation, and brightness.
  • the vehicle interior image color automatic adjustment unit is configured to obtain at least one of luminance, hue, saturation, and brightness of a translucent vehicle interior image obtained by translucent the vehicle interior image based on a color determination result of the external image. Automatically adjust to improve the visibility of the video.
  • the image synthesizing circuit transmits the semitransparent vehicle interior image through the virtual camera image from the image processing unit and superimposes the translucent vehicle interior image from the vehicle interior image color automatic adjustment unit on the virtual camera image. A composite image representing the image is generated.
  • the image composition circuit in the image composition circuit, the image composition circuit superimposes the translucent vehicle interior image from the vehicle interior image color automatic adjustment unit on the virtual camera image from the image processing unit. Then, a composite image that transmits the translucent vehicle interior image and expresses the virtual camera image is generated, and this composite image is displayed on the monitor.
  • the real camera video input from the vehicle blind spot camera is converted into a virtual camera image viewed from the viewpoint position of the driver, so that the driver sees the composite image displayed on the monitor. Can intuitively grasp the blind spot portion from the driver included in the virtual camera image without parallax.
  • the external video color determination unit the actual camera video input from the in-vehicle blind spot camera is used as the external video, and the average brightness and hue of the external video is determined.
  • the vehicle interior image color automatic adjustment unit the external video Based on the determination result of the color determination unit, the brightness and color of the translucent vehicle interior image obtained by translucent the vehicle interior image are automatically adjusted so as to improve the visibility with respect to the external video. Therefore, a semi-transparent vehicle interior image with improved visibility to the external video is transmitted, and the virtual camera image is displayed on the monitor.
  • the external environment conditions such as daytime, twilight, nighttime, etc.
  • the distinction between the virtual camera image and the semi-transparent vehicle interior image becomes clear, and the external situation that is a blind spot from the driver by the virtual camera image is clearly visible through the positional relationship with the host vehicle by the semi-transparent vehicle interior image. it can.
  • the distinction between the virtual camera image and the semi-transparent vehicle interior image is clarified, and the external situation that becomes a blind spot from the driver is clarified by the positional relationship with the own vehicle Can be seen through.
  • FIG. 1 is an overall system block diagram illustrating a see-through side view monitor system A1 (an example of a vehicle periphery image display system) according to a first embodiment. It is explanatory drawing which shows the acquisition example 1 of the brightness
  • FIG. 2 shows the acquisition example
  • FIG. It is explanatory drawing which shows the acquisition example 3 of the brightness
  • FIG. 6 is a flowchart illustrating a flow of external video luminance follow-up display control processing executed by a control circuit 45 in the see-through side view monitor system A1 according to the first embodiment. It is a flowchart which shows the flow of the brightness
  • FIG. 6 is a flowchart illustrating a flow of hue conversion display control processing executed by a control circuit 45 in the see-through side view monitor system A1 according to the first embodiment.
  • FIG. 6 is a diagram illustrating a hue circle used in the hue conversion display control process according to the first embodiment.
  • FIG. It is a flowchart which shows the flow of the warning display control process performed in the control circuit 45 in the see-through side view monitor system A1 of Example 1.
  • FIG. It is a figure which shows the vehicle interior still image image
  • FIG. 13 is a diagram illustrating an image in which the “opaque portion DE” in FIG. 12 is set with respect to the vehicle interior image RP illustrated in FIG. 10 in the see-through side view monitor system A1 according to the first embodiment.
  • 10 shows a translucent vehicle interior image RG in which “opaque portion DE”, “transparent portion CE”, and “translucent portion GE” are set in the vehicle interior image RP shown in FIG. 10 in the see-through side view monitor system A1 of the first embodiment.
  • FIG. It is an image figure which shows a brightness
  • FIG. 1 It is a whole system block diagram which shows see-through back view monitor system A2 (an example of a vehicle periphery image display system) of Example 2.
  • It is a whole system block diagram which shows see-through front view monitor system A3 (an example of a vehicle periphery image display system) of Example 3.
  • FIG. 10 is a diagram showing a translucent vehicle interior image RG in which “opaque part DE”, “transparent part CE”, and “translucent part GE” are set for the vehicle interior image RP in the see-through front view monitor system A3 of Example 3.
  • the entire system configuration is composed of a blind spot eliminating camera, a digital image processing unit for processing the video, and a translucent video blending unit.
  • the basic configuration is common.
  • the configuration is devised in consideration of cost and the like.
  • the first embodiment uses a side camera for eliminating a blind spot that is built in a side mirror or disposed in the vicinity of the side mirror as an in-vehicle blind spot camera, and a side camera in a front side portion of a vehicle that becomes a blind spot of a driver.
  • a side camera for eliminating a blind spot that is built in a side mirror or disposed in the vicinity of the side mirror as an in-vehicle blind spot camera, and a side camera in a front side portion of a vehicle that becomes a blind spot of a driver.
  • the see-through side view monitor system which displays an image
  • FIG. 1 is an overall system block diagram illustrating a see-through side view monitor system A1 (an example of a vehicle periphery image display system) according to the first embodiment.
  • the see-through side view monitor system A1 includes a side camera 1 (vehicle blind spot camera), an image processing control unit 2 (monitor video generation means), a monitor 3, and a blend ratio manual control.
  • An interface 4 blend ratio manual operation means
  • an external sensor 5 and a hue manual control interface 6 (vehicle interior image color manual operation means) are provided.
  • the side camera 1 is mounted in the left side mirror or disposed near the left side mirror, and images the front side portion of the vehicle that becomes the blind spot of the driver.
  • the side camera 1 acquires actual camera video data of the front side portion of the vehicle by an image sensor (CCD, CMOS, etc.).
  • the monitor 3 is set to a vehicle interior position (for example, an instrument panel position, etc.) that can be visually recognized by the driver, and displays an image by inputting a display image from the image processing control unit 2.
  • the monitor 3 has a display screen 3a such as a liquid crystal display or an organic EL display.
  • a dedicated monitor may be set in the see-through side view monitor system A1
  • a dedicated monitor may be set in the camera system for eliminating blind spots, and the navigation system.
  • a monitor of another system may be used.
  • the image processing control unit 2 displays on the monitor 3 based on input information from the blend ratio manual control interface 4, the external sensor 5, and the hue manual control interface 6 in addition to the actual camera image input from the side camera 1. Generate video.
  • the blend ratio manual control interface 4 is constituted by a touch panel switch of the monitor 3, for example, and arbitrarily adjusts the transmittance of the “semi-transparent portion GE” set in the vehicle interior image by manual operation.
  • the external sensor 5 is a sensor or switch that provides input information to the image processing control unit 2, and as shown in FIG. 1, a steering angle sensor 51, a speed sensor 52, an illumination ON / OFF switch 53, a function A switch 54 and other sensors and switches are provided.
  • the function switch 54 When the function switch 54 is turned on, the blend circuit unit 46a is based on external environment information (daytime, evening, nighttime, weather, etc.) and vehicle information (steering angle, vehicle speed, etc.) obtained by the external sensor 5.
  • the transmittance of the “semi-transparent portion GE” set in the vehicle interior image is automatically adjusted so as to enhance the visibility of the composite image displayed on the monitor 3.
  • the hue manual control interface 6 is configured by, for example, a touch panel switch of the monitor 3, and arbitrarily adjusts the overall hue of the superimposed vehicle interior image by manual operation.
  • the image processing control unit 2 includes a decoder 41, an image memory 42, an image processing unit 43, an image memory 44 (image storage unit), a control circuit (CPU) 45, a super-in Pause circuit 46 (image composition circuit), encoder 47, blend external control unit 48, luminance / hue determination sensor 49a (external video color determination unit), and luminance / hue conversion block 49b (in-vehicle interior image color automatic adjustment unit) ) And a hue external control unit 49c (vehicle interior image color external control unit).
  • the decoder 41 performs analog / digital conversion on the actual camera video data input from the side camera 1.
  • the image memory 42 stores the actual camera image data digitally converted from the decoder 41.
  • the image processing unit 43 converts the actual camera image data input from the image memory 42 into a virtual camera image viewed from the viewpoint position of the driver.
  • “viewpoint conversion processing as if a virtual camera is arranged near the driver's viewpoint” is performed, and “various processing (luminance adjustment, color correction, edge correction, etc.) is included.
  • Image processing is also performed.
  • the image memory 44 is a memory for superimposing, and stores a vehicle interior image RP (FIG. 10) previously captured from the driver's viewpoint as a vehicle interior image.
  • the control circuit (CPU) 45 is a central processing circuit that manages all information processing and control output related to image processing in accordance with input information, and performs external video luminance follow-up display control, luminance sudden change support display control, and hue conversion display.
  • a control program for performing various image processing controls such as control / warning display control is set.
  • the superimpose circuit 46 basically converts the vehicle interior image RP from the image memory 44 into a translucent vehicle interior image RG (FIG. 14) by translucently converting it into the virtual camera image from the image processing unit 43. By synthesizing the semi-transparent vehicle interior image RG, a composite image that transmits the semi-transparent vehicle interior image RG and expresses the virtual camera image is generated.
  • the superimpose circuit 46 includes a blend circuit unit 46a that divides the vehicle interior image RP previously captured from the driver's viewpoint and sets different transmittances in the respective regions.
  • a shadow area SE obtained by projecting the host vehicle on the road surface is set as an “opaque portion DE” having a transmittance of 0% in the vehicle interior image in the vehicle interior image RP (FIG. 10).
  • the area corresponding to the window glass of the host vehicle is set as a “transparent portion CE” in which the transmittance of the vehicle interior image is 100%, and the area other than the shadow area and the window glass area is set to “half It is set as “transparent part GE” (FIG. 14).
  • the encoder 47 receives a composite image signal obtained by superimposing the translucent vehicle interior image RG on the virtual camera image from the superimpose circuit 46, and outputs the composite image signal to the monitor 3 through digital / analog conversion.
  • the blend external control unit 48 sets the transmittance of the “translucent portion GE” set in the vehicle interior image within a range of 0% to 100%.
  • a transmittance control command to be arbitrarily adjusted is output to the control circuit 45.
  • the luminance / hue determination sensor 49a receives actual camera image data of an external video from the side camera 1, determines average luminance and hue based on the actual camera image data, and sends the determination result to the control circuit 45. Output.
  • the luminance / hue determination sensor 49a takes in data of each pixel constituting the actual camera image as a digital data luminance signal Y and a color difference signal CbCr (or RGB data). Then, as shown in FIG. 2, the acquired original data (screen) is accumulated (accumulated) for each horizontal line in the horizontal scanning line direction, and a plurality of accumulated data by all the scanning lines is averaged to obtain the entire image. The brightness (brightness) and hue of the are determined. This determination method is the easiest.
  • the luminance / hue conversion block 49b receives the luminance / hue determination result of the external video from the control circuit 45 and the translucent vehicle interior image data from the image memory 44, and determines the luminance and hue of the translucent vehicle interior image.
  • the brightness and hue of the external video are automatically adjusted to improve the visibility, and the adjusted translucent vehicle interior image is output to the superimpose circuit 46.
  • the hue control command from the hue external control unit 49c and the translucent vehicle interior image data from the image memory 44 are input, the hue of the translucent vehicle interior image is adjusted, and the adjusted translucent vehicle interior image is superposed. Output to the impose circuit 46.
  • the hue external control unit 49 c When a hue adjustment signal is input from the hue manual control interface 6, the hue external control unit 49 c outputs a hue control command for arbitrarily adjusting the overall hue of the superimposed vehicle interior image according to preference. Output to the hue conversion block 49b.
  • FIG. 5 is a flowchart showing the flow of the external video luminance follow-up display control process executed by the control circuit 45 in the see-through side view monitor system A1 of Example 1 (external video luminance follow-up display control mode).
  • the control circuit 45 executed by the control circuit 45 in the see-through side view monitor system A1 of Example 1 (external video luminance follow-up display control mode).
  • each step of FIG. 5 will be described.
  • step S51 it is determined whether or not the function switch 54 is ON. If Yes (switch ON), the process proceeds to step S52, and if No (switch OFF), the process returns to step S51.
  • step S52 following the determination that the function switch 54 is ON in step S51, the luminance determination result and the hue determination result are obtained as determination data by the luminance / hue determination sensor 49a, and the process proceeds to step S53.
  • step S53 following the acquisition of the luminance / hue determination data in step S52, it is determined whether or not the determined luminance detection value is lower than the first set value Y1 indicating the twilight threshold, and Yes (luminance detection value) If ⁇ Y1), the process proceeds to step S55. If No (brightness detection value ⁇ Y1), the process proceeds to step S54.
  • step S54 following the determination that the luminance detection value ⁇ Y1 in step S53, the luminance of the vehicle interior image to be superimposed is set to the normal state, and the process returns to step S51.
  • “brightness” refers to the sensitivity of the human eye (CIE standard spectral luminous efficiency V) emitted from the light source or secondary light source (reflection surface or transmission surface) toward the observer. This is a photometric quantity evaluated in [ ⁇ ]), focusing only on a specific direction (observation direction).
  • the “normal luminance state” refers to a luminance that clearly distinguishes the translucent vehicle interior image superimposed on the virtual camera image when the brightness outside the vehicle interior is the daytime brightness.
  • step S55 following the determination that the luminance detection value ⁇ Y1 in step S53, it is determined whether the determined luminance detection value is lower than a second set value Y2 ( ⁇ Y1) indicating a nighttime threshold. If Yes (luminance detection value ⁇ Y2), the process proceeds to step S57. If No (luminance detection value ⁇ Y2), the process proceeds to step S56.
  • step S56 following the determination that the luminance detection value ⁇ Y2 in step S55, the luminance of the vehicle interior image to be superimposed is shifted up or the brightness is increased, and the process returns to step S51.
  • “brightness” is one of the three attributes of color and refers to the brightness of the color. Note that the three attributes of color include “lightness” including “hue (hue)” and “saturation (degree of color vividness)”.
  • step S57 following the determination that the luminance detection value ⁇ Y2 in step S55, the luminance of the vehicle interior image to be superimposed is inverted, the black line is displayed as a white line, and the process returns to step S51.
  • FIG. 6 is a flowchart showing the flow of the sudden brightness change corresponding display control process executed by the control circuit 45 in the see-through side view monitor system A1 of the first embodiment (the sudden brightness change compatible display control mode).
  • the control circuit 45 executed by the control circuit 45 in the see-through side view monitor system A1 of the first embodiment (the sudden brightness change compatible display control mode).
  • step S61 it is determined whether or not the function switch 54 is ON. If Yes (switch ON), the process proceeds to step S62. If No (switch OFF), the process returns to step S61.
  • step S62 following the determination that the function switch 54 is ON in step S61, the luminance determination result and the hue determination result are obtained as determination data by the luminance / hue determination sensor 49a, and the process proceeds to step S63.
  • step S63 following the acquisition of the luminance / hue determination data in step S62, it is determined whether or not the determined luminance detection value is higher than a third set value Y3 indicating the upper threshold, and Yes (luminance detection) If value> Y3), the process proceeds to step S65. If No (luminance detection value ⁇ Y3), the process proceeds to step S64.
  • step S64 following the determination that the detected luminance value ⁇ Y3 in step S63, the luminance of the vehicle interior image to be superimposed is set to the normal state, and the process returns to step S61.
  • step S65 following the determination that the detected luminance value is greater than Y3 in step S63, the luminance of the vehicle interior image to be superimposed is shifted down or the brightness is decreased, and the process returns to step S61.
  • FIG. 7 is a flowchart showing the flow of hue conversion display control processing executed by the control circuit 45 in the see-through side view monitor system A1 of the first embodiment (hue conversion display control mode).
  • FIG. 8 is a diagram illustrating a hue circle used in the hue conversion display control process according to the first embodiment. Hereinafter, each step of FIG. 7 will be described.
  • step S71 it is determined whether or not the function switch 54 is ON. If Yes (switch ON), the process proceeds to step S72. If No (switch OFF), the process returns to step S71.
  • step S72 following the determination that the function switch 54 is ON in step S71, the luminance determination result and the hue determination result are obtained as determination data by the luminance / hue determination sensor 49a, and the process proceeds to step S73.
  • step S73 following the acquisition of the luminance / hue determination data in step S72, the determined hue detection value is compared with the hue setting value X, and whether the hue shift due to the difference therebetween is less than the first threshold value Z1. If yes (hue shift ⁇ Z1), the process proceeds to step S75. If No (hue shift ⁇ Z1), the process proceeds to step S74.
  • the set value X of the hue is obtained by determining the hue of the vehicle interior image before the hue conversion stored in the image memory 44.
  • step S74 following the determination that hue deviation ⁇ Z1 in step S73, the hue of the vehicle interior image to be superimposed is maintained as it is, and the process returns to step S71.
  • step S75 following the determination that the hue shift ⁇ Z1 in step S73, the determined hue detection value is compared with the hue setting value X, and the hue shift due to the difference between the two is detected by the second threshold Z2 ( ⁇ It is determined whether or not the difference is less than (Z1). If Yes (hue shift ⁇ Z2), the process proceeds to step S77. If No (hue shift ⁇ Z2), the process proceeds to step S76.
  • step S76 following the determination that hue shift ⁇ Z2 in step S75, the brightness of the vehicle interior image to be superimposed is shifted up or the brightness is increased, and the process returns to step S51.
  • step S77 following the determination in step S75 that the hue shift is smaller than Z2, the hue of the vehicle interior image to be superimposed is converted, displayed in a complementary color, and the process returns to step S71.
  • the conversion to the hue of the complementary color system means that the color located diagonally in the hue circle shown in FIG. 8 has a complementary color relationship.
  • the hue of the external video is “red”
  • “Cyan” is most preferable.
  • red in addition to “cyan”, “blue” and “green” are also present in the diagonal complementary color system region, and therefore may be converted to “blue” or “green”.
  • FIG. 9 is a flowchart showing a flow of warning display control processing executed by the control circuit 45 in the see-through side view monitor system A1 of the first embodiment (warning display control mode). Hereinafter, each step of FIG. 9 will be described.
  • step S91 it is determined whether or not the function switch 54 is ON. If Yes (switch ON), the process proceeds to step S92. If No (switch OFF), the process returns to the determination in step S91.
  • step S92 following the determination that the function switch 54 is ON in step S91, the speed sensor 52 obtains speed data and also obtains obstacle presence / absence data from the image processing unit 43, and the process proceeds to step S93.
  • the “obstacle presence / absence data” is obtained by analyzing the actual camera image data input to the image processing unit 43 and determining whether an image indicating the obstacle exists in the analyzed image. Is done.
  • step S93 following the acquisition of the speed data and obstacle presence / absence data in step S92, it is determined whether or not the speed detection value is smaller than the set value V. If Yes (speed detection value ⁇ V), the process proceeds to step S95. If No (speed detection value ⁇ V), the process proceeds to step S94.
  • the set value V is a low speed range vehicle speed value as a judgment threshold value for low speed driving and normal driving.
  • step S94 following the determination that speed detection value ⁇ V in step S93, the hue of the entire screen of the vehicle interior image to be superimposed is changed to a red system that warns the entire screen, and the process proceeds to step S51. Return.
  • step S95 following the determination that the detected speed value ⁇ V in step S93, it is determined whether or not an obstacle is not recognized based on the acquired obstacle presence / absence data, and Yes (no obstacle recognized). If this is the case, the process proceeds to step S97. If No (there is an obstacle recognized), the process proceeds to step S96.
  • step S96 following the determination that the obstacle is recognized in step S95, the hue of the entire screen of the vehicle interior image to be superimposed is changed in accordance with the degree of obstacle approach (for example, gradually from orange). To red), the process returns to step S91.
  • step S97 following the determination that no obstacle is recognized in step S95, the brightness and hue of the vehicle interior image to be superimposed are maintained, and the process returns to step S91.
  • the object of the present invention including the first to third embodiments is to provide an external camera capable of contributing to the elimination of blind spots, and the driver viewed the video in a system capable of displaying the camera video using image processing.
  • a system capable of intuitively recognizing that the image is transmitted through the vehicle is inexpensively constructed, and further, a vehicle peripheral image display system capable of grasping the behavior of the vehicle in the image is proposed.
  • the main contents of the display system proposed by the present inventor are as follows.
  • -Build a display method that allows you to intuitively understand the direction of travel, size, and other vehicle sensations simply by looking at the video displayed on the monitor.
  • -A system that can change the image transmissivity freely according to the driver's preference.
  • a display system capable of automatically changing the basic transmittance according to the operating status is constructed. For example, the transmittance is changed according to the luminance of the external video at dusk so that the external video is easy to see.
  • -Build a display system that automatically controls the brightness and hue of the translucent interior image to be superimposed according to the brightness and hue of the external video.
  • the system will be able to change the brightness and hue by manual operation according to the preference of visibility with large individual differences. For example, the brightness of the translucent interior image is increased at twilight, the brightness of the translucent interior image is inverted at night, and the brightness of the translucent interior image is decreased when the oncoming vehicle lights.
  • the hue of the video image and the translucent vehicle interior image approximates, the hue of the translucent vehicle interior image is converted so as to have a hue difference.
  • the hue of the hue can be controlled, and use a whole screen displayed on the monitor to perform a visual warning action. For example, when the safety is impaired in conjunction with vehicle speed information or obstacle information, the hue of the entire translucent vehicle interior image is converted to a red hue or the like representing a warning.
  • the operations in the see-through side view monitor system A1 of the first embodiment are “monitor image display operation by transmission image”, “transparency change operation of translucent portion”, “external image luminance follow-up display control operation”, and “brightness change suddenly”.
  • the description will be divided into “corresponding display control action”, “hue conversion display control action”, and “warning display control action”.
  • FIG. 10 is a diagram showing a vehicle interior still image previously captured from the driver's viewpoint position toward the left front side.
  • FIG. 11 is a perspective view illustrating a state in which a vehicle body shape is projected onto a road surface from a vehicle on which the see-through side view monitor system A1 according to the first embodiment is mounted.
  • FIG. 12 shows an image (opaque portion) when an image obtained by projecting the vehicle body shape onto the road surface from a vehicle on which the see-through side view monitor system A1 according to the first embodiment is mounted is seen from the driver viewpoint position.
  • FIG. 13 is a diagram illustrating an image in which the “opaque portion DE” of FIG. 12 is set with respect to the vehicle interior image RP illustrated in FIG.
  • FIG. 14 shows a translucent vehicle interior in which “opaque part DE”, “transparent part CE”, and “translucent part GE” are set for the vehicle interior image RP shown in FIG. 10 in the see-through side view monitor system A1 of the first embodiment. It is a figure which shows the image RG.
  • a monitor image display operation using a transmission image will be described with reference to FIGS.
  • the real camera video input from the side camera 1 is analog / digital converted by the decoder 41 and stored in the image memory 42. Thereafter, in the image processing unit 43, “image processing including various processing (brightness adjustment, color correction, edge correction, etc.)” and “viewpoint conversion processing as if a virtual camera was placed near the driver's viewpoint” To obtain a virtual camera image.
  • the image memory 44 stores a vehicle interior image RP (FIG. 10) previously captured from the driver's viewpoint as a vehicle interior image.
  • the superimpose circuit 46 makes the vehicle interior image RP from the image memory 44 translucent to make a translucent vehicle interior image RG (FIG. 14), and adds the translucent vehicle to the virtual camera image from the image processing unit 43.
  • a composite image that transmits the translucent vehicle interior image RG and expresses the virtual camera image is generated.
  • the composite image superimposed in the superimpose circuit 46 is sent to the encoder 47, and after that, undergoes digital / analog conversion in the encoder 47, is output to the monitor 3, and is displayed on the display screen 3a.
  • the vehicle is shown high in the air so that it can be easily discriminated.
  • this projection plane is projected vertically on the road surface such as a road and is at the same height as the ground contact surface of the tire.
  • the image shown in FIG. 12 shows the shape of the vehicle body in actual driving, touching this projection image when superimposed on the image from the driver's viewpoint means contact with the vehicle body.
  • the viewpoint-converted image and this projection plane you can see at a glance the vehicle sensation that is necessary for avoiding falling wheels to the side grooves and avoiding obstacles by just looking at this image (side view screen). Since it can be understood and intuitively understood, the degree of contribution to safe driving increases. That is, as shown in FIG. 13, the projected part of the vehicle body shape is an “opaque part DE” having a transmittance of 0%, and the vehicle interior image RP is displayed as it is.
  • the vehicle interior image RP is displayed with an arbitrary transmittance by using ⁇ (alpha) blend with the side camera image.
  • alpha
  • a “transparent portion CE” with a transmittance of 100% is set, and the remaining is a “translucent portion GE”.
  • the vehicle interior image RP from the driver's viewpoint is displayed 100% as a translucent vehicle interior image RG.
  • the actual camera video of the side camera 1 installed on the side mirror is converted by the viewpoint conversion image processing, and converted into a virtual camera image as if it was photographed by the virtual camera from the driver viewpoint position.
  • a translucent image with a more realistic feeling can be expressed by superimposing and displaying the translucent vehicle interior image on the virtual camera image.
  • the positional relationship between the external transmission image and the actual vehicle is clear. It can be displayed as you can see.
  • the transmittance of the interior image RP of the shadow area projected according to the size and shape of the vehicle body on the road which is the virtual space screen when performing the above-described viewpoint conversion, viewed from the driver viewpoint position. It is displayed as a 0% “opaque part DE”, and otherwise it is displayed as a semi-transparent area (“transparent part CE”, “semi-transparent part GE”) having an arbitrary transmittance. Therefore, the monitor screen 3a is displayed as a superimpose screen in which the size and shape of the vehicle body are clear, and the other part is a blend screen with the camera video. As a result, the behavior of the vehicle becomes clear at a glance, and it becomes easy to judge the possibility of wheel removal.
  • the system has become a redundant system. That is, since two blind cameras and a driver viewpoint camera are used and a driver viewpoint camera needs to be added to the existing system, the cost of the system increases.
  • the driver viewpoint camera is provided near the driver's viewpoint position and cannot be provided at the driver viewpoint position, the camera image obtained from the driver viewpoint camera is actually the driver's viewpoint position. Parallax occurs with respect to the image seen from the viewpoint.
  • this proposal does not require the addition of a driver viewpoint camera to the existing system and does not increase costs.
  • the virtual camera viewpoint There is no occurrence of parallax.
  • the shape of the vehicle body can be grasped at a glance, it is possible to obtain a necessary and sufficient effect that it is very easy to understand in the traveling direction and avoiding obstacles that are approaching and can contribute to safe driving.
  • the entire screen is not a uniform translucent image, but a uniform screen is obtained by using a blend setting of 100% and 100% to 0% blendable translucent areas and images with several levels of transparency. Can improve various misunderstandings and recognition errors.
  • the “translucent portion GE” when a predetermined single fixed transmittance is used, the user cannot arbitrarily change the transmittance, and the usability is deteriorated. Even if there is an environmental change, visibility may be reduced due to the screen having a uniform transmittance.
  • the transmittance of the “semi-transparent portion GE” can be adjusted manually or automatically. That is, when a transmittance adjustment signal is input to the blend external control unit 48 by a manual operation on the blend ratio manual control interface 4, a transmittance control command is output to the control circuit 45, and the blend circuit unit 46a
  • the transmittance of the “semi-transparent part GE” set in the image is arbitrarily adjusted in the range of 0% to 100%.
  • the blend circuit unit 46a When the function switch 54 is turned on, the blend circuit unit 46a is based on external environment information (daytime, evening, nighttime, weather, etc.) and vehicle information (steering angle, vehicle speed, etc.) obtained by the external sensor 5.
  • the transmittance of the “semi-transparent portion GE” set in the vehicle interior image is automatically adjusted so as to enhance the visibility of the composite image displayed on the monitor 3.
  • the blend ratio variable by making the blend ratio variable by manual operation, it is a system that can freely set and update the transmissivity of the “semi-transparent part GE”, which is very easy to use. Further, when the function switch 54 is turned on, the system automatically adjusts the transmittance of the “semi-transparent portion GE” without requiring any user operation, and the composite image displayed on the monitor 3 is high. Visibility can be maintained.
  • FIG. 15 is an image diagram showing a luminance reversal screen when the translucent vehicle interior image RG is reversed and displayed in the see-through side view monitor system A1 according to the first embodiment.
  • the display control action by the external video luminance follow-up display control mode will be described with reference to FIG.
  • step S51 When the external video captured by the side camera 1 is bright during the daytime and the luminance detection value is equal to or greater than the first set value Y1, the flow proceeds from step S51 to step S52 to step S53 to step S54 in the flowchart of FIG. Is repeated. That is, in step S54, the luminance of the translucent vehicle interior image RG to be superimposed is set to a normal luminance state having high discrimination when the external video is bright.
  • step S51 in the flowchart of FIG. Step S52 ⁇ Step S53 ⁇ Step S55 ⁇ Step S56 is repeated. That is, in step S56, the luminance or brightness of the translucent vehicle interior image RG to be superimposed is shifted up.
  • the brightness (lightness) of the translucent vehicle interior image RG to be superimposed is kept in a normal state at twilight, the overall brightness of the display screen 3a of the monitor 3 falls, and the inside of the dim virtual camera image The semi-transparent vehicle interior image RG may be melted and the visibility may be impaired.
  • the two images to be superimposed will have a brightness difference (brightness difference). Useful for improving visibility.
  • step S51 When the external image captured by the side camera 1 is dark at night or the like and the luminance detection value is lower than the second set value Y2, in the flowchart of FIG. 5, go to step S51 ⁇ step S52 ⁇ step S53 ⁇ step S55 ⁇ step S57.
  • the flow going forward is repeated. That is, in step S57, as shown in FIG. 15, the luminance of the translucent vehicle interior image RG to be superimposed is inverted, and the black line is displayed as a white line.
  • the translucent vehicle interior image RG to be superimposed when the brightness (brightness) of the translucent vehicle interior image RG to be superimposed is kept in a normal state at night when the external video is dark, the overall brightness of the display screen 3a of the monitor 3 is greatly reduced, and the dark virtual The translucent vehicle interior image RG is completely dissolved in the camera image, and the visibility is significantly impaired. Further, even if the brightness of the superimposed translucent vehicle interior image RG is increased, the translucent vehicle interior image RG is merged into the dark virtual camera image, and the two images cannot be distinguished. In contrast, the translucent vehicle interior image RG that is superimposed by superimposing on the dark external video is inverted as in the negative image of the photograph, the dark black line portion is displayed as a white line, and the bright white line portion is displayed.
  • the blend ratio manual control interface 4 achieves an increase in brightness by increasing the transmittance and a decrease in brightness by decreasing the transmittance.
  • step S64 the luminance of the translucent vehicle interior image RG to be superimposed is set to a normal state or a luminance change state by the external video luminance follow-up display control.
  • step S61 in the flowchart of FIG.
  • the flow of going from step S62 to step S63 to step S65 is repeated. That is, in step S65, the luminance or brightness of the translucent vehicle interior image RG to be superimposed is shifted down.
  • the external image of the monitor 3 is basically whitened out and superimposed, although a certain amount of brightness correction is applied.
  • the translucent vehicle interior image RG becomes indistinguishable.
  • the brightness adjustment including auto iris of the in-vehicle camera can provide a certain amount of dazzling prevention effect, and the superimposed image can be distinguished.
  • the brightness adjustment function of the in-vehicle camera cannot follow.
  • the semi-transparent vehicle interior is superimposed.
  • the brightness or brightness of the image RG is shifted down.
  • the external video virtual camera image
  • the semitransparent vehicle interior image RG that is entirely blackened is superimposed by superimposition.
  • the distinction between the external video and the translucent vehicle interior image RG becomes clear, which is useful for improving the visibility when there is a steep luminance change.
  • step S71 If the hue of the external video and the hue of the translucent vehicle interior image RG are different and the hue deviation is equal to or greater than the first threshold value Z1 when the hue detection value is compared with the set value X, step S71 ⁇ The flow from step S72 to step S73 to step S74 is repeated. That is, in step S74, the hue of the translucent vehicle interior image RG to be superimposed is maintained as the current hue.
  • step S71 When the hue of the external image approaches the hue of the translucent vehicle interior image RG and the hue deviation is less than the first threshold Z1 but greater than or equal to the second threshold Z2 when the hue detection value is compared with the set value X,
  • step S71 the luminance (lightness) of the translucent vehicle interior image RG to be superimposed is increased.
  • step S77 the hue of the translucent vehicle interior image RG to be superimposed is converted into a complementary color hue positioned diagonally in the hue circle shown in FIG. 8 and displayed.
  • the hue of the external video and the hue of the translucent vehicle interior image RG are close to each other, but different from each other, the brightness (brightness) of the semitransparent vehicle interior image RG to be superimposed is increased to make it visible. Can improve sex.
  • the green color is mainly used, and if the translucent interior image RG to be superimposed is mainly green of the same color, the visibility will deteriorate. Become.
  • the translucent vehicle interior image RG is “a magenta color that exists oppositely in the hue circle”, the distinction becomes clear.
  • the red color is the main color, and when the semi-transparent vehicle interior image RG to be superimposed is mainly the same color red, the visibility is deteriorated.
  • the translucent vehicle interior image RG is “a cyan color that exists oppositely in the hue circle”, the distinction becomes clear.
  • the external video and the hue of the translucent vehicle interior image RG are at the same hue level, the external video and the translucent vehicle interior image RG are changed by changing the hue to a color system having a complementary color relationship.
  • the monitor image can be clearly distinguished.
  • step S91 When the monitor display in the side view is maintained even in the normal driving state and the speed detection value is equal to or higher than the set value V, in the flowchart of FIG. 9, the process proceeds from step S91 to step S92 to step S93 to step S94.
  • the forward flow is repeated. That is, in step S94, the hue of the entire screen of the semi-transparent vehicle interior image RG to be superimposed is converted to red.
  • step S91 when an obstacle is recognized in the vicinity of the own vehicle, in the flowchart of FIG. 9, the flow from step S91 ⁇ step S92 ⁇ step S93 ⁇ step S95 ⁇ step S96 is repeated. That is, in step S96, the hue of the entire screen of the semi-transparent vehicle interior image RG to be superimposed is converted so as to gradually increase the red color in accordance with the approach degree of the obstacle.
  • step S91 ⁇ step S92 ⁇ step S93 ⁇ step S95 ⁇ step S97 is repeated in the flowchart of FIG. That is, in step S97, the brightness and hue of the translucent vehicle interior image RG to be superimposed are maintained.
  • the side view monitor system is used for safety checks such as when the vehicle is widened or when the engine is started, and is frequently used when the vehicle is stopped or traveling at an extremely low speed. Therefore, when trying to use this system when the vehicle is in a speed state above a certain vehicle speed, the monitor 3 is kept being watched, so the feeling of speed and the sense of space are often emphasized more than in reality, and there is often a sense of incongruity. Since this would impair safety in the original sense, it is better to judge such a situation and issue a warning that the speed should be reduced for safe driving. Also, when the image processing control unit 2 recognizes an obstacle by image analysis, it is better to give a warning notifying the presence of the obstacle in order to promote driving to avoid the obstacle when the obstacle is recognized. good.
  • a warning operation is performed in conjunction with the speed sensor 52 and the like.
  • the driver is warned that the vehicle speed should be reduced and safety Can be secured.
  • a warning display that uses a change in hue according to the degree of proximity of the obstacle is performed to promptly respond to collisions, entrainment, falling wheels, etc., against the host vehicle. Sex can be secured.
  • An in-vehicle blind spot camera (side camera 1) that is attached to the host vehicle and images the periphery of the vehicle, a monitor 3 that is set in a vehicle interior position that can be visually recognized by the driver, and an input from the in-vehicle blind spot camera (side camera 1).
  • a vehicle peripheral image display system (see-through side view monitor system A1), comprising monitor image generation means (image processing control unit 2) for generating a display image on the monitor 3 based on a real camera image to be displayed
  • the monitor image generation means (image processing control unit 2) converts the viewpoint of the actual camera image input from the vehicle-mounted blind spot camera (side camera 1) into a virtual camera image viewed from the viewpoint position of the driver.
  • the luminance of the semi-transparent vehicle interior image RG obtained by making the vehicle interior image translucent based on the external video color determination unit (luminance / hue determination sensor 49a) determined according to at least one and the color determination result of the external video
  • a vehicle interior image color automatic adjustment unit (luminance / hue conversion block 49b) that automatically adjusts at least one of hue, saturation, and lightness so as to improve the visibility of the external image, and from the image processing unit 43
  • a virtual camera image is transmitted through the translucent vehicle interior image RG by image composition in which the translucent vehicle interior image RG from the vehicle interior image color automatic adjustment unit (luminance / hue conversion block 49b) is superimposed on the virtual camera image.
  • an image composition circuit (superimpose circuit 46) for generating a composite image to be expressed. Therefore, regardless of the external environmental conditions for acquiring real camera video, the distinction between the virtual camera image and the translucent vehicle interior image RG is clarified, and the external situation that is a blind spot from the driver is clarified by the positional relationship with the own vehicle Can be seen through.
  • the monitor image generating means (image processing control unit 2) has an image storage unit (image memory 44) for storing a vehicle interior still image previously captured from the viewpoint of the driver as a vehicle interior image.
  • the vehicle interior image color automatic adjustment unit (luminance / hue conversion block 49b) obtains a translucent vehicle interior image RG by making the vehicle interior image from the image storage unit (image memory 44) translucent. .
  • the setting of the driver viewpoint camera is omitted, and an inexpensive system using only the in-vehicle blind spot camera (side camera 1), while no parallax occurs with respect to the actual driver viewpoint like the driver viewpoint camera, A translucent vehicle interior image RG from the driver's viewpoint can be acquired.
  • the external image color determination unit is a luminance / hue determination sensor 49a that determines an average luminance and hue of an external image from the in-vehicle blind spot camera (side camera 1), and the vehicle interior image color automatic adjustment The unit automatically adjusts the luminance and hue of the translucent vehicle interior image RG based on the result of color determination of the external video by the luminance / hue determination sensor 49a so as to improve the visibility of the luminance and hue of the external video.
  • a hue conversion block 49b For this reason, the brightness and hue of the translucent vehicle interior image RG can be automatically adjusted so as to improve the visibility, corresponding to the average brightness and hue of the external video from the vehicle blind spot camera (side camera 1). .
  • the monitor image generation means (image processing control unit 2) reads the luminance detection value from the luminance / hue determination sensor 49a, and when the luminance detection value is equal to or higher than a first set value Y1 indicating a twilight threshold, When the brightness difference between the brightness of the semi-transparent vehicle interior image RG and the brightness of the external video is displayed and the brightness detection value is lower than the first set value Y1 but greater than or equal to the second set value Y2 indicating the nighttime threshold, When the brightness of the translucent vehicle interior image RG is increased to give a difference in brightness from the brightness of the external image, and the brightness detection value is lower than the second set value Y2, the brightness of the translucent vehicle interior image RG And an external video luminance follow-up display control mode (FIG.
  • the monitor image generation means (image processing control unit 2) reads the luminance detection value from the luminance / hue determination sensor 49a, and when the luminance detection value becomes higher than the third set value Y3 indicating the upper limit side threshold value.
  • a display control mode (FIG. 6) corresponding to a sudden change in luminance is provided that reduces the luminance of the translucent vehicle interior image RG and displays the entire image as blackish. For this reason, when there is a steep luminance change such that the light of the oncoming vehicle is reflected on the in-vehicle blind spot camera (side camera 1) at night, the external video displayed on the monitor 3 and the half of the steep luminance change are dealt with. It is possible to clarify the distinction of the transparent vehicle interior image RG and improve the visibility.
  • the monitor image generation means (image processing control unit 2) reads the hue detection value from the luminance / hue determination sensor 49a, and the hue deviation between the hue detection value and the set value X is equal to or greater than the first threshold value Z1.
  • the hue difference between the hue detection value and the set value X is less than the first threshold value Z1 but greater than or equal to the second threshold value Z2
  • the hue of the translucent vehicle interior image RG is displayed. Is a display that brightens the color as it is, and when the hue deviation between the hue detection value and the set value X is less than the second threshold Z2, the hue of the translucent interior image RG is the complementary color side in the hue circle with respect to the hue of the external video It has a hue conversion display control mode (FIG.
  • Vehicle speed detection means for detecting the vehicle speed
  • the monitor image generation means image processing control unit 2 reads the vehicle speed detection value from the vehicle speed detection means (speed sensor 52)
  • a warning display control mode for converting and displaying the entire hue of the translucent vehicle interior image RG into a hue that causes the driver to recognize the warning is provided. For this reason, when the vehicle speed is too high, it is safe to give the driver a warning that the vehicle speed should be reduced by changing the overall hue of the translucent interior image RG to be superimposed, for example, to red. Sex can be secured.
  • the monitor image generation means (image processing control unit 2) is adapted to the luminance / hue of the translucent vehicle interior image RG in response to an external operation on the vehicle interior image color manual operation means (hue manual control interface 6).
  • a vehicle interior image color external control unit (hue external control unit 49c) that arbitrarily adjusts at least one of saturation and lightness. For this reason, at least one of the brightness, hue, saturation, and brightness of the translucent vehicle interior image RG can be adjusted by manual operation according to the preference of the user having large individual differences and the visibility of the display image on the monitor 3.
  • the system can be made easy to use.
  • the image composition circuit (superimpose circuit 46) has an area SE obtained by projecting the host vehicle on the road surface in the vehicle interior image from the viewpoint of the driver, and the transmittance of the vehicle interior image RP is 0%.
  • the region corresponding to the window glass of the host vehicle is set as the “opaque portion DE” and the “transparent portion CE”, and the region corresponding to the window glass of the host vehicle is set as the “transparent portion CE” with a transmittance of 100%.
  • a blend circuit unit 46a that sets an area other than "" as a "semi-transparent part GE" having an arbitrary transmittance.
  • the in-vehicle blind spot camera is a side camera used in a see-through side view monitor system A1 that displays a front side portion of a vehicle, which is a blind spot of a driver, on a monitor 3 in the vehicle interior as an image transmitted from the vehicle interior. 1. For this reason, it is possible to understand at a glance the vehicle feeling necessary for avoiding falling wheels to the side grooves and avoiding obstacles, enabling intuitive grasp of the space, and increasing the contribution that can contribute to safe driving. .
  • the second embodiment uses a back camera for eliminating the blind spot disposed at the rear position of the vehicle as the in-vehicle blind spot camera, and displays the rear part of the vehicle that becomes the blind spot of the driver on the monitor as an image transmitted from the vehicle interior. It is an example of a see-through back view monitor system.
  • FIG. 16 is an overall system block diagram illustrating a see-through back view monitor system A2 (an example of a vehicle periphery image display system) according to the second embodiment.
  • the see-through back view monitor system A2 includes a back camera 21 (vehicle-mounted blind spot camera), an image processing control unit 2 (monitor video generation means), a monitor 3, and a blend ratio manual control.
  • An interface 4, an external sensor 5, and a hue manual control interface 6 (vehicle interior image color manual operation means) are provided.
  • the rear camera 21 is mounted near the inside of the trunk lid of the license plate in the case of a passenger car, or in the vicinity of the upper end of the rear window in the case of a large car such as an RV car, which becomes a driver's blind spot.
  • the rear part of the vehicle is imaged.
  • the back camera 21 acquires real camera video data of the rear part of the vehicle by an image sensor (CCD, CMOS, etc.).
  • the image processing control unit 2 includes a decoder 41, an image memory 42, an image processing unit 43, an image memory 44 (image storage unit), a control circuit (CPU) 45, a super-in Pause circuit 46 (image composition circuit), encoder 47, blend external control unit 48, luminance / hue determination sensor 49a (external video color determination unit), and luminance / hue conversion block 49b (in-vehicle interior image color automatic adjustment unit) ) And a hue external control unit 49c (vehicle interior image color external control unit). Since each configuration is the same as that of FIG. 1 of the first embodiment, the corresponding components are denoted by the same reference numerals and description thereof is omitted.
  • FIG. 17 shows a translucent vehicle interior image in which “opaque portion DE”, “transparent portion CE”, and “translucent portion GE” are set for the vehicle interior image RP behind the vehicle in the see-through back view monitor system A2 of the second embodiment. It is a figure which shows RG.
  • the see-through back view monitor system A2 of Example 2 has a form in which the side camera 1 of the see-through side view monitor system A1 of Example 1 using the side camera 1 is replaced with a back camera 21.
  • the actual camera video from the back camera 21 is digitally converted and the viewpoint is converted into a virtual camera image from the driver's viewpoint.
  • the vehicle body projection image of FIG. 11 is now applied to the rear of the vehicle, and the area of the vehicle body projection view is adapted to the vehicle interior image superimposed with the virtual camera image of the back camera 21.
  • the shaded portion corresponding to the shadow SE by the vertical projection of the vehicle is set as an “opaque portion DE” with 0% transmittance.
  • the window glass portion is a “transparent portion CE” having a transmittance of 100%.
  • the other region is set as a semi-transparent “translucent portion GE” that is alpha-blended with an arbitrary transmittance that can be defined by the user.
  • a camera is installed to reflect the vicinity of the bumper, and the vehicle feel is obtained by displaying the reflected bumper and the vehicle trajectory line.
  • the in-vehicle blind spot camera is a back camera 21 used in a see-through back view monitor system A2 that displays a rear part of a vehicle that is a driver's blind spot as an image transmitted from the passenger compartment on the monitor 3 in the passenger compartment. .
  • a vehicle stop line, a vehicle stop curb, a wall, and the like necessary for reverse traveling in parking and a sense of position and distance of the own vehicle, or a following vehicle and the own vehicle approaching during running It is possible to intuitively grasp the space such as the sense of position and the sense of distance, and to increase the degree of contribution that can contribute to quick parking and safe driving.
  • a front camera for eliminating a blind spot is used as an in-vehicle blind spot camera, and a front part of the vehicle that becomes a driver's blind spot is displayed on a monitor as an image that is transmitted from the passenger compartment. It is an example of a see-through front view monitor system.
  • FIG. 18 is an overall system block diagram illustrating a see-through front view monitor system A3 (an example of a vehicle periphery image display system) according to the third embodiment.
  • the see-through front view monitor system A3 includes a left front camera 31L (vehicle-mounted blind spot camera), a right front camera 31R (vehicle-mounted blind spot camera), and a central front camera 31C (vehicle-mounted blind spot camera).
  • An image processing control unit 2 (monitor image generation means), a monitor 3, a blend ratio manual control interface 4, an external sensor 5, and a hue manual control interface 6 (vehicle interior image color manual operation means). ing.
  • the external sensor 5 includes a turn signal switch 55 in addition to the rudder angle sensor 51, the speed sensor 52, the illumination ON / OFF switch 53, and the function switch 54.
  • the image processing control unit 2 includes a left decoder 41L, a right decoder 41R, a central decoder 41C, a left image memory 42L, a right image memory 42R, a central image memory 42C, and image processing.
  • Unit 43 image memory 44 (image storage unit), control circuit (CPU) 45, superimpose circuit 46 (image synthesis circuit), encoder 47, blend external control unit 48, and luminance / hue determination sensor 49a (external video color determination unit), luminance / hue conversion block 49b (vehicle interior image color automatic adjustment unit), and hue external control unit 49c (vehicle interior image color external control unit).
  • These components are the same as those in FIG. 1 of the first embodiment.
  • FIG. 19 is a flowchart showing the flow of the blend ratio sensor interlocking control process executed by the control circuit 45 in the see-through front view monitor system A3 of the third embodiment.
  • each step will be described.
  • the user arbitrarily changes the left / right blend ratio and sets the current transmittance Tr1 to a value such as 30%, for example.
  • step S191 it is determined whether or not the function switch 54 is ON. If Yes (switch ON), the process proceeds to step S192. If No (switch OFF), the determination in step S191 is repeated.
  • step S192 following the determination that the function switch 54 is ON in step S191, it is determined whether or not the ON signal is output from the turn signal switch 55. If Yes (turn signal blinking operation) Shifts to step S193, and returns No to step S191 if No (turn signal is extinguished).
  • step S193 following the determination that the turn signal switch 55 outputs an ON signal in step S192, it is determined whether or not the signal from the turn signal switch 55 is a course change signal to the right. If the signal is right), the process proceeds to step S194. If the signal is No (turn signal is left), the process proceeds to step S196.
  • step S194 following the determination that the turn signal in step S193 is right, it is determined whether or not the current transmittance Tr1 is smaller than the set value Tr0. If Yes (Tr1 ⁇ Tr0), the process proceeds to step S195. If No (Tr1 ⁇ Tr0), the process proceeds to step S198.
  • the set value Tr0 is a transmittance threshold value for securing a right field of view that is the line changing direction.
  • step S195 following the determination that Tr1 ⁇ Tr0 in step S194, the transmittance of the right front camera image area is forcibly changed from the current transmittance Tr1 to the transmittance T (eg, Tr0). Return to step S191.
  • step S196 following the determination that the turn signal in step S193 is left, it is determined whether or not the current transmittance Tr1 is smaller than the set value Tr0. If Yes (Tr1 ⁇ Tr0), the process proceeds to step S197. If No (Tr1 ⁇ Tr0), the process proceeds to step S198.
  • the set value Tr0 is a transmittance threshold value for securing a field of view to the left which is the line changing direction.
  • step S197 following the determination that Tr1 ⁇ Tr0 in step S196, the transmittance of the left front camera image area is forcibly changed from the current transmittance Tr1 to the transmittance T (eg, Tr0). Return to step S191.
  • step S198 following the determination that Tr1 ⁇ Tr0 in step S194 or step S196, the current transmittance Tr1 is maintained without change, and the process returns to step S191.
  • Other configurations are the same as those in the first embodiment.
  • FIG. 20 is a diagram illustrating an image in which “opaque portions DE” are set in the divided regions of the left, right, and center front camera images in the see-through front view monitor system A3 according to the third embodiment.
  • FIG. 21 shows a translucent vehicle interior image RG in which “opaque portion DE”, “transparent portion CE”, and “translucent portion GE” are set for the vehicle interior image RP in the see-through front view monitor system A3 of the third embodiment.
  • FIG. 21 shows a translucent vehicle interior image RG in which “opaque portion DE”, “transparent portion CE”, and “translucent portion GE” are set for the vehicle interior image RP in the see-through front view monitor system A3 of the third embodiment.
  • the images of the front cameras 31L, 31R, and 31C are digitized and subjected to viewpoint conversion image processing, and then a vehicle-shaped vertical projection image.
  • a superimpose screen which is a vehicle interior image that takes into account, is superimposed and a composite image is obtained.
  • the region where the vehicle body shape is projected perpendicularly to the road surface with respect to the vehicle interior image RP taken in advance forward from the driver's viewpoint position is defined as an “opaque portion DE” having a transmittance of 0%. Then, a wide-angle screen image of 180 degrees or more is displayed in the area excluding the “opaque portion DE” from the vehicle interior image RP.
  • this wide-angle screen image of 180 degrees or more is used as a viewpoint-converted image from the driver's viewpoint, as shown in FIG. 20, the camera image from the central front camera 31C is displayed in the center area, and the left front camera 31L is displayed in the left area.
  • a screen in which the camera video and the camera video from the right front camera 31R are combined in the right area is configured. That is, since the image using the camera is configured to secure the field of view, the camera images of the left, right, and center front cameras 31L, 31R, and 31C are often displayed on one screen. In this case, usually, these respective camera images are combined to display a wide-angle screen image of 180 degrees or more.
  • a semi-transparent vehicle interior image RG that is distinguished from the “opaque part DE”, “transparent part CE”, and “semi-transparent part GE” is superimposed on the video shown in FIG. 20 by superimposition. As shown in the figure, the driver is provided with an image that looks through the vehicle interior and looks outside the vehicle front side.
  • the vehicle shape called the vehicle sensation as well as the elimination of the blind spots is eliminated as in the case of the see-through side view monitor system A1 and the see-through back view monitor system A2 described above. -It is possible to provide a video whose size is self-explanatory and easy to grasp intuitively when suddenly avoiding danger.
  • the transmittance of the “semi-transparent portion GE” in the three-divided video area is automatically adjusted.
  • step S195 since the visual field in the right region is more important than the central region of the screen, the system performs an alpha blending operation to automatically increase the transmittance in order to secure the visual field.
  • step S191 If the left turn signal is detected and the current transmittance Tr1 is smaller than the set value Tr0, the process proceeds from step S191 to step S192 to step S193 to step S196 to step S197 in the flowchart of FIG.
  • step S197 since the visual field in the left region is more important than the central region of the screen, the system performs an alpha blending operation to automatically increase the transmittance in order to secure the visual field.
  • the left and right of the turn signal may be discriminated, and only one of them may be weighted to change the transmittance. Since other operations are the same as those of the first embodiment, description thereof is omitted.
  • the in-vehicle blind spot camera is a left, right, and center front camera 31L, 31R used in a see-through front view monitor system A3 that displays a front part of a vehicle, which is a driver's blind spot, on a monitor 3 in the passenger compartment. , 31C.
  • a see-through front view monitor system A3 that displays a front part of a vehicle, which is a driver's blind spot, on a monitor 3 in the passenger compartment. , 31C.
  • Embodiments 1 to 3 show examples in which the luminance and hue of the translucent vehicle interior image RG superimposed on the virtual camera image are changed according to the determination result of the luminance and hue of the external video.
  • the brightness of the translucent vehicle interior image RG to be superimposed on the virtual camera image is added to at least one of brightness, hue, saturation, and brightness according to the determination result in addition to the brightness and hue of the external video.
  • An example in which at least one of hue, saturation, and lightness is changed is included in the present invention.
  • the vehicle interior image RP prepared in advance as the translucent vehicle interior image RG to be superimposed on the virtual camera image is divided into three “opaque portion DE”, “transparent portion CE”, and “translucent portion GE”.
  • the example which distinguishes in a part was shown.
  • an example in which the entire vehicle interior image RP is a “translucent portion GE” may be used.
  • the vehicle interior image RP as a whole may be “semi-transparent part GE” and “opaque part DE” may be displayed by bordering.
  • an example of “shadow part” by filling may be used.
  • vehicle interior image RP prepared in advance is distinguished into two parts of an “opaque part DE” (or “shadow part”) and a “translucent part GE”. Furthermore, the vehicle interior image RP prepared in advance may be an example in which the image changes continuously by an “opaque portion DE” (or “shadow portion”) and a “transparent portion in which the transmittance is changed in a gradation”.
  • Example 1 shows an example of a see-through side view monitor system A1 using a side camera
  • Example 2 shows an example of a see-through back view monitor system A2 using a back camera
  • Example 3 Then, the example of see-through front view monitor system A3 using a front camera was shown.
  • a monitor is shared, and a monitor system that can select any one of a side view, a back view, a front view, and the like, or a monitor system that is automatically switched under a predetermined condition It can also be applied to.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention divulgue un système d'affichage d'image périphérique de véhicule destiné à montrer une différence entre une image de caméra virtuelle et une image spatiale de véhicule semi-transparente, indépendamment des conditions environnementales externes d'acquisition d'une image vidéo réelle par caméra et de reconnaissance visuelle claire d'une situation externe entrant dans une zone d'angle mort pour le conducteur en relation de position avec le véhicule de manière transparente. Un système (A1) de contrôle de vue latérale en transparence est muni d'une caméra latérale (1), d'un écran (3) et d'une unité de commande de traitement d'image (2). L'unité de commande de traitement d'image (2) comprend une unité de traitement d'image (43) convertissant une image vidéo externe envoyée par la caméra latérale (1) en une image virtuelle de caméra vue depuis le point de vue du conducteur par rapport au point de vue, un capteur déterminant la clarté ou les nuances (49a) pour déterminer la couleur d'une image vidéo externe en fonction de la clarté/de la nuance, un bloc de conversion de clarté/nuance (49b) pour ajuster automatiquement la clarté/nuance de l'image spatiale de véhicule semi-transparente (RG), de telle sorte que la visibilité de l'image vidéo externe soit améliorée, et un circuit de superposition (46) pour créer une image composite représentant l'image virtuelle de caméra vue à travers l'image spatiale de véhicule semi-transparente (RG) par composition d'image consistant à superposer l'image spatiale de véhicule semi-transparente (RG) à l'image virtuelle de caméra.
PCT/JP2009/070488 2009-12-07 2009-12-07 Système d'affichage d'image périphérique de véhicule WO2011070640A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN200980162790.0A CN102714710B (zh) 2009-12-07 2009-12-07 车辆周边图像显示系统
PCT/JP2009/070488 WO2011070640A1 (fr) 2009-12-07 2009-12-07 Système d'affichage d'image périphérique de véhicule
EP09852037.2A EP2512133B1 (fr) 2009-12-07 2009-12-07 Système d'affichage d'image périphérique de véhicule
US13/514,575 US20120249789A1 (en) 2009-12-07 2009-12-07 Vehicle peripheral image display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/070488 WO2011070640A1 (fr) 2009-12-07 2009-12-07 Système d'affichage d'image périphérique de véhicule

Publications (1)

Publication Number Publication Date
WO2011070640A1 true WO2011070640A1 (fr) 2011-06-16

Family

ID=44145212

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/070488 WO2011070640A1 (fr) 2009-12-07 2009-12-07 Système d'affichage d'image périphérique de véhicule

Country Status (4)

Country Link
US (1) US20120249789A1 (fr)
EP (1) EP2512133B1 (fr)
CN (1) CN102714710B (fr)
WO (1) WO2011070640A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013162328A (ja) * 2012-02-06 2013-08-19 Fujitsu Ten Ltd 画像処理装置、画像処理方法、プログラム、及び画像処理システム
EP2744694A4 (fr) * 2011-08-17 2015-07-22 Lg Innotek Co Ltd Caméra destinée à un véhicule
WO2016027689A1 (fr) * 2014-08-21 2016-02-25 アイシン精機株式会社 Dispositif de commande d'affichage d'image et système d'affichage d'image
JP2016058801A (ja) * 2014-09-05 2016-04-21 アイシン精機株式会社 画像表示制御装置および画像表示システム
EP3089075A3 (fr) * 2015-04-09 2017-01-25 Kabushiki Kaisha Tokai Rika Denki Seisakusho Dispositif de visualisation d'un véhicule
WO2019074005A1 (fr) * 2017-10-10 2019-04-18 マツダ株式会社 Dispositif d'affichage pour véhicule
EP3967554A1 (fr) * 2020-09-15 2022-03-16 Mazda Motor Corporation Système d'affichage pour véhicule
US20220080884A1 (en) * 2020-09-15 2022-03-17 Hyundai Motor Company Device and method for controlling emotional lighting of vehicle

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11021136B1 (en) * 2011-08-29 2021-06-01 The Boeing Company Methods and systems for providing a remote virtual view
US20140114534A1 (en) * 2012-10-19 2014-04-24 GM Global Technology Operations LLC Dynamic rearview mirror display features
KR101393881B1 (ko) * 2012-10-24 2014-05-12 현대자동차주식회사 차량의 주차구획 인식방법
JP6115104B2 (ja) * 2012-12-04 2017-04-19 アイシン精機株式会社 車両の制御装置、及び制御方法
CN103856823A (zh) * 2012-12-06 2014-06-11 腾讯科技(深圳)有限公司 界面调整方法、装置及终端
JP6081570B2 (ja) * 2013-02-21 2017-02-15 本田技研工業株式会社 運転支援装置、および画像処理プログラム
JP6148887B2 (ja) * 2013-03-29 2017-06-14 富士通テン株式会社 画像処理装置、画像処理方法、及び、画像処理システム
WO2015104860A1 (fr) * 2014-01-10 2015-07-16 アイシン精機株式会社 Dispositif de commande d'affichage d'image et système d'affichage d'image
WO2015123173A1 (fr) 2014-02-11 2015-08-20 Robert Bosch Gmbh Vidéo de mise en correspondance de luminosité et de couleur provenant d'un système à multiples caméras
FR3018940B1 (fr) * 2014-03-24 2018-03-09 Survision Systeme de classification automatique de vehicules automobiles
JP5989701B2 (ja) * 2014-03-24 2016-09-07 トヨタ自動車株式会社 境界検出装置および境界検出方法
CN105216715A (zh) * 2015-10-13 2016-01-06 湖南七迪视觉科技有限公司 一种汽车驾驶员视觉辅助增强系统
CN105611308B (zh) * 2015-12-18 2018-11-06 盯盯拍(深圳)技术股份有限公司 视频画面处理方法、装置以及系统
WO2017130439A1 (fr) * 2016-01-28 2017-08-03 鴻海精密工業股▲ふん▼有限公司 Système d'affichage d'image véhiculaire et véhicule comportant ledit système d'affichage d'image monté dans celui-ci
WO2017154317A1 (fr) * 2016-03-09 2017-09-14 株式会社Jvcケンウッド Dispositif de commande d'affichage de véhicule, système d'affichage de véhicule, procédé de commande d'affichage de véhicule et programme
EP3444145A4 (fr) * 2016-04-14 2019-08-14 Nissan Motor Co., Ltd. Procédé d'affichage d'environnement de corps mobile et appareil d'affichage d'environnement de corps mobile
US10306289B1 (en) 2016-09-22 2019-05-28 Apple Inc. Vehicle video viewing systems
JP6876236B2 (ja) * 2016-09-30 2021-05-26 株式会社アイシン 表示制御装置
WO2018087625A1 (fr) 2016-11-10 2018-05-17 Semiconductor Energy Laboratory Co., Ltd. Dispositif d'affichage et procédé de pilotage de dispositif d'affichage
US10936884B2 (en) * 2017-01-23 2021-03-02 Magna Electronics Inc. Vehicle vision system with object detection failsafe
US10609398B2 (en) * 2017-07-28 2020-03-31 Black Sesame International Holding Limited Ultra-low bitrate coding based on 3D map reconstruction and decimated sub-pictures
DE102017216058A1 (de) * 2017-09-12 2019-03-14 Bayerische Motoren Werke Aktiengesellschaft Dynamisch kolorierte Anzeige eines Fahrzeugs
US11678035B2 (en) 2017-10-05 2023-06-13 University Of Utah Research Foundation Translucent imaging systems and related methods
JP2019073091A (ja) * 2017-10-13 2019-05-16 トヨタ自動車株式会社 車両用表示装置
DE102018100211A1 (de) * 2018-01-08 2019-07-11 Connaught Electronics Ltd. Verfahren zum Erzeugen einer Darstellung einer Umgebung durch Verschieben einer virtuellen Kamera in Richtung eines Innenspiegels eines Fahrzeugs; sowie Kameraeinrichtung
WO2019177036A1 (fr) * 2018-03-15 2019-09-19 株式会社小糸製作所 Système d'imagerie de véhicule
WO2020226072A1 (fr) * 2019-05-07 2020-11-12 Agc株式会社 Système d'affichage, procédé d'affichage, et programme d'affichage
JP7167853B2 (ja) * 2019-05-23 2022-11-09 株式会社デンソー 表示制御装置
JP7018923B2 (ja) * 2019-12-13 2022-02-14 本田技研工業株式会社 駐車支援装置、駐車支援方法およびプログラム

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002337605A (ja) * 2001-05-18 2002-11-27 Auto Network Gijutsu Kenkyusho:Kk 車両用周辺視認装置
JP2003244688A (ja) * 2001-12-12 2003-08-29 Equos Research Co Ltd 車両の画像処理装置
JP2004350303A (ja) 2004-06-11 2004-12-09 Equos Research Co Ltd 車両の画像処理装置
JP2005335410A (ja) * 2004-05-24 2005-12-08 Olympus Corp 画像表示装置
JP2008039395A (ja) 2006-08-01 2008-02-21 Dainippon Printing Co Ltd 粘度測定装置および粘度測定方法
JP2008171314A (ja) * 2007-01-15 2008-07-24 San Giken:Kk 速度警報機能付カーナビ装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5280344A (en) * 1992-04-30 1994-01-18 International Business Machines Corporation Method and means for adding an extra dimension to sensor processed raster data using color encoding
US6037914A (en) * 1997-08-25 2000-03-14 Hewlett-Packard Company Method and apparatus for augmented reality using a see-through head-mounted display
JP4114292B2 (ja) * 1998-12-03 2008-07-09 アイシン・エィ・ダブリュ株式会社 運転支援装置
EP1303140A4 (fr) * 2000-07-19 2007-01-17 Matsushita Electric Ind Co Ltd Systeme de controle
US7212653B2 (en) * 2001-12-12 2007-05-01 Kabushikikaisha Equos Research Image processing system for vehicle
CN1485227A (zh) * 2002-09-24 2004-03-31 李大民 一种后视的方法及其实现该方法的装置
US20060050983A1 (en) * 2004-09-08 2006-03-09 Everest Vit, Inc. Method and apparatus for enhancing the contrast and clarity of an image captured by a remote viewing device
WO2006103835A1 (fr) * 2005-03-25 2006-10-05 Mitsubishi Denki Kabushiki Kaisha Dispositif de traitement d’image, dispositif et procede d’affichage d’image
US7612813B2 (en) * 2006-02-03 2009-11-03 Aptina Imaging Corporation Auto exposure for digital imagers
JP4305540B2 (ja) * 2007-03-22 2009-07-29 村田機械株式会社 画像処理装置
JP2009171008A (ja) * 2008-01-11 2009-07-30 Olympus Corp 色再現装置および色再現プログラム
WO2009104675A1 (fr) * 2008-02-20 2009-08-27 クラリオン株式会社 Système d'affichage d'images périphériques pour véhicule
JP2009225322A (ja) * 2008-03-18 2009-10-01 Hyundai Motor Co Ltd 車両用情報表示システム
US8334876B2 (en) * 2008-05-22 2012-12-18 Sanyo Electric Co., Ltd. Signal processing device and projection display apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002337605A (ja) * 2001-05-18 2002-11-27 Auto Network Gijutsu Kenkyusho:Kk 車両用周辺視認装置
JP2003244688A (ja) * 2001-12-12 2003-08-29 Equos Research Co Ltd 車両の画像処理装置
JP2005335410A (ja) * 2004-05-24 2005-12-08 Olympus Corp 画像表示装置
JP2004350303A (ja) 2004-06-11 2004-12-09 Equos Research Co Ltd 車両の画像処理装置
JP2008039395A (ja) 2006-08-01 2008-02-21 Dainippon Printing Co Ltd 粘度測定装置および粘度測定方法
JP2008171314A (ja) * 2007-01-15 2008-07-24 San Giken:Kk 速度警報機能付カーナビ装置

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2744694A4 (fr) * 2011-08-17 2015-07-22 Lg Innotek Co Ltd Caméra destinée à un véhicule
US10155476B2 (en) 2011-08-17 2018-12-18 Lg Innotek Co., Ltd. Camera apparatus of vehicle
JP2013162328A (ja) * 2012-02-06 2013-08-19 Fujitsu Ten Ltd 画像処理装置、画像処理方法、プログラム、及び画像処理システム
WO2016027689A1 (fr) * 2014-08-21 2016-02-25 アイシン精機株式会社 Dispositif de commande d'affichage d'image et système d'affichage d'image
JP2016043778A (ja) * 2014-08-21 2016-04-04 アイシン精機株式会社 画像表示制御装置および画像表示システム
JP2016058801A (ja) * 2014-09-05 2016-04-21 アイシン精機株式会社 画像表示制御装置および画像表示システム
EP3089075A3 (fr) * 2015-04-09 2017-01-25 Kabushiki Kaisha Tokai Rika Denki Seisakusho Dispositif de visualisation d'un véhicule
WO2019074005A1 (fr) * 2017-10-10 2019-04-18 マツダ株式会社 Dispositif d'affichage pour véhicule
JP2019071547A (ja) * 2017-10-10 2019-05-09 マツダ株式会社 車両用表示装置
EP3967554A1 (fr) * 2020-09-15 2022-03-16 Mazda Motor Corporation Système d'affichage pour véhicule
US20220080884A1 (en) * 2020-09-15 2022-03-17 Hyundai Motor Company Device and method for controlling emotional lighting of vehicle
US11603040B2 (en) * 2020-09-15 2023-03-14 Hyundai Motor Company Device and method for controlling emotional lighting of vehicle

Also Published As

Publication number Publication date
CN102714710B (zh) 2015-03-04
EP2512133A1 (fr) 2012-10-17
EP2512133A4 (fr) 2016-11-30
US20120249789A1 (en) 2012-10-04
CN102714710A (zh) 2012-10-03
EP2512133B1 (fr) 2018-07-18

Similar Documents

Publication Publication Date Title
JP5118605B2 (ja) 車両周辺画像表示システム
WO2011070640A1 (fr) Système d'affichage d'image périphérique de véhicule
JP5421788B2 (ja) 車両周辺画像表示システム
US11572017B2 (en) Vehicular vision system
US8754760B2 (en) Methods and apparatuses for informing an occupant of a vehicle of surroundings of the vehicle
JP5121737B2 (ja) 画像データにおいて重要対象物を識別するためのバーチャルスポットライト
JP5251947B2 (ja) 車両用画像表示装置
JP2010109684A5 (fr)
WO2018105417A1 (fr) Dispositif d'imagerie, dispositif de traitement d'image, système d'affichage et véhicule
JP5459154B2 (ja) 車両用周囲画像表示装置及び方法
EP2476587B1 (fr) Dispositif de contrôle des environs d'un véhicule
JP5516997B2 (ja) 画像生成装置
KR101657673B1 (ko) 파노라마뷰 생성 장치 및 방법
JP6762863B2 (ja) 撮像装置、画像処理装置、表示システム、および車両
JP7073237B2 (ja) 画像表示装置、画像表示方法
CN111086518B (zh) 显示方法、装置、车载平视显示设备及存储介质
JP6781035B2 (ja) 撮像装置、画像処理装置、表示システム、および車両
JP2021007255A (ja) 撮像装置、画像処理装置、表示システム、および車両
JP7296490B2 (ja) 表示制御装置及び車両
JP7007438B2 (ja) 撮像装置、画像処理装置、表示装置、表示システム、および車両
WO2021240873A1 (fr) Dispositif de commande d'affichage, véhicule et procédé de commande d'affichage

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980162790.0

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09852037

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13514575

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2009852037

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP