WO2011070640A1 - 車両周辺画像表示システム - Google Patents
車両周辺画像表示システム Download PDFInfo
- Publication number
- WO2011070640A1 WO2011070640A1 PCT/JP2009/070488 JP2009070488W WO2011070640A1 WO 2011070640 A1 WO2011070640 A1 WO 2011070640A1 JP 2009070488 W JP2009070488 W JP 2009070488W WO 2011070640 A1 WO2011070640 A1 WO 2011070640A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- hue
- vehicle interior
- camera
- Prior art date
Links
- 238000012545 processing Methods 0.000 claims abstract description 59
- 239000000203 mixture Substances 0.000 claims abstract description 38
- 239000002131 composite material Substances 0.000 claims abstract description 17
- 238000002834 transmittance Methods 0.000 claims description 63
- 238000001514 detection method Methods 0.000 claims description 55
- 238000006243 chemical reaction Methods 0.000 claims description 35
- 230000008859 change Effects 0.000 claims description 28
- 230000002093 peripheral effect Effects 0.000 claims description 12
- 230000000295 complement effect Effects 0.000 claims description 10
- 239000005357 flat glass Substances 0.000 claims description 9
- 230000002194 synthesizing effect Effects 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 abstract description 8
- 230000004069 differentiation Effects 0.000 abstract 1
- 238000000034 method Methods 0.000 description 60
- 230000008569 process Effects 0.000 description 57
- 230000006870 function Effects 0.000 description 20
- 238000010586 diagram Methods 0.000 description 14
- 230000000694 effects Effects 0.000 description 13
- 230000009471 action Effects 0.000 description 9
- 230000037237 body shape Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 238000012937 correction Methods 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 4
- 238000007792 addition Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 3
- 230000008030 elimination Effects 0.000 description 3
- 238000003379 elimination reaction Methods 0.000 description 3
- 230000001771 impaired effect Effects 0.000 description 3
- 238000002156 mixing Methods 0.000 description 3
- 230000035807 sensation Effects 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 241001085205 Prenanthella exigua Species 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000035699 permeability Effects 0.000 description 1
- 239000012466 permeate Substances 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/26—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/602—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint
- B60R2300/605—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint the adjustment being automatic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/802—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views
Definitions
- the present invention relates to a vehicle periphery image display system that displays a vehicle periphery image including a blind spot area on a monitor screen in a vehicle based on a camera image acquired from an in-vehicle blind spot camera.
- the side view monitor system currently in practical use has a side camera (CCD camera, etc.) set inside the side mirror, and the actual camera image from the side camera is displayed on the monitor screen of the front display unit that is also used as the navigation system. Is displayed. In other words, the front side portion of the vehicle that becomes the driver's blind spot is displayed on the monitor screen, so that the driver can recognize the situation of the portion that becomes the blind spot.
- CCD camera CCD camera, etc.
- the side camera is placed inside the side mirror, there is a large parallax between the camera viewpoint and the driver viewpoint, and the shape of the obstacle and other objects is the shape in the camera image and the driver's seat.
- the shape that can be seen from is completely different.
- a converted external image is generated by converting a camera image obtained by a blind spot camera provided outside the vehicle body into a virtual camera image viewed from the viewpoint position of the driver.
- a visual recognition area image excluding the blind spot area is generated from a camera video obtained by a driver viewpoint camera provided near the driver's viewpoint position.
- the vehicle periphery image display system which obtains the synthesized image which synthesize
- the viewpoint of the rear camera attached to the trunk part outside the vehicle is converted from the viewpoint of the driver's viewpoint to the image viewed from the rear by image conversion.
- the portion of the composited rear view image that can be seen from the window is a live image (raw image) from the internal camera image, and cannot be captured by the camera.
- the blind spot is obtained by superimposing the image of the external camera by image processing.
- the boundary line for cutting out the image should be matched with the vehicle-shaped window frame and others.
- the edge portion of the window frame or the like is superimposed as a thick frame-shaped superimpose.
- the conventional vehicle periphery image display system has the following problems.
- the applicant first converted the viewpoint of the actual camera image input from the vehicle-mounted blind spot camera into a virtual camera image viewed from the viewpoint position of the driver, and the vehicle interior that has been made semitransparent into the virtual camera image.
- Proposed a vehicle peripheral image display system that transmits a semi-transparent vehicle interior image and displays a virtual camera image by superimposing images Japanese Patent Application No. 2008-39395, filing date: February 20, 2008.
- the translucent interior image is superimposed on the virtual camera image
- the brightness and hue of the virtual camera image and the translucent interior image approximate.
- the outline of the vehicle and the color of the passenger compartment are blended into the delicate color of the external image.
- the semi-transparent vehicle interior image is also made entirely whitish and transparent.
- An object of the present invention is to provide a vehicle periphery image display system that can clearly see through the situation in a positional relationship with the host vehicle.
- an in-vehicle blind spot camera that is attached to the host vehicle and images the periphery of the vehicle, a monitor that is set at a vehicle interior position that can be visually recognized by the driver, and an actual input from the in-vehicle blind spot camera.
- Monitor image generation means for generating a display image on the monitor based on the camera image.
- the monitor video generation means includes an image processing unit, an external video color determination unit, a vehicle interior image color automatic adjustment unit, and an image synthesis circuit.
- the image processing unit converts an actual camera image input from the in-vehicle blind spot camera into a virtual camera image viewed from the viewpoint position of the driver.
- the external video color determination unit determines the color of the external video from the in-vehicle blind spot camera according to at least one of luminance, hue, saturation, and brightness.
- the vehicle interior image color automatic adjustment unit is configured to obtain at least one of luminance, hue, saturation, and brightness of a translucent vehicle interior image obtained by translucent the vehicle interior image based on a color determination result of the external image. Automatically adjust to improve the visibility of the video.
- the image synthesizing circuit transmits the semitransparent vehicle interior image through the virtual camera image from the image processing unit and superimposes the translucent vehicle interior image from the vehicle interior image color automatic adjustment unit on the virtual camera image. A composite image representing the image is generated.
- the image composition circuit in the image composition circuit, the image composition circuit superimposes the translucent vehicle interior image from the vehicle interior image color automatic adjustment unit on the virtual camera image from the image processing unit. Then, a composite image that transmits the translucent vehicle interior image and expresses the virtual camera image is generated, and this composite image is displayed on the monitor.
- the real camera video input from the vehicle blind spot camera is converted into a virtual camera image viewed from the viewpoint position of the driver, so that the driver sees the composite image displayed on the monitor. Can intuitively grasp the blind spot portion from the driver included in the virtual camera image without parallax.
- the external video color determination unit the actual camera video input from the in-vehicle blind spot camera is used as the external video, and the average brightness and hue of the external video is determined.
- the vehicle interior image color automatic adjustment unit the external video Based on the determination result of the color determination unit, the brightness and color of the translucent vehicle interior image obtained by translucent the vehicle interior image are automatically adjusted so as to improve the visibility with respect to the external video. Therefore, a semi-transparent vehicle interior image with improved visibility to the external video is transmitted, and the virtual camera image is displayed on the monitor.
- the external environment conditions such as daytime, twilight, nighttime, etc.
- the distinction between the virtual camera image and the semi-transparent vehicle interior image becomes clear, and the external situation that is a blind spot from the driver by the virtual camera image is clearly visible through the positional relationship with the host vehicle by the semi-transparent vehicle interior image. it can.
- the distinction between the virtual camera image and the semi-transparent vehicle interior image is clarified, and the external situation that becomes a blind spot from the driver is clarified by the positional relationship with the own vehicle Can be seen through.
- FIG. 1 is an overall system block diagram illustrating a see-through side view monitor system A1 (an example of a vehicle periphery image display system) according to a first embodiment. It is explanatory drawing which shows the acquisition example 1 of the brightness
- FIG. 2 shows the acquisition example
- FIG. It is explanatory drawing which shows the acquisition example 3 of the brightness
- FIG. 6 is a flowchart illustrating a flow of external video luminance follow-up display control processing executed by a control circuit 45 in the see-through side view monitor system A1 according to the first embodiment. It is a flowchart which shows the flow of the brightness
- FIG. 6 is a flowchart illustrating a flow of hue conversion display control processing executed by a control circuit 45 in the see-through side view monitor system A1 according to the first embodiment.
- FIG. 6 is a diagram illustrating a hue circle used in the hue conversion display control process according to the first embodiment.
- FIG. It is a flowchart which shows the flow of the warning display control process performed in the control circuit 45 in the see-through side view monitor system A1 of Example 1.
- FIG. It is a figure which shows the vehicle interior still image image
- FIG. 13 is a diagram illustrating an image in which the “opaque portion DE” in FIG. 12 is set with respect to the vehicle interior image RP illustrated in FIG. 10 in the see-through side view monitor system A1 according to the first embodiment.
- 10 shows a translucent vehicle interior image RG in which “opaque portion DE”, “transparent portion CE”, and “translucent portion GE” are set in the vehicle interior image RP shown in FIG. 10 in the see-through side view monitor system A1 of the first embodiment.
- FIG. It is an image figure which shows a brightness
- FIG. 1 It is a whole system block diagram which shows see-through back view monitor system A2 (an example of a vehicle periphery image display system) of Example 2.
- It is a whole system block diagram which shows see-through front view monitor system A3 (an example of a vehicle periphery image display system) of Example 3.
- FIG. 10 is a diagram showing a translucent vehicle interior image RG in which “opaque part DE”, “transparent part CE”, and “translucent part GE” are set for the vehicle interior image RP in the see-through front view monitor system A3 of Example 3.
- the entire system configuration is composed of a blind spot eliminating camera, a digital image processing unit for processing the video, and a translucent video blending unit.
- the basic configuration is common.
- the configuration is devised in consideration of cost and the like.
- the first embodiment uses a side camera for eliminating a blind spot that is built in a side mirror or disposed in the vicinity of the side mirror as an in-vehicle blind spot camera, and a side camera in a front side portion of a vehicle that becomes a blind spot of a driver.
- a side camera for eliminating a blind spot that is built in a side mirror or disposed in the vicinity of the side mirror as an in-vehicle blind spot camera, and a side camera in a front side portion of a vehicle that becomes a blind spot of a driver.
- the see-through side view monitor system which displays an image
- FIG. 1 is an overall system block diagram illustrating a see-through side view monitor system A1 (an example of a vehicle periphery image display system) according to the first embodiment.
- the see-through side view monitor system A1 includes a side camera 1 (vehicle blind spot camera), an image processing control unit 2 (monitor video generation means), a monitor 3, and a blend ratio manual control.
- An interface 4 blend ratio manual operation means
- an external sensor 5 and a hue manual control interface 6 (vehicle interior image color manual operation means) are provided.
- the side camera 1 is mounted in the left side mirror or disposed near the left side mirror, and images the front side portion of the vehicle that becomes the blind spot of the driver.
- the side camera 1 acquires actual camera video data of the front side portion of the vehicle by an image sensor (CCD, CMOS, etc.).
- the monitor 3 is set to a vehicle interior position (for example, an instrument panel position, etc.) that can be visually recognized by the driver, and displays an image by inputting a display image from the image processing control unit 2.
- the monitor 3 has a display screen 3a such as a liquid crystal display or an organic EL display.
- a dedicated monitor may be set in the see-through side view monitor system A1
- a dedicated monitor may be set in the camera system for eliminating blind spots, and the navigation system.
- a monitor of another system may be used.
- the image processing control unit 2 displays on the monitor 3 based on input information from the blend ratio manual control interface 4, the external sensor 5, and the hue manual control interface 6 in addition to the actual camera image input from the side camera 1. Generate video.
- the blend ratio manual control interface 4 is constituted by a touch panel switch of the monitor 3, for example, and arbitrarily adjusts the transmittance of the “semi-transparent portion GE” set in the vehicle interior image by manual operation.
- the external sensor 5 is a sensor or switch that provides input information to the image processing control unit 2, and as shown in FIG. 1, a steering angle sensor 51, a speed sensor 52, an illumination ON / OFF switch 53, a function A switch 54 and other sensors and switches are provided.
- the function switch 54 When the function switch 54 is turned on, the blend circuit unit 46a is based on external environment information (daytime, evening, nighttime, weather, etc.) and vehicle information (steering angle, vehicle speed, etc.) obtained by the external sensor 5.
- the transmittance of the “semi-transparent portion GE” set in the vehicle interior image is automatically adjusted so as to enhance the visibility of the composite image displayed on the monitor 3.
- the hue manual control interface 6 is configured by, for example, a touch panel switch of the monitor 3, and arbitrarily adjusts the overall hue of the superimposed vehicle interior image by manual operation.
- the image processing control unit 2 includes a decoder 41, an image memory 42, an image processing unit 43, an image memory 44 (image storage unit), a control circuit (CPU) 45, a super-in Pause circuit 46 (image composition circuit), encoder 47, blend external control unit 48, luminance / hue determination sensor 49a (external video color determination unit), and luminance / hue conversion block 49b (in-vehicle interior image color automatic adjustment unit) ) And a hue external control unit 49c (vehicle interior image color external control unit).
- the decoder 41 performs analog / digital conversion on the actual camera video data input from the side camera 1.
- the image memory 42 stores the actual camera image data digitally converted from the decoder 41.
- the image processing unit 43 converts the actual camera image data input from the image memory 42 into a virtual camera image viewed from the viewpoint position of the driver.
- “viewpoint conversion processing as if a virtual camera is arranged near the driver's viewpoint” is performed, and “various processing (luminance adjustment, color correction, edge correction, etc.) is included.
- Image processing is also performed.
- the image memory 44 is a memory for superimposing, and stores a vehicle interior image RP (FIG. 10) previously captured from the driver's viewpoint as a vehicle interior image.
- the control circuit (CPU) 45 is a central processing circuit that manages all information processing and control output related to image processing in accordance with input information, and performs external video luminance follow-up display control, luminance sudden change support display control, and hue conversion display.
- a control program for performing various image processing controls such as control / warning display control is set.
- the superimpose circuit 46 basically converts the vehicle interior image RP from the image memory 44 into a translucent vehicle interior image RG (FIG. 14) by translucently converting it into the virtual camera image from the image processing unit 43. By synthesizing the semi-transparent vehicle interior image RG, a composite image that transmits the semi-transparent vehicle interior image RG and expresses the virtual camera image is generated.
- the superimpose circuit 46 includes a blend circuit unit 46a that divides the vehicle interior image RP previously captured from the driver's viewpoint and sets different transmittances in the respective regions.
- a shadow area SE obtained by projecting the host vehicle on the road surface is set as an “opaque portion DE” having a transmittance of 0% in the vehicle interior image in the vehicle interior image RP (FIG. 10).
- the area corresponding to the window glass of the host vehicle is set as a “transparent portion CE” in which the transmittance of the vehicle interior image is 100%, and the area other than the shadow area and the window glass area is set to “half It is set as “transparent part GE” (FIG. 14).
- the encoder 47 receives a composite image signal obtained by superimposing the translucent vehicle interior image RG on the virtual camera image from the superimpose circuit 46, and outputs the composite image signal to the monitor 3 through digital / analog conversion.
- the blend external control unit 48 sets the transmittance of the “translucent portion GE” set in the vehicle interior image within a range of 0% to 100%.
- a transmittance control command to be arbitrarily adjusted is output to the control circuit 45.
- the luminance / hue determination sensor 49a receives actual camera image data of an external video from the side camera 1, determines average luminance and hue based on the actual camera image data, and sends the determination result to the control circuit 45. Output.
- the luminance / hue determination sensor 49a takes in data of each pixel constituting the actual camera image as a digital data luminance signal Y and a color difference signal CbCr (or RGB data). Then, as shown in FIG. 2, the acquired original data (screen) is accumulated (accumulated) for each horizontal line in the horizontal scanning line direction, and a plurality of accumulated data by all the scanning lines is averaged to obtain the entire image. The brightness (brightness) and hue of the are determined. This determination method is the easiest.
- the luminance / hue conversion block 49b receives the luminance / hue determination result of the external video from the control circuit 45 and the translucent vehicle interior image data from the image memory 44, and determines the luminance and hue of the translucent vehicle interior image.
- the brightness and hue of the external video are automatically adjusted to improve the visibility, and the adjusted translucent vehicle interior image is output to the superimpose circuit 46.
- the hue control command from the hue external control unit 49c and the translucent vehicle interior image data from the image memory 44 are input, the hue of the translucent vehicle interior image is adjusted, and the adjusted translucent vehicle interior image is superposed. Output to the impose circuit 46.
- the hue external control unit 49 c When a hue adjustment signal is input from the hue manual control interface 6, the hue external control unit 49 c outputs a hue control command for arbitrarily adjusting the overall hue of the superimposed vehicle interior image according to preference. Output to the hue conversion block 49b.
- FIG. 5 is a flowchart showing the flow of the external video luminance follow-up display control process executed by the control circuit 45 in the see-through side view monitor system A1 of Example 1 (external video luminance follow-up display control mode).
- the control circuit 45 executed by the control circuit 45 in the see-through side view monitor system A1 of Example 1 (external video luminance follow-up display control mode).
- each step of FIG. 5 will be described.
- step S51 it is determined whether or not the function switch 54 is ON. If Yes (switch ON), the process proceeds to step S52, and if No (switch OFF), the process returns to step S51.
- step S52 following the determination that the function switch 54 is ON in step S51, the luminance determination result and the hue determination result are obtained as determination data by the luminance / hue determination sensor 49a, and the process proceeds to step S53.
- step S53 following the acquisition of the luminance / hue determination data in step S52, it is determined whether or not the determined luminance detection value is lower than the first set value Y1 indicating the twilight threshold, and Yes (luminance detection value) If ⁇ Y1), the process proceeds to step S55. If No (brightness detection value ⁇ Y1), the process proceeds to step S54.
- step S54 following the determination that the luminance detection value ⁇ Y1 in step S53, the luminance of the vehicle interior image to be superimposed is set to the normal state, and the process returns to step S51.
- “brightness” refers to the sensitivity of the human eye (CIE standard spectral luminous efficiency V) emitted from the light source or secondary light source (reflection surface or transmission surface) toward the observer. This is a photometric quantity evaluated in [ ⁇ ]), focusing only on a specific direction (observation direction).
- the “normal luminance state” refers to a luminance that clearly distinguishes the translucent vehicle interior image superimposed on the virtual camera image when the brightness outside the vehicle interior is the daytime brightness.
- step S55 following the determination that the luminance detection value ⁇ Y1 in step S53, it is determined whether the determined luminance detection value is lower than a second set value Y2 ( ⁇ Y1) indicating a nighttime threshold. If Yes (luminance detection value ⁇ Y2), the process proceeds to step S57. If No (luminance detection value ⁇ Y2), the process proceeds to step S56.
- step S56 following the determination that the luminance detection value ⁇ Y2 in step S55, the luminance of the vehicle interior image to be superimposed is shifted up or the brightness is increased, and the process returns to step S51.
- “brightness” is one of the three attributes of color and refers to the brightness of the color. Note that the three attributes of color include “lightness” including “hue (hue)” and “saturation (degree of color vividness)”.
- step S57 following the determination that the luminance detection value ⁇ Y2 in step S55, the luminance of the vehicle interior image to be superimposed is inverted, the black line is displayed as a white line, and the process returns to step S51.
- FIG. 6 is a flowchart showing the flow of the sudden brightness change corresponding display control process executed by the control circuit 45 in the see-through side view monitor system A1 of the first embodiment (the sudden brightness change compatible display control mode).
- the control circuit 45 executed by the control circuit 45 in the see-through side view monitor system A1 of the first embodiment (the sudden brightness change compatible display control mode).
- step S61 it is determined whether or not the function switch 54 is ON. If Yes (switch ON), the process proceeds to step S62. If No (switch OFF), the process returns to step S61.
- step S62 following the determination that the function switch 54 is ON in step S61, the luminance determination result and the hue determination result are obtained as determination data by the luminance / hue determination sensor 49a, and the process proceeds to step S63.
- step S63 following the acquisition of the luminance / hue determination data in step S62, it is determined whether or not the determined luminance detection value is higher than a third set value Y3 indicating the upper threshold, and Yes (luminance detection) If value> Y3), the process proceeds to step S65. If No (luminance detection value ⁇ Y3), the process proceeds to step S64.
- step S64 following the determination that the detected luminance value ⁇ Y3 in step S63, the luminance of the vehicle interior image to be superimposed is set to the normal state, and the process returns to step S61.
- step S65 following the determination that the detected luminance value is greater than Y3 in step S63, the luminance of the vehicle interior image to be superimposed is shifted down or the brightness is decreased, and the process returns to step S61.
- FIG. 7 is a flowchart showing the flow of hue conversion display control processing executed by the control circuit 45 in the see-through side view monitor system A1 of the first embodiment (hue conversion display control mode).
- FIG. 8 is a diagram illustrating a hue circle used in the hue conversion display control process according to the first embodiment. Hereinafter, each step of FIG. 7 will be described.
- step S71 it is determined whether or not the function switch 54 is ON. If Yes (switch ON), the process proceeds to step S72. If No (switch OFF), the process returns to step S71.
- step S72 following the determination that the function switch 54 is ON in step S71, the luminance determination result and the hue determination result are obtained as determination data by the luminance / hue determination sensor 49a, and the process proceeds to step S73.
- step S73 following the acquisition of the luminance / hue determination data in step S72, the determined hue detection value is compared with the hue setting value X, and whether the hue shift due to the difference therebetween is less than the first threshold value Z1. If yes (hue shift ⁇ Z1), the process proceeds to step S75. If No (hue shift ⁇ Z1), the process proceeds to step S74.
- the set value X of the hue is obtained by determining the hue of the vehicle interior image before the hue conversion stored in the image memory 44.
- step S74 following the determination that hue deviation ⁇ Z1 in step S73, the hue of the vehicle interior image to be superimposed is maintained as it is, and the process returns to step S71.
- step S75 following the determination that the hue shift ⁇ Z1 in step S73, the determined hue detection value is compared with the hue setting value X, and the hue shift due to the difference between the two is detected by the second threshold Z2 ( ⁇ It is determined whether or not the difference is less than (Z1). If Yes (hue shift ⁇ Z2), the process proceeds to step S77. If No (hue shift ⁇ Z2), the process proceeds to step S76.
- step S76 following the determination that hue shift ⁇ Z2 in step S75, the brightness of the vehicle interior image to be superimposed is shifted up or the brightness is increased, and the process returns to step S51.
- step S77 following the determination in step S75 that the hue shift is smaller than Z2, the hue of the vehicle interior image to be superimposed is converted, displayed in a complementary color, and the process returns to step S71.
- the conversion to the hue of the complementary color system means that the color located diagonally in the hue circle shown in FIG. 8 has a complementary color relationship.
- the hue of the external video is “red”
- “Cyan” is most preferable.
- red in addition to “cyan”, “blue” and “green” are also present in the diagonal complementary color system region, and therefore may be converted to “blue” or “green”.
- FIG. 9 is a flowchart showing a flow of warning display control processing executed by the control circuit 45 in the see-through side view monitor system A1 of the first embodiment (warning display control mode). Hereinafter, each step of FIG. 9 will be described.
- step S91 it is determined whether or not the function switch 54 is ON. If Yes (switch ON), the process proceeds to step S92. If No (switch OFF), the process returns to the determination in step S91.
- step S92 following the determination that the function switch 54 is ON in step S91, the speed sensor 52 obtains speed data and also obtains obstacle presence / absence data from the image processing unit 43, and the process proceeds to step S93.
- the “obstacle presence / absence data” is obtained by analyzing the actual camera image data input to the image processing unit 43 and determining whether an image indicating the obstacle exists in the analyzed image. Is done.
- step S93 following the acquisition of the speed data and obstacle presence / absence data in step S92, it is determined whether or not the speed detection value is smaller than the set value V. If Yes (speed detection value ⁇ V), the process proceeds to step S95. If No (speed detection value ⁇ V), the process proceeds to step S94.
- the set value V is a low speed range vehicle speed value as a judgment threshold value for low speed driving and normal driving.
- step S94 following the determination that speed detection value ⁇ V in step S93, the hue of the entire screen of the vehicle interior image to be superimposed is changed to a red system that warns the entire screen, and the process proceeds to step S51. Return.
- step S95 following the determination that the detected speed value ⁇ V in step S93, it is determined whether or not an obstacle is not recognized based on the acquired obstacle presence / absence data, and Yes (no obstacle recognized). If this is the case, the process proceeds to step S97. If No (there is an obstacle recognized), the process proceeds to step S96.
- step S96 following the determination that the obstacle is recognized in step S95, the hue of the entire screen of the vehicle interior image to be superimposed is changed in accordance with the degree of obstacle approach (for example, gradually from orange). To red), the process returns to step S91.
- step S97 following the determination that no obstacle is recognized in step S95, the brightness and hue of the vehicle interior image to be superimposed are maintained, and the process returns to step S91.
- the object of the present invention including the first to third embodiments is to provide an external camera capable of contributing to the elimination of blind spots, and the driver viewed the video in a system capable of displaying the camera video using image processing.
- a system capable of intuitively recognizing that the image is transmitted through the vehicle is inexpensively constructed, and further, a vehicle peripheral image display system capable of grasping the behavior of the vehicle in the image is proposed.
- the main contents of the display system proposed by the present inventor are as follows.
- -Build a display method that allows you to intuitively understand the direction of travel, size, and other vehicle sensations simply by looking at the video displayed on the monitor.
- -A system that can change the image transmissivity freely according to the driver's preference.
- a display system capable of automatically changing the basic transmittance according to the operating status is constructed. For example, the transmittance is changed according to the luminance of the external video at dusk so that the external video is easy to see.
- -Build a display system that automatically controls the brightness and hue of the translucent interior image to be superimposed according to the brightness and hue of the external video.
- the system will be able to change the brightness and hue by manual operation according to the preference of visibility with large individual differences. For example, the brightness of the translucent interior image is increased at twilight, the brightness of the translucent interior image is inverted at night, and the brightness of the translucent interior image is decreased when the oncoming vehicle lights.
- the hue of the video image and the translucent vehicle interior image approximates, the hue of the translucent vehicle interior image is converted so as to have a hue difference.
- the hue of the hue can be controlled, and use a whole screen displayed on the monitor to perform a visual warning action. For example, when the safety is impaired in conjunction with vehicle speed information or obstacle information, the hue of the entire translucent vehicle interior image is converted to a red hue or the like representing a warning.
- the operations in the see-through side view monitor system A1 of the first embodiment are “monitor image display operation by transmission image”, “transparency change operation of translucent portion”, “external image luminance follow-up display control operation”, and “brightness change suddenly”.
- the description will be divided into “corresponding display control action”, “hue conversion display control action”, and “warning display control action”.
- FIG. 10 is a diagram showing a vehicle interior still image previously captured from the driver's viewpoint position toward the left front side.
- FIG. 11 is a perspective view illustrating a state in which a vehicle body shape is projected onto a road surface from a vehicle on which the see-through side view monitor system A1 according to the first embodiment is mounted.
- FIG. 12 shows an image (opaque portion) when an image obtained by projecting the vehicle body shape onto the road surface from a vehicle on which the see-through side view monitor system A1 according to the first embodiment is mounted is seen from the driver viewpoint position.
- FIG. 13 is a diagram illustrating an image in which the “opaque portion DE” of FIG. 12 is set with respect to the vehicle interior image RP illustrated in FIG.
- FIG. 14 shows a translucent vehicle interior in which “opaque part DE”, “transparent part CE”, and “translucent part GE” are set for the vehicle interior image RP shown in FIG. 10 in the see-through side view monitor system A1 of the first embodiment. It is a figure which shows the image RG.
- a monitor image display operation using a transmission image will be described with reference to FIGS.
- the real camera video input from the side camera 1 is analog / digital converted by the decoder 41 and stored in the image memory 42. Thereafter, in the image processing unit 43, “image processing including various processing (brightness adjustment, color correction, edge correction, etc.)” and “viewpoint conversion processing as if a virtual camera was placed near the driver's viewpoint” To obtain a virtual camera image.
- the image memory 44 stores a vehicle interior image RP (FIG. 10) previously captured from the driver's viewpoint as a vehicle interior image.
- the superimpose circuit 46 makes the vehicle interior image RP from the image memory 44 translucent to make a translucent vehicle interior image RG (FIG. 14), and adds the translucent vehicle to the virtual camera image from the image processing unit 43.
- a composite image that transmits the translucent vehicle interior image RG and expresses the virtual camera image is generated.
- the composite image superimposed in the superimpose circuit 46 is sent to the encoder 47, and after that, undergoes digital / analog conversion in the encoder 47, is output to the monitor 3, and is displayed on the display screen 3a.
- the vehicle is shown high in the air so that it can be easily discriminated.
- this projection plane is projected vertically on the road surface such as a road and is at the same height as the ground contact surface of the tire.
- the image shown in FIG. 12 shows the shape of the vehicle body in actual driving, touching this projection image when superimposed on the image from the driver's viewpoint means contact with the vehicle body.
- the viewpoint-converted image and this projection plane you can see at a glance the vehicle sensation that is necessary for avoiding falling wheels to the side grooves and avoiding obstacles by just looking at this image (side view screen). Since it can be understood and intuitively understood, the degree of contribution to safe driving increases. That is, as shown in FIG. 13, the projected part of the vehicle body shape is an “opaque part DE” having a transmittance of 0%, and the vehicle interior image RP is displayed as it is.
- the vehicle interior image RP is displayed with an arbitrary transmittance by using ⁇ (alpha) blend with the side camera image.
- ⁇ alpha
- a “transparent portion CE” with a transmittance of 100% is set, and the remaining is a “translucent portion GE”.
- the vehicle interior image RP from the driver's viewpoint is displayed 100% as a translucent vehicle interior image RG.
- the actual camera video of the side camera 1 installed on the side mirror is converted by the viewpoint conversion image processing, and converted into a virtual camera image as if it was photographed by the virtual camera from the driver viewpoint position.
- a translucent image with a more realistic feeling can be expressed by superimposing and displaying the translucent vehicle interior image on the virtual camera image.
- the positional relationship between the external transmission image and the actual vehicle is clear. It can be displayed as you can see.
- the transmittance of the interior image RP of the shadow area projected according to the size and shape of the vehicle body on the road which is the virtual space screen when performing the above-described viewpoint conversion, viewed from the driver viewpoint position. It is displayed as a 0% “opaque part DE”, and otherwise it is displayed as a semi-transparent area (“transparent part CE”, “semi-transparent part GE”) having an arbitrary transmittance. Therefore, the monitor screen 3a is displayed as a superimpose screen in which the size and shape of the vehicle body are clear, and the other part is a blend screen with the camera video. As a result, the behavior of the vehicle becomes clear at a glance, and it becomes easy to judge the possibility of wheel removal.
- the system has become a redundant system. That is, since two blind cameras and a driver viewpoint camera are used and a driver viewpoint camera needs to be added to the existing system, the cost of the system increases.
- the driver viewpoint camera is provided near the driver's viewpoint position and cannot be provided at the driver viewpoint position, the camera image obtained from the driver viewpoint camera is actually the driver's viewpoint position. Parallax occurs with respect to the image seen from the viewpoint.
- this proposal does not require the addition of a driver viewpoint camera to the existing system and does not increase costs.
- the virtual camera viewpoint There is no occurrence of parallax.
- the shape of the vehicle body can be grasped at a glance, it is possible to obtain a necessary and sufficient effect that it is very easy to understand in the traveling direction and avoiding obstacles that are approaching and can contribute to safe driving.
- the entire screen is not a uniform translucent image, but a uniform screen is obtained by using a blend setting of 100% and 100% to 0% blendable translucent areas and images with several levels of transparency. Can improve various misunderstandings and recognition errors.
- the “translucent portion GE” when a predetermined single fixed transmittance is used, the user cannot arbitrarily change the transmittance, and the usability is deteriorated. Even if there is an environmental change, visibility may be reduced due to the screen having a uniform transmittance.
- the transmittance of the “semi-transparent portion GE” can be adjusted manually or automatically. That is, when a transmittance adjustment signal is input to the blend external control unit 48 by a manual operation on the blend ratio manual control interface 4, a transmittance control command is output to the control circuit 45, and the blend circuit unit 46a
- the transmittance of the “semi-transparent part GE” set in the image is arbitrarily adjusted in the range of 0% to 100%.
- the blend circuit unit 46a When the function switch 54 is turned on, the blend circuit unit 46a is based on external environment information (daytime, evening, nighttime, weather, etc.) and vehicle information (steering angle, vehicle speed, etc.) obtained by the external sensor 5.
- the transmittance of the “semi-transparent portion GE” set in the vehicle interior image is automatically adjusted so as to enhance the visibility of the composite image displayed on the monitor 3.
- the blend ratio variable by making the blend ratio variable by manual operation, it is a system that can freely set and update the transmissivity of the “semi-transparent part GE”, which is very easy to use. Further, when the function switch 54 is turned on, the system automatically adjusts the transmittance of the “semi-transparent portion GE” without requiring any user operation, and the composite image displayed on the monitor 3 is high. Visibility can be maintained.
- FIG. 15 is an image diagram showing a luminance reversal screen when the translucent vehicle interior image RG is reversed and displayed in the see-through side view monitor system A1 according to the first embodiment.
- the display control action by the external video luminance follow-up display control mode will be described with reference to FIG.
- step S51 When the external video captured by the side camera 1 is bright during the daytime and the luminance detection value is equal to or greater than the first set value Y1, the flow proceeds from step S51 to step S52 to step S53 to step S54 in the flowchart of FIG. Is repeated. That is, in step S54, the luminance of the translucent vehicle interior image RG to be superimposed is set to a normal luminance state having high discrimination when the external video is bright.
- step S51 in the flowchart of FIG. Step S52 ⁇ Step S53 ⁇ Step S55 ⁇ Step S56 is repeated. That is, in step S56, the luminance or brightness of the translucent vehicle interior image RG to be superimposed is shifted up.
- the brightness (lightness) of the translucent vehicle interior image RG to be superimposed is kept in a normal state at twilight, the overall brightness of the display screen 3a of the monitor 3 falls, and the inside of the dim virtual camera image The semi-transparent vehicle interior image RG may be melted and the visibility may be impaired.
- the two images to be superimposed will have a brightness difference (brightness difference). Useful for improving visibility.
- step S51 When the external image captured by the side camera 1 is dark at night or the like and the luminance detection value is lower than the second set value Y2, in the flowchart of FIG. 5, go to step S51 ⁇ step S52 ⁇ step S53 ⁇ step S55 ⁇ step S57.
- the flow going forward is repeated. That is, in step S57, as shown in FIG. 15, the luminance of the translucent vehicle interior image RG to be superimposed is inverted, and the black line is displayed as a white line.
- the translucent vehicle interior image RG to be superimposed when the brightness (brightness) of the translucent vehicle interior image RG to be superimposed is kept in a normal state at night when the external video is dark, the overall brightness of the display screen 3a of the monitor 3 is greatly reduced, and the dark virtual The translucent vehicle interior image RG is completely dissolved in the camera image, and the visibility is significantly impaired. Further, even if the brightness of the superimposed translucent vehicle interior image RG is increased, the translucent vehicle interior image RG is merged into the dark virtual camera image, and the two images cannot be distinguished. In contrast, the translucent vehicle interior image RG that is superimposed by superimposing on the dark external video is inverted as in the negative image of the photograph, the dark black line portion is displayed as a white line, and the bright white line portion is displayed.
- the blend ratio manual control interface 4 achieves an increase in brightness by increasing the transmittance and a decrease in brightness by decreasing the transmittance.
- step S64 the luminance of the translucent vehicle interior image RG to be superimposed is set to a normal state or a luminance change state by the external video luminance follow-up display control.
- step S61 in the flowchart of FIG.
- the flow of going from step S62 to step S63 to step S65 is repeated. That is, in step S65, the luminance or brightness of the translucent vehicle interior image RG to be superimposed is shifted down.
- the external image of the monitor 3 is basically whitened out and superimposed, although a certain amount of brightness correction is applied.
- the translucent vehicle interior image RG becomes indistinguishable.
- the brightness adjustment including auto iris of the in-vehicle camera can provide a certain amount of dazzling prevention effect, and the superimposed image can be distinguished.
- the brightness adjustment function of the in-vehicle camera cannot follow.
- the semi-transparent vehicle interior is superimposed.
- the brightness or brightness of the image RG is shifted down.
- the external video virtual camera image
- the semitransparent vehicle interior image RG that is entirely blackened is superimposed by superimposition.
- the distinction between the external video and the translucent vehicle interior image RG becomes clear, which is useful for improving the visibility when there is a steep luminance change.
- step S71 If the hue of the external video and the hue of the translucent vehicle interior image RG are different and the hue deviation is equal to or greater than the first threshold value Z1 when the hue detection value is compared with the set value X, step S71 ⁇ The flow from step S72 to step S73 to step S74 is repeated. That is, in step S74, the hue of the translucent vehicle interior image RG to be superimposed is maintained as the current hue.
- step S71 When the hue of the external image approaches the hue of the translucent vehicle interior image RG and the hue deviation is less than the first threshold Z1 but greater than or equal to the second threshold Z2 when the hue detection value is compared with the set value X,
- step S71 the luminance (lightness) of the translucent vehicle interior image RG to be superimposed is increased.
- step S77 the hue of the translucent vehicle interior image RG to be superimposed is converted into a complementary color hue positioned diagonally in the hue circle shown in FIG. 8 and displayed.
- the hue of the external video and the hue of the translucent vehicle interior image RG are close to each other, but different from each other, the brightness (brightness) of the semitransparent vehicle interior image RG to be superimposed is increased to make it visible. Can improve sex.
- the green color is mainly used, and if the translucent interior image RG to be superimposed is mainly green of the same color, the visibility will deteriorate. Become.
- the translucent vehicle interior image RG is “a magenta color that exists oppositely in the hue circle”, the distinction becomes clear.
- the red color is the main color, and when the semi-transparent vehicle interior image RG to be superimposed is mainly the same color red, the visibility is deteriorated.
- the translucent vehicle interior image RG is “a cyan color that exists oppositely in the hue circle”, the distinction becomes clear.
- the external video and the hue of the translucent vehicle interior image RG are at the same hue level, the external video and the translucent vehicle interior image RG are changed by changing the hue to a color system having a complementary color relationship.
- the monitor image can be clearly distinguished.
- step S91 When the monitor display in the side view is maintained even in the normal driving state and the speed detection value is equal to or higher than the set value V, in the flowchart of FIG. 9, the process proceeds from step S91 to step S92 to step S93 to step S94.
- the forward flow is repeated. That is, in step S94, the hue of the entire screen of the semi-transparent vehicle interior image RG to be superimposed is converted to red.
- step S91 when an obstacle is recognized in the vicinity of the own vehicle, in the flowchart of FIG. 9, the flow from step S91 ⁇ step S92 ⁇ step S93 ⁇ step S95 ⁇ step S96 is repeated. That is, in step S96, the hue of the entire screen of the semi-transparent vehicle interior image RG to be superimposed is converted so as to gradually increase the red color in accordance with the approach degree of the obstacle.
- step S91 ⁇ step S92 ⁇ step S93 ⁇ step S95 ⁇ step S97 is repeated in the flowchart of FIG. That is, in step S97, the brightness and hue of the translucent vehicle interior image RG to be superimposed are maintained.
- the side view monitor system is used for safety checks such as when the vehicle is widened or when the engine is started, and is frequently used when the vehicle is stopped or traveling at an extremely low speed. Therefore, when trying to use this system when the vehicle is in a speed state above a certain vehicle speed, the monitor 3 is kept being watched, so the feeling of speed and the sense of space are often emphasized more than in reality, and there is often a sense of incongruity. Since this would impair safety in the original sense, it is better to judge such a situation and issue a warning that the speed should be reduced for safe driving. Also, when the image processing control unit 2 recognizes an obstacle by image analysis, it is better to give a warning notifying the presence of the obstacle in order to promote driving to avoid the obstacle when the obstacle is recognized. good.
- a warning operation is performed in conjunction with the speed sensor 52 and the like.
- the driver is warned that the vehicle speed should be reduced and safety Can be secured.
- a warning display that uses a change in hue according to the degree of proximity of the obstacle is performed to promptly respond to collisions, entrainment, falling wheels, etc., against the host vehicle. Sex can be secured.
- An in-vehicle blind spot camera (side camera 1) that is attached to the host vehicle and images the periphery of the vehicle, a monitor 3 that is set in a vehicle interior position that can be visually recognized by the driver, and an input from the in-vehicle blind spot camera (side camera 1).
- a vehicle peripheral image display system (see-through side view monitor system A1), comprising monitor image generation means (image processing control unit 2) for generating a display image on the monitor 3 based on a real camera image to be displayed
- the monitor image generation means (image processing control unit 2) converts the viewpoint of the actual camera image input from the vehicle-mounted blind spot camera (side camera 1) into a virtual camera image viewed from the viewpoint position of the driver.
- the luminance of the semi-transparent vehicle interior image RG obtained by making the vehicle interior image translucent based on the external video color determination unit (luminance / hue determination sensor 49a) determined according to at least one and the color determination result of the external video
- a vehicle interior image color automatic adjustment unit (luminance / hue conversion block 49b) that automatically adjusts at least one of hue, saturation, and lightness so as to improve the visibility of the external image, and from the image processing unit 43
- a virtual camera image is transmitted through the translucent vehicle interior image RG by image composition in which the translucent vehicle interior image RG from the vehicle interior image color automatic adjustment unit (luminance / hue conversion block 49b) is superimposed on the virtual camera image.
- an image composition circuit (superimpose circuit 46) for generating a composite image to be expressed. Therefore, regardless of the external environmental conditions for acquiring real camera video, the distinction between the virtual camera image and the translucent vehicle interior image RG is clarified, and the external situation that is a blind spot from the driver is clarified by the positional relationship with the own vehicle Can be seen through.
- the monitor image generating means (image processing control unit 2) has an image storage unit (image memory 44) for storing a vehicle interior still image previously captured from the viewpoint of the driver as a vehicle interior image.
- the vehicle interior image color automatic adjustment unit (luminance / hue conversion block 49b) obtains a translucent vehicle interior image RG by making the vehicle interior image from the image storage unit (image memory 44) translucent. .
- the setting of the driver viewpoint camera is omitted, and an inexpensive system using only the in-vehicle blind spot camera (side camera 1), while no parallax occurs with respect to the actual driver viewpoint like the driver viewpoint camera, A translucent vehicle interior image RG from the driver's viewpoint can be acquired.
- the external image color determination unit is a luminance / hue determination sensor 49a that determines an average luminance and hue of an external image from the in-vehicle blind spot camera (side camera 1), and the vehicle interior image color automatic adjustment The unit automatically adjusts the luminance and hue of the translucent vehicle interior image RG based on the result of color determination of the external video by the luminance / hue determination sensor 49a so as to improve the visibility of the luminance and hue of the external video.
- a hue conversion block 49b For this reason, the brightness and hue of the translucent vehicle interior image RG can be automatically adjusted so as to improve the visibility, corresponding to the average brightness and hue of the external video from the vehicle blind spot camera (side camera 1). .
- the monitor image generation means (image processing control unit 2) reads the luminance detection value from the luminance / hue determination sensor 49a, and when the luminance detection value is equal to or higher than a first set value Y1 indicating a twilight threshold, When the brightness difference between the brightness of the semi-transparent vehicle interior image RG and the brightness of the external video is displayed and the brightness detection value is lower than the first set value Y1 but greater than or equal to the second set value Y2 indicating the nighttime threshold, When the brightness of the translucent vehicle interior image RG is increased to give a difference in brightness from the brightness of the external image, and the brightness detection value is lower than the second set value Y2, the brightness of the translucent vehicle interior image RG And an external video luminance follow-up display control mode (FIG.
- the monitor image generation means (image processing control unit 2) reads the luminance detection value from the luminance / hue determination sensor 49a, and when the luminance detection value becomes higher than the third set value Y3 indicating the upper limit side threshold value.
- a display control mode (FIG. 6) corresponding to a sudden change in luminance is provided that reduces the luminance of the translucent vehicle interior image RG and displays the entire image as blackish. For this reason, when there is a steep luminance change such that the light of the oncoming vehicle is reflected on the in-vehicle blind spot camera (side camera 1) at night, the external video displayed on the monitor 3 and the half of the steep luminance change are dealt with. It is possible to clarify the distinction of the transparent vehicle interior image RG and improve the visibility.
- the monitor image generation means (image processing control unit 2) reads the hue detection value from the luminance / hue determination sensor 49a, and the hue deviation between the hue detection value and the set value X is equal to or greater than the first threshold value Z1.
- the hue difference between the hue detection value and the set value X is less than the first threshold value Z1 but greater than or equal to the second threshold value Z2
- the hue of the translucent vehicle interior image RG is displayed. Is a display that brightens the color as it is, and when the hue deviation between the hue detection value and the set value X is less than the second threshold Z2, the hue of the translucent interior image RG is the complementary color side in the hue circle with respect to the hue of the external video It has a hue conversion display control mode (FIG.
- Vehicle speed detection means for detecting the vehicle speed
- the monitor image generation means image processing control unit 2 reads the vehicle speed detection value from the vehicle speed detection means (speed sensor 52)
- a warning display control mode for converting and displaying the entire hue of the translucent vehicle interior image RG into a hue that causes the driver to recognize the warning is provided. For this reason, when the vehicle speed is too high, it is safe to give the driver a warning that the vehicle speed should be reduced by changing the overall hue of the translucent interior image RG to be superimposed, for example, to red. Sex can be secured.
- the monitor image generation means (image processing control unit 2) is adapted to the luminance / hue of the translucent vehicle interior image RG in response to an external operation on the vehicle interior image color manual operation means (hue manual control interface 6).
- a vehicle interior image color external control unit (hue external control unit 49c) that arbitrarily adjusts at least one of saturation and lightness. For this reason, at least one of the brightness, hue, saturation, and brightness of the translucent vehicle interior image RG can be adjusted by manual operation according to the preference of the user having large individual differences and the visibility of the display image on the monitor 3.
- the system can be made easy to use.
- the image composition circuit (superimpose circuit 46) has an area SE obtained by projecting the host vehicle on the road surface in the vehicle interior image from the viewpoint of the driver, and the transmittance of the vehicle interior image RP is 0%.
- the region corresponding to the window glass of the host vehicle is set as the “opaque portion DE” and the “transparent portion CE”, and the region corresponding to the window glass of the host vehicle is set as the “transparent portion CE” with a transmittance of 100%.
- a blend circuit unit 46a that sets an area other than "" as a "semi-transparent part GE" having an arbitrary transmittance.
- the in-vehicle blind spot camera is a side camera used in a see-through side view monitor system A1 that displays a front side portion of a vehicle, which is a blind spot of a driver, on a monitor 3 in the vehicle interior as an image transmitted from the vehicle interior. 1. For this reason, it is possible to understand at a glance the vehicle feeling necessary for avoiding falling wheels to the side grooves and avoiding obstacles, enabling intuitive grasp of the space, and increasing the contribution that can contribute to safe driving. .
- the second embodiment uses a back camera for eliminating the blind spot disposed at the rear position of the vehicle as the in-vehicle blind spot camera, and displays the rear part of the vehicle that becomes the blind spot of the driver on the monitor as an image transmitted from the vehicle interior. It is an example of a see-through back view monitor system.
- FIG. 16 is an overall system block diagram illustrating a see-through back view monitor system A2 (an example of a vehicle periphery image display system) according to the second embodiment.
- the see-through back view monitor system A2 includes a back camera 21 (vehicle-mounted blind spot camera), an image processing control unit 2 (monitor video generation means), a monitor 3, and a blend ratio manual control.
- An interface 4, an external sensor 5, and a hue manual control interface 6 (vehicle interior image color manual operation means) are provided.
- the rear camera 21 is mounted near the inside of the trunk lid of the license plate in the case of a passenger car, or in the vicinity of the upper end of the rear window in the case of a large car such as an RV car, which becomes a driver's blind spot.
- the rear part of the vehicle is imaged.
- the back camera 21 acquires real camera video data of the rear part of the vehicle by an image sensor (CCD, CMOS, etc.).
- the image processing control unit 2 includes a decoder 41, an image memory 42, an image processing unit 43, an image memory 44 (image storage unit), a control circuit (CPU) 45, a super-in Pause circuit 46 (image composition circuit), encoder 47, blend external control unit 48, luminance / hue determination sensor 49a (external video color determination unit), and luminance / hue conversion block 49b (in-vehicle interior image color automatic adjustment unit) ) And a hue external control unit 49c (vehicle interior image color external control unit). Since each configuration is the same as that of FIG. 1 of the first embodiment, the corresponding components are denoted by the same reference numerals and description thereof is omitted.
- FIG. 17 shows a translucent vehicle interior image in which “opaque portion DE”, “transparent portion CE”, and “translucent portion GE” are set for the vehicle interior image RP behind the vehicle in the see-through back view monitor system A2 of the second embodiment. It is a figure which shows RG.
- the see-through back view monitor system A2 of Example 2 has a form in which the side camera 1 of the see-through side view monitor system A1 of Example 1 using the side camera 1 is replaced with a back camera 21.
- the actual camera video from the back camera 21 is digitally converted and the viewpoint is converted into a virtual camera image from the driver's viewpoint.
- the vehicle body projection image of FIG. 11 is now applied to the rear of the vehicle, and the area of the vehicle body projection view is adapted to the vehicle interior image superimposed with the virtual camera image of the back camera 21.
- the shaded portion corresponding to the shadow SE by the vertical projection of the vehicle is set as an “opaque portion DE” with 0% transmittance.
- the window glass portion is a “transparent portion CE” having a transmittance of 100%.
- the other region is set as a semi-transparent “translucent portion GE” that is alpha-blended with an arbitrary transmittance that can be defined by the user.
- a camera is installed to reflect the vicinity of the bumper, and the vehicle feel is obtained by displaying the reflected bumper and the vehicle trajectory line.
- the in-vehicle blind spot camera is a back camera 21 used in a see-through back view monitor system A2 that displays a rear part of a vehicle that is a driver's blind spot as an image transmitted from the passenger compartment on the monitor 3 in the passenger compartment. .
- a vehicle stop line, a vehicle stop curb, a wall, and the like necessary for reverse traveling in parking and a sense of position and distance of the own vehicle, or a following vehicle and the own vehicle approaching during running It is possible to intuitively grasp the space such as the sense of position and the sense of distance, and to increase the degree of contribution that can contribute to quick parking and safe driving.
- a front camera for eliminating a blind spot is used as an in-vehicle blind spot camera, and a front part of the vehicle that becomes a driver's blind spot is displayed on a monitor as an image that is transmitted from the passenger compartment. It is an example of a see-through front view monitor system.
- FIG. 18 is an overall system block diagram illustrating a see-through front view monitor system A3 (an example of a vehicle periphery image display system) according to the third embodiment.
- the see-through front view monitor system A3 includes a left front camera 31L (vehicle-mounted blind spot camera), a right front camera 31R (vehicle-mounted blind spot camera), and a central front camera 31C (vehicle-mounted blind spot camera).
- An image processing control unit 2 (monitor image generation means), a monitor 3, a blend ratio manual control interface 4, an external sensor 5, and a hue manual control interface 6 (vehicle interior image color manual operation means). ing.
- the external sensor 5 includes a turn signal switch 55 in addition to the rudder angle sensor 51, the speed sensor 52, the illumination ON / OFF switch 53, and the function switch 54.
- the image processing control unit 2 includes a left decoder 41L, a right decoder 41R, a central decoder 41C, a left image memory 42L, a right image memory 42R, a central image memory 42C, and image processing.
- Unit 43 image memory 44 (image storage unit), control circuit (CPU) 45, superimpose circuit 46 (image synthesis circuit), encoder 47, blend external control unit 48, and luminance / hue determination sensor 49a (external video color determination unit), luminance / hue conversion block 49b (vehicle interior image color automatic adjustment unit), and hue external control unit 49c (vehicle interior image color external control unit).
- These components are the same as those in FIG. 1 of the first embodiment.
- FIG. 19 is a flowchart showing the flow of the blend ratio sensor interlocking control process executed by the control circuit 45 in the see-through front view monitor system A3 of the third embodiment.
- each step will be described.
- the user arbitrarily changes the left / right blend ratio and sets the current transmittance Tr1 to a value such as 30%, for example.
- step S191 it is determined whether or not the function switch 54 is ON. If Yes (switch ON), the process proceeds to step S192. If No (switch OFF), the determination in step S191 is repeated.
- step S192 following the determination that the function switch 54 is ON in step S191, it is determined whether or not the ON signal is output from the turn signal switch 55. If Yes (turn signal blinking operation) Shifts to step S193, and returns No to step S191 if No (turn signal is extinguished).
- step S193 following the determination that the turn signal switch 55 outputs an ON signal in step S192, it is determined whether or not the signal from the turn signal switch 55 is a course change signal to the right. If the signal is right), the process proceeds to step S194. If the signal is No (turn signal is left), the process proceeds to step S196.
- step S194 following the determination that the turn signal in step S193 is right, it is determined whether or not the current transmittance Tr1 is smaller than the set value Tr0. If Yes (Tr1 ⁇ Tr0), the process proceeds to step S195. If No (Tr1 ⁇ Tr0), the process proceeds to step S198.
- the set value Tr0 is a transmittance threshold value for securing a right field of view that is the line changing direction.
- step S195 following the determination that Tr1 ⁇ Tr0 in step S194, the transmittance of the right front camera image area is forcibly changed from the current transmittance Tr1 to the transmittance T (eg, Tr0). Return to step S191.
- step S196 following the determination that the turn signal in step S193 is left, it is determined whether or not the current transmittance Tr1 is smaller than the set value Tr0. If Yes (Tr1 ⁇ Tr0), the process proceeds to step S197. If No (Tr1 ⁇ Tr0), the process proceeds to step S198.
- the set value Tr0 is a transmittance threshold value for securing a field of view to the left which is the line changing direction.
- step S197 following the determination that Tr1 ⁇ Tr0 in step S196, the transmittance of the left front camera image area is forcibly changed from the current transmittance Tr1 to the transmittance T (eg, Tr0). Return to step S191.
- step S198 following the determination that Tr1 ⁇ Tr0 in step S194 or step S196, the current transmittance Tr1 is maintained without change, and the process returns to step S191.
- Other configurations are the same as those in the first embodiment.
- FIG. 20 is a diagram illustrating an image in which “opaque portions DE” are set in the divided regions of the left, right, and center front camera images in the see-through front view monitor system A3 according to the third embodiment.
- FIG. 21 shows a translucent vehicle interior image RG in which “opaque portion DE”, “transparent portion CE”, and “translucent portion GE” are set for the vehicle interior image RP in the see-through front view monitor system A3 of the third embodiment.
- FIG. 21 shows a translucent vehicle interior image RG in which “opaque portion DE”, “transparent portion CE”, and “translucent portion GE” are set for the vehicle interior image RP in the see-through front view monitor system A3 of the third embodiment.
- the images of the front cameras 31L, 31R, and 31C are digitized and subjected to viewpoint conversion image processing, and then a vehicle-shaped vertical projection image.
- a superimpose screen which is a vehicle interior image that takes into account, is superimposed and a composite image is obtained.
- the region where the vehicle body shape is projected perpendicularly to the road surface with respect to the vehicle interior image RP taken in advance forward from the driver's viewpoint position is defined as an “opaque portion DE” having a transmittance of 0%. Then, a wide-angle screen image of 180 degrees or more is displayed in the area excluding the “opaque portion DE” from the vehicle interior image RP.
- this wide-angle screen image of 180 degrees or more is used as a viewpoint-converted image from the driver's viewpoint, as shown in FIG. 20, the camera image from the central front camera 31C is displayed in the center area, and the left front camera 31L is displayed in the left area.
- a screen in which the camera video and the camera video from the right front camera 31R are combined in the right area is configured. That is, since the image using the camera is configured to secure the field of view, the camera images of the left, right, and center front cameras 31L, 31R, and 31C are often displayed on one screen. In this case, usually, these respective camera images are combined to display a wide-angle screen image of 180 degrees or more.
- a semi-transparent vehicle interior image RG that is distinguished from the “opaque part DE”, “transparent part CE”, and “semi-transparent part GE” is superimposed on the video shown in FIG. 20 by superimposition. As shown in the figure, the driver is provided with an image that looks through the vehicle interior and looks outside the vehicle front side.
- the vehicle shape called the vehicle sensation as well as the elimination of the blind spots is eliminated as in the case of the see-through side view monitor system A1 and the see-through back view monitor system A2 described above. -It is possible to provide a video whose size is self-explanatory and easy to grasp intuitively when suddenly avoiding danger.
- the transmittance of the “semi-transparent portion GE” in the three-divided video area is automatically adjusted.
- step S195 since the visual field in the right region is more important than the central region of the screen, the system performs an alpha blending operation to automatically increase the transmittance in order to secure the visual field.
- step S191 If the left turn signal is detected and the current transmittance Tr1 is smaller than the set value Tr0, the process proceeds from step S191 to step S192 to step S193 to step S196 to step S197 in the flowchart of FIG.
- step S197 since the visual field in the left region is more important than the central region of the screen, the system performs an alpha blending operation to automatically increase the transmittance in order to secure the visual field.
- the left and right of the turn signal may be discriminated, and only one of them may be weighted to change the transmittance. Since other operations are the same as those of the first embodiment, description thereof is omitted.
- the in-vehicle blind spot camera is a left, right, and center front camera 31L, 31R used in a see-through front view monitor system A3 that displays a front part of a vehicle, which is a driver's blind spot, on a monitor 3 in the passenger compartment. , 31C.
- a see-through front view monitor system A3 that displays a front part of a vehicle, which is a driver's blind spot, on a monitor 3 in the passenger compartment. , 31C.
- Embodiments 1 to 3 show examples in which the luminance and hue of the translucent vehicle interior image RG superimposed on the virtual camera image are changed according to the determination result of the luminance and hue of the external video.
- the brightness of the translucent vehicle interior image RG to be superimposed on the virtual camera image is added to at least one of brightness, hue, saturation, and brightness according to the determination result in addition to the brightness and hue of the external video.
- An example in which at least one of hue, saturation, and lightness is changed is included in the present invention.
- the vehicle interior image RP prepared in advance as the translucent vehicle interior image RG to be superimposed on the virtual camera image is divided into three “opaque portion DE”, “transparent portion CE”, and “translucent portion GE”.
- the example which distinguishes in a part was shown.
- an example in which the entire vehicle interior image RP is a “translucent portion GE” may be used.
- the vehicle interior image RP as a whole may be “semi-transparent part GE” and “opaque part DE” may be displayed by bordering.
- an example of “shadow part” by filling may be used.
- vehicle interior image RP prepared in advance is distinguished into two parts of an “opaque part DE” (or “shadow part”) and a “translucent part GE”. Furthermore, the vehicle interior image RP prepared in advance may be an example in which the image changes continuously by an “opaque portion DE” (or “shadow portion”) and a “transparent portion in which the transmittance is changed in a gradation”.
- Example 1 shows an example of a see-through side view monitor system A1 using a side camera
- Example 2 shows an example of a see-through back view monitor system A2 using a back camera
- Example 3 Then, the example of see-through front view monitor system A3 using a front camera was shown.
- a monitor is shared, and a monitor system that can select any one of a side view, a back view, a front view, and the like, or a monitor system that is automatically switched under a predetermined condition It can also be applied to.
Abstract
Description
この車両周辺画像表示システムにおいて、前記モニター映像生成手段は、画像処理部と、外部映像色判定部と、車室内画像色自動調整部と、画像合成回路と、を有する。
前記画像処理部は、前記車載死角カメラから入力される実カメラ映像を、運転者の視点位置から見た仮想カメラ画像に視点変換する。
前記外部映像色判定部は、前記車載死角カメラからの外部映像の色を、輝度・色相・彩度・明度の少なくとも1つに応じて判定する。
前記車室内画像色自動調整部は、前記外部映像の色判定結果に基づき、車室内画像を半透明化した半透明車室内画像の輝度・色相・彩度・明度の少なくとも1つを、前記外部映像に対する視認性を高めるように自動調整する。
前記画像合成回路は、前記画像処理部からの仮想カメラ画像に、前記車室内画像色自動調整部からの半透明車室内画像を重畳させる画像合成により、半透明車室内画像を透過して仮想カメラ画像を表現する合成画像を生成する。
すなわち、画像処理部において、車載死角カメラから入力される実カメラ映像が、運転者の視点位置から見た仮想カメラ画像に視点変換されることで、モニターに表示された合成画像を見た運転者は、仮想カメラ画像に含まれる運転者からの死角部分を視差無く直感的に把握できる。
そして、外部映像色判定部において、車載死角カメラから入力される実カメラ映像を外部映像とし、この外部映像の平均的な明るさ・色合いが判定され、車室内画像色自動調整部において、外部映像色判定部の判定結果に基づき、車室内画像を半透明化した半透明車室内画像の明るさ・色合いが、外部映像に対する視認性を高めるように自動調整される。
したがって、外部映像に対して視認性を高めた半透明車室内画像を透過し、仮想カメラ画像がモニターに表現されることになり、例えば、昼間時、薄暮時、夜間時等の外部環境条件にかかわらず、仮想カメラ画像と半透明車室内画像の区別が明らかとなり、仮想カメラ画像による運転者から死角となる外部状況を、半透明車室内画像による自車両との位置関係にて明確に透過視認できる。
この結果、実カメラ映像を取得する外部環境条件にかかわらず、仮想カメラ画像と半透明車室内画像の区別を明らかとし、運転者から死角となる外部状況を自車両との位置関係にて明確に透過視認できる。
図1は、実施例1のシースルーサイドビューモニターシステムA1(車両周辺画像表示システムの一例)を示す全体システムブロック図である。
このスーパーインポーズ回路46には、運転者の視点から予め撮影された車室内画像RPを領域分けし、それぞれの領域にて異なる透過率の設定を行うブレンド回路部46aを有する。前記ブレンド回路部46aでは、車室内画像RP(図10)のうち、自車両を道路面に投影した影の領域SEを、車室内画像の透過率が0%の「不透明部分DE」として設定し、自車両の窓ガラスに相当する領域を、車室内画像の透過率が100%の「透明部分CE」として設定し、影領域と窓ガラス領域以外の領域を、任意の透過率を持つ「半透明部分GE」として設定する(図14)。
この輝度・色相判定センサー49aでは、実カメラ画像を構成する各画素のデータを、デジタルデータ輝度信号Yと色差信号CbCr(または、RGBデータ)として取り込んでいる。そして、取り込んだ元データ(画面)について、図2に示すように、水平走査線方向の一水平ライン毎にデータを累積(蓄積)し、全走査線による複数の累積データを平均化して画像全体の明るさ(輝度)・色相を判定している。この判定手法が最も容易である。しかし、取り込んだ元データ(画面)について、図3に示すように、四隅・中央等に特定のブロック(判定用のエリア1~5)を設定し、各ブロックでの平均値により画像全体の明るさ(輝度)・色相を判定しても良い。また、取り込んだ元データ(画面)について、図4に示すように、数画素おきに縦方向と横方向の格子状サンプルライン(代表的な垂直ライン、代表的な水平ライン)を設定し、サンプルライン毎にデータを累積(蓄積)し、複数の累積データを平均化して画像全体の明るさ(輝度)・色相を判定しても良い。
ここで、「輝度」とは、光源や二次光源(反射面や透過面)から観測者の方向へ向かって発する「光の強さ」を人間の目の感度(CIE標準分光視感効率V[λ])で評価した測光量で、特定方向(観測方向)のみに着目したものである。
また、「輝度の通常状態」とは、車室外の明るさが昼間の明るさであるとき、仮想カメラ画像に重畳する半透明車室内画像の区別が明確となる輝度をいう。
ここで、「明度」とは、色の三属性の一つであり、色の明るさのことをいう。なお、色の三属性は、「明度」に、「色相(色合いのこと)」と「彩度(色の鮮やかさの度合い)」を含めたものをいう。
ここで、色相の設定値Xは、画像メモリ44に記憶設定されている色相変換前の車室内画像の色相を判定することで求められる。
ここで、補色系の色相への変換とは、図8に示す色相環において、対角上に位置する色が補色関係となるため、例えば、外部映像の色相が「赤」である場合には、「シアン」に変換するのが最も好ましい。但し、「赤」にとって、「シアン」以外に、「青」や「緑」も対角の補色系領域に存在するため、「青」や「緑」に変換しても良い。
ここで、「障害物有無データ」は、画像処理部43に入力される実カメラ画像データの解析を行い、解析した画像中に障害物を示す画像が存在するか否かを判断することで取得される。
ここで、設定値Vは、サイドビューモニターシステムが、本来、車両の幅寄せや始動時の安全確認のために用いられるものであるため、低速走行と通常走行の判断閾値として、低速域車速値に設定される。
実施例1~実施例3を含む本発明の目的は、死角解消に寄与可能な外部カメラを持ち、カメラ映像を、画像処理を用いて表示可能なシステムのうち、運転者がその映像を見ただけで車両を透過した映像であると直感的に認識可能なシステムを安価に構築し、更に、車両の挙動をその映像中で把握可能な車両周辺画像表示システムを提案するものである。
・モニターに表示される映像を見ただけで、直感的に車の進行方向、大きさその他の車両感覚が判る表示法を構築する。
・映像の透過率等を自由に運転者の嗜好にあわせ変化可能なシステムとする。更に運転状況によって、基本的な透過率を自動変更可能な表示システムを構築する。例えば、夕暮れ時などの外部映像の輝度により、透過率を変更し外部映像を見やすくする。
・スーパーインポーズする半透明車室内画像の輝度・色相を、外部映像の輝度・色相に応じて視認性を高める自動制御を行う表示システムを構築する。加えて、個人差が大きい視認性の好みにあわせて輝度・色相を手動操作により変更可能なシステムとする。例えば、薄暮れ時には半透明車室内画像の輝度をアップし、夜間時には半透明車室内画像の輝度を反転し、対向車のライトが入ったときは半透明車室内画像の輝度をダウンし、外部映像と半透明車室内画像の色相が近似するときは半透明車室内画像の色相を変換して色相差を持たせる等、視点変換されたカメラ映像を見やすくする工夫を行う。
・色相を制御可能であることを利用し、モニター表示される画面全体を用いて視覚に訴える警告動作を行うシステムとする。例えば、車速情報や障害物情報と連動し、安全性を損なうような場合には、半透明車室内画像全体の色相を、警告をあらわす赤色系色相等に変換する。
図10は、運転者の視点位置から左前部側方に向かって予め撮影された車室内静止映像を示す図である。図11は、実施例1のシースルーサイドビューモニターシステムA1が搭載されている車両から路面に対し車体形状を投影している様子をイメージして示した斜視図である。図12は、実施例1のシースルーサイドビューモニターシステムA1が搭載されている車両から路面に対し車体形状を投影した像を運転者視点位置から透過して見た場合の映像(不透明部分)を示す図である。図13は、実施例1のシースルーサイドビューモニターシステムA1において図10に示す車室内画像RPに対し図12の「不透明部分DE」を設定した画像を示す図である。図14は、実施例1のシースルーサイドビューモニターシステムA1において図10に示す車室内画像RPに対し「不透明部分DE」と「透明部分CE」と「半透明部分GE」を設定した半透明車室内画像RGを示す図である。以下、図10~図14に基づいて、透過映像によるモニター画像表示作用を説明する。
つまり、死角カメラと運転者視点カメラの2つのカメラを用い、既存システムに対し運転者視点カメラの追加が必要であるため、システムのコストアップとなる。加えて、運転者視点カメラは、運転者の視点位置近くに設けられるもので、運転者視点の位置には設けることができないため、運転者視点カメラから得られるカメラ映像は、実際に運転者の視点から見える像に対して視差が生じる。
上記のように、スーパーインポーズ回路46のブレンド回路部46aにおいて、車室内画像RPのうち自車両の形状を道路面へ垂直に投影した領域SEを、車室内画像の透過率が0%の「不透明部分DE」として設定し、自車両の窓ガラスに相当する領域を、車室内画像の透過率が100%の「透明部分CE」として設定し、影領域と窓ガラス領域以外の領域を、任意の透過率を持つ「半透明部分GE」として設定している(図14)。
すなわち、ブレンド比率手動制御インターフェース4に対する手動操作によりブレンド外部制御部48に対し、透過率調整信号が入力されたら、制御回路45に対し透過率制御指令が出力され、ブレンド回路部46aは、車室内画像に設定された「半透明部分GE」の透過率を、0%~100%の範囲で任意に調整する。
図15は、実施例1のシースルーサイドビューモニターシステムA1において半透明車室内画像RGを反転表示したときの輝度反転画面を示すイメージ図である。以下、図5に示すフローチャートに基づいて、図15を参照しながら外部映像輝度追従表示制御モードによる表示制御作用を説明する。
例えば、薄暮れ時等において、スーパーインポーズされる半透明車室内画像RGの輝度(明度)を通常状態のままにすると、モニター3の表示画面3aの全体輝度が落ち込み、薄暗い仮想カメラ画像の中に半透明車室内画像RGが溶け込んでしまい、視認性を損なうことがある。
これに対し、薄暗い外部映像に対してスーパーインポーズされる半透明車室内画像RGの輝度(明度)を高めることで、重ね合わせられる2つの画像が輝度差(明度差)を持つことになり、視認性の向上に有用である。
例えば、外部映像が暗い夜間時において、スーパーインポーズされる半透明車室内画像RGの輝度(明度)を通常状態のままにすると、モニター3の表示画面3aの全体輝度が大幅に落ち込み、暗い仮想カメラ画像の中に半透明車室内画像RGが完全に溶け込んでしまい、視認性を著しく損なう。また、スーパーインポーズされる半透明車室内画像RGの輝度をアップしても、暗い仮想カメラ画像の中に半透明車室内画像RGが溶け込んで、2つの画像の区別がつかない。
これに対し、暗い外部映像に対してスーパーインポーズにより重畳される半透明車室内画像RGを、写真のネガ画像のように輝度データを反転し、暗い黒線部分は白線表示し、明るい白線部分は黒線表示を行うことで、特に、夜間の視認性を向上させた合成画像を構成することができる。つまり、半透明車室内画像RGを反転表示することで、スーパーインポーズによるモニター3の表示画面は、図15に示すように、白抜きの線画に近い映像となり、暗い外部カメラ映像の中でも車両感覚を直感的に把握可能となる。
以下、図6に示すフローチャートに基づいて、輝度急変対応表示制御モードによる表示制御作用を説明する。
以下、図7に示すフローチャートに基づいて、図8を参照しながら色相変換表示制御モードによる表示制御作用を説明する。
したがって、外部映像の色相と半透明車室内画像RGの色相が互いに同じ色相レベルにある場合には、補色関係にある色系統に色相を変化させることにより、外部映像と半透明車室内画像RGがはっきりと区別されたモニター映像とすることができる。
以下、図9に示すフローチャートに基づいて、警告表示制御モードによる表示制御作用を説明する。
実施例1のシースルーサイドビューモニターシステムA1にあっては、下記に列挙する効果を得ることができる。
このため、実カメラ映像を取得する外部環境条件にかかわらず、仮想カメラ画像と半透明車室内画像RGの区別を明らかとし、運転者から死角となる外部状況を自車両との位置関係にて明確に透過視認できる。
このため、運転者視点カメラの設定を省略し、車載死角カメラ(サイドカメラ1)のみを用いる安価なシステムとしながら、運転者視点カメラのように実際の運転者視点に対し視差が生じることなく、運転者視点による半透明車室内画像RGを取得することができる。
このため、車載死角カメラ(サイドカメラ1)からの外部映像の平均的な輝度と色相に対応し、半透明車室内画像RGの輝度と色相を、視認性を高めるように自動調整することができる。
このため、昼間から薄暮れを経過して夜間に至るまでの外部映像の輝度変化に追従し、モニター3に表示される外部映像と半透明車室内画像RGの区別を明確にし、視認性を向上させることができる。
このため、夜間にて対向車のライトが車載死角カメラ(サイドカメラ1)に映り込む等の急峻な輝度変化があるとき、急峻な輝度変化に対応し、モニター3に表示される外部映像と半透明車室内画像RGの区別を明確にし、視認性を向上させることができる。
このため、外部映像の色相が半透明車室内画像RGの色相に近づいたり、外部映像の色相が半透明車室内画像RGの色相と一致したりしても、両者の色相ずれ幅に対応して、モニター3に表示される外部映像と半透明車室内画像RGの区別を明確にし、視認性を向上させることができる。
このため、車速が出過ぎているとき、スーパーインポーズする半透明車室内画像RGの全体の色相を、例えば、赤色系に変化させることで、車速を落とすべきとの警告を運転者に与えて安全性を確保することができる。
このため、個人差が大きなユーザーの好みやモニター3の表示画像の視認性に応じて、手動操作により半透明車室内画像RGの輝度・色相・彩度・明度の少なくとも1つを調整できるというように、使い勝手の良いシステムとすることができる。
このため、モニター3に車体の大きさ・形がハッキリとしたスーパーインポーズ画面として表示され、例えば、一様な半透明画面にした場合に生じる様々な誤解や認識の誤りを改善することができる。その結果、車両の挙動が一目瞭然となり、脱輪等の回避判断が容易となる。
このため、側溝への落輪回避や障害物回避の際に必要となる車両感覚が一目で判り、直感的な空間把握が可能となり、安全運転に対して寄与し得る貢献度を大きくすることができる。
図16は、実施例2のシースルーバックビューモニターシステムA2(車両周辺画像表示システムの一例)を示す全体システムブロック図である。
図17は、実施例2のシースルーバックビューモニターシステムA2において車両後方の車室内画像RPに対し「不透明部分DE」と「透明部分CE」と「半透明部分GE」を設定した半透明車室内画像RGを示す図である。
実施例2のシースルーバックビューモニターシステムA2にあっては、実施例1の(1)~(9)の効果に加え、下記の効果を得ることができる。
このため、例えば、駐車での後退走行の際に必要となる車両停止ライン,車両停止縁石,壁等と自車両の位置感覚や距離感覚、あるいは、走行時、接近してくる後続車両と自車両の位置感覚や距離感覚といった空間把握を直感的に行うことができ、迅速な駐車や安全運転に対して寄与し得る貢献度を大きくすることができる。
図18は、実施例3のシースルーフロントビューモニターシステムA3(車両周辺画像表示システムの一例)を示す全体システムブロック図である。
ここで、設定値Tr0は、線路変更方向である右への視界を確保するための透過率しきい値である。
ここで、設定値Tr0は、線路変更方向である左への視界を確保するための透過率しきい値である。
なお、他の構成は、実施例1と同様である。
図20は、実施例3のシースルーフロントビューモニターシステムA3において左・右・中央のフロントカメラ映像の分割領域に「不透明部分DE」を設定した画像を示す図である。図21は、実施例3のシースルーフロントビューモニターシステムA3において車室内画像RPに対し「不透明部分DE」と「透明部分CE」と「半透明部分GE」を設定した半透明車室内画像RGを示す図である。
すなわち、カメラを用いた映像は、視界の確保を目的とする構成のため、左・右・中央のフロントカメラ31L,31R,31Cのカメラ画像を1画面で表示することが多い。この場合、通常、これらの各々のカメラ映像を合成し、180度以上の広角画面映像を表示することになる。
ターンシグナルスイッチ55が反応する場合は、ハンドルを切る動作、即ち徐行・停止後に左右いずれかへ針路変更を行うことである。この場合、中央部の視界より左右からの接近車両の情報が重要となってくる。
ここで、上記動作中、ターンシグナルの左右を判別し、そのどちらかのみを重み付けして透過率を変更するようにしても良い。なお、他の作用は、実施例1と同様であるので、説明を省略する。
実施例3のシースルーフロントビューモニターシステムA3にあっては、実施例1の(1)~(9)の効果に加え、下記の効果を得ることができる。
このため、例えば、停止や徐行からの直進発進や旋回発進の際に必要となる車両前方の障害物等と自車両の位置感覚や距離感覚、あるいは、接近してくる車両と自車両の位置感覚や距離感覚といった空間把握を直感的に行うことができ、安全運転に対して寄与し得る貢献度を大きくすることができる。
Claims (12)
- 自車両に取り付けられ、車両周辺を撮像する車載死角カメラと、運転者が視認できる車室内位置に設定したモニターと、前記車載死角カメラから入力される実カメラ映像に基づき、前記モニターへの表示映像を生成するモニター映像生成手段と、を備えた車両周辺画像表示システムにおいて、
前記モニター映像生成手段は、
前記車載死角カメラから入力される実カメラ映像を、運転者の視点位置から見た仮想カメラ画像に視点変換する画像処理部と、
前記車載死角カメラからの外部映像の色を、輝度・色相・彩度・明度の少なくとも1つに応じて判定する外部映像色判定部と、
前記外部映像の色判定結果に基づき、車室内画像を半透明化した半透明車室内画像の輝度・色相・彩度・明度の少なくとも1つを、前記外部映像に対する視認性を高めるように自動調整する車室内画像色自動調整部と、
前記画像処理部からの仮想カメラ画像に、前記車室内画像色自動調整部からの半透明車室内画像を重畳させる画像合成により、半透明車室内画像を透過して仮想カメラ画像を表現する合成画像を生成する画像合成回路と、
を有することを特徴とする車両周辺画像表示システム。 - 請求項1に記載された車両周辺画像表示システムにおいて、
前記モニター映像生成手段は、運転者の視点から予め撮影された車室内静止映像を、車室内画像として記憶しておく画像記憶部を有し、
前記車室内画像色自動調整部は、前記画像記憶部からの車室内画像を半透明化することにより半透明車室内画像を取得することを特徴とする車両周辺画像表示システム。 - 請求項1または請求項2に記載された車両周辺画像表示システムにおいて、
前記外部映像色判定部は、前記車載死角カメラからの外部映像の平均的な輝度と色相を判定する輝度・色相判定センサーであり、
前記車室内画像色自動調整部は、前記輝度・色相判定センサーによる外部映像の色判定結果に基づき、半透明車室内画像の輝度と色相を、前記外部映像の輝度と色相に対する視認性を高めるように自動調整する輝度・色相変換ブロックであることを特徴とする車両周辺画像表示システム。 - 請求項3に記載された車両周辺画像表示システムにおいて、
前記モニター映像生成手段は、前記輝度・色相判定センサーからの輝度検出値を読み込み、輝度検出値が薄暮れ閾値を示す第1設定値以上のとき、半透明車室内画像の輝度と外部映像の輝度の間に輝度差を持たせた表示とし、輝度検出値が第1設定値より低いが夜間閾値を示す第2設定値以上のとき、半透明車室内画像の輝度を高めて外部映像の輝度との間に輝度差を持たせた表示とし、輝度検出値が第2設定値よりも低いとき、半透明車室内画像の輝度を反転すると共に黒線を白線で反転表示する外部映像輝度追従表示制御モードを有することを特徴とする車両周辺画像表示システム。 - 請求項3に記載された車両周辺画像表示システムにおいて、
前記モニター映像生成手段は、前記輝度・色相判定センサーからの輝度検出値を読み込み、輝度検出値が上限側の閾値を示す第3設定値より高くなると、半透明車室内画像の輝度を低下させ、全体的に黒っぽく表示する輝度急変対応表示制御モードを有することを特徴とする車両周辺画像表示システム。 - 請求項2から請求項5までの何れか1項に記載された車両周辺画像表示システムにおいて、
前記モニター映像生成手段は、前記輝度・色相判定センサーからの色相検出値を読み込み、色相検出値と設定値の色相ずれが第1閾値以上のとき、半透明車室内画像の色相を維持する表示とし、色相検出値と設定値の色相ずれが第1閾値未満であるが第2閾値以上のとき、半透明車室内画像の色相はそのままで色を明るくする表示とし、色相検出値と設定値の色相ずれが第2閾値未満のとき、半透明車室内画像の色相を外部映像の色相に対し、色相環において補色側領域に存在する補色系色相に変換表示する色相変換表示制御モードを有することを特徴とする車両周辺画像表示システム。 - 請求項1から請求項6までの何れか1項に記載された車両周辺画像表示システムにおいて、
車速を検出する車速検出手段を設け、
前記前記モニター映像生成手段は、前記車速検出手段からの車速検出値を読み込み、車速検出値が設定値以上であるとき、半透明車室内画像の全体色相を運転者に対し警告を認識させる色相に変換して表示する警告表示制御モードを有することを特徴とする車両周辺画像表示システム。 - 請求項1から請求項7までの何れか1項に記載された車両周辺画像表示システムにおいて、
前記モニター映像生成手段は、車室内画像色手動操作手段に対する外部からの操作に応じ、前記半透明車室内画像の輝度・色相・彩度・明度の少なくとも1つを、任意に調整する車室内画像色外部制御部を有することを特徴とする車両周辺画像表示システム。 - 請求項1から請求項8までの何れか1項に記載された車両周辺画像表示システムにおいて、
前記画像合成回路は、前記仮想カメラ画像に重畳する前記半透明車室内画像のうち、自車両を道路面に投影した領域を透過率が0%の不透明部分とし、自車両の窓ガラスに相当する領域を透過率が100%の透明部分とし、前記不透明部分と前記透明部分以外の領域を、任意の透過率を持つ半透明部分として設定するブレンド回路部を有することを特徴とする車両周辺画像表示システム。 - 請求項1から請求項9までの何れか1項に記載された車両周辺画像表示システムにおいて、
前記車載死角カメラは、運転者の死角となる車両の前部側方部分を、車室内から透過した映像として車室内のモニターに表示するシースルーサイドビューモニターシステムに用いられるサイドカメラであることを特徴とする車両周辺画像表示システム。 - 請求項1から請求項9までの何れか1項に記載された車両周辺画像表示システムにおいて、
前記車載死角カメラは、運転者の死角となる車両の後方部分を、車室内から透過した映像として車室内のモニターに表示するシースルーバックビューモニターシステムに用いられるバックカメラであることを特徴とする車両周辺画像表示システム。 - 請求項1から請求項9までの何れか1項に記載された車両周辺画像表示システムにおいて、
前記車載死角カメラは、運転者の死角となる車両の前方部分を、車室内から透過した映像として車室内のモニターに表示するシースルーフロントビューモニターシステムに用いられる単独あるいは複数のフロントカメラであることを特徴とする車両周辺画像表示システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2009/070488 WO2011070640A1 (ja) | 2009-12-07 | 2009-12-07 | 車両周辺画像表示システム |
US13/514,575 US20120249789A1 (en) | 2009-12-07 | 2009-12-07 | Vehicle peripheral image display system |
EP09852037.2A EP2512133B1 (en) | 2009-12-07 | 2009-12-07 | Vehicle periphery image display system |
CN200980162790.0A CN102714710B (zh) | 2009-12-07 | 2009-12-07 | 车辆周边图像显示系统 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2009/070488 WO2011070640A1 (ja) | 2009-12-07 | 2009-12-07 | 車両周辺画像表示システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011070640A1 true WO2011070640A1 (ja) | 2011-06-16 |
Family
ID=44145212
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/070488 WO2011070640A1 (ja) | 2009-12-07 | 2009-12-07 | 車両周辺画像表示システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20120249789A1 (ja) |
EP (1) | EP2512133B1 (ja) |
CN (1) | CN102714710B (ja) |
WO (1) | WO2011070640A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013162328A (ja) * | 2012-02-06 | 2013-08-19 | Fujitsu Ten Ltd | 画像処理装置、画像処理方法、プログラム、及び画像処理システム |
EP2744694A4 (en) * | 2011-08-17 | 2015-07-22 | Lg Innotek Co Ltd | CAMERA DEVICE FOR A VEHICLE |
WO2016027689A1 (ja) * | 2014-08-21 | 2016-02-25 | アイシン精機株式会社 | 画像表示制御装置および画像表示システム |
JP2016058801A (ja) * | 2014-09-05 | 2016-04-21 | アイシン精機株式会社 | 画像表示制御装置および画像表示システム |
EP3089075A3 (en) * | 2015-04-09 | 2017-01-25 | Kabushiki Kaisha Tokai Rika Denki Seisakusho | Vehicular viewing device |
WO2019074005A1 (ja) * | 2017-10-10 | 2019-04-18 | マツダ株式会社 | 車両用ディスプレイ装置 |
EP3967554A1 (en) * | 2020-09-15 | 2022-03-16 | Mazda Motor Corporation | Vehicular display system |
US20220080884A1 (en) * | 2020-09-15 | 2022-03-17 | Hyundai Motor Company | Device and method for controlling emotional lighting of vehicle |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11021136B1 (en) * | 2011-08-29 | 2021-06-01 | The Boeing Company | Methods and systems for providing a remote virtual view |
US20140114534A1 (en) * | 2012-10-19 | 2014-04-24 | GM Global Technology Operations LLC | Dynamic rearview mirror display features |
KR101393881B1 (ko) * | 2012-10-24 | 2014-05-12 | 현대자동차주식회사 | 차량의 주차구획 인식방법 |
JP6115104B2 (ja) * | 2012-12-04 | 2017-04-19 | アイシン精機株式会社 | 車両の制御装置、及び制御方法 |
CN103856823A (zh) * | 2012-12-06 | 2014-06-11 | 腾讯科技(深圳)有限公司 | 界面调整方法、装置及终端 |
JP6081570B2 (ja) * | 2013-02-21 | 2017-02-15 | 本田技研工業株式会社 | 運転支援装置、および画像処理プログラム |
JP6148887B2 (ja) * | 2013-03-29 | 2017-06-14 | 富士通テン株式会社 | 画像処理装置、画像処理方法、及び、画像処理システム |
WO2015104860A1 (ja) * | 2014-01-10 | 2015-07-16 | アイシン精機株式会社 | 画像表示制御装置および画像表示システム |
CN105981367A (zh) * | 2014-02-11 | 2016-09-28 | 罗伯特·博世有限公司 | 对来自多摄像机系统的视频的亮度和颜色匹配 |
JP5989701B2 (ja) * | 2014-03-24 | 2016-09-07 | トヨタ自動車株式会社 | 境界検出装置および境界検出方法 |
FR3018940B1 (fr) * | 2014-03-24 | 2018-03-09 | Survision | Systeme de classification automatique de vehicules automobiles |
CN105216715A (zh) * | 2015-10-13 | 2016-01-06 | 湖南七迪视觉科技有限公司 | 一种汽车驾驶员视觉辅助增强系统 |
CN105611308B (zh) * | 2015-12-18 | 2018-11-06 | 盯盯拍(深圳)技术股份有限公司 | 视频画面处理方法、装置以及系统 |
US20190031102A1 (en) * | 2016-01-28 | 2019-01-31 | Hon Hai Precision Industry Co., Ltd. | Image display system for vehicle use and vehicle equipped with the image display system |
WO2017154317A1 (ja) * | 2016-03-09 | 2017-09-14 | 株式会社Jvcケンウッド | 車両用表示制御装置、車両用表示システム、車両用表示制御方法およびプログラム |
WO2017179174A1 (ja) * | 2016-04-14 | 2017-10-19 | 日産自動車株式会社 | 移動体周囲表示方法及び移動体周囲表示装置 |
US10306289B1 (en) | 2016-09-22 | 2019-05-28 | Apple Inc. | Vehicle video viewing systems |
JP6876236B2 (ja) * | 2016-09-30 | 2021-05-26 | 株式会社アイシン | 表示制御装置 |
WO2018087625A1 (en) * | 2016-11-10 | 2018-05-17 | Semiconductor Energy Laboratory Co., Ltd. | Display device and driving method of display device |
US10936884B2 (en) * | 2017-01-23 | 2021-03-02 | Magna Electronics Inc. | Vehicle vision system with object detection failsafe |
US10609398B2 (en) * | 2017-07-28 | 2020-03-31 | Black Sesame International Holding Limited | Ultra-low bitrate coding based on 3D map reconstruction and decimated sub-pictures |
DE102017216058A1 (de) * | 2017-09-12 | 2019-03-14 | Bayerische Motoren Werke Aktiengesellschaft | Dynamisch kolorierte Anzeige eines Fahrzeugs |
WO2019071155A1 (en) | 2017-10-05 | 2019-04-11 | University Of Utah Research Foundation | TRANSLUCENT IMAGING SYSTEM AND ASSOCIATED METHODS |
JP2019073091A (ja) * | 2017-10-13 | 2019-05-16 | トヨタ自動車株式会社 | 車両用表示装置 |
DE102018100211A1 (de) * | 2018-01-08 | 2019-07-11 | Connaught Electronics Ltd. | Verfahren zum Erzeugen einer Darstellung einer Umgebung durch Verschieben einer virtuellen Kamera in Richtung eines Innenspiegels eines Fahrzeugs; sowie Kameraeinrichtung |
WO2019177036A1 (ja) * | 2018-03-15 | 2019-09-19 | 株式会社小糸製作所 | 車両用映像システム |
JP7435601B2 (ja) * | 2019-05-07 | 2024-02-21 | Agc株式会社 | 表示システム、表示方法及び透明表示体 |
JP7167853B2 (ja) * | 2019-05-23 | 2022-11-09 | 株式会社デンソー | 表示制御装置 |
JP7018923B2 (ja) * | 2019-12-13 | 2022-02-14 | 本田技研工業株式会社 | 駐車支援装置、駐車支援方法およびプログラム |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002337605A (ja) * | 2001-05-18 | 2002-11-27 | Auto Network Gijutsu Kenkyusho:Kk | 車両用周辺視認装置 |
JP2003244688A (ja) * | 2001-12-12 | 2003-08-29 | Equos Research Co Ltd | 車両の画像処理装置 |
JP2004350303A (ja) | 2004-06-11 | 2004-12-09 | Equos Research Co Ltd | 車両の画像処理装置 |
JP2005335410A (ja) * | 2004-05-24 | 2005-12-08 | Olympus Corp | 画像表示装置 |
JP2008039395A (ja) | 2006-08-01 | 2008-02-21 | Dainippon Printing Co Ltd | 粘度測定装置および粘度測定方法 |
JP2008171314A (ja) * | 2007-01-15 | 2008-07-24 | San Giken:Kk | 速度警報機能付カーナビ装置 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5280344A (en) * | 1992-04-30 | 1994-01-18 | International Business Machines Corporation | Method and means for adding an extra dimension to sensor processed raster data using color encoding |
US6037914A (en) * | 1997-08-25 | 2000-03-14 | Hewlett-Packard Company | Method and apparatus for augmented reality using a see-through head-mounted display |
JP4114292B2 (ja) * | 1998-12-03 | 2008-07-09 | アイシン・エィ・ダブリュ株式会社 | 運転支援装置 |
EP1303140A4 (en) * | 2000-07-19 | 2007-01-17 | Matsushita Electric Ind Co Ltd | MONITORING SYSTEM |
US7212653B2 (en) * | 2001-12-12 | 2007-05-01 | Kabushikikaisha Equos Research | Image processing system for vehicle |
CN1485227A (zh) * | 2002-09-24 | 2004-03-31 | 李大民 | 一种后视的方法及其实现该方法的装置 |
US20060050983A1 (en) * | 2004-09-08 | 2006-03-09 | Everest Vit, Inc. | Method and apparatus for enhancing the contrast and clarity of an image captured by a remote viewing device |
WO2006103835A1 (ja) * | 2005-03-25 | 2006-10-05 | Mitsubishi Denki Kabushiki Kaisha | 画像処理装置、画像表示装置、及び画像表示方法 |
US7612813B2 (en) * | 2006-02-03 | 2009-11-03 | Aptina Imaging Corporation | Auto exposure for digital imagers |
JP4305540B2 (ja) * | 2007-03-22 | 2009-07-29 | 村田機械株式会社 | 画像処理装置 |
JP2009171008A (ja) * | 2008-01-11 | 2009-07-30 | Olympus Corp | 色再現装置および色再現プログラム |
WO2009104675A1 (ja) * | 2008-02-20 | 2009-08-27 | クラリオン株式会社 | 車両周辺画像表示システム |
JP2009225322A (ja) * | 2008-03-18 | 2009-10-01 | Hyundai Motor Co Ltd | 車両用情報表示システム |
US8334876B2 (en) * | 2008-05-22 | 2012-12-18 | Sanyo Electric Co., Ltd. | Signal processing device and projection display apparatus |
-
2009
- 2009-12-07 WO PCT/JP2009/070488 patent/WO2011070640A1/ja active Application Filing
- 2009-12-07 CN CN200980162790.0A patent/CN102714710B/zh not_active Expired - Fee Related
- 2009-12-07 EP EP09852037.2A patent/EP2512133B1/en active Active
- 2009-12-07 US US13/514,575 patent/US20120249789A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002337605A (ja) * | 2001-05-18 | 2002-11-27 | Auto Network Gijutsu Kenkyusho:Kk | 車両用周辺視認装置 |
JP2003244688A (ja) * | 2001-12-12 | 2003-08-29 | Equos Research Co Ltd | 車両の画像処理装置 |
JP2005335410A (ja) * | 2004-05-24 | 2005-12-08 | Olympus Corp | 画像表示装置 |
JP2004350303A (ja) | 2004-06-11 | 2004-12-09 | Equos Research Co Ltd | 車両の画像処理装置 |
JP2008039395A (ja) | 2006-08-01 | 2008-02-21 | Dainippon Printing Co Ltd | 粘度測定装置および粘度測定方法 |
JP2008171314A (ja) * | 2007-01-15 | 2008-07-24 | San Giken:Kk | 速度警報機能付カーナビ装置 |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2744694A4 (en) * | 2011-08-17 | 2015-07-22 | Lg Innotek Co Ltd | CAMERA DEVICE FOR A VEHICLE |
US10155476B2 (en) | 2011-08-17 | 2018-12-18 | Lg Innotek Co., Ltd. | Camera apparatus of vehicle |
JP2013162328A (ja) * | 2012-02-06 | 2013-08-19 | Fujitsu Ten Ltd | 画像処理装置、画像処理方法、プログラム、及び画像処理システム |
WO2016027689A1 (ja) * | 2014-08-21 | 2016-02-25 | アイシン精機株式会社 | 画像表示制御装置および画像表示システム |
JP2016043778A (ja) * | 2014-08-21 | 2016-04-04 | アイシン精機株式会社 | 画像表示制御装置および画像表示システム |
JP2016058801A (ja) * | 2014-09-05 | 2016-04-21 | アイシン精機株式会社 | 画像表示制御装置および画像表示システム |
EP3089075A3 (en) * | 2015-04-09 | 2017-01-25 | Kabushiki Kaisha Tokai Rika Denki Seisakusho | Vehicular viewing device |
WO2019074005A1 (ja) * | 2017-10-10 | 2019-04-18 | マツダ株式会社 | 車両用ディスプレイ装置 |
JP2019071547A (ja) * | 2017-10-10 | 2019-05-09 | マツダ株式会社 | 車両用表示装置 |
EP3967554A1 (en) * | 2020-09-15 | 2022-03-16 | Mazda Motor Corporation | Vehicular display system |
US20220080884A1 (en) * | 2020-09-15 | 2022-03-17 | Hyundai Motor Company | Device and method for controlling emotional lighting of vehicle |
US11603040B2 (en) * | 2020-09-15 | 2023-03-14 | Hyundai Motor Company | Device and method for controlling emotional lighting of vehicle |
Also Published As
Publication number | Publication date |
---|---|
EP2512133B1 (en) | 2018-07-18 |
EP2512133A1 (en) | 2012-10-17 |
US20120249789A1 (en) | 2012-10-04 |
EP2512133A4 (en) | 2016-11-30 |
CN102714710A (zh) | 2012-10-03 |
CN102714710B (zh) | 2015-03-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5118605B2 (ja) | 車両周辺画像表示システム | |
WO2011070640A1 (ja) | 車両周辺画像表示システム | |
JP5421788B2 (ja) | 車両周辺画像表示システム | |
US11572017B2 (en) | Vehicular vision system | |
US8754760B2 (en) | Methods and apparatuses for informing an occupant of a vehicle of surroundings of the vehicle | |
JP5121737B2 (ja) | 画像データにおいて重要対象物を識別するためのバーチャルスポットライト | |
JP5251947B2 (ja) | 車両用画像表示装置 | |
JP2010109684A5 (ja) | ||
WO2018105417A1 (ja) | 撮像装置、画像処理装置、表示システム、および車両 | |
JP5459154B2 (ja) | 車両用周囲画像表示装置及び方法 | |
EP2476587B1 (en) | Vehicle surrounding monitor apparatus | |
JP5516997B2 (ja) | 画像生成装置 | |
KR101657673B1 (ko) | 파노라마뷰 생성 장치 및 방법 | |
JP7073237B2 (ja) | 画像表示装置、画像表示方法 | |
JP6781035B2 (ja) | 撮像装置、画像処理装置、表示システム、および車両 | |
JP6762863B2 (ja) | 撮像装置、画像処理装置、表示システム、および車両 | |
WO2021240872A1 (ja) | 表示制御装置、車両及び表示制御方法 | |
JP2021007255A (ja) | 撮像装置、画像処理装置、表示システム、および車両 | |
CN111086518B (zh) | 显示方法、装置、车载平视显示设备及存储介质 | |
JP7296490B2 (ja) | 表示制御装置及び車両 | |
JP7007438B2 (ja) | 撮像装置、画像処理装置、表示装置、表示システム、および車両 | |
WO2021240873A1 (ja) | 表示制御装置、車両及び表示制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980162790.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09852037 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13514575 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009852037 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: JP |