WO2016027689A1 - Dispositif de commande d'affichage d'image et système d'affichage d'image - Google Patents

Dispositif de commande d'affichage d'image et système d'affichage d'image Download PDF

Info

Publication number
WO2016027689A1
WO2016027689A1 PCT/JP2015/072431 JP2015072431W WO2016027689A1 WO 2016027689 A1 WO2016027689 A1 WO 2016027689A1 JP 2015072431 W JP2015072431 W JP 2015072431W WO 2016027689 A1 WO2016027689 A1 WO 2016027689A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
vehicle body
vehicle
unit
display
Prior art date
Application number
PCT/JP2015/072431
Other languages
English (en)
Japanese (ja)
Inventor
淳邦 橋本
Original Assignee
アイシン精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アイシン精機株式会社 filed Critical アイシン精機株式会社
Publication of WO2016027689A1 publication Critical patent/WO2016027689A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • Embodiments described herein relate generally to an image display control device and an image display system.
  • the image display control device of the embodiment includes a mirror image creation unit that creates a mirror image showing at least the rear of the vehicle based on an image captured by an imaging unit provided on a vehicle body, and the mirror image created by the mirror image creation unit.
  • An image superimposing unit that superimposes a vehicle body image showing at least a part of the vehicle body, a display control unit that controls the display device so that an image in which the mirror image and the vehicle body image are superimposed by the image superimposing unit is displayed; And at least one of luminance and color of the vehicle body image is adjusted corresponding to the mirror image. Therefore, according to the present embodiment, for example, the discriminability of the vehicle body image is easily improved by adjusting at least one of the luminance and the color of the vehicle body image corresponding to the mirror image.
  • At least one of the luminance and color of the vehicle body image differs depending on the location of the vehicle body image. Therefore, for example, at least one of the luminance and the color is adjusted corresponding to the mirror image for each location of the vehicle body image, so that the discriminability of the vehicle body image is more likely to be improved.
  • the image display control device for example, at least one of the luminance and the color of the vehicle body image changes according to the change in the mirror image. Therefore, for example, since at least one of the luminance and color of the vehicle body image is adjusted according to the temporal change of the mirror image, the discriminability of the vehicle body image is more likely to be improved.
  • the brightness of the vehicle body image is set so that the difference from the brightness of the mirror image is a predetermined value or more. Therefore, for example, the discriminability of the body image is more likely to increase due to the difference between the brightness of the mirror image and the brightness of the body image.
  • the color of the vehicle body image is set to a complementary color of the color where the vehicle body image of the mirror image overlaps. Therefore, for example, the discriminability of the body image is more likely to increase due to the difference between the hue of the mirror image and the hue of the body image.
  • the image superimposing unit sets a boundary line for the mirror image, and the boundary line is arranged between a plurality of images formed with different luminances. Therefore, for example, the discriminability is more likely to increase due to the difference in luminance between both sides of the boundary line.
  • the image superimposing unit sets a boundary line for the mirror image, and the boundary line is arranged between a plurality of images formed with different colors. Therefore, for example, the distinguishability is likely to increase due to the difference in color between both sides of the boundary line.
  • the image display control system includes, for example, an imaging unit that captures at least the rear of a vehicle provided on the vehicle body, a display device, and an image display control device.
  • FIG. 1 is an exemplary schematic configuration diagram of an image display system according to an embodiment.
  • FIG. 2 is a diagram illustrating an example of an image displayed on the display device of the image display system according to the embodiment.
  • FIG. 3 is a plan view of an example of an imaging range by the imaging unit of the image display system according to the embodiment.
  • FIG. 4 is a side view of an example of an imaging range by the imaging unit of the image display system of the embodiment.
  • FIG. 5 is a plan view of another example of the imaging range by the imaging unit of the image display system of the embodiment.
  • FIG. 6 is a side view of another example of the imaging range by the imaging unit of the image display system of the embodiment.
  • FIG. 1 is an exemplary schematic configuration diagram of an image display system according to an embodiment.
  • FIG. 2 is a diagram illustrating an example of an image displayed on the display device of the image display system according to the embodiment.
  • FIG. 3 is a plan view of an example of an imaging range by the imaging unit of the image display system according to the
  • FIG. 7 is an exemplary explanatory diagram of the entire area and the display range of the outside image obtained by the image display system of the embodiment.
  • FIG. 8 is a diagram showing an example of a display range in an outside image obtained by the image display system of the embodiment.
  • FIG. 9 is a diagram illustrating an example of a vehicle body image displayed on the display device of the image display system according to the embodiment.
  • FIG. 10 is an exemplary block diagram of a control unit included in the image display system of the embodiment.
  • FIG. 11 is an exemplary flowchart of a processing procedure of the image display system according to the embodiment.
  • FIG. 12 is an exemplary conceptual diagram including the position of the vehicle at the time of parking and an image (output image) displayed on the display device of the image display system according to the embodiment at the position.
  • FIG. 13 is a diagram showing an example of an image displayed on the display device of the image display system of the embodiment, and is a diagram in a state where the vehicle is running on a road.
  • FIG. 14 is a diagram illustrating an example of a vehicle body image of the image display system according to the embodiment.
  • FIG. 15 is a diagram illustrating another example of the vehicle body image of the image display system according to the embodiment.
  • the image display system 100 equipped in the vehicle 1 includes an ECU 11 (electronic control unit) that controls an image displayed on a display unit 10 a as a display device.
  • the ECU 11 is an example of a display control unit or an image display control device.
  • the display unit 10a is provided in place of, for example, a rear view mirror (not shown) provided at the front and upper part of the vehicle interior. Based on the image captured by the imaging unit 12, the display unit 10a displays an image resembling a mirror image reflected in a room mirror provided at the front and upper part of the vehicle interior as illustrated in FIG.
  • the A passenger such as a driver can use the display unit 10a as a room mirror or instead of a room mirror.
  • the room mirror can also be referred to as a rearview mirror.
  • the display unit 10a of the housing 10 can be attached to the rearview mirror by an attachment or an attachment so as to cover the mirror surface of the rearview mirror, for example.
  • an image that is opposite to the left and right of the image captured by the imaging unit 12 provided outside the vehicle, that is, outside the passenger compartment is displayed.
  • the display unit 10a can be configured, for example, as an LCD (liquid crystal display), an OELD (organic electro-luminescent display), a projector device, or the like.
  • ECU11 may be accommodated in the housing
  • a half mirror (not shown) may be provided on the front side, that is, the rear side of the display unit 10a. In this case, in a state where the image display system 100 is not used and no image is displayed on the display unit 10a, the half mirror can be used as a room mirror. Further, the housing 10 may be provided with an imaging unit 12I as shown in FIGS. 3 to 6 that captures indoor images.
  • the output image Im as an image displayed on the display unit 10 a includes an out-of-vehicle image Imo indicated by a solid line and a vehicle body image Imi indicated by a broken line.
  • the vehicle outside image Imo can be generated from images acquired by one or a plurality of imaging units 12.
  • the imaging unit 12 is a digital camera that incorporates an imaging element such as a CCD (charge coupled device) or a CIS (CMOS image sensor).
  • the imaging unit 12 can output image data, that is, moving image data at a predetermined frame rate.
  • CCD charge coupled device
  • CIS CMOS image sensor
  • the imaging unit 12 includes an imaging unit 12R that images the rear of the vehicle body, that is, the rear outside the vehicle compartment, and an imaging unit 12S that images the side of the vehicle body, that is, the side outside the vehicle interior. Can be.
  • the imaging units 12R and 12S may capture images including both the rear and side of the vehicle body.
  • the imaging unit 12 may be a wide angle lens or a fisheye lens.
  • the ECU 11 can synthesize images acquired by the plurality of imaging units 12 by a known technique to obtain a series of outside-vehicle images Imo as illustrated in FIG.
  • the vehicle outside image Imo may be a panoramic image.
  • a plurality of imaging units 12 provide a relatively wide range on the rear and side of the vehicle 1 so that the vehicle outside image Imo at each position in a relatively wide range can be displayed on the display unit 10a. Imaged. Then, as shown in FIG. 8, a part of the wide range is used, that is, displayed in the composite image as the output image Im.
  • the imaging units 12 that image the outside of the vehicle are provided on each of the left and right sides of the vehicle 1 and are provided on the rear end 1 b of the vehicle 1.
  • the imaging unit 12S that images the side and the rear outside the vehicle compartment is provided in the door mirror
  • the imaging unit 12R that images the rear and the side outside the vehicle cabin is provided in the rear hatch.
  • the imaging unit 12 that images the outside of the vehicle is provided at the left end and the right end of the rear end portion 1 b of the vehicle 1 and at the rear end portion 1 b of the vehicle 1.
  • the imaging ranges of the plurality of imaging units 12 are vertically different.
  • the imaging units 12S that capture the sides and the rear outside the passenger compartment are provided at the upper portions of the left and right corners on the rear side of the vehicle body, respectively, and the imaging units 12R that capture the rear and the sides outside the passenger compartment are The rear bumper is provided.
  • the outside image Imo may be an image captured by one imaging unit 12 or an image based on a part thereof.
  • the outside image Imo is an example of a mirror image. In FIG.
  • the outside image Imo is indicated by a solid line and the vehicle body image Imi is indicated by a broken line for the sake of convenience.
  • the actual outside image Imo is not limited to a solid line, and the vehicle body image Imi is not limited to a broken line.
  • the imaging unit 12 may be provided in the vehicle interior.
  • the vehicle body image Imi includes a contour line Lo or an edge as a display element drawn in a three-dimensional frame shape showing the structure of the vehicle body.
  • the vehicle body components indicated by the contour line Lo are, for example, corners, edges, windows, pillars, doors, floors, ceilings, trims, wheels, axles, differential gears, and the like of the vehicle body.
  • the vehicle body image Imi is not limited to the vehicle body shape as long as the occupant can roughly recognize the position and shape of the vehicle 1.
  • the vehicle body image Imi may be a schematic one.
  • the region between the contour lines Lo may be colored in a state where the vehicle outside image Imo is transmitted.
  • the vehicle body image Imi is a line drawing (line diagram).
  • a line as a display element included in a line drawing can have various display modes. Examples of display modes include type, surface density, width, thickness, density, transparency, color, and pattern. The types include, for example, a solid line, a broken line, a one-dot chain line, a two-dot chain line, a broken line, a jagged line, and a wavy line.
  • the surface density is a density per unit area of the screen or image. For example, in the case of the same thickness, the surface density of the solid line is larger than the surface density of the broken line.
  • the line drawing can include a plurality of lines having locally different display modes.
  • the line drawing may partially include dots, symbols, characters, figures, and the like. The display mode of these line drawings can be set or changed according to the vehicle state such as the traveling state or the operation state.
  • the vehicle body image Imi is displayed together with the vehicle outside image Imo on the display unit 10 a, so that the occupant can recognize the relative position between the vehicle 1 and the object B outside the vehicle, the vehicle 1 and the object B , The direction of the object B, the size of the object B, and the like.
  • the vehicle body image Imi includes a portion Pw indicating an end portion of the vehicle body in the vehicle width direction, a portion Pr indicating the rear end portion of the vehicle body, and a portion Pb indicating the lower portion of the vehicle body. Can be included.
  • a portion Psr indicating an end portion or the like may be included.
  • the vehicle body image Imi is created so that at least the lower part of the vehicle body can be recognized two-dimensionally, that is, two-dimensionally. Therefore, the occupant can easily recognize, for example, the planar size, shape, and part of the vehicle body. For example, the occupant can easily recognize the size and height of the object B outside the passenger compartment, the positional relationship of the object B in the horizontal direction, and the like with reference to the vehicle body image Imi.
  • the vehicle body image Imi is stored in advance in a nonvolatile storage unit.
  • the storage unit can store vehicle body images Imi for a plurality of vehicle types. In this case, for example, the vehicle body image Imi selected according to the vehicle type of the vehicle 1, the user's preference, or the like can be used as the composite image.
  • the storage unit may be, for example, the SSD 11d shown in FIG.
  • the ECU 11 can transform the vehicle body image Imi based on an input instruction or operation or the like at the operation input unit 10b in a setting operation such as calibration. Specifically, for example, the vehicle body image Imi is deformed such that the vehicle body image Imi is stretched left and right as it goes upward, is stretched up and down, and the vertical and horizontal positions are changed. Is changed.
  • the changed body image Imi is stored in the storage unit, and the changed body image Imi is used as the composite image.
  • the vehicle body image Imi an indoor image captured by the imaging unit 12I, an image in which the indoor image is changed, or the like may be used.
  • the ECU 11 can set or change the transmission ratio ⁇ of the vehicle body image Imi, that is, the composition ratio with the vehicle exterior image Imo.
  • the vehicle body image Imi has a luminance of x1
  • the vehicle exterior image Imo has a luminance of x2
  • the transmittance ⁇ can be set to an arbitrary value.
  • the ECU 11 can change the display range Ad of the output image Im and the vehicle outside image Imo as a composite image according to the situation of the vehicle 1.
  • the ECU 11 can use detection results of various sensors, instruction signals, and the like as signals or data serving as triggers for changing the display range Ad.
  • the detection results include, for example, the non-contact measuring device 13 shown in FIG. 1, the steering angle sensor 14 for the front wheels, the steering angle sensor 15a of the rear wheel steering system 15, the GPS 16 (global positioning system), the wheel speed sensor 17, the brake This is a detection result of the brake sensor 18a of the system 18, the accelerator sensor 19, the torque sensor 20a of the front wheel steering system 20, the shift sensor 21, and the like.
  • the instruction signal is, for example, an instruction signal acquired from the direction indicator 22 or the operation input unit 24b.
  • the instruction signal can also be referred to as a control signal, a switching signal, an operation signal, an input signal, instruction data, or the like.
  • the ECU 11 determines the output image Im and the outside image Imo according to the detection result by the object detection unit 111 shown in FIG. 10, the position of the vehicle 1 by the vehicle position acquisition unit 113, the detection result by the background detection unit 114, and the like.
  • the display range Ad, the transparency ⁇ , the color (hue), the luminance, the saturation, and the like of the vehicle body image Imi can be set or changed.
  • the ECU 11 may set or change the thickness of the outline Lo of the vehicle body image Imi, the presence or absence of a shadow, and the like.
  • the electrical components included in the image display system 100 are connected electrically or communicably via the in-vehicle network 23, for example.
  • the electrical components include, for example, the non-contact measuring device 13, the steering angle sensor 14, the steering angle sensor 15a, the GPS 16, the wheel speed sensor 17, the brake sensor 18a, the accelerator sensor 19, the torque sensor 20a, the shift sensor 21, and the direction indicator 22.
  • the operation input unit 24b is, for example, a CAN (controller area network).
  • Each electrical component may be electrically or communicably connected via other than CAN.
  • the non-contact measuring device 13 is, for example, a sonar or radar that emits ultrasonic waves or radio waves and captures the reflected waves.
  • the ECU 11 can measure the presence / absence of the object B as an obstacle and the distance to the object B as shown in FIG. 2 positioned around the vehicle 1 based on the detection result of the non-contact measuring device 13. That is, the non-contact measuring device 13 is an example of a distance measuring unit and an object detecting unit.
  • the steering angle sensor 14 is a sensor that detects a steering amount of a steering wheel (not shown) as a steering unit, and is configured using, for example, a hall element.
  • the rudder angle sensor 15a is a sensor that detects the steering amount of the rear wheel 2R, and is configured using, for example, a hall element. Note that the steering amount is detected as, for example, a rotation angle.
  • the wheel speed sensor 17 is a sensor that detects the amount of rotation of the wheel 2 (2F, 2R) and the number of rotations per unit time, and is configured using, for example, a hall element.
  • the ECU 11 can calculate the amount of movement of the vehicle 1 based on the data acquired from the wheel speed sensor 17.
  • the wheel speed sensor 17 may be provided in the brake system 18.
  • the brake system 18 includes an ABS (anti-lock brake system) that suppresses brake locking, a skid prevention device (ESC: electronic stability control) that suppresses side slip of the vehicle 1 during cornering, and an electric brake system that enhances braking force.
  • BBW brake by wire
  • the brake system 18 applies a braking force to the wheels 2 via an actuator (not shown), and decelerates the vehicle 1.
  • the brake sensor 18a is, for example, a sensor that detects an operation amount of a brake pedal.
  • the accelerator sensor 19 is a sensor that detects the amount of operation of the accelerator pedal.
  • the torque sensor 20a detects torque that the driver gives to the steering unit.
  • the shift sensor 21 is, for example, a sensor that detects the position of the movable part of the speed change operation part, and is configured using a displacement sensor or the like.
  • the movable part is, for example, a lever, an arm, a button, or the like. Note that the configuration, arrangement, electrical connection form, and the like of the various sensors and actuators described above are examples, and can be variously set or changed.
  • the direction indicator 22 outputs a signal for instructing to turn on, turn off, blink, etc. the direction indication light.
  • the display unit 10a can be covered with a transparent operation input unit 10b.
  • the operation input unit 10b is, for example, a touch panel.
  • a passenger or the like can visually recognize an image displayed on the display screen of the display unit 10a via the operation input unit 10b.
  • the occupant or the like operates variously in the image display system 100 by touching, pushing, or moving the operation input unit 10b with a finger or the like at a position corresponding to the image displayed on the display screen of the display unit 10a.
  • the operation input can be executed.
  • a display unit 24a different from the display unit 10a and an audio output device 24c are provided in the vehicle.
  • the display unit 24a is, for example, an LCD or an OELD.
  • the audio output device 24c is, for example, a speaker.
  • the display unit 24a is covered with a transparent operation input unit 24b.
  • the operation input unit 24b is, for example, a touch panel.
  • a passenger or the like can visually recognize an image displayed on the display screen of the display unit 24a via the operation input unit 24b.
  • an occupant or the like can perform an operation input by operating the operation input unit 24b with a finger or the like at a position corresponding to an image displayed on the display screen of the display unit 24a. Can do.
  • the monitor device 24 can include an operation input unit (not shown) such as a switch, a dial, a joystick, and a push button, for example.
  • the monitor device 24 can also be used as a navigation system or an audio system.
  • the ECU 11 can cause the display unit 24a of the monitor device 24 to display an image similar to that of the display unit 10a.
  • the ECU 11 includes, for example, a CPU 11a (central processing unit), a ROM 11b (read only memory), a RAM 11c (random access memory), an SSD 11d (solid state drive), a display control unit 11e, a voice control unit 11f, and the like.
  • the SSD 11d may be a flash memory.
  • the CPU 11a can execute various calculations.
  • the CPU 11a can read a program installed and stored in a nonvolatile storage device such as the ROM 11b or the SSD 11d, and execute arithmetic processing according to the program.
  • the RAM 11c temporarily stores various types of data used in computations by the CPU 11a.
  • the SSD 11d is a rewritable nonvolatile storage unit, and can store data even when the ECU 11 is powered off.
  • the display control unit 11e mainly performs image processing using image data obtained by the imaging unit 12, image processing of image data displayed by the display units 10a and 24a, etc. Execute. Image processing is, for example, composition.
  • the voice control unit 11f mainly executes processing of voice data output from the voice output device 24c among the arithmetic processing in the ECU 11.
  • the CPU 11a, the ROM 11b, the RAM 11c, and the like can be integrated in the same package.
  • the ECU 11 may have a configuration in which another logical operation processor such as a DSP (digital signal processor) or a logic circuit is used instead of the CPU 11a.
  • an HDD hard disk drive
  • the SSD 11d and the HDD may be provided separately from the ECU 11.
  • the output image Im corresponding to the mirror image of the rearview mirror is displayed on the display unit 10a by the image processing of the ECU 11.
  • functions, coefficients, constants, data, etc. for performing coordinate transformation from the vehicle exterior image Imo to the output image Im corresponding to the mapping of the room mirror are the mapping of the room mirror of a plurality of markers actually arranged outside the vehicle or inside the vehicle.
  • the position can be obtained by actually acquiring the position in the inside, performing calibration by imaging, or performing geometric calculation.
  • the function may be a conversion formula, a conversion matrix, or the like.
  • the output image Im is, for example, an image similar to a mapping of a room mirror, a registered image, a suitable image, or the like.
  • the position, size, shape, etc., of the body image Imi are actually acquired in the mapping of the room mirror of multiple markers that are actually placed outside or inside the vehicle, or calibration by imaging is performed. Or by performing a geometric operation.
  • the ECU 11 functions as at least a part of the image display control device in cooperation with hardware and software (program). That is, in this embodiment, for example, as shown in FIG. 10, the ECU 11 displays the vehicle outside image creation unit 110 and the object detection unit in addition to the display control unit 11 e and the voice control unit 11 f also shown in FIG. 1. 111, an image creation unit 112, a vehicle position acquisition unit 113, a background detection unit 114, a display mode change unit 115, a display range determination unit 116, an additional image creation unit 117, and the like.
  • the program can include, for example, modules corresponding to the blocks shown in FIG. Further, the image processing can be executed by the CPU 11a in addition to the display control unit 11e.
  • the vehicle exterior image creation unit 110 is an example of a mirror image creation unit
  • the image creation unit 112 is an example of an image superimposing unit.
  • the outside-vehicle image creation unit 110 connects, for example, a plurality of (for example, three) images captured by the imaging unit 12 outside the vehicle by combining their boundary portions, and creates a series of outside-vehicle images Imo.
  • the vehicle exterior image creation unit 110 creates a vehicle exterior image Imo that resembles the mirror image of the room mirror by the line of sight of the occupant by performing coordinate conversion or the like on the image captured by the imaging unit 12 or the synthesized image.
  • the coordinates of the outside-vehicle image Imo obtained from the imaging unit 12 are converted into coordinates corresponding to the vehicle body image Imi based on experimental results obtained in advance.
  • the size of the image Imb of the object B in the vehicle outside image Imo can be corrected using the measurement result of the distance to the object B by the non-contact measuring device 13. It should be noted that the vehicle outside image Imo only needs to have little or no discomfort for the occupant to view, and does not have to be perfectly aligned with the mirror image when the room mirror is provided.
  • the object detection unit 111 detects the object B outside the vehicle by performing image processing on the vehicle outside image Imo created by the vehicle outside image creation unit 110, for example.
  • the object B is, for example, a vehicle, an object, a person, or the like.
  • the object detection unit 111 can detect the object B outside the vehicle from the data obtained from the non-contact measurement device 13, and the image processing result of the image Imo outside the vehicle and the non-contact measurement device 13 obtained.
  • the object B outside the vehicle can be detected from the data.
  • the object detection unit 111 may acquire the distance from the vehicle 1 to the object B from the result of image processing of the vehicle outside image Imo or the data obtained from the non-contact measurement device 13.
  • the image creation unit 112 creates an output image Im including a composite image in which the vehicle body image Imi and the vehicle exterior image Imo are superimposed on at least the display range Ad displayed on the display unit 10a.
  • the image creating unit 112 may further synthesize an indoor image based on the imaging unit 12I (see FIGS. 3 to 6 and the like) that images the vehicle interior. In this case, for example, an indoor image in which a window portion is cut out by image processing can be superimposed as a transmission image.
  • the vehicle position acquisition unit 113 includes, for example, data from the GPS 16, detection results of the non-contact measurement device 13, wheel speed detected by the wheel speed sensor 17, rudder angle detected by the rudder angle sensors 14 and 15 a, imaging unit
  • the position of the vehicle 1 can be acquired from the image processing result of the vehicle outside image Imo acquired by 12.
  • the background detection unit 114 detects the characteristics of the background image, for example, by performing image processing on the vehicle exterior image Imo created by the vehicle exterior image creation unit 110 serving as the background of the vehicle body image Imi.
  • the characteristics of the detected image are, for example, the color, brightness, and saturation of the image.
  • the background detection unit 114 can acquire image characteristics of a partial region of the outside-vehicle image Imo.
  • the background detection unit 114 can acquire image attributes of a region overlapping with the vehicle body image Imi and a region adjacent thereto.
  • the background detection unit 114 can acquire image characteristics for each part, for example, for each preset section. Note that the background detection unit 114 may detect an average value or a total value of image characteristics for the entire outside-vehicle image Imo.
  • the display mode changing unit 115 includes, for example, the non-contact measuring device 13, the steering angle sensors 14, 15a, the GPS 16, the wheel speed sensor 17, the brake sensor 18a, the accelerator sensor 19, the torque sensor 20a, the shift sensor 21, and the direction indicator 22.
  • Information indicating the vehicle state such as the detection result acquired from the signal, the data, the instruction signal of the operation input unit 24b, the detection result of the object detection unit 111, the position of the vehicle 1 acquired by the vehicle position acquisition unit 113, Depending on the detection result of the background detection unit 114, the display mode of at least one of the vehicle body image Imi and the vehicle exterior image Imo can be changed.
  • the display mode changing unit 115 can change the transmittance ⁇ , the color, the luminance, the saturation, and the like of the vehicle body image Imi.
  • the display mode changing unit 115 can change the transparency of the indoor image.
  • the display range determination unit 116 includes, for example, the non-contact measurement device 13, the steering angle sensors 14 and 15 a, the GPS 16, the wheel speed sensor 17, the brake sensor 18 a, the accelerator sensor 19, the torque sensor 20 a, the shift sensor 21, and the direction indicator 22.
  • Display range Ad according to the detection result acquired from the signal, the data, the instruction signal of the operation input unit 24b, the detection result of the object detection unit 111, the position of the vehicle 1 acquired by the vehicle position acquisition unit 113, and the like. Can be changed.
  • the additional image creation unit 117 can add the additional image Ima to the output image Im, for example.
  • the additional image Ima is an artificial image such as an emphasis display of an object detected by the object detection unit 111 or a display of a line such as a lane or a parking frame.
  • the highlighting is, for example, a frame or a fill.
  • the image display system 100 can execute processing in a procedure as shown in FIG. 11, for example.
  • the ECU 11 is obtained from the non-contact measuring device 13, the steering angle sensors 14, 15a, the GPS 16, the wheel speed sensor 17, the brake sensor 18a, the accelerator sensor 19, the torque sensor 20a, the shift sensor 21, the direction indicator 22, and the like.
  • it is compared with each reference value to determine whether or not a condition for changing the display range Ad is satisfied (S1).
  • the ECU 11 When the condition for changing the display range Ad is satisfied, the ECU 11 functions as the display range determining unit 116, and changes the display range Ad to a position or a width according to the condition (S2).
  • the ECU 11 is acquired from the non-contact measuring device 13, the steering angle sensors 14, 15a, the GPS 16, the wheel speed sensor 17, the brake sensor 18a, the accelerator sensor 19, the torque sensor 20a, the shift sensor 21, the direction indicator 22, and the like.
  • it is compared with each reference value to determine whether or not a condition for changing the display mode is satisfied (S3).
  • the ECU 11 When the condition for changing the display mode is satisfied, the ECU 11 functions as the display mode changing unit 115 and changes the display mode according to the condition (S4).
  • S4 An example of S4 will be described later.
  • the ECU 11 functions as the vehicle exterior image creation unit 110, the object detection unit 111, and the like, and also functions as the image creation unit 112, and creates an output image Im corresponding to the set (changed) display mode and display range Ad. (S5). In S5, an output image Im including the additional image Ima can be created. Then, the display control unit 11e controls the display unit 10a so that the created output image Im is displayed (S6).
  • the vehicle body image Imi includes a plurality of lines Ll along the vehicle longitudinal direction and a plurality of lines Lw along the vehicle width direction corresponding to the lower part of the vehicle 1.
  • the lines Ll and Lw are arranged in a frame shape or a lattice shape, and a display area Ab corresponding to the lower part of the vehicle body is formed.
  • the display mode of the lines Lcl and Lsw is different from the display mode of the other lines Ll and Lw.
  • the widths (thicknesses) of the lines Lcl and Lsw are larger than the widths of the lines Ll and Lw
  • the luminances of the lines Lcl and Lsw are higher than the luminances of the lines Ll and Lw.
  • the vehicle body image Imi includes a line Lv (part Psr) along the vehicle vertical direction at the side portion or the rear portion of the vehicle body.
  • the vehicle body image Imi includes lines Ll and Lw as display elements whose intervals become narrower toward the rear of the vehicle.
  • Each feature makes it easier for the occupant to recognize the position of the floor of the vehicle body, for example.
  • the occupant can easily recognize, for example, the position of the vehicle body in the vehicle longitudinal direction and the position of the vehicle body in the vehicle width direction.
  • the occupant can recognize the position in the vehicle front-rear direction and the position in the vehicle width direction at a height corresponding to the lower part of the vehicle body, for example, as compared with the case where the lines Lcl and Lsw are provided at other height positions.
  • the position of the object B in the vehicle front-rear direction or the vehicle width direction can be easily recognized with higher accuracy.
  • the occupant can easily recognize the position of the vehicle body in the vertical direction of the vehicle, for example.
  • Lines Ll, Lw, Lcl, Lsw, and Lv are examples of display elements.
  • the display area Ab is an example of an area.
  • the lines Ll, Lw, Lcl, Lsw, Lv and the display area Ab are examples of scales.
  • the line Lv does not need to extend in the vertical direction, and may be inclined with respect to the vertical direction.
  • a plurality of lines Lv may be arranged along the other lines Ll and Lw at intervals like a scale.
  • FIG. 12 shows, for example, an output image Im when the vehicle 1 is parked to turn to the left and reversely move to the predicted arrival position.
  • the additional image creating unit 117 adds a frame-like highlighting Imf1 surrounding the detected image B to the image Imb of the detected object B, and adds it to the image Iml of the frame line L on the road surface.
  • a band-like emphasis display Imf2 that overlaps with the image Iml is added
  • an image Imf3 that indicates an expected arrival position that is separated by a predetermined distance from the current position is added, and further, movement that is predicted from the expected arrival position, the steering angle, and the like
  • a linear image Imf4 indicating the route is added.
  • the images Imf3 and Imf4 correspond to the display area Ab corresponding to the lower part of the vehicle body.
  • the side edges of the images Imf3 and Imf4 can be drawn so as to coincide with the side edges of the display area Ab at the predicted arrival position.
  • the driver can more easily recognize the surroundings of the vehicle 1, the situation outside the vehicle in the traveling direction, the parking target position P, the predicted arrival position, the movement route, and the like.
  • the occupant can easily grasp the expected movement path of the vehicle 1 based on the vehicle body image Imi including the display area Ab and the images Imf3 and Imf4 corresponding to the display area Ab.
  • the display mode changing unit 115 changes the color, brightness, saturation, and the like of the vehicle body image Imi for each place according to the vehicle outside image Imo as the background. Specifically, the display mode changing unit 115 sets the color of the vehicle body image Imi to, for example, black in a portion overlapping the sky area above the horizon in the outside image Imo, and below the horizon in the outside image Imo. In a portion overlapping the road surface area, the color of the outside image Imi is set to, for example, white. In addition, in FIG. 13, it represents with the broken line and the white line.
  • the display mode changing unit 115 can be referred to as a display mode adjusting unit, a display mode setting unit, and a display mode determining unit.
  • the display mode changing unit 115 determines the characteristics of the vehicle body image Imi, that is, the color, brightness, and saturation depending on the characteristics of the outside image Imo as the background of the vehicle body image Imi, that is, the color, brightness, and saturation. Can be adjusted.
  • the luminance of the vehicle body image Imi is set so that the difference from the luminance of the vehicle exterior image Imo is equal to or greater than a predetermined value. Thereby, for example, when the vehicle outside image Imo is dark, the vehicle body image Imi becomes bright, and when the vehicle outside image Imo is bright, the vehicle body image Imi becomes dark.
  • the portion of the vehicle body image Imi that overlaps the dark region of the vehicle exterior image Imo becomes bright, and the portion of the vehicle body image Imi that overlaps the bright region of the vehicle exterior image Imo becomes dark. That is, the contrast as the brightness difference between the vehicle exterior image Imo and the vehicle body image Imi is set higher than a predetermined value. Therefore, the discriminability of the vehicle body image Imi is likely to increase.
  • the color (hue) of the vehicle exterior image Imo and the color (hue) of the vehicle body image Imi may be set to colors opposite to the hue circle, that is, complementary colors.
  • a difference is generated between the hue of the vehicle outside image Imo and the hue of the vehicle body image Imi. Therefore, the discriminability of the vehicle body image Imi is likely to increase.
  • a combination of complementary colors for example, a combination of colors on the opposite side of the hue circle such as a combination of colors in the vicinity of two complementary colors may be used.
  • the outside image Imo includes a bright light blue or white area such as the sky, a portion overlapping the area of the outside image Imi may be black.
  • the color saturation of the vehicle body image Imi may be set so that the difference from the color saturation of the vehicle exterior image Imo is a predetermined value or more.
  • the vehicle outside image Imo and the vehicle body image Imi may be more easily distinguished than when the saturation difference is low. Therefore, the discriminability of the vehicle body image Imi is likely to increase.
  • the thickness of the line, line type, presence / absence of shadow, etc. may be adjusted.
  • the display mode changing unit 115 can acquire the characteristics of the vehicle body image Imi corresponding to the characteristics of the outside-vehicle image Imo based on a map, a table, or the like as information stored in, for example, the SSD 11d as a storage unit. Further, the display mode changing unit 115 can acquire the characteristics of the vehicle body image Imi corresponding to the characteristics of the vehicle exterior image Imo from the function indicating the relationship between the characteristics of the vehicle exterior image Imo and the characteristics of the vehicle body image Imi. Further, the display mode changing unit 115 may improve the discrimination between the vehicle exterior image Imo and the vehicle body image Imi by at least one of color, brightness, and saturation, and at least one of color, brightness, and saturation.
  • the discriminability between the vehicle exterior image Imo and the vehicle body image Imi may be enhanced by the combination of the two. Further, the outside image Imo changes with time. Therefore, for example, the display mode changing unit 115 can determine the characteristics of the vehicle body image Imi each time the vehicle outside image Imo is updated, or can determine the characteristics of the vehicle body image Imi at predetermined time intervals. Alternatively, when a sensor for detecting the brightness of the background, such as a luminance sensor (not shown) or an illuminance sensor, is provided, the display mode changing unit 115 determines the characteristics of the vehicle body image Imi based on the detection result of the sensor. May be.
  • a sensor for detecting the brightness of the background such as a luminance sensor (not shown) or an illuminance sensor
  • the display mode changing unit 115 of the ECU 11 may select, for example, the vehicle body image Imi corresponding to the detection result from the plurality of stored vehicle body images Imi.
  • the arithmetic processing is easily simplified.
  • a brightness sensor is configured by the imaging unit 12 and the background detection unit 114.
  • FIG. 14 illustrates a vehicle body image Imi having different characteristics for each of the areas A1 to A3.
  • the luminance of the line Lo1 of the vehicle body image Imi in the region A1 is the highest
  • the luminance of the line Lo2 of the vehicle body image Imi of the region A2 is lower than the luminance of the line Lo1
  • the line Lo3 of the vehicle body image Imi of the region A3. Is set lower than the luminance of the line Lo2.
  • the characteristics of the vehicle body image Imi can be changed in three or more stages. Further, the characteristics of the vehicle body image Imi may change gradually. In this case, gradation is generated in the vehicle body image Imi.
  • FIG. 15 illustrates a vehicle image Imi configured as a transmission image instead of a line drawing.
  • the vehicle body image Imi is filled in a region outside the frame line Low corresponding to the window in a state of transmitting the vehicle outside image Imo (not shown in FIG. 15) with a transmittance ⁇ .
  • the luminance of the vehicle body image Imi in the region A4 above the boundary line PL of the outside image is lower than the luminance of the vehicle body image Imi in the region A5 below the boundary line PL. Even in such an example, the discriminability of the vehicle body image Imi is likely to increase.
  • the boundary line PL is, for example, a boundary line between an inclined ground and the sky.
  • the vehicle body image Imi may be at least partially free of contour lines. At least one of the luminance and color of the vehicle body image Imi differs between the region A4 and the region A5 on both sides of the boundary line PL. Thereby, the discriminability of the body image Imi is further increased.
  • the boundary line PL is bent or bent at a place where a mountain, a valley, an inclination, or the like exists, and is straight at a flat place.
  • a horizontal line as shown in FIG. 13 can also be such a boundary line PL.
  • the discriminability of the vehicle body image Imi is likely to be improved by adjusting at least one of the brightness and the color of the vehicle body image Imi corresponding to the vehicle exterior image Imo.
  • At least one of the luminance and color of the vehicle body image Imi differs depending on the location of the vehicle body image Imi. That is, at least one of luminance and color is adjusted for each location of the vehicle body image Imi corresponding to the vehicle exterior image Imo (mirror image). Therefore, for example, the discriminability of the vehicle body image Imi is likely to increase.
  • At least one of the luminance and the color of the vehicle body image Imi changes according to the change of the vehicle outside image Imo. Therefore, for example, at least one of the luminance and the color of the vehicle body image Imi is adjusted according to the time-dependent change of the vehicle exterior image Imo (mirror image), and thus the discriminability of the vehicle body image Imi is more likely to increase.
  • the luminance of the vehicle body image Imi is set so that the difference from the luminance of the vehicle exterior image Imo is a predetermined value or more. Therefore, for example, the discriminability of the vehicle body image Imi is more likely to increase due to the difference between the luminance of the vehicle exterior image Imo (mirror image) and the luminance of the vehicle body image Imi.
  • the color of the vehicle body image Imi is set to the complementary color of the color where the vehicle body image Imi of the vehicle exterior image Imo overlaps. Therefore, for example, the discriminability of the vehicle body image Imi is more likely to increase due to the difference between the hue of the vehicle exterior image Imo (mirror image) and the hue of the vehicle body image Imi.
  • the display unit 10a may be a device that displays an image on a front window, a screen in a vehicle, or the like, or may be a display panel provided on a dashboard or a center console in a vehicle.
  • the display panel may be provided in a cockpit module, an instrument panel, a fascia, or the like.
  • the signals obtained with the driving operation of the driver perform driving operations such as a steering wheel, a shift switch, a brake pedal, a clutch pedal, and an accelerator pedal other than the direction indicator 22 and the shift lever described above. It may be a signal obtained with an operation on the operation unit.
  • the adjustment or change of the display mode for the operation of the operation unit can be variously set. For example, the display mode can be changed (adjusted) according to the operation of the brake pedal, that is, for example, the color or brightness can be changed, or the display range Ad can be enlarged.
  • SYMBOLS 1 Vehicle (vehicle body), 10a ... Display part (display apparatus), 11 ... ECU (image display control apparatus), 11e ... Display control part, 12, 12R, 12S ... Imaging part, 100 ... Image display system, 110 ... Outside a vehicle Image creation unit (mirror image creation unit), 112... Image creation unit (image superimposition unit), Imi... Car body image, Imo.
  • Vehicle vehicle body
  • 10a Display part
  • ECU image display control apparatus
  • 11e Display control part
  • 12, 12R, 12S Imaging part
  • 100 Image display system
  • 110 ... Outside a vehicle Image creation unit (mirror image creation unit), 112... Image creation unit (image superimposition unit), Imi... Car body image, Imo.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Traffic Control Systems (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

La présente invention concerne, selon un mode de réalisation, un dispositif de commande d'affichage d'image comprenant : une unité de création d'image miroir qui crée une image miroir montrant au moins l'arrière d'un véhicule sur la base d'une image capturée par une unité de capture d'image montée sur une carrosserie de véhicule ; une unité de superposition d'image qui superpose l'image miroir, créée par l'unité de création d'image miroir, sur une image de carrosserie de véhicule montrant au moins une partie de la carrosserie de véhicule ; et une unité de commande d'affichage qui commande un dispositif d'affichage de façon à afficher une image dans laquelle l'image miroir et l'image de carrosserie de véhicule sont superposées l'une sur l'autre par l'unité de superposition d'image. La brillance et/ou la couleur de l'image de carrosserie de véhicules est/sont ajustée(s) en correspondance avec l'image miroir.
PCT/JP2015/072431 2014-08-21 2015-08-06 Dispositif de commande d'affichage d'image et système d'affichage d'image WO2016027689A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-168770 2014-08-21
JP2014168770A JP6413477B2 (ja) 2014-08-21 2014-08-21 画像表示制御装置および画像表示システム

Publications (1)

Publication Number Publication Date
WO2016027689A1 true WO2016027689A1 (fr) 2016-02-25

Family

ID=55350635

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/072431 WO2016027689A1 (fr) 2014-08-21 2015-08-06 Dispositif de commande d'affichage d'image et système d'affichage d'image

Country Status (2)

Country Link
JP (1) JP6413477B2 (fr)
WO (1) WO2016027689A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111886858A (zh) * 2018-03-15 2020-11-03 株式会社小糸制作所 车辆用影像系统

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017179206A1 (fr) * 2016-04-15 2017-10-19 三菱電機株式会社 Dispositif de commande d'affichage pour aide au stationnement, et procédé de commande d'affichage pour aide au stationnement
JP6877115B2 (ja) * 2016-09-27 2021-05-26 株式会社東海理化電機製作所 車両用視認装置
WO2018092919A1 (fr) 2016-11-21 2018-05-24 京セラ株式会社 Dispositif de traitement d'image, dispositif d'imagerie et système d'affichage
JP2018085584A (ja) * 2016-11-21 2018-05-31 京セラ株式会社 画像処理装置、撮像装置、および表示システム
JP2019069717A (ja) * 2017-10-10 2019-05-09 アイシン精機株式会社 駐車支援装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003244688A (ja) * 2001-12-12 2003-08-29 Equos Research Co Ltd 車両の画像処理装置
JP2004258752A (ja) * 2003-02-24 2004-09-16 Fuji Photo Film Co Ltd 画像分割方法および装置
JP2007235532A (ja) * 2006-03-01 2007-09-13 Tokai Rika Co Ltd 車両用監視装置
WO2011070640A1 (fr) * 2009-12-07 2011-06-16 クラリオン株式会社 Système d'affichage d'image périphérique de véhicule

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010136289A (ja) * 2008-12-08 2010-06-17 Denso It Laboratory Inc 運転支援装置及び運転支援方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003244688A (ja) * 2001-12-12 2003-08-29 Equos Research Co Ltd 車両の画像処理装置
JP2004258752A (ja) * 2003-02-24 2004-09-16 Fuji Photo Film Co Ltd 画像分割方法および装置
JP2007235532A (ja) * 2006-03-01 2007-09-13 Tokai Rika Co Ltd 車両用監視装置
WO2011070640A1 (fr) * 2009-12-07 2011-06-16 クラリオン株式会社 Système d'affichage d'image périphérique de véhicule

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111886858A (zh) * 2018-03-15 2020-11-03 株式会社小糸制作所 车辆用影像系统
CN111886858B (zh) * 2018-03-15 2022-06-21 株式会社小糸制作所 车辆用影像系统

Also Published As

Publication number Publication date
JP2016043778A (ja) 2016-04-04
JP6413477B2 (ja) 2018-10-31

Similar Documents

Publication Publication Date Title
JP6380410B2 (ja) 画像表示制御装置および画像表示システム
JP6056612B2 (ja) 画像表示制御装置および画像表示システム
JP6446925B2 (ja) 画像表示制御装置および画像表示システム
JP6565148B2 (ja) 画像表示制御装置および画像表示システム
WO2014156788A1 (fr) Dispositif de commande d'affichage d'image, système d'affichage d'image et unité d'affichage
WO2016027689A1 (fr) Dispositif de commande d'affichage d'image et système d'affichage d'image
US10474898B2 (en) Image processing apparatus for vehicle
US20190244324A1 (en) Display control apparatus
JP6876236B2 (ja) 表示制御装置
JP2019028920A (ja) 表示制御装置
CN109314770B (zh) 周边监控装置
WO2018220915A1 (fr) Dispositif de surveillance de périphérie
US10807529B2 (en) Driving assistant apparatus with lane marking
JP6781035B2 (ja) 撮像装置、画像処理装置、表示システム、および車両
US11830409B2 (en) Peripheral image display device
JP6772716B2 (ja) 周辺監視装置
JP2016060237A (ja) 駐車支援装置及び駐車支援システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15833775

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15833775

Country of ref document: EP

Kind code of ref document: A1