WO2022266704A1 - Display device, display system and methods therefor - Google Patents

Display device, display system and methods therefor Download PDF

Info

Publication number
WO2022266704A1
WO2022266704A1 PCT/AU2022/050626 AU2022050626W WO2022266704A1 WO 2022266704 A1 WO2022266704 A1 WO 2022266704A1 AU 2022050626 W AU2022050626 W AU 2022050626W WO 2022266704 A1 WO2022266704 A1 WO 2022266704A1
Authority
WO
WIPO (PCT)
Prior art keywords
display screen
image generating
camera
view
obstructed
Prior art date
Application number
PCT/AU2022/050626
Other languages
French (fr)
Inventor
William Hutchinson
Nelson MINO
Original Assignee
Thomas Global Systems (IP) Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomas Global Systems (IP) Pty Ltd filed Critical Thomas Global Systems (IP) Pty Ltd
Publication of WO2022266704A1 publication Critical patent/WO2022266704A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/30Collimators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/202Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used displaying a blind spot scene on the vehicle part responsible for the blind spot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/802Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/02Simple or compound lenses with non-spherical faces
    • G02B3/08Simple or compound lenses with non-spherical faces with discontinuous faces, e.g. Fresnel lens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Definitions

  • the present invention relates to display device and in particular to a display device and methods therefore, for use in a vehicle or craft.
  • display devices may be used for displaying a wide variety of images. However, where ambient light incident on the display is reflected within the display, it may interfere with the image being displayed. [009] Any discussion of the background art throughout the specification should in no way be considered as an admission that such background art is prior art, nor that such background art is widely known or forms part of the common general knowledge in the field in Australia or any other country.
  • the invention seeks to provide a display device which will overcome or substantially ameliorate at least some of the deficiencies of the prior art, or to at least provide an alternative.
  • the invention seeks to provide a system for reducing focal plane disparity between an internal screen and an exterior environment.
  • the invention may be said to consist in a display system for providing a driver dynamic visibility of an external area obstructed by a view- obstructing part of a vehicle, the system comprising: a. a layered display screen for displaying images; b. a camera able to be mounted on, or integrated into, the vehicle, the camera being configurable to capture images of the obstructed external area; c. an electronic control unit for communicating images captured by the camera to the layered display screen; wherein d. the camera is configurable to capture images having a wider angle of view than a view obstructed by the view-obstructing part from a viewing position; e. the layered display screen includes an image generating layer; and f. the layered display screen includes one or more optical layers configured for collimating light from the image generating layer; g. such that the displayed images of the obstructed external area dynamically adjust in accordance with the viewing position of the layered display screen.
  • the camera is configurable to capture images having a wider angle of view than a static angle of view obstructed by the view-obstructing part from the viewing position of the layered display screen.
  • the images of the obstructed external area displayed by the layered display screen dynamically adjust in accordance with a line of sight between the viewing position and the display screen.
  • the one or more optical layers of the layered display screen provide (to a viewer at the viewing position) an image plane behind the image generating layer (the image plane looks further away from the driver than it is).
  • the image generating layer is located below, or behind, and at distance within the focal length of, the one or more optical layers.
  • the camera is configurable to capture images at, or substantially at, an infinity focus
  • the depth of field of the images captured by the camera match, or approximately match, the depth of field of a viewer’s eye viewing an unobstructed external area proximate to the view-obstructing part of the vehicle.
  • the layered display screen includes a screen surface.
  • the screen surface has a viewable area that is smaller than an image generating layer area of the image generating layer.
  • the image generating layer area is in the range of about 1.2 to 2.5 times the viewable area of the screen surface.
  • the image generating layer area is about 1 .5 times the viewable area of the screen surface.
  • the one or more optical layers (and the screen surface) is distanced from the image generating layer.
  • the one or more optical layers (and the screen surface) is separated from the image generating layer by a pre-determined distance.
  • the pre-determined distance is in the range of about 1 inch (approx. 25 mm) to about 2.5 inches (approx. 64 mm).
  • the pre-determined distance is about 1.5 inches (approx. 38 mm).
  • a numerical aperture of the one or more optical layers is chosen to optimise the competing factors of (i) minimising the distance between the image generating layer and the one or more optical layers and (ii) maintaining driver visibility of all, or substantially all, the viewable area.
  • the numerical aperture is in the range of about 0.3 to about 0.5.
  • the numerical aperture is about 0.38.
  • the layered display screen provides a viewing cone in the range of about 25 to about 35 degrees.
  • the layered display screen provides a viewing cone of about 30 degrees.
  • the image generating layer is located below, or behind, the one or more optical layers.
  • the image generating layer is an emissive display layer.
  • the image generating layer is an organic light emitting diode (OLED) layer or micro-light emitting diode (micro-LED) layer.
  • OLED organic light emitting diode
  • micro-LED micro-light emitting diode
  • the image generating layer is a non-emissive display layer.
  • the image generating layer is a liquid crystal display layer.
  • the screen surface is located adjacent to the one or more optical layers.
  • the screen surface is located above the one or more optical layers.
  • the screen surface is integral with, or adhered to, one of the one or more optical layers.
  • the view-obstructing part of the vehicle is the A-pillar.
  • the image displayed by the layered display screen is not visible to the driver’s side passenger.
  • the layered display screen is substantially rectangular in shape.
  • dynamic visibility comprises providing continuous visibility of the obstructed external area as the viewing position changes.
  • dynamic visibility comprises providing continuous visibility of the obstructed external area during movement of the vehicle.
  • the one or more optical layers comprise a collimator configured to provide to the driver collimated light from the image generating layer.
  • the collimator is a Fresnel lens.
  • the one or more optical layers comprise a magnifying lens.
  • one or more layers of the layered display screen include an anti-reflective coating.
  • the anti-reflective coating is on an uppermost layer of the display device. [050] In one embodiment, the anti-reflective coating is on an upper surface of the uppermost layer of the display device.
  • the lowest one of the one or more optical layers includes anti- reflective coating on a lower side.
  • the anti-reflective coating includes Magnesium Flouride (MgF2) and/or Lithium Flouride (LiF).
  • one or more interior surfaces of the layered display screen (e.g. the casing) comprise an anti-reflective substance and/or are coated with an anti- reflective coating.
  • the anti-reflective substance and/or anti-reflective coating comprises a dark (light absorbing) substance.
  • the anti-reflective substance comprises black velvet.
  • the anti-reflective substance and/or anti-reflective coating comprises carbon nanotubes (CNTs).
  • the anti-reflective substance and/or anti-reflective coating comprises a material made out of vertically aligned nanotube arrays (VANTAs).
  • the camera is able to be mounted on, or integrated into, an exterior part of the vehicle.
  • the camera is able to be mounted on, or integrated into, a side mirror of the vehicle opposite the driver.
  • the camera is able to be mounted on, or integrated into, a front or side casing of the side mirror.
  • the area external to the vehicle captured by the camera is forward and to the side of the vehicle.
  • the electronic control unit is able to receive and communicate to the layered display screen images from a secondary camera.
  • the layered display screen is able to operate as screen for use when merging and/or reversing and/or performing other driving actions.
  • the camera is a wide-angle camera configured to capture an external area beyond the obstructed external area and that area near to the obstructed external area (e.g. to the rear of the vehicle).
  • the electronic control unit is able to adjust and/or crop and/or enlarge by zooming, the images captured by the camera.
  • the camera is adapted to capture images of the obstructed external area that are larger than a pre-determined minimum capture area, wherein the larger capture area is able to be cropped by the electronic control unit according to one or more parameters
  • the one or more parameters include distance between the visibility obstructing vehicle part and the driver, driver height and/or internal dimensions of the car
  • the driver can control cropping of the larger capture area using an onboard vehicle control system
  • the system is adapted to identify objects in the external area image captured by the camera and the layered display screen is configured to display the identified objects to driver using identification signs or icons.
  • the system is adapted to identify the movement of identifiable objects and to display the movement using moving identification signs or icons [071] In one embodiment, the identification signs or icons change position on the image generating layer according to movement of the identified object relative to the vehicle. [072] In one embodiment, one or more elements of the device are configurable to auto adjust, or to be adjustable by the driver, for the purposes of image calibration [073] In one embodiment, the electronic control unit of the system is configured to apply a software packaging incorporating machine vision and/or computer vision methods to identify objects in the images captured by the camera such as road users, traffic lights, brake lights, other vehicles on the road.
  • the machine vision and/or computer vision methods for identification of objects comprise image processing techniques including one or more of: a. filtering b. thresholding c. segmentation d. edge detection e. colour analysis and/or f. pattern recognition
  • the electronic control unit of the system is configured to apply a software package for identification of objects including artificial intelligence methods such as neural nets, machine learning or deep learning that learn from images processed by the machine vision and/or computer vision methods to improve accuracy of object identification
  • the electronic control system is configured to communicate with a vehicle operation system, such the electronic control system is able to operate or control the vehicle
  • vehicle operation or control by the electronic control system includes one or more of the following: a. controlling acceleration of the vehicle; b. controlling braking of the vehicle; c. controlling steering of the vehicle d. controlling vehicle velocity
  • the system, or individual components thereof e.g. the layered display screen and/or the camera
  • the system, or individual components thereof is able to be retrofit to the vehicle.
  • the invention may be said to consist in a layered display screen for displaying images and for providing dynamic visibility of an external area obstructed by a view-obstructing part of a vehicle, wherein: a. the layered display screen includes an image generating layer; b. the layered display screen includes one or more optical layers for collimating light from image generating layer; c. the layered display screen is adapted to receive images from an electronic control unit configured for communicating images captured by a camera, the images capturing the obstructed external area and having a wider angle of view than a view obstructed by the view-obstructing part from a viewing position; d. such that the displayed images of the obstructed external area dynamically adjust in accordance with the viewing position of the layered display screen.
  • the layered display screen is able to be retrofit to the vehicle.
  • the invention may be said to consist in a method of providing driver dynamic visibility of an external area obstructed by a view- obstructing part of a vehicle, the method including the steps of: a. providing a layered display screen for displaying images; b. mounting a camera on, or integrating the camera into, the vehicle, the camera being configurable to capture images of the obstructed external area; c. providing an electronic control unit for communicating images captured by the camera to the layered display screen; wherein d. the camera is configured to capture images having a wider angle of view than a view obstructed by the view-obstructing part from a viewing position; e. the layered display screen includes an image generating layer; f. the layered display screen includes one or more optical layers for collimating light from the image generating layer; g. such that the displayed images of the obstructed external area dynamically adjust in accordance with the viewing position of the layered display screen.
  • the invention may be said to consist in a method of providing driver dynamic visibility of an external area obstructed by a view- obstructing part of a vehicle, the method including the steps of: a. receiving an image from a camera, the image having a wide-angle view than a view obstructed by the view obstructing part from a viewing position; b. generating an image from an image generating layer; and c. passing the generated image through a collimating optical layer in order to collimate the light passing through the optical layer.
  • the optical layer includes a length or area aspect that is smaller than the corresponding length or area aspect of the image generating layer.
  • the image generating layer is removed from the optical layer by a predetermined distance.
  • the method includes the step of providing a housing within which the image generating layer and the optical layer are housed.
  • the method includes the step of absorbing light incident from the image generating layer that does not pass directly through the optical layers.
  • the invention may be said to consist in a display device for restricting interference by ambient light, wherein: a. the device includes a screen viewable by a viewer, the screen defining a screen surface including a linear polariser; b. the device includes one or more inner layers underneath the screen surface, the one or more inner layers including an image generating layer; c. the screen surface is configured to be distanced from the inner layers by a predetermined distance; and d. a viewable area of the screen surface is smaller than an image generating layer area of the image generating layer; e. such that the screen surface and the one or more inner layers co-operate to reduce light contamination within the device.
  • the invention may be said to consist in a display device for restricting interference by ambient light, wherein: a. the device includes a screen surface comprising an outer linear polariser; b. the device includes one or more inner layers underneath the screen surface, the one or more inner layers including an image generating layer; c. the screen surface is configured to be distanced from the inner layers by a predetermined distance; and d. a viewable area of the screen surface is smaller than an image generating layer area of the image generating layer; e. such that the screen surface and the one or more inner layers co-operate to reduce light contamination within the device.
  • the image generating layer area is in the range of about 1.2 to about 2.5 times the viewable area of the screen surface.
  • the image generating layer area is about 1 .5 times the viewable area of the screen surface.
  • the device further includes an intermediate layer that is adjacent to, or integral with, the screen surface.
  • the intermediate layer includes one or more optical layers.
  • the screen surface, including any intermediate layer is separated from the one or more inner layers by the pre-determined distance.
  • the pre-determined distance is determined by a function of: a. the optical properties of the intermediate layer; b. how much of the image generating layer is visible by a viewer at a preferred viewing distance; and/or c. the potential range of the preferred viewing distance (i.e. distance between viewer and screen).
  • the predetermined distance is chosen or choosable based on a compromise magnification, image focus and distortion.
  • the predetermined distance is adjustable.
  • the display device includes an adjustment mechanism for adjusting the predetermined distance.
  • the pre-determined distance is in the range of about 1 inch (approx. 25 mm) to about 2.5 inches (approx. 64 mm).
  • the pre-determined distance is about 1.5 inches (approx. 38 mm).
  • the one or more optical layers comprise at least one converging lens.
  • the one or more optical layers comprise a collimator.
  • the collimator is a Fresnel lens.
  • the one or more optical layers comprise a magnifying lens.
  • a numerical aperture of the one or more optical layers is chosen to optimise the competing factors of (i) minimising the distance between the image generating layer and the intermediate layer and (ii) maintaining viewer visibility of all, or substantially all, the viewable area.
  • the numerical aperture is in the range of about 0.3 to about 0.5
  • the numerical aperture is about 0.38
  • the device provides a viewing cone in the range of about 25 to about 35 degrees.
  • the device provides a viewing cone of about 30 degrees.
  • the device includes an adhesive layer, and one of the one or more optical layers is adhered to the outer linear polariser by the adhesive layer.
  • the adhesive layer is an adhesive laminate.
  • one or more of the one or more optical layers includes an anti-reflective coating on an inner side.
  • one or more layers of the device include an anti-reflective coating.
  • an uppermost layer of the device has the anti-reflective coating.
  • an outer surface of the outermost layer of the device has the anti-reflective coating.
  • the outer polariser has an anti-reflective coating.
  • one or more interior surfaces of the device comprise an anti- reflective substance and/or are coated with an anti -reflective coating.
  • the anti-reflective substance and/or coating comprises a dark (light absorbing) substance.
  • the anti-reflective substance comprises black velvet.
  • the anti-reflective substance and/or coating comprises carbon nanotubes (CNTs).
  • the anti-reflective substance and/or coating comprises a material made out of vertically aligned nanotube arrays (VANTAs).
  • VANTAs vertically aligned nanotube arrays
  • the anti-reflective coating includes Magnesium Flouride (MgF2) and/or Lithium Flouride (LiF).
  • the ambient light entering the device reflected by the image generating layer is absorbed by the outer linear polariser.
  • the outer linear polariser reduces the ambient light entering into the device by an amount in the range of about 40 to about 75 percent.
  • the outer linear polariser reduces the ambient light entering into the device by an amount greater than about 50 percent.
  • the image generating layer includes an emissive display layer.
  • the image generating layer includes an OLED layer or micro-LED layer.
  • the image generating layer includes a non-emissive display layer.
  • the image generating layer includes a liquid crystal display layer.
  • the device further includes an inner linear polariser adjacent to an inner surface of the image generating layer.
  • the device further includes a backlighting layer.
  • the backlighting layer is the innermost layer of the device. [0132] In one embodiment, the backlighting layer is edge-lit.
  • the edge lighting is by LED.
  • the device further includes a directional backlight film below the inner linear polariser and above the backlighting layer.
  • the directional backlight film is configured to control the angular spread of light from the backlighting layer.
  • the outer linear polariser is the outermost layer of the device.
  • one or more of the layers of the device are rectangular in shape. [0138] Other aspects of the invention are also disclosed.
  • Figure 1 shows a shows a vehicle interior including an A-pillar covered by a display screen
  • Figure 2 shows the vehicle interior of figure 1 with the display screen showing objects obscured by the A-pillar;
  • Figure 3 shows a schematic view of a display system
  • Figure 4 shows a cutaway schematic’s plan view of a display screen
  • Figure 5 shows a cutaway schematic plan view of a display screen with a view viewing the display screen from two viewing points
  • Figure 6 shows a schematic view of a control system
  • Figure 7 shows a flow chart of a method of providing driver dynamic visibility of an external area obstructed by a view-obstructing part of a vehicle.
  • a display system 1000 for providing a driver dynamic visibility of an external area, the external area being obstructed by a view obstructing part of a vehicle 2000.
  • the view obstructing part of the vehicle may be an A-pillar 2100 of the vehicle (as shown in figures 1 and 2), or any other portion, such as the dashboard, B-pillar, C-pillar, vehicle body or cab.
  • the display system includes a layered display screen 1100 for displaying images, a camera 1200, and an electronic control unit 1300.
  • the electronic control unit 1300 is configured for communicating images captured by the camera 1200 to the layered display screen 1100 for display by the layered display screen 1100.
  • the layered display screen 1100 preferably includes a housing 1140 (shown in figure 4), which will be described in more detail below.
  • the layered display screen 1100 includes a plurality of image generating layers 1110 and a plurality of optical layers 1120 for collimating light from the image generating layer 1110.
  • the display screen 1100 defines an outer surface through which light radiates to be viewed by a viewer.
  • the display screen 1100 further includes an intermediate layer which, together with a layer defining the screen surface, make up the optical layers.
  • the optical layers 1120 preferably include a lens 1122, and is preferably in the form of a Fresnel lens.
  • the lens 1122 includes a linear polariser layer 1124 that is adhered by an adhesive laminant layer 1126 to the lens 1122.
  • the outer linear polariser layer 1124 is located closer to the viewer than the lens 1122.
  • the outer linear polariser 1124 reduces the ambient light entering into the device by an amount in the range of about 40% to about 75%, and more preferably greater than 50%.
  • the optical layers 1120 further preferably includes an outer antireflective coating 1128, which is disposed on an outermost surface of the optical layers 1120.
  • the outermost surface of the optical layers 1120 preferably corresponds to a display screen surface of the layered display screen 1100, and in this sense, the optical layers 1120 may be said to be integral with the display screen surface. However, it is envisaged that a further clear protective layer (not shown) may be provided outside of the intermediate layer to make up the optical layers 1120.
  • the antireflective coating 1128 serves to prevent or restrict the ingress of ambient light into the layered display screen 1100.
  • a further antireflective coating may be provided on a side of the optical layers 1120 closer to the image generating layers 1110.
  • the anti-reflective coating includes Magnesium Flouride (MgF2) and/or Lithium Flouride (LiF).
  • MgF2 Magnesium Flouride
  • LiF Lithium Flouride
  • the linear polariser layer 1124 will serve to restrict the ingress of ambient light into the display screen 1100.
  • the Fresnel lens may also provide a magnifying effect.
  • a separate magnifying layer (not shown) may be provided for this purpose.
  • the optical layers can include a converging lens
  • the one or more optical layers 1120 of the layered display screen 1100 are preferably optically configured to display or provide (to a viewer in a viewing position) an focal plane or image plane behind the image generating layer (i.e. the image plane looks further away from the driver than it is).
  • the image generating layer 1110 is located below or inwardly (i.e. further away from the viewer and/or into the housing) and at a predetermined distance within the focal length of, the one or more optical layers.
  • the image generating layers 1110 are located at a predetermined distance of about 25 mm (1 inch) to about 64 mm (2.5 inches) from the optical layer 1110, and more preferably at about 38 mm (1.5 inches).
  • the predetermined distance is shown as arrow d on figure 4.
  • an adjustment mechanism (not shown) may be provided that will allow the predetermined distance to be adjustable by a viewer.
  • Such an adjustment mechanism may be mechanical or electronic in nature, and a person skilled in the art will appreciate that a wide variety of adjustment mechanisms are possible. Examples of adjustment mechanisms include threaded adjustment mechanisms, linear motors, snap-fit type adjustment mechanisms, or the like.
  • the display screen 1100 can include an adjustment mechanism (not shown) for adjusting the predetermined distance.
  • the predetermined distance is preferably determined by a function of one or more selected from: a. the optical properties of the intermediate layer (including the Fresnel lens); b. how much of the image generating layer is visible by a viewer at a preferred viewing distance; and c. the range of preferred viewing distances (i.e. distance between viewer and screen).
  • the distance d between the optical layer 1120 and image generating layer 1110 needs to be chosen to find the preferred compromise between magnification, image focus and distortion.
  • a numerical aperture of the one or more optical layers 1120 is configured to optimise the competing factors of (i) minimising the distance between the image generating layer and the one or more optical layers and (ii) maintaining driver visibility of all, or substantially all, the viewable area.
  • the numerical aperture is preferably in the range of about 0.3 to about 0.5, and more preferably about 0.38.
  • a distance (which may be the width and/or height) or area across the image generating layers 1110 will be larger than the corresponding distance or area across the optical layers 1120.
  • the viewable size, and preferably area, of at least one or more aspects (i.e. width and/or height and/or width x height, shown as arrow S in figure 4) of an outer surface of the layered display screen 1100 will preferably be smaller than the corresponding aspect (shown as arrow L in figure 4) of the image generating layers 1110.
  • the aspect L (height, width or area) of the image generating layers 1110 will preferably be about 1 .2 to 2.5 times the corresponding aspect S of the optical layers 1120, and more preferably about 1.5 times.
  • the layered display screen 1100 provides a tapered viewing angle or cone that is in the range of about 25° to about 35°, and most preferably at about 30°, relative to the plane of the collimated light being displayed from the optical layers 1120.
  • the image generating layers 11 10 can include an emissive display layer, such as an organic light emitting diode (OLED) or other similar light emitting diode (LED) layer such as a micro-LED layer.
  • OLED organic light emitting diode
  • LED light emitting diode
  • the image generating layers 1110 can include an non- emissive display layer, for example a liquid crystal display (LCD) layer.
  • LCD liquid crystal display
  • the image generating layers 1110 include an edge lit backlight layer 1112 that includes LEDs 1113 arranged around at least partly about the periphery of the backlight layer of 1112.
  • the backlight layer 1112 can be directly lit and not edge lit.
  • the image generating layer 1110 further includes a narrow field of view directional backlight form 1114 and a rear or innermost polarising layer 1116.
  • the directional backlight form 1114 is configured to control the angular spread of light from the backlighting layer 1112.
  • the image generating layer 1110 further includes a liquid crystal cell layer 1118 that interacts together with the rear polarising layer 1116 and linear polarising layer 1126 of the optical layers 1120 in a known manner to thereby generate images.
  • the display screen 1100 could be composed of a resilient and/or flexible material that can be folded or banned in accordance with the requirements of the A-pillar configuration.
  • Such resilient and/or flexible display panels are known, and a discussion of these is considered beyond the scope of this specification.
  • the camera 1200 is preferably configured for being mounted on, or integrated into a vehicle 2000.
  • the camera 1200 is configured to be mountable, or integrated with a side mirror (not shown) of the vehicle 2000, in a position from which the camera is able to view the obstructed area.
  • the camera 1200 could be mountable to, or integrated with a front or side casing of a side mirror.
  • the area external to the vehicle captured by the camera would be forward and to the side of the vehicle.
  • the camera 1200 is configured for configurable for capturing images of an external area 3000 that is obstructed by the A pillar 2100 of the vehicle 2000.
  • the view obstructed by the A pillar 2100 will typically be from the point of view of a driver in one particular viewing position at any one time.
  • an obstruction to the view of the driver may be while the driver moves between a number of viewing positions, for example viewing positions of a driver that is movable backwards and forwards in the driver seat, and side to side from the driver’s seat.
  • the camera 1200 will preferably be configured for capturing images or video having a wider angle of view than any one static angle of view obstructed from any one given particular viewing position.
  • the camera 1200 will also preferably be configured for capturing images of the entire external area that may be obstructed within the whole typical range of movement of the drivers viewing position.
  • images captured by the camera 1200 will be fed to the control unit 1300, which will direct these to the display screen 1100. Images captured by the camera will be displayed on the display screen 1100. The displayed images of the obstructed external area will adjust dynamically in accordance with a line of sight between the viewing position from which the layered display screen is being viewed, and the display screen.
  • the camera 1200 is configured or configurable to capture images at, or substantially close to, an infinity focus. Images displayed by the image generating layers 1110 will be collimated by the optical layers 1120 as they pass on to the viewer. This will have the effect of giving light from the images a similar focal plane to light from the actual external area 3000. Because of this, a viewer looking between a viewable area adjacent the external area 3000, and at images on the display screen 1100, will not need to adjust their focus as much as if they were looking at a screen without collimation.
  • the depth of field of the images captured by the camera match, or approximately match, the depth of field of a viewer’s eye viewing an unobstructed external area proximate to the view obstructing part of the vehicle. This reduces effort required by the driver to be able to be aware of their surroundings.
  • the image displayed by the layered display screen 1100 may or may not be visible to the driver’s side passenger.
  • another viewing angle reduction layer (not shown) may be provided that prevents others that are outside of a particular viewing angle from seeing the display screen 1100.
  • the camera 1200 may be a wide-angle camera that is configured to capture an external area well beyond the obstructed external area, for example towards the rear of the vehicle.
  • the viewer may be able to see more of the underlying image generating layers, as thereby providing continuously changing visibility of the obstructed external area as the viewing position changes.
  • This is illustrated in figure 5, where the viewing angles from two different positions are shown. From a first top viewing position, a portion of the image generating layer 1120 shown as V1 can be seen. Portion V1 shows an image corresponding to a field of view of the external area shown on figure 5 as R1 . From a second bottom viewing position, a portion of the image generating layer 1120 shown as V2 can be seen. Portion V2 shows an image corresponding to a field of view of the external area shown on figure 5 as R2.
  • the visibility provided by the display screen 1100 will change to provide continuous visibility of the obstructed external area.
  • image along lengths V1 and V2 generated by the display screen will be collimated to form the intended image at the far focus plane (at or towards infinity) thereby to reduce the field disparity between the scene external to the vehicle and the scene being displayed from the display screen. This allows the observer to keep their eyes focused at a far field while dynamically changing the content the observer sees when moving their eye location.
  • the layered display screen will be substantially rectangular in shape.
  • the layered display screen may be configured to substantially correspond with the shape and size of the view obstructing part.
  • the display screen 1100 is configured to substantially match the shape and size of the A-pillar 2100.
  • one or more interior surfaces of the layered display screen may be composed of anti -reflective substance and/or are coated with an anti-reflective coating 1142, shown as a broken line in figure 4.
  • an anti-reflective coating 1142 shown as a broken line in figure 4.
  • the interior surfaces of the housing 1140 will be coated with a light absorbing substance such as black velvet, carbon nanotubes, vertically aligned nanotubes arrays (VANTAs), or any other suitably engineered material or coating.
  • Not all of the light generated by the image generating layer 1110 will be transmitted directly out of the display screen 1100.
  • Light from the image generating layer 1110 may be incident upon the internal walls and/or other interior components of the housing 1140. Importantly, this light will not be reflected internally, thereby interfering with and/or creating light noise in the light that is transmitted out of the display screen 1100. Instead, light incident upon the internal walls (shown as arrow D in figure 4) will be absorbed by the antireflective coating 1142, thereby causing the light that is transmitted from the display screen 1100 to be of a clearer quality.
  • the linear polarising layer 1124 will cause ambient light that is incident on the display screen to be filtered, thereby restricting ingress of ambient light into the display screen.
  • Ambient light that may traverse through the optical layers may reflect off the image generating layers and onto the antireflective coating 1142, further reducing noise in the light that is transmitted out of the display screen 1100.
  • Figure 6 shows a control unit 1300.
  • the control unit 1300 preferably includes a processor 1310 and semiconductor memory 1320 comprising volatile memory such as random access memory (RAM) or read only memory (ROM).
  • RAM random access memory
  • ROM read only memory
  • the memory 1320 may comprise either RAM or ROM or a combination of RAM and ROM.
  • the device further comprises I/O interface 1330 for communicating with one or more peripheral devices.
  • the I/O interface 1330 may offer both serial and parallel interface connectivity.
  • the I/O interface 1330 may also communicate with one or more human input devices (HID) 1340 such as keyboards, touchscreens, pointing devices, joysticks and the like.
  • HID human input devices
  • the I/O interface 1330 may also comprise an audio interface for communicate audio signals to one or more audio devices 1350, such as a speaker or a buzzer.
  • the control unit 1300 also preferably comprises a network interface 1360 for communicating with one or more networks 1370.
  • the network 1370 may be a wired network, such as a wired EthernetTM network or a similar wired network; or a wireless network, such as a BluetoothTM network or IEEE 802.11 network.
  • the network 1370 may be a local area network (LAN), such as a home or office computer network, or a wide area network (WAN), such as the Internet or private WAN.
  • LAN local area network
  • WAN wide area network
  • the network 1370 may be a wired connection to the vehicle control unit, or to the vehicle media control head unit.
  • the control unit 1300 further comprises a storage device 1380, such as a magnetic disk hard drive or a solid state disk drive.
  • the storage device 1380 may be configured for storage of software instructions and/or data.
  • the controller 1300 also comprises a video interface 1390 for conveying video signals to the display screen 1100, and possibly to a further display device such as the vehicles media head unit (not shown) or dashboard display, such as a liquid crystal display (LCD), cathode-ray tube (CRT) or similar display device.
  • a further display device such as the vehicles media head unit (not shown) or dashboard display, such as a liquid crystal display (LCD), cathode-ray tube (CRT) or similar display device.
  • LCD liquid crystal display
  • CRT cathode-ray tube
  • the control unit 1300 also comprises a communication bus subsystem 1400 and for interconnecting the various devices described above. Further, the control unit 1300 can comprise a clock device 1410 received and transmitted, and a geolocation device 1420 in order for the control unit 1300 to be able to track the location of the vehicle 2000, as well as the location in which particular objects may be identified (this will be described in more detail below).
  • the electronic control unit 1300 is able to receive and communicate to the layered display screen 1100 images from a secondary camera (not shown).
  • a secondary camera may be, for example, a DVR, side view camera or reversing camera located on the vehicle.
  • the display screen 1100 may be able to operate as a screen for use when merging and/or reversing and/or performing other driving actions.
  • control unit 1300 may be configured with software instructions on the digital storage media 1380 that allow for the adjustment and/or cropping and/or enlarging by zooming, the images captured by the camera.
  • the electronic unit may be configured for automatically cropping the images according to one or more parameters.
  • parameters can include one or more selected from a. the distance between the visibility obstructing vehicle part and the driver, b. driver height, and c. internal dimensions of the vehicle.
  • control unit 1300 may be adapted to identify objects captured in images of the external area, and insert identification signs or icons into the feed to the display screen 1100 for display on the display screen. Such identification signs or icons may be configured to move together with the identified object on the display screen 1000.
  • control unit 1300 may include a transceiver and be configured for transmitting images to a remote server for processing of the images. Processing of the images could be for purposes of identifying objects, or the like.
  • Image processing carried out by the control unit 1300 may be configured for incorporating machine vision/computer vision methods to identify objects in images captured by the camera. Objects identified may include road users, pets, traffic lights, traffic signs, brake lights, or other vehicles on the road. Methods that could be used for identification of objects may comprise image processing techniques including one or more of: a. filtering b. thresholding c. segmentation d. edge detection e. colour analysis and/or f. pattern recognition
  • Computer vision/machine vision methods may be complemented by artificial intelligence method such as neural nets, machine learning or deep learning that learned from images processed by the machine vision and/or computer vision methods to improve accuracy of object identification.
  • artificial intelligence method such as neural nets, machine learning or deep learning that learned from images processed by the machine vision and/or computer vision methods to improve accuracy of object identification.
  • one or more elements of the device may be configurable to auto adjust, or to be adjustable by the driver, for the purposes of image calibration.
  • Such elements could include focal plane distance, or the like.
  • control unit 1300 may be configured to communicate with a vehicle operating system (not shown) to feed information about identified objects to the vehicle operating system, which can then be used for automated or semiautomated control of the vehicle.
  • functions that could be carried out by the vehicle operating system in response to information received from the control unit 1300 include, but are not limited to, one or more of: a. controlling acceleration of the vehicle; b. controlling braking of the vehicle; c. controlling steering of the vehicle; and d. controlling vehicle velocity.
  • the system as described may be built into the vehicle on manufacture, or could be retrofittable to the vehicle.
  • the system 1000 would be fitted with requisite panel pieces and connector formations that would allow it to be connected to existing connecting formations on the vehicle.
  • replacement vehicle mirrors may be provided with a camera built into the vehicle mirrors.
  • a panel piece for a vehicle mirror may be provided that slots into a space where an old panel piece is removed.
  • currently existing A-pillar coverings may be removed, and a replacement A-pillar covering including a display screen 1100 as described above may be provided with the requisite connector formations for fitting to the A-pillar.
  • a display system 1000 as described above will be provided 2, and installed 4 on a vehicle 2000.
  • the camera will be configured or configurable to capture 6 images of the external area 3000, including the obstructed area.
  • the camera will then be used to generate 8 images in the form of a video feed from the camera to the control unit.
  • the control unit 1300 After receiving 10 the video feed from the camera, the control unit 1300 then processes 12 the received video feed and transmits the processed video feed to the display screen 1100.
  • the display screen 1100 generates 14 an image on an image generating layer, which is then radiated out of the display screen via the optical layers 1120. As the image passes through the optical layers, it is collimated 16.
  • real-time for example “displaying real-time data,” refers to the display of the data without intentional delay, given the processing limitations of the system and the time required to accurately measure the data.
  • exemplary is used in the sense of providing examples, as opposed to indicating quality. That is, an “exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality for example serving as a desirable model or representing the best of its kind.
  • a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
  • “or” should be understood to have the same meaning as “and/or” as defined above.
  • the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
  • “at least one of A and B” can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
  • bus and its derivatives, while being described in a preferred embodiment as being a communication bus subsystem for interconnecting various devices including by way of parallel connectivity such as Industry Standard Architecture (ISA), conventional Peripheral Component Interconnect (PCI) and the like or serial connectivity such as PCI Express (PCIe), Serial Advanced Technology Attachment (Serial ATA) and the like, should be construed broadly herein as any system for communicating data.
  • parallel connectivity such as Industry Standard Architecture (ISA), conventional Peripheral Component Interconnect (PCI) and the like or serial connectivity such as PCI Express (PCIe), Serial Advanced Technology Attachment (Serial ATA) and the like
  • PCIe PCI Express
  • Serial Advanced Technology Attachment Serial ATA
  • ‘in accordance with’ may also mean ‘as a function of and is not necessarily limited to the integers specified in relation thereto.
  • a computer implemented method should not necessarily be inferred as being performed by a single computing device such that the steps of the method may be performed by more than one cooperating computing devices.
  • objects as used herein such as 'web server’, ‘server’, ‘client computing device’, ‘computer readable medium’ and the like should not necessarily be construed as being a single object, and may be implemented as a two or more objects in cooperation, such as, for example, a web server being construed as two or more web servers in a server farm cooperating to achieve a desired goal or a computer readable medium being distributed in a composite manner, such as program code being provided on a compact disk activatable by a license key downloadable from a computer network.
  • the invention may be embodied using devices conforming to other network standards and for other applications, including, for example other WLAN standards and other wireless standards.
  • Applications that can be accommodated include IEEE 802.11 wireless LANs and links, and wireless Ethernet.
  • wireless and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not. In the context of this document, the term “wired” and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a solid medium. The term does not imply that the associated devices are coupled by electrically conductive wires. Processes:
  • processor may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory.
  • a “computer” or a “computing device” or a “computing machine” or a “computing platform” may include one or more processors.
  • the methodologies described herein are, in one embodiment, performable by one or more processors that accept computer-readable (also called machine-readable) code containing a set of instructions that when executed by one or more of the processors carry out at least one of the methods described herein.
  • Any processor capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken are included.
  • a typical processing system that includes one or more processors.
  • the processing system further may include a memory subsystem including main RAM and/or a static RAM, and/or ROM.
  • a computer-readable carrier medium may form, or be included in a computer program product.
  • a computer program product can be stored on a computer usable carrier medium, the computer program product comprising a computer readable program means for causing a processor to perform a method as described herein.
  • the one or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment, the one or more processors may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to- peer or distributed network environment.
  • the one or more processors may form a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • each of the methods described herein is in the form of a computer-readable carrier medium carrying a set of instructions, e.g., a computer program that are for execution on one or more processors.
  • embodiments of the present invention may be embodied as a method, an apparatus such as a special purpose apparatus, an apparatus such as a data processing system, or a computer-readable carrier medium.
  • the computer- readable carrier medium carries computer readable code including a set of instructions that when executed on one or more processors cause a processor or processors to implement a method.
  • aspects of the present invention may take the form of a method, an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects.
  • the present invention may take the form of carrier medium (e.g., a computer program product on a computer-readable storage medium) carrying computer-readable program code embodied in the medium.
  • the software may further be transmitted or received over a network via a network interface device.
  • the carrier medium is shown in an example embodiment to be a single medium, the term “carrier medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “carrier medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by one or more of the processors and that cause the one or more processors to perform any one or more of the methodologies of the present invention.
  • a carrier medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media.
  • some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a processor device, computer system, or by other means of carrying out the function.
  • a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method.
  • an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.
  • a device A connected to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
  • Connected may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

Abstract

There is provided a display system and display device and methods therefor, for use in a vehicle for providing a driver dynamic visibility of an external area obstructed by a view obstructing part of the vehicle. The display system includes a camera and a display screen. The camera is configured to capture images having a wide angle of view than the view obstructed by the view obstructing part. The display screen is configured so that images of the obstructed external area are dynamically adjusted in accordance with the viewing position of the display screen. Further, there is provided a display device including a screen viewable by a viewer and inner layers including an image generating layer. The screen surface and the inner layers cooperates to reduce light contamination within the device.

Description

DISPLAY DEVICE, DISPLAY SYSTEM AND METHODS THEREFOR
Field of the Invention
[001] The present invention relates to display device and in particular to a display device and methods therefore, for use in a vehicle or craft.
[002] The invention has been developed primarily for use in/with vehicles and will be described hereinafter with reference to this application. However, it will be appreciated that the invention is not limited to this particular field of use.
Background of the Invention
[003] At present, most vehicles include pillars which extend between the body of the vehicle and its roof. Such pillars present blind spots, behind which the drivers and passengers of vehicles cannot see, and the pillars may conceal vehicles and social pedestrians, causing safety concerns while driving. The design size of such pillars has been increasing due to the requirements for increased safety in collisions and rollovers. [004] Attempts have been made to overcome these issues by providing cameras for providing a view of the scene behind the pillars, with a feed from the camera being displayed by display devices mounted onto the pillars on the inside of the vehicle.
[005] However, such displays do not have the same focal plane as the external surroundings that the camera is viewing. A viewer from within the vehicle is required to adjust the focus of their eyes from a distant focus of external objects to a close-up focus of the display. In order for a viewer to adjust to their focus constantly while maintaining a correct situational awareness of the external environment requires effort by the viewer. [006] Further, if the viewer changes location within the internal environment of the vehicle (for example while moving over rough terrain or going through turns), the perspective of prior art displays does not change, and the display within the vehicle will be difficult to read and be tracked by the eyes and remain static based on the camera view.
[007] Drivers of different sizes, and with different preferred seating positions would require a recalibration of the internal display in order to make the scene overlay correctly.
[008] Further, display devices may be used for displaying a wide variety of images. However, where ambient light incident on the display is reflected within the display, it may interfere with the image being displayed. [009] Any discussion of the background art throughout the specification should in no way be considered as an admission that such background art is prior art, nor that such background art is widely known or forms part of the common general knowledge in the field in Australia or any other country.
Summary of the Invention
[010] The invention seeks to provide a display device which will overcome or substantially ameliorate at least some of the deficiencies of the prior art, or to at least provide an alternative.
[011] Additionally and/or alternatively, the invention seeks to provide a system for reducing focal plane disparity between an internal screen and an exterior environment.
[012] In a first aspect, the invention may be said to consist in a display system for providing a driver dynamic visibility of an external area obstructed by a view- obstructing part of a vehicle, the system comprising: a. a layered display screen for displaying images; b. a camera able to be mounted on, or integrated into, the vehicle, the camera being configurable to capture images of the obstructed external area; c. an electronic control unit for communicating images captured by the camera to the layered display screen; wherein d. the camera is configurable to capture images having a wider angle of view than a view obstructed by the view-obstructing part from a viewing position; e. the layered display screen includes an image generating layer; and f. the layered display screen includes one or more optical layers configured for collimating light from the image generating layer; g. such that the displayed images of the obstructed external area dynamically adjust in accordance with the viewing position of the layered display screen.
[013] In one embodiment, the camera is configurable to capture images having a wider angle of view than a static angle of view obstructed by the view-obstructing part from the viewing position of the layered display screen. [014] In one embodiment, the images of the obstructed external area displayed by the layered display screen dynamically adjust in accordance with a line of sight between the viewing position and the display screen.
[015] In one embodiment, the one or more optical layers of the layered display screen provide (to a viewer at the viewing position) an image plane behind the image generating layer (the image plane looks further away from the driver than it is).
[016] In one embodiment, the image generating layer is located below, or behind, and at distance within the focal length of, the one or more optical layers.
[017] In one embodiment, the camera is configurable to capture images at, or substantially at, an infinity focus
[018] In one embodiment, the depth of field of the images captured by the camera match, or approximately match, the depth of field of a viewer’s eye viewing an unobstructed external area proximate to the view-obstructing part of the vehicle.
[019] In one embodiment, the layered display screen includes a screen surface.
[020] In one embodiment, the screen surface has a viewable area that is smaller than an image generating layer area of the image generating layer.
[021] In one embodiment, the image generating layer area is in the range of about 1.2 to 2.5 times the viewable area of the screen surface.
[022] In one embodiment, the image generating layer area is about 1 .5 times the viewable area of the screen surface.
[023] In one embodiment, the one or more optical layers (and the screen surface) is distanced from the image generating layer.
[024] In one embodiment, the one or more optical layers (and the screen surface) is separated from the image generating layer by a pre-determined distance.
[025] In one embodiment, the pre-determined distance is in the range of about 1 inch (approx. 25 mm) to about 2.5 inches (approx. 64 mm).
[026] In one embodiment, the pre-determined distance is about 1.5 inches (approx. 38 mm).
[027] In one embodiment, a numerical aperture of the one or more optical layers is chosen to optimise the competing factors of (i) minimising the distance between the image generating layer and the one or more optical layers and (ii) maintaining driver visibility of all, or substantially all, the viewable area.
[028] In one embodiment, the numerical aperture is in the range of about 0.3 to about 0.5.
[029] In one embodiment, the numerical aperture is about 0.38. [030] In one embodiment, the layered display screen provides a viewing cone in the range of about 25 to about 35 degrees.
[031] In one embodiment, the layered display screen provides a viewing cone of about 30 degrees.
[032] In one embodiment, the image generating layer is located below, or behind, the one or more optical layers.
[033] In one embodiment, the image generating layer is an emissive display layer.
[034] In one embodiment, the image generating layer is an organic light emitting diode (OLED) layer or micro-light emitting diode (micro-LED) layer.
[035] In one embodiment, the image generating layer is a non-emissive display layer. [036] In one embodiment, the image generating layer is a liquid crystal display layer. [037] In one embodiment, the screen surface is located adjacent to the one or more optical layers.
[038] In one embodiment, the screen surface is located above the one or more optical layers.
[039] In one embodiment, the screen surface is integral with, or adhered to, one of the one or more optical layers.
[040] In one embodiment, the view-obstructing part of the vehicle is the A-pillar.
[041] In one embodiment, the image displayed by the layered display screen is not visible to the driver’s side passenger.
[042] In one embodiment, the layered display screen is substantially rectangular in shape.
[043] In one embodiment, dynamic visibility comprises providing continuous visibility of the obstructed external area as the viewing position changes.
[044] In one embodiment, dynamic visibility comprises providing continuous visibility of the obstructed external area during movement of the vehicle.
[045] In one embodiment, the one or more optical layers comprise a collimator configured to provide to the driver collimated light from the image generating layer.
[046] In one embodiment, the collimator is a Fresnel lens.
[047] In one embodiment, the one or more optical layers comprise a magnifying lens. [048] In one embodiment, one or more layers of the layered display screen include an anti-reflective coating.
[049] In one embodiment, the anti-reflective coating is on an uppermost layer of the display device. [050] In one embodiment, the anti-reflective coating is on an upper surface of the uppermost layer of the display device.
[051] In one embodiment, the lowest one of the one or more optical layers includes anti- reflective coating on a lower side.
[052] In one embodiment, the anti-reflective coating includes Magnesium Flouride (MgF2) and/or Lithium Flouride (LiF).
[053] In one embodiment, one or more interior surfaces of the layered display screen (e.g. the casing) comprise an anti-reflective substance and/or are coated with an anti- reflective coating.
[054] In one embodiment, the anti-reflective substance and/or anti-reflective coating comprises a dark (light absorbing) substance.
[055] In one embodiment, the anti-reflective substance comprises black velvet.
[056] In one embodiment, the anti-reflective substance and/or anti-reflective coating comprises carbon nanotubes (CNTs).
[057] In one embodiment, the anti-reflective substance and/or anti-reflective coating comprises a material made out of vertically aligned nanotube arrays (VANTAs).
[058] In one embodiment, the camera is able to be mounted on, or integrated into, an exterior part of the vehicle.
[059] In one embodiment, the camera is able to be mounted on, or integrated into, a side mirror of the vehicle opposite the driver.
[060] In one embodiment, the camera is able to be mounted on, or integrated into, a front or side casing of the side mirror.
[061] In one embodiment, the area external to the vehicle captured by the camera is forward and to the side of the vehicle.
[062] In one embodiment, the electronic control unit is able to receive and communicate to the layered display screen images from a secondary camera.
[063] In one embodiment, the layered display screen is able to operate as screen for use when merging and/or reversing and/or performing other driving actions.
[064] In one embodiment, the camera is a wide-angle camera configured to capture an external area beyond the obstructed external area and that area near to the obstructed external area (e.g. to the rear of the vehicle).
[065] In one embodiment, the electronic control unit is able to adjust and/or crop and/or enlarge by zooming, the images captured by the camera.
[066] In one embodiment, the camera is adapted to capture images of the obstructed external area that are larger than a pre-determined minimum capture area, wherein the larger capture area is able to be cropped by the electronic control unit according to one or more parameters
[067] In one embodiment, the one or more parameters include distance between the visibility obstructing vehicle part and the driver, driver height and/or internal dimensions of the car
[068] In one embodiment, the driver can control cropping of the larger capture area using an onboard vehicle control system
[069] In one embodiment, the system is adapted to identify objects in the external area image captured by the camera and the layered display screen is configured to display the identified objects to driver using identification signs or icons.
[070] In one embodiment, the system is adapted to identify the movement of identifiable objects and to display the movement using moving identification signs or icons [071] In one embodiment, the identification signs or icons change position on the image generating layer according to movement of the identified object relative to the vehicle. [072] In one embodiment, one or more elements of the device are configurable to auto adjust, or to be adjustable by the driver, for the purposes of image calibration [073] In one embodiment, the electronic control unit of the system is configured to apply a software packaging incorporating machine vision and/or computer vision methods to identify objects in the images captured by the camera such as road users, traffic lights, brake lights, other vehicles on the road.
[074] In one embodiment, the machine vision and/or computer vision methods for identification of objects comprise image processing techniques including one or more of: a. filtering b. thresholding c. segmentation d. edge detection e. colour analysis and/or f. pattern recognition
[075] In one embodiment, the electronic control unit of the system is configured to apply a software package for identification of objects including artificial intelligence methods such as neural nets, machine learning or deep learning that learn from images processed by the machine vision and/or computer vision methods to improve accuracy of object identification [076] In one embodiment, the electronic control system is configured to communicate with a vehicle operation system, such the electronic control system is able to operate or control the vehicle
[077] In one embodiment, vehicle operation or control by the electronic control system includes one or more of the following: a. controlling acceleration of the vehicle; b. controlling braking of the vehicle; c. controlling steering of the vehicle d. controlling vehicle velocity
[078] In one embodiment, the system, or individual components thereof (e.g. the layered display screen and/or the camera) is able to be retrofit to the vehicle.
[079] In another aspect the invention may be said to consist in a layered display screen for displaying images and for providing dynamic visibility of an external area obstructed by a view-obstructing part of a vehicle, wherein: a. the layered display screen includes an image generating layer; b. the layered display screen includes one or more optical layers for collimating light from image generating layer; c. the layered display screen is adapted to receive images from an electronic control unit configured for communicating images captured by a camera, the images capturing the obstructed external area and having a wider angle of view than a view obstructed by the view-obstructing part from a viewing position; d. such that the displayed images of the obstructed external area dynamically adjust in accordance with the viewing position of the layered display screen.
[080] In one embodiment, the layered display screen is able to be retrofit to the vehicle.
[081] In another aspect the invention may be said to consist in a method of providing driver dynamic visibility of an external area obstructed by a view- obstructing part of a vehicle, the method including the steps of: a. providing a layered display screen for displaying images; b. mounting a camera on, or integrating the camera into, the vehicle, the camera being configurable to capture images of the obstructed external area; c. providing an electronic control unit for communicating images captured by the camera to the layered display screen; wherein d. the camera is configured to capture images having a wider angle of view than a view obstructed by the view-obstructing part from a viewing position; e. the layered display screen includes an image generating layer; f. the layered display screen includes one or more optical layers for collimating light from the image generating layer; g. such that the displayed images of the obstructed external area dynamically adjust in accordance with the viewing position of the layered display screen.
[082] In another aspect the invention may be said to consist in a method of providing driver dynamic visibility of an external area obstructed by a view- obstructing part of a vehicle, the method including the steps of: a. receiving an image from a camera, the image having a wide-angle view than a view obstructed by the view obstructing part from a viewing position; b. generating an image from an image generating layer; and c. passing the generated image through a collimating optical layer in order to collimate the light passing through the optical layer.
[083] In one embodiment, the optical layer includes a length or area aspect that is smaller than the corresponding length or area aspect of the image generating layer. [084] In one embodiment, the image generating layer is removed from the optical layer by a predetermined distance.
[085] In one embodiment, the method includes the step of providing a housing within which the image generating layer and the optical layer are housed.
[086] In one embodiment, the method includes the step of absorbing light incident from the image generating layer that does not pass directly through the optical layers.
[087] In another aspect the invention may be said to consist in a display device for restricting interference by ambient light, wherein: a. the device includes a screen viewable by a viewer, the screen defining a screen surface including a linear polariser; b. the device includes one or more inner layers underneath the screen surface, the one or more inner layers including an image generating layer; c. the screen surface is configured to be distanced from the inner layers by a predetermined distance; and d. a viewable area of the screen surface is smaller than an image generating layer area of the image generating layer; e. such that the screen surface and the one or more inner layers co-operate to reduce light contamination within the device.
[088] In another aspect the invention may be said to consist in a display device for restricting interference by ambient light, wherein: a. the device includes a screen surface comprising an outer linear polariser; b. the device includes one or more inner layers underneath the screen surface, the one or more inner layers including an image generating layer; c. the screen surface is configured to be distanced from the inner layers by a predetermined distance; and d. a viewable area of the screen surface is smaller than an image generating layer area of the image generating layer; e. such that the screen surface and the one or more inner layers co-operate to reduce light contamination within the device.
[089] In one embodiment, the image generating layer area is in the range of about 1.2 to about 2.5 times the viewable area of the screen surface.
[090] In one embodiment, the image generating layer area is about 1 .5 times the viewable area of the screen surface.
[091] In one embodiment, the device further includes an intermediate layer that is adjacent to, or integral with, the screen surface.
[092] In one embodiment, the intermediate layer includes one or more optical layers. [093] In one embodiment, the screen surface, including any intermediate layer, is separated from the one or more inner layers by the pre-determined distance.
[094] In one embodiment, the pre-determined distance is determined by a function of: a. the optical properties of the intermediate layer; b. how much of the image generating layer is visible by a viewer at a preferred viewing distance; and/or c. the potential range of the preferred viewing distance (i.e. distance between viewer and screen).
[095] In one embodiment the predetermined distance is chosen or choosable based on a compromise magnification, image focus and distortion.
[096] In one embodiment, the predetermined distance is adjustable.
[097] In one embodiment, the display device includes an adjustment mechanism for adjusting the predetermined distance. [098] In one embodiment, the pre-determined distance is in the range of about 1 inch (approx. 25 mm) to about 2.5 inches (approx. 64 mm).
[099] In one embodiment, the pre-determined distance is about 1.5 inches (approx. 38 mm).
[0100] In one embodiment, the one or more optical layers comprise at least one converging lens.
[0101] In one embodiment, the one or more optical layers comprise a collimator.
[0102] In one embodiment, the collimator is a Fresnel lens.
[0103] In one embodiment, the one or more optical layers comprise a magnifying lens. [0104] In one embodiment, a numerical aperture of the one or more optical layers is chosen to optimise the competing factors of (i) minimising the distance between the image generating layer and the intermediate layer and (ii) maintaining viewer visibility of all, or substantially all, the viewable area.
[0105] In one embodiment, the numerical aperture is in the range of about 0.3 to about 0.5
[0106] In one embodiment, the numerical aperture is about 0.38
[0107] In one embodiment, the device provides a viewing cone in the range of about 25 to about 35 degrees.
[0108] In one embodiment, the device provides a viewing cone of about 30 degrees. [0109] In one embodiment, the device includes an adhesive layer, and one of the one or more optical layers is adhered to the outer linear polariser by the adhesive layer.
[0110] In one embodiment, the adhesive layer is an adhesive laminate.
[0111] In one embodiment, one or more of the one or more optical layers includes an anti-reflective coating on an inner side.
[0112] In one embodiment, one or more layers of the device include an anti-reflective coating.
[0113] In one embodiment, an uppermost layer of the device has the anti-reflective coating.
[0114] In one embodiment, an outer surface of the outermost layer of the device has the anti-reflective coating.
[0115] In one embodiment, the outer polariser has an anti-reflective coating.
[0116] In one embodiment, one or more interior surfaces of the device comprise an anti- reflective substance and/or are coated with an anti -reflective coating.
[0117] In one embodiment, the anti-reflective substance and/or coating comprises a dark (light absorbing) substance. [0118] In one embodiment, the anti-reflective substance comprises black velvet.
[0119] In one embodiment, the anti-reflective substance and/or coating comprises carbon nanotubes (CNTs).
[0120] In one embodiment, the anti-reflective substance and/or coating comprises a material made out of vertically aligned nanotube arrays (VANTAs).
[0121] In one embodiment, the anti-reflective coating includes Magnesium Flouride (MgF2) and/or Lithium Flouride (LiF).
[0122] In one embodiment, the ambient light entering the device reflected by the image generating layer is absorbed by the outer linear polariser.
[0123] In one embodiment, the outer linear polariser reduces the ambient light entering into the device by an amount in the range of about 40 to about 75 percent.
[0124] In one embodiment, the outer linear polariser reduces the ambient light entering into the device by an amount greater than about 50 percent.
[0125] In one embodiment, the image generating layer includes an emissive display layer.
[0126] In one embodiment, the image generating layer includes an OLED layer or micro-LED layer.
[0127] In one embodiment, the image generating layer includes a non-emissive display layer.
[0128] In one embodiment, the image generating layer includes a liquid crystal display layer.
[0129] In one embodiment, the device further includes an inner linear polariser adjacent to an inner surface of the image generating layer.
[0130] In one embodiment, the device further includes a backlighting layer.
[0131] In one embodiment, the backlighting layer is the innermost layer of the device. [0132] In one embodiment, the backlighting layer is edge-lit.
[0133] In one embodiment, the edge lighting is by LED.
[0134] In one embodiment, the device further includes a directional backlight film below the inner linear polariser and above the backlighting layer.
[0135] In one embodiment, the directional backlight film is configured to control the angular spread of light from the backlighting layer.
[0136] In one embodiment, the outer linear polariser is the outermost layer of the device.
[0137] In one embodiment, one or more of the layers of the device are rectangular in shape. [0138] Other aspects of the invention are also disclosed.
Brief Description of the Drawings
[0139] Notwithstanding any other forms which may fall within the scope of the present invention, a preferred embodiment of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:
[0140] Figure 1 shows a shows a vehicle interior including an A-pillar covered by a display screen;
[0141] Figure 2 shows the vehicle interior of figure 1 with the display screen showing objects obscured by the A-pillar;
[0142] Figure 3 shows a schematic view of a display system;
[0143] Figure 4 shows a cutaway schematic’s plan view of a display screen;
[0144] Figure 5 shows a cutaway schematic plan view of a display screen with a view viewing the display screen from two viewing points;
[0145] Figure 6 shows a schematic view of a control system; and
[0146] Figure 7 shows a flow chart of a method of providing driver dynamic visibility of an external area obstructed by a view-obstructing part of a vehicle.
Description of Embodiments
[0147] It should be noted in the following description that like or the same reference numerals in different embodiments denote the same or similar features.
System for providing dynamic visibility
[0148] Now described with reference to figures 1 - 4, there is provided a display system 1000 for providing a driver dynamic visibility of an external area, the external area being obstructed by a view obstructing part of a vehicle 2000. The view obstructing part of the vehicle may be an A-pillar 2100 of the vehicle (as shown in figures 1 and 2), or any other portion, such as the dashboard, B-pillar, C-pillar, vehicle body or cab. [0149] The display system includes a layered display screen 1100 for displaying images, a camera 1200, and an electronic control unit 1300. The electronic control unit 1300 is configured for communicating images captured by the camera 1200 to the layered display screen 1100 for display by the layered display screen 1100.
Display screen
[0150] The layered display screen 1100 preferably includes a housing 1140 (shown in figure 4), which will be described in more detail below. The layered display screen 1100 includes a plurality of image generating layers 1110 and a plurality of optical layers 1120 for collimating light from the image generating layer 1110. The display screen 1100 defines an outer surface through which light radiates to be viewed by a viewer. The display screen 1100 further includes an intermediate layer which, together with a layer defining the screen surface, make up the optical layers.
[0151] The optical layers 1120 preferably include a lens 1122, and is preferably in the form of a Fresnel lens. The lens 1122 includes a linear polariser layer 1124 that is adhered by an adhesive laminant layer 1126 to the lens 1122. Preferably, the outer linear polariser layer 1124 is located closer to the viewer than the lens 1122. Preferably the outer linear polariser 1124 reduces the ambient light entering into the device by an amount in the range of about 40% to about 75%, and more preferably greater than 50%. [0152] The optical layers 1120 further preferably includes an outer antireflective coating 1128, which is disposed on an outermost surface of the optical layers 1120. The outermost surface of the optical layers 1120 preferably corresponds to a display screen surface of the layered display screen 1100, and in this sense, the optical layers 1120 may be said to be integral with the display screen surface. However, it is envisaged that a further clear protective layer (not shown) may be provided outside of the intermediate layer to make up the optical layers 1120.
[0153] The antireflective coating 1128 serves to prevent or restrict the ingress of ambient light into the layered display screen 1100. A further antireflective coating (not shown) may be provided on a side of the optical layers 1120 closer to the image generating layers 1110. Preferably, the anti-reflective coating includes Magnesium Flouride (MgF2) and/or Lithium Flouride (LiF). In addition to the antireflective coating 1128, the linear polariser layer 1124 will serve to restrict the ingress of ambient light into the display screen 1100.
[0154] In addition to the collimating effect, it is envisaged that the Fresnel lens may also provide a magnifying effect. Alternatively, in an alternative embodiment a separate magnifying layer (not shown) may be provided for this purpose.
[0155] In an alternative embodiment, the optical layers can include a converging lens
(not shown).
[0156] The one or more optical layers 1120 of the layered display screen 1100 are preferably optically configured to display or provide (to a viewer in a viewing position) an focal plane or image plane behind the image generating layer (i.e. the image plane looks further away from the driver than it is).
[0157] The image generating layer 1110 is located below or inwardly (i.e. further away from the viewer and/or into the housing) and at a predetermined distance within the focal length of, the one or more optical layers. Preferably the image generating layers 1110 are located at a predetermined distance of about 25 mm (1 inch) to about 64 mm (2.5 inches) from the optical layer 1110, and more preferably at about 38 mm (1.5 inches). The predetermined distance is shown as arrow d on figure 4.
[0158] It is envisaged that an adjustment mechanism (not shown) may be provided that will allow the predetermined distance to be adjustable by a viewer. Such an adjustment mechanism may be mechanical or electronic in nature, and a person skilled in the art will appreciate that a wide variety of adjustment mechanisms are possible. Examples of adjustment mechanisms include threaded adjustment mechanisms, linear motors, snap-fit type adjustment mechanisms, or the like.
[0159] In another embodiment (not shown) the display screen 1100 can include an adjustment mechanism (not shown) for adjusting the predetermined distance.
[0160] The predetermined distance is preferably determined by a function of one or more selected from: a. the optical properties of the intermediate layer (including the Fresnel lens); b. how much of the image generating layer is visible by a viewer at a preferred viewing distance; and c. the range of preferred viewing distances (i.e. distance between viewer and screen).
[0161] It should be noted that the distance d between the optical layer 1120 and image generating layer 1110, as compared with the distance between the intermediate optical layer and viewer needs to be chosen to find the preferred compromise between magnification, image focus and distortion.
[0162] Preferably, a numerical aperture of the one or more optical layers 1120 is configured to optimise the competing factors of (i) minimising the distance between the image generating layer and the one or more optical layers and (ii) maintaining driver visibility of all, or substantially all, the viewable area. The numerical aperture is preferably in the range of about 0.3 to about 0.5, and more preferably about 0.38.
[0163] It is envisaged that a distance (which may be the width and/or height) or area across the image generating layers 1110 will be larger than the corresponding distance or area across the optical layers 1120. In other words, the viewable size, and preferably area, of at least one or more aspects (i.e. width and/or height and/or width x height, shown as arrow S in figure 4) of an outer surface of the layered display screen 1100 will preferably be smaller than the corresponding aspect (shown as arrow L in figure 4) of the image generating layers 1110. [0164] It is envisaged that the aspect L (height, width or area) of the image generating layers 1110 will preferably be about 1 .2 to 2.5 times the corresponding aspect S of the optical layers 1120, and more preferably about 1.5 times. In this way, the layered display screen 1100 provides a tapered viewing angle or cone that is in the range of about 25° to about 35°, and most preferably at about 30°, relative to the plane of the collimated light being displayed from the optical layers 1120.
[0165] It is envisaged that in one embodiment, the image generating layers 11 10 can include an emissive display layer, such as an organic light emitting diode (OLED) or other similar light emitting diode (LED) layer such as a micro-LED layer.
[0166] In another embodiment, the image generating layers 1110 can include an non- emissive display layer, for example a liquid crystal display (LCD) layer.
[0167] In the embodiment shown in figure 4, the image generating layers 1110 include an edge lit backlight layer 1112 that includes LEDs 1113 arranged around at least partly about the periphery of the backlight layer of 1112. In an alternative embodiment, the backlight layer 1112 can be directly lit and not edge lit.
[0168] The image generating layer 1110 further includes a narrow field of view directional backlight form 1114 and a rear or innermost polarising layer 1116. The directional backlight form 1114 is configured to control the angular spread of light from the backlighting layer 1112.
[0169] The image generating layer 1110 further includes a liquid crystal cell layer 1118 that interacts together with the rear polarising layer 1116 and linear polarising layer 1126 of the optical layers 1120 in a known manner to thereby generate images.
[0170] It is envisaged that in alternative embodiments, the display screen 1100 could be composed of a resilient and/or flexible material that can be folded or banned in accordance with the requirements of the A-pillar configuration. Such resilient and/or flexible display panels are known, and a discussion of these is considered beyond the scope of this specification.
Camera
[0171] The camera 1200 is preferably configured for being mounted on, or integrated into a vehicle 2000. Preferably, the camera 1200 is configured to be mountable, or integrated with a side mirror (not shown) of the vehicle 2000, in a position from which the camera is able to view the obstructed area. In this regard, it is anticipated that the camera 1200 could be mountable to, or integrated with a front or side casing of a side mirror. In the case of an A-pillar, the area external to the vehicle captured by the camera would be forward and to the side of the vehicle. [0172] The camera 1200 is configured for configurable for capturing images of an external area 3000 that is obstructed by the A pillar 2100 of the vehicle 2000. The view obstructed by the A pillar 2100 will typically be from the point of view of a driver in one particular viewing position at any one time. However, such an obstruction to the view of the driver may be while the driver moves between a number of viewing positions, for example viewing positions of a driver that is movable backwards and forwards in the driver seat, and side to side from the driver’s seat.
[0173] The camera 1200 will preferably be configured for capturing images or video having a wider angle of view than any one static angle of view obstructed from any one given particular viewing position. The camera 1200 will also preferably be configured for capturing images of the entire external area that may be obstructed within the whole typical range of movement of the drivers viewing position.
[0174] In this way, images captured by the camera 1200 will be fed to the control unit 1300, which will direct these to the display screen 1100. Images captured by the camera will be displayed on the display screen 1100. The displayed images of the obstructed external area will adjust dynamically in accordance with a line of sight between the viewing position from which the layered display screen is being viewed, and the display screen.
[0175] Preferably the camera 1200 is configured or configurable to capture images at, or substantially close to, an infinity focus. Images displayed by the image generating layers 1110 will be collimated by the optical layers 1120 as they pass on to the viewer. This will have the effect of giving light from the images a similar focal plane to light from the actual external area 3000. Because of this, a viewer looking between a viewable area adjacent the external area 3000, and at images on the display screen 1100, will not need to adjust their focus as much as if they were looking at a screen without collimation. In other words, the depth of field of the images captured by the camera match, or approximately match, the depth of field of a viewer’s eye viewing an unobstructed external area proximate to the view obstructing part of the vehicle. This reduces effort required by the driver to be able to be aware of their surroundings.
[0176] It is envisaged that the image displayed by the layered display screen 1100 may or may not be visible to the driver’s side passenger. To this extent, another viewing angle reduction layer (not shown) may be provided that prevents others that are outside of a particular viewing angle from seeing the display screen 1100. [0177] It is further envisaged that the camera 1200 may be a wide-angle camera that is configured to capture an external area well beyond the obstructed external area, for example towards the rear of the vehicle.
Dynamic visibility
[0178] Further, as the viewing position of the viewer changes, the viewer may be able to see more of the underlying image generating layers, as thereby providing continuously changing visibility of the obstructed external area as the viewing position changes. This is illustrated in figure 5, where the viewing angles from two different positions are shown. From a first top viewing position, a portion of the image generating layer 1120 shown as V1 can be seen. Portion V1 shows an image corresponding to a field of view of the external area shown on figure 5 as R1 . From a second bottom viewing position, a portion of the image generating layer 1120 shown as V2 can be seen. Portion V2 shows an image corresponding to a field of view of the external area shown on figure 5 as R2. As the position of the viewer progresses from the first top viewing position to the second bottom viewing position, the visibility provided by the display screen 1100 will change to provide continuous visibility of the obstructed external area. In addition, image along lengths V1 and V2 generated by the display screen will be collimated to form the intended image at the far focus plane (at or towards infinity) thereby to reduce the field disparity between the scene external to the vehicle and the scene being displayed from the display screen. This allows the observer to keep their eyes focused at a far field while dynamically changing the content the observer sees when moving their eye location.
[0179] Similarly, as the vehicle moves, the images generated by the camera will change, and the dynamic visibility provided by the display screen will be provided by continuous visibility of the obstructed external area during movement of the vehicle. [0180] It is envisaged that in one embodiment, the layered display screen will be substantially rectangular in shape. Alternatively, the layered display screen may be configured to substantially correspond with the shape and size of the view obstructing part. As shown in figure 1 , the display screen 1100 is configured to substantially match the shape and size of the A-pillar 2100.
[0181] Further, in order to restrict and/or prevent ambient light from interfering with the display generated by the layered display screen 1100, it is envisaged that one or more interior surfaces of the layered display screen may be composed of anti -reflective substance and/or are coated with an anti-reflective coating 1142, shown as a broken line in figure 4. For example, it is envisaged that the interior surfaces of the housing 1140 will be coated with a light absorbing substance such as black velvet, carbon nanotubes, vertically aligned nanotubes arrays (VANTAs), or any other suitably engineered material or coating.
[0182] Not all of the light generated by the image generating layer 1110 will be transmitted directly out of the display screen 1100. Light from the image generating layer 1110 may be incident upon the internal walls and/or other interior components of the housing 1140. Importantly, this light will not be reflected internally, thereby interfering with and/or creating light noise in the light that is transmitted out of the display screen 1100. Instead, light incident upon the internal walls (shown as arrow D in figure 4) will be absorbed by the antireflective coating 1142, thereby causing the light that is transmitted from the display screen 1100 to be of a clearer quality.
[0183] Further, the linear polarising layer 1124 will cause ambient light that is incident on the display screen to be filtered, thereby restricting ingress of ambient light into the display screen. Ambient light that may traverse through the optical layers may reflect off the image generating layers and onto the antireflective coating 1142, further reducing noise in the light that is transmitted out of the display screen 1100.
Control unit
[0184] Figure 6 shows a control unit 1300. In a preferred embodiment, the control unit 1300 preferably includes a processor 1310 and semiconductor memory 1320 comprising volatile memory such as random access memory (RAM) or read only memory (ROM). The memory 1320 may comprise either RAM or ROM or a combination of RAM and ROM.
[0185] The device further comprises I/O interface 1330 for communicating with one or more peripheral devices. The I/O interface 1330 may offer both serial and parallel interface connectivity. The I/O interface 1330 may also communicate with one or more human input devices (HID) 1340 such as keyboards, touchscreens, pointing devices, joysticks and the like. The I/O interface 1330 may also comprise an audio interface for communicate audio signals to one or more audio devices 1350, such as a speaker or a buzzer.
[0186] The control unit 1300 also preferably comprises a network interface 1360 for communicating with one or more networks 1370. The network 1370 may be a wired network, such as a wired Ethernet™ network or a similar wired network; or a wireless network, such as a Bluetooth™ network or IEEE 802.11 network. The network 1370 may be a local area network (LAN), such as a home or office computer network, or a wide area network (WAN), such as the Internet or private WAN. Alternatively, the network 1370 may be a wired connection to the vehicle control unit, or to the vehicle media control head unit.
[0187] The control unit 1300 further comprises a storage device 1380, such as a magnetic disk hard drive or a solid state disk drive. The storage device 1380 may be configured for storage of software instructions and/or data.
[0188] The controller 1300 also comprises a video interface 1390 for conveying video signals to the display screen 1100, and possibly to a further display device such as the vehicles media head unit (not shown) or dashboard display, such as a liquid crystal display (LCD), cathode-ray tube (CRT) or similar display device.
[0189] The control unit 1300 also comprises a communication bus subsystem 1400 and for interconnecting the various devices described above. Further, the control unit 1300 can comprise a clock device 1410 received and transmitted, and a geolocation device 1420 in order for the control unit 1300 to be able to track the location of the vehicle 2000, as well as the location in which particular objects may be identified (this will be described in more detail below).
[0190] In one embodiment, the electronic control unit 1300 is able to receive and communicate to the layered display screen 1100 images from a secondary camera (not shown). Such a secondary camera may be, for example, a DVR, side view camera or reversing camera located on the vehicle. In this way, the display screen 1100 may be able to operate as a screen for use when merging and/or reversing and/or performing other driving actions.
[0191] It is envisaged that the control unit 1300 may be configured with software instructions on the digital storage media 1380 that allow for the adjustment and/or cropping and/or enlarging by zooming, the images captured by the camera. For example, if the camera is adapted to capture images of an obstructed external area that is larger than the obstructed external area, then the electronic unit may be configured for automatically cropping the images according to one or more parameters. Such parameters can include one or more selected from a. the distance between the visibility obstructing vehicle part and the driver, b. driver height, and c. internal dimensions of the vehicle.
[0192] In another preferred embodiment, the control unit 1300 may be adapted to identify objects captured in images of the external area, and insert identification signs or icons into the feed to the display screen 1100 for display on the display screen. Such identification signs or icons may be configured to move together with the identified object on the display screen 1000. Further, it is envisaged that the control unit 1300 may include a transceiver and be configured for transmitting images to a remote server for processing of the images. Processing of the images could be for purposes of identifying objects, or the like. Image processing carried out by the control unit 1300 may be configured for incorporating machine vision/computer vision methods to identify objects in images captured by the camera. Objects identified may include road users, pets, traffic lights, traffic signs, brake lights, or other vehicles on the road. Methods that could be used for identification of objects may comprise image processing techniques including one or more of: a. filtering b. thresholding c. segmentation d. edge detection e. colour analysis and/or f. pattern recognition
[0193] Computer vision/machine vision methods may be complemented by artificial intelligence method such as neural nets, machine learning or deep learning that learned from images processed by the machine vision and/or computer vision methods to improve accuracy of object identification.
[0194] It is further envisaged that one or more elements of the device may be configurable to auto adjust, or to be adjustable by the driver, for the purposes of image calibration. Such elements could include focal plane distance, or the like.
[0195] It is further envisaged that in one embodiment, the control unit 1300 may be configured to communicate with a vehicle operating system (not shown) to feed information about identified objects to the vehicle operating system, which can then be used for automated or semiautomated control of the vehicle. Functions that could be carried out by the vehicle operating system in response to information received from the control unit 1300 include, but are not limited to, one or more of: a. controlling acceleration of the vehicle; b. controlling braking of the vehicle; c. controlling steering of the vehicle; and d. controlling vehicle velocity.
[0196] It is envisaged that the system as described may be built into the vehicle on manufacture, or could be retrofittable to the vehicle. In the instance of a retrofittable system, the system 1000 would be fitted with requisite panel pieces and connector formations that would allow it to be connected to existing connecting formations on the vehicle. For example, replacement vehicle mirrors may be provided with a camera built into the vehicle mirrors. Alternatively, a panel piece for a vehicle mirror may be provided that slots into a space where an old panel piece is removed. Further, currently existing A-pillar coverings may be removed, and a replacement A-pillar covering including a display screen 1100 as described above may be provided with the requisite connector formations for fitting to the A-pillar.
[0197] A wide variety of vehicles and connector formations of all shapes and sizes are presently on the market, and it will be appreciated by person skilled in the art that all of these could not be described.
In use
[0198] In use, and with reference to figure 7, a display system 1000 as described above will be provided 2, and installed 4 on a vehicle 2000. The camera will be configured or configurable to capture 6 images of the external area 3000, including the obstructed area. The camera will then be used to generate 8 images in the form of a video feed from the camera to the control unit. After receiving 10 the video feed from the camera, the control unit 1300 then processes 12 the received video feed and transmits the processed video feed to the display screen 1100. The display screen 1100 generates 14 an image on an image generating layer, which is then radiated out of the display screen via the optical layers 1120. As the image passes through the optical layers, it is collimated 16.
Interpretation
[0199] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms used herein should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein. For the purposes of the present invention, additional terms are defined below. Furthermore, all definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms unless there is doubt as to the meaning of a particular term, in which case the common dictionary definition and/or common usage of the term will prevail. [0200] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular articles “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise and thus are used herein to refer to one or to more than one (i.e. to “at least one”) of the grammatical object of the article. By way of example, the phrase “an element” refers to one element or more than one element.
[0201] The term “about” is used herein to refer to quantities that vary by as much as 30%, preferably by as much as 20%, and more preferably by as much as 10% to a reference quantity. The use of the word ‘about’ to qualify a number is merely an express indication that the number is not to be construed as a precise value.
[0202] Throughout this specification, unless the context requires otherwise, the words “comprise”, “comprises” and “comprising” will be understood to imply the inclusion of a stated step or element or group of steps or elements but not the exclusion of any other step or element or group of steps or elements.
[0203] The term “real-time” for example “displaying real-time data,” refers to the display of the data without intentional delay, given the processing limitations of the system and the time required to accurately measure the data.
[0204] As used herein, the term “exemplary” is used in the sense of providing examples, as opposed to indicating quality. That is, an “exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality for example serving as a desirable model or representing the best of its kind.
[0205] The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc. [0206] As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of” or “exactly one of,” or, when used in the claims, “consisting of” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.” “Consisting essentially of,” when used in the claims, shall have its ordinary meaning as used in the field of patent law.
[0207] As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
Bus
[0208] In the context of this document, the term “bus” and its derivatives, while being described in a preferred embodiment as being a communication bus subsystem for interconnecting various devices including by way of parallel connectivity such as Industry Standard Architecture (ISA), conventional Peripheral Component Interconnect (PCI) and the like or serial connectivity such as PCI Express (PCIe), Serial Advanced Technology Attachment (Serial ATA) and the like, should be construed broadly herein as any system for communicating data.
In accordance with:
[0209] As described herein, ‘in accordance with’ may also mean ‘as a function of and is not necessarily limited to the integers specified in relation thereto.
Composite items
[0210] As described herein, ‘a computer implemented method’ should not necessarily be inferred as being performed by a single computing device such that the steps of the method may be performed by more than one cooperating computing devices.
[0211] Similarly objects as used herein such as 'web server’, ‘server’, ‘client computing device’, ‘computer readable medium’ and the like should not necessarily be construed as being a single object, and may be implemented as a two or more objects in cooperation, such as, for example, a web server being construed as two or more web servers in a server farm cooperating to achieve a desired goal or a computer readable medium being distributed in a composite manner, such as program code being provided on a compact disk activatable by a license key downloadable from a computer network.
Wireless:
[0212] The invention may be embodied using devices conforming to other network standards and for other applications, including, for example other WLAN standards and other wireless standards. Applications that can be accommodated include IEEE 802.11 wireless LANs and links, and wireless Ethernet.
[0213] In the context of this document, the term “wireless” and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not. In the context of this document, the term “wired” and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a solid medium. The term does not imply that the associated devices are coupled by electrically conductive wires. Processes:
[0214] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as “processing”, “computing”, “calculating”, “determining”, “analyzing” or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities into other data similarly represented as physical quantities.
Processor:
[0215] In a similar manner, the term “processor” may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory. A “computer” or a “computing device” or a “computing machine” or a “computing platform” may include one or more processors.
[0216] The methodologies described herein are, in one embodiment, performable by one or more processors that accept computer-readable (also called machine-readable) code containing a set of instructions that when executed by one or more of the processors carry out at least one of the methods described herein. Any processor capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken are included. Thus, one example is a typical processing system that includes one or more processors. The processing system further may include a memory subsystem including main RAM and/or a static RAM, and/or ROM.
Computer-Readable Medium:
[0217] Furthermore, a computer-readable carrier medium may form, or be included in a computer program product. A computer program product can be stored on a computer usable carrier medium, the computer program product comprising a computer readable program means for causing a processor to perform a method as described herein.
Networked or Multiple Processors:
[0218] In alternative embodiments, the one or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment, the one or more processors may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to- peer or distributed network environment. The one or more processors may form a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
[0219] Note that while some diagram(s) only show(s) a single processor and a single memory that carries the computer-readable code, those in the art will understand that many of the components described above are included, but not explicitly shown or described in order not to obscure the inventive aspect. For example, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
Additional Embodiments:
[0220] Thus, one embodiment of each of the methods described herein is in the form of a computer-readable carrier medium carrying a set of instructions, e.g., a computer program that are for execution on one or more processors. Thus, as will be appreciated by those skilled in the art, embodiments of the present invention may be embodied as a method, an apparatus such as a special purpose apparatus, an apparatus such as a data processing system, or a computer-readable carrier medium. The computer- readable carrier medium carries computer readable code including a set of instructions that when executed on one or more processors cause a processor or processors to implement a method. Accordingly, aspects of the present invention may take the form of a method, an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of carrier medium (e.g., a computer program product on a computer-readable storage medium) carrying computer-readable program code embodied in the medium.
Carrier Medium:
[0221] The software may further be transmitted or received over a network via a network interface device. While the carrier medium is shown in an example embodiment to be a single medium, the term “carrier medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “carrier medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by one or more of the processors and that cause the one or more processors to perform any one or more of the methodologies of the present invention. A carrier medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media.
Implementa tion:
[0222] It will be understood that the steps of methods discussed are performed in one embodiment by an appropriate processor (or processors) of a processing (i.e., computer) system executing instructions (computer-readable code) stored in storage. It will also be understood that the invention is not limited to any particular implementation or programming technique and that the invention may be implemented using any appropriate techniques for implementing the functionality described herein. The invention is not limited to any particular programming language or operating system.
Means For Carrying out a Method or Function
[0223] Furthermore, some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a processor device, computer system, or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method. Furthermore, an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.
Connected
[0224] Similarly, it is to be noticed that the term connected, when used in the claims, should not be interpreted as being limitative to direct connections only. Thus, the scope of the expression a device A connected to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. “Connected” may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
Embodiments:
[0225] Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.
[0226] Similarly it should be appreciated that in the above description of example embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description of Specific Embodiments are hereby expressly incorporated into this Detailed Description of Specific Embodiments, with each claim standing on its own as a separate embodiment of this invention.
[0227] Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Specific Details
[0228] In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
[0229] It will be appreciated that the methods/apparatus/devices/systems described/illustrated above at least substantially provide a display device for restricting interference by ambient light.
[0230] The display device described herein, and/or shown in the drawings, are presented by way of example only and are not limiting as to the scope of the invention. Unless otherwise specifically stated, individual aspects and components of the display device may be modified, or may have been substituted therefore known equivalents, or as yet unknown substitutes such as may be developed in the future or such as may be found to be acceptable substitutes in the future. The display device may also be modified for a variety of applications while remaining within the scope and spirit of the claimed invention, since the range of potential applications is great, and since it is intended that the present display device be adaptable to many such variations.
[0231]
Terminology
[0232] In describing the preferred embodiment of the invention illustrated in the drawings, specific terminology will be resorted to for the sake of clarity. However, the invention is not intended to be limited to the specific terms so selected, and it is to be understood that each specific term includes all technical equivalents which operate in a similar manner to accomplish a similar technical purpose. Terms such as "forward", "rearward", "radially", "peripherally", "upwardly", "downwardly", and the like are used as words of convenience to provide reference points and are not to be construed as limiting terms.
Different instances of Objects
[0233] As used herein, unless otherwise specified the use of the ordinal adjectives “first”, “second”, “third”, etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
Comprising and Including
[0234] In the claims which follow and in the preceding description of the invention, except where the context requires otherwise due to express language or necessary implication, the word “comprise” or variations such as “comprises” or “comprising” are used in an inclusive sense, i.e. to specify the presence of the stated features but not to preclude the presence or addition of further features in various embodiments of the invention.
[0235] Any one of the terms: including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising. Scope of Invention
[0236] Thus, while there has been described what are believed to be the preferred embodiments of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as fall within the scope of the invention. For example, any formulas given above are merely representative of procedures that may be used. Functionality may be added or deleted from the block diagrams and operations may be interchanged among functional blocks. Steps may be added or deleted to methods described within the scope of the present invention.
[0237] Although the invention has been described with reference to specific examples, it will be appreciated by those skilled in the art that the invention may be embodied in many other forms.
Chronological order [0238] For the purpose of this specification, where method steps are described in sequence, the sequence does not necessarily mean that the steps are to be carried out in chronological order in that sequence, unless there is no other logical manner of interpreting the sequence.
Markush grou s [0239] In addition, where features or aspects of the invention are described in terms of
Markush groups, those skilled in the art will recognise that the invention is also thereby described in terms of any individual member or subgroup of members of the Markush group. Industrial Applicability
[0240] It is apparent from the above, that the arrangements described are applicable to the automotive, electronics and appliance industries.

Claims

Claims
1. A display system for providing a driver dynamic visibility of an external area obstructed by a view-obstructing part of a vehicle, the system comprising: a layered display screen for displaying images; a camera able to be mounted on, or integrated into, the vehicle, the camera being configurable to capture images of the obstructed external area; an electronic control unit for communicating images captured by the camera to the layered display screen; wherein the camera is configurable to capture images having a wider angle of view than a view obstructed by the view-obstructing part from a viewing position; the layered display screen includes an image generating layer; and the layered display screen includes one or more optical layers configured for collimating light from the image generating layer; such that the displayed images of the obstructed external area dynamically adjust in accordance with the viewing position of the layered display screen.
2. The display system as claimed in claim 1 , wherein the camera is configurable to capture images having a wider angle of view than a static angle of view obstructed by the view-obstructing part from the viewing position of the layered display screen.
3. The display system as claimed in claim 2, wherein the images of the obstructed external area displayed by the layered display screen are dynamically adjusted by the layered display screen in accordance with a line of sight between the viewing position and the display screen.
4. The display system as claimed in claim 1 , wherein the one or more optical layers of the layered display screen display an image including an image plane behind the image generating layer.
5. The display system as claimed in claim 1 , wherein the depth of field of the images captured by the camera match, or approximately match, the depth of field of a viewer’s eye viewing an unobstructed external area proximate to the view- obstructing part of the vehicle.
6. The display system as claimed in claim 1 , wherein the layered display screen includes a screen surface including a viewable area that is smaller than an image generating layer area of the image generating layer.
7. The display system as claimed in claim 1 , wherein the one or more optical layers is distanced from the image generating layer.
8. The display system as claimed in claim 1 , wherein the image generating layer is located inside the one or more optical layers.
9. The display system as claimed in claim 1 , wherein the one or more optical layers comprise a collimator configured to provide to the driver collimated light from the image generating layer.
10. The display system as claimed in claim 1 , wherein the one or more optical layers comprise a magnifying lens.
11. The display system as claimed in claim 1 , wherein one or more layers of the layered display screen include an anti-reflective coating.
12. The display system as claimed in claim 1 , wherein the display system includes a housing for housing one or more selected from the image generating layer and the optical layer.
13. The display system as claimed in claim 12, wherein one or more interior surfaces of the housing comprise an anti-reflective substance and/or are coated with an anti-reflective coating.
14. The display system as claimed in claim 1 , wherein the electronic control unit is configured to receive and communicate to the layered display screen images from a secondary camera.
15. The display system as claimed in claim 1 , wherein the electronic control unit is configured to adjust and/or crop and/or enlarge by zooming, the images captured by the camera.
16. The display system as claimed in claim 1 , wherein the display system is adapted to identify objects in the external area image captured by the camera and the layered display screen is configured to display the identified objects to a user using identification signs or icons.
17. A layered display screen for displaying images and for providing dynamic visibility of an external area obstructed by a view-obstructing part of a vehicle, wherein: the layered display screen includes an image generating layer; the layered display screen includes one or more optical layers for collimating light from image generating layer; the layered display screen is adapted to receive images from an electronic control unit configured for communicating images captured by a camera, the images capturing the obstructed external area and having a wider angle of view than a view obstructed by the view-obstructing part from a viewing position; such that the displayed images of the obstructed external area dynamically adjust in accordance with the viewing position of the layered display screen.
18. A method of providing driver dynamic visibility of an external area obstructed by a view-obstructing part of a vehicle, the method including the steps of: providing a layered display screen for displaying images; mounting a camera on, or integrating the camera into, the vehicle, the camera being configurable to capture images of the obstructed external area; providing an electronic control unit for communicating images captured by the camera to the layered display screen; wherein the camera is configured to capture images having a wider angle of view than a view obstructed by the view-obstructing part from a viewing position; the layered display screen includes an image generating layer; the layered display screen includes one or more optical layers for collimating light from the image generating layer; such that the displayed images of the obstructed external area dynamically adjust in accordance with the viewing position of the layered display screen.
19. A method of providing driver dynamic visibility of an external area obstructed by a view-obstructing part of a vehicle, the method including the steps of: receiving an image from a camera, the image having a wide-angle view than a view obstructed by the view obstructing part from a viewing position; generating an image from an image generating layer; and passing the generated image through a collimating optical layer in order to collimate the light passing through the optical layer.
20. The method as claimed in claim 19, wherein the optical layer includes a length or area aspect that is smaller than the corresponding length or area aspect of the image generating layer.
21 A display device for restricting interference by ambient light, wherein: the device includes a screen viewable by a viewer, the screen defining a screen surface including a linear polariser; the device includes one or more inner layers underneath the screen surface, the one or more inner layers including an image generating layer; the screen surface is configured to be distanced from the inner layers by a predetermined distance; and a viewable area of the screen surface is smaller than an image generating layer area of the image generating layer; such that the screen surface and the one or more inner layers co-operate to reduce light contamination within the device.
22. The display device of claim 21 , wherein the device further includes an intermediate layer that is adjacent to, or integral with, the screen, the intermediate layer including one or more optical layers.
23. The display device of claim 22, wherein the one or more optical layers includes at least one converging lens.
24. The display device of claim 22, wherein the one or more optical layers includes a collimator.
25. The display device of claim 22, wherein the one or more optical layers includes a magnifying lens.
26. The display device of claim 22, wherein one or more of the one or more optical layers includes an anti-reflective coating on an inner side.
27. The display device of claim 26, wherein an outermost layer of the device has the anti-reflective coating.
28. The display device of claim 19, wherein one or more interior surfaces of the device comprise an anti-reflective substance and/or are coated with an anti- reflective coating.
29. The display device of claim 28, wherein the anti-reflective substance and/or coating comprises a light absorbing substance.
30. The display device of claim 21 , wherein the display device further includes an inner linear polariser adjacent to an inner surface of the image generating layer.
31. The display device of claim 21 , wherein the display device further includes a backlighting layer 32. The display device of claim 31 , wherein the device further includes a directional backlight film below the inner linear polariser and above the backlighting layer, the directional backlight film being configured to control the angular spread of light from the backlighting layer.
PCT/AU2022/050626 2021-06-21 2022-06-21 Display device, display system and methods therefor WO2022266704A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163213184P 2021-06-21 2021-06-21
US202163213179P 2021-06-21 2021-06-21
US63/213,179 2021-06-21
US63/213,184 2021-06-21

Publications (1)

Publication Number Publication Date
WO2022266704A1 true WO2022266704A1 (en) 2022-12-29

Family

ID=84543791

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2022/050626 WO2022266704A1 (en) 2021-06-21 2022-06-21 Display device, display system and methods therefor

Country Status (1)

Country Link
WO (1) WO2022266704A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7058252B2 (en) * 2001-08-06 2006-06-06 Ocuity Limited Optical switching apparatus
EP2003019A2 (en) * 2007-06-13 2008-12-17 Aisin AW Co., Ltd. Driving assist apparatus for vehicle
US20150002642A1 (en) * 2013-07-01 2015-01-01 RWD Consulting, LLC Vehicle visibility improvement system
WO2016077309A2 (en) * 2014-11-12 2016-05-19 Corning Incorporated Contrast enhancement sheet and display device comprising the same
US20210078495A1 (en) * 2017-04-12 2021-03-18 Omron Corporation Image display unit
US10953797B2 (en) * 2018-04-05 2021-03-23 Toyota Motor Engineering & Manufacturing North America, Inc. Cloaking devices with converging lenses and coherent image guides and vehicles comprising the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7058252B2 (en) * 2001-08-06 2006-06-06 Ocuity Limited Optical switching apparatus
EP2003019A2 (en) * 2007-06-13 2008-12-17 Aisin AW Co., Ltd. Driving assist apparatus for vehicle
US20150002642A1 (en) * 2013-07-01 2015-01-01 RWD Consulting, LLC Vehicle visibility improvement system
WO2016077309A2 (en) * 2014-11-12 2016-05-19 Corning Incorporated Contrast enhancement sheet and display device comprising the same
US20210078495A1 (en) * 2017-04-12 2021-03-18 Omron Corporation Image display unit
US10953797B2 (en) * 2018-04-05 2021-03-23 Toyota Motor Engineering & Manufacturing North America, Inc. Cloaking devices with converging lenses and coherent image guides and vehicles comprising the same

Similar Documents

Publication Publication Date Title
US11572017B2 (en) Vehicular vision system
US10317771B2 (en) Driver assistance apparatus and vehicle
US10768505B2 (en) Driver assistance apparatus and vehicle
CN110869901B (en) User interface device for vehicle and vehicle
US8098170B1 (en) Full-windshield head-up display interface for social networking
US8098171B1 (en) Traffic visibility in poor viewing conditions on full windshield head-up display
US10908417B2 (en) Vehicle vision system with virtual retinal display
KR102553434B1 (en) Lamp for vehicle
US10703299B2 (en) Rear view mirror simulation
EP3475755B1 (en) Display projecting a virtual image into the field of view of a user
GB2555185A (en) Vehicle exterior monitoring
EP3450255A1 (en) Around view monitoring apparatus for vehicle, and vehicle
CN107380056A (en) Vehicular illumination device and vehicle
US20180345860A1 (en) Display system, electronic mirror system and movable-body apparatus equipped with the same
US20140168415A1 (en) Vehicle vision system with micro lens array
US20140002252A1 (en) Vehicular heads up display with integrated bi-modal high brightness collision warning system
US20220041105A1 (en) Rearview device simulation
Terzis Automotive mirror-replacement by camera monitor systems
EP2605101B1 (en) Method for displaying images on a display device of a driver assistance device of a motor vehicle, computer program and driver assistance device carrying out the method
WO2022266704A1 (en) Display device, display system and methods therefor
EP3451279A1 (en) Rear view mirror simulation
US10324288B2 (en) Vehicle display system absorbing ambient light
CN106094215B (en) The method of vehicle-mounted head-up display, system and its augmented reality
CN219625794U (en) Optical imaging module, camera module and electronic device
CN117518126A (en) Light shield, detection device and vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22826879

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE