EP3877824A2 - Display presentation across plural display surfaces - Google Patents

Display presentation across plural display surfaces

Info

Publication number
EP3877824A2
EP3877824A2 EP19806075.8A EP19806075A EP3877824A2 EP 3877824 A2 EP3877824 A2 EP 3877824A2 EP 19806075 A EP19806075 A EP 19806075A EP 3877824 A2 EP3877824 A2 EP 3877824A2
Authority
EP
European Patent Office
Prior art keywords
display
display surface
image
edge portion
display system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19806075.8A
Other languages
German (de)
French (fr)
Inventor
Ziyad IBRAHIM
Glenn Frederick Evans
Nicholas Fredrick Ray
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Publication of EP3877824A2 publication Critical patent/EP3877824A2/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/13336Combining plural substrates to produce large-area displays, e.g. tiled displays
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/133526Lenses, e.g. microlenses or Fresnel lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1641Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1677Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly
    • H04M1/0268Details of the structure or mounting of specific components for a display module assembly including a flexible display panel
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/02Composition of display devices
    • G09G2300/026Video wall, i.e. juxtaposition of a plurality of screens to create a display screen of bigger dimensions
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0233Improving the luminance or brightness uniformity across the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/02Flexible displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0241Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings using relative motion of the body parts to change the operational status of the telephone set, e.g. switching on/off, answering incoming call
    • H04M1/0243Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings using relative motion of the body parts to change the operational status of the telephone set, e.g. switching on/off, answering incoming call using the relative angle between housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/22Illumination; Arrangements for improving the visibility of characters on dials
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/16Details of telephonic subscriber devices including more than one display unit
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K2102/00Constructional details relating to the organic devices covered by this subclass
    • H10K2102/301Details of OLEDs
    • H10K2102/311Flexible OLED

Definitions

  • One aspect of this disclosure is directed to an electronic display system comprising first and second display surfaces and a computer.
  • Each of the first and second display surfaces is configured to receive and transmit display light from an emissive element.
  • Each of the first and second display surfaces includes both a flat portion and an edge portion non- coplanar to the flat portion.
  • the computer is configured to control the emissive elements of the first and second display surfaces so as to present a first section of a display image on the first display surface and a second section of the display image on the second display surface.
  • one or more horizontal or vertical rows of the display image rendered on the flat portion of a display surface are rendered duplicatively on an edge portion.
  • FIGS. 1 and 2 show aspects of example electronic display systems.
  • FIGS. 3A, 3B, and 3C show aspects of an example electronic display system featuring hinged display surfaces arranged over curved organic light-emitting diode (OLED) matrices.
  • OLED organic light-emitting diode
  • FIG. 4 shows aspects of an example electronic display system featuring hinged display surfaces arranged over liquid-crystal display (LCD) matrices.
  • LCD liquid-crystal display
  • FIG. 5 shows aspects of estimation of a user’s ocular position in an example electronic display system.
  • FIG. 6 illustrates an example method to present a display image on an electronic display system.
  • FIGS. 7, 8 A, and 8B shows aspects of an example implementation of the method of FIG. 6.
  • FIG. 1 shows aspects of an example electronic display system 10.
  • the electronic display system as illustrated takes the form of a foldable tablet computer.
  • the electronic display system may take the form of a laptop computer or dual-screen smart phone.
  • Electronic display systems of numerous other types, sizes, and form factors are equally envisaged.
  • electronic display system 10 is shown presenting a display image 12.
  • the nature of the display image is not particularly limited.
  • the display image may include a user interface of an application executing on the electronic display system.
  • the display image may include text.
  • the display image may be one of a sequence of video frames.
  • display image 12 is comprised of a set of discrete loci ⁇ ; ⁇ arranged in three-dimensional (3D) space.
  • each locus Pi may correspond to a pixel of a display matrix used to form the display image.
  • each locus Pi may be associated with corresponding digital color values Ri, Gi , and Bi, which define the relative brightness of that locus in each of three different color channels. All of the loci of display image 12 may be coplanar in some examples, but that condition is not strictly necessary.
  • electronic display system 10 is a hinged device, and display image 12 is bent along the axis of hinge 14 to an angle a.
  • electronic display system 10 includes a first display surface 16A and a second display surface 16B.
  • Each display surface 16 e.g ., 16A and 16B in FIG. 1 is configured to receive and transmit display light from one or more light-emissive elements arranged beneath that display surface. In this manner, each display surface is configured to present at least a section of display image 12.
  • the first and second display surfaces are separated by an adjustable angle a, via hinge 14.
  • Some example electronic display systems may include only one display surface, or more than two.
  • the plural display surfaces may be the same or different from each other with respect to dimensions and/or technology.
  • the plural display surfaces need not be joined by a hinge, but may simply abut each other in a tiled arrangement, and thereby present a unitary display image.
  • FIG. 2 shows an example display system 10’ configured in this manner.
  • a single display surface 16 may be configured to present a display image 12 in its entirety.
  • a display image may be partitioned into a plurality of sections 18 (e.g ., sections 18A and 18B), and each display surface 16 may be configured to present a different, corresponding section.
  • sections 18A and 18B e.g ., sections 18A and 18B
  • seamless, gapless presentation across the sections may be desired.
  • FIG. 3 A shows an example display surface 16 and associated display componentry of electronic display system 10.
  • FIG. 3 A shows an organic light-emitting diode (OLED) matrix 20, in which a plurality of individually addressable, emissive OLED elements 22 are arranged beneath the display surface.
  • OLED organic light-emitting diode
  • the elements 22 of OLED matrix 20 are bordered by a nonemissive area 24, which may be configured to seal or secure the OLED matrix, or provide electrical connections for addressing the elements.
  • the nonemissive area of the OLED matrix may be bent or curved away from the primary viewing plane, which is defined by flat portion 28 A of display surface 16.
  • the display surface is similarly bent or curved to follow substantially the contour of the OLED matrix.
  • display surface 16 includes, in addition to flat portion 28, an edge portion 30, which is non-coplanar (i.e., nonparallel in an areal sense) to the flat portion.
  • the edge portion follows the smooth curve of the OLED matrix.
  • the edge portion may comprise any number of curved and/or plane-bevel facets non-coplanar to the flat portion.
  • two hinged or otherwise abutting display surfaces 16, configured in this manner may be used to present a continuous, essentially unbroken display image 12, where a different section 18 of the display image (e.g., 18A and 18B in FIG. 3B) is presented on each display surface.
  • hinge 14 is situated between adjacent edge portions 30 and is configured to pivotally couple display surface 16A to display surface [0017]
  • each display surface 16 is the outer surface of a glass or transparent-polymer cover layer 32 of substantially constant thickness.
  • the thickness and/or structure of the cover layer may be varied in order to impart desired ray-guiding properties.
  • the cover layer may be configured so as to collect the emission from the OLED elements 22 below edge portion 30 and to release the emission in a direction normal to the primary viewing plane. Used in conjunction with curved OLED matrix 20, this approach may be used to provide substantially distortion-free image display all the way to visible edge of each display surface 16.
  • FIG. 4. shows aspects of an electronic display system 10” based on liquid-crystal display (LCD) matrices 34.
  • LCD liquid-crystal display
  • a suitably engineered cover layer 32” may be used to re-image the pixel elements 22” of the LCD display so that the pixel elements appear very close to the shared edge of each display surface 16.
  • image distortion resulting from the re imaging may be corrected preemptively during rendering of the display-image ( vide infra).
  • electronic display system 10 includes a computer 36. Operatively coupled to the display componentry of the electronic display system, the computer may execute the rendering of display image 12 in addition to any other process described herein.
  • the computer may include at least one processor 38 and associated computer memory 40.
  • the computer may be configured to individually address and control the emission from each OLED element 22.
  • the same computer 36 may address and control plural associated display matrices. This tactic enables plural sections 18 of the display image to be rendered conceitedly.
  • the computer may be configured to render a plurality of sections of a display image for presentation on a corresponding plurality of display surfaces of the electronic display system.
  • display-image rendering is responsive to one or more geometric inputs to computer 36 of electronic display system 10.
  • the geometric inputs may reflect (1) the configuration of the electronic display system, including the layout and conformation among the various display surfaces; (2) the orientation of the electronic display system as configured; and (3) one or more ocular positions O of the user in a frame of reference of the electronic display system.
  • each geometric input may be furnished by a sensor arranged in the electronic display system and coupled operatively to the computer.
  • each geometric input may be estimated heuristically by the computer based on the current usage scenario of the electronic display system.
  • each geometric input may be evaluated and re-evaluated in real time as the electronic display system is used, so that the computer is able to dynamically adjust the display-image presentation in response to changing geometry. Example geometric inputs are described below.
  • electronic display system 10 includes a hinge-angle sensor 42 and at least one abutment sensor 44.
  • the hinge-angle sensor is configured to furnish an output responsive to the angle of separation a between flat portion 28A and flat portion 28B of respective display surfaces 16A and 16B.
  • Each abutment sensor 44 is configured to furnish an output responsive to abutment of display surface 16A to any other display surface 16.
  • the hinge-angle sensor may be potentiometer-based, and the abutment sensor may be an electrostatic or Hall-effect sensor; in other examples, various other sensor technologies may be used.
  • Electronic display system 10 includes an inertial measurement unit (EMU) 46, magnetometer 48, and palm sensor 50.
  • the IMU may comprise either or both of a multi- axis accelerometer and a multi-axis gyroscope configured to sense, respectively, translational and rotational movement of the electronic display system.
  • the magnetometer may be configured to sense the absolute orientation of the electronic display system based on a geomagnetic measurement. Alone or in combination with the output from hinge-angle sensor 42, output from the IMU and magnetometer are responsive to the orientation of each display surface 16 of the electronic display system. Accordingly, such output may be furnished to computer 36.
  • palm sensor 50 may be configured to sense the location of the user’s palm in scenarios in which the electronic display system is being held in the user’s hand.
  • the computer may be configured to estimate the orientation of each display surface heuristically, based on output from the palm sensor.
  • feature imaging based on a world-facing camera may be used to determine the orientation of the electronic display system.
  • Electronic display system 10 includes an optional user-facing camera 52 configured to acquire an ocular image of the user. More particularly, the user-facing camera is configured to image the user’s pupils, eyes, face, or head in real time. As shown in FIG. 5, user-facing camera 52 includes an on-axis lamp 54 and an off-axis lamp 56, these terms referring to the direction of illumination with respect to the optical axis Z of the user-facing camera. Each lamp may comprise a light-emitting diode (LED) or diode laser, for example, which emits infrared (IR) or near-IR illumination in a high-sensitivity band of the user facing camera.
  • LED light-emitting diode
  • IR infrared
  • Off-axis illumination may create a specular glint 58 that reflects from the cornea 60 of the user’s eye. Off-axis illumination may also be used to illuminate the eye for a‘ dark pupil’ effect, where pupil 62 appears darker than the surrounding iri s 64. By contrast, on-axis illumination may be used to create a‘bright pupil’ effect, where the pupil appears brighter than the surrounding iris. More specifically, illumination of the retror effective tissue of retina 66 may reflect back through the pupil, forming a bright image 68.
  • Ocular image data from user-facing camera 52 may be conveyed to computer 36. There, the data may be processed to resolve such features as the pupil center, pupil outline, and/or one or more specular glints 58. The locations of such features in the image data may be used as input parameters in a model— e.g ., a polynomial model— that relates feature position to an estimate of the right or left ocular position O of the user.
  • the ocular position O may correspond to a pupil position itself.
  • the ocular position may correspond to a position of the dominant eye of the user, or to a position that bisects a line segment joining the right and left eyes, for instance.
  • the user-facing camera may be configured to recognize the user’s face, or head, and ocular positions may be estimated based on a suitable anatomical model. In examples in which two or more users are detected in the ocular image, additional ocular positions may be computed.
  • user-facing camera 52 may be omitted in some implementations.
  • computer 36 may be configured to estimate the ocular positions based on a series of heuristics.
  • the user may be expected to view electronic display system 10 from a side opposite to the side that the operating system recognizes as the‘top’.
  • the palm location may be sensed by palm sensor 50 and used to predict the likely vantage point of the user.
  • the user may be expected to view the display screen from the side which is opposite to the side where the palm is located.
  • FIG. 6 illustrates aspects of an example method 70 to present on an electronic display system 10 a display image 12 viewable from an ocular position O.
  • the display image may be presented on one or more display surfaces 16 of the electronic display system.
  • the configuration among two or more display surfaces 16 of electronic display system 10 is sensed by computer 36.
  • the configuration may include layout information defining the relative arrangement of the display surfaces.
  • the configuration may include conformational information defining the relative orientation of the display surfaces as determined by the hinge angle a.
  • the configuration may be sensed via output from hinge-angle sensor 42 and/or abutment sensors 44, in view of stored data reflecting the dimensions and static configuration of the various display surfaces.
  • the hinge angle a in some implementations, affects not only the orientation of a hinged display surface, but also the viewable size of that surface as viewed from a given ocular position ( vide infra).
  • the overall orientation of the one or more display surfaces 16 of electronic display system 10 is sensed.
  • the orientation may be sensed by computer 36 based on the output of IMU 46, magnetometer 48, and/or palm sensor 50, for example.
  • one or more ocular positions O of one or more users of electronic display system 10 are estimated in the frame of reference of the display image 12.
  • the ocular positions may be estimated by acquisition of an ocular image of the one or more users and subsequent analysis of the ocular image.
  • ocular positions may be estimated heuristically— e.g., based on palm positioning on the electronic display system.
  • control loop in which computer 36 iterates through each locus P of display image 12 and renders that locus at appropriate coordinates (U, V) of any display surface 16 on which the locus would be visible from ocular position O.
  • the control loop at 78 may be executed for each display surface 16 of electronic display system 10, in sequence or in parallel.
  • a locus P is mapped to coordinates (II, V) of display surface 16 where a straight line OP passing through O and P intersects the display surface. If the point of intersection of OP is within the boundaries of the display surface, then locus P will be presented on that display surface. Otherwise, P will not be presented on that display surface, but may be presented on another display surface.
  • method 70 effectively partitions display image 12 into one or more sections 18 based on the position of that locus in the display image, and further based on the geometric inputs specifying configuration, orientation, and ocular position.
  • the angle Q of the straight line OP is computed relative to the direction E of emergence of light from display surface 16.
  • the direction of emergence may be normal to the display surface at coordinates (U, V). More generally, the direction of emergence may be computed based on the orientation of the emissive element 22 and the ray-directing properties (e.g ., refractive index and thickness) of cover layer 32.
  • each OLED element 22 emits a Lambertian distribution centered normal (i.e., orthogonal) to the matrix.
  • the energy emitted at non-orthogonal angles ( Q > 0) may be computed in terms of the cosine of Q.
  • the projected off-orthogonal energy may be increased by the inverse of this value in order to compensate the observed brightness for the angle of observation.
  • the angle may also be useful, inter alia , for mapping display surface coordinates ( U , V) to a corresponding pixel position (X, Y).
  • the straight line OP is extrapolated in order to determine which pixel position (X, Y ) of display matrix 20 correspond to mapped coordinates (//, V) of display surface 16.
  • the extrapolation may be computed according to Snell’s Law applied at the display surface, in view of the refractive index of cover layer 32. Such extrapolation is appropriate in examples in which light from an emissive element 22 refracts through cover layer 32 en route to coordinates (//, V).
  • the corresponding pixel position (X, Y) is illuminated to a brightness which is based on the angle Q computed relative to the direction of emergence of light from display surface 16. Illumination of the pixel position may include actively increasing the level of illumination for coordinates ( U , V) with increasing angle Q , for a given desired brightness of locus P. In some examples, the active illumination at pixel position (X, Y) varies as 1 / cos Q.
  • Steps 80 through 86 are now repeated, in sequence or in parallel, for all remaining loci P of display image 12.
  • method 70 may return to 72 for re-evaluation of the configuration, orientation, and ocular position, prior to presentation of the next display image. Accordingly, any change in the geometric inputs—the hinge angle or ocular position, for example— may be reflected in the subsequent presentation of the display image.
  • FIG. 7 demonstrates the implementation of method 70 under conditions of changing geometric inputs.
  • FIG. 7 shows the rendering of the same locus P of display image 12 in different usage scenarios.
  • hinge 14 When hinge 14 is set at an oblique angle and the display image is observed from ocular position O i, P is rendered on flat portion 28B of display surface 16B. As the hinge angle is opened further, more of the edge portions of both display surfaces become visible, thereby extending further into the edge portions the area to which the display image is rendered.
  • OiP is rotated relative to its initial position, and P is now rendered on edge portion 30B of display surface 16B, even though the ocular position Oi has not changed.
  • the display image when viewed from 0 ⁇ of a user, flows continuously from flat portion 28B of display surface 16B to the edge portion 30B of the same display surface.
  • one or more rows of the display image rendered on flat portion 28B may be rendered duplicatively on ( e.g ., shifted over to) edge portion 30B.
  • the‘rows’ of the display image may run horizontally or vertically with respect to the frame of reference of the electronic display system.
  • Duplicative rendering of display content from flat portion 28B onto edge portion 30B is also shown in FIG. 8A.
  • the row of pixel positions 88 from the flat portion are duplicated on the edge portion of the same display surface.
  • FIG. 7 also demonstrates the effect of method 70 for a case in which the user’s ocular position shifts relative to electronic display system 10.
  • P is rendered on edge portion 30B for observation from () ⁇ .
  • the display image flows continuously to edge portion 30A of adjacent display surface 16A.
  • one or more rows of the display image originally rendered on flat portion 28B may be rendered duplicatively on edge portion 30 A.
  • Duplicative rendering of display content from flat portion 28B onto edge portion 30A is also shown in FIG. 8B.
  • the row of pixel positions 88 from the flat portion are duplicated on the edge portion of the adjacent display surface.
  • duplication of pixel rows may be concurrent, such that the duplicated rows are displayed at the same time.
  • duplication of pixel rows may be from frame to frame, such that a portion of a frame is displayed using one row of pixels at an earlier time (e.g, Frameo), and then a corresponding portion of a subsequent frame is displayed using another row of pixels at a later time (e.g, Framei).
  • One aspect of this disclosure is directed to an electronic display system comprising: a first display surface configured to receive and transmit display light from a first emissive element, the first display surface having a first flat portion and a first edge portion non-coplanar to the first flat portion; a second display surface configured to receive and transmit display light from a second emissive element, the second display surface having a second flat portion and a second edge portion non-coplanar to the second flat portion; and a computer configured to control the first and second emissive elements so as to present a first section of a display image on the first display surface and a second section of the display image on the second display surface, such that one or more rows of the display image rendered on the first flat portion are rendered duplicatively on the first or second edge portion.
  • the one or more rows of the display image rendered on the first flat portion are rendered duplicatively on the first edge portion. In some implementations, the one or more rows of the display image rendered on the first flat portion are rendered duplicatively on the second edge portion. In some implementations, the display image, when viewed from a first ocular position of a user, flows continuously from the first flat portion to the first or second edge portion. In some implementations, the first and second sections are rendered so that the display image, when viewed from a second ocular position of the user, flows continuously from the second flat portion to the first or second edge portion. In some implementations, the computer is further configured to partition the display image into the first and second sections based on an estimate of the first ocular position.
  • the electronic display system further comprises a hinge situated between the first and second edge portions and configured to pivotally couple the first display surface to the second display surface, and a sensor furnishing an output responsive to an angle of separation between the first and second flat portions; here the computer is further configured to partition the display image into the first and second sections based on the output.
  • the electronic display system further comprises a camera configured to acquire an ocular image of the user, and the computer is further configured to estimate the ocular position by analysis of the ocular image.
  • the electronic display system further comprises a sensor furnishing an output responsive to one or more of an orientation of the first display surface and an abutment of the first display surface to another display surface, and the computer is further configured to partition the display image into the first and second sections based on the output.
  • Another aspect of this disclosure is directed to a method to present on a display surface of an electronic display system a display image viewable from an ocular position O, the method comprising: rendering a locus P of the display image by illuminating coordinates (U, V) of an edge portion of the display surface where a straight line OP passing through O and P intersects the edge portion, including increasing active illumination of the coordinates (U, V) with increasing angle of the straight line OP relative to a direction of emergence of light from the coordinates (U, V).
  • the direction of emergence is normal to a surface of the edge portion at the coordinates (U, V), and the active illumination is increased with increasing angle of OP relative to the normal of the display surface.
  • increasing the active illumination includes increasing by a factor 1 / cos Q, where Q is the angle of OP relative to the direction of emergence of light from the coordinates (U, V).
  • illuminating the coordinates ⁇ II, V) includes refracting light from an emissive element associated with the coordinates.
  • the ocular position O is a position of a dominant eye of a user of the electronic display system, or a position that bisects a line segment joining right and left eyes of a user of the electronic display system.
  • the method further comprises estimating the ocular position O heuristically.
  • the method further comprises acquiring an ocular image of a user of the electronic display system, and the ocular position O is estimated by analysis of the ocular image.
  • the ocular position O is a pupil position.
  • Another aspect of this disclosure is directed to a method to present on first and second display surfaces of an electronic display system a display image viewable from an ocular position O, the method comprising: sensing a configuration of the first and second display surfaces; estimating the ocular position 0 rendering a locus P of the display image by illuminating coordinates (U, V) of the first display surface where a straight line OP passing through O and P intersects the first display surface; and re-rendering the locus P of the display image by illuminating different coordinates ⁇ O’, V’) of the first or second display surface where a straight line OP passing through O and P intersects the first or second display surface, responsive to a change in OP.
  • the change in OP is responsive to a change in O.
  • the first and second display surfaces are held apart by an adjustable angle via a hinge situated between the first and second display surfaces, and the change in OP is responsive to a change in the angle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Nonlinear Science (AREA)
  • Signal Processing (AREA)
  • Mathematical Physics (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

An electronic display system comprises first and second display surfaces and a computer. Each of the first and second display surfaces is configured to receive and transmit display light from an emissive element. Each of the first and second display surfaces includes both a flat portion and an edge portion non-coplanar to the flat portion. The computer is configured to control the emissive elements of the first and second display surfaces so as to present a first section of a display image on the first display surface and a second section of the display image on the second display surface. In this example, one or more rows of the display image rendered on the flat portion of a display surface are rendered duplicatively on an edge portion.

Description

DISPLAY PRESENTATION ACROSS PLURAL DISPLAY SURFACES
BACKGROUND
[0001] Electronic display technology has undergone rapid growth in recent years. Electronic display systems have become larger, flatter, brighter, more power-efficient, and capable of true-to-life color at high resolution. On the other hand, display technology does not currently leverage the advantages of modular design.
SUMMARY
[0002] One aspect of this disclosure is directed to an electronic display system comprising first and second display surfaces and a computer. Each of the first and second display surfaces is configured to receive and transmit display light from an emissive element. Each of the first and second display surfaces includes both a flat portion and an edge portion non- coplanar to the flat portion. The computer is configured to control the emissive elements of the first and second display surfaces so as to present a first section of a display image on the first display surface and a second section of the display image on the second display surface. In this example, one or more horizontal or vertical rows of the display image rendered on the flat portion of a display surface are rendered duplicatively on an edge portion.
[0003] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] FIGS. 1 and 2 show aspects of example electronic display systems.
[0005] FIGS. 3A, 3B, and 3C show aspects of an example electronic display system featuring hinged display surfaces arranged over curved organic light-emitting diode (OLED) matrices.
[0006] FIG. 4 shows aspects of an example electronic display system featuring hinged display surfaces arranged over liquid-crystal display (LCD) matrices.
[0007] FIG. 5 shows aspects of estimation of a user’s ocular position in an example electronic display system.
[0008] FIG. 6 illustrates an example method to present a display image on an electronic display system. [0009] FIGS. 7, 8 A, and 8B shows aspects of an example implementation of the method of FIG. 6.
DETAILED DESCRIPTION
[0010] This disclosure is presented by way of example and with reference to the drawing figures listed above. Components, process steps, and other elements that may be substantially the same in one or more of the figures are identified coordinately and are described with minimal repetition. It will be noted, however, that elements identified coordinately may also differ to some degree. It will be further noted that the figures are schematic and generally not drawn to scale. Rather, the various drawing scales, aspect ratios, and numbers of components shown in the figures may be purposely distorted to make certain features or relationships easier to see.
[0011] FIG. 1 shows aspects of an example electronic display system 10. The electronic display system as illustrated takes the form of a foldable tablet computer. In other examples, the electronic display system may take the form of a laptop computer or dual-screen smart phone. Electronic display systems of numerous other types, sizes, and form factors are equally envisaged. In FIG. 1, electronic display system 10 is shown presenting a display image 12. The nature of the display image is not particularly limited. In some examples, the display image may include a user interface of an application executing on the electronic display system. In some examples, the display image may include text. In some examples, the display image may be one of a sequence of video frames.
[0012] Generally speaking, display image 12 is comprised of a set of discrete loci { ;} arranged in three-dimensional (3D) space. In pixel-display implementations, each locus Pi may correspond to a pixel of a display matrix used to form the display image. In color- display implementations, each locus Pi may be associated with corresponding digital color values Ri, Gi , and Bi, which define the relative brightness of that locus in each of three different color channels. All of the loci of display image 12 may be coplanar in some examples, but that condition is not strictly necessary. In FIG. 1, for instance, electronic display system 10 is a hinged device, and display image 12 is bent along the axis of hinge 14 to an angle a.
[0013] Continuing in FIG. 1, electronic display system 10 includes a first display surface 16A and a second display surface 16B. Each display surface 16 ( e.g ., 16A and 16B in FIG. 1) is configured to receive and transmit display light from one or more light-emissive elements arranged beneath that display surface. In this manner, each display surface is configured to present at least a section of display image 12. In FIG. 1, the first and second display surfaces are separated by an adjustable angle a, via hinge 14.
[0014] Implementations that differ from FIG. 1 are also envisaged. Some example electronic display systems may include only one display surface, or more than two. In electronic display systems having plural display surfaces, the plural display surfaces may be the same or different from each other with respect to dimensions and/or technology. In some examples, the plural display surfaces need not be joined by a hinge, but may simply abut each other in a tiled arrangement, and thereby present a unitary display image. FIG. 2 shows an example display system 10’ configured in this manner.
[0015] In some examples, a single display surface 16 may be configured to present a display image 12 in its entirety. In other examples, as shown in FIGS. 1 and 2, a display image may be partitioned into a plurality of sections 18 ( e.g ., sections 18A and 18B), and each display surface 16 may be configured to present a different, corresponding section. In those examples in which a display image is presented in sections, seamless, gapless presentation across the sections may be desired. This aspect is now developed with reference to FIG. 3 A, which shows an example display surface 16 and associated display componentry of electronic display system 10. In particular, FIG. 3 A shows an organic light-emitting diode (OLED) matrix 20, in which a plurality of individually addressable, emissive OLED elements 22 are arranged beneath the display surface.
[0016] In the example of FIG. 3 A, the elements 22 of OLED matrix 20 are bordered by a nonemissive area 24, which may be configured to seal or secure the OLED matrix, or provide electrical connections for addressing the elements. To conceal the display-image gap that would otherwise be observed at this edge, the nonemissive area of the OLED matrix may be bent or curved away from the primary viewing plane, which is defined by flat portion 28 A of display surface 16. The display surface is similarly bent or curved to follow substantially the contour of the OLED matrix. Accordingly, display surface 16 includes, in addition to flat portion 28, an edge portion 30, which is non-coplanar (i.e., nonparallel in an areal sense) to the flat portion. In the illustrated example, the edge portion follows the smooth curve of the OLED matrix. More generally, the edge portion may comprise any number of curved and/or plane-bevel facets non-coplanar to the flat portion. As shown in FIG. 3B, two hinged or otherwise abutting display surfaces 16, configured in this manner, may be used to present a continuous, essentially unbroken display image 12, where a different section 18 of the display image (e.g., 18A and 18B in FIG. 3B) is presented on each display surface. In the example of FIG. 3B, hinge 14 is situated between adjacent edge portions 30 and is configured to pivotally couple display surface 16A to display surface [0017] In the illustrated example, each display surface 16 is the outer surface of a glass or transparent-polymer cover layer 32 of substantially constant thickness. In other examples, the thickness and/or structure of the cover layer may be varied in order to impart desired ray-guiding properties. In particular, the cover layer may be configured so as to collect the emission from the OLED elements 22 below edge portion 30 and to release the emission in a direction normal to the primary viewing plane. Used in conjunction with curved OLED matrix 20, this approach may be used to provide substantially distortion-free image display all the way to visible edge of each display surface 16.
[0018] Despite the applicability of curved OLED matrix 20 to borderless image display, alternative display technologies are also consonant with this disclosure. FIG. 4. shows aspects of an electronic display system 10” based on liquid-crystal display (LCD) matrices 34. In this example, a suitably engineered cover layer 32” may be used to re-image the pixel elements 22” of the LCD display so that the pixel elements appear very close to the shared edge of each display surface 16. In some cases, image distortion resulting from the re imaging may be corrected preemptively during rendering of the display-image ( vide infra).
[0019] Turning back to FIG. 3C, electronic display system 10 includes a computer 36. Operatively coupled to the display componentry of the electronic display system, the computer may execute the rendering of display image 12 in addition to any other process described herein. To that end, the computer may include at least one processor 38 and associated computer memory 40. In the illustrated example, the computer may be configured to individually address and control the emission from each OLED element 22. In electronic display systems having plural display surfaces 16, the same computer 36 may address and control plural associated display matrices. This tactic enables plural sections 18 of the display image to be rendered conceitedly. Accordingly, the computer may be configured to render a plurality of sections of a display image for presentation on a corresponding plurality of display surfaces of the electronic display system.
[0020] In some examples, display-image rendering is responsive to one or more geometric inputs to computer 36 of electronic display system 10. The geometric inputs may reflect (1) the configuration of the electronic display system, including the layout and conformation among the various display surfaces; (2) the orientation of the electronic display system as configured; and (3) one or more ocular positions O of the user in a frame of reference of the electronic display system. In some examples, each geometric input may be furnished by a sensor arranged in the electronic display system and coupled operatively to the computer. Alternatively, or in addition, each geometric input may be estimated heuristically by the computer based on the current usage scenario of the electronic display system. Moreover, each geometric input may be evaluated and re-evaluated in real time as the electronic display system is used, so that the computer is able to dynamically adjust the display-image presentation in response to changing geometry. Example geometric inputs are described below.
[0021] Continuing in FIG. 3C, electronic display system 10 includes a hinge-angle sensor 42 and at least one abutment sensor 44. The hinge-angle sensor is configured to furnish an output responsive to the angle of separation a between flat portion 28A and flat portion 28B of respective display surfaces 16A and 16B. Each abutment sensor 44 is configured to furnish an output responsive to abutment of display surface 16A to any other display surface 16. In some examples, the hinge-angle sensor may be potentiometer-based, and the abutment sensor may be an electrostatic or Hall-effect sensor; in other examples, various other sensor technologies may be used.
[0022] Electronic display system 10 includes an inertial measurement unit (EMU) 46, magnetometer 48, and palm sensor 50. The IMU may comprise either or both of a multi- axis accelerometer and a multi-axis gyroscope configured to sense, respectively, translational and rotational movement of the electronic display system. The magnetometer may be configured to sense the absolute orientation of the electronic display system based on a geomagnetic measurement. Alone or in combination with the output from hinge-angle sensor 42, output from the IMU and magnetometer are responsive to the orientation of each display surface 16 of the electronic display system. Accordingly, such output may be furnished to computer 36. When included, palm sensor 50 may be configured to sense the location of the user’s palm in scenarios in which the electronic display system is being held in the user’s hand. In configurations in which the IMU and magnetometer are omitted, the computer may be configured to estimate the orientation of each display surface heuristically, based on output from the palm sensor. In still other examples, feature imaging based on a world-facing camera may be used to determine the orientation of the electronic display system.
[0023] Electronic display system 10 includes an optional user-facing camera 52 configured to acquire an ocular image of the user. More particularly, the user-facing camera is configured to image the user’s pupils, eyes, face, or head in real time. As shown in FIG. 5, user-facing camera 52 includes an on-axis lamp 54 and an off-axis lamp 56, these terms referring to the direction of illumination with respect to the optical axis Z of the user-facing camera. Each lamp may comprise a light-emitting diode (LED) or diode laser, for example, which emits infrared (IR) or near-IR illumination in a high-sensitivity band of the user facing camera. Off-axis illumination may create a specular glint 58 that reflects from the cornea 60 of the user’s eye. Off-axis illumination may also be used to illuminate the eye for a‘ dark pupil’ effect, where pupil 62 appears darker than the surrounding iri s 64. By contrast, on-axis illumination may be used to create a‘bright pupil’ effect, where the pupil appears brighter than the surrounding iris. More specifically, illumination of the retror effective tissue of retina 66 may reflect back through the pupil, forming a bright image 68.
[0024] Ocular image data from user-facing camera 52 may be conveyed to computer 36. There, the data may be processed to resolve such features as the pupil center, pupil outline, and/or one or more specular glints 58. The locations of such features in the image data may be used as input parameters in a model— e.g ., a polynomial model— that relates feature position to an estimate of the right or left ocular position O of the user. In some implementations, the ocular position O may correspond to a pupil position itself. In other implementations, the ocular position may correspond to a position of the dominant eye of the user, or to a position that bisects a line segment joining the right and left eyes, for instance. In still other implementations, the user-facing camera may be configured to recognize the user’s face, or head, and ocular positions may be estimated based on a suitable anatomical model. In examples in which two or more users are detected in the ocular image, additional ocular positions may be computed.
[0025] Despite the benefit of sensory estimation of ocular positions O , user-facing camera 52 may be omitted in some implementations. Instead, computer 36 may be configured to estimate the ocular positions based on a series of heuristics. For example, the user may be expected to view electronic display system 10 from a side opposite to the side that the operating system recognizes as the‘top’. In addition, the palm location may be sensed by palm sensor 50 and used to predict the likely vantage point of the user. For example, the user may be expected to view the display screen from the side which is opposite to the side where the palm is located.
[0026] FIG. 6 illustrates aspects of an example method 70 to present on an electronic display system 10 a display image 12 viewable from an ocular position O. The display image may be presented on one or more display surfaces 16 of the electronic display system.
[0027] At 72 of method 70, the configuration among two or more display surfaces 16 of electronic display system 10 is sensed by computer 36. In examples involving tiled display surfaces, the configuration may include layout information defining the relative arrangement of the display surfaces. In some examples (e.g., those involving hinged display surfaces), the configuration may include conformational information defining the relative orientation of the display surfaces as determined by the hinge angle a. In some examples, accordingly, the configuration may be sensed via output from hinge-angle sensor 42 and/or abutment sensors 44, in view of stored data reflecting the dimensions and static configuration of the various display surfaces. It will be noted that the hinge angle a, in some implementations, affects not only the orientation of a hinged display surface, but also the viewable size of that surface as viewed from a given ocular position ( vide infra).
[0028] Continuing in FIG. 6, at 74 the overall orientation of the one or more display surfaces 16 of electronic display system 10 is sensed. The orientation may be sensed by computer 36 based on the output of IMU 46, magnetometer 48, and/or palm sensor 50, for example.
[0029] At 76 one or more ocular positions O of one or more users of electronic display system 10 are estimated in the frame of reference of the display image 12. In electronic display systems equipped with a user-facing camera 52, as described above, the ocular positions may be estimated by acquisition of an ocular image of the one or more users and subsequent analysis of the ocular image. In some examples, ocular positions may be estimated heuristically— e.g., based on palm positioning on the electronic display system.
[0030] At 78 there is a control loop, in which computer 36 iterates through each locus P of display image 12 and renders that locus at appropriate coordinates (U, V) of any display surface 16 on which the locus would be visible from ocular position O. In some implementations, the control loop at 78 may be executed for each display surface 16 of electronic display system 10, in sequence or in parallel.
[0031] At 80, and now referring also to FIG. 7, a locus P is mapped to coordinates (II, V) of display surface 16 where a straight line OP passing through O and P intersects the display surface. If the point of intersection of OP is within the boundaries of the display surface, then locus P will be presented on that display surface. Otherwise, P will not be presented on that display surface, but may be presented on another display surface. In this manner, method 70 effectively partitions display image 12 into one or more sections 18 based on the position of that locus in the display image, and further based on the geometric inputs specifying configuration, orientation, and ocular position.
[0032] At 82 the angle Q of the straight line OP is computed relative to the direction E of emergence of light from display surface 16. In some examples— e.g, when display surface 16 is locally parallel to underlying OLED matrix 20— the direction of emergence may be normal to the display surface at coordinates (U, V). More generally, the direction of emergence may be computed based on the orientation of the emissive element 22 and the ray-directing properties ( e.g ., refractive index and thickness) of cover layer 32. Typically, each OLED element 22 emits a Lambertian distribution centered normal (i.e., orthogonal) to the matrix. Accordingly, the energy emitted at non-orthogonal angles ( Q > 0) may be computed in terms of the cosine of Q. In some examples, the projected off-orthogonal energy may be increased by the inverse of this value in order to compensate the observed brightness for the angle of observation. The angle may also be useful, inter alia , for mapping display surface coordinates ( U , V) to a corresponding pixel position (X, Y).
[0033] At 84, accordingly, the straight line OP is extrapolated in order to determine which pixel position (X, Y ) of display matrix 20 correspond to mapped coordinates (//, V) of display surface 16. In some examples, the extrapolation may be computed according to Snell’s Law applied at the display surface, in view of the refractive index of cover layer 32. Such extrapolation is appropriate in examples in which light from an emissive element 22 refracts through cover layer 32 en route to coordinates (//, V).
[0034] At 86 the corresponding pixel position (X, Y) is illuminated to a brightness which is based on the angle Q computed relative to the direction of emergence of light from display surface 16. Illumination of the pixel position may include actively increasing the level of illumination for coordinates ( U , V) with increasing angle Q , for a given desired brightness of locus P. In some examples, the active illumination at pixel position (X, Y) varies as 1 / cos Q.
[0035] Steps 80 through 86 are now repeated, in sequence or in parallel, for all remaining loci P of display image 12. After all loci of the display image are rendered, method 70 may return to 72 for re-evaluation of the configuration, orientation, and ocular position, prior to presentation of the next display image. Accordingly, any change in the geometric inputs— the hinge angle or ocular position, for example— may be reflected in the subsequent presentation of the display image.
[0036] FIG. 7 demonstrates the implementation of method 70 under conditions of changing geometric inputs. In particular, FIG. 7 shows the rendering of the same locus P of display image 12 in different usage scenarios. When hinge 14 is set at an oblique angle and the display image is observed from ocular position O i, P is rendered on flat portion 28B of display surface 16B. As the hinge angle is opened further, more of the edge portions of both display surfaces become visible, thereby extending further into the edge portions the area to which the display image is rendered. When the hinge is fully open, OiP is rotated relative to its initial position, and P is now rendered on edge portion 30B of display surface 16B, even though the ocular position Oi has not changed. Accordingly, as the hinge angle increases, the display image, when viewed from 0\ of a user, flows continuously from flat portion 28B of display surface 16B to the edge portion 30B of the same display surface. To implement this effect, one or more rows of the display image rendered on flat portion 28B may be rendered duplicatively on ( e.g ., shifted over to) edge portion 30B. As used herein, the‘rows’ of the display image may run horizontally or vertically with respect to the frame of reference of the electronic display system. Duplicative rendering of display content from flat portion 28B onto edge portion 30B is also shown in FIG. 8A. Here, the row of pixel positions 88 from the flat portion are duplicated on the edge portion of the same display surface.
[0037] FIG. 7 also demonstrates the effect of method 70 for a case in which the user’s ocular position shifts relative to electronic display system 10. As noted above, when hinge 14 is fully open P is rendered on edge portion 30B for observation from ()\ . When the ocular position shifts to Oi, the display image flows continuously to edge portion 30A of adjacent display surface 16A. To implement this effect, one or more rows of the display image originally rendered on flat portion 28B may be rendered duplicatively on edge portion 30 A. Duplicative rendering of display content from flat portion 28B onto edge portion 30A is also shown in FIG. 8B. Here, the row of pixel positions 88 from the flat portion are duplicated on the edge portion of the adjacent display surface. In some scenarios, duplication of pixel rows may be concurrent, such that the duplicated rows are displayed at the same time. In some scenarios (e.g., video scenarios), duplication of pixel rows may be from frame to frame, such that a portion of a frame is displayed using one row of pixels at an earlier time (e.g, Frameo), and then a corresponding portion of a subsequent frame is displayed using another row of pixels at a later time (e.g, Framei).
[0038] One aspect of this disclosure is directed to an electronic display system comprising: a first display surface configured to receive and transmit display light from a first emissive element, the first display surface having a first flat portion and a first edge portion non-coplanar to the first flat portion; a second display surface configured to receive and transmit display light from a second emissive element, the second display surface having a second flat portion and a second edge portion non-coplanar to the second flat portion; and a computer configured to control the first and second emissive elements so as to present a first section of a display image on the first display surface and a second section of the display image on the second display surface, such that one or more rows of the display image rendered on the first flat portion are rendered duplicatively on the first or second edge portion.
[0039] In some implementations, the one or more rows of the display image rendered on the first flat portion are rendered duplicatively on the first edge portion. In some implementations, the one or more rows of the display image rendered on the first flat portion are rendered duplicatively on the second edge portion. In some implementations, the display image, when viewed from a first ocular position of a user, flows continuously from the first flat portion to the first or second edge portion. In some implementations, the first and second sections are rendered so that the display image, when viewed from a second ocular position of the user, flows continuously from the second flat portion to the first or second edge portion. In some implementations, the computer is further configured to partition the display image into the first and second sections based on an estimate of the first ocular position. In some implementations, the electronic display system further comprises a hinge situated between the first and second edge portions and configured to pivotally couple the first display surface to the second display surface, and a sensor furnishing an output responsive to an angle of separation between the first and second flat portions; here the computer is further configured to partition the display image into the first and second sections based on the output. In some implementations, the electronic display system further comprises a camera configured to acquire an ocular image of the user, and the computer is further configured to estimate the ocular position by analysis of the ocular image. In some implementations, the electronic display system further comprises a sensor furnishing an output responsive to one or more of an orientation of the first display surface and an abutment of the first display surface to another display surface, and the computer is further configured to partition the display image into the first and second sections based on the output.
[0040] Another aspect of this disclosure is directed to a method to present on a display surface of an electronic display system a display image viewable from an ocular position O, the method comprising: rendering a locus P of the display image by illuminating coordinates (U, V) of an edge portion of the display surface where a straight line OP passing through O and P intersects the edge portion, including increasing active illumination of the coordinates (U, V) with increasing angle of the straight line OP relative to a direction of emergence of light from the coordinates (U, V).
[0041] In some implementations, the direction of emergence is normal to a surface of the edge portion at the coordinates (U, V), and the active illumination is increased with increasing angle of OP relative to the normal of the display surface. In some implementations, increasing the active illumination includes increasing by a factor 1 / cos Q, where Q is the angle of OP relative to the direction of emergence of light from the coordinates (U, V). In some implementations, illuminating the coordinates {II, V) includes refracting light from an emissive element associated with the coordinates. In some implementations, the ocular position O is a position of a dominant eye of a user of the electronic display system, or a position that bisects a line segment joining right and left eyes of a user of the electronic display system. In some implementations, the method further comprises estimating the ocular position O heuristically. In some implementations, the method further comprises acquiring an ocular image of a user of the electronic display system, and the ocular position O is estimated by analysis of the ocular image. In some implementations, the ocular position O is a pupil position.
[0042] Another aspect of this disclosure is directed to a method to present on first and second display surfaces of an electronic display system a display image viewable from an ocular position O, the method comprising: sensing a configuration of the first and second display surfaces; estimating the ocular position 0 rendering a locus P of the display image by illuminating coordinates (U, V) of the first display surface where a straight line OP passing through O and P intersects the first display surface; and re-rendering the locus P of the display image by illuminating different coordinates {O’, V’) of the first or second display surface where a straight line OP passing through O and P intersects the first or second display surface, responsive to a change in OP.
[0043] In some implementations, the change in OP is responsive to a change in O. In some implementations, the first and second display surfaces are held apart by an adjustable angle via a hinge situated between the first and second display surfaces, and the change in OP is responsive to a change in the angle.
[0044] It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
[0045] The subject matter of the present disclosure includes all novel and non-obvious combinations and sub-combinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims

1. An electronic display system comprising:
a first display surface configured to receive and transmit display light from a first emissive element, the first display surface having a first flat portion and a first edge portion non- coplanar to the first flat portion;
a second display surface configured to receive and transmit display light from a second emissive element, the second display surface having a second flat portion and a second edge portion non-coplanar to the second flat portion; and
a computer configured to control the first and second emissive elements so as to present a first section of a display image on the first display surface and a second section of the display image on the second display surface, such that one or more rows of the display image rendered on the first flat portion are rendered duplicatively on the first or second edge portion.
2. The electronic display system of claim 1 wherein the one or more rows of the display image rendered on the first flat portion are rendered duplicatively on the first edge portion.
3. The electronic display system of claim 1 wherein the one or more rows of the display image rendered on the first flat portion are rendered duplicatively on the second edge portion.
4. The electronic display system of claim 1 wherein the display image, when viewed from a first ocular position of a user, flows continuously from the first flat portion to the first or second edge portion.
5. The electronic display system of claim 4 wherein the first and second sections are rendered so that the display image, when viewed from a second ocular position of the user, flows continuously from the second flat portion to the first or second edge portion.
6. The electronic display system of claim 4 wherein the computer is further configured to partition the display image into the first and second sections based on an estimate of the first ocular position.
7. The electronic display system of claim 1 further comprising a hinge situated between the first and second edge portions and configured to pivotally couple the first display surface to the second display surface, and a sensor furnishing an output responsive to an angle of separation between the first and second flat portions, wherein the computer is further configured to partition the display image into the first and second sections based on the output.
8. The electronic display system of claim 1 further comprising a camera configured to acquire an ocular image of the user, wherein the computer is further configured to estimate the ocular position by analysis of the ocular image.
9. The electronic display system of claim 1 further comprising a sensor furnishing an output responsive to one or more of an orientation of the first display surface and an abutment of the first display surface to another display surface, and wherein the computer is further configured to partition the display image into the first and second sections based on the output.
10. A method to present on a display surface of an electronic display system a display image viewable from an ocular position O, the method comprising:
rendering a locus P of the display image by illuminating coordinates (I I, V) of an edge portion of the display surface where a straight line OP passing through O and/1 intersects the edge portion, including increasing active illumination of the coordinates (U, V ) with increasing angle of the straight line OP relative to a direction of emergence of light from the coordinates (U, V).
11. The method of claim 10 wherein the direction of emergence is normal to a surface of the edge portion at the coordinates (U, V), and wherein the active illumination is increased with increasing angle of OP relative to the normal of the display surface.
12. The method of claim 10 wherein increasing the active illumination includes increasing by a factor 1 / cos Q , where Q is the angle of OP relative to the direction of emergence of light from the coordinates (U, V).
13. The method of claim 10 wherein illuminating the coordinates (U, V) includes refracting light from an emissive element associated with the coordinates.
14. The method of claim 10 wherein the ocular position O is a position of a dominant eye of a user of the electronic display system, or a position that bisects a line segment joining right and left eyes of a user of the electronic display system.
15. The method of claim 10 further comprising estimating the ocular position O heuristically.
EP19806075.8A 2018-11-05 2019-10-29 Display presentation across plural display surfaces Pending EP3877824A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/181,214 US11272045B2 (en) 2018-11-05 2018-11-05 Display presentation across plural display surfaces
PCT/US2019/058421 WO2020096801A2 (en) 2018-11-05 2019-10-29 Display presentation across plural display surfaces

Publications (1)

Publication Number Publication Date
EP3877824A2 true EP3877824A2 (en) 2021-09-15

Family

ID=68618200

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19806075.8A Pending EP3877824A2 (en) 2018-11-05 2019-10-29 Display presentation across plural display surfaces

Country Status (4)

Country Link
US (1) US11272045B2 (en)
EP (1) EP3877824A2 (en)
CN (1) CN112955812A (en)
WO (1) WO2020096801A2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102692813B1 (en) * 2019-08-20 2024-08-08 삼성전자 주식회사 Electronic device and method for controlling operation mode based on status of electronic device thereof
NL2025203B1 (en) 2020-03-24 2021-10-20 Microsoft Technology Licensing Llc Alignment of modification regions with pixel registration

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0028890D0 (en) 2000-11-27 2001-01-10 Isis Innovation Visual display screen arrangement
GB0223883D0 (en) 2002-10-15 2002-11-20 Seamless Display Ltd Visual display screen arrangement
US7570227B2 (en) * 2003-10-17 2009-08-04 Palo Alto Research Center Incorporated Systems and methods for managing seams
US7623105B2 (en) 2003-11-21 2009-11-24 Sharp Laboratories Of America, Inc. Liquid crystal display with adaptive color
US8368729B2 (en) 2007-11-22 2013-02-05 Sharp Kabushiki Kaisha Display device
WO2011067996A1 (en) 2009-12-02 2011-06-09 シャープ株式会社 Display device and display method
JP5561769B2 (en) * 2010-04-08 2014-07-30 Necカシオモバイルコミュニケーションズ株式会社 Terminal device and program
JP2012185797A (en) 2011-02-15 2012-09-27 Toshiba Corp Electronic equipment
US20130181892A1 (en) * 2012-01-13 2013-07-18 Nokia Corporation Image Adjusting
CN105324605B (en) 2013-02-22 2020-04-28 瑞尔D斯帕克有限责任公司 Directional backlight
JP5922639B2 (en) 2013-12-07 2016-05-24 レノボ・シンガポール・プライベート・リミテッド Foldable electronic device, display system, and display method
KR102172980B1 (en) * 2014-04-07 2020-11-02 삼성전자주식회사 Tiled display system and method for processing images thereof
EP3172621A4 (en) * 2014-07-22 2018-05-23 Barco Inc. Display systems and methods employing polarizing reflective screens
WO2016017948A1 (en) * 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Flexible device and interfacing method thereof
KR102243106B1 (en) * 2014-08-20 2021-04-22 삼성전자 주식회사 Display apparatus and control method thereof
US10520782B2 (en) 2017-02-02 2019-12-31 James David Busch Display devices, systems and methods capable of single-sided, dual-sided, and transparent mixed reality applications
KR102390984B1 (en) * 2017-08-10 2022-04-26 엘지디스플레이 주식회사 Display Device and Driving Method thereof

Also Published As

Publication number Publication date
US20200142662A1 (en) 2020-05-07
WO2020096801A3 (en) 2020-10-01
US11272045B2 (en) 2022-03-08
WO2020096801A2 (en) 2020-05-14
CN112955812A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
EP3571673B1 (en) Method for displaying virtual image, storage medium and electronic device therefor
EP3227760B1 (en) Pointer projection for natural user input
CN108700934B (en) Wearable device capable of eye tracking
US9874932B2 (en) Avoidance of color breakup in late-stage re-projection
US9779512B2 (en) Automatic generation of virtual materials from real-world materials
US9480397B2 (en) Gaze tracking variations using visible lights or dots
CN105992965B (en) In response to the stereoscopic display of focus shift
TWI704501B (en) Electronic apparatus operated by head movement and operation method thereof
US20170256095A1 (en) Blocking screen in Augmented Reality
CN107209386A (en) Augmented reality visual field object follower
JP2017539108A (en) Display visibility based on eye convergence
JP2017534957A (en) Display that reduces eye discomfort
US11250541B2 (en) Camera-based transparent display
US11272045B2 (en) Display presentation across plural display surfaces
CN109643152A (en) Use face and the eyes tracking of the face sensor in head-mounted display and FA Facial Animation
US20180158390A1 (en) Digital image modification
CN107148591A (en) Display device and display control method
US20240036699A1 (en) Devices, Methods, and Graphical User Interfaces for Processing Inputs to a Three-Dimensional Environment
US10706600B1 (en) Head-mounted display devices with transparent display panels for color deficient user
US20240104843A1 (en) Methods for depth conflict mitigation in a three-dimensional environment
WO2024026024A1 (en) Devices and methods for processing inputs to a three-dimensional environment
WO2024054433A2 (en) Devices, methods, and graphical user interfaces for controlling avatars within three-dimensional environments
WO2024020061A1 (en) Devices, methods, and graphical user interfaces for providing inputs in three-dimensional environments
CN118694912A (en) Gaze-aware tone mapping with adaptive glints

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210429

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RAP3 Party data changed (applicant data changed or rights of an application transferred)

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20230206