WO2012030265A1 - Face screen orientation and related devices and methods - Google Patents

Face screen orientation and related devices and methods Download PDF

Info

Publication number
WO2012030265A1
WO2012030265A1 PCT/SE2010/051337 SE2010051337W WO2012030265A1 WO 2012030265 A1 WO2012030265 A1 WO 2012030265A1 SE 2010051337 W SE2010051337 W SE 2010051337W WO 2012030265 A1 WO2012030265 A1 WO 2012030265A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
content
responsive
image
faces
Prior art date
Application number
PCT/SE2010/051337
Other languages
French (fr)
Inventor
Ola Andersson
Johan Kwarnmark
Johan Svedberg
Michael Huber
Original Assignee
Telefonaktiebolaget L M Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget L M Ericsson (Publ) filed Critical Telefonaktiebolaget L M Ericsson (Publ)
Publication of WO2012030265A1 publication Critical patent/WO2012030265A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/242Aligning, centring, orientation detection or correction of the image by image rotation, e.g. by 90 degrees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0492Change of orientation of the displayed image, e.g. upside-down, mirrored
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/16Digital picture frames
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters

Definitions

  • a display may be used in different physical orientations.
  • the display may be used in a portrait mode or in a landscape mode, and the user may orient or rotate the device as desired.
  • the user may be inconvenienced by having to configure the graphics subsystem that renders content (e.g., images, video, text, etc.) on the display for the selected orientation.
  • pattern recognition may be used to determine whether a display is being used in a first orientation or in a second orientation with respect to a user. More particularly, face recognition software detects face landmarks (e.g., nose, lips, eyes, etc.) of the user and determines whether the face is oriented by more than a threshold angle from a vertical axis. If the orientation of the user's face is less than the threshold angle, a graphics controller is configured for portrait mode. If the orientation of the user's face is greater than the threshold angle, the graphics controller is configured for landscape mode. Accordingly, the graphics controller renders images viewable with respect to the orientation that the user has selected for using the computing device. Such graphic control, however, may not be able to accommodate multiple users.
  • face landmarks e.g., nose, lips, eyes, etc.
  • a method may be provided to orient content on a display of an electronic device including a camera. More particularly, the method may include obtaining an image from the camera, identifying a plurality of faces in the image, and orienting content on the display responsive to the plurality of faces in the image. By orienting content on the display responsive to a plurality of faces in the image, the electronic device may orient content on the display to best accommodate viewing angles of a plurality of users who may be viewing the display from different angles, directions, and/or sides of the display.
  • the plurality of faces in the image may be used to determine a display orientation that best accommodates the plurality of users, but the image and the faces therein are not included in the content provided on the display (i.e., the image and the faces therein may be excluded from the content provided on the display).
  • Orienting content on the display may include orienting content responsive to an alignment of a largest of the faces identified in the image.
  • Orienting content on the display may include orienting content responsive to a common alignment of a group of the faces in the image.
  • Orienting content on the display may include orienting content responsive to numbers of faces arranged along different sides of the image.
  • Orienting content on the display may include orienting content responsive to numbers of faces located in different areas of the image.
  • the electronic device may include an orientation sensor used to determine an orientation of the display.
  • content When the display is oriented vertically, content may be oriented on the display responsive to a rotational alignment of the display.
  • content When the display is oriented horizontally, content may be oriented on the display responsive to the plurality of faces in the image.
  • Orienting content on the display responsive to a rotational alignment of the display may include orienting content in portrait mode on the display responsive to a first rotational alignment of the display and orienting content in landscape mode on the display responsive to a second rotational alignment of the display with the first and second rotational alignments being different.
  • a method of orienting content on a display of an electronic device including a camera and an orientation sensor may include determining an orientation of the display.
  • content may be oriented on the display responsive to a rotational alignment of the display.
  • an image may be obtained from the camera, and content may be oriented on the display responsive to at least one face in the image.
  • content orientation may be automatically adaptable to select content orientation based on display orientation when the display is being held upright (when all viewers are most likely viewing from the same orientation), and to select content orientation based on user orientation when the display is laid horizontally (when an orientation or orientations of a viewer or viewers is less
  • Orienting content on the display may include orienting content on the display responsive to a plurality of faces in the image.
  • Orienting content on the display may include orienting content responsive to an alignment of a largest of the faces identified in the image.
  • Orienting content on the display may include orienting content responsive to a common alignment of a group of the faces in the image.
  • Orienting content on the display may include orienting content responsive to numbers of faces arranged along different sides of the image.
  • Orienting content on the display may include orienting content responsive to numbers of faces located in different areas of the image.
  • Orienting content on the display responsive to a rotational alignment of the display may include orienting content in portrait mode on the display responsive to a first rotational alignment of the display, and orienting content in landscape mode on the display responsive to a second rotational alignment of the display, with the first and second rotational alignments being different.
  • an electronic device may include a display and a camera.
  • the electronic device may include a processor coupled to the display and coupled to the camera.
  • the processor may be configured to identify a plurality of faces in the image obtained by the camera, and to orient content on the display responsive to the plurality of faces in the image.
  • the processor may be configured to orient content on the display responsive to an alignment of a largest of the faces identified in the image.
  • the processor may be configured to orient content on the display responsive to a common alignment of a group of the faces in the image.
  • the processor may be configured to orient content on the display responsive to numbers of faces arranged along different sides of the image.
  • the processor may be configured to orient content on the display responsive to numbers of faces located in different areas of the image.
  • An orientation sensor may be coupled to the processor.
  • the processor may be further configured to determine an orientation of the display responsive to a signal from the orientation sensor, to orient content on the display responsive to a rotational alignment of the display when the display is oriented vertically, and to orient content on the display responsive to the plurality of faces in the image when the display is oriented horizontally.
  • the processor may be configured to orient content on the display responsive to a rotational alignment of the display by orienting content in portrait mode on the display responsive to a first rotational alignment of the display and by orienting content in landscape mode on the display responsive to a second rotational alignment of the display with the first and second rotational alignments being different.
  • an electronic device may include a display, a camera, and an orientation sensor.
  • the electronic device may also include a processor coupled to the display, the camera, and the orientation sensor.
  • the processor may be configured to determine an orientation of the display responsive to a signal from the orientation sensor, to orient content on the display responsive to a rotational alignment of the display when the display is oriented vertically, and to orient content on the display responsive to at least one face in an image obtained by the camera when the display is oriented horizontally.
  • the processor may be configured to orient content on the display responsive to a plurality of faces in the image when the display is oriented horizontally.
  • the processor may be configured to orient content on the display responsive to an alignment of a largest of the faces identified in the image.
  • the processor may be configured to orient content on the display responsive to a common alignment of a group of the faces in the image.
  • the processor may be configured to orient content on the display responsive to numbers of faces arranged along different sides of the image.
  • the processor may be configured to orient content on the display responsive to numbers of faces located in different areas of the image.
  • the processor may be configured to orient content on the display responsive to a rotational alignment of the display by orienting content in portrait mode on the display responsive to a first rotational alignment of the display and by orienting content in landscape mode on the display responsive to a second rotational alignment of the display with the first and second rotational alignments being different.
  • Figure 1 is a plan view of an electronic device according to some embodiments.
  • Figure 2 is a block diagram of the electronic device of Figure 1 according to some embodiments.
  • Figures 3A1, 3A2, 3B1, 3B2, 4A1, 4A2, 4B1, 4B2, 5A1, 5A2, 5B1, 5B2, 6A1, 6A2, 6B1 , and 6B2 are corresponding camera images and display outputs according to some embodiments;
  • Figure 7 is a flow chart illustrating operations of methods according to some embodiments.
  • an image may be oriented on the display in either a landscape mode (with width greater than height) or a portrait mode (with height greater than width).
  • the electronic device may automatically select the display mode responsive to an orientation of the device relative to the ground and/or floor.
  • the electronic device may include one or more orientation sensors (e.g., accelerometers) configured to detect a direction of a gravitational pull on the electronic device.
  • a processor may be coupled to the accelerometer(s) and to the display, and the processor may be configured to determine an orientation of the electronic device relative to the ground and/or floor.
  • the processor may be configured to render an image on the display in the landscape mode when the electronic device is oriented with the widest dimension of the display substantially horizontal (relative to the ground/floor), and the processor may be configured to render the image on the display in the portrait mode when the electronic device is oriented with the narrowest dimension of the display substantially horizontal (relative to the ground/floor).
  • Use of accelerometers to determine image orientation may not be suitable when the electronic device is laid down (e.g., on a table or desk) so that the display faces up, when the electronic device is used in a low gravity environment, when the display is viewed from an odd position, etc.
  • a digital camera may be provided in the electronic device, and the digital camera may be used to determine an orientation of the electronic device relative to a user or users viewing the display. Accordingly, the digital camera may be used to determine whether to display an image on the display in a landscape, portrait, or other mode.
  • a processor coupled to the camera may be configured to determine an orientation of the display relative to a user(s) face.
  • a mobile electronic device 101 may include a display 103 (e.g., a liquid crystal display or LCD, a touch sensitive display, etc.) and camera lens 105a (of camera 105) on a same surface of electronic device 101 so that camera lens 105a is configured to capture images of a user or users viewing display 103. More particularly, camera lens 105 a may be oriented such that its field of view (FOV) is likely to include a face or faces of a user or users viewing display 103. Camera 105 (including camera lens 105a), for example, may be provided as a digital camera used for video telephony in a tablet PC or smartphone.
  • FOV field of view
  • Electronic device 101 may also include user interface 107 including user interface elements such as speaker 107a, microphone 107b, touch input 107c, etc.
  • Touch input 107c may include a keypad, a touch pad, a dial, a joystick, a touch sensitive surface of display 103, etc. While particular user interface elements are illustrated by way of example, illustrated interface elements may be omitted and/or other user interface elements may be included.
  • speaker 107a and/or microphone 107b may be omitted if radiotelephone functionality is not provided by mobile electronic device 101.
  • electronic device 101 may include orientation sensor(s) 117 coupled to processor, and orientation sensor(s) 117 (e.g., gravitational sensors) may be configured to detect a physical orientation of electronic device 101 (and display 103 thereof) relative to a ground/floor plane.
  • orientation sensor(s) 117 e.g., gravitational sensors
  • three gravitational sensors for the x-axis, y-axis, and z-axis of electronic device 101
  • orientation sensor(s) 1 17 may include one or more of a mercury switch(es), an accelerometer(s) (such as a microelectromechanical accelerometer(s)), a gyroscope(s), a magnetometer(s), etc.
  • processor 109 may be coupled to each of display 103, digital camera 105, user interface 107, memory 1 11, wireless transceiver 115 (e.g., cellular radiotelephone transceiver, Bluetooth transceiver, WiFi transceiver, etc.), and orientation sensor(s) 1 17.
  • wireless transceiver 115 e.g., cellular radiotelephone transceiver, Bluetooth transceiver, WiFi transceiver, etc.
  • orientation sensor(s) 1 e.g., orientation sensor(s) 1 17.
  • wireless transceiver 115 may be omitted if not required for functionality of electronic device 101.
  • wireless transceiver 115 may be omitted if wireless communications are not supported by electronic device 101.
  • Processor 109 may be configured to control functionality of electronic device 101 using instructions/information stored in memory 111 and/or received through transceiver 115 to provide one or more functionalities such as mobile telephony, mobile video telephony, internet browsing, text messaging, e-mail, document generation/display, video/audio reproduction/recording, etc.
  • processor 109 may be configured to control an orientation of content (e.g., an image, video, text, etc.) on display 103 responsive to a physical orientation of display 101 (relative to ground) determined using orientation sensor(s) 1 17 and/or responsive to information received from camera 105. For example, if display 103 is oriented in a plane that is substantially vertical (e.g., the z-axis is substantially parallel to the
  • processor 109 may orient content on display 103 according to the orientation of the x-axis and y-axis relative to the ground/floor. If the x-axis of display 103 is substantially horizontal relative to the ground/floor (e.g., substantially orthogonal with respect to a gravitational pull on device 101) and the y-axis of display 103 is substantially vertical relative to ground (e.g., substantially aligned with a gravitational pull on device 101), processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis and with vertical elements of the content aligned with the y-axis (e.g., in landscape mode).
  • processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis and with vertical elements of the content aligned with the x-axis (e.g., in portrait mode).
  • processor 109 may thus accurately predict whether portrait mode or landscape mode is most appropriate based on a rotational alignment of the x-axis and the y-axis.
  • orientation sensor(s) 117 may not provide sufficient information to choose an orientation for content on display 103.
  • a gravitational orientation of electronic device 101 is not necessarily indicative of an orientation of a user viewing display 103.
  • processor 109 may use input from camera 105 to determine how to orient content on display 103.
  • camera 105 may capture an image including a face or faces of a user or users, and processor 109 may use image recognition software to identify the face or faces and orientations/locations/features thereof.
  • Image recognition and more particularly facial image recognition software is discussed, for example, in U.S. Publication Nos. 2008/0239131 and
  • processor 109 may be configured to render content on display 103 in an orientation corresponding to an orientation of the face. If the image includes a single face substantially aligned with the y-axis as shown in Figure 3A1, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis of display 103 and with vertical elements of the content aligned with the y-axis of display 103 (e.g., in landscape mode) as shown in Figure 3A2.
  • processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis of display 103 and with vertical elements of the content aligned with the x-axis of display 103 (e.g., in portrait mode) as shown in Figure 3B2.
  • processor 109 may be configured to render content on display 103 according to relative sizes of the two faces in the image and/or relative distances of the two faces from electronic device 101. For example, a smaller face in the image (occupying a lesser area of the image, having a lesser width/height in the image, occupying fewer pixels, etc.) may be assumed to be more distant than a larger face in the image (occupying a greater area of the image, having a greater width/height in the image, occupying more pixels, etc.), and processor 109 may orient content on display 109 to
  • processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis and with vertical elements of the content aligned with the y-axis (e.g., in landscape mode) as shown in Figure 4A2.
  • processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis and with vertical elements of the content aligned with the x-axis (e.g., in portrait mode) as shown in Figure 4B2. While use of size of faces is discussed in the context to two faces, a largest of any number of faces may be used to orient the image on display 103.
  • processor 109 may be configured to render content on display 103 according to an orientation most suitable for the largest number of users. According to some embodiments, if more faces in the image are substantially aligned with the y-axis of electronic device 101 (e.g., two faces of users looking from end 101a) than are aligned with the x-axis of electronic device 101 (e.g., one face of a user looking from side 101b) as shown in Figure 5A1, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis and with vertical elements of the content aligned with the y-axis (e.g., in landscape mode) as shown in Figure 5A2.
  • processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis and with vertical elements of the content aligned with the x-axis (e.g., in portrait mode) as shown in Figure 5B2.
  • processor 109 may determine an orientation to render content on display 103 according to numbers of faces arranged along different sides and/or arranged in different quadrants of the image generated by camera 105. If more faces in the image are substantially aligned along a bottom of the image (e.g., three faces of users arranged along the bottom) than are arranged along a side of the image (e.g., one face of a user arranged along the left side) as shown in Figure 6A1 , processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis and with vertical elements of the content aligned with the y-axis (e.g., in landscape mode) as shown in Figure 6A2.
  • processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis and with vertical elements of the content aligned with the x-axis (e.g., in portrait mode) as shown in Figure 6B2.
  • processor 109 may use images of Figures 6A1 and 6B 1 to achieve the results of Figures 6A2 and 6B2 by counting faces substantially appearing in quadrants of the image. Regarding the image of Figure 6A1 , for example, processor 109 may determine that there are 0 faces in the first quadrant, 1 face in the second quadrant, 1 face in the third quadrant, and 2 faces in the third quadrant.
  • processor 109 may orient content on display 103 as shown in Figure 6A2.
  • processor 109 may determine that there are 0 faces in the first quadrant, 2 faces in the second quadrant, 1 face in the third quadrant, and 1 face in the third quadrant.
  • processor 109 may orient content on display 103 as shown in Figure 6A2. Results of Figures 6A1 , 6A2, 6B1 , and 6B2 may thus be obtained using positions of faces without necessarily determining orientations of particular faces.
  • FIG. 7 is a flow chart illustrating operations of processor 19 as discussed above with respect to Figures 3-6.
  • processor 109 determines if a user override has been entered. If a user override has been entered, processor 109 orients content on display 103 according to user input at block 703. If not, processor 109 may determine an orientation of electronic device 101 using orientation sensor(s) 117 at block 705.
  • processor 109 may orient content on display according to the orientation of the x-axis and the y-axis relative to ground at block 709. More particularly, if the x-axis of display 103 is substantially horizontal relative to ground and the y-axis of display 103 is substantially vertical relative to ground, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis and with vertical elements of the content aligned with the y-axis (e.g., in landscape mode).
  • processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis and with vertical elements of the content aligned with the x-axis (e.g., in portrait mode).
  • display 103 is oriented in a plane that is substantially horizontal (relative to ground with the z-axis substantially vertical relative to ground) at block 707 (e.g., when a table PC is laid on a table or desk)
  • processor 109 may obtain an image from digital camera 105 at block 711 to determine how to orient content on display 103.
  • processor 109 may orient content on display 103 using a default setting at block 717, such as a last display orientation used, a portrait mode, a landscape mode, a best determination available using orientation sensor(s) 117, etc.
  • processor 109 may determine how many faces are present in the image. If only one face is detected in the image at block 719, processor 109 may match a display orientation to an orientation of the one face at block 721 as discussed above with respect to Figures 3A1, 3A2, 3B1, and 3B2. If only two faces are detected in the image at blocks 719 and 723, processor 109 may match a display orientation to an orientation of the larger/closer face in the image at block 725 as discussed above with respect to Figures 4A1 , 4A2, 4B 1 , and 4B2.
  • processor 109 may match the display orientation to accommodate the largest numbers of users at block 727 as discussed above with respect to Figures 5A1 , 5A2, 5B1 , and 5B2 and/or with respect to Figures 6A1 , 6A2, 6B 1 , and 6B2.
  • processor 109 may reorient content on display 103 as user override inputs change, as physical orientations of electronic device 101 change, as numbers/positions of a user(s) change, etc.
  • Figure 7 shows an example of operations according to some embodiments of the present invention, other orders of operations may be performed and/or particular operations may be performed individually.
  • operations of Figures 4A1 , 4A2, 4B1, and 4B2 operations of Figures 5A1, 5A2, 5B1, and 5B2, and/or operations of Figures 6A1 , 6A2, 6B1, and 6B2, may be performed individually according to some embodiments of the present invention.
  • an image from camera 105 (on a same side/surface of device 101 as display 103) may be used for display orientation control where display orientation is provided to accommodate a largest number of users based on facial
  • camera 105 may be positioned to capture an image(s) of a face(s) of a user(s) viewing display 103.
  • Processor 103 may be configured to recognize faces and orientations/features thereof using face recognition software, and to determine whether to render an image on display 103 in landscape or portrait mode responsive to the orientation(s) of the face(s).
  • processor 109 may count faces on the 4 sides of the image (corresponding to the 4 sides of electronic device 101) and the side with the highest count may determine the orientation of content on display 103 as discussed above with respect to Figures 6A1 , 6A2, 6B 1 , and 6B2.
  • Face orientation mode may be activated all the time, activated only if an orientation sensor(s) gives specific readouts (e.g., if electronic device is laid on a table with display 103 up), and/or activated when not overridden by user input. If no faces are found in a face orientation mode, processor may default to a last display orientation used, to a portrait or landscape mode, to a best determination available using orientation sensor(s) 117, etc. Face orientation according to embodiments of the present invention may allow display orientation determinations when multiple users are present.
  • a smart table and/or information/info desk may include a display facing up, and such a table/desk may use facial orientation(s) according to embodiments of the present invention to allow display orientation determinations when one or multiple users are gathered around the display.
  • Such a table/desk may be non-mobile, and display orientation may be performed according to the flow chart of Figure 7 omitting blocks 705, 707, and 709 so that the "No" output of block 701 feeds directly to block 711.
  • processor 109 may render video (e.g., streamed through transceiver 115 and/or stored in memory 111), photographs (e.g., using a photo editor running on processor 109), web pages (e.g., using a browser application running on processor 109), documents (e.g., using a word processor application running on processor 109), presentation slides (e.g., using a presentation application running on processor 109), spreadsheets (e.g., using a spreadsheet application running on processor 109), etc.
  • video e.g., streamed through transceiver 115 and/or stored in memory 111
  • photographs e.g., using a photo editor running on processor 109
  • web pages e.g., using a browser application running on processor 109
  • documents e.g., using a word processor application running on processor 109
  • presentation slides e.g., using a presentation application running on processor 109
  • spreadsheets e.g., using a spreadsheet application running on processor 109
  • content rendered by processor 109 may be distinct from an image obtained from camera 105 so that the content rendered by processor 109 does not include the image obtained from camera. Stated in other words, the image obtained from camera 105 (including faces of a user or users viewing display 109) may be excluded from the content oriented on the display. Accordingly, the image provided by camera 105 may be used to orient the distinct content rendered by processor 109.
  • Exemplary embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
  • These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means
  • These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks.
  • a tangible, non-transitory computer-readable medium may include an electronic, magnetic, optical, electromagnetic, or semiconductor data storage system, apparatus, or device. More specific examples of the computer-readable medium would include the following: a portable computer diskette, a random access memory (RAM) circuit, a read-only memory (ROM) circuit, an erasable programmable read-only memory (EPROM or Flash memory) circuit, a portable compact disc read-only memory (CD-ROM), and a portable digital video disc read-only memory (DVD/BlueRay).
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • CD-ROM compact disc read-only memory
  • DVD/BlueRay portable digital video disc read-only memory
  • the computer program instructions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
  • the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as "circuitry,” "a module” or variants thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Content may be oriented on a display (103) of an electronic device (101) including a camera (105) by obtaining an image (711) from the camera (105), identifying a plurality of faces (719, 723) in the image, and orienting the content on the display (725, 727) responsive to the plurality of faces in the image. Related devices are also discussed.

Description

FACE SCREEN ORIENTATION AND RELATED DEVICES AND METHODS
BACKGROUND
[0001] In some mobile electronic devices (such as tablet computers and/or smartphones), a display may be used in different physical orientations. For example, the display may be used in a portrait mode or in a landscape mode, and the user may orient or rotate the device as desired. The user, however, may be inconvenienced by having to configure the graphics subsystem that renders content (e.g., images, video, text, etc.) on the display for the selected orientation.
[0002] As discussed in U.S. Patent Publication No. 2008/0181502 to Yang entitled "Pattern
Recognition for During Orientation Of A Display Device" and published July 31, 2008, pattern recognition may be used to determine whether a display is being used in a first orientation or in a second orientation with respect to a user. More particularly, face recognition software detects face landmarks (e.g., nose, lips, eyes, etc.) of the user and determines whether the face is oriented by more than a threshold angle from a vertical axis. If the orientation of the user's face is less than the threshold angle, a graphics controller is configured for portrait mode. If the orientation of the user's face is greater than the threshold angle, the graphics controller is configured for landscape mode. Accordingly, the graphics controller renders images viewable with respect to the orientation that the user has selected for using the computing device. Such graphic control, however, may not be able to accommodate multiple users.
SUMMARY
[0003] According to some embodiments, a method may be provided to orient content on a display of an electronic device including a camera. More particularly, the method may include obtaining an image from the camera, identifying a plurality of faces in the image, and orienting content on the display responsive to the plurality of faces in the image. By orienting content on the display responsive to a plurality of faces in the image, the electronic device may orient content on the display to best accommodate viewing angles of a plurality of users who may be viewing the display from different angles, directions, and/or sides of the display. Accordingly, the plurality of faces in the image may be used to determine a display orientation that best accommodates the plurality of users, but the image and the faces therein are not included in the content provided on the display (i.e., the image and the faces therein may be excluded from the content provided on the display). [0004] Orienting content on the display may include orienting content responsive to an alignment of a largest of the faces identified in the image. Orienting content on the display may include orienting content responsive to a common alignment of a group of the faces in the image. Orienting content on the display may include orienting content responsive to numbers of faces arranged along different sides of the image. Orienting content on the display may include orienting content responsive to numbers of faces located in different areas of the image.
[0005] The electronic device may include an orientation sensor used to determine an orientation of the display. When the display is oriented vertically, content may be oriented on the display responsive to a rotational alignment of the display. When the display is oriented horizontally, content may be oriented on the display responsive to the plurality of faces in the image. Orienting content on the display responsive to a rotational alignment of the display may include orienting content in portrait mode on the display responsive to a first rotational alignment of the display and orienting content in landscape mode on the display responsive to a second rotational alignment of the display with the first and second rotational alignments being different.
[0006] According to some other embodiments, a method of orienting content on a display of an electronic device including a camera and an orientation sensor may include determining an orientation of the display. When the display is oriented vertically, content may be oriented on the display responsive to a rotational alignment of the display. When the display is oriented horizontally, an image may be obtained from the camera, and content may be oriented on the display responsive to at least one face in the image. Accordingly, content orientation may be automatically adaptable to select content orientation based on display orientation when the display is being held upright (when all viewers are most likely viewing from the same orientation), and to select content orientation based on user orientation when the display is laid horizontally (when an orientation or orientations of a viewer or viewers is less
predictable).
[0007] Orienting content on the display may include orienting content on the display responsive to a plurality of faces in the image. Orienting content on the display may include orienting content responsive to an alignment of a largest of the faces identified in the image. Orienting content on the display may include orienting content responsive to a common alignment of a group of the faces in the image. Orienting content on the display may include orienting content responsive to numbers of faces arranged along different sides of the image. Orienting content on the display may include orienting content responsive to numbers of faces located in different areas of the image. [0008] Orienting content on the display responsive to a rotational alignment of the display may include orienting content in portrait mode on the display responsive to a first rotational alignment of the display, and orienting content in landscape mode on the display responsive to a second rotational alignment of the display, with the first and second rotational alignments being different.
[0009] According to still other embodiments, an electronic device may include a display and a camera. The electronic device may include a processor coupled to the display and coupled to the camera. The processor may be configured to identify a plurality of faces in the image obtained by the camera, and to orient content on the display responsive to the plurality of faces in the image.
[0010] The processor may be configured to orient content on the display responsive to an alignment of a largest of the faces identified in the image. The processor may be configured to orient content on the display responsive to a common alignment of a group of the faces in the image. The processor may be configured to orient content on the display responsive to numbers of faces arranged along different sides of the image. The processor may be configured to orient content on the display responsive to numbers of faces located in different areas of the image.
[0011] An orientation sensor may be coupled to the processor. The processor may be further configured to determine an orientation of the display responsive to a signal from the orientation sensor, to orient content on the display responsive to a rotational alignment of the display when the display is oriented vertically, and to orient content on the display responsive to the plurality of faces in the image when the display is oriented horizontally. The processor may be configured to orient content on the display responsive to a rotational alignment of the display by orienting content in portrait mode on the display responsive to a first rotational alignment of the display and by orienting content in landscape mode on the display responsive to a second rotational alignment of the display with the first and second rotational alignments being different.
[0012] According to yet other embodiments, an electronic device may include a display, a camera, and an orientation sensor. The electronic device may also include a processor coupled to the display, the camera, and the orientation sensor. The processor may be configured to determine an orientation of the display responsive to a signal from the orientation sensor, to orient content on the display responsive to a rotational alignment of the display when the display is oriented vertically, and to orient content on the display responsive to at least one face in an image obtained by the camera when the display is oriented horizontally. [0013] The processor may be configured to orient content on the display responsive to a plurality of faces in the image when the display is oriented horizontally. The processor may be configured to orient content on the display responsive to an alignment of a largest of the faces identified in the image. The processor may be configured to orient content on the display responsive to a common alignment of a group of the faces in the image. The processor may be configured to orient content on the display responsive to numbers of faces arranged along different sides of the image. The processor may be configured to orient content on the display responsive to numbers of faces located in different areas of the image.
[0014] The processor may be configured to orient content on the display responsive to a rotational alignment of the display by orienting content in portrait mode on the display responsive to a first rotational alignment of the display and by orienting content in landscape mode on the display responsive to a second rotational alignment of the display with the first and second rotational alignments being different.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this application, illustrate certain non- limiting embodiment(s) of the invention. In the drawings:
[0016] Figure 1 is a plan view of an electronic device according to some embodiments;
[0017] Figure 2 is a block diagram of the electronic device of Figure 1 according to some embodiments;
[0018] Figures 3A1, 3A2, 3B1, 3B2, 4A1, 4A2, 4B1, 4B2, 5A1, 5A2, 5B1, 5B2, 6A1, 6A2, 6B1 , and 6B2 are corresponding camera images and display outputs according to some embodiments; and
[0019] Figure 7 is a flow chart illustrating operations of methods according to some embodiments.
DETAILED DESCRIPTION
[0020] In an electronic device (e.g., a mobile radiotelephone, a smartphone, a tablet personal computer, a digital camera, etc.) including a rectangular display (e.g., a rectangular LCD display, a rectangular touch screen display, etc.), an image may be oriented on the display in either a landscape mode (with width greater than height) or a portrait mode (with height greater than width). Moreover, the electronic device may automatically select the display mode responsive to an orientation of the device relative to the ground and/or floor. [0021] For example, the electronic device may include one or more orientation sensors (e.g., accelerometers) configured to detect a direction of a gravitational pull on the electronic device. A processor may be coupled to the accelerometer(s) and to the display, and the processor may be configured to determine an orientation of the electronic device relative to the ground and/or floor.
Accordingly, the processor may be configured to render an image on the display in the landscape mode when the electronic device is oriented with the widest dimension of the display substantially horizontal (relative to the ground/floor), and the processor may be configured to render the image on the display in the portrait mode when the electronic device is oriented with the narrowest dimension of the display substantially horizontal (relative to the ground/floor). Use of accelerometers to determine image orientation, however, may not be suitable when the electronic device is laid down (e.g., on a table or desk) so that the display faces up, when the electronic device is used in a low gravity environment, when the display is viewed from an odd position, etc.
[0022] According to some embodiments of the present invention, a digital camera may be provided in the electronic device, and the digital camera may be used to determine an orientation of the electronic device relative to a user or users viewing the display. Accordingly, the digital camera may be used to determine whether to display an image on the display in a landscape, portrait, or other mode. A processor coupled to the camera, for example, may be configured to determine an orientation of the display relative to a user(s) face.
[0023] According to some embodiments of the present invention shown in the plan view of Figure 1 and the block diagram of Figure 2, a mobile electronic device 101 (such as a tablet personal computer or tablet PC) may include a display 103 (e.g., a liquid crystal display or LCD, a touch sensitive display, etc.) and camera lens 105a (of camera 105) on a same surface of electronic device 101 so that camera lens 105a is configured to capture images of a user or users viewing display 103. More particularly, camera lens 105 a may be oriented such that its field of view (FOV) is likely to include a face or faces of a user or users viewing display 103. Camera 105 (including camera lens 105a), for example, may be provided as a digital camera used for video telephony in a tablet PC or smartphone.
[0024] Electronic device 101 may also include user interface 107 including user interface elements such as speaker 107a, microphone 107b, touch input 107c, etc. Touch input 107c, for example, may include a keypad, a touch pad, a dial, a joystick, a touch sensitive surface of display 103, etc. While particular user interface elements are illustrated by way of example, illustrated interface elements may be omitted and/or other user interface elements may be included. For example, speaker 107a and/or microphone 107b may be omitted if radiotelephone functionality is not provided by mobile electronic device 101. [0025] In addition, electronic device 101 may include orientation sensor(s) 117 coupled to processor, and orientation sensor(s) 117 (e.g., gravitational sensors) may be configured to detect a physical orientation of electronic device 101 (and display 103 thereof) relative to a ground/floor plane. For example, three gravitational sensors (for the x-axis, y-axis, and z-axis of electronic device 101) may be configured to detect an orientation of electronic device 101 in three dimensions relative to gravitational forces thereon (perpendicular to a ground/floor plane). The x-axis and y-axis of electronic device 101 (shown in Figure 1) may define axes of the surface of Figure 1 (including display 103 and camera lens 105a), and the z-axis of electronic device 101 may define an axis perpendicular with respect to the surface of Figure 1. By way of example, orientation sensor(s) 1 17 may include one or more of a mercury switch(es), an accelerometer(s) (such as a microelectromechanical accelerometer(s)), a gyroscope(s), a magnetometer(s), etc.
[0026] As shown in Figure 2, processor 109 may be coupled to each of display 103, digital camera 105, user interface 107, memory 1 11, wireless transceiver 115 (e.g., cellular radiotelephone transceiver, Bluetooth transceiver, WiFi transceiver, etc.), and orientation sensor(s) 1 17. Elements of Figures 1 and 2, however, may be omitted if not required for functionality of electronic device 101. For example, wireless transceiver 115 may be omitted if wireless communications are not supported by electronic device 101. Processor 109 may be configured to control functionality of electronic device 101 using instructions/information stored in memory 111 and/or received through transceiver 115 to provide one or more functionalities such as mobile telephony, mobile video telephony, internet browsing, text messaging, e-mail, document generation/display, video/audio reproduction/recording, etc.
[0027] According to some embodiments of the present invention, processor 109 may be configured to control an orientation of content (e.g., an image, video, text, etc.) on display 103 responsive to a physical orientation of display 101 (relative to ground) determined using orientation sensor(s) 1 17 and/or responsive to information received from camera 105. For example, if display 103 is oriented in a plane that is substantially vertical (e.g., the z-axis is substantially parallel to the
ground/floor) as determined using orientation sensor(s) 1 17, processor 109 may orient content on display 103 according to the orientation of the x-axis and y-axis relative to the ground/floor. If the x-axis of display 103 is substantially horizontal relative to the ground/floor (e.g., substantially orthogonal with respect to a gravitational pull on device 101) and the y-axis of display 103 is substantially vertical relative to ground (e.g., substantially aligned with a gravitational pull on device 101), processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis and with vertical elements of the content aligned with the y-axis (e.g., in landscape mode). If the y-axis of display 103 is substantially horizontal relative to ground (e.g., substantially orthogonal with respect to a gravitational pull on device 101) and the x-axis of display 103 is substantially vertical relative to ground (e.g., substantially aligned with a gravitational pull on device 101), processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis and with vertical elements of the content aligned with the x-axis (e.g., in portrait mode). When display 103 is oriented in a plane that is substantially vertical, any user is very likely viewing display 103 from an upright position (e.g., sitting or standing up and looking forward), and processor 109 may thus accurately predict whether portrait mode or landscape mode is most appropriate based on a rotational alignment of the x-axis and the y-axis.
[0028] If display 103 is oriented in a plane that is substantially horizontal (relative to ground with the z-axis substantially aligned with a gravitational pull on device 101) as determined by orientation sensor(s) 1 17, however, orientation sensor(s) 117 may not provide sufficient information to choose an orientation for content on display 103. When a tablet PC is laid on a table or desk with display 103 facing up, for example, a gravitational orientation of electronic device 101 is not necessarily indicative of an orientation of a user viewing display 103. Stated in other words, when display 103 is oriented in a plane that is substantially horizontal, a user/users may be looking down from any side of display 103, and a direction/side from which a user/users is/are viewing display 103 is less predictable. Accordingly, processor 109 may use input from camera 105 to determine how to orient content on display 103.
[0029] More particularly, camera 105 may capture an image including a face or faces of a user or users, and processor 109 may use image recognition software to identify the face or faces and orientations/locations/features thereof. Image recognition (and more particularly facial image recognition) software is discussed, for example, in U.S. Publication Nos. 2008/0239131 and
2008/0181502, the disclosures of which are hereby incorporated herein in their entirety by reference. If only a single face is identified in the image captured by camera 105, processor 109 may be configured to render content on display 103 in an orientation corresponding to an orientation of the face. If the image includes a single face substantially aligned with the y-axis as shown in Figure 3A1, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis of display 103 and with vertical elements of the content aligned with the y-axis of display 103 (e.g., in landscape mode) as shown in Figure 3A2. If the image includes a single face substantially aligned with the x-axis as shown in Figure 3B1, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis of display 103 and with vertical elements of the content aligned with the x-axis of display 103 (e.g., in portrait mode) as shown in Figure 3B2.
[0030] If two faces are identified in the image captured by camera 105 and the two faces have different orientations, processor 109 may be configured to render content on display 103 according to relative sizes of the two faces in the image and/or relative distances of the two faces from electronic device 101. For example, a smaller face in the image (occupying a lesser area of the image, having a lesser width/height in the image, occupying fewer pixels, etc.) may be assumed to be more distant than a larger face in the image (occupying a greater area of the image, having a greater width/height in the image, occupying more pixels, etc.), and processor 109 may orient content on display 109 to
accommodate the closer user. If the larger face in the image is substantially aligned with the y-axis as shown in Figure 4A1 , processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis and with vertical elements of the content aligned with the y-axis (e.g., in landscape mode) as shown in Figure 4A2. If the larger face in the image is substantially aligned with the x-axis as shown in Figure 4B 1, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis and with vertical elements of the content aligned with the x-axis (e.g., in portrait mode) as shown in Figure 4B2. While use of size of faces is discussed in the context to two faces, a largest of any number of faces may be used to orient the image on display 103.
[0031] If more than two faces with different orientations are identified in the image captured by camera 105, processor 109 may be configured to render content on display 103 according to an orientation most suitable for the largest number of users. According to some embodiments, if more faces in the image are substantially aligned with the y-axis of electronic device 101 (e.g., two faces of users looking from end 101a) than are aligned with the x-axis of electronic device 101 (e.g., one face of a user looking from side 101b) as shown in Figure 5A1, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis and with vertical elements of the content aligned with the y-axis (e.g., in landscape mode) as shown in Figure 5A2. On the other hand, if more faces in the image are substantially aligned with the x-axis of electronic device 101 (e.g., two faces of users looking from side 101b) than are aligned with the y-axis of electronic device 101 (e.g., one face of a user looking from end 101a) as shown in Figure 5B1, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis and with vertical elements of the content aligned with the x-axis (e.g., in portrait mode) as shown in Figure 5B2.
[0032] According to other embodiments of the present invention, processor 109 may determine an orientation to render content on display 103 according to numbers of faces arranged along different sides and/or arranged in different quadrants of the image generated by camera 105. If more faces in the image are substantially aligned along a bottom of the image (e.g., three faces of users arranged along the bottom) than are arranged along a side of the image (e.g., one face of a user arranged along the left side) as shown in Figure 6A1 , processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis and with vertical elements of the content aligned with the y-axis (e.g., in landscape mode) as shown in Figure 6A2. On the other hand, if more faces in the image are aligned along a side of the image (e.g., three faces of users arranged along the left side) than are arranged along a bottom of the image (e.g., one face of a user arranged along the bottom) as shown in Figure 6B1, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis and with vertical elements of the content aligned with the x-axis (e.g., in portrait mode) as shown in Figure 6B2.
[0033] According to other embodiments, processor 109 may use images of Figures 6A1 and 6B 1 to achieve the results of Figures 6A2 and 6B2 by counting faces substantially appearing in quadrants of the image. Regarding the image of Figure 6A1 , for example, processor 109 may determine that there are 0 faces in the first quadrant, 1 face in the second quadrant, 1 face in the third quadrant, and 2 faces in the third quadrant. One face in the 1st and 2nd quadrants indicates one face along the top of the image, two faces in the 2nd and 3rd quadrants indicates two faces along the left of the image, three faces in the 3rd and 4th quadrants indicates three faces along the bottom of the image, and two faces in the 4th and 1st quadrants indicates two faces along the right of the image. Accordingly, processor 109 may orient content on display 103 as shown in Figure 6A2. Regarding the image of Figure 6B1 processor 109 may determine that there are 0 faces in the first quadrant, 2 faces in the second quadrant, 1 face in the third quadrant, and 1 face in the third quadrant. Two faces in the 1st and 2nd quadrants indicates two faces along the top of the image, three faces in the 2nd and 3rd quadrants indicates three faces along the left of the image, two faces in the 3rd and 4th quadrants indicates two faces along the bottom of the image, and one face in the 4th and 1st quadrants indicates one face along the right of the image. Accordingly, processor 109 may orient content on display 103 as shown in Figure 6A2. Results of Figures 6A1 , 6A2, 6B1 , and 6B2 may thus be obtained using positions of faces without necessarily determining orientations of particular faces.
[0034] Figure 7 is a flow chart illustrating operations of processor 19 as discussed above with respect to Figures 3-6. At block 701, processor 109 determines if a user override has been entered. If a user override has been entered, processor 109 orients content on display 103 according to user input at block 703. If not, processor 109 may determine an orientation of electronic device 101 using orientation sensor(s) 117 at block 705.
[0035] If display 103 is oriented in a substantially vertical plane (relative to the ground/floor) at block 707, processor 109 may orient content on display according to the orientation of the x-axis and the y-axis relative to ground at block 709. More particularly, if the x-axis of display 103 is substantially horizontal relative to ground and the y-axis of display 103 is substantially vertical relative to ground, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the x-axis and with vertical elements of the content aligned with the y-axis (e.g., in landscape mode). If the y-axis of display 103 is substantially horizontal relative to ground and the x-axis of display 103 is substantially vertical relative to ground, processor 109 may orient content on display 103 with horizontal elements of the content aligned with the y-axis and with vertical elements of the content aligned with the x-axis (e.g., in portrait mode). [0036] If display 103 is oriented in a plane that is substantially horizontal (relative to ground with the z-axis substantially vertical relative to ground) at block 707 (e.g., when a table PC is laid on a table or desk), processor 109 may obtain an image from digital camera 105 at block 711 to determine how to orient content on display 103. If no faces are detected in the image at block 715, processor 109 may orient content on display 103 using a default setting at block 717, such as a last display orientation used, a portrait mode, a landscape mode, a best determination available using orientation sensor(s) 117, etc.
[0037] If a face or faces are detected in the image at block 71 , processor 109 may determine how many faces are present in the image. If only one face is detected in the image at block 719, processor 109 may match a display orientation to an orientation of the one face at block 721 as discussed above with respect to Figures 3A1, 3A2, 3B1, and 3B2. If only two faces are detected in the image at blocks 719 and 723, processor 109 may match a display orientation to an orientation of the larger/closer face in the image at block 725 as discussed above with respect to Figures 4A1 , 4A2, 4B 1 , and 4B2. If more than two faces are detected in the image at blocks 719 and 723, processor 109 may match the display orientation to accommodate the largest numbers of users at block 727 as discussed above with respect to Figures 5A1 , 5A2, 5B1 , and 5B2 and/or with respect to Figures 6A1 , 6A2, 6B 1 , and 6B2.
[0038] As shown in the flow chart of Figure 7, processor 109 may reorient content on display 103 as user override inputs change, as physical orientations of electronic device 101 change, as numbers/positions of a user(s) change, etc. Moreover, while Figure 7 shows an example of operations according to some embodiments of the present invention, other orders of operations may be performed and/or particular operations may be performed individually. For example, operations of Figures 4A1 , 4A2, 4B1, and 4B2, operations of Figures 5A1, 5A2, 5B1, and 5B2, and/or operations of Figures 6A1 , 6A2, 6B1, and 6B2, may be performed individually according to some embodiments of the present invention.
[0039] According to embodiments of the present invention, an image from camera 105 (on a same side/surface of device 101 as display 103) may be used for display orientation control where display orientation is provided to accommodate a largest number of users based on facial
orientations/positions in the image. More particularly, camera 105 may be positioned to capture an image(s) of a face(s) of a user(s) viewing display 103. Processor 103 may be configured to recognize faces and orientations/features thereof using face recognition software, and to determine whether to render an image on display 103 in landscape or portrait mode responsive to the orientation(s) of the face(s). According to some embodiments, processor 109 may count faces on the 4 sides of the image (corresponding to the 4 sides of electronic device 101) and the side with the highest count may determine the orientation of content on display 103 as discussed above with respect to Figures 6A1 , 6A2, 6B 1 , and 6B2. Face orientation mode may be activated all the time, activated only if an orientation sensor(s) gives specific readouts (e.g., if electronic device is laid on a table with display 103 up), and/or activated when not overridden by user input. If no faces are found in a face orientation mode, processor may default to a last display orientation used, to a portrait or landscape mode, to a best determination available using orientation sensor(s) 117, etc. Face orientation according to embodiments of the present invention may allow display orientation determinations when multiple users are present.
[0040] While display orientation determinations using facial orientations have been discussed with regard to mobile electronic devices (e.g., mobile radiotelephones, tablet computers, etc.), display orientation according to embodiments of the present invention may be provided for other device types, such as non-mobile electronic devices. By way of example, a smart table and/or information/info desk may include a display facing up, and such a table/desk may use facial orientation(s) according to embodiments of the present invention to allow display orientation determinations when one or multiple users are gathered around the display. Such a table/desk may be non-mobile, and display orientation may be performed according to the flow chart of Figure 7 omitting blocks 705, 707, and 709 so that the "No" output of block 701 feeds directly to block 711.
[0041] As used herein, the term content includes any visual output rendered on display 103 by processor 109. By way of example, processor 109 may render video (e.g., streamed through transceiver 115 and/or stored in memory 111), photographs (e.g., using a photo editor running on processor 109), web pages (e.g., using a browser application running on processor 109), documents (e.g., using a word processor application running on processor 109), presentation slides (e.g., using a presentation application running on processor 109), spreadsheets (e.g., using a spreadsheet application running on processor 109), etc. According to some embodiments, content rendered by processor 109 may be distinct from an image obtained from camera 105 so that the content rendered by processor 109 does not include the image obtained from camera. Stated in other words, the image obtained from camera 105 (including faces of a user or users viewing display 109) may be excluded from the content oriented on the display. Accordingly, the image provided by camera 105 may be used to orient the distinct content rendered by processor 109.
[0042] Various embodiments are described fully herein with reference to the accompanying figures, in which various embodiments are shown. This invention may, however, be embodied in many alternate forms and should not be construed as limited to the embodiments set forth herein. Accordingly, while the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and are described in detail herein. It should be understood, however, that there is no intent to limit the invention to the particular forms disclosed, but on the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the claims. Like numbers refer to like elements throughout the description of the figures.
[0043] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes," "including," "have," "having" or variants thereof when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Moreover, when an element is referred to as being "responsive" or "connected" to another element or variants thereof, it can be directly responsive or connected to the other element, or intervening elements may be present. In contrast, when an element is referred to as being "directly responsive" or "directly connected" to another element or variants thereof, there are no intervening elements present. As used herein the term "and/or" includes any and all combinations of one or more of the associated listed items and may be abbreviated as "/".
[0044] It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element without departing from the teachings of the disclosure. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
[0045] Exemplary embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means
(functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
[0046] These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks.
[0047] A tangible, non-transitory computer-readable medium may include an electronic, magnetic, optical, electromagnetic, or semiconductor data storage system, apparatus, or device. More specific examples of the computer-readable medium would include the following: a portable computer diskette, a random access memory (RAM) circuit, a read-only memory (ROM) circuit, an erasable programmable read-only memory (EPROM or Flash memory) circuit, a portable compact disc read-only memory (CD-ROM), and a portable digital video disc read-only memory (DVD/BlueRay).
[0048] The computer program instructions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as "circuitry," "a module" or variants thereof.
[0049] It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated. [0050] Many different embodiments have been disclosed herein, in connection with the above description and the drawings. It will be understood that it would be unduly repetitious and obfuscating to literally describe and illustrate every combination and subcombination of these embodiments. Accordingly, the present specification, including the drawings, shall be construed to constitute a complete written description of all combinations and subcombinations of the embodiments described herein, and of the manner and process of making and using them, and shall support claims to any such combination or subcombination.
[0051] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0052] In the specification, there have been disclosed embodiments of the invention and, although specific terms are employed, they are used in a generic and descriptive sense only and not for purposes of limitation. Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present invention. All such variations and modifications are intended to be included herein within the scope of the present invention.

Claims

What Is Claimed Is:
1. A method of orienting content on a display (103) of an electronic device (101) including a camera (105), the method comprising:
obtaining an image (711) from the camera (105);
identifying a plurality of faces (719, 723) in the image; and
orienting content on the display (725, 727) responsive to the plurality of faces in the image.
2. The method according to claim 1 wherein orienting content (725, 727) on the display (103) comprises orienting content responsive to an alignment of a largest of the faces (725) identified in the image.
3. The method according to claim 1 wherein orienting content (725, 727) on the display (103) comprises orienting content responsive to a common alignment of a group of the faces (727) in the image.
4. The method according to claim 1 wherein orienting content (725, 727) on the display (103) comprises orienting content responsive to numbers of faces arranged along different sides of the image
5. The method according to claim 1 wherein orienting content (725, 727) on the display (103) comprises orienting content responsive to numbers of faces located in different areas of the image.
6. The method according to claim 1 wherein the electronic device (101) includes an orientation sensor (1 17), the method further comprising:
determining an orientation (705) of the display (103); and
when the display (103) is oriented vertically, orienting content (709) on the display (103) responsive to a rotational alignment of the display (103);
wherein orienting content on the display (725, 727) responsive to the plurality of faces in the image comprises orienting content on the display (725, 727) responsive to the plurality of faces in the image when the display (103) is oriented horizontally.
7. The method according to Claim 6 wherein orienting content (709) on the display (103) responsive to a rotational alignment of the display (103) comprises orienting content (709) in portrait mode on the display (103) responsive to a first rotational alignment of the display (103) and orienting content (709) in landscape mode on the display (103) responsive to a second rotational alignment of the display (103) wherein the first and second rotational alignments are different.
8. A method of orienting content on a display (103) of an electronic device (101) including a camera (105) and an orientation sensor (117), the method comprising:
determining an orientation (705) of the display (103);
when the display (103) is oriented vertically, orienting content (709) on the display (103) responsive to a rotational alignment of the display (103); and
when the display (103) is oriented horizontally, obtaining an image from the camera (105) and orienting content (721, 725, 727) on the display (103) responsive to at least one face in the image.
9. The method according to Claim 8 wherein orienting content on the display (103) comprises orienting content on the display (103) responsive to a plurality of faces in the image.
10. The method according to claim 9 wherein orienting content (725, 727) on the display (103) comprises orienting content responsive to an alignment of a largest of the faces (725) identified in the image.
11. The method according to claim 9 wherein orienting content (725, 727) on the display (103) comprises orienting content responsive to a common alignment of a group of the faces (727) in the image.
12. The method according to claim 9 wherein orienting content (725, 727) on the display (103) comprises orienting content responsive to numbers of faces arranged along different sides of the image.
13. The method according to claim 9 wherein orienting content (725, 727) on the display (103) comprises orienting content responsive to numbers of faces located in different areas of the image.
14. The method according to Claim 8 wherein orienting content (709) on the display (103) responsive to a rotational alignment of the display (103) comprises orienting content (709) in portrait mode on the display (103) responsive to a first rotational alignment of the display (103) and orienting content (709) in landscape mode on the display (103) responsive to a second rotational alignment of the display (103) wherein the first and second rotational alignments are different.
15. An electronic device (101) comprising:
a display (103);
a camera (105); and
a processor (109) coupled to the display (103) and coupled to the camera (105), wherein the processor (109) is configured to identify a plurality of faces in the image obtained by the camera (105), and to orient content on the display responsive to the plurality of faces in the image.
16. The electronic device (101) according to claim 15 wherein the processor (101) is configured to orient content on the display (103) responsive to an alignment of a largest of the faces identified in the image.
17. The electronic device (101) according to claim 1 wherein the processor (101) is configured to orient content on the display (103) responsive to a common alignment of a group of the faces in the image.
18. The electronic device (101) according to claim 15 wherein the processor (101) is configured to orient content on the display (103) responsive to numbers of faces arranged along different sides of the image.
19. The electronic device (101) according to claim 15 wherein the processor (101) is configured to orient content on the display (103) responsive to numbers of faces located in different areas of the image.
20. The electronic device (101) according to claim 1 further comprising:
an orientation sensor (117) coupled to the processor (101); wherein the processor (101) is further configured to determine an orientation of the display (103) responsive to a signal from the orientation sensor, to orient content on the display (103) responsive to a rotational alignment of the display (103) when the display (103) is oriented vertically, and to orient content on the display responsive to the plurality of faces in the image when the display (103) is oriented horizontally.
21. The electronic device (101) according to Claim 20 wherein the processor (101) is configured to orient content on the display (103) responsive to a rotational alignment of the display (103) by orienting content in portrait mode on the display (103) responsive to a first rotational alignment of the display (103) and by orienting content in landscape mode on the display (103) responsive to a second rotational alignment of the display (103) wherein the first and second rotational alignments are different.
22. An electronic device (101) comprising:
a display (103);
a camera (105);
an orientation sensor (117); and
a processor (109) coupled to the display (109), the camera (105), and the orientation sensor (117), wherein the processor (109) is configured to determine an orientation of the display (103) responsive to a signal from the orientation sensor (117), to orient content on the display (103) responsive to a rotational alignment of the display (103) when the display (103) is oriented vertically, and to orient content on the display (103) responsive to at least one face in an image obtained by the camera (105) when the display (103) is oriented horizontally.
23. The electronic device (101) according to Claim 22 wherein the processor (101) is configured to orient content on the display (103) responsive to a plurality of faces in the image when the display (103) is oriented horizontally.
24. The electronic device (101) according to claim 23 wherein the processor (101) is configured to orient content on the display (103) responsive to an alignment of a largest of the faces identified in the image.
25. The electronic device (101) according to claim 23 wherein the processor (101) is configured to orient content on the display (103) responsive to a common alignment of a group of the faces in the image.
26. The electronic device (101) according to claim 23 wherein the processor (101) is configured to orient content on the display (103) responsive to numbers of faces arranged along different sides of the image.
27. The electronic device (101) according to claim 23 wherein the processor (101) is configured to orient content on the display (103) responsive to numbers of faces located in different areas of the image.
28. The electronic device (101) according to Claim 22 wherein the processor (101) is configured to orient content on the display (103) responsive to a rotational alignment of the display (103) by orienting content in portrait mode on the display (103) responsive to a first rotational alignment of the display (103) and by orienting content in landscape mode on the display (103) responsive to a second rotational alignment of the display (103) wherein the first and second rotational alignments are different.
PCT/SE2010/051337 2010-08-30 2010-12-06 Face screen orientation and related devices and methods WO2012030265A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US37816710P 2010-08-30 2010-08-30
US61/378,167 2010-08-30

Publications (1)

Publication Number Publication Date
WO2012030265A1 true WO2012030265A1 (en) 2012-03-08

Family

ID=43971434

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2010/051337 WO2012030265A1 (en) 2010-08-30 2010-12-06 Face screen orientation and related devices and methods

Country Status (1)

Country Link
WO (1) WO2012030265A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130182014A1 (en) * 2012-01-12 2013-07-18 Jieun Park Mobile terminal and control method thereof
US9342143B1 (en) * 2012-04-17 2016-05-17 Imdb.Com, Inc. Determining display orientations for portable devices

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050104848A1 (en) * 2003-09-25 2005-05-19 Kabushiki Kaisha Toshiba Image processing device and method
US20060265442A1 (en) * 2003-10-01 2006-11-23 Saju Palayur Method and system for controlling a user interface a corresponding device and software devices for implementing the method
WO2007147449A1 (en) * 2006-06-21 2007-12-27 Sony Ericsson Mobile Communications Ab Device and method for adjusting image orientation
US20080152199A1 (en) * 2006-12-21 2008-06-26 Sony Ericsson Mobile Communications Ab Image orientation for display
US20080181502A1 (en) 2007-01-31 2008-07-31 Hsin-Ming Yang Pattern recognition for during orientation of a display device
US20080239131A1 (en) 2007-03-28 2008-10-02 Ola Thorn Device and method for adjusting orientation of a data representation displayed on a display
US20080266326A1 (en) * 2007-04-25 2008-10-30 Ati Technologies Ulc Automatic image reorientation
US20100066763A1 (en) * 2008-09-12 2010-03-18 Gesturetek, Inc. Orienting displayed elements relative to a user
US20100066667A1 (en) * 2008-09-12 2010-03-18 Gesturetek, Inc. Orienting a displayed element relative to a user
US20100295782A1 (en) * 2009-05-21 2010-11-25 Yehuda Binder System and method for control based on face ore hand gesture detection

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050104848A1 (en) * 2003-09-25 2005-05-19 Kabushiki Kaisha Toshiba Image processing device and method
US20060265442A1 (en) * 2003-10-01 2006-11-23 Saju Palayur Method and system for controlling a user interface a corresponding device and software devices for implementing the method
WO2007147449A1 (en) * 2006-06-21 2007-12-27 Sony Ericsson Mobile Communications Ab Device and method for adjusting image orientation
US20080152199A1 (en) * 2006-12-21 2008-06-26 Sony Ericsson Mobile Communications Ab Image orientation for display
US20080181502A1 (en) 2007-01-31 2008-07-31 Hsin-Ming Yang Pattern recognition for during orientation of a display device
US20080239131A1 (en) 2007-03-28 2008-10-02 Ola Thorn Device and method for adjusting orientation of a data representation displayed on a display
US20080266326A1 (en) * 2007-04-25 2008-10-30 Ati Technologies Ulc Automatic image reorientation
US20100066763A1 (en) * 2008-09-12 2010-03-18 Gesturetek, Inc. Orienting displayed elements relative to a user
US20100066667A1 (en) * 2008-09-12 2010-03-18 Gesturetek, Inc. Orienting a displayed element relative to a user
US20100295782A1 (en) * 2009-05-21 2010-11-25 Yehuda Binder System and method for control based on face ore hand gesture detection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHRIS HALL: "AppBox Pro", 8 September 2009 (2009-09-08), pages 1 - 16, XP002637738, Retrieved from the Internet <URL:http://www.148apps.com/reviews/appbox-pro/> [retrieved on 20110518] *
EDWARD C. BAIG ET BOB DR.MAC LEVITUS: "iPhone for Dummies, 3rd ed.", 1 January 2009, WILEY, pages: 8, 126, XP002638168 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130182014A1 (en) * 2012-01-12 2013-07-18 Jieun Park Mobile terminal and control method thereof
US9424798B2 (en) * 2012-01-12 2016-08-23 Lg Electronics Mobile terminal and control method thereof
US9342143B1 (en) * 2012-04-17 2016-05-17 Imdb.Com, Inc. Determining display orientations for portable devices
US10186018B2 (en) 2012-04-17 2019-01-22 Imdb.Com, Inc. Determining display orientations for portable devices
US11100608B2 (en) 2012-04-17 2021-08-24 Imdb, Inc. Determining display orientations for portable devices

Similar Documents

Publication Publication Date Title
US11231845B2 (en) Display adaptation method and apparatus for application, and storage medium
US9304591B2 (en) Gesture control
US9232138B1 (en) Image stabilization techniques
EP2820515B1 (en) Device camera angle
US8970481B2 (en) Method for adjusting display manner of portable electronic device
US8624927B2 (en) Display apparatus, display control method, and display control program
US9690334B2 (en) Adaptive visual output based on change in distance of a mobile device to a user
CN102376295B (en) Assisted zoom and method
US8068121B2 (en) Manipulation of graphical objects on a display or a proxy device
US9589325B2 (en) Method for determining display mode of screen, and terminal device
US20110001762A1 (en) Method for adjusting displayed frame, electronic device, and computer readable medium thereof
WO2015124098A1 (en) Screen content display method and system
US9373302B2 (en) Stacked device position identification
US20120054690A1 (en) Apparatus and method for displaying three-dimensional (3d) object
WO2013168173A1 (en) Gaze-based automatic scrolling
US20140375698A1 (en) Method for adjusting display unit and electronic device
US20150002698A1 (en) Inclination angle compensation system and method for picture
WO2022134632A1 (en) Work processing method and apparatus
WO2022042425A1 (en) Video data processing method and apparatus, and computer device and storage medium
EP2769293B1 (en) Method and apparatus for control of orientation of information presented based upon device use state
CN105718232A (en) Display method and display device of arbitrary angle plane rotation
KR20210056563A (en) Display apparatus and the control method thereof
US20150160841A1 (en) Desktop-like device and method for displaying user interface
CN103529932A (en) Method and system for rotating display image
CN106325492B (en) Orientation adjusting method and device and electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10795481

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10795481

Country of ref document: EP

Kind code of ref document: A1