US20210096611A1 - Camera and flashlight operation in hinged device - Google Patents
Camera and flashlight operation in hinged device Download PDFInfo
- Publication number
- US20210096611A1 US20210096611A1 US16/719,740 US201916719740A US2021096611A1 US 20210096611 A1 US20210096611 A1 US 20210096611A1 US 201916719740 A US201916719740 A US 201916719740A US 2021096611 A1 US2021096611 A1 US 2021096611A1
- Authority
- US
- United States
- Prior art keywords
- display
- computing device
- camera
- pose
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1647—Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1675—Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
- G06F1/1681—Details related solely to hinges
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/48—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B30/00—Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1615—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
- G06F1/1616—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
- G06F1/1618—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position the display being foldable up to the back of the other housing with a single degree of freedom, e.g. by 360° rotation over the axis defined by the rear edge of the base enclosure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1641—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1652—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1675—Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
- G06F1/1677—Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/0206—Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
- H04M1/0208—Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
- H04M1/0214—Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/0206—Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
- H04M1/0241—Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings using relative motion of the body parts to change the operational status of the telephone set, e.g. switching on/off, answering incoming call
- H04M1/0243—Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings using relative motion of the body parts to change the operational status of the telephone set, e.g. switching on/off, answering incoming call using the relative angle between housings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/53—Constructional details of electronic viewfinders, e.g. rotatable or detachable
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/617—Upgrading or updating of programs or applications for camera control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
- H04N23/632—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
-
- H04N5/44591—
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2217/00—Details of cameras or camera bodies; Accessories therefor
- G03B2217/002—Details of arrangement of components in or on camera body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Definitions
- Some mobile electronic devices such as smart phones and tablets, have a monolithic handheld form in which a display occupies substantially an entire front side of the device.
- Other devices such as laptop computers, include a hinge that connects a display to other hardware, such as a keyboard and cursor controller (e.g. a track pad).
- One example provides a computing device including a first portion including a first display, a second portion including a second display and a camera, the second portion connected to the first portion via a hinge, and a hinge angle sensing mechanism including one or more sensors.
- the computing device further includes a logic device, and a storage device holding instructions executable by the logic device to execute a camera application, to receive sensor data from the one or more sensors, and based at least in part on the sensor data received from the one or more sensors, determine a device pose.
- the instructions are further executable to output the camera application to the first display when the device pose is indicative of the camera being world-facing, and output the camera application to the second display when the device pose is indicative of the camera being user-facing.
- a computing device comprising a first portion, a second portion comprising a light source, the second portion connected to the first portion via a hinge, and a hinge angle sensing mechanism comprising one or more sensors.
- the computing device further includes a logic device, and a storage device holding instructions executable by the logic device to execute a flashlight application, to receive sensor data from the one or more sensors, and based at least in part on the sensor data received from the one or more sensors, determine a pose of the computing device.
- the instructions are further executable to control the light source to emit light at a relatively lower brightness when the pose of the computing device is indicative of the light source being user-facing, and control the light source to emit light at a relatively higher brightness when the pose of the computing device is indicative of the light source being world-facing.
- FIGS. 1A-1F show different poses of an example multi-display computing device.
- FIG. 2 schematically shows example locations of integrated hardware devices for the computing device of FIGS. 1A-1F .
- FIG. 3 schematically illustrates example camera user interfaces displayed based upon computing device pose.
- FIGS. 4A-4C depicts an example use scenario in which a camera application is launched from a lock screen, followed by the device being moved from a dual display mode to a flip mode.
- FIGS. 5A-5G depict example use scenarios in which a camera application is launched from an unlocked state, followed by the device being moved from a dual display mode to a flip mode and then rotated.
- FIGS. 6A-6E depict an example use scenario in which a device is switched from a self-facing camera mode to an outward facing camera mode based upon an angle of a hinge connecting a first portion and a second portion of the computing device and also on a camera orientation relative to a user.
- FIGS. 7A-7C illustrate spanning of a camera application across two displays.
- FIGS. 8A-8C illustrate an example use scenario in which a third-party camera application that comprises a control to select between a front-facing camera and a rear-facing camera is operated on the computing device of FIGS. 1A-1F .
- FIGS. 9A-9C illustrate another example use scenario in which a third-party camera application that comprises a control to select between a front-facing camera and a rear-facing camera is operated on the computing device of FIGS. 1A-1F .
- FIG. 10 illustrates example poses of the computing device of FIGS. 1A-1F used to control a flashlight.
- FIG. 11 illustrates an example invocation mechanism for launching a flashlight application.
- FIG. 12 illustrates an initial brightness of a flashlight when the computing device of FIGS. 1A-1F is in a double portrait configuration.
- FIGS. 13A-13D illustrate the control of a brightness of a flashlight based upon moving the computing device of FIGS. 1A-1F between the double portrait configuration and the flip configuration.
- FIGS. 14A-14D illustrate the control of a brightness of the flashlight as a function of flashlight orientation compared to a user.
- FIGS. 15A-15C illustrate the control of a brightness of the flashlight based upon launching a flashlight application when the flashlight is in an outward-facing orientation.
- FIGS. 16A-16E illustrate the control of a brightness of the flashlight based upon moving the flashlight between a user-facing orientation and an outward-facing orientation.
- FIGS. 17A-17B illustrates a brightness of a flashlight dimming as an orientation of the flashlight transitions from world-facing to user-facing.
- FIGS. 18A-18D illustrate changes in flashlight brightness and a brightness indicator based upon changes in hinge angle of the computing device of FIGS. 1A-1F .
- FIGS. 19A-19E illustrate changes in flashlight brightness based upon changes in hinge angle of the computing device of FIGS. 1A-1F .
- FIG. 20 is a block diagram illustrating an example computing system.
- an example dual-display device may comprise a camera located on a display-side of one of the portions, and sensor data related to a hinge angle between the display portions may be used as input to control the operation of the camera and/or flashlight.
- aspects of controlling the camera include determining on which display to display user interface features, providing for the support of third-party applications that are configured for use with dual front/rear facing camera systems, and controlling flashlight brightness, among other features.
- each device portion may include a six degree of freedom (6DOF) motion sensor.
- the device also may include a sensor that senses a fully closed and/or fully open position, such as a Hall effect sensor on one portion to sense a magnet located on the other portion. Based on data from such sensors, the computing device may determine a likely orientation of the camera relative to the user (e.g. world-facing or user-facing), and output an application user interface, such as a camera stream, user interface controls, etc., based upon a determined computing device pose.
- 6DOF six degree of freedom
- the computing device may determine a likely orientation of the camera relative to the user (e.g. world-facing or user-facing), and output an application user interface, such as a camera stream, user interface controls, etc., based upon a determined computing device pose.
- FIGS. 1A-1F show various poses in which an example dual-screen computing device 100 may be held.
- Computing device 100 includes a first portion 102 and a second portion 104 that respectively include a first display 106 and a second display 108 .
- Each of the first display 106 and the second display 108 may comprise a touch sensor configured to sense touches from digits of users, styluses, and/or other objects.
- a hinge 110 connecting the first and second portions 102 and 104 allows the relative orientation between the portions and their displays to be adjusted by rotating one or both portions about the hinge 110 .
- This relative orientation is represented in FIGS. 1A-1B by a variable angle ⁇ measured between the emissive surfaces of first and second displays 106 and 108 .
- first and second portions 102 and 104 form an acute angle ⁇ 1
- the first and second portions are further rotated away from each other to form a larger angle ⁇ 2 .
- the second portion 104 is folded via hinge 110 behind the first portion 102 , such that a display side of each portion faces outward. From the perspective of a user of computing device 100 , the second display 108 is imperceptible. In such a configuration, the device may inactivate the display not facing the user.
- the pose of the computing device 100 in FIG. 1C is referred to herein as a “single portrait” configuration, which in some examples comprises a hinge angle ⁇ within a range of 235 to 360 degrees.
- FIG. 1D depicts the computing device 100 after rotating the second portion 180 degrees clockwise from the orientation shown in FIG. 1C .
- the first and second portions 102 and 104 are oriented in a “double portrait” configuration.
- First and second portions 102 and 104 may be rotatable throughout any suitable range of angles.
- the first and second portions 102 and 104 may be rotated in a range up to substantially 360 degrees from a fully open configuration, as shown in FIG. 1C , to a fully closed configuration in which the display side of the first portion 102 faces a display side of the second portion 104 .
- FIGS. 1A-1D each depict the computing device in a portrait orientation.
- the computing device 100 may be used in a landscape orientation.
- FIG. 1E depicts an example single landscape orientation in which the first display 106 is folded behind (e.g. away from a user holding the computing device 100 ) the second display 108 via the hinge 110 . In such a configuration, either display may face the user and be active. Further, in such a configuration, the display not facing the user may be inactive.
- FIG. 1F depicts an example double landscape orientation.
- the first portion 102 includes a first 6DOF motion sensor 114 configured to measure the pose of the first portion 102 in six degrees of freedom, namely, x, y, z, pitch, roll, and yaw, as well as accelerations and rotational velocities, so as to track the rotational and translational motion of the first portion 102 .
- the second portion 104 includes a second 6DOF motion sensor 116 .
- Any suitable 6DOF motion sensors may be used.
- the first 6DOF motion sensor 114 and the second 6DOF motion sensor 116 may each include one or more accelerometers and one or more gyroscopes.
- each 6DOF motion sensor 114 and 116 may optionally include a magnetometer.
- any other suitable sensor or sensors may be used to detect the relative orientations of each device portion, such as an optical or mechanical encoder incorporated into the hinge of the device.
- the first and second portions 102 and 104 may include a sensor configured to sense when the first and second portions 102 and 104 are in the closed configuration or fully open configuration (0° or 360° rotation of either portion relative to the other portion).
- the sensor takes the form of a Hall effect sensor 120 configured to detect motion of a magnet 122 .
- the sensor may comprise an optical sensor, contact switch, or other suitable sensing mechanism.
- computing device 100 may include other numbers of displays.
- the computing device 100 may take any suitable form, including but not limited to various mobile devices (e.g., a foldable smart phone or tablet).
- the second portion 104 includes a camera 122 and flash 124 located generally at an upper right-hand side of the second display 108 .
- the computing device may be configured to display or modify a user interface in a manner adapted to the pose.
- FIG. 2 shows an example user interface for the computing device 100 of FIGS. 1A-1F when the computing device is in a locked state, and illustrates example locations of integrated device hardware.
- the first portion 102 includes a speaker and the second portion 104 includes a camera and flash, as described in more detail below.
- the first display 106 and the second display 108 each are directed towards a user of the computing device 100 , and each display 106 and 108 displays aspects of the lock screen user interface.
- the lock screen user interface includes a first user interface element 202 that is selectable to invoke a camera application.
- the user interface also includes a second user interface element 204 that is selectable by a user to invoke a flashlight application.
- the user interface elements 202 and 204 may provide faster access to the camera and flashlight, respectively, compared to unlocking the computing device and navigating to an application launcher.
- the user interface elements 202 and 204 are displayed on a bottom, left-hand side of the first display 106 .
- a user may hold the computing device 100 at/around a periphery of the computing device 100 , such placement may allow for intuitive selection of a user interface element via touch input without significantly repositioning a hand(s) on the computing device 100 .
- the lock screen user interface also may include a control 206 to change a lock status of the computing device 100 from the locked state to an unlocked state.
- the computing device 100 may comprise a fingerprint sensor and corresponding user interface element 206 indicating the location of the fingerprint sensor.
- the fingerprint sensor may be used to authenticate a user of the computing device 100 and thereby unlock the computing device.
- the fingerprint sensor 206 is located at a periphery of the second display 108 , which may allow for convenient placement of a user's right thumb (for example) while a user holds the computing device by the right hand or both hand(s).
- the user interface elements 202 , 204 may be displayed in any other suitable location.
- the fingerprint sensor 206 may have any other suitable location.
- a computing device 100 may utilize any other suitable unlock mechanism (e.g., facial recognition) to change the lock state from locked to unlocked.
- the computing device 100 When a user launches an application configured to access a camera stream of the camera 122 (referred to herein as a “camera application”), the computing device 100 automatically detects a likely orientation and a direction of the camera 122 . For example, based on sensor data obtained from the first 6DOF motion sensor and the second 6DOF motion sensor, the computing device 100 may determine a relative orientation and/or motion of the second portion 104 relative to the first portion 102 . As the camera 122 is located at a display side of the second portion 104 , a direction of the camera may be detected using data obtained from data indicating a likely orientation of the first and second 6DOF motion sensors compared to the user (e.g.
- the computing device 100 Based on the orientation and direction detected for the camera, the computing device 100 mirrors or un-mirrors a camera stream of the camera 122 .
- mirroring and un-mirroring the camera stream may be determined at an application level, rather than a system level.
- a camera application may be posture-aware and configured to automatically switch between a user-facing or world-facing camera mode based upon computing device pose, adjusting mirroring accordingly.
- a camera application e.g.
- a third-party camera application may mirror the camera stream based on whether a user-facing or world-facing camera mode is active, including in instances that the camera mode does not correspond to device pose.
- FIG. 3 depicts example camera streams displayed in response to various detected orientations and directions of the camera 122 upon launch of a camera application.
- a user may modify these camera stream configurations, and/or set a desired location and/or orientation a camera stream, for each of one or more selected device poses.
- the computing device 100 may output an application user interface (camera stream, user interface elements for operating a camera, etc.) to a specific display(s) based upon a lock state, hinge angle, and/or determined pose (position and/or orientation) of the computing device.
- FIG. 4A depicts an example use scenario in which a user launches a camera application from a lock screen user interface via a touch input 402 to user interface element 202 .
- Sensor data received from the first 6DOF motion sensor 114 and the second 6DOF motion sensor 116 indicates that the computing device 100 is in a double portrait configuration.
- the computing device 100 may anticipate a possible movement to world-facing image/video capture when the camera application is launched from a lock screen user interface, and output a camera user interface 404 to the first display 106 , as shown in FIG. 4B .
- the camera user interface 102 remains visible and operable to the user without the computing device 100 swapping the application user interface to a different display screen, as shown in FIG. 4C .
- the user first poses for a self-portrait in which the camera 122 is oriented in a user-facing direction.
- the user may manually move the camera application from the first display 106 to the second display 108 , e.g. by using a “touch and drag” gesture to select and move the camera application user interface 404 to the second display 108 .
- FIGS. 5A-5B depict an example use scenario in which a user launches a camera application from an application launcher 502 when the computing device is in an unlocked state.
- the application launcher 502 takes the form of a menu bar displayed near a bottom edge of the second display 108 .
- an application launcher may take any other suitable form.
- a user may launch an application from a location other than an application launcher 502 , and may launch the application from either display screen.
- a notification/action center that provides an overview of alerts from computing applications includes a user interface element 504 that is selectable to launch a camera application.
- a device home screen also may include an application icon that is selectable to launch the camera application.
- user selection of a camera application icon 506 within the application launcher 502 causes the computing device 100 to launch the corresponding camera application.
- the camera application launches to the second display 108 , which is the same display from which the camera application was invoked, as shown in FIG. 5B .
- the camera application is invoked from the first display 106 and the camera application launches on the first display, as shown in FIG. 5D .
- the computing device 100 is configured to automatically move any application that actively uses the camera 122 to a display that remains user-facing or becomes active (e.g., as a result of becoming user-facing) in the event of a fold, such as to a flip device pose configuration.
- a user folds the device into a flip configuration by rotating the second portion 104 to a location behind the first portion 102 .
- the computing device 100 may then determine from motion data that the camera is moving towards a world-facing direction. As a result, the computing device 100 moves the camera application to the first display 106 , and also places the second display 108 into an inactive state, as shown in FIG. 5F .
- the computing device 100 detects the flip transition for switching display screens when the hinge angle is greater than or equal to a threshold hinge angle (e.g. 345 degrees).
- a threshold hinge angle e.g. 345 degrees.
- the application user interface camera stream, image capture settings, shutter control, etc.
- the camera 122 is directed away from the user.
- the user then rotates the computing device 100 without changing the hinge angle ⁇ to change the direction of the camera 122 from a world-facing direction to a user-facing direction.
- the computing device 100 moves the application user interface from the first display 106 to the second display 108 , as shown in FIG. 5G , thereby automatically keeping the user interface facing toward the user.
- FIGS. 6A-6E illustrate additional examples of automatically moving an application user interface between display screens as a pose of a computing device changes.
- the computing device is in a double portrait configuration and the camera is in a user-facing direction.
- a camera application is output for display on the second display 108 .
- FIG. 6B a user folds the second portion 104 to a pose that is behind the first portion 102 from the perspective of the user, e.g. by rotating the second portion 104 counterclockwise about the hinge 110 .
- the computing device automatically detects the pose change based on sensor data.
- the computing device swaps the camera application user interface from the second display 108 to the first display 106 , as shown in FIG. 6C .
- FIG. 6C illustrate additional examples of automatically moving an application user interface between display screens as a pose of a computing device changes.
- the user turns the computing device around, such that the camera transitions from a world-facing direction to a user-facing direction.
- the computing device also detects this pose change based on sensor data, and automatically moves the camera application from the first display 106 to the second display 108 as shown in FIG. 6E .
- the camera 122 is active during a detected device flip transition.
- a camera stream may automatically move from one display to the other display screen—without user input confirming an intent to switch display screens—in response to a detected device flip transition.
- a user may be prompted to provide input (e.g. touch input, such as a tap or double tap on a display screen), to confirm intent to switch to a different active display screen. This may help to prevent accidental screen switching when the computing device 100 is in a single-screen pose (single portrait or single landscape), for example.
- a computing device 100 may allow a user to span a computing application user interface across multiple displays. Some applications may not be aware of the multi-screen configuration. In such applications, the spanned view of the application may take the form of a single application user interface window displayed across both displays. Other applications may be multi-screen aware. In such applications, spanning may trigger the display of a multi-screen user interface (UI) mode that is different than the single-screen mode.
- FIGS. 7A-7C depict an example camera application UI that is multi-screen aware and thus that enters a specific multi-screen mode when spanned. In FIG. 7A , the computing device receives a touch input at the second display, which is currently displaying the camera application in a single screen mode.
- the touch input drags the camera application towards the first display and releases the application in a particular area, e.g. within a threshold distance of the hinge, as shown in FIG. 7B .
- the computing device spans the camera application across the first display 106 and the second display 108 .
- the camera application displays a UI comprising a collection of thumbnails of recently captured images or videos on the first display 106 , and displays the current camera stream on the second display 108 .
- the second display 108 may switch between displaying a camera stream and displaying a recently captured image(s) upon user input selecting a user interface element 702 .
- the computing device displays the camera stream preview and recently captured images on separate display screens simultaneously.
- the camera application may display a latest image or video capture as occupying a majority of the user interface on the first display 106 , and the current camera stream on the second display 108 .
- Some third-party applications may be configured to select between a front-facing and a rear-facing camera.
- the computing device 100 comprises a single physical camera 122 .
- the computing device 100 may enumerate the physical camera 122 as two virtual cameras when the computing device 100 executes a third-party application.
- the third-party application may request to receive data from the virtual camera representing the user-facing camera, or the virtual camera representing the world-facing camera.
- the computing device provides the stream from the camera to the corresponding virtual camera, which provides the stream to the third-party application.
- FIGS. 8A-8C depict an example use scenario in which a user launches a third-party application that is configured for front-facing camera and rear-facing camera user experiences.
- the application launches on a display screen from which the application was invoked.
- the application launches on the second display 108 , as shown in FIG. 8A .
- the user selects an in-application user interface toggle 802 via a touch input 804 .
- the computing device 100 presents a notification 806 instructing the user to change a pose of the computing device 100 to complete the switch between camera modes, as shown in FIG. 8B .
- a user may change the pose of the computing device in various manners.
- the user rotates the second portion 104 counterclockwise relative to the first portion 102 , until the computing device 100 is in a single portrait configuration shown in FIG. 8C .
- the computing device 100 ceases presentation of the notification, and the second display 108 and associated touch sensor may be inactivated while facing away from the user. Further, the computing device 100 outputs the world-facing camera stream to the first display 106 .
- a user may inadvertently select an in-application user interface toggle to switch between rear- and front-facing camera modes, and the computing device may present a notification 806 directing the user to change the computing device pose.
- the user may undo the request to switch camera modes, and thus dismiss the notification 806 , by again selecting the user interface toggle 802 .
- Some third-party applications also may not be configured to automatically change camera modes based upon changes in device pose.
- a computing device 100 may present a notification to alert a user of a detected change in the camera mode.
- a user of the computing device then may confirm the change in camera mode to dismiss the notification, or may revert the computing device to the prior device pose and keep the camera in the same mode.
- FIGS. 9A-9C depict an example use scenario in which a third-party application that accesses a camera stream of the camera 122 is opened on the second portion 104 .
- a user folds the second portion 104 behind the first portion 102 , such that the camera 122 is directed away from the user ( FIG. 9B ).
- the computing device 100 detects the pose of the second portion 104 relative to the first portion 102 and determines that the pose is indicative of the camera being world-facing, e.g. based upon the data received from the motion sensors on each portion 102 and 104 and also information indicating most recent screen interactions.
- the computing device 100 presents a notification 902 altering the user to the detected change of camera mode.
- the computing device 100 may present the notification 902 as an overlay stacked on top of the application without dismissing (e.g. closing) the application.
- the computing device 100 may present the notification 902 on a separate display than the application, e.g. to present the notification 902 on a user-facing display when the application is displayed on a world-facing display.
- the computing device 100 may output the notification in any other suitable manner.
- the user provides touch input to the computing device 100 by selecting the in-application user interface toggle 904 on the first display 106 .
- the computing device dismisses the notification.
- the user provides touch input to the computing device 100 by tapping the first display 106 at a location of the notification 902 with a finger (as shown in dotted lines in FIG. 9B ) or stylus.
- the computing device 100 dismisses the notification 902 and outputs the application to the first display 106 ( FIG. 9C ).
- the application also may switch virtual cameras (described above) from which the camera stream is obtained.
- the user may manifest intent to change camera modes by touching the first display 106 at an in-application user interface toggle 904 (e.g. while in the pose of FIG. 9A ), after which the user may be prompted to flip the camera around to a world-facing view, and after which the application obtains the camera stream from a different virtual camera.
- an in-application user interface toggle 904 e.g. while in the pose of FIG. 9A
- the computing device 100 includes a flash for the camera.
- the computing device 100 further is configured to operate the flash in a “flashlight mode” that is separate from camera operation.
- the flash may be utilized as a flashlight in any computing device pose.
- FIG. 10 depicts use of the flash 124 as a flashlight in single portrait 1002 , single landscape 1004 , double portrait 1006 , and double landscape 1008 poses of a computing device.
- a lock screen user interface may include a user interface element 204 that is selectable to turn on the flash 124 in a flashlight mode when the computing device is in a locked state.
- the user interface element 208 may be selectable to launch the flashlight mode without providing user credentials to change the lock state.
- a user may also invoke the flashlight mode when the computing device 100 is in an unlocked state.
- the notification/action center that provides an overview of alerts from computing applications may also include a user interface element that is selectable to turn on the camera flash in flashlight mode.
- FIG. 11 depicts the computing device 100 in an unlocked state.
- the computing device 100 displays a notification center via the first display 106 and displays a plurality of application icons via the second display 108 , wherein the notification center comprises a flashlight control 1102 .
- the computing device 100 may include any other suitable mechanism for launching the flashlight.
- the user interface element 1102 for controlling operation of the flash as a flashlight is displayed on the first display 106 , rather than on the second display that includes the flash.
- the user interface element 1102 for controlling the flash remains on the first display 106 for convenient user access.
- a flashlight control user interface element 1102 may be invoked from the second display 108 .
- the flash 124 is in a user-facing direction when the computing device is in a double portrait (or double landscape) configuration.
- the flash 124 initiates at low brightness (e.g., 0-20% of total brightness) as shown in FIG. 12 , which may help to prevent eye discomfort in the user-facing pose.
- a brightness of light emitted by the flash 124 in the flashlight mode may be controlled by the hinge angle, such that the brightness is increased once the hinge angle indicates that the light is likely rotated out of direct view.
- FIGS. 13A-13D depict an example use scenario for operating a flashlight application when a computing device is in a double portrait or a double landscape configuration.
- the flash is off, and a user navigates to a notification/action center.
- the user turns on the flashlight by selecting a user interface element 1102 displayed in the notification/action center.
- the flash 124 initiates at a lower brightness (e.g. 0-20% of full brightness).
- the computing device outputs a notification, e.g.
- the user rotates the second portion 104 counterclockwise while a pose of the first portion 102 remains relatively unchanged.
- the flash 124 emits increasingly brighter light as the hinge angle increases and the flash 124 moves in a direction away from the user.
- the flash emits light at or near full brightness.
- the computing device 100 may be utilized in various other configurations, in addition or alternatively to a double portrait or a double landscape configuration.
- initial flashlight brightness may be determined based on whether the flash is directed towards a user or the surrounding real-world environment (away from the user).
- FIGS. 14A-14D depict example use scenarios for launching a flashlight application when a computing device is in a single portrait configuration.
- the flash 124 When a user turns on the flashlight and the flash hardware 124 is user-facing (e.g. the first portion 102 is folded behind the second portion 104 from a perspective of a user), the flash 124 enters the flashlight mode at low brightness, as shown in FIGS. 14A and 14C . Further, the computing device may output a notification directing a user to switch displays for full flashlight brightness. In FIG. 14A , the flashlight is launched when the computing device is in a locked state, and the computing device outputs a notification 1404 to the lock screen directing the user to fold the second portion 104 behind the first portion 102 for full flashlight brightness. As shown in FIG. 14B , flashlight brightness is 100 % when the flash 124 is moved to a world-facing direction. In FIG.
- the flashlight is launched from a notification/action center while the computing device is in an unlocked state, and the computing device outputs a notification 1406 in the notification/action center directing the user to switch screens for full flashlight brightness.
- the computing device 100 adjusts the flashlight brightness to 100 %, as shown in FIG. 14D .
- the flash 124 may initiate at full or near-full flashlight brightness.
- FIGS. 15A-15C depict an example use scenario for operating a flashlight application when a computing device is in a flip mode, single portrait configuration in which the flash 124 is world-facing.
- the flash is off.
- a user opens a notification/action center, which includes the user interface element 1102 for invoking a flashlight mode (e.g. by launching a flashlight application).
- a user selects the user interface element 1102 , which turns on the flash.
- the flash 124 is world-facing (e.g. as determined from data received from motion sensors on each portion 102 and 104 and also information indicating most recent screen interactions) and initiates a flashlight mode at full brightness.
- a similar sequence of events may occur when the computing device is in a single landscape configuration in which the flash 124 is world-facing.
- FIGS. 16A-16E depicts an example use scenario for operating a flashlight application when a computing device is in a single portrait or single landscape configuration in which the flash 124 is user-facing. Similar to the scenario depicted in FIGS. 15A-15C , the flash is off at FIG. 16A .
- a user opens a notification/action center.
- the user selects the user interface element 1102 , which invokes the flashlight application.
- the flash 124 is user-facing (e.g. as determined from motion sensors on each portion 102 and 104 and information regarding most recent screen interactions) and initiates a flashlight mode at a low/partial brightness.
- FIG. 16E depicts an example use scenario for operating a flashlight application when a computing device is in a single portrait or single landscape configuration in which the flash 124 is user-facing. Similar to the scenario depicted in FIGS. 15A-15C , the flash is off at FIG. 16A .
- FIG. 16B a user opens a notification/action center.
- the user selects
- the user turns around the computing device such that the flash 124 transitions to a world-facing direction.
- the computing device detects the resulting change in orientation of each portion 102 and 104 via motion sensor data and displays a notification 1602 on the user-facing display 106 (previously world-facing) requesting that the user activate the display.
- the computing device 100 displays the user notification/action center on the user-facing display 106 and increases the flashlight brightness, as indicated at FIG. 16E .
- the flash 124 is active during a detected device flip transition.
- user interface controls for a flashlight application may automatically switch from one display screen to the other display screen—without user input confirming intent to change display screens—in response to a detected device flip transition.
- a user may be prompted to provide input (e.g. touch input, such as a tap or double tap on a display screen), to switch which display screen displays user interface flashlight controls. This may help to prevent accidental screen switching when the computing device 100 is in a single-screen pose (single portrait or single landscape).
- FIGS. 17A-17B depict an example use scenario in which a user moves a computing device 100 from a flip mode, single portrait (or landscape) configuration in which the flash 124 is world-facing to a double portrait (or landscape) configuration in which the flash 124 is user-facing.
- the flash 124 is at full flashlight brightness while the computing device 100 is in the flip mode, single portrait configuration and the flash 124 is world-facing.
- the flashlight brightness dims as the hinge angle decreases.
- FIGS. 18A-18D depicts an example use scenario in which, at FIG. 18A , the computing device 100 is in the double portrait configuration and the locked state.
- a user turns on the flashlight by selecting the user interface element 204 , which initiates the flashlight at low brightness.
- the computing device 100 displays a notification 1802 instructing the user to fold back the second portion 104 to increase flashlight brightness.
- the user rotates the second portion 104 counterclockwise, at FIG. 18C , and flashlight brightness increases with increasing hinge angle.
- the computing device 100 also displays a brightness level indication 1804 to inform a user of a current flashlight brightness.
- the flashlight operates at full brightness when the computing device 100 is in the single portrait configuration and the flash 124 is world-facing.
- the computing device 100 automatically adjusts a brightness of a flashlight based upon changes in pose and hinge angle ⁇ .
- the computing device 100 controls the brightness of the flashlight to illuminate the flash at a relatively lower brightness (e.g. 0-20% full brightness) when the computing device 100 is in the double portrait and double landscape configurations, and at a relatively higher brightness (90-100% full brightness) when the computing device is in the single portrait and single landscape configurations and the flashlight is facing away from the user (e.g. as determined from motion sensors on each portion 102 and 104 and also information indicating most recent screen interactions).
- the computing device 100 may control the flashlight to exhibit a brightness in a range of brightness values between the lowest brightness setting and the highest brightness setting.
- Adjustments to light source brightness may be continuous as the hinge angle increases or decreases, or may be incremental.
- a computing device in a double portrait or double landscape configuration may not increase flashlight brightness from the low brightness value to a higher brightness value until the computing device detects a threshold increase, e.g. 10 degrees, in the hinge angle ⁇ , and then may increase flashlight brightness continuously or in increments as the hinge angle increases further.
- a threshold increase e.g. 10 degrees
- FIGS. 19A-19E depict the computing device 100 automatically adjusting flashlight brightness based upon changing hinge angle.
- the computing device is in a double portrait configuration and the light source brightness is 20% of full brightness.
- the hinge angle increases to the pose shown in FIG. 19B , and the flashlight brightness increases to 40% of full brightness.
- the hinge angle increases again to the pose shown in FIG. 19C , and the flashlight brightness increase to 60% of full brightness.
- FIG. 19D further increases in the hinge angle cause the computing device to further increase the light source brightness.
- FIG. 19E the computing device is in a single portrait configuration and the flashlight brightness is 100% of full brightness.
- the computing device described herein further is configured to decrease flashlight brightness as hinge angle decreases.
- the computing device decreases flashlight brightness from 100% to 80% upon detecting the change in hinge angle as the computing device 100 transitions from the single portrait pose shown in FIG. 19E to the pose shown in FIG. 19D .
- the methods and processes described herein may be tied to a computing system of one or more computing devices.
- such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
- API application-programming interface
- FIG. 20 schematically shows a non-limiting example of a computing system 2000 that can enact one or more of the methods and processes described above.
- Computing system 2000 is shown in simplified form.
- Computing system 2000 may take the form of one or more personal computers, server computers, tablet computers, home-entertainment computers, network computing devices, gaming devices, mobile computing devices, mobile communication devices (e.g., smart phone), and/or other computing devices, and wearable computing devices such as smart wristwatches and head mounted augmented reality devices.
- Computing system 2000 includes a logic processor 2002 volatile memory 2004 , and a non-volatile storage device 2006 .
- Computing system 2000 may optionally include a display subsystem 2008 , input subsystem 2010 , communication subsystem 2012 , and/or other components not shown in FIG. 20 .
- Logic processor 2002 includes one or more physical devices configured to execute instructions.
- the logic processor may be configured to execute instructions that are part of one or more applications, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.
- the logic processor may include one or more physical processors (hardware) configured to execute software instructions. Additionally or alternatively, the logic processor may include one or more hardware logic circuits or firmware devices configured to execute hardware-implemented logic or firmware instructions. Processors of the logic processor 2002 may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic processor optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic processor may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration. In such a case, these virtualized aspects are run on different physical logic processors of various different machines, it will be understood.
- Non-volatile storage device 2006 includes one or more physical devices configured to hold instructions executable by the logic processors to implement the methods and processes described herein. When such methods and processes are implemented, the state of non-volatile storage device 2006 may be transformed—e.g., to hold different data.
- Non-volatile storage device 2006 may include physical devices that are removable and/or built-in.
- Non-volatile storage device 2006 may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., ROM, EPROM, EEPROM, FLASH memory, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), or other mass storage device technology.
- Non-volatile storage device 2006 may include nonvolatile, dynamic, static, read/write, read-only, sequential-access, location-addressable, file-addressable, and/or content-addressable devices. It will be appreciated that non-volatile storage device 2006 is configured to hold instructions even when power is cut to the non-volatile storage device 2006 .
- Volatile memory 2004 may include physical devices that include random access memory. Volatile memory 2004 is typically utilized by logic processor 2002 to temporarily store information during processing of software instructions. It will be appreciated that volatile memory 2004 typically does not continue to store instructions when power is cut to the volatile memory 2004 .
- logic processor 2002 volatile memory 2004 , and non-volatile storage device 2006 may be integrated together into one or more hardware-logic components.
- hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.
- FPGAs field-programmable gate arrays
- PASIC/ASICs program- and application-specific integrated circuits
- PSSP/ASSPs program- and application-specific standard products
- SOC system-on-a-chip
- CPLDs complex programmable logic devices
- module may be used to describe an aspect of computing system 2000 typically implemented in software by a processor to perform a particular function using portions of volatile memory, which function involves transformative processing that specially configures the processor to perform the function.
- a module, program, or engine may be instantiated via logic processor 2002 executing instructions held by non-volatile storage device 2006 , using portions of volatile memory 2004 .
- modules, programs, and/or engines may be instantiated from the same application, service, code block, object, library, routine, API, function, etc.
- the same module, program, and/or engine may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc.
- the terms “module,” “program,” and “engine” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
- input subsystem 2010 may comprise or interface with one or more user-input devices such as a keyboard, mouse, touch screen, or game controller.
- the input subsystem may comprise or interface with selected natural user input (NUI) componentry.
- NUI natural user input
- Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board.
- NUI componentry may include a microphone for speech and/or voice recognition; an infrared, color, stereoscopic, and/or depth camera for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity; and/or any other suitable sensor.
- communication subsystem 2012 may be configured to communicatively couple various computing devices described herein with each other, and with other devices.
- Communication subsystem 2012 may include wired and/or wireless communication devices compatible with one or more different communication protocols.
- the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network, such as a HDMI over Wi-Fi connection.
- the communication subsystem may allow computing system 2000 to send and/or receive messages to and/or from other devices via a network such as the Internet.
- a computing device comprising a first portion comprising a first display, a second portion comprising a second display and a camera, the second portion connected to the first portion via a hinge, a hinge angle sensing mechanism comprising one or more sensors, a logic device, and a storage device holding instructions executable by the logic device to execute a camera application and to receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a device pose, output the camera application to the first display when the device pose is indicative of the camera being world-facing, and output the camera application to the second display when the device pose is indicative of the camera being user-facing.
- the instructions may additionally or alternatively be executable to detect a change in device pose, and when the change in device pose is indicative of a camera direction changing from a world-facing direction to a user-facing direction, then move the camera application from the first display to the second display.
- the instructions may additionally or alternatively be executable to detect a change in device pose, and when the change in device pose is indicative of a camera direction changing from a user-facing direction to a world-facing direction, then move the camera application from the second display to the first display.
- the device pose is indicative of the camera being user-facing and also indicative of the computing device being in a double portrait or double landscape configuration
- the instructions may additionally or alternatively be executable to receive a user input to move the camera application from the second display to the first display, and in response to receiving the user input, moving the camera application from the second display to the first display.
- the device pose is indicative of the camera being user-facing and also indicative of the computing device being in a double portrait or double landscape configuration
- the instructions may additionally or alternatively be executable to receive a user input moving the camera application to a location spanning each of the first display and the second display, and in response to the user input, outputting the camera application in a spanning mode.
- the instructions may additionally or alternatively be executable to output the camera application in the spanning mode by outputting a camera stream to one of the first display and the second display and outputting a camera roll of captured photos to the other of the first display and the second display.
- the instructions may additionally or alternatively be executable to receive, via one of the first display and the second display, a touch input to launch the camera application, and in response to receiving the touch input, outputting the camera application to the one of the first display and the second display.
- the instructions may additionally or alternatively be executable to receive the sensor data by receiving sensor data indicative of a transition in device pose compared to an initial device pose at which the touch input was received.
- Another example provides a method enacted on a computing device comprising a first portion including a first display, a second portion including a second display and a camera, the second portion connected to the first portion via a hinge, and a hinge angle sensing mechanism comprising one or more sensors, the method comprising receiving, at one of the first display and the second display, an input to launch a camera application, in response to receiving the input, outputting the camera application to the one of the first display and the second display, receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a change in pose of the computing device, and based at least in part on the change in pose of the computing device, output the camera application to the other of the first display and the second display to maintain a user-facing orientation.
- a computing device comprising a first portion, a second portion comprising a light source, the second portion connected to the first portion via a hinge, a hinge angle sensing mechanism comprising one or more sensors, a logic device, and a storage device holding instructions executable by the logic device to execute a flashlight application and to receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a pose of the computing device, control the light source to emit light at a relatively lower brightness when the pose of the computing device is indicative of the light source being user-facing, and control the light source to emit light at a relatively higher brightness when the pose of the computing device is indicative of the light source being world-facing.
- the instructions may additionally or alternatively be executable to detect a change in pose of the computing device, when the change in the pose is indicative of a direction of the light source changing from a world-facing direction to a user-facing direction, decrease a brightness of the light source, and when the change in the pose is indicate of the direction of the light source changing from the user-facing direction to the world-facing direction, increase the brightness of the light source.
- the instructions may additionally or alternatively be executable to dynamically adjust the brightness of the light source during the change in pose of the computing device.
- the instructions may additionally or alternatively be executable to increase or decrease the brightness of the light source when a change in hinge angle exceeds a threshold hinge angle.
- the first portion may additionally or alternatively comprise a first display and the second portion may additionally or alternatively comprise a second display.
- the instructions may additionally or alternatively be executable to output a brightness indicator via at least one display of the first display and the second display.
- the instructions may additionally or alternatively be executable to, when the pose of the computing device is indicative of the light source being user-facing, output via at least one of the first display and the second display a notification instructing a user to change the pose of the computing device.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application claims priority to U.S. Provisional Patent Application Ser. No. 62/909,199, filed Oct. 1, 2019, the entirety of which is hereby incorporated herein by reference for all purposes.
- Some mobile electronic devices, such as smart phones and tablets, have a monolithic handheld form in which a display occupies substantially an entire front side of the device. Other devices, such as laptop computers, include a hinge that connects a display to other hardware, such as a keyboard and cursor controller (e.g. a track pad).
- One example provides a computing device including a first portion including a first display, a second portion including a second display and a camera, the second portion connected to the first portion via a hinge, and a hinge angle sensing mechanism including one or more sensors. The computing device further includes a logic device, and a storage device holding instructions executable by the logic device to execute a camera application, to receive sensor data from the one or more sensors, and based at least in part on the sensor data received from the one or more sensors, determine a device pose. The instructions are further executable to output the camera application to the first display when the device pose is indicative of the camera being world-facing, and output the camera application to the second display when the device pose is indicative of the camera being user-facing.
- Another example provides a computing device comprising a first portion, a second portion comprising a light source, the second portion connected to the first portion via a hinge, and a hinge angle sensing mechanism comprising one or more sensors. The computing device further includes a logic device, and a storage device holding instructions executable by the logic device to execute a flashlight application, to receive sensor data from the one or more sensors, and based at least in part on the sensor data received from the one or more sensors, determine a pose of the computing device. The instructions are further executable to control the light source to emit light at a relatively lower brightness when the pose of the computing device is indicative of the light source being user-facing, and control the light source to emit light at a relatively higher brightness when the pose of the computing device is indicative of the light source being world-facing.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
-
FIGS. 1A-1F show different poses of an example multi-display computing device. -
FIG. 2 schematically shows example locations of integrated hardware devices for the computing device ofFIGS. 1A-1F . -
FIG. 3 schematically illustrates example camera user interfaces displayed based upon computing device pose. -
FIGS. 4A-4C depicts an example use scenario in which a camera application is launched from a lock screen, followed by the device being moved from a dual display mode to a flip mode. -
FIGS. 5A-5G depict example use scenarios in which a camera application is launched from an unlocked state, followed by the device being moved from a dual display mode to a flip mode and then rotated. -
FIGS. 6A-6E depict an example use scenario in which a device is switched from a self-facing camera mode to an outward facing camera mode based upon an angle of a hinge connecting a first portion and a second portion of the computing device and also on a camera orientation relative to a user. -
FIGS. 7A-7C illustrate spanning of a camera application across two displays. -
FIGS. 8A-8C illustrate an example use scenario in which a third-party camera application that comprises a control to select between a front-facing camera and a rear-facing camera is operated on the computing device ofFIGS. 1A-1F . -
FIGS. 9A-9C illustrate another example use scenario in which a third-party camera application that comprises a control to select between a front-facing camera and a rear-facing camera is operated on the computing device ofFIGS. 1A-1F . -
FIG. 10 illustrates example poses of the computing device ofFIGS. 1A-1F used to control a flashlight. -
FIG. 11 illustrates an example invocation mechanism for launching a flashlight application. -
FIG. 12 illustrates an initial brightness of a flashlight when the computing device ofFIGS. 1A-1F is in a double portrait configuration. -
FIGS. 13A-13D illustrate the control of a brightness of a flashlight based upon moving the computing device ofFIGS. 1A-1F between the double portrait configuration and the flip configuration. -
FIGS. 14A-14D illustrate the control of a brightness of the flashlight as a function of flashlight orientation compared to a user. -
FIGS. 15A-15C illustrate the control of a brightness of the flashlight based upon launching a flashlight application when the flashlight is in an outward-facing orientation. -
FIGS. 16A-16E illustrate the control of a brightness of the flashlight based upon moving the flashlight between a user-facing orientation and an outward-facing orientation. -
FIGS. 17A-17B illustrates a brightness of a flashlight dimming as an orientation of the flashlight transitions from world-facing to user-facing. -
FIGS. 18A-18D illustrate changes in flashlight brightness and a brightness indicator based upon changes in hinge angle of the computing device ofFIGS. 1A-1F . -
FIGS. 19A-19E illustrate changes in flashlight brightness based upon changes in hinge angle of the computing device ofFIGS. 1A-1F . -
FIG. 20 is a block diagram illustrating an example computing system. - Examples are disclosed that relate to multi-display devices comprising two display portions connected via a hinge, and that relate to the operation of a camera and flashlight of such devices. As described below, an example dual-display device may comprise a camera located on a display-side of one of the portions, and sensor data related to a hinge angle between the display portions may be used as input to control the operation of the camera and/or flashlight. Aspects of controlling the camera include determining on which display to display user interface features, providing for the support of third-party applications that are configured for use with dual front/rear facing camera systems, and controlling flashlight brightness, among other features.
- Any suitable sensor configuration may be used to detect the hinge angle. In some implementations, each device portion may include a six degree of freedom (6DOF) motion sensor. Further, the device also may include a sensor that senses a fully closed and/or fully open position, such as a Hall effect sensor on one portion to sense a magnet located on the other portion. Based on data from such sensors, the computing device may determine a likely orientation of the camera relative to the user (e.g. world-facing or user-facing), and output an application user interface, such as a camera stream, user interface controls, etc., based upon a determined computing device pose.
-
FIGS. 1A-1F show various poses in which an example dual-screen computing device 100 may be held.Computing device 100 includes afirst portion 102 and asecond portion 104 that respectively include afirst display 106 and asecond display 108. Each of thefirst display 106 and thesecond display 108 may comprise a touch sensor configured to sense touches from digits of users, styluses, and/or other objects. - A
hinge 110 connecting the first andsecond portions hinge 110. This relative orientation is represented inFIGS. 1A-1B by a variable angle θ measured between the emissive surfaces of first andsecond displays FIG. 1A , first andsecond portions FIG. 1B the first and second portions are further rotated away from each other to form a larger angle θ2. - In
FIG. 1C , thesecond portion 104 is folded viahinge 110 behind thefirst portion 102, such that a display side of each portion faces outward. From the perspective of a user ofcomputing device 100, thesecond display 108 is imperceptible. In such a configuration, the device may inactivate the display not facing the user. The pose of thecomputing device 100 inFIG. 1C is referred to herein as a “single portrait” configuration, which in some examples comprises a hinge angle θ within a range of 235 to 360 degrees. -
FIG. 1D depicts thecomputing device 100 after rotating the second portion 180 degrees clockwise from the orientation shown inFIG. 1C . InFIG. 1D , the first andsecond portions second portions second portions FIG. 1C , to a fully closed configuration in which the display side of thefirst portion 102 faces a display side of thesecond portion 104. - The examples shown in
FIGS. 1A-1D each depict the computing device in a portrait orientation. In other examples, thecomputing device 100 may be used in a landscape orientation.FIG. 1E depicts an example single landscape orientation in which thefirst display 106 is folded behind (e.g. away from a user holding the computing device 100) thesecond display 108 via thehinge 110. In such a configuration, either display may face the user and be active. Further, in such a configuration, the display not facing the user may be inactive.FIG. 1F depicts an example double landscape orientation. - As mentioned above, the
first portion 102 includes a first6DOF motion sensor 114 configured to measure the pose of thefirst portion 102 in six degrees of freedom, namely, x, y, z, pitch, roll, and yaw, as well as accelerations and rotational velocities, so as to track the rotational and translational motion of thefirst portion 102. Likewise, thesecond portion 104 includes a second6DOF motion sensor 116. Any suitable 6DOF motion sensors may be used. For example, the first6DOF motion sensor 114 and the second6DOF motion sensor 116 may each include one or more accelerometers and one or more gyroscopes. Additionally, each6DOF motion sensor - Further, as mentioned above, the first and
second portions second portions FIGS. 1A-1D , the sensor takes the form of aHall effect sensor 120 configured to detect motion of amagnet 122. In other examples, the sensor may comprise an optical sensor, contact switch, or other suitable sensing mechanism. Further, while shown inFIGS. 1A-1F as including first andsecond displays computing device 100 may include other numbers of displays. Thecomputing device 100 may take any suitable form, including but not limited to various mobile devices (e.g., a foldable smart phone or tablet). - In
FIGS. 1A-1F , thesecond portion 104 includes acamera 122 andflash 124 located generally at an upper right-hand side of thesecond display 108. Depending on a configuration and orientation of the computing device 100 (which together may be referred to herein as a pose of the device), the computing device may be configured to display or modify a user interface in a manner adapted to the pose. -
FIG. 2 shows an example user interface for thecomputing device 100 ofFIGS. 1A-1F when the computing device is in a locked state, and illustrates example locations of integrated device hardware. In this example, thefirst portion 102 includes a speaker and thesecond portion 104 includes a camera and flash, as described in more detail below. - In the double portrait configuration shown in
FIG. 2 , thefirst display 106 and thesecond display 108 each are directed towards a user of thecomputing device 100, and eachdisplay user interface element 202 that is selectable to invoke a camera application. The user interface also includes a seconduser interface element 204 that is selectable by a user to invoke a flashlight application. When thecomputing device 100 is in the locked state, theuser interface elements - In the example of
FIG. 2 , theuser interface elements first display 106. As a user may hold thecomputing device 100 at/around a periphery of thecomputing device 100, such placement may allow for intuitive selection of a user interface element via touch input without significantly repositioning a hand(s) on thecomputing device 100. - The lock screen user interface also may include a
control 206 to change a lock status of thecomputing device 100 from the locked state to an unlocked state. For example, thecomputing device 100 may comprise a fingerprint sensor and correspondinguser interface element 206 indicating the location of the fingerprint sensor. The fingerprint sensor may be used to authenticate a user of thecomputing device 100 and thereby unlock the computing device. In the example ofFIG. 2 , thefingerprint sensor 206 is located at a periphery of thesecond display 108, which may allow for convenient placement of a user's right thumb (for example) while a user holds the computing device by the right hand or both hand(s). In other examples, theuser interface elements fingerprint sensor 206 may have any other suitable location. Further, in other examples, acomputing device 100 may utilize any other suitable unlock mechanism (e.g., facial recognition) to change the lock state from locked to unlocked. - When a user launches an application configured to access a camera stream of the camera 122 (referred to herein as a “camera application”), the
computing device 100 automatically detects a likely orientation and a direction of thecamera 122. For example, based on sensor data obtained from the first 6DOF motion sensor and the second 6DOF motion sensor, thecomputing device 100 may determine a relative orientation and/or motion of thesecond portion 104 relative to thefirst portion 102. As thecamera 122 is located at a display side of thesecond portion 104, a direction of the camera may be detected using data obtained from data indicating a likely orientation of the first and second 6DOF motion sensors compared to the user (e.g. based upon recent screen interactions indicating a screen that was likely user facing), and also data indicating a hinge angle between the first portion and the second portion. Based on the orientation and direction detected for the camera, thecomputing device 100 mirrors or un-mirrors a camera stream of thecamera 122. In some examples, mirroring and un-mirroring the camera stream may be determined at an application level, rather than a system level. For example, a camera application may be posture-aware and configured to automatically switch between a user-facing or world-facing camera mode based upon computing device pose, adjusting mirroring accordingly. As another example, a camera application (e.g. a third-party camera application) may mirror the camera stream based on whether a user-facing or world-facing camera mode is active, including in instances that the camera mode does not correspond to device pose.FIG. 3 depicts example camera streams displayed in response to various detected orientations and directions of thecamera 122 upon launch of a camera application. In some examples, a user may modify these camera stream configurations, and/or set a desired location and/or orientation a camera stream, for each of one or more selected device poses. - The
computing device 100 may output an application user interface (camera stream, user interface elements for operating a camera, etc.) to a specific display(s) based upon a lock state, hinge angle, and/or determined pose (position and/or orientation) of the computing device.FIG. 4A depicts an example use scenario in which a user launches a camera application from a lock screen user interface via atouch input 402 touser interface element 202. Sensor data received from the first6DOF motion sensor 114 and the second6DOF motion sensor 116 indicates that thecomputing device 100 is in a double portrait configuration. In some examples, thecomputing device 100 may anticipate a possible movement to world-facing image/video capture when the camera application is launched from a lock screen user interface, and output acamera user interface 404 to thefirst display 106, as shown inFIG. 4B . When a user rotates thesecond portion 104 counterclockwise while thefirst portion 102 remains relatively stationary, thecamera user interface 102 remains visible and operable to the user without thecomputing device 100 swapping the application user interface to a different display screen, as shown inFIG. 4C . - In
FIG. 4B , the user first poses for a self-portrait in which thecamera 122 is oriented in a user-facing direction. In such a device pose, if the user wishes to have the camera application displayed on thesecond display 108 rather than the first display 106 (such that the user interface is displayed on the device portion that has the camera), the user may manually move the camera application from thefirst display 106 to thesecond display 108, e.g. by using a “touch and drag” gesture to select and move the cameraapplication user interface 404 to thesecond display 108. -
FIGS. 5A-5B depict an example use scenario in which a user launches a camera application from anapplication launcher 502 when the computing device is in an unlocked state. In this example, theapplication launcher 502 takes the form of a menu bar displayed near a bottom edge of thesecond display 108. In other examples, an application launcher may take any other suitable form. Further, in some examples, a user may launch an application from a location other than anapplication launcher 502, and may launch the application from either display screen. In the depicted example, a notification/action center that provides an overview of alerts from computing applications includes auser interface element 504 that is selectable to launch a camera application. As another example, a device home screen also may include an application icon that is selectable to launch the camera application. - Returning to
FIG. 5A , user selection of acamera application icon 506 within theapplication launcher 502 causes thecomputing device 100 to launch the corresponding camera application. In this example, the camera application launches to thesecond display 108, which is the same display from which the camera application was invoked, as shown inFIG. 5B . In the example ofFIG. 5C , the camera application is invoked from thefirst display 106 and the camera application launches on the first display, as shown inFIG. 5D . - To facilitate a switch between user- and world-facing camera directions, the
computing device 100 is configured to automatically move any application that actively uses thecamera 122 to a display that remains user-facing or becomes active (e.g., as a result of becoming user-facing) in the event of a fold, such as to a flip device pose configuration. InFIGS. 5E-5F , a user folds the device into a flip configuration by rotating thesecond portion 104 to a location behind thefirst portion 102. Thecomputing device 100 may then determine from motion data that the camera is moving towards a world-facing direction. As a result, thecomputing device 100 moves the camera application to thefirst display 106, and also places thesecond display 108 into an inactive state, as shown inFIG. 5F . In some examples, thecomputing device 100 detects the flip transition for switching display screens when the hinge angle is greater than or equal to a threshold hinge angle (e.g. 345 degrees). When thecomputing device 100 detects the flip transition, the application user interface (camera stream, image capture settings, shutter control, etc.) is automatically kept viewable and accessible by a user as thecamera 122 is directed away from the user. - Continuing with
FIG. 5F , the user then rotates thecomputing device 100 without changing the hinge angle θ to change the direction of thecamera 122 from a world-facing direction to a user-facing direction. Based upon motion data, thecomputing device 100 moves the application user interface from thefirst display 106 to thesecond display 108, as shown inFIG. 5G , thereby automatically keeping the user interface facing toward the user. -
FIGS. 6A-6E illustrate additional examples of automatically moving an application user interface between display screens as a pose of a computing device changes. In FIG, 6A, the computing device is in a double portrait configuration and the camera is in a user-facing direction. In this example, a camera application is output for display on thesecond display 108. InFIG. 6B , a user folds thesecond portion 104 to a pose that is behind thefirst portion 102 from the perspective of the user, e.g. by rotating thesecond portion 104 counterclockwise about thehinge 110. The computing device automatically detects the pose change based on sensor data. In response, the computing device swaps the camera application user interface from thesecond display 108 to thefirst display 106, as shown inFIG. 6C . InFIG. 6D , the user turns the computing device around, such that the camera transitions from a world-facing direction to a user-facing direction. The computing device also detects this pose change based on sensor data, and automatically moves the camera application from thefirst display 106 to thesecond display 108 as shown inFIG. 6E . - In the examples of
FIGS. 5F-5G and 6D-6E , thecamera 122 is active during a detected device flip transition. When thecamera 122 is active, a camera stream may automatically move from one display to the other display screen—without user input confirming an intent to switch display screens—in response to a detected device flip transition. In other examples, a user may be prompted to provide input (e.g. touch input, such as a tap or double tap on a display screen), to confirm intent to switch to a different active display screen. This may help to prevent accidental screen switching when thecomputing device 100 is in a single-screen pose (single portrait or single landscape), for example. - In some examples, a
computing device 100 may allow a user to span a computing application user interface across multiple displays. Some applications may not be aware of the multi-screen configuration. In such applications, the spanned view of the application may take the form of a single application user interface window displayed across both displays. Other applications may be multi-screen aware. In such applications, spanning may trigger the display of a multi-screen user interface (UI) mode that is different than the single-screen mode.FIGS. 7A-7C depict an example camera application UI that is multi-screen aware and thus that enters a specific multi-screen mode when spanned. InFIG. 7A , the computing device receives a touch input at the second display, which is currently displaying the camera application in a single screen mode. The touch input drags the camera application towards the first display and releases the application in a particular area, e.g. within a threshold distance of the hinge, as shown inFIG. 7B . In response, the computing device spans the camera application across thefirst display 106 and thesecond display 108. As shown inFIG. 7C , in the spanning mode, the camera application displays a UI comprising a collection of thumbnails of recently captured images or videos on thefirst display 106, and displays the current camera stream on thesecond display 108. InFIGS. 7A-7B , thesecond display 108 may switch between displaying a camera stream and displaying a recently captured image(s) upon user input selecting auser interface element 702. In contrast, in the spanning mode shown inFIG. 7C , the computing device displays the camera stream preview and recently captured images on separate display screens simultaneously. As another example, in the spanning mode, the camera application may display a latest image or video capture as occupying a majority of the user interface on thefirst display 106, and the current camera stream on thesecond display 108. - Some third-party applications may be configured to select between a front-facing and a rear-facing camera. However, in the depicted examples, the
computing device 100 comprises a singlephysical camera 122. Thus, to accommodate such third-party applications, thecomputing device 100 may enumerate thephysical camera 122 as two virtual cameras when thecomputing device 100 executes a third-party application. In such an example, based upon a camera mode (e.g., user-facing or world-facing) selected in the third-party application, the third-party application may request to receive data from the virtual camera representing the user-facing camera, or the virtual camera representing the world-facing camera. In response, the computing device provides the stream from the camera to the corresponding virtual camera, which provides the stream to the third-party application. -
FIGS. 8A-8C depict an example use scenario in which a user launches a third-party application that is configured for front-facing camera and rear-facing camera user experiences. As mentioned above, the application launches on a display screen from which the application was invoked. In this example, the application launches on thesecond display 108, as shown inFIG. 8A . When the user wants to switch between world-facing and user-facing camera modes, the user selects an in-applicationuser interface toggle 802 via atouch input 804. In response, thecomputing device 100 presents anotification 806 instructing the user to change a pose of thecomputing device 100 to complete the switch between camera modes, as shown inFIG. 8B . - A user may change the pose of the computing device in various manners. In
FIG. 8B , the user rotates thesecond portion 104 counterclockwise relative to thefirst portion 102, until thecomputing device 100 is in a single portrait configuration shown inFIG. 8C . When sensor data obtained from the first and second 6DOF motion sensors indicates that the user has completed the instructed pose change, thecomputing device 100 ceases presentation of the notification, and thesecond display 108 and associated touch sensor may be inactivated while facing away from the user. Further, thecomputing device 100 outputs the world-facing camera stream to thefirst display 106. - In some instances, a user may inadvertently select an in-application user interface toggle to switch between rear- and front-facing camera modes, and the computing device may present a
notification 806 directing the user to change the computing device pose. The user may undo the request to switch camera modes, and thus dismiss thenotification 806, by again selecting theuser interface toggle 802. - Some third-party applications also may not be configured to automatically change camera modes based upon changes in device pose. When a change in computing device pose is detected while the computing device runs a third-party application, a
computing device 100 may present a notification to alert a user of a detected change in the camera mode. A user of the computing device then may confirm the change in camera mode to dismiss the notification, or may revert the computing device to the prior device pose and keep the camera in the same mode. -
FIGS. 9A-9C depict an example use scenario in which a third-party application that accesses a camera stream of thecamera 122 is opened on thesecond portion 104. After launching the application (FIG. 9A ), a user folds thesecond portion 104 behind thefirst portion 102, such that thecamera 122 is directed away from the user (FIG. 9B ). Thecomputing device 100 detects the pose of thesecond portion 104 relative to thefirst portion 102 and determines that the pose is indicative of the camera being world-facing, e.g. based upon the data received from the motion sensors on eachportion computing device 100 presents anotification 902 altering the user to the detected change of camera mode. In some examples, thecomputing device 100 may present thenotification 902 as an overlay stacked on top of the application without dismissing (e.g. closing) the application. In other examples, thecomputing device 100 may present thenotification 902 on a separate display than the application, e.g. to present thenotification 902 on a user-facing display when the application is displayed on a world-facing display. In further examples, thecomputing device 100 may output the notification in any other suitable manner. - To confirm the change in camera mode, the user provides touch input to the
computing device 100 by selecting the in-applicationuser interface toggle 904 on thefirst display 106. In response to receiving user input, the computing device dismisses the notification. As another example, the user provides touch input to thecomputing device 100 by tapping thefirst display 106 at a location of thenotification 902 with a finger (as shown in dotted lines inFIG. 9B ) or stylus. In response to receiving user input, thecomputing device 100 dismisses thenotification 902 and outputs the application to the first display 106 (FIG. 9C ). The application also may switch virtual cameras (described above) from which the camera stream is obtained. As another example, the user may manifest intent to change camera modes by touching thefirst display 106 at an in-application user interface toggle 904 (e.g. while in the pose ofFIG. 9A ), after which the user may be prompted to flip the camera around to a world-facing view, and after which the application obtains the camera stream from a different virtual camera. - As mentioned above, the
computing device 100 includes a flash for the camera. Thecomputing device 100 further is configured to operate the flash in a “flashlight mode” that is separate from camera operation. The flash may be utilized as a flashlight in any computing device pose.FIG. 10 depicts use of theflash 124 as a flashlight insingle portrait 1002,single landscape 1004,double portrait 1006, anddouble landscape 1008 poses of a computing device. - A user may invoke the flashlight mode in any suitable manner. As mentioned above with reference to
FIG. 2 , a lock screen user interface may include auser interface element 204 that is selectable to turn on theflash 124 in a flashlight mode when the computing device is in a locked state. In some examples, the user interface element 208 may be selectable to launch the flashlight mode without providing user credentials to change the lock state. - A user may also invoke the flashlight mode when the
computing device 100 is in an unlocked state. For example, the notification/action center that provides an overview of alerts from computing applications may also include a user interface element that is selectable to turn on the camera flash in flashlight mode.FIG. 11 depicts thecomputing device 100 in an unlocked state. In this example, thecomputing device 100 displays a notification center via thefirst display 106 and displays a plurality of application icons via thesecond display 108, wherein the notification center comprises aflashlight control 1102. In other examples, thecomputing device 100 may include any other suitable mechanism for launching the flashlight. - In
FIG. 11 , theuser interface element 1102 for controlling operation of the flash as a flashlight is displayed on thefirst display 106, rather than on the second display that includes the flash. In instances that a user rotates thesecond portion 104 comprising the flash to direct the flash away from the user, theuser interface element 1102 for controlling the flash remains on thefirst display 106 for convenient user access. In other examples, a flashlight controluser interface element 1102 may be invoked from thesecond display 108. - The
flash 124 is in a user-facing direction when the computing device is in a double portrait (or double landscape) configuration. Thus, when launched, theflash 124 initiates at low brightness (e.g., 0-20% of total brightness) as shown inFIG. 12 , which may help to prevent eye discomfort in the user-facing pose. As explained in more detail below, a brightness of light emitted by theflash 124 in the flashlight mode may be controlled by the hinge angle, such that the brightness is increased once the hinge angle indicates that the light is likely rotated out of direct view. -
FIGS. 13A-13D depict an example use scenario for operating a flashlight application when a computing device is in a double portrait or a double landscape configuration. AtFIG. 13A , the flash is off, and a user navigates to a notification/action center. AtFIG. 13B , the user turns on the flashlight by selecting auser interface element 1102 displayed in the notification/action center. In a current user-facing orientation (e.g. as determined based upon motion data obtained from the motion sensors on eachportion flash 124 initiates at a lower brightness (e.g. 0-20% of full brightness). The computing device outputs a notification, e.g. in the notification/action center, instructing the user to adjust flashlight brightness by folding thesecond portion 104 behind thefirst portion 102. AtFIG. 13C , the user rotates thesecond portion 104 counterclockwise while a pose of thefirst portion 102 remains relatively unchanged. Theflash 124 emits increasingly brighter light as the hinge angle increases and theflash 124 moves in a direction away from the user. When thesecond portion 104 is folded behind thefirst portion 102, as shown atFIG. 13D , the flash emits light at or near full brightness. - As mentioned above, the
computing device 100 may be utilized in various other configurations, in addition or alternatively to a double portrait or a double landscape configuration. When the flashlight is invoked and a computing device is in a single portrait (or single landscape) configuration, initial flashlight brightness may be determined based on whether the flash is directed towards a user or the surrounding real-world environment (away from the user).FIGS. 14A-14D depict example use scenarios for launching a flashlight application when a computing device is in a single portrait configuration. - When a user turns on the flashlight and the
flash hardware 124 is user-facing (e.g. thefirst portion 102 is folded behind thesecond portion 104 from a perspective of a user), theflash 124 enters the flashlight mode at low brightness, as shown inFIGS. 14A and 14C . Further, the computing device may output a notification directing a user to switch displays for full flashlight brightness. InFIG. 14A , the flashlight is launched when the computing device is in a locked state, and the computing device outputs anotification 1404 to the lock screen directing the user to fold thesecond portion 104 behind thefirst portion 102 for full flashlight brightness. As shown inFIG. 14B , flashlight brightness is 100% when theflash 124 is moved to a world-facing direction. InFIG. 14C , the flashlight is launched from a notification/action center while the computing device is in an unlocked state, and the computing device outputs anotification 1406 in the notification/action center directing the user to switch screens for full flashlight brightness. Once the user rotates thecomputing device 100 such that theflash 124 is directed away from the user, thecomputing device 100 adjusts the flashlight brightness to 100%, as shown inFIG. 14D . When a user launches a flashlight application and theflash 124 is already in a world-facing direction, such as in the examples ofFIGS. 14B and 14D , theflash 124 may initiate at full or near-full flashlight brightness. -
FIGS. 15A-15C depict an example use scenario for operating a flashlight application when a computing device is in a flip mode, single portrait configuration in which theflash 124 is world-facing. AtFIG. 15A , the flash is off. AtFIG. 15B , a user opens a notification/action center, which includes theuser interface element 1102 for invoking a flashlight mode (e.g. by launching a flashlight application). AtFIG. 15C , a user selects theuser interface element 1102, which turns on the flash. In this example, theflash 124 is world-facing (e.g. as determined from data received from motion sensors on eachportion flash 124 is world-facing. -
FIGS. 16A-16E depicts an example use scenario for operating a flashlight application when a computing device is in a single portrait or single landscape configuration in which theflash 124 is user-facing. Similar to the scenario depicted inFIGS. 15A-15C , the flash is off atFIG. 16A . AtFIG. 16B , a user opens a notification/action center. AtFIG. 16C , the user selects theuser interface element 1102, which invokes the flashlight application. However, in this scenario, theflash 124 is user-facing (e.g. as determined from motion sensors on eachportion FIG. 16D , the user turns around the computing device such that theflash 124 transitions to a world-facing direction. The computing device detects the resulting change in orientation of eachportion notification 1602 on the user-facing display 106 (previously world-facing) requesting that the user activate the display. Once the user provides touch input to thedisplay 106 or otherwise confirms intent to active thedisplay 106, thecomputing device 100 displays the user notification/action center on the user-facingdisplay 106 and increases the flashlight brightness, as indicated atFIG. 16E . - In the examples of
FIGS. 16C-16E , theflash 124 is active during a detected device flip transition. When theflash 124 is active, user interface controls for a flashlight application may automatically switch from one display screen to the other display screen—without user input confirming intent to change display screens—in response to a detected device flip transition. In other examples, a user may be prompted to provide input (e.g. touch input, such as a tap or double tap on a display screen), to switch which display screen displays user interface flashlight controls. This may help to prevent accidental screen switching when thecomputing device 100 is in a single-screen pose (single portrait or single landscape). - When the flashlight is on and the
computing device 100 is in an unlocked state, a brightness of theflash 124 in flashlight mode changes based upon changes in pose of the computing device.FIGS. 17A-17B depict an example use scenario in which a user moves acomputing device 100 from a flip mode, single portrait (or landscape) configuration in which theflash 124 is world-facing to a double portrait (or landscape) configuration in which theflash 124 is user-facing. AtFIG. 17A , theflash 124 is at full flashlight brightness while thecomputing device 100 is in the flip mode, single portrait configuration and theflash 124 is world-facing. As a user “unfolds” thecomputing device 100 from the flip mode, single portrait configuration to the double portrait configuration shown atFIG. 17B , the flashlight brightness dims as the hinge angle decreases. - Flashlight brightness also changes as a user moves the computing device from the double portrait (or landscape) configuration to the single portrait (or landscape) configuration to rotate the flash to an outward-facing pose when the computing device is in a locked state.
FIGS. 18A-18D depicts an example use scenario in which, atFIG. 18A , thecomputing device 100 is in the double portrait configuration and the locked state. AtFIG. 18B , a user turns on the flashlight by selecting theuser interface element 204, which initiates the flashlight at low brightness. On the lock screen, thecomputing device 100 displays anotification 1802 instructing the user to fold back thesecond portion 104 to increase flashlight brightness. The user rotates thesecond portion 104 counterclockwise, atFIG. 18C , and flashlight brightness increases with increasing hinge angle. Thecomputing device 100 also displays abrightness level indication 1804 to inform a user of a current flashlight brightness. AtFIG. 18D , the flashlight operates at full brightness when thecomputing device 100 is in the single portrait configuration and theflash 124 is world-facing. - As mentioned above, the
computing device 100 automatically adjusts a brightness of a flashlight based upon changes in pose and hinge angle θ. Thecomputing device 100 controls the brightness of the flashlight to illuminate the flash at a relatively lower brightness (e.g. 0-20% full brightness) when thecomputing device 100 is in the double portrait and double landscape configurations, and at a relatively higher brightness (90-100% full brightness) when the computing device is in the single portrait and single landscape configurations and the flashlight is facing away from the user (e.g. as determined from motion sensors on eachportion computing device 100 may control the flashlight to exhibit a brightness in a range of brightness values between the lowest brightness setting and the highest brightness setting. - Adjustments to light source brightness may be continuous as the hinge angle increases or decreases, or may be incremental. In a more specific example, a computing device in a double portrait or double landscape configuration may not increase flashlight brightness from the low brightness value to a higher brightness value until the computing device detects a threshold increase, e.g. 10 degrees, in the hinge angle θ, and then may increase flashlight brightness continuously or in increments as the hinge angle increases further.
-
FIGS. 19A-19E depict thecomputing device 100 automatically adjusting flashlight brightness based upon changing hinge angle. InFIG. 19A , the computing device is in a double portrait configuration and the light source brightness is 20% of full brightness. The hinge angle increases to the pose shown inFIG. 19B , and the flashlight brightness increases to 40% of full brightness. The hinge angle increases again to the pose shown inFIG. 19C , and the flashlight brightness increase to 60% of full brightness. As shownFIG. 19D , further increases in the hinge angle cause the computing device to further increase the light source brightness. InFIG. 19E , the computing device is in a single portrait configuration and the flashlight brightness is 100% of full brightness. - As mentioned above, the computing device described herein further is configured to decrease flashlight brightness as hinge angle decreases. In the example of
FIGS. 19A-19E , the computing device decreases flashlight brightness from 100% to 80% upon detecting the change in hinge angle as thecomputing device 100 transitions from the single portrait pose shown inFIG. 19E to the pose shown inFIG. 19D . - In some embodiments, the methods and processes described herein may be tied to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
-
FIG. 20 schematically shows a non-limiting example of acomputing system 2000 that can enact one or more of the methods and processes described above.Computing system 2000 is shown in simplified form.Computing system 2000 may take the form of one or more personal computers, server computers, tablet computers, home-entertainment computers, network computing devices, gaming devices, mobile computing devices, mobile communication devices (e.g., smart phone), and/or other computing devices, and wearable computing devices such as smart wristwatches and head mounted augmented reality devices. -
Computing system 2000 includes alogic processor 2002volatile memory 2004, and anon-volatile storage device 2006.Computing system 2000 may optionally include adisplay subsystem 2008,input subsystem 2010,communication subsystem 2012, and/or other components not shown inFIG. 20 . -
Logic processor 2002 includes one or more physical devices configured to execute instructions. For example, the logic processor may be configured to execute instructions that are part of one or more applications, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result. - The logic processor may include one or more physical processors (hardware) configured to execute software instructions. Additionally or alternatively, the logic processor may include one or more hardware logic circuits or firmware devices configured to execute hardware-implemented logic or firmware instructions. Processors of the
logic processor 2002 may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic processor optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic processor may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration. In such a case, these virtualized aspects are run on different physical logic processors of various different machines, it will be understood. -
Non-volatile storage device 2006 includes one or more physical devices configured to hold instructions executable by the logic processors to implement the methods and processes described herein. When such methods and processes are implemented, the state ofnon-volatile storage device 2006 may be transformed—e.g., to hold different data. -
Non-volatile storage device 2006 may include physical devices that are removable and/or built-in.Non-volatile storage device 2006 may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., ROM, EPROM, EEPROM, FLASH memory, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), or other mass storage device technology.Non-volatile storage device 2006 may include nonvolatile, dynamic, static, read/write, read-only, sequential-access, location-addressable, file-addressable, and/or content-addressable devices. It will be appreciated thatnon-volatile storage device 2006 is configured to hold instructions even when power is cut to thenon-volatile storage device 2006. -
Volatile memory 2004 may include physical devices that include random access memory.Volatile memory 2004 is typically utilized bylogic processor 2002 to temporarily store information during processing of software instructions. It will be appreciated thatvolatile memory 2004 typically does not continue to store instructions when power is cut to thevolatile memory 2004. - Aspects of
logic processor 2002,volatile memory 2004, andnon-volatile storage device 2006 may be integrated together into one or more hardware-logic components. Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example. - The terms “module,” “program,” and “engine” may be used to describe an aspect of
computing system 2000 typically implemented in software by a processor to perform a particular function using portions of volatile memory, which function involves transformative processing that specially configures the processor to perform the function. Thus, a module, program, or engine may be instantiated vialogic processor 2002 executing instructions held bynon-volatile storage device 2006, using portions ofvolatile memory 2004. It will be understood that different modules, programs, and/or engines may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same module, program, and/or engine may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The terms “module,” “program,” and “engine” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc. - When included,
display subsystem 2008 may be used to present a visual representation of data held bynon-volatile storage device 2006. The visual representation may take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the non-volatile storage device, and thus transform the state of the non-volatile storage device, the state ofdisplay subsystem 2008 may likewise be transformed to visually represent changes in the underlying data.Display subsystem 2008 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined withlogic processor 2002,volatile memory 2004, and/ornon-volatile storage device 2006 in a shared enclosure, or such display devices may be peripheral display devices. - When included,
input subsystem 2010 may comprise or interface with one or more user-input devices such as a keyboard, mouse, touch screen, or game controller. In some embodiments, the input subsystem may comprise or interface with selected natural user input (NUI) componentry. Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board. Example NUI componentry may include a microphone for speech and/or voice recognition; an infrared, color, stereoscopic, and/or depth camera for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity; and/or any other suitable sensor. - When included,
communication subsystem 2012 may be configured to communicatively couple various computing devices described herein with each other, and with other devices.Communication subsystem 2012 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network, such as a HDMI over Wi-Fi connection. In some embodiments, the communication subsystem may allowcomputing system 2000 to send and/or receive messages to and/or from other devices via a network such as the Internet. - Another example provides a computing device comprising a first portion comprising a first display, a second portion comprising a second display and a camera, the second portion connected to the first portion via a hinge, a hinge angle sensing mechanism comprising one or more sensors, a logic device, and a storage device holding instructions executable by the logic device to execute a camera application and to receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a device pose, output the camera application to the first display when the device pose is indicative of the camera being world-facing, and output the camera application to the second display when the device pose is indicative of the camera being user-facing. In such an example, the instructions may additionally or alternatively be executable to detect a change in device pose, and when the change in device pose is indicative of a camera direction changing from a world-facing direction to a user-facing direction, then move the camera application from the first display to the second display. In such an example, the instructions may additionally or alternatively be executable to detect a change in device pose, and when the change in device pose is indicative of a camera direction changing from a user-facing direction to a world-facing direction, then move the camera application from the second display to the first display. In such an example, the device pose is indicative of the camera being user-facing and also indicative of the computing device being in a double portrait or double landscape configuration, and the instructions may additionally or alternatively be executable to receive a user input to move the camera application from the second display to the first display, and in response to receiving the user input, moving the camera application from the second display to the first display. In such an example, the device pose is indicative of the camera being user-facing and also indicative of the computing device being in a double portrait or double landscape configuration, and the instructions may additionally or alternatively be executable to receive a user input moving the camera application to a location spanning each of the first display and the second display, and in response to the user input, outputting the camera application in a spanning mode. In such an example, the instructions may additionally or alternatively be executable to output the camera application in the spanning mode by outputting a camera stream to one of the first display and the second display and outputting a camera roll of captured photos to the other of the first display and the second display. In such an example, the instructions may additionally or alternatively be executable to receive, via one of the first display and the second display, a touch input to launch the camera application, and in response to receiving the touch input, outputting the camera application to the one of the first display and the second display. In such an example, the instructions may additionally or alternatively be executable to receive the sensor data by receiving sensor data indicative of a transition in device pose compared to an initial device pose at which the touch input was received.
- Another example provides a method enacted on a computing device comprising a first portion including a first display, a second portion including a second display and a camera, the second portion connected to the first portion via a hinge, and a hinge angle sensing mechanism comprising one or more sensors, the method comprising receiving, at one of the first display and the second display, an input to launch a camera application, in response to receiving the input, outputting the camera application to the one of the first display and the second display, receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a change in pose of the computing device, and based at least in part on the change in pose of the computing device, output the camera application to the other of the first display and the second display to maintain a user-facing orientation. In such an example, the change in device pose is indicative of the camera changing from a user-facing direction to a world-facing direction, and outputting the camera application to the other of the first display and the second display may additionally or alternatively comprise moving the camera application from the second display to the first display. In such an example, moving the camera application from the second display to the first display further may additionally or alternatively comprise powering off the second display. In such an example, determining the change in device pose may additionally or alternatively comprise determining a transition from a double portrait or double landscape device pose to a single portrait or single landscape device pose. In such an example, determining the change in device pose may additionally or alternatively comprise determining a rotation of the computing device in a single portrait or single landscape device pose such that the camera transitions from a user-facing direction to a world-facing direction or vice versa.
- Another example provides a computing device comprising a first portion, a second portion comprising a light source, the second portion connected to the first portion via a hinge, a hinge angle sensing mechanism comprising one or more sensors, a logic device, and a storage device holding instructions executable by the logic device to execute a flashlight application and to receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a pose of the computing device, control the light source to emit light at a relatively lower brightness when the pose of the computing device is indicative of the light source being user-facing, and control the light source to emit light at a relatively higher brightness when the pose of the computing device is indicative of the light source being world-facing. In such an example, the instructions may additionally or alternatively be executable to detect a change in pose of the computing device, when the change in the pose is indicative of a direction of the light source changing from a world-facing direction to a user-facing direction, decrease a brightness of the light source, and when the change in the pose is indicate of the direction of the light source changing from the user-facing direction to the world-facing direction, increase the brightness of the light source. In such an example, the instructions may additionally or alternatively be executable to dynamically adjust the brightness of the light source during the change in pose of the computing device. In such an example, the instructions may additionally or alternatively be executable to increase or decrease the brightness of the light source when a change in hinge angle exceeds a threshold hinge angle. In such an example, the first portion may additionally or alternatively comprise a first display and the second portion may additionally or alternatively comprise a second display. In such an example, the instructions may additionally or alternatively be executable to output a brightness indicator via at least one display of the first display and the second display. In such an example, the instructions may additionally or alternatively be executable to, when the pose of the computing device is indicative of the light source being user-facing, output via at least one of the first display and the second display a notification instructing a user to change the pose of the computing device.
- It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
- The subject matter of the present disclosure includes all novel and non-obvious combinations and sub-combinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.
Claims (20)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/719,740 US11561587B2 (en) | 2019-10-01 | 2019-12-18 | Camera and flashlight operation in hinged device |
PCT/US2020/042434 WO2021066923A1 (en) | 2019-10-01 | 2020-07-17 | Camera and flashlight operation in hinged device |
US18/157,417 US20230152863A1 (en) | 2019-10-01 | 2023-01-20 | Camera and flashlight operation in hinged device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962909199P | 2019-10-01 | 2019-10-01 | |
US16/719,740 US11561587B2 (en) | 2019-10-01 | 2019-12-18 | Camera and flashlight operation in hinged device |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/157,417 Continuation US20230152863A1 (en) | 2019-10-01 | 2023-01-20 | Camera and flashlight operation in hinged device |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210096611A1 true US20210096611A1 (en) | 2021-04-01 |
US11561587B2 US11561587B2 (en) | 2023-01-24 |
Family
ID=75163137
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/719,740 Active 2040-04-07 US11561587B2 (en) | 2019-10-01 | 2019-12-18 | Camera and flashlight operation in hinged device |
US18/157,417 Pending US20230152863A1 (en) | 2019-10-01 | 2023-01-20 | Camera and flashlight operation in hinged device |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/157,417 Pending US20230152863A1 (en) | 2019-10-01 | 2023-01-20 | Camera and flashlight operation in hinged device |
Country Status (2)
Country | Link |
---|---|
US (2) | US11561587B2 (en) |
WO (1) | WO2021066923A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11201962B2 (en) * | 2019-10-01 | 2021-12-14 | Microsoft Technology Licensing, Llc | Calling on a multi-display device |
US20210409610A1 (en) * | 2020-06-30 | 2021-12-30 | Snap Inc. | Third-party modifications for a camera user interface |
US11281419B2 (en) * | 2020-06-29 | 2022-03-22 | Microsoft Technology Licensing, Llc | Instruction color book painting for dual-screen devices |
US11416130B2 (en) | 2019-10-01 | 2022-08-16 | Microsoft Technology Licensing, Llc | Moving applications on multi-screen computing device |
US20220269464A1 (en) * | 2021-02-23 | 2022-08-25 | Beijing Xiaomi Mobile Software Co., Ltd. | Method for acquiring sensor data, terminal, and storage medium |
US11445097B2 (en) * | 2019-10-07 | 2022-09-13 | Samsung Electronics Co., Ltd | Apparatus and method for providing illumination of camera in electronic device |
US20220350373A1 (en) * | 2019-12-27 | 2022-11-03 | Intel Corporation | Hinge Angle Detection |
US20230276116A1 (en) * | 2022-02-28 | 2023-08-31 | Motorola Mobility Llc | Electronic device with automatic eye gaze tracking and camera adjustment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6069648A (en) * | 1997-08-22 | 2000-05-30 | Hitachi, Ltd. | Information communication terminal device |
US20120117495A1 (en) * | 2010-10-01 | 2012-05-10 | Imerj, Llc | Dragging an application to a screen using the application manager |
US20150365590A1 (en) * | 2013-03-06 | 2015-12-17 | Nec Corporation | Imaging device, imaging method and program |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7536650B1 (en) | 2003-02-25 | 2009-05-19 | Robertson George G | System and method that facilitates computer desktop use via scaling of displayed objects with shifts to the periphery |
US8984440B2 (en) | 2010-10-01 | 2015-03-17 | Z124 | Managing expose views in dual display communication devices |
US8504936B2 (en) | 2010-10-01 | 2013-08-06 | Z124 | Changing stack when swapping |
US20060107226A1 (en) | 2004-11-16 | 2006-05-18 | Microsoft Corporation | Sidebar autohide to desktop |
KR101217554B1 (en) | 2006-05-09 | 2013-01-02 | 삼성전자주식회사 | seamless foldable display device |
US8890802B2 (en) | 2008-06-10 | 2014-11-18 | Intel Corporation | Device with display position input |
US8947320B2 (en) | 2008-09-08 | 2015-02-03 | Qualcomm Incorporated | Method for indicating location and direction of a graphical user interface element |
US8194001B2 (en) | 2009-03-27 | 2012-06-05 | Microsoft Corporation | Mobile computer device display postures |
US20100321275A1 (en) | 2009-06-18 | 2010-12-23 | Microsoft Corporation | Multiple display computing device with position-based operating modes |
US8548523B2 (en) | 2009-07-01 | 2013-10-01 | At&T Intellectual Property I, L.P. | Methods, apparatus, and computer program products for changing ring method based on type of connected device |
US20110143769A1 (en) | 2009-12-16 | 2011-06-16 | Microsoft Corporation | Dual display mobile communication device |
US20120084736A1 (en) | 2010-10-01 | 2012-04-05 | Flextronics Id, Llc | Gesture controlled screen repositioning for one or more displays |
EP3734407A1 (en) | 2011-02-10 | 2020-11-04 | Samsung Electronics Co., Ltd. | Portable device comprising a touch-screen display, and method for controlling same |
US9351237B2 (en) | 2011-09-27 | 2016-05-24 | Z124 | Displaying of charging status on dual screen device |
US9524030B2 (en) | 2013-04-26 | 2016-12-20 | Immersion Corporation | Haptic feedback for interactions with foldable-bendable displays |
KR20150026403A (en) | 2013-09-03 | 2015-03-11 | 삼성전자주식회사 | Dual-monitoring system and method |
US20150100914A1 (en) | 2013-10-04 | 2015-04-09 | Samsung Electronics Co., Ltd. | Gestures for multiple window operation |
US10021247B2 (en) | 2013-11-14 | 2018-07-10 | Wells Fargo Bank, N.A. | Call center interface |
KR102561200B1 (en) | 2014-02-10 | 2023-07-28 | 삼성전자주식회사 | User terminal device and method for displaying thereof |
US20150370322A1 (en) | 2014-06-18 | 2015-12-24 | Advanced Micro Devices, Inc. | Method and apparatus for bezel mitigation with head tracking |
CN104239094B (en) | 2014-08-29 | 2017-12-08 | 小米科技有限责任公司 | Control method, device and the terminal device of background application |
US10558344B2 (en) | 2015-06-01 | 2020-02-11 | Apple Inc. | Linking multiple windows in a user interface display |
US10291873B2 (en) | 2015-11-20 | 2019-05-14 | Hattar Tanin, LLC | Dual-screen electronic devices |
KR102480462B1 (en) | 2016-02-05 | 2022-12-23 | 삼성전자주식회사 | Electronic device comprising multiple displays and method for controlling thereof |
KR102558164B1 (en) | 2016-08-30 | 2023-07-21 | 삼성전자 주식회사 | Method for providing notification service related to the call back and an electronic device |
US10346117B2 (en) | 2016-11-09 | 2019-07-09 | Microsoft Technology Licensing, Llc | Device having a screen region on a hinge coupled between other screen regions |
CN106681641A (en) | 2016-12-23 | 2017-05-17 | 珠海市魅族科技有限公司 | Split screen display method and device |
CN107040719A (en) | 2017-03-21 | 2017-08-11 | 宇龙计算机通信科技(深圳)有限公司 | Filming control method and imaging control device based on double screen terminal |
US10567630B2 (en) | 2017-05-12 | 2020-02-18 | Microsoft Technology Licensing, Llc | Image capture using a hinged device with multiple cameras |
US10204592B1 (en) | 2017-11-30 | 2019-02-12 | Dell Products L.P. | Configuring multiple displays of a computing device to have a similar perceived appearance |
US11169577B2 (en) | 2018-04-04 | 2021-11-09 | Microsoft Technology Licensing, Llc | Sensing relative orientation of computing device portions |
DK180316B1 (en) | 2018-06-03 | 2020-11-06 | Apple Inc | Devices and methods for interacting with an application switching user interface |
KR102638783B1 (en) | 2018-10-17 | 2024-02-22 | 삼성전자주식회사 | Electronic device for controlling application according to folding angle and method thereof |
US11416130B2 (en) | 2019-10-01 | 2022-08-16 | Microsoft Technology Licensing, Llc | Moving applications on multi-screen computing device |
US11201962B2 (en) | 2019-10-01 | 2021-12-14 | Microsoft Technology Licensing, Llc | Calling on a multi-display device |
-
2019
- 2019-12-18 US US16/719,740 patent/US11561587B2/en active Active
-
2020
- 2020-07-17 WO PCT/US2020/042434 patent/WO2021066923A1/en active Application Filing
-
2023
- 2023-01-20 US US18/157,417 patent/US20230152863A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6069648A (en) * | 1997-08-22 | 2000-05-30 | Hitachi, Ltd. | Information communication terminal device |
US20120117495A1 (en) * | 2010-10-01 | 2012-05-10 | Imerj, Llc | Dragging an application to a screen using the application manager |
US20150365590A1 (en) * | 2013-03-06 | 2015-12-17 | Nec Corporation | Imaging device, imaging method and program |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11895261B2 (en) * | 2019-10-01 | 2024-02-06 | Microsoft Technology Licensing, Llc | Calling on a multi-display device |
US20220030104A1 (en) * | 2019-10-01 | 2022-01-27 | Microsoft Technology Licensing, Llc | Calling on a multi-display device |
US11201962B2 (en) * | 2019-10-01 | 2021-12-14 | Microsoft Technology Licensing, Llc | Calling on a multi-display device |
US11416130B2 (en) | 2019-10-01 | 2022-08-16 | Microsoft Technology Licensing, Llc | Moving applications on multi-screen computing device |
US11445097B2 (en) * | 2019-10-07 | 2022-09-13 | Samsung Electronics Co., Ltd | Apparatus and method for providing illumination of camera in electronic device |
US11809237B2 (en) * | 2019-12-27 | 2023-11-07 | Intel Corporation | Hinge angle detection |
US20220350373A1 (en) * | 2019-12-27 | 2022-11-03 | Intel Corporation | Hinge Angle Detection |
US11656830B2 (en) * | 2020-06-29 | 2023-05-23 | Microsoft Technology Licensing, Llc | Instruction color book painting for dual-screen devices |
US20220171592A1 (en) * | 2020-06-29 | 2022-06-02 | Microsoft Technology Licensing, Llc | Instruction color book painting for dual-screen devices |
US20230297310A1 (en) * | 2020-06-29 | 2023-09-21 | Microsoft Technology Licensing, Llc | Instruction color book painting for dual-screen devices |
US11281419B2 (en) * | 2020-06-29 | 2022-03-22 | Microsoft Technology Licensing, Llc | Instruction color book painting for dual-screen devices |
US20210409610A1 (en) * | 2020-06-30 | 2021-12-30 | Snap Inc. | Third-party modifications for a camera user interface |
US20220269464A1 (en) * | 2021-02-23 | 2022-08-25 | Beijing Xiaomi Mobile Software Co., Ltd. | Method for acquiring sensor data, terminal, and storage medium |
US20230276116A1 (en) * | 2022-02-28 | 2023-08-31 | Motorola Mobility Llc | Electronic device with automatic eye gaze tracking and camera adjustment |
US11889178B2 (en) * | 2022-02-28 | 2024-01-30 | Motorola Mobility Llc | Electronic device with automatic eye gaze tracking and camera adjustment |
Also Published As
Publication number | Publication date |
---|---|
US11561587B2 (en) | 2023-01-24 |
US20230152863A1 (en) | 2023-05-18 |
WO2021066923A1 (en) | 2021-04-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11561587B2 (en) | Camera and flashlight operation in hinged device | |
US11706521B2 (en) | User interfaces for capturing and managing visual media | |
EP3590028B1 (en) | Systems and methods for window control in virtual reality environment | |
KR102271289B1 (en) | Flexible device and method for performing interfacing thereof | |
US20080048980A1 (en) | Detecting movement of a computer device to effect movement of selected display objects | |
US9304591B2 (en) | Gesture control | |
US20180088775A1 (en) | Method of controlling mobile device with touch-sensitive display and motion sensor, and mobile device | |
US9888172B2 (en) | Systems and methods for capturing images from a lock screen | |
US8988459B2 (en) | Method and apparatus for operating a display unit of a mobile device | |
KR20200005211A (en) | Electronic device and method for changing location of preview image according to direction of camera | |
EP3625663A1 (en) | Configuration of primary and secondary displays | |
US20130145308A1 (en) | Information Processing Apparatus and Screen Selection Method | |
US20120038675A1 (en) | Assisted zoom | |
CN114585996A (en) | Mobile applications on multi-screen computing devices | |
KR20130127288A (en) | Portable device and controlling method thereof | |
TWI547855B (en) | Information processing device, information processing method and program | |
US10139986B2 (en) | Data sharing by displaying projections of devices | |
US9591226B2 (en) | Information processing apparatus, information processing method, and program | |
US9875075B1 (en) | Presentation of content on a video display and a headset display | |
KR101861377B1 (en) | Method for controlling screen based on motion of mobile terminal and the mobile terminal therefor | |
KR20200076588A (en) | System and method for head mounted device input | |
KR20170066916A (en) | Electronic apparatus and controlling method of thereof | |
JP5865615B2 (en) | Electronic apparatus and control method | |
WO2021066989A1 (en) | Drag and drop operations on a touch screen display | |
TW201403454A (en) | Screen rotating method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SCHENONE, SCOTT D.;TUOMI, OTSO JOONA CASIMIR;SONNINO, EDUARDO;AND OTHERS;SIGNING DATES FROM 20191212 TO 20200428;REEL/FRAME:052899/0682 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |