US20210006730A1 - Computing device - Google Patents
Computing device Download PDFInfo
- Publication number
- US20210006730A1 US20210006730A1 US16/869,413 US202016869413A US2021006730A1 US 20210006730 A1 US20210006730 A1 US 20210006730A1 US 202016869413 A US202016869413 A US 202016869413A US 2021006730 A1 US2021006730 A1 US 2021006730A1
- Authority
- US
- United States
- Prior art keywords
- computing device
- camera
- orientation
- view
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000002093 peripheral effect Effects 0.000 claims abstract description 51
- 230000000694 effects Effects 0.000 claims description 118
- 238000000034 method Methods 0.000 claims description 27
- 230000001815 facial effect Effects 0.000 claims description 8
- 230000007704 transition Effects 0.000 claims description 3
- 230000015654 memory Effects 0.000 description 24
- 230000008921 facial expression Effects 0.000 description 18
- 238000004891 communication Methods 0.000 description 16
- 238000001514 detection method Methods 0.000 description 10
- 238000012545 processing Methods 0.000 description 10
- 230000000284 resting effect Effects 0.000 description 10
- 230000008859 change Effects 0.000 description 9
- 238000012800 visualization Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000014509 gene expression Effects 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000000047 product Substances 0.000 description 2
- 208000032140 Sleepiness Diseases 0.000 description 1
- 206010041349 Somnolence Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013479 data entry Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000037081 physical activity Effects 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 230000037321 sleepiness Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H04N5/247—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1601—Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
- G06F1/1605—Multimedia displays, e.g. with integrated or attached speakers, cameras, microphones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1656—Details related to functional adaptations of the enclosure, e.g. to provide protection against EMI, shock, water, or to host detachable peripherals like a mouse or removable expansions units like PCMCIA cards, or to provide access to internal components for maintenance or to removable storage supports like CDs or DVDs, or to mechanically mount accessories
- G06F1/166—Details related to functional adaptations of the enclosure, e.g. to provide protection against EMI, shock, water, or to host detachable peripherals like a mouse or removable expansions units like PCMCIA cards, or to provide access to internal components for maintenance or to removable storage supports like CDs or DVDs, or to mechanically mount accessories related to integrated arrangements for adjusting the position of the main body with respect to the supporting surface, e.g. legs for adjusting the tilt angle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1688—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being integrated loudspeakers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/73—Querying
- G06F16/732—Query formulation
- G06F16/7335—Graphical querying, e.g. query-by-region, query-by-sketch, query-by-trajectory, GUIs for designating a person/face/object as a query predicate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/73—Querying
- G06F16/738—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/40—Document-oriented image-based pattern recognition
- G06V30/41—Analysis of document content
- G06V30/413—Classification of content, e.g. text, photographs or tables
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/175—Static expression
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/06—Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/08—Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
- G09B5/12—Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations different stations being capable of presenting different information simultaneously
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H04N5/2253—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R3/00—Circuits for transducers, loudspeakers or microphones
- H04R3/04—Circuits for transducers, loudspeakers or microphones for correcting frequency response
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R3/00—Circuits for transducers, loudspeakers or microphones
- H04R3/12—Circuits for transducers, loudspeakers or microphones for distributing signals to two or more loudspeakers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R5/00—Stereophonic arrangements
- H04R5/04—Circuit arrangements, e.g. for selective connection of amplifier inputs/outputs to loudspeakers, for loudspeaker detection, or for adaptation of settings to personal preferences or hearing impairments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/161—Indexing scheme relating to constructional details of the monitor
- G06F2200/1614—Image rotation following screen orientation, e.g. switching from landscape to portrait mode
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1631—Panel PC, e.g. single housing hosting PC and display panel
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2420/00—Details of connection covered by H04R, not provided for in its groups
- H04R2420/01—Input selection or mixing for amplifiers or loudspeakers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2430/00—Signal processing covered by H04R, not provided for in its groups
- H04R2430/01—Aspects of volume control, not necessarily automatic, in sound systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2499/00—Aspects covered by H04R or H04S not otherwise provided for in their subgroups
- H04R2499/10—General applications
- H04R2499/11—Transducers incorporated or for use in hand-held devices, e.g. mobile phones, PDA's, camera's
Definitions
- the present disclosure relates to a computing device.
- An electronic device often has cameras for capturing videos or images.
- cameras of the electronic device usually include a front camera to capture objects in front of the electronic device and a back camera to capture objects behind the electronic device.
- a user often needs to manually select to use either the front camera or the back camera based on a location of the object relative to the electronic device. The selected camera may then capture the object and the non-selected camera may not be utilized.
- the electronic device often includes various input/output elements and the position of the input/output elements relative to the user may change when the electronic device is situated in different positions.
- the electronic device often requires the user to memorize the functionality associated with each input/output element and accurately localize the desired input/output element to interact with the electronic device.
- the electronic device is usually inconvenient for the user to use, especially for the young children and the elderly.
- a computing device may include: a display screen located on a front surface of a housing; a support located on a back surface of the housing, where: the support is configured to support the display screen in a first position relative to a physical surface when situated in a first orientation; and the support is configured to support the display screen in a second position relative to the physical surface when situated in a second orientation; a first camera located on a first peripheral side of the front surface, the first camera being configured to capture a first field of view; and a second camera located on a second peripheral side of the front surface, where: the first peripheral side is opposite the second peripheral side; and the second camera is configured to capture a second field of view that is different from the first field of view.
- Implementations may include one or more of the following features.
- the computing device where: the second orientation and the first orientation are different; when the display screen is situated in the first orientation, the display screen is in a first viewing position and the first camera captures the first field of view that includes a portion of a surface in front of the housing and the second camera captures the second field of view that includes a portion of an area in front of the display screen; and when the display screen is situated in the second orientation, the display screen is in a second viewing position and the first camera captures the first field of view that includes a first portion of an area above the display screen and the second camera captures the second field of view that includes a second portion of an area above the display screen.
- the computing device where: a tilt angle between the display screen and the physical surface when the display screen is positioned in the first orientation is greater than a tilt angle between the display screen and the physical surface when the display screen is positioned in the second orientation.
- the computing device may include: an orientation sensor that detects when the display screen is situated in the first orientation and when the display screen is situated in the second orientation; and an activity application that executes a routine based on whether the orientation sensor detects the first orientation or the second orientation.
- the computing device of may include: when the orientation sensor detects that the display screen is positioned in the first orientation, the activity application identifies the first camera located on the first peripheral side as a top camera of the display screen and the second camera located on the second peripheral side as a bottom camera of the display screen; the first field of view of the top camera is directed downward towards the physical surface; and the second field of view of the bottom camera is directed upward towards a user facing the display screen.
- the top camera is configured to capture a first video stream that includes an activity scene of the physical surface in the first field of view; and the bottom camera is configured to capture a second video stream that includes a face of the user in the second field of view.
- the computing device where: when the orientation sensor detects that the display screen is positioned in the second orientation: an activity application identifies the first camera located on the first peripheral side as a bottom camera of the display screen and the second camera located on the second peripheral side as a top camera of the display screen; and the first field of view of the bottom camera and the second field of view of the top camera are directed upward.
- the bottom camera is configured to capture a first video stream that includes a first portion of a face of a user in the first field of view
- the top camera is configured to capture a second video stream that includes a second portion of the face of the user in the second field of view.
- the computing device where: the bottom camera is configured to capture a first video stream that includes a first user in the first field of view; and the top camera is configured to capture a second video stream that includes a second user in the second field of view.
- the first peripheral side includes a protrusion that extends outwardly from the front surface of the housing; and the first camera is positioned on the protrusion towards the physical surface; and the first field of view of the first camera is configured to capture at least a portion of the physical surface.
- the computing device where the support may include: a supporting element that is integral with or coupled to the back surface of the housing; and the supporting element extends outwardly from the back surface of the housing and is situatable on a physical surface to support the display screen in the first orientation or the second orientation.
- One general aspect includes a method may include: determining that a computing device is positioned in a first orientation on a physical surface, where: the computing device includes a first camera configured to capture a first field of view and a second camera configured to capture a second field of view that is different from the first field of view; and the first field of view of the first camera is directed towards the physical surface and the second field of view of the second camera is directed towards a user facing the computing device when the computing device is situated in the first orientation; capturing, using the first camera of the computing device, a first video stream that includes an activity scene of the physical surface in the first field of view; capturing, using the second camera of the computing device, a second video stream that includes the user in the second field of view; determining, in an activity application of the computing device, an operation routine of the activity application based on the first video stream including the activity scene of the physical surface and the second video stream including the user; and executing the operation routine in the activity application.
- Implementations may include one or more of the following features.
- the method where determining the operation routine of the activity application includes: detecting, in the first video stream, a tangible object in the activity scene of the physical surface; determining, in the second video stream, a user state of the user; and determining the operation routine of the activity application based on the tangible object in the activity scene and the user state of the user.
- the method where: determining the user state of the user includes: determining a facial feature of the user in the second video stream; and determining the user state of the user based on the facial feature of the user; and the operation routine of the activity application includes adjusting one or more of a task complexity level and an instruction detail level associated with the user in the activity application based on the user state of the user.
- the method where: the first camera is located on a first peripheral side of a front surface of a housing of a display screen; and the second camera is located on a second peripheral side of the front surface of the housing of the display screen, where the first peripheral side is opposite the second peripheral side.
- the method may include: when the computing device is positioned in the first orientation on the physical surface: the first camera becomes a top camera of the computing device, the top camera being directed downward towards the physical surface; and the second camera becomes a bottom camera of the computing device, the bottom camera being directed upward towards the user facing the computing device.
- the method may include: determining that the computing device is positioned in a second orientation on the physical surface, where the first field of view of the first camera is directed towards the user facing the computing device when the computing device is situated in the second orientation, and the second field of view of the second camera is directed towards the user facing the computing device when the computing device is situated in the second orientation; capturing, using the first camera of the computing device, a third video stream that includes a first portion of a face of the user in the first field of view; capturing, using the second camera of the computing device, a fourth video stream that includes a second portion of the face of the user in the second field of view; and adjusting the operation routine of the activity application based on one or more of the third video stream including the first portion of the face of the user and the fourth video stream including the second portion of the face of the user.
- the method may include: when the computing device is positioned in the second orientation on the physical surface: the first camera becomes a bottom camera of the computing device, the bottom camera being directed upward towards the first portion of the face of the user in the first field of view; and the second camera becomes a top camera of the computing device, the top camera being directed upward towards the second portion of the face of the user in the second field of view.
- the computing device in a second orientation has been rotated as compared to the computing device in the first orientation; and a tilt angle between the computing device and the physical surface when the computing device is positioned in the second orientation on the physical surface is smaller than a tilt angle between the computing device and the physical surface when the computing device is positioned in the first orientation on the physical surface.
- One general aspect includes a computing device may include: a display screen located on a front surface of the computing device in a first orientation; a first camera located on a front surface of the computing device near a top edge of the computing device in the first orientation, the first camera being configured to capture a first field of view; and a second camera located on the front surface of the computing device near a bottom edge of the computing device in the first orientation, the second camera being configured to capture a second field of view different from the first field of view, and an orientation sensor configured to detect when the computing device is oriented into a second orientation, where in the second orientation the first camera is located near a bottom edge of the computing device in the second orientation and the second camera is now located near a top edge of the computing device in the second orientation.
- FIG. 1A is a front view of an example computing device.
- FIGS. 1B and 1C respectively illustrate perspective views of an example computing device from a front perspective and a rear perspective.
- FIG. 2A illustrates a side view of a computing device in a first orientation and depicts fields of view of a first camera and a second camera.
- FIG. 2B illustrates a side view of a computing device in a second orientation and depicts fields of view of a first camera and a second camera.
- FIG. 3 illustrates a field of view of a first camera and a field of view of a second camera of an example computing device when the example computing device is situated in a first orientation on an activity surface.
- FIG. 4 is a block diagram illustrating an example computer system that includes one or more example computing devices.
- FIG. 5 is a block diagram of an example computing device.
- FIG. 6 is a flowchart of an example method for determining an orientation of a computing device.
- FIG. 7 is a flowchart of an example method for capturing video streams in a second orientation.
- FIGS. 8A and 8B illustrate different orientations of the computing device.
- FIGS. 1A-1C illustrate an example computing device 100 that includes a first camera 110 and a second camera 120 that are capable of operating simultaneously to capture video streams of different fields of view.
- a field of view of the first camera 110 may be pointing downward towards a physical surface in front of the computing device where a user may be placing one or more tangible objects and the field of view of the second camera may be directed outwards from a display screen 140 on the computing device 100 towards a user viewing the display screen 140 .
- the computing device 100 may be able to see both what objects a user places in front of the computing device 100 and what the user is doing at the same time (such as capturing facial expressions, speech, gestures, etc.).
- the computing device 110 may be postionable in different orientations and the field of view of the first camera 110 and the second camera 120 may be different relative to the physical surface the computing device 110 is positioned on in the different orientations.
- the computing device 100 may transition from the first orientation 101 to the second orientation 103 .
- a first orientation 101 as shown in FIGS. 1A-1C may be a first viewing position of the displays screen 140 and the display screen 140 may be substantially upright position and the field of view of the first camera 110 may be downwards towards a physical surface in front of the computing device 100 .
- a second orientation 103 as shown in FIG.
- 2B may be a second viewing position of the display screen 140 and the display screen 140 may be substantially flat or horizontal in position (e.g., resting substantially flat on a surface) and the field of view of the first camera 110 may be outwards towards a top area above the computing device 100 .
- the computing device 100 may include a housing 105 that includes a front surface 107 in which a display screen 140 is positioned.
- the front surface 107 may be substantially flat and a display screen 140 and one or more cameras (e.g., a first camera 110 and/or a second camera 120 ) may be integrated into the front surface 107 .
- the front surface 107 of the housing 105 computing device 100 may also include one or more audio outputs 150 .
- the audio output 150 may be positioned on another surface of the computing device 100 , such as a back surface and/or side surface.
- the display screen 140 may be positioned on the front surface 107 of the housing 105 of the computing device 100 to allow for the display screen 140 to be easily viewed by a user.
- the display screen 140 may be viewable by a user in different orientations, such as a first orientation 101 where the display screen 140 is substantially vertical or a second orientation 103 where the display screen 140 is substantially horizontal as shown in more detail in FIGS. 2A and 2B .
- the display screen 140 may be adapted to orient the content of the display screen 140 relative to the orientation of the computing device 100 in order to allow content to be presented to the user as being upright, based on the orientation of the computing device 100 .
- the top of the display screen 140 in the first orientation 101 becomes the bottom of the display screen 140 in the second orientation 103 .
- the activity application(s) 414 may receive the orientation information from the orientation sensor 522 and the activity application(s) 414 may cause the content of the display screen 140 to be rotated 180 degrees in order to account for the change in orientation as described in more detail elsewhere herein.
- the displays screen 140 may occupy a portion of the front surface 107 of the computing device 100 and the edges of the front surface 107 around the display screen 140 may be referred to as peripheral sides.
- the peripheral sides (such as the first peripheral sides 102 and the second peripheral side 104 ) may occupy other portions of the front surface 107 of the housing 105 of the computing device 100 along the periphery of the side of the display screen 140 (such as the first side 142 of the display screen 140 and/or the second side 144 of the display screen 140 ).
- These other peripheral sides of the front surface 107 of the housing 105 may allow for other components of the computing device 100 to be positioned. For example, as depicted in FIG.
- the first camera 110 may be located on the first peripheral side 102 proximate to the first side 142 of the display screen 140 .
- the computing device 100 is positioned in the first orientation 101 and the first camera 110 may be identified as the top camera along the top peripheral side of the housing 105 of the computing device 100 .
- the second camera 120 may be located on the second peripheral side 104 proximate to the second side 144 of the display screen 140 .
- the second camera 120 may be identified as the bottom camera along the bottom peripheral side of the computing device 100 .
- the first camera 110 , the second camera 120 , and the display screen 140 are all located on the front surface 107 of the computing device 100 . It should be understood that while the first camera 110 , second camera 120 , and the display screen 140 are integrated into the front surface 107 of the housing 105 of the computing device 100 as shown, in other implementations, one or more of the first camera 110 , the second camera 120 , and/or the display screen 140 may be separate components that may be attachable to the front surface 107 or other portions of the housing 105 of the computing device 140 and may be positionable around the peripheral sides of the housing 105 of the computing device 100 .
- the first camera 110 is positioned opposite the second camera 120 on the front surface 107 of the housing 105 of the computing device 100 .
- the first camera 110 is centered along the first peripheral side 102 and the second camera 120 is centered along the second peripheral side 104 .
- additional cameras may be positioned on other portions of the front surface 107 of the housing 105 of the computing device 100 and these additional cameras may be configured to capture additional fields of view separate from the fields of view of the first camera 110 and/or the second camera 120 .
- the different fields of view may overlap and allow for stereo vision from the additional cameras, as describe elsewhere herein.
- the activity application(s) 414 may select which cameras (from the first camera 110 , second camera 120 , and/or additional cameras) to use to capture a video stream and may limit the amount of cameras used to capture video streams to improve processing time. In further implementations, based on the orientation of the computing device 100 , the activity application(s) 414 may select specific cameras (from the first camera 110 , second camera 120 , and/or additional cameras) based on the orientation of the computing device 100 .
- the first camera 110 and the second camera 120 may be positioned on the same peripheral side of the front surface 107 (such as the first peripheral side 102 or the second peripheral side 104 ) and the first camera 110 and the second camera 120 may be positioned to capture different fields of view despite being positioned on the same peripheral side.
- the first camera 110 and the second camera 120 may both be positioned on the first peripheral side 102 (such as adjacent to each other, or within a distance of a couple of inches but positioned on the first peripheral side 102 , etc.).
- the first camera 110 may have a first field of view (such as forward looking to capture things in front of the computing device, such as a user, etc.) and the second camera 120 may have a second field of view (such as downward looking to capture things (such as tangible objects, etc.) on a physical surface the computing device 100 is resting on).
- first field of view such as forward looking to capture things in front of the computing device, such as a user, etc.
- second camera 120 may have a second field of view (such as downward looking to capture things (such as tangible objects, etc.) on a physical surface the computing device 100 is resting on).
- FIGS. 1B and 1C illustrate perspective views of the computing device 100 from a front perspective and a rear perspective, respectively.
- the computing device 100 may include a support to position the computing device 100 in the different orientations.
- the support may include a supporting element 130 for supporting and positioning the housing 105 computing device 100 on a physical surface in different orientations.
- the supporting element 130 may rest against the physical surface on different surfaces, such as a bottom surface 133 of the supporting element 130 or a back surface 135 of the supporting element 130 in order to prop the housing 105 of the computing device 100 up in the different orientations, such as the first orientation 101 shown in FIGS. 1B and 1C .
- the supporting element 130 may extend out from a back surface 150 of the housing 105 of the computing device 100 .
- the back surface 150 of the housing 105 of the computing device 100 may be substantially flat, and the supporting element 130 may extend out from a portion of the back surface 150 of the housing 105 of the computing device 100 .
- the supporting element 130 may extend out of a lower portion 152 of the back surface 150 of the housing 105 of the computing device 100 .
- the supporting element 130 may extend along at least a portion of the lower portion 152 of the back surface 150 of the housing 105 of the computing device 100 .
- the supporting element 130 may be integral to the housing 150 of the computing device 100 .
- the computing device 100 may be a single unit and all of the component may be housed within the housing 105 for easy portability and use.
- the supporting element 130 may be a rod or protrusion that may be capable of supporting the computing device 100 while being minimal in size.
- the supporting element 130 may be detachable and may be removed from the back surface 150 of the housing 105 of the computing device 100 in order to allow the computing device 100 to lay flat on a surface.
- the supporting element 130 may be foldable and may be able to fold flat against the back surface 150 of the housing 105 of the computing device 100 in order to allow the computing device 100 to lay flat on a surface.
- the supporting element 150 may house one or more components of the computing device 100 .
- a battery and or other components of the computing device 100 may be positioned within the larger area of the supporting element 150 compared to the thinner area of the computing device 100 that includes the display screen 140 .
- the supporting element may include one or more access ports 132 , such as a power button 132 a , a memory card slot 132 b , an input/output port 132 c (such as a 3.5 mm port, etc.), etc.
- access ports 132 such as a power button 132 a , a memory card slot 132 b , an input/output port 132 c (such as a 3.5 mm port, etc.), etc.
- an additional supporting element 170 may be present on one or more surfaces of the housing 105 of the computing device, such as the back surface 150 of the housing 105 of the computing device 100 .
- the additional supporting element 170 may be a protrusion on the back surface 150 of the housing 105 of the computing device 100 or a piece of plastic/rubber/metal/etc. that is positioned on the back surface 150 .
- the additional supporting element 170 may be positioned on the back surface 150 of the housing 105 of the computing device 100 , such that when the computing device 100 is placed in different orientations, such as the second orientation 103 , the additional supporting element 170 comes into contact with a physical surface and assists in positioning and retaining the computing device 100 in the second orientation 103 , as shown in FIG. 2B .
- the computing device 100 may include additional hardware input/outputs, such as a volume control 180 and/or a screen lock button 182 . These additional hardware input/outputs may be positioned on a side surface of the housing 105 of the computing device 100 , as shown in FIGS. 1B and 1C where the additional hardware input/outputs may be easily accessible by a user without obscuring the display screen 140 and/or field of views of the cameras. In some implementations, these additional hardware input/outputs may be configurable to provide similar functions independent of the orientation of the computing device 100 as described elsewhere herein with respect to FIGS. 8A and 8B .
- a camera protrusion 160 may be included on the housing 105 of the computing device 100 .
- the camera protrusion 160 may be a molded portion of one of the peripheral sides (such as the first peripheral side 102 or the second peripheral side 104 ) of the housing 105 .
- the camera protrusion 160 may extend out from the front surface 107 and have a camera (such as the first camera 110 and/or the second camera 120 ) positioned within the camera protrusion 160 in order to increase what is included in the field of view of the camera as compared to a camera mounted flat on the front surface 107 of the housing 105 of the computing device 100 .
- the camera protrusion 160 may angle the camera to look down towards a physical surface in front of the display screen 140 and the computing device 100 , whereas a camera mounted flat on the front surface 107 would look out towards the area in front of the computing device 100 rather than down towards the physical surface the computing device 100 is resting on.
- the camera protrusion 160 may direct the field of view of the camera to include at least a portion of the housing 105 of the computing device 100 and/or the display screen 140 and the information captured related to the display screen 140 and or the housing 105 of the computing device 100 may be used by the activity application(s) 414 to change one or more routines being executed by the activity application(s) 414 .
- FIG. 2A illustrates the computing device 100 situated in the first orientation 101 .
- the computing device 100 is resting on a bottom side edge 190 of the housing 105 of the computing device 100 and the bottom surface 133 of the supporting element 130 .
- the front of the computing device 100 and the display screen 140 are positioned in a substantially vertical position.
- a tilt angle between the display screen 140 or a back surface of the housing 105 and the physical surface on which the display screen 140 is positioned in the first orientation 101 is greater than a tilt angle between the display screen 140 or the back surface of the housing 105 and the physical surface when the display screen is positioned in the second orientation 103 .
- the display screen 140 may be leaning slightly back in an angled vertical position, such that if a user sits in front of the computing device 100 while the computing device 100 is resting on a surface 112 in the angled vertical position, the user is looking forward and slightly down to view the display screen 140 .
- FIG. 2A shows a side view of the computing device 100 .
- the first camera 110 may include a first field of view 230 and the second camera 120 may include a second field of view 240 .
- the first field of view 230 is angled downwards and includes the area immediately in front of the display screen 140 and the portion of the physical surface 112 proximate to the front of the computing device 100 .
- the second field of view 240 is angled outward and/or upward to capture the area in front of the computing device 100 , which may include a user viewing the computing device 100 from approximately 2 feet away, etc.
- the field of view 230 and the field of view 240 are depicted as triangles with limited bounds, but that any shape of field of view can be depicted and the view is not limited to the shapes shown. Additionally, the field of view 230 and the field of view 240 may extend beyond the shapes shown and the distance of what can be captured is limited only by the constraints of what the camera can view. For example, special lenses, such as a fish-eye or telescopic lens may be used to adapt the field of view of the camera to capture specific portions of an area.
- the field of view 230 and the field of view 240 may overlap where both the first camera 110 and the second camera 120 may capture an object within the overlap but from a different perspective.
- This may allow the activity application(s) 414 to perform stereo vision, where an object, such as a user or physical object, may be captured in the two different video streams and the locations of the separate cameras (first camera 110 and second camera 120 ) may be known relative to the object being captured in the images.
- This may allow the activity application(s) 414 to track a position, an acceleration, a depth, or other information related to the object using the two different views.
- this may allow the activity application(s) 414 to capture three-dimensional information using the two different two-dimensional captured video streams. It may allow the activity applications(s) 414 to estimate the distance of the object, a range of the object, a depth, an acceleration of the object, etc.
- the activity application(s) 414 may perform stereopsis on the captured video streams to provide a three-dimensional visualization based on the captured video streams.
- the activity application(s) 414 may account for when an object in one of the fields of view (such as field of view 230 and/or field of view 240 ) is obscured (such as by a hand moving the object, or another object being placed in front of the object) and the activity application(s) 414 may use the other field of view to continue to track the object that is obscured. It should be understood that by adding additional cameras along with the first camera 110 and the second camera 120 , the stereo vision capabilities may be enhanced by the capturing of additional video streams showing additional fields of view.
- FIG. 2B illustrates the computing device 100 situated in the second orientation 103 .
- the computing device 100 may be positioned so that supporting element 130 is angling the computing device 100 , such as in an angled flat position and the back surface 135 of the supporting element 130 is resting against the physical surface and the additional supporting element 170 is resting against the physical surface.
- the substantially flat position of the computing device 100 may be such that when a user is viewing the display screen in this position, the bottom edge of the display screen 140 is lower than the top edge of the displays screen 140 relative to the user and the surface on which the computing device 100 is resting.
- a user 210 may be viewing the computing device 100 on their lap and the surface of their lap is where the computing device 100 is resting.
- the back surface of the housing 105 of the computing device 100 where the supporting element 130 is positioned may be higher than the back of surface of the housing 105 of the computing device 100 with the additional supporting element 170 , to tilt the screen slightly towards user 210 .
- the computing device 100 may be oriented into the second orientation 103 by rotating the computing device 100 180 degrees around itself relative to the first orientation 101 and then resting the computing device 100 on the physical surface in the second orientation 103 .
- the supporting element 130 may be slidable or detachably removable and may be raised from the first portion of the back surface of the housing 105 of the computing device 100 to a second portion of the back surface of the housing 105 of the computing device 100 without rotating the computing device 100 , to position the computing device in the second orientation 103 .
- the first camera 110 may be identified as a bottom camera as it is located at the bottom edge of the of the display screen 140 and the second camera 120 may be identified as a top camera as it is located at the top edge of the display screen 140 in the second orientation 103 .
- the field of view 240 of the first camera 110 may be directed upward and out along the top edge of the computing device 100 in the second orientation 103 .
- the field of view 230 of the second camera 120 may be directed up towards the top area above the display screen 140 of the computing device 100 . As shown in the example in FIG.
- the field of view 230 may extent up and outward to capture a view of a user 210 viewing the display screen 140 from the bottom of the computing device 100 in the second orientation 103 .
- the field of view 230 may capture at least a portion of a face of the user 210 from an angle below the face of the user 210 .
- the second camera 120 may be able to include the face of the user 210 in the captured video streams and may provide the face of the user 210 to the activity application(s) 214 for facial analysis to detect facial expressions, such as confusion, disinterest, sleepiness, etc.
- the second camera 120 may also be directed upwards above the computing device 100 and may be able to capture a field of view that includes objects such as a ceiling, a lighting array, etc. Using this field of view 230 , the video stream of the second camera 120 may be analyzed to identify specific rooms or locations that the computing device 100 is located in.
- the activity application(s) 414 may have a database of room profiles, such as a user's classroom, home, etc. and the activity application(s) 414 may compare the field of view 230 to the database of room profiles and determine which ceiling matches with what is being captured in the video stream of the second camera 120 . This may allow the activity application(s) 414 to run specific applications in specific locations.
- the activity application(s) 414 may block the request to run a game when the activity application(s) 414 determines that the computing device 100 is located in the user's classroom at the moment.
- the second camera 120 may be configured to capture a field of view 230 that includes the first user 210 and the first camera 110 may be configured to capture a field of view 240 of a second user 220 .
- the first user 210 and the second user 220 may be positioned on opposite sides of the computing device 100 and they may both be able to view the display screen 140 from their respective positions.
- the activity application(s) 414 may run an application that both users can participate in and the activity application(s) 414 may analyze objects, expressions, gestures, etc. that are presented by the first user 210 and the second user 220 in the respective fields of view.
- the first user 210 may be a student and the second user 220 may be a teacher.
- the activity application(s) 414 may capture the interactions between the teacher and the student, including the facial expressions and/or gestures of the teacher or student and execute an application based on the captured information.
- FIG. 3 illustrates an example computing device 100 in the first orientation 101 .
- the first camera 110 includes a field of view 230 that extends downward towards the surface on which an activity object 310 (e.g., a book, etc.) is present.
- the second camera 120 includes a field of view 240 that extends outward towards the area in front of the computing device as shown.
- the computing device 100 may capture a video stream of the activity object 310 using the first camera 110 and a video stream of a user (not shown) interacting with the activity object 310 using the second camera 120 .
- the activity application(s) 414 may generate a visualization of the activity object 310 for display on the display screen 140 and the user (not shown) may be able to simultaneously interact with the activity object 310 and the visualization on the display screen 140 , while the second camera captures at least a portion of the user, such as a face or hands, as the user interacts. This allows for the user to experience an enriched interaction with both physical and virtual objects while the activity application(s) 414 may capture an image of the user for additional routines. For example, the user may be doing homework and learning about a topic in a textbook. The activity application(s) 414 may identify the topic from the textbook and provide additional curated information related to the topic on the display screen 140 .
- the activity application(s) 414 may also capture one or more facial expressions of the user and based on the captured facial expressions, may remove or supplement the virtual information presented on the display screen 140 . For example, if the user appears confused when viewing the virtual information, the activity application(s) 414 may present a broader overview of the virtual information in order to guide the user as to what the virtual information is presenting.
- FIG. 4 is a block diagram illustrating an example computer system 400 that is used with the computing device 100 .
- the system 400 may include computing devices 100 a . . . 100 n and servers 402 a . . . 402 n communicatively coupled via a network 406 .
- a letter after a reference number e.g., “ 100 a ”
- a reference number in the text without a following letter e.g., “ 100 ”
- the system 400 and/or further systems contemplated by this present disclosure may include additional and/or fewer components, may combine components and/or divide one or more of the components into additional components, etc.
- the system 400 may include any number of servers 402 , computing devices 100 , and/or networks 406 .
- the computing device 100 may be coupled to the network 406 via the signal line 408 and the server 402 may be coupled to the network 406 via the signal line 404 .
- the computing device 100 may be accessed by user 210 .
- the network 406 may include any number of networks and/or network types.
- the network 406 may include, but is not limited to, one or more local area networks (LANs), wide area networks (WANs) (e.g., the Internet), virtual private networks (VPNs), mobile (cellular) networks, wireless wide area network (WWANs), WiMAX® networks, Bluetooth® communication networks, peer-to-peer networks, other interconnected data paths across which multiple devices may communicate, various combinations thereof, etc.
- LANs local area networks
- WANs wide area networks
- VPNs virtual private networks
- WWANs wireless wide area network
- WiMAX® networks WiMAX® networks
- Bluetooth® communication networks peer-to-peer networks, other interconnected data paths across which multiple devices may communicate, various combinations thereof, etc.
- the computing device 100 may be a computing device that has data processing and communication capabilities.
- the computing device 100 may include a processor (e.g., virtual, physical, etc.), a memory, a power source, a network interface, and/or other software and/or hardware components, such as front and/or rear facing cameras, display screen, graphics processor, wireless transceivers, keyboard, firmware, operating systems, drivers, various physical connection interfaces (e.g., USB, HDMI, etc.).
- the computing devices 100 may be coupled to and communicate with one another and with other entities of the system 400 via the network 406 using a wireless and/or wired connection.
- the system 400 may include any number of computing devices 100 and the computing devices 100 may be the same or different types of devices (e.g., tablets, mobile phones, desktop computers, laptop computers, etc.).
- the computing device 100 may include the cameras 110 and 120 , a detection engine 412 , and one or more activity applications 414 .
- the detection engine 412 may detect and/or recognize objects in a video stream, and cooperate with the activity application(s) 414 to provide the user 210 with a virtual experience that incorporates in substantially real-time the objects and the user manipulation of the objects in the physical environment.
- the detection engine 412 may process the video stream captured by the cameras 110 or 120 to detect and recognize an object created by the user.
- the activity application 414 may generate a visualization of the object created by the user, and display to the user a virtual scene on the display screen 140 .
- the components and operations of the detection engine 412 and the activity application 414 are described in details throughout.
- the server 402 may include one or more computing devices that have data processing, storing, and communication capabilities.
- the server 402 may include one or more hardware servers, server arrays, storage devices and/or storage systems, etc.
- the server 402 may be a centralized, distributed and/or a cloud-based server.
- the server 402 may include one or more virtual servers that operate in a host server environment and access the physical hardware of the host server (e.g., processor, memory, storage, network interfaces, etc.) via an abstraction layer (e.g., a virtual machine manager).
- an abstraction layer e.g., a virtual machine manager
- the server 402 may include software applications operable by one or more processors of the server 402 to provide various computing functionalities, services, and/or resources, and to send and receive data to and from the computing devices 160 .
- the software applications may provide the functionalities of internet searching, social networking, web-based email, blogging, micro-blogging, photo management, video/music/multimedia hosting/sharing/distribution, business services, news and media distribution, user account management, or any combination thereof.
- the server 202 may also provide other network-accessible services.
- the server 402 may include a search engine capable of retrieving results that match one or more search criteria from a data store.
- the search criteria may include an image and the search engine may compare the image to product images in its data store (not shown) to identify a product that matches the image.
- the detection engine 412 and/or the storage 510 e.g., see FIG. 5
- system 400 illustrated in FIG. 4 is provided by way of example, and that a variety of different system environments and configurations are contemplated and are within the scope of the present disclosure. For example, various functionalities may be moved from a server to a client, or vice versa and some implementations may include additional or fewer computing devices, services, and/or networks, and may implement various client or server-side functionalities. In addition, various entities of the system 400 may be integrated into a single computing device or system or divided into additional computing devices or systems, etc.
- FIG. 5 is a block diagram of an example computing device 100 .
- the computing device 100 may include a processor 512 , a memory 514 , a communication unit 516 , an input device 518 , a display 520 , a storage 510 , the camera 110 , the camera 120 , and the orientation sensor 522 communicatively coupled by a bus 308 . It should be understood that the computing device 100 is not limited to such and other components are also possible and contemplated.
- the processor 512 may execute software instructions by performing various input/output, logical, and/or mathematical operations.
- the processor 512 may have various computing architectures to process data signals including, for example, a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, and/or an architecture implementing a combination of instruction sets.
- CISC complex instruction set computer
- RISC reduced instruction set computer
- the processor 512 may be physical and/or virtual, and may include a single core or plurality of processing units and/or cores.
- the memory 514 may be a non-transitory computer-readable medium that is configured to store and provide access to data to other components of the computing device 100 .
- the memory 514 may store instructions and/or data that are executable by the processor 512 .
- the memory 514 may store the detection engine 412 , the activity applications 414 , and a camera driver 506 .
- the memory 514 may also store other instructions and data, including, for example, an operating system, hardware drivers, other software applications, data, etc.
- the memory 514 may be coupled to the bus 508 for communication with the processor 512 and other components of the computing device 100 .
- the communication unit 516 may include one or more interface devices (I/F) for wired and/or wireless connectivity with the network 406 and/or other devices.
- the communication unit 516 may include transceivers for sending and receiving wireless signals.
- the communication unit 516 may include radio transceivers for communication with the network 406 and for communication with nearby devices using close-proximity connectivity (e.g., Bluetooth®, NFC, etc.).
- the communication unit 516 may include ports for wired connectivity with other devices.
- the communication unit 516 may include a CAT-5 interface, ThunderboltTM interface, FireWireTM interface, USB interface, etc.
- the display 520 may display electronic images and data output by the computing device 100 for presentation to the user.
- the display 520 may include any display device, monitor or screen, including, for example, an organic light-emitting diode (OLED) display, a liquid crystal display (LCD), etc.
- the display 520 may be a touch-screen display capable of receiving input from one or more fingers of the user.
- the display 520 may be a capacitive touch-screen display capable of detecting and interpreting multiple points of contact with the display surface.
- the computing device 100 may include a graphic adapter (not shown) for rendering and outputting the images and data for presentation on display 520 .
- the graphic adapter may be a separate processing device including a separate processor and memory (not shown) or may be integrated with the processor 512 and memory 514 .
- the input device 518 may include any device for inputting information into the computing device 100 .
- the input device 518 may include one or more peripheral devices.
- the input device 518 may include a keyboard (e.g., a QWERTY keyboard), a pointing device (e.g., a mouse or touchpad), a microphone, a camera, etc.
- the input device 518 may include a touch-screen display capable of receiving input from one or more fingers of the user.
- the functionality of the input device 518 and the display 520 may be integrated, and the user may interact with the computing device 100 by touching a surface of the display 520 .
- the user may interact with an emulated keyboard (e.g., soft keyboard or virtual keyboard) displayed on the touch-screen display 520 by contacting the display 520 in the keyboard regions using his or her fingers.
- an emulated keyboard e.g., soft keyboard or virtual keyboard
- the orientation sensor 522 may include one or more sensors for detecting an orientation of the computing device 100 .
- the orientation sensor 522 may include on more orientation sensors 522 that can detect the orientation of the computing device 100 .
- the orientation sensors may be configured to detect an angle or tilt of the computing device, such as by using an accelerometer or similar sensor relative to a known position and communication the angle or tilt to the activity application(s) 414 .
- the orientation sensor 522 can detect the differences in the tilt of the computing device 100 in order to determine when the computing device 100 is positioned in the first orientation 101 or the second orientation 103 and may provide that information to the activity application(s) 414 .
- the detection engine 412 may include a calibrator 502 and a detector 504 .
- the components 412 , 502 , and 504 may be communicatively coupled to one another and/or to other components 414 , 506 , 510 , 512 , 514 , 516 , 518 , 520 , 110 , 120 , and/or 522 of the computing device 100 by the bus 508 and/or the processor 512 .
- the components 412 , 502 , and 504 may be sets of instructions executable by the processor 512 to provide their functionality.
- the components 412 , 502 , and 504 may be stored in the memory 514 of the computing device 100 and may be accessible and executable by the processor 512 to provide their functionality. In any of the foregoing implementations, these components 412 , 502 , and 504 may be adapted for cooperation and communication with the processor 512 and other components of the computing device 100 .
- the calibrator 502 includes software and/or logic for performing image calibration on the video stream captured by the cameras 110 and/or 120 .
- the calibrator 502 may calibrate the images in the video stream to adapt to the capture position of the cameras 110 and/or 120 .
- the capture position of the cameras 110 and/or 120 may depend on the computing device 100 attributes and/or the orientation of the computing device 100 . Capturing the video stream from a camera position in different orientations may cause distortion effects on the video stream. Therefore, the calibrator 502 may adjust one or more operation parameters of the cameras 110 and 120 to compensate for these distortion effects.
- Examples of the operation parameters being adjusted include, but are not limited to, focus, exposure, white balance, aperture, f-stop, image compression, ISO, depth of field, noise reduction, focal length, etc.
- image calibration on the captured video streams is advantageous, because it can optimize the images of the video streams to accurately detect the objects depicted therein, and thus the operations of the activity applications 414 based on the objects detected in the video streams can be significantly improved.
- the calibrator 502 may also calibrate the images to compensate for the characteristics of the activity surface (e.g., size, angle, topography, etc.). For example, the calibrator 502 may perform the image calibration to account for the discontinuities and/or the non-uniformities of the activity surface, thereby enabling accurate detection of objects when the computing device 100 is set up on various activity surfaces (e.g., bumpy surface, beds, tables, whiteboards, etc.). In some embodiments, the calibrator 502 may calibrate the images to compensate for optical effect caused by the optical elements of the cameras 110 and/or 120 .
- the calibrator 502 may calibrate the images to compensate for optical effect caused by the optical elements of the cameras 110 and/or 120 .
- the calibrator 502 may also calibrate the cameras 110 or 120 to split their field of view into multiple portions with the user being included in one portion of the field of view and the activity surface being included in another portion of the field of view of the cameras 110 and/or 120 .
- the detector 504 includes software and/or logic for processing the video stream captured by the cameras 110 or 120 to detect the objects present in the video stream.
- the detector 504 may analyze the images of the video streams to determine line segments, and determine the object that has the contour matching the line segments using the object data in the storage 510 .
- the detector 504 may provide the tangible objects detected in the video stream to the activity applications 414 .
- the detector 504 may store the objects detected in the video stream in the storage 510 for retrieval by these components.
- the detector 504 may determine whether the line segments and/or the object associated with the line segments can be identified in the video stream, and instruct the calibrator 502 to calibrate the images of the video stream accordingly.
- the activity application 414 includes software and/or logic executable on the computing device 100 .
- the activity application 414 may receive the objects detected in the video stream of the activity surface from the detector 504 .
- the activity application 414 may generate a virtual environment that incorporates, in real-time, the virtualization of the tangible objects and the user manipulation of the tangible objects on the activity surface, and display the virtual environment to the user on the computing device 100 .
- Non-limiting examples of the activity application 414 include video games, learning applications, assistive applications, storyboard applications, collaborative applications, productivity applications, etc. Other types of activity application are also possible and contemplated.
- the camera driver 506 includes software storable in the memory 514 and operable by the processor 512 to control/operate the cameras 110 and 120 .
- the camera driver 506 may be a software driver executable by the processor 512 for instructing the cameras 110 and 120 to capture and provide a video stream and/or a still image, etc.
- the camera driver 506 may be capable of controlling various features of the cameras 110 and 120 (e.g., flash, aperture, exposure, focal length, etc.).
- the camera driver 506 may be communicatively coupled to the cameras 110 and 120 and other components of the computing device 100 via the bus 508 , and these components may interface with the camera driver 506 to capture video and/or still images using the cameras 110 and 120 .
- the cameras 110 and 120 are video capture devices (e.g., a camera) adapted to capture video streams and/or images in their field of view.
- the cameras 110 and 120 may be coupled to the bus 508 for communication and interaction with the other components of the computing device 100 .
- the one or more of the cameras 110 and 120 may include a lens for gathering and focusing light, a photo sensor including pixel regions for capturing the focused light, and a processor for generating image data based on signals provided by the pixel regions.
- the photo sensor may be any type of photo sensor (e.g., a charge-coupled device (CCD), a complementary metal-oxide-semiconductor (CMOS) sensor, a hybrid CCD/CMOS device, etc.).
- the cameras 110 and 120 may include a microphone for capturing sound.
- the cameras 110 and 120 may be coupled to a microphone that is coupled to the bus 508 or included in another component of the computing device 100 .
- the cameras 110 and 120 may also include a flash, a zoom lens, and/or other features.
- the processor of the cameras 110 and 120 may store video and/or still image data being captured in the memory 514 and/or provide the video and/or still image data to other components of the computing device 100 , such as the detection engine 412 and/or the activity applications 414 .
- the storage 510 is a non-transitory storage medium that stores and provides access to various types of data.
- Non-limiting examples of the data stored in the storage 510 include video stream and/or still images captured by the cameras 110 and 120 , object data describing various tangible objects (e.g., object contour, color, shape and size, etc.), object detection result indicating the tangible objects, etc.
- the data stored in the storage 510 may also include one or more orientation profiles.
- the orientaiton profile may include the position information of the various cameras 110 and 120 as well as expected fields of view in different orientations.
- the storage 510 may be included in the memory 514 or another storage device coupled to the bus 508 .
- the storage 510 may be included in a distributed data store, such as a cloud-based computing and/or data storage system.
- the storage 510 may include a database management system (DBMS).
- the DBMS may be a structured query language (SQL) DBMS.
- SQL structured query language
- the storage 510 may store data in an object-based data store or multi-dimensional tables including rows and columns, and may manipulate (i.e., insert, query, update, and/or delete) data entries stored in the storage 510 using programmatic operations (e.g., SQL queries and statements or a similar database manipulation library).
- programmatic operations e.g., SQL queries and statements or a similar database manipulation library.
- Other implementations of the storage 510 with additional characteristics, structures, acts, and functionalities are also possible and contemplated.
- FIG. 6 depicts an example method for determining what orientation a computing device 100 is positioned in.
- the orientation sensor 522 may determine that the computing device 100 is positioned in a first orientation 101 on a physical surface.
- the orientation sensor 522 may determine that the computing device 100 is in the first orientation based on an angle detected by one or more orientation sensors 522 that can determine how the computing device 100 is orientated and angled.
- the computing device 100 may include a first camera 110 configured to capture a first field of view 230 and a second camera 120 configured to capture a second field of view 240 that is different from the first field of view.
- the first field of view 230 and the second field of view 240 may originate from different camera locations on the computing device 100 , in further implementations, the first field of view 230 and the second field of view 240 may overlap at least a portion of the field of view. In some implementations, the first field of view 230 of the first camera 110 may be directed towards the physical surface and the second field of view 240 of the second camera 120 may be directed towards a user facing the computing device 100 when the computing device 100 is situated in the first orientation 101 .
- the first camera 110 may capture a first video stream including an activity scene of the physical activity surface in the first field of view 230 .
- the activity scene may be a portion of a physical surface proximate to the computing device 100 .
- one or more objects or other items may be positioned on the physical surface within the field of view 230 of the first camera 110 .
- a user may draw or craft an image on a piece of paper or board situated on the physical surface and within the field of view 230 of the first camera 110 .
- the objects may be passed through the field of view 230 of the first camera 110 without being placed on the physical surface, such as a gesture performed by the user in the area proximate to the computing device 100 and within the field of view 230 of the first camera 110 .
- the second camera 120 may capture a second video stream including a user in the second field of view 240 .
- the second video stream may include a profile of user positioned in front of the computing device 100 and within the second field of view 240 .
- the user can be positioned in front of the computing device 100 and viewing content on the display screen 140 while the computing device 100 rests on a table and the user sits in a chair.
- the video stream may capture at least a portion of a face of a user.
- the video stream may capture at least a portion of an appendage of the user, such as a hand, to capture a gesture.
- the video stream may include objects that are being held or manipulated by a user.
- the video stream may further include the environment around a user, such as posters in a classroom, etc. that may be detectable by the detector 504 and passed on to the activity application(s) 414 .
- the video stream may include facial expression information from the user.
- the activity application(s) 414 may determine an operation routine based on the first video stream and the second video stream.
- the first video stream may include a textbook for a student and the second video stream may identify which specific student is present in front of the computing device 100 .
- the activity application(s) 414 may retrieve a personalized study schedule from the storage 510 related to the identity of the specific student and find which topic the student has been assigned in the specific book that has been identified from the first video stream.
- the activity application(s) 414 may then cause the specific topic and page number to be displayed on the display screen 140 for student to open the book to that page without having to receive any directions from the student. This is advantageous as it reduces the busy work and time needed for a student to begin learning assigned material and also reduces the opportunity for mistakes to arise from the student not knowing where to go in the book, etc.
- the activity application(s) 414 may determine a user state of the user in the second video stream.
- the user state may include a facial feature the user state may be determined based on the facial feature.
- the facial features may be an expression of confusion, an attention level, an emotional condition, such as happiness, sadness, angry, joy, frustration, etc.
- the detector 504 may compare identified points on a user's face to a database of facial expressions and map the identified points to the example facial expressions. When a facial expression mapping exceeds a threshold value, then the facial expression is identified based on the example.
- the example facial expressions may be based on a machine learning algorithm that is continuously updated as the sample size and accuracy increases.
- the facial expressions may be personalized for a specific user and over time as different facial expressions are captured of that specific user, the detector 504 may update the examples facial expression for that user to increase accuracy.
- the operation routine may be a task for the user and the complexity of the task may be updated based on the captured information.
- the activity application(s) 414 may determine a task complexity level based on the user or the progress of the task. The task complexity level may be increased when it appears a task is being completed too quickly and/or the task complexity level may be decreased when it appears a task is creating confusion and/or the user is struggling to complete the task. For example, if a user exhibits frustration, such as in a facial expression, or a threshold period of time has expired without interacting with an object, the activity application(s) 414 may identify an operation routine with a lower complexity level and/or provide an instruction detail level that is appropriate for what the user is currently doing.
- the activity application(s) 414 may present an operation routine that provides a hint for what to perform next to solve the math problem.
- the activity application(s) 414 may execute the operation routine on the display screen 140 .
- the activity application(s) 414 may cause one or more programs to be executed based on the operation routine.
- the activity application(s) 414 may cause additional information to be displayed on the display screen 140 , such as page number or student information.
- the activity application(s) 414 may generate a visualization based on information detected in the first or second video stream and display the visualization on the display screen 140 .
- the first video stream may include a drawing of a ball and the second video stream may include a facial expression from the user that is happy.
- the activity application(s) 414 may generate a virtual avatar that is smiling and throwing a virtualization of the drawing of the ball, such as a virtualization with similar shapes, colors, contours, etc.
- FIG. 7 depicts an example method for determining orientation a computing device 100 .
- the computing device 100 is positioned in the second orientation and the activity application(s) 414 may determine that the computing device 100 is positioned in the second orientation 103 .
- the computing device 100 may change orientation from the first orientation 101 to the second orientation 103 and the orientation sensor 522 may detect that change in orientation and provide the change in orientation to the second orientation 103 to the activity application(s) 414 .
- the first field of view 240 of the first camera 110 is directed towards the user facing the computing device 100 and the second field of view 230 of the second camera 120 is also directed towards the user facing the computing device 100 .
- the first camera 110 may capture a third video stream including a first portion of a face of the user in the first field of view 240 .
- the first portion of the face of the user may be a view of the user from below the face of the user and include a portion of the mouth and the direction the head of the user is facing.
- the second camera 120 may capture a fourth video stream including a second portion of the face of the user in the second field of view 230 .
- the second portion of the face of the user may be a forward-looking view of the user and may include the use and mouth expressions.
- the third video stream may show where the user is looking and how the head is tilted and the fourth video stream may include forward looking facial expressions simultaneously capture with the third video stream.
- the activity application(s) 414 may be able to more accurately predict a user state and will receive more detailed facial expression information compared to a single camera that may be unable to provide the same level of detail as the two cameras capturing video streams simultaneously.
- the activity application(s) 414 may adjust an operation routine being displayed on the display screen 140 based on the information detected in the third video stream and the fourth video stream.
- the fourth video stream may capture the way the mouth of the user pronounces a word displayed on the display screen and the additional detail provided by the third video stream may provide additional detail of how the mouth is shaping the word.
- a three-dimensional depiction of the mouth forming the words may be presented on the display screen 140 and a user may be able to see how they are supposed to shape a word using their mouth.
- FIG. 8A shows a computing device 100 in the first orientation 101 .
- a volume interface 802 may be displayed virtually on the display screen 140 .
- the volume interface 802 may slidably appear along an edge of the display screen 140 that corresponds to where the volume control 180 is located on the edge of the housing 105 of the computing device 100 .
- the volume control 180 may include a volume up button 183 and a volume down button 185 that a user may selectively press in order to change the volume of the audio in the audio output 160 .
- the volume control 180 may be discretely positioned and have a minimal profile to not distract the user from the operation of the computing device 100 .
- a reminder in the form of the volume interface 802 may be presented on the display screen 140 in order to signal to the user how to operate the volume control 180 .
- a virtual volume up button 804 and a virtual volume down button 806 may be displayed when a user's finger is proximate to the volume control 180 .
- the volume interface 802 may be displayed.
- the user may not even have to press the volume control 180 .
- a proximity sensor such as a touch sensor or a light sensor may be installed in/near the volume control 180 and when the proximity sensor detects that the user is interacting with the volume control 180 , the volume interface 802 may be displayed to signal to the user how to interact with the volume control 180 .
- the first camera 110 or the second camera 120 may include a field of view that includes the volume control 180 and when the detector 504 detects that a user is trying to interact with the volume control, the volume interface 802 may be displayed.
- the user may also interact with the virtual volume interface 802 rather than the volume control 180 and if the display screen 140 detects a touch interaction with the volume interface 802 , the computing device 100 may cause the volume to be changed based on the interaction with the virtual volume interface 802 .
- the audio output 150 may include two or more audio outputs such as a first speaker 808 and a second speaker 810 .
- the audio output may control what is being output by the first speaker 808 and the second speaker 810 in order to create a stereo sound output as controlled by the activity application(s) 414 .
- the first speaker 808 outputs sound from the left side of the computing device 100 and the second speaker 810 outputs sound from the right side of the computing device 100 .
- a user can couple different audio output options, such as a pair of wired headphones or Bluetooth headphones and the activity application(s) 414 may send the left channel audio and the right channel audio to the correct audio output, such as the left ear headphone and the right ear headphone.
- the audio stream may be split by channel, such as when two users are interacting with the computing device and the first user may have a first audio output corresponding to the first speaker 808 and the second user may have a second audio output corresponding to the second speaker 810 or other corresponding audio output device.
- a teacher may be using a headset paired to the output of the second speaker 810 and a student may be listening to the audio output from the first speaker 808 .
- the output to the teacher may be presented simultaneously through the headphones while the student can hear the output channel going to the first speaker 808 . This would allow private information, such as whether or not answers being input by the student are correct for the teacher to hear without disrupting the student.
- the activity application(s) 414 may control the polarity of the inputs and outputs based on the orientation. For example, as shown in FIG. 8B , when the computing device 100 is orientated in the second orientation 103 , the volume control 180 is located on a right side of the computing device 100 . In this implementation, the volume up button 185 may be located on top portion of the volume control 180 and the volume down button 183 may be located on a bottom portion of the volume control 180 .
- volume interface 802 may be located proximate to the edge of the housing 105 of the computing device where the volume control 180 is situated and the volume interface 802 may be adjusted such that the volume up indicator 806 is still on a top portion of the volume interface 802 and a volume down indicator 804 is still in a bottom portion of the volume interface 802 even though the screen has changed positions relative to where it is located in the first orientation 101 .
- the activity application(s) 414 may identify in the software commands for volume up and volume down, which input controls the hardware based on the orientation of the computing device 100 .
- the activity application(s) 414 may be configured to update the mapping of the volume controls based on how the computing device 100 is orientated to allow for a user to always press the top of the volume control 180 to increase volume and press the bottom of the volume control 180 to decrease volume respectively. By preserving the habit, the user has to control the volume, the activity application(s) provides a more immersive experience when the computing device 100 is situated in different orientations.
- the activity application(s) 414 may also update the audio outputs when the computing device 100 is orientated in the second position 103 .
- the first speaker 808 may be identified as the right audio output and the second speaker 810 may be identified as the left audio output by the activity application(s) 414 .
- the sound quality may change based on when the speakers are located above the display screen 140 compared to below the display screen 140 .
- An audio profile may be executed by the activity application(s) to change the audio quality when the different orientations are detected in order to improve the audio quality in the different orientations.
- various implementations may be presented herein in terms of algorithms and symbolic representations of operations on data bits within a computer memory.
- An algorithm is here, and generally, conceived to be a self-consistent set of operations leading to a desired result.
- the operations are those requiring physical manipulations of physical quantities.
- these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
- Various implementations described herein may relate to an apparatus for performing the operations herein.
- This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer.
- a computer program may be stored in a computer readable storage medium, including, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, flash memories including USB keys with non-volatile memory or any type of media suitable for storing electronic instructions, each coupled to a computer system bus.
- the technology described herein can take the form of a hardware implementation, a software implementation, or implementations containing both hardware and software elements.
- the technology may be implemented in software, which includes but is not limited to firmware, resident software, microcode, etc.
- the technology can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system.
- a computer-usable or computer readable medium can be any non-transitory storage apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- a data processing system suitable for storing and/or executing program code may include at least one processor coupled directly or indirectly to memory elements through a system bus.
- the memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories that provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
- I/O devices including but not limited to keyboards, displays, pointing devices, etc.
- I/O controllers can be coupled to the system either directly or through intervening I/O controllers.
- Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems, storage devices, remote printers, etc., through intervening private and/or public networks.
- Wireless (e.g., Wi-FiTM) transceivers, Ethernet adapters, and modems, are just a few examples of network adapters.
- the private and public networks may have any number of configurations and/or topologies. Data may be transmitted between these devices via the networks using a variety of different communication protocols including, for example, various Internet layer, transport layer, or application layer protocols.
- data may be transmitted via the networks using transmission control protocol/Internet protocol (TCP/IP), user datagram protocol (UDP), transmission control protocol (TCP), hypertext transfer protocol (HTTP), secure hypertext transfer protocol (HTTPS), dynamic adaptive streaming over HTTP (DASH), real-time streaming protocol (RTSP), real-time transport protocol (RTP) and the real-time transport control protocol (RTCP), voice over Internet protocol (VOIP), file transfer protocol (FTP), WebSocket (WS), wireless access protocol (WAP), various messaging protocols (SMS, MMS, XMS, IMAP, SMTP, POP, WebDAV, etc.), or other known protocols.
- TCP/IP transmission control protocol/Internet protocol
- UDP user datagram protocol
- TCP transmission control protocol
- HTTP hypertext transfer protocol
- HTTPS secure hypertext transfer protocol
- DASH dynamic adaptive streaming over HTTP
- RTSP real-time streaming protocol
- RTCP real-time transport protocol
- RTCP real-time transport control protocol
- VOIP voice over Internet protocol
- FTP file
- modules, routines, features, attributes, methodologies and other aspects of the disclosure can be implemented as software, hardware, firmware, or any combination of the foregoing.
- a component an example of which is a module, of the specification is implemented as software
- the component can be implemented as a standalone program, as part of a larger program, as a plurality of separate programs, as a statically or dynamically linked library, as a kernel loadable module, as a device driver, and/or in every and any other way known now or in the future.
- the disclosure is in no way limited to implementation in any specific programming language, or for any specific operating system or environment. Accordingly, the disclosure is intended to be illustrative, but not limiting, of the scope of the subject matter set forth in the following claims.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Business, Economics & Management (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Acoustics & Sound (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Computational Linguistics (AREA)
- Library & Information Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Biomedical Technology (AREA)
- Otolaryngology (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- User Interface Of Digital Computer (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Studio Devices (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Accessories Of Cameras (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/869,413 US20210006730A1 (en) | 2019-07-07 | 2020-05-07 | Computing device |
CA3146171A CA3146171A1 (en) | 2019-07-07 | 2020-07-07 | Computing device |
CN202080063146.4A CN114402204A (zh) | 2019-07-07 | 2020-07-07 | 计算设备 |
BR112022000318A BR112022000318A2 (pt) | 2019-07-07 | 2020-07-07 | Dispositivo de computação |
MX2022000340A MX2022000340A (es) | 2019-07-07 | 2020-07-07 | Dispositivo informatico. |
PCT/US2020/041057 WO2021007241A1 (en) | 2019-07-07 | 2020-07-07 | Computing device |
AU2020309531A AU2020309531A1 (en) | 2019-07-07 | 2020-07-07 | Computing device |
GB2200060.8A GB2599839A (en) | 2019-07-07 | 2020-07-07 | Computing device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962871195P | 2019-07-07 | 2019-07-07 | |
US16/869,413 US20210006730A1 (en) | 2019-07-07 | 2020-05-07 | Computing device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210006730A1 true US20210006730A1 (en) | 2021-01-07 |
Family
ID=74065703
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/869,413 Abandoned US20210006730A1 (en) | 2019-07-07 | 2020-05-07 | Computing device |
US16/880,875 Active 2040-09-14 US11516410B2 (en) | 2019-07-07 | 2020-05-21 | Input polarity of computing device |
US16/880,882 Abandoned US20210004405A1 (en) | 2019-07-07 | 2020-05-21 | Enhancing tangible content on physical activity surface |
US18/148,277 Abandoned US20240031688A1 (en) | 2019-07-07 | 2022-12-29 | Enhancing tangible content on physical activity surface |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/880,875 Active 2040-09-14 US11516410B2 (en) | 2019-07-07 | 2020-05-21 | Input polarity of computing device |
US16/880,882 Abandoned US20210004405A1 (en) | 2019-07-07 | 2020-05-21 | Enhancing tangible content on physical activity surface |
US18/148,277 Abandoned US20240031688A1 (en) | 2019-07-07 | 2022-12-29 | Enhancing tangible content on physical activity surface |
Country Status (8)
Country | Link |
---|---|
US (4) | US20210006730A1 (es) |
CN (3) | CN114375435A (es) |
AU (3) | AU2020311356A1 (es) |
BR (3) | BR112022000301A2 (es) |
CA (3) | CA3146022A1 (es) |
GB (3) | GB2599838A (es) |
MX (3) | MX2022000340A (es) |
WO (3) | WO2021007241A1 (es) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD920326S1 (en) * | 2019-07-07 | 2021-05-25 | Tangible Play, Inc. | Virtualization device |
USD934828S1 (en) * | 2020-04-07 | 2021-11-02 | Yealink (Xiamen) Network Technology Co., Ltd. | Touch screen |
US20220050693A1 (en) * | 2020-08-11 | 2022-02-17 | International Business Machines Corporation | Determine step position to offer user assistance on an augmented reality system |
US11368651B1 (en) * | 2021-03-15 | 2022-06-21 | Amazon Technologies, Inc. | Audiovisual device |
US20220294993A1 (en) * | 2021-03-15 | 2022-09-15 | Amazon Technologies, Inc. | Electronic device with shutter assembly |
US11516410B2 (en) | 2019-07-07 | 2022-11-29 | Tangible Play, Inc. | Input polarity of computing device |
USD980208S1 (en) * | 2019-09-24 | 2023-03-07 | Tangible Play, Inc. | Activity surface |
USD991931S1 (en) * | 2021-04-28 | 2023-07-11 | Diebold Nixdorf Systems Gmbh | Computer screen |
USD991930S1 (en) * | 2021-04-28 | 2023-07-11 | Diebold Nixdorf Systems Gmbh | Computer screen |
USD993242S1 (en) * | 2021-04-28 | 2023-07-25 | Diebold Nixdorf Systems Gmbh | Computer screen |
US20240022759A1 (en) * | 2020-11-25 | 2024-01-18 | International Business Machines Corporation | Video encoding through non-saliency compression for live streaming of high definition videos in low-bandwidth transmission |
US12126828B2 (en) * | 2023-07-30 | 2024-10-22 | International Business Machines Corporation | Video encoding through non-saliency compression for live streaming of high definition videos in low-bandwidth transmission |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD922232S1 (en) | 2019-06-28 | 2021-06-15 | Ademco Inc. | Security system keypad |
USD1009671S1 (en) * | 2019-09-10 | 2024-01-02 | Ademco Inc. | Security system control panel with touchscreen interface |
US11983461B2 (en) * | 2020-03-26 | 2024-05-14 | Snap Inc. | Speech-based selection of augmented reality content for detected objects |
US11798550B2 (en) | 2020-03-26 | 2023-10-24 | Snap Inc. | Speech-based selection of augmented reality content |
US11816269B1 (en) * | 2020-05-29 | 2023-11-14 | Humane, Inc. | Gesture recognition for wearable multimedia device using real-time data streams |
TWI793851B (zh) * | 2021-07-16 | 2023-02-21 | 大立光電股份有限公司 | 具有雙通光區域的電子裝置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160282901A1 (en) * | 2015-03-26 | 2016-09-29 | Tangible Play, Inc. | Display Positioning System |
US20180329672A1 (en) * | 2017-05-15 | 2018-11-15 | Microsoft Technology Licensing, Llc | Volume adjustment on hinged multi-screen device |
US20190098258A1 (en) * | 2016-03-25 | 2019-03-28 | Panasonic Intellectual Property Management Co., Ltd. | Information displaying system and information providing terminal |
Family Cites Families (185)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD263824S (en) | 1979-06-28 | 1982-04-13 | Bell Telephone Laboratories, Incorporated | General-purpose electronic telephone set base |
USD282935S (en) | 1983-01-27 | 1986-03-11 | Shifflett David C | Teaching apparatus for correcting learning problems |
USD284084S (en) | 1983-03-21 | 1986-06-03 | Summagraphics Corporation | Digitizer |
USD289291S (en) | 1984-04-06 | 1987-04-14 | International Business Machines Corp. | Interactive graphics data terminal |
USD299491S (en) | 1986-04-30 | 1989-01-24 | Road Runner Co., Ltd. | Combined vehicle rear view mirror and television |
USD304338S (en) | 1986-08-18 | 1989-10-31 | British Telecommunications | Communications equipment keyboard housing |
USD299473S (en) | 1986-10-17 | 1989-01-17 | The Quaker Oats Company | Educational toy viewing device |
USD310521S (en) | 1987-10-28 | 1990-09-11 | Video Technology Industries, Inc. | Combined drawing tablet and cursor control for computer |
USD312533S (en) | 1987-11-30 | 1990-12-04 | Amity Leather Products Co. | Wallet with security feature |
USD324210S (en) | 1988-11-17 | 1992-02-25 | In Focus Systems, Inc. | Electronic data display panel for use with an overhead projector |
USD322777S (en) | 1989-01-31 | 1991-12-31 | Canon Kabushiki Kaisha | Manual data input electronic notebook |
USD313409S (en) | 1989-03-07 | 1991-01-01 | Cherry Electrical Products Limited | Digitizer for a graphic tablet |
USD321175S (en) | 1989-06-09 | 1991-10-29 | Huika Sangyo Kabushiki Kaisha | Electronic instrument for reading and displaying from magnetic and optical recording mediums such as floppy disks or the like |
USD333814S (en) | 1990-10-30 | 1993-03-09 | Video Technology Industries, Inc. | Combined electronic drawing stylus and tablet housing |
USD336053S (en) | 1991-01-17 | 1993-06-01 | Forerunner Corporation | Control console for signalling emergency services |
AU117925S (en) | 1991-12-24 | 1993-08-18 | Arnos Australia Pty Ltd | A display panel for use in a filing system |
USD426816S (en) | 1992-05-29 | 2000-06-20 | International Business Machines Corporation | Computer housing |
USD352279S (en) | 1992-11-02 | 1994-11-08 | International Business Machines Incorporated | Portable computer tablet with a touch screen and stylus |
USD370892S (en) | 1992-11-12 | 1996-06-18 | International Business Machines Corp. | Personal computer |
USD366499S (en) | 1993-12-06 | 1996-01-23 | Vtech Industries, Inc. | Housing for an electronic educational game |
USD362662S (en) | 1994-01-18 | 1995-09-26 | Hewlett-Packard Company | Write-on-image tablet |
USD362270S (en) | 1994-07-12 | 1995-09-12 | Irene Allen | Electronic instrument for teaching the alphabet |
USD380231S (en) | 1995-01-10 | 1997-06-24 | Vtech Industries, Inc. | Electronic educational game housing |
USD374224S (en) | 1995-01-26 | 1996-10-01 | Thomson Consumer Electronics (Societe Anonyme) | Television set |
USD361784S (en) | 1995-01-31 | 1995-08-29 | Educational Insights, Inc. | Electronic educational game board |
USD373576S (en) | 1995-08-30 | 1996-09-10 | Mastertouch Information Systems, Inc. | Handheld information and collection device with cradle |
USD384659S (en) | 1996-03-20 | 1997-10-07 | Sony Corporation | Television receiver |
USD393461S (en) | 1996-03-25 | 1998-04-14 | Sony Corporation | Television monitor |
USD388065S (en) | 1996-08-02 | 1997-12-23 | Alps Electric (Usa), Inc. | Cursor control input device |
USD395458S (en) | 1996-12-12 | 1998-06-23 | Gtech Corporation | Gaming terminal |
USD396217S (en) | 1997-02-24 | 1998-07-21 | Kabushiki Kaisha Toshiba | Electronic computer |
US6175954B1 (en) | 1997-10-30 | 2001-01-16 | Fuji Xerox Co., Ltd. | Computer programming using tangible user interface where physical icons (phicons) indicate: beginning and end of statements and program constructs; statements generated with re-programmable phicons and stored |
USD413595S (en) | 1998-02-17 | 1999-09-07 | Sony Corporation | Television monitor |
USD411517S (en) | 1998-07-13 | 1999-06-29 | Ncr Corporation | Computer |
USD429068S (en) | 1999-04-14 | 2000-08-08 | Mark John Kleinsmith | Combination card case with money clip and fabric insert |
USD437593S1 (en) | 2000-02-28 | 2001-02-13 | Intel Corporation | Internet tablet |
US7227526B2 (en) | 2000-07-24 | 2007-06-05 | Gesturetek, Inc. | Video-based image control system |
USD459394S1 (en) | 2001-03-23 | 2002-06-25 | Choice Obadiaru | Reader machine |
USD458255S1 (en) | 2001-05-31 | 2002-06-04 | Firich Enterprises Co., Ltd. | Dual screen touch point of sale device |
US8019121B2 (en) | 2002-07-27 | 2011-09-13 | Sony Computer Entertainment Inc. | Method and system for processing intensity from input devices for interfacing with a computer program |
DE10258794A1 (de) | 2002-12-16 | 2004-06-24 | Ibeo Automobile Sensor Gmbh | Verfahren zur Erkennung und Verfolgung von Objekten |
USD494943S1 (en) | 2003-01-28 | 2004-08-24 | Bpt S.P.A. | Video communication system |
US8064684B2 (en) | 2003-04-16 | 2011-11-22 | Massachusetts Institute Of Technology | Methods and apparatus for visualizing volumetric data using deformable physical object |
US20050276164A1 (en) | 2004-06-12 | 2005-12-15 | Scott Amron | Watch adapted to rotate a displayed image so as to appear in a substantially constant upright orientation |
USD517512S1 (en) | 2004-08-25 | 2006-03-21 | Action Electronics Co., Ltd. | Audio-video display device |
GB0420204D0 (en) | 2004-09-11 | 2004-10-13 | Univ Abertay | "Object recognition system" |
USD546895S1 (en) | 2005-04-11 | 2007-07-17 | Idt Electronic Products Limited | Electronic learning device |
USD524312S1 (en) | 2005-05-23 | 2006-07-04 | Associated Industries China, Inc. | Liquid crystal display |
TWM291671U (en) | 2005-11-09 | 2006-06-01 | Wistron Corp | Enclosure with rotary functions and electric device for combining the enclosure |
US20110199319A1 (en) | 2005-11-09 | 2011-08-18 | George Moser | Reconfigurable Computer |
USD533857S1 (en) | 2005-12-07 | 2006-12-19 | Hannspree Inc. | Television set |
US8611587B2 (en) | 2006-03-27 | 2013-12-17 | Eyecue Vision Technologies Ltd. | Device, system and method for determining compliance with an instruction by a figure in an image |
USD563405S1 (en) | 2006-05-17 | 2008-03-04 | Pfu Limited | Computer terminal |
USD578131S1 (en) | 2006-09-12 | 2008-10-07 | Control4 Corporation | Wall mounted touch screen faceplate |
USD599328S1 (en) | 2007-02-08 | 2009-09-01 | Hewlett-Packard Development Company, L.P. | Remote control for digital entertainment content |
WO2008129540A2 (en) | 2007-04-19 | 2008-10-30 | Eyecue Vision Technologies Ltd. | Device and method for identification of objects using color coding |
CA120748S (en) | 2007-05-16 | 2008-02-21 | Bce Inc | Telephone base |
US9138636B2 (en) | 2007-05-16 | 2015-09-22 | Eyecue Vision Technologies Ltd. | System and method for calculating values in tile games |
USD576177S1 (en) | 2007-12-14 | 2008-09-02 | Kabushiki Kaisha Toshiba | Digital audio player |
US20090273560A1 (en) | 2008-02-04 | 2009-11-05 | Massachusetts Institute Of Technology | Sensor-based distributed tangible user interface |
US8217964B2 (en) | 2008-02-14 | 2012-07-10 | Nokia Corporation | Information presentation based on display screen orientation |
US7777899B1 (en) | 2008-06-19 | 2010-08-17 | Gesturetek, Inc. | Interaction interface for controlling an application |
US8514251B2 (en) | 2008-06-23 | 2013-08-20 | Qualcomm Incorporated | Enhanced character input using recognized gestures |
USD607883S1 (en) | 2008-07-08 | 2010-01-12 | Fujitsu Limited | Personal computer |
US8896632B2 (en) | 2008-09-12 | 2014-11-25 | Qualcomm Incorporated | Orienting displayed elements relative to a user |
EP2350792B1 (en) | 2008-10-10 | 2016-06-22 | Qualcomm Incorporated | Single camera tracker |
USD600689S1 (en) | 2009-03-12 | 2009-09-22 | Quanta Computer Inc. | Notebook computer |
KR20120089452A (ko) | 2009-08-04 | 2012-08-10 | 아이큐 비젼 테크놀로지즈 리미티드 | 물체 추출 시스템 및 방법 |
TWD136375S1 (zh) | 2009-10-28 | 2010-08-11 | 緯創資通股份有限公司 | 電腦 |
USD641749S1 (en) | 2009-11-30 | 2011-07-19 | Creative Technology Ltd | Communication device |
US20110191692A1 (en) | 2010-02-03 | 2011-08-04 | Oto Technologies, Llc | System and method for e-book contextual communication |
TWD141813S1 (zh) | 2010-02-26 | 2011-08-01 | 和冠股份有限公司 | 座標輸入機 |
USD638019S1 (en) | 2010-04-23 | 2011-05-17 | ASI Holdings Limited | Media player dock |
USD634316S1 (en) | 2010-05-28 | 2011-03-15 | Hewlett-Packard Development Company, L.P. | Computing device |
US8749499B2 (en) | 2010-06-08 | 2014-06-10 | Sap Ag | Touch screen for bridging multi and/or single touch points to applications |
US9262015B2 (en) | 2010-06-28 | 2016-02-16 | Intel Corporation | System for portable tangible interaction |
US20120026098A1 (en) * | 2010-07-30 | 2012-02-02 | Research In Motion Limited | Portable electronic device having tabletop mode |
US20120043235A1 (en) | 2010-08-23 | 2012-02-23 | James Robert Klement | Protective case for portable electrical device |
USD683027S1 (en) | 2011-01-26 | 2013-05-21 | Omron Healthcare Co., Ltd. | Sphygmomanometer |
USD654450S1 (en) | 2011-02-01 | 2012-02-21 | Vizio Inc | Electronic device with dock |
USD660837S1 (en) | 2011-02-23 | 2012-05-29 | Rgb Systems, Inc. | Desktop touch screen |
US8698873B2 (en) | 2011-03-07 | 2014-04-15 | Ricoh Company, Ltd. | Video conferencing with shared drawing |
USD665687S1 (en) | 2011-05-16 | 2012-08-21 | Otis Elevator Company | Elevator indicator |
USD660736S1 (en) | 2011-05-16 | 2012-05-29 | Otis Elevator Company | Elevator indicator |
USD663638S1 (en) | 2011-05-16 | 2012-07-17 | Otis Elevator Company | Elevator passenger interface |
TWD152721S (zh) | 2011-05-27 | 2013-04-01 | 和冠股份有限公司 | 座標輸入機 |
TWD148540S (zh) | 2011-05-27 | 2012-08-01 | 和冠股份有限公司 | 座標輸入機 |
USD678239S1 (en) | 2011-06-08 | 2013-03-19 | Clearone Communications, Inc. | Internet protocol conference telephone station |
US8935438B1 (en) * | 2011-06-28 | 2015-01-13 | Amazon Technologies, Inc. | Skin-dependent device components |
USD662089S1 (en) | 2011-09-15 | 2012-06-19 | Plantronics, Inc. | Base cradle for a communications headset |
US9350951B1 (en) | 2011-11-22 | 2016-05-24 | Scott Dallas Rowe | Method for interactive training and analysis |
US9182815B2 (en) | 2011-12-07 | 2015-11-10 | Microsoft Technology Licensing, Llc | Making static printed content dynamic with virtual data |
USD674801S1 (en) | 2011-12-15 | 2013-01-22 | Kevin Gary Wharram | Solar-powered electronic reader cover |
USD676900S1 (en) | 2011-12-19 | 2013-02-26 | Panasonic Corporation | Point of sales terminal |
USD688249S1 (en) | 2011-12-29 | 2013-08-20 | Kevin Gary Wharram | Solar-powered electronic reader cover |
US10457441B2 (en) | 2012-01-05 | 2019-10-29 | Portero Holdings, Llc | Case for a communication device |
USD679018S1 (en) | 2012-02-02 | 2013-03-26 | Cardiac Pacemakers, Inc. | Communicator |
WO2013117977A2 (en) | 2012-02-06 | 2013-08-15 | Sony Computer Entertainment Europe | Book object for augmented reality |
USD702579S1 (en) | 2012-02-17 | 2014-04-15 | Otis Elevator Company | Touch pad for destination selection controller |
US9134807B2 (en) | 2012-03-02 | 2015-09-15 | Microsoft Technology Licensing, Llc | Pressure sensitive key normalization |
US8851280B2 (en) | 2012-05-22 | 2014-10-07 | Feng Wen | Tablet cover |
US9152173B2 (en) | 2012-10-09 | 2015-10-06 | Microsoft Technology Licensing, Llc | Transparent display device |
US9158389B1 (en) | 2012-10-15 | 2015-10-13 | Tangible Play, Inc. | Virtualization of tangible interface objects |
US10033943B1 (en) | 2012-10-15 | 2018-07-24 | Tangible Play, Inc. | Activity surface detection, display and enhancement |
US10657694B2 (en) | 2012-10-15 | 2020-05-19 | Tangible Play, Inc. | Activity surface detection, display and enhancement of a virtual scene |
USD693314S1 (en) | 2012-10-24 | 2013-11-12 | Evans Consoles Corporation | Control module |
CN104937920A (zh) | 2012-10-31 | 2015-09-23 | 因维萨热技术公司 | 扩展视场图像和视频捕获 |
KR102001218B1 (ko) | 2012-11-02 | 2019-07-17 | 삼성전자주식회사 | 객체와 관련된 정보 제공 방법 및 이를 위한 디바이스 |
US9235768B1 (en) | 2012-11-19 | 2016-01-12 | Evernote Corporation | Custom drawings as content access identifiers |
USD697910S1 (en) | 2012-12-11 | 2014-01-21 | Control Module, Inc. | Touch screen terminal for time and attendance system |
TWD158449S (zh) | 2012-12-17 | 2014-01-11 | 和冠股份有限公司 | 座標輸入機 |
US9058693B2 (en) | 2012-12-21 | 2015-06-16 | Dassault Systemes Americas Corp. | Location correction of virtual objects |
US20150187225A1 (en) | 2012-12-26 | 2015-07-02 | Google Inc. | Providing quizzes in electronic books to measure and improve reading comprehension |
US9160915B1 (en) * | 2013-01-09 | 2015-10-13 | Amazon Technologies, Inc. | Modifying device functionality based on device orientation |
USD697506S1 (en) | 2013-01-17 | 2014-01-14 | Control Module, Inc. | Touch screen terminal for time and attendance system |
SE536902C2 (sv) | 2013-01-22 | 2014-10-21 | Crunchfish Ab | Skalbar inmatning från spårat objekt i beröringsfritt användargränssnitt |
US9472113B1 (en) | 2013-02-05 | 2016-10-18 | Audible, Inc. | Synchronizing playback of digital content with physical content |
US9524282B2 (en) | 2013-02-07 | 2016-12-20 | Cherif Algreatly | Data augmentation with real-time annotations |
USD697060S1 (en) | 2013-03-08 | 2014-01-07 | The Joy Factory, Inc. | Protective case for electronic device |
USD704693S1 (en) | 2013-03-13 | 2014-05-13 | Samsung Electronics Co., Ltd. | Case for electronic device |
TWD160300S (zh) | 2013-04-23 | 2014-05-01 | 和冠股份有限公司 | 座標輸入機 |
US9697562B2 (en) | 2013-06-07 | 2017-07-04 | International Business Machines Corporation | Resource provisioning for electronic books |
USD738394S1 (en) | 2013-06-09 | 2015-09-08 | Apple Inc. | Display screen or portion thereof with animated graphical user interface |
USD742371S1 (en) | 2013-07-29 | 2015-11-03 | Raytheon Company | Mounted computer design |
USD708184S1 (en) | 2013-08-19 | 2014-07-01 | The Neat Company, Inc. | Sheet scanner with input tray |
USD721665S1 (en) | 2013-08-26 | 2015-01-27 | Technicolor Delivery Technologies | Set-top box |
IN2013MU02915A (es) | 2013-09-10 | 2015-07-03 | Tata Consultancy Services Ltd | |
KR102065417B1 (ko) | 2013-09-23 | 2020-02-11 | 엘지전자 주식회사 | 웨어러블 이동단말기 및 그 제어방법 |
WO2015065467A1 (en) | 2013-11-01 | 2015-05-07 | Hewlett-Packard Development Company, L. P. | Form adjustable angle between a battery cavity to selectively place a video display in viewing orientations |
US9462175B2 (en) | 2013-11-18 | 2016-10-04 | Heekwan Kim | Digital annotation-based visual recognition book pronunciation system and related method of operation |
USD766288S1 (en) | 2013-12-05 | 2016-09-13 | Lg Electronics Inc. | Display of a television receiver with graphical user interface |
US9402018B2 (en) | 2013-12-17 | 2016-07-26 | Amazon Technologies, Inc. | Distributing processing for imaging processing |
USD733714S1 (en) | 2013-12-20 | 2015-07-07 | Pelican Products, Inc. | Case for electronic pad |
USD755783S1 (en) | 2014-01-29 | 2016-05-10 | Rgb Systems, Inc. | Tabletop control panel |
WO2015116971A1 (en) | 2014-01-31 | 2015-08-06 | Heller Noah Raymond | Determination of aesthetic preferences based on user history |
US9571151B2 (en) | 2014-02-06 | 2017-02-14 | Olloclip, Llc | Cases for mobile electronic devices configured to receive auxiliary optical devices |
GB2583848B (en) | 2014-05-21 | 2021-03-24 | Tangible Play Inc | Virtualization of tangible interface objects |
USD816081S1 (en) | 2014-06-25 | 2018-04-24 | Sensel, Inc. | Touch sensor art keyboard |
USD750085S1 (en) | 2014-10-21 | 2016-02-23 | Logitech Europe S.A. | Protective folio with keyboard for multiple types of input devices |
USD778982S1 (en) | 2014-10-29 | 2017-02-14 | First Data Corporation | Mobile point-of-sale display and printer |
US20160203645A1 (en) | 2015-01-09 | 2016-07-14 | Marjorie Knepp | System and method for delivering augmented reality to printed books |
USD760248S1 (en) | 2015-01-13 | 2016-06-28 | Victor Alfonso Suarez | Display screen with graphical user interface |
USD775628S1 (en) | 2015-03-30 | 2017-01-03 | Incase Designs Corp. | Sleeve case for electronic device |
USD798378S1 (en) | 2015-05-15 | 2017-09-26 | Seoul Electronics & Telecom Co., Ltd. | Electronic payment terminal |
US20160378296A1 (en) | 2015-06-25 | 2016-12-29 | Ashok Mishra | Augmented Reality Electronic Book Mechanism |
KR20170006559A (ko) * | 2015-07-08 | 2017-01-18 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
US10956948B2 (en) | 2015-11-09 | 2021-03-23 | Anupam Madiratta | System and method for hotel discovery and generating generalized reviews |
USD807884S1 (en) | 2015-11-11 | 2018-01-16 | Technologies Humanware Inc. | Tactile braille tablet |
US10194089B2 (en) | 2016-02-08 | 2019-01-29 | Qualcomm Incorporated | Systems and methods for implementing seamless zoom function using multiple cameras |
JP2017175579A (ja) | 2016-03-25 | 2017-09-28 | パナソニックIpマネジメント株式会社 | 情報表示システム及び情報提供端末 |
US9866927B2 (en) | 2016-04-22 | 2018-01-09 | Microsoft Technology Licensing, Llc | Identifying entities based on sensor data |
US10885801B2 (en) | 2016-05-24 | 2021-01-05 | Tangible Play, Inc. | Virtualized tangible programming |
USD839275S1 (en) | 2018-01-18 | 2019-01-29 | Amazon Technologies, Inc. | Device cover |
USD810088S1 (en) | 2016-06-06 | 2018-02-13 | Amazon Technologies, Inc. | Device cover |
USD812622S1 (en) | 2016-09-06 | 2018-03-13 | Hanwha Techwin Co., Ltd. | Portable terminal |
JP1605571S (es) | 2017-01-12 | 2018-06-04 | ||
BR112019014561A2 (pt) | 2017-01-17 | 2020-02-18 | Hewlett-Packard Development Company, L.P. | Conteúdo aumentado simulado |
USD825596S1 (en) | 2017-01-27 | 2018-08-14 | Charles Cannata | Display screen or portion thereof with graphical user interface |
USD871500S1 (en) | 2017-01-30 | 2019-12-31 | Tabletop Media Llc | Tabletop point-of-sale (POS) terminal |
USD844010S1 (en) | 2017-02-17 | 2019-03-26 | Crestron Electronics, Inc. | Tablet computer dock |
USD852211S1 (en) | 2017-03-21 | 2019-06-25 | Microsoft Corporation | Display screen with animated graphical user interface |
US10635255B2 (en) | 2017-04-18 | 2020-04-28 | Google Llc | Electronic device response to force-sensitive interface |
USD849741S1 (en) | 2017-04-18 | 2019-05-28 | Shenzhen Huion Animation Technology Co. Ltd. | Interactive pen display |
USD824406S1 (en) | 2017-05-01 | 2018-07-31 | Promontech Llc | Computer display panel with a graphical user interface for a mortgage application |
US10003371B1 (en) | 2017-05-31 | 2018-06-19 | Dennis Given | Electronic device case comprising sliding camera lens covers |
USD857007S1 (en) | 2017-08-22 | 2019-08-20 | Intel Corporation | Audio-visual display device |
USD877747S1 (en) | 2017-09-15 | 2020-03-10 | Express Scripts Strategic Development, Inc. | Display screen with graphical user interface |
USD834573S1 (en) | 2017-09-21 | 2018-11-27 | Facebook, Inc. | Display device |
US10897680B2 (en) * | 2017-10-04 | 2021-01-19 | Google Llc | Orientation-based device interface |
USD880327S1 (en) | 2017-11-02 | 2020-04-07 | Otis Elevator Company | Operating/display panel |
USD850440S1 (en) | 2017-12-05 | 2019-06-04 | Rgb Systems, Inc. | Table top touch screen |
US10854001B2 (en) | 2017-12-26 | 2020-12-01 | Tangible Play, Inc. | Tangible object virtualization station |
USD854565S1 (en) | 2018-01-30 | 2019-07-23 | Citrix Systems, Inc. | Display screen or portion thereof with graphical user interface |
WO2019195787A1 (en) | 2018-04-05 | 2019-10-10 | Tangible Play, Inc. | Display positioning system |
USD873819S1 (en) | 2018-06-19 | 2020-01-28 | Seagate Technology Llc | Storage device |
US10896235B2 (en) | 2018-07-13 | 2021-01-19 | Tyndale House Publishers, Inc. | Connecting a printed document to related digital content |
US10817582B2 (en) | 2018-07-20 | 2020-10-27 | Elsevier, Inc. | Systems and methods for providing concomitant augmentation via learning interstitials for books using a publishing platform |
USD919653S1 (en) | 2018-10-12 | 2021-05-18 | Vasona Networks, Inc. | Display screen or portion thereof with animated graphical user interface |
USD852801S1 (en) | 2018-11-07 | 2019-07-02 | Tangible Play, Inc. | Device case |
WO2020097381A1 (en) | 2018-11-07 | 2020-05-14 | Tangible Play, Inc. | Protective cover device |
USD902202S1 (en) | 2018-11-08 | 2020-11-17 | TecPal Ltd. | Kitchen tablet |
US20210006730A1 (en) | 2019-07-07 | 2021-01-07 | Tangible Play, Inc. | Computing device |
USD907032S1 (en) | 2019-07-07 | 2021-01-05 | Tangible Play, Inc. | Virtualization device |
USD929398S1 (en) | 2019-08-06 | 2021-08-31 | Ambit Microsystems (Shanghai) Ltd. | Smart home hub |
USD908122S1 (en) | 2020-06-28 | 2021-01-19 | Shenzhen Dushang Technology Co., Ltd. | All-in-one computer |
-
2020
- 2020-05-07 US US16/869,413 patent/US20210006730A1/en not_active Abandoned
- 2020-05-21 US US16/880,875 patent/US11516410B2/en active Active
- 2020-05-21 US US16/880,882 patent/US20210004405A1/en not_active Abandoned
- 2020-07-07 CA CA3146022A patent/CA3146022A1/en active Pending
- 2020-07-07 GB GB2200058.2A patent/GB2599838A/en not_active Withdrawn
- 2020-07-07 CN CN202080063135.6A patent/CN114375435A/zh active Pending
- 2020-07-07 MX MX2022000340A patent/MX2022000340A/es unknown
- 2020-07-07 BR BR112022000301A patent/BR112022000301A2/pt not_active IP Right Cessation
- 2020-07-07 CN CN202080063146.4A patent/CN114402204A/zh active Pending
- 2020-07-07 WO PCT/US2020/041057 patent/WO2021007241A1/en active Application Filing
- 2020-07-07 MX MX2022000299A patent/MX2022000299A/es unknown
- 2020-07-07 GB GB2200057.4A patent/GB2599597A/en not_active Withdrawn
- 2020-07-07 CN CN202080063171.2A patent/CN114424238A/zh active Pending
- 2020-07-07 AU AU2020311356A patent/AU2020311356A1/en not_active Abandoned
- 2020-07-07 GB GB2200060.8A patent/GB2599839A/en not_active Withdrawn
- 2020-07-07 WO PCT/US2020/041051 patent/WO2021007238A1/en active Application Filing
- 2020-07-07 BR BR112022000318A patent/BR112022000318A2/pt not_active IP Right Cessation
- 2020-07-07 BR BR112022000277A patent/BR112022000277A2/pt not_active IP Right Cessation
- 2020-07-07 CA CA3161129A patent/CA3161129A1/en active Pending
- 2020-07-07 CA CA3146171A patent/CA3146171A1/en active Pending
- 2020-07-07 AU AU2020311360A patent/AU2020311360A1/en not_active Abandoned
- 2020-07-07 MX MX2022000345A patent/MX2022000345A/es unknown
- 2020-07-07 WO PCT/US2020/041066 patent/WO2021007248A1/en active Application Filing
- 2020-07-07 AU AU2020309531A patent/AU2020309531A1/en not_active Abandoned
-
2022
- 2022-12-29 US US18/148,277 patent/US20240031688A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160282901A1 (en) * | 2015-03-26 | 2016-09-29 | Tangible Play, Inc. | Display Positioning System |
US20190098258A1 (en) * | 2016-03-25 | 2019-03-28 | Panasonic Intellectual Property Management Co., Ltd. | Information displaying system and information providing terminal |
US20180329672A1 (en) * | 2017-05-15 | 2018-11-15 | Microsoft Technology Licensing, Llc | Volume adjustment on hinged multi-screen device |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11516410B2 (en) | 2019-07-07 | 2022-11-29 | Tangible Play, Inc. | Input polarity of computing device |
USD920326S1 (en) * | 2019-07-07 | 2021-05-25 | Tangible Play, Inc. | Virtualization device |
USD954042S1 (en) * | 2019-07-07 | 2022-06-07 | Tangible Play, Inc. | Virtualization device |
USD980208S1 (en) * | 2019-09-24 | 2023-03-07 | Tangible Play, Inc. | Activity surface |
USD934828S1 (en) * | 2020-04-07 | 2021-11-02 | Yealink (Xiamen) Network Technology Co., Ltd. | Touch screen |
US20220050693A1 (en) * | 2020-08-11 | 2022-02-17 | International Business Machines Corporation | Determine step position to offer user assistance on an augmented reality system |
US20240022759A1 (en) * | 2020-11-25 | 2024-01-18 | International Business Machines Corporation | Video encoding through non-saliency compression for live streaming of high definition videos in low-bandwidth transmission |
US11368651B1 (en) * | 2021-03-15 | 2022-06-21 | Amazon Technologies, Inc. | Audiovisual device |
US20220294993A1 (en) * | 2021-03-15 | 2022-09-15 | Amazon Technologies, Inc. | Electronic device with shutter assembly |
US11949997B2 (en) * | 2021-03-15 | 2024-04-02 | Amazon Technologies, Inc. | Electronic device with shutter assembly |
USD993242S1 (en) * | 2021-04-28 | 2023-07-25 | Diebold Nixdorf Systems Gmbh | Computer screen |
USD991930S1 (en) * | 2021-04-28 | 2023-07-11 | Diebold Nixdorf Systems Gmbh | Computer screen |
USD991931S1 (en) * | 2021-04-28 | 2023-07-11 | Diebold Nixdorf Systems Gmbh | Computer screen |
US12126828B2 (en) * | 2023-07-30 | 2024-10-22 | International Business Machines Corporation | Video encoding through non-saliency compression for live streaming of high definition videos in low-bandwidth transmission |
Also Published As
Publication number | Publication date |
---|---|
WO2021007241A1 (en) | 2021-01-14 |
BR112022000301A2 (pt) | 2022-03-15 |
BR112022000277A2 (pt) | 2022-04-26 |
CN114375435A (zh) | 2022-04-19 |
CN114402204A (zh) | 2022-04-26 |
CA3146022A1 (en) | 2021-01-14 |
GB2599597A (en) | 2022-04-06 |
MX2022000299A (es) | 2022-04-25 |
GB202200060D0 (en) | 2022-02-16 |
AU2020311360A1 (en) | 2022-02-24 |
BR112022000318A2 (pt) | 2022-03-15 |
US11516410B2 (en) | 2022-11-29 |
US20210004405A1 (en) | 2021-01-07 |
CN114424238A (zh) | 2022-04-29 |
CA3161129A1 (en) | 2021-01-14 |
GB202200058D0 (en) | 2022-02-16 |
US20210004051A1 (en) | 2021-01-07 |
GB202200057D0 (en) | 2022-02-16 |
GB2599839A (en) | 2022-04-13 |
AU2020311356A1 (en) | 2022-02-24 |
WO2021007238A1 (en) | 2021-01-14 |
US20240031688A1 (en) | 2024-01-25 |
WO2021007248A1 (en) | 2021-01-14 |
CA3146171A1 (en) | 2021-01-14 |
MX2022000340A (es) | 2022-04-25 |
MX2022000345A (es) | 2022-04-25 |
AU2020309531A1 (en) | 2022-02-24 |
GB2599838A (en) | 2022-04-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210006730A1 (en) | Computing device | |
US10984576B2 (en) | Activity surface detection, display and enhancement of a virtual scene | |
US20230415030A1 (en) | Virtualization of Tangible Interface Objects | |
US10726266B2 (en) | Virtualization of tangible interface objects | |
US11022863B2 (en) | Display positioning system | |
US10033943B1 (en) | Activity surface detection, display and enhancement | |
US20220322555A1 (en) | Display Positioning System | |
US20200143567A1 (en) | Protective Cover Device | |
GB2564784B (en) | Activity surface detection, display and enhancement of a virtual scene | |
US20200233503A1 (en) | Virtualization of tangible object components |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
AS | Assignment |
Owner name: TANGIBLE PLAY, INC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SOLOMON, MARK;SCHOLLER, JEROME;SIGNING DATES FROM 20201022 TO 20201201;REEL/FRAME:054517/0398 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
AS | Assignment |
Owner name: GLAS TRUST COMPANY LLC, NEW JERSEY Free format text: SECURITY INTEREST;ASSIGNOR:TANGIBLE PLAY, INC.;REEL/FRAME:060257/0811 Effective date: 20211124 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |