WO2022020452A1 - Dispositif d'interface utilisateur virtualisée - Google Patents
Dispositif d'interface utilisateur virtualisée Download PDFInfo
- Publication number
- WO2022020452A1 WO2022020452A1 PCT/US2021/042549 US2021042549W WO2022020452A1 WO 2022020452 A1 WO2022020452 A1 WO 2022020452A1 US 2021042549 W US2021042549 W US 2021042549W WO 2022020452 A1 WO2022020452 A1 WO 2022020452A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensors
- display panels
- transformative
- user
- virtualized
- Prior art date
Links
- 230000004044 response Effects 0.000 claims abstract description 7
- 238000000034 method Methods 0.000 claims description 8
- 238000004891 communication Methods 0.000 claims description 6
- 238000001514 detection method Methods 0.000 claims description 5
- 230000003287 optical effect Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 10
- 230000001953 sensory effect Effects 0.000 description 6
- 230000009466 transformation Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 241000256113 Culicidae Species 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000000704 physical effect Effects 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- 230000001131 transforming effect Effects 0.000 description 2
- 241000255925 Diptera Species 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 239000006059 cover glass Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000012528 membrane Substances 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 238000013515 script Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/26—Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/06—Patience; Other games for self-amusement
- A63F9/08—Puzzles provided with elements movable in relation, i.e. movably connected, to each other
- A63F9/0826—Three-dimensional puzzles with slidable or rotatable elements or groups of elements, the main configuration remaining unchanged, e.g. Rubik's cube
- A63F9/0834—Three-dimensional puzzles with slidable or rotatable elements or groups of elements, the main configuration remaining unchanged, e.g. Rubik's cube comprising only two layers, e.g. with eight elements
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/06—Patience; Other games for self-amusement
- A63F9/08—Puzzles provided with elements movable in relation, i.e. movably connected, to each other
- A63F9/0826—Three-dimensional puzzles with slidable or rotatable elements or groups of elements, the main configuration remaining unchanged, e.g. Rubik's cube
- A63F9/0838—Three-dimensional puzzles with slidable or rotatable elements or groups of elements, the main configuration remaining unchanged, e.g. Rubik's cube with an element, e.g. invisible core, staying permanently in a central position having the function of central retaining spider and with groups of elements rotatable about at least three axes intersecting in one point
- A63F9/0842—Three-dimensional puzzles with slidable or rotatable elements or groups of elements, the main configuration remaining unchanged, e.g. Rubik's cube with an element, e.g. invisible core, staying permanently in a central position having the function of central retaining spider and with groups of elements rotatable about at least three axes intersecting in one point each group consisting of again a central element and a plurality of additional elements rotatable about three orthogonal axes at both ends, the additional elements being rotatable about at least two axes, e.g. Rubik's cube
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2401—Detail of input, input devices
- A63F2009/243—Detail of input, input devices with other kinds of input
- A63F2009/2435—Detail of input, input devices with other kinds of input using a video camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2401—Detail of input, input devices
- A63F2009/2436—Characteristics of the input
- A63F2009/2442—Sensors or detectors
- A63F2009/2447—Motion detector
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2448—Output devices
- A63F2009/245—Output devices visual
- A63F2009/2457—Display screens, e.g. monitors, video displays
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2448—Output devices
- A63F2009/247—Output devices audible, e.g. using a loudspeaker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- aspects of the embodiments relate generally to electronics, transducer, and data-processing technologies and, more particularly, to handheld computing devices comprising user-interfaces.
- Augmented- or mixed-reality games available in the market generally use either camera or geolocation as real-world inputs.
- a game processes video camera images of its surrounding “real” environment and superimposes additional, “virtual” elements on them.
- a cell phone game called “Mosquitos” released circa 2004 displayed a phone camera image on the screen of the phone, and overlaid images of giant mosquitoes on it; the player’s objective was to shoot the mosquitos using superimposed crosshairs.
- Virtual objects in transreality puzzles may be displayed on a separate display like a monitor display or a wearable VR/AR headset communicatively coupled to the transformable input device, with the latter receiving mechanical inputs from the user.
- virtual objects may be displayed, and be subjected to manipulation or transformations on a display or a plurality of displays placed on the outside surfaces of the transformable input device itself.
- the unique experience delivered by such transreality puzzles is based on integrating active three-dimensional fine-motor user inputs with purposely-engineered sensory, visual and haptic feedback.
- transreality gaming devices comprise multiple moving parts requiring mutual rotations or positional shifts, thus leading to significant production costs and limited reliability defined by mechanical movement, moving surface contamination, electrical connection complexity, as well as hazards due to the presence of small mechanical parts.
- FIG. 1 is a perspective-view diagram illustrating a virtualized transformative user-interface device according to some embodiments.
- FIG. 2A is a process flow diagram illustrating basic functionality of the device of FIG. 1 according to some examples.
- FIG. 2B is a diagram illustrating exemplary user interactions with a volumetric mixed reality device according to some embodiments.
- FIG. 3 is a simplified exploded-view diagram of a display sensor according to an example.
- FIG. 4 is a simplified perspective-view diagram illustrating some interior components of a device according to some embodiments.
- FIG. 5 is a simplified perspective-view diagram illustrating a core for use in a device according to some embodiments.
- FIG. 6 is a front elevational-view diagram illustrating a sleeve for use in a device according to some embodiments.
- FIG. 7 is a simplified perspective-view diagram illustrating a device according to some examples.
- FIGS. 8A-8D are perspective-view diagrams illustrating various devices in accordance with principles described in the present disclosure.
- FIG. 9 is a simplified perspective-view diagram illustrating an example of user interaction with a volumetric mixed reality device according to some embodiments.
- a virtualized transformable user-interface electronic device is built as a robust monolithic body with no actual externally-observable moving parts.
- these aspects may mitigate some limitations of transreality puzzle-type devices.
- a virtualized transformative user-interface device includes a monolithic non-transformable body having a plurality of outside-facing surfaces and a plurality of display panels respectively disposed along at least some of the plurality of outside facing surfaces. The device detects whether mechanical user input forces are consistent with an intent to move or transform the virtualized transformative user-interface device, and in response to a positive determination, the device emulates transformative movement, via changing graphical images on the plurality of display panels.
- the device has a cubic form factor with six flat panel displays on its exterior faces, and includes additional sensor or actuator components supporting user interactivity, e.g. microphone, accelerometer, temperature sensor, light sensor, camera, vibration actuator or speaker.
- additional sensor or actuator components supporting user interactivity e.g. microphone, accelerometer, temperature sensor, light sensor, camera, vibration actuator or speaker.
- user input may be collected via touch force sensor arrays either directly integrated with displays (touch screen), or a sensor plate with force sensors placed underneath the displays.
- force sensors may be placed in device edges defined by the displays on its faces, or on the core of the device mechanically connected to faces or edges using axles. In all cases, the quantity, locations, and arrangement of force sensors are arranged to facilitate automated determination of magnitude and direction of mechanical force applied to device faces and edges by a user engaged interactively with the device.
- Additional sensory inputs and processing may combine multi-modal sensing, e.g., camera, accelerometer, pressure, with processing that combines such inputs to discern gestures, mimicry, other higher-level user behaviors.
- multi-modal sensing e.g., camera, accelerometer, pressure
- processing that combines such inputs to discern gestures, mimicry, other higher-level user behaviors.
- the present disclosure relates to a volumetric mixed-reality electronic device appearing from the exterior as a robust, monolithic body with no external moving parts.
- a virtualized transformative user-interface device is equipped with data-processing circuitry, such as a microprocessor-based system that includes a processor core, memory, non-volatile data storage, input/output circuitry, and interface circuitry that is coupled to input devices (such as sensors), and output devices such as displays, or a sound system (e.g., amplifier, speaker).
- the microprocessor-based system may comprise a central processing unit (CPU), graphics processing unit (GPU), digital signal processor (DSP), or a combination of various types of processor architectures to support the types of algorithms utilized to implement the device’s functionality.
- suitable interface circuitry such as analog- to-digital converter (ADC) or digital-to-analog converter (DAC) circuitry may be included and suitably coupled to the processing circuitry.
- ADC analog- to-digital converter
- DAC digital-to-analog converter
- SoC System-on-a-Chip
- the microprocessor-based system may include data-communications circuitry, such as a WiFi modem and radio compliant with an IEEE 802.11 standard, an IEEE 802.15 standard (“Bluetooth”), a 3GPP standard (LTE/5G), or the like, suitably interfaced with a processor circuit.
- the device may include suitable connectors, flex or other cabling, power sources, and related electronics such as a system bus or other internal interface between various hardware devices.
- Instructions executable by the microprocessor-based system may be stored in non-volatile memory, such as flash EEPROM or other suitable storage media. Certain processing may be carried out locally on-board the deice, whereas other processing may be carried out remotely on a peer device or on one or more servers consistent with a cloud-computing model, utilizing inputs collected locally by the device, and transmitting processed results based on that input to the device. In related embodiments, certain instructions may be transmitted to the device from a server, peer device, or other source, to be executed locally on the device.
- non-volatile memory such as flash EEPROM or other suitable storage media.
- certain machine-leaming-based algorithms are executed on the device, which are dynamically tuned under a supervised-learning regime, an un-supervised learning regime, a reinforcement learning regime, or a combination of such paradigms. Accordingly, certain parameters (e.g., weights, offsets, loss function, etc.) may be developed and adjusted over time, whether locally by the device, or remotely (e.g., in the cloud) and transmitted to the device.
- certain parameters e.g., weights, offsets, loss function, etc.
- a mixed reality electronic device has an electronic display, which may be a thin-film transistor (TFT) device, e.g., an active matrix video display such as a LCD , LED, OLED, or similar, situated at the outer surface of the device.
- TFT thin-film transistor
- the device has a plurality of electronic displays situated at its outer surfaces.
- the display(s) present content, such as movable graphical objects with which the user may interact via manipulation of the device, hand gestures, or some combination thereof. For instance, the user may produce readable input by pressing or swiping on or over the display(s) with a finger. This input may be interpreted as directions for moving the graphical object over the surface of the display as part of solving a puzzle, or transforming a transformable geometric shape.
- the virtualized transformative user-interface device may be equipped with one or more sensors to detect other physical actions of the user according to some embodiments.
- the device comprises an accelerometer, the output of which is provided to a processor programmed to detect user gestures, including shaking and similar actions. Further, some embodiments of the device utilize an accelerometer (e.g., gyroscope) to detect device orientation and movement in space.
- An advanced feature includes recognition of user hand gestures. The hand gestures may be detected through a number of presently existing, or future- arising, technologies, including, but not limited to, strain or pressure sensing, resistive touch sensing, capacitive sensing, or optical detection.
- strain or pressure sensing e.g., gyroscope
- resistive touch sensing e.g., capacitive sensing, or optical detection.
- One shared principle of such various sensing techniques exploits a change their physical state, typically electrical properties, when a human hand or finger is placed in direct contact or in proximity of the sensing device. The change in electrical properties of the sensing device is processed to be interpreted by the processor-based circuitry of the device.
- one or more video cameras may be incorporated into a device and adapted to detect the position of the user’s eyes.
- This input may be used in many ways, including, but not limited to, control of the content, but also energy-saving and battery life extension through dimming of surfaces not visible to the user.
- the device may include instructions to integrate inputs from the various sensors, recognize the pattern of particular user gestures, and correlate the gestures to a set of allowed transformations of the content displayed on the electronic display or displays.
- the device comprises an output to provide sensory feedback other than visual, e.g. audio circuits, speakers, vibration motor and its controls for haptic feedback.
- Device 0100 is generally shaped like a cube, with display touch sensors 0110 (e.g., force-sensitive input assemblies, capacitive sensors, membrane sensors, etc.) disposed on each of its faces (“display sensors” hereinafter). Only display touch sensors disposed on faces visible directly to the viewer are shown for clarity.
- the device is built as a robust monolithic body with no moving parts (as observable from the outside), which mitigates certain drawbacks of mechanically-transformable devices.
- the device has no extemally-observable moving parts, it may have moving parts internally, such as vibration actuators, gyroscopes or other accelerometers, piezoelectric sensors or actuators, certain microelectromechanical (MEMS) sensors or actuators, speaker, microphone, or the like.
- moving parts such as vibration actuators, gyroscopes or other accelerometers, piezoelectric sensors or actuators, certain microelectromechanical (MEMS) sensors or actuators, speaker, microphone, or the like.
- MEMS microelectromechanical
- FIG. 2A basic functionality of the device according to an example embodiment is illustrated as a flow diagram.
- the basic operation of this example includes storing current video content and displaying it on display or plurality of displays at 210; integrating input from a plurality of sensors at 202; recognizing user’s gesture at 204 based on the input at 202 and on classification criteria applied at 203; recognizing user-intended transformation of the video content displayed on the display or plurality of displays at 206 may be accomplished, in part, by comparing interpreted gesture of the user (or series of gestures) to stored patterns of sensory responses, which may be part of classification criteria at 203.
- the current video content may be transformed and displayed.
- Display touch sensors 0110 are used as sensors to detect user interaction with content on the displays.
- the content comprises movable graphical objects 0112 and 0114 intended to cause a user to interact with them through hand gestures, including, but not limited to, pressing on object 0112 with a finger, moving the object 0114 over the surface of the display, or the like.
- Content scripts in some embodiments of the present disclosure include moving objects as part of solving a puzzle, or transforming virtualized transformable geometric shapes into various emulated configurations.
- the displays 0130 are 3D displays.
- the device may be configured to create an illusion of a volumetric object with objects placed inside.
- FIG. 3 is an exploded view of a display touch sensor 0110 illustrating principal components supporting its functionality according to an example.
- the touch sensor 0110 of FIG. 3 includes cover glass 0120, active-matrix display array (AMD) 0130 with an input flex circuit cable 0132 connecting AMD 0130 to the driving electronics, sensor plate 0140, with sensors 0142 disposed on it, and a flex cable 0144.
- sensor plate 0140 comprises four distinct sensor devices 0142.
- the sensor devices are tensoresistors, i.e. electronic components with electrical resistance dependent on the mechanical force applied (e.g., piezoresistors). When a pressure is applied to the movable object 112 as shown in FIG.
- the four tensoresistors disposed as illustrated in FIG. 3 provide four measurements of force magnitude in known sensor locations. Measuring these values in real time provides enough information to map digital representations of hand mechanical action, i.e. point of application, direction and magnitude of mechanical force applied to the surface of the device through e.g. a finger or a palm, touching the display sensor.
- A artificial intelligence
- DL deep learning
- TPU Tensor Processing Unit
- the hardware executing the AI and DL training is remote from the device, such as in the cloud, and communicates with the device via communications circuitry such as the examples described above.
- a device 0100 comprises a core 0166, and a plurality of axles 0160.
- Each of the plurality of axles 0160 comprises a distal end 0162 and a proximal end 0164.
- the distal end 0162 is mechanically linked to display 0130, and the proximal end 0164 is mechanically linked to the core 0166, as illustrated in FIG. 4.
- One embodiment of core 0166 is shown in FIG. 5.
- Core 0166 comprises six sleeves 0180, one sleeve aligned orthogonally with a corresponding display 0130.
- a front elevational view of sleeve 0180 is shown in FIG. 6 according to an example.
- Each sleeve 0180 comprises a plurality of sensors 0182 and a plurality of axle retainer members 0184.
- Each of the plurality of axle retainer members 0184 comprises a sensor surface 0188 adjacent sensor 0182, and an axle retainer surface 0186 adj acent the proximal end 0164 of the axle 0160.
- the axle retainer member 0184 is arranged to transmit force from the axle 0160 mechanically coupled to display 0130.
- This example embodiment includes four of each sensor and axle retainer members per axle, though other configurations are also contemplated. The whole arrangement is intended to detect in real time the magnitude and direction of force that user applies to each screen.
- the number of display screens differs from six, and the number of axles per active matrix display screen may differ as well. Further, retaining the axles, transmitting force from axle to sensor, and collecting enough measurements from the sensors to determine force magnitude and direction may require different numbers of sensors and axle retainer members per axle.
- sensors may be disposed along the edges of the device, as shown in FIG. 7.
- the device is of generally cubical shape, and two sensors are disposed per edge. In other embodiments (not shown) there may be more or fewer sensors situated along each of the edges.
- volumetric mixed reality devices of the present disclosure is emulating user experiences similar to those provided by transformable electronic devices, using a monolithic, non-transformable, electronic device such as one described in accordance with any of the foregoing embodiments.
- the images displayed on the electronic display or displays emulate the appearance and functionality of a transformable input device with elements that could be re-positioned, slanted, or turned.
- a volumetric mixed reality device may be configured to emulate a 2x2x2-cubelet transreality puzzle such as in the examples disclosed in Russian Federation Patent No. RU2644313C1, the disclosure of which is incorporated by reference herein.
- FIGS. 8C- 8D Other related embodiments may emulate e.g. a 3x3x3 Rubik’s Cube-style transreality puzzle as disclosed in U.S. Patent No. 8,465,356 (not shown), or a 4x4x4 puzzle as shown in FIGS. 8C- 8D.
- the sensors built into the non-transformable devices of FIGs. 8A-8D, or similar devices are arranged to detect the user’s physical actions, typically hand gestures, intended to transform the device if it were transformable.
- the images displayed on the display screens are animated in response to the sensor inputs to emulate transformative movement of a transformative device.
- the animation may depict re-positioning, slanting, pushing, compressing or turning the virtualized movable elements.
- a set of instructions may be programmed into the device to integrate the inputs from multiple the sensors, recognize the pattern of a particular user gesture, correlate the gesture to a predefined transformation of the virtualized transformative device.
- FIG. 9 shows an example of a gesture employed to interact with a virtualized transformative device which comprises sensors 0190 incorporated into its edges.
- the generally cubic form of the device defines three axes of symmetry. Each of these three axes connects centers of opposing cube faces. One of the three axes is XI -X2, connecting points XI and X2, are geometric centers of an opposing pair of device faces.
- the user is applying force on the edges of the device as if to transform an actual transformative device by rotating one group of four virtual cubelets around axis XI -X2 relative to other group of four virtual cubelets as shown by arrows A1 and A2.
- This gesture reflecting an intent to transform the virtualized transformable device, manifests in application of rotating momenta shown as A1 and A2 at the points of the user’s hand contact to the device, with forces measured by sensors 0190.
- the device of FIG. 9 is not actually transformative in a mechanical sense, the device is programmed to emulate mechanical transformation to provide visual and, optionally, sensory feedback to the user which depicts such mechanical transformation.
- a virtualized transformative electronic devices differs substantially from actual mechanically-transformable input devices employed in transreality puzzles in that they are not transformable, i.e. incapable of having their parts repositioned, slanted, or turned.
- the term virtualized in the present context means that the device is operative to receive users’ dynamic input and provide visual and, optionally, sensory feedback without significant relative movement of rotatable or shiftable parts like e.g. rotating cubelets in Rubik’s cube -type puzzles referenced above, or joysticks.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- External Artificial Organs (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Un dispositif d'interface utilisateur de transformation virtualisée comprend un corps monolithique non transformable ayant une pluralité de surfaces tournées vers l'extérieur et une pluralité de panneaux d'affichage disposés respectivement le long d'au moins une partie de la pluralité de surfaces tournées vers l'extérieur. Le dispositif détecte si des forces mécaniques entrées par un utilisateur sont cohérentes avec une intention de déplacer ou de transformer le dispositif d'interface utilisateur de transformation virtualisée, et en réponse à une détermination positive, le dispositif émule un mouvement de transformation, par l'intermédiaire de changements d'images graphiques sur la pluralité de panneaux d'affichage.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063054272P | 2020-07-21 | 2020-07-21 | |
US63/054,272 | 2020-07-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022020452A1 true WO2022020452A1 (fr) | 2022-01-27 |
Family
ID=79728919
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2021/042549 WO2022020452A1 (fr) | 2020-07-21 | 2021-07-21 | Dispositif d'interface utilisateur virtualisée |
Country Status (2)
Country | Link |
---|---|
TW (1) | TWI807372B (fr) |
WO (1) | WO2022020452A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120302303A1 (en) * | 2010-11-22 | 2012-11-29 | Gonzalez Rosendo | Display puzzle |
US20140223378A1 (en) * | 2013-02-01 | 2014-08-07 | Akshay Sinha | Graphical user interface (gui) that receives directional input to change face for receiving passcode |
US20180161668A1 (en) * | 2015-04-27 | 2018-06-14 | Shanghai Dianhua Digital Technology Co., Ltd. | Smart puzzle cube having prompting and recording functions |
US20180311566A1 (en) * | 2017-04-28 | 2018-11-01 | Bong Gu SHIN | Smart magic cube and operation method thereof |
US20190358549A1 (en) * | 2016-10-20 | 2019-11-28 | Cubios, Inc | Electronic device with a three-dimensional transformable display |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10747404B2 (en) * | 2017-10-24 | 2020-08-18 | Microchip Technology Incorporated | Touchscreen including tactile feedback structures and corresponding virtual user interface elements |
-
2021
- 2021-07-21 WO PCT/US2021/042549 patent/WO2022020452A1/fr active Application Filing
- 2021-07-21 TW TW110126817A patent/TWI807372B/zh active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120302303A1 (en) * | 2010-11-22 | 2012-11-29 | Gonzalez Rosendo | Display puzzle |
US20140223378A1 (en) * | 2013-02-01 | 2014-08-07 | Akshay Sinha | Graphical user interface (gui) that receives directional input to change face for receiving passcode |
US20180161668A1 (en) * | 2015-04-27 | 2018-06-14 | Shanghai Dianhua Digital Technology Co., Ltd. | Smart puzzle cube having prompting and recording functions |
US20190358549A1 (en) * | 2016-10-20 | 2019-11-28 | Cubios, Inc | Electronic device with a three-dimensional transformable display |
US20180311566A1 (en) * | 2017-04-28 | 2018-11-01 | Bong Gu SHIN | Smart magic cube and operation method thereof |
Also Published As
Publication number | Publication date |
---|---|
TWI807372B (zh) | 2023-07-01 |
TW202209054A (zh) | 2022-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3160608B1 (fr) | Commande de jouets physiques employant un moteur physique | |
ES2966949T3 (es) | Método y sistema de interacción háptica | |
US9229540B2 (en) | Deriving input from six degrees of freedom interfaces | |
US20170291104A1 (en) | User Interface Device Provided With Surface Haptic Sensations | |
US9545571B2 (en) | Methods and apparatus for a video game magic system | |
CN105653014B (zh) | 具有触觉减弱防止组件的外围设备 | |
US20160098095A1 (en) | Deriving Input from Six Degrees of Freedom Interfaces | |
Liang et al. | GaussBricks: magnetic building blocks for constructive tangible interactions on portable displays | |
JP6062621B2 (ja) | 物理モデルに基づくジェスチャ認識 | |
EP3364272A1 (fr) | Système de génération haptique localisée automatique | |
WO2016205143A1 (fr) | Gants incluant une rétroaction haptique et destinés à être utilisés avec des systèmes hmd | |
US20160328028A1 (en) | System, method and device for foot-operated motion and movement control in virtual reality and simulated environments | |
JP4982877B2 (ja) | 触覚ディスプレイ装置、多自由度アクチュエータ、及び、ハンドリング装置 | |
US20100261514A1 (en) | Hand-manipulable interface methods and systems | |
US20190278370A1 (en) | Single actuator haptic effects | |
US20080248871A1 (en) | Interface device | |
US20170087455A1 (en) | Filtering controller input mode | |
CN113508355A (zh) | 虚拟现实控制器 | |
WO2022020452A1 (fr) | Dispositif d'interface utilisateur virtualisée | |
US20150084848A1 (en) | Interaction between generic interaction devices and an interactive display | |
US12011658B2 (en) | Single unit deformable controller | |
Spanogianopoulos et al. | Human computer interaction using gestures for mobile devices and serious games: A review | |
US20230338827A1 (en) | Correlating gestures on deformable controller to computer simulation input signals | |
Henriques | SPR1NG Controller: An Interface for Kinesthetic Spatial Manipulation | |
US20240173618A1 (en) | User-customized flat computer simulation controller |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21845841 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08.05.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21845841 Country of ref document: EP Kind code of ref document: A1 |