WO2023072406A1 - Changement de disposition d'un dispositif d'entrée virtuel - Google Patents

Changement de disposition d'un dispositif d'entrée virtuel Download PDF

Info

Publication number
WO2023072406A1
WO2023072406A1 PCT/EP2021/080194 EP2021080194W WO2023072406A1 WO 2023072406 A1 WO2023072406 A1 WO 2023072406A1 EP 2021080194 W EP2021080194 W EP 2021080194W WO 2023072406 A1 WO2023072406 A1 WO 2023072406A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
input device
layout
user
control device
Prior art date
Application number
PCT/EP2021/080194
Other languages
English (en)
Inventor
Andreas Kristensson
Peter ÖKVIST
Tommy Arngren
Original Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget Lm Ericsson (Publ) filed Critical Telefonaktiebolaget Lm Ericsson (Publ)
Priority to PCT/EP2021/080194 priority Critical patent/WO2023072406A1/fr
Publication of WO2023072406A1 publication Critical patent/WO2023072406A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1673Arrangements for projecting a virtual keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/54Details of telephonic subscriber devices including functional features of a projector or beamer module assembly

Definitions

  • Embodiments presented herein relate to a control device, a method, a computer program, and a computer program product for changing a layout of a virtual input device in a virtual environment.
  • Virtual input devices such as optical virtual keyboards allow input of characters without the need for physical keys.
  • the human interaction with the virtual keyboard occurs mostly via a touchscreen interface, but can also take place in a different form in virtual or augmented reality.
  • optical virtual input devices are configured to optically detect and analyze human hand and finger motions and interpret them as operations on a physically non-existent input device, such as like a surface with painted or projected keys on a virtual keyboard. In that way optical virtual devices can emulate unlimited types of manually operated input devices (such as a mouse, keyboard, and other devices).
  • a projection keyboard is a form of optical virtual input device whereby the image of a virtual keyboard is projected onto a surface. When a user touches the surface covered by an image of a key, the input device records the corresponding keystroke.
  • optical virtual input devices are based on combinations of laser and tactile sensors where finger-on-projected-area in combination with micro-vibration detection are considered. For example, a projected finger tap detected simultaneously with a tap-rendered vibration is indicative of a key stroke.
  • Mechanical input units can thereby be replaced by such virtual input devices, potentially optimized for a specific application and for the user's physiology, maintaining speed, simplicity and unambiguity of manual data input.
  • some virtual keyboards allow for the keyboard layout to be changed. This can be realized by providing keyboard shortcuts, or so-called “hot keys”, on the virtual keyboard, or by the user touching one or more physical buttons on a physical input device, such as on the physical device projecting the virtual keyboard.
  • One drawback of current virtual keyboards is that they do not enable the user to quickly and intuitively adapt and change the keyboard layout. It could therefore be difficult for the user to change the keyboard layout of a virtual keyboard. Expanding this to virtual input devices in general, there is therefore a need for, in a virtual environment, to simplify the reception of accurate user input for changing the layout of a virtual input device.
  • An object of embodiments herein is to address the above issues and to provide techniques that simplify the reception of accurate user input for changing the keyboard layout of a virtual input device.
  • the object is addressed by providing a control device for changing a layout of a virtual input device in a virtual environment.
  • the input device is associated with at least two different layouts for receiving user input, one layout at a time.
  • the control device comprises processing circuitry.
  • the processing circuitry is configured to cause the control device to obtain information about user input defining a user interaction with the input device in the virtual environment.
  • the processing circuitry is configured to cause the control device to calculate a virtual representation of the user interaction with the input device.
  • the processing circuitry is configured to cause the control device to identify, from the virtual representation, that the user interaction comprises the user interacting with the input device so as to move the input device in the virtual environment according to a predetermined pattern.
  • the processing circuitry is configured to cause the control device to in response thereto change the layout of the input device from one of the at least two layouts to another one of the at least two layouts.
  • the object is addressed by providing a method for changing a layout of a virtual input device in a virtual environment.
  • the input device is associated with at least two different layouts for receiving user input, one layout at a time.
  • the method is performed by a control device.
  • the method comprises obtaining information about user input defining a user interaction with the input device in the virtual environment.
  • the method comprises calculating a virtual representation of the user interaction with the input device.
  • the method comprises identifying, from the virtual representation, that the user interaction comprises the user interacting with the input device so as to move the input device in the virtual environment according to a predetermined pattern.
  • the method comprises, in response thereto, changing the layout of the input device from one of the at least two layouts to another one of the at least two layouts.
  • the object is addressed by providing a computer program for changing a layout of a virtual input device in a virtual environment, the computer program comprising computer program code which, when run on a control device, causes the control device to perform a method according to the second aspect.
  • a computer program product comprising a computer program according to the third aspect and a computer readable storage medium on which the computer program is stored.
  • the computer readable storage medium could be a non-transitory computer readable storage medium.
  • the object is addressed by providing a communication device comprising a control device according to the first aspect.
  • a system comprising a control device according to the first aspect and a user interface device.
  • the user interface device comprises a projection module for making the virtual input device visible on a surface.
  • the user interface device comprises a sensor for sensing the user interaction of the user with the virtual input device.
  • these aspects enable the layout of a virtual input device to be intuitively, quickly, accurately, and dynamically changed.
  • these aspects do not require the need for the user to remember any keyboard shortcuts, or so-called “hot keys”, for changing the layout of a virtual input device.
  • these aspects do not require the need for having one or more physical buttons on a physical input device for changing the layout of a virtual input device.
  • the predetermined pattern is defined by the virtual representation of the user grabbing and pivoting the virtual input device.
  • changing the layout comprises sequentially cycling through the layouts, one after another.
  • the user interaction with the virtual input device is performed while a software application is run in the virtual environment, and the user interaction provides user input to the software application.
  • the layout which the virtual input device is changed to depends on the software application.
  • the virtual input device is a virtual keyboard comprising virtual keys.
  • a change of the layout of the virtual input device is made by changing from a virtual keyboard with a first layout to a virtual keyboard with a second layout.
  • each of the first layout and the second layout differ from each other in terms of any of: character set, presence of shortcut keys, presence of functional keys, size of the virtual keys, size of the keyboard.
  • the virtual input device is a virtual communication device comprising virtual keys or a virtual musical instrument comprising virtual keys.
  • a change of the layout of the virtual input device is made by either changing the placement of the virtual keys on the virtual communication device or the virtual musical instrument or by changing from a first virtual communication device or a first virtual musical instrument to a second virtual communication device or a second virtual musical instrument.
  • the virtual representation of the user selectively touching the virtual input device at different parts of the virtual input device results in changing the layout differently depending on at which portion the virtual representation of the user touches the virtual input device.
  • the user interaction with the virtual input device is performed while a software application is run in the virtual environment
  • the processing circuitry further is configured to cause the control device to: identify, according to a predefined criterion, that there is another layout of the virtual input device than a currently used layout that corresponds to the software application, and in response thereto: provide, in the virtual environment, an indication to the user that it is possible to change the layout of the virtual input device and/or the action needed to change the layout of the virtual input device.
  • the virtual environment is an extended reality (XR) virtual environment.
  • XR extended reality
  • the virtual environment is either an augmented reality (AR) virtual environment, a virtual reality (VR) virtual environment, or a mixed reality (MR) virtual environment.
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • the information related to user input is obtained from at least one sensor configured to track hand and finger movement of the user in relation to the virtual input device in the virtual environment.
  • Fig. 1 is a schematic diagram illustrating a system according to embodiments
  • Fig. 2 schematically illustrates a virtual input device according to embodiments
  • FIGS. 3 and 4 are flowcharts of methods according to embodiments
  • Fig. 5 is a schematic diagram showing functional units of a control device according to an embodiment
  • Fig. 6 is a schematic diagram showing functional modules of a control device according to an embodiment.
  • Fig. 7 shows one example of a computer program product comprising computer readable storage medium according to an embodiment.
  • the embodiments disclosed herein therefore relate to mechanisms for changing a layout of a virtual input device in a virtual environment.
  • a control device a method performed by the control device, a computer program product comprising code, for example in the form of a computer program, that when run on a control device, causes the control device to perform the method.
  • Fig. 1 is a schematic diagram of a system 100.
  • the system 100 comprises a user interface device 110 and a control device 500.
  • the user interface device 110 and the control device 500 are operatively connected to each other.
  • control device 500 is part of, or integrated with, the user interface device 110. In some implementations, the control device 500 is part of, or integrated with, a communication device, such as a mobile phone, tablet computer, or the like.
  • the user interface device 110 comprises a projection module 114 for making a virtual input device 120 visible on a surface.
  • the user interface device 110 further comprises a sensor 112 for sensing user interaction of a user with the virtual input device 120 at coordinates along the surface.
  • the sensor 112 could be a radar module, a lidar module, a camera module, or the like.
  • the sensor 112 is an inertial measurement unit (IMU) and is provided on gloves, or other piece of garment, worn by the user.
  • IMU inertial measurement unit
  • the functionality of the user interface device 110 might be split between at least two physical devices. Further, also combinations of different types of sensors 112 are possible.
  • the user is schematically represented by two hands.
  • Software converts the coordinates to identify actions or characters put in by the user 130a, 130b.
  • the control device 500 calculates a virtual representation of the user interaction with the virtual input device 120.
  • the virtual representation and the virtual input device 120 define parts of a virtual environment.
  • the virtual environment is an extended reality (XR) virtual environment.
  • the virtual environment is either an augmented reality (AR) virtual environment, a virtual reality (VR), virtual environment, or a mixed reality (MR) virtual environment.
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • the user interface device 110 in Fig. 1 is illustrated as a stand-alone device, the user interface device 110 could be part of a headset (such as a VR headset), or wearable computer glasses (such as AR glasses, or smart glasses).
  • Fig. 2 schematically illustrates a virtual input device 120 having a layout that is changeable between a first layout 210a and a second layout 210b. It is here noted that the layout may be changeable between two or more layouts.
  • a change between the first layout 210a and the second layout 210b is schematically illustrated at arrow 220.
  • the change between the first layout 210a and a second layout 210b takes place when the control device 500 identifies that a virtual representation 230 of the user touches the virtual input device 120 so as to move the virtual input device 120 in the virtual environment according to a predetermined pattern 240.
  • Fig. 3 is a flowchart illustrating embodiments of methods for changing a layout 210a, 210b of a virtual input device 120 in a virtual environment.
  • the virtual input device 120 is associated with at least two different layouts 210a, 210b for receiving user input, one layout 210a, 210b at a time.
  • the methods are performed by the control device 500.
  • the methods are advantageously provided as computer programs 720.
  • the control device 500 obtains information about user input defining a user interaction with the virtual input device 120 in the virtual environment.
  • S104 The control device 500 calculates a virtual representation 230 of the user interaction with the virtual input device 120.
  • SI 10 The control device 500 identifies, from the virtual representation 230, that the user interaction comprises the user 130a, 130b interacting with the virtual input device 120 so as to move the virtual input device 120 in the virtual environment according to a predetermined pattern 240.
  • SI 12 The control device 500, in response thereto, changes the layout 210a, 210b of the virtual input device 120 from a first layout of the at least two layouts 210a, 210b to a second layout of the at least two layouts 210a, 210b.
  • Embodiments relating to further details of changing a layout 210a, 210b of a virtual input device 120 in a virtual environment as performed by the control device 500 will now be disclosed.
  • sensors 112 are configured to track the virtual representation 230 of the user.
  • the information related to user input is obtained from at least one sensor 112 configured to track hand and finger movement of the user 130a, 130b in relation to the virtual input device 120 in the virtual environment.
  • sensors 112 have been disclosed above.
  • the virtual input device 120 in the virtual environment is moved according to a predetermined pattern 240.
  • the predetermined pattern 240 is defined by the virtual representation 230 of the user 130a, 130b grabbing and pivoting the virtual input device 120.
  • changing the layout 210a, 210b comprises sequentially cycling through the layouts 210a, 210b, one after another. That is, assuming that there are three layouts, the layout would then be changed first from a first layout to a second layout, then from the second layout to a third layout, and then from the third layout back to the first layout.
  • which layout 210a, 210b to change to is dependent on the user application.
  • the user interaction with the virtual input device 120 is performed whilst a software application is run in the virtual environment.
  • the user interaction provides user input to the software application, and the layout 210a, 210b to which the virtual input device 120 is changed to depends on the software application.
  • the layout might change such that the virtual input device 120 is changed from a keyboard with a first layout 210a optimized for a first natural language to a keyboard with a second layout 210b optimized for a second language. This could be useful for a document processing application.
  • the layout might change such that the virtual input device 120 is changed from a keyboard with a general-purpose layout to a keyboard with a layout optimized for the computer programming language of the computer programming application.
  • the layout 210a, 210b is changed.
  • the virtual input device 120 is a virtual keyboard comprising virtual keys. A change of layout 210a, 210b of the virtual input device 120 can then be made by changing from a virtual keyboard with a first layout 210a of the virtual keys to a virtual keyboard with a second layout 210b of the virtual keys.
  • each of the first layout 210a and the second layout 210b differ from each other.
  • each of the first layout 210a and the second layout 210b differ from each other in terms of any of: character set, presence of hot keys, presence of functional keys, colour, texture, activation sound, size of the virtual keys, size of the keyboard.
  • the virtual input device 120 is a virtual communication device comprising virtual keys or a virtual musical instrument comprising virtual keys.
  • a change of the layout 210a, 210b of the virtual input device 120 can then be made by either changing the placement of the virtual keys on the virtual communication device or the virtual musical instrument or by changing from a first virtual communication device or a first virtual musical instrument to a second virtual communication device or a second virtual musical instrument.
  • the virtual communication device might be a virtual representation of a smartphone, tablet computer, or the like.
  • a change of the layout 210a, 210b of the virtual communication device can then be made by changing the placement of virtual keys on the virtual representation of the smartphone, or tablet computer.
  • a further change of the layout 210a, 210b of the virtual communication device can then be made by changing from a virtual representation of a smartphone to a virtual representation of a tablet computer.
  • the layout 210a, 210b is changed dependent on which part of the virtual input device 120 the user grabs.
  • the virtual representation 230 of the user 130a, 130b selectively interacting with the virtual input device 120 at different parts of the virtual input device 120 results in changing the layout 210a, 210b differently depending on at which portion the virtual representation 230 of the user 130a, 130b interacts with the virtual input device 120.
  • One example of such an interaction may be the user touching the virtual input device 120.
  • an indication is provided to the user that a change of layout 210a, 210b is possible.
  • the user interaction with the virtual input device 120 is performed while a software application is run in the virtual environment, the control device 500 is configured to perform (optional) steps SI 06 and SI 08:
  • the control device 500 identifies, according to a predefined criterion, that there is another layout 210a, 210b of the virtual input device 120 than a currently used layout 210a, 210b that is available for the software application. This other layout 210a may be better suited for using the software application.
  • the predefined criterion is met when the control device 500 detects that the user types e.g. English words on a virtual keyboard having a Swedish layout and that the user has not before changed keyboard layouts. The control device 500 could thereby based on user input and context determine the optimal adaptation and configuration of of the virtual input device 120.
  • the control device 500 in response thereto, provides, in the virtual environment, an indication to the user 130a, 130b that it is possible to change the layout 210a, 210b of the virtual input device 120 and/or the action needed to change the layout 210a, 210b of the virtual input device 120.
  • Non-limiting examples thereof are movie clips or graphical overlays illustrating how the user is enabled to change the layout 210a, 210b of the virtual input device 120 and the outcome resulting from such a change of the layout 210a, 210b of the virtual input device 120.
  • Fig. 4 is a flowchart of a method for changing a layout 210a, 210b of a virtual input device 120 in a virtual environment as performed by a control device 500 according to at least some of the above disclosed embodiments, aspects, and examples.
  • the control device 500 detect that the user intends to start a session with the virtual input device 120.
  • S202 The control device 500 instructs the user interface device 110 to make the virtual input device 120 visible.
  • the control device 500 obtains information from the sensor 112 that enables the control device 500 to track movement of the hands and fingers of the user.
  • the control device 500 thereby obtains information about user input defining a user interaction with the virtual input device 120 in the virtual environment and calculates a virtual representation 230 of the user interaction with the virtual input device 120
  • the control device 500 identifies, from the virtual representation 230, that the user interaction comprises the user 130a, 130b interacting with the virtual input device 120 so as to move the virtual input device 120 in the virtual environment according to the predetermined pattern 240, for example by the virtual representation 230 of the user 130a, 130b grabbing and pivoting the virtual input device 120.
  • the control device 500 changes the layout 210a, 210b of the virtual input device 120 from one of the at least two layouts 210a, 210b to another one of the at least two layouts 210a, 210b
  • S206 The control device 500 detect that the user no longer interacts with the virtual input device 120 and determines to end the session started in S201.
  • the control device 500 instructs the user interface device 110 to stop making the virtual input device 120 visible. io
  • Fig. 5 schematically illustrates, in terms of a number of functional units, the components of a control device 500 according to an embodiment.
  • Processing circuitry 510 is provided using any combination of one or more of a suitable central processing unit (CPU), multiprocessor, microcontroller, digital signal processor (DSP), etc., capable of executing software instructions stored in a computer program product 710 (as in Fig. 7), e.g. in the form of a storage medium 530.
  • the processing circuitry 510 may further be provided as at least one application specific integrated circuit (ASIC), or field programmable gate array (FPGA).
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the processing circuitry 510 is configured to cause the control device 500 to perform a set of operations, or steps, as disclosed above.
  • the storage medium 530 may store the set of operations
  • the processing circuitry 510 may be configured to retrieve the set of operations from the storage medium 530 to cause the control device 500 to perform the set of operations.
  • the set of operations may be provided as a set of executable instructions.
  • the processing circuitry 510 is thereby arranged to execute methods as herein disclosed.
  • the storage medium 530 may also comprise persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid state memory or even remotely mounted memory.
  • the control device 500 may further comprise a communications interface 520 at least configured for communications with other entities, functions, nodes, and devices. As such the communications interface 520 may comprise one or more transmitters and receivers, comprising analogue and digital components.
  • the processing circuitry 510 controls the general operation of the control device 500 e.g. by sending data and control signals to the communications interface 520 and the storage medium 530, by receiving data and reports from the communications interface 520, and by retrieving data and instructions from the storage medium 530.
  • Other components, as well as the related functionality, of the control device 500 are omitted in order not to obscure the concepts presented herein.
  • Fig. 6 schematically illustrates, in terms of a number of functional modules, the components of a control device 500 according to an embodiment.
  • the control device 500 of Fig. 6 comprises a number of functional modules; an obtain module 610 configured to perform step S102, a calculate module 620 configured to perform step SI 04, an identify module 650 configured to perform step SI 10, and a change module 650 configured to perform step S 112.
  • the control device 500 of Fig. 6 may further comprise a number of optional functional modules, such as any of an identity module 630 configured to perform step SI 06, and a provide module 640 configured to perform step SI 08.
  • each functional module 610-660 may in one embodiment be implemented only in hardware and in another embodiment with the help of software, i.e., the latter embodiment having computer program instructions stored on the storage medium 530 which when run on the processing circuitry makes the control device 500 perform the corresponding steps mentioned above in conjunction with Fig 6. It should also be mentioned that even though the modules correspond to parts of a computer program, they do not need to be separate modules therein, but the way in which they are implemented in software is dependent on the programming language used.
  • one or more or all functional modules 610:660 may be implemented by the processing circuitry 510, possibly in cooperation with the communications interface 520 and/or the storage medium 530.
  • the processing circuitry 510 may thus be configured to fetch instructions from the storage medium 530 as provided by a functional module 610:660 and to execute these instructions, thereby performing any steps as disclosed herein.
  • a first portion of the instructions performed by the control device 500 may be executed in a first device, and a second portion of the of the instructions performed by the control device 500 may be executed in a second device; the herein disclosed embodiments are not limited to any particular number of devices on which the instructions performed by the control device 500 may be executed.
  • the methods according to the herein disclosed embodiments are suitable to be performed by a control device 500 residing in a cloud computational environment. Therefore, although a single processing circuitry 510 is illustrated in Fig. 5 the processing circuitry 510 may be distributed among a plurality of devices, or nodes. The same applies to the functional modules 610:660 of Fig. 6 and the computer program 720 of Fig. 7.
  • Fig. 7 shows one example of a computer program product 710 comprising computer a readable storage medium 730.
  • a computer program 720 can be stored, which computer program 720 can cause the processing circuitry 510 and thereto operatively coupled entities and devices, such as the communications interface 520 and the storage medium 530, to execute methods according to embodiments described herein.
  • the computer program 720 and/or computer program product 710 may thus provide means for performing any steps as herein disclosed.
  • the computer program product 710 is illustrated as an optical disc, such as a CD (compact disc) or a DVD (digital versatile disc) or a Blu-Ray disc.
  • the computer program product 710 could also be embodied as a memory, such as a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or an electrically erasable programmable read-only memory (EEPROM) and more particularly as a non-volatile storage medium of a device in an external memory such as a USB (Universal Serial Bus) memory or a Flash memory, such as a compact Flash memory.
  • RAM random access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • the computer program 720 is here schematically shown as a track on the depicted optical disk, the computer program 720 can be stored in any way which is suitable for the computer program product 710.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un mécanisme permettant de modifier une disposition d'un dispositif d'entrée virtuel dans un environnement virtuel. Le dispositif d'entrée est associé à au moins deux agencements différents pour recevoir une entrée utilisateur, une disposition à la fois. Un procédé est réalisé par un dispositif de commande. Le procédé consiste à obtenir des informations concernant une entrée utilisateur définissant une interaction utilisateur avec le dispositif d'entrée dans l'environnement virtuel. Le procédé comprend le calcul d'une représentation virtuelle de l'interaction utilisateur avec le dispositif d'entrée. Le procédé consiste à identifier, à partir de la représentation virtuelle, que l'interaction utilisateur comprend l'interaction utilisateur avec le dispositif d'entrée de façon à déplacer le dispositif d'entrée dans l'environnement virtuel selon un motif prédéterminé. Le procédé comprend, en réponse à cela, le changement de la disposition du dispositif d'entrée d'une première disposition des au moins deux dispositions à une seconde disposition des au moins deux dispositions.
PCT/EP2021/080194 2021-10-29 2021-10-29 Changement de disposition d'un dispositif d'entrée virtuel WO2023072406A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2021/080194 WO2023072406A1 (fr) 2021-10-29 2021-10-29 Changement de disposition d'un dispositif d'entrée virtuel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2021/080194 WO2023072406A1 (fr) 2021-10-29 2021-10-29 Changement de disposition d'un dispositif d'entrée virtuel

Publications (1)

Publication Number Publication Date
WO2023072406A1 true WO2023072406A1 (fr) 2023-05-04

Family

ID=78536181

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/080194 WO2023072406A1 (fr) 2021-10-29 2021-10-29 Changement de disposition d'un dispositif d'entrée virtuel

Country Status (1)

Country Link
WO (1) WO2023072406A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030174125A1 (en) * 1999-11-04 2003-09-18 Ilhami Torunoglu Multiple input modes in overlapping physical space
US20140028567A1 (en) * 2011-04-19 2014-01-30 Lg Electronics Inc. Display device and control method thereof
EP2733593A2 (fr) * 2012-11-14 2014-05-21 Samsung Electronics Co., Ltd Procédé et dispositif électronique pour fournir un clavier virtuel
EP3046002A1 (fr) * 2015-01-16 2016-07-20 Samsung Electronics Co., Ltd. Dispositif d'entrée virtuel et procédé permettant de recevoir une entrée utilisateur à l'aide de celui-ci

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030174125A1 (en) * 1999-11-04 2003-09-18 Ilhami Torunoglu Multiple input modes in overlapping physical space
US20140028567A1 (en) * 2011-04-19 2014-01-30 Lg Electronics Inc. Display device and control method thereof
EP2733593A2 (fr) * 2012-11-14 2014-05-21 Samsung Electronics Co., Ltd Procédé et dispositif électronique pour fournir un clavier virtuel
EP3046002A1 (fr) * 2015-01-16 2016-07-20 Samsung Electronics Co., Ltd. Dispositif d'entrée virtuel et procédé permettant de recevoir une entrée utilisateur à l'aide de celui-ci

Similar Documents

Publication Publication Date Title
US10409490B2 (en) Assisting input from a keyboard
US9146672B2 (en) Multidirectional swipe key for virtual keyboard
US10013143B2 (en) Interfacing with a computing application using a multi-digit sensor
KR101872426B1 (ko) 깊이 기반 사용자 인터페이스 제스처 제어
KR101847754B1 (ko) 근접 기반 입력을 위한 장치 및 방법
US8432301B2 (en) Gesture-enabled keyboard and associated apparatus and computer-readable storage medium
US8816964B2 (en) Sensor-augmented, gesture-enabled keyboard and associated apparatus and computer-readable storage medium
US20140306897A1 (en) Virtual keyboard swipe gestures for cursor movement
TWI463355B (zh) 多點觸控介面之訊號處理裝置、訊號處理方法及使用者介面圖像選取方法
EP3000016B1 (fr) Entrée utilisateur par entrée en survol
US20130106707A1 (en) Method and device for gesture determination
US9128609B2 (en) Touch interpretive architecture and touch interpretive method by using multi-fingers gesture to trigger application program
WO2014118602A1 (fr) Émulation de la sensibilité à la pression sur des dispositifs multipoint
US20120050032A1 (en) Tracking multiple contacts on an electronic device
WO2023072406A1 (fr) Changement de disposition d'un dispositif d'entrée virtuel
US10474409B2 (en) Response control method and electronic device
WO2023104286A1 (fr) Rendu de claviers virtuels dans des environnements virtuels
WO2010100503A2 (fr) Interface utilisateur pour un dispositif électronique à surface tactile
US9547515B2 (en) Convert a gesture
JP2023532794A (ja) ユーザ機器からの視覚フィードバック
CN108804007A (zh) 图像采集方法、装置、存储介质及电子设备
US11416140B2 (en) Touchscreen devices to transmit input selectively
CN118159929A (zh) 虚拟输入设备的基于人体工程学的重新配置
EP2816457A1 (fr) Procédé d'affichage d'interface et dispositif terminal
US20150138102A1 (en) Inputting mode switching method and system utilizing the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21805428

Country of ref document: EP

Kind code of ref document: A1