US20220043264A1 - Method for switching input devices, head-mounted display and computer readable storage medium - Google Patents

Method for switching input devices, head-mounted display and computer readable storage medium Download PDF

Info

Publication number
US20220043264A1
US20220043264A1 US16/984,164 US202016984164A US2022043264A1 US 20220043264 A1 US20220043264 A1 US 20220043264A1 US 202016984164 A US202016984164 A US 202016984164A US 2022043264 A1 US2022043264 A1 US 2022043264A1
Authority
US
United States
Prior art keywords
input device
specific
physical input
distance
threshold
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/984,164
Other versions
US11249314B1 (en
Inventor
Sheng-Cherng Lin
Chien-Hsin Liu
Shih-Lung Lin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HTC Corp
Original Assignee
HTC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HTC Corp filed Critical HTC Corp
Priority to US16/984,164 priority Critical patent/US11249314B1/en
Assigned to HTC CORPORATION reassignment HTC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIN, SHIH-LUNG, LIU, CHIEN-HSIN, LIN, SHENG-CHERNG
Priority to TW109132512A priority patent/TWI761960B/en
Priority to CN202011046873.8A priority patent/CN114089827B/en
Publication of US20220043264A1 publication Critical patent/US20220043264A1/en
Application granted granted Critical
Publication of US11249314B1 publication Critical patent/US11249314B1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06K9/00355
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0185Displaying image at variable distance
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/11Hand-related biometrics; Hand pose recognition

Definitions

  • the present disclosure generally relates to mechanisms for controlling input devices, in particular, to a method for switching input devices, a head-mounted display (HMD) and a computer readable storage medium.
  • HMD head-mounted display
  • hand gestures are limited to simple operations that respects the physics of our real world, yet for example, the distance operation of movie dashboard browsing is still preferred by additional physical controllers.
  • Complex gesture language can be designed for such scenarios but it requires a lot of machine learning and high user learning curve. Even with all the language learned, the tactile feedback and control accuracy of real controllers cannot be replicated.
  • the present disclosure is directed to a method for switching input devices, an HMD and a computer readable storage medium.
  • the disclosure provides a method for switching input devices, adapted to a head-mounted display (HMD).
  • the method includes: providing a visual content, wherein a first physical input device is visible in the visual content; determining whether a first specific distance between a specific object and the first physical input device is smaller than a first threshold; enabling the first physical input device in response to determining that the first specific distance is smaller than the first threshold; disabling the first physical input device in response to determining that the first specific distance is not smaller than the first threshold.
  • the disclosure provides a computer readable storage medium, recording an executable computer program to be loaded by a head-mounted display (HMD) to execute steps of: providing a visual content, wherein a first physical input device is visible in the visual content; determining whether a first specific distance between a specific object and the first physical input device is smaller than a first threshold; enabling the first physical input device in response to determining that the first specific distance is smaller than the first threshold; disabling the first physical input device in response to determining that the first specific distance is not smaller than the first threshold.
  • HMD head-mounted display
  • the disclosure provides a head-mounted display (HMD) including a display, a storage circuit, and a processor.
  • the storage circuit stores a plurality of modules.
  • the processor is coupled to the display and the storage circuit and accesses the modules to perform following steps: controlling the display to provide a visual content, wherein a first physical input device is visible in the visual content; determining whether a first specific distance between a specific object and the first physical input device is smaller than a first threshold; enabling the first physical input device in response to determining that the first specific distance is smaller than the first threshold; disabling the first physical input device in response to determining that the first specific distance is not smaller than the first threshold.
  • FIG. 1 shows a schematic diagram of an HMD according to an embodiment of the disclosure.
  • FIG. 2 shows a flow chart of the method for switching input devices according to an embodiment of the disclosure.
  • FIG. 3A to 3C are application scenarios according to a first embodiment of the disclosure.
  • FIG. 1 shows a schematic diagram of an HMD according to an embodiment of the disclosure.
  • the HMD 100 includes a storage circuit 102 , a display 104 , and a processor 106 .
  • the storage circuit 102 is one or a combination of a stationary or mobile random access memory (RAM), read-only memory (ROM), flash memory, hard disk, or any other similar device, and which records a plurality of modules that can be executed by the processor 106 .
  • RAM random access memory
  • ROM read-only memory
  • flash memory hard disk, or any other similar device, and which records a plurality of modules that can be executed by the processor 106 .
  • the display 104 may be an optical see through display. That is, the visual content shown in the display 104 may include virtual contents and physical contents, wherein the virtual contents may include virtual reality contents, and the physical contents may include real world objects in front of the HMD 100 , but the disclosure is not limited thereto.
  • the processor 106 may be coupled with the storage circuit 102 and the display 104 , and the processor 106 may be, for example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, an ARM-based processor, and the like.
  • DSP digital signal processor
  • ASICs Application Specific Integrated Circuits
  • FPGAs Field Programmable Gate Array
  • the processor 106 may access the modules stored in the storage circuit 102 to implement the method for switching input devices provided in the disclosure, which would be further discussed in the following.
  • FIG. 2 shows a flow chart of the method for switching input devices according to an embodiment of the disclosure.
  • the method of this embodiment may be executed by the HMD 100 in FIG. 1 , and the details of each step in FIG. 2 will be described below with the components shown in FIG. 1 .
  • FIG. 3A to 3C would be used as examples, wherein FIG. 3A to 3C are application scenarios according to a first embodiment of the disclosure.
  • the processor 106 may perform step S 210 to control the display 104 to provide a visual content 310 .
  • the visual content 310 may be an MR content that includes virtual objects (e.g., objects VO 1 and VO 2 ) and real world objects (e.g., the first physical input device D 1 , the second physical input device D 2 , and the hand H 1 of the wearer of the HMD 100 ). That is, the first physical input device D 1 , the second physical input device D 2 , and the hand H 1 of the wearer of the HMD 100 are visible in the visual content 310 .
  • virtual objects e.g., objects VO 1 and VO 2
  • real world objects e.g., the first physical input device D 1 , the second physical input device D 2 , and the hand H 1 of the wearer of the HMD 100 . That is, the first physical input device D 1 , the second physical input device D 2 , and the hand H 1 of the wearer of the HMD 100 are visible in the visual content 310
  • the first physical input device D 1 e.g., a keyboard
  • the second physical input device D 2 e.g., a controller
  • the wearer of the HMD 100 may use the first physical input device D 1 to perform input operations (e.g., typing words) to the HMD 100 when the first physical input device D 1 is enabled, and the wearer of the HMD 100 may not use the first physical input device D 1 to perform input operations to the HMD 100 when the first physical input device D 1 is disabled.
  • the wearer of the HMD 100 may use the second physical input device D 2 to perform input operations to the HMD 100 when the second physical input device D 2 is enabled, and the wearer of the HMD 100 may not use the second physical input device D 2 to perform input operations to the HMD 100 when the second physical input device D 2 is disabled.
  • the wearer of the HMD 100 may use the first physical input device D 1 and the second physical input device D 2 to perform input operations to the HMD 100 when the first physical input device D 1 and the second physical input device D 2 are both enabled.
  • the wearer of the HMD 100 may use the hand H 1 to perform input operations to the HMD 100 via hand gestures when the first physical input device D 1 and the second physical input device D 2 are both disabled, but the disclosure is not limited thereto.
  • step S 220 the processor 106 may determine whether a first specific distance between a specific object and the first physical input device D 1 is smaller than a first threshold.
  • the processor 106 may use the mechanisms discussed in the following to perform the step S 220 .
  • the specific object may be the hand H 1 in FIG. 3A , but the disclosure is not limited thereto.
  • the processor 106 may obtain a specific 3D object (e.g., a hand-shaped 3D object) corresponding to the specific object (i.e., the hand H 1 ) and a first 3D object (e.g., a keyboard-shaped 3D object) corresponding to the first physical input device D 1 via an inside-out tracking mechanism.
  • the details of the inside-out tracking mechanism may be referred to related conventional documents, and the details thereof would not be repeated herein.
  • the processor 106 may define a first distance between the specific 3D object and the first 3D object as the first specific distance and determine whether the first distance is smaller than the first threshold.
  • the first threshold may be designed to be any distance value based on the requirements of the designer, such as 3 cm, 5 cm or any required values.
  • the first physical input device D 1 may be disposed with a first proximity sensor connected with the HMD 100 .
  • the first proximity sensor may be a built-in element of the first physical input device D 1 or an external dongle connected to the first physical input device D 1 , but the disclosure is not limited thereto.
  • the processor 106 may use the first proximity sensor to detect a second distance between the first proximity sensor and an approaching object (e.g., the hand H 1 ). Next, the processor 106 may define the second distance as the first specific distance between the specific object and the first physical input device D 1 and determine whether the second distance is smaller than the first threshold.
  • the HMD 100 may be connected to a second proximity sensor, wherein the second proximity sensor may be worn on the hand H 1 (e.g., the specific object), but the disclosure is not limited thereto.
  • the processor 106 may perform an image recognition (e.g., Google LensTM) to the visual content 310 to recognize a specific 2D object (e.g., a hand-shaped 2D object) corresponding to the specific object and a first 2D object (e.g., a keyboard-shaped 2D object) corresponding to the first physical input device D 1 in the visual content 310 .
  • the processor 106 may determine whether a third distance between the specific 2D object and the first 2D object is smaller than a specific threshold (which may be arbitrarily chosen by the designer). If the third distance between the specific 2D object and the first 2D object is determined to be smaller than the specific threshold, it represents that the specific object may be close to the first physical input device D 1 .
  • a specific threshold which may be arbitrarily chosen by the designer
  • the processor 106 may further use the second proximity sensor to detect a fourth distance between the second proximity sensor and an approaching object (e.g., the first physical input device D 1 ). Afterwards, the processor 106 may define the fourth distance as the first specific distance between the specific object and the first physical input device D 1 and determine whether the fourth distance is smaller than the first threshold.
  • the processor 106 may perform step S 230 to enable the first physical input device D 1 .
  • the wearer of the HMD 100 may use the first physical input device D 1 to the HMD 100 .
  • the processor 106 may further disable a hand gesture recognition. Therefore, the wearer of the HMD 100 may not be allowed to perform input operations to the HMD 100 with his/her hands (e.g., the hand H 1 ), but the disclosure is not limited thereto.
  • the processor 106 may perform step S 240 to disable the first physical input device D 1 .
  • the wearer of the HMD 100 may not use the first physical input device D 1 to the HMD 100 .
  • the processor 106 may determine whether the specific object (e.g., the hand H 1 ) is close enough to the first physical input device D 1 . In response to determining that the specific object is close enough to the first physical input device D 1 (e.g., the first specific distance being smaller than the first threshold), the processor 106 may correspondingly enable the first physical input device D 1 for the wearer of the HMD 100 to perform input operations as shown in FIG. 3A .
  • the specific object e.g., the hand H 1
  • the processor 106 may correspondingly enable the first physical input device D 1 for the wearer of the HMD 100 to perform input operations as shown in FIG. 3A .
  • the processor 106 may correspondingly disable the first physical input device D 1 , such that the wearer of the HMD 100 cannot use the first physical input device D 1 to perform input operations as shown in FIG. 3B , but the disclosure is not limited thereto.
  • the processor 106 may determine whether to enable/disable the second physical input device D 2 based on how close is the specific object (e.g., the hand H 1 ) to the second physical input device D 2 .
  • the processor 106 may be configured to: determine whether a second specific distance between the specific object and the second physical input device D 2 is smaller than a second threshold. In response to determining that the second specific distance is smaller than the second threshold, the processor 106 may correspondingly enable the second physical input device for the wearer of the HMD 100 to perform input operations as shown in FIG. 3C . On the other hand, in response to determining that the second specific distance is not smaller than the second threshold, the processor 106 may correspondingly disable the second physical input device D 2 , such that the wearer cannot used the second physical input device D 2 to perform input operations as shown in FIG. 3B .
  • the related details may be referred to the teachings in the above embodiments, which would not be repeated herein.
  • the processor 106 may enable the hand gesture recognition, such that the processor 106 may perform the hand gesture recognition to the specific object (e.g., the hand H 1 ). Therefore, the wearer of the HMD 100 would be allowed to perform input operations to the HMD 100 by performing various hand gestures, but the disclosure is not limited thereto.
  • the disclosure further provides a computer readable storage medium for executing the method for switching input devices.
  • the computer readable storage medium is composed of a plurality of program instructions (for example, a setting program instruction and a deployment program instruction) embodied therein. These program instructions can be loaded into the HMD 100 and executed by the same to execute the method for switching input devices and the functions of the HMD 100 described above.
  • the method provided in the disclosure whether the specific object is close enough to the physical input device paired with the HMD may be determined.
  • the HMD may correspondingly enable the physical input device for the wearer of the HMD to perform input operations through the physical input device.
  • the HMD may correspondingly disable the physical input device, such that the wearer of the HMD cannot use the physical input device to perform input operations. Accordingly, the disclosure has provided a novel, convenient, and intuitive way for the wearer of the HMD to switch input devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a method for switching input devices, a head-mounted display (HMD) and a computer readable storage medium. The method includes: providing a visual content, wherein a first physical input device is visible in the visual content; determining whether a first specific distance between a specific object and the first physical input device is smaller than a first threshold; enabling the first physical input device in response to determining that the first specific distance is smaller than the first threshold; disabling the first physical input device in response to determining that the first specific distance is not smaller than the first threshold.

Description

    BACKGROUND OF THE INVENTION 1. Field of the Invention
  • The present disclosure generally relates to mechanisms for controlling input devices, in particular, to a method for switching input devices, a head-mounted display (HMD) and a computer readable storage medium.
  • 2. Description of Related Art
  • Along with the development of mixed reality (MR) technologies, current HMDs are capable of for reality access. For optical see through or video pass through reality experiences, hand gestures are the main input method because most virtual contents are an extension of real objects in space.
  • To interact with these contents, it makes sense for users to come in proximity of these real-world objects and reach out with their hands as means to control. In full virtual immersive state, hand gestures are limited to simple operations that respects the physics of our real world, yet for example, the distance operation of movie dashboard browsing is still preferred by additional physical controllers. Complex gesture language can be designed for such scenarios but it requires a lot of machine learning and high user learning curve. Even with all the language learned, the tactile feedback and control accuracy of real controllers cannot be replicated.
  • Current switch between hand tracking and virtual 6 degree of freedom (6dof) controllers happens at the start of a virtual app. Users can manually switch in the menu layer if hand gesture is preferred. However, when MR contents starts to be mature, the switch between hand gesture, virtual controller, and even 2D virtual input controllers (such as keyboard and mouse) will be more frequent and expected to be accessible at the same time.
  • Therefore, it is crucial to design an intuitive way to switch between these different input devices.
  • SUMMARY OF THE INVENTION
  • Accordingly, the present disclosure is directed to a method for switching input devices, an HMD and a computer readable storage medium.
  • The disclosure provides a method for switching input devices, adapted to a head-mounted display (HMD). The method includes: providing a visual content, wherein a first physical input device is visible in the visual content; determining whether a first specific distance between a specific object and the first physical input device is smaller than a first threshold; enabling the first physical input device in response to determining that the first specific distance is smaller than the first threshold; disabling the first physical input device in response to determining that the first specific distance is not smaller than the first threshold.
  • The disclosure provides a computer readable storage medium, recording an executable computer program to be loaded by a head-mounted display (HMD) to execute steps of: providing a visual content, wherein a first physical input device is visible in the visual content; determining whether a first specific distance between a specific object and the first physical input device is smaller than a first threshold; enabling the first physical input device in response to determining that the first specific distance is smaller than the first threshold; disabling the first physical input device in response to determining that the first specific distance is not smaller than the first threshold.
  • The disclosure provides a head-mounted display (HMD) including a display, a storage circuit, and a processor. The storage circuit stores a plurality of modules. The processor is coupled to the display and the storage circuit and accesses the modules to perform following steps: controlling the display to provide a visual content, wherein a first physical input device is visible in the visual content; determining whether a first specific distance between a specific object and the first physical input device is smaller than a first threshold; enabling the first physical input device in response to determining that the first specific distance is smaller than the first threshold; disabling the first physical input device in response to determining that the first specific distance is not smaller than the first threshold.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure.
  • FIG. 1 shows a schematic diagram of an HMD according to an embodiment of the disclosure.
  • FIG. 2 shows a flow chart of the method for switching input devices according to an embodiment of the disclosure.
  • FIG. 3A to 3C are application scenarios according to a first embodiment of the disclosure.
  • DESCRIPTION OF THE EMBODIMENTS
  • Reference will now be made in detail to the present preferred embodiments of the disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
  • See FIG. 1, which shows a schematic diagram of an HMD according to an embodiment of the disclosure. In FIG. 1, the HMD 100 includes a storage circuit 102, a display 104, and a processor 106.
  • The storage circuit 102 is one or a combination of a stationary or mobile random access memory (RAM), read-only memory (ROM), flash memory, hard disk, or any other similar device, and which records a plurality of modules that can be executed by the processor 106.
  • In the embodiments of the disclosure, the display 104 may be an optical see through display. That is, the visual content shown in the display 104 may include virtual contents and physical contents, wherein the virtual contents may include virtual reality contents, and the physical contents may include real world objects in front of the HMD 100, but the disclosure is not limited thereto.
  • The processor 106 may be coupled with the storage circuit 102 and the display 104, and the processor 106 may be, for example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, an ARM-based processor, and the like.
  • In the embodiments of the disclosure, the processor 106 may access the modules stored in the storage circuit 102 to implement the method for switching input devices provided in the disclosure, which would be further discussed in the following.
  • See FIG. 2, which shows a flow chart of the method for switching input devices according to an embodiment of the disclosure. The method of this embodiment may be executed by the HMD 100 in FIG. 1, and the details of each step in FIG. 2 will be described below with the components shown in FIG. 1. In addition, for better understanding the concept of the disclosure, FIG. 3A to 3C would be used as examples, wherein FIG. 3A to 3C are application scenarios according to a first embodiment of the disclosure.
  • In FIG. 3A, the processor 106 may perform step S210 to control the display 104 to provide a visual content 310. In one embodiment, the visual content 310 may be an MR content that includes virtual objects (e.g., objects VO1 and VO2) and real world objects (e.g., the first physical input device D1, the second physical input device D2, and the hand H1 of the wearer of the HMD 100). That is, the first physical input device D1, the second physical input device D2, and the hand H1 of the wearer of the HMD 100 are visible in the visual content 310.
  • In the embodiments of the disclosure, the first physical input device D1 (e.g., a keyboard) and the second physical input device D2 (e.g., a controller) may be paired with the HMD 100.
  • In one embodiment, the wearer of the HMD 100 may use the first physical input device D1 to perform input operations (e.g., typing words) to the HMD 100 when the first physical input device D1 is enabled, and the wearer of the HMD 100 may not use the first physical input device D1 to perform input operations to the HMD 100 when the first physical input device D1 is disabled. Similarly, the wearer of the HMD 100 may use the second physical input device D2 to perform input operations to the HMD 100 when the second physical input device D2 is enabled, and the wearer of the HMD 100 may not use the second physical input device D2 to perform input operations to the HMD 100 when the second physical input device D2 is disabled.
  • In one embodiment, the wearer of the HMD 100 may use the first physical input device D1 and the second physical input device D2 to perform input operations to the HMD 100 when the first physical input device D1 and the second physical input device D2 are both enabled. On the other hand, the wearer of the HMD 100 may use the hand H1 to perform input operations to the HMD 100 via hand gestures when the first physical input device D1 and the second physical input device D2 are both disabled, but the disclosure is not limited thereto.
  • Next, in step S220, the processor 106 may determine whether a first specific distance between a specific object and the first physical input device D1 is smaller than a first threshold.
  • In various embodiments, the processor 106 may use the mechanisms discussed in the following to perform the step S220.
  • Specifically, in a first embodiment of the disclosure, the specific object may be the hand H1 in FIG. 3A, but the disclosure is not limited thereto. In this case, the processor 106 may obtain a specific 3D object (e.g., a hand-shaped 3D object) corresponding to the specific object (i.e., the hand H1) and a first 3D object (e.g., a keyboard-shaped 3D object) corresponding to the first physical input device D1 via an inside-out tracking mechanism. In various embodiments, the details of the inside-out tracking mechanism may be referred to related conventional documents, and the details thereof would not be repeated herein. Next, the processor 106 may define a first distance between the specific 3D object and the first 3D object as the first specific distance and determine whether the first distance is smaller than the first threshold.
  • In various embodiments, the first threshold may be designed to be any distance value based on the requirements of the designer, such as 3 cm, 5 cm or any required values.
  • In a second embodiment of the disclosure, the first physical input device D1 may be disposed with a first proximity sensor connected with the HMD 100. In some embodiments, the first proximity sensor may be a built-in element of the first physical input device D1 or an external dongle connected to the first physical input device D1, but the disclosure is not limited thereto.
  • In the second embodiment, the processor 106 may use the first proximity sensor to detect a second distance between the first proximity sensor and an approaching object (e.g., the hand H1). Next, the processor 106 may define the second distance as the first specific distance between the specific object and the first physical input device D1 and determine whether the second distance is smaller than the first threshold.
  • In a third embodiment, the HMD 100 may be connected to a second proximity sensor, wherein the second proximity sensor may be worn on the hand H1 (e.g., the specific object), but the disclosure is not limited thereto.
  • In this case, the processor 106 may perform an image recognition (e.g., Google Lens™) to the visual content 310 to recognize a specific 2D object (e.g., a hand-shaped 2D object) corresponding to the specific object and a first 2D object (e.g., a keyboard-shaped 2D object) corresponding to the first physical input device D1 in the visual content 310. Next, the processor 106 may determine whether a third distance between the specific 2D object and the first 2D object is smaller than a specific threshold (which may be arbitrarily chosen by the designer). If the third distance between the specific 2D object and the first 2D object is determined to be smaller than the specific threshold, it represents that the specific object may be close to the first physical input device D1. Therefore, the processor 106 may further use the second proximity sensor to detect a fourth distance between the second proximity sensor and an approaching object (e.g., the first physical input device D1). Afterwards, the processor 106 may define the fourth distance as the first specific distance between the specific object and the first physical input device D1 and determine whether the fourth distance is smaller than the first threshold.
  • In one embodiment, in response to determining that the first specific distance is smaller than the first threshold, the processor 106 may perform step S230 to enable the first physical input device D1. In this case, the wearer of the HMD 100 may use the first physical input device D1 to the HMD 100.
  • In some embodiments, after enabling the first physical input device D1, the processor 106 may further disable a hand gesture recognition. Therefore, the wearer of the HMD 100 may not be allowed to perform input operations to the HMD 100 with his/her hands (e.g., the hand H1), but the disclosure is not limited thereto.
  • On the other hand, in response to determining that the first specific distance is not smaller than the first threshold, the processor 106 may perform step S240 to disable the first physical input device D1. In this case, the wearer of the HMD 100 may not use the first physical input device D1 to the HMD 100.
  • In brief, the processor 106 may determine whether the specific object (e.g., the hand H1) is close enough to the first physical input device D1. In response to determining that the specific object is close enough to the first physical input device D1 (e.g., the first specific distance being smaller than the first threshold), the processor 106 may correspondingly enable the first physical input device D1 for the wearer of the HMD 100 to perform input operations as shown in FIG. 3A.
  • On the other hand, in response to determining that the specific object is not close to the first physical input device D1 (e.g., the first specific distance being not smaller than the first threshold), the processor 106 may correspondingly disable the first physical input device D1, such that the wearer of the HMD 100 cannot use the first physical input device D1 to perform input operations as shown in FIG. 3B, but the disclosure is not limited thereto.
  • In other embodiments, the processor 106 may determine whether to enable/disable the second physical input device D2 based on how close is the specific object (e.g., the hand H1) to the second physical input device D2.
  • Specifically, the processor 106 may be configured to: determine whether a second specific distance between the specific object and the second physical input device D2 is smaller than a second threshold. In response to determining that the second specific distance is smaller than the second threshold, the processor 106 may correspondingly enable the second physical input device for the wearer of the HMD 100 to perform input operations as shown in FIG. 3C. On the other hand, in response to determining that the second specific distance is not smaller than the second threshold, the processor 106 may correspondingly disable the second physical input device D2, such that the wearer cannot used the second physical input device D2 to perform input operations as shown in FIG. 3B. The related details may be referred to the teachings in the above embodiments, which would not be repeated herein.
  • In some embodiments, if all of the physical input devices paired with the HMD 100 are disabled, the processor 106 may enable the hand gesture recognition, such that the processor 106 may perform the hand gesture recognition to the specific object (e.g., the hand H1). Therefore, the wearer of the HMD 100 would be allowed to perform input operations to the HMD 100 by performing various hand gestures, but the disclosure is not limited thereto.
  • The disclosure further provides a computer readable storage medium for executing the method for switching input devices. The computer readable storage medium is composed of a plurality of program instructions (for example, a setting program instruction and a deployment program instruction) embodied therein. These program instructions can be loaded into the HMD 100 and executed by the same to execute the method for switching input devices and the functions of the HMD 100 described above.
  • In summary, the method provided in the disclosure, whether the specific object is close enough to the physical input device paired with the HMD may be determined. In response to determining that the specific object is close enough to the physical input device, the HMD may correspondingly enable the physical input device for the wearer of the HMD to perform input operations through the physical input device.
  • On the other hand, in response to determining that the specific object is not close to the physical input device, the HMD may correspondingly disable the physical input device, such that the wearer of the HMD cannot use the physical input device to perform input operations. Accordingly, the disclosure has provided a novel, convenient, and intuitive way for the wearer of the HMD to switch input devices.
  • It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the present disclosure without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the present disclosure cover modifications and variations of this disclosure provided they fall within the scope of the following claims and their equivalents.

Claims (20)

What is claimed is:
1. A method for switching input devices, adapted to a head-mounted display (HMD), comprising:
providing a visual content, wherein a first physical input device is visible in the visual content;
determining whether a first specific distance between a specific object and the first physical input device is smaller than a first threshold;
enabling the first physical input device in response to determining that the first specific distance is smaller than the first threshold;
disabling the first physical input device in response to determining that the first specific distance is not smaller than the first threshold.
2. The method according to claim 1, wherein the first physical input device is paired with the HMD, and the step of determining whether the first specific distance between the specific object and the first physical input device is smaller than the first threshold comprising:
obtaining a specific 3D object corresponding to the specific object and a first 3D object corresponding to the first physical input device via an inside-out tracking mechanism;
defining a first distance between the specific 3D object and the first 3D object as the first specific distance, and determining whether the first distance is smaller than the first threshold.
3. The method according to claim 1, wherein the first physical input device is paired with the HMD and disposed with a first proximity sensor, the HMD is connected with the first proximity sensor, and the step of determining whether the first specific distance between the specific object and the first physical input device is smaller than the first threshold comprising:
using the first proximity sensor to detect a second distance between the first proximity sensor and an approaching object;
defining the second distance as the first specific distance between the specific object and the first physical input device, and determining whether the second distance is smaller than the first threshold.
4. The method according to claim 1, wherein the HMD is paired with the first physical input device and connected to a second proximity sensor, and the step of determining whether the first specific distance between the specific object and the first physical input device is smaller than the first threshold comprising:
performing an image recognition to the visual content to recognize a specific 2D object corresponding to the specific object and a first 2D object corresponding to the first physical input device in the visual content;
in response to determining that a third distance between the specific 2D object and the first 2D object is smaller than a specific threshold, using the second proximity sensor to detect a fourth distance between the second proximity sensor and an approaching object;
defining the fourth distance as the first specific distance between the specific object and the first physical input device, and determining whether the fourth distance is smaller than the first threshold.
5. The method according to claim 4, wherein the second proximity sensor is worn on the specific object.
6. The method according to claim 1, wherein the HMD is paired with a second physical input device, the second physical input device is visible in the visual content, and the method further comprising:
determining whether a second specific distance between the specific object and the second physical input device is smaller than a second threshold;
enabling the second physical input device in response to determining that the second specific distance is smaller than the second threshold;
disabling the second physical input device in response to determining that the second specific distance is not smaller than the second threshold.
7. The method according to claim 1, wherein the visual content is a Mixed Reality (MR) content.
8. The method according to claim 1, wherein the specific object is a hand, and in response to disabling the first physical input device, the method further comprising:
performing a hand gesture recognition to the specific object.
9. The method according to claim 1, wherein after the step of enabling the first physical input device, the method further comprising:
disabling a hand gesture recognition.
10. A non-transitory computer readable storage medium, recording an executable computer program to be loaded by a head-mounted display (HMD) to execute steps of:
providing a visual content, wherein a first physical input device is visible in the visual content;
determining whether a first specific distance between a specific object and the first physical input device is smaller than a first threshold;
enabling the first physical input device in response to determining that the first specific distance is smaller than the first threshold;
disabling the first physical input device in response to determining that the first specific distance is not smaller than the first threshold.
11. A head-mounted display (HMD), comprising:
a display;
a storage circuit, storing a plurality of modules;
a processor, coupled to the display and the storage circuit, accessing the modules to perform following steps:
controlling the display to provide a visual content, wherein a first physical input device is visible in the visual content;
determining whether a first specific distance between a specific object and the first physical input device is smaller than a first threshold;
enabling the first physical input device in response to determining that the first specific distance is smaller than the first threshold;
disabling the first physical input device in response to determining that the first specific distance is not smaller than the first threshold.
12. The HMD according to claim 11, wherein the first physical input device is paired with the HMD, and the processor is configured to:
obtain a specific 3D object corresponding to the specific object and a first 3D object corresponding to the first physical input device via an inside-out tracking mechanism;
define a first distance between the specific 3D object and the first 3D object as the first specific distance, and determining whether the first distance is smaller than the first threshold.
13. The HMD according to claim 11, wherein the first physical input device is paired with the HMD and disposed with a first proximity sensor, the HMD is connected with the first proximity sensor, and the processor is configured to:
use the first proximity sensor to detect a second distance between the first proximity sensor and an approaching object;
define the second distance as the first specific distance between the specific object and the first physical input device, and determining whether the second distance is smaller than the first threshold.
14. The HMD according to claim 11, wherein the HMD is paired with the first physical input device and connected to a second proximity sensor, and the processor is configured to:
perform an image recognition to the visual content to recognize a specific 2D object corresponding to the specific object and a first 2D object corresponding to the first physical input device in the visual content;
in response to determining that a third distance between the specific 2D object and the first 2D object is smaller than a specific threshold, use the second proximity sensor to detect a fourth distance between the second proximity sensor and an approaching object;
define the fourth distance as the first specific distance between the specific object and the first physical input device, and determining whether the fourth distance is smaller than the first threshold.
15. The HMD according to claim 14, wherein the second proximity sensor is worn on the specific object.
16. The HMD according to claim 11, wherein the HMD is paired with a second physical input device, the second physical input device is visible in the visual content, and the processor is further configured to:
determine whether a second specific distance between the specific object and the second physical input device is smaller than a second threshold;
enable the second physical input device in response to determining that the second specific distance is smaller than the second threshold;
disable the second physical input device in response to determining that the second specific distance is not smaller than the second threshold.
17. The HMD according to claim 11, wherein the visual content is a Mixed Reality (MR) content.
18. The HMD according to claim 11, wherein the specific object is a hand, and in response to disabling the first physical input device, the processor is further configured to:
perform a hand gesture recognition to the specific object.
19. The HMD according to claim 11, wherein after enabling the first physical input device, the processor is further configured to disable a hand gesture recognition.
20. The HMD according to claim 11, wherein the specific object is a hand.
US16/984,164 2020-08-04 2020-08-04 Method for switching input devices, head-mounted display and computer readable storage medium Active 2040-08-14 US11249314B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/984,164 US11249314B1 (en) 2020-08-04 2020-08-04 Method for switching input devices, head-mounted display and computer readable storage medium
TW109132512A TWI761960B (en) 2020-08-04 2020-09-21 Method for switching input devices, head-mounted display and computer readable storage medium
CN202011046873.8A CN114089827B (en) 2020-08-04 2020-09-29 Method for switching input device, head-mounted display and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/984,164 US11249314B1 (en) 2020-08-04 2020-08-04 Method for switching input devices, head-mounted display and computer readable storage medium

Publications (2)

Publication Number Publication Date
US20220043264A1 true US20220043264A1 (en) 2022-02-10
US11249314B1 US11249314B1 (en) 2022-02-15

Family

ID=80113738

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/984,164 Active 2040-08-14 US11249314B1 (en) 2020-08-04 2020-08-04 Method for switching input devices, head-mounted display and computer readable storage medium

Country Status (3)

Country Link
US (1) US11249314B1 (en)
CN (1) CN114089827B (en)
TW (1) TWI761960B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024090721A1 (en) 2022-10-28 2024-05-02 삼성전자주식회사 Wearable device for displaying visual object by using sensors within external electronic device, and method therefor

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180348860A1 (en) * 2017-06-02 2018-12-06 Htc Corporation Immersive headset system and control method thereof
US10778953B2 (en) * 2018-12-10 2020-09-15 Universal City Studios Llc Dynamic convergence adjustment in augmented reality headsets
US11076144B2 (en) * 2018-12-14 2021-07-27 Cloudminds (Shenzhen) Robotics Systems Co., Ltd. Method and apparatus for obtaining image, storage medium and electronic device

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102377871B (en) * 2010-08-24 2013-12-04 联想(北京)有限公司 Information processing equipment and control method thereof
US8912979B1 (en) * 2011-07-14 2014-12-16 Google Inc. Virtual window in head-mounted display
US9766806B2 (en) * 2014-07-15 2017-09-19 Microsoft Technology Licensing, Llc Holographic keyboard display
CN104300950A (en) * 2014-09-23 2015-01-21 青岛歌尔声学科技有限公司 Automatic triggering starting circuit integrating reset function and HMD equipment
JP6676071B2 (en) * 2015-08-04 2020-04-08 グーグル エルエルシー Input via Context-Dependent Hand Collision with Objects in Virtual Reality
US10101803B2 (en) * 2015-08-26 2018-10-16 Google Llc Dynamic switching and merging of head, gesture and touch input in virtual reality
TWI695307B (en) * 2016-04-29 2020-06-01 姚秉洋 Method for displaying an on-screen keyboard, computer program product thereof and non-transitory computer-readable medium thereof
WO2018039277A1 (en) * 2016-08-22 2018-03-01 Magic Leap, Inc. Diffractive eyepiece
CN115665236B (en) * 2016-11-21 2024-10-01 北京嘀嘀无限科技发展有限公司 System and method for performing actions based on location information
CN109478120B (en) * 2016-12-28 2022-03-25 英华达(上海)科技有限公司 Input method and system of electronic equipment
EP3542252B1 (en) * 2017-08-10 2023-08-02 Google LLC Context-sensitive hand interaction
US11126258B2 (en) * 2017-10-14 2021-09-21 Qualcomm Incorporated Managing and mapping multi-sided touch
CN109753196A (en) * 2017-11-06 2019-05-14 阿里巴巴集团控股有限公司 Processing method, device, equipment and machine readable media
JP7059619B2 (en) * 2017-12-22 2022-04-26 セイコーエプソン株式会社 Processing equipment, display systems, and programs
EP3797345A4 (en) * 2018-05-22 2022-03-09 Magic Leap, Inc. Transmodal input fusion for a wearable system
WO2020033875A1 (en) * 2018-08-10 2020-02-13 Compound Photonics Limited Apparatus, systems, and methods for foveated display
CN111309142A (en) * 2018-12-11 2020-06-19 托比股份公司 Method and device for switching input modality of display device
US10901495B2 (en) * 2019-01-10 2021-01-26 Microsofttechnology Licensing, Llc Techniques for multi-finger typing in mixed-reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180348860A1 (en) * 2017-06-02 2018-12-06 Htc Corporation Immersive headset system and control method thereof
US10778953B2 (en) * 2018-12-10 2020-09-15 Universal City Studios Llc Dynamic convergence adjustment in augmented reality headsets
US11076144B2 (en) * 2018-12-14 2021-07-27 Cloudminds (Shenzhen) Robotics Systems Co., Ltd. Method and apparatus for obtaining image, storage medium and electronic device

Also Published As

Publication number Publication date
US11249314B1 (en) 2022-02-15
TWI761960B (en) 2022-04-21
TW202207009A (en) 2022-02-16
CN114089827B (en) 2023-08-25
CN114089827A (en) 2022-02-25

Similar Documents

Publication Publication Date Title
US20220391047A1 (en) Split-screen display method, electronic device, and computer-readable storage medium
US20150301595A1 (en) Electronic apparatus and eye-gaze input method
US20140306898A1 (en) Key swipe gestures for touch sensitive ui virtual keyboard
CN107301038B (en) Application production apparatus, system, method, and non-transitory computer readable medium
US11630576B2 (en) Electronic device and method for processing letter input in electronic device
KR20190100339A (en) Application switching method, device and graphical user interface
US20170031563A1 (en) Method and apparatus for display control and electronic device
WO2018107771A1 (en) Method and apparatus for controlling touch screen of terminal, and terminal
US9152316B2 (en) Electronic device, controlling method thereof, and non-transitory storage medium
KR20160057407A (en) Simultaneous hover and touch interface
US20140218289A1 (en) Electronic device with control interface and methods therefor
KR20160060110A (en) Quick tasks for on-screen keyboards
US11914789B2 (en) Method for inputting letters, host, and computer readable storage medium
CN106796810A (en) On a user interface frame is selected from video
US20230185513A1 (en) Method for operating mirrored content under mirror mode and computer readable storage medium
US11249314B1 (en) Method for switching input devices, head-mounted display and computer readable storage medium
CN104615348B (en) Information processing method and electronic equipment
US20140223328A1 (en) Apparatus and method for automatically controlling display screen density
JP7372945B2 (en) Scenario control method, device and electronic device
US20150253980A1 (en) Information processing method and electronic device
US20170371479A1 (en) Method for Processing Operation and Terminal
CN112534390B (en) Electronic device for providing virtual input tool and method thereof
US20120151409A1 (en) Electronic Apparatus and Display Control Method
US11762464B1 (en) Method for inputting characters with eye gazes, host, and computer readable storage medium
CN104820489A (en) System and method in managing low-latency direct control feedback

Legal Events

Date Code Title Description
AS Assignment

Owner name: HTC CORPORATION, TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIN, SHENG-CHERNG;LIU, CHIEN-HSIN;LIN, SHIH-LUNG;SIGNING DATES FROM 20200727 TO 20200730;REEL/FRAME:053401/0283

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE