AU2022201327A1 - Virtual interface control device - Google Patents

Virtual interface control device Download PDF

Info

Publication number
AU2022201327A1
AU2022201327A1 AU2022201327A AU2022201327A AU2022201327A1 AU 2022201327 A1 AU2022201327 A1 AU 2022201327A1 AU 2022201327 A AU2022201327 A AU 2022201327A AU 2022201327 A AU2022201327 A AU 2022201327A AU 2022201327 A1 AU2022201327 A1 AU 2022201327A1
Authority
AU
Australia
Prior art keywords
processor
virtual
simulator
detector
operation action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2022201327A
Inventor
Chih-Feng Ho
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oxti Corp
Original Assignee
Oxti Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oxti Corp filed Critical Oxti Corp
Priority to AU2022201327A priority Critical patent/AU2022201327A1/en
Publication of AU2022201327A1 publication Critical patent/AU2022201327A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • G03H2001/0061Adaptation of holography to specific applications in haptic applications when the observer interacts with the holobject
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

OF THE DISCLOSURE The virtual interface control device for an appliance includes a main member; a simulator housed in the main member, where the simulator stores a number of virtual scenes, and the simulator selectively projects out one of the 5 virtual scenes; a detector housed in the main member, where the detector detects an operation action of a user in the virtual scene, and produces a detection signal according to the operation action; and a processor housed in the main member, where the processor respectively and electrically connected with the detector and the simulator; the processor receives and deciphers the 10 detection signal to identify the operation action; the processor, based on the operation action, instructs the simulator to switch to another virtual scene, or produces a control signal to control the appliance. Therefore, a user may operate an appliance without physically contacting with the appliance and its physical control interface. 15 FI1 13 10 FIG. 2

Description

FI1
13 10
FIG. 2
TITLE: VIRTUAL INTERFACE CONTROL DEVICE BACKGROUND OF THE INVENTION
(a) Technical Field of the Invention
The present invention is generally related to control devices, and more
particular to a control device operating in a virtual scene.
(b) Description of the Prior Art
R.O.C. Taiwan Patent Publication No. 201134141 teaches the remote
control of home appliances through a mobile phone. The mobile phone
transmits commands to a host computer through a wireless network. A digital
control disk connected with the computer generates an infrared or wireless
control signal to operate a home appliance.
For this prior art, a user's hand is occupied by holding the mobile phone
and cannot engage other activities, thereby causing inconvenience.
SUMMARY OF THE INVENTION
Therefore, a virtual interface control device for an appliance is provided
herein. The virtual interface control device includes a main member; a
simulator housed in the main member, where the simulator stores a number of
virtual scenes, and the simulator selectively projects out one of the virtual
scenes; a detector housed in the main member, where the detector detects an
operation action of a user in the virtual scene, and produces a detection signal
according to the operation action; and a processor housed in the main member,
where the processor respectively and electrically connected with the detector
and the simulator; the processor receives and deciphers the detection signal to
identify the operation action; the processor, based on the operation action,
instructs the simulator to switch to another virtual scene, or produces a control
signal to control the appliance.
Specifically, the main member is a pair of eyeglasses.
Specifically, the detector is an optical detector or an infrared detector.
Specifically, the virtual interface control device further includes a
transceiver. The transceiver is electrically connected with the processor, and
the transceiver is a wireless transceiver.
Specifically, the processor is a control circuit, a central processing unit
(CPU), a single-chip microcomputer, or a microcontroller (MCU).
Therefore, a user may operate an appliance without physically contacting
with the appliance and its physical control interface. By switching from one
virtual scene to a different virtual scene, the user may operate another
appliance.
The foregoing objectives and summary provide only a brief introduction
to the present invention. To fully appreciate these and other objects of the
present invention as well as the invention itself, all of which will become
apparent to those skilled in the art, the following detailed description of the
invention and the claims should be read in conjunction with the accompanying
drawings. Throughout the specification and drawings identical reference
numerals refer to identical or similar parts.
Many other advantages and features of the present invention will become
manifest to those versed in the art upon making reference to the detailed
description and the accompanying sheets of drawings in which a preferred
structural embodiment incorporating the principles of the present invention is
shown by way of illustrative example.
BRIEF DESCRIPTION OF TIE DRAWINGS
FIG. 1 is a block diagram showing a virtual interface control device
according to an embodiment of the present invention.
FIG. 2 is a perspective diagram showing the virtual interface control
device embodied in a pair of eyeglasses.
FIG. 3 is a schematic diagram showing the virtual interface control device
of FIG. 2 projecting a virtual scene.
FIG. 4 is a schematic diagram showing the virtual interface control device
of FIG. 2 in operating a light bulb.
FIG. 5 is a schematic diagram showing the virtual interface control device
of FIG. 2in operating a motor vehicle's computer.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
The following descriptions are exemplary embodiments only, and are not
intended to limit the scope, applicability or configuration of the invention in
any way. Rather, the following description provides a convenient illustration
for implementing exemplary embodiments of the invention. Various changes
to the described embodiments may be made in the function and arrangement
of the elements described without departing from the scope of the invention as
set forth in the appended claims.
As shown in FIG. 1, a virtual interface control device according to an
embodiment of the present invention includes a main member 1, a simulator
10, a detector 11, a processor 12 and a transceiver 13.
As shown in FIG. 2, the main member 1 may be a pair of eyeglasses.
As shown in FIG. 1 and FIG. 3, the simulator 10 is housed in the main
member 1, and the simulator 10 has stored a number of virtual scenes 20, 30,
40, 50 that are selectively projected out by the simulator 10. Specifically, the
virtual scenes 20, 30, 40, 50 are computer-created 3D image simulation to real
scenes and stored in the simulator 10. The virtual scenes 20, 30, 40, 50
respectively provide a virtual control interface simulating the operation
buttons or switch of a real control interface in a real environment. For example, as shown in FIG. 4, the virtual scene 20 is an indoor scene which includes a light bulb, and the virtual control interface simulates the switches controlling the on and off of the light bulb. For another example, as shown in FIG. 5, the virtual scene 30 shows the inside of a motor vehicle which includes a dashboard, and the virtual control interface simulates the buttons on the dashboard controlling the motor vehicle's computer.
The detector 11 is housed in the main member 1, and the detector 11
detects an operation action of a user in the virtual scene 20, 30, 40, or 50. The
detector 11, according to the detected operation action, produces a detection
signal. Specifically, the detector 11may be an optical detector or an infrared
detector. Taking an infrared detector as example, the infrared detector may be
an active infrared motion detector (Active Infrared Motion Sensor), which
radiates infrared and received the reflected infrared so as to detect the user's
operation action such as hand movement or gesture. The infrared detector also
may be a passive infrared motion detector (Passive Infrared Motion Sensor)
that picks up infrared disseminated from the user, so as to detect the user's
operation action.
The processor 12 is housed in the main member 1, and the processor 12 is
respectively and electrically connected with the detector 11 and the simulator.
The processor 12 receives and deciphers the detection signal to identify the intended operation action. The processor 12 then, based on the operation action, controls the simulator 10 to switch to an appropriate virtual scene 20,
30, 40, or 50. The processor also may, based on the operation action, generates
a control signal to the control interface. Specifically, the processor 12 may be a
control circuit, a central processing unit (CPU), a single-chip microcomputer,
or a microcontroller (MCU).
The transceiver 13 is housed in the main member 1 and electrically
connected with the processor 12. The transceiver 13may be a wireless
transceiver that transmits the control signal in a wireless manner and may
amplify the control signal. Specifically, the wireless transceiver may be a
wireless LAN (WLAN) transceiver, a Wireless-Fidelity (Wi-Fi) transceiver, a
Bluetooth transceiver, an infrared (IR) transceiver, a 4G transceiver, or a 5G
transceiver.
As shown in FIG. 3, in operation, the simulator 10 first projects out the
virtual scenes 20, 30, 40, and 50 for a user to select. As shown in FIG. 4, for
example, the user selects the virtual scene 20 which depicts an indoor
environment with a light bulb. The virtual scene also includes a virtual control
interface for the light bulb which includes a first location G1 for turning on the
light bulb and a second location G2 for turning off the light bulb. When the
detector 11 detects that the user's operation action (e.g., hand movement) reaches the first location G1, a corresponding detection signal is produced.
The processor 12 receives and deciphers the detection signal to identify that
the operation action is against the first location G1. The processor 12 then
produces a control signal to turn on the light bulb. When the detector 11
detects that the user's operation action reaches the second location G2, a
corresponding detection signal is produced. The processor 12 receives and
deciphers the detection signal to identify that the operation action is against the
second location G2. The processor 12 then produces a control signal to turn
off the light bulb.
The virtual scene 20 may also include three additional locations in the
virtual control interface: a third location G3 for switching to the virtual scene
30, a fourth location G4 for switching to the virtual scene 40, and a fifth
location G5 for switching to the virtual scene 50. When the detector 11 detects
that the user's operation action reaches the third location G3, a corresponding
detection signal is produced. The processor 12 receives and deciphers the
detection signal to identify that the operation action is against the third location
G3. The processor 12 then controls the simulator 10 to switch to the virtual
scene 30. As shown in FIG. 5, the virtual scene 30 may depict the interior of a
motor vehicle and a dashboard. The virtual scene 30 may include three
locations in the virtual control interface: a sixth location G6 for turning on the air conditioner of the motor vehicle and a seventh location G7 for turning on the stereo of the motor vehicle. When the detector 11 detects that the user's operation action reaches the sixth location G6, a corresponding detection signal is produced. The processor 12 receives and deciphers the detection signal to identify that the operation action is against the sixth location G6. The processor 12 then produces a control signal to instruct the motor vehicle's computer to turn on the air conditioner. When the detector 11 detects that the user's operation action reaches the seventh location G7, a corresponding detection signal is produced. The processor 12 receives and deciphers the detection signal to identify that the operation action is against the seventh location G7. The processor 12 then produces a control signal to instruct the motor vehicle's computer to turn on the stereo. The virtual scene 30 may also include three additional locations in the virtual control interface: an eighth location G8 for switching to the virtual scene 20, a ninth location G9 for switching to the virtual scene 40, and a tenth location G10 for switching to the virtual scene 50. When the detector 11 detects that the user's operation action reaches the third location G3, a corresponding detection signal is produced.
The processor 12 receives and deciphers the detection signal to identify that
the operation action is against the third location G3. The processor 12 then
controls the simulator 10 to switch to the virtual scene 30. The processor 12 may also transmit the control signal through the transceiver 13 in a wireless manner.
Therefore, the user may operate an appliance without physically
contacting with the appliance and its physical control interface. The user's
hand is therefore freed to engage other activities, achieving greater
convenience.
While certain novel features of this invention have been shown and
described and are pointed out in the annexed claim, it is not intended to be
limited to the details above, since it will be understood that various omissions,
modifications, substitutions and changes in the forms and details of the device
illustrated and in its operation can be made by those skilled in the art without
departing in any way from the claims of the present invention.

Claims (5)

I CLAIM:
1. A virtual interface control device for an appliance, comprising:
a main member;
a simulator housed in the main member, where the simulator stores a
plurality of virtual scenes, the simulator selectively projects out
one of the virtual scenes;
a detector housed in the main member, where the detector detects an
operation action of a user in the virtual scene, and produces a
detection signal according to the operation action; and
a processor housed in the main member, where the processor
respectively and electrically connected with the detector and the
simulator; the processor receives and deciphers the detection
signal to identify the operation action; and the processor, based
on the operation action, instructs the simulator to switch to
another virtual scene, or produces a control signal to control the
appliance.
2. The virtual interface control device according to claim 1, wherein the
main member is a pair of eyeglasses.
3. The virtual interface control device according to claim 1, wherein the
detector is an optical detector or an infrared detector.
4. The virtual interface control device according to claim 1, further
comprising a transceiver, wherein the transceiver is electrically
connected with the processor; and the transceiver is a wireless
transceiver.
5. The virtual interface control device according to claim 1, wherein the
processor is a control circuit, a central processing unit (CPU), a
single-chip microcomputer, or a microcontroller (MCU).
AU2022201327A 2022-02-25 2022-02-25 Virtual interface control device Abandoned AU2022201327A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2022201327A AU2022201327A1 (en) 2022-02-25 2022-02-25 Virtual interface control device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2022201327A AU2022201327A1 (en) 2022-02-25 2022-02-25 Virtual interface control device

Publications (1)

Publication Number Publication Date
AU2022201327A1 true AU2022201327A1 (en) 2023-09-21

Family

ID=88019515

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2022201327A Abandoned AU2022201327A1 (en) 2022-02-25 2022-02-25 Virtual interface control device

Country Status (1)

Country Link
AU (1) AU2022201327A1 (en)

Similar Documents

Publication Publication Date Title
US11676483B2 (en) System and method for facilitating appliance control via a smart device
JP6606269B2 (en) Smart home wireless control system
US10019894B2 (en) Remote controlling a plurality of controllable devices
EP1744290B1 (en) Integrated remote controller and method of selecting device controlled thereby
KR101591552B1 (en) Touch-sensitive wireless device and on screen display for remotely controlling a system
KR101224351B1 (en) Method for locating an object associated with a device to be controlled and a method for controlling the device
JP6114996B2 (en) System and method for gaze tracking
CN104391487A (en) Intelligent controller and control method thereof
CN104574932A (en) Intelligent-terminal-based remote control method and device and intelligent terminal
CN109324518A (en) Housed device control method, device, equipment and storage medium
WO2006013479A2 (en) Method for control of a device
EP1597662A1 (en) Method for controlling lighting parameters, controlling device, lighting system
CN104054331B (en) The configuration used for many side control devices
AU2022201327A1 (en) Virtual interface control device
KR20190079720A (en) Remote control system with intelligent cooling / heating control and illumination function for companion animals
RU2673464C1 (en) Method for recognition and control of household appliances via mobile phone and mobile phone for its implementation
JP5853006B2 (en) Remote control system and method
CN116668219A (en) Carrier control device
US11700063B2 (en) Appliance remote control
TWM631367U (en) Carrier control device
CN111492339A (en) Information processing apparatus, information processing method, and recording medium
CN107943342B (en) Touch control system
KR102493070B1 (en) A display device
KR102063631B1 (en) An controller and a method thereof
Shi et al. Direct gaze based environmental controls

Legal Events

Date Code Title Description
MK5 Application lapsed section 142(2)(e) - patent request and compl. specification not accepted