WO2021138848A1 - Systems and methods for controlling appliances - Google Patents

Systems and methods for controlling appliances Download PDF

Info

Publication number
WO2021138848A1
WO2021138848A1 PCT/CN2020/070989 CN2020070989W WO2021138848A1 WO 2021138848 A1 WO2021138848 A1 WO 2021138848A1 CN 2020070989 W CN2020070989 W CN 2020070989W WO 2021138848 A1 WO2021138848 A1 WO 2021138848A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
slave
sensor
master
control interface
Prior art date
Application number
PCT/CN2020/070989
Other languages
French (fr)
Inventor
Shan GUAN
Original Assignee
Metis Ip (Suzhou) Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Metis Ip (Suzhou) Llc filed Critical Metis Ip (Suzhou) Llc
Priority to PCT/CN2020/070989 priority Critical patent/WO2021138848A1/en
Publication of WO2021138848A1 publication Critical patent/WO2021138848A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house

Definitions

  • the present disclosure generally relates to systems and methods for controlling appliances, and in particular, to an intelligent switch with holographic projection.
  • a system may include a holographic projector, at least one sensor, and at least one processor.
  • the holographic projector may be configured to project a control interface, the control interface having at least one icon.
  • the at least one sensor may be configured to detect a gesture by a user, the gesture being associated with the at least one icon.
  • the at least one processor may be directed to perform one or more of the following operations.
  • the at least one processor may determine a 3D coordinate system based on the control interface.
  • the at least one processor may determine a vector of the gesture associated with the at least one icon in the 3D coordinate system.
  • the at least one processor may generate a control signal according to the determined vector of the gesture for controlling one or more devices.
  • the system may include a communication module configured to transmit the determined control signal to one or more devices.
  • the communication module may include at least one of a Wireless Local Area Network (WLAN) unit, a ZigBee unit, a Z-wave unit, or a Bluetooth unit.
  • WLAN Wireless Local Area Network
  • the at least one sensor may include at least one of a camera, an infrared detector, an ultrasonic detector or a microwave detector.
  • the at least one sensor may be further configured to receive an instruction for triggering the holographic projector to project the control interface.
  • the at least one sensor configured to receive the instruction includes at least one of a voice sensor, a temperature sensor, a humidity sensor, a smoke sensor, a light sensor, a camera, a proximity sensor, an energy spectrum sensor, a 3D pressure sensor, or an acceleration sensor.
  • the holographic projector may be mounted on a top surface of a switch, and the at least one sensor is mounted on the top surface of the switch.
  • the one or more devices may include at least one of an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a doorbell, an alarm, a lighting device, a curtain, a lock, or a robot.
  • the system may include a master switch and at least one slave switch.
  • the slave switch may include a slave holographic projector, at least one slave sensor, and a first communication module.
  • the slave holographic projector may be configured to project a slave control interface, the slave control interface having at least one slave icon.
  • the one slave sensor configured to detect a first gesture by a first user, the first gesture being associated with the at least one slave icon.
  • the first communication module configured to transmit the first gesture to the master switch.
  • the master switch may include a second communication module configured to receive the first gesture from the first communication module, and at least one processor configured to generate a slave control signal based on the first gesture for controlling one or more devices.
  • the at least one processor may be directed to perform the following operations for further generating the slave control signal.
  • the at least one processor may determine a 3D coordinate system based on the slave control interface.
  • the at least one processor may determine a vector of the first gesture associated with the at least one slave icon in the 3D coordinate system.
  • the at least one processor may generate the slave control signal according to the determined vector of the first gesture.
  • the master switch may further include a master holographic projector configured to project a master control interface having at least one master icon, and at least one master sensor configured to detect a second gesture by a second user.
  • the second gesture may be associated with the at least one master icon.
  • the at least one processor may be further configured to generate a master control signal based on the second gesture for controlling the one or more devices.
  • the at least one processor may be directed to perform the following operations for further generating the master control signal.
  • the at least one processor may determine a 3D coordinate system based on the master control interface.
  • the at least one processor may determine a vector of the second gesture associated with the at least one master icon in the 3D coordinate system.
  • the at least one processor may generate the master control signal according to the determined vector of the second gesture.
  • the at least one processor may be configured to assign the master control signal with a higher control priority over the slave control signal.
  • the master switch may further include a third communication module configured to transmit the slave control signal or the master control signal to the one or more devices.
  • a method may include one or more of the following operations: determining a 3D coordinate system based on a control interface, the control interface being a holographically projected control interface and having at least one icon; determining a vector of a gesture by a user, the gesture being associated with the at least one icon; and generating a control signal according to the determined vector of the gesture for controlling one or more devices.
  • a method may include one or more of the following operations: determining a 3D coordinate system based on a slave control interface, wherein the slave control interface is holographically projected by a slave holographic projector; determining a vector of the a first gesture by a first user associated with the at least one slave icon in a 3D coordinate system, wherein the first gesture is detected by at least one slave sensor and transmitted to a master switch; and generating a slave control signal according to the determined vector of the first gesture for controlling one or more devices.
  • a non-transitory computer-readable medium comprising at least one set of instructions for updating and loading an application installed in a mobile device, wherein when executed by at least one processor of a computer device, the at least one set of instructions directs the at least one processor to determine a 3D coordinate system based on a control interface, wherein the control interface is holographically projected by a holographic projector and includes at least one icon.
  • the at least one set of instructions also directs the at least one processor to determine a vector of a gesture by a user associated with the at least one icon in the 3D coordinate system, wherein the gesture is detected by at least one sensor.
  • the at least one set of instructions further directs the at least one processor generate a control signal according to the determined vector of the gesture for controlling one or more devices.
  • a non-transitory computer-readable medium comprising at least one set of instructions for updating and loading an application installed in a mobile device, wherein when executed by at least one processor of a computer device, the at least one set of instructions directs the at least one processor to determine a 3D coordinate system based on a slave control interface, wherein the slave control interface is holographically projected by a slave holographic projector.
  • the at least one set of instructions also directs the at least one processor to determine a vector of a first gesture by a first user associated with the at least one slave icon in a 3D coordinate system, wherein the first gesture is detected by at least one slave sensor and transmitted to a master switch.
  • the at least one set of instructions further directs the at least one processor to generate a slave control signal according to the determined vector of the first gesture for controlling one or more devices.
  • FIG. 1 is a block diagram of an exemplary system according to some embodiments of the present disclosure
  • FIG. 2 is a block diagram illustrating an exemplary master switch according to some embodiments of the present disclosure
  • FIG. 3 is a block diagram illustrating an exemplary slave switch according to some embodiments of the present disclosure
  • FIG. 4 is a block diagram illustrating an exemplary communication module according to some embodiments of the present disclosure.
  • FIG. 5 is a block diagram illustrating an exemplary control interface according to some embodiments of the present disclosure.
  • FIG. 6 is a block diagram illustrating an exemplary holographic projector according to some embodiments of the present disclosure
  • FIG. 7 is a block diagram illustrating an exemplary processor according to some embodiments of the present disclosure.
  • FIG. 8-A is a schematic diagram illustrating exemplary locations of a master switch and a projected control interface according to some embodiments of the present disclosure
  • FIG. 8-B is a schematic diagram illustrating exemplary locations of a slave switch and a projected control interface according to some embodiments of the present disclosure.
  • FIG. 9 is a flowchart illustrating an exemplary process for controlling an appliance according to some embodiments of the present disclosure.
  • the present disclosure may be disclosed as an apparatus (including, for example, a system, device, computer program product, or any other apparatus) , a method (including, for example, a computer-implemented process, or any other process) , and/or any combinations of the foregoing. Accordingly, the present disclosure may take the form of an entire software embodiment (including firmware, resident software, microcode) , an entire hardware embodiment, or a combination of software and hardware aspects that may generally be referred to herein as a “system. ”
  • system, ” “engine, ” “module, ” “unit, ” and/or “block” used herein are one method to distinguish different components, elements, parts, sections or assembly of different level in ascending order. However, the terms may be displaced by another expression if they may achieve the same purpose.
  • load ” “load device, ” “device” , “electrical load” and “appliance” are used interchangeably herein, to denote an apparatus that may consume electricity and convert it to one or more forms of energy including, for example, mechanical energy, electromagnetic energy, internal energy, chemical energy, or the like, or a combination thereof.
  • An aspect of the present disclosure relates to an intelligent switch with holographic projection.
  • the intelligent switch may be applied in various environments, such as home, office, other public or private occasions, etc.
  • the intelligent switch may include a holographic projector to project a control interface on a projection medium.
  • the control interface may include at least one icon.
  • An icon may refer to a virtual icon or a physical icon.
  • the projection medium may include a holographic film, a transparent and/or non-transparent glass medium, a wall, a water drop, air, etc.
  • the intelligent switch may control one or more devices based on a gesture associated with the at least one icon, for example, an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a doorbell, an alarm, a lighting device, a curtain, a lock, a robot, etc.
  • a gesture associated with the at least one icon for example, an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a doorbell, an alarm, a lighting device, a curtain, a lock, a robot, etc.
  • Another aspect of the present disclosure relates to a system for controlling one or more devices.
  • the system includes a master switch and at least one slave switch.
  • Each slave switch projects a slave control interface having at least one slave icons, detects a first gesture, and transmit the first gesture to the master switch.
  • the master switch controls one or more devices based on the first gesture.
  • the slave switch does not have a processor to
  • FIG. 1 is a block diagram of an exemplary system 100according to some embodiments of the present disclosure.
  • the intelligent switch system may include a master switch 110, a user terminal 120, a server 130, one or more devices 140 (e.g., device 140-1, 140-2, 140-3, 140-4, and so on) , a storage 150, a plurality of slave switch 160 (e.g., slave switch 160-1, 160-2, 160-3, and so on) , and a network (not shown) .
  • the master switch 110 may be configured to control the one or more devices 140.
  • the master switch 110 may control, direct, or command the one or more devices 140 directly or indirectly.
  • the master switch 110 may include at least one sensor to detect a control signal to control the one or more devices 140.
  • the master switch 110 may receive the instruction of controlling the one or more devices 140 from the user terminal 120 via a network (not shown in FIG. 1) .
  • the master switch 110 may receive an instruction to control the one or more devices 140 from the slave switch 160 via the network.
  • the master switch 110 may communicate with the storage 150 to access information stored in the storage 150 via the network.
  • the master switch 110 may access at least one set of instructions stored in the storage 150 for controlling the one or more devices.
  • the master switch 110 may communicate with the server 130 via the network.
  • the master switch 110 may be controlled by the server 130.
  • a user of the controlling system 100 may control the one or more devices 140 via the user terminal 120.
  • the user terminal 120 may control one or more devices 140 via the master switch 110.
  • the user terminal 120 may send a signal (e.g., an instruction signal) to the master switch 110, and the master switch 110 may control the one or more devices 140 based on the signal.
  • the user terminal 120 may control the one or more devices 140 directly by sending a signal to the one or more devices.
  • the user terminal 120 may be configured to activate or change the settings of the master switch 110, to control the one or more devices 140 (e.g., a device or an appliance) , and/or to retrieve information related to the one or more devices 140 (e.g., information relating to energy consumption or the current status of one or more devices 140) .
  • the one or more devices 140 e.g., a device or an appliance
  • information related to the one or more devices 140 e.g., information relating to energy consumption or the current status of one or more devices 140.
  • the user terminal 120 may include a mobile device 120-1, a tablet computer 120-2, a laptop computer 120-3, or the like, or any combination thereof.
  • the mobile device 120-1 may include a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, a built-in device in a motor vehicle, or the like, or any combination thereof.
  • the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart helmet, a smart watch, smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof.
  • the smart mobile device may include a smartphone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an augmented reality glass, an augmented reality patch, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a Google Glass, an Oculus Rift, a Hololens, a Gear VR, etc.
  • the built-in device in a motor vehicle may include an onboard computer, an onboard television, etc.
  • the server 130 may be configured to process information and/or data relating to the controlling system 100.
  • the server 130 may receive a control signal from the user terminal 120 and control the one or more devices 140 based on the control signal.
  • the server 130 may be a single server or a server group.
  • the server group may be centralized, or distributed.
  • the server 130 may be local or remote.
  • the server may access information and/or data stored in the master switch 110, the user terminal 120, the one or more devices 140, and/or the storage 150 via the network.
  • the server may 130 may be directly connected to the master switch 110, the user terminal 120, the devices 140, and/or the storage 150 to access stored information and/or data.
  • the server 130 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the devices 140 may be any appliance that may consume electricity and/or convert electricity to another form of energy including, for example, mechanical energy (including potential energy, kinetic energy) , internal energy (heat) , chemical energy, light, electromagnetic radiation, or the like, or a combination thereof.
  • Exemplary devices may include an air conditioner 140-1, a washing machine 140-2, a television 140-3, a security device 140-4, etc.
  • the security device 140-4 may include a surveillance camera, an alarm, an electronic lock, etc. The security device 140-4 may monitor the environment and report certain events to the master switch 110. Exemplary events may include somebody approaching or entering through a door, someone entering the backyard, etc.
  • the security device 140-4 may further receive instructions from the master switch 110 and execute the instructed operations including, for example, locking the door, turning on the alarm, notifying a person (e.g., an owner of a house) or an entity (e.g., a security department of a building, police) , taking a photo or a video of a suspicious person or an, etc.
  • a person e.g., an owner of a house
  • an entity e.g., a security department of a building, police
  • the devices 140 may communicate with the master switch 110 and/or slave switch 160 through an electrical connection (e.g., a smart plug) .
  • the smart plug may be a plug or socket connected to a network (e.g., WLAN) .
  • the electrical connection may be based on electrical wire or another contact via a conductor.
  • the smart plug may send or receive information through a wireless network such as Bluetooth, WLAN, Wi-Fi, ZigBee, etc.
  • the devices 140 may also be in communication with the master switch 110 and/or slave switch 160 directly. The communication may be based on a wireless network such as Bluetooth, WLAN, Wi-Fi, ZigBee, etc.
  • an air conditioner 140-1 may have its WLAN unit and report the monitored temperature and/or power consumption to the master switch 110 through a WLAN in the house.
  • the storage 150 may store data and/or instructions. In some embodiments, the storage 150 may store data obtained from the user terminal 120 and/or the device 140. In some embodiments, the storage 150 may store data and/or instructions that the master switch 110 and/or the server 130 may execute or use to perform exemplary methods described in the present disclosure. For example, the storage 150 may store holographic data associated with a master control interface of the master switch 110. As still another example, the storage 150 may store holographic data associated with a slave control interface of the slave switch 160. In some embodiments, the storage 150 may include a mass storage, removable storage, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • ROM read-only memory
  • the storage 150 may communicate with one or more components in the controlling system 100 (e.g., the master switch 110, the user terminal 120, the server 130, the device 140, the slave switch 160) via the network.
  • the one or more components in the controlling system 100 may access the data or instructions stored in the storage 150 via the network.
  • the storage 150 may be part of the master switch 110. In some embodiments, the storage 150 may be part of the server 130.
  • the slave switch 160 may be operably connected to the master switch 110 to control the device 140.
  • the device 140 may be operably connected to the slave switch 160.
  • “operably connected” may refer to the state that relevant elements/components are connected in such a way that they may cooperate to achieve their intended function or functions.
  • the “connection” may be direct, or indirect, physical, remote, via a wired connection, or via a wireless connection, etc.
  • the master switch 110 may be assigned a higher priority than the slave switch 160.
  • the master switch 110 may generate a master control signal for controlling a device 140, while the slave switch 160 may also generate a slave control signal for controlling the device 140. Since the master switch 110 has a higher priority than the slave switch 160, the master switch 110 may control the device 140, and the slave switch 160 may lose control of the device 140.
  • the master switch 110 and the slave switch 160 may be located in different locations.
  • the controlling system 100 is within a house
  • the master switch 110 may be located in the living room
  • slave switches 160 e.g., 160-1, 160-2, 160-3
  • slave switches 160 e.g., 160-1, 160-2, 160-3
  • the controlling system 100 may further include a network.
  • the network may be configured to connect one or more components of the controlling system 100.
  • the network may include a cable network, a wireline network, an optical fiber network, a telecommunications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a wide area network (WAN) , a public telephone switched network (PSTN) , a Bluetooth network, a ZigBee network, a Z-Wave network, a near field communication (NFC) network, or the like, or any combination thereof.
  • LAN local area network
  • WAN wide area network
  • WLAN wireless local area network
  • MAN metropolitan area network
  • WAN wide area network
  • PSTN public telephone switched network
  • Bluetooth network a Bluetooth network
  • ZigBee network a Z-Wave network
  • NFC near field communication
  • FIG. 2 is a block diagram illustrating an exemplary master switch 110 according to some embodiments of the present disclosure. It should be noted that the master switch 110 described below is merely provided for illustration purposes, and not intended to limit the scope of the present disclosure. As illustrated in FIG. 2, the master switch 110 may include a communication module 210, a control panel 220, a holographic projector 230, a sensor 240, a processor 250, and a memory 260.
  • the communication module 210 may be configured to facilitate the communication between the master switch 110 and one or more components of the controlling system 100 (e.g., the user terminal 120, the server 130, the one or more devices 140, the storage 150, the slave switch 160) .
  • the communication may be implemented by transmitting and/or receiving electromagnetic wave signals.
  • the master switch 110 may transmit a master control signal to the device 140 or the slave switch 160, via the communication module 210.
  • the master switch 110 may receive the information related to the operation of an appliance (e.g., the device 140) from the slave switch 160 via the communication module 210.
  • the communication module 210 may be configured to exchange information and/or data among one or more components within the master switch 110, such as the control panel 220, the holographic projector 230, the sensor 240, the processor 250, and/or the memory 260.
  • the processor 250 may access information stored in the memory 260 via the communication module 210.
  • the control panel 220 may be configured to receive a command or an instruction from a user.
  • the control panel 220 may refer to a physical user interface.
  • the control panel 220 may include a button, a microphone, a display, an indicator lamp, a movable component, or the like, or any combination thereof.
  • the user may press on the button to control the one or more devices 140 associated with the button.
  • the user may click an icon on the display to control the one or more devices 140 associated with the icon.
  • the user may slide the movable component to control the one or more devices 140 associated with the movable component in the control panel 220.
  • the control panel 220 may display a notification to the user.
  • the control panel 220 may display a message to the user notifying whether a command or an instruction has been executed accordingly or not.
  • the user may control the control panel 220 through any type of wired or wireless network, or combination thereof.
  • the network may include a cable network, a wireline network, an optical fiber network, a telecommunications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a wide area network (WAN) , a public telephone switched network (PSTN) , a Bluetooth network, a ZigBee network, a Z-Wave network, a near field communication (NFC) network, or the like, or any combination thereof.
  • LAN local area network
  • WAN wide area network
  • WLAN wireless local area network
  • MAN metropolitan area network
  • WAN wide area network
  • PSTN public telephone switched network
  • Bluetooth network a Bluetooth network
  • ZigBee network a Z-Wave network
  • NFC near field communication
  • a user may access and/or activate the control panel 220 remotely through the user terminal 120.
  • the control panel 220 may be projected on a projection medium (e.g., a holographic film, a wall) by a holographic projector in the master switch 110 and/or the slave switch 160.
  • a projection medium e.g., a holographic film, a wall
  • the holographic projector 230 may project a control interface at a short distance.
  • the control interface may include one or more icons.
  • the holographic projector 230 may project the control interface on a projection medium based on holographic data.
  • the holographic data may include Computer-Generated Holograms (CGH) related to the control interface.
  • the projection medium may include a holographic film, a transparent and/or non-transparent glass medium, a wall, water drops, air, or the like, or any combination thereof.
  • the holographic projector 230 may project the control interface, and present the holographically projected control interface on the wall above the master switch 110.
  • the holographic projector 230 may include a holographic projection device.
  • the holographic projection system may include a plurality of optical elements in an optical path.
  • the holographic projection device may include a micro-projection component.
  • the micro-projection component may further generate the holographic projection based on a micro-projection display technology.
  • the micro-projection display technology may include Micro-Electro-Mechanical-System (MEMS) optical scanning micro-projection, Liquid Crystal Display (LCD) transmission micro-projection, Digital Light Processing (DLP) , Liquid Crystal on Silicon (LCoS) reflective-projection display technology, or the like, or any combination thereof.
  • MEMS Micro-Electro-Mechanical-System
  • LCD Liquid Crystal Display
  • DLP Digital Light Processing
  • LCDoS Liquid Crystal on Silicon
  • the holographic projector 230 may locate within the master switch. Alternatively, the holographic projector 230 may be mounted on a surface of master switch 110. For example, the holographic projector 230 may be mounted on the top surface of the master switch 110, and the holographic projector 230 may project the master control interface on the wall above the master switch 110. In some embodiments, a size of the holographically projected control interface may be adjustable according to different application scenarios.
  • the sensor 240 may include at least one sensor configured to detect signals.
  • the sensor may also transmit the detected signals into other forms of information (e.g., electrical signals) .
  • the sensor 240 may detect a gesture by a user.
  • the sensor 240 may generate a signal according to the detected gesture, which may be transmitted to the processor 250.
  • the processor 250 may generate a control signal according to the signal received from the sensor 240.
  • the gesture is associated with at least one master icon on the holographically projected master control interface.
  • the sensor 240 may include a camera, an infrared detector, an ultrasonic detector, a microwave detector, or the like, or any combination thereof.
  • the sensor 240 may be a camera that acquires an image (or video) of the gesture, and the sensor 240 may send the acquired image data (e.g., 3D information of the image) to the processor 250 to determine a master control signal based on the gesture.
  • the sensor 240 may include an infrared detector configured to detect the movement of the user (e.g., the user’s finger movement, hand movement, facial expression) , which may generate a corresponding signal according to the detected movement of the user.
  • the sensor 240 may transmit the corresponding signal to the processor 250.
  • the processor 250 may further generate a master control signal based on the signal.
  • the sensor 240 may be inside and/or outside of the master switch 110.
  • the holographic projector 230 mounted on the top surface of the master switch 110
  • the sensor 240 may be mounted on the top surface of the master switch 110 to detect a gesture acted on the holographically projected control interface.
  • the sensor 240 may be further configured to receive an instruction for triggering the holographic projector to project the control interface.
  • the sensor 240 may include a voice sensor, a temperature sensor, a humidity sensor, a smoke sensor, a light sensor, a camera, a proximity sensor, an energy spectrum sensor, a 3D pressure sensor, an acceleration sensor, or the like, or any combination thereof.
  • the voice sensor may receive a voice instruction from a user to activate and/or trigger the holographic projector to project the master control interface.
  • the sensor 240 may detect or monitor the environment.
  • the sensor 240 may also generate data based on the environment.
  • Exemplary data generated by the sensor 240 according to the environment may include physical data, chemical data, and/or biological data relating to the ambient environment.
  • the physical data may include temperature, light, motion, vibration, pressure, humidity, image, fingerprint, air quality, or the like, or any combination thereof.
  • the chemical data may include a concentration of a gas or other chemicals (e.g., carbon monoxide, carbon dioxide, oxygen, hydrogen sulfide, ammonia, particle matters) in the air, etc.
  • the biological data may include biological characteristics of people in the ambient environment of the master switch 110, such as blood pressure, heart rate, pulse rate, the concentration of blood sugar or insulin, or the like, or any combination thereof.
  • the sensor 240 may send the detected or monitored information to the processor 250 for further processing.
  • the sensor 240 may be an external device, not inside or on the surface of the master switch 110.
  • the external sensor 240 may communicate with the master switch 110 or other components of the controlling system 100 via the communication module 210.
  • the processor 250 may process information and/or data relating to controlling an appliance (e.g., the device 140) , to perform one or more functions described in the present disclosure. For example, the processor 250 may acquire holographic data relating to the holographically projected control interface and/or a gesture acted on the holographically projected control interface, and determine a control signal based on the gesture for controlling one or more devices associated with the master switch 110 or the slave switch 160. For example, the processor 250 may determine a three-dimensional (3D) coordinate system based on the holographically projected control interface. The holographically projected control interface may include at least one icon for controlling the one or more devices 140 associated with the master switch 110 and/or the slave switch 160. The processor 250 may determine a vector of the gesture associated with the at least one icon in the 3D coordinate system, and generate a control signal according to the determined vector of the gesture for controlling the one or more devices 140.
  • an appliance e.g., the device 140
  • the processor 250 may determine a vector of the gesture associated with the at
  • the processor 250 may be a central processing unit (CPU) , an application-specific integrated circuit (ASIC) , an application-specific instruction-set processor (ASIP) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit (MCU) , a digital signal processor (DSP) , a field-programmable gate array (FPGA) , an advanced RISC (reduced instruction set computing) machines (ARM) , or the like, or any combination thereof.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • ASIP application-specific instruction-set processor
  • GPU graphics processing unit
  • PPU a physics processing unit
  • MCU microcontroller unit
  • DSP digital signal processor
  • FPGA field-programmable gate array
  • ARM advanced RISC (reduced instruction set computing) machines
  • the memory 260 may be configured to store data and/or instructions associated with the master switch 110.
  • the memory 260 may store data obtained from one or more components of the controlling system 100 (e.g., the user terminal 120, the server 130, the slave switch 160) .
  • the memory 260 may store instructions that the server 130 and/or the processor 250 may execute to perform exemplary methods described in the present disclosure.
  • the instructions may relate to acquiring holographic data, generating a holographically projected control interface, analyzing a gesture from a user acted on the holographically projected control interface, determining a control signal for controlling one or more devices, or the like, or any combination thereof.
  • the memory 260 may include a mass storage, removable storage, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • Exemplary removable memory may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
  • Exemplary volatile read-and-write memory may include a random-access memory (RAM) .
  • Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
  • DRAM dynamic RAM
  • DDR SDRAM double date rate synchronous dynamic RAM
  • SRAM static RAM
  • T-RAM thyristor RAM
  • Z-RAM zero-capacitor RAM
  • Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically-erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
  • MROM mask ROM
  • PROM programmable ROM
  • EPROM erasable programmable ROM
  • EEPROM electrically-erasable programmable ROM
  • CD-ROM compact disk ROM
  • digital versatile disk ROM etc.
  • FIG. 3 is a block diagram illustrating an exemplary slave switch 160 according to some embodiments of the present disclosure. It should be noted that the slave switch 160 described below is merely provided for illustration purposes, and not intended to limit the scope of the present disclosure. As illustrated in FIG. 3, the slave switch 160 may include a communication module 310, a control panel 320, a holographic projector 330, and a sensor 340.
  • the communication module 310 in the slave switch 160 may have the same or similar configuration and function as the communication module 210 in the master switch 110.
  • the communication module 310 may be referred to as the first communication module herein, and the communication module 210 may be referred to as the second communication module herein.
  • the first communication module in the slave switch 160 may be configured to exchange information and/or data with the second communication module in the master switch 110.
  • the information and/or data may relate to an input from a user.
  • the input may include one or more operations of a device (e.g., the device 140) .
  • the sensor 340 detects a first gesture by a first user acted on the control panel 320, and/or the holographically projected control interface (herein referred to as the slave control interface) .
  • the communication module 310 (herein referred to as the first communication module) in the slave switch 160 may transmit the first gesture in the form of an electromagnetic wave signal to the communication module 210 (herein referred to as the second communication module) in the master switch 110.
  • the processor 250 may further generate a slave control signal based on the first gesture for controlling one or more devices 140.
  • control panel 320 may have the same or similar configuration and function as the control panel 220 in the master switch 110.
  • the slave control interface of the control panel 320 may have at least one slave icon.
  • the user may apply one or more operations onto the at least one slave icon for controlling the corresponding device (e.g., the one or more devices 140) .
  • the operation may include a gesture, such as a pressing, a sliding, a rotating, or the like, or any combination thereof.
  • the holographic projector 330 may have the same or similar configuration and function as the holographic projector 230 in the master switch 110.
  • the slave holographic projector may project the slave control interface based on holographic data relating to the slave control interface.
  • the processor 250 in the master switch 110 may determine the holographic data and transmit the holographic data to the slave switch 160 for projecting.
  • the position of the holographic projector 330 and/or structures inside the holographic projector 330 may be the same or similar to the holographic projector 230 described in FIG. 2 in the present disclosure.
  • the sensor 340 (referred to as the slave sensor herein) may have the same or similar configuration and function as the sensor 240 in the master switch 110.
  • the sensor 340 may be configured to detect a first gesture by a first user.
  • the sensor 340 may generate a signal according to the detected first gesture, which may be transmitted to the processor 250.
  • the processor 250 may generate a control signal according to the signal received from the sensor 340.
  • the first gesture is associated with the at least one slave icon onto the slave control interface holographically projected by the holographic projector 330.
  • An exemplary sensor may include a camera, an infrared detector, an ultrasonic detector, a microwave detector, or the like, or any combination thereof.
  • the senor 340 may be a camera that acquires an image of the gesture, and the sensor 340 may send the acquired image data (e.g., three-dimension information of the image) to the processor 250 in the master switch 110 so as to generate a slave control signal based on the gesture.
  • the sensor 340 may be an infrared detector that detects the movement of the gesture to generate a slave control signal.
  • the sensor 340 may be mounted on any appropriate location inside and/or outside of the slave switch 160.
  • the sensor 340 that is configured to detect gestures from the user may be mounted on the top surface of the slave switch 160 based on the holographic projector 330 mounted on the top surface of the slave switch 160, so that the sensor 340 may detect a gesture acted on the holographically projected control interface from the holographic projector 330.
  • the sensor 340 may be further configured to receive an instruction for triggering the holographic projector 330 (also referred to as the salve holographic projector) to project the salve control interface.
  • the sensor 340 may include a voice sensor, a temperature sensor, a humidity sensor, a smoke sensor, a light sensor, a camera, a proximity sensor, an energy spectrum sensor, a 3D pressure sensor, an acceleration sensor, or the like, or any combination thereof.
  • the voice sensor may receive a voice instruction from a user to activate and/or trigger the slave holographic projector 330 to project the slave control interface.
  • FIG. 4 is a block diagram illustrating an exemplary communication module 400 according to some embodiments of the present disclosure.
  • the communication module 210 also referred to as the second communication module
  • the communication module 310 also referred to as the first communication module
  • the communication module 400 may include a WLAN unit 410, a ZigBee unit 420, a Z-wave unit 430, and a Bluetooth unit 440.
  • the communication module 400 may have one or more other communication units.
  • the communication module 400 may further include a radio frequency communication unit.
  • FIG. 5 is a block diagram illustrating an exemplary control panel 500 according to some embodiments of the present disclosure.
  • the control panel 220 in the master switch 110 and/or the control panel 320 in the slave switch 160 may have the same or similar configuration as the control panel 500.
  • the control panel 500 may include a button 510, a microphone 520, a display 530, and an indicator 540.
  • the button 510 may be configured to connect or disconnect a control circuit. For example, a user may press the button 510 to activate or trigger the holographic projector of the master switch 110 or the slave switch 160. As another example, the user may press the button 510 to control one or more devices 140.
  • the microphone 520 may be configured to detect sound and convert the detected sound to an electrical signal.
  • the user may input an audio by the microphone 520.
  • the audio may relate to any operation for controlling one or more devices (e.g., the device 140) , the master switch 110 and/or the slave switch 160.
  • the display 530 may display information of at least one component in the controlling system 100.
  • the display 530 may present a control interface, information relating to the operation of a device (e.g., the one or more devices 140, the master switch 110, a slave switch 140) , a working state of a device (e.g., the one or more devices 140, the master switch 110, a slave switch 140) , an alarm, or the like, or any combination thereof.
  • Exemplary display may include a Liquid Crystal Display (LCD) , a Light-Emitting Diode (LED) , an Organic Light Emitting Diode (OLED) , an electronic ink display, a Plasma Display Panel (PDP) , or the like, or any combination thereof.
  • the display 530 may include a touch screen that both displaying information and receiving control instructions.
  • the display 530 may display a control interface of the one or more devices 140, and be touchable to receive an on/off instruction from a user.
  • the indicator 540 may be configured to notify a user.
  • the indicator 540 may give out light to notify the user of certain information relating to an alarm, a state of operation, a working status, or the like, or any combination thereof.
  • the indicator 540 may present a color to notify the user of a state of the master switch 110.
  • the indicator 540 of the master switch 110 may emit green light when the master switch 110 operates normally, and red light when the master switch 110 operates abnormally.
  • the indicator 540 may include a light-emitting diode (LED) lamp, a gas discharge lamp (for example, a neon lamp) , an incandescent lamp, or any other light-emitting device or component, or any combination thereof.
  • LED light-emitting diode
  • the button 510 may be replaced by one or more of a slide bar, a knob, a dial, or the like, or any combination thereof.
  • the user may slide the slide bar, or rotate the knob or dial to input information to control one or more components (e.g., the master switch 110, the one or more devices 140, the slave switch 160) .
  • control panel 500 included in the master switch 110 may be different from that included in the slave switch 160 (also referred to as the control panel 320) .
  • the control panel 500 included in the master switch 110 may include the button 510, the microphone 520, the display 530, and the indicator 540.
  • the control panel 500 included in the slave switch 160 may only include the button 510.
  • FIG. 6 is a block diagram illustrating an exemplary holographic projector 600 according to some embodiments of the present disclosure.
  • the holographic projector 230 in the master switch 110 and/or the holographic projector 330 in the slave switch 160 may have the same or similar configuration as the holographic projector 600.
  • the holographic projector 600 may include one or more optical paths for holographic projection in principle. The one or more optical paths may be varied. All such variations are within the scope of the present disclosure.
  • the holographic projector 600 may include a light source 610, a lens 620, a beam splitter 630, and a spatial light modulator (SLM) 640.
  • SLM spatial light modulator
  • the light source 610 may emit a light beam.
  • the light source 610 may include a light-emitting diode (LED) , a laser, a halogen lamp, a fluorescent lamp, or the like, or any combination thereof.
  • the lens 620 be configured to reflect light and/or transmit light.
  • the lens may include a convex lens, a concave lens, a plane mirror, or the like, or any combination thereof.
  • the beam splitter 630 may split the light beam into two or more beams, each aimed in different directions.
  • the two or more beams may include an illumination beam and a reference beam.
  • the illumination beam may be directed to an object that needs to be projected (e.g., the control interface) of the holographic projector 600.
  • the spatial light modulator (SLM) 640 may impose some form of spatially-varying modulation on the light beam.
  • the SLM 640 may modulate the intensity or the phase of the light beam individually, or both the intensity and the phase of the light beam simultaneously.
  • the SLM 640 may encode the holographic data into the light beam for further generating the holographically projected control interface.
  • the holographic data may relate to the user interface of the master switch 110.
  • the holographic projector 600 may include a micro-projection system.
  • the micro-projection system may be a system on a chip (SoC) .
  • SoC system on a chip
  • the micro-projection system may further generate the holographically projected control interface based on a micro-projection display technology.
  • the micro-projection display technology may include Micro-Electro-Mechanical-System (MEMS) optical scanning micro-projection, Liquid Crystal Display (LCD) transmission micro-projection, Digital Light Processing (DLP) , Liquid Crystal on Silicon (LCoS) reflective-projection display technology, or the like, or any combination thereof.
  • MEMS Micro-Electro-Mechanical-System
  • LCD Liquid Crystal Display
  • DLP Digital Light Processing
  • LCD Liquid Crystal on Silicon
  • reflective-projection display technology or the like, or any combination thereof.
  • FIG. 7 is a block diagram illustrating an exemplary processor 250 according to some embodiments of the present disclosure.
  • the processor 250 may include a holographic data determination unit 710, a gesture data acquisition unit 720, and a control signal determination unit 730.
  • Each unit may be a hardware circuit that is designed to perform the following actions, a set of instructions stored in one or more storage media (e.g., the storage 150, or the memory 260) , and/or a combination of the hardware circuit and the one or more storage media.
  • the holographic data determination unit 710 may be configured to determine the holographic data relating to the master control interface of a master switch 110.
  • the master control interface may have at least one master icon.
  • each of the at least master icon may be configured to control a device of the one or more devices 140.
  • the device may include at least one of an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a doorbell, an alarm, a lighting device, a curtain, a lock, a robot, or the like, or any combination thereof.
  • the holographic data determination unit 710 may determine the holographic data relating to the master control interface based on Computer-Generated Holograms (CGH) .
  • CGH Computer-Generated Holograms
  • the holographic data determination unit 710 may encode the master control interface into the CGH for determining the holographic data.
  • the holographic data determination unit 710 may send the holographic data to the master holographic projector (e.g., the holographic projector 230) for further projecting the master control interface through the communication module 210.
  • the master holographic projector e.g., the holographic projector 230
  • the holographic data determination unit 710 may determine first holographic data relating to the slave control interface, the slave control interface having at least one slave icon.
  • the slave icon may be the same or similar to the master icon.
  • the holographic data determination unit 710 may send the determined first holographic data to the slave holographic projector (e.g., the holographic projector 330 in the slave switch 160) for projecting the slave control interface.
  • the holographic data determination unit 710 may determine the first holographic data relating to the slave control interface of the slave switch 160 and transmit the first holographic data to the slave holographic projector (i.e., the holographic projector 330) for projecting the slave control interface.
  • the slave holographic projector i.e., the holographic projector 330
  • the gesture data acquisition unit 720 may be configured to obtain the data related to a gesture by a user.
  • the gesture may be associated with the at least one icon (e.g., the master icon in the master control interface, the slave icon in the slave control interface) .
  • the gesture data may include starting coordinates of the gesture, ending coordinates of the gesture, a movement direction, a vector of the gesture, or the like, or any combination thereof.
  • the gesture data acquisition unit 720 may obtain the gesture data from at least one sensor (e.g., the sensor 240 in the master switch 110, the sensor 340 in the slave switch 160) .
  • the at least one sensor may be configured to detect the gesture acted on the holographically projected control interface.
  • the at least one sensor may include at least one of a camera, an infrared detector, an ultrasonic detector or a microwave detector.
  • the gesture data acquisition unit 720 may obtain a first gesture data from the slave switch 160.
  • the first gesture data may refer to the data (e.g., location data) associated with the first gesture.
  • the first gesture may be associated with at least one slave icon.
  • the first gesture may be performed by a first user by pressing on a slave icon.
  • the slave sensor e.g., the sensor 340
  • a first communication module e.g., the communication module 310 in the slave switch 160
  • a second communication module e.g., the communication module 210 in the master switch 110
  • the gesture data acquisition unit 720 may obtain the first gesture from the second communication module.
  • the gesture data acquisition unit 720 may obtain a second gesture data from the master switch 110.
  • the second gesture data may refer to the data (e.g., location data) associated with the second gesture.
  • the second gesture may be associated with at least one master icon.
  • the second gesture may be performed by a second user by rotating a master icon.
  • the master sensor also referred to as the sensor 240
  • the control signal determining unit 730 may be configured to generate a control signal for controlling one or more devices 140. In some embodiments, the control signal determining unit 730 may generate the control signal based on the gesture by a user. For example, the control signal determining unit 730 may determine a slave control signal based on the first gesture associated with the slave switch 160. As another example, the control signal determining unit 730 may determine a master control signal based on the second gesture associated with the master switch 110.
  • control signal determining unit 730 may first determine a 3D coordinate system based on the control interface of a switch (e.g., the master control interface of the master switch 110, the slave control interface of the slave switch 160) , and then determine a vector of the gesture associated with at least one icon in the 3D coordinate system from a user.
  • the gesture may include the first gesture associated with the at least one slave switch 160 performed by the first user, the second gesture associated with the at least one master switch 110 performed by the second user.
  • the control signal determining unit 730 may determine the control signal based on the vector of the gesture.
  • FIG. 8-A is a schematic diagram illustrating exemplary locations of a master switch 110 and a projected control interface according to some embodiments of the present disclosure.
  • the holographic projector 230 may be mounted on the top surface of the master switch 110, and the control interface (also referred to as the master control interface) of the master switch 110 may be projected somewhere above the master switch 110.
  • the master switch 110 may be mounted on the wall of a living room, the control interface (also referred to as the master control interface) of the master switch 110 may be projected on the wall right above the master switch 110.
  • the projected control interface (also referred to as the master control interface) may have a set of icons (also referred to as the master icons) for controlling the one or more devices 140.
  • FIG. 8-B is a schematic diagram illustrating exemplary locations of a slave switch 160 and a projected control interface according to some embodiments of the present disclosure.
  • the holographic projector 330 may be mounted on the top surface of the slave switch 160, and the control interface of the slave switch 160 may be projected somewhere above the slave switch 160.
  • the slave switch 160 may be mounted on the wall of a bedroom, and the control interface (also referred to as the slave control interface) of the slave switch 160 may be projected on the wall right above the master switch 110.
  • the holographically projected control interface (also referred to as the slave control interface) may have a set of icons (also referred to as the slave icons) for controlling one or more devices 140.
  • the slave switch 160 may be placed in a different location from the master switch 110.
  • the master switch 110 may be set in the living room, and the slave switch 160 (e.g., 160-1, 160-2) may be placed in another room, for example, a bedroom, a bathroom, or a kitchen.
  • the control panel 220 of the master switch 110 in FIG. 8-A may have a different configuration than the control panel 320 of the slave switch 160 in FIG. 8-B.
  • the control panel 220 of the master switch 110 may include a plurality of buttons, and the control panel 320 of the slave switch 160 may include only one button. It should be noted that FIG. 8-A and FIG.
  • 8-B merely illustrate exemplary locations of the switch (the master switch or the slave switch) and the control interface (the master control interface or the slave control interface) .
  • Other locations, such as the control interface is projected under the switch, on the right of the switch, on the left of the switch, or any relative locations may fall within the scope of the present disclosure.
  • FIG. 9 is a flowchart illustrating an exemplary process for controlling a device according to some embodiments of the present disclosure.
  • the process 900 may be executed by at least one processor in the controlling system 100 (e.g., the server 130 or the processor 250 in the master switch 110) .
  • the process 900 may be implemented as a set of instructions (e.g., an application) stored in a non-transitory computer-readable storage media (e.g., the storage 150 or the memory 260) .
  • the processor may execute the instructions and may accordingly be directed to perform the process 900 via receiving and/or sending electronic signals or electrical currents.
  • the processor 250 may determine a three-dimensional (3D) coordinate system based on a control interface of a switch.
  • the control interface may refer to a holographically projected control interface (e.g., a master control interface, a slave control interface) .
  • the control interface may have at least one icon (e.g., at least one master icon, at least one slave icon) .
  • the switch may include the master switch 110 and at least one slave switch 160.
  • the holographic projector e.g., the holographic projector 230 or the holographic projector 330
  • control interface may be projected onto a projection medium (e.g., a holographic film, a transparent and/or non-transparent glass medium, a wall, a water drop, air) above the switch (e.g., the master switch 110, or the slave switch 160) .
  • a projection medium e.g., a holographic film, a transparent and/or non-transparent glass medium, a wall, a water drop, air
  • the X-Y plane refers to a plane of the control interface that parallels to the projection medium
  • the Z-axis may be perpendicular to the projection medium.
  • a geometric center of the control interface may be the origin of the coordinate system
  • the processor 250 may determine the 3D coordinate system, including X, Y, and Z axes, based on the original.
  • An icon in the at least one icon may correspond to three coordinates in the 3D coordinate system.
  • an icon A associated with a lighting device may locate in the 3D coordinate system with the coordinates of (X a , Y a , Z a ) .
  • the processor 250 may determine a vector of a gesture associated with at least one icon in the 3D coordinate system from a user.
  • the gesture may be performed by the user on the control interface for controlling one or more devices 140.
  • the gesture may include a pressing, a rotating, a sliding, or the like, or any combination thereof.
  • At least one sensor e.g., the sensor 240, or the sensor 340 in the switch (e.g., the master switch 110 or the slave switch 160) may detect the gesture.
  • the gesture may include a first gesture performed by the first user on the slave control interface, and a second gesture performed by the second user on the master control interface.
  • the processor 250 may first determine starting coordinates of the gesture, ending coordinates of the gesture, a direction of the gesture in the 3D coordinate system. The processor 250 may determine the vector of the gesture based on the starting location, the ending location, and the direction of the gesture.
  • a user may perform a sliding gesture on an icon A on the control interface, and the at least one sensor (e.g., the sensor 240, or the sensor 340) may detect the sliding gesture and transmit the sliding gesture data to the processor 250.
  • the processor 250 may determine a 3D coordinate system based on the control interface.
  • the processor 250 may also determine the starting coordinates of the gesture as (X 1 , Y 1 , Z 1 ) and the ending coordinates of the gesture as (X 2 , Y 2 , Z 2 ) in the 3D coordinate system.
  • the processor 250 may further determine the vector of the sliding gesture based on the starting coordinates, (X 1 , Y 1 , Z 1 ) and the ending coordinates (X 2 , Y 2 , Z 2 ) .
  • the vector of the sliding gesture may be denoted as (X 2 -X 1 , Y 2 -Y 1 , Z 2 -Z 1 ) .
  • the sliding gesture may be on the plane paralleled to projection medium, which means Z 1 and Z 2 are zero.
  • the vector of the sliding gesture may be (X 2 -X 1 , Y 2 -Y 1 , 0) .
  • the processor 250 may generate a control signal based on the vector of the gesture for controlling one or more devices 140.
  • the processor 250 may determine the control signal based on the vector of the gesture and the coordinates of the at least one icon in the control interface. For example, the processor 250 may determine an icon that the gesture acts on according to the starting coordinates of the gesture and the coordinates of the icon. The icon that the gesture acted on may include coordinates nearest to the starting coordinates of the gesture.
  • the processor 250 may also generate a control signal based on the vector of the gesture.
  • the vector may include a moving direction and a moving range of the gesture associated with the icon.
  • the processor 250 may turn up two levels of volume of a television based on the vector with moving right about two centimeters.
  • the processor 250 may generate a slave control signal based on the first gesture by the first user.
  • the slave holographic projector 330 may project the control interface (also referred to as the slave control interface herein) of the slave control panel 320.
  • the slave control interface may have at least one slave icon.
  • the sensor 340 may detect the first gesture acted on the slave control interface, and the first gesture may be associated with the at least one slave icon.
  • the first gesture may be transmitted to the master switch 110 by the first communication module 310.
  • the master switch 110 may receive the first gesture from the first communication module 310 by the second communication module 210.
  • the processor 250 may determine a 3D coordinate system based on the slave control interface, determine a vector of the first gesture in the 3D coordinate system, and generate the slave control signal for controlling the one or more devices 140 according to the determined vector of the first gesture.
  • the processor 250 may generate a master control signal based on the second gesture by the second user.
  • the master holographic projector 230 may project the master control interface on the wall above the master switch 110, the master control interface having at least one master icon.
  • the sensor 240 may detect the second gesture acted on the slave control interface, and the second gesture may be associated with the at least one master icon.
  • the processor 250 may determine a 3D coordinate system based on the projected master control interface.
  • the processor 250 may determine the vector of the second gesture in the 3D coordinate system.
  • the processor 250 may further generate the master control signal according to the determined vector of the second gesture.
  • the processor 250 may assign the master control signal with a higher control priority over the slave control signal. In other words, when the processor 250 determines the master control signal and the slave signal simultaneously for controlling the same device, the processor 250 may control the device based on the master control signal.
  • the processor 250 may transmit the control signal to the one or more devices.
  • the processor 250 may transmit the control signal in the form of the electromagnetic wave using any suitable communication protocol via the network.
  • the exemplary communication protocol may include Transmission Control Protocol (TCP) , Internet Protocol (IP) , TCP/IP, Internetwork Packet Exchange Protocol (IPX) , Sequenced Packet Exchange Protocol (SPX) , IPX/SPX, or the like, or any combination thereof.
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code) or combining software and hardware implementation that may all generally be referred to herein as a “block, ” “module, ” “engine, ” “unit, ” “component, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable media having computer- readable program code embodied thereon.
  • a computer-readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electromagnetic, optical, or the like, or any suitable combination thereof.
  • a computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer-readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby, and Groovy, or other programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
  • LAN local area network
  • WAN wide area network
  • SaaS Software as a Service
  • the numbers expressing quantities of ingredients, properties, and so forth, used to describe and claim certain embodiments of the application are to be understood as being modified in some instances by the term “about, ” “approximate, ” or “substantially. ”
  • “about, ” “approximate, ” or “substantially” may indicate ⁇ 20% variation of the value it describes, unless otherwise stated.
  • the numerical parameters set forth in the written description and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by a particular embodiment.
  • the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Systems and methods for controlling appliances are provided. A method may include determining a 3D coordinate system based on a control interface of a switch and determining a vector of a gesture associated with at least one icon in the 3D coordinate from a user. The method may further include determining a control signal based on the vector of the gesture for controlling one or more devices and transmitting the control signal to the one or more devices.

Description

SYSTEMS AND METHODS FOR CONTROLLING APPLIANCES TECHNICAL FIELD
The present disclosure generally relates to systems and methods for controlling appliances, and in particular, to an intelligent switch with holographic projection.
BACKGROUND
In an existing smart home system, more and more functional modules for controlling one or more smart household appliances, are integrated into a single switch. To control more appliances, such switch usually has a large physical user interface or control panel to present the user these functions. On the other hand, a small-sized switch may not be able to present many functions in its user interface or control panel due to the limitation of its size. Therefore, an intelligent switch capable of controlling many appliances on a limited user interface or control panel may be welcomed and in demand.
SUMMARY
According to an aspect of the present disclosure, a system is provided. The systems may include a holographic projector, at least one sensor, and at least one processor. The holographic projector may be configured to project a control interface, the control interface having at least one icon. The at least one sensor may be configured to detect a gesture by a user, the gesture being associated with the at least one icon. The at least one processor may be directed to perform one or more of the following operations. The at least one processor may determine a 3D coordinate system based on the control interface. The at least one processor may determine a vector of the gesture associated with the at least one icon in the 3D coordinate system. The at least one processor may generate a control  signal according to the determined vector of the gesture for controlling one or more devices.
In some embodiments, the system may include a communication module configured to transmit the determined control signal to one or more devices.
In some embodiments, the communication module may include at least one of a Wireless Local Area Network (WLAN) unit, a ZigBee unit, a Z-wave unit, or a Bluetooth unit.
In some embodiments, the at least one sensor may include at least one of a camera, an infrared detector, an ultrasonic detector or a microwave detector.
In some embodiments, the at least one sensor may be further configured to receive an instruction for triggering the holographic projector to project the control interface.
In some embodiments, the at least one sensor configured to receive the instruction includes at least one of a voice sensor, a temperature sensor, a humidity sensor, a smoke sensor, a light sensor, a camera, a proximity sensor, an energy spectrum sensor, a 3D pressure sensor, or an acceleration sensor.
In some embodiments, the holographic projector may be mounted on a top surface of a switch, and the at least one sensor is mounted on the top surface of the switch.
In some embodiments, the one or more devices may include at least one of an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a doorbell, an alarm, a lighting device, a curtain, a lock, or a robot.
According to an aspect of the present disclosure, another system is provided. The system may include a master switch and at least one slave switch. The slave switch may include a slave holographic projector, at least  one slave sensor, and a first communication module. The slave holographic projector may be configured to project a slave control interface, the slave control interface having at least one slave icon. The one slave sensor configured to detect a first gesture by a first user, the first gesture being associated with the at least one slave icon. The first communication module configured to transmit the first gesture to the master switch. The master switch may include a second communication module configured to receive the first gesture from the first communication module, and at least one processor configured to generate a slave control signal based on the first gesture for controlling one or more devices.
In some embodiments, the at least one processor may be directed to perform the following operations for further generating the slave control signal. The at least one processor may determine a 3D coordinate system based on the slave control interface. The at least one processor may determine a vector of the first gesture associated with the at least one slave icon in the 3D coordinate system. The at least one processor may generate the slave control signal according to the determined vector of the first gesture.
In some embodiments, the master switch may further include a master holographic projector configured to project a master control interface having at least one master icon, and at least one master sensor configured to detect a second gesture by a second user. The second gesture may be associated with the at least one master icon.
In some embodiments, the at least one processor may be further configured to generate a master control signal based on the second gesture for controlling the one or more devices.
In some embodiments, the at least one processor may be directed to perform the following operations for further generating the master control signal. The at least one processor may determine a 3D coordinate system based on the master control interface. The at least one processor may  determine a vector of the second gesture associated with the at least one master icon in the 3D coordinate system. The at least one processor may generate the master control signal according to the determined vector of the second gesture.
In some embodiments, the at least one processor may be configured to assign the master control signal with a higher control priority over the slave control signal.
In some embodiments, the master switch may further include a third communication module configured to transmit the slave control signal or the master control signal to the one or more devices.
According to another aspect of the present disclosure, a method is provided. The method may include one or more of the following operations: determining a 3D coordinate system based on a control interface, the control interface being a holographically projected control interface and having at least one icon; determining a vector of a gesture by a user, the gesture being associated with the at least one icon; and generating a control signal according to the determined vector of the gesture for controlling one or more devices.
According to another aspect of the present disclosure, a method is provided. The method may include one or more of the following operations: determining a 3D coordinate system based on a slave control interface, wherein the slave control interface is holographically projected by a slave holographic projector; determining a vector of the a first gesture by a first user associated with the at least one slave icon in a 3D coordinate system, wherein the first gesture is detected by at least one slave sensor and transmitted to a master switch; and generating a slave control signal according to the determined vector of the first gesture for controlling one or more devices.
According to still another aspect of the present disclosure, a non-transitory computer-readable medium, comprising at least one set of  instructions for updating and loading an application installed in a mobile device, wherein when executed by at least one processor of a computer device, the at least one set of instructions directs the at least one processor to determine a 3D coordinate system based on a control interface, wherein the control interface is holographically projected by a holographic projector and includes at least one icon. The at least one set of instructions also directs the at least one processor to determine a vector of a gesture by a user associated with the at least one icon in the 3D coordinate system, wherein the gesture is detected by at least one sensor. The at least one set of instructions further directs the at least one processor generate a control signal according to the determined vector of the gesture for controlling one or more devices.
According to still another aspect of the present disclosure, a non-transitory computer-readable medium, comprising at least one set of instructions for updating and loading an application installed in a mobile device, wherein when executed by at least one processor of a computer device, the at least one set of instructions directs the at least one processor to determine a 3D coordinate system based on a slave control interface, wherein the slave control interface is holographically projected by a slave holographic projector. The at least one set of instructions also directs the at least one processor to determine a vector of a first gesture by a first user associated with the at least one slave icon in a 3D coordinate system, wherein the first gesture is detected by at least one slave sensor and transmitted to a master switch. The at least one set of instructions further directs the at least one processor to generate a slave control signal according to the determined vector of the first gesture for controlling one or more devices.
BRIEF DESCRIPTION OF THE DRAWINGS
The present disclosure is further described in terms of exemplary embodiments. These exemplary embodiments are described in detail with  reference to the drawings. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent similar structures throughout the several views of the drawings, and wherein:
FIG. 1 is a block diagram of an exemplary system according to some embodiments of the present disclosure;
FIG. 2 is a block diagram illustrating an exemplary master switch according to some embodiments of the present disclosure;
FIG. 3 is a block diagram illustrating an exemplary slave switch according to some embodiments of the present disclosure;
FIG. 4 is a block diagram illustrating an exemplary communication module according to some embodiments of the present disclosure;
FIG. 5 is a block diagram illustrating an exemplary control interface according to some embodiments of the present disclosure;
FIG. 6 is a block diagram illustrating an exemplary holographic projector according to some embodiments of the present disclosure;
FIG. 7 is a block diagram illustrating an exemplary processor according to some embodiments of the present disclosure;
FIG. 8-A is a schematic diagram illustrating exemplary locations of a master switch and a projected control interface according to some embodiments of the present disclosure;
FIG. 8-B is a schematic diagram illustrating exemplary locations of a slave switch and a projected control interface according to some embodiments of the present disclosure; and
FIG. 9 is a flowchart illustrating an exemplary process for controlling an appliance according to some embodiments of the present disclosure.
DETAILED DESCRIPTION
In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure. However, it should be apparent to those skilled in  the art that the present disclosure may be practiced without such details. In other instances, well-known methods, procedures, systems, components, and/or circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present disclosure. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Thus, the present disclosure is not limited to the embodiments shown, but to be accorded the broadest scope consistent with the claims.
As will be understood by those skilled in the art, the present disclosure may be disclosed as an apparatus (including, for example, a system, device, computer program product, or any other apparatus) , a method (including, for example, a computer-implemented process, or any other process) , and/or any combinations of the foregoing. Accordingly, the present disclosure may take the form of an entire software embodiment (including firmware, resident software, microcode) , an entire hardware embodiment, or a combination of software and hardware aspects that may generally be referred to herein as a “system. ”
It will be understood that the term “system, ” “engine, ” “module, ” “unit, ” and/or “block” used herein are one method to distinguish different components, elements, parts, sections or assembly of different level in ascending order. However, the terms may be displaced by another expression if they may achieve the same purpose.
It will be understood that when a unit, engine, module or block is referred to as being “on, ” “connected to, ” or “coupled to” another unit, engine, module, or block, it may be directly on, connected or coupled to, or communicate with the other unit, engine, module, or block, or an intervening unit, engine, module, or block may be present, unless the context clearly  indicates otherwise. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
The devices, modules, units, components or pins with the same numeral or notation in the drawings refer to the same device or components.
The terminology used herein is to describe particular examples and embodiments only and is not intended to be limiting. As used herein, the singular forms “a, ” “an, ” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “include, ” and/or “comprise, ” when used in this disclosure, specify the presence of integers, devices, behaviors, stated features, steps, elements, operations, and/or components, but do not exclude the presence or addition of one or more other integers, devices, behaviors, features, steps, elements, operations, components, and/or groups thereof.
The terms “load, ” “load device, ” “device” , “electrical load” and “appliance” are used interchangeably herein, to denote an apparatus that may consume electricity and convert it to one or more forms of energy including, for example, mechanical energy, electromagnetic energy, internal energy, chemical energy, or the like, or a combination thereof.
An aspect of the present disclosure relates to an intelligent switch with holographic projection. The intelligent switch may be applied in various environments, such as home, office, other public or private occasions, etc. The intelligent switch may include a holographic projector to project a control interface on a projection medium. The control interface may include at least one icon. An icon may refer to a virtual icon or a physical icon. The projection medium may include a holographic film, a transparent and/or non-transparent glass medium, a wall, a water drop, air, etc. The intelligent switch may control one or more devices based on a gesture associated with the at least one icon, for example, an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a  doorbell, an alarm, a lighting device, a curtain, a lock, a robot, etc. Another aspect of the present disclosure relates to a system for controlling one or more devices. The system includes a master switch and at least one slave switch. Each slave switch projects a slave control interface having at least one slave icons, detects a first gesture, and transmit the first gesture to the master switch. The master switch controls one or more devices based on the first gesture. The slave switch does not have a processor to control the one or more devices.
FIG. 1 is a block diagram of an exemplary system 100according to some embodiments of the present disclosure. The intelligent switch system may include a master switch 110, a user terminal 120, a server 130, one or more devices 140 (e.g., device 140-1, 140-2, 140-3, 140-4, and so on) , a storage 150, a plurality of slave switch 160 (e.g., slave switch 160-1, 160-2, 160-3, and so on) , and a network (not shown) .
The master switch 110 may be configured to control the one or more devices 140. In some embodiments, the master switch 110 may control, direct, or command the one or more devices 140 directly or indirectly. For example, the master switch 110 may include at least one sensor to detect a control signal to control the one or more devices 140. As another example, the master switch 110 may receive the instruction of controlling the one or more devices 140 from the user terminal 120 via a network (not shown in FIG. 1) . As still another example, the master switch 110 may receive an instruction to control the one or more devices 140 from the slave switch 160 via the network. In some embodiments, the master switch 110 may communicate with the storage 150 to access information stored in the storage 150 via the network. For example, the master switch 110 may access at least one set of instructions stored in the storage 150 for controlling the one or more devices. In some embodiments, the master switch 110 may  communicate with the server 130 via the network. For example, the master switch 110 may be controlled by the server 130.
A user of the controlling system 100 may control the one or more devices 140 via the user terminal 120. For example, the user terminal 120 may control one or more devices 140 via the master switch 110. The user terminal 120 may send a signal (e.g., an instruction signal) to the master switch 110, and the master switch 110 may control the one or more devices 140 based on the signal. As another example, the user terminal 120 may control the one or more devices 140 directly by sending a signal to the one or more devices. In some embodiments, the user terminal 120 may be configured to activate or change the settings of the master switch 110, to control the one or more devices 140 (e.g., a device or an appliance) , and/or to retrieve information related to the one or more devices 140 (e.g., information relating to energy consumption or the current status of one or more devices 140) .
In some embodiments, the user terminal 120 may include a mobile device 120-1, a tablet computer 120-2, a laptop computer 120-3, or the like, or any combination thereof. In some embodiments, the mobile device 120-1 may include a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, a built-in device in a motor vehicle, or the like, or any combination thereof. In some embodiments, the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart helmet, a smart watch, smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smartphone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof. In some embodiments, the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an  augmented reality glass, an augmented reality patch, or the like, or any combination thereof. For example, the virtual reality device and/or the augmented reality device may include a Google Glass, an Oculus Rift, a Hololens, a Gear VR, etc. The built-in device in a motor vehicle may include an onboard computer, an onboard television, etc.
The server 130 may be configured to process information and/or data relating to the controlling system 100. For example, the server 130 may receive a control signal from the user terminal 120 and control the one or more devices 140 based on the control signal. In some embodiments, the server 130 may be a single server or a server group. The server group may be centralized, or distributed. In some embodiments, the server 130 may be local or remote. For example, the server may access information and/or data stored in the master switch 110, the user terminal 120, the one or more devices 140, and/or the storage 150 via the network. As another example, the server may 130 may be directly connected to the master switch 110, the user terminal 120, the devices 140, and/or the storage 150 to access stored information and/or data. In some embodiments, the server 130 may be implemented on a cloud platform. Merely by way of example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
In some embodiments, the devices 140 may be any appliance that may consume electricity and/or convert electricity to another form of energy including, for example, mechanical energy (including potential energy, kinetic energy) , internal energy (heat) , chemical energy, light, electromagnetic radiation, or the like, or a combination thereof. Exemplary devices may include an air conditioner 140-1, a washing machine 140-2, a television 140-3, a security device 140-4, etc. In some embodiments, the security device 140-4 may include a surveillance camera, an alarm, an electronic lock, etc. The  security device 140-4 may monitor the environment and report certain events to the master switch 110. Exemplary events may include somebody approaching or entering through a door, someone entering the backyard, etc. The security device 140-4 may further receive instructions from the master switch 110 and execute the instructed operations including, for example, locking the door, turning on the alarm, notifying a person (e.g., an owner of a house) or an entity (e.g., a security department of a building, police) , taking a photo or a video of a suspicious person or an, etc.
In some embodiments, the devices 140 may communicate with the master switch 110 and/or slave switch 160 through an electrical connection (e.g., a smart plug) . The smart plug may be a plug or socket connected to a network (e.g., WLAN) . The electrical connection may be based on electrical wire or another contact via a conductor. The smart plug may send or receive information through a wireless network such as Bluetooth, WLAN, Wi-Fi, ZigBee, etc. In some embodiments, the devices 140 may also be in communication with the master switch 110 and/or slave switch 160 directly. The communication may be based on a wireless network such as Bluetooth, WLAN, Wi-Fi, ZigBee, etc. For example, an air conditioner 140-1 may have its WLAN unit and report the monitored temperature and/or power consumption to the master switch 110 through a WLAN in the house.
The storage 150 may store data and/or instructions. In some embodiments, the storage 150 may store data obtained from the user terminal 120 and/or the device 140. In some embodiments, the storage 150 may store data and/or instructions that the master switch 110 and/or the server 130 may execute or use to perform exemplary methods described in the present disclosure. For example, the storage 150 may store holographic data associated with a master control interface of the master switch 110. As still another example, the storage 150 may store holographic data associated with a slave control interface of the slave switch 160. In some embodiments,  the storage 150 may include a mass storage, removable storage, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
In some embodiments, the storage 150 may communicate with one or more components in the controlling system 100 (e.g., the master switch 110, the user terminal 120, the server 130, the device 140, the slave switch 160) via the network. The one or more components in the controlling system 100 may access the data or instructions stored in the storage 150 via the network. In some embodiment, the storage 150 may be part of the master switch 110. In some embodiments, the storage 150 may be part of the server 130.
The slave switch 160 may be operably connected to the master switch 110 to control the device 140. In some embodiments, the device 140 may be operably connected to the slave switch 160. As used herein and unless otherwise specifically stated, “operably connected” may refer to the state that relevant elements/components are connected in such a way that they may cooperate to achieve their intended function or functions. The “connection” may be direct, or indirect, physical, remote, via a wired connection, or via a wireless connection, etc.
In some embodiments, the master switch 110 may be assigned a higher priority than the slave switch 160. For example, the master switch 110 may generate a master control signal for controlling a device 140, while the slave switch 160 may also generate a slave control signal for controlling the device 140. Since the master switch 110 has a higher priority than the slave switch 160, the master switch 110 may control the device 140, and the slave switch 160 may lose control of the device 140. In some embodiments, the master switch 110 and the slave switch 160 may be located in different locations. For example, the controlling system 100 is within a house, the master switch 110 may be located in the living room, and slave switches 160  (e.g., 160-1, 160-2, 160-3) may be located in individual rooms, such as a bedroom, a bathroom, a kitchen.
In some embodiments, the controlling system 100 may further include a network. The network may be configured to connect one or more components of the controlling system 100. Merely by way of example, the network may include a cable network, a wireline network, an optical fiber network, a telecommunications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a wide area network (WAN) , a public telephone switched network (PSTN) , a Bluetooth network, a ZigBee network, a Z-Wave network, a near field communication (NFC) network, or the like, or any combination thereof.
FIG. 2 is a block diagram illustrating an exemplary master switch 110 according to some embodiments of the present disclosure. It should be noted that the master switch 110 described below is merely provided for illustration purposes, and not intended to limit the scope of the present disclosure. As illustrated in FIG. 2, the master switch 110 may include a communication module 210, a control panel 220, a holographic projector 230, a sensor 240, a processor 250, and a memory 260.
The communication module 210 may be configured to facilitate the communication between the master switch 110 and one or more components of the controlling system 100 (e.g., the user terminal 120, the server 130, the one or more devices 140, the storage 150, the slave switch 160) . In some embodiments, the communication may be implemented by transmitting and/or receiving electromagnetic wave signals. For example, the master switch 110 may transmit a master control signal to the device 140 or the slave switch 160, via the communication module 210. In some embodiments, the master switch 110 may receive the information related to the operation of an  appliance (e.g., the device 140) from the slave switch 160 via the communication module 210.
In some embodiments, the communication module 210 may be configured to exchange information and/or data among one or more components within the master switch 110, such as the control panel 220, the holographic projector 230, the sensor 240, the processor 250, and/or the memory 260. For example, the processor 250 may access information stored in the memory 260 via the communication module 210.
The control panel 220 may be configured to receive a command or an instruction from a user. The control panel 220 may refer to a physical user interface. In some embodiments, the control panel 220 may include a button, a microphone, a display, an indicator lamp, a movable component, or the like, or any combination thereof. For example, the user may press on the button to control the one or more devices 140 associated with the button. As another example, the user may click an icon on the display to control the one or more devices 140 associated with the icon. As still another example, the user may slide the movable component to control the one or more devices 140 associated with the movable component in the control panel 220. In some embodiments, the control panel 220 may display a notification to the user. For example, the control panel 220 may display a message to the user notifying whether a command or an instruction has been executed accordingly or not.
In some embodiments, the user may control the control panel 220 through any type of wired or wireless network, or combination thereof. Merely by way of example, the network may include a cable network, a wireline network, an optical fiber network, a telecommunications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a wide area network (WAN) , a public telephone switched network (PSTN) , a  Bluetooth network, a ZigBee network, a Z-Wave network, a near field communication (NFC) network, or the like, or any combination thereof. For example, a user may access and/or activate the control panel 220 remotely through the user terminal 120. In some embodiments, the control panel 220 may be projected on a projection medium (e.g., a holographic film, a wall) by a holographic projector in the master switch 110 and/or the slave switch 160.
The holographic projector 230 (also referred to as the master holographic projector herein) may project a control interface at a short distance. The control interface may include one or more icons. The holographic projector 230 may project the control interface on a projection medium based on holographic data. In some embodiments, the holographic data may include Computer-Generated Holograms (CGH) related to the control interface. The projection medium may include a holographic film, a transparent and/or non-transparent glass medium, a wall, water drops, air, or the like, or any combination thereof. For example, the holographic projector 230 may project the control interface, and present the holographically projected control interface on the wall above the master switch 110.
In some embodiments, the holographic projector 230 may include a holographic projection device. In some embodiments, the holographic projection system may include a plurality of optical elements in an optical path. In some embodiments, the holographic projection device may include a micro-projection component. The micro-projection component may further generate the holographic projection based on a micro-projection display technology. The micro-projection display technology may include Micro-Electro-Mechanical-System (MEMS) optical scanning micro-projection, Liquid Crystal Display (LCD) transmission micro-projection, Digital Light Processing (DLP) , Liquid Crystal on Silicon (LCoS) reflective-projection display technology, or the like, or any combination thereof. An exemplary holographic projector is illustrated in FIG. 6 and described below. It is  understood for persons having ordinary skills in the art that a holographic projection technology may be varied. All such variations are within the scope of the present disclosure.
In some embodiments, the holographic projector 230 may locate within the master switch. Alternatively, the holographic projector 230 may be mounted on a surface of master switch 110. For example, the holographic projector 230 may be mounted on the top surface of the master switch 110, and the holographic projector 230 may project the master control interface on the wall above the master switch 110. In some embodiments, a size of the holographically projected control interface may be adjustable according to different application scenarios.
The sensor 240 (also referred to as the master sensor herein) may include at least one sensor configured to detect signals. The sensor may also transmit the detected signals into other forms of information (e.g., electrical signals) . For example, the sensor 240 may detect a gesture by a user. The sensor 240 may generate a signal according to the detected gesture, which may be transmitted to the processor 250. The processor 250 may generate a control signal according to the signal received from the sensor 240. The gesture is associated with at least one master icon on the holographically projected master control interface. The sensor 240 may include a camera, an infrared detector, an ultrasonic detector, a microwave detector, or the like, or any combination thereof. For example, the sensor 240 may be a camera that acquires an image (or video) of the gesture, and the sensor 240 may send the acquired image data (e.g., 3D information of the image) to the processor 250 to determine a master control signal based on the gesture. As another example, the sensor 240 may include an infrared detector configured to detect the movement of the user (e.g., the user’s finger movement, hand movement, facial expression) , which may generate a corresponding signal according to the detected movement of the user. The  sensor 240 may transmit the corresponding signal to the processor 250. The processor 250 may further generate a master control signal based on the signal. In some embodiments, the sensor 240 may be inside and/or outside of the master switch 110. For example, the holographic projector 230 mounted on the top surface of the master switch 110, and the sensor 240 may be mounted on the top surface of the master switch 110 to detect a gesture acted on the holographically projected control interface.
In some embodiments, the sensor 240 may be further configured to receive an instruction for triggering the holographic projector to project the control interface. The sensor 240 may include a voice sensor, a temperature sensor, a humidity sensor, a smoke sensor, a light sensor, a camera, a proximity sensor, an energy spectrum sensor, a 3D pressure sensor, an acceleration sensor, or the like, or any combination thereof. For example, the voice sensor may receive a voice instruction from a user to activate and/or trigger the holographic projector to project the master control interface.
In some embodiments, the sensor 240 may detect or monitor the environment. The sensor 240 may also generate data based on the environment. Exemplary data generated by the sensor 240 according to the environment may include physical data, chemical data, and/or biological data relating to the ambient environment. The physical data may include temperature, light, motion, vibration, pressure, humidity, image, fingerprint, air quality, or the like, or any combination thereof. The chemical data may include a concentration of a gas or other chemicals (e.g., carbon monoxide, carbon dioxide, oxygen, hydrogen sulfide, ammonia, particle matters) in the air, etc. The biological data may include biological characteristics of people in the ambient environment of the master switch 110, such as blood pressure, heart rate, pulse rate, the concentration of blood sugar or insulin, or the like, or any combination thereof. The sensor 240 may send the detected or monitored information to the processor 250 for further processing. In some  embodiments, the sensor 240 may be an external device, not inside or on the surface of the master switch 110. The external sensor 240 may communicate with the master switch 110 or other components of the controlling system 100 via the communication module 210.
The processor 250 may process information and/or data relating to controlling an appliance (e.g., the device 140) , to perform one or more functions described in the present disclosure. For example, the processor 250 may acquire holographic data relating to the holographically projected control interface and/or a gesture acted on the holographically projected control interface, and determine a control signal based on the gesture for controlling one or more devices associated with the master switch 110 or the slave switch 160. For example, the processor 250 may determine a three-dimensional (3D) coordinate system based on the holographically projected control interface. The holographically projected control interface may include at least one icon for controlling the one or more devices 140 associated with the master switch 110 and/or the slave switch 160. The processor 250 may determine a vector of the gesture associated with the at least one icon in the 3D coordinate system, and generate a control signal according to the determined vector of the gesture for controlling the one or more devices 140.
Merely by way of example, the processor 250 may be a central processing unit (CPU) , an application-specific integrated circuit (ASIC) , an application-specific instruction-set processor (ASIP) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit (MCU) , a digital signal processor (DSP) , a field-programmable gate array (FPGA) , an advanced RISC (reduced instruction set computing) machines (ARM) , or the like, or any combination thereof.
The memory 260 may be configured to store data and/or instructions associated with the master switch 110. In some embodiments, the memory 260 may store data obtained from one or more components of  the controlling system 100 (e.g., the user terminal 120, the server 130, the slave switch 160) . In some embodiments, the memory 260 may store instructions that the server 130 and/or the processor 250 may execute to perform exemplary methods described in the present disclosure. For example, the instructions may relate to acquiring holographic data, generating a holographically projected control interface, analyzing a gesture from a user acted on the holographically projected control interface, determining a control signal for controlling one or more devices, or the like, or any combination thereof.
Merely by way of example, the memory 260 may include a mass storage, removable storage, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. Exemplary removable memory may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc. Exemplary volatile read-and-write memory may include a random-access memory (RAM) . Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc. Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically-erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
FIG. 3 is a block diagram illustrating an exemplary slave switch 160 according to some embodiments of the present disclosure. It should be noted that the slave switch 160 described below is merely provided for illustration purposes, and not intended to limit the scope of the present disclosure. As illustrated in FIG. 3, the slave switch 160 may include a communication module 310, a control panel 320, a holographic projector 330, and a sensor 340.
In some embodiments, the communication module 310 in the slave switch 160 may have the same or similar configuration and function as the communication module 210 in the master switch 110. For convenience, the communication module 310 may be referred to as the first communication module herein, and the communication module 210 may be referred to as the second communication module herein. In some embodiments, the first communication module in the slave switch 160 may be configured to exchange information and/or data with the second communication module in the master switch 110. The information and/or data may relate to an input from a user. The input may include one or more operations of a device (e.g., the device 140) . For example, when the sensor 340 detects a first gesture by a first user acted on the control panel 320, and/or the holographically projected control interface (herein referred to as the slave control interface) . In some embodiments, the communication module 310 (herein referred to as the first communication module) in the slave switch 160 may transmit the first gesture in the form of an electromagnetic wave signal to the communication module 210 (herein referred to as the second communication module) in the master switch 110. Then the processor 250 may further generate a slave control signal based on the first gesture for controlling one or more devices 140.
In some embodiments, the control panel 320 may have the same or similar configuration and function as the control panel 220 in the master switch 110. The slave control interface of the control panel 320 may have at least one slave icon. The user may apply one or more operations onto the at least one slave icon for controlling the corresponding device (e.g., the one or more devices 140) . In some embodiments, the operation may include a gesture, such as a pressing, a sliding, a rotating, or the like, or any combination thereof.
In some embodiments, the holographic projector 330 (also referred to as the slave holographic projector herein) may have the same or similar configuration and function as the holographic projector 230 in the master switch 110. The slave holographic projector may project the slave control interface based on holographic data relating to the slave control interface. For example, the processor 250 in the master switch 110 may determine the holographic data and transmit the holographic data to the slave switch 160 for projecting. In some embodiments, the position of the holographic projector 330 and/or structures inside the holographic projector 330 may be the same or similar to the holographic projector 230 described in FIG. 2 in the present disclosure.
In some embodiments, the sensor 340 (referred to as the slave sensor herein) may have the same or similar configuration and function as the sensor 240 in the master switch 110. In some embodiments, the sensor 340 may be configured to detect a first gesture by a first user. The sensor 340 may generate a signal according to the detected first gesture, which may be transmitted to the processor 250. The processor 250 may generate a control signal according to the signal received from the sensor 340. The first gesture is associated with the at least one slave icon onto the slave control interface holographically projected by the holographic projector 330. An exemplary sensor may include a camera, an infrared detector, an ultrasonic detector, a microwave detector, or the like, or any combination thereof. For example, the sensor 340 may be a camera that acquires an image of the gesture, and the sensor 340 may send the acquired image data (e.g., three-dimension information of the image) to the processor 250 in the master switch 110 so as to generate a slave control signal based on the gesture. As another example, the sensor 340 may be an infrared detector that detects the movement of the gesture to generate a slave control signal. In some embodiments, the sensor 340 may be mounted on any appropriate location  inside and/or outside of the slave switch 160. For example, the sensor 340 that is configured to detect gestures from the user may be mounted on the top surface of the slave switch 160 based on the holographic projector 330 mounted on the top surface of the slave switch 160, so that the sensor 340 may detect a gesture acted on the holographically projected control interface from the holographic projector 330.
In some embodiments, the sensor 340 may be further configured to receive an instruction for triggering the holographic projector 330 (also referred to as the salve holographic projector) to project the salve control interface. The sensor 340 may include a voice sensor, a temperature sensor, a humidity sensor, a smoke sensor, a light sensor, a camera, a proximity sensor, an energy spectrum sensor, a 3D pressure sensor, an acceleration sensor, or the like, or any combination thereof. For example, the voice sensor may receive a voice instruction from a user to activate and/or trigger the slave holographic projector 330 to project the slave control interface.
FIG. 4 is a block diagram illustrating an exemplary communication module 400 according to some embodiments of the present disclosure. In some embodiments, the communication module 210 (also referred to as the second communication module) and/or the communication module 310 (also referred to as the first communication module) may have the same or similar configuration as the communication module 400. As shown in FIG. 4, the communication module 400 may include a WLAN unit 410, a ZigBee unit 420, a Z-wave unit 430, and a Bluetooth unit 440. It should be noted that the communication module 400 may have one or more other communication units. For example, the communication module 400 may further include a radio frequency communication unit.
FIG. 5 is a block diagram illustrating an exemplary control panel 500 according to some embodiments of the present disclosure. In some  embodiments, the control panel 220 in the master switch 110 and/or the control panel 320 in the slave switch 160 may have the same or similar configuration as the control panel 500. As shown in FIG . 5, the control panel 500 may include a button 510, a microphone 520, a display 530, and an indicator 540.
In some embodiments, the button 510 may be configured to connect or disconnect a control circuit. For example, a user may press the button 510 to activate or trigger the holographic projector of the master switch 110 or the slave switch 160. As another example, the user may press the button 510 to control one or more devices 140.
In some embodiments, the microphone 520 may be configured to detect sound and convert the detected sound to an electrical signal. For example, the user may input an audio by the microphone 520. The audio may relate to any operation for controlling one or more devices (e.g., the device 140) , the master switch 110 and/or the slave switch 160.
In some embodiments, the display 530 may display information of at least one component in the controlling system 100. For example, the display 530 may present a control interface, information relating to the operation of a device (e.g., the one or more devices 140, the master switch 110, a slave switch 140) , a working state of a device (e.g., the one or more devices 140, the master switch 110, a slave switch 140) , an alarm, or the like, or any combination thereof. Exemplary display may include a Liquid Crystal Display (LCD) , a Light-Emitting Diode (LED) , an Organic Light Emitting Diode (OLED) , an electronic ink display, a Plasma Display Panel (PDP) , or the like, or any combination thereof. In some embodiments, the display 530 may include a touch screen that both displaying information and receiving control instructions. For example, the display 530 may display a control interface of the one or more devices 140, and be touchable to receive an on/off instruction from a user.
The indicator 540 may be configured to notify a user. For example, the indicator 540 may give out light to notify the user of certain information relating to an alarm, a state of operation, a working status, or the like, or any combination thereof. As another example, the indicator 540 may present a color to notify the user of a state of the master switch 110. As still another example, the indicator 540 of the master switch 110 may emit green light when the master switch 110 operates normally, and red light when the master switch 110 operates abnormally. The indicator 540 may include a light-emitting diode (LED) lamp, a gas discharge lamp (for example, a neon lamp) , an incandescent lamp, or any other light-emitting device or component, or any combination thereof.
It should be noted that the above description is for illustration purposes only. For a person having ordinary skill in the art, based on the contents and principle of the present disclosure, the form, and specifics of the control panel 500 may be modified or changed without departing from certain principles. For example, the button 510 may be replaced by one or more of a slide bar, a knob, a dial, or the like, or any combination thereof. Correspondingly, the user may slide the slide bar, or rotate the knob or dial to input information to control one or more components (e.g., the master switch 110, the one or more devices 140, the slave switch 160) . As another example, the control panel 500 included in the master switch 110 (also referred to as the control panel 220) may be different from that included in the slave switch 160 (also referred to as the control panel 320) . The control panel 500 included in the master switch 110 (also referred to as the control panel 220) may include the button 510, the microphone 520, the display 530, and the indicator 540. The control panel 500 included in the slave switch 160 (also referred to as the control panel 320) may only include the button 510.
FIG. 6 is a block diagram illustrating an exemplary holographic  projector 600 according to some embodiments of the present disclosure. In some embodiments, the holographic projector 230 in the master switch 110 and/or the holographic projector 330 in the slave switch 160 may have the same or similar configuration as the holographic projector 600. For persons having ordinary skills in the art, the holographic projector 600 may include one or more optical paths for holographic projection in principle. The one or more optical paths may be varied. All such variations are within the scope of the present disclosure. As shown in FIG. 6, the holographic projector 600 may include a light source 610, a lens 620, a beam splitter 630, and a spatial light modulator (SLM) 640.
In some embodiments, the light source 610 may emit a light beam. The light source 610 may include a light-emitting diode (LED) , a laser, a halogen lamp, a fluorescent lamp, or the like, or any combination thereof. The lens 620 be configured to reflect light and/or transmit light. The lens may include a convex lens, a concave lens, a plane mirror, or the like, or any combination thereof. The beam splitter 630 may split the light beam into two or more beams, each aimed in different directions. For example, the two or more beams may include an illumination beam and a reference beam. The illumination beam may be directed to an object that needs to be projected (e.g., the control interface) of the holographic projector 600. Some of the light scattered (or reflected) from the object may fall onto the projection medium. The reference beam may be directed onto the projection medium. The illumination beam and the reference beam may reach the projection medium, intersect and/or interfere with each other to generate the projected control interface. In some embodiments, the spatial light modulator (SLM) 640 may impose some form of spatially-varying modulation on the light beam. For example, the SLM 640 may modulate the intensity or the phase of the light beam individually, or both the intensity and the phase of the light beam simultaneously. The SLM 640 may encode the holographic data into the  light beam for further generating the holographically projected control interface. The holographic data may relate to the user interface of the master switch 110.
For persons having ordinary skills in the art, above described one or more components, such as the light source 610, the lens 620, the beam splitter 630, the SLM 640 may be integrated together in a chip to achieve the holographic projection. For example, the holographic projector 600 may include a micro-projection system. The micro-projection system may be a system on a chip (SoC) . The micro-projection system may further generate the holographically projected control interface based on a micro-projection display technology. In some embodiments, the micro-projection display technology may include Micro-Electro-Mechanical-System (MEMS) optical scanning micro-projection, Liquid Crystal Display (LCD) transmission micro-projection, Digital Light Processing (DLP) , Liquid Crystal on Silicon (LCoS) reflective-projection display technology, or the like, or any combination thereof.
FIG. 7 is a block diagram illustrating an exemplary processor 250 according to some embodiments of the present disclosure. The processor 250 may include a holographic data determination unit 710, a gesture data acquisition unit 720, and a control signal determination unit 730. Each unit may be a hardware circuit that is designed to perform the following actions, a set of instructions stored in one or more storage media (e.g., the storage 150, or the memory 260) , and/or a combination of the hardware circuit and the one or more storage media.
In some embodiments, the holographic data determination unit 710 may be configured to determine the holographic data relating to the master control interface of a master switch 110. The master control interface may have at least one master icon. In some embodiments, each of the at least master icon may be configured to control a device of the one or more devices  140. The device may include at least one of an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a doorbell, an alarm, a lighting device, a curtain, a lock, a robot, or the like, or any combination thereof. In some embodiments, the holographic data determination unit 710 may determine the holographic data relating to the master control interface based on Computer-Generated Holograms (CGH) . In other words, the holographic data determination unit 710 may encode the master control interface into the CGH for determining the holographic data. In some embodiments, the holographic data determination unit 710 may send the holographic data to the master holographic projector (e.g., the holographic projector 230) for further projecting the master control interface through the communication module 210.
In some embodiments, the holographic data determination unit 710 may determine first holographic data relating to the slave control interface, the slave control interface having at least one slave icon. The slave icon may be the same or similar to the master icon. The holographic data determination unit 710 may send the determined first holographic data to the slave holographic projector (e.g., the holographic projector 330 in the slave switch 160) for projecting the slave control interface. For example, after a first user activates or triggers the slave switch, the holographic data determination unit 710 may determine the first holographic data relating to the slave control interface of the slave switch 160 and transmit the first holographic data to the slave holographic projector (i.e., the holographic projector 330) for projecting the slave control interface.
The gesture data acquisition unit 720 may be configured to obtain the data related to a gesture by a user. The gesture may be associated with the at least one icon (e.g., the master icon in the master control interface, the slave icon in the slave control interface) . The gesture data may include starting coordinates of the gesture, ending coordinates of the gesture, a  movement direction, a vector of the gesture, or the like, or any combination thereof. In some embodiments, the gesture data acquisition unit 720 may obtain the gesture data from at least one sensor (e.g., the sensor 240 in the master switch 110, the sensor 340 in the slave switch 160) . The at least one sensor may be configured to detect the gesture acted on the holographically projected control interface. The at least one sensor may include at least one of a camera, an infrared detector, an ultrasonic detector or a microwave detector.
In some embodiments, the gesture data acquisition unit 720 may obtain a first gesture data from the slave switch 160. The first gesture data may refer to the data (e.g., location data) associated with the first gesture. The first gesture may be associated with at least one slave icon. For example, the first gesture may be performed by a first user by pressing on a slave icon. The slave sensor (e.g., the sensor 340) may detect the first gesture by the first user, a first communication module (e.g., the communication module 310 in the slave switch 160) may transmit the detected first gesture to a second communication module (e.g., the communication module 210 in the master switch 110) . The gesture data acquisition unit 720 may obtain the first gesture from the second communication module. In some embodiments, the gesture data acquisition unit 720 may obtain a second gesture data from the master switch 110. The second gesture data may refer to the data (e.g., location data) associated with the second gesture. The second gesture may be associated with at least one master icon. For example, the second gesture may be performed by a second user by rotating a master icon. The master sensor (also referred to as the sensor 240) may detect the second gesture by the second user and transmit to the gesture data acquisition unit 720 via the second communication module (e.g., the communication module 210 in the master switch 110) .
The control signal determining unit 730 may be configured to generate a control signal for controlling one or more devices 140. In some embodiments, the control signal determining unit 730 may generate the control signal based on the gesture by a user. For example, the control signal determining unit 730 may determine a slave control signal based on the first gesture associated with the slave switch 160. As another example, the control signal determining unit 730 may determine a master control signal based on the second gesture associated with the master switch 110.
In some embodiments, the control signal determining unit 730 may first determine a 3D coordinate system based on the control interface of a switch (e.g., the master control interface of the master switch 110, the slave control interface of the slave switch 160) , and then determine a vector of the gesture associated with at least one icon in the 3D coordinate system from a user. The gesture may include the first gesture associated with the at least one slave switch 160 performed by the first user, the second gesture associated with the at least one master switch 110 performed by the second user. The control signal determining unit 730 may determine the control signal based on the vector of the gesture.
FIG. 8-A is a schematic diagram illustrating exemplary locations of a master switch 110 and a projected control interface according to some embodiments of the present disclosure. The holographic projector 230 may be mounted on the top surface of the master switch 110, and the control interface (also referred to as the master control interface) of the master switch 110 may be projected somewhere above the master switch 110. For example, the master switch 110 may be mounted on the wall of a living room, the control interface (also referred to as the master control interface) of the master switch 110 may be projected on the wall right above the master switch 110. The projected control interface (also referred to as the master control  interface) may have a set of icons (also referred to as the master icons) for controlling the one or more devices 140.
FIG. 8-B is a schematic diagram illustrating exemplary locations of a slave switch 160 and a projected control interface according to some embodiments of the present disclosure. The holographic projector 330 may be mounted on the top surface of the slave switch 160, and the control interface of the slave switch 160 may be projected somewhere above the slave switch 160. For example, the slave switch 160 may be mounted on the wall of a bedroom, and the control interface (also referred to as the slave control interface) of the slave switch 160 may be projected on the wall right above the master switch 110. The holographically projected control interface (also referred to as the slave control interface) may have a set of icons (also referred to as the slave icons) for controlling one or more devices 140.
In some embodiments, the slave switch 160 may be placed in a different location from the master switch 110. For example, if the master switch 110 may be set in the living room, and the slave switch 160 (e.g., 160-1, 160-2) may be placed in another room, for example, a bedroom, a bathroom, or a kitchen. In some embodiments, the control panel 220 of the master switch 110 in FIG. 8-A may have a different configuration than the control panel 320 of the slave switch 160 in FIG. 8-B. For example, the control panel 220 of the master switch 110 may include a plurality of buttons, and the control panel 320 of the slave switch 160 may include only one button. It should be noted that FIG. 8-A and FIG. 8-B merely illustrate exemplary locations of the switch (the master switch or the slave switch) and the control interface (the master control interface or the slave control interface) . Other locations, such as the control interface is projected under the switch, on the right of the switch, on the left of the switch, or any relative locations may fall within the scope of the present disclosure.
FIG. 9 is a flowchart illustrating an exemplary process for controlling a device according to some embodiments of the present disclosure. The process 900 may be executed by at least one processor in the controlling system 100 (e.g., the server 130 or the processor 250 in the master switch 110) . For example, the process 900 may be implemented as a set of instructions (e.g., an application) stored in a non-transitory computer-readable storage media (e.g., the storage 150 or the memory 260) . The processor may execute the instructions and may accordingly be directed to perform the process 900 via receiving and/or sending electronic signals or electrical currents.
In 910, the processor 250 (e.g., the control signal determination unit 730) may determine a three-dimensional (3D) coordinate system based on a control interface of a switch.
The control interface may refer to a holographically projected control interface (e.g., a master control interface, a slave control interface) . The control interface may have at least one icon (e.g., at least one master icon, at least one slave icon) . The switch may include the master switch 110 and at least one slave switch 160. The holographic projector (e.g., the holographic projector 230 or the holographic projector 330) may project the control interface onto a projection medium. In some embodiments, as shown in FIGs. 8-A and 8-B, the control interface may be projected onto a projection medium (e.g., a holographic film, a transparent and/or non-transparent glass medium, a wall, a water drop, air) above the switch (e.g., the master switch 110, or the slave switch 160) .
In the coordinate system, the X-Y plane refers to a plane of the control interface that parallels to the projection medium, the Z-axis may be perpendicular to the projection medium. In some embodiments, a geometric center of the control interface may be the origin of the coordinate system, and the processor 250 may determine the 3D coordinate system, including X, Y,  and Z axes, based on the original. An icon in the at least one icon may correspond to three coordinates in the 3D coordinate system. For example, an icon A associated with a lighting device may locate in the 3D coordinate system with the coordinates of (X a, Y a, Z a) .
In 920, the processor 250 (e.g., the control signal determination unit 730) may determine a vector of a gesture associated with at least one icon in the 3D coordinate system from a user.
In some embodiments, the gesture may be performed by the user on the control interface for controlling one or more devices 140. For example, the gesture may include a pressing, a rotating, a sliding, or the like, or any combination thereof. At least one sensor (e.g., the sensor 240, or the sensor 340) in the switch (e.g., the master switch 110 or the slave switch 160) may detect the gesture. The gesture may include a first gesture performed by the first user on the slave control interface, and a second gesture performed by the second user on the master control interface.
The processor 250 may first determine starting coordinates of the gesture, ending coordinates of the gesture, a direction of the gesture in the 3D coordinate system. The processor 250 may determine the vector of the gesture based on the starting location, the ending location, and the direction of the gesture.
For example, a user may perform a sliding gesture on an icon A on the control interface, and the at least one sensor (e.g., the sensor 240, or the sensor 340) may detect the sliding gesture and transmit the sliding gesture data to the processor 250. The processor 250 may determine a 3D coordinate system based on the control interface. The processor 250 may also determine the starting coordinates of the gesture as (X 1, Y 1, Z 1) and the ending coordinates of the gesture as (X 2, Y 2, Z 2) in the 3D coordinate system. The processor 250 may further determine the vector of the sliding gesture based on the starting coordinates, (X 1, Y 1, Z 1) and the ending coordinates (X 2,  Y 2, Z 2) . In some embodiments, the vector of the sliding gesture may be denoted as (X 2-X 1, Y 2-Y 1, Z 2-Z 1) . In some embodiments, the sliding gesture may be on the plane paralleled to projection medium, which means Z 1 and Z 2 are zero. The vector of the sliding gesture may be (X 2-X 1, Y 2-Y 1, 0) .
In 930, the processor 250 (e.g., the control signal determination unit 730) may generate a control signal based on the vector of the gesture for controlling one or more devices 140. In some embodiments, the processor 250 may determine the control signal based on the vector of the gesture and the coordinates of the at least one icon in the control interface. For example, the processor 250 may determine an icon that the gesture acts on according to the starting coordinates of the gesture and the coordinates of the icon. The icon that the gesture acted on may include coordinates nearest to the starting coordinates of the gesture. The processor 250 may also generate a control signal based on the vector of the gesture. For example, the vector may include a moving direction and a moving range of the gesture associated with the icon. The processor 250 may turn up two levels of volume of a television based on the vector with moving right about two centimeters.
In some embodiments, the processor 250 (e.g., the control signal determination unit 730) may generate a slave control signal based on the first gesture by the first user. For example, the slave holographic projector 330 may project the control interface (also referred to as the slave control interface herein) of the slave control panel 320. The slave control interface may have at least one slave icon. The sensor 340 may detect the first gesture acted on the slave control interface, and the first gesture may be associated with the at least one slave icon. The first gesture may be transmitted to the master switch 110 by the first communication module 310. The master switch 110 may receive the first gesture from the first communication module 310 by the second communication module 210. Then the processor 250 may determine a 3D coordinate system based on the slave control interface, determine a  vector of the first gesture in the 3D coordinate system, and generate the slave control signal for controlling the one or more devices 140 according to the determined vector of the first gesture.
In some embodiments, the processor 250 (e.g., the control signal determination unit 730) may generate a master control signal based on the second gesture by the second user. For example, the master holographic projector 230 may project the master control interface on the wall above the master switch 110, the master control interface having at least one master icon. The sensor 240 may detect the second gesture acted on the slave control interface, and the second gesture may be associated with the at least one master icon. The processor 250 may determine a 3D coordinate system based on the projected master control interface. The processor 250 may determine the vector of the second gesture in the 3D coordinate system. The processor 250 may further generate the master control signal according to the determined vector of the second gesture.
In some embodiments, the processor 250 may assign the master control signal with a higher control priority over the slave control signal. In other words, when the processor 250 determines the master control signal and the slave signal simultaneously for controlling the same device, the processor 250 may control the device based on the master control signal.
In 940, the processor 250 (e.g., the communication module 210) may transmit the control signal to the one or more devices. For example, the processor 250 may transmit the control signal in the form of the electromagnetic wave using any suitable communication protocol via the network. The exemplary communication protocol may include Transmission Control Protocol (TCP) , Internet Protocol (IP) , TCP/IP, Internetwork Packet Exchange Protocol (IPX) , Sequenced Packet Exchange Protocol (SPX) , IPX/SPX, or the like, or any combination thereof.
Having thus described the basic concepts, it may be rather  apparent to those skilled in the art after reading this detailed disclosure that the foregoing detailed disclosure is intended to be presented by way of example only and is not limiting. Various alterations, improvements, and modifications may occur and are intended to those skilled in the art, though not expressly stated herein. These alterations, improvements, and modifications are intended to be suggested by this disclosure and are within the spirit and scope of the exemplary embodiments of this disclosure.
Moreover, certain terminology has been used to describe embodiments of the present disclosure. For example, the terms “one embodiment, ” “an embodiment, ” and/or “some embodiments” mean that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Therefore, it is emphasized and should be appreciated that two or more references to “an embodiment” or “one embodiment” or “an alternative embodiment” in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined as suitable in one or more embodiments of the present disclosure.
Further, it will be appreciated by one skilled in the art, aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code) or combining software and hardware implementation that may all generally be referred to herein as a “block, ” “module, ” “engine, ” “unit, ” “component, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable media having computer- readable program code embodied thereon.
A computer-readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electromagnetic, optical, or the like, or any suitable combination thereof. A computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer-readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby, and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a  Service (SaaS) .
Furthermore, the recited order of processing elements or sequences, or the use of numbers, letters, or other designations therefore, is not intended to limit the claimed processes and methods to any order except as may be specified in the claims. Although the above disclosure discusses through various examples what is currently considered to be a variety of useful embodiments of the disclosure, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover modifications and equivalent arrangements that are within the spirit and scope of the disclosed embodiments. For example, although the implementation of various components described above may be embodied in a hardware device, it may also be implemented as a software-only solution-e.g., an installation on an existing server or mobile device.
Similarly, it should be appreciated that in the foregoing description of embodiments of the present disclosure, various features are sometimes grouped together in a single embodiment, figure, or description thereof to streamline the disclosure aiding in the understanding of one or more of the various embodiments. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim. Rather, embodiments lie in less than all features of a single foregoing disclosed embodiment.
In some embodiments, the numbers expressing quantities of ingredients, properties, and so forth, used to describe and claim certain embodiments of the application are to be understood as being modified in some instances by the term “about, ” “approximate, ” or “substantially. ” For example, “about, ” “approximate, ” or “substantially” may indicate ±20% variation of the value it describes, unless otherwise stated. Accordingly, in  some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable.
Each of the patents, patent applications, publications of patent applications, and other material, such as articles, books, specifications, publications, documents, things, and/or the like, referenced herein is hereby incorporated herein by this reference for all purposes, excepting any prosecution file history associated with same, any of same that is inconsistent with or in conflict with the present document, or any of same that may have a limiting affect as to the broadest scope of the claims now or later associated with the present document. By way of example, should there be any inconsistency or conflict between the descriptions, definition, and/or the use of a term associated with any of the incorporated material and that associated with the present document, the description, definition, and/or the use of the term in the present document shall prevail.
In closing, it is to be understood that the embodiments of the application disclosed herein are illustration of the principles of the embodiments of the application. Other modifications that may be employed may be within the scope of the application. Thus, by way of example, but not of limitation, alternative configurations of the embodiments of the application may be utilized in accordance with the teachings herein. Accordingly, embodiments of the present application are not limited to that precisely as shown and described.

Claims (37)

  1. A system comprising:
    a holographic projector configured to project a control interface, the control interface having at least one icon;
    at least one sensor configured to detect a gesture by a user, the gesture being associated with the at least one icon; and
    at least one processor configured to:
    determine a 3D coordinate system based on the control interface;
    determine a vector of the gesture associated with the at least one icon in the 3D coordinate system; and
    generate a control signal according to the determined vector of the gesture for controlling one or more devices.
  2. The system of claim 1, further comprising a communication module configured to transmit the control signal to the one or more devices.
  3. The system of claim 2, wherein the communication module includes at least one of: a Wireless Local Area Network (WLAN) unit, a ZigBee unit, a Z-wave unit, or a Bluetooth unit.
  4. The system of claim 1, wherein the at least one sensor includes at least one of: a camera, an infrared detector, an ultrasonic detector, or a microwave detector.
  5. The system of claim 1, wherein the at least one sensor is further configured to receive an instruction for triggering the holographic projector to project the control interface.
  6. The system of claim 5, wherein the at least one sensor configured to  receive the instruction includes at least one of: a voice sensor, a temperature sensor, a humidity sensor, a smoke sensor, a light sensor, a camera, a proximity sensor, an energy spectrum sensor, a 3D pressure sensor, or an acceleration sensor.
  7. The system of claim 1, wherein the holographic projector is mounted on a top surface of a switch, and the at least one sensor is mounted on the top surface of the switch.
  8. The system of claim 1, wherein the one or more devices include at least one of: an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a doorbell, an alarm, a lighting device, a curtain, a lock, or a robot.
  9. A system comprising:
    a master switch; and
    at least one slave switch, wherein:
    a slave switch of the at least one slave switch includes:
    a slave holographic projector configured to project a slave control interface, the slave control interface having at least one slave icon,
    at least one slave sensor configured to detect a first gesture by a first user, the first gesture being associated with the at least one slave icon, and
    a first communication module configured to transmit the first gesture to the master switch; and
    the master switch includes:
    a second communication module configured to receive the first gesture from the first communication module, and
    at least one processor configured to generate a slave control  signal based on the first gesture for controlling one or more devices.
  10. The system of claim 9, to generate the slave control signal, the at least one processor is further configured to:
    determine a 3D coordinate system based on the slave control interface;
    determine a vector of the first gesture associated with the at least one slave icon in the 3D coordinate system; and
    generate the slave control signal according to the determined vector of the first gesture.
  11. The system of claim 9, wherein the master switch further includes:
    a master holographic projector configured to project a master control interface, the master control interface having at least one master icon; and
    at least one master sensor configured to detect a second gesture by a second user, the second gesture being associated with the at least one master icon.
  12. The system of claim 11, wherein the at least one processor is further configured to generate a master control signal based on the second gesture for controlling the one or more devices.
  13. The system of claim 12, to generate the master control signal, the at least one processor is further configured to:
    determine a 3D coordinate system based on the master control interface;
    determine a vector of the second gesture associated with the at least one master icon in the 3D coordinate system; and
    generate the master control signal according to the determined vector of the second gesture.
  14. The system of claim 12, wherein the at least one processor is configured to assign the master control signal with a higher control priority over the slave control signal.
  15. The system of claim 12, wherein the master switch further includes a third communication module configured to transmit the slave control signal or the master control signal to the one or more devices.
  16. The system of claim 9, wherein the at least one slave sensor includes at least one of: a camera, an infrared detector, an ultrasonic detector, or a microwave detector.
  17. The system of claim 11, wherein the at least one master sensor includes at least one of: a camera, an infrared detector, an ultrasonic detector, or a microwave detector.
  18. The system of claim 9, wherein the one or more devices include at least one of: an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a doorbell, an alarm, a lighting device, a curtain, a lock, or a robot.
  19. A method, comprising:
    determining a 3D coordinate system based on a control interface, the control interface being a holographically projected control interface projected by a holographic projector and having at least one icon;
    determining a vector of a gesture by a user, the gesture being detected by at least one sensor and associated with the at least one icon; and
    generating a control signal according to the determined vector of the gesture for controlling one or more devices.
  20. The method of claim 19, further comprising transmitting, by a communication module, the control signal to the one or more devices.
  21. The method of claim 20, wherein the communication module includes at least one of: a Wireless Local Area Network (WLAN) unit, a ZigBee unit, a Z-wave unit, or a Bluetooth unit.
  22. The method of claims 19, wherein the at least one sensor includes at least one of: a camera, an infrared detector, an ultrasonic detector, or a microwave detector.
  23. The method of claims 19, further comprising:
    receiving, by the at least one sensor, an instruction for triggering the holographic projector to project the control interface.
  24. The method of claims 23, wherein the instruction includes at least one of: a voice sensor, a temperature sensor, a humidity sensor, a smoke sensor, a light sensor, a camera, a proximity sensor, an energy spectrum sensor, a 3D pressure sensor, or an acceleration sensor.
  25. The method of claim 19, wherein the holographic projector is mounted on a top surface of a switch, and the at least one sensor is mounted on the top surface of the switch.
  26. The method of claim 19, wherein the one or more devices include at least one of: an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a doorbell, an alarm, a lighting device, a curtain, a lock, or a robot.
  27. A method, comprising:
    determining a 3D coordinate system based on a slave control interface, wherein the slave control interface is holographically projected by a slave holographic projector;
    determining a vector of the a first gesture by a first user associated with the at least one slave icon in a 3D coordinate system, wherein the first gesture is detected by at least one slave sensor and transmitted to a master switch; and
    generating a slave control signal according to the determined vector of the first gesture for controlling one or more devices.
  28. The method of claim 27, wherein the master switch includes:
    a master holographic projector configured to project a master control interface, the master control interface having at least one master icon; and
    at least one master sensor configured to detect a second gesture by a second user, the second gesture being associated with the at least one master icon.
  29. The method of claim 28, further comprising:
    generating a master control signal based on the second gesture for controlling the one or more devices.
  30. The method of claim 29, wherein the generating the master control signal further comprising:
    determining a 3D coordinate system based on the master control interface;
    determining a vector of the second gesture associated with the at least one master icon in the 3D coordinate system; and
    generating the master control signal according to the determined vector of  the second gesture.
  31. The method of claim 29, further comprising:
    assigning the master control signal with a higher control priority over the slave control signal.
  32. The method of claim 29, wherein the master switch is further configured to transmit the slave control signal or the master control signal to the one or more devices.
  33. The method of claim 27, wherein the at least one slave sensor includes at least one of: a camera, an infrared detector, an ultrasonic detector, or a microwave detector.
  34. The method of claim 28, wherein the at least one master sensor includes at least one of: a camera, an infrared detector, an ultrasonic detector, or a microwave detector.
  35. The method of claim 27, wherein the one or more devices include at least one of: an air conditioner, a television, a refrigerator, an oven, a washing machine, a media player, a monitor, a doorbell, an alarm, a lighting device, a curtain, a lock, or a robot.
  36. A non-transitory computer readable medium, comprising at least one set of instructions for controlling one or more devices, wherein when executed by at least one processor of a computer device, the at least one set of instructions directs the at least one processor to:
    determine a 3D coordinate system based on a control interface, wherein the control interface is holographically projected by a holographic projector  and includes at least one icon;
    determine a vector of a gesture by a user associated with the at least one icon in the 3D coordinate system, wherein the gesture is detected by at least one sensor; and
    generate a control signal according to the determined vector of the gesture for controlling one or more devices.
  37. A non-transitory computer readable medium, comprising at least one set of instructions for controlling one or more devices, wherein when executed by at least one processor of a computer device, the at least one set of instructions directs the at least one processor to:
    determine a 3D coordinate system based on a slave control interface, wherein the slave control interface is holographically projected by a slave holographic projector;
    determine a vector of the a first gesture by a first user associated with the at least one slave icon in a 3D coordinate system, wherein the first gesture is detected by at least one slave sensor and transmitted to a master switch; and
    generate a slave control signal according to the determined vector of the first gesture for controlling one or more devices.
PCT/CN2020/070989 2020-01-08 2020-01-08 Systems and methods for controlling appliances WO2021138848A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/070989 WO2021138848A1 (en) 2020-01-08 2020-01-08 Systems and methods for controlling appliances

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/070989 WO2021138848A1 (en) 2020-01-08 2020-01-08 Systems and methods for controlling appliances

Publications (1)

Publication Number Publication Date
WO2021138848A1 true WO2021138848A1 (en) 2021-07-15

Family

ID=76787645

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/070989 WO2021138848A1 (en) 2020-01-08 2020-01-08 Systems and methods for controlling appliances

Country Status (1)

Country Link
WO (1) WO2021138848A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115095843A (en) * 2022-06-24 2022-09-23 中国第一汽车股份有限公司 Car lamp structure capable of realizing sound and light integration and control method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107300867A (en) * 2017-06-30 2017-10-27 广东美的制冷设备有限公司 Project the control method of touch-control control device, household electrical appliance and household electrical appliance
CN108681289A (en) * 2018-07-06 2018-10-19 广州天通智能技术有限公司 A kind of intelligent radio switch
CN208027099U (en) * 2017-07-28 2018-10-30 美的智慧家居科技有限公司 A kind of intelligent appliance control system based on holographic technique
WO2019143903A1 (en) * 2018-01-18 2019-07-25 Chia Ming Chen Color mixing from different light sources
CN110321003A (en) * 2019-05-30 2019-10-11 苏宁智能终端有限公司 Smart home exchange method and device based on MR technology

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107300867A (en) * 2017-06-30 2017-10-27 广东美的制冷设备有限公司 Project the control method of touch-control control device, household electrical appliance and household electrical appliance
CN208027099U (en) * 2017-07-28 2018-10-30 美的智慧家居科技有限公司 A kind of intelligent appliance control system based on holographic technique
WO2019143903A1 (en) * 2018-01-18 2019-07-25 Chia Ming Chen Color mixing from different light sources
CN108681289A (en) * 2018-07-06 2018-10-19 广州天通智能技术有限公司 A kind of intelligent radio switch
CN110321003A (en) * 2019-05-30 2019-10-11 苏宁智能终端有限公司 Smart home exchange method and device based on MR technology

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115095843A (en) * 2022-06-24 2022-09-23 中国第一汽车股份有限公司 Car lamp structure capable of realizing sound and light integration and control method thereof
CN115095843B (en) * 2022-06-24 2023-11-10 中国第一汽车股份有限公司 Car lamp structure capable of realizing sound and light integration and control method thereof

Similar Documents

Publication Publication Date Title
US20210165555A1 (en) User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
KR102498451B1 (en) Electronic device and method for provideing information in the electronic device
KR102498364B1 (en) Electronic device and method for provideing information in the electronic device
US11688140B2 (en) Three dimensional virtual room-based user interface for a home automation system
CN110168485B (en) Augmented reality control of internet of things devices
CN106030673B (en) Vision and auditory user notification method for intelligent household's hazard detector
KR102265086B1 (en) Virtual Environment for sharing of Information
US20180232571A1 (en) Intelligent assistant device communicating non-verbal cues
KR102537543B1 (en) Intelligent electronic device and operating method thereof
US20210358294A1 (en) Holographic device control
CN108370488B (en) Audio providing method and apparatus thereof
US20160343173A1 (en) Acousto-optical display for augmented reality
US10802668B2 (en) Small screen virtual room-based user interface
KR102501384B1 (en) Electronic device and method for controlling operation thereof
KR102651726B1 (en) Electronic device including light emitting device and operating method thereof
KR20170052976A (en) Electronic device for performing motion and method for controlling thereof
WO2021138848A1 (en) Systems and methods for controlling appliances
CN105892572A (en) Method and apparatus for displaying content
CN115496850A (en) Household equipment control method, intelligent wearable equipment and readable storage medium
US10958894B2 (en) Image processing method and electronic device supporting image processing
KR102333931B1 (en) Video projector and operating method thereof
EP4062265B1 (en) Brain-computer interface
CN111033606A (en) Information processing apparatus, information processing method, and program
EP3814852B1 (en) Wall clock ai voice assistant
Schenkluhn et al. Augmented Reality-based Indoor Positioning for Smart Home Automations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20911480

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20911480

Country of ref document: EP

Kind code of ref document: A1