WO2019189965A1 - 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템 및 방법 - Google Patents
가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템 및 방법 Download PDFInfo
- Publication number
- WO2019189965A1 WO2019189965A1 PCT/KR2018/003755 KR2018003755W WO2019189965A1 WO 2019189965 A1 WO2019189965 A1 WO 2019189965A1 KR 2018003755 W KR2018003755 W KR 2018003755W WO 2019189965 A1 WO2019189965 A1 WO 2019189965A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- iot device
- user terminal
- posture
- user
- location
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 15
- 238000004891 communication Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000005286 illumination Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- QVFWZNCVPCJQOP-UHFFFAOYSA-N chloralodol Chemical compound CC(O)(C)CC(C)OC(O)C(Cl)(Cl)Cl QVFWZNCVPCJQOP-UHFFFAOYSA-N 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000000060 site-specific infrared dichroism spectroscopy Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y10/00—Economic sectors
- G16Y10/75—Information technology; Communication
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y40/00—IoT characterised by the purpose of the information processing
- G16Y40/30—Control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72415—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
Definitions
- the present invention relates to an IoT device control system and method, and more particularly, to an IoT device control system and method using virtual reality and augmented reality.
- the term Internet of Things first appeared in the 1998 Auto-ID Lab at the Massachusetts Institute of Technology (MIT). Later, in 2005, the ITU-T published the annual report, 'The Internet of Things,' which predicted that the Internet of Things would be the most basic framework for all the structures of the future information technology (IT) industrial revolution.
- the report defines the Internet of Things as "a new ICT foundation that connects all things in the world to networks, enabling humans and things to communicate with each other anytime, anywhere.”
- the Internet of Things can be seen as an infrastructure for implementing a ubiquitous space. This ubiquitous space begins with the incorporation of computing devices with specific functions into the environment and things, and the environment or things themselves become intelligent.
- an application for controlling the IoT apparatus is installed in a smartphone and a PC, and the application is controlled using the application.
- a dedicated application installed in a device such as a smart phone or a PC may be driven to check and control information of the smart light.
- a dedicated application installed in a device such as a smart phone or a PC may be driven to check and control information of the smart light.
- a control target has been selected by the user by providing a list of devices that the user can control.
- a control target device it is inconvenient for a user to select a control target device from a device list.
- the technical problem to be solved by the present invention is to provide a system and method that allows a user to conveniently identify and control the IoT device to be controlled using virtual reality and augmented reality.
- a method for controlling an IoT device using virtual reality and augmented reality wherein the user terminal photographs a real space including one or more IoT devices, when the image is captured. Registering and registering at least one of a location and a posture with an IoT device selected from a user, and if the user terminal satisfies a condition determined using at least one of a location and a posture of a registered user terminal matching the selected IoT device Displaying a control screen for the selected IoT device, and controlling the selected IoT device according to a user input through the control screen.
- the condition may be satisfied if the user terminal is located within a location range determined based on the location of the registered user terminal matching the selected IoT device.
- the condition is a posture determined based on a posture of a user terminal located within a location range determined based on a location of a registered user terminal matching the selected IoT device and based on a posture of a registered user terminal matching the selected IoT device. If the posture of the user terminal within the range can be satisfied.
- the method may include displaying a control screen for the selected IoT device in the virtual space when a location and a viewpoint of the user satisfy the condition in a virtual space where the one or more IoT terminals are disposed corresponding to the real space; and The method may further include controlling the selected IoT device according to a user input through a control screen displayed in the virtual space.
- the IoT device control system using a virtual reality and augmented reality for solving the above technical problem, a user terminal for photographing a real space including one or more IoT devices, and when taking the image of the user terminal And a server that registers at least one of the position and the posture by matching the IoT device selected from the user.
- the user terminal may display a control screen for the selected IoT device when the user terminal satisfies a condition determined by using at least one of a position and a posture of a registered user terminal matching the selected IoT device. .
- the server may transmit a control command for controlling the selected IoT device to the selected IoT device according to a user input through the control screen.
- the server may display a control screen for the selected IoT device in the virtual space when the location and viewpoint of the user satisfy the condition in the virtual space where the one or more IoT terminals are disposed corresponding to the real space. According to a user input through a control screen displayed in a space, a control command for controlling the selected IoT device may be transmitted to the selected IoT device.
- the user can conveniently identify and control the controlled IoT device by using virtual reality and augmented reality.
- FIG. 1 is a block diagram showing the configuration of an IoT device control system according to an embodiment of the present invention.
- FIG. 2 is a flowchart illustrating an operation of an IoT device control system according to an embodiment of the present invention.
- FIG. 3 is a diagram illustrating an IoT device list displayed on a user terminal according to an embodiment of the present invention.
- FIG. 4 is a diagram illustrating an example of photographing an IoT device in a user terminal according to an embodiment of the present invention.
- FIG. 5 is a diagram illustrating an IoT device control screen according to an embodiment of the present invention.
- FIG. 6 is a view provided to explain an IoT device registration process according to an embodiment of the present invention.
- FIG. 1 is a block diagram showing the configuration of an IoT device control system according to an embodiment of the present invention.
- an IoT device control system may include a plurality of IoT devices 100a, 100b, and 100c, an access point 110, an indoor positioning system 120, a user terminal 200, and the like. It may include a service server 300.
- the plurality of IoT devices 100a, 100b, 100c, the access point 110, the indoor positioning system 120, the user terminal 200, and the service server 300 may exchange information and data with each other through the communication network 400. Can be.
- the communication network 400 includes a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), the Internet, 2G, 3G, 4G, LTE mobile communication network, Bluetooth, It may include Wi-Fi, Wibro, satellite communication networks, and Low Power Wide Area (LPWA) networks such as LoRa and Sigfox, and the communication method may be wired or wireless and may be any communication method. .
- LAN local area network
- MAN metropolitan area network
- WAN wide area network
- LTE mobile communication network 2G, 3G, 4G, LTE mobile communication network
- Bluetooth Low Power Wide Area
- FIG. 1 three IoT devices 100a, 100b, and 100c and one access point 110 are illustrated for convenience of description, but the number may increase or decrease according to a spatial environment in which an IoT device service is provided.
- the IoT devices 100a, 100b, and 100c are all devices such as various sensors, information providing apparatuses, convenience facilities, game machines, security apparatuses, and home appliances, and there is no limitation on the types thereof.
- the IoT device may include all household appliances such as a refrigerator, an oven, a washing machine, a cleaner, a printer, a fax, a multifunction printer, a webcam, a television, a video, a DVD player, an audio, an electric recorder, an interphone, an air conditioner, a heater, a dehumidifier and the like that can communicate with each other. .
- IoT devices can be used to protect the environment from tangible objects and surrounding environments, including temperature sensors, humidity sensors, thermal sensors, gas sensors, illuminance sensors, ultrasonic sensors, remote sensing sensors, Synthetic Aperture Radar (SAR), radar, position sensors, motion sensors, and image sensors. It may include various sensors from which information can be obtained.
- SAR Synthetic Aperture Radar
- the IoT devices 100a, 100b, and 100c may be implemented as devices that support wireless communication through Bluetooth, Wi-Fi, and / or low power wide area (LPWA) networks such as LoRa and Sigfox.
- LPWA low power wide area
- the access point 110 may support the IoT device 100a, 100b, 100c accessing the service server 300 through the communication network 400 in a space where the IoT devices 100a, 100b, 100c are installed.
- the access point 110 may be implemented as a device that enables wireless devices to be connected to a wired device using a related standard using Wi-Fi in a computer network.
- the access point 110 may be implemented as a device capable of connecting a plurality of devices to the service server 300 through a short-range wireless communication method other than Wi-Fi.
- the indoor positioning system 120 may position the position of the user terminal 200 based on a predetermined reference point indoors.
- Various indoor positioning methods such as Wi-Fi, Bluetooth, and visible light communication are known, and the indoor positioning system 120 according to the present invention positions the position of the user terminal 200 in a real space where IoT devices 100a, 100b, and 100c are installed. If possible, it can be implemented by any indoor positioning method.
- the indoor positioning system 120 is connected to the user terminal 200 through the access point 110, but the indoor positioning system 120 interworks with the user terminal 200 without passing through the access point 110. Indoor positioning may be performed.
- the user terminal 200 includes a memory means such as a smart phone, a tablet PC, a personal digital assistant (PDA) or a web pad, and includes a microprocessor to operate. It can be made of a terminal with the capability, and various applications can be installed to provide various services to the user.
- a memory means such as a smart phone, a tablet PC, a personal digital assistant (PDA) or a web pad
- PDA personal digital assistant
- web pad includes a microprocessor to operate. It can be made of a terminal with the capability, and various applications can be installed to provide various services to the user.
- the user terminal 200 may include one or more sensors capable of measuring its own movement or posture.
- the user terminal 200 may include at least one of a geomagnetic sensor, an acceleration sensor, a gyroscope sensor, and a motion sensor. Since a method of measuring a movement ana posture of the user terminal 200 using a geomagnetic sensor, an acceleration sensor, a gyroscope sensor, a motion sensor, and the like is already known, a detailed description thereof will be omitted.
- the user terminal 200 checks information on the IoT devices 100a, 100b, and 100c, and a service application (hereinafter referred to as a “service application”) that allows a user to control the IoT devices 100a, 100b and 100c. Can be installed.
- a service application hereinafter referred to as a “service application”
- the service server 300 may register the IoT devices 100a, 100b, and 100c.
- the service server 300 may provide IoT device information to the user terminal 200 authorized for the registered IoT devices 100a, 100b, and 100c to be displayed on the screen through the service application.
- the service server 300 transmits the IoT device control command input from the user to the IoT devices 100a, 100b, and 100c through the IoT device control screen displayed on the screen of the user terminal 200 through the service application. It can also be used to operate according to the command.
- FIG. 2 is a flowchart illustrating an operation of an IoT device control system according to an embodiment of the present invention.
- the user terminal 200 may receive a list of registered IoT devices from the service server 300 according to a user's request and provide the user with a service application through the service application (S210).
- FIG. 3 is a diagram illustrating an IoT device list displayed on a user terminal according to an embodiment of the present invention.
- the service application may display the IoT device list 11 as illustrated in FIG. 3 on the screen of the user terminal 200.
- 3 shows that an IoT device corresponding to 'lighting 4' is newly registered.
- a method of displaying the registered IoT device list may be applied in addition to the example illustrated in FIG. 3.
- the user terminal 200 may receive an IoT device corresponding to 'lighting 4' from the IoT device list (S220). The user terminal 200 may then activate the camera function.
- the user may move near the IoT device corresponding to the lighting 4 with the user terminal 200 to photograph a space including the corresponding IoT device (S230).
- FIG. 4 is a diagram illustrating an example of photographing an IoT device in a user terminal according to an embodiment of the present invention.
- the user picks up the user terminal 200 in the real space where the IoT devices 100a, 100b, and 100c are located, and moves near the IoT device 100a. I can move it.
- the user may photograph a space including the IoT device 100a using the user terminal 200.
- the photographing guide 10 may be displayed on the camera photographing screen of the user terminal 200 so that photographing may be performed while the IoT device 100a enters the photographing guide 10. .
- the user terminal 200 may check its location in cooperation with the indoor positioning system 120.
- the user terminal 200 may also check his posture information by using an inertial sensor provided in the user terminal 200.
- the step S230 of capturing the space including the IoT device 100a may be performed first, and then the IoT device list display step S210 and the IoT device selection step S220 may be implemented. Do.
- the user terminal 200 transmits the position and posture information of the user terminal 200 to the service server 300 to be registered by matching the IoT device 100a.
- the user terminal 200 may transmit identification information (eg, device ID) of the IoT device 100a to the service server 300 along with the location and attitude information of the user terminal 200.
- identification information eg, device ID
- only location information of the user terminal 200 may be registered to be matched with the IoT device 100a.
- a control screen for the corresponding IoT device 100a may be displayed (S260).
- the IoT device control screen output condition may be determined to be satisfied when the user terminal 200 is located within a location range determined based on the location of the registered user terminal matching the IoT device 100a.
- the attitude of the user terminal 200 within the posture range determined based on the attitude of the registered user terminal matching the IoT device 100a is satisfied. It may be decided.
- FIG. 5 is a diagram illustrating an IoT device control screen according to an embodiment of the present invention.
- the user terminal 200 may output a control screen 20 for the IoT device 100a on an image photographing a space including the IoT device 100a.
- the control screen 20 may include information 22 indicating the location of the IoT device 100a in space.
- the location of the IoT device 100a displayed in the information 22 may be the location of the user terminal 200 obtained in step S230.
- the control screen 20 may include a user interface 21 for controlling the IoT device 100a.
- FIG. 5 illustrates an example in which the brightness level and the illumination duration of the illumination can be adjusted for the IoT device 100a which is the illumination.
- the IoT device 100a may be controlled according to a user input through the control screen 20 (S270).
- the user terminal 200 may transmit a control command corresponding to the user input on the control screen 20 to the service server 300.
- the service server 300 may transmit the result back to the IoT device 100a to operate according to the control command.
- the IoT apparatus 100a is matched with the IoT device 100a through steps S210 to S230, and the location and / or posture information of the user terminal is registered. Control may also be made for device 100a.
- the service server 300 creates a virtual space corresponding to the real space where the IoT device 100a is located and through a virtual reality support terminal (not shown) such as a user terminal 200 or a head mounted display (HMD) terminal. Can provide.
- a virtual reality support terminal not shown
- HMD head mounted display
- the IoT device 100a may be disposed corresponding to the location in the real space.
- the location of the IoT device 100a may be determined in the virtual space by using the location information of the user terminal that is registered with respect to the IoT device 100a described above.
- the service server 300 displays the virtual space screen to the user based on a predetermined position and a time point in the virtual space. Can provide.
- the service server 300 may provide a virtual space screen corresponding thereto. If the user's location and viewpoint satisfy the predetermined condition in the virtual space, the service server 300 may display a control screen for the IoT device 100a in the virtual space.
- the service server 300 may transmit a control command for controlling the IoT device 100a according to a user input through a control screen displayed in the virtual space.
- FIG. 6 is a view provided to explain an IoT device registration process according to an embodiment of the present invention.
- the IoT device 100a may operate in a mode (AP mode) as a Wi-Fi access point (S610). Then, the IoT device 100a may broadcast IoT AP access information including its SSID and the like (S620).
- AP mode a mode
- S610 Wi-Fi access point
- S620 IoT AP access information including its SSID and the like
- the user terminal 200 may access the IoT device 100a as an access point using IoT AP access information (S630).
- the user terminal 200 may transmit home AP information, service server information, user terminal location and / or posture information to the IoT device 100a (S640).
- the user terminal location and / or attitude information may be obtained when the user terminal 200 photographs a space including the IoT device 100a.
- Home AP information may include the SSID, password and the like of the access point (110).
- the service server information may include information necessary for the IoT device 100a to access the service server 300 and register itself.
- the IoT device 100a may access the access point 110 using the information transmitted in step S540 (S650) and terminate the AP mode (S660).
- the IoT device 100a may access the service server 300 through the access point 110 (S670).
- the IoT device 100a may transmit the network related information, the IoT device attribute related information, and the user terminal location and / or posture information to the service server 300 (S680), and perform IoT device registration (S690).
- the user terminal location and / or posture information may be matched with the IoT device 100a.
- the control screen for the IoT device 100a may be automatically output on the screen of the user terminal 200.
- the control screen for the IoT device 100a may be displayed on the virtual space displayed through the user terminal 200 or the virtual reality support terminal.
- Embodiments of the invention include a computer readable medium containing program instructions for performing various computer-implemented operations.
- This medium records a program for executing the method described so far.
- the media may include, alone or in combination with the program instructions, data files, data structures, and the like. Examples of such media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CDs and DVDs, floppy disks and program commands such as magnetic-optical media, ROM, RAM, flash memory, and the like.
- Hardware devices configured to store and perform such operations.
- Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Computer Hardware Design (AREA)
- Development Economics (AREA)
- Business, Economics & Management (AREA)
- Accounting & Taxation (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (8)
- 사용자 단말이 하나 이상의 IoT 디바이스를 포함하는 실제 공간을 촬영하는 단계,상기 영상을 촬영할 때 상기 사용자 단말의 위치 및 자세 중 적어도 하나를 사용자로부터 선택된 IoT 디바이스에 매칭하여 등록하는 단계,상기 선택된 IoT 디바이스에 매칭하여 등록된 사용자 단말의 위치 및 자세 중 적어도 하나를 이용하여 정해지는 조건을 상기 사용자 단말이 만족하면, 상기 선택된 IoT 디바이스에 대한 제어 화면을 표시하는 단계, 그리고상기 제어 화면을 통한 사용자 입력에 따라 상기 선택된 IoT 디바이스를 제어하는 단계를 포함하는 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 방법.
- 제 1 항에서,상기 조건은,상기 선택된 IoT 디바이스에 매칭하여 등록된 사용자 단말의 위치를 기준으로 정해지는 위치 범위 내에 상기 사용자 단말이 위치하면 만족되는 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 방법.
- 제 1 항에서,상기 조건은,상기 선택된 IoT 디바이스에 매칭하여 등록된 사용자 단말의 위치를 기준으로 정해지는 위치 범위 내에 상기 사용자 단말이 위치하고, 상기 선택된 IoT 디바이스에 매칭하여 등록된 사용자 단말의 자세를 기준으로 정해지는 자세 범위 내에 상기 사용자 단말의 자세가 있으면 만족되는 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 방법.
- 제 1 항에서,상기 실제 공간에 대응하여 상기 하나 이상의 IoT 단말이 배치되는 가상 공간 상에서 사용자의 위치 및 시점이 상기 조건을 만족하면, 상기 선택된 IoT 디바이스에 대한 제어 화면을 상기 가상 공간에 표시하는 단계, 그리고상기 가상 공간에 표시된 제어 화면을 통한 사용자 입력에 따라 상기 선택된 IoT 디바이스를 제어하는 단계를 더 포함하는 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 방법.
- 하나 이상의 IoT 디바이스를 포함하는 실제 공간을 촬영하는 사용자 단말, 그리고상기 영상을 촬영할 때 상기 사용자 단말의 위치 및 자세 중 적어도 하나를 사용자로부터 선택된 IoT 디바이스에 매칭하여 등록하는 서버를 포함하고,상기 사용자 단말은,상기 선택된 IoT 디바이스에 매칭하여 등록된 사용자 단말의 위치 및 자세 중 적어도 하나를 이용하여 정해지는 조건을 상기 사용자 단말이 만족하면, 상기 선택된 IoT 디바이스에 대한 제어 화면을 표시하며,상기 서버는,상기 제어 화면을 통한 사용자 입력에 따라 상기 선택된 IoT 디바이스를 제어하는 제어 명령을 상기 선택된 IoT 디바이스로 전달하는 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템.
- 제 5 항에서,상기 조건은,상기 선택된 IoT 디바이스에 매칭하여 등록된 사용자 단말의 위치를 기준으로 정해지는 위치 범위 내에 상기 사용자 단말이 위치하면 만족되는 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템.
- 제 5 항에서,상기 조건은,상기 선택된 IoT 디바이스에 매칭하여 등록된 사용자 단말의 위치를 기준으로 정해지는 위치 범위 내에 상기 사용자 단말이 위치하고, 상기 선택된 IoT 디바이스에 매칭하여 등록된 사용자 단말의 자세를 기준으로 정해지는 자세 범위 내에 상기 사용자 단말의 자세가 있으면 만족되는 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템.
- 제 5 항에서,상기 서버는,상기 실제 공간에 대응하여 상기 하나 이상의 IoT 단말이 배치되는 가상 공간 상에서 사용자의 위치 및 시점이 상기 조건을 만족하면, 상기 선택된 IoT 디바이스에 대한 제어 화면을 상기 가상 공간에 표시하며,상기 가상 공간에 표시된 제어 화면을 통한 사용자 입력에 따라 상기 선택된 IoT 디바이스를 제어하는 제어 명령을 상기 선택된 IoT 디바이스로 전달하는 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/042,748 US11262903B2 (en) | 2018-03-30 | 2018-03-30 | IoT device control system and method using virtual reality and augmented reality |
PCT/KR2018/003755 WO2019189965A1 (ko) | 2018-03-30 | 2018-03-30 | 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템 및 방법 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2018/003755 WO2019189965A1 (ko) | 2018-03-30 | 2018-03-30 | 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템 및 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019189965A1 true WO2019189965A1 (ko) | 2019-10-03 |
Family
ID=68059098
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2018/003755 WO2019189965A1 (ko) | 2018-03-30 | 2018-03-30 | 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템 및 방법 |
Country Status (2)
Country | Link |
---|---|
US (1) | US11262903B2 (ko) |
WO (1) | WO2019189965A1 (ko) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11170540B1 (en) * | 2021-03-15 | 2021-11-09 | International Business Machines Corporation | Directional based commands |
US11924209B2 (en) * | 2021-04-28 | 2024-03-05 | International Business Machines Corporation | Augmented reality-based access control for network devices |
US11941231B2 (en) | 2021-08-29 | 2024-03-26 | Snap Inc. | Camera interfaces to interact with IoT devices |
US11954774B2 (en) | 2021-08-29 | 2024-04-09 | Snap Inc. | Building augmented reality experiences with IoT devices |
US20230063944A1 (en) * | 2021-08-29 | 2023-03-02 | Yu Jiang Tham | Two-way control of iot devices using ar camera |
US20230063194A1 (en) * | 2021-08-29 | 2023-03-02 | Yu Jiang Tham | Controlling iot devices through ar object interaction |
US11809680B2 (en) | 2021-12-30 | 2023-11-07 | Snap Inc. | Interface for engaging IoT devices with AR camera |
WO2023244490A1 (en) * | 2022-06-15 | 2023-12-21 | Snap Inc. | Standardized ar interfaces for iot devices |
US20230410437A1 (en) * | 2022-06-15 | 2023-12-21 | Sven Kratz | Ar system for providing interactive experiences in smart spaces |
CN114926614B (zh) * | 2022-07-14 | 2022-10-25 | 北京奇岱松科技有限公司 | 一种基于虚拟世界和现实世界的信息交互系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160140347A (ko) * | 2015-05-27 | 2016-12-07 | (주)원퍼스트 | 사물인터넷 기반 제어 서비스 제공 방법 및 스마트 제어 장치 |
KR101709715B1 (ko) * | 2015-11-16 | 2017-02-24 | (주)아이오텍 | 카메라를 이용한 사물인터넷 기반의 제어 및 감시장치 |
US20170180489A1 (en) * | 2015-12-22 | 2017-06-22 | Samsung Electronics Co., Ltd. | Electronic device and server for providing service related to internet of things device |
KR101773768B1 (ko) * | 2017-04-24 | 2017-09-13 | (주)지란지교시큐리티 | 사물인터넷 기기 모니터링 시스템 및 방법 |
KR20180012038A (ko) * | 2016-07-26 | 2018-02-05 | 코나에스 주식회사 | 사물인터넷 서비스를 이용한 전자기기 관리 방법 및 장치 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2411532B (en) * | 2004-02-11 | 2010-04-28 | British Broadcasting Corp | Position determination |
TWI423112B (zh) * | 2009-12-09 | 2014-01-11 | Ind Tech Res Inst | 可攜式虛擬輸入操作裝置與其操作方法 |
US10395018B2 (en) * | 2010-11-29 | 2019-08-27 | Biocatch Ltd. | System, method, and device of detecting identity of a user and authenticating a user |
US9213405B2 (en) * | 2010-12-16 | 2015-12-15 | Microsoft Technology Licensing, Llc | Comprehension and intent-based content for augmented reality displays |
US8965049B2 (en) * | 2011-02-01 | 2015-02-24 | Panasonic Intellectual Property Corporation Of America | Function extension device, function extension method, computer-readable recording medium, and integrated circuit |
EP2751742A1 (en) * | 2011-08-31 | 2014-07-09 | metaio GmbH | Method of matching image features with reference features |
WO2013033842A1 (en) * | 2011-09-07 | 2013-03-14 | Tandemlaunch Technologies Inc. | System and method for using eye gaze information to enhance interactions |
US20130314443A1 (en) * | 2012-05-28 | 2013-11-28 | Clayton Grassick | Methods, mobile device and server for support of augmented reality on the mobile device |
US9448404B2 (en) * | 2012-11-13 | 2016-09-20 | Qualcomm Incorporated | Modifying virtual object display properties to increase power performance of augmented reality devices |
US9317972B2 (en) * | 2012-12-18 | 2016-04-19 | Qualcomm Incorporated | User interface for augmented reality enabled devices |
JP5900393B2 (ja) * | 2013-03-21 | 2016-04-06 | ソニー株式会社 | 情報処理装置、操作制御方法及びプログラム |
US9728009B2 (en) * | 2014-04-29 | 2017-08-08 | Alcatel Lucent | Augmented reality based management of a representation of a smart environment |
KR101679504B1 (ko) * | 2015-02-26 | 2016-12-06 | 인제대학교 산학협력단 | 링거 거치대 |
US20170061696A1 (en) * | 2015-08-31 | 2017-03-02 | Samsung Electronics Co., Ltd. | Virtual reality display apparatus and display method thereof |
KR20170091913A (ko) * | 2016-02-02 | 2017-08-10 | 삼성전자주식회사 | 영상 서비스 제공 방법 및 장치 |
KR20170108285A (ko) | 2016-03-17 | 2017-09-27 | 주식회사 이웃 | 증강현실 기술을 이용한 실내장치 제어시스템 및 방법 |
US10754416B2 (en) * | 2016-11-14 | 2020-08-25 | Logitech Europe S.A. | Systems and methods for a peripheral-centric augmented/virtual reality environment |
KR102313981B1 (ko) * | 2017-06-20 | 2021-10-18 | 삼성전자주식회사 | 지문 인증 방법 및 장치 |
-
2018
- 2018-03-30 WO PCT/KR2018/003755 patent/WO2019189965A1/ko active Application Filing
- 2018-03-30 US US17/042,748 patent/US11262903B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160140347A (ko) * | 2015-05-27 | 2016-12-07 | (주)원퍼스트 | 사물인터넷 기반 제어 서비스 제공 방법 및 스마트 제어 장치 |
KR101709715B1 (ko) * | 2015-11-16 | 2017-02-24 | (주)아이오텍 | 카메라를 이용한 사물인터넷 기반의 제어 및 감시장치 |
US20170180489A1 (en) * | 2015-12-22 | 2017-06-22 | Samsung Electronics Co., Ltd. | Electronic device and server for providing service related to internet of things device |
KR20180012038A (ko) * | 2016-07-26 | 2018-02-05 | 코나에스 주식회사 | 사물인터넷 서비스를 이용한 전자기기 관리 방법 및 장치 |
KR101773768B1 (ko) * | 2017-04-24 | 2017-09-13 | (주)지란지교시큐리티 | 사물인터넷 기기 모니터링 시스템 및 방법 |
Also Published As
Publication number | Publication date |
---|---|
US20210149551A1 (en) | 2021-05-20 |
US11262903B2 (en) | 2022-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019189965A1 (ko) | 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템 및 방법 | |
WO2015167080A1 (en) | Unmanned aerial vehicle control apparatus and method | |
WO2013133486A1 (en) | Device, method and timeline user interface for controlling home devices | |
WO2015174729A1 (ko) | 공간 정보를 제공하기 위한 증강 현실 제공 방법과 시스템, 그리고 기록 매체 및 파일 배포 시스템 | |
WO2012118299A2 (en) | Method and apparatus for sharing media based on social network in communication system | |
KR102076647B1 (ko) | 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템 및 방법 | |
WO2012124860A1 (ko) | Nfc 기반의 기기제어 방법 및 이를 이용한 기기제어 시스템 | |
WO2017065569A1 (en) | Method for locking and unlocking touchscreen-equipped mobile device and mobile device | |
WO2018070768A1 (ko) | 모니터링 시스템 제어 방법 및 이를 지원하는 전자 장치 | |
WO2021002687A1 (ko) | 사용자 간의 경험 공유를 지원하기 위한 방법, 시스템 및 비일시성의 컴퓨터 판독 가능 기록 매체 | |
WO2017217713A1 (en) | Method and apparatus for providing augmented reality services | |
WO2017119800A1 (ko) | 센서 관리 방법 및 장치 | |
WO2014189186A1 (ko) | 무선 리모컨 기능을 갖는 ip 카메라를 이용한 전자기기의 제어 방법 | |
WO2017111332A1 (ko) | 전자 장치 및 전자 장치의 제어 방법 | |
WO2013042815A1 (ko) | 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법, 그리고 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체 | |
WO2015102467A1 (ko) | 웨어러블 기기를 이용한 홈 디바이스 제어 장치 및 제어 방법 | |
WO2015093893A1 (ko) | 홈 네트워크 시스템에서 이벤트 통지 방법 및 장치 | |
WO2017200153A1 (ko) | 360도 영상 재생 시 사용자 단말 기울기 정보를 이용하여 재생 영역을 보정하는 방법 및 시스템 | |
WO2019198868A1 (ko) | 무인기와 무선단말기 간의 상호 인식 방법 | |
WO2014065495A1 (en) | Method for providing contents and a digital device for the same | |
WO2021045430A1 (ko) | 댁내 중계 장치 및 그와 연결되는 전자 장치 | |
WO2014010785A1 (ko) | 관광 관련 IoT 서비스 방법 | |
WO2015102476A1 (ko) | 이동형 3d 멀티디스플레이 기반의 실감형 교육 서비스 제공 차량 | |
JP2009230193A (ja) | ポインティング装置としての機能を実現する撮影装置、表示装置、および情報システム | |
WO2019135543A1 (ko) | 전자장치 및 그 제어방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18911483 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18911483 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 16.04.2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18911483 Country of ref document: EP Kind code of ref document: A1 |