US20160133052A1 - Virtual environment for sharing information - Google Patents

Virtual environment for sharing information Download PDF

Info

Publication number
US20160133052A1
US20160133052A1 US14/934,587 US201514934587A US2016133052A1 US 20160133052 A1 US20160133052 A1 US 20160133052A1 US 201514934587 A US201514934587 A US 201514934587A US 2016133052 A1 US2016133052 A1 US 2016133052A1
Authority
US
United States
Prior art keywords
electronic device
information
contents
display
external electronic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/934,587
Other languages
English (en)
Inventor
Woosung CHOI
Hyuk Kang
Minji Kim
Dongil Son
Buseop JUNG
Jongho Choi
Jooman HAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO, LTD. reassignment SAMSUNG ELECTRONICS CO, LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Kim, Minji, CHOI, JONGHO, Jung, Buseop, KANG, HYUK, CHOI, WOOSUNG, HAN, JOOMAN, Son, Dongil
Publication of US20160133052A1 publication Critical patent/US20160133052A1/en
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE COUNTRY PREVIOUSLY RECORDED AT REEL: 036979 FRAME: 0264. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT . Assignors: Kim, Minji, CHOI, JONGHO, Jung, Buseop, KANG, HYUK, CHOI, WOOSUNG, HAN, JOOMAN, Son, Dongil
Priority to US16/536,195 priority Critical patent/US11120630B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45537Provision of facilities of other operating environments, e.g. WINE
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/12Use of DVI or HDMI protocol in interfaces along the display data pipeline
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • the disclosure relates to an electronic device, and for example, to an apparatus and a method for sharing information through a virtual environment.
  • a portable device for example, a smart phone and a tablet PC
  • a wearable device for example, a smart watch and a head-mounted device
  • a smart home appliance for example, a smart television and a game console
  • the electronic devices having various forms may exchange information.
  • a smart television, a smart phone, and a notebook computer may be connected with each other through a network.
  • the smart phone may transmit a picture stored in a smart phone to a smart television.
  • a user of the smart television may check the picture stored in the smart phone through the smart television.
  • a notebook computer may transmit a movie to the smart television.
  • the user of the smart television may watch the movie, instead of the picture, through the smart television
  • an electronic device may provide a user with contents obtained from an external electronic device.
  • the electronic device may provide the contents without discriminating the contents from other previously obtained contents. Accordingly, it may be difficult for a user to discriminate the contents obtained from the external electronic device from other contents. For example, it is difficult for a user to recognize information about a device, from which the contents are obtained.
  • the electronic device When the electronic device obtains a plurality of contents from a plurality of external electronic devices, respectively, the electronic device may not simultaneously provide the user with the plurality of contents in the related art. Accordingly, the user is inconvenienced by needing to first check one of the plurality of contents, and then check other contents.
  • the electronic device may provide a user with various virtual environments so that the user may view a 2D or 3D image.
  • the electronic device may provide a user with a new environment regardless of a space, in which a user is actually located. Accordingly, the user may more vividly view the 2D or 3D image based on a virtual environment.
  • the electronic device in the related art cannot reflect a space, in which a user is actually located, to a virtual environment.
  • the electronic device since the electronic device is capable of using only a set image, the electronic device cannot represent various electronic devices (for example, a television, a refrigerator, or a computer) located in an actual space on a virtual environment. Accordingly, when a user uses a virtual environment, the user may not directly control an electronic device located in an actual space.
  • An example of the disclosure provides an electronic device and a method capable of relating contents obtained from an external electronic device to an object corresponding to the external electronic device through a virtual environment and providing the obtained contents. Accordingly, the electronic device may discriminate the contents obtained from the external electronic device from other contents and provide the obtained contents through the virtual environment.
  • Another example of the disclosure provides an electronic device and a method capable of relating contents obtained from a plurality of external electronic devices to a plurality of objects corresponding to the plurality of external electronic devices, respectively, and simultaneously providing the obtained contents.
  • Another example of the disclosure provides an electronic device and a method capable of enabling a user to directly control electronic devices disposed in an actual space through a virtual environment by reflecting a space, in which the user is actually located, to the virtual environment.
  • an electronic device includes: a display; and an information providing module in the form of processing circuitry functionally connected with the display, wherein the information providing module is configured to display an object corresponding to an external electronic device for the electronic device through the display, to obtain information to be output through the external electronic device, and to provide contents corresponding to the information in relation to a region, on which the object is displayed.
  • the electronic device and the method according to various examples may provide, for example, contents obtained from an external electronic device to a user in relation to an object corresponding to the external electronic device, thereby enabling the user to discriminate the contents obtained from the external electronic device from other contents based on the object.
  • the electronic device and the method according to various examples may simultaneously provide, for example, a plurality of contents obtained from a plurality of external electronic devices through a plurality of objects corresponding to the plurality of external electronic devices, respectively, thereby reducing or overcoming inconvenience caused when a user needs to check each of the plurality of contents.
  • the electronic device and the method according to various examples may provide, for example, contents obtained through an external electronic device through a virtual environment, thereby providing a service linked with the external electronic device through the virtual environment.
  • FIG. 1 illustrates an example use environment of a plurality of electronic devices
  • FIG. 2 is a block diagram illustrating an example electronic device
  • FIG. 3 is a block diagram illustrating an example electronic device
  • FIG. 4 illustrates an example of an actual use environment, in which an electronic device is linked with an external electronic device
  • FIG. 5 illustrates an example of a virtual environment provided by an electronic device
  • FIG. 6 is a block diagram illustrating an example information providing module
  • FIG. 7 illustrates an example of a user interface provided through an electronic device
  • FIG. 8 illustrates an example of a user interface provided through the electronic device
  • FIG. 9 illustrates an example of a user interface provided through the electronic device
  • FIG. 10 illustrates an example of a user interface provided through the electronic device
  • FIG. 11 illustrates an example of a user interface provided through the electronic device.
  • FIG. 12 is a flowchart illustrating an example method for providing information by an electronic device.
  • the term “include” or “may include” which may be used in describing various examples of the disclosure refers to the existence of a corresponding disclosed function, operation or component which can be used in various examples of the disclosure and does not limit one or more additional functions, operations, or components.
  • the terms such as “include” or “have” may be construed to denote a certain characteristic, number, step, operation, constituent element, component or a combination thereof, but may not be construed to exclude the existence of or a possibility of addition of one or more other characteristics, numbers, steps, operations, constituent elements, components or combinations thereof.
  • the expression “or” or “at least one of A or/and B” includes any or all of combinations of words listed together.
  • the expression “A or B” or “at least A or/and B” may include A, may include B, or may include both A and B.
  • the expression “1”, “2”, “first”, or “second” used in various examples of the disclosure may modify various components of the various examples but does not limit the corresponding components.
  • the above expressions do not limit the sequence and/or importance of the components.
  • the expressions may be used for distinguishing one component from other components.
  • a first user device and a second user device indicate different user devices although both of them are user devices.
  • a first structural element may be referred to as a second structural element.
  • the second structural element also may be referred to as the first structural element.
  • a component When it is stated that a component is “coupled to” or “connected to” another component, the component may be directly coupled or connected to another component or another component may exist between the component and another component. In contrast, when it is stated that a component is “directly coupled to” or “directly connected to” another component, a component does not exist between the component and another component.
  • An electronic device may be a device including a communication function.
  • the electronic device may be one or a combination of a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, a netbook computer, a Personal Digital Assistant (PDA), a camera, a wearable device (for example, a Head-Mounted-Device (HMD) such as electronic glasses, electronic clothes, and electronic bracelet, an electronic necklace, an electronic appcessary, an electronic tattoo, and a smart watch.
  • HMD Head-Mounted-Device
  • the electronic device may be a smart home appliance having a communication function.
  • the smart home appliance may include at least one of a TeleVision (TV), a Digital Video Disk (DVD) player, an audio player, an air conditioner, a cleaner, an oven, a microwave oven, a washing machine, an air cleaner, a set-top box, a TV box (for example, Samsung HomeSyncTM, Apple TVTM, or Google TVTM), game consoles, an electronic dictionary, an electronic key, a camcorder, and an electronic frame.
  • TV TeleVision
  • DVD Digital Video Disk
  • the electronic device may include at least one of various types of medical devices (for example, Magnetic Resonance Angiography (MRA), Magnetic Resonance Imaging (MRI), Computed Tomography (CT), a scanner, an ultrasonic device and the like), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), a vehicle infotainment device, electronic equipment for a ship (for example, a navigation device for ship, a gyro compass and the like), avionics, a security device, a head unit for a vehicle, an industrial or home robot, an Automatic Teller Machine (ATM) of financial institutions, and a Point Of Sale (POS) device of shops.
  • MRA Magnetic Resonance Angiography
  • MRI Magnetic Resonance Imaging
  • CT Computed Tomography
  • scanner an ultrasonic device and the like
  • a navigation device for example, a Global Positioning System (GPS) receiver, an Event Data Recorder (
  • the electronic device may include at least one of furniture or a part of a building/structure, an electronic board, an electronic signature receiving device, a projector, and various types of measuring devices (for example, a water meter, an electricity meter, a gas meter, a radio wave meter and the like) including a camera function.
  • the electronic device according to various examples of the disclosure may be one or a combination of the above described various devices. Further, the electronic device according to various examples of the disclosure may be a flexible device. It will be apparent to those skilled in the art that the electronic device according to various examples of the disclosure is not limited to the above described devices.
  • the term “user” used in various examples may refer to a person who uses an electronic device or a device (for example, an artificial intelligence electronic device) which uses an electronic device.
  • a screen of an electronic device may be split into at least two windows according to a predefined split manner and displayed through a display of an electronic device.
  • the windows may be referred to as split windows.
  • the split windows refer to windows displayed on a display of an electronic display not to be superposed one on another.
  • a popup window may refer to a window displayed on a display of an electronic device to hide or to be superposed on a portion of a screen under execution.
  • an electronic device using a split window and a popup window is capable of displaying two or more application execution screens or function execution screens.
  • the split windows and the popup window may be referred to as a multi-window.
  • the term “user” may indicate a person who uses an electronic device or a device (e.g., an artificial intelligence electronic device) that uses an electronic device.
  • An electronic device 101 within a network environment 100 will be described with reference to FIG. 1 .
  • the electronic device 101 may include an information providing module 110 in the form of processing circuitry, a bus 120 , a processor 130 , a memory 140 , an input/output interface 160 , a display 170 , and a communication interface 180 .
  • the electronic device 101 may omit at least one of the above elements or may further include other elements.
  • the information providing module 110 may be configured to process at least some of the information obtained from other elements (for example, the processor 130 , the memory 140 , the input/output interface 160 , and the communication interface 180 ), and provide the user with the obtained information by various methods.
  • the information providing module or processing circuitry 110 may be configured to display an object corresponding to an external electronic device (for example, the electronic devices 102 and 104 or the server 106 ) through the display 170 functionally connected with the electronic device 101 using the processor 130 or independently from the processor 130 .
  • the object may have an image corresponding to an appearance of an external electronic device.
  • the information providing module 110 may provide information outputtable by an external electronic device (for example, the electronic device 102 ) through at least one of the object corresponding to the external electronic device displayed on the display 170 or the external electronic device depending on whether the electronic device 101 is detached from or attached to the user.
  • the external electronic device may be a television.
  • the information providing module 110 may provide (for example, display) at least some of information (for example, an image, text, a sound, or notification information) outputtable through the television on a partial area of the display 170 , on which the object having a television image corresponding to the television is displayed. Additional information associated with the information providing module 110 will be described below in conjunction with FIGS. 4 to 12 .
  • the bus 120 may be a circuit connecting the above described components and transmitting communication (for example, a control message) between the above described components.
  • the processor 130 may include one or more a CPU (central processing unit (CPU)), application processors (application processor (AP)), or communication processors (communication processor (CP)).
  • CPU central processing unit
  • AP application processor
  • CP communication processor
  • the Processor 130 may be configured to execute the operation or data processing related to control and/or communication of at least one the other components of the electronic device 101 .
  • the memory 140 stores commands or data received from the processor 130 or other components of the electronic device 101 or generated by the processor 130 or other components of the electronic device 101 .
  • the memory 140 may include programming modules 150 , for example, a kernel 151 , middleware 153 , an Application Programming Interface (API) 155 , and an application 157 .
  • Each of the aforementioned programming modules may be implemented by software, firmware, hardware, or a combination of two or more thereof.
  • the kernel 151 controls or manages system resources (for example, the bus 120 , the processor 130 , or the memory 140 ) used for executing an operation or function implemented by the remaining other programming modules, for example, the middleware 153 , the API 155 , or the application 157 .
  • the kernel 151 provides an interface for accessing individual components of the electronic device 101 from the middleware 153 , the API 155 , or the application 157 to control or manage the components.
  • the middleware 153 performs a relay function of allowing the API 155 or the application 157 to communicate with the kernel 151 to exchange data.
  • the middleware 153 performs a control for the operation requests (for example, scheduling or load balancing) by using a method of assigning a priority, by which system resources (for example, the bus 120 , the processor 130 , the memory 140 and the like) of the electronic device 101 can be used, to the application 157 .
  • the API 155 may, for example, be an interface by which the application 157 can control a function provided by the kernel 141 or the middleware 153 and includes, for example, at least one interface or function (for example, command) for a file control, a window control, image processing, or a character control
  • the input/output interface 160 can receive, for example, a command and/or data from a user, and transfer the received command and/or data to the processor 130 and/or the memory 140 through the bus 120 .
  • Examples of the display 170 may include a Liquid Crystal Display (LCD), a Light-Emitting Diode (LED) display, an Organic Light-Emitting Diode (OLED) display, a MicroElectroMechanical Systems (MEMS) display, and an electronic paper display, etc.
  • the display 170 may display various contents (e.g., text, images, videos, icons, or symbols) to the user.
  • the display 170 may include a touch screen and receive, for example, a touch, gesture, proximity, or hovering input using an electronic pen or the user's body part.
  • the communication interface 180 may enable communication, for example, between the electronic device 101 and an external device (e.g., the first external electronic device 102 , the second external electronic device 104 , or the server 106 ).
  • the communication interface 180 may be connected to a network 162 through wireless or wired communication to communicate with the external device (e.g., the second external electronic device 104 or the server 106 ).
  • the wireless communication may use at least one of, for example, Long Term Evolution (LTE), LTE-Advance (LTE-A), Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Universal Mobile Telecommunications System (UMTS), WiBro (Wireless Broadband), and Global System for Mobile Communications (GSM), as a cellular communication protocol.
  • the wireless communication may include, for example, short range communication 164 .
  • the short-range communication 164 may include at least one of, for example, Wi-Fi, Bluetooth, Near Field Communication (NFC), infrared communication, and Global Positioning System (GPS).
  • the wired communication may include at least one of, for example, a Universal Serial Bus (USB), a High Definition Multimedia Interface (HDMI), Recommended Standard 232 (RS-232), and a Plain Old Telephone Service (POTS).
  • the network 162 may, for example, include at least one of communication networks, such as a computer network (e.g., a LAN or a WAN), the Internet, and a telephone network.
  • At least one of the first and second external electronic devices 102 and 104 may be the same or a different type of device from the electronic device 101 .
  • the server 106 may include a group of one or more servers.
  • all or some of the operations performed in the electronic device 101 may be performed in another electronic device or a plurality of electronic devices (e.g., the electronic devices 102 and 104 or the server 106 ).
  • the electronic device 101 may make a request for performing at least some functions relating thereto to another device (e.g., the electronic device 102 or 104 or the server 106 ) instead of performing the functions or services by itself.
  • the other electronic device may carry out the requested functions or the additional functions and transfer the result to the electronic device 101 .
  • the electronic device 101 may process the received result as it is or additionally provide the requested functions or services.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • FIG. 2 is a block diagram illustrating an example electronic device 201 .
  • the electronic device 201 may, for example, include a whole or a part of the electronic device 101 illustrated in FIG. 1 .
  • the electronic device 201 may, for example, include one or more Processors (for example, Application Processors (APs)) 210 , a communication module 220 , a Subscriber Identification Module (SIM) card 224 , a memory 230 , a sensor module 240 including various sensors, an input device 250 , a display 260 , an interface 270 , an audio module 280 , a camera module 291 , a power managing module 295 , a battery 296 , an indicator 297 , and a motor 298 .
  • APs Application Processors
  • SIM Subscriber Identification Module
  • the processor 210 operates an operating system (OS) or an application program so as to control a plurality of hardware or software component elements connected to the processor 210 and execute various data processing and calculations including multimedia data.
  • the processor 210 may be implemented by, for example, a System on Chip (SoC).
  • SoC System on Chip
  • the processor 210 may further include a Graphic Processing Unit (GPU).
  • the communication module 220 may have a configuration the same or similar to that of the communication interface 180 of FIG. 1 .
  • the communication module 220 may include, for example, a cellular module 221 , a Wi-Fi module 223 , a Bluetooth module 225 , a GPS module 227 , an NFC module 228 , and a Radio Frequency (RF) module 229 .
  • RF Radio Frequency
  • the cellular module 221 may provide a voice call, image call, a text message service, or an Internet service through, for example, a communication network. According to an example, the cellular module 221 may distinguish and authenticate the electronic device 201 within a communication network by using a subscriber identification module 224 (for example, the SIM card). According to an example of the disclosure, the cellular module 221 may perform at least some of the functions that the processor 210 may provide. According to an example, the cellular module 221 may include a Communication Processor (CP).
  • CP Communication Processor
  • At least one of the Wi-Fi module 223 , the Bluetooth module 225 , the GPS module 227 , and the NFC module 228 may be, for example, a processor for processing data transceived through a corresponding module.
  • at least some (two or more) of the cellular module 221 , the Wi-Fi module 223 , the Bluetooth module 225 , the GPS module 227 , and the NFC module 228 may be included in one Integrated Chip (IC) or IC package.
  • IC Integrated Chip
  • the RF module 229 may transmit/receive, for example, a communication signal (for example, an RF signal).
  • the RF module 229 may include, for example, a transceiver, a Power Amplifier Module (PAM), a frequency filter, a Low Noise Amplifier (LNA), or an antenna.
  • PAM Power Amplifier Module
  • LNA Low Noise Amplifier
  • at least one of the cellular module 221 , the Wi-Fi module 223 , the BT module 225 , the GPS module 227 , and the NFC module 228 may transmit/receive an RF signal through a separate RF module.
  • the subscriber identification module 224 may include, for example, a card including a subscriber identification module and/or an embedded SIM, and may contain unique identification information (e.g., an Integrated Circuit Card Identifier (ICCID)) or subscriber information (e.g., an International Mobile Subscriber Identity (IMSI)).
  • ICCID Integrated Circuit Card Identifier
  • IMSI International Mobile Subscriber Identity
  • the memory 230 may include, for example, an internal memory 232 or an external memory 234 .
  • An internal memory 232 may include at least one of a volatile memory (for example, a Dynamic Random Access Memory (DRAM), a Static RAM (SRAM), a Synchronous Dynamic RAM (SDRAM), and the like, and a non-volatile memory (for example, a One Time Programmable Read Only Memory (OTPROM)), a Programmable ROM (PROM), an Erasable and Programmable ROM (EPROM), an Electrically Erasable and Programmable ROM (EEPROM), a mask ROM, a flash ROM, a flash memory (for example, a NAND flash memory or a NOR flash memory), a hard disc drive, a Solid State Drive (SSD), and the like.
  • DRAM Dynamic Random Access Memory
  • SRAM Static RAM
  • SDRAM Synchronous Dynamic RAM
  • EEPROM Electrically Erasable and Programmable ROM
  • EEPROM Electrically Erasable
  • An external memory 234 may further include a flash drive, for example, a Compact Flash (CF), a Secure Digital (SD), a Micro Secure Digital (Micro-SD), a Mini Secure Digital (Mini-SD), an eXtreme Digital (xD), a Multi Media Card (MMC), a memory stick, or the like.
  • the external memory 234 may be functionally and/or physically connected to the electronic device 201 through various interfaces.
  • the sensor module 240 may, for example, include various sensors configured to measure a physical quantity or detect an operation state of the electronic device 201 , and may convert the measured or detected information into an electrical signal.
  • the sensor module 240 may include at least one of a gesture sensor 240 A, a gyro sensor 240 B, an atmospheric pressure sensor 240 C, a magnetic sensor 240 D, an acceleration sensor 240 E, a grip sensor 240 F, a proximity sensor 240 G, a color sensor 240 H (for example, a Red/Green/Blue (RGB) sensor), a bio-sensor 240 I, a temperature/humidity sensor 240 J, an illumination sensor 240 K, and an Ultra Violet (UV) sensor 240 M.
  • the sensor module 240 may include, for example, an E-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an Infrared (IR) sensor, an iris sensor, and/or a fingerprint sensor.
  • the sensor module 240 may further include a control circuit for controlling one or more sensors included therein.
  • an electronic device 201 may further include a processor configured to control the sensor module 240 as a part of or separately from the processor 210 , and may control the sensor module 240 while the processor 210 is in a sleep state.
  • the input device 250 may include, for example, a touch panel 252 , a (digital) pen sensor 254 , a key 256 , or an ultrasonic input device 258 .
  • the touch panel 252 may use at least one of, for example, a capacitive type, a resistive type, an infrared type, and an ultrasonic type. Also, the touch panel 252 may further include a control circuit.
  • the touch panel 252 may further include a tactile layer and provide a tactile reaction to the user.
  • the (digital) pen sensor 254 may include, for example, a recognition sheet which is a part of the touch panel or is separated from the touch panel.
  • the key 256 may include, for example, a physical button, an optical key or a keypad.
  • the ultrasonic input device 258 may detect ultrasonic waves generated by an input tool through a microphone (for example, a microphone 288 ) and identify data corresponding to the detected ultrasonic waves.
  • the display 260 may include a panel 262 , a hologram device 264 or a projector 266 .
  • the panel 262 may include a configuration identical or similar to that of the display 170 illustrated in FIG. 1 .
  • the panel 262 may be implemented to be, for example, flexible, transparent, or wearable.
  • the panel 262 and the touch panel 252 may be implemented as one module.
  • the hologram 264 may show a three dimensional image in the air using an interference of light.
  • the projector 266 may display an image by projecting light onto a screen.
  • the screen may be located, for example, inside or outside the electronic device 201 .
  • the display 260 may further include a control circuit for controlling the panel 262 , the hologram device 264 , or the projector 266 .
  • the interface 270 may include, for example, a High-Definition Multimedia Interface (HDMI) 272 , a Universal Serial Bus (USB) 274 , an optical interface 276 , or a D-subminiature (D-sub) 278 .
  • the interface 270 may be included in, for example, the communication interface 180 illustrated in FIG. 1 .
  • the interface 270 may include, for example, a Mobile High-definition Link (MHL) interface, a Secure Digital (SD) card/Multi-Media Card (MMC) interface, or an Infrared Data Association (IrDA) standard interface.
  • MHL Mobile High-definition Link
  • SD Secure Digital
  • MMC Multi-Media Card
  • IrDA Infrared Data Association
  • the audio module 280 bi-directionally converts a sound and an electronic signal. At least some components of the audio module 280 may be included in, for example, the input/output interface 160 illustrated in FIG. 1 .
  • the audio module 280 processes sound information input or output through, for example, a speaker 282 , a receiver 284 , an earphone 286 , the microphone 288 or the like.
  • the camera module 291 is a device which can photograph a still image and a video.
  • the camera module 291 may include one or more image sensors (for example, a front sensor or a back sensor), an Image Signal Processor (ISP) (not shown) or a flash (for example, an LED or xenon lamp).
  • ISP Image Signal Processor
  • flash for example, an LED or xenon lamp
  • the power managing module 295 manages power of the electronic device 200 .
  • the power managing module 295 may include, for example, a Power Management Integrated Circuit (PMIC), a charger Integrated Circuit (IC), or a battery or fuel gauge.
  • PMIC Power Management Integrated Circuit
  • IC charger Integrated Circuit
  • battery or fuel gauge a Battery or fuel gauge
  • the PMIC may be mounted to, for example, an integrated circuit or an SoC semiconductor.
  • a charging method may, for example, include wired and wireless methods.
  • the charger IC charges a battery and prevent over voltage or over current from flowing from a charger.
  • the charger IC includes a charger IC for at least one of the wired charging method and the wireless charging method.
  • the wireless charging method may include, for example, a magnetic resonance method, a magnetic induction method and an electromagnetic wave method, and additional circuits for wireless charging, for example, circuits such as a coil loop, a resonant circuit, a rectifier or the like may be added.
  • the battery gauge measures, for example, a remaining quantity of the battery 296 , or a voltage, a current, or a temperature during charging.
  • the battery 296 may store or generate electricity and supply power to the electronic device 200 by using the stored or generated electricity.
  • the battery 296 may include a rechargeable battery or a solar battery.
  • the indicator 297 shows particular statuses of the electronic device 200 or a part (for example, AP 210 ) of the electronic device 200 , for example, a booting status, a message status, a charging status and the like.
  • the motor 298 converts an electrical signal to a mechanical vibration.
  • the electronic device 200 may include a processing unit in the form of a processor (for example, GPU) for supporting a module TV.
  • the processing unit for supporting the mobile TV may process, for example, media data according to a standard of Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), media flow or the like.
  • DMB Digital Multimedia Broadcasting
  • DVD Digital Video Broadcasting
  • Each of the components of the electronic device according to various examples of the disclosure may be implemented by one or more components and the name of the corresponding component may vary depending on a type of the electronic device.
  • the electronic device according to various examples of the disclosure may include at least one of the above described components, a few of the components may be omitted, or additional components may be further included. Also, some of the components of the electronic device according to various examples of the disclosure may be combined to form a single entity, and thus may equivalently execute functions of the corresponding components before being combined.
  • FIG. 3 is a block diagram illustrating an example program module.
  • a program module 310 (for example, the program 150 ) may include an Operating System (OS) for controlling resources related to the electronic device (for example, the electronic device 101 ) and/or various applications (for example, application programs 157 ) executed in the operating system.
  • the operating system may be, for example, Android, iOS, Windows, Symbian, Tizen, Bada, or the like.
  • the program module 310 may include a kernel 320 , middleware 330 , an Application Programming Interface (API) 360 , and/or an application 370 . At least some of the program module 310 may be preloaded on the electronic device, or may be downloaded from an external electronic device (e.g., the electronic device 102 or 104 , or the server 106 ).
  • API Application Programming Interface
  • the kernel 320 may include, for example, a system resource manager 321 and/or a device driver 323 .
  • the system resource manager 321 may perform the control, allocation, retrieval, or the like of system resources.
  • the system resource manager 321 may include a process manager, a memory manager, a file system manager, or the like.
  • the device driver 323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a Wi-Fi driver, an audio driver, or an Inter-Process Communication (IPC) driver.
  • IPC Inter-Process Communication
  • the middleware 330 may, for example, provide a function required by the applications 370 in common or provide various functions to the applications 370 through the API 360 so that the applications 370 can efficiently use limited system resources within the electronic device.
  • the middleware 330 (for example, the middleware 153 ) may include, for example, at least one of a runtime library 335 , an application manager 341 , a window manager 342 , a multimedia manager 343 , a resource manager 344 , a power manager 345 , a database manager 346 , a package manager 347 , a connectivity manager 348 , a notification manager 349 , a location manager 350 , a graphic manager 351 , and a security manager 352 .
  • the runtime library 335 may include a library module that a compiler uses in order to add a new function through a programming language while the applications 370 are being executed.
  • the runtime library 335 may perform input/output management, memory management, the functionality for an arithmetic function, or the like.
  • the application manager 341 may manage, for example, the life cycle of at least one of the applications 370 .
  • the window manager 342 may manage Graphical User Interface (GUI) resources used for the screen.
  • the multimedia manager 343 may determine a format required to reproduce various media files, and may encode or decode a media file by using a coder/decoder (codec) appropriate for the corresponding format.
  • codec coder/decoder
  • the resource manager 344 may manage resources, such as a source code, a memory, a storage space, and the like of at least one of the applications 370 .
  • the power manager 345 may operate together with a Basic Input/Output System (BIOS) to manage a battery or power and may provide power information required for the operation of the electronic device.
  • BIOS Basic Input/Output System
  • the database manager 346 may generate, search for, and/or change a database to be used by at least one of the applications 370 .
  • the package manager 347 may manage the installation or update of an application distributed in the form of a package file.
  • the connectivity manager 348 may manage a wireless connection such as, for example, Wi-Fi or Bluetooth.
  • the notification manager 349 may display or notify of an event, such as an arrival message, an appointment, a proximity notification, and the like, in such a manner as not to disturb the user.
  • the location manager 350 may manage location information of the electronic device.
  • the graphic manager 351 may manage a graphic effect, which is to be provided to the user, or a user interface related to the graphic effect.
  • the security manager 352 may provide various security functions required for system security, user authentication, and the like. According to an example, when the electronic device (e.g., the electronic device 101 ) has a telephone call function, the middleware 330 may further include a telephony manager for managing a voice call function or a video call function of the electronic device.
  • the middleware 330 may include a middleware module that forms a combination of various functions of the above-described elements.
  • the middleware 330 may provide a module specialized for each type of OS in order to provide a differentiated function. Also, the middleware 330 may dynamically delete some of the existing elements, or may add new elements.
  • the API 360 may, for example, be a set of API programming functions, and may be provided with a different configuration according to an OS. For example, in the case of Android or iOS, one API set may be provided for each platform. In the case of Tizen, for example, two or more API sets may be provided for each platform.
  • the applications 370 may include, for example, one or more applications which can provide functions such as home 371 , dialer 372 , SMS/MMS 373 , Instant Message (IM) 374 , browser 375 , camera 376 , alarm 377 , contacts 378 , voice dialer 379 , email 380 , calendar 381 , media player 382 , album 383 , clock 384 , health care (for example, measure exercise quantity or blood sugar), or environment information (for example, atmospheric pressure, humidity, or temperature information).
  • IM Instant Message
  • the applications 370 may include, for example, one or more applications which can provide functions such as home 371 , dialer 372 , SMS/MMS 373 , Instant Message (IM) 374 , browser 375 , camera 376 , alarm 377 , contacts 378 , voice dialer 379 , email 380 , calendar 381 , media player 382 , album 383 , clock 384 , health care (for example, measure exercise quantity or blood sugar
  • the applications 370 may include an application (hereinafter, referred to as an “information exchange application” for convenience of description) supporting information exchange between the electronic device (e.g., the electronic device 101 ) and an external electronic device (e.g., the electronic device 102 or 104 ).
  • the application associated with information exchange may include, for example, a notification relay application for forwarding specific information to an external electronic device, or a device management application for managing an external electronic device.
  • the notification relay application may include a function of delivering, to the external electronic device (e.g., the electronic device 102 or 104 ), notification information generated by other applications (e.g., an SMS/MMS application, an email application, a health care application, an environmental information application, etc.) of the electronic device 101 . Further, the notification relay application may receive notification information from, for example, an external electronic device and provide the received notification information to a user.
  • the external electronic device e.g., the electronic device 102 or 104
  • notification information generated by other applications e.g., an SMS/MMS application, an email application, a health care application, an environmental information application, etc.
  • the notification relay application may receive notification information from, for example, an external electronic device and provide the received notification information to a user.
  • the device management application may manage (for example, install, delete, or update) at least one function of an external electronic device (for example, the electronic devices 102 and 104 ) communicating with the electronic device (for example, a function of turning on/off the external electronic device itself (or some components) or a function of adjusting luminance (or a resolution) of the display), applications operating in the external electronic device, or services provided by the external electronic device (for example, a call service and a message service).
  • an external electronic device for example, the electronic devices 102 and 104
  • the electronic device for example, a function of turning on/off the external electronic device itself (or some components) or a function of adjusting luminance (or a resolution) of the display
  • applications operating in the external electronic device for example, a call service and a message service.
  • the applications 370 may include applications (for example, a health care application of a mobile medical appliance or the like) designated according to attributes of the external electronic device 102 or 104 .
  • the application 370 may include an application received from the external electronic device (e.g., the server 106 , or the electronic device 102 or 104 ).
  • the application 370 may include a preloaded application or a third party application which can be downloaded from the server. Names of the elements of the program module 310 , according to the above-described examples, may change depending on the type of OS.
  • At least some of the program module 310 may be implemented in software, firmware, hardware, or a combination of two or more thereof. At least some of the program module 310 may be implemented (e.g., executed) by, for example, the processor (e.g., the processor 210 ). At least some of the program module 310 may include, for example, a module, a program, a routine, a set of instructions, and/or a process for performing one or more functions.
  • module as used herein may, for example, mean a unit including one of hardware, software, and firmware or a combination of two or more of them.
  • the “module” may be interchangeably used with, for example, the term “unit”, “logic”, “logical block”, “component”, or “circuit”.
  • the “module” may be a minimum unit of an integrated component element or a part thereof.
  • the “module” may be a minimum unit for performing one or more functions or a part thereof.
  • the “module” may be mechanically or electronically implemented.
  • the “module” may include at least one of an Application-Specific Integrated Circuit (ASIC) chip, a Field-Programmable Gate Arrays (FPGA), and a programmable-logic device for performing operations which have been known or are to be developed hereinafter.
  • ASIC Application-Specific Integrated Circuit
  • FPGA Field-Programmable Gate Arrays
  • programmable-logic device for performing operations which have been known or are to be developed hereinafter.
  • At least some of the devices (for example, modules or functions thereof) or the method (for example, operations) according to the disclosure may be implemented by a command stored in a computer-readable storage medium in a programming module form.
  • the instruction when executed by a processor (e.g., the processor 130 ), may cause the one or more processors to execute the function corresponding to the instruction.
  • the computer-readable storage medium may be, for example, the memory 140 .
  • the computer readable recoding medium may include a hard disk, a floppy disk, magnetic media (for example, a magnetic tape), optical media (for example, a Compact Disc Read Only Memory (CD-ROM) and a Digital Versatile Disc (DVD)), magneto-optical media (for example, a floptical disk), a hardware device (for example, a Read Only Memory (ROM), a Random Access Memory (RAM), a flash memory), and the like.
  • the program instructions may include high class language codes, which can be executed in a computer by using an interpreter, as well as machine codes made by a compiler.
  • the aforementioned hardware device according to various examples may be configured to be operated as one or more software modules in order to perform the operation of the disclosure, and vice versa.
  • the module or the programming module, e.g., processing circuitry, may include one or more of the aforementioned components or may further include other additional components, or some of the aforementioned components may be omitted.
  • Operations executed by a module, a programming module, or other component elements according to various examples may be executed sequentially, in parallel, repeatedly, or in a heuristic manner. Further, some operations may be executed according to another order or may be omitted, or other operations may be added.
  • the examples disclosed herein are suggested for describing and helping understanding of disclosed technical contents, and do not limit the scope of the technology described in the disclosure. Accordingly, the scope of the disclosure should be construed as including all modifications or various other examples based on the technical idea of the disclosure.
  • information related to an apparatus and a method of providing information by using an object corresponding to an external electronic device for example, the electronic devices 102 and 104 , or the server 106 are provided with reference to FIGS. 4 to 12 .
  • FIG. 4 illustrates an example of an actual use environment 400 , in which an electronic device (for example, the electronic device 101 ) is linked with an external electronic device (for example, the electronic devices 102 and 104 , and the server 106 ) within a designated space
  • FIG. 5 illustrates an example of a virtual environment 500 provided by an electronic device (for example, the electronic device 101 ).
  • an actual use environment 400 may be an environment, in which a plurality of electronic devices located within a space is linked with each other to provide a user 410 with information, based on the space in which the user 410 is located.
  • the actual use environment 400 may be an environment, in which a plurality of electronic devices 401 ( 413 to 445 ) located in the living room provides the user 410 with information.
  • the actual use environment 400 may be an environment providing the user 410 with information through the linkage of a plurality of electronic devices located in the office.
  • the user 410 may wear a Head-Mounted Device (HMD) 401 (for example, the electronic device 101 ) in a living room and look at a partial area 490 of the living room.
  • HMD Head-Mounted Device
  • the external electronic devices 413 to 445 for the HMD 401 may be located in the living room.
  • the external electronic devices 413 to 445 may be, for example, an electronic watch 413 , an electronic tattoo 415 , a lighting 420 , speakers 423 - 1 and 423 - 2 , a television 425 , an interphone 427 , an air conditioner 429 , an electronic drawer 430 , a smart blind 433 , a smart phone 440 , a tablet PC 443 , or a laptop PC 445 .
  • the external electronic devices 413 to 445 are not limited to the aforementioned devices, and may include various electronic devices.
  • the electronic device 401 is not limited to the illustrated HMD 401 , and may be various devices capable of providing the user 410 with visual information.
  • At least some of the electronic device 401 and the external electronic devices 413 to 445 may be connected with each other through wired or wireless communication.
  • at least some of the electronic device 401 and the external electronic devices 413 to 445 may be directly connected through a local network including a home network or Device to Device (D2D) communication.
  • D2D Device to Device
  • the external electronic devices 413 to 445 may output information (for example, a notification or contents).
  • the television 425 may output a video 465 .
  • the tablet PC 443 may output text 483 .
  • the laptop PC 445 may output a document 485 .
  • the electronic drawer 430 may output a notification notifying that a drawer is opened.
  • the smart phone 440 may output a notification notifying a reception of a text message.
  • the notification may be a notification sound, vibration, scent, light, waves corresponding to a brainwave of a person, or an image
  • the external electronic devices 413 to 445 may output at least one of the aforementioned notifications through which the user is capable of recognizing states or locations of the external electronic devices 413 to 445 .
  • the electronic device 401 may provide the user 410 with contents corresponding to the information together with a virtual environment.
  • the notification provided by the electronic drawer 430 and notifying that the drawer is opened may be provided in a form of an image through the electronic device 401 .
  • the image may be displayed at least a part of a region, in which an object corresponding to the electronic drawer 430 displayed on the virtual environment is displayed.
  • a virtual environment 500 may be provided through an electronic device 501 (for example, a display (for example, the display 170 ) functionally connected with the electronic device 501 ).
  • the virtual environment 500 may include an image (for example, a still image or a video) identical or similar to an actual use environment (for example, the actual use environment 400 ), in which the electronic device 501 or external electronic devices for the electronic device 501 are disposed.
  • the virtual environment 500 may include an image (an image of a picture of the living room or a drawing image of the living room) for the living room.
  • the virtual environment 500 may include an image for an environment different from the actual use environment.
  • a space, in which the electronic device 501 and a user 510 are actually located is the “living room”, but the virtual environment 500 may include a “classroom”.
  • the electronic device 501 may provide the virtual environment 500 different from the actual space based on contents which have been provided (or are to be provided) to the user 510 or a user's input. For example, if the user 510 desires a simulation for “shopping”, the electronic device 501 may provide an image for a shopping mall as an example of the virtual environment 500 . As another example, if the contents are movie contents, the electronic device 501 may provide an image of a movie theater as the virtual environment 500 . According to an example, the virtual environment 500 may include a 3D image, as well as a 2D image.
  • one or more objects 513 to 540 (hereinafter, candidate objects for convenience of description) corresponding to one or more external electronic devices (hereinafter, for convenience of description, “candidate electronic devices”) among the external electronic devices 413 to 445 disposed in the space (for example, the living room), in which the electronic device 501 is located, or other external electronic devices disposed in a space (not shown) different from the space may be displayed on the virtual environment 500 .
  • the electronic watch 413 , the electronic tattoo 415 , the speakers 423 - 1 and 423 - 2 , the television 425 , the air conditioner 429 , the smart phone 440 , and the smart blind 433 which are disposed in a region 490 (viewed by the user) corresponding to the gaze of the user 510 among the external electronic devices 413 to 445 may be selected as candidate electronic devices.
  • a washing machine located at another space (for example, a utility room) connected with the electronic device 501 through a local network may be selected as a candidate electronic device. Additional information about the method of selecting the candidate electronic device will be described with reference to FIG. 6 .
  • the candidate objects 513 to 547 may be, for example, an image, text, a video, or an icon corresponding to the candidate electronic device.
  • the candidate objects 513 to 547 may be images related to an appearance of the candidate electronic device.
  • the candidate objects 513 to 547 may include speaker images 523 - 1 and 523 - 2 identical or similar to appearances of the speakers 423 - 1 and 423 - 2 , a television image 525 identical or similar to an appearance of the television 425 , an image 529 different from an actual image of the air conditioner 429 but representing the air conditioner 429 , an image 533 different from an actual image of the smart blind 433 but representing the smart blind 433 and an image 540 different from an actual image of the smart phone 440 but representing the smart phone 440 , e.g., icons.
  • the candidate objects 515 to 547 may include, for example, text (for example, a product name or identification information) for the candidate electronic device.
  • the candidate objects 515 to 547 may be an output window 515 including “electronic tattoo” that is text corresponding to the electronic tattoo 415 .
  • the candidate objects 513 to 547 may be icons for the candidate electronic devices.
  • the candidate objects 513 to 547 may include an electronic watch icon 513 representing the electronic watch 413 or a washing machine icon 547 representing a washing machine.
  • the candidate objects 513 to 547 may be displayed on a region of the virtual environment 500 corresponding to relative positions of the candidate electronic devices for the electronic device 501 .
  • the user 510 may use the HMD 501 that is the electronic device 510 at a first location corresponding to a left-rear side of the living room.
  • the user 510 may be attached with the electronic tattoo 415 on his/her arm, and hold the smart phone 440 with a hand.
  • locations of the electronic tattoo 415 and the smart phone 440 may be the same as or similar to the location of the HMD 501 .
  • the electronic tattoo object 515 corresponding to the electronic tattoo 415 and the smart phone object 540 corresponding to the smart phone 440 may be displayed in a first region corresponding to a left lower side of the virtual environment 500 as candidate objects.
  • the television 425 may be located at a second location corresponding to a center front side of the living room.
  • the television object 525 corresponding to the television 425 may be displayed in a second region corresponding to the center front side of the virtual environment 500 as a candidate object.
  • the first speaker 423 - 1 and the second speaker 423 - 2 may be located at third locations which are both sides of the television 425 in the living room.
  • the first speaker object 523 - 1 and the second speaker object 523 - 2 corresponding to the first speaker 423 - 1 and the second speaker 423 - 2 respectively, may be displayed on third regions, which are both sides of the television 525 in the virtual environment 500 , as a candidate objects.
  • the air conditioner 479 may be located at a fourth location corresponding to a front side of the living room.
  • the air conditioner object 529 corresponding to the air conditioner 429 may be displayed in a fourth region corresponding to a right center side of the virtual environment 500 as a candidate object.
  • the smart blind 433 may be located at a fifth location corresponding to a right front side of the air conditioner 579 in the living room.
  • the smart blind object 533 corresponding to the smart blinder 433 may be displayed in a fifth region corresponding to a right upper end of the air conditioner object 529 in the virtual environment 500 as a candidate object.
  • the candidate objects 513 to 547 may be displayed on a region 503 designated in the virtual environment 500 .
  • the right lower region 503 may be designated as a region for displaying the candidate objects 513 to 547 in the virtual environment 500 .
  • the electronic watch object 513 corresponding to the electronic watch 413 and the washing machine 547 corresponding to the washing machine may be displayed on the region 503 designated for displaying the objects.
  • contents may be provided in relation to the region, on which at least some objects (hereinafter, target objects for convenience of description) among one or more candidate objects displayed in the virtual environment 500 .
  • the contents may correspond to information to be output by at least some electronic devices (hereinafter, referred to as target electronic devices for convenience of description) corresponding to the target objects among one or more candidate electronic devices.
  • text 565 “Body temperature: 37°, and Stress: 90%” corresponding to information on a body temperature and a degree of stress of the user 510 , which is provided through the electronic tattoo 415 may be provided through the region on which the electronic tattoo object 515 is displayed.
  • contents 465 (for example, a movie) to be output through the television 425 may be provided through the region, on which the television object 525 that is the target object is displayed.
  • the text message icon 590 corresponding to notification information about the reception of the text message to be provided by the smart phone 440 may be provided through the region on which the smart phone object 540 that is the target object is displayed.
  • text 597 corresponding to notification information about “washing is completed” to be output by the washing machine may be provided through an upper region of the region, on which the washing machine object 547 that is the target object is displayed.
  • a control menu for example, an image, text, or an icon
  • a control menu 583 for opening or closing the smart blind 433 may be provided through the region, on which the smart blind object 533 is displayed.
  • FIG. 6 is a block diagram illustrating an example information providing module (for example, the information providing module 110 in the form, for example, of processing circuitry) included in an electronic device (for example, the electronic device 101 ).
  • an information providing module 610 may include, for example, a device checking module 620 , an object display module 630 , an information obtaining module 640 , a content providing module 650 , and a device control module 660 .
  • the device checking module 620 may be implemented by hardware, software, or firmware, or any combination thereof which is capable of performing or being configured to perform designated operations.
  • the device checking module 620 may be configured to check, for example, a candidate electronic device among one or more external electronic devices (for example, the electronic devices 102 and 104 , and the server 106 ) for the electronic device. According to an example, the device checking module 620 may be configured to recognize (for example, scan) one or more external electronic devices communicable with the electronic device. One or more external electronic devices may be devices connected with the electronic device through a network (for example, the network 162 ) or a short range communication (for example, the short range communication 164 ).
  • a network for example, the network 162
  • a short range communication for example, the short range communication 164
  • the device checking module 620 may be configured to transmit a request signal requesting communication to the plurality of external electronic devices through Wi-Fi communication (or Bluetooth, infrared communication, or short range communication).
  • the device checking module 620 may be configured to receive a response signal for the request signal from one or more external electronic devices communicable through the Wi-Fi communication among the plurality of external electronic devices. Accordingly, the device checking module 620 may be configured to recognize the one or more external electronic devices.
  • the device checking module 620 may be configured to receive a signal transmitted (for example, broadcasted) from one or more external electronic devices (a Bluetooth Low Energy (BLE) tag included in the one or more external electronic devices) without requesting communication from the plurality of external electronic devices. Accordingly, the device checking module 620 may recognize the one or more external electronic devices.
  • BLE Bluetooth Low Energy
  • the device checking module 620 may be configured to recognize one or more external electronic devices in response to the use (e.g., wearing) of the electronic device by a user. For example, if the electronic device is a wearable device, the device checking module 620 may be configured to check whether the electronic device is worn by the user. Accordingly, if the electronic device is worn by the user, the device checking module 620 may recognize, for example, the one or more external electronic devices. On the other hand, if the electronic device is not worn by the user, the device checking module 620 may not recognize, for example, the one or more external electronic devices. According to various examples, the device checking module 620 may recognize one or more external electronic devices based on various user's inputs (for example, selection of a menu item related to the recognition of the external electronic device), as well as whether the user uses the electronic device.
  • various user's inputs for example, selection of a menu item related to the recognition of the external electronic device
  • the device checking module 620 may be configured to automatically recognize one or more external electronic devices.
  • the device checking module 620 may be configured to periodically (for example, once for about each one hour) recognize one or more external electronic devices.
  • the device checking module 620 may recognize one or more external electronic devices in response to an operation of the one or more external electronic devices.
  • the device checking module 620 may detect a signal corresponding to a turn-on operation of the one or more external electronic devices.
  • the device checking module 620 may recognize the one or more external electronic devices based on the signal.
  • the device checking module 620 may be configured to select a candidate electronic device from the recognized one or more external electronic devices. According to an example, the device checking module 620 may be configured to select an electronic device located within a designated range from the electronic device as a candidate electronic device.
  • the designated range may include at least one of, for example, a range detectable by a sensor (for example, the image sensor) included in the electronic device, a range, in which short range communication (for example, the short range communication 164 ) with the electronic device may be established, a range, in which communication with the electronic device with a designated signal intensity or more may be established, or ranges connected with each other through a local network.
  • the device checking module 620 may select an electronic device located at a place, which the user may visually recognize, among the recognized one or more external electronic devices as a candidate electronic device.
  • the device checking module 620 may be configured to obtain an image at a location corresponding to the gaze of the user through the image sensor included in the electronic device.
  • the device checking module 620 may select one or more electronic devices include in the image as a candidate electronic device.
  • the device checking module 620 may be configured to select an electronic device located to be approximate to the user among the recognized one or more external electronic devices as a candidate electronic device. To this end, the device checking module 620 may select one or more electronic devices, which may establish short range communication (for example, Wi-Fi, Bluetooth, or infrared communication) with the electronic device among the one or more external electronic devices connected with the electronic device through various communication methods, as a candidate electronic device.
  • short range communication for example, Wi-Fi, Bluetooth, or infrared communication
  • the electronic device may include an infrared communication module communicable with an external electronic device corresponding to a designated direction.
  • the device checking module 620 may select an external electronic device located in a desired direction as a candidate electronic device by using the infrared communication module.
  • the device checking module 620 may receive an infrared signal from one or more external electronic devices located in a direction corresponding to the infrared communication module through infrared communication.
  • the device checking module 620 may select the one or more external electronic devices as a candidate electronic device.
  • the device checking module 620 may be configured to select, for example, one or more electronic devices, in which a communication signal intensity between the electronic device and one or more external electronic devices is a designated signal intensity or more (for example, high when signal intensity is divided into high, middle, and low), among the recognized one or more external electronic devices as a candidate electronic device.
  • the recognized one or more external electronic devices may be connected within the electronic device through various networks.
  • the device checking module 620 may select one or more electronic devices connected through the local network (for example, a home network) among the one or more external electronic devices as a candidate electronic device.
  • the device checking module 620 may be configured to select a candidate electronic device based on an input, such as, for example, a user input.
  • the device checking module 620 may be configured to recognize, for example, the gaze of the user for the electronic device through an eye recognizing sensor functionally connected with the electronic device (for example, included in the electronic device).
  • the device checking module 620 may be configured to select an electronic device corresponding to the gaze of the user among the one or more external electronic devices as a candidate electronic device.
  • the device checking module 620 may be configured to recognize, for example, a motion of the user (for example, a motion of indicating the electronic device with a hand or a finger of the user) through the image sensor functionally connected with the electronic device.
  • the device checking module 620 may select an electronic device corresponding to the motion (for example, the electronic device indicated by the hand or the finger of the user) as a candidate electronic device.
  • the device checking module 620 may be configured to select a candidate electronic device based on operation states of one or more communicable external electronic devices.
  • the device checking module 620 may be configured to select an electronic device which is outputting (or is to output) information (for example, contents, a sound, scent, light, radio waves, or vibrations) among one or more external electronic devices as a candidate electronic device.
  • the device checking module 620 may be configured to obtain, for example, operation information about an external electronic device regarding whether one or more external electronic devices are outputting information, through wired or wireless communication. Based on the operation information, the device checking module 620 may select the electronic device, which is outputting the information, as a candidate electronic device.
  • the device checking module 620 may be configured to obtain, for example, an image of a first external electronic device, which outputs a video through a display and an image of a second external electronic device, which does not output the information, through the image sensor. For example, based on the images, the device checking module 620 may select the first external electronic device as a candidate electronic device.
  • the object display module 630 may be configured to display a candidate object corresponding to the candidate electronic device. According to an example, the object display module 630 may be configured to display a virtual environment (for example, the virtual environment 500 ) for displaying a candidate object through a display (for example, the display 170 ) functionally connected with the electronic device. According to an example, the object display module 630 may be configured to select a virtual environment to be displayed through the electronic device among a plurality of virtual environments based on at least one input for the electronic device. The object display module 630 may be configured to provide the selected virtual environment. According to an example, the object display module 630 may be configured to display a virtual environment obtained through the candidate electronic device or another external electronic device through the display.
  • a virtual environment for example, the virtual environment 500
  • a display for example, the display 170
  • the object display module 630 may be configured to select a virtual environment to be displayed through the electronic device among a plurality of virtual environments based on at least one input for the electronic device.
  • the object display module 630 may be configured to obtain a virtual environment through a candidate electronic device based on a communication connection between the candidate electronic device and the electronic device or a request of the user. Additional information on the method of providing the virtual environment will be described with reference to FIG. 7 below.
  • the object display module 630 may be configured to display the candidate object within the virtual environment.
  • the object display module 630 may display the candidate object through a region corresponding to a location of the candidate electronic device in an actual use environment or a region (for example, the region 503 ) designated for displaying the candidate object.
  • the object display module 630 may be configured to differently display the candidate objects based on processing information for the candidate object or the virtual environment.
  • the processing information may, for example, include information on a time, for which the gaze of the user for the candidate electronic device stays, or a degree of communication connection (paring) with the electronic device.
  • the object display module 630 may be configured to display the candidate object with a chroma, luminance, brightness, light and shade, transparency, color, shape, shaking, motion or a size designated according to the time, for which the gaze of the user for the candidate electronic device stays, or a communication connection state.
  • the object display module 630 may be configured to display the candidate object with a first color (for example, black and white colors), if the user views the candidate electronic device for a second time (for example, about 30 seconds) longer than the first time, the object display module 630 may display the candidate object with a second color (for example, colors).
  • a first color for example, black and white colors
  • a second color for example, colors
  • the object display module 630 may be configured to display the candidate object with a different size according to the communication connection state between the electronic device and the candidate electronic device.
  • the communication connection state may be divided into a “communication connection attempt”, “ongoing communication connection”, and “communication connection completion”. If the communication connection state between the electronic device and the candidate electronic device is the “communication connection attempt”, the object display module 630 may be configured to display the candidate object with a first size. If the communication connection state between the electronic device and the candidate electronic device is changed to the “ongoing communication connection”, the object display module 630 may change a size of the candidate object to a second size larger than the first size. If the communication connection state between the electronic device and the candidate electronic device is changed to the “communication connection completion”, the object display module 630 may change a size of the candidate object to a third size larger than the second size.
  • the communication connection state is divided into three steps (for example, the “communication connection attempt”, the “ongoing communication connection”, and the “communication connection completion”), but may be variously set according to various examples.
  • the communication connection state may be divided into four steps, that is, “communication connection attempt”, “ongoing communication connection by 40%”, “ongoing communication connection by 80%”, and “communication connection completion”. Additional information about the method of displaying the candidate object with a designated format will be described with reference to FIG. 11 .
  • the object display module 630 may be configured to display a user interface (for example, a progress bar or a loading bar) indicating a degree of processing the candidate object or the virtual environment in relation to the candidate object.
  • the object display module 630 may be configured to display a progress bar for displaying a time, for which the gaze of the user for the candidate object stays, in the region, in which the candidate object is displayed. If the gaze of the user for the candidate object stays for the first time (for example, about 5 seconds), the object display module 630 may display a first progress bar indicating a first percentage (for example, about 30%). Further, If the gaze of the user for the candidate object stays for the second time (for example, about 15 seconds) longer than the first time, the object display module 630 may display a second progress bar indicating a second percentage (for example, about 90%) larger than the first percentage.
  • a user interface for example, a progress bar or a loading bar
  • the object display module 630 may be configured to differently display the plurality of candidate objects so that the plurality of objects is discriminated from each other.
  • the object display module 630 may be configured to display, for example, the plurality of candidate objects with different frequencies, colors, or brightness so that the plurality of candidate objects is discriminated through a brainwave signal of the user viewing the plurality of candidate objects.
  • the object display module 630 may be configured to check a first region of the display for displaying a first candidate object and a second region of the display for displaying a second candidate object.
  • the object display module 630 may be configured to display the first candidate object at a first frequency (for example, about 60.2 Hz) through the first region, and display the second candidate object at a second frequency (for example, about 60.4 Hz) through the second region.
  • the object display module 630 may be configured to display the first candidate object with a first color (for example, red) and display the second candidate object with a second color (for example, blue). Further, the object display module 630 may be configured to display the first candidate object with first brightness (for example, about 2 lux), and display the second candidate object with second brightness (for example, about 5 lux).
  • first color for example, red
  • second candidate object for example, blue
  • first brightness for example, about 2 lux
  • second candidate object for example, about 5 lux
  • the information obtaining module 640 may be configured to obtain, for example, information (for example, the video 465 , the text 483 , and the text message 485 ) to be output (or currently output) through a target electronic device corresponding to a target object among one or more candidate objects. According to an example, the information obtaining module 640 may be configured to determine a target object based on a user's input (for example, a gesture, a touch, or the eye), execution information of the candidate electronic device, or the motion information about the candidate electronic device.
  • a user's input for example, a gesture, a touch, or the eye
  • the information obtaining module 640 may be configured to check the gaze of the user based on a user's input. If the gaze of the user corresponds to the first candidate object, the information obtaining module 640 may be configured to determine the first candidate object as a target object. If the gaze of the user corresponds to a second candidate object, the information obtaining module 640 may determine the second candidate object as a target object.
  • the information obtaining module 640 may be configured to recognize, for example, a motion (for example, rotation) of a head of the user based on a user's input through a motion sensor functionally connected with the electronic device (for example, included in the electronic device). If the motion corresponds to a first motion corresponding to the first candidate object (for example, the head rotates in a first direction corresponding to the first candidate object), the information obtaining module 640 may determine the first candidate object as a target object. If the motion corresponds to a second motion corresponding to the second candidate object (for example, the head rotates in a second direction corresponding to the second candidate object), the information obtaining module 640 may determine the second candidate object as a target object. For example, the electronic device (for example, the electronic device 101 ) may use at least one of the gesture sensor 240 A, the gyro sensor 240 B, and the acceleration sensor 240 E in the sensor module 240 as a motion sensor.
  • the information obtaining module 640 may be configured to determine a target object based on biometric information (for example, a brainwave) of the user as a user's input.
  • biometric information for example, a brainwave
  • the brainwave of the user may be differently measured according to visual information viewed by the user (for example, an output frequency, a color, or brightness of visual information).
  • visual information viewed by the user for example, an output frequency, a color, or brightness of visual information.
  • the information obtaining module 640 may be configured to check an image viewed by the user corresponding to the brainwave between the first image and the second image by using the brainwave obtained through a brainwave recognizing sensor functionally connected with the electronic device.
  • the object display module 630 may be configured to output the first candidate object through the display at the first frequency, and output the second candidate object through the display at the second frequency.
  • the information obtaining module 640 may be configured to obtain the brainwave of the user. If the brainwave of the user is the first signal corresponding to the first frequency, the information obtaining module 640 may be configured to determine the first candidate object as a target object. If the brainwave of the user is the second signal corresponding to the second frequency, the information obtaining module 640 may be configured to determine the second candidate object as a target object.
  • the information obtaining module 640 may be configured to check information about reception of a notification by the candidate electronic device as execution information. Accordingly, the information obtaining module 640 may be configured to determine the candidate object corresponding to the candidate electronic device as a target object. For example, the information obtaining module 640 may be configured to check information about completion of washing by the washing machine that is the candidate electronic device as execution information. Accordingly, the information obtaining module 640 may determine the washing machine object corresponding to the washing machine as a target object.
  • the information obtaining module 640 may be configured to check, for example, motion information about one or more candidate electronic devices. For example, if the motion information corresponds to designated motion information, the object display module 630 may be configured to determine a candidate object corresponding to the candidate electronic device as a target object. For example, if a user wearing an electronic watch raises his/her arm in a direction of a face of the user, the information obtaining module 640 may be configured to determine that the user intends to check the electronic clock. Accordingly, the motion information corresponding to the movement of the electronic watch in an upper direction may be set as the designated motion information. For example, a motion sensor included in the electronic watch may detect the motion information corresponding to the movement of the electronic watch in the upper direction. For example, the information obtaining module 640 may obtain the motion information from the electronic watch. The information obtaining module 640 may determine the electronic watch as a target electronic device based on the motion information.
  • motion information about one or more candidate electronic devices may be checked by various sensors, such as the image sensor functionally connected with the one or more candidate electronic devices or the electronic device, as well as a motion sensor included in the one or more candidate electronic devices.
  • the information obtaining module 640 may be configured to determine a group including one or more candidate objects a target object, as well as determine each candidate object as a target object.
  • the group may be determined based on at least one of, for example, a user's input, a region, in which the candidate object is displayed, and functions of the candidate electronic devices.
  • first to third candidate objects may be determined as a first group
  • fourth and fifth candidate objects may be determined as a second group.
  • first and second candidate objects located in the first region of the virtual environment may be determined as the first group
  • third to fifth candidate objects located in the second region of the virtual environment may be determined as the second group.
  • one or more candidate objects corresponding to one or more candidate electronic devices, in which a relevant function is executed may be determined as one group.
  • a monitor object corresponding to a monitor for outputting a video and a speaker object corresponding to a speaker for outputting a sound corresponding to the video may be determined as one group.
  • a game console object corresponding to a game console for controlling a game a monitor object corresponding to a monitor for displaying an image of the game, and a speaker object corresponding to a speaker for outputting a sound for the game may be determined as one group.
  • the information obtaining module 640 may be configured to obtain information to be output through the target electronic device corresponding to the target object. For example, a movie may be currently output through the target electronic device. The information obtaining module 640 may be configured to obtain information on at least a part (a next part of a part output by the target electronic device) of the currently output movie. For example, when a notification (for example, a notification for reception of a text message when the target electronic device is a smart phone) is output (or to be output) by the target electronic device, the information obtaining module 640 may receive information about the notification.
  • a notification for example, a notification for reception of a text message when the target electronic device is a smart phone
  • the smart phone when a smart phone that is the target electronic device receives a text message, the smart phone may need to output at least one of a ring sound, a vibration, text message contents, and text message sender information corresponding to the reception of the text message.
  • the information obtaining module 640 may be configured to obtain at least one of information indicating that the text message is received, text message contents, and the text message sender information.
  • the information obtaining module 640 may be configured to directly obtain information from the target electronic device through wireless or wired communication (for example, D2D communication). Further, according to another example, the information obtaining module 640 may be configured to obtain information about the target electronic device through another electronic device (for example, the electronic devices 102 and 104 , and the server 106 ) connected with the target electronic device. For example, the information obtaining module 640 may obtain the information through a server managing the information to be output by the target electronic device.
  • wireless or wired communication for example, D2D communication
  • the information obtaining module 640 may be configured to obtain information about the target electronic device through another electronic device (for example, the electronic devices 102 and 104 , and the server 106 ) connected with the target electronic device.
  • the information obtaining module 640 may obtain the information through a server managing the information to be output by the target electronic device.
  • the content providing module 650 may be configured to provide, for example, contents corresponding to the obtained information in relation to the region, on which the target object is displayed (for example, within the displayed region or a region closed to the displayed region). According to an example, the content providing module 650 may be configured to determine contents corresponding to the obtained information. For example, when the obtained information is information about a notification, the content providing module 650 may be configured to determine an image, a video, text, an icon, a sound, or a vibration related to the notification as the contents corresponding to the notification.
  • the obtained information may be information about contents (for example, a video, an image, or text) output by the target electronic device.
  • the content providing module 650 may be configured to determine contents identical or similar to the contents output by the target electronic device or contents obtained by processing (for example, changing a format, resolution, or a size of data) at least a part of the contents output by the target electronic device as the contents corresponding to the information.
  • the obtained information may be information about a second part, which is a next part of a first part of a movie currently output through the target electronic device.
  • the content providing module 650 may be configured to determine the second part of the movie as contents corresponding to the information.
  • a data format of the video that is the obtained information may be a first data format.
  • the content providing module 650 may be configured to determine the video, of which the data format is changed from the first data format to the second data format, as contents corresponding to the information.
  • the content providing module 650 may be configured to provide the contents to the region, on which the target object is displayed, the region closed to (or connected with) the region, on which the target object is displayed, or a direction corresponding to the region, on which the target object is displayed.
  • the content providing module 650 may be configured to display contents on a partial region (for example, an image part corresponding to a display of the television) included in the television image.
  • the target object is an icon (for example, the washing machine icon 547 ) for the target electronic device
  • the content providing module 650 may be configured to display the contents (for example, the contents 597 ) on an upper, lower, left, or right region of the icon.
  • the content providing module 650 may be configured to display the contents together with visual information (for example, a connection line) connecting the target object and the contents.
  • the content providing module 650 may display the contents on a region different from that of the target region, and then move the contents to the region, on which the target object is displayed.
  • the electronic device may include a first speaker located in a first direction (for example, a right side) of the display and a second speaker located in a second direction (for example, a left side) of the electronic device.
  • a first target object may be displayed on a region corresponding to the first direction of the display.
  • the content providing module 650 may be configured to output a louder sound through the first speaker than the second speaker so that the user recognizes as if the sound that is the content is output in the first direction corresponding to the first target object. Otherwise, the content providing module 650 may be configured to output the sound only through the first speaker.
  • a second target object may be displayed on a region corresponding to the second direction of the display.
  • the content providing module 650 may output a louder sound through the second speaker than the first speaker so that the user recognizes as if the sound that is the content is output in the second direction corresponding to the second target object. Otherwise, the content providing module 650 may output the sound only through the second speaker.
  • the content providing module 650 may be configured to provide another information obtained through an electronic device different from the target electronic device in relation to the region, on which the target object corresponding to the target electronic device is displayed. For example, information (for example, writing contents for a lecture when the content is a video lecture) related to contents provided to the region, on which the target object corresponding to the target electronic device is displayed, may be obtained through an electronic device different from the target electronic device.
  • the content providing module 650 may be configured to provide the information to the region, on which the target object is displayed (or the region close to the region), in relation to the contents. A method of providing the obtained information through another electronic device will be described with reference to FIG. 10 .
  • the content providing module 650 may be configured to control (for example, temporarily stop or turn off) the provision of the contents based on external environment information (for example, external object recognition) for the electronic device.
  • the content providing module 650 may be configured to check for, for example, a person located around the electronic device (for example, within about 1 m) as external environment information through the image sensor functionally connected with the electronic device while providing the contents.
  • the content providing module 650 may be configured to stop providing the contents so as to enable the user of the electronic device to recognize that a person is located in a neighboring region.
  • the content providing module 650 may be configured to display information indicating that a person is located in a neighboring region.
  • the content providing module 650 may display an image (for example, a picture of the person) or text corresponding to the person.
  • the device control module 660 may be configured to control, for example, a function of the target electronic device corresponding to the target object. According to an example, the device control module 660 may be configured to control an output function of the target electronic device. For example, the contents corresponding to the output (for example, currently output) information through the target electronic device through the content providing module 650 may be output through the electronic device. Accordingly, the information may be provided through the target electronic device and the contents corresponding to the information may be simultaneously provided by the electronic device. It may be difficult for the user to simultaneously check the information output through the target electronic device and the contents provided by the electronic device.
  • the device control module 660 may be configured to control the target electronic device so that the information is not output through the target electronic device.
  • the device control module 660 may be configured to control the target electronic device so that the information is output through the target electronic device. Further, the device control module 660 may be configured to stop the output of the contents for the information by the electronic device.
  • the device control module 660 may be configured to determine whether the electronic device is used based on the electronic device is worn by the user. For example, the contents may be provided by a television, which is the external electronic device of the electronic device. If the electronic device is an HMD, the device control module 660 may check whether the HMD is worn by the user.
  • the device control module 660 may be configured to control the television so that the contents are not provided through the television (turn off only a screen of the television) or turn off the television. Then, If the HMD is detached from the user, the device control module 660 may control the television or turn on the television so that the contents are output through the television again. The method of controlling the target electronic device will be described with reference to FIG. 8 below.
  • the device control module 660 may also be configured to control (turn off) an operation of another external electronic device for the electronic device, as well as the target electronic device.
  • the device control module 660 may be configured to turn off a lighting located around the electronic device and at least some functions of the television (turn off only a screen output function of the television) when the electronic device is used by the user.
  • the device control module 660 may be configured to turn on the lighting and the television again when the electronic device is not used by the user.
  • the device control module 660 may be configured to provide a control menu for controlling the contents output through the target electronic device or the electronic device in relation to the target object.
  • the device control module 660 may be configured to display control menu items corresponding to a function of “turning off” and “turning on” the lighting on a region, on which the lighting object is displayed.
  • the device control module 660 may be configured to control the lighting so that the lighting is turned off. The method of providing the control menu item will be described with reference to FIG. 8 below.
  • the device control module 660 may be configured to transmit at least some of the contents output through the target object or the information (for example, some contents or link information about the contents) related to the contents to an electronic device corresponding to another target object. Accordingly, at least some of the information may be provided through the electronic device corresponding to another target object.
  • the device control module 660 may be configured to select at least a part of an advertisement image that is the content provided through a region, on which a first target object is displayed.
  • the device control module 660 may be configured to transmit link information related to an advertisement product corresponding to the advertisement image to a target electronic device corresponding to a second target object based on a user's input (for example, a drag from the first target object to the second target object or a movement of the gaze of the user from the first target object to the second target object) for a region, on which the second target object is displayed.
  • a user's input for example, a drag from the first target object to the second target object or a movement of the gaze of the user from the first target object to the second target object
  • FIG. 7 illustrates an example of a user interface (for example, the virtual environment 500 ) provided through an electronic device (for example, the electronic device 101 ).
  • a background of a living room in which a user 710 is located, may be displayed through a display 703 of an electronic device 701 as a background of a virtual environment 705 (for example, the virtual environment 500 ).
  • contents 709 which may be output through a television disposed in the living room, may be displayed through a television object 707 corresponding to the television.
  • the user 710 may watch the contents 709 through the television object 707 within the virtual environment 705 identically or similarly to the contents 709 provided by the actual television in the actual living room.
  • a selection menu 720 through which the background of the virtual environment 705 may be changed, may be provided to the user 710 through the virtual environment 705 .
  • the selection menu 720 may be displayed with a second depth effect.
  • the selection menu 720 having the second depth effect may be recognized to be closer to the user 710 than the television object 707 having the first depth effect.
  • the selection menu 720 may be more transparently displayed than other objects within the virtual environment 705 .
  • the selection menu 720 may include, for example, a first menu item 721 having a background of a lecture room, a second menu item 723 having a background of a conference room, and a third menu item 725 having a background of a café, or other menu items (not shown).
  • the first menu item 721 to the third menu item 725 may be moved in a specific direction 729 (for example, a right direction) based on the gaze or a gesture of the user 710 . For example, in FIG.
  • the first menu item 721 , the second menu item 723 , and the third menu item 725 are sequentially disposed in the right direction (for example, the specific direction 729 ) from a left side, but the third menu item 725 , the first menu item 721 , and the second menu item 723 may be sequentially disposed from the left side to the right side according to an input of the user 710 .
  • the second menu item 723 may be selected by the gaze or the gesture of the user 710 .
  • the user 710 may select the second menu item 723 by viewing the second menu item 723 for a predetermined time, or by touching the second menu item 723 .
  • the electronic device 701 may provide the user 710 with visual information notifying that the second menu item 723 is selected.
  • the electronic device 701 may display the second menu item 723 to be larger than the first menu item 721 or the third menu item 725 , display the second menu item 723 to be stronger than the first menu item 721 or the third menu item 725 , display the second menu item 723 to be flickered, or display the second menu item 723 with a shadow.
  • FIG. 8 illustrates an example of a user interface (for example, the virtual environment 500 ) provided through an electronic device (for example, the electronic device 101 ). Descriptions of parts identical or similar to those of FIG. 7 will be omitted, and reference numerals of parts identical or similar to those of FIG. 7 will be denoted with the identical or similar reference numerals.
  • contents 809 (for example, the contents 709 ) providable through an actual television 851 disposed in a laboratory room 800 , in which a user 810 is located, may be provided through a television object 807 (for example, the television object 707 ) in a virtual environment 805 (for example, the virtual environment 705 ).
  • a television object 807 for example, the television object 707
  • a virtual environment 805 for example, the virtual environment 705
  • the electronic device 801 may display the contents 809 within at least a partial region of the virtual environment 805 displayed through the electronic device 801 .
  • the television 851 may continuously output the contents 809 , but may also stop providing the contents 809 and provide a default image 849 .
  • the default image 849 may include, for example, an image (for example, black data) having the smaller amount of data than that of the contents 809 or the smaller amount of current consumed than that consumed if the contents 809 are output.
  • the television 851 may be turned off, and some other functions of the television 851 (an operation of waiting for reception of a signal provided by the electronic device 101 ) may be performed.
  • the television 851 may be turned off.
  • a control menu 830 through which the television 851 may be controlled, may be provided to the user 810 through the virtual environment 805 .
  • the control menu 830 may include a channel menu 837 , through which a channel of the television 851 may be changed.
  • the control menu 830 may include various menu items including volume, screen resolution, pause, play, fast forward, rewind, or fast play. For example, if the television object 807 displayed within the virtual environment 805 is displayed with a first depth effect (or three-dimensional effect), the selection menu 830 may be displayed with a second depth effect or more transparently than other objects.
  • the user 810 may select a channel of the television 851 through a hardware key 803 formed in the electronic device 801 . Otherwise, the user 810 may select, for example, a channel of the television 851 by viewing or touching with a finger at least one of first to third channel information 831 , 833 , and 835 displayed within the virtual environment 805 for a predetermined time. According to an example, the first channel information 831 to the third channel information 835 may be moved in a specific direction 829 (for example, a left direction) based on the gaze or a gesture of the user 810 .
  • a specific direction 829 for example, a left direction
  • the sequentially disposed first to third channel information 831 , 833 , and 835 may be displayed with different sizes based on the selected second channel information 833 .
  • the selected second channel information 833 may be displayed to be largest, and the non-selected first channel information 831 and third channel information 835 may be displayed to be relatively smaller than the second channel information 833 .
  • other non-selected channel information may be displayed to be gradually smaller when being far from the selected second channel information 833 .
  • FIG. 9 illustrates an example of a user interface (for example, the virtual environment 500 ) provided through an electronic device (for example, the electronic device 101 ). Descriptions of parts identical or similar to those of FIG. 7 or 8 will be omitted, and reference numerals of parts identical or similar to those of FIG. 7 or 8 will be denoted with the identical or similar reference numerals.
  • contents 909 for example, the contents 709 ) providable through a monitor 951 or through a monitor object 907 corresponding to an actual monitor 951 may be provided in a virtual environment 905 (for example, the virtual environment 705 ) corresponding to an office 900 .
  • an icon 951 through which information related to the contents 909 is transmittable to an actual notebook computer 960 , may be displayed in the virtual environment 905 .
  • the icon 951 may be displayed on, for example, at least a partial region of the monitor object 907 , on which the contents 909 are displayed.
  • the icon 951 may be displayed in a form of an image or text representing a transportation means (for example, an automobile, a cart, a bicycle, a wagon, or a wheel).
  • a transportation means for example, an automobile, a cart, a bicycle, a wagon, or a wheel.
  • the icon 951 may be activated.
  • the icon 951 may be flickered, displayed thicker, displayed larger, or finely move.
  • the cart may roll wheels in place or while moving forward in a specific direction 929 (for example, a front direction).
  • the information related to the contents 909 may be transmitted to the notebook computer 960 .
  • the information related to the contents 909 may include, for example, at least some contents of the contents 909 currently displayed through the monitor object 907 , link information about the contents 909 , and detailed information about an image, terms, or text included in the contents 909 .
  • FIG. 10 illustrates an example of a user interface (for example, the virtual environment 500 ) provided through an electronic device (for example, the electronic device 101 ). Descriptions of parts identical or similar to those of FIGS. 7 to 9 will be omitted, and reference numerals of parts identical or similar to those of FIGS. 7 to 9 will be denoted with the identical or similar reference numerals.
  • a user 101 may use a first electronic device 1001 (for example, the electronic device 101 ), a second electronic device 1031 , and a third electronic device 1051 within a network space 1000 .
  • the network space 1000 represents, for example, an environment, in which a plurality of electronic devices are linked with each other by using wired or wireless communication to provide information to the user 1010 or obtain information from the user 1010 .
  • first contents 1009 (for example, education contents) outputtable through the third electronic device 1051 (for example, the monitor 951 ) may be provided through at least a partial region of the object 1007 (for example, the monitor object 907 ) disposed within the virtual environment 1005 displayed through a display of the first electronic device 1001 .
  • the user 1010 may execute a memo application as second contents 1033 through the second electronic device 1031 .
  • the second contents 1033 may be provided through the object 1007 together with the first contents 1009 .
  • the first contents 1009 may be provided through a first region 1007 - 1 of the object 1007 (for example, a left portion of the object 1007 ), and the second contents 1033 may be provided through a second region 1007 - 3 of the object 1007 (for example, a right portion of the object 1007 ).
  • the specific contents may be displayed through the object 1007 .
  • the second electronic device 1031 receives a phrase “Hello!” from the user 1010
  • the first electronic device 1001 may display the phrase “Hello!” 1035 on the second region 1007 - 3 of the object 1007 , on which the second contents 1033 are displayed.
  • the first electronic device 1001 may simultaneously display the phrase 1035 on the object 1007 .
  • the phrase “Hello!” 1035 may be simultaneously output through a display of the electronic device 1001 (for example, by using one frame provided for one vertical clock).
  • the first electronic device 1001 in response to the input of the phrase of the user 1010 , the first electronic device 1001 may instantly display the phrase 1035 through the display. For example, if the user 1010 inputs “H” into the second electronic device 1031 , the first electronic device 1001 may output “H” through the display in response to the input of “H”. Next, if the user 1010 sequentially inputs “e”, “l”, “I”, “o”, and “!”, the first electronic device 1001 may sequentially display the respective characters through the display in response to the respective inputs.
  • an actual use environment of an external electronic device 1031 may correspond to the virtual environment 1005 displayed on the monitor object 1007 .
  • the phrase 1035 may be displayed on a center region of the memo application.
  • the phrase 1035 may be displayed on a center region of the second region 1007 - 3 of the monitor object 1007 .
  • the phrase 1035 may be displayed with a second size (for example, about two times of the first size) corresponding to a ratio of the first size on the monitor object 1007 .
  • the phrase 1035 may be displayed larger on the second region 1007 - 3 of the monitor object 1007 .
  • the first electronic device 1001 may display at least some contents 1009 - 1 of the first contents 1009 output through the first region 1007 - 1 of the object 1007 on the second contents 1033 output through the second region 1007 - 3 of the object 1007 automatically or based on a user's input.
  • the first electronic device 1001 may automatically back up at least some contents 1009 - 1 in the second contents 1033 based on various cases, that is, a case where at least some contents 1009 - 1 include a specific term designated by the user 1010 , importance of at least some contents 1009 - 1 is relatively higher than that of other contents, and at least some contents 1009 - 1 are not properly displayed through the object 1007 .
  • At least some contents 1009 - 1 may also be displayed on a display of the second electronic device 1031 .
  • the first electronic device 1001 may transmit at least some contents 1009 - 1 to the second contents 1033 , so that at least some contents 1009 - 1 may be displayed or stored through the second electronic device 1031 .
  • the third electronic device 1051 may have additional information 1053 for at least some contents 1009 - 1 .
  • the third electronic device 1051 may obtain information indicating that at least some contents 1009 - 1 are displayed or stored through the second electronic device 1031 as the additional information 1053 for at least some contents 1009 - 1 from the first electronic device 1001 or the second electronic device 1031 .
  • the third electronic device 1051 may display the additional information 1053 through a display of the third electronic device 1051 .
  • information provided through at least one of the first electronic device 1001 to the third electronic device 1051 may be shared through another electronic device, and the information obtained through at least one of the first electronic device 1001 to the third electronic device 1051 also be displayed or stored through another electronic device.
  • FIG. 11 illustrates an example of a user interface (for example, the virtual environment 500 ) provided through an electronic device (for example, the electronic device 101 ).
  • a user 1110 may scan the living room 1130 in a predetermined direction 1135 (for example, from the left to the right) while wearing an electronic device 1101 (for example, the electronic device 101 ).
  • an electronic flowerpot 1141 , a desktop computer 1143 , a monitor 1145 , and a keyboard 1147 disposed in the living room 1130 may be discovered by the electronic device 1101 , and displayed as first to fourth objects 1141 - 1 , 1143 - 1 , 1145 - 1 , and 1147 - 1 at locations corresponding to actual spaces thereof, respectively, within the virtual environment 1105 .
  • the electronic device 1101 may differently display user interfaces 1150 and 1170 according to a processing degree of the virtual environment 1105 in generating the virtual environment 1105 .
  • the first user interface 1150 may, for example, vary a transparency of the virtual environment 1105 according to the processing degree. For example, if the processing degree is about 10%, transparency 1151 of the virtual environment 1105 may have a first transparency having the most unclear transparency. If the processing degree is about 50%, transparency 1153 of the virtual environment 1105 may have a second transparency having a middle transparency higher than the first transparency. If the processing degree is about 80%, transparency 1155 of the virtual environment 1105 may have third transparency having the highest transparency. According to an example, the transparency of the virtual environment 1105 provided by the first user interface 1150 may be gradually changed from about 0% to about 100% according to a processing degree.
  • the second user interface 1170 may, for example, vary a region of the virtual environment 1105 output through a display of the electronic device 1101 based on the processing degree. For example, if a processing degree is about 10%, a region 1171 of the virtual environment 1105 shown to the user 1110 may have a first region that is the narrowest region. If the processing degree is about 50%, the transparency 1173 of the virtual environment 1105 may have a second region having a middle region wider than the first region. If the processing degree is about 80%, transparency 1175 of the virtual environment 1105 may have a third region having the widest region. According to an example, the region of the virtual environment 1105 provided by the second user interface 1170 may be gradually changed from about 0% to about 100% based on a processing degree.
  • the processing degree of the virtual environment 1105 is represented by transparency or a shown region, but according to various examples, the processing degree of the virtual environment 1105 is variously represented by chroma, luminance, brightness, color, light and shade, shaking, movement, or the like.
  • an electronic device for example, the electronic device 101 for sharing information through a virtual environment (for example, the virtual environment 500 ) may include a display (for example, the display 170 ) and an information providing module, e.g., in the form of processing circuitry, (for example, the information providing module 110 ) functionally connected with the display, and the information providing module may display an object corresponding to an external electronic device for the electronic device through the display, obtain information to be output through the external electronic device, and provide contents corresponding to the information in relation to a region, on which the object is displayed.
  • an information providing module e.g., in the form of processing circuitry, (for example, the information providing module 110 ) functionally connected with the display, and the information providing module may display an object corresponding to an external electronic device for the electronic device through the display, obtain information to be output through the external electronic device, and provide contents corresponding to the information in relation to a region, on which the object is displayed.
  • the information providing module may check a plurality of external electronic devices for the electronic device, and select the external electronic device among the plurality of external electronic devices based on a user's input or functions of the plurality of external electronic devices.
  • the information providing module may check a plurality of external electronic devices for the electronic device, and determine a currently operated electronic device among the plurality of external electronic devices as the external electronic device.
  • At least one object may include at least one of an image for an appearance of the external electronic device, an icon corresponding to the external electronic device, and text corresponding to the external electronic device.
  • the information providing module may obtain a relative location of the external electronic device with respect to the electronic device, and if the relative location is a first location, the information providing module may display the object on a first region of the display, and if the relative location is a second location, the information providing module may display the object on a second region of the display.
  • the information providing module may check a time, for which the gaze of a user for the electronic device stays on the object, and if the time is a first time, the information providing module may display the object in a first format, and if the time is a second time, the information providing module may display the object in a second format.
  • the information providing module may provide the contents to at least a partial region of the region.
  • the contents include a sound
  • the information providing module may check a location of the region in the display, and provides the sound in a first direction with respect to the electronic device if the location is a first location, and the information providing module may provide the sound in a second direction with respect to the electronic device if the location is a second location.
  • the information providing module may obtain another information obtained through another external electronic device for the electronic device, and further provide other contents corresponding to another information.
  • the information providing module may provide the contents if the electronic device is worn the user.
  • the contents may be provided to the user through at least one of the electronic device and the external electronic device.
  • the information providing module may control the external electronic device so that the contents are prevented from being provided through the external electronic device any longer if the electronic device is attached onto the user.
  • the information providing module may control the external electronic device so that the contents are provided to the user through the external electronic device based on the fact that the electronic device is detached from the user of the electronic device.
  • the information providing module may stop providing the contents based on external environment information about the electronic device.
  • an electronic device for sharing information through a virtual environment may include a display and an information providing module functionally connected with the electronic device, and the information providing module may display a plurality of objects corresponding to a plurality of external electronic devices, respectively, for the electronic device through the display, select at least one object among the plurality of objects, obtain information to be output by at least one external electronic device corresponding to at least one object among the plurality of external electronic devices, and provide the contents corresponding to the information in relation to a region, on which at least one object is displayed.
  • the information providing module may display a virtual environment through the display, and display the plurality of objects within the virtual environment.
  • the information providing module may change the virtual environment to another virtual environment based on at least one input for the electronic device.
  • At least one object may be displayed on a designated region of the virtual environment.
  • the contents may include first sub contents and second sub contents
  • at least one object may include a first object for outputting the first sub contents and a second object for outputting the second sub contents.
  • the plurality of objects may include a first object and a second object
  • the information providing module may output the first object based on a first frequency and the second object based on a second frequency.
  • the plurality of objects may include a first object and a second object
  • the information providing module may obtain biometric information about a user of the electronic device, and the information providing module may determine the first object as at least one object if the biometric information corresponds to a first frequency, and determine the second object as at least one object if the biometric information corresponds to a second frequency.
  • the information providing module may determine at least one object based on at least one of the gaze and a gesture of the user for the electronic device.
  • the information providing module may provide a second part of output contents that is a next part of a first part of the contents previously output through at least one external electronic device as the information.
  • the information providing module may display a control menu for controlling the information in relation to at least one content through the region.
  • the information providing module may obtain an input for another object among the plurality of objects, and transmit at least a part of at least one content or at least a part of additional information about at least one content to another external electronic device so that at least the part of at least one contents or at least the part of the additional information on at least one contents is provided through another external electronic device corresponding to another object based on the input.
  • FIG. 12 is a flowchart 1200 illustrating an example method of providing information by an electronic device (for example, the electronic device 101 ).
  • an electronic device for example, the object display module 630
  • the electronic device may provide a virtual environment (for example, the virtual environment 500 ).
  • the electronic device may display the object within the virtual environment.
  • the electronic device may obtain information (for example, the video 465 ) to be output through the external electronic device. According to an example, the electronic device may obtain the information if the external electronic device is selected as a target electronic device based on a user's input.
  • the electronic device (for example, the content providing module 650 ) may provide contents (for example, the video 465 ) corresponding to the information in relation to a region, on which the object is displayed.
  • a method of sharing information through a virtual environment by an electronic device may include an operation of displaying an object corresponding to an external electronic device for the electronic device through a display (for example, the display 170 ) functionally connected with the electronic device, an operation of obtaining information to be output through the external electronic device, and an operation of providing contents corresponding to the information in relation to a region, on which the object is displayed.
  • the operation of displaying the object may include an operation of checking a plurality of external electronic devices for the electronic device, and an operation of selecting the external electronic device among the plurality of external electronic devices based on a user's input or functions of the plurality of external electronic devices.
  • the operation of displaying the object may include an operation of checking a plurality of external electronic devices for the electronic device, and determining a currently operated electronic device among the plurality of external electronic devices as the external electronic device.
  • At least one object may include at least one of an image for an appearance of the external electronic device, an icon corresponding to the external electronic device, and text corresponding to the external electronic device.
  • the operation of displaying the object may include obtaining a relative location of the external electronic device with respect to the electronic device, an operation of displaying the object on a first region of the display if the relative location is a first location, and an operation of displaying the object on a second region of the display if the relative location is a second location.
  • the operation of displaying the object may include an operation of checking a time, for which the gaze of a user for the electronic device stays on the object, an operation of displaying the object in a first format if the time is a first time, and an operation of displaying the object in a second format if the time is a second time.
  • the operation of providing the contents may include an operation of providing the contents to at least a partial region of the region.
  • the contents include a sound
  • the providing of the contents may include an operation of checking a location of the region in the display, an operation of providing the sound in a first direction with respect to the electronic device if the location is a first location, and an operation of providing the sound in a second direction with respect to the electronic device if the location is a second location.
  • the method of providing information through the virtual environment may further include an operation of obtaining another information obtained through another external electronic device for the electronic device, and an operation of providing other contents corresponding another information.
  • the providing of the contents may include an operation of providing the contents if the electronic device is worn by the user.
  • the contents may be provided to the user through at least one of the electronic device and the external electronic device.
  • the method of providing information through the virtual environment may further include an operation of controlling the external electronic device so that the contents are prevented from being provided through the external electronic device any longer if the electronic device is attached onto the user.
  • the operation of controlling the external electronic device may include an operation of controlling the external electronic device so that the contents are provided to the user through the external electronic device if the electronic device is detached from the user of the electronic device.
  • the method of providing information through the virtual environment may further include an operation of stopping providing the contents based on external environment information about the electronic device.
  • a method of providing information through a virtual environment may include an operation of displaying a plurality of objects corresponding to a plurality of external electronic devices, respectively, for the electronic device through a display functionally connected with the electronic device, an operation of selecting at least one object among the plurality of objects, an operation of obtaining information to be output by at least one external electronic device corresponding to at least one object among the plurality of external electronic device, and an operation of providing the contents corresponding to the information in relation to a region, on which at least one object is displayed.
  • the operation of displaying the plurality of objects may include an operation of displaying a virtual environment through the display, and an operation of displaying the plurality of objects within the virtual environment.
  • the method of providing information through the virtual environment may further include an operation of changing the virtual environment to another virtual environment based on at least one input for the electronic device.
  • At least one object may be displayed on a designated region of the virtual environment.
  • the contents may include first sub contents and second sub contents
  • at least one object may include a first object for outputting the first sub contents and a second object for outputting the second sub contents.
  • the plurality of objects may include a first object and a second object
  • the displaying of the plurality of objects may include an operation of outputting the first object based on a first frequency and an operation of outputting the second object based on a second frequency
  • the plurality of objects may include a first object and a second object
  • the operation of obtaining the information may include an operation of obtaining biometric information about a user of the electronic device, and an operation of determining the first object as at least one object if the biometric information corresponds to a first frequency, and an operation of determining the second object as at least one object if the biometric information corresponds to a second frequency.
  • the operation of obtaining the information may include an operation of determining at least one object based on at least one of the gaze and a gesture of the user for the electronic device.
  • the operation of providing the contents may include an operation of providing a second part of output contents that is a next part of a first part of the contents previously output through at least one external electronic device as the information.
  • the operation of providing the contents may include an operation of displaying a control menu for controlling the information in relation to at least one contents through the region.
  • the method of providing information through the virtual environment may further include an operation of obtaining an input for another object among the plurality of objects, and an operation of transmitting at least a part of at least one content or at least a part of additional information about at least one content to another external electronic device so that at least the part of at least one contents or at least the part of the additional information on at least one contents is provided through another external electronic device corresponding to another object based on the input.
  • the commands may be set to make one or more processors perform one or more operations if the commands are executed by the one or more processors, and the one or more operations may include an operation of displaying, by an electronic device (for example, the electronic device 101 ) through a display (for example, the display 170 ) functionally connected with the electronic device, an operation of obtaining information to be output through at least one external electronic device, and an operation of providing contents corresponding to the information, respectively, through a region, on which at least one object is displayed.
  • an electronic device for example, the electronic device 101
  • a display for example, the display 170

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
US14/934,587 2014-11-07 2015-11-06 Virtual environment for sharing information Abandoned US20160133052A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/536,195 US11120630B2 (en) 2014-11-07 2019-08-08 Virtual environment for sharing information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2014-0154367 2014-11-07
KR1020140154367A KR102265086B1 (ko) 2014-11-07 2014-11-07 정보를 공유하기 위한 가상 환경

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/536,195 Continuation US11120630B2 (en) 2014-11-07 2019-08-08 Virtual environment for sharing information

Publications (1)

Publication Number Publication Date
US20160133052A1 true US20160133052A1 (en) 2016-05-12

Family

ID=54608278

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/934,587 Abandoned US20160133052A1 (en) 2014-11-07 2015-11-06 Virtual environment for sharing information
US16/536,195 Active US11120630B2 (en) 2014-11-07 2019-08-08 Virtual environment for sharing information

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/536,195 Active US11120630B2 (en) 2014-11-07 2019-08-08 Virtual environment for sharing information

Country Status (5)

Country Link
US (2) US20160133052A1 (fr)
EP (1) EP3018561B1 (fr)
KR (1) KR102265086B1 (fr)
CN (1) CN105589732B (fr)
ES (1) ES2750682T3 (fr)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170185276A1 (en) * 2015-12-23 2017-06-29 Samsung Electronics Co., Ltd. Method for electronic device to control object and electronic device
US20180061128A1 (en) * 2016-08-23 2018-03-01 Adobe Systems Incorporated Digital Content Rendering Coordination in Augmented Reality
US10163198B2 (en) * 2016-02-26 2018-12-25 Samsung Electronics Co., Ltd. Portable image device for simulating interaction with electronic device
US10198846B2 (en) 2016-08-22 2019-02-05 Adobe Inc. Digital Image Animation
WO2019088737A1 (fr) * 2017-11-02 2019-05-09 Samsung Electronics Co., Ltd. Procédé et dispositif de réalisation de commande à distance
USD851671S1 (en) * 2017-11-06 2019-06-18 Whatsapp Inc. Display screen or portion thereof with graphical user interface
US20190188916A1 (en) * 2017-11-15 2019-06-20 Xiaoyin ZHANG Method and apparatus for augmenting reality
US20190243443A1 (en) * 2018-01-31 2019-08-08 Colopl, Inc. Program, information processing apparatus and method thereof
US20190293942A1 (en) * 2017-03-30 2019-09-26 Tencent Technology (Shenzhen) Company Limited Virtual reality glasses, lens barrel adjustment method and device
US10430559B2 (en) 2016-10-18 2019-10-01 Adobe Inc. Digital rights management in virtual and augmented reality
US10477138B2 (en) 2016-08-11 2019-11-12 Hooloop Corporation Methods and systems for presenting specific information in a virtual reality environment
US10506221B2 (en) 2016-08-03 2019-12-10 Adobe Inc. Field of view rendering control of digital content
US10521967B2 (en) 2016-09-12 2019-12-31 Adobe Inc. Digital content interaction and navigation in virtual and augmented reality
US10521941B2 (en) * 2015-05-22 2019-12-31 Samsung Electronics Co., Ltd. System and method for displaying virtual image through HMD device
US10536411B2 (en) 2017-11-06 2020-01-14 Whatsapp Inc. Providing group messaging thread highlights
US10540003B2 (en) * 2016-05-09 2020-01-21 Lg Electronics Inc. Head mounted display device and method for controlling the same
US10600245B1 (en) * 2014-05-28 2020-03-24 Lucasfilm Entertainment Company Ltd. Navigating a virtual environment of a media content item
US20200147483A1 (en) * 2018-11-09 2020-05-14 Primax Electronics Ltd. Interactive gaming system
US10664150B2 (en) 2017-11-06 2020-05-26 Whatsapp Inc. Providing group messaging thread highlights
US10678401B2 (en) 2017-11-06 2020-06-09 Whatsapp Inc. Providing group messaging thread highlights
US10685074B2 (en) 2017-11-06 2020-06-16 Whatsapp Inc. Providing group messaging thread highlights
CN111742281A (zh) * 2018-02-23 2020-10-02 三星电子株式会社 用于针对显示在显示器上的第一内容提供根据外部对象的移动的第二内容的电子装置及其操作方法
US10838502B2 (en) * 2016-03-29 2020-11-17 Microsoft Technology Licensing, Llc Sharing across environments
US20210125468A1 (en) * 2018-06-28 2021-04-29 3M Innovative Properties Company Notification delivery for workers wearing personal protective equipment
EP3859503A1 (fr) * 2020-02-03 2021-08-04 Evolution Malta Ltd Sélection d'objets distants
US11120630B2 (en) 2014-11-07 2021-09-14 Samsung Electronics Co., Ltd. Virtual environment for sharing information
US20220197394A1 (en) * 2020-12-17 2022-06-23 Samsung Electronics Co., Ltd. Electronic device and operation method thereof
US20220269089A1 (en) * 2014-10-15 2022-08-25 Samsung Electronics Co., Ltd. Method and apparatus for processing screen using device
US11461820B2 (en) 2016-08-16 2022-10-04 Adobe Inc. Navigation and rewards involving physical goods and services
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep
US20230418544A1 (en) * 2021-02-18 2023-12-28 Canon Kabushiki Kaisha Glasses-type information device, and method and storage medium for the same
US12039098B2 (en) 2021-07-19 2024-07-16 Samsung Electronics Co., Ltd. Electronic device and operation method of electronic device for controlling external electronic device

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10643391B2 (en) 2016-09-23 2020-05-05 Apple Inc. Immersive virtual display
CN114706483A (zh) * 2016-09-23 2022-07-05 苹果公司 沉浸式虚拟显示
CN106775934B (zh) * 2016-11-29 2020-12-11 北京元心科技有限公司 基于多系统的输入输出方法及装置
KR102679047B1 (ko) * 2017-01-25 2024-07-01 삼성전자주식회사 전자 장치 및 전자 장치 제어 방법
CN106997242B (zh) * 2017-03-28 2020-10-30 联想(北京)有限公司 界面管理方法及头戴式显示设备
US10921595B2 (en) 2018-06-29 2021-02-16 International Business Machines Corporation Contextual adjustment to augmented reality glasses
CN109045695B (zh) * 2018-08-08 2020-09-29 腾讯科技(深圳)有限公司 虚拟环境中的配件选择方法、设备及存储介质
US11468611B1 (en) * 2019-05-16 2022-10-11 Apple Inc. Method and device for supplementing a virtual environment
KR20210063928A (ko) 2019-11-25 2021-06-02 삼성전자주식회사 증강 현실 서비스를 제공하기 위한 전자 장치 및 그의 동작 방법
US11363081B2 (en) * 2020-04-16 2022-06-14 Kathleen A. SANVIDGE Method and system for conducting remote communications at a funeral home
US20210358294A1 (en) * 2020-05-15 2021-11-18 Microsoft Technology Licensing, Llc Holographic device control
US11927756B2 (en) * 2021-04-01 2024-03-12 Samsung Electronics Co., Ltd. Method for providing augmented reality image and head mounted display device supporting the same
US11908088B2 (en) * 2021-06-09 2024-02-20 Red Hat, Inc. Controlling virtual resources from within an augmented reality environment
KR20230013407A (ko) * 2021-07-19 2023-01-26 삼성전자주식회사 외부 전자 장치를 제어하기 위한 전자 장치 및 전자 장치의 동작 방법
EP4295567A4 (fr) 2021-08-02 2024-08-07 Samsung Electronics Co Ltd Procédé et dispositif de capture d'une image par configuration de dispositifs d'éclairage de l'internet des objets (ido) dans un environnement ido
CN118215931A (zh) 2021-11-09 2024-06-18 三星电子株式会社 用于在电子装置和可穿戴电子装置之间提供与增强现实服务相关的内容的方法和装置
US11935201B2 (en) * 2022-04-28 2024-03-19 Dell Products Lp Method and apparatus for using physical devices in extended reality environments
KR20240007562A (ko) * 2022-07-08 2024-01-16 삼성전자주식회사 분류에 기반하는 객체의 선택 및 제어를 위한 전자 장치 및 그 방법
WO2024071661A1 (fr) * 2022-09-30 2024-04-04 삼성전자 주식회사 Dispositif électronique et procédé de fonctionnement associé

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070132662A1 (en) * 2004-05-27 2007-06-14 Canon Kabushiki Kaisha Information processing method, information processing apparatus, and image sensing apparatus
US20110037712A1 (en) * 2009-08-11 2011-02-17 Lg Electronics Inc. Electronic device and control method thereof
US20110170787A1 (en) * 2010-01-12 2011-07-14 Qualcomm Incorporated Using a display to select a target object for communication
US20130002701A1 (en) * 2010-08-18 2013-01-03 Brother Kogyo Kabushiki Kaisha Systems for displaying images on portable display devices and head-mountable displays, methods for controlling such systems, and computer-readable storage media storing instructions for controlling such systems
US20130050432A1 (en) * 2011-08-30 2013-02-28 Kathryn Stone Perez Enhancing an object of interest in a see-through, mixed reality display device
US20130069985A1 (en) * 2011-09-21 2013-03-21 Google Inc. Wearable Computer with Superimposed Controls and Instructions for External Device
US20130194164A1 (en) * 2012-01-27 2013-08-01 Ben Sugden Executable virtual objects associated with real objects
US20140192085A1 (en) * 2013-01-04 2014-07-10 Lg Electronics Inc. Head mounted display and method for controlling the same
US20150332620A1 (en) * 2012-12-21 2015-11-19 Sony Corporation Display control apparatus and recording medium
US20150346816A1 (en) * 2014-05-30 2015-12-03 Moriahtown Co., Ltd. Display device using wearable eyeglasses and method of operating the same
US20160124579A1 (en) * 2014-10-29 2016-05-05 Sony Corporation Controlling multiple devices with a wearable input device
US20170185276A1 (en) * 2015-12-23 2017-06-29 Samsung Electronics Co., Ltd. Method for electronic device to control object and electronic device
US20170295278A1 (en) * 2016-04-10 2017-10-12 Philip Scott Lyren Display where a voice of a calling party will externally localize as binaural sound for a telephone call

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8605008B1 (en) 2007-05-04 2013-12-10 Apple Inc. Head-mounted display
WO2012002603A1 (fr) 2010-06-28 2012-01-05 엘지전자 주식회사 Procédé et appareil pour produire l'état fonctionnel d'un dispositif externe
CN102542921A (zh) * 2010-12-22 2012-07-04 黄志奇 一种参数自适应控制的头戴显示器驱动控制结构
JP5810540B2 (ja) 2011-02-04 2015-11-11 セイコーエプソン株式会社 頭部装着型表示装置および頭部装着型表示装置の制御方法
KR20160084502A (ko) 2011-03-29 2016-07-13 퀄컴 인코포레이티드 로컬 멀티-사용자 협업을 위한 모듈식 모바일 접속된 피코 프로젝터들
US20130147686A1 (en) 2011-12-12 2013-06-13 John Clavin Connecting Head Mounted Displays To External Displays And Other Communication Networks
US9024844B2 (en) 2012-01-25 2015-05-05 Microsoft Technology Licensing, Llc Recognition of image on external display
JP6066037B2 (ja) * 2012-03-27 2017-01-25 セイコーエプソン株式会社 頭部装着型表示装置
JP6056178B2 (ja) * 2012-04-11 2017-01-11 ソニー株式会社 情報処理装置、表示制御方法及びプログラム
KR101861380B1 (ko) * 2012-07-16 2018-05-28 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 헤드 마운트 디스플레이를 이용한 컨텐츠 출력 방법 및 이를 위한 헤드 마운트 디스플레이
KR101971624B1 (ko) * 2012-07-25 2019-04-23 삼성전자주식회사 이동 단말의 정보 표시 방법, 디스플레이 장치의 정보 제공 방법, 이동 단말의 제어 신호 생성 방법
KR101958778B1 (ko) * 2012-08-31 2019-03-15 엘지전자 주식회사 헤드 마운트 디스플레이 및 이를 이용한 디지털 디바이스 제어 방법
US9535496B2 (en) 2013-03-15 2017-01-03 Daqri, Llc Visual gestures
EP3001406A4 (fr) * 2013-05-21 2017-01-25 Sony Corporation Dispositif de commande d'affichage, procédé de commande d'affichage, et support d'enregistrement
US9753687B1 (en) * 2014-01-03 2017-09-05 Sony Interactive Entertainment America Llc Wearable computer using programmed local tag
US9836122B2 (en) * 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US10318007B2 (en) 2014-01-27 2019-06-11 Lg Electronics Inc. Head mounted display device for multi-tasking and method for controlling same
US9836987B2 (en) 2014-02-14 2017-12-05 Lincoln Global, Inc. Virtual reality pipe welding simulator and setup
US10203762B2 (en) * 2014-03-11 2019-02-12 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
KR102265086B1 (ko) 2014-11-07 2021-06-15 삼성전자 주식회사 정보를 공유하기 위한 가상 환경
US9898865B2 (en) * 2015-06-22 2018-02-20 Microsoft Technology Licensing, Llc System and method for spawning drawing surfaces
EP3446290B1 (fr) * 2016-04-22 2021-11-03 InterDigital CE Patent Holdings Procédé et dispositif pour composer une image
US10235809B2 (en) * 2016-06-30 2019-03-19 Microsoft Technology Licensing, Llc Reality to virtual reality portal for dual presence of devices
KR20180054228A (ko) * 2016-11-15 2018-05-24 삼성전자주식회사 콘텐츠를 제공하기 위한 방법 및 그 전자 장치
US11132840B2 (en) * 2017-01-16 2021-09-28 Samsung Electronics Co., Ltd Method and device for obtaining real time status and controlling of transmitting devices
US20190129607A1 (en) * 2017-11-02 2019-05-02 Samsung Electronics Co., Ltd. Method and device for performing remote control

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070132662A1 (en) * 2004-05-27 2007-06-14 Canon Kabushiki Kaisha Information processing method, information processing apparatus, and image sensing apparatus
US20110037712A1 (en) * 2009-08-11 2011-02-17 Lg Electronics Inc. Electronic device and control method thereof
US20110170787A1 (en) * 2010-01-12 2011-07-14 Qualcomm Incorporated Using a display to select a target object for communication
US20130002701A1 (en) * 2010-08-18 2013-01-03 Brother Kogyo Kabushiki Kaisha Systems for displaying images on portable display devices and head-mountable displays, methods for controlling such systems, and computer-readable storage media storing instructions for controlling such systems
US20130050432A1 (en) * 2011-08-30 2013-02-28 Kathryn Stone Perez Enhancing an object of interest in a see-through, mixed reality display device
US20130069985A1 (en) * 2011-09-21 2013-03-21 Google Inc. Wearable Computer with Superimposed Controls and Instructions for External Device
US20130194164A1 (en) * 2012-01-27 2013-08-01 Ben Sugden Executable virtual objects associated with real objects
US20150332620A1 (en) * 2012-12-21 2015-11-19 Sony Corporation Display control apparatus and recording medium
US20140192085A1 (en) * 2013-01-04 2014-07-10 Lg Electronics Inc. Head mounted display and method for controlling the same
US20150346816A1 (en) * 2014-05-30 2015-12-03 Moriahtown Co., Ltd. Display device using wearable eyeglasses and method of operating the same
US20160124579A1 (en) * 2014-10-29 2016-05-05 Sony Corporation Controlling multiple devices with a wearable input device
US20170185276A1 (en) * 2015-12-23 2017-06-29 Samsung Electronics Co., Ltd. Method for electronic device to control object and electronic device
US20170295278A1 (en) * 2016-04-10 2017-10-12 Philip Scott Lyren Display where a voice of a calling party will externally localize as binaural sound for a telephone call

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11508125B1 (en) 2014-05-28 2022-11-22 Lucasfilm Entertainment Company Ltd. Navigating a virtual environment of a media content item
US10602200B2 (en) 2014-05-28 2020-03-24 Lucasfilm Entertainment Company Ltd. Switching modes of a media content item
US10600245B1 (en) * 2014-05-28 2020-03-24 Lucasfilm Entertainment Company Ltd. Navigating a virtual environment of a media content item
US12007570B2 (en) * 2014-10-15 2024-06-11 Samsung Electronics Co., Ltd. Method and apparatus for processing screen using device
US11914153B2 (en) * 2014-10-15 2024-02-27 Samsung Electronics Co., Ltd. Method and apparatus for processing screen using device
US20220269089A1 (en) * 2014-10-15 2022-08-25 Samsung Electronics Co., Ltd. Method and apparatus for processing screen using device
US11120630B2 (en) 2014-11-07 2021-09-14 Samsung Electronics Co., Ltd. Virtual environment for sharing information
US10521941B2 (en) * 2015-05-22 2019-12-31 Samsung Electronics Co., Ltd. System and method for displaying virtual image through HMD device
US11386600B2 (en) 2015-05-22 2022-07-12 Samsung Electronics Co., Ltd. System and method for displaying virtual image through HMD device
US20170185276A1 (en) * 2015-12-23 2017-06-29 Samsung Electronics Co., Ltd. Method for electronic device to control object and electronic device
US10163198B2 (en) * 2016-02-26 2018-12-25 Samsung Electronics Co., Ltd. Portable image device for simulating interaction with electronic device
US10838502B2 (en) * 2016-03-29 2020-11-17 Microsoft Technology Licensing, Llc Sharing across environments
US10540003B2 (en) * 2016-05-09 2020-01-21 Lg Electronics Inc. Head mounted display device and method for controlling the same
US10506221B2 (en) 2016-08-03 2019-12-10 Adobe Inc. Field of view rendering control of digital content
US10477138B2 (en) 2016-08-11 2019-11-12 Hooloop Corporation Methods and systems for presenting specific information in a virtual reality environment
US11461820B2 (en) 2016-08-16 2022-10-04 Adobe Inc. Navigation and rewards involving physical goods and services
US10198846B2 (en) 2016-08-22 2019-02-05 Adobe Inc. Digital Image Animation
US20180061128A1 (en) * 2016-08-23 2018-03-01 Adobe Systems Incorporated Digital Content Rendering Coordination in Augmented Reality
US10521967B2 (en) 2016-09-12 2019-12-31 Adobe Inc. Digital content interaction and navigation in virtual and augmented reality
US10430559B2 (en) 2016-10-18 2019-10-01 Adobe Inc. Digital rights management in virtual and augmented reality
US20190293942A1 (en) * 2017-03-30 2019-09-26 Tencent Technology (Shenzhen) Company Limited Virtual reality glasses, lens barrel adjustment method and device
US11042033B2 (en) * 2017-03-30 2021-06-22 Tencent Technology (Shenzhen) Company Limited Virtual reality glasses, lens barrel adjustment method and device
WO2019088737A1 (fr) * 2017-11-02 2019-05-09 Samsung Electronics Co., Ltd. Procédé et dispositif de réalisation de commande à distance
US10678401B2 (en) 2017-11-06 2020-06-09 Whatsapp Inc. Providing group messaging thread highlights
US11604561B2 (en) 2017-11-06 2023-03-14 Whatsapp Llc Providing group messaging thread highlights
USD851671S1 (en) * 2017-11-06 2019-06-18 Whatsapp Inc. Display screen or portion thereof with graphical user interface
USD904435S1 (en) 2017-11-06 2020-12-08 Whatsapp Inc. Display screen or portion thereof with graphical user interface
US10536411B2 (en) 2017-11-06 2020-01-14 Whatsapp Inc. Providing group messaging thread highlights
US10664150B2 (en) 2017-11-06 2020-05-26 Whatsapp Inc. Providing group messaging thread highlights
US10685074B2 (en) 2017-11-06 2020-06-16 Whatsapp Inc. Providing group messaging thread highlights
US20190188916A1 (en) * 2017-11-15 2019-06-20 Xiaoyin ZHANG Method and apparatus for augmenting reality
US20190243443A1 (en) * 2018-01-31 2019-08-08 Colopl, Inc. Program, information processing apparatus and method thereof
US11217031B2 (en) 2018-02-23 2022-01-04 Samsung Electronics Co., Ltd. Electronic device for providing second content for first content displayed on display according to movement of external object, and operating method therefor
CN111742281A (zh) * 2018-02-23 2020-10-02 三星电子株式会社 用于针对显示在显示器上的第一内容提供根据外部对象的移动的第二内容的电子装置及其操作方法
US20210125468A1 (en) * 2018-06-28 2021-04-29 3M Innovative Properties Company Notification delivery for workers wearing personal protective equipment
US20200147483A1 (en) * 2018-11-09 2020-05-14 Primax Electronics Ltd. Interactive gaming system
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep
EP3859503A1 (fr) * 2020-02-03 2021-08-04 Evolution Malta Ltd Sélection d'objets distants
US20210240427A1 (en) * 2020-02-03 2021-08-05 Evolution Malta Limited Selection of Remote Objects
US11822728B2 (en) * 2020-12-17 2023-11-21 Samsung Electronics Co., Ltd. Electronic device and operation method thereof
US20220197394A1 (en) * 2020-12-17 2022-06-23 Samsung Electronics Co., Ltd. Electronic device and operation method thereof
US20230418544A1 (en) * 2021-02-18 2023-12-28 Canon Kabushiki Kaisha Glasses-type information device, and method and storage medium for the same
US12039098B2 (en) 2021-07-19 2024-07-16 Samsung Electronics Co., Ltd. Electronic device and operation method of electronic device for controlling external electronic device

Also Published As

Publication number Publication date
CN105589732B (zh) 2020-11-06
US20190362560A1 (en) 2019-11-28
CN105589732A (zh) 2016-05-18
EP3018561B1 (fr) 2019-09-18
KR20160054840A (ko) 2016-05-17
ES2750682T3 (es) 2020-03-26
EP3018561A1 (fr) 2016-05-11
US11120630B2 (en) 2021-09-14
KR102265086B1 (ko) 2021-06-15

Similar Documents

Publication Publication Date Title
US11120630B2 (en) Virtual environment for sharing information
US10209766B2 (en) Displaying method in low power mode and electronic device supporting the same
KR102673224B1 (ko) 전자 장치 및 전자 장치 제어 방법
US20190318545A1 (en) Command displaying method and command displaying device
CN107257954B (zh) 用于提供屏幕镜像服务的设备和方法
US10261573B2 (en) Power control method and apparatus for reducing power consumption
US10503459B2 (en) Method for sharing screen and electronic device thereof
KR102276847B1 (ko) 가상 오브젝트 제공 방법 및 그 전자 장치
US10261683B2 (en) Electronic apparatus and screen display method thereof
US9916120B2 (en) Method and apparatus for providing of screen mirroring service
US9912880B2 (en) Method and apparatus for adjusting color
US10908712B2 (en) Method for recognizing rotation of rotating body and electronic device for processing the same
US20170041272A1 (en) Electronic device and method for transmitting and receiving content
US20160142703A1 (en) Display method and electronic device
KR102277460B1 (ko) 화면을 공유하기 위한 방법 및 그 전자 장치
US20160351047A1 (en) Method and system for remote control of electronic device
US20190208557A1 (en) Method using a time point for sharing data between electronic devices based on situation information
EP3107087B1 (fr) Dispositif permettant de commander indépendamment de multiples zones d'affichage et procédé associé
US10719209B2 (en) Method for outputting screen and electronic device supporting the same
KR102192155B1 (ko) 어플리케이션 정보를 제공하는 방법 및 장치
US20190129601A1 (en) Electronic device and control method therefor
KR102434754B1 (ko) 전자 장치 및 전자 장치의 컨텐츠 표시 방법
KR102323797B1 (ko) 전자 장치 및 그의 정보 공유 방법
US20160267886A1 (en) Method of controlling screen and electronic device for processing method
US20160085433A1 (en) Apparatus and Method for Displaying Preference for Contents in Electronic Device

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO, LTD., KOREA, DEMOCRATIC PE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHOI, WOOSUNG;KANG, HYUK;KIM, MINJI;AND OTHERS;SIGNING DATES FROM 20151006 TO 20151031;REEL/FRAME:036979/0264

AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE COUNTRY PREVIOUSLY RECORDED AT REEL: 036979 FRAME: 0264. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:CHOI, WOOSUNG;KANG, HYUK;KIM, MINJI;AND OTHERS;SIGNING DATES FROM 20151006 TO 20151031;REEL/FRAME:045337/0176

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION