WO2019199043A1 - Terminal mobile - Google Patents

Terminal mobile Download PDF

Info

Publication number
WO2019199043A1
WO2019199043A1 PCT/KR2019/004252 KR2019004252W WO2019199043A1 WO 2019199043 A1 WO2019199043 A1 WO 2019199043A1 KR 2019004252 W KR2019004252 W KR 2019004252W WO 2019199043 A1 WO2019199043 A1 WO 2019199043A1
Authority
WO
WIPO (PCT)
Prior art keywords
target device
mobile terminal
target
information
identification information
Prior art date
Application number
PCT/KR2019/004252
Other languages
English (en)
Korean (ko)
Inventor
송태영
송종훈
최진구
김병수
김인숙
김희진
송영훈
최지나
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020180123268A external-priority patent/KR102102396B1/ko
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2019199043A1 publication Critical patent/WO2019199043A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/22Processing or transfer of terminal data, e.g. status or physical capabilities
    • H04W8/24Transfer of terminal data

Definitions

  • the present invention relates to a mobile terminal capable of establishing a BLE connection with a target device by photographing a target device to be connected.
  • Bluetooth is a short-range wireless technology standard that can transmit and receive data by wirelessly connecting various devices in a short distance.
  • a user When performing wireless communication between two devices using Bluetooth communication, a user performs a procedure of searching for a Bluetooth device and requesting a connection. do.
  • the device may mean an apparatus and an apparatus.
  • the user may perform a connection after searching for the Bluetooth device according to the Bluetooth communication method to use using the Bluetooth device.
  • Bluetooth communication methods include a basic rate / enhanced data rate (BR / EDR) method and a low energy (LE) method, which is a low power method.
  • the BR / EDR scheme may be referred to as Bluetooth Classic.
  • the Bluetooth classic includes Bluetooth technology that has been adopted since Bluetooth 1.0 using Basic Rate and Bluetooth technology that has used Enhanced Data Rate supported since Bluetooth 2.0.
  • Bluetooth Low Energy (hereinafter referred to as Bluetooth LE) technology has been applied since Bluetooth 4.0, and can consume hundreds of kilobytes (KB) of information stably with low power consumption.
  • the Bluetooth low energy energy technology uses an attribute protocol to exchange information between devices. This Bluetooth LE method can reduce energy overhead by reducing the header overhead and simplifying the operation.
  • Some Bluetooth devices do not have a display or a user interface.
  • the complexity of connection / management / control / disconnection between various kinds of Bluetooth devices and similarly applied Bluetooth devices is increasing.
  • Bluetooth can achieve a relatively high speed at a relatively low power, low cost, but the transmission distance is generally limited to a maximum of 100m, it is suitable for use in a limited space.
  • the user enters the Bluetooth menu, presses the search button, checks and selects a device to be connected from the searched list, and presses the connection button to set up a connection.After connecting, the user runs a dedicated app to connect to the selected device. Control was performed.
  • the present invention has been made to solve the above problems, and an object of the present invention is to provide a mobile terminal that can establish a BLE connection with a target device by photographing the target device to be connected.
  • a mobile terminal includes a communication unit for communicating with a plurality of target devices in a Bluetooth Low Energy (BLE) manner, a camera unit for photographing an image, and the plurality of targets from the plurality of target devices.
  • BLE Bluetooth Low Energy
  • an operation method of a mobile terminal communicating with a plurality of target devices in a Bluetooth Low Energy (BLE) manner may include identification information of the plurality of target devices from the plurality of target devices.
  • the present invention not only estimates the target device using machine learning, but also determines the target device to be connected using the identification information included in the advertisement packet. Accordingly, if the user only needs to illuminate the target device to be connected with the camera, the user can accurately determine what the target device is photographed and establish a connection.
  • the accuracy may be significantly lowered only by estimating the target device using machine learning.
  • the model ID included in the advertisement packet together, it is possible to accurately determine what the target device is photographed and establish a connection.
  • the user presses the search button to perform a scan, a list of the scanned devices, a user selects a device to be connected, and the like is omitted.
  • the list of scanned devices is displayed, and the process of selecting a model name on the list while the user is familiar with the model name of the target device is omitted.
  • connection is possible only by illuminating the device to be connected with the camera, which has the advantage of improving user convenience.
  • FIG. 1 is a schematic diagram illustrating an example of a wireless communication system using Bluetooth low power energy technology according to an embodiment of the present invention.
  • 2A through 2D are diagrams for describing a problem that may occur when BLE is connected.
  • FIG. 3 is a block diagram illustrating a mobile terminal 100 according to an embodiment of the present invention.
  • FIG. 4 is a view for explaining a method of operation of a wireless communication system according to an embodiment of the present invention.
  • FIG. 5 is a diagram for describing a process for establishing a connection between a mobile terminal and a first target device.
  • FIG. 6 is a diagram illustrating an advertisement packet transmitted from a first target device.
  • FIG. 7 is a diagram illustrating a learning process of a machine learning model.
  • FIG. 8 is a diagram illustrating a scene of photographing a target device with a mobile terminal.
  • FIG. 9 is a diagram for describing a method of generating a list of estimated identification information using a captured image, according to an exemplary embodiment.
  • FIGS. 10 and 11 are diagrams for describing a method of selecting a target device for connection and control using predicted identification information and identification information of a plurality of target devices obtained by scanning according to an embodiment of the present invention. to be.
  • FIG. 12 is a diagram illustrating a stack of the mobile terminal 100 and the first target device 200
  • FIG. 13 is a diagram illustrating a detailed stack of the mobile terminal 100.
  • FIG. 14 is a diagram illustrating a state change of the target device
  • FIG. 15 is a diagram illustrating a state change of the mobile terminal.
  • 16A to 16C illustrate a process of establishing a connection with the first target device 600 according to an embodiment of the present invention.
  • 17 is a diagram illustrating a control icon corresponding to the function of the target device.
  • 18A to 18C are diagrams for describing a method of inputting a voice command using a mobile terminal according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram illustrating an example of a wireless communication system using Bluetooth low power energy technology according to an embodiment of the present invention.
  • the wireless communication system may include a mobile terminal 100 and a plurality of target devices 200, 300, 400, and 500.
  • the mobile terminal 100 and the plurality of target devices 200, 300, 400, and 500 may perform Bluetooth communication using a Bluetooth Low Energy (BLE) (hereinafter, referred to as BLE) technology.
  • BLE Bluetooth Low Energy
  • BLE technology Compared with Bluetooth Basic Rate / Enhanced Data Rate (BR / EDR) technology, BLE technology has a relatively small duty cycle, enables low-cost production, and significantly reduces power consumption through low-speed data rates. coin cell) It can be operated for more than 1 year when using battery.
  • the BLE technology simplifies the connection procedure between devices, and the packet size is smaller than that of the Bluetooth BR / EDR technology.
  • the number of RF channels is 40
  • the data rate supports 1Mbps
  • the topology is a scatternet structure
  • latency is 3ms
  • (6) output power is less than 10mW (10dBm)
  • (7) is mainly used in applications such as mobile phones, watches, sports, healthcare, sensors, device control.
  • the mobile terminal 100 may operate as a client device in relation to a plurality of target devices.
  • the mobile terminal 100 may be represented as a master device, a master, a client, a member, a sensor device, a sink device, a collector, a first device, a handsfree device, or the like. Can be.
  • the plurality of target devices 200, 300, 400, and 500 may operate as server devices in relation to the mobile terminal 100.
  • the plurality of target devices include a data service device, a slave device device, a slave, a server, a conductor, a host device, a gateway, and a sensing device. It may be represented as a sensing device, a monitoring device, a second device, an audio gate (AG), or the like.
  • the plurality of target devices When the plurality of target devices receive data from the mobile terminal and directly communicate with the mobile terminal, when receiving a data request from the mobile terminal, the plurality of target devices provide the data to the mobile terminal through a response.
  • the plurality of target devices sends a notification message and an indication message to the mobile terminal to provide data information to the mobile terminal.
  • the plurality of target devices transmit the indication message to the mobile terminal
  • the plurality of target devices receive a confirmation message corresponding to the indication message from the mobile terminal.
  • the plurality of target devices provide data information to the user through a display unit or transmit a request input from the user through a user input interface in the process of transmitting and receiving notification, instruction, and confirmation messages with the client device. Can be received.
  • the plurality of target devices may read data from a memory unit or write new data to the corresponding memory in the process of transmitting and receiving a message with the mobile terminal.
  • the mobile terminal refers to a device for requesting data information and data transmission from a plurality of target devices.
  • the mobile terminal receives data through a notification message, an instruction message, etc. from a plurality of target devices, and when receiving an instruction message from the plurality of target devices, sends a confirmation message in response to the instruction message.
  • the mobile terminal may provide information to the user through an output unit or receive input from the user through the input unit in the process of transmitting and receiving messages with a plurality of target devices.
  • the mobile terminal may read data from the memory or write new data to the corresponding memory in the process of transmitting and receiving messages with the plurality of target devices.
  • FIG. 2 is a view for explaining a problem that may occur when BLE connection.
  • a user selects a Bluetooth icon 210 to enter a Bluetooth connection menu.
  • the mobile terminal 100 When the user presses the search button 220 shown in FIG. 2B, the mobile terminal 100 performs a scan. When the scan is completed, as shown in FIG. 2C, the mobile terminal displays a list 230 of the scanned devices.
  • the user checks a device to be connected in the list and selects the device.
  • the mobile terminal can connect the selected device to the mobile terminal.
  • the mobile terminal drives a dedicated app for controlling the selected device, and the user can control the selected device using a control screen of the dedicated app.
  • the user since the list of connected scanable devices directly includes device names such as WMRRD11-NC1012P, the user has a problem in that it is difficult to grasp on the list what devices are desired to control.
  • FIG. 3 is a block diagram illustrating a mobile terminal 100 according to an embodiment of the present invention.
  • the mobile terminal 100 includes a wireless communication unit 110, an input unit 120, an artificial intelligence unit 130, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, and a controller 180. ) And the power supply unit 190 and the like.
  • FIG. 3 The components shown in FIG. 3 are not essential to implementing a mobile terminal, so the mobile terminal described herein may have more or fewer components than those listed above.
  • the wireless communication unit 110 of the components, between the mobile terminal 100 and the wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or the mobile terminal 100 and the external server It may include one or more modules that enable wireless communication therebetween.
  • the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.
  • the wireless communication unit 110 may include at least one of the broadcast receiving module 111, the mobile communication module 112, the wireless internet module 113, the short range communication module 114, and the location information module 115. .
  • the input unit 120 may include a camera 121 or an image input unit for inputting an image signal, a microphone 122 for inputting an audio signal, an audio input unit, or a user input unit 123 for receiving information from a user. , Touch keys, mechanical keys, and the like.
  • the voice data or the image data collected by the input unit 120 may be analyzed and processed as a control command of the user.
  • the artificial intelligence unit 130 performs a role of processing information based on artificial intelligence technology, and includes one or more modules that perform at least one of learning information, inferring information, perceiving information, and processing natural language. It may include.
  • the artificial intelligence unit 130 uses a machine learning technique to generate a large amount of information (big data) such as information stored in the mobile terminal, environment information around the mobile terminal, and information stored in an external storage that can be communicated with. At least one of learning, reasoning, and processing may be performed. The artificial intelligence unit 130 predicts (or infers) an operation of at least one executable mobile terminal using information learned using the machine learning technique, and calculates the most of the at least one predicted operations. The mobile terminal can be controlled to execute a highly feasible operation.
  • big data such as information stored in the mobile terminal, environment information around the mobile terminal, and information stored in an external storage that can be communicated with. At least one of learning, reasoning, and processing may be performed.
  • the artificial intelligence unit 130 predicts (or infers) an operation of at least one executable mobile terminal using information learned using the machine learning technique, and calculates the most of the at least one predicted operations.
  • the mobile terminal can be controlled to execute a highly feasible operation.
  • Machine learning technology is a technology that collects and learns a large amount of information based on at least one algorithm, and determines and predicts information based on the learned information.
  • the learning of information is an operation of grasping characteristics, rules, and judgment criteria of information, quantifying a relationship between information, and predicting new data using the quantized pattern.
  • the algorithms used by these machine learning techniques can be algorithms based on statistics, for example, decision trees that use tree structures as predictive models, and artificial ones that mimic the neural network structure and function of living things.
  • Neural networks genetic programming based on living evolutionary algorithms, clustering that distributes observed examples into subsets of clusters, and Monte Carlo, which randomly computes function values through randomized random numbers Monte carlo method.
  • deep learning technology is a technology that performs at least one of learning, determining, and processing information by using an artificial neural network algorithm.
  • the artificial neural network may have a structure that connects layers to layers and transfers data between layers.
  • Such deep learning technology can learn a large amount of information through an artificial neural network using a graphic processing unit (GPU) optimized for parallel computing.
  • GPU graphic processing unit
  • the artificial intelligence unit 130 collects (detects, monitors, and extracts signals, data, information, etc. input or output from the components of the mobile terminal in order to collect a large amount of information for applying the machine learning technology). , Detect, receive).
  • the artificial intelligence unit 130 may collect (detect, monitor, extract, detect, receive) data, information, and the like stored in an external storage (for example, a cloud server) connected through communication. More specifically, the collection of information may be understood as a term including an operation of sensing information through a sensor, extracting information stored in the memory 170, or receiving information from an external storage through communication.
  • the artificial intelligence unit 130 may detect information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information through the sensing unit 140. In addition, the artificial intelligence unit 130 may receive a broadcast signal and / or broadcast related information, a wireless signal, wireless data, and the like through the wireless communication unit 110. In addition, the artificial intelligence unit 130 may receive image information (or signal), audio information (or signal), data, or information input from a user from the input unit.
  • the artificial intelligence unit 130 collects a large amount of information in real time on the background, learns it, and stores the processed information (eg, knowledge graph, command policy, personalization database, conversation engine, etc.) in an appropriate form. Can be stored at 170.
  • processed information eg, knowledge graph, command policy, personalization database, conversation engine, etc.
  • the artificial intelligence unit 130 if the operation of the mobile terminal is predicted, to execute the predicted operation, to control the components of the mobile terminal, or
  • the control command for executing the operation may be transmitted to the controller 180.
  • the controller 180 can execute the predicted operation by controlling the mobile terminal based on the control command.
  • the artificial intelligence unit 130 analyzes historical information indicating performance of a specific operation through machine learning technology, and updates the previously learned information based on the analysis information. Can be. Thus, the artificial intelligence unit 130 may improve the accuracy of the information prediction.
  • the artificial intelligence unit 130 and the controller 180 may be understood as the same component.
  • a function performed by the controller 180 described herein may be expressed as being performed by the artificial intelligence unit 130, and the controller 180 is named as the artificial intelligence unit 130 or vice versa.
  • the intelligent unit 130 may be referred to as the controller 180.
  • the artificial intelligence unit 130 and the controller 180 may be understood as separate components.
  • the artificial intelligence unit 130 and the controller 180 may perform various controls on the mobile terminal through data exchange with each other.
  • the controller 180 may perform at least one function on the mobile terminal or control at least one of the components of the mobile terminal based on the result derived from the artificial intelligence unit 130.
  • the artificial intelligence unit 130 may also be operated under the control of the controller 180.
  • the sensing unit 140 may include one or more sensors for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information.
  • the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, and gravity.
  • the mobile terminal disclosed herein may use a combination of information sensed by at least two or more of these sensors.
  • the output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes at least one of a display unit 151, an audio output unit 152, a hap tip module 153, and an optical output unit 154. can do.
  • the display unit 151 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
  • the touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and the user, and may also provide an output interface between the mobile terminal 100 and the user.
  • the interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100.
  • the interface unit 160 connects a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, and an identification module. It may include at least one of a port, an audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port.
  • I / O audio input / output
  • I / O video input / output
  • earphone port an earphone port
  • the memory 170 stores data supporting various functions of the mobile terminal 100.
  • the memory 170 is a plurality of application programs (applications or applications) driven in the mobile terminal 100, data for operating the mobile terminal 100, instructions, the operation of the artificial intelligence unit 130 Data for, for example, at least one algorithm information for machine learning, and the like. At least some of these applications may be downloaded from an external server via wireless communication. In addition, at least some of these application programs may exist on the mobile terminal 100 from the time of shipment for basic functions of the mobile terminal 100 (for example, a call forwarding, a calling function, a message receiving, and a calling function). The application program may be stored in the memory 170 and installed on the mobile terminal 100 to be driven by the controller 180 to perform an operation (or function) of the mobile terminal.
  • the controller 180 In addition to the operation related to the application program, the controller 180 typically controls the overall operation of the mobile terminal 100.
  • the controller 180 may provide or process information or a function appropriate to a user by processing signals, data, information, and the like, which are input or output through the above-described components, or by driving an application program stored in the memory 170.
  • controller 180 may control at least some of the components described with reference to FIG. 3 in order to drive an application program stored in the memory 170. In addition, the controller 180 may operate by combining at least two or more of the components included in the mobile terminal 100 to drive the application program.
  • the power supply unit 190 receives power from an external power source and an internal power source under the control of the controller 180 to supply power to each component included in the mobile terminal 100.
  • the power supply unit 190 includes a battery, which may be a built-in battery or a replaceable battery.
  • the broadcast receiving module 111 of the wireless communication unit 110 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or switching of broadcast channels for at least two broadcast channels.
  • the broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal.
  • the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
  • the broadcast signal may be encoded according to at least one of technical standards (or broadcast methods, for example, ISO, IEC, DVB, ATSC, etc.) for transmitting and receiving digital broadcast signals, and the broadcast receiving module 111 may
  • the digital broadcast signal may be received by using a method suitable for a technical standard set by technical standards.
  • the broadcast associated information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
  • the broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.
  • the broadcast related information may exist in various forms such as an electronic program guide (EPG) of digital multimedia broadcasting (DMB) or an electronic service guide (ESG) of digital video broadcast-handheld (DVB-H).
  • EPG electronic program guide
  • ESG electronic service guide
  • the broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.
  • the mobile communication module 112 may include technical standards or communication schemes (eg, Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (CDMA2000), and EV).
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced) and the like to transmit and receive a radio signal with at least one of a base station, an external terminal, a server on a mobile communication network.
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO)
  • WCDMA Wideband CDMA
  • HSDPA High
  • the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
  • the wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100.
  • the wireless internet module 113 is configured to transmit and receive wireless signals in a communication network according to wireless internet technologies.
  • Wireless Internet technologies include, for example, Wireless LAN (WLAN), Wireless-Fidelity (Wi-Fi), Wireless Fidelity (Wi-Fi) Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), and WiMAX (World Interoperability for Microwave Access (HSDPA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), and the like. 113) transmits and receives data according to at least one wireless Internet technology in a range including the Internet technologies not listed above.
  • WLAN Wireless LAN
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Wireless Fidelity
  • DLNA Digital Living Network Alliance
  • WiBro Wireless Broadband
  • WiMAX Worldwide Interoperability for Microwave Access
  • HSDPA High Speed Downlink Packet Access
  • HSUPA High Speed Uplink Packet Access
  • LTE Long Term Evolution-Advanced
  • the wireless Internet module 113 for performing a wireless Internet access through the mobile communication network 113 May be understood as a kind of mobile communication module 112.
  • the short range communication module 114 is for short range communication, and includes Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and NFC. (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus) by using at least one of the technologies, it is possible to support near field communication.
  • the short-range communication module 114 may be configured between a mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or through the wireless area networks. ) And a network in which the other mobile terminal 100 (or an external server) is located.
  • the short range wireless communication network may be short range wireless personal area networks.
  • the other mobile terminal 100 is a wearable device capable of exchanging (or interworking) data with the mobile terminal 100 according to the present invention (for example, smartwatch, smart glasses). (smart glass), head mounted display (HMD).
  • the short range communication module 114 may sense (or recognize) a wearable device that can communicate with the mobile terminal 100, around the mobile terminal 100.
  • the controller 180 may include at least a portion of data processed by the mobile terminal 100 in the short range communication module ( The transmission may be transmitted to the wearable device through 114. Therefore, the user of the wearable device may use data processed by the mobile terminal 100 through the wearable device.
  • the user when a user receives a phone call, the user performs a phone call through the wearable device, or when a message is received by the mobile terminal 100, the user receives the received call through the wearable device. It is possible to check the message.
  • the location information module 115 is a module for obtaining a location (or current location) of a mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module.
  • GPS Global Positioning System
  • WiFi Wireless Fidelity
  • the mobile terminal may acquire the location of the mobile terminal using a signal transmitted from a GPS satellite.
  • the mobile terminal may acquire the location of the mobile terminal based on information of the wireless access point (AP) transmitting or receiving the Wi-Fi module and the wireless signal. If necessary, the location information module 115 may perform any function of other modules of the wireless communication unit 110 to substitute or additionally obtain data regarding the location of the mobile terminal.
  • the location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.
  • the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user.
  • the mobile terminal 100 is one.
  • the plurality of cameras 121 may be provided.
  • the camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode.
  • the processed image frame may be displayed on the display unit 151 or stored in the memory 170.
  • the plurality of cameras 121 provided in the mobile terminal 100 may be arranged to form a matrix structure, and through the camera 121 forming a matrix structure in this way, the mobile terminal 100 may have various angles or focuses.
  • the plurality of pieces of image information may be input.
  • the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for implementing a stereoscopic image.
  • the microphone 122 processes external sound signals into electrical voice data.
  • the processed voice data may be variously used according to a function (or an application program being executed) performed by the mobile terminal 100. Meanwhile, various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in the process of receiving an external sound signal.
  • the user input unit 123 is for receiving information from a user. When information is input through the user input unit 123, the controller 180 may control an operation of the mobile terminal 100 to correspond to the input information. .
  • the user input unit 123 may be a mechanical input unit (or a mechanical key, for example, a button, a dome switch, a jog wheel, or the like located at the front or rear or side of the mobile terminal 100). Jog switch, etc.) and touch input means.
  • the touch input means may include a virtual key, a soft key, or a visual key displayed on the touch screen through a software process, or a portion other than the touch screen.
  • the virtual key or the visual key may be displayed on the touch screen while having various forms, for example, graphic or text. ), An icon, a video, or a combination thereof.
  • the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a sensing signal corresponding thereto.
  • the controller 180 may control driving or operation of the mobile terminal 100 or perform data processing, function or operation related to an application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.
  • the proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
  • the proximity sensor 141 may be disposed in an inner region of the mobile terminal covered by the touch screen described above or near the touch screen.
  • the proximity sensor 141 examples include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the proximity sensor 141 may be configured to detect the proximity of the object by the change of the electric field according to the proximity of the conductive object.
  • the touch screen (or touch sensor) itself may be classified as a proximity sensor.
  • the proximity sensor 141 may detect a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). have.
  • the controller 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern detected through the proximity sensor 141 as described above, and further, provides visual information corresponding to the processed data. It can be output on the touch screen. Further, the controller 180 may control the mobile terminal 100 to process different operations or data (or information) according to whether the touch on the same point on the touch screen is a proximity touch or a touch touch. .
  • the touch sensor applies a touch (or touch input) applied to the touch screen (or the display unit 151) using at least one of various touch methods such as a resistive film method, a capacitive method, an infrared method, an ultrasonic method, and a magnetic field method. Detect.
  • the touch sensor may be configured to convert a change in pressure applied to a specific portion of the touch screen or capacitance generated at the specific portion into an electrical input signal.
  • the touch sensor may be configured to detect a position, an area, a pressure at the touch, a capacitance at the touch, and the like, when the touch object applying the touch on the touch screen is touched on the touch sensor.
  • the touch object is an object applying a touch to the touch sensor, and may be, for example, a finger, a touch pen or a stylus pen, a pointer, or the like.
  • the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180.
  • the controller 180 can know which area of the display unit 151 is touched.
  • the touch controller may be a separate component from the controller 180 or may be the controller 180 itself.
  • the controller 180 may perform different control or perform the same control according to the type of touch object that touches the touch screen (or a touch key provided in addition to the touch screen). Whether to perform different control or the same control according to the type of touch object may be determined according to the operation state of the mobile terminal 100 or an application program being executed.
  • the touch sensor and the proximity sensor described above may be independently or combined, and may be a short (or tap) touch, a long touch, a multi touch, a drag touch on a touch screen. ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, etc. A touch can be sensed.
  • the ultrasonic sensor may recognize location information of a sensing object using ultrasonic waves.
  • the controller 180 can calculate the position of the wave generation source through the information detected from the optical sensor and the plurality of ultrasonic sensors.
  • the position of the wave source can be calculated using the property that the light is much faster than the ultrasonic wave, that is, the time that the light reaches the optical sensor is much faster than the time when the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generation source may be calculated using a time difference from the time when the ultrasonic wave reaches the light as the reference signal.
  • the camera 121 which has been described as the configuration of the input unit 120, includes at least one of a camera sensor (eg, CCD, CMOS, etc.), a photo sensor (or image sensor), and a laser sensor.
  • a camera sensor eg, CCD, CMOS, etc.
  • a photo sensor or image sensor
  • a laser sensor e.g., a laser sensor
  • the camera 121 and the laser sensor may be combined with each other to detect a touch of a sensing object with respect to a 3D stereoscopic image.
  • the photo sensor may be stacked on the display element, which is configured to scan the movement of the sensing object in proximity to the touch screen. More specifically, the photo sensor mounts a photo diode and a transistor (TR) in a row / column and scans contents mounted on the photo sensor by using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor calculates coordinates of the sensing object according to the amount of light change, and thus, the position information of the sensing object can be obtained.
  • TR transistor
  • the display unit 151 displays (outputs) information processed by the mobile terminal 100.
  • the display unit 151 may display execution screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information according to the execution screen information. .
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.
  • the stereoscopic display unit may be a three-dimensional display method such as a stereoscopic method (glasses method), an auto stereoscopic method (glasses-free method), a projection method (holographic method).
  • a 3D stereoscopic image is composed of a left image (left eye image) and a right image (right eye image).
  • a top-down method in which the left and right images are arranged up and down in one frame according to the way in which the left and right images are merged into three-dimensional stereoscopic images.
  • L-to-R (left-to-right, side by side) method to be arranged as a checker board method to arrange the pieces of the left and right images in the form of tiles, a column unit of the left and right images Or an interlaced method of alternately arranging rows, and a time sequential (frame by frame) method of alternately displaying left and right images by time.
  • the 3D thumbnail image may generate a left image thumbnail and a right image thumbnail from the left image and the right image of the original image frame, respectively, and may be generated as one image as they are combined.
  • a thumbnail refers to a reduced image or a reduced still image.
  • the left image thumbnail and the right image thumbnail generated as described above are displayed with a left and right distance difference on the screen by a depth corresponding to the parallax of the left image and the right image, thereby representing a three-dimensional space.
  • the left image and the right image necessary for implementing the 3D stereoscopic image may be displayed on the stereoscopic display unit by the stereoscopic processing unit.
  • the stereoscopic processor is configured to receive a 3D image (the image of the base view and the image of the extended view) and set a left image and a right image therefrom, or to receive a 2D image and convert it to a left image and a right image.
  • the sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output unit 152 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100.
  • the sound output unit 152 may include a receiver, a speaker, a buzzer, and the like.
  • the haptic module 153 generates various haptic effects that a user can feel.
  • a representative example of the tactile effect generated by the haptic module 153 may be vibration.
  • the intensity and pattern of vibration generated by the haptic module 153 may be controlled by the user's selection or the setting of the controller. For example, the haptic module 153 may synthesize different vibrations and output them or sequentially output them.
  • the haptic module 153 may be used to stimulate pin arrays that vertically move with respect to the contact skin surface, jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of electrodes, and electrostatic force.
  • Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.
  • the haptic module 153 may not only deliver a tactile effect through direct contact, but also may allow a user to feel the tactile effect through a muscle sense such as a finger or an arm. Two or more haptic modules 153 may be provided according to configuration aspects of the mobile terminal 100.
  • the light output unit 154 outputs a signal for notifying occurrence of an event by using light of a light source of the mobile terminal 100.
  • Examples of events occurring in the mobile terminal 100 may be message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
  • the signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or the rear.
  • the signal output may be terminated by the mobile terminal detecting the user's event confirmation.
  • the interface unit 160 serves as a path to all external devices connected to the mobile terminal 100.
  • the interface unit 160 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device.
  • the port, audio input / output (I / O) port, video input / output (I / O) port, earphone port, etc. may be included in the interface unit 160.
  • the identification module is a chip that stores a variety of information for authenticating the usage rights of the mobile terminal 100, a user identification module (UIM), subscriber identity module (SIM), universal user authentication And a universal subscriber identity module (USIM).
  • a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through the interface unit 160.
  • the interface unit 160 may be a passage for supplying power from the cradle to the mobile terminal 100 or may be input from the cradle by a user.
  • Various command signals may be a passage through which the mobile terminal 100 is transmitted.
  • Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.
  • the memory 170 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
  • the memory 170 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
  • the memory 170 may include a flash memory type, a hard disk type, a solid state disk type, an SSD type, a silicon disk drive type, and a multimedia card micro type. ), Card-type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read It may include at least one type of storage medium of -only memory (PROM), programmable read-only memory (PROM), magnetic memory, magnetic disk and optical disk.
  • the mobile terminal 100 may be operated in connection with a web storage that performs a storage function of the memory 170 on the Internet.
  • the controller 180 controls the operation related to the application program, and generally the overall operation of the mobile terminal 100. For example, if the state of the mobile terminal satisfies a set condition, the controller 180 may execute or release a lock state that restricts input of a user's control command to applications.
  • controller 180 may perform control and processing related to voice call, data communication, video call, or the like, or may perform pattern recognition processing for recognizing handwriting input or drawing input performed on a touch screen as text and images, respectively. Can be. Furthermore, the controller 180 may control any one or a plurality of components described above in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.
  • the power supply unit 190 receives external power and internal power under the control of the controller 180 to supply power required for the operation of each component.
  • the power supply unit 190 includes a battery, and the battery may be a built-in battery configured to be rechargeable, and may be detachably coupled to the terminal body for charging.
  • the power supply unit 190 may be provided with a connection port, the connection port may be configured as an example of the interface 160 is electrically connected to the external charger for supplying power for charging the battery.
  • the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port.
  • the power supply unit 190 uses one or more of an inductive coupling based on a magnetic induction phenomenon or a magnetic resonance coupling based on an electromagnetic resonance phenomenon from an external wireless power transmitter. Power can be delivered.
  • various embodiments of the present disclosure may be implemented in a recording medium readable by a computer or a similar device using, for example, software, hardware, or a combination thereof.
  • the input unit 120 and the sensing unit 140 have been described as separate components, the present disclosure is not limited thereto, and the sensing unit 140 may include the input unit 120.
  • wireless communication unit 110 may be used interchangeably with the term communication unit.
  • memory 170 may be used interchangeably with the term storage.
  • controller 180 may be used interchangeably with the term controller and processor.
  • the controller 180 may include an application-specific integrated circuit (ASIC), another chipset, a logic circuit, and / or a data processing device.
  • ASIC application-specific integrated circuit
  • AI unit 130 may be used interchangeably with the term AI controller and AI processor.
  • the wireless communication unit 110 may include a network interface.
  • the network interface refers to a device that enables a mobile terminal to perform wired or wireless communication with a plurality of target devices, and may include an energy efficiency interface and a legacy interface.
  • the energy efficiency interface is a device for low power wireless communication with low energy consumption, and refers to a unit (or module) that enables a mobile device to search for a target device or to transmit data.
  • the legacy interface is a device for wireless communication, and refers to a unit (or module) that enables the mobile terminal to search for another device or to transmit data.
  • the display unit 151 may be a unit (or a module) that outputs data received through a network interface or data stored in the memory 170 under the control of the controller 180.
  • the controller 180 controls the network interface to receive advertisement information from the target device, transmits a scan request to the device to the target device, and scans in response to the scan request from the target device. Control the communication unit to receive a response), and control the network interface to transmit a connect request message to the server device for establishing a Bluetooth connection with the target device.
  • the controller 180 controls the communication unit to read or write data from the target device using the attribute protocol.
  • the BLE procedure may be classified into a device filtering procedure, an advertising procedure, a scanning procedure, a discovery procedure, a connecting procedure, and the like.
  • the device filtering procedure is a method for reducing the number of devices performing a response to a request, an indication, a notification, etc. in the controller stack.
  • the controller stack can control the number of requests sent, reducing power consumption in the BLE controller stack.
  • the advertising device or scanning device may perform the device filtering procedure to limit the device receiving the advertising packet, scan request or connection request.
  • the advertising device refers to a device that transmits an advertising event, that is, performs an advertisement, and is also referred to as an advertiser.
  • the plurality of target devices 200, 300, 400, and 500 correspond to advertising devices.
  • the scanning device refers to a device that performs scanning and a device that transmits a scan request.
  • the mobile terminal 100 corresponds to a scanning device.
  • the scanning device when the scanning device receives some advertising packets from the advertising device, the scanning device should send a scan request to the advertising device.
  • the scanning device may ignore the advertisement packets transmitted from the advertisement device.
  • the device filtering procedure may also be used in the connection request process. If device filtering is used in the connection request process, it is not necessary to transmit a response to the connection request by ignoring the connection request.
  • the advertising device performs an advertising procedure to perform a non-directional broadcast to the devices in the area.
  • non-directional broadcast refers to broadcast in all directions rather than broadcast in a specific direction.
  • Non-directional broadcasts refer to broadcasts in a particular direction. Non-directional broadcasts occur without a connection procedure between an advertising device and a device in a listening (or listening) state (hereinafter referred to as a listening device).
  • the advertising procedure is used to establish a Bluetooth connection with a nearby initiating device.
  • the advertising procedure may be used to provide periodic broadcast of user data to the scanning devices that are listening on the advertising channel.
  • the advertising devices may receive a scan request from listening devices that are listening to obtain additional user data from the advertising device.
  • the advertising device transmits a response to the scan request to the device that sent the scan request through the same advertising physical channel as the received advertising physical channel.
  • Broadcast user data sent as part of an advertisement packet is dynamic data, while scan response data is generally static data.
  • the advertising device may receive a connection request from the initiating device on the advertising (broadcast) physical channel. If the advertising device used a connectable advertising event and the initiating device was not filtered by the device filtering procedure, the advertising device stops the advertising and enters the connected mode. The advertising device may start advertising again after the connected mode.
  • the device performing the scanning i.e., the scanning device, performs a scanning procedure to listen to the non-directional broadcast of the user data from the advertising devices using the advertising physical channel.
  • the scanning device sends a scan request to the advertising device via the advertising physical channel to request additional data from the advertising device.
  • the advertising device transmits a scan response that is a response to the scan request, including additional data requested by the scanning device over the advertising physical channel.
  • the scanning procedure can be used while connected to other BLE devices in the BLE piconet.
  • the scanning device If the scanning device is in an initiator mode that can receive the broadcasted advertising event and initiate a connection request, the scanning device sends the connection request to the advertising device via the advertising physical channel to the advertising device. You can start a Bluetooth connection with.
  • the scanning device When the scanning device sends a connection request to the advertising device, the scanning device stops initiator mode scanning for further broadcast and enters the connected mode.
  • Bluetooth devices Devices capable of Bluetooth communication (hereinafter referred to as “Bluetooth devices”) perform an advertisement procedure and a scanning procedure to find devices that are nearby or to be found by other devices within a given area.
  • the discovery procedure is performed asymmetrically.
  • a Bluetooth device that attempts to find another device around it is called a discovering device and listens for devices that advertise a scannable advertisement event.
  • Bluetooth devices discovered and available from other devices are referred to as discoverable devices, and actively broadcast advertising events so that other devices can scan through an advertising (broadcast) physical channel.
  • Both the discovering device and the discoverable device may already be connected with other Bluetooth devices in the piconet.
  • connection procedure is asymmetric, and the connection procedure requires the other Bluetooth device to perform the scanning procedure while the specific Bluetooth device performs the advertisement procedure.
  • the advertising procedure can be the goal, so that only one device will respond to the advertising.
  • the connection may be initiated by sending a connection request to the advertising device via the advertising (broadcast) physical channel.
  • the link layer LL enters the advertisement state by the instruction of the host (stack). If the link layer is in the advertisement state, the link layer sends advertisement packet data units (PDUs) in the advertisement events.
  • PDUs advertisement packet data units
  • Each advertising event consists of at least one advertising PDU, which is transmitted via the advertising channel indexes used.
  • the advertisement event may terminate when the advertisement PDU is transmitted through each of the advertisement channel indexes used, or may terminate the advertisement event earlier when the advertisement device needs to make space for performing another function.
  • the link layer enters the scanning state by the indication of the host (stack). In the scanning state, the link layer listens for advertising channel indices.
  • scanning states There are two types of scanning states: passive scanning and active scanning, each scanning type being determined by the host.
  • ScanInterval is defined as the interval (interval) between the starting points of two consecutive scan windows.
  • the link layer must listen for completion of all scan intervals in the scan window as instructed by the host. In each scan window, the link layer must scan a different advertising channel index. The link layer uses all available advertising channel indexes.
  • the link layer When passive scanning, the link layer only receives packets and does not transmit any packets.
  • the link layer When active scanning, the link layer performs listening to rely on the advertising PDU type, which may request advertising PDUs and additional information related to the advertising device from the advertising device.
  • the link layer enters the initiation state by the indication of the host (stack).
  • the link layer When the link layer is in the initiating state, the link layer performs listening for the advertising channel indexes.
  • the link layer listens for the advertising channel index during the scan window period.
  • the link layer enters the connected state when the device performing the connection request, i.e., the initiating device, sends the CONNECT_REQ PDU to the advertising device or when the advertising device receives the CONNECT_REQ PDU from the initiating device.
  • connection After entering the connected state, the connection is considered to be created. However, it does not need to be considered to be established at the time the connection enters the connected state. The only difference between the newly created connection and the established connection is the link layer connection supervision timeout value.
  • the link layer that performs the master role is called a master, and the link layer that performs the slave role is called a slave.
  • the master controls the timing of the connection event, and the connection event is the point in time when the master and the slave are synchronized.
  • FIG. 4 is a view for explaining a method of operation of a wireless communication system according to an embodiment of the present invention.
  • the plurality of target devices 200 and 300 may transmit advertisement information to the mobile terminal 100 (S410 and S420).
  • the mobile terminal 100 may scan the peripheral target device (S430).
  • the first target device 200 in an advertisement state may transmit advertisement information to adjacent devices for each advertisement event.
  • the time between the advertising events may be defined as an advertising interval.
  • the first target device 200 may transmit advertisement information to adjacent devices at each advertising interval.
  • the advertisement information may be used interchangeably with the term advertisement packet or advertisement message.
  • the first target device 200 may include specific information in the advertisement packet and transmit the same to the adjacent devices through a procedure of setting the advertisement packet.
  • the first target device 200 performs an advertisement parameter setting procedure to include specific information in the advertisement packet in the standby state.
  • the host of the first target device 200 transmits a Set Advertising Parameter Command to the controller to set parameters related to the interval and address at which the advertisement packet is transmitted.
  • the advertisement parameter setting command may set the maximum transmission interval, the minimum transmission interval, the type of the advertisement packet, the type of address used for the advertisement packet, the advertisement channel used for the transmission of the advertisement packet, and the like.
  • the host transmits a Set Advertising Data Command to the controller to set data used in the advertisement packets including the data field.
  • the first target device may set data included in the advertisement packet.
  • the advertisement packet may include data for TDS in the advertisement packet.
  • the host then sends an advertisement activation command to the controller to start or stop the operation of sending the advertisement packet.
  • the controller After the controller receives the advertisement activation command from the host, the controller continuously transmits the advertisement packet to neighboring devices until receiving the advertisement activation command for stopping transmission of the advertisement packet from the host.
  • the first target device 200 enters an advertisement state for transmitting the advertisement packet, and the controller of the first target device 200 periodically transmits the advertisement packet in the advertisement event.
  • the mobile terminal 100 may perform a scanning operation by setting a scanning parameter to receive an advertisement packet transmitted from adjacent devices in a standby state.
  • the host of the mobile terminal 100 transmits a set scan parameters command to the controller to set parameters for scanning.
  • the scan parameter setting command may set a type of a scanning operation such as passive scanning or active scanning, an interval at which a scanning operation is to be performed, a scan window, and a type of an address used in scan request packets.
  • the host After setting the parameters for the scanning operation through the scan parameter setting command, the host transmits an enabling scanning command to the controller to start the scanning operation.
  • the mobile terminal 100 enters a scanning state to perform a scanning operation and receives advertisement packets transmitted from neighboring devices during the scanning window period.
  • the advertisement packets transmitted from the plurality of target devices may be received during the scanning period of the mobile terminal 100.
  • the controller 1 of the device 1 When the controller 1 of the device 1 receives the advertisement packet, the controller 1 generates and sends an advertisement report to the host to report the advertisement packet to the host.
  • the advertisement report may be called LE_Advertising_Report_Event and may be generated based on one or a plurality of advertisement packets.
  • FIG. 6 is a diagram illustrating an example of a data format of an advertisement packet.
  • the advertisement packet has an AD Stucture structure, and the mobile terminal 100 may parse the advertisement packet to find data desired by the mobile terminal 100.
  • the plurality of target devices may include specific data in the advertisement packet in an AD structure structure through the above-described advertisement parameter setting procedure.
  • the plurality of target devices may configure an advertisement packet to transmit Tx Power, which is transmission power, data for providing TDS, and Manufacturer information, which is information related to a manufacturer, through an advertisement packet.
  • Tx Power which is transmission power
  • Data for providing TDS data for providing TDS
  • Manufacturer information which is information related to a manufacturer
  • the advertisement packet is composed of a plurality of AD structures, each AD structure may be composed of a LTV structure (Length Type Value).
  • the Length field indicates the length of data
  • the Ad Type field indicates the type of data included
  • the Ad Data may include actual data.
  • each AD structure may be composed of Length, AD Type, and AD Data fields.
  • the AD type may indicate a type of data included in AD data.
  • the AD Data field includes data according to an AD type and may be configured of one or a plurality of LTV structures.
  • the advertisement information transmitted from the target device may include identification information of the target device.
  • the identification information of the target device may be a model ID of the target device.
  • the advertisement information transmitted from the second target device 300 may be a model ID unique to the second target device 300 which is identification information of the second target device 300.
  • the advertisement information transmitted from the first target device 200 may be a model ID unique to the first target device 200 which is identification information of the first target device 200.
  • the plurality of target devices may include identification information of each of the plurality of target devices in the advertisement packet through the above-described advertisement parameter setting procedure.
  • the advertisement packet may include data representing the model ID of the target device.
  • an extended field for including data representing a model ID of a target device may be configured using a manufacturer AD type.
  • the extended field includes an AI camera field so that the mobile terminal 100 recognizes that the target device is recognized by shooting, and a data field indicating a model ID of the target device. It may include.
  • data provided by the manufacturer of the target device for example, a manufacturer's unique ID (Company ID), specific data provided by the manufacturer, etc.
  • the advertisement information may include an access address of each of the plurality of target devices for accessing the plurality of target devices.
  • the plurality of target devices may include a plurality of network interfaces (for example, Wi-Fi, WiGig, BR / EDR, etc.) in addition to the Bluetooth LE.
  • the advertisement message may include information of a network interface supported by the target device.
  • the controller 180 may receive a plurality of advertisement information including identification information of the plurality of target devices, respectively, from the plurality of target devices.
  • the mobile terminal 100 initially exists in a standby state. In the standby state, the mobile terminal 100 cannot receive a message.
  • the mobile terminal 100 may enter an advertising state, a scanning state, or an initiating state.
  • the mobile terminal 100 may perform a scan. In this case, the mobile terminal 100 may enter a scanning state and receive an advertisement packet transmitted from a nearby target device.
  • the controller 180 may obtain the model ID and the access address of the target device by parsing the received advertisement packet.
  • the controller 180 parses an advertisement packet received from the first target device 200 to obtain identification information (model ID of the first target device 200) and an access address of the first target device 200. can do.
  • the controller 180 parses the advertisement packet received from the second target device 300 to obtain identification information (model ID of the second target device 300) and an access address of the second target device 300. Can be obtained.
  • the mobile terminal 100 through the information of the alternative communication means (for example, Bluetooth BR / EDR, Wi-Fi, Wi-Fi Direct, etc.) supported by a plurality of target devices through the Bluetooth LE Information about services that can be provided (for example, Bluetooth BR / EDR A2DP HFP, Wi-Fi Direct Miracast, Wi-Fi Direct File Transfer, etc.) can be obtained.
  • the alternative communication means for example, Bluetooth BR / EDR, Wi-Fi, Wi-Fi Direct, etc.
  • the mobile terminal 100 may be connected to the target device through alternative communication means, and may provide various services through the connected alternative communication means.
  • the mobile terminal and the plurality of target devices may exchange alternative communication means information supported by each device and service information that may be provided through the alternative communication means in the connection procedure of the Bluetooth LE.
  • the mobile terminal 100 exists in a scanning state in order to search for a device supporting Bluetooth nearby, and a plurality of target devices exist in an advertising state.
  • the plurality of target devices transmit the advertisement information to the mobile terminal 100 in the advertisement state.
  • the advertisement information may be transmitted to a plurality of devices in a broadcast manner through an advertisement channel.
  • the mobile terminal 100 may receive the advertisement information during the search window section in the search state.
  • the advertisement information may include alternative communication means information indicating information of alternative communication means supported by each of the plurality of external devices, and service information indicating information of a service that can be provided through the alternative communication means.
  • the service information may be included as list information listing a plurality of services.
  • the alternative communication means information and the service information may be basic information for allowing the mobile terminal 100 to select an alternative communication means and a service to be activated through the GATT of the Bluetooth LE.
  • the mobile terminal 100 sends a scan request message to the target device. Can transmit
  • the mobile terminal 100 may receive a scan response message including additional information from the target device.
  • the additional information may include a local name of the target device, a device class, a device type, and / or a major service class.
  • the controller 180 may control the camera unit to photograph the first target device of the plurality of target devices (S440). In addition, the controller 180 may obtain identification information labeled on the first target device by using the captured image of the first target device (S450).
  • the controller 180 may obtain identification information labeled on the first target device by using the previously learned machine learning model.
  • the learning process of the machine learning model is illustrated in FIG. 7.
  • data for learning a neural network may be collected (S710).
  • the data is data for a plurality of target devices, and may include photos, model IDs, etc. of the plurality of target devices.
  • the collected data may be preprocessed (S720). Specifically, the collected data may be processed and classified in a state suitable for machine learning. In this case, the image format supported by the learning program, the number of data, and the data for each model may be classified and collected. The collected data can also be divided into training data and review data.
  • the learning algorithm may include Naive-Bayes, K-Nearest Neighbors, Logistic Regression, and Support Vector Machine.
  • the learning process of the neural network may be a process of training by labeling identification information (model ID) on an image of the target device.
  • images taken of the target device at various angles, distances, and illuminance may be used as training data, and images of all or a portion of the target device may be used as training data.
  • the learning process of the neural network may be a process of training by labeling identification information (model ID) on images of each of the plurality of target devices.
  • the machine learning model generated by such training may be a model learned by labeling identification information on each of a plurality of target devices. Accordingly, when an image including the target device is input, the learned machine learning model may predict and output identification information (model ID) labeled on the target device.
  • model ID identification information
  • the learning result that is, the learned machine learning model is confirmed as the review data (S740), and when the accuracy of the review result exceeds the target value, the learned machine learning model may be mounted in the mobile terminal 100 ( S750).
  • a program implementing the machine learning model may be stored in the storage unit.
  • the controller 180 may control the camera unit to photograph the first target device 600.
  • the controller 180 can activate a photographing function of the camera unit.
  • BLE Bluetooth Low Energy
  • the controller 180 can display a preview screen 810 indicating an image received through the camera unit.
  • the user may project the first target device 600 to be controlled among the plurality of target devices.
  • the controller 180 may acquire a captured image of the first target device by capturing the image.
  • the controller 180 may acquire the image of the first target device by capturing the image at a predetermined time interval.
  • FIG. 9 is a diagram for describing a method of generating a list of estimated identification information using a captured image, according to an exemplary embodiment.
  • the controller 180 may photograph the first target device (S810). When the captured image of the first target device is obtained, the controller 180 may estimate identification information of the first target device (S820).
  • the controller 180 may input an image captured by the first target device into the learned machine learning model.
  • the learned machine learning model may predict identification information of the first target device by using the input image and output a prediction result.
  • the learned machine learning model may estimate a plurality of pieces of identification information.
  • the learned machine learning model may predict a plurality of pieces of identification information with respect to the photographed first target device and output a prediction result.
  • the controller 180 may generate a list of estimated identification information (S830).
  • the controller 180 can select a target device to activate a connection with the mobile terminal 100 among the plurality of scanned target devices (S460).
  • the controller 180 uses the image of the first target device and the plurality of advertisement information received from the plurality of target devices, and the target to activate the connection of the first target device to the mobile terminal among the plurality of target devices. Can be selected with the device.
  • FIGS. 10 and 11 are diagrams for describing a method of selecting a target device for connection and control using predicted identification information and identification information of a plurality of target devices obtained by scanning according to an embodiment of the present invention. to be.
  • the mobile terminal 100 receives the advertisement information of the first target device, the advertisement information of the second target device, and the advertisement information of the third target device. It is assumed to be described.
  • the third target device is a device that does not support the target device recognition service through shooting, it will be described on the assumption that the identification information (model ID) is not included in the advertisement information transmitted from the third target device.
  • the first target device is photographed, an image of the first target device is input to the machine learning model, and the machine learning model predicts that the target device in the image is the first target device and outputs identification information of the first target device.
  • the machine learning model predicts that the target device in the image is the first target device and outputs identification information of the first target device.
  • the controller 180 may obtain a prediction about identification information of the target device included in the captured image by inputting the captured image to the machine learning model (S1010).
  • the machine learning model may determine that the target device included in the captured image is the first target device and output identification information of the first target device.
  • the controller 180 may generate a list of the scanned plurality of target devices using the plurality of advertisement information (S1020).
  • the advertisement information may not include identification information (model ID).
  • the controller 180 may generate a list of target devices that transmit the advertisement information including the identification information (model ID). That is, as illustrated in FIG. 11B, a target device list including identification information (model ID) of the first target device and identification information (model ID) of the second target device can be generated.
  • the controller 180 may select a target device to activate a connection with the mobile terminal 100 from among a plurality of target devices.
  • the controller 180 may select a target device to activate a connection with the mobile terminal 100 among the plurality of target devices by using the predicted identification information and the identification information included in the plurality of advertisement information.
  • the controller 180 can find the identification information predicted from the scanned target device list (S1030).
  • the machine learning model predicts the identification information of the target device included in the captured image as the identification information of the first target device, and the list of the scanned target devices includes the identification information of the first target device.
  • the controller 180 can select the first target device as the target device to activate the connection.
  • the controller 180 can activate a connection with the selected first target device (S470).
  • the controller 180 may control a network interface to transmit a connect request to the first target device in order to be connected to the first target device through the Bluetooth LE method.
  • the access address included in the advertisement information transmitted from the first target device may be used.
  • the first target device may enter a connected mode and establish a Bluetooth LE connection with the mobile terminal 100.
  • the controller 180 can control the network interface unit to read or write data from the target device using the property protocol. have.
  • the present invention not only estimates the target device using machine learning, but also determines the target device to be connected using the identification information included in the advertisement packet. Accordingly, if the user only needs to illuminate the target device to be connected with the camera, the user can accurately determine what the target device is photographed and establish a connection.
  • the accuracy may be significantly lowered only by estimating the target device using machine learning.
  • the model ID included in the advertisement packet together, it is possible to accurately determine what the target device is photographed and establish a connection.
  • the user presses the search button to perform a scan, a list of the scanned devices, a user selects a device to be connected, and the like is omitted.
  • the list of scanned devices is displayed, and the process of selecting a model name on the list while the user is familiar with the model name of the target device is omitted.
  • connection is possible only by illuminating the device to be connected with the camera, which has the advantage of improving user convenience.
  • the controller 180 may activate the connection with the first target device through a communication method corresponding to a service provided by the first target device.
  • the controller 180 may perform the service by using the already formed Bluetooth LE connection.
  • the controller 180 can handover to an alternative communication means to perform the service.
  • the mobile terminal 100 through the Bluetooth LE information of the alternative communication means (for example, Bluetooth BR / EDR, Wi-Fi, Wi-Fi Direct, etc.) supported by the plurality of target devices and the alternative communication means It has been described that information about services that can be provided through (for example, Bluetooth BR / EDR A2DP HFP, Wi-Fi Direct Miracast, Wi-Fi Direct File Transfer, etc.) can be obtained.
  • the alternative communication means for example, Bluetooth BR / EDR, Wi-Fi, Wi-Fi Direct, etc.
  • the controller 180 may request the first target device to activate an alternative communication means and service to be connected.
  • the controller 180 may perform a handover from Bluetooth LE to Bluetooth BR / EDR. .
  • the mobile terminal 100 and the first target device may enter a connection state.
  • the mobile terminal 100 may make a read request for the additional information request to the first target device.
  • Read request is a message for requesting information stored in the GATT Database of the first target device.
  • the mobile terminal may receive a read response transmitted by the first target device in response to the read request. Accordingly, the mobile terminal can receive additional information requested by the mobile terminal.
  • the mobile terminal 100 may transmit a write request message to the first target device in order to request the ON of the Bluetooth BR / EDR, which is an alternative communication technology to be connected, and activation of a service.
  • the write request is to request writing of the Handover Control Point characteristic of the GATT Database of the first target device, wherein the mobile terminal 100 writes the activation of some or all of the services that the first target device can support. You can request by request.
  • the first target device may enter the BR / EDR page scan state.
  • the mobile terminal may receive a write response from the first target device in response to the write request.
  • the first target device turns on the Bluetooth BR / EDR, which is an alternative communication means, and activates the service.
  • the first target device may activate all or part of the service requested from the mobile terminal.
  • the mobile terminal receiving the write response message may enter the BR / EDR page state, transmit a page message to the first target device, and establish a Bluetooth BR / EDR connection.
  • the mobile terminal and the first target device can provide a service through the Bluetooth BR / EDR.
  • the controller 180 may perform handover from Bluetooth LE to Wi-Fi Direc.
  • the mobile terminal 100 and the first target device may enter a connection state.
  • the mobile terminal 100 may make a read request for the additional information request to the first target device.
  • Read request is a message for requesting information stored in the GATT Database of the first target device.
  • the mobile terminal may receive a read response transmitted by the first target device in response to the read request. Accordingly, the mobile terminal can receive additional information requested by the mobile terminal.
  • the mobile terminal 100 may transmit a write request message to the first target device in order to request ON of Wi-Fi Direct, which is an alternative communication technology to be connected, and activation of a service.
  • the write request is to request writing of a Handover Control Point characteristic of the GATT Database of the first target device, wherein the mobile terminal is a part (eg, Miracast) or all of the services that the first target device can support. May be requested through the write request.
  • a Handover Control Point characteristic of the GATT Database of the first target device wherein the mobile terminal is a part (eg, Miracast) or all of the services that the first target device can support. May be requested through the write request.
  • the first target device Upon receiving the write request message, the first target device can enter a listening state of Wi-Fi Direct.
  • the mobile terminal may receive a write response from the first target device in response to the write request.
  • the first target device activates Wi-Fi Direct, which is an alternative communication means, and activates the service.
  • the first target device may activate all or part of the service requested from the mobile terminal.
  • the mobile terminal receiving the write response message may enter a search state of Wi-Fi Direct, transmit a probe request message to the first target device, and form a Wi-Fi Direct connection.
  • the first device 200 and the second device 300 may provide a service through Wi-Fi Direct.
  • FIG. 12 is a diagram illustrating a stack of the mobile terminal 100 and the first target device 200
  • FIG. 13 is a diagram illustrating a detailed stack of the mobile terminal 100.
  • the mobile terminal can scan and receive advertisement information from the target device.
  • network interface information for providing a model ID or a service may be obtained.
  • the mobile terminal may manage the model ID, recognize the target device from the captured image, and filter the scanned target device. Accordingly, the target device for establishing the connection can be determined.
  • the GATT Client of the mobile terminal may request information stored in the GATT Database from the GATT Server of the target device, and the GATT Server may provide the GATT Client with a service provided by the target device and a network interface for providing the service.
  • the mobile terminal can acquire a network interface for connection and control of the determined target device.
  • the mobile terminal may be connected to the target device using a network interface requested for handover, and provide content such as audio or control the target device to the target device.
  • FIG. 14 is a diagram illustrating a state change of the target device
  • FIG. 15 is a diagram illustrating a state change of the mobile terminal.
  • the target device when the target device is powered on, the target device enters the BLE ready state. In this state, the BLE may transmit an advertisement packet to the mobile terminal.
  • the mobile terminal may search for the target device in the BLE Ready state, and may obtain Model ID information included in the advertisement packet through the search result.
  • the target device when receiving a service request from a mobile terminal (AI Camera device), the target device enters a transport ready state.
  • the target device turns on a transport capable of performing the corresponding service and enters the transport into a connection standby state.
  • the target device enters the service state and performs the service requested by the seeker.
  • the BLE continues to advertise in preparation for a request of a mobile terminal (AI Camera device).
  • the mobile terminal (AI Camera device) may search for a target device in a service state.
  • the target device enters the BLE ready state.
  • a mobile terminal may enter an initiating state when the power is turned on.
  • the mobile terminal (AI Camera device) moves to the DISCOVERY state and performs a scanning operation.
  • the mobile terminal moves from the Initiating State to the DISCOVERY state by another action of the user (for example, selecting a search button) and scanning. ) Can be performed.
  • the mobile terminal wants to use a certain transport (BR / EDR, Wi-Fi,%) to connect based on a BLE GATT connection and use a certain service. Exchanges information about the status and enters the Transport Ready state.
  • the mobile terminal In a transport ready state, the mobile terminal turns on a transport capable of performing a service desired by a user, and enters the transport into a connection standby state.
  • the mobile terminal may perform a service in a service state and enter an initiating state when the service is terminated.
  • 16 is a diagram illustrating a process of establishing a connection with a first target device 600 according to an embodiment of the present invention.
  • the process of generating a list of the scanned target devices by scanning the surrounding target devices by the mobile terminal may be performed before the process of capturing the target devices and predicting identification information of the target devices, or may be performed later.
  • the mobile terminal may first photograph the target device to predict identification information of the target device, and then generate a list of scanned target devices by scanning the peripheral target device.
  • the mobile terminal may first scan a nearby target device to generate a list of scanned target devices, and then photograph the target device to predict identification information of the target device.
  • the controller may display an icon 1610 for connecting a Bluetooth LE.
  • the controller may activate a photographing function of the camera unit and scan a peripheral target device.
  • the controller may display a preview screen 1620 indicating an image received through the camera unit.
  • the user may project the first target device 600 to be controlled among the plurality of target devices.
  • the controller 180 may acquire a captured image of the first target device by capturing the image. In this case, even if there is no user input for the photographing button, the controller 180 can automatically photograph.
  • the controller 180 may acquire the image of the first target device by capturing the image at a predetermined time interval.
  • the controller 180 may perform photography until the recognition rate is greater than a preset value based on the recognition rate determined by the input machine learning model.
  • the controller can scan the peripheral target device.
  • the controller may receive advertisement information from a plurality of target devices and generate a list of the plurality of target devices using the received advertisement information.
  • the control unit first target corresponding to the identification information predicted from the list.
  • the device can be obtained.
  • the activation and scanning of the shooting function are performed at the same time as the icon 1610 is selected, but the parallax occurs until the actual shooting after the normal shooting function is activated. Therefore, the list generation of the target device is first performed, and then the predicted identification information is output. Therefore, since the list of peripheral target devices is already generated when the identification information of the target device is predicted, the time required for connection with the target device can be minimized.
  • the user selecting the icon 1610 is willing to connect the Bluetooth LE to the user.
  • the present invention performs a scan when the user selects the icon 1610, the scan does not need to be continuously performed in order to prepare a list of peripheral target devices in advance.
  • the controller may display a connection icon 1630 for connecting to the first target device.
  • the controller may establish a connection between the mobile terminal and the first target device.
  • connection icon 1630 the setting of the connection by selecting the connection icon 1630 is only an embodiment, and when the first target device corresponding to the predicted identification information is obtained from the list, the control unit automatically sets the mobile terminal and the first terminal without a separate user input. 1 The target device can be connected.
  • the controller may select a Bluetooth LE method or an alternative communication means to connect the mobile terminal to the first target device.
  • the first target device when the first target device is a headset, the first target device may provide an audio service.
  • the controller may establish a BR / EDR connection with the first target device and then transmit the content stored in or being reproduced to the mobile terminal to the first target device.
  • the first target device when the first target device is a TV, the first target device may provide a control service or a miracast service by the mobile terminal.
  • the control unit may establish a Wi-Fi connection with the first target device and then transmit a control command or a screen of the mobile terminal to the first target device.
  • the first target device when the first target device is a sensor, the first target device may provide a sensing information providing service acquired by the first target device.
  • the controller may establish a Bluetooth LE connection with the first target device and then receive sensing information from the first target device.
  • 17 is a diagram illustrating a control icon corresponding to the function of the target device.
  • the controller may activate a connection with the first target device.
  • the controller may display a control icon 1721, 1722, or 1723 corresponding to a function of the first target device, or a property icon for checking the property of the first target device.
  • the controller may display a control icon or a property icon on the preview screen 1710.
  • the controller may display an icon for turning on / off the humidifier, an icon for adjusting the intensity of the humidifier, an icon for checking the current humidity, and the like.
  • the mobile terminal may transmit a control command to the first target device, and the first target device may transmit sensing information to the mobile terminal.
  • the present invention displays a control icon without driving a dedicated app for controlling the first target device, thereby allowing the user to display the first target device on the preview screen. You can control it directly.
  • FIG. 18 is a diagram for describing a method of inputting a voice command using a mobile terminal according to an embodiment of the present invention.
  • the controller activates the camera function to display the preview screen 1810 and to activate the connection with the first target device 200 included in the preview screen 1810. Can be.
  • the controller may display text of a voice command corresponding to the function of the first target device.
  • the controller may display the text of the voice command on the preview screen 1810.
  • a voice command frequently used by a user may be displayed at the top.
  • text of a voice command based on a user's favorite channel or content information currently being played may be displayed.
  • the user may input a voice command to the mobile terminal 100 or directly input the voice command to the first target device 200.
  • the mobile terminal 100 may transmit a voice command to the first target device 200.
  • the present invention has an advantage of visually displaying usable voice commands, so that the user needs to learn the voice commands, thereby improving user convenience.
  • the mobile terminal described herein includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, a slate PC , Tablet PCs, ultrabooks, wearable devices, such as smartwatches, glass glasses, head mounted displays, and the like. have.
  • the target device described in the present invention may include a television, a headset, a washing machine, a refrigerator, a speaker, an air cleaner, an air conditioner, a humidifier, a dehumidifier, an audio, a sensor, and the like.
  • the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
  • the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. There is this.
  • the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be interpreted as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Telephone Function (AREA)

Abstract

L'invention concerne un terminal mobile. Un terminal mobile, selon un mode de réalisation de la présente invention, comprend : une unité de communication pour communiquer avec une pluralité de dispositifs cibles par un procédé Bluetooth basse énergie (BLE); une unité de caméra pour capturer une image; et une unité de commande pour recevoir, en provenance de la pluralité de dispositifs cibles, une pluralité d'éléments d'informations d'annonce, qui comprend des éléments respectifs d'informations d'identification de la pluralité de dispositifs cibles, commander l'unité de caméra de façon à photographier un premier dispositif cible parmi la pluralité de dispositifs cibles, et activer une connexion avec le premier dispositif cible à l'aide d'une image dans laquelle le premier dispositif cible est photographié et de la pluralité d'éléments d'informations d'annonce.
PCT/KR2019/004252 2018-04-10 2019-04-10 Terminal mobile WO2019199043A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2018-0041775 2018-04-10
KR20180041775 2018-04-10
KR10-2018-0123268 2018-10-16
KR1020180123268A KR102102396B1 (ko) 2018-04-10 2018-10-16 이동 단말기

Publications (1)

Publication Number Publication Date
WO2019199043A1 true WO2019199043A1 (fr) 2019-10-17

Family

ID=68163650

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/004252 WO2019199043A1 (fr) 2018-04-10 2019-04-10 Terminal mobile

Country Status (1)

Country Link
WO (1) WO2019199043A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110784940A (zh) * 2019-11-14 2020-02-11 新乡学院 一种用于网络连接的方法和电子装置
US12021719B2 (en) 2022-07-15 2024-06-25 Lg Electronics Inc. Artificial intelligence apparatus and method for providing target device manual thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110136096A (ko) * 2010-06-14 2011-12-21 삼성에스디에스 주식회사 근거리 무선 통신 디바이스 및 그 제어방법
KR20120045848A (ko) * 2010-11-01 2012-05-09 삼성전기주식회사 블루투스 장치 간 페어링 장치 및 방법
KR20160007098A (ko) * 2014-07-11 2016-01-20 주식회사 네오랩컨버전스 블루투스 대상 기기 등록 방법 및 그 장치
KR20160023065A (ko) * 2014-08-21 2016-03-03 삼성전자주식회사 적어도 하나 이상의 통신 방식을 선택하기 위한 방법 및 장치
KR101789690B1 (ko) * 2017-07-11 2017-10-25 (주)블루비스 딥 러닝 기반 보안 서비스 제공 시스템 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110136096A (ko) * 2010-06-14 2011-12-21 삼성에스디에스 주식회사 근거리 무선 통신 디바이스 및 그 제어방법
KR20120045848A (ko) * 2010-11-01 2012-05-09 삼성전기주식회사 블루투스 장치 간 페어링 장치 및 방법
KR20160007098A (ko) * 2014-07-11 2016-01-20 주식회사 네오랩컨버전스 블루투스 대상 기기 등록 방법 및 그 장치
KR20160023065A (ko) * 2014-08-21 2016-03-03 삼성전자주식회사 적어도 하나 이상의 통신 방식을 선택하기 위한 방법 및 장치
KR101789690B1 (ko) * 2017-07-11 2017-10-25 (주)블루비스 딥 러닝 기반 보안 서비스 제공 시스템 및 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110784940A (zh) * 2019-11-14 2020-02-11 新乡学院 一种用于网络连接的方法和电子装置
US12021719B2 (en) 2022-07-15 2024-06-25 Lg Electronics Inc. Artificial intelligence apparatus and method for providing target device manual thereof

Similar Documents

Publication Publication Date Title
WO2020032311A1 (fr) Terminal mobile
WO2016047902A1 (fr) Terminal mobile et son procédé de commande
WO2020171288A1 (fr) Terminal mobile et dispositif électronique comprenant un terminal mobile
WO2020091215A1 (fr) Terminal, et procédé pour la connexion à des dispositifs cible
WO2017122877A1 (fr) Terminal mobile
WO2016047863A1 (fr) Dispositif mobile, hmd et système
WO2016182132A1 (fr) Terminal mobile et son procédé de commande
WO2017082457A1 (fr) Visiocasque et son procédé de commande
WO2017018579A1 (fr) Terminal mobile et son procédé de commande
WO2017047854A1 (fr) Terminal mobile et son procédé de commande
WO2018199379A1 (fr) Dispositif d'intelligence artificielle
WO2017007045A1 (fr) Drone, terminal mobile, et procédé de commande de drone et de terminal mobile
WO2021015323A1 (fr) Terminal mobile
WO2016117745A1 (fr) Dispositif électronique et son procédé de commande
WO2017022931A1 (fr) Terminal mobile et son procédé de commande
WO2016208845A1 (fr) Terminal mobile et son procédé de commande
WO2016003018A1 (fr) Terminal mobile, et procédé de commande correspondant
WO2017115960A1 (fr) Terminal mobile et son procédé de commande
WO2016195147A1 (fr) Visiocasque
WO2017030387A1 (fr) Dispositif de chargement, procédé de commande de dispositif de chargement, et terminal mobile connecté à celui-ci
WO2018093002A1 (fr) Terminal mobile et procédé de commande dudit terminal mobile
WO2016114437A1 (fr) Terminal mobile et son procédé de commande
WO2016195146A1 (fr) Visiocasque
WO2018043853A1 (fr) Dispositif de capture d'images hdr et procédé de commande associé
WO2016021844A1 (fr) Terminal mobile et son procédé de communication

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19785273

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19785273

Country of ref document: EP

Kind code of ref document: A1