WO2013129860A1 - Mobile terminal and network system - Google Patents

Mobile terminal and network system Download PDF

Info

Publication number
WO2013129860A1
WO2013129860A1 PCT/KR2013/001625 KR2013001625W WO2013129860A1 WO 2013129860 A1 WO2013129860 A1 WO 2013129860A1 KR 2013001625 W KR2013001625 W KR 2013001625W WO 2013129860 A1 WO2013129860 A1 WO 2013129860A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile terminal
management
camera
current location
information
Prior art date
Application number
PCT/KR2013/001625
Other languages
French (fr)
Inventor
Duck Gu Jeon
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2013129860A1 publication Critical patent/WO2013129860A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes

Definitions

  • FIG. 4 is a front perspective view of the mobile terminal according to an embodiment
  • the mobile terminal 110 may communicate with the network server 120 through the wireless communication unit 210. In some embodiments, the mobile terminal 110 may communicate with the network server 120 through the mobile communication module 213 or the wireless Internet module 215.
  • the output unit 250 outputs an audio signal, a video signal, or an alarm signal.
  • the output unit 250 may include the display 251, an audio output module 253, an alarm emitter 255, and a haptic module 257.
  • the interface 270 may be used as a passage allowing the mobile terminal 110 to exchange data with an external device.
  • a broadcast signal reception antenna (not shown) may be disposed at one side of the front or rear case 200-1 or 200-2, in addition to an antenna used for calls. The broadcast signal reception antenna may be installed such that it can be extended from the rear case 200-2.
  • the wireless communication unit 210 may transmit information about the current location and camera direction of the mobile terminal 110 to the network server 120 in requesting object information.
  • the wireless communication unit 210 may receive object information about management objects 130 located within a predetermined angular range in the camera direction from the network server 120.
  • the angular range may be set freely by the user or by default, which should not be construed as limiting the present invention.
  • the object icons 420 may be, but not limited to, UIs or GUIs.
  • the object icons 420 may provide minimum object information.
  • each of the object icons 420 may indicate at least one of the name of a management object 130 represented by the object icon 420 and the distance from the current location of the mobile terminal 110 to the management object 130.

Abstract

A method for operating a mobile terminal according to an embodiment of the present invention includes receiving a search distance being a search range to be searched from a current location, receiving object information about a management object located within the search distance in a direction of a camera, displaying a preview image of the current location received from the camera, and displaying an object icon representing the management object along with the preview image. A mobile terminal according to an embodiment of the present invention includes a camera, a Global Positioning System (GPS) module for determining a current location, a motion sensor for sensing a direction of the camera, a wireless communication unit for receiving object information about a management object located within a search distance from the current location in the camera direction, and a display for displaying a preview image received from the camera and, upon receipt of the object information, displaying an object icon representing the management object on the preview image. A network system according to an embodiment of the present invention includes a plurality of management objects for controlling an ambient environment, a network server for receiving a plurality of pieces of object information including information about positions of the plurality of management objects from the plurality of management objects and storing the plurality of pieces of object information, and a mobile terminal for receiving the plurality of pieces of object information from the network server and displaying a plurality of object icons indicating the positions of the plurality of management objects on a preview image.

Description

MOBILE TERMINAL AND NETWORK SYSTEM
The present invention relates to a mobile terminal and a network system, and more particularly, to a mobile terminal for enabling a user to view management objects by augmented reality and a remote control method using the same.
A mobile terminal is a portable device capable of performing one or more of a voice call and video call function, an information input/output function, and a data storing function, while being carried with a user.
Because mobility or portability should be considered for the mobile terminal, the mobility faces limitations in space allocation for a user interface such as a display or a keypad. Accordingly, even though the mobile terminal is equipped with a front touch screen, the size of the front touch screen is limited.
Along with the diversification of its functions, the mobile terminal has evolved to an integrated multimedia player having complex functions such as picture-taking, video recording, playback of music or a video, gaming, broadcasting reception, and wireless Internet.
To implement complex functions in such a mobile terminal developed into a multimedia player, new attempts have been made in terms of hardware or software. For example, a user interface environment is built to render function search or function selection user-friendly.
An object of the present invention devised to solve the problem is to provide information about the positions or failures of management objects using a mobile terminal.
The objects of the present invention are not limited to the object set forth above and other objects of the present invention will be apparent from the following description to those skilled in the art.
According to an embodiment of the present invention, a method for operating a mobile terminal includes receiving a search distance being a search range to be searched from a current location, receiving object information about a management object located within the search distance in a direction of a camera, displaying a preview image of the current location received from the camera, and displaying an object icon representing the management object along with the preview image.
According to another embodiment of the present invention, a mobile terminal includes a camera, a Global Positioning System (GPS) module for determining a current location, a motion sensor for sensing a direction of the camera, a wireless communication unit for receiving object information about a management object located within a search distance from the current location in the camera direction, and a display for displaying a preview image received from the camera and, upon receipt of the object information, displaying an object icon representing the management object on the preview image.
According to a further embodiment of the present invention, a network system includes a plurality of management objects for controlling an ambient environment, a network server for receiving a plurality of pieces of object information including information about positions of the plurality of management objects from the plurality of management objects and storing the plurality of pieces of object information, and a mobile terminal for receiving the plurality of pieces of object information from the network server and displaying a plurality of object icons indicating the positions of the plurality of management objects on a preview image.
Details of other embodiments are incorporated in the following detailed description and the attached drawings.
A mobile terminal and a network system according to the present invention have one or more of the following effects.
The network system according to an embodiment can allow a user to readily search for management objects at any place using a mobile terminal.
The mobile terminal according to an embodiment indicates the positions of management objects on a preview image so that an engineer near to the management objects can take an immediate action.
It will be appreciated by persons skilled in the art that the effects that can be achieved with the present invention are not limited to what has been particularly described hereinabove and other advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings.
FIG. 1 is a schematic view illustrating a network system including a mobile terminal according to an embodiment;
FIG. 2 is a diagram illustrating a signal flow for a call procedure in the network system including the mobile terminal according to an embodiment;
FIG. 3 is a block diagram of the mobile terminal according to an embodiment;
FIG. 4 is a front perspective view of the mobile terminal according to an embodiment;
FIG. 5 is a rear perspective view of the mobile terminal according to an embodiment;
FIG. 6 is a flowchart illustrating a method for operating the mobile terminal according to an embodiment; and
FIG. 7 illustrates a screen displayed on a display according to an embodiment.
The advantages and features of the present invention and the technical configurations of the present invention to achieve them will be apparent with reference to embodiments described in detail with the attached drawings. It is to be clearly understood that the present invention may be implemented in various manners, not limited to embodiments as set forth herein. The embodiments are provided only to render the disclosure of the present invention comprehensive and indicate the scope of the present invention to those skilled in the art. The present invention is defined only by the appended claims. Accordingly, the scope of the invention should be determined by the overall description of the specification. Like reference numerals denote the same components throughout the specification.
A mobile terminal described herein may be any of a portable phone, a smart phone, a laptop computer, a digital broadcasting terminal, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a navigator, etc. The terms “module” and “unit” used to signify components are used herein to help the understanding of the components and thus they should not be considered as having specific meanings or roles. Accordingly, the terms “module” and “unit” may be used interchangeably.
FIG. 1 is a schematic view illustrating a network system including a mobile terminal 110 according to an embodiment.
Referring to FIG. 1, the network system according to the embodiment includes a plurality of management objects 130 for controlling an ambient environment, a network server 120 for receiving object information about the plurality of management objects 130 including position information about the management objects 130 from the plurality of management objects 130 and storing the object information, and the mobile terminal 110 for receiving the object information about the plurality of management objects 130 from the network server 120 and displaying a plurality of object icons indicating the positions of the management objects 130 on a preview image. The mobile terminal 110 may wirelessly communicate with the network server 120.
The mobile terminal 110 may wirelessly communicate with the network server 120.
Wireless communication refers to exchanging information such as signals, codes, images, voice, etc. in waves without transmitting the waves through a wire. Wireless communication technology may cover wireless networks, wireless Internet, Wireless Local Area Network (WLAN), mobile communication, short-range communication, etc. For example, the mobile terminal 110 and the network server 120 may communicate with each other through an Internet Protocol (IP) network according to a wireless communication scheme.
The network server 120 may communicate with the plurality of management objects 130. The plurality of management objects 130 may include facilities requiring management. For example, the management objects 130 may include, but not limited to, control facilities connected to air conditioners or lightings or facilities that control an ambient environment such as air conditioners or lightings.
The network server 120 may collect the object information about the plurality of management objects 130 by communicating with the plurality of management objects 130 wirelessly or wiredly.
The object information may include information needed to manage the management objects 130 in a broad sense. For example, the object information may include information about at least one of the names of the management objects 130, the distances from a current location of the mobile terminal 110 to the management objects 130, the positions of the management objects 130, and failures of the management objects 130.
The management objects 130 may control an ambient environment. For example, the management objects 130 may control temperature or brightness inside a building. The management objects 130 may be, but not limited to, indoor or outdoor units of air conditioners, lightings, or a plurality of sensors, etc.
The management objects 130 may have addresses and Global Positioning System (GPS) coordinates. Upon receipt of an information request from the network server 120, the management objects 130 may transmit their addresses and GPS coordinates to the network server 120.
FIG. 2 is a call process diagram illustrating a signal processing operation of the network system.
Referring to FIG. 2, the network server 120 may communicate with the plurality of management objects 130 wirelessly or wiredly.
The network server 120 may request object information to the plurality of management objects 130 (s141). While the network server 120 may request object information to the plurality of management objects 130 at every predetermined interval, to which the present invention is not limited, the object information may be requested in many other manners.
Upon receipt of the object information request from the network server 120 (s141), the plurality of management objects 130 may transmit the object information to the network server 120 (s142).
Upon receipt of the object information from the plurality of management objects 130 (s142), the network server 120 may store the received object information (s143).
The mobile terminal 110 may determine its current location (s144). For example, the mobile terminal 110 may include a GPS module that may detect the current location of the mobile terminal 110.
The mobile terminal 110 may sense its direction with respect to a reference direction (s145). For example, the mobile terminal 110 may include a gyro sensor that may sense a direction in which the mobile terminal 110 is headed. The order of current location detection and current direction sensing for the mobile terminal 110 is not limited to a specific order and thus may be changed according to embodiments. The mobile terminal 110 may determine the direction of a camera based on the direction of the mobile terminal 110 and the position of the camera on the mobile terminal 110.
In an embodiment, the mobile terminal 110 may receive a search distance. For example, the mobile terminal 110 may display a search distance control icon on which a touch input may be recognized on a display, so that a search distance may be input to set a search range of the management objects 130.
The mobile terminal 110 may include the camera. The mobile terminal 110 may request object information about management objects 130 to the network server 120 (s146). The mobile terminal 110 may transmit the direction in which the camera faces and the current location of the mobile terminal 110 to the network server 120. More specifically, the mobile terminal 110 may request object information about management objects 130 located within a predetermined angular range in the facing direction of the camera to the network server 120.
The mobile terminal 110 may transmit the search distance to the network server 120.
Upon receipt of the object information request from the mobile terminal 110 (s146), the network server 120 may transmit object information received from management objects 130 communicating with it to the mobile terminal 110 (s147). The network server 120 may detect the management objects 130 located within the predetermined angular range in the facing direction of the camera of the mobile terminal 110 and transmit object information about the detected management objects 130 to the mobile terminal 110.
Upon receipt of the search distance from the mobile terminal 110, the network server 110 may transmit to the mobile terminal 110 object information about management objects 130 located within the search distance from the current location of the mobile terminal 110 among the management objects 130.
The mobile terminal 110 may receive the object information about the management objects 130 from the network server 120 (s147). While the mobile terminal 110 may receive the object information about the management objects 130 and display a preview image on a display, the order of the object information reception and the preview image displaying is not limited to a specific order and may change according to embodiments.
The mobile terminal 110 may display object icons representing the management objects 130 on the preview image. The mobile terminal 110 may display the object icons on the preview image in such a manner that the directions of the management objects 130 are indicated.
The mobile terminal 110 may display the preview image and the object icons on the display so that a user can visually identify the directions of the management objects 130.
FIG. 3 is a block diagram of the mobile terminal according to an embodiment of the present invention. With reference to FIG. 3, the mobile terminal according to the embodiment of the present invention will be described below from the viewpoint of functional components.
Referring to FIG. 3, the mobile terminal 110 may include a wireless communication unit 210, an Audio/Video (A/V) input unit 220, a user input unit 230, a sensor unit 240, an output unit 250, a memory 260, an interface 270, a controller 280, and a power supply 290. Two or more components of the mobile terminal 110 may be combined into a single component or a single component thereof may be separated into two or more components in real implementation.
The wireless communication unit 210 may include a broadcasting reception module 211, a mobile communication module 213, a wireless Internet module 215, a short-range communication module 217, and a GPS module 219.
The broadcasting reception module 211 receives at least one of a broadcast signal or broadcasting-related information on a broadcast channel from an external broadcasting management server. The broadcast channel may be any of a satellite channel, a terrestrial channel, etc. The broadcasting management server may refer to a server for generating and transmitting at least one of a broadcast signal or broadcasting-related information or a server for receiving at least one of an already generated broadcast signal or already generated broadcasting-related information and providing the received at least one of the broadcast signal or the broadcasting-related information to terminals.
The broadcasting-related information may be information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast signal may be a TV broadcast signal, a radio broadcast signal, a data broadcast signal, a combination of the TV broadcast signal and the data broadcast signal, or a combination of the radio broadcast signal and the data broadcast signal. The broadcasting-related information may be provided over a mobile communication network. In this case, the mobile communication module 213 may receive the broadcasting-related information. The broadcasting-related information may take various forms. For example, the broadcasting-related information may be a Digital Multimedia Broadcasting (DMB) Electronic Program Guide (EPG) or a Digital Video Broadcast-Handheld (DVB-H) Electronic Service Guide (ESG).
The broadcasting reception module 211 may receive a broadcast signal through a broadcasting system, particularly a digital broadcast signal through a digital broadcasting system such as Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), DVB-H, or Integrated Services Digital Broadcast-Terrestrial (ISDB-T). The broadcasting reception module 211 may be adapted to all other broadcasting systems that provide broadcast signals as well as the digital broadcasting system. The broadcast signal and/or broadcasting-related information received at the broadcasting reception module 211 may be stored in the memory 260.
The mobile communication module 213 transmits a wireless signal to and receives a wireless signal from at least one of a Base Station (BS), an external terminal, or a server over a mobile communication network. The wireless signal may include a voice call signal, a video call signal, or text/other various types of data involved in multimedia message transmission and reception.
The wireless Internet module 215 is a built-in or external module for providing wireless Internet connectivity to the mobile terminal 110. The wireless Internet module 215 may operate in conformance to WLAN (Wireless Fidelity (WiFi)), Wireless broadband (Wibro), World Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), etc.
The mobile terminal 110 may communicate with the network server 120 through the wireless communication unit 210. In some embodiments, the mobile terminal 110 may communicate with the network server 120 through the mobile communication module 213 or the wireless Internet module 215.
The mobile terminal 110 may request object information about management objects 130 to the network server 120 and may receive the object information from the network server 120 through the mobile communication module 213 or the wireless Internet module 215.
The short-range communication module 217 is used for short-range communication. For short-range communication, the short-range communication module 217 may conform to Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra WideBand (UWB), ZigBee, etc.
The GPS module 219 receives location information from a plurality of GPS satellites. The GPS module 219 may detect the current location of the mobile terminal 110.
The A/V input unit 220 is used to receive an audio signal or a video signal and may include a camera 221 and a microphone 123. The camera 221 processes a video frame of a still image or video acquired from an image sensor in video call mode or camera mode. The processed video frame may be displayed on a display 251.
The video frame processed by the camera 221 may be stored in the memory 260 or transmitted externally through the wireless communication unit 210. Two or more cameras 221 may be provided to the mobile terminal 110 depending on the configuration specification of the mobile terminal 110.
The microphone 223 receives an external audio signal and processes the audio signal to electrical voice data in call mode, recording mode, or voice recognition mode. In the call mode, the processed voice data may be converted to a format transmittable to a BS and output through the mobile communication module 213. Various noise cancellation algorithms are available to the microphone 223 in order to eliminate noise introduced during input of an external audio signal.
The user input unit 230 generates key input data that the user inputs to control the operation of the mobile terminal 110. The user input unit 230 may include a keypad, a dome switch, a (resistive/capacitive) touch pad, etc. to receive a command or information through the user’s push or touch manipulation. The user input unit 230 may be configured to operate in a jog wheel or jog fashion involving key rotation, in a joy stick fashion, or in a finger mouse fashion. Especially when a touch pad is layered with the display 251, the resulting structure may be referred to as a touch screen.
The sensor unit 240 senses the current state of the mobile terminal 110, such as the open or closed state, position, or user touch of the mobile terminal 110 and generates a sensing signal to control the operation of the mobile terminal 110 according to the sensed state. For example, if the mobile terminal 110 is a sliding phone, the sensor unit 240 may sense whether the sliding phone is opened or closed. In addition, the sensor unit 240 may sense whether the power supply 290 is supplying power or whether the interface 270 is coupled with an external device.
The sensor unit 240 may include a proximity sensor 241, a pressure sensor 243, and a motion sensor 245. The proximity sensor 241 may detect an object approaching the mobile terminal 110 or the existence or absence of an object in the vicinity of the mobile terminal 110 without mechanical contact. The proximity sensor 241 may detect a nearby object based on a change in an alternating or electromagnetic field or the variation rate of capacitance. One or more proximity sensors 241 may be provided to the mobile terminal 110 depending on the specification of the mobile terminal 110.
The pressure sensor 243 may determine whether pressure is applied to the mobile terminal 110 and how strong the pressure is. The pressure sensor 243 may be installed at a part of the mobile terminal 110 requiring pressure detection according to the environment in which the mobile terminal 110 is used. If the pressure sensor 243 is installed on the display 251, a touch input on the display 251 may be identified from a pressed touch input on the display 151, for which a stronger pressure is applied than for the touch input, according to a signal output from the pressure sensor 243. In addition, in case of the pressed touch input, the magnitude of pressure applied to the display 251 may also be known from the signal output from the pressure sensor 243.
The motion sensor 245 senses the position or motion of the mobile terminal 110 using an acceleration sensor, a gyro sensor, etc. The acceleration sensor is a device that converts an acceleration change in a direction to an electrical signal. Along with the development of the Micro-ElectroMechanical System (MEMS) technology, acceleration sensors have become popular. There are a broad range of acceleration sensors from an acceleration sensor that measures a large acceleration value for sensing collision in an airbag system for a vehicle to an acceleration sensor that measures a very small acceleration value for use as input means capable of recognizing fine hands’ motions when a game is played. Typically, 2- or 3-axis acceleration sensors are packed into one package or a single z-axis acceleration sensor is used depending on use environments. Accordingly, when not a Z-axis acceleration sensor but an X-axis or Y-axis acceleration sensor is to be used, the acceleration sensor may be erected on a main substrate by means of a substrate fragment.
The gyro sensor measures an angular velocity and thus senses a direction with respect to a reference direction.
The output unit 250 outputs an audio signal, a video signal, or an alarm signal. The output unit 250 may include the display 251, an audio output module 253, an alarm emitter 255, and a haptic module 257.
The display 251 displays information processed in the mobile terminal 110. For example, when the mobile terminal 110 is in the call mode, the display 251 displays a User Interface (UI) or Graphical User Interface (GUI) related to a call. In the video call mode or the camera mode, the display 251 may display captured or received images separately or simultaneously and may also display a UI or GUI.
As described before, if a touch screen is configured by layering the display 251 with a touch pad, the display 251 may be used not only as an output device but also as an input device capable of receiving information by a user’s touch.
If the display 251 is configured into a touch screen, it may include a touch screen panel, a touch screen panel controller, etc. In this case, the touch screen panel, which is a transparent panel attached on the exterior of the touch screen, may be connected to an internal bus of the mobile terminal 110. The touch screen panel keeps monitoring whether it is touched by a user. Upon detection of a touch input, the touch screen panel provides a signal corresponding to the touch input to the touch screen panel controller. The touch screen panel controller processes the received signal into data and transmits the data to the controller 280 so that the controller 280 may determine the presence or absence of a touch input and may locate a touched point on the touch screen.
The display 251 may be configured into electronic paper (e-paper). E-paper is a kind of reflective display having excellent visual characteristics including a high resolution, a wide viewing angle, and a bright white background, like paper and ink. The e-paper may be formed on any substrate of a material such as plastic, metal, paper, etc. Since the e-paper can keep an image after power is off and does not require a backlight assembly, it lengthens the battery lifetime of the mobile terminal 110. The display 251 may be configured into e-paper using electrostatic-charged hemispherical twist balls, electrophoretic deposition, or microcapsules.
Besides, the display 251 may be configured into at least one of a Liquid Crystal Display (LCD), a thin film transistor-LCD, an Organic Light Emitting Diode (OLED) display, a flexible display, or a 3D display. Depending on implementation of the mobile terminal 110, the mobile terminal 110 may be provided with two or more displays 251. For example, both external and internal displays (not shown) may be mounted to the mobile terminal 110.
The audio output module 253 outputs audio data received from the wireless communication unit 210 or stored in the memory 260 in call termination mode, call mode, recording mode, voice recognition mode, or broadcasting reception mode. The audio output module 253 also outputs an audio signal involved in a function performed by the mobile terminal 110, for example, an audio signal related to a call incoming sound, a message reception sound, etc. The audio output module 253 may include a speaker, a buzzer, etc.
The alarm emitter 255 outputs a signal notifying occurrence of an event to the mobile terminal 110. Events of the mobile terminal 110 include call signal reception, message reception, key signal input, etc. The alarm emitter 255 may output an event notification signal in a form other than an audio signal or a video signal. For example, the event notification signal may be output in the form of vibrations. Upon receipt of a call signal or a message, the alarm unit 255 may output a signal notifying the call signal or message reception. Upon receipt of a key signal, the alarm unit 255 may output a feedback signal for the key signal input. Thus, the user is aware of occurrence of an event from a signal output from the alarm emitter 255. A signal notifying of occurrence of an event may also be output through the display 251 or the audio output module 253 in the mobile terminal 110.
The haptic module 257 generates various tactile effects that a user can feel. A major example of the tactile effects is vibrations. When the haptic module 257 generates vibrations as tactile effects, the intensity and pattern of the vibrations may be altered. The haptic module 257 may synthesize different vibration effects and output the synthesized vibrations. Alternatively or additionally, the haptic module 257 may output different vibration effects sequentially.
The haptic module 257 may provide various haptic effects, other than vibration, such as a haptic effect obtained using a pin array that moves perpendicularly to a contact skin surface, a haptic effect obtained by injecting or sucking in air through an injection hole or a suction hole, a haptic effect obtained by giving a stimulus to the surface of the skin, a haptic effect obtained through contact with an electrode, a haptic effect obtained using an electrostatic force, and a haptic effect obtained by realizing the sense of heat or cold using a device capable of absorbing heat or generating heat. The haptic module 257 may be configured to enable the user to recognize a haptic effect using the kinesthetic sense of the fingers or the arms. The mobile terminal 110 may include two or more haptic modules 257 depending on the configuration specification of the mobile terminal 110.
The memory 260 may store programs required for processing and controlling in the controller 280 or temporarily store input or output data (e.g. a phone book, messages, still images, videos, etc.).
The memory 260 may store an input radius that has been set or received object information.
The memory 260 may include at least one of a flash memory type storage medium, a hard disk type storage medium, a multimedia card micro type storage medium, a card type memory (e.g., a Secure Digital (SD) or eXtreme Digital (XD) memory), a Random Access Memory (RAM), or a Read-Only Memory (ROM). The mobile terminal 110 may operate a Web storage, which performs the storage function of the memory 260 on the Internet.
The interface 270 interfaces between the mobile terminal 110 and all external devices connected to the mobile terminal 110. The external devices may include a wired/wireless headset, an external charger, a wired/wireless data port, a memory card, a card socket such as a Subscriber Identification Module (SIM) card or a User Identity Module (UIM) card, an audio Input/Output (I/O) port, a video I/O port, an earphone, etc. The interface 270 may receive data or power from such an external device and transfer the data or power to each component of the mobile terminal 110. In addition, the interface 270 may transmit data from the mobile terminal 110 to the external device.
When the mobile terminal 110 is connected to an external cradle, the interface 270 may provide a path for supplying power from the external cradle to the mobile terminal 110 or for transmitting various user-input command signals from the external cradle to the mobile terminal 110.
The controller 280 typically provides overall control to the mobile terminal 110 by controlling the operation of each component. For example, the controller 280 controls and processes voice call, data communication, video call, etc. The controller 280 may include a multimedia player 281 for playing multimedia. The multimedia player 281 may be configured in hardware inside the controller 280 or in software separately from the controller 280.
The power supply 290 may receive power from an external power source or an internal power source and supply power to each component of the mobile terminal 110.
The mobile terminal 110 having the above-described configuration may be configured to operate in communication systems capable of transmitting data in frames or packets, including a wired/wireless communication system or a satellite communication system.
FIG. 4 is a front perspective view of the mobile terminal according to an embodiment of the present invention and FIG. 5 is a rear perspective view of the mobile terminal illustrated in FIG. 4. With reference to FIGS. 4 and 5, the exterior of the mobile terminal according to the present invention will be described, centering on its exterior components. While the following description is given in the context of a bar-type mobile terminal having a front touch screen, it is purely exemplary. Thus it is to be understood that the present invention is also applicable to other types of mobile terminals including a folder type, a swing type, and a slider type.
Referring to FIG. 4, a front case 200-1 and a rear case 200-2 form the exterior case of the mobile terminal 110. A number of electronic parts are mounted in the space defined by the front case 200-1 and the rear case 200-2. The front case 200-1 and the rear case 200-2 may be formed of synthetic resin by injection molding or may be formed of a metal such as stainless steel (STS) or titanium (Ti).
The display 251, a first audio output module 253a, a first camera 221a, and first, second and third user input units 230a, 230b and 230c may be disposed in a body of the mobile terminal 110, particularly on the front case 200-1. Fourth and fifth user input units 230d and 230e and the microphone 223 may be disposed on side surfaces of the rear case 200-2.
If a touch pad is layered with the display 251, the display 251 may serve as a touch screen so that the user may enter various types of information to the mobile terminal 110 by touching the display 251.
The first audio output module 253a may be implemented as a receiver or a speaker. The first camera 221a may be configured to be suitable for capturing a still image or video of the user. The microphone 223 may be configured to properly receive the user's voice or other sounds.
The first to fifth user input units 230a to 230e and later-described sixth and seventh user input units 230f and 230g may be collectively referred to as the user input unit 230, and any means can be employed as the first to seventh user input units 230a to 230f so long as it can operate in a tactile manner.
For example, the user input unit 230 may be implemented as a dome switch or a touch pad that can receive a command or information according to a push or touch manipulation of the user, or may be implemented as a wheel or jog type for rotating a key or as a joystick. In terms of function, the first, second and third user input units 230a, 230b and 230c may operate as function keys for entering a command such as call, mouse point shift, screen scroll, start, or end, the fourth user input unit 230d may be used for selecting an operation mode, and the fifth user input unit 230e may operate as a hot key for activating a special function within the mobile terminal 110.
Referring to FIG. 5, a second camera 221b may be additionally provided on the rear surface of the rear case 200-2, and the sixth and seventh user input units 230f and 230g and the interface 270 may be disposed on one side surface of the rear case 200-2.
The second camera 221b may have a shooting direction which is substantially the opposite to that of the first camera 221a, and may have different pixels from those of the first camera 221a, which should not be construed as limiting the present invention. A flash (not shown) and a mirror (not shown) may be additionally disposed in the vicinity of the second camera 221b. When an image of an object is captured with the second camera 221b, the flash may illuminate the object. The mirror may allow the user to see himself or herself when he or she wants to capture his or her own image (self-portrait taking) using the second camera 221b.
A second audio output module (not shown) may be additionally provided on the rear case 200-2. The second audio output module may realize a stereo function along with the first audio output module 253. The audio output module may also be used in speaker-phone mode.
The interface 270 may be used as a passage allowing the mobile terminal 110 to exchange data with an external device. A broadcast signal reception antenna (not shown) may be disposed at one side of the front or rear case 200-1 or 200-2, in addition to an antenna used for calls. The broadcast signal reception antenna may be installed such that it can be extended from the rear case 200-2.
The power supply 290 may be mounted in the rear case 200-2 to supply power to the mobile terminal 110. The power supply 290 may be, for example, a chargeable battery which can be detachably mounted to the rear case 200-2 for being charged.
The second camera 221b and the other elements have been described as provided in the rear case 200-2, to which the present invention is not limited. Even though the second camera 221b is not provided, the first camera 221a may be configured to be rotatable and thus to capture an image in the shooting direction of the second camera 221b.
FIG. 6 is a flowchart illustrating a method for operating the mobile terminal according to an embodiment.
Referring to FIG. 6, the method for operating the mobile terminal according to the embodiment includes receiving a search distance between the current location of the mobile terminal and management objects to be searched for (s310), receiving object information about management objects located within the search distance in the direction of the camera (s350), displaying a preview image of the current location, received through the camera (s360), and displaying object icons representing the management objects along with the preview image (s370).
In the step s310 of receiving a search distance between the current location of the mobile terminal 110 and management objects to be searched for, the mobile terminal 110 may display a search distance control icon for inputting a search distance on the display 251. Upon receipt of a touch input on the search distance control icon, the mobile terminal 110 may recognize a search distance corresponding to the touch input.
The method for operating the mobile terminal may further include determining the current location of the mobile terminal 110 by means of the GPS module 219 (s320). The operation of the GPS module 219 has been described before and thus its description will not be provided herein to avoid redundancy.
The method for operating the mobile terminal may further include sensing the direction of the camera 221 by means of the motion sensor 245 (s330). The motion sensor 245 may include an acceleration sensor and a gyro sensor and thus may sense the location or movement of the mobile terminal 110. The mobile terminal 110 may determine the direction of the camera 221 according to the direction of the mobile terminal 110 sensed by the motion sensor 245 and the position of the camera 221 on the mobile terminal 110. The operation of the motion sensor 245 has been described before and thus its description will not be provided herein to avoid redundancy.
While the location determination step s320 precedes the direction sensing step s330 in FIG. 6, which should not be construed as limiting the present invention, the time order of the steps may be changed according to embodiments.
The method for operating the mobile terminal may further include requesting object information about management objects 130 to the network server 120 (s340).
In the object information requesting step s340, the mobile terminal 110 may transmit the current location of the mobile terminal 110, the direction of the camera 221, and the search distance to the network server 120. The mobile terminal 110 may request object information about management objects 130 located in the direction of the camera 221 to the network server 120.
When the mobile terminal 110 requests object information about management objects 130 to the network server 120 (s340), the network server 120 may transmit the object information to the mobile terminal 110.
The network server 120 may selectively transmit latest object information received from the management objects 130 to the mobile terminal 110. For example, the network server 120 may select object information to be transmitted to the mobile terminal 110 based on the received camera direction, current location, and search distance of the mobile terminal 110. The object information that the network server 120 will transmit to the mobile terminal 110 may be about management objects located within a predetermined angle in the camera direction within the search distance from the current location of the mobile terminal 110.
While the network server 120 may store the object information by communicating with the management objects 130 before receiving the object information request from the mobile terminal 110, to which the present invention is not limited, the network server 120 may receive the object information from the management objects 130 after receiving the object information request from the mobile terminal 110 in some embodiments.
The mobile terminal 110 may receive the object information from the network server 120 (s350). The object information received from the network server 120 may be about management objects 130 located within a predetermined angular range in the camera direction. The object information received from the network server 120 may be about management objects 130 located within the search distance from the current location of the mobile terminal 110.
If the mobile terminal 110 fails to receive a signal from the network server 120 in response to the object information request transmitted to the network server 120, the mobile terminal 110 may request the object information again.
The mobile terminal 110 may display a preview image on the display 251 (s360). The preview image may be captured by the camera 221 in the mobile terminal 110. While the image displaying step s360 is performed after the object information reception step s350, to which the present invention is not limited, the search distance reception step s310, the location determination step s320, the direction determination step s330, the object information requesting step s340, or the object information reception step s350 may be performed after the preview image is displayed in the method for operating the mobile terminal according to embodiments.
If the direction of the camera 221 has been changed in the mobile terminal 110, the mobile terminal 110 may determine its current location and the changed direction of the camera 221. The mobile terminal 110 may re-request object information about management objects 130 located within a predetermined angle in the changed direction of the camera 221 to the network server 120.
Upon receipt of the object information from the network server 120 (s350), the mobile terminal 110 may display object icons representing the management objects 130 on the preview image.
The object icons may be UIs or GUIs. The object icons may be displayed on the preview image. The object icons may be disposed on the preview image in such a manner that they indicate the directions or positions of the management objects 130.
FIG. 7 illustrates the mobile terminal 110 according to an embodiment.
Referring to FIG. 7, the mobile terminal 110 according to the embodiment includes the camera 221, the GPS module 219 for determining the current location of the mobile terminal 110, the motion sensor 245 for sensing the direction of the camera 221, the wireless communication unit 210 for receiving object information about management objects 130 located within a search distance from the current location in the direction of the camera 221, and the display 251 for displaying a preview image 410 received through the camera 221 and, upon receipt of the object information, displaying object icons 420 representing the management objects 130 on the preview image 410.
The camera 221 may capture the preview image 410. The preview image 410 may be displayed on the display 251.
The GPS module 219 may determine the current location of the mobile terminal 110. The motion sensor 245 may sense the direction of the mobile terminal 110. The mobile terminal 110 may determine the direction of the camera 221 based on the direction sensed by the motion sensor 245.
The camera 221 and the motion sensor 245 have been described before and thus their detailed description will not be provided herein.
The wireless communication unit 210 may request object information and receive the object information. The wireless communication unit 210 may communicate with the network server 120. The wireless communication unit 210 may use the mobile communication module 113 or the wireless Internet module 215 according to a communication scheme used for communicating with the network server 120.
For example, if the mobile terminal 110 communicates with the network server 120 over a mobile communication network, the mobile terminal 110 may request object information to the network server 120 through the mobile communication module 213. If the mobile terminal 110 communicates with the network server 120 over an IP network, the mobile terminal 110 may request object information to the network server 120 through the wireless Internet module 215.
The wireless communication unit 210 may transmit information about the current location and camera direction of the mobile terminal 110 to the network server 120 in requesting object information. The wireless communication unit 210 may receive object information about management objects 130 located within a predetermined angular range in the camera direction from the network server 120. The angular range may be set freely by the user or by default, which should not be construed as limiting the present invention.
If the preview image 410 has been changed due to a change in the current location and direction of the mobile terminal 110, the wireless communication 210 may re-request object information to the network server 120.
The wireless communication unit 210 may receive the object information from the network server 120 and the display 251 may reconfigure a screen based on the object information.
The display 251 may display the preview image 410. As an image captured by the camera 221 is changed, the display 251 may in turn change the preview image 410. The display 251 may recognize a touch input.
The display 251 may display a search distance control icon 430. The search distance may be a maximum range of management objects 130 to be searched for from the current location of the mobile terminal 110.
While the search distance control icon 430 is displayed on the preview image in FIG. 7, this does not limit the present invention. As illustrated in FIG. 7, the search distance control icon 430 is shown as a window that displays a search distance control button and a search distance. However, the search distance control icon 430 is not limited to the specific form and may be configured in various embodiments. Upon receipt of a touch input to the search distance control icon 430, the mobile terminal 110 may transmit a search distance corresponding to the touch input to the network server 120 and thus may receive object information accordingly from the network server 120.
The display 251 may display the object icons 420. The display 251 may display a preview image captured by the camera 221. The display 251 may display the object icons 420 on the preview image 410.
The controller 280 may control display of the object icons 420 on the preview image 410 in such a manner that the positions of the management objects 130 represented by the object icons 420 are indicated.
If the direction of the camera 221 has been changed, the display unit 251 may display object icons 420 representing management objects 130 located within a predetermined angular range in the changed direction of the camera 221.
A plurality of object icons 420 may be displayed. Each of the object icons 420 may represent a management object 130. The object icons 420 may be disposed on the preview image 410 so as to indicate the positions of the management objects 130 represented by the object icons 420.
The object icons 420 may be, but not limited to, UIs or GUIs. The object icons 420 may provide minimum object information. For example, each of the object icons 420 may indicate at least one of the name of a management object 130 represented by the object icon 420 and the distance from the current location of the mobile terminal 110 to the management object 130.
The management objects 130 may be facilities that control an ambient environment, requiring management. For example, the management objects 130 may be air conditioners for controlling temperature or humidity in their installed places or lightings for controlling brightness in their installed places.
The object information may cover information required to manage the facilities in a broad sense. For example, the object information about a management object 130 may include at least one of the name of the management object, the distance from the current location of the mobile terminal 110 to the management object 130, failure information about the management object 130, and the position of the management object 130.
While the present invention has been particularly shown and described with reference to preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.

Claims (22)

  1. A method for operating a mobile terminal, the method comprising:
    receiving a search distance being a search range to be searched from a current location;
    receiving object information about a management object located within the search distance in a direction of a camera;
    displaying a preview image of the current location received from the camera; and
    displaying an object icon representing the management object along with the preview image.
  2. The method according to claim 1, wherein the received object information is about the management object located within a predetermined angular range in the camera direction.
  3. The method according to claim 1, further comprising determining the current location through a Global Positioning System (GPS) module.
  4. The method according to claim 1, further comprising determining the camera direction through a motion sensor.
  5. The method according to claim 1, wherein the reception comprises:
    displaying a search distance control icon for inputting the search distance; and
    receiving the search distance, upon touch on the search distance control icon.
  6. The method according to claim 1, wherein the object icon indicates at least one of a name of the management object and a distance from the current location to the management object.
  7. The method according to claim 1, wherein the object information includes at least one of a name of the management object, a distance from the current location to the management object, and failure information about the management object.
  8. The method according to claim 1, wherein the displaying of object information comprises disposing the object icon on the preview image to indicate a position of the management object.
  9. The method according to claim 1, further comprising displaying an information window including the object information, when the object icon is touched.
  10. A mobile terminal comprising:
    a camera;
    a Global Positioning System (GPS) module for determining a current location;
    a motion sensor for sensing a direction of the camera;
    a wireless communication unit for receiving object information about a management object located within a search distance from the current location in the camera direction; and
    a display for displaying a preview image received from the camera and, upon receipt of the object information, displaying an object icon representing the management object on the preview image.
  11. The mobile terminal according to claim 10, wherein the search distance is a maximum range of management objects to be searched for from the current location, and the display displays a search distance control icon for inputting the search distance.
  12. The mobile terminal according to claim 10, wherein the object icon is disposed on the preview image to indicate a position of the management object.
  13. The mobile terminal according to claim 10, wherein the object icon indicates at least one of a name of the management object and a distance from the current location to the management object.
  14. The mobile terminal according to claim 10, wherein the object information includes at least one of a name of the management object, a distance from the current location to the management object, and failure information about the management object.
  15. The mobile terminal according to claim 10, wherein when the camera direction is changed, object information about a management object located within a predetermined angular range in the changed camera direction is requested.
  16. A network system comprising:
    a plurality of management objects for controlling an ambient environment;
    a network server for receiving a plurality of pieces of object information including information about positions of the plurality of management objects from the plurality of management objects and storing the plurality of pieces of object information; and
    a mobile terminal for receiving the plurality of pieces of object information from the network server and displaying a plurality of object icons indicating the positions of the plurality of management objects on a preview image.
  17. The network system according to claim 16, wherein the plurality of management objects include air conditioners or lightings.
  18. The network system according to claim 16, wherein the plurality of management objects have addresses and Global Positioning System (GPS) coordinates.
  19. The network system according to claim 16, wherein when the object icon is touched, the mobile terminal displays an information window including the object information.
  20. The network system according to claim 16, wherein the mobile terminal receives a search distance being a search range from a current location.
  21. The network system according to claim 20, wherein the mobile terminal displays a search distance control icon for inputting the search distance and the search distance control icon indicates the search distance.
  22. The network system according to claim 16, wherein the object icon indicates at least one of a name of the management object and a distance from the current location to the management object.
PCT/KR2013/001625 2012-02-29 2013-02-28 Mobile terminal and network system WO2013129860A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2012-0021240 2012-02-29
KR1020120021240A KR101867853B1 (en) 2012-02-29 2012-02-29 Mobile phone and network system comprising thereof

Publications (1)

Publication Number Publication Date
WO2013129860A1 true WO2013129860A1 (en) 2013-09-06

Family

ID=49082985

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/001625 WO2013129860A1 (en) 2012-02-29 2013-02-28 Mobile terminal and network system

Country Status (2)

Country Link
KR (1) KR101867853B1 (en)
WO (1) WO2013129860A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105023020A (en) * 2014-04-21 2015-11-04 三星电子株式会社 Semantic labeling apparatus and method thereof

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6815290B2 (en) * 2017-07-13 2021-01-20 ヤンマーパワーテクノロジー株式会社 Object identification system
KR101949361B1 (en) * 2017-09-26 2019-02-18 엘지전자 주식회사 Mobile terminal and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090102859A1 (en) * 2007-10-18 2009-04-23 Yahoo! Inc. User augmented reality for camera-enabled mobile devices
US20110164163A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Synchronized, interactive augmented reality displays for multifunction devices
WO2011096668A2 (en) * 2010-02-05 2011-08-11 (주)올라웍스 Method for providing information on object within view of terminal device, terminal device for same and computer-readable recording medium
US20110319130A1 (en) * 2010-06-28 2011-12-29 Lg Electronics Inc. Mobile terminal and method of operation
US20120041966A1 (en) * 2010-07-15 2012-02-16 Virtual Beam, Inc. Directional information search from a mobile device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101606727B1 (en) * 2010-06-25 2016-03-28 엘지전자 주식회사 Mobile terminal and operation method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090102859A1 (en) * 2007-10-18 2009-04-23 Yahoo! Inc. User augmented reality for camera-enabled mobile devices
US20110164163A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Synchronized, interactive augmented reality displays for multifunction devices
WO2011096668A2 (en) * 2010-02-05 2011-08-11 (주)올라웍스 Method for providing information on object within view of terminal device, terminal device for same and computer-readable recording medium
US20110319130A1 (en) * 2010-06-28 2011-12-29 Lg Electronics Inc. Mobile terminal and method of operation
US20120041966A1 (en) * 2010-07-15 2012-02-16 Virtual Beam, Inc. Directional information search from a mobile device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105023020A (en) * 2014-04-21 2015-11-04 三星电子株式会社 Semantic labeling apparatus and method thereof
EP2950252A1 (en) * 2014-04-21 2015-12-02 Samsung Electronics Co., Ltd Semantic labeling apparatus and method thereof
CN105023020B (en) * 2014-04-21 2019-09-13 三星电子株式会社 Semantic tagger device and method thereof
US10762119B2 (en) 2014-04-21 2020-09-01 Samsung Electronics Co., Ltd. Semantic labeling apparatus and method thereof

Also Published As

Publication number Publication date
KR101867853B1 (en) 2018-06-15
KR20130099607A (en) 2013-09-06

Similar Documents

Publication Publication Date Title
US8391697B2 (en) Mobile terminal and method of controlling the operation of the mobile terminal
WO2014119829A1 (en) Mobile/portable terminal
CN110062105B (en) Interface display method and terminal equipment
CN110109593B (en) Screen capturing method and terminal equipment
WO2015174612A1 (en) Mobile terminal and control method therefor
WO2019117566A1 (en) Electronic device and input control method thereof
WO2009107935A2 (en) Virtual optical input device with feedback and method of controlling the same
WO2012133981A1 (en) Image display device and method for operating same
CN110502287B (en) Application program control method and terminal
CN109710349B (en) Screen capturing method and mobile terminal
WO2018105772A1 (en) Mobile terminal
WO2013191488A1 (en) Apparatus including a touch screen and screen change method thereof
WO2014119835A1 (en) Mobile terminal and method for operating same
CN109144703B (en) Multitask processing method and terminal equipment thereof
WO2018097340A1 (en) Mobile terminal
WO2014204022A1 (en) Mobile terminal
WO2020060004A1 (en) Mobile terminal
WO2021006372A1 (en) Mobile terminal
WO2017155159A1 (en) Camera module
WO2018074615A1 (en) Mobile terminal
WO2013129860A1 (en) Mobile terminal and network system
WO2018143744A1 (en) Touch sensing display device and method for controlling screen thereof
WO2012081787A1 (en) Image processing apparatus of mobile terminal and method thereof
WO2016104873A1 (en) Digital device and method of controlling therefor
WO2014208984A1 (en) Apparatus and method for providing a security environment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13754969

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13754969

Country of ref document: EP

Kind code of ref document: A1