KR20150017955A - Control apparatus of mobile terminal and method thereof - Google Patents
Control apparatus of mobile terminal and method thereof Download PDFInfo
- Publication number
- KR20150017955A KR20150017955A KR1020130094235A KR20130094235A KR20150017955A KR 20150017955 A KR20150017955 A KR 20150017955A KR 1020130094235 A KR1020130094235 A KR 1020130094235A KR 20130094235 A KR20130094235 A KR 20130094235A KR 20150017955 A KR20150017955 A KR 20150017955A
- Authority
- KR
- South Korea
- Prior art keywords
- sensor
- display device
- gesture
- video display
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F13/00—Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
- G06F13/14—Handling requests for interconnection or transfer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q9/00—Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Telephone Function (AREA)
Abstract
The present invention relates to a control apparatus and method for a mobile terminal that can precisely control an external device (for example, an image display device) based on different object recognition distance differences of object detection sensors, A communication unit for forming a communication network with the video display device; Object detecting sensors having different object recognition distances; A gesture sensor for sensing a gesture; A first control signal for controlling the image display device is generated when an object is sequentially detected by the object detection sensors and a first predetermined gesture is detected by the gesture sensor, And transmitting the image data to the image display apparatus through the communication network.
Description
The present invention relates to a control apparatus and method for a mobile terminal.
2. Description of the Related Art Generally, a mobile terminal (portable electronic device) is a portable device that is portable and has one or more functions of voice and video communication, a function of inputting / outputting information, and a function of storing data. In addition, since the functions of the mobile terminal are diversified (for example, photographing or video shooting, music or video file playback, game, broadcast reception, etc.), the mobile terminal has a complex multimedia function, . Also, a mobile terminal according to the prior art is disclosed in U.S. Patent Publication No. US 20100088634.
An object of the present invention is to provide a control apparatus and method for a mobile terminal which can precisely control an external device (for example, an image display device) based on different object recognition distance differences of the object detection sensors.
A control device of a mobile terminal according to embodiments of the present invention includes a communication unit that forms a communication network with a video display device; Object detecting sensors having different object recognition distances; A gesture sensor for sensing a gesture; A first control signal for controlling the image display device is generated when an object is sequentially detected by the object detection sensors and a first predetermined gesture is detected by the gesture sensor, And transmitting the image data to the image display apparatus through the communication network.
In one embodiment of the present invention, the object detection sensors may include a proximity touch sensor for sensing proximity touch of a touch screen and a proximity sensor for sensing presence or absence of the object.
In one embodiment of the present invention, the object recognition maximum distance of the gesture sensor is larger than the proximity sensor and the proximity touch sensor, and the object recognition maximum distance of the proximity sensor is larger than the proximity touch sensor and smaller than the gesture sensor.
As an example related to the present invention, the control unit may be configured such that the object is sensed by the gesture sensor, then the object is sensed by the proximity sensor, and then the object is sensed by the proximity sensor, A first control signal for controlling the image display device may be generated when the first predetermined gesture is detected.
As an example related to the present invention, the controller may be configured to control the image display device when the object is sequentially sensed by the object detection sensors and the first predetermined gesture is detected by the gesture sensor Generates a second control signal and transmits the generated second control signal to the video display device through the communication network, and the first and second control signals may be different from each other.
In one embodiment of the present invention, the display device further includes a display unit for displaying a content different from the content displayed on the video display device, wherein the control unit displays the content displayed on the display unit by the object detection sensors The object may be sequentially detected and the content of the video display device may be controlled if a second predetermined gesture is detected by the gesture sensor.
As an example related to the present invention, the controller captures the content displayed on the video display device when an object is sequentially detected by the object detection sensors and a second preset gesture is detected by the gesture sensor, The captured content can be displayed on the display unit.
As an example of the present invention, the captured content may include a caption of a content displayed on the video display device, a voice file corresponding to the caption, and an image corresponding to the caption.
As an example related to the present invention, when the subtitle displayed on the display unit is selected, the control unit can execute an application program linked to the subtitle.
In an embodiment of the present invention, the control unit may display, on the display unit, an audio file corresponding to the caption or an icon for outputting an image corresponding to the caption together with the caption of the content displayed on the video display device.
As an example related to the present invention, the controller may display the content displayed on the display unit on the video display device when the call signal is received, or temporarily stop the playback of the content displayed on the video display device when the call signal is received have.
A method of controlling a mobile terminal according to embodiments of the present invention includes: forming a communication network with a video display device; Sequentially sensing an object by object detection sensors having different object recognition distances; Generating a first control signal for controlling the image display device when the object is sequentially detected by the object detection sensors and a first predetermined gesture is detected by the gesture sensor; And transmitting the generated first control signal to the video display device through the communication network.
The control apparatus and method for a mobile terminal according to embodiments of the present invention may be configured such that a difference between recognition distances of object detection sensors of different objects and a gesture sensed by a gesture sensor, ), It is possible to accurately control an external device (for example, a video display device).
The control apparatus and method for a mobile terminal according to embodiments of the present invention may be configured such that a difference between recognition distances of object detection sensors of different objects and a gesture sensed by a gesture sensor, ), It is possible to control the content of the external device (for example, the video display device) while watching the desired content through the mobile terminal.
The control apparatus and method for a mobile terminal according to embodiments of the present invention may be configured such that a difference between recognition distances of object detection sensors of different objects and a gesture sensed by a gesture sensor, Can be easily and quickly captured.
The control apparatus and method for a mobile terminal according to embodiments of the present invention may be configured such that a difference between recognition distances of object detection sensors of different objects and a gesture sensed by a gesture sensor, ), And the user can easily and quickly utilize the captured content by classifying the captured content according to the category (caption, voice file, image, and the like).
1 is a block diagram illustrating a mobile terminal according to one embodiment disclosed herein.
2A and 2B are conceptual diagrams of a communication system in which a mobile terminal can operate according to the present invention.
3 is a flowchart illustrating a method of controlling a mobile terminal according to a first embodiment of the present invention.
4 is an exemplary view illustrating a mobile terminal connected to an image display apparatus through a wireless communication network according to a first embodiment of the present invention.
5 is a view illustrating an object recognition maximum distance of the object detection sensors according to the first embodiment of the present invention.
FIG. 6 is a diagram illustrating a process of controlling an image display apparatus according to the first embodiment of the present invention.
7 is a flowchart illustrating a method of controlling a mobile terminal according to a second embodiment of the present invention.
8A and 8B illustrate a process of controlling contents of an image display apparatus according to a second embodiment of the present invention.
9 is a diagram illustrating an example of a gesture table for controlling contents of an image display apparatus according to a second embodiment of the present invention.
10 is a flowchart illustrating a method of controlling a mobile terminal according to a third embodiment of the present invention.
11 is a view illustrating an example of a process for capturing a content of an image display apparatus according to a third embodiment of the present invention.
12 is an exemplary view showing captured content according to a third embodiment of the present invention.
FIG. 13 is a diagram illustrating a process of executing a program linked to a captured content according to the third embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. In addition, it should be noted that the attached drawings are only for easy understanding of the embodiments disclosed in the present specification, and should not be construed as limiting the technical idea disclosed in the present specification by the attached drawings.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC , A tablet PC (tablet PC), and an ultrabook (ultrabook). However, it will be understood by those skilled in the art that the configuration according to the embodiments described herein may be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to a mobile terminal.
1 is a block diagram illustrating a mobile terminal according to one embodiment disclosed herein.
The
Hereinafter, the components will be described in order.
The
The
The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.
The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the
The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).
For example, the
The broadcast signal and / or broadcast related information received through the
The
The
The
The short-
The
Referring to FIG. 1, an A / V (Audio / Video)
The image frame processed by the
The
The
The
The
The
The
Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the
There may be two or
Also, the
Here, a stereoscopic image represents a 3-dimensional stereoscopic image, and a 3-dimensional stereoscopic image represents a progressive depth and reality in which objects are located on a monitor or a screen, It is an image that makes you feel the same as the space. 3D stereoscopic images are implemented using binocular disparity. The binocular parallax means a parallax caused by the position of two eyes away from each other. When two eyes see two different images and the images are transmitted to the brain through the retina and fused, the depth and real feeling of the stereoscopic image can be felt .
The
Examples of the autostereoscopic method include a parallax barrier method, a lenticular method, an integral imaging method, and a switchable lens method. The projection method includes a reflection type holographic method and a transmission type holographic method.
Generally, 3D stereoscopic images consist of left image (left eye image) and right image (right eye image). A top-down method of arranging a left image and a right image in one frame according to a method in which a left image and a right image are combined into a three-dimensional stereoscopic image, A checker board system in which pieces of a left image and a right image are arranged in a tile form, a left-to-right (right-side) Or an interlaced method in which rows are alternately arranged, and a time sequential (frame-by-frame) method in which right and left images are alternately displayed in time.
In addition, the 3D thumbnail image can generate a left image thumbnail and a right image thumbnail from the left image and right image of the original image frame, respectively, and combine them to generate one 3D thumbnail image. In general, a thumbnail means a reduced image or a reduced still image. The left image thumbnail and the right image thumbnail generated in this way are displayed on the screen with a difference of the left and right distance by the depth corresponding to the parallax between the left image and the right image, thereby exhibiting a stereoscopic spatial feeling.
The left and right images necessary for realizing the three-dimensional stereoscopic image can be displayed on the
On the other hand, when a
The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
Referring to FIG. 1, a
Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "floating touch or proximity touch" , And the act of actually contacting the pointer on the touch screen is called "contact touch. &Quot; The location where the pointer is proximately touched on the touch screen means a position where the pointer corresponds vertically to the touch screen when the pointer is non-contact-touched.
Examples of the proximity touch sensor (not shown) include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, have. When the touch screen is electrostatic, it is configured to detect the proximity of the pointer by a change of the electric field along the proximity of an object having conductivity (hereinafter, referred to as a pointer). In this case, the touch screen (touch sensor) may be classified as a proximity sensor. That is, the touch screen (touch sensor) may include a touch sensor for sensing a touch and a proximity touch sensor for sensing a proximity (or non-contact) touch. The object recognition (sensing) maximum distance of the proximity touch sensor may be 1 to 2 cm.
The proximity touch sensor senses a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.
In the case where the three-
The
The proximity sensor and the
The stereoscopic
The
The
The
For example, the
As another example, a photosensor may be stacked on a display element. The photosensor is configured to scan the movement of the object proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of light change, thereby acquiring position information of the object to be sensed.
The
The
The
The
In addition to the vibration, the
The
The
The
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
In addition, the
If the condition of the mobile terminal satisfies the set condition, the
The
The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays , Microprocessors, microprocessors, microprocessors, and other electronic units for carrying out other functions. In some cases, the embodiments described herein may be implemented by the
According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein.
The software code may be implemented in a software application written in a suitable programming language. The software code is stored in the
Next, a communication system that can be implemented through the
2A and 2B are conceptual diagrams of a communication system in which the
First, referring to FIG. 2A, the communication system may use different wireless interfaces and / or physical layers. For example, wireless interfaces that can be used by a communication system include Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA) , Universal Mobile Telecommunications Systems (UMTS) (especially Long Term Evolution (LTE)), Global System for Mobile Communications (GSM), and the like.
Hereinafter, for the sake of convenience of description, the description will be limited to CDMA. However, it is apparent that the present invention can be applied to all communication systems including CDMA wireless communication systems.
2A, a CDMA wireless communication system includes at least one
Each of the plurality of
The intersection of sector and frequency assignment may be referred to as a CDMA channel. The
As shown in FIG. 2A, a broadcasting transmitter (BT) 295 transmits a broadcasting signal to
In addition, FIG. 2A illustrates a
Of the typical operation of a wireless communication system, the
Next, a method of acquiring location information of a mobile terminal using a WiFi (Wireless Fidelity) Positioning System (WPS) will be described with reference to FIG. 2B.
A WiFi Positioning System (WPS) 300 uses a WiFi module included in the
The WiFi
The
SSID, RSSI, channel information, Privacy, Network Type, Signal Strength, and Noise Strength based on the location information request message of the
The
In FIG. 2B, the wireless APs connected to the
Next, the
The information of any wireless APs stored in the
Since the
The extracted location information of the
Hereinafter, a control apparatus and method of a mobile terminal capable of accurately controlling an external device (e.g., an image display device) based on different object recognition distance differences of the object detection sensors will be described. The image display device may be a variety of image display devices such as a tablet personal computer (PC), a television, a notebook, and the like.
3 is a flowchart illustrating a method of controlling a mobile terminal according to a first embodiment of the present invention.
First, the
4 is an exemplary view illustrating a mobile terminal connected to an image display apparatus through a wireless communication network according to a first embodiment of the present invention.
4, the
The
The
The
The
5 is a view illustrating an object recognition maximum distance of the object detection sensors according to the first embodiment of the present invention.
5, the object detection maximum distances of the
The
The
FIG. 6 is a diagram illustrating a process of controlling an image display apparatus according to the first embodiment of the present invention.
6, the
The
On the other hand, when the object is not detected (not detected) in the order of the
Therefore, the control apparatus and method for a mobile terminal according to the first embodiment of the present invention can detect a difference between object recognition distance differences of object detection sensors and a gesture sensed through a gesture sensor, (E.g., a video display device), it is possible to accurately control an external device (e.g., a video display device).
7 is a flowchart illustrating a method of controlling a mobile terminal according to a second embodiment of the present invention.
First, the
The
The
When the object is detected in the order of the
The
The
8A and 8B illustrate a process of controlling contents of an image display apparatus according to a second embodiment of the present invention.
8A, the
The
The
On the other hand, when the object is not detected (not detected) in the order of the
The
On the other hand, if the gesture preset by the
8B, the
If the gesture preset by the gesture sensor 145 (for example, a gesture for moving the moving picture forward) continues to be sensed over time, the
The
On the other hand, when the object is not detected (not detected) in the order of the
The
On the other hand, if the gesture detected by the
9 is a diagram illustrating an example of a gesture table for controlling contents of an image display apparatus according to a second embodiment of the present invention.
9, when the gesture table 9-1 for controlling the content of the
Therefore, the control apparatus and method for a mobile terminal according to the second embodiment of the present invention can detect a difference between object recognition distance differences of object detection sensors and a gesture sensed through a gesture sensor, Video display apparatus), it is possible to control the content of the external apparatus (for example, the video display apparatus) while watching the desired content through the
10 is a flowchart illustrating a method of controlling a mobile terminal according to a third embodiment of the present invention.
First, the
The
The
When the object is detected in the order of the
The
The
The
When the icon for outputting the audio file corresponding to the subtitles or the image corresponding to the subtitles is selected, the
11 is a view illustrating an example of a process for capturing a content of an image display apparatus according to a third embodiment of the present invention.
11, the
12 is an exemplary view showing captured content according to a third embodiment of the present invention.
12, the
When the icon 12-1 for outputting the audio file corresponding to the caption 11-1 or the image corresponding to the caption 11-1 is selected, the
FIG. 13 is a diagram illustrating a process of executing a program linked to a captured content according to the third embodiment of the present invention.
13, when the subtitle 11-1 displayed on the
When the subtitle 11-1 displayed on the
Therefore, the control apparatus and method for a mobile terminal according to the third embodiment of the present invention can detect a difference between object recognition distance differences of object detection sensors and a gesture sensed through a gesture sensor, Video display device) can be easily and quickly captured.
A control apparatus and method for a mobile terminal according to a third embodiment of the present invention is a control apparatus and method for a mobile terminal according to the third embodiment of the present invention, Device), and the user can easily and quickly utilize the captured content by capturing the captured content according to the category (caption, voice file, image, etc.).
As described in detail above, the control apparatus and method for a mobile terminal according to embodiments of the present invention may be configured such that, based on different object recognition distance differences of object detection sensors and gestures sensed through a gesture sensor, (For example, a video display device) can be accurately controlled by controlling the video display device (for example, a video display device).
The control apparatus and method for a mobile terminal according to embodiments of the present invention may be configured such that a difference between recognition distances of object detection sensors of different objects and a gesture sensed by a gesture sensor, ), It is possible to control the content of the external device (for example, the video display device) while watching the desired content through the
The control apparatus and method for a mobile terminal according to embodiments of the present invention may be configured such that a difference between recognition distances of object detection sensors of different objects and a gesture sensed by a gesture sensor, Can be easily and quickly captured.
The control apparatus and method for a mobile terminal according to embodiments of the present invention may be configured such that a difference between recognition distances of object detection sensors of different objects and a gesture sensed by a gesture sensor, ), And the user can easily and quickly utilize the captured content by classifying the captured content according to the category (caption, voice file, image, and the like).
It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.
141: Proximity sensor 145: Gesture sensor
151: Display unit 180:
Claims (18)
Object detecting sensors having different object recognition distances;
A gesture sensor for sensing a gesture;
A first control signal for controlling the image display device is generated when an object is sequentially detected by the object detection sensors and a first predetermined gesture is detected by the gesture sensor, And transmitting the image data to the image display apparatus through the communication network.
When the object is sensed by the gesture sensor and then the object is sensed by the proximity sensor and then the object is sensed by the proximity touch sensor and the first predetermined gesture is sensed by the gesture sensor, And generates a first control signal for controlling the display device.
A second control signal for controlling the image display device is generated when the object is sequentially sensed by the object detection sensors and the first predetermined gesture is sensed by the gesture sensor, And transmits the control signal to the video display device through the communication network, wherein the first and second control signals are different from each other.
Wherein the control unit sequentially detects the object by the object detection sensors in a state where the other contents are displayed on the display unit and when the second predetermined gesture is detected by the gesture sensor, And a control unit for controlling the mobile terminal.
When the object is sequentially detected by the object detection sensors and a second predetermined gesture is detected by the gesture sensor, the content displayed on the video display device is captured, and the captured content is displayed on the display unit To the mobile terminal.
A subtitle of the content displayed on the video display device, an audio file corresponding to the subtitle, and an image corresponding to the subtitle.
And when the subtitle displayed on the display unit is selected, executes an application program linked to the subtitle.
Wherein the display unit displays an audio file corresponding to the caption or an icon for outputting an image corresponding to the caption together with the caption of the content displayed on the video display device.
When the call signal is received, displays the content displayed on the display unit on the video display device or temporarily stops the playback of the content displayed on the video display device when the call signal is received.
Sequentially sensing an object by object detection sensors having different object recognition distances;
Generating a first control signal for controlling the image display device when the object is sequentially detected by the object detection sensors and a first predetermined gesture is detected by the gesture sensor;
And transmitting the generated first control signal to the video display device through the communication network.
Generating a second control signal for controlling the image display device when the object is sequentially sensed by the object detection sensors and the first predetermined gesture is sensed by the gesture sensor;
And transmitting the generated second control signal to the image display apparatus through the communication network, wherein the first and second control signals are different from each other.
Controlling the content of the video display device when the object is sequentially detected by the object detection sensors in a state where the other contents are displayed on the display unit and a second predetermined gesture is detected by the gesture sensor, The method of claim 1, further comprising:
Capturing content displayed on the video display device when objects are sequentially detected by the object detection sensors and a second predetermined gesture is detected by the gesture sensor;
And displaying the captured content on a display unit.
A subtitle of the content displayed on the video display device, an audio file corresponding to the subtitle, and an image corresponding to the subtitle.
Receiving a call signal;
Further comprising displaying the content displayed on the display unit on the video display device based on the received call signal or temporarily stopping the playback of the content displayed on the video display device .
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130094235A KR20150017955A (en) | 2013-08-08 | 2013-08-08 | Control apparatus of mobile terminal and method thereof |
US14/341,331 US20150042580A1 (en) | 2013-08-08 | 2014-07-25 | Mobile terminal and a method of controlling the mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130094235A KR20150017955A (en) | 2013-08-08 | 2013-08-08 | Control apparatus of mobile terminal and method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20150017955A true KR20150017955A (en) | 2015-02-23 |
Family
ID=53046559
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020130094235A KR20150017955A (en) | 2013-08-08 | 2013-08-08 | Control apparatus of mobile terminal and method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20150017955A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180056973A (en) * | 2016-11-21 | 2018-05-30 | 연세대학교 산학협력단 | Apparatus and method of providing additional function and feedback to other apparatus by using information of multiple sensor |
-
2013
- 2013-08-08 KR KR1020130094235A patent/KR20150017955A/en not_active Application Discontinuation
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180056973A (en) * | 2016-11-21 | 2018-05-30 | 연세대학교 산학협력단 | Apparatus and method of providing additional function and feedback to other apparatus by using information of multiple sensor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102099358B1 (en) | Mobile terminal and control method thereof | |
KR102080743B1 (en) | Mobile terminal and control method thereof | |
KR20150026026A (en) | Wearable watch-type device and systme habving the same | |
KR20150045748A (en) | Control apparatus of mobile terminal and method thereof | |
KR102045893B1 (en) | Mobile terminal and control method thereof | |
KR101763227B1 (en) | Mobile terminal and method for controlling the same | |
KR20150059473A (en) | Mobile terminal and multi-device interworking method using finger print thereof | |
KR102195773B1 (en) | Wearable glass-type device and method of controlling the device | |
KR20140110403A (en) | Mobile terminal and tag identification blocking method thereof | |
KR101988897B1 (en) | Obile terminal and audio zooming sharing method thereof | |
KR101968526B1 (en) | Mobile terminal and control method thereof | |
KR102018550B1 (en) | Control apparatus of mobile terminal and method thereof | |
KR20140125630A (en) | Control apparatus of mobile terminal and method thereof | |
KR20140102936A (en) | Mobile terminal and control method thereof | |
KR20140085039A (en) | Control apparatus of mobile terminal and method thereof | |
KR102123352B1 (en) | Control apparatus of mobile terminal and method thereof | |
KR20150050077A (en) | Control apparatus of mobile terminal and method thereof | |
KR20150017955A (en) | Control apparatus of mobile terminal and method thereof | |
KR20150012886A (en) | Mobile terminal and information exchangeing method using nfc tag thereof | |
KR20140141300A (en) | Watch type terminal and control method thereof | |
KR102025774B1 (en) | Mobile terminal having operation control function via drawing input and operation control method thereof | |
KR101965742B1 (en) | Power control system and method for controlling the same | |
KR20140090479A (en) | Call control system and method for controlling the same | |
KR20150015801A (en) | Mobile terminal and control method thereof | |
KR20140133078A (en) | Control apparatus of mobile terminal and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |