MX2014015378A - Input apparatus, display apparatus, control method thereof and display system. - Google Patents

Input apparatus, display apparatus, control method thereof and display system.

Info

Publication number
MX2014015378A
MX2014015378A MX2014015378A MX2014015378A MX2014015378A MX 2014015378 A MX2014015378 A MX 2014015378A MX 2014015378 A MX2014015378 A MX 2014015378A MX 2014015378 A MX2014015378 A MX 2014015378A MX 2014015378 A MX2014015378 A MX 2014015378A
Authority
MX
Mexico
Prior art keywords
input
mode
display
touch
change
Prior art date
Application number
MX2014015378A
Other languages
Spanish (es)
Other versions
MX340896B (en
Inventor
Dong-Heon Lee
Yong-Hwan Kwon
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020120065397A external-priority patent/KR101943419B1/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of MX2014015378A publication Critical patent/MX2014015378A/en
Publication of MX340896B publication Critical patent/MX340896B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0383Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN

Abstract

An input apparatus, a display apparatus, a control method thereof and a display system are provided. The input apparatus includes: a communication unit which communicates with the display apparatus; a touch sensor which senses a touch input; a motion sensor which senses a motion of the input apparatus; and a controller which determines whether a value of a touch input, that sensed by the touch sensor when the input apparatus operates in a gesture mode according to the motion of the input apparatus, is greater than or equal to a first reference value, and changes an input mode of the input apparatus to a touch mode if it is determined that the value of the touch input greater than or equal to the first reference value.

Description

ENTRY APPARATUS, DEPLOYMENT DEVICE, METHOD OF CONTROL OF THEMSELVES AND DEPLOYMENT SYSTEM FIELD OF THE INVENTION Apparatus and methods consistent with exemplary embodiments refer to an input apparatus, a display apparatus, a control method thereof and a display system, and more particularly, to an input apparatus, a display apparatus, a method of controlling them and a deployment system that displays an input mode of the input device.
BACKGROUND OF THE INVENTION A deployment system displays an image based on an image signal that is supplied from an external source or generated by the deployment system itself, and includes a plurality of devices as well as a deployment apparatus having a deployment panel to display an image on it to execute various functions. The deployment system can incorporate several types of devices depending on the functions to be executed. For example, the deployment system may include a display apparatus and an input apparatus that transmits several commands to the display apparatus. Generally, the input apparatus is implemented as a remote controller of the display apparatus.
The input apparatus may include a touch sensor for detecting a user's tactile input and a motion sensor for detecting a movement of the input apparatus. The input apparatus can operate in one of a tactile mode according to a touch input of the use, and a gesture mode according to a movement of the input apparatus. The two modes are exchanged using a mode switch button on the remote controller.
However, a user should manipulate the change button as designed to change the mode while the remote controller is operating and such manipulation suspends the interaction and a user may be less focused on manipulating the interface and may feel bothered If a user is not informed about the mode change, the user can not be aware of the input mode of the input device and may feel uncomfortable when using the input device.
BRIEF DESCRIPTION OF THE INVENTION In accordance with one aspect of an exemplary embodiment, an input apparatus of a display apparatus is provided, the input apparatus including: a communication unit communicating with the display apparatus; a touch sensor that receives an input user's touch; a motion sensor that detects a movement of the input apparatus; and a controller that changes an input mode of the input device to a touch mode in the event that the user touch input that is a first reference value or more is detected while the input apparatus operates in a gesture mode in accordance with the movement of the input device.
The controller may change the input mode of the input apparatus to the gesture mode in case a movement of the input apparatus is detected which is a second reference value or more when the input apparatus operates in the touch mode.
The controller can control the communication unit to transmit a command to the deployment apparatus to display a user interface (UI) concerning the change of the input mode.
The UI can display a process of changing the input mode in the display device by stages.
The UI can be adjusted in stages using a degree of transparency according to the change of the input mode.
The controller can control the communication unit to transmit a command to display the UI opaquely in case the change of the input mode is completed.
The UI may include a calibrator that displays the change of the input mode by stages.
The controller may ignore the user's tactile input that is less than the first reference value in the gesture mode, and may ignore the movement of the input mode that is less than the second reference value in the tactile mode.
The first reference value may include a value corresponding to at least one of a drag at a predetermined distance or more, one tap with a predetermined force or more and one tap at predetermined times or more with respect to the touch sensor, and the second value Reference may include a value corresponding to at least one of a movement at a predetermined distance or more, with a predetermined force or more and at times predetermined or more detected by the motion sensor.
In accordance with an aspect of another exemplary embodiment, an input apparatus of a display apparatus is provided including: a communication unit communicating with the display apparatus; a touch sensor that receives a tactile input from the user; a motion sensor that detects a movement of the input apparatus; and a controller that changes an input mode of the input device to a gesture mode in case that a movement of the input apparatus is detected which is a second reference value or more when the input apparatus operates in a tactile mode according to the user's tactile input.
The controller can control the communication unit to transmit a command to the display apparatus to display a UI indicating a process of changing the input mode of the step input device according to at least one of force, times and distance of a movement of the input device that is detected by the motion sensor.
The UI may also include information regarding at least one of distance, force and times of a movement that is necessary to complete the change of the input mode.
In accordance with an aspect of another exemplary embodiment, a display apparatus is provided including: a display unit that displays an image; a communication unit communicating with an input device of the deployment apparatus; and a controller controlling the display unit to display a UI therein with reference to a change from the input mode of the input apparatus to a touch mode in case a command is received to change the input mode from a mode of gesture to touch mode through the communication unit.
The controller can control the display unit to display a UI with reference to changing the input mode to the gesture mode in case a command is received to change the input mode from the touch mode to the gesture mode.
The UI can display a process to change the input mode in stages.
The UI can be adjusted in stages using a degree of transparency according to the change of the input mode.
The controller can control the display unit to display the UI opaquely in case the change of the input mode is completed.
The UI may include a calibrator that displays the change of the input mode in stages.
In accordance with an aspect of another exemplary embodiment, a display apparatus is provided including: a display unit that displays an image; a communication unit communicating with an input device of the deployment apparatus; and a controller controlling the display unit to display a UI with reference to a change from an input mode of the input device to a gesture mode in case a command is received to change the input mode from a touch mode to the Gesture mode through the communication unit.
The UI can display the change from the input mode to the stepwise gesture mode.
The UI may also include information concerning at least one of distance, force, and times of a movement that is necessary to complete the change of the input mode.
According to an aspect of another exemplary embodiment, there is provided a method of controlling an input apparatus of a display apparatus, the control method including: operating the input apparatus in a gesture mode as an input mode according to with a movement of the input device; detect a user's tactile input; and changing the input mode to a tactile mode in case the detected tactile input is a first reference value or more.
The control method may further include detecting a movement of the input apparatus; and changing the input mode to the gesture mode in case the motion detected is a second reference value or more.
The control method may further include transmitting a command to the display apparatus to display a UI with reference to a change of the input mode.
The UI can display a process for changing the staging mode in the deployment apparatus.
The UI can be adjusted in stages using a degree of transparency according to the change of the input mode.
The control method may further include transmitting a command to display the UI opaquely in case the change of the input mode is completed.
The UI may include a calibrator to display the change of the step input mode.
The control method may further include, ignoring a movement of the input apparatus that is less than the second reference value in case the input apparatus is switched to the tactile mode.
The control method may further include, ignoring the user's tactile input that is less than the first reference value in case the input apparatus is changed to gesture mode.
The first reference value may include a value corresponding to at least one of a drag at a predetermined distance or more, one tap with a predetermined force or more and one tap at predetermined times or more with respect to the touch sensor, and the second value Reference may include a value corresponding to at least one of a movement at a predetermined distance or more, with a predetermined force or more and at times predetermined or more detected by the motion sensor.
In accordance with an aspect of another exemplary embodiment, there is provided a method of controlling an input apparatus of a display apparatus, the method of control including: operating the input apparatus in a tactile mode as an input mode in accordance with the user's tactile input; detect a movement of the input apparatus; and changing the input mode to a gesture mode in case the motion detected is a second reference value or more.
The control method may further include, transmitting a command to the display apparatus to deploy a UI in stages with reference to a change of the input mode.
The transmission of the command may include, transmitting a command to the display apparatus to display information relating to at least one of distance, force and times of a movement that is necessary to complete the change of the input mode.
In accordance with an aspect of another exemplary embodiment, there is provided a method of controlling a display apparatus including: receiving a command from an input apparatus for changing an input mode from a gesture mode to a tactile mode; and display a UI to change the input mode to the touch mode based on the received command.
The control method may further include, receiving a command from the input apparatus to change the input mode from the tactile mode to the gesture mode; and display a UI to change the input mode to the gesture mode based on the received command.
The control method may also include, deploying a process to change the input mode in stages.
The phased deployment may include adjusting a degree of transparency of the UI in stages according to the change in the mode of entry.
The control method may further include, displaying the UI opaquely in case the change of the input mode is completed.
The staged deployment may include displaying the change of the staged entry mode through a calibrator.
In accordance with an aspect of another exemplary embodiment, a method of controlling a display apparatus is provided including: receiving a command from an input apparatus for changing an input mode from a tactile mode to a gesture mode; and display a UI to change the input mode to the gesture mode based on the received command.
The deployment of the UI may include, displaying the change from the input mode to the stepwise gesture mode.
The deployment of the UI may include displaying information concerning at least one of distance, force and times of a movement that is necessary to complete the change of the mode of entry.
In accordance with an aspect of another exemplary embodiment, a deployment system is provided including: a display apparatus communicating with an input apparatus and displaying an UI in accordance with a command transmitted by the input apparatus; and an input apparatus that communicates with the display apparatus, includes a touch sensor for receiving a user touch input and a motion sensor for detecting a movement of the input apparatus, changes an input mode of the input apparatus from a gesture mode to a tactile mode in case a user touch input that is a first reference value or more is detected when the input apparatus operates in the gesture mode, changes the input mode of the input apparatus from the mode touch to the gesture mode in case a movement of the input apparatus is detected which is a second reference value or more when the input apparatus operates in the touch mode, and transmits a command to the display apparatus to display a UI with reference to the change of the input mode.
The UI displays the change of the input mode in stages.
BRIEF DESCRIPTION OF THE FIGURES The above aspects and / or others will be apparent and will be more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying figures, in which: Figure 1 illustrates a deployment system according to an exemplary embodiment; Figure 2 is a block diagram of the deployment system according to the exemplary embodiment; Figures 3 to 5 illustrate a screen that is displayed in a deployment unit of a deployment apparatus according to a first exemplary embodiment; Figures 6 and 7 are flow charts showing a method of controlling the deployment system according to the first exemplary embodiment; Figures 8 to 12 illustrate a screen that is displayed in a deployment unit of a deployment apparatus according to a second exemplary embodiment; Y Figure 13 is a flow chart showing a control method of a deployment system according to the second exemplary embodiment.
DETAILED DESCRIPTION OF THE INVENTION Exemplary embodiments to details will now be described with reference to accompanying figures so that they are readily appreciated by one skilled in the art. Exemplary modalities may be incorporated in various forms without being limited to the exemplary embodiments set forth herein. Descriptions of well-known parts are omitted for clarity, and similar reference numbers refer to similar elements in the document.
Figure 1 illustrates a deployment system 1 according to an exemplary embodiment.
As shown herein, the deployment system 1 according to the present exemplary embodiment includes a display apparatus 100 that processes an image signal in accordance with a preset processing operation and displays an image based on the processed image signal , and an input device 200 that generates and transmits preset commands / data / information / signals to the display apparatus 100 to remotely control the display apparatus 100.
The present exemplary embodiment refers to the deployment system 1 including the deployment apparatus 100 which is implemented as a television (TV) that displays a transmission image based on Transmission / transmission information / transmission data transmitted by transmission equipment of a transmission station, and input device 200 which is implemented as a remote controller. Nevertheless, the type of an image that can be displayed by the display apparatus 100 is not limited to a transmission image, and may include video, still images, applications, on-screen display (OSD), a user interface (UI) for control various operations (hereinafter also referred to as "graphic user interface (GUI)") and other images, based on signals / data supplied by various types of image supply sources (not shown).
The inventive concept can also be applied to other deployment systems that differ from the present exemplary embodiment, such as a system that includes a monitor connected to a computer main body such as the deployment apparatus 100, and the input apparatus 200 communicating with the main body of the computer. That is, the exemplary embodiment which will be described hereinafter is one of several exemplary embodiments of the deployment system 1, and the inventive concept is not limited thereto.
The input apparatus 200 is an external apparatus that can communicate with the display apparatus 100 in wirelessly, and wireless communication includes infrared communication and radio frequency (RF) communication. The input apparatus 200 transmits a preset command to the display apparatus 100 according to the manipulation of a user.
The input apparatus 200, in accordance with the present exemplary embodiment, includes a touch sensor 210 that detects a user's tactile input, and a motor sensor 220 that detects a movement of the input device 200 by a user. The motion sensor 220 includes a gyroscope sensor, an angular velocity sensor, and a terrestrial magnetism sensor. Therefore, the display apparatus 100 can control an image displayed by the display unit 130, according to tactile information or movement information transmitted by the input apparatus 200.
In the following, configurations of the deployment apparatus 100 and the input apparatus 200 will be described in detail in accordance with the present exemplary embodiment. Figure 2 is a block diagram of the deployment system 1 according to the exemplary embodiment.
As shown therein, the display apparatus 100 includes an image receiver 110 configured to receive an image signal, an image processor 120 configured to process the received image signal by the image receiver 110, a display unit 130 configured to display an image thereon based on the image signal processed by the image processor 120, a first communication unit 140 configured to communicate with the outside, a first unit storage 150 configured to store data therein, and a first controller 160 configured to control the deployment apparatus 100.
The input device 200 includes a touch sensor 210 configured to detect a user's tactile input, a motion sensor 220 configured to detect a movement of the input device 200, a second communication unit 240 configured to communicate with the outside, a second storage unit 240 configured to store data therein, and a second controller 260 configured to control the second communication unit 240 for transmitting a command to the display apparatus 100 based on a result of detection of the touch sensor 210 or the sensor of movement 220. The input device 200 can also include various menu keys, number keys and other buttons that can be manipulated by a user.
In the following, a detailed configuration of the deployment apparatus 100 will be described.
The image receiver 110 receives an image signal and transmits the received image signal to the image processor 120. The image receiver 110 may be implemented in various configurations corresponding to a standard of a received image signal and an implementation of the deployment 100. For example, the image receiver 110 may receive a radio frequency (RF) signal transmitted in a wireless manner from a transmission station (not shown), or may receive a transmitted image signal in a wired manner of according to standards such as composite video, component video, super video, SCART, high definition multimedia interface (HDMI). If an image signal is a transmission signal, the image receiver 110 includes a tuner for tuning the transmission signal per channel.
The image signal can be entered by an external device, for example, a personal computer (PC), audio / video (A / V) devices, smart phone and smart pad. The image signal may be the result of data that is transmitted through a network such as the Internet. In this case, the deployment apparatus 100 may execute a network communication through the first communication unit 140 or may also include an additional network communicator. The The image signal may be the result of data being stored in the first storage unit 150 which is a non-volatile memory such as a flash memory or a hard disk drive. The first storage unit 150 may be provided within the deployment apparatus 100 or may be provided outside the deployment apparatus 100. If the first storage unit 150 is provided outside of the deployment apparatus 100, the display apparatus 100 may additionally include a connector (not shown) to be connected to the first storage unit 150.
The image processor 120 executes various image processor operations with respect to an image signal. The image processor 120 outputs the processed image signal to the display unit 130, in which an image is displayed on the basis of the image signal.
The types of image processing operations executed by the image processor 120 may include a decoding operation corresponding to various image formats, a deinterleaving operation, a frame refresh rate conversion operation, a scale operation, an operation noise reduction to improve an image quality, a detail improvement operation, an operation of line scan, etc., but it is not limited to this. The image processor 120 can be implemented as a group of individual elements that execute the above processes independently, or as a system-on-chip (SOC) that integrates the above processes.
The display unit 130 displays an image based on an image signal that is processed by the image processor 120. The display unit 130 can be implemented as various display panels including liquid crystal, plasma, light emitting diode ( LED), organic light emission diode (OLED), surface conduction electron emitter, carbon nano-tube, and nano-crystal, but it is not limited to this.
The deployment unit 130 may additionally include additional components according to its type of modality. For example, if the display unit 130 includes liquid crystal, the display unit 130 includes an LCD panel (not shown), a backlight unit (not shown) emitting light to the LCD panel, and a substrate panel drive (not shown) that drives the panel.
The display unit 130 according to the present exemplary embodiment displays a user interface (UI) indicating an input mode of the input device 200. The UI includes text and icons (graphics), in where the text includes numbers and letters. If the display unit 130 is a touch screen, a user may touch the UI displayed by the display unit 130 which transmits a command to the first controller 160 corresponding to the touch input.
The first communication unit 140 transmits commands, data, information and / or signals from the input device 200 to the image processor 120. The first communication unit 140 can transmit commands, data, information and / or signals from the image processor 120 to the input device 200. The first communication unit 140 can use wireless communication between the display apparatus 100 and the input device 200. The wireless communication can include infrared communication, RF communication, Zigbee and Bluetooth.
In accordance with the present exemplary embodiment, the first communication unit 140 is included in the deployment apparatus 100, but the inventive concept is not limited thereto. Alternatively, the first communication unit 140 can be implemented as a protection key (dongle) or a module and can be removably mounted on a connector (not shown) of a deployment device 100 connected to the processor of image 120.
The first storage unit 150 stores data therein according to a control of the first controller 160. The first storage unit 150 includes a non-volatile storage medium such as a flash memory or a hard disk drive. The first controller 160 has access to the first storage unit 150, said controller reads, registers, modifies, deletes and updates data stored in the first storage unit 150.
The data that is stored in the first storage unit 150 includes an operating system (OS) for operating the display apparatus 100, and various applications that are executed in the OS, and additional image and data data.
The first controller 160 controls various elements of the display apparatus 100. For example, the first controller 160 controls the image processor 120 to execute an image processor operation, and executes a corresponding control operation with respect to a command from the apparatus. input 100 so as to control all the operations of the deployment apparatus 100.
The first controller 160, in accordance with the present exemplary embodiment, controls the display unit 130 for displaying UIs (31, 32, 35 and 36 in Figures 3 to 5, and 61, 62 and 63 in Figures 8 to 12) indicating an input mode (for example, touch mode or gesture mode) of the input device 200, based on a command transmitted by the input device 200.
In the following, a detailed configuration of the input apparatus 200 will be described.
The input apparatus 200 according to the present exemplary embodiment operates in one of a plurality of input modes, which includes a tactile mode in which the input apparatus 200 operates in accordance with a user's tactile input to a unit of input. tactile detection, and a gesture mode in which the input apparatus 200 operates in accordance with a movement of the input apparatus 200.
For example, the input apparatus 200 may move a page from a screen displayed by the display apparatus 100, according to a movement thereof in the gesture mode, and may select an item from a screen displayed by the display apparatus. 100 display by a user touch input in touch mode.
In exemplary mode, the input apparatus 200 is configured to operate in the gesture mode with respect to a larger movement such as the page shift, and to operate in the tactile mode with respect to a smaller movement such as the selection of the element. However, the exemplary modality it is not limited to the previous thing, and diverse other entrances of the users can apply.
The touch sensor 210 includes a touch detection unit that detects a user's tactile input. The user's tactile input may vary including tapping, clicking, double clicking, dragging, sliding, and fast spinning. If the touch sensor 210 detects a user touch input, the second controller 260 generates and transmits a command to the display device 100 corresponding to the detected touch input.
The motion sensor 220 includes a motion detection unit which detects a movement of the input device 200. The motion detection unit includes a gyroscope sensor, an angular velocity sensor and a terrestrial magnetism sensor. The motion sensor 220 measures the acceleration and angular velocity with respect to three or six axes of the input apparatus 200 from a current position of the input apparatus 200, and transmits the result of the measurement to the second controller 260. The second controller 250 generates and transmits a command to the display apparatus 100 through the second communication unit 240 corresponding to movement information of the input apparatus 200.
The second communication unit 240 communicates with the deployment apparatus 100, and uses a communication standard corresponding to the first communication unit 140. For example, the second communication unit 240 transmits and receives a predetermined command through a communication wireless, which includes infrared communication, RF communication, Zigbee and Bluetooth.
Accordingly, the display apparatus 100 can execute a control operation corresponding to the command.
The second storage unit 250 stores data therein, which is accessed by means of the second controller 260. The data of the current input mode of the input device 200 which are stored in the second storage unit 250 are read, stored and updated, and the posture information is read, stored and updated to detect a movement of the input apparatus 100.
The second controller 250 calculates movement information of the input device 200 based on the angular velocity and acceleration of the input device 200 detected by the motion sensor 220. The second controller 260 compares the calculated movement information with the positional information stored in the second storage unit 250 and adjusts the movement information and transmits the result as a predetermined command to the display apparatus 100 through the second communication unit 240.
The second storage unit 250 is implemented as a non-volatile storage medium such as a Flash memory. The second controller 260 has access to the second storage unit 250, this controller reads, records, modifies, deletes and / or updates the data stored in the second storage unit 250.
The second controller 260 changes the input mode of the input device 200 according to one of a user touch input to the touch sensor 210 and a movement of the input device 200 detected by the motion sensor 220, and controls the second unit of communication 240 to transmit a corresponding command to the deployment apparatus 100.
Figures 3 to 5 illustrate a screen that is displayed in the deployment unit 130 of the deployment apparatus 100 according to the first exemplary embodiment. The unfolded UIs 31, 32, 35 and 36 may include current input mode information and input mode change information of the input device 200 (e.g., change from tactile mode to gesture mode and vice versa).
According to the first exemplary embodiment, the second storage unit 250 stores there a first reference value of a user touch input and a second reference value of a movement of the input apparatus 200. The first reference value corresponds to at least to one of a drag of a predetermined distance or more, a touch with a predetermined force or more and a touch occurring a predetermined number of times or more with respect to the touch sensor 210. The second reference value corresponds to a movement of one predetermined distance or more, a movement with a predetermined force or more and a movement occurring a predetermined number of times or more detected by the motion sensor 220. The force may be determined by one of, or a combination of a movement distance , speed and area of the input device 200 by movement.
In the present exemplary embodiment, the first and second reference values can be configured as the same value and can be configured as different values.
The second controller 260 controls the second communication unit 240 to transmit to the display apparatus 100 one of a command corresponding to the first reference value or more of the user touch inputs detected by the touch sensor 210, and a command corresponding to the second reference value or more of the movements of the input apparatus 200 detected by the motion sensor 220.
The transmitted command may include a command that displays a UI indicating one of the plurality of input modes of the input device 200. For example, if the motion sensor 220 detects a movement of the input device 200 which is the first reference value or more while the input device 200 operates in the gesture mode, the second controller 260 can control the second communication unit 250 to transmit a command for deployment in the 130 UIs deployment unit, i.e., messages 31 and 32 indicating the gesture mode as the input mode of the input device 200 together with a command corresponding to the detected movement (for example, a page shift command). The messages displayed 31 and 32 include icons and text.
The first controller 160 of the display apparatus 100 receives a command from the input device 200 through the first communication unit 140, and controls the display unit 130 to display the displaced screen therein as in FIG. 4 in accordance with FIG. with the command transmitted by the input device 200. As shown in FIG. 4, the first controller 160 controls the display unit 130 for displaying the messages 31 and 32 such as icons and text therein indicating a current input mode of the input device 200.
The command that is transmitted by the input device 200 to the display apparatus 100 may include a command that changes the input mode of the input apparatus 200 from the current mode of operation to one of the plurality of input modes.
For example, while the input device 200 operates in the gesture mode, the second controller 260 may change the input mode of the input device 200 to the touch mode if the touch input which is the first reference value or more is detected by the touch sensor 210 of the input apparatus 200 (i.e., if a value of a touch input detected by the touch sensor 210 of the input apparatus 200 is greater than or equal to the first reference value) as shown in Figure 4 , and as shown in Figure 5, can control the second communication unit 250 to transmit a command to display the messages 33 and 34 in the display unit 130 indicating the change of the input mode of the input device 200 to the mode tactile. The messages displayed 33 and 34 can include icons and text.
The first controller 160 of the deployment apparatus 100 receives a command from the apparatus of input 200 through the first communication unit 140, and controls the display unit 130 to display therein a plurality of elements 33 and 34 which are selected as the input apparatus 200 is changed to the tactile mode as in FIG. Figure 5 according to the command transmitted by the input device 200. As shown in Figure 5, the first controller 160 controls the display unit 130 to display messages 35 and 36 thereon such as icons and text indicating the current input mode of the input device 200.
Figures 3 and 5 illustrate the exemplary embodiment in which the messages 31, 32, 35 and 36 that indicate the current input mode are displayed. However, the display apparatus 100 may display a message indicating the change of the input mode from one mode (e.g., gesture mode) to another mode (e.g., touch mode) such as "switched to tactile mode".
A user may select the element 34 from the plurality of elements 33 and 34 deployed in the deployment unit 130, by manipulating the touch sensor 210 of the input apparatus 200, such as by tapping and shifting. The second controller 260 controls the second communication unit 250 to transmit a command to the display apparatus 100 corresponding to the result of the detection of the touch sensor 210.
The first controller 160 controls the display unit 130 to display a selection result of the user in accordance with the command transmitted by the input device 200. The first controller 160 controls the display unit 130 to highlight the selected element 34. as in figure 5 through a process of focusing, pointing, cursor, highlighting or fast rotation.
If buttons such as menu keys, number keys and arrow keys (four arrow keys) are additionally provided in the input device 200, a user may select a UI displayed in the display unit 130 by manipulating the buttons as well as the 210 touch sensor.
According to the exemplary embodiment, after the input mode has been changed to the gesture mode by the detected movement of the input device 200 which is the second reference value or more, the second controller 260 ignores the touch input of a user which is less than the first reference value.
As a result, in accordance with the present exemplary embodiment, if one of the touch sensor 210 and the motion sensor 220 detects the input which is the first or second reference value or more, the input that is less than the first or second value of reference detected by the other of the touch sensor 210 and the motion sensor 220 is ignored. Even after one of the touch sensor 210 and the motion sensor 220 detects the first or second reference value or more, the input which is the first or second reference value or more and which is detected by the other one of the touch sensor 210 and the motion sensor 220 becomes a valid input, and thus the input mode is changed.
According to one embodiment, the input apparatus 200 that supports both the touch sensor 210 and the motion sensor 220 operates only in one input mode and, therefore, the possibility of operational errors is reduced. Also, the input modes are exchanged for the touch that is the first reference value or more (ie, if a detected touch input value is greater than or equal to the first reference value) or for the movement that is the second reference value or more (that is, if a detected movement value is greater than or equal to the second reference value), and therefore the mode is changed naturally without artificial manipulation by a user, thus reducing discomfort .
In the following, a method of controlling the deployment system 1 according to the first exemplary embodiment will be described.
Figures 6 and 7 are flow graphs showing the control method of the deployment system 1 according to the first exemplary embodiment.
As shown therein, the input apparatus 200 can operate in the gesture mode as the input mode in which the input apparatus 200 operates in accordance with a detection result of the motion sensor 220 (S401).
The input apparatus 200 detects a user's tactile input to the touch sensor 210 while operating in the gesture mode (S403). The gesture mode could have been changed in step S507 in figure 7 which will be described later.
The second controller 260 of the input device 200 determines whether the tactile input detected in step S403 is the first reference value or more, that is, if a value of the second touch input is greater than or equal to the first reference value ( S405). The first reference value may correspond to at least one of a drag of a predetermined distance or more, a touch with a predetermined force or more and a touch occurring a predetermined number of times or more with respect to the touch sensor 210, and is stored in advance in the second storage unit 250.
If it is determined that the tactile input is the first reference value or more in the operation S405, the second controller 260 changes the input mode of the input device 200 to the touch mode, and controls the second communication unit 250 to transmit a command to the display apparatus 100 corresponding to the touch input detected in step S403 (S407).
The first controller 160 of the display apparatus 100 executes an operation corresponding to the command received in the step S407, and controls the display unit 130 to display the messages 35 and 36 indicating the input mode of the input device 100 (S409). The messages displayed in step S409 may include messages indicating that the input mode of the input device 200 is switched to the touch mode.
In case it is determined that the tactile input is less than the first reference value in the operation S405, the second controller 260 ignores the tactile input detected in the operation S403.
The exemplary embodiment in Figure 7 is different from the exemplary embodiment in Figure 6 because, in the above exemplary embodiment, the input apparatus 200 is switched to the gesture mode while operating in the tactile mode.
As shown in Figure 7, the input apparatus 200 can operate in the touch mode as the mode of input in which the input apparatus 200 operates according to the result of the detection of the touch sensor 210 (S501). The tactile mode could have been changed in step S407 in figure 6.
While in the touch mode, the input apparatus 200 detects the movement of the input apparatus 200 with respect to the motion sensor 220 (S503).
The second controller 260 of the input device 200 determines whether the movement detected in step S503 is the second reference value or more, that is, if a detected movement value is greater than or equal to the second reference value (S505). The second reference value may correspond to at least one of a movement of a predetermined distance or more, a movement with a predetermined force or more and a movement occurring a predetermined number of times or more detected by the motion sensor 220, and stored in advance in the second storage unit 250. In the present exemplary embodiment, the second reference value can be set as the same value as, or a different value from the first reference value in the step S405 in FIG. 6.
If it is determined that the movement input is the second reference value or more in the operation S505, the second controller 260 changes the input mode of the input device 200 to the gesture mode, and controls the second communication unit 250 to transmit a command to the display apparatus 100 corresponding to the motion input detected in step S503 (S507).
The first controller 160 of the display apparatus 100 executes an operation according to the command transmitted in the step S507, and controls the display unit 130 to display the messages 31 and 32 indicating the input mode of the input apparatus 100.
(S509). The messages displayed in step S509 may include messages indicating that the input mode of input device 200 is changed to gesture mode.
In case it is determined that the movement input is less than the second reference value in step S505, second controller 260 ignores the movement input detected in step S503.
According to the first exemplary embodiment, because the input modes are exchanged for touch or movement which is the reference value or more, the interaction is not suspended and the input mode is changed naturally without artificial manipulation by a user , thus improving the convenience of the user.
Figures 8 to 12 illustrate a screen that is displayed in the deployment unit 130 of the apparatus of deployment 100 according to a second modality plar.
The screen, according to the second exemplary embodiment, is characterized by displaying a process of changing the entry mode in stages, compared to the screen according to the first exemplary embodiment shown in Figures 3 to 5. Due to that elements other than the UIs deployed in the deployment unit 130 have the same reference numbers and names as those according to the first exemplary embodiment, their detailed description will not be repeated here.
According to the second exemplary embodiment, the second storage unit 250 stores there a third reference value of a user touch input and a movement of the input apparatus 100. The third reference value is a plurality of values which are set in stages to change the entry mode.
More specifically, the third reference value includes a plurality of values established in steps (eg, first input value, second input value and third input value) corresponding to a movement with a predetermined force, with a predetermined number occurring times or more and at a predetermined distance or more detected by the motion sensor 220 and corresponding to a drag of a predetermined distance or more, a tapping or clicking occurring a predetermined number of times or more and a touch with a predetermined force or more with respect to the touch sensor 210. The force is determined by at least one of, or a combination of a movement distance, speed and area of the input apparatus 200 by movement.
The second controller 260 controls the second communication unit 250 to transmit a command to the display apparatus 100 to display a UI referring to the change of the input mode of the input apparatus 200.
The second controller 260 can control the second communication unit 250 to transmit a command in order to display a UI which, in stages, displays the change of the input mode of the input device 200 to the gesture mode according to at least one of the distance, force and number of times of the movement of the input device 200 detected by the motion sensor 220.
For example, while the display apparatus 100 communicates with the input apparatus 200 as in FIG. 8, a user may use a shaking movement of the input apparatus 200 to use the input apparatus 200 in the gesture mode.
The motion sensor 220 can detect the movement of the input apparatus 200, and the second controller 260 may control the second communication unit 250 to transmit a command for deployment in the deployment apparatus 100 of a UI with reference to the input mode, based at least on one of the distance, force and number of times of the movement detected . Unfolded UIs 61 and 62 include icons and text. The second controller 260 can compare the result of the detection of the motion sensor 220 with the values established in steps and stored in the second storage unit 250, and transmit a corresponding command to the comparison result, through the second communication unit. 240 The first controller 160 of the display apparatus 100 receives the command from the input apparatus 200 through the first communication unit 150, and controls the display unit 130 to display the UIs 61, 62 and 63 such as icons and text displaying the change of the input mode of the input device 200 to the gesture mode in the steps as in FIGS. 8 to 12 according to the command transmitted by the input device 200. A user can immediately check the feedback of the force, number of times and distance of the stirring movement of the input apparatus 200, through the deployment apparatus 200.
The first controller 160 adjusts, in stages, a degree of transparency of the UIs 61 and 62 deployed in the display unit 30 corresponding to at least one of the distance, force and number of times of the movement of the input apparatus 200.
For example, if a user agitates the input device 200 weekly as a first input, the second controller 260 transmits information corresponding to the first input value to the display apparatus 100 through the second communication unit 250, and the first controller 160 receives force information from the first input through the first communication unit 150 and controls the display unit 130 to display the UIs 61 and 62 with a corresponding degree of transparency as in Figure 9.
A user may determine that a stronger movement is needed to change the input mode to the gesture mode by checking the degree of transparency of the UIs 61 and 62 deployed in the display unit 130 in Figure 9 and may shake the input apparatus 200 stronger as a second entry. The second controller 260 transmits to the display apparatus 100 information corresponding to the second input value through the second communication unit 250, and the first controller 160 receives force information from the second input to through the first communication unit 150 and controls the deployment unit 130 to display the UIs 63 and 64 with a corresponding degree of transparency as in Figure 10. The UIs 63 and 64 in Figure 10 are clearer (less transparent) than the UIs 61 and 62 in Figure 9.
In addition, a user may determine that a stronger movement is needed to change the input mode to the gesture mode by reviewing the degree of transparency of the UIs 63 and 64 displayed in the display unit 130 as in FIG. 10, and then shake the input device 200 stronger as a third input. The second controller 260 transmits to the display apparatus 100 information corresponding to the third input value through the second communication unit 250, and the first controller 160 receives force information from the third input through the first communication unit 150 and controls the display unit 130 to display the UIs 64 and 65 with a corresponding degree of transparency as in Figure 11. The UIs 65 and 66 in Figure 11 are clearer (less transparent) than the UIs 61, 62, 63 and 64 in figures 9 and 10.
That is, the UIs 61, 62, 63, 64, 65 and 66 displayed in the display unit 130 are adjusted in the degree of transparency in steps according to the change of the input mode.
The first to third inputs have been explained so that they are distinguished by the force of the movement of the input device 200. However, the change of the input mode can be displayed in stages according to a number of times of the movement. For example, if the input apparatus 200 is shaken once, the UIs 61 and 62 can be displayed as in Figure 9. If the input apparatus 200 is stirred two to four times, the UIs 63 and 64 can be displayed as in figure 10. If the input device 200 is agitated five times or more, the UIs 65 and 66 can be displayed as in figure 11. Also, the steps of changing the input mode can be determined taking into account collective forces. and number of times of movement of the input device 200.
The first to third inputs may correspond to the first to third input values, respectively, stored in the second storage unit 250.
The display apparatus 100, according to the present exemplary embodiment, can display the UIs 65 and 66 of the display unit 130 in an opaque manner, as shown in FIG. 11, in case the change of the mode of operation is completed. entry to gesture mode.
That is, if the change from the input mode to the gesture mode is completed, the second controller 260 can control the second communication unit 250 for transmitting a command to the display apparatus 100 to display the UI opaquely with reference to the input mode, and the first controller 160 provides feedback to a user with reference to the completion of the change by displaying the UIs 65 and 66 in an opaque manner in accordance with the command transmitted by the input device 200.
The UIs 61, 62, 63, 64, 65 and 66 displayed in Figures 9 to 11 include icons 61, 63 and 645 indicating the input mode of the input apparatus 200, and guides 62, 64 and 66 that are provided to enter the movement of the input device 200 in the gesture mode.
That is, a user can determine that the input mode is changed to the gesture mode through the icons 61, 63 and 65 displayed on the display apparatus 100, and can rotate the input apparatus 200 to the left and right sides through the guides 62, 64 and 66 to select a previous or next image, and can rotate the input apparatus 200 up and down to zoom in / out of the screen.
The UI referring to the input mode, according to the third example mode, can also include information regarding the force or number of times of a movement that is necessary to complete the change of the input mode. For example, guides 62, 64 and 66 displayed in the display unit 130 may provide text which indicates the force or number of times of the movement of the input apparatus 200 that is needed to complete the change from the input mode to the gesture mode.
It has been explained that the UIs 61, 62, 63, 64, 65 and 66 with reference to the input mode, in accordance with the exemplary embodiment shown in Figures 9 to 11, display the process of changing the input mode with the degree of transparency. However, the UI, according to the modality, is not limited to the above. For example, the UIs 67 and 68 with reference to the input mode may include a calibrator indicating the change of the input mode in steps, as shown in Figure 11.
The first controller 160 controls the display unit 130 to move a status bar of the calibrator in FIG. 12 to the right side corresponding to the state of change of the input mode and to display the moved status bar, based on the command including reference information to the input mode transmitted by the input device 200. A user can determine the degree of change of the input mode according to the location of the gauge status bar and can complete the change of the input mode by shaking the input apparatus 200 in a stronger way, etcetera.
The UIs 61, 62, 63, 64, 65 and 66 with reference to the input mode, according to the present exemplary embodiment, can display the change from the input mode to the tactile mode in stages. A user can review the UI with reference to the input mode displayed on the display apparatus 100 and can determine the degree of change from the input mode to the touch mode, and can complete the change of the input mode by touching the touch sensor 210 in a manner stronger, etcetera.
In the following, a method of controlling the deployment system 1 according to the second exemplary embodiment will be described.
Figure 13 is a flow chart showing the control method of the deployment system 1 according to the second exemplary embodiment.
As shown therein, the input apparatus 200 can operate in the touch mode as the input mode in which it operates in accordance with the result of the detection of the touch sensor 210 (S701).
The input apparatus 200 detects a movement of the miso with respect to the motion sensor 220 while operating in the tactile mode (S703).
The second controller 260 of the input device 200 transmits a corresponding command to the display apparatus 100, based on at least one of the distance, force and number of times of the movement of the input apparatus 200 detected in step S703 (S705).
The first controller 160 of the display apparatus 100 receives the command transmitted in the operation S705, and controls the display unit 130 to display the UIs 61, 62, 63, 64, 65, 66, 67 and 68 with reference to the input mode of the input apparatus 200 according to the received command (S707)). The UIs 61, 62, 63, 64, 65, 66, 67 and 68, which are displayed in the step S707, can display the change of the input mode of the input device 200 to the gesture mode in steps using the degree of transparency or the calibrator, and they can unfold the UIs 65 and 66 opaquely in case the change of the input mode is completed. The UIs with reference to the input mode can also include information regarding the force or number of times of the movement that is needed to complete the change of the input mode.
In the exemplary embodiment in Figure 13, the input apparatus 200 is changed from the tactile mode to the gesture mode. However, the modes include the case where the input device 200 is changed from the gesture mode to the touch mode, and can apply to the case where the input mode is changed in a general way using buttons to the gesture mode or the touch mode .
According to the second exemplary embodiment, the UIs that provide the information relating to the change of the input mode of the input device 200 are displayed, and a user receives adequate visual feedback in real time with reference to the change of the input mode and so You can both feel less confused and more comfortable.
Although a few exemplary embodiments have been shown and described, those skilled in the art will appreciate that changes can be made to these exemplary embodiments without departing from the principles and spirit of the inventive concept, the scope of which is defined in the appended claims and their equivalents.

Claims (15)

1. An input apparatus of a display apparatus, the input apparatus comprising: a communication unit communicating with the deployment apparatus; a touch sensor that receives a tactile input from the user; a motion sensor that detects a movement of the input apparatus; Y a controller that changes an input mode of the input device to a touch mode in the event that a user touch input that is a first reference value or more is detected while the input apparatus operates in a gesture mode in accordance with the movement of the input device.
2. The input apparatus according to claim 1, characterized in that the controller changes the input mode of the input apparatus to the gesture mode in case a movement of the input apparatus is detected which is a second reference value or more when the Input device operates in touch mode.
3. The input apparatus according to claim 2, characterized in that the controller controls the communication unit for transmitting a command to the deployment apparatus to display a user interface (UI) referring to the change of the input mode.
4. The input apparatus according to claim 2, characterized in that the controller ignores the user's tactile input that is less than the first reference value in the gesture mode, and ignores the movement of the input mode that is less than the second reference value in touch mode.
5. The input apparatus according to any of claims 2 to 4, characterized in that the first reference value comprises a value corresponding to at least one of a drag at a predetermined distance or more, a touch with a predetermined force or more and a touch by a predetermined number of times or more with respect to the touch sensor, and the second reference value comprises a value corresponding to at least one of a movement at a predetermined distance or more, with a predetermined force or more and by a predetermined number times or more detected by the motion sensor.
6. A deployment apparatus comprising: a deployment unit that displays an image; a communication unit communicating with an input device of the deployment apparatus; Y a controller controlling the display unit to display a user interface (UI) with reference to a change from the input mode of the input device to a touch mode in the event that a command to change the input mode from a gesture mode the touch mode is received through the communication unit.
7. The display apparatus according to claim 6, characterized in that the controller controls the display unit to display a UI with reference to the change of the input mode to the gesture mode in case a command is received to change the input mode from the tactile mode to gesture mode.
8. The deployment apparatus according to claim 7, characterized in that the UI displays a process for changing the input mode in stages.
9. A method of controlling an input apparatus of a display apparatus, the control method comprising: operating the input apparatus in a gesture mode as an input mode according to a movement of the input apparatus; detect a user's tactile input; and changing the input mode to a tactile mode in case the detected tactile input is a first reference value or more.
10. The control method according to claim 9, characterized in that it further comprises detecting a movement of the input apparatus; Y change the input mode to the gesture mode in case the motion detected is a second reference value or more.
11. The control method according to claim 10, characterized in that it further comprises transmitting a command to the display apparatus to display a user interface (UI) with reference to a change of the input mode.
12. The control method according to claim 10, characterized in that it also comprises: ignoring a movement of the input device that is less than the second reference value in case the input apparatus is switched to the touch mode; and ignore the user's tactile input that is less than the first reference value in case the input device is switched to gesture mode.
13. A control method of a display apparatus characterized in that it comprises: receiving a command from an input device to change an input mode from a gesture mode to a tactile mode; Y deploy a user interface (UI) to change the input mode to the touch mode based on the received command.
14. The control method according to claim 13, characterized in that it further comprises receiving a command from the input apparatus to change the input mode from the tactile mode to the gesture mode; and display a UI to change the input mode to the gesture mode based on the received command.
15. The control method according to claim 14, characterized in that it also comprises: deploy a process of changing the entry mode in stages.
MX2014015378A 2012-06-19 2013-05-21 Input apparatus, display apparatus, control method thereof and display system. MX340896B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020120065397A KR101943419B1 (en) 2012-01-06 2012-06-19 Input apparatus, display apparatus, control method thereof and display system
PCT/KR2013/004448 WO2013191382A1 (en) 2012-06-19 2013-05-21 Input apparatus, display apparatus, control method thereof and display system

Publications (2)

Publication Number Publication Date
MX2014015378A true MX2014015378A (en) 2015-06-10
MX340896B MX340896B (en) 2016-07-29

Family

ID=49769950

Family Applications (1)

Application Number Title Priority Date Filing Date
MX2014015378A MX340896B (en) 2012-06-19 2013-05-21 Input apparatus, display apparatus, control method thereof and display system.

Country Status (3)

Country Link
BR (1) BR112014031855A2 (en)
MX (1) MX340896B (en)
WO (1) WO2013191382A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150101703A (en) * 2014-02-27 2015-09-04 삼성전자주식회사 Display apparatus and method for processing gesture input

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7903084B2 (en) * 2004-03-23 2011-03-08 Fujitsu Limited Selective engagement of motion input modes
US7432916B2 (en) * 2004-12-09 2008-10-07 Universal Electronics, Inc. Controlling device with dual-mode, touch-sensitive display
KR101568128B1 (en) * 2008-11-14 2015-11-12 삼성전자주식회사 Method for operating user interface based on motion sensor and mobile terminal using the same
US8742885B2 (en) * 2009-05-01 2014-06-03 Apple Inc. Directional touch remote
US20120062551A1 (en) * 2010-09-13 2012-03-15 Lg Electronics Inc. Image display apparatus and method for operating image display apparatus

Also Published As

Publication number Publication date
MX340896B (en) 2016-07-29
BR112014031855A2 (en) 2017-06-27
WO2013191382A1 (en) 2013-12-27

Similar Documents

Publication Publication Date Title
EP2613227B1 (en) Input apparatus and control method thereof
US9467732B2 (en) Display apparatus and control method for displaying an operational state of a user input
KR102033764B1 (en) User interface display method and remote controller using the same
US8643790B2 (en) Remote controller, image processing apparatus, and imaging system comprising the same
US9811303B2 (en) Display apparatus, multi display system including the same, and control method thereof
KR102169521B1 (en) Input apparatus, display apparatus and control method thereof
RU2689412C2 (en) Display device and display method
KR20130080891A (en) Display apparatus and control method thereof
KR101943419B1 (en) Input apparatus, display apparatus, control method thereof and display system
US20140240263A1 (en) Display apparatus, input apparatus, and control method thereof
KR20150031986A (en) Display apparatus and control method thereof
US20130176505A1 (en) Input apparatus, display apparatus and methods for controlling a display through user manipulation
EP3056974A2 (en) Display apparatus and method
JP2013143775A (en) Display apparatus, input apparatus, and method for controlling the same
US20150009143A1 (en) Operating system
EP1961218B1 (en) Display apparatus and method and information processing apparatus and method for providing picture in picture function
MX2014015378A (en) Input apparatus, display apparatus, control method thereof and display system.
EP3226568A1 (en) Display device and display method
KR20080089003A (en) Apparatus and method for controlling a display device that displays plural screens
KR101237353B1 (en) Remote control device controlling multimedia device by using touch screen and multimedia apparatus comprising them
US20160227151A1 (en) Display apparatus, remote control apparatus, remote control system and control method thereof

Legal Events

Date Code Title Description
FG Grant or registration