WO2013042815A1 - Method of controlling an android platform-based application execution terminal using a smart terminal and computer-readable medium having a computer program for controlling the android platform-based application execution terminal using the smart terminal recorded thereon - Google Patents

Method of controlling an android platform-based application execution terminal using a smart terminal and computer-readable medium having a computer program for controlling the android platform-based application execution terminal using the smart terminal recorded thereon Download PDF

Info

Publication number
WO2013042815A1
WO2013042815A1 PCT/KR2011/007039 KR2011007039W WO2013042815A1 WO 2013042815 A1 WO2013042815 A1 WO 2013042815A1 KR 2011007039 W KR2011007039 W KR 2011007039W WO 2013042815 A1 WO2013042815 A1 WO 2013042815A1
Authority
WO
WIPO (PCT)
Prior art keywords
application execution
event
terminal
smart terminal
execution terminal
Prior art date
Application number
PCT/KR2011/007039
Other languages
French (fr)
Korean (ko)
Inventor
변정섭
김수한
류혁곤
Original Assignee
주식회사 인프라웨어테크놀러지
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 인프라웨어테크놀러지 filed Critical 주식회사 인프라웨어테크놀러지
Publication of WO2013042815A1 publication Critical patent/WO2013042815A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/16Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/14Handling requests for interconnection or transfer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/4222Remote control device emulator integrated into a non-television apparatus, e.g. a PDA, media center or smart toy
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present invention relates to an Android platform-based application execution terminal control method using a smart terminal, and a computer-readable recording medium recording an Android platform-based application execution terminal control program using a smart terminal, in more detail the application Android platform-based application execution terminal control method using a smart terminal, and the Android platform-based application execution terminal control program using a smart terminal to conveniently perform the control using a smart terminal by the user according to the operating characteristics of the execution terminal A computer-readable recording medium having recorded thereon.
  • Devices such as smart TVs and smart set-top boxes have been mainly controlled through remote controllers, but the remote controllers have been inconvenient to utilize various functions of such smart devices, and recently, they are controlled through handheld smart touch terminals such as smartphones or tablet terminals. The technique to say is proposed.
  • Such a conventional control technology merely provides a relatively simple function of manipulating a mouse pointer through a handheld smart touch terminal by providing an interface identical to or similar to a conventional remote control interface or simply providing a touch pad area. . That is, it does not provide a precise control interface such as controlling the functions of the tablet terminal itself.
  • an application executable terminal such as an Android platform-based smart TV or smart set-top box may perform operations for controlling various Android applications or services using the same in the terminal device.
  • an application execution terminal such as an Android platform-based smart TV or smart set-top box may perform operations for controlling various Android applications or services using the same in the terminal device.
  • An object of the present invention can be controlled according to each function and operation method when performing the Android application and the service using the same in the application execution terminal utilizing the fact that the smart terminal and the application execution terminal adopts a common platform (for example, Android platform). It is to provide a remote UI screen and its control method suitable for the operation characteristics.
  • Android platform-based application execution terminal control method using a smart terminal that allows the user to conveniently control the application execution terminal in the smart terminal, and a computer recording the application execution terminal control program based on the Android platform using the smart terminal It provides a recording medium that can be read.
  • a method of controlling an application execution terminal based on an Android platform using a smart terminal comprising: mapping 1: 1 to a UI screen of an input / output unit of an smart terminal and an output unit of an application execution terminal; Determining whether the smart terminal formed as a touch interface receives a touchdown event as a next event; If it is determined that the smart terminal receives the touch-up event, determining whether the touch-down event is input again; When the smart terminal receives the touchdown event again, the smart terminal determines that the touchdown event is a double touch event and generates and transmits a request signal for the application execution terminal according to the type of event occurring after the double touch. Causing the processing to be synchronized.
  • the smart terminal determines whether the touch drag event is received within a preset first time after the double touch event occurs, and the smart terminal inputs the touch drag event within the preset first time. If received, it is preferable to include a step of generating a drag request signal, and then transmits to the application execution terminal to process.
  • the transmitting of the drag request signal to the application execution terminal may include determining whether the touch down event is maintained for a preset second time in a state in which the touch down event is re-input when the smart terminal does not receive the touch drag event;
  • the method may further include generating a long press request signal when the smart terminal is maintained for a preset second time and transmitting the long press request signal to the application execution terminal for processing.
  • the transmitting of the long press request signal to the application execution terminal may include restarting the touch-up event after the double-touch event occurs when the smart terminal does not receive the touchdown event again or is not maintained for a preset second time.
  • the method may further include determining whether to receive an input, and generating a double-click request signal when the smart terminal receives the touch-up event again, and then transmitting the signal to the application execution terminal for processing.
  • the smart terminal In the transmitting of the double click request signal to the application execution terminal, when the smart terminal does not receive the touch-up event again, it is determined as a general click by the user to generate a click request signal and then transmits the signal to the application execution terminal. It is preferable to make the treatment.
  • the 1: 1 mapping step may include capturing the output unit screen of the application execution terminal by using the camera module of the smart terminal (g); Displaying the captured image on the input / output unit of the smart terminal (h); Step (i) to achieve the mapping for event processing by reflecting the resolution and aspect ratio of the smart terminal and the application execution terminal is preferably configured to include.
  • a computer-readable recording medium that records an application execution terminal control program based on an Android platform using a smart terminal according to the present invention for achieving the above object is provided with a remote control framework installed in a smart terminal and an application execution terminal on an Android platform. a request for an application execution terminal according to a synchronization-control means for performing a synchronization for a remote control through a control framework, and a double touch event input to an input / output unit of a smart terminal, and then an event occurring after the double touch event.
  • the event-processing means for generating and transmitting a signal to be processed in synchronization with an event input after the double touch by the synchronization-control means, and during the event synchronization process by the event-processing means.And a mapping-processing means for controlling the UI screen of the output unit to be 1: 1 mapping.
  • the mapping-processing means implements a back-key implemented as a physical interface (PI) on the application execution terminal as a mapping icon at the input / output unit, and the back-key implemented as the mapping icon is When touched by the user, it is preferable to generate a back-key request signal and then transmit it to the application execution terminal for processing.
  • PI physical interface
  • the application execution terminal control program is a mode for controlling control mode switching and control method switching when a drag event occurs in the upper, lower, left, and right corner regions of the remote control UI screen implemented in the input / output unit in the outer direction. It is preferable to further comprise the conversion-treatment means.
  • the control mode switch is a switch between the touch mode, the mouse mode, the keypad mode, the browser mode, and the game mode, and the control method is switched between the drag method, the zoom in / zoom out method, the movement using the mouse pointer, and the click method using the mouse pointer. Is preferably.
  • the application execution terminal control program implements a remote control UI screen of the input / output unit for controlling the application execution terminal.
  • the control program may use a scrolling area by dragging, a zoom area for implementing screen zoom in / out, a touch for moving or clicking the mouse pointer. It is preferable to further include a remote UI-implementing means for controlling to output to a divided area divided into pad areas.
  • the remote UI-implementing means implements a QWERTY keypad screen in the input / output unit when a keypad input is required in the application execution terminal, and then inputs a character using the QWERTY keypad screen to generate a request signal for inputting the same character to execute the application. It is preferable to transmit to the terminal to process.
  • mapping-processing means captures the output unit screen of the application execution terminal using the camera module of the smart terminal, displays the captured image on the input / output unit of the smart terminal, and reflects the resolution and the aspect ratio of both terminals. It is desirable to achieve mapping for event processing.
  • the complex function control operation for various Android applications and services using the same in the application execution terminal can be operated using a smart terminal without a problem that the response speed is lowered according to the communication load. .
  • FIG. 1 is a diagram showing a system in which an Android platform-based application execution terminal control method using a smart terminal according to an embodiment of the present invention is implemented.
  • 2 to 7 are diagrams showing examples of UI screens implemented in a smart terminal and an application execution terminal.
  • FIG. 8 is a flowchart illustrating a method for controlling an application execution terminal based on the Android platform using a smart terminal according to an embodiment of the present invention.
  • FIG. 1 is a diagram illustrating a system in which an Android platform-based application execution terminal control method using a smart terminal according to an embodiment of the present invention is implemented, and FIGS. 2 to 7 illustrate a smart terminal 20 and an application execution terminal 10.
  • FIG. 1 A diagram illustrating an example of a UI screen implemented in FIG.
  • a system in which a control method for an application execution terminal 10 based on an Android platform is implemented includes an application execution terminal 10 and a smart terminal 20.
  • the application execution terminal 10 will be described as a concept including a smart TV, a smart set-top box, a personal computer (PC), etc., and refers to a terminal capable of executing an Android application.
  • the internal configuration includes a first wireless transmission / reception unit 11, an output unit 12, a control unit 13, and a first storage unit 14, wherein the control unit 13 includes emulation-implementing means 13a. .
  • the emulation-implementing means 13a is configured to receive various request signals received from the remote control-implementing means 25 and the remote UI-implementing means 26 of the smart terminal 20 to be described later through the remote control framework. Emulates the role as if made through the internal interface of the execution terminal (10).
  • the Android application determines that a control event through a UI screen implemented in the output unit 12 itself of the terminal device is input, and since the software is developed, a separate process for processing a remote control by an external smart terminal 20 is performed.
  • an emulation-implementing means 13a is provided in order not to create a software version of the application execution terminal 10.
  • the smart terminal 20 will refer to various terminals (preferably handheld type) having a touch interface such as a smart phone and a smart pad and installed with a smart operating system.
  • the internal configuration includes a second wireless transmission / reception unit 21, an input / output unit 22, a remote control function unit 23, and a second storage unit 27.
  • the second wireless transmitter / receiver 21 is a component for transmitting and receiving signals and data to and from the first wireless transmitter and receiver 11 of the application execution terminal 10 through a wireless method.
  • wireless communication technologies such as WLAN, Zigbee, infrared, various mobile communication networks (eg, 3G, 4G) may be used.
  • the input / output unit 22 is formed as a touch interface capable of touch input by a user.
  • the remote control UI screen by the remote control function unit 23 is implemented and may be implemented as a general touch screen.
  • the remote control function unit 23 comprises a synchronization-control means 24, a remote control-implementation means 25, a remote UI-implementation means 26, and corresponds to a control application for the application execution terminal 10. do.
  • Synchronization-control means 24 is a remote control framework is installed on both the smart terminal 20 and the application execution terminal 10 on the Android platform, thereby performing the synchronization for the remote control.
  • the synchronization-control means 24 forms a data link between the smart terminal 20 and the application execution terminal 10, and the remote control-implementing means 25 and the remote UI-implementation of the smart terminal 20.
  • Various request signals generated by the means 26 can be processed in the application execution terminal 10 in the same manner.
  • the synchronization-control means 24 equally matches the clock frequency and the IP address to a specific value in the smart terminal 20 and the application execution terminal 10 so as to implement the remote control-implementation means 25 and the remote UI. Synchronization is performed through interworking with the implementation means 26.
  • the remote control-implementing means 25 serves as an interface for controlling the application execution terminal 10.
  • the remote control-implementing means 25 controls the UI screen of the remote control implemented by the input / output unit 22 of the smart terminal 20 by the user of the smart terminal 20, and additionally through the control.
  • the role of transmitting the generated signal to the application execution terminal 10 is also parallel.
  • the remote control-implementing means 25 is required for selecting and executing the correct position through the remote control by the smart terminal 20 when using the Android application and the service using the same on the application execution terminal 10. Used for control.
  • the remote control-implementing means 25 is an event-processing means 25a, a mapping-processing means 25b, a mode switching-processing means 25c to enable precise control on the remote control UI screen of the mouse pointer. ).
  • the event-processing means 25a performs various event control so that the Android application and the service using the same can be controlled according to each function and operation method by the application execution terminal 10 through an input to the input / output unit 22.
  • the event-processing means 25a receives a double touch event by the user to the input / output unit 22 and then, in response to the type of successively input events, the various types of the application execution terminal 10. After generating the request signal of, send it.
  • the event-processing means 25a controls the event occurring after the double touch to be simultaneously processed by the smart terminal 20 and the application execution terminal 10 according to the synchronization interworking performed by the synchronization-control means 24. do.
  • the event-processing means 25a determines whether a double touch event by a user is input to the input / output unit 22. For example, when the touch-down event ( ⁇ ), the touch-up event ( ⁇ ), or the touchdown event ( ⁇ ) are continuously input to the input / output unit 22 by the user, the event-processing means 25a generates a double-touch event. I think that.
  • the event-processing means 25a generates a drag request signal to the application execution terminal 10 when a touch drag event occurs within a first preset time following the occurrence of the double touch event to the input / output unit 22.
  • the second radio transceiver 21 is controlled to transmit.
  • the touch drag event input through the input / output unit 22 occurs when the user slides with directionality without releasing the hand on the input / output unit 22 after the double touch event occurs.
  • the event-processing means 25a maintains the last touch state (the state in which the touchdown event has occurred in the above-described example) for a preset second time following the occurrence of the double touch event to the input / output unit 22, After generating the long press request signal, the second radio transmitter / receiver 21 is controlled to transmit to the application execution terminal 10.
  • the second time may be the same as the first time described above, or may be set to a different time in order to recognize a different function setting by the user.
  • the event-processing means 25a is a touch event having a directionality of the last touch state and a massive touch after the occurrence of a double touch event to the input / output unit 22 (touch-up having a direction opposite to the touchdown event in the above-described example) Event), the second radio transmitter / receiver 21 is controlled to generate a double-click request signal and then transmit it to the application execution terminal 10.
  • the example of the double click request signal generated by the event-processing means 25a can be used in the same sense as the double click corresponding to the selection using the existing mouse input device.
  • mapping-processing means 25b controls the synchronization-control means 24 when processing the event synchronization by the event-processing means 25a, thereby executing the application with the input / output unit 22 of the smart terminal 20.
  • the screen of the output unit 12 of the terminal 10 controls the 1: 1 mapping.
  • mapping-processing means 25b is located in the mapped position of the output unit 12 of the application execution terminal 10 when the user touches the target position of the input / output unit 22 of the smart terminal 20 by the user. After generating the touch request signal for the control, and controls the second radio transmitter and receiver 21 to transmit to the application execution terminal (10).
  • mapping processed by the mapping-processing means 25b as shown above is shown in FIG. In FIG. 2, when the aspect ratio remains the same and there is a difference only in the number of pixels (resolution), or when the aspect ratio is different, it may be converted according to a fixed aspect ratio setting.
  • mapping-processing means 25b when controlling the application execution terminal 10, the input-output unit 22 of the smart terminal 20 has a back-key implemented by a physical interface (PI). ) Can be implemented as a mapping icon. Accordingly, the mapping-processing means 25b generates a back-key request signal when the mapping icon embodying the back-key is touched by the user, and then transmits the second radio transmission / reception to transmit to the application execution terminal 10. The unit 21 is controlled.
  • PI physical interface
  • the mapping-processing means 25b may be implemented by utilizing the camera module of the smart terminal 20.
  • a camera module (not shown) of the smart terminal 20 captures a display screen of the application execution terminal 10 and transfers the captured screen to an input / output unit of the smart terminal 20. Display.
  • the portion of matching the resolution and aspect ratio between the smart terminal 20 and the application execution terminal 10 is as described above.
  • the screens of the input / output unit 22 of the smart terminal 20 and the output unit 12 of the application execution terminal 10 are controlled to have 1: 1 mapping.
  • the mode change-processing means 25c is illustrated in FIGS. 3 and 4 when a drag event occurs in an outward direction in the corner regions of the upper, lower, left, and right sides of the remote control UI screen of the smart terminal 20. As such, certain functional transitions may be performed.
  • the specific functions performed by the mode switching processing unit 25c include control mode switching and control method switching.
  • the 'control mode' may be a touch mode, a mouse mode, a keypad mode, a browser mode, a game mode, and the like
  • the 'control method' may include dragging, zooming in and out, moving using a mouse pointer, and clicking using a mouse pointer. This can be
  • the mode switching-processing means 25c may apply various operation events generated in the smart terminal 20 through the second wireless transmission / reception unit 21. It transmits to the 1st wireless transmission / reception part 11 of the execution terminal 10.
  • the remote UI-implementing means 26 implements a remote control UI screen of the smart terminal 20 for controlling the application execution terminal 10.
  • the remote UI-implementing means 26 divides the three control areas into separate partition areas by dividing the remote control UI screen of the input / output unit 22, receives the events from them, and generates a request signal. .
  • the example output to the individual partitions may be implemented as shown in FIG. 6.
  • the remote UI-implementing means 26 controls the second radio transmitter / receiver 21 to transmit the request signal generated according to the event for the individual partition to the application execution terminal 10.
  • the remote UI-implementing means 26 implements a QWERTY keypad screen as shown in FIG. 7 in the input / output unit 22 of the smart terminal 20 when a keypad input is required in the application execution terminal 10. Accordingly, when characters such as English, Korean, and special characters are input by the user using the QWERTY keypad screen, a second request is made to generate a request signal for inputting the same character according to the corresponding input and transmit the same to the application execution terminal 10.
  • the radio transmitter and receiver 21 is controlled.
  • the second storage unit 27 is installed with a remote control framework for performing a remote control based on the Android platform.
  • the remote control framework is similarly installed in the first storage unit 14 of the application execution terminal 10.
  • the remote control function unit 23 recognizes that after the touchdown event is inputted to the input / output unit 22 (S1), the touchup event is input by the user as a subsequent event. (S2).
  • the remote control function unit 23 determines whether the touchdown event as in step S1 is input again through the input / output unit 22 (S3). In this case, when the touchdown event is input again, the remote control function unit 23 performs processing according to the type of successively input events which are subsequently determined by judging as a double touch event.
  • step S3 when it is determined that the touchdown event is re-entered, the remote control function unit 23 inputs the input / output unit 22 to the first time set in advance following the occurrence of the double touch event (eg, 1.0).
  • operation S4 it is determined whether a touch drag event is input.
  • the remote control function unit 23 When the touch drag event is input as a result of the determination in step S4, the remote control function unit 23 generates a drag request signal and transmits it to the application execution terminal 10 so that the touch drag event is synchronized and processed ( S5).
  • the remote control function unit 23 is a preset state in which the touch-down event is re-inputted after the double touch event is generated to the input / output unit 22. It is determined whether the second time (eg, 1.5 seconds) is maintained (S6).
  • step S6 when the state in which the touchdown event is re-entered is maintained for the second time, the remote control function unit 23 generates a long press request signal and transmits the long press request signal to the application execution terminal 10.
  • the long press event is synchronized and processed (S7).
  • the remote control function unit 23 determines whether the touch-up event is re-entered following the occurrence of the double-touch event. It is determined (S8).
  • the remote control function unit 23 When the touch-up event is input again as a result of the determination in step S8, the remote control function unit 23 generates a double-click request signal and transmits it to the application execution terminal 10 so that the double-click event is synchronized and processed. (S5).
  • the remote control function unit 23 determines that it is a general click by the user, generates a click request signal, and then returns to the application execution terminal 10. By transmitting the click event is synchronized and processed (S10).
  • the invention can also be embodied as computer readable code on a computer readable recording medium.
  • Computer-readable recording media include all kinds of recording devices that store data that can be read by a computer system.
  • Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disks, optical data storage devices, and the like, which may be implemented in the form of carrier waves (eg, transmission over the Internet). .
  • the computer readable recording medium can also store and execute computer readable code in a distributed manner over networked computer systems. And functional programs, codes, and code segments for implementing the present invention can be easily inferred by programmers in the technical field to which the present invention belongs.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)

Abstract

The present invention relates to a method of controlling an Android platform-based application execution terminal using a smart terminal and computer-readable medium having a computer program for controlling the Android platform-based application execution terminal using the smart terminal recorded thereon. When the smart terminal formed of a touch interface receives a touch down event from a user, the smart terminal determines whether to receive a touch up event as a next event. When determining to receive the touch up event, the smart terminal determines whether to re-receive the touch down event. When re-receiving the touch down event, the smart terminal determines the touch down event to be a double touch event. According to an event type created after the double touch, a request signal for an application execution terminal is created, transmitted and processed to be synchronized with an event which is input after the double touch. Therefore, complex controls for an Android application and a service using same executed in the application execution terminal can be manipulated using the smart terminal without a problem of a reduction in reaction speed according to communication load.

Description

스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법, 그리고 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체A computer-readable recording medium recording an Android platform based application execution terminal control method using a smart terminal, and an Android platform based application execution terminal control program using a smart terminal
본 발명은 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법, 그리고 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체에 관한 것으로, 더욱 상세하게는 애프리케이션 실행 단말의 동작 특성에 맞게 스마트 단말을 이용한 제어를 사용자에 의해 편리하게 수행되도록 하기 위한 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법, 그리고 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체에 관한 것이다.The present invention relates to an Android platform-based application execution terminal control method using a smart terminal, and a computer-readable recording medium recording an Android platform-based application execution terminal control program using a smart terminal, in more detail the application Android platform-based application execution terminal control method using a smart terminal, and the Android platform-based application execution terminal control program using a smart terminal to conveniently perform the control using a smart terminal by the user according to the operating characteristics of the execution terminal A computer-readable recording medium having recorded thereon.
스마트 TV, 스마트 셋톱박스 등과 같은 장치는 주로 리모컨을 통해 제어하여 왔으나, 리모컨으로는 이러한 스마트 장치의 다양한 기능을 활용하기에 불편하였기에 최근 들어 스마트폰이나 태블릿 단말과 같은 핸드핼드 스마트 터치 단말을 통해 제어하는 기술이 제시되고 있다.Devices such as smart TVs and smart set-top boxes have been mainly controlled through remote controllers, but the remote controllers have been inconvenient to utilize various functions of such smart devices, and recently, they are controlled through handheld smart touch terminals such as smartphones or tablet terminals. The technique to say is proposed.
그러나 이와 같은 종래의 제어 기술은 기존의 리모컨 인터페이스와 동일 내지 흡사한 인터페이스를 제공하거나 단순히 터치 패드 영역을 제공하여 핸드핼드 스마트 터치 단말을 통해 마우스 포인터를 조작하는 정도의 비교적 간단한 기능을 제공하고 있을 뿐이다. 즉, 태블릿 단말의 자체 기능을 제어하는 것과 같은 정밀한 제어 인터페이스를 제공하고 있지 못하다.However, such a conventional control technology merely provides a relatively simple function of manipulating a mouse pointer through a handheld smart touch terminal by providing an interface identical to or similar to a conventional remote control interface or simply providing a touch pad area. . That is, it does not provide a precise control interface such as controlling the functions of the tablet terminal itself.
특히, 안드로이드 플랫폼 기반의 스마트 TV나 스마트 셋톱박스 등과 같은 애플리케이션 실행가능한 단말(이하, '애플리케이션 실행 단말')에서는 단말 장치에서 구동되는 각종의 안드로이드 애플리케이션이나 이를 이용한 서비스에 대한 컨트롤을 위한 조작을 수행할 경우에 종래 기술에서는 간단한 조작밖에 지원하지 못하는 단점이 있었다.In particular, an application executable terminal (hereinafter referred to as an “application execution terminal”) such as an Android platform-based smart TV or smart set-top box may perform operations for controlling various Android applications or services using the same in the terminal device. In the case of the prior art there is a disadvantage that only simple operation can be supported.
또한, 퍼스널컴퓨터(PC)의 화면을 스마트 단말을 통해 원격 제어하는 기술도 종래로 개발되어 있다. 그러나 이러한 기술에서는 원격 제어를 위한 통신 로드가 매우 크기 때문에 비교적 복잡한 기기제어 인터페이스를 지원하려면 애플리케이션 실행 단말의 반응 속도와 스마트 단말의 동작 속도가 급격하게 떨어진다는 문제점이 있었다. 따라서, 이러한 종래기술도 스마트 TV, 스마트 셋톱박스 등과 같은 장치가 제공하는 다양한 기능을 원활하게 원격으로 제어하기에는 그다지 효과적인 방식이라 할 수 없다.In addition, a technique for remotely controlling a screen of a personal computer (PC) via a smart terminal has also been developed in the related art. However, in this technology, since the communication load for the remote control is very large, there is a problem that the response speed of the application execution terminal and the operation speed of the smart terminal are drastically reduced to support a relatively complex device control interface. Therefore, this conventional technology is not a very effective way to smoothly remotely control various functions provided by devices such as smart TVs and smart set-top boxes.
본 발명의 목적은 스마트 단말과 애플리케이션 실행 단말이 공통된 플랫폼(예: 안드로이드 플랫폼)을 채택한다는 점을 활용하여 애플리케이션 실행 단말에서 안드로이드 애플리케이션 및 이를 이용한 서비스 수행시 각각의 기능 및 동작 방식에 맞게 제어될 수 있도록 동작 특성에 맞는 컨트롤 리모트 UI 화면 및 이의 제어 방식을 제공하는 것이다. 스마트 단말에서 애플리케이션 실행 단말에 대한 제어가 사용자에 의해 편리하게 수행 가능하도록 해주는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법, 그리고 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체을 제공한다.An object of the present invention can be controlled according to each function and operation method when performing the Android application and the service using the same in the application execution terminal utilizing the fact that the smart terminal and the application execution terminal adopts a common platform (for example, Android platform). It is to provide a remote UI screen and its control method suitable for the operation characteristics. Android platform-based application execution terminal control method using a smart terminal that allows the user to conveniently control the application execution terminal in the smart terminal, and a computer recording the application execution terminal control program based on the Android platform using the smart terminal It provides a recording medium that can be read.
이러한 과제를 달성하기 위한 본 발명에 따른 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법은, 스마트 단말의 입출력부와 애플리케이션 실행 단말의 출력부의 UI 화면에 대해 1 : 1 맵핑하는 단계; 터치 인터페이스로 형성된 스마트 단말이 터치다운 이벤트를 입력받은 경우, 다음 이벤트로 터치업 이벤트를 입력받았는지 여부를 판단하는 단계; 스마트 단말이 터치업 이벤트를 입력받은 것으로 판단하면, 터치다운 이벤트가 재입력되는지 여부를 판단하는 단계; 스마트 단말이 터치다운 이벤트를 재입력받은 경우, 더블터치 이벤트로 판단하여 더블터치 이후에 발생하는 이벤트의 타입에 따라 애플리케이션 실행 단말에 대한 요청신호를 생성한 뒤 전송하여 더블터치 이후에 입력된 이벤트와 동기화되어 처리되도록 하는 단계를 포함한다.According to an aspect of the present invention, there is provided a method of controlling an application execution terminal based on an Android platform using a smart terminal, the method comprising: mapping 1: 1 to a UI screen of an input / output unit of an smart terminal and an output unit of an application execution terminal; Determining whether the smart terminal formed as a touch interface receives a touchdown event as a next event; If it is determined that the smart terminal receives the touch-up event, determining whether the touch-down event is input again; When the smart terminal receives the touchdown event again, the smart terminal determines that the touchdown event is a double touch event and generates and transmits a request signal for the application execution terminal according to the type of event occurring after the double touch. Causing the processing to be synchronized.
이때, 이벤트 처리 단계는, 스마트 단말이 더블터치 이벤트가 발생한 뒤, 미리 설정된 제 1 시간 내로 터치드래그 이벤트를 입력받는지 여부를 판단하는 단계, 그리고 스마트 단말이 미리 설정된 제 1 시간 내로 터치드래그 이벤트를 입력받은 경우, 드래그 요청신호를 생성한 뒤, 애플리케이션 실행 단말로 전송하여 처리하도록 하는 단계를 포함하는 것이 바람직하다.In this case, in the event processing step, the smart terminal determines whether the touch drag event is received within a preset first time after the double touch event occurs, and the smart terminal inputs the touch drag event within the preset first time. If received, it is preferable to include a step of generating a drag request signal, and then transmits to the application execution terminal to process.
또한, 드래그 요청신호를 애플리케이션 실행 단말로 전송하는 단계는, 스마트 단말이 터치드래그 이벤트를 입력받지 못한 경우에 터치다운 이벤트가 재입력된 상태로 미리 설정된 제 2 시간만큼 유지되지 여부를 판단하는 단계, 그리고 스마트 단말이 미리 설정된 제 2 시간만큼 유지되는 경우에 롱 프레스 요청신호를 생성한 뒤, 애플리케이션 실행 단말로 전송하여 처리하도록 하는 단계를 더 포함하여 구성되는 것이 바람직하다.The transmitting of the drag request signal to the application execution terminal may include determining whether the touch down event is maintained for a preset second time in a state in which the touch down event is re-input when the smart terminal does not receive the touch drag event; The method may further include generating a long press request signal when the smart terminal is maintained for a preset second time and transmitting the long press request signal to the application execution terminal for processing.
또한, 롱 프레스 요청신호를 애플리케이션 실행 단말로 전송하는 단계는, 스마트 단말이 터치다운 이벤트를 재입력받지 않거나 미리 설정된 제 2 시간 동안 유지되지 않은 경우에 더블터치 이벤트가 발생한 뒤, 터치업 이벤트를 재입력받는지 여부를 판단하는 단계, 그리고 스마트 단말이 터치업 이벤트를 재입력받은 경우에 더블클릭 요청신호를 생성한 뒤, 애플리케이션 실행 단말로 전송하여 처리하도록 하는 단계를 더 포함하는 것이 바람직하다.In addition, the transmitting of the long press request signal to the application execution terminal may include restarting the touch-up event after the double-touch event occurs when the smart terminal does not receive the touchdown event again or is not maintained for a preset second time. The method may further include determining whether to receive an input, and generating a double-click request signal when the smart terminal receives the touch-up event again, and then transmitting the signal to the application execution terminal for processing.
또한, 더블클릭 요청신호를 애플리케이션 실행 단말로 전송하는 단계는, 스마트 단말이 터치업 이벤트를 재입력받지 못한 경우에 사용자에 의한 일반적인 클릭으로 판단하여 클릭 요청신호를 생성한 뒤, 애플리케이션 실행 단말로 전송하여 처리하도록 하는 것이 바람직하다.In the transmitting of the double click request signal to the application execution terminal, when the smart terminal does not receive the touch-up event again, it is determined as a general click by the user to generate a click request signal and then transmits the signal to the application execution terminal. It is preferable to make the treatment.
또한, 1 : 1 맵핑 단계는, 스마트 단말의 카메라 모듈을 이용하여 애플리케이션 실행 단말의 출력부 화면을 촬상하는 단계(g); 스마트 단말의 입출력부에 그 촬상한 이미지를 표시하는 단계(h); 스마트 단말과 애플리케이션 실행 단말의 해상도와 종횡비를 반영하여 이벤트 처리를 위한 맵핑을 달성하는 단계(i);를 포함하여 구성되는 것이 바람직하다.In addition, the 1: 1 mapping step may include capturing the output unit screen of the application execution terminal by using the camera module of the smart terminal (g); Displaying the captured image on the input / output unit of the smart terminal (h); Step (i) to achieve the mapping for event processing by reflecting the resolution and aspect ratio of the smart terminal and the application execution terminal is preferably configured to include.
이러한 과제를 달성하기 위한 본 발명에 따른 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체는, 안드로이드 플랫폼 상에서 스마트 단말 및 애플리케이션 실행 단말에 설치된 리모트 컨트롤 프레임워크(remote control framework)를 통해 리모트 컨트롤을 위한 동기화를 수행하는 동기화-제어수단과, 스마트 단말의 입출력부로 더블터치 이벤트를 입력받은 뒤, 더블터치 이벤트 이후에 발생하는 이벤트의 타입에 따라 애플리케이션 실행 단말에 대한 요청신호를 생성한 뒤 전송하여, 동기화-제어수단에 의해 더블터치 이후에 입력된 이벤트와 동기화되어 처리되도록 하는 이벤트-처리수단과, 이벤트-처리수단에 의한 이벤트 동기화 처리시 입출력부와 애플리케이션 실행 단말의 출력부의 UI 화면이 1 : 1 맵핑이 되도록 제어하는 맵핑-처리수단을 구비한다.A computer-readable recording medium that records an application execution terminal control program based on an Android platform using a smart terminal according to the present invention for achieving the above object is provided with a remote control framework installed in a smart terminal and an application execution terminal on an Android platform. a request for an application execution terminal according to a synchronization-control means for performing a synchronization for a remote control through a control framework, and a double touch event input to an input / output unit of a smart terminal, and then an event occurring after the double touch event. The event-processing means for generating and transmitting a signal to be processed in synchronization with an event input after the double touch by the synchronization-control means, and during the event synchronization process by the event-processing means.And a mapping-processing means for controlling the UI screen of the output unit to be 1: 1 mapping.
이때, 맵핑-처리수단은 애플리케이션 실행 단말 상에 물리적 인터페이스(Phisical Interface: PI)로 구현되는 백-키(back-key)를 입출력부에 맵핑 아이콘으로 구현하며, 맵핑 아이콘으로 구현된 백-키가 사용자에 의해 터치되는 경우에 백-키 요청신호를 생성한 뒤, 애플리케이션 실행 단말로 전송하여 처리되도록 하는 것이 바람직하다.At this time, the mapping-processing means implements a back-key implemented as a physical interface (PI) on the application execution terminal as a mapping icon at the input / output unit, and the back-key implemented as the mapping icon is When touched by the user, it is preferable to generate a back-key request signal and then transmit it to the application execution terminal for processing.
또한, 애플리케이션 실행 단말 제어 프로그램은, 입출력부에 구현된 리모트 컨트롤 UI 화면의 상, 하, 좌, 우의 모서리 영역에서 외곽 방향으로 드래그 이벤트가 발생한 경우에 제어모드 전환과 제어방법 전환이 되도록 제어하는 모드전환-처리수단을 더 구비하는 것이 바람직하다. 여기서, 제어모드 전환은 터치 모드, 마우스 모드, 키패드 모드, 브라우저 모드, 게임 모드 간의 전환이며, 제어방법 전환은 드래그 방법, 줌인/줌아웃 방법, 마우스 포인터를 이용한 이동, 마우스 포인터를 이용한 클릭 방법 간의 전환인 것이 바람직하다.In addition, the application execution terminal control program is a mode for controlling control mode switching and control method switching when a drag event occurs in the upper, lower, left, and right corner regions of the remote control UI screen implemented in the input / output unit in the outer direction. It is preferable to further comprise the conversion-treatment means. Here, the control mode switch is a switch between the touch mode, the mouse mode, the keypad mode, the browser mode, and the game mode, and the control method is switched between the drag method, the zoom in / zoom out method, the movement using the mouse pointer, and the click method using the mouse pointer. Is preferably.
이때, 애플리케이션 실행 단말 제어 프로그램은 애플리케이션 실행 단말을 제어하기 위한 입출력부의 리모트 컨트롤 UI 화면을 구현한다. 그리고, 제어 프로그램은 이러한 리모트 컨트롤 UI 화면을 애플리케이션 실행 단말에서 브라우저가 실행될 경우에 드래그를 이용한 스크롤 영역, 화면 줌인/줌아웃을 구현하기 위한 줌 영역, 마우스 포인터를 이동하거나 마우스 포인터를 이용해 클릭하기 위한 터치패드 영역으로 구분된 분할영역으로 출력하도록 제어하는 원격UI-구현수단을 더 구비하는 것이 바람직하다.In this case, the application execution terminal control program implements a remote control UI screen of the input / output unit for controlling the application execution terminal. When the browser is executed in the application execution terminal, the control program may use a scrolling area by dragging, a zoom area for implementing screen zoom in / out, a touch for moving or clicking the mouse pointer. It is preferable to further include a remote UI-implementing means for controlling to output to a divided area divided into pad areas.
또한, 원격UI-구현수단은 애플리케이션 실행 단말에서 키패드 입력이 필요할 경우 입출력부에 쿼티 키패드 화면을 구현한 뒤, 쿼티 키패드 화면을 이용해 문자를 입력하면, 동일한 문자 입력을 위한 요청신호를 생성하여 애플리케이션 실행 단말로 전송하여 처리하도록 하는 것이 바람직하다.In addition, the remote UI-implementing means implements a QWERTY keypad screen in the input / output unit when a keypad input is required in the application execution terminal, and then inputs a character using the QWERTY keypad screen to generate a request signal for inputting the same character to execute the application. It is preferable to transmit to the terminal to process.
또한, 맵핑-처리수단은, 스마트 단말의 카메라 모듈을 이용하여 애플리케이션 실행 단말의 출력부 화면을 촬상하고, 그 촬상한 이미지를 스마트 단말의 입출력부에 표시한 후, 양 단말의 해상도와 종횡비를 반영하여 이벤트 처리를 위한 맵핑을 달성하는 것이 바람직하다.Further, the mapping-processing means captures the output unit screen of the application execution terminal using the camera module of the smart terminal, displays the captured image on the input / output unit of the smart terminal, and reflects the resolution and the aspect ratio of both terminals. It is desirable to achieve mapping for event processing.
본 발명의 제어 기술에 따르면, 애플리케이션 실행 단말에서 수행되는 각종의 안드로이드 애플리케이션 및 이를 이용한 서비스에 대한 복잡한 기능 제어 조작도 별달리 통신 부하에 따라 반응 속도가 낮아지는 문제점이 없이 스마트 단말을 이용해 조작이 가능하다.According to the control technology of the present invention, the complex function control operation for various Android applications and services using the same in the application execution terminal can be operated using a smart terminal without a problem that the response speed is lowered according to the communication load. .
또한, 본 발명에 따르면 다양한 입력 방법을 요구하는 다양한 종류의 안드로이드 애플리케이션을 각각의 목적에 맞게 항시 휴대 가능한 스마트 단말의 리모트 컨트롤 UI 화면을 통해 제어 가능하게 되므로 애플리케이션 실행 단말이 있는 곳이라면 어디서든 편리하게 활용이 가능하다.In addition, according to the present invention, it is possible to control various types of Android applications requiring various input methods through the remote control UI screen of the smart terminal that is always portable for each purpose, wherever there is an application execution terminal. It can be used.
도 1은 본 발명의 실시예에 따른 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법이 구현되는 시스템을 나타내는 도면.1 is a diagram showing a system in which an Android platform-based application execution terminal control method using a smart terminal according to an embodiment of the present invention is implemented.
도 2 내지 도 7은 스마트 단말과 애플리케이션 실행 단말에 구현된 UI 화면의 예를 나타내는 도면.2 to 7 are diagrams showing examples of UI screens implemented in a smart terminal and an application execution terminal.
도 8은 본 발명의 실시예에 따른 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법을 나타내는 흐름도.8 is a flowchart illustrating a method for controlling an application execution terminal based on the Android platform using a smart terminal according to an embodiment of the present invention.
이하에서는 도면을 참조하여 본 발명을 상세하게 설명한다.Hereinafter, with reference to the drawings will be described in detail the present invention.
도 1은 본 발명의 실시예에 따른 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법이 구현되는 시스템을 나타내는 도면이고, 도 2 내지 도 7은 스마트 단말(20)과 애플리케이션 실행 단말(10)에 구현된 UI 화면의 일 예를 나타내는 도면이다.1 is a diagram illustrating a system in which an Android platform-based application execution terminal control method using a smart terminal according to an embodiment of the present invention is implemented, and FIGS. 2 to 7 illustrate a smart terminal 20 and an application execution terminal 10. A diagram illustrating an example of a UI screen implemented in FIG.
도 1 내지 도 7을 참조하면, 안드로이드 플랫폼 기반의 애플리케이션 실행 단말(10)에 대한 제어방법이 구현되는 시스템은 애플리케이션 실행 단말(10)과 스마트 단말(20)을 포함한다.1 to 7, a system in which a control method for an application execution terminal 10 based on an Android platform is implemented includes an application execution terminal 10 and a smart terminal 20.
먼저 애플리케이션 실행 단말(10)에 대해 살펴보면, 널리 스마트 TV, 스마트 셋톱박스, 퍼스널컴퓨터(PC) 등을 포함하는 개념으로 안드로이드 응용프로그램을 실행시킬 수 있는 단말을 가리킨다. 내부 구성은 제 1 무선송수신부(11), 출력부(12), 제어부(13), 제 1 저장부(14)를 포함하고, 이때 제어부(13)는 에뮬레이션-구현수단(13a)을 포함한다.First, the application execution terminal 10 will be described as a concept including a smart TV, a smart set-top box, a personal computer (PC), etc., and refers to a terminal capable of executing an Android application. The internal configuration includes a first wireless transmission / reception unit 11, an output unit 12, a control unit 13, and a first storage unit 14, wherein the control unit 13 includes emulation-implementing means 13a. .
여기서, 에뮬레이션-구현수단(13a)은 후술할 스마트 단말(20)의 원격제어-구현수단(25)과 원격UI-구현수단(26)로부터 리모트 컨트롤 프레임워크을 통해 수신되는 각종의 요청신호를 마치 애플리케이션 실행 단말(10)의 내부 인터페이스를 통해 이루어진 것처럼 에뮬레이션하는 역할을 수행한다.Here, the emulation-implementing means 13a is configured to receive various request signals received from the remote control-implementing means 25 and the remote UI-implementing means 26 of the smart terminal 20 to be described later through the remote control framework. Emulates the role as if made through the internal interface of the execution terminal (10).
일반적으로 안드로이드 애플리케이션은 단말장치의 출력부(12) 자체에 구현된 UI 화면을 통한 제어 이벤트가 입력되는 것으로 판단하고 소프트웨어가 개발되어 있으므로, 외부의 스마트 단말(20)에 의한 리모트 컨트롤을 처리하는 별도의 소프트웨어 버전을 만들지 않기 위해서는 애플리케이션 실행 단말(10)에서 에뮬레이션-구현수단(13a)이 구비된 것이 바람직하다.In general, the Android application determines that a control event through a UI screen implemented in the output unit 12 itself of the terminal device is input, and since the software is developed, a separate process for processing a remote control by an external smart terminal 20 is performed. In order not to create a software version of the application execution terminal 10, it is preferable that an emulation-implementing means 13a is provided.
이어서 스마트 단말(20)에 대해 살펴보면, 널리 스마트폰과 스마트패드 등과 같은 터치 인터페이스를 구비하고 스마트 운영체제가 설치된 (바람직하게는 핸드핼드 형태의) 각종 단말을 가리킨다. 내부 구성은 제 2 무선송수신부(21), 입출력부(22), 리모트 컨트롤 기능부(23), 제 2 저장부(27)를 포함한다.Subsequently, the smart terminal 20 will refer to various terminals (preferably handheld type) having a touch interface such as a smart phone and a smart pad and installed with a smart operating system. The internal configuration includes a second wireless transmission / reception unit 21, an input / output unit 22, a remote control function unit 23, and a second storage unit 27.
제 2 무선송수신부(21)는 애플리케이션 실행 단말(10)의 제 1 무선송수신부(11)와 신호 및 데이터를 무선방식을 통해 송수신하기 위한 구성요소이다. 여기서 사용되는 무선방식으로는 무선랜(WiFi), 지그비(Zigbee), 적외선, 각종의 이동통신망(예: 3G, 4G) 등의 무선통신 기술이 활용될 수 있다.The second wireless transmitter / receiver 21 is a component for transmitting and receiving signals and data to and from the first wireless transmitter and receiver 11 of the application execution terminal 10 through a wireless method. As a wireless method used herein, wireless communication technologies such as WLAN, Zigbee, infrared, various mobile communication networks (eg, 3G, 4G) may be used.
입출력부(22)는 사용자에 의한 터치 입력 가능한 터치 인터페이스로 형성되는데, 리모트 컨트롤 기능부(23)에 의한 리모트 컨트롤 UI 화면이 구현되며, 일반적인 터치 스크린으로 구현될 수도 있다.The input / output unit 22 is formed as a touch interface capable of touch input by a user. The remote control UI screen by the remote control function unit 23 is implemented and may be implemented as a general touch screen.
리모트 컨트롤 기능부(23)는 동기화-제어수단(24), 원격제어-구현수단(25), 원격UI-구현수단(26)을 포함하여 구성되며 애플리케이션 실행 단말(10)에 대한 제어 애플리케이션에 해당한다.The remote control function unit 23 comprises a synchronization-control means 24, a remote control-implementation means 25, a remote UI-implementation means 26, and corresponds to a control application for the application execution terminal 10. do.
동기화-제어수단(24)은 안드로이드 플랫폼 상에서 스마트 단말(20)과 애플리케이션 실행 단말(10)에 모두 리모트 컨트롤 프레임워크가 설치되므로, 이를 통해 리모트 컨트롤을 위한 동기화를 수행한다.Synchronization-control means 24 is a remote control framework is installed on both the smart terminal 20 and the application execution terminal 10 on the Android platform, thereby performing the synchronization for the remote control.
구체적으로 살펴보면, 동기화-제어수단(24)은 스마트 단말(20)과 애플리케이션 실행 단말(10) 간의 데이터 링크를 형성하고, 스마트 단말(20)의 원격제어-구현수단(25)과 원격UI-구현수단(26)에 의해 생성된 각종의 요청신호를 애플리케이션 실행 단말(10)에서 동일하게 처리될 수 있도록 한다. 이에 따라, 동기화-제어수단(24)은 클럭 주파수와 IP 주소를 스마트 단말(20)과 애플리케이션 실행 단말(10)에서 특정의 값으로 동일하게 일치화시켜 원격제어-구현수단(25)과 원격UI-구현수단(26)과의 연동을 통해 동기화를 수행한다.Specifically, the synchronization-control means 24 forms a data link between the smart terminal 20 and the application execution terminal 10, and the remote control-implementing means 25 and the remote UI-implementation of the smart terminal 20. Various request signals generated by the means 26 can be processed in the application execution terminal 10 in the same manner. Accordingly, the synchronization-control means 24 equally matches the clock frequency and the IP address to a specific value in the smart terminal 20 and the application execution terminal 10 so as to implement the remote control-implementation means 25 and the remote UI. Synchronization is performed through interworking with the implementation means 26.
원격제어-구현수단(25)은 애플리케이션 실행 단말(10)을 제어하기 위한 인터페이스 역할을 수행한다.The remote control-implementing means 25 serves as an interface for controlling the application execution terminal 10.
기본적으로 원격제어-구현수단(25)은 스마트 단말(20)의 사용자에 의해 스마트 단말(20)의 입출력부(22)로 구현되는 리모트 컨트롤 UI 화면에 대한 제어를 수행하도록 하며, 추가적으로 제어를 통해 생성된 신호를 애플리케이션 실행 단말(10)로 전송하는 역할도 병행한다.Basically, the remote control-implementing means 25 controls the UI screen of the remote control implemented by the input / output unit 22 of the smart terminal 20 by the user of the smart terminal 20, and additionally through the control. The role of transmitting the generated signal to the application execution terminal 10 is also parallel.
즉, 원격제어-구현수단(25)은 애플리케이션 실행 단말(10) 상으로 안드로이드 애플리케이션 및 이를 이용한 서비스를 이용할 시 스마트 단말(20)에 의한 리모트 컨트롤을 통해 정확한 위치를 선택하여 실행해야 하는 경우에 대한 제어를 위해 사용된다. 이를 위해, 원격제어-구현수단(25)은 마우스 포인터의 리모트 컨트롤 UI 화면 상에서 정밀한 제어가 가능하도록 하기 위해 이벤트-처리수단(25a), 맵핑-처리수단(25b), 모드전환-처리수단(25c)을 포함한다.That is, the remote control-implementing means 25 is required for selecting and executing the correct position through the remote control by the smart terminal 20 when using the Android application and the service using the same on the application execution terminal 10. Used for control. To this end, the remote control-implementing means 25 is an event-processing means 25a, a mapping-processing means 25b, a mode switching-processing means 25c to enable precise control on the remote control UI screen of the mouse pointer. ).
이벤트-처리수단(25a)은 입출력부(22)로의 입력을 통해 애플리케이션 실행 단말(10)에서 안드로이드 애플리케이션과 이를 이용한 서비스가 각각의 기능 및 동작 방식에 맞게 제어될 수 있도록 각종 이벤트 제어를 수행한다.The event-processing means 25a performs various event control so that the Android application and the service using the same can be controlled according to each function and operation method by the application execution terminal 10 through an input to the input / output unit 22.
보다 구체적으로 살펴보면, 이벤트-처리수단(25a)은 입출력부(22)로 사용자에 의한 더블터치 이벤트를 입력받은 뒤, 이어지는 연속적으로 입력된 이벤트의 타입에 대응하여 애플리케이션 실행 단말(10)에 대한 각종의 요청신호를 생성한 뒤 이를 전송한다.In more detail, the event-processing means 25a receives a double touch event by the user to the input / output unit 22 and then, in response to the type of successively input events, the various types of the application execution terminal 10. After generating the request signal of, send it.
이와 함께, 이벤트-처리수단(25a)은 더블터치 이후에 발생하는 이벤트를 동기화-제어수단(24)에 의한 동기화 연동 수행에 따라 스마트 단말(20)과 애플리케이션 실행 단말(10)에서 동시에 처리되도록 제어한다.In addition, the event-processing means 25a controls the event occurring after the double touch to be simultaneously processed by the smart terminal 20 and the application execution terminal 10 according to the synchronization interworking performed by the synchronization-control means 24. do.
구체적인 예로 살펴보면, 이벤트-처리수단(25a)은 입출력부(22)로 사용자에 의한 더블터치 이벤트가 입력되었는지 여부를 판단한다. 예컨대, 이벤트-처리수단(25a)은 사용자에 의한 터치다운 이벤트(↓), 터치업 이벤트(↑), 터치다운 이벤트(↓)가 연속적으로 입출력부(22)로 입력되면, 더블터치 이벤트가 발생한 것으로 판단한다.As a specific example, the event-processing means 25a determines whether a double touch event by a user is input to the input / output unit 22. For example, when the touch-down event (↓), the touch-up event (↑), or the touchdown event (↓) are continuously input to the input / output unit 22 by the user, the event-processing means 25a generates a double-touch event. I think that.
이후, 이벤트-처리수단(25a)은 입출력부(22)로 더블터치 이벤트 발생에 이어서 미리 설정된 제 1 시간 내로 터치드래그 이벤트가 발생하면, 드래그 요청신호를 생성한 뒤, 애플리케이션 실행 단말(10)로 전송하도록 제 2 무선송수신부(21)를 제어한다. 이 경우, 입출력부(22)를 통해 입력되는 터치드래그 이벤트는 더블터치 이벤트가 발생한 뒤, 사용자가 입출력부(22) 상에서 손을 떼지 않은 상태에서 방향성을 갖고 미끄러지듯이 민 경우 발생한다.Thereafter, the event-processing means 25a generates a drag request signal to the application execution terminal 10 when a touch drag event occurs within a first preset time following the occurrence of the double touch event to the input / output unit 22. The second radio transceiver 21 is controlled to transmit. In this case, the touch drag event input through the input / output unit 22 occurs when the user slides with directionality without releasing the hand on the input / output unit 22 after the double touch event occurs.
한편, 이벤트-처리수단(25a)은 입출력부(22)로 더블터치 이벤트 발생에 이어 마지막 터치 상태(상술한 예에서의, 터치다운 이벤트가 발생한 상태)가 미리 설정된 제 2 시간만큼 유지되는 경우, 롱 프레스 요청신호를 생성한 뒤, 애플리케이션 실행 단말(10)로 전송하도록 제 2 무선송수신부(21)를 제어한다. 여기서 제 2 시간은 상술한 제 1 시간과 동일할 수도 있으며, 사용자에게 다른 기능 설정을 인지시키기 위해 상이한 시간으로 설정도 가능하다.On the other hand, when the event-processing means 25a maintains the last touch state (the state in which the touchdown event has occurred in the above-described example) for a preset second time following the occurrence of the double touch event to the input / output unit 22, After generating the long press request signal, the second radio transmitter / receiver 21 is controlled to transmit to the application execution terminal 10. Here, the second time may be the same as the first time described above, or may be set to a different time in order to recognize a different function setting by the user.
또한, 이벤트-처리수단(25a)은 입출력부(22)로 더블터치 이벤트 발생에 이어, 마지막 터치 상태와 방대의 방향성을 갖는 터치 이벤트(상술한 예에서의 터치다운 이벤트와 반대 방향성을 갖는 터치업 이벤트)가 발생한 경우, 더블클릭 요청신호를 생성한 뒤, 애플리케이션 실행 단말(10)로 전송하도록 제 2 무선송수신부(21)를 제어한다.In addition, the event-processing means 25a is a touch event having a directionality of the last touch state and a massive touch after the occurrence of a double touch event to the input / output unit 22 (touch-up having a direction opposite to the touchdown event in the above-described example) Event), the second radio transmitter / receiver 21 is controlled to generate a double-click request signal and then transmit it to the application execution terminal 10.
이 경우에, 이벤트-처리수단(25a)에 의해 생성되는 더블클릭 요청신호의 예는 기존의 마우스 입력 장치를 이용한 선택에 해당되는 더블 클릭과 동일한 의미로 사용될 수 있다.In this case, the example of the double click request signal generated by the event-processing means 25a can be used in the same sense as the double click corresponding to the selection using the existing mouse input device.
또한, 맵핑-처리수단(25b)은 이벤트-처리수단(25a)에 의한 이벤트 동기화를 처리할 때 동기화-제어수단(24)를 제어함으로써, 스마트 단말(20)의 입출력부(22)와 애플리케이션 실행 단말(10)의 출력부(12)의 화면이 1 : 1 맵핑이 이루어지도록 제어한다.In addition, the mapping-processing means 25b controls the synchronization-control means 24 when processing the event synchronization by the event-processing means 25a, thereby executing the application with the input / output unit 22 of the smart terminal 20. The screen of the output unit 12 of the terminal 10 controls the 1: 1 mapping.
이에 따라, 맵핑-처리수단(25b)은 스마트 단말(20)의 입출력부(22)의 목적 위치를 사용자에 의해 터치된 경우, 애플리케이션 실행 단말(10)의 출력부(12)의 맵핑된 위치에 대한 터치 요청신호를 생성한 뒤, 애플리케이션 실행 단말(10)로 전송하도록 제 2 무선송수신부(21)를 제어한다.Accordingly, the mapping-processing means 25b is located in the mapped position of the output unit 12 of the application execution terminal 10 when the user touches the target position of the input / output unit 22 of the smart terminal 20 by the user. After generating the touch request signal for the control, and controls the second radio transmitter and receiver 21 to transmit to the application execution terminal (10).
이상과 같이 맵핑-처리수단(25b)에 의해 처리되는 맵핑(mapping)의 개념은 도 2에 도시되어 있다. 도 2에 도시된 것은 화면비율은 동일하게 유지된 채 픽셀의 수(해상도)에서만 차이가 있는 경우이나 화면비율이 다른 경우 고정종횡비 설정에 따라 변환할 수 있다.The concept of mapping processed by the mapping-processing means 25b as shown above is shown in FIG. In FIG. 2, when the aspect ratio remains the same and there is a difference only in the number of pixels (resolution), or when the aspect ratio is different, it may be converted according to a fixed aspect ratio setting.
또한, 맵핑-처리수단(25b)은 애플리케이션 실행 단말(10)을 제어할 때 물리적 인터페이스(Phisical Interface: PI)로 구현되는 백-키(back-key)를 스마트 단말(20)의 입출력부(22)에 맵핑 아이콘으로 구현할 수 있다. 이에 따라, 맵핑-처리수단(25b)은 백-키를 구현한 맵핑 아이콘이 사용자에 의해 터치되는 경우, 백-키 요청신호를 생성한 뒤, 애플리케이션 실행 단말(10)로 전송하도록 제 2 무선송수신부(21)를 제어한다.In addition, the mapping-processing means 25b, when controlling the application execution terminal 10, the input-output unit 22 of the smart terminal 20 has a back-key implemented by a physical interface (PI). ) Can be implemented as a mapping icon. Accordingly, the mapping-processing means 25b generates a back-key request signal when the mapping icon embodying the back-key is touched by the user, and then transmits the second radio transmission / reception to transmit to the application execution terminal 10. The unit 21 is controlled.
한편, 맵핑-처리수단(25b)은 스마트 단말(20)의 카메라 모듈을 활용하여 구현될 수도 있다. 도 2의 (b)를 참조하면, 스마트 단말(20)의 카메라 모듈(미도시)은 애플리케이션 실행 단말(10)의 디스플레이 화면을 촬상하고, 그 촬상된 화면을 스마트 단말(20)의 입출력부에 표시한다. 스마트 단말(20)과 애플리케이션 실행 단말(10) 간에 해상도와 종횡비를 맞추는 부분에 대해서는 전술한 바와 같다. 이를 이용하여 스마트 단말(20)의 입출력부(22)와 애플리케이션 실행 단말(10)의 출력부(12)의 화면이 1 : 1 맵핑이 이루어지도록 제어한다.On the other hand, the mapping-processing means 25b may be implemented by utilizing the camera module of the smart terminal 20. Referring to FIG. 2B, a camera module (not shown) of the smart terminal 20 captures a display screen of the application execution terminal 10 and transfers the captured screen to an input / output unit of the smart terminal 20. Display. The portion of matching the resolution and aspect ratio between the smart terminal 20 and the application execution terminal 10 is as described above. By using this, the screens of the input / output unit 22 of the smart terminal 20 and the output unit 12 of the application execution terminal 10 are controlled to have 1: 1 mapping.
이어서, 모드전환-처리수단(25c)은 스마트 단말(20)의 리모트 컨트롤 UI 화면에서 예컨대 상, 하, 좌, 우의 모서리 영역에서 외곽 방향으로 드래그 이벤트가 발생한 경우에 도 3 및 도 4에 도시된 바와 같이, 특정의 기능 전환을 수행할 수 있다.Subsequently, the mode change-processing means 25c is illustrated in FIGS. 3 and 4 when a drag event occurs in an outward direction in the corner regions of the upper, lower, left, and right sides of the remote control UI screen of the smart terminal 20. As such, certain functional transitions may be performed.
여기서 모드전환-처리수단(25c)에 의해 이루어지는 특정 기능으로는 제어모드 전환과 제어방법 전환을 들 수 있다. 여기서 '제어모드'는 터치 모드, 마우스 모드, 키패드 모드, 브라우저 모드, 게임 모드 등이 될 수 있고, '제어방법'은 드래그, 줌인/줌아웃, 그리고 마우스 포인터를 이용한 이동, 마우스 포인터를 이용한 클릭 등이 될 수 있다.The specific functions performed by the mode switching processing unit 25c include control mode switching and control method switching. Here, the 'control mode' may be a touch mode, a mouse mode, a keypad mode, a browser mode, a game mode, and the like, and the 'control method' may include dragging, zooming in and out, moving using a mouse pointer, and clicking using a mouse pointer. This can be
한편, 도 4에 도시된 바와 같이, 리모트 컨트롤 UI 화면의 좌측의 모서리 영역에서 외곽 방향으로 드래그 이벤트가 발생한 경우 마우스 모드로 전환되며, 우측의 모서리 영역에서 외곽 방향으로 드래그 이벤트 발생시 제어방법이 전환된다. 동일한 방식으로, 상측의 모서리 영역에서 외곽 방향으로 드래그 이벤트 발생시 터치 모드로 전환되며, 하측의 모서리 영역에서 외곽 방향으로 드래그 이벤트 발생시 키패드 모드로 전환될 수 있으며, 이는 하나의 예시로 다양한 제어모드 전환과 제어방법 전환을 조합하여 구현될 수 있다.On the other hand, as shown in Figure 4, when the drag event occurs in the outer direction in the left corner area of the remote control UI screen is switched to the mouse mode, the control method is switched when the drag event occurs in the outer direction in the right corner area. . In the same way, the touch mode when the drag event occurs in the outer direction in the upper corner area can be switched to the touch mode, and the keypad mode can be switched when the drag event occurs in the outer direction in the lower corner area. It can be implemented by combining control method switching.
여기서 제어 모드가 키패드 모드로 전환되면 도 5에 도시된 것과 같이, 모드전환-처리수단(25c)은 스마트 단말(20)에서 발생된 각종의 조작 이벤트를 제 2 무선송수신부(21)를 통해 애플리케이션 실행 단말(10)의 제 1 무선송수신부(11)로 전송한다.Here, when the control mode is switched to the keypad mode, as shown in FIG. 5, the mode switching-processing means 25c may apply various operation events generated in the smart terminal 20 through the second wireless transmission / reception unit 21. It transmits to the 1st wireless transmission / reception part 11 of the execution terminal 10. FIG.
원격UI-구현수단(26)은 애플리케이션 실행 단말(10)을 제어하기 위한 스마트 단말(20)의 리모트 컨트롤 UI 화면을 구현한다.The remote UI-implementing means 26 implements a remote control UI screen of the smart terminal 20 for controlling the application execution terminal 10.
좀더 구체적으로 살펴보면, 애플리케이션 실행 단말(10)에서 브라우저가 실행될 경우에, 드래그를 이용한 스크롤 영역(22a), 화면 줌인/줌아웃을 구현하기 위한 줌 영역(22b), 마우스 포인터를 이동하거나 마우스 포인터를 이용해 클릭하기 위한 터치패드 영역(22c)의 3 가지 제어 영역이 필요하다. 이에 따라, 원격UI-구현수단(26)은 이러한 3 가지 제어 영역을 입출력부(22)의 리모트 컨트롤 UI 화면을 분할하여 개별 분할영역으로 구분 출력하며, 이들로부터 이벤트를 수신하여 요청신호를 생성한다. 한편, 개별 분할영역으로 출력된 예시는 도 6과 같이 구현될 수 있다.More specifically, when the browser is executed in the application execution terminal 10, the scroll region 22a using dragging, the zoom region 22b for implementing screen zoom in / zoom out, the mouse pointer is moved or the mouse pointer is moved. Three control areas of the touchpad area 22c for clicking are required. Accordingly, the remote UI-implementing means 26 divides the three control areas into separate partition areas by dividing the remote control UI screen of the input / output unit 22, receives the events from them, and generates a request signal. . Meanwhile, the example output to the individual partitions may be implemented as shown in FIG. 6.
이에 따라, 원격UI-구현수단(26)은 개별 분할영역에 대한 이벤트에 따라 생성된 요청신호를 애플리케이션 실행 단말(10)로 전송하도록 제 2 무선송수신부(21)를 제어한다.Accordingly, the remote UI-implementing means 26 controls the second radio transmitter / receiver 21 to transmit the request signal generated according to the event for the individual partition to the application execution terminal 10.
또한, 원격UI-구현수단(26)은 애플리케이션 실행 단말(10)에서 키패드 입력이 필요할 경우에 스마트 단말(20)의 입출력부(22)에 도 7과 같이 쿼티 키패드 화면을 구현한다. 이에 따라, 쿼티 키패드 화면을 이용하여 영문, 한글, 특수문자 등의 문자가 사용자에 의해 입력되면, 해당 입력에 따른 동일한 문자 입력을 위한 요청신호를 생성하여 애플리케이션 실행 단말(10)로 전송하도록 제 2 무선송수신부(21)를 제어한다.In addition, the remote UI-implementing means 26 implements a QWERTY keypad screen as shown in FIG. 7 in the input / output unit 22 of the smart terminal 20 when a keypad input is required in the application execution terminal 10. Accordingly, when characters such as English, Korean, and special characters are input by the user using the QWERTY keypad screen, a second request is made to generate a request signal for inputting the same character according to the corresponding input and transmit the same to the application execution terminal 10. The radio transmitter and receiver 21 is controlled.
제 2 저장부(27)는 안드로이드 플랫폼 기반의 리모트 컨트롤이 수행되기 위한 리모트 컨트롤 프레임워크가 설치된다. The second storage unit 27 is installed with a remote control framework for performing a remote control based on the Android platform.
한편, 리모트 컨트롤 프레임워크는 애플리케이션 실행 단말(10)의 제 1 저장부(14)에도 동일하게 설치된다.On the other hand, the remote control framework is similarly installed in the first storage unit 14 of the application execution terminal 10.
도 8은 본 발명의 실시예에 따른 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법을 나타내는 흐름도이다. 도 1 내지 도 8을 참조하면, 리모트 컨트롤 기능부(23)는 입출력부(22)로 터치다운 이벤트가 입력된 뒤(S1), 다음으로 이어지는 이벤트로서 터치업 이벤트가 사용자에 의해 입력되는 것을 인식한다(S2).8 is a flowchart illustrating a method of controlling an application execution terminal based on an Android platform using a smart terminal according to an embodiment of the present invention. 1 to 8, the remote control function unit 23 recognizes that after the touchdown event is inputted to the input / output unit 22 (S1), the touchup event is input by the user as a subsequent event. (S2).
단계(S2) 이후에, 리모트 컨트롤 기능부(23)는 단계(S1)에서와 같은 터치다운 이벤트가 입출력부(22)를 통해 다시 입력되는지 여부를 판단한다(S3). 여기서 터치다운 이벤트가 다시 입력된 경우, 리모트 컨트롤 기능부(23)는 더블터치 이벤트로 판단함으로써 이어지는 연속적으로 입력된 이벤트의 타입에 따른 처리를 수행한다.After step S2, the remote control function unit 23 determines whether the touchdown event as in step S1 is input again through the input / output unit 22 (S3). In this case, when the touchdown event is input again, the remote control function unit 23 performs processing according to the type of successively input events which are subsequently determined by judging as a double touch event.
단계(S3) 이후에, 위 판단결과 터치다운 이벤트가 재입력되었다고 판단된 경우에는 리모트 컨트롤 기능부(23)는 입출력부(22)로 더블터치 이벤트 발생에 이어 미리 설정된 제 1 시간(예: 1.0 초) 내로 터치드래그 이벤트가 입력되는지 여부를 판단한다(S4).After step S3, when it is determined that the touchdown event is re-entered, the remote control function unit 23 inputs the input / output unit 22 to the first time set in advance following the occurrence of the double touch event (eg, 1.0). In operation S4, it is determined whether a touch drag event is input.
단계(S4)의 판단결과 터치드래그 이벤트가 입력된 경우, 리모트 컨트롤 기능부(23)는 드래그 요청신호를 생성한 뒤, 애플리케이션 실행 단말(10)로 전송하여 터치드래그 이벤트가 동기화되어 처리되도록 한다(S5).When the touch drag event is input as a result of the determination in step S4, the remote control function unit 23 generates a drag request signal and transmits it to the application execution terminal 10 so that the touch drag event is synchronized and processed ( S5).
한편, 단계(S4)의 판단결과 터치드래그 이벤트가 입력되지 않은 경우에는, 리모트 컨트롤 기능부(23)는 입출력부(22)로 더블터치 이벤트 발생에 이어 터치다운 이벤트가 재입력된 상태가 미리 설정된 제 2 시간(예: 1.5 초)만큼 유지되지 여부를 판단한다(S6).On the other hand, if the touch drag event is not input as a result of the determination in step S4, the remote control function unit 23 is a preset state in which the touch-down event is re-inputted after the double touch event is generated to the input / output unit 22. It is determined whether the second time (eg, 1.5 seconds) is maintained (S6).
단계(S6)의 판단결과 터치다운 이벤트가 재입력된 상태가 제 2 시간동안 유지되는 경우에는 리모트 컨트롤 기능부(23)는 롱 프레스 요청신호를 생성한 뒤, 애플리케이션 실행 단말(10)로 전송하여 롱 프레스 이벤트가 동기화되어 처리되도록 한다(S7).As a result of the determination in step S6, when the state in which the touchdown event is re-entered is maintained for the second time, the remote control function unit 23 generates a long press request signal and transmits the long press request signal to the application execution terminal 10. The long press event is synchronized and processed (S7).
한편, 단계(S6)의 판단결과 터치다운 이벤트가 재입력되지 않거나 제 2 시간 동안 유지되지 않은 경우, 리모트 컨트롤 기능부(23)는 더블터치 이벤트 발생에 이어, 터치업 이벤트가 재입력되는지 여부를 판단한다(S8).On the other hand, if the touchdown event is not re-entered or maintained for the second time as a result of the determination in step S6, the remote control function unit 23 determines whether the touch-up event is re-entered following the occurrence of the double-touch event. It is determined (S8).
단계(S8)의 판단결과 터치업 이벤트가 재입력된 경우, 리모트 컨트롤 기능부(23)는 더블클릭 요청신호를 생성한 뒤, 애플리케이션 실행 단말(10)로 전송하여 더블클릭 이벤트가 동기화되어 처리되도록 한다(S5).When the touch-up event is input again as a result of the determination in step S8, the remote control function unit 23 generates a double-click request signal and transmits it to the application execution terminal 10 so that the double-click event is synchronized and processed. (S5).
한편, 단계(S3)의 판단결과 터치다운 이벤트가 재입력되지 않는 경우, 리모트 컨트롤 기능부(23)는 사용자에 의한 일반적인 클릭으로 판단하여 클릭 요청신호를 생성한 뒤, 애플리케이션 실행 단말(10)로 전송하여 클릭 이벤트가 동기화되어 처리되도록 한다(S10).On the other hand, if the touchdown event is not re-entered as a result of the determination in step S3, the remote control function unit 23 determines that it is a general click by the user, generates a click request signal, and then returns to the application execution terminal 10. By transmitting the click event is synchronized and processed (S10).
본 발명은 또한 컴퓨터로 읽을 수 있는 기록매체에 컴퓨터가 읽을 수 있는 코드로서 구현하는 것이 가능하다. 컴퓨터가 읽을 수 있는 기록매체는 컴퓨터 시스템에 의하여 읽혀질 수 있는 데이터가 저장되는 모든 종류의 기록 장치를 포함한다.The invention can also be embodied as computer readable code on a computer readable recording medium. Computer-readable recording media include all kinds of recording devices that store data that can be read by a computer system.
컴퓨터가 읽을 수 있는 기록매체의 예로는 ROM, RAM, CD-ROM, 자기테이프, 플로피디스크, 광 데이터 저장장치 등이 있으며, 캐리어웨이브(예컨대, 인터넷을 통한 전송)의 형태로 구현되는 것도 포함한다. 또한 컴퓨터가 읽을 수 있는 기록매체는 네트워크로 연결된 컴퓨터 시스템에 분산된 방식으로 컴퓨터가 읽을 수 있는 코드가 저장되고 실행될 수 있다. 그리고 본 발명을 구현하기 위한 기능적 프로그램, 코드, 코드 세그먼트들은 본 발명이 속하는 기술 분야의 프로그래머들에 의해 용이하게 추론될 수 있다.Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disks, optical data storage devices, and the like, which may be implemented in the form of carrier waves (eg, transmission over the Internet). . The computer readable recording medium can also store and execute computer readable code in a distributed manner over networked computer systems. And functional programs, codes, and code segments for implementing the present invention can be easily inferred by programmers in the technical field to which the present invention belongs.

Claims (12)

  1. 안드로이드 플랫폼 상에서 스마트 단말과 애플리케이션 실행 단말에 설치된 리모트 컨트롤 프레임워크를 통하여 상기 스마트 단말을 이용하여 애플리케이션 실행 단말을 제어하는 방법으로서,A method of controlling an application execution terminal using the smart terminal through a remote control framework installed on the smart terminal and the application execution terminal on the Android platform,
    상기 스마트 단말의 입출력부와 상기 애플리케이션 실행 단말의 출력부의 UI 화면에 대해 1 : 1 맵핑하는 제 1 단계;A first step of mapping 1: 1 to the UI screen of the input / output unit of the smart terminal and the output unit of the application execution terminal;
    터치 인터페이스로 형성된 스마트 단말이 터치다운 이벤트를 사용자로부터 입력받은 경우, 다음 이벤트로 터치업 이벤트가 입력되는지 여부를 판단하는 제 2 단계;A second step of determining whether a touch-up event is input as a next event when the smart terminal formed with the touch interface receives a touchdown event from a user;
    상기 스마트 단말이 상기 터치업 이벤트를 입력받은 것으로 판단하면, 상기 터치다운 이벤트가 재입력되는지 여부를 판단하는 제 3 단계;A third step of determining whether the touchdown event is input again when the smart terminal determines that the touchup event has been input;
    상기 스마트 단말이 상기 터치다운 이벤트를 재입력 받은 경우, 더블터치 이벤트로 판단하여 상기 더블터치 이후에 발생하는 이벤트의 타입에 따라 애플리케이션 실행 단말에 대한 요청신호를 생성한 뒤 전송하여 상기 더블터치 이후에 입력된 이벤트와 동기화되어 처리되도록 하는 제 4 단계;When the smart terminal receives the touchdown event again, the smart terminal determines that it is a double touch event and generates and transmits a request signal for an application execution terminal according to the type of event occurring after the double touch. A fourth step of synchronizing and processing the input event;
    를 포함하여 구성되는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법.Android platform-based application execution terminal control method using a smart terminal comprising a.
  2. 청구항 1에 있어서, The method according to claim 1,
    상기 제 4 단계는,The fourth step,
    상기 스마트 단말이 상기 더블터치 이벤트가 발생한 뒤, 미리 설정된 제 1 시간 내로 터치드래그 이벤트를 입력받았는지 여부를 판단하는 단계(a);Determining, by the smart terminal, whether the touch drag event has been input within a first preset time after the double touch event occurs;
    상기 스마트 단말이 상기 제 1 시간 내로 상기 터치드래그 이벤트를 입력받은 경우, 드래그 요청신호를 생성한 뒤, 상기 애플리케이션 실행 단말로 전송하여 처리하도록 하는 단계(b);(B) generating a drag request signal and transmitting the drag request signal to the application execution terminal when the smart terminal receives the touch drag event within the first time;
    를 포함하는 것을 특징으로 하는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법.Android platform-based application execution terminal control method using a smart terminal comprising a.
  3. 청구항 2에 있어서, The method according to claim 2,
    상기 단계(b)는, Step (b) is,
    상기 스마트 단말이 상기 터치드래그 이벤트를 입력받지 못한 경우, 상기 터치다운 이벤트가 재입력받은 상태로 미리 설정된 제 2 시간만큼 유지되지 여부를 판단하는 단계(c);If the smart terminal does not receive the touch drag event, determining whether the touchdown event is maintained for a preset second time in a re-input state;
    상기 스마트 단말이 상기 미리 설정된 제 2 시간만큼 유지되는 경우, 롱 프레스 요청신호를 생성한 뒤, 상기 애플리케이션 실행 단말로 전송하여 처리하도록 하는 단계(d);(D) generating a long press request signal and transmitting the generated long press request signal to the application execution terminal when the smart terminal is maintained for the preset second time;
    를 더 포함하여 구성되는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법.Android platform-based application execution terminal control method using a smart terminal further comprises a.
  4. 청구항 3에 있어서, The method according to claim 3,
    상기 단계(d)는, Step (d) is,
    상기 스마트 단말이 상기 터치다운 이벤트를 재입력받지 않거나 상기 미리 설정된 제 2 시간 동안 유지되지 않은 경우, 상기 더블터치 이벤트가 발생한 뒤, 상기 터치업 이벤트를 재입력받는지 여부를 판단하는 단계(e);(E) determining whether the smart terminal receives the touch up event again after the double touch event occurs when the smart terminal does not receive the touch down event again or is not maintained for the preset second time;
    상기 스마트 단말이 상기 터치업 이벤트를 재입력받은 경우, 더블클릭 요청신호를 생성한 뒤, 상기 애플리케이션 실행 단말로 전송하여 처리하도록 하는 단계(f);If the smart terminal receives the touch-up event again, generating a double-click request signal and transmitting the processed signal to the application execution terminal for processing;
    를 더 포함하여 구성되는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법.Android platform-based application execution terminal control method using a smart terminal further comprises a.
  5. 청구항 4에 있어서,The method according to claim 4,
    상기 제 1 단계는, The first step is,
    상기 스마트 단말의 카메라 모듈을 이용하여 상기 애플리케이션 실행 단말의 출력부 화면을 촬상하는 단계(g);(G) capturing an output unit screen of the application execution terminal using the camera module of the smart terminal;
    상기 스마트 단말의 입출력부에 상기 촬상한 이미지를 표시하는 단계(h);Displaying the captured image on an input / output unit of the smart terminal (h);
    상기 스마트 단말과 상기 애플리케이션 실행 단말의 해상도와 종횡비를 반영하여 이벤트 처리를 위한 맵핑을 달성하는 단계(i);(I) achieving mapping for event processing by reflecting a resolution and an aspect ratio of the smart terminal and the application execution terminal;
    를 포함하여 구성되는 것을 특징으로 하는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법.Android platform-based application execution terminal control method using a smart terminal, characterized in that comprises a.
  6. 안드로이드 플랫폼 상에서 스마트 단말 및 애플리케이션 실행 단말에 설치된 리모트 컨트롤 프레임워크(remote control framework)를 통해 리모트 컨트롤을 위한 동기화를 수행하는 동기화-제어수단;Synchronization-control means for performing synchronization for the remote control via a remote control framework installed on the smart terminal and the application execution terminal on the Android platform;
    상기 스마트 단말의 입출력부로 더블터치 이벤트를 사용자로부터 입력받은 뒤, 상기 더블터치 이벤트 이후에 발생하는 이벤트의 타입에 따라 애플리케이션 실행 단말에 대한 요청신호를 생성한 뒤 전송하여, 상기 동기화-제어수단에 의해 상기 더블터치 이후에 입력된 이벤트와 동기화되어 처리되도록 하는 이벤트-처리수단;After receiving the double touch event from the user to the input / output unit of the smart terminal, the request signal for the application execution terminal is generated and transmitted according to the type of the event occurring after the double touch event, by the synchronization-control means. Event-processing means for processing in synchronization with an event input after the double touch;
    상기 이벤트-처리수단에 의한 이벤트 동기화 처리시 상기 입출력부와 상기 애플리케이션 실행 단말의 출력부의 UI 화면이 1 : 1 맵핑이 되도록 제어하는 맵핑-처리수단;Mapping-processing means for controlling the UI screen of the input / output unit and the output unit of the application execution terminal to be 1: 1 mapping during event synchronization processing by the event-processing means;
    을 구비하여 이루어진 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체.A computer-readable recording medium recording an application running terminal control program based on the Android platform using a smart terminal comprising a.
  7. 청구항 6에 있어서, The method according to claim 6,
    상기 맵핑-처리수단은,The mapping processing means,
    상기 애플리케이션 실행 단말 상에 물리적 인터페이스(PI)로 구현되는 백-키(back-key)를 상기 입출력부에 맵핑 아이콘으로 구현하며, 상기 맵핑 아이콘으로 구현된 백-키가 사용자에 의해 터치되는 경우, 상기 백-키 요청신호를 생성한 뒤, 상기 애플리케이션 실행 단말로 전송하여 처리되도록 하는 것을 특징으로 하는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체.When a back-key implemented as a physical interface (PI) on the application execution terminal is implemented as a mapping icon in the input / output unit, and the back-key implemented as the mapping icon is touched by the user, After generating the back-key request signal, and transmits to the application execution terminal to be processed by a computer-readable recording medium recording an application platform execution program control program based on the Android platform using a smart terminal.
  8. 청구항 6에 있어서, The method according to claim 6,
    상기 입출력부에 구현된 리모트 컨트롤 UI 화면의 상, 하, 좌, 우의 모서리 영역에서 외곽 방향으로 드래그 이벤트가 발생한 경우, 제어모드 전환과 제어방법 전환이 되도록 제어하는 모드전환-처리수단;Mode switching-processing means for controlling control mode switching and control method switching when a drag event occurs in the upper, lower, left, and right corners of the remote control UI screen implemented in the input / output unit in the outer direction;
    을 더 구비하는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체.A computer-readable recording medium recording an Android platform-based application execution terminal control program using a smart terminal further comprising.
  9. 청구항 8에 있어서, The method according to claim 8,
    상기 제어모드 전환은 터치 모드, 마우스 모드, 키패드 모드, 브라우저 모드, 게임 모드 상호 간의 전환이며,The control mode switch is a switch between a touch mode, a mouse mode, a keypad mode, a browser mode, and a game mode.
    상기 제어방법 전환은, 드래그 방법, 줌인/줌아웃 방법, 마우스 포인터를 이용한 이동, 마우스 포인터를 이용한 클릭 방법 상호 간의 전환인 것을 특징으로 하는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체.The control method switching is a computer recording an Android platform based application execution terminal control program using a smart terminal, characterized in that switching between the drag method, the zoom in / zoom out method, the movement using a mouse pointer, the click method using a mouse pointer. Readable recording medium.
  10. 청구항 6에 있어서, The method according to claim 6,
    상기 애플리케이션 실행 단말을 제어하기 위한 상기 입출력부의 리모트 컨트롤 UI 화면을 구현하며, 상기 리모트 컨트롤 UI 화면을 상기 애플리케이션 실행 단말에서 브라우저가 실행될 경우, 드래그를 이용한 스크롤 영역, 화면 줌인/줌아웃을 구현하기 위한 줌 영역, 마우스 포인터를 이동하거나 마우스 포인터를 이용해 클릭하기 위한 터치패드 영역으로 구분된 분할영역으로 출력하도록 제어하는 원격UI-구현수단;A remote control UI screen of the input / output unit for controlling the application execution terminal is implemented, and when the browser is executed in the application execution terminal, the scroll area using a drag and a zoom for implementing screen zoom in / zoom out Remote UI-implementing means for controlling output to a divided area divided into an area and a touchpad area for moving or clicking the mouse pointer;
    을 더 구비하는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체.A computer-readable recording medium recording an Android platform-based application execution terminal control program using a smart terminal further comprising.
  11. 청구항 10에 있어서, The method according to claim 10,
    상기 원격UI-구현수단은,The remote UI-implementing means,
    상기 애플리케이션 실행 단말에서 키패드 입력이 필요할 경우 상기 입출력부에 쿼티 키패드 화면을 구현한 뒤, 상기 쿼티 키패드 화면을 이용해 문자를 입력하면, 동일한 문자 입력을 위한 요청신호를 생성하여 상기 애플리케이션 실행 단말로 전송하여 처리하도록 하는 것을 특징으로 하는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체.If a keypad input is required in the application execution terminal, a QWERTY keypad screen may be implemented in the input / output unit. If a character is input using the QWERTY keypad screen, a request signal for inputting the same character is generated and transmitted to the application execution terminal. A computer-readable recording medium recording an Android platform-based application execution terminal control program using a smart terminal.
  12. 청구항 7에 있어서,The method according to claim 7,
    상기 맵핑-처리수단은,The mapping processing means,
    상기 스마트 단말의 카메라 모듈을 이용하여 상기 애플리케이션 실행 단말의 출력부 화면을 촬상하고, 상기 촬상한 이미지를 상기 스마트 단말의 입출력부에 표시한 후, 상기 스마트 단말과 상기 애플리케이션 실행 단말의 해상도와 종횡비를 반영하여 이벤트 처리를 위한 맵핑을 달성하는 것을 특징으로 하는 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어 프로그램을 기록한 컴퓨터로 판독가능한 기록매체.After capturing the output unit screen of the application execution terminal by using the camera module of the smart terminal, and displaying the captured image on the input / output unit of the smart terminal, the resolution and aspect ratio of the smart terminal and the application execution terminal are determined. A computer-readable recording medium recording an Android platform based application execution terminal control program using a smart terminal, characterized in that to achieve mapping for event processing by reflecting.
PCT/KR2011/007039 2011-09-24 2011-09-25 Method of controlling an android platform-based application execution terminal using a smart terminal and computer-readable medium having a computer program for controlling the android platform-based application execution terminal using the smart terminal recorded thereon WO2013042815A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020110096618A KR101325026B1 (en) 2011-09-24 2011-09-24 Control method for application execution terminal based on android platform using smart-terminal, and computer-readable recording medium for the same
KR10-2011-0096618 2011-09-24

Publications (1)

Publication Number Publication Date
WO2013042815A1 true WO2013042815A1 (en) 2013-03-28

Family

ID=47914563

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/007039 WO2013042815A1 (en) 2011-09-24 2011-09-25 Method of controlling an android platform-based application execution terminal using a smart terminal and computer-readable medium having a computer program for controlling the android platform-based application execution terminal using the smart terminal recorded thereon

Country Status (2)

Country Link
KR (1) KR101325026B1 (en)
WO (1) WO2013042815A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103634636A (en) * 2013-11-13 2014-03-12 Tcl集团股份有限公司 Application shortcut operation and control method and system of Android intelligent television
CN103838375A (en) * 2014-02-28 2014-06-04 深圳市深信服电子科技有限公司 Terminal input method and terminal
US9678081B2 (en) 2012-09-27 2017-06-13 Fujifilm Corporation Chromatography method and chromatographic kit
CN111818153A (en) * 2020-07-06 2020-10-23 斑马网络技术有限公司 Vehicle machine remote touch control method, server, vehicle machine and user terminal
CN113792284A (en) * 2021-09-18 2021-12-14 读书郎教育科技有限公司 Method and device for realizing application control of Android terminal

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150139336A (en) 2014-06-03 2015-12-11 삼성전자주식회사 Image forming apparatus and method for providing user interface screen thereof
KR101635614B1 (en) * 2014-11-27 2016-07-01 이동섭 An interface configuration method of a smart device for controlling eletronic devices
KR20190025328A (en) * 2017-09-01 2019-03-11 삼성전자주식회사 Application launch method and electronic device implementing the same
KR102495326B1 (en) 2018-01-31 2023-02-02 삼성전자주식회사 Electronic device and control method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004064269A (en) * 2002-07-26 2004-02-26 Matsushita Electric Ind Co Ltd Portable terminal
KR20060125735A (en) * 2003-11-04 2006-12-06 코닌클리케 필립스 일렉트로닉스 엔.브이. Universal remote control device with touch screen
KR20090128142A (en) * 2008-06-10 2009-12-15 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR20110062296A (en) * 2009-12-03 2011-06-10 엘지전자 주식회사 Mobile terminal, electronic device and method of controlling electronic device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004064269A (en) * 2002-07-26 2004-02-26 Matsushita Electric Ind Co Ltd Portable terminal
KR20060125735A (en) * 2003-11-04 2006-12-06 코닌클리케 필립스 일렉트로닉스 엔.브이. Universal remote control device with touch screen
KR20090128142A (en) * 2008-06-10 2009-12-15 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR20110062296A (en) * 2009-12-03 2011-06-10 엘지전자 주식회사 Mobile terminal, electronic device and method of controlling electronic device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9678081B2 (en) 2012-09-27 2017-06-13 Fujifilm Corporation Chromatography method and chromatographic kit
CN103634636A (en) * 2013-11-13 2014-03-12 Tcl集团股份有限公司 Application shortcut operation and control method and system of Android intelligent television
CN103634636B (en) * 2013-11-13 2017-10-24 Tcl集团股份有限公司 A kind of application fast control method and system of Android intelligent television
CN103838375A (en) * 2014-02-28 2014-06-04 深圳市深信服电子科技有限公司 Terminal input method and terminal
CN111818153A (en) * 2020-07-06 2020-10-23 斑马网络技术有限公司 Vehicle machine remote touch control method, server, vehicle machine and user terminal
CN113792284A (en) * 2021-09-18 2021-12-14 读书郎教育科技有限公司 Method and device for realizing application control of Android terminal
CN113792284B (en) * 2021-09-18 2023-06-13 读书郎教育科技有限公司 Method and device for realizing application management and control of Android terminal

Also Published As

Publication number Publication date
KR20130032924A (en) 2013-04-03
KR101325026B1 (en) 2013-11-08

Similar Documents

Publication Publication Date Title
WO2013042815A1 (en) Method of controlling an android platform-based application execution terminal using a smart terminal and computer-readable medium having a computer program for controlling the android platform-based application execution terminal using the smart terminal recorded thereon
CN103024504B (en) Based on the intelligent remote control system of digital TV set-top box
CN111083684B (en) Method for controlling electronic equipment and electronic equipment
WO2014142471A1 (en) Multi-input control method and system, and electronic device supporting the same
WO2012169784A2 (en) Apparatus and method for providing web browser interface using gesture in device
CN104618793B (en) A kind of information processing method and electronic equipment
WO2011099803A2 (en) Apparatus and method for performing multi-tasking
WO2011046345A2 (en) Method for controlling portable device, display device, and video system
US20120146884A1 (en) Control transfer apparatus, control transfer system and method thereof
WO2013070024A1 (en) Method and apparatus for designating entire area using partial area touch in a portable equipment
CN105808181B (en) Image mediating device, interactive display system and its operating method
WO2014107005A1 (en) Mouse function provision method and terminal implementing the same
WO2009157730A2 (en) System for controlling devices and information on network by using hand gestures
WO2011025188A2 (en) Method for providing control widget and device using the same
WO2013042921A1 (en) Apparatus and method for running application in mobile terminal
JP2014146229A (en) Screen sharing system and central device
WO2012150744A1 (en) Application clone executing method, computer readable recording medium, and clone terminal for supporting same
WO2012093779A2 (en) User terminal supporting multimodal interface using user touch and breath and method for controlling same
CN103516882A (en) Method and system for playing pictures based on multi-screen interaction scene
CN104202637A (en) Key remote control and target dragging method
WO2015064984A1 (en) Electronic device and communication system having the same
CN201985899U (en) Control switching device and control switching system
WO2023234495A1 (en) Control method of portable terminal
WO2019203591A1 (en) High efficiency input apparatus and method for virtual reality and augmented reality
CN103607620A (en) Mobile communication terminal method and apparatus for controlling intelligent television

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11872774

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11872774

Country of ref document: EP

Kind code of ref document: A1